Comtrol Corporation: Latency Performance Testing of Network Device Servers



Similar documents
Digi International: Digi One IA RealPort Latency Performance Testing

Adaptec: Snap Server NAS Performance Study

Adaptec: Snap Server NAS Performance Study

VERITAS Software - Storage Foundation for Windows Dynamic Multi-Pathing Performance Testing

How To Compare Two Servers For A Test On A Poweredge R710 And Poweredge G5P (Poweredge) (Power Edge) (Dell) Poweredge Poweredge And Powerpowerpoweredge (Powerpower) G5I (

RESOLVING SERVER PROBLEMS WITH DELL PROSUPPORT PLUS AND SUPPORTASSIST AUTOMATED MONITORING AND RESPONSE

Microsoft Windows Server 2003 vs. Linux Competitive File Server Performance Comparison

Figure 1A: Dell server and accessories Figure 1B: HP server and accessories Figure 1C: IBM server and accessories

IMPROVE YOUR SUPPORT EXPERIENCE WITH DELL PREMIUM SUPPORT WITH SUPPORTASSIST TECHNOLOGY

New!! - Higher performance for Windows and UNIX environments

Out-of-box comparison between Dell and HP blade servers

Figure 2: Dell offers significant savings per chassis over HP and IBM in acquisition costs and 1-, 3-, and 5-year TCO.

LOAD BALANCING IN THE MODERN DATA CENTER WITH BARRACUDA LOAD BALANCER FDC T740

TOTAL COST OF OWNERSHIP: DELL LATITUDE E6510 VS. APPLE 15-INCH MACBOOK PRO

SERVER POWER CALCULATOR ANALYSIS: CISCO UCS POWER CALCULATOR AND HP POWER ADVISOR

Intel X58 Express Chipset

A Principled Technologies white paper commissioned by Dell Inc.

pco.interface GigE & USB Installation Guide

A QUICK AND EASY GUIDE TO SETTING UP THE DELL POWEREDGE C8000

A PRINCIPLED TECHNOLOGIES TEST REPORT SHAREPOINT PERFORMANCE: TREND MICRO PORTALPROTECT VS. MICROSOFT FOREFRONT JULY 2011

TOTAL COST COMPARISON SUMMARY: VMWARE VSPHERE VS. MICROSOFT HYPER-V

SUDT AccessPort TM Advanced Terminal / Monitor / Debugger Version 1.37 User Manual

VIRTUALIZATION-MANAGEMENT COMPARISON: DELL FOGLIGHT FOR VIRTUALIZATION VS. SOLARWINDS VIRTUALIZATION MANAGER

Microsoft Windows Server 2003 with Internet Information Services (IIS) 6.0 vs. Linux Competitive Web Server Performance Comparison

MANAGING CLIENTS WITH DELL CLIENT INTEGRATION PACK 3.0 AND MICROSOFT SYSTEM CENTER CONFIGURATION MANAGER 2012

TEST REPORT Dell PERC H700 average percentage win in IOPS over FEBRUARY 2006 Dell PERC 6/i across RAID 5 and RAID 10. Internal HDD tests

Out-of-box comparison between Dell, HP, and IBM blade servers

Performance Evaluation of Linux Bridge

TCP/IP MODULE CA-ETHR-A INSTALLATION MANUAL

FILE-HOSTING SERVICE COMPARISON: FASTER SYNCING WITH DROPBOX FOR BUSINESS

Using the RS232 serial evaluation boards on a USB port

Chapter 5 Cubix XP4 Blade Server

Exeba -ATS. User Guide. Escan Technologies Corporation

PC Base Adapter Daughter Card UART GPIO. Figure 1. ToolStick Development Platform Block Diagram

Quick Note 32. Using Digi RealPort with a Digi TransPort Router. UK Support September 2012

Secure SCADA Communication Protocol Performance Test Results

Samsung KIES Online User Update Procedure **

White Paper Open-E NAS Enterprise and Microsoft Windows Storage Server 2003 competitive performance comparison

BETTER PUBLIC CLOUD PERFORMANCE WITH SOFTLAYER

ALIENWARE X51 PERFORMANCE COMPARISON: SAMSUNG SSD VS. TRADITIONAL HARD DRIVE

ETHERNET WEATHER STATION CONNECTIONS Application Note 33

DK40 Datasheet & Hardware manual Version 2

I/O PERFORMANCE COMPARISON OF VMWARE VCLOUD HYBRID SERVICE AND AMAZON WEB SERVICES

Fusionstor NAS Enterprise Server and Microsoft Windows Storage Server 2003 competitive performance comparison

Ultra Thin Client TC-401 TC-402. Users s Guide

Cisco Unified Workforce Optimization

I/O PERFORMANCE OF THE LENOVO STORAGE N4610

CENTRALIZED SYSTEMS MANAGEMENT: DELL LIFECYCLE CONTROLLER INTEGRATION FOR SCCM VS. HP PROLIANT INTEGRATION KIT FOR SCCM

Installation and Configuration Guide for Cluster Services running on Microsoft Windows 2000 Advanced Server using Acer Altos Servers

Intel Server S3200SHL

Knowledge Base POS/C31A Troubleshooting

Sugar Enterprise 4.5 On-Demand Features and Functionality Evaluation

AP ENPS ANYWHERE. Hardware and software requirements

Endpoint Protection Performance Benchmarks

Agilent Technologies. Connectivity Guide. USB/LAN/GPIB Interfaces. Agilent Technologies

THE EASY WAY EASY SCRIPT FUNCTION

V2.7.x Installation on a Database Server Note: This document is to be used on a new database server installation.

Using HyperTerminal with Agilent General Purpose Instruments

Keep it Simple Timing

BIT COMMANDER. Serial RS232 / RS485 to Ethernet Converter

V2.8.x Installation on a Database Server Note: This document is to be used on a new database server installation.

CPU PERFORMANCE COMPARISON OF TWO CLOUD SOLUTIONS: VMWARE VCLOUD HYBRID SERVICE AND MICROSOFT AZURE

QuickSpecs. Models HP 750GB 7200rpm SATA (NCQ/Smart IV) 3Gbp/s Hard Drive

Managing Serial Devices in a Networked Environment

TEST REPORT OCTOBER 2009 SMB Workload performance testing: PowerEdge T110 vs. an older small business desktop running typical small

Software Installation and Quick Start Guide. EnergyMax -USB/RS Sensor System

Intel Server Board S3420GPRX Intel Server System SR1630GPRX Intel Server System SR1630HGPRX

Introduction to PCI Express Positioning Information

Quest vworkspace Virtual Desktop Extensions for Linux

DATABASE PERFORMANCE COMPARISON OF VMWARE VCLOUD AIR, AMAZON WEB SERVICES, AND MICROSOFT AZURE

Aquadyne TechTip TITLE: TROUBLESHOOTING PC COM PORT AND MODEM PRODUCTS AFFECTED SYMPTOMS POSSIBLE CAUSES

A PRINCIPLED TECHNOLOGIES TEST REPORT VIDEO-RENDERING PERFORMANCE COMPARISON: DELL PRECISION MOBILE WORKSTATIONS VS. APPLE MACBOOK PROS MARCH 2012

Comparing the Network Performance of Windows File Sharing Environments

Infor Web UI Sizing and Deployment for a Thin Client Solution

To perform Ethernet setup and communication verification, first perform RS232 setup and communication verification:

CQG/LAN Technical Specifications. January 3, 2011 Version

2-Port RS232/422/485 Combo Serial to USB2.0 Adapter (w/ Metal Case and Screw Lock Mechanism) Installation Guide

User s Manual TCP/IP TO RS-232/422/485 CONVERTER. 1.1 Introduction. 1.2 Main features. Dynamic DNS

How To Perform A File Server On A Poweredge R730Xd With Windows Storage Spaces

Technical Product Specifications Dell Dimension 2400 Created by: Scott Puckett

SEE2 UV150 QUICK INSTALL GUIDE

USB to RS-422/485 Serial Adapter (ID-SC0911-S1/SC0A11-S1) User s Manual

s y s t e m r e q u i r e m e n t s

USER GUIDE. Ethernet Configuration Guide (Lantronix) P/N: Rev 6

Dell PowerEdge Blades Outperform Cisco UCS in East-West Network Performance

APRIL 2010 HIGH PERFORMANCE NETWORK SECURITY APPLIANCES

THUM - Temperature Humidity USB Monitor

formerly Help Desk Authority Upgrade Guide

T3 Mux M13 Multiplexer

Intel Server Board S3420GPLX Intel Server Board S3420GPLC Intel Server System SR1630GP Intel Server System SR1630HGP

Using the DNP3.0 Protocol via Digi Device Servers and Terminal Servers

AN4128 Application note

DCS Data and communication server

3.1 RS-232/422/485 Pinout:PORT1-4(RJ-45) RJ-45 RS-232 RS-422 RS-485 PIN1 TXD PIN2 RXD PIN3 GND PIN4 PIN5 T PIN6 T PIN7 R+ PIN8 R-

CLOUD NETWORKING WITH CA 3TERA APPLOGIC

Quick Installation. A Series of Intelligent Bar Code Reader with NeuroFuzzy Decoding. Quick Installation

Intel Server Board S3420GPV

Toshiba Serial Driver Help Kepware Technologies

1.1 Connection Direct COM port connection. 1. Half duplex RS232 spy cable without handshaking

Using IDENT M System T with Modbus/TCP

Transcription:

April 2003 1001 Aviation Parkway, Suite 400 Morrisville, NC 27560 919-380-2800 Fax 919-380-2899 320 B Lakeside Drive Foster City, CA 94404 650-513-8000 Fax 650-513-8099 www.veritest.com info@veritest.com Comtrol Corporation: Latency Performance Testing of Network Device Servers Test report prepared under contract from Comtrol Corporation Executive summary Comtrol Corporation commissioned VeriTest, a division of Lionbridge Technologies, Inc., to execute a benchmark comparing the overall latency of device server products when performing network to serial communications. Additionally, as a baseline, VeriTest executed the same benchmark against a Native PC serial port. The test compared single and multiple serial port to Ethernet device servers from three vendors. In addition to testing the performance of the products using industry standard TCP/IP protocol, the tests were repeated with three Comtrol products using their Rapid Transport (RTS) network protocol, to determine if latency improvements could be obtained if RTS replaced TCP/IP. Key findings Native PC serial port latency was 5.67 ms versus 4.075 ms on the DeviceMaster RTS 16RM (RTS protocol) and 7.13 ms (TCP/IP protocol) Overall, our testing showed that the Comtrol device servers generated significantly lower latencies compared to other competitors device servers The Comtrol DeviceMaster RTS RM16 has the lowest latency of any product tested Comtrol s products, on average, performed approximately 10 to 50 times faster than Digi or Lantronix in our test configurations. The latency differences observed across the nine products tested ranged from 4.075 ms to a high of 562.595 ms The following device servers were included in the test: Comtrol DeviceMaster PRIMO Comtrol DeviceMaster RTS 1 Port Comtrol RocketPort Serial Hub ia Comtrol DeviceMaster RTS 16RM Digi One IA RealPort Digi PortServer TS 16 Lantronix MSS4 Lantronix UDS-10 Lantronix UDS100 The device servers listed above connect one or more serial devices and move data to and from the connected device over an Ethernet network that is configured with vendor specific Driver or COM port

redirector software. The COM port redirection software associates an IP or MAC address with a COM port identifier. It is the function of the device server and related software/drivers to convert data from a serial based system into data that can be delivered to an Ethernet based client system. The test objective was to measure the average round-trip latency required to send a stream of single 8-bit characters from a host PC across a network to the test device server s serial port and receive the same data echoed back to the host PC. The echoing of data was accomplished by attaching a loop back plug to the device server s serial port. The lower the round-trip latency, the faster the device server and associated driver or COM port redirection software processed and moved the data. Comtrol Corporation provided VeriTest with all the device servers and associated software. Comtrol also provided a custom Python-based test program to be executed on a personal computer. The program generated the load for testing, controlled the execution of the test and measured the latency during testing. Please refer to the Test Methodology section for complete details on how the tests were conducted and our review of the benchmark program used for these tests. A single iteration of the test consists of the benchmark program sending a single 8-bit character from the Ethernet port of the host PC to the serial port of the device server being tested, and then having the character echoed back to the host PC. The benchmark program recorded the elapsed time from when the character was sent until the time it was received back. This elapsed time is the latency required to send a single character to a device server and get a response back. For each device server tested, we configured the benchmark program to complete 10,000 iterations of the test described above. For all device servers tested, the benchmark program sent data at 9600 baud with 8 data bits, 1 stop bit and no parity. We conducted two tests using 10,000 iterations for each device server tested to ensure the repeatability and accuracy of the test results. We then computed an average based on the two sets of latency results for inclusion in this report. Please refer to the Test Methodology section for complete details and configuration information. Latency results from the testing are shown in Figure 1 below. For these tests, lower numbers mean less latency. The benchmark program was completed for all device server products using the TCP/IP protocol. Please note that for three of the Comtrol products, we also performed the testing using Comtrol s proprietary RTS protocol. Our testing showed that Comtrol device servers running under RTS incurred up to a 52 percent decrease in latency compared to the same product running TCP/IP. In our testing, the Comtrol DeviceMaster RTS 16RM device server generated the lowest average round-trip latencies compared to all other device servers regardless of the network protocol. The RTS 16RM required, on average, 7.13 milliseconds to complete each of the 10,000 iterations included in each test using TCP/IP and 4.075 milliseconds to complete each of the 10,000 iterations using RTS. These results are significantly better than the 110.055 milliseconds required when testing the next best performing competitors product, namely the Digi One IA RealPort 1- port device server. The Digi PortServer TS 16, required on average, 120.005 milliseconds to complete each of the 10,000 iterations. Finally, the Lantronix products generated significantly higher latencies than any other vendor s devices in our testing. Specifically the Lantronix MSS4 required, on average, 562.595 milliseconds to complete each of the 10,000 iterations included in each test. Other Lantronix products faired slightly better than the MSS4, with both the UDS-10 and UDS 100 each generating average latencies of 529.36 milliseconds. Overall, the Comtrol product line produced significantly lower latencies than any other competitor s products tested, regardless of the protocol used. Comtrol Latency Performance Testing 2

Product Driver/Driver Rev. Network Protocol Avg. Latency (ms) Native PC Serial Port Win2K 5.67 Comtrol DeviceMaster PRIMO W2K/1.5.0.0 TCP/IP 10.65 Comtrol DeviceMaster RTS 1 Port W2K/6.5.0.0 TCP/IP 15.855 Comtrol DeviceMaster RTS 1 Port W2K/6.5.0.0 Rapid Transport 9.47 Comtrol RocketPort Serial Hub ia W2K/6.5.0.0 TCP/IP 19.1 Comtrol RocketPort Serial Hub ia W2K/6.5.0.0 Rapid Transport 9.165 Comtrol DeviceMaster RTS 16RM W2K/6.5.0.0 TCP/IP 7.13 Comtrol DeviceMaster RTS 16RM W2K/6.5.0.0 Rapid Transport 4.075 Digi One IA RealPort W2K/2.6.82.0 TCP/IP 110.055 Digi PortServer TS 16 W2K/2.6.82.0 TCP/IP 120.005 Lantronix MSS4 W2K/Dev Com 1.0 Build 117 TCP/IP 562.595 Lantronix UDS -100 W2K/Dev Com 1.0 Build 117 TCP/IP 529.36 Lantronix UDS -10 W2K/Dev Com 1.0 Build 117 TCP/IP 529.36 Figure 1. Device Server Average Latencies Comtrol Latency Performance Testing 3

Testing methodology Comtrol Corporation commissioned VeriTest, a division of Lionbridge Technologies, Inc., to execute a predefined benchmark test provided by Comtrol that measured the performance of a native PC serial port and compared the performance of the following device servers moving a stream of single 8-bit character data: Comtrol DeviceMaster PRIMO Comtrol DeviceMaster RTS 1 Port Comtrol RocketPort Serial Hub ia Comtrol DeviceMaster RTS 16RM Digi One IA RealPort Digi PortServer TS 16 Lantronix MSS4 Lantronix UDS-10 Lantronix UDS100 The device servers listed above allow serial devices to connect to and move data to and from a device on an Ethernet network configured with vendor specific COM port driver or redirection software. The COM port driver or redirection software associates an IP or MAC address with a COM port identifier. It is the function of the device server and related software/drivers to convert data from a serial based system into data that can be delivered to an Ethernet based client system. The goal of the testing was to measure the average round trip-latency required to send a stream of single 8- bit characters from a host PC across a network to the test device server s serial port and receive the same data echoed back to the host PC. The echoing of data was accomplished by attaching a loop back plug to the device server s serial port. The lower the latency for this round-trip, the faster the device server and associated COM port redirection software processed and moved the data. To provide a basis for comparison, the latency test was also performed on the host PC, sending the same stream of single 8 bit characters out of the native, onboard serial port. A loop back plug was attached to the onboard serial port to echo back the character. The equipment and topology used for the test is shown in Figure 2 below. Each of the device servers used a 100 Mbps Ethernet port (except the Lantronix UDS-10 which had a 10Mbps port) and was equipped with one or more serial ports. We used a Dell Dimension 8100 configured with a 1.5 GHz Pentium 4 processor and 256MB of RAM, running Windows 2000 Professional/SP3 as our host PC. We connected the host PC to a 3Com 10/100 Mbps hub using a Cat 5 Ethernet cable. Next, we connected the Ethernet port on the device server under test to the 3Com hub using a Cat 5 Ethernet cable. A loop back plug was placed on the device server s serial port to redirect the character stream back to the host PC. Host PC Hub Device Server with Loopback Plug Ethernet Network Cable Ethernet Network Cable Figure 2. Device Server Interconnect Diagram Comtrol Latency Performance Testing 4

Comtrol Corporation provided VeriTest with a custom Python script, referred to as the benchmark program in this document, to measure the latency for each of the devices tested. To run the Python script, we downloaded ActivePython-2.2.2-224-win32-ix86.msi from ActiveState s website and installed it on the host PC using the typical installation option. We then installed the benchmark program by copying the custom Comtrol provided Python files to the host PC. We ran the benchmark program from a command prompt on the host PC as follows: C:\> bench.bat c10000 com# # - the designated COM port number assigned to the device servers serial port. When executed, the benchmark program writes a stream of single 8 bit characters from the host PC using an Ethernet connection to the device under test. The device under test then echoes that same 8 bit character back to the host PC. The echoing of data was accomplished using a loop back plug that connected the RS- 232 Tx data pin to the Rx data pin. The benchmark program recorded the time that elapsed from when it initially sent the character to when it received the character back. This elapsed time is the latency required to send a single character to the device server s serial port and get a response back. A single iteration of this test is defined as the sending of one 8 bit character of data from the host PC to the device server under test and then back to the host PC. For these tests, we instructed the benchmark program to perform 10,000 iterations of the test. When the specific test was completed the benchmark program computed and displayed the min, mean, max and standard deviation of the results recorded during the testing. For our reporting, we used the mean to produce our results. The test results from this benchmark were obtained using the following parameters: inter-character timeout: 2ms total read timeout: 2000ms iteration delay: 100ms baud rate: 9,600bps block size: 1 byte number of iterations: 10,000 [All parameters are default values of the benchmark program except iteration count] We reviewed and evaluated the Comtrol supplied Python language source code to ensure that the program code matched the observed operation of the applications during testing. During our analysis we verified the following characteristics of the benchmark program: the program flow of the application seemed reasonable and fair for all products tested the layout of the test data was a 256 byte repeating set of ASCII characters the program sent a single ASCII character by default the program received a single ASCII character by default the program would terminate with an error if the received character did not match the send character the timing values printed as output matched the output of the time.clock( ) system call that the min, max, and mean results were calculated properly for an iteration count of 10 the com port functions issued matched the Win32 parameter definitions for those calls Given our investigation of the Python script, we are satisfied that the benchmark program operated as intended and generated a valid set of test results. Comtrol Latency Performance Testing 5

Comtrol provided all the device servers as well as the products software/drivers. To be certain we were using the most current version of each device servers software/driver, we visited each vendor s website and performed any necessary updating. Listed in figure 3 below are the vendor specific COM port driver or redirection software versions we used for this testing. Product Driver/Driver Rev. Comtrol DeviceMaster PRIMO W2K/1.5.0.0 Comtrol DeviceMaster RTS 1 Port W2K/6.5.0.0 Comtrol RocketPort Serial Hub ia W2K/6.5.0.0 Comtrol DeviceMaster RTS 16RM W2K/6.5.0.0 Digi One IA RealPort W2K/2.6.82.0 Digi PortServer TS 16 W2K/2.6.82.0 Lantronix MSS4 W2K/Dev Com 1.0 Build 117 Lantronix UDS 100 W2K/Dev Com 1.0 Build 117 Lantronix UDS -10 W2K/Dev Com 1.0 Build 117 Figure 3. Device Server Driver Information We used the following procedure for testing each device server: Connect the host PC and device server to the 3Com hub using Cat 5 Ethernet Cable Power up the Device Sever under test, place the loop back plug on a serial port on the device server Power up the host PC Install and configure the vendor specific COM port redirection software and drivers specifying the IP and in some cases the MAC (Comtrol devices only) address of the Ethernet interface of the device server under test. The appropriate COM port was also defined as the interface through which the device server and the host PC communicate Reboot the host PC again and recycle the power on the test device server Start the benchmark script on the host PC by entering the command line entry - C:\> bench.bat c10000 com# and specifying the appropriate COM port number for # Once the command is executed, visually monitor the test until complete and save the test results Cycle power on both the host PC and device server Perform the second 10,000 iteration of the test Comtrol Latency Performance Testing 6

Test results This section describes the results of the device server testing. The goal of the testing was to measure the average round-trip latency required to send a stream of single 8-bit characters from a host PC over a local Ethernet network to a test device server s serial port and have it echo back the stream to the host PC. The lower the latency for this round-trip, the faster the device server was able to process and move the data. Please refer to the Test Methodology section for complete details and configuration information. In our testing, the Comtrol DeviceMaster RTS 16RM device server generated the lowest average round-trip latencies compared to all other devices regardless of the network protocol. The RTS 16RM required, on average, 7.13 milliseconds to complete each of the 10,000 iterations included in each test using TCP/IP and 4.075 milliseconds to complete each of the 10,000 iterations using RTS. These results are significantly better than the 110.055 milliseconds required when testing the next best performing competitors product, namely the Digi One IA RealPort 1- port device server. The Digi PortServer TS 16, required on average, 120.005 milliseconds to complete each of the 10,000 iterations. Finally, the Lantronix products generated significantly higher latencies than any other vendor s devices in our testing. Specifically the Lantronix MSS4 required, on average, 562.595 milliseconds to complete each of the 10,000 iterations included in each test. Other Lantronix products faired slightly better than the MSS4, with both the UDS-10 and UDS 100 each generating average latencies of 529.36 milliseconds. Overall, the Comtrol product line produced significantly lower latencies than any other competitor s products tested, regardless of the protocol used. Product Driver/Driver Rev. Network Protocol Avg. Latency (ms) Native PC SERIAL PORT Win2K 5.67 Comtrol DeviceMaster PRIMO W2K/1.5.0.0 TCP/IP 10.65 Comtrol DeviceMaster RTS 1 Port W2K/6.5.0.0 TCP/IP 15.855 Comtrol DeviceMaster RTS 1 Port W2K/6.5.0.0 Rapid Transport 9.47 Comtrol RocketPort Serial Hub ia W2K/6.5.0.0 TCP/IP 19.1 Comtrol RocketPort Serial Hub ia W2K/6.5.0.0 Rapid Transport 9.165 Comtrol DeviceMaster RTS 16RM W2K/6.5.0.0 TCP/IP 7.13 Comtrol DeviceMaster RTS 16RM W2K/6.5.0.0 Rapid Transport 4.075 Serive Digi One IA RealPort W2K/2.6.82.0 TCP/IP 110.055 Digi PortServer TS 16 W2K/2.6.82.0 TCP/IP 120.005 Lantronix MSS4 W2K/Dev Com 1.0 Build 117 TCP/IP 562.595 Lantronix UDS 100 W2K/Dev Com 1.0 Build 117 TCP/IP 529.36 Lantronix UDS -10 W2K/Dev Com 1.0 Build 117 TCP/IP 529.36 Figure 4. Device Server Average Latencies Comtrol Latency Performance Testing 7

Appendix A. System disclosures Dell Dimension (Host PC) Processor/Speed/Number Of P4/1.5 GHz System RAM/Type/Slots 256 MB Motherboard Manufacturer INTEL Motherboard Chipset/Model 82850 Main Bus Type PCI L2 Cache 256 KB BIOS DELL AO2 HD Model # / Size 38 GB HD Controller INTEL 82801BA Ultra ATA Storage Controller Graphics Adapter NVIDA DDR GEFORCE 2GTS Graphics Driver & Version NVDISP.DRV/4.12.01.0634 Graphics Memory (MB type) 32 MB Graphics Chip Type GEFORCE 2 NIC (Driver) 3Com 3C920 (3C9050 TX Compatible) USB Chipset USB 1.0 Figure 5. Host PC System Configuration Information Comtrol Latency Performance Testing 8

VeriTest (www.veritest.com), the testing division of Lionbridge Technologies, Inc., provides outsourced testing solutions that maximize revenue and reduce costs for our clients. For companies who use high-tech products as well as those who produce them, smoothly functioning technology is essential to business success. VeriTest helps our clients identify and correct technology problems in their products and in their line of business applications by providing the widest range of testing services available. VeriTest created the suite of industry-standard benchmark software that includes WebBench, NetBench, Winstone, and WinBench. We ve distributed over 20 million copies of these tools, which are in use at every one of the 2001 Fortune 100 companies. Our Internet BenchMark service provides the definitive ratings for Internet Service Providers in the US, Canada, and the UK. Under our former names of ZD Labs and etesting Labs, and as part of VeriTest since July of 2002, we have delivered rigorous, objective, independent testing and analysis for over a decade. With the most knowledgeable staff in the business, testing facilities around the world, and almost 1,600 dedicated network PCs, VeriTest offers our clients the expertise and equipment necessary to meet all their testing needs. For more information email us at info@veritest.com or call us at 919-380-2800. Disclaimer of Warranties; Limitation of Liability: VERITEST HAS MADE REASONABLE EFFORTS TO ENSURE THE ACCURACY AND VALIDITY OF ITS TESTING, HOWEVER, VERITEST SPECIFICALLY DISCLAIMS ANY WARRANTY, EXPRESSED OR IMPLIED, RELATING TO THE TEST RESULTS AND ANALYSIS, THEIR ACCURACY, COMPLETENESS OR QUALITY, INCLUDING ANY IMPLIED WARRANTY OF FITNESS FOR ANY PARTICULAR PURPOSE. ALL PERSONS OR ENTITIES RELYING ON THE RESULTS OF ANY TESTING DO SO AT THEIR OWN RISK, AND AGREE THAT VERITEST, ITS EMPLOYEES AND ITS SUBCONTRACTORS SHALL HAVE NO LIABILITY WHATSOEVER FROM ANY CLAIM OF LOSS OR DAMAGE ON ACCOUNT OF ANY ALLEGED ERROR OR DEFECT IN ANY TESTING PROCEDURE OR RESULT. IN NO EVENT SHALL VERITEST BE LIABLE FOR INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES IN CONNECTION WITH ITS TESTING, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. IN NO EVENT SHALL VERITEST S LIABILITY, INCLUDING FOR DIRECT DAMAGES, EXCEED THE AMOUNTS PAID IN CONNECTION WITH VERITEST S TESTING. CUSTOMER S SOLE AND EXCLUSIVE REMEDIES ARE AS SET FORTH HEREIN. Comtrol Latency Performance Testing 9