Evaluating Wireless Broadband Gateways for Deployment by Service Provider Customers



Similar documents
IxChariot Virtualization Performance Test Plan

WaveInsite Mobile WLAN Client Interoperability and Performance Testing

Testing Packet Switched Network Performance of Mobile Wireless Networks IxChariot

IxVeriWave BYOD (Bring Your Own Device) Testing

WHITE PAPER. How To Compare Virtual Devices (NFV) vs Hardware Devices: Testing VNF Performance

Essential Testing of WLAN Equipment Service Provider WLAN Master Test Plan

WHITE PAPER. Addressing Monitoring, Access, and Control Challenges in a Virtualized Environment

Guidebook to MEF Certification

SUNYIT. Reaction Paper 2. Measuring the performance of VoIP over Wireless LAN

EBOOK. The Network Comes of Age: Access and Monitoring at the Application Level

WHITE PAPER. Enabling 100 Gigabit Ethernet Implementing PCS Lanes

WHITE PAPER. Static Load Balancers Implemented with Filters

IxChariot Pro Active Network Assessment and Monitoring Platform

WHITE PAPER. SDN Controller Testing: Part 1

Voice over Wi-Fi Voice Quality Assessment Test

Ensuring Success in a Virtual World: Demystifying SDN and NFV Migrations

WHITE PAPER. Net Optics Phantom Virtual Tap Delivers Best-Practice Network Monitoring For Virtualized Server Environs

Measuring Wireless Network Performance: Data Rates vs. Signal Strength

Optimizing Network and Client Performance Through Dynamic Airtime Scheduling. white paper

TamoSoft Throughput Test

diversifeye Application Note

VoIP Testing IxChariot

Analysis and Simulation of VoIP LAN vs. WAN WLAN vs. WWAN

WHITE PAPER. Extending Network Monitoring Tool Performance

WI-FI PERFORMANCE BENCHMARK TESTING: Aruba Networks AP-225 and Cisco Aironet 3702i

WHITE PAPER. Gaining Total Visibility for Lawful Interception

WHITE PAPER. Monitoring Load Balancing in the 10G Arena: Strategies and Requirements for Solving Performance Challenges

IxLoad: Testing Microsoft IPTV

Measure wireless network performance using testing tool iperf

VoIP Bandwidth Considerations - design decisions

Network Simulation Traffic, Paths and Impairment

Clearing the Way for VoIP

Triple Play Test Suite

CT LANforge-FIRE VoIP Call Generator

Cisco Virtual Office over WiMAX in India

Hosted Voice. Best Practice Recommendations for VoIP Deployments

Analysis of Effect of Handoff on Audio Streaming in VOIP Networks

HOSTED VOICE Bring Your Own Bandwidth & Remote Worker. Install and Best Practices Guide

WHITE PAPER. Network Traffic Port Aggregation: Improved Visibility, Security, and Efficiency

Burst Testing. New mobility standards and cloud-computing network. This application note will describe how TCP creates bursty

Solving the Sticky Client Problem in Wireless LANs SOLVING THE STICKY CLIENT PROBLEM IN WIRELESS LANS. Aruba Networks AP-135 and Cisco AP3602i

WHITE PAPER. Packet Time Monitoring in Your Visibility Architecture

WHITE PAPER. Tap Technology Enables Healthcare s Digital Future

Lab Testing Summary Report

QUALITY OF SERVICE FOR CLOUD-BASED MOBILE APPS: Aruba Networks AP-135 and Cisco AP3602i

JDSU Ethernet Testing

A Performance Analysis of Gateway-to-Gateway VPN on the Linux Platform

CT LANforge WiFIRE Chromebook a/b/g/n WiFi Traffic Generator with 128 Virtual STA Interfaces

WHITE PAPER. Best Practices for Deploying IPv6 over Broadband Access

Worldwide Education Services

Customer White paper. SmartTester. Delivering SLA Activation and Performance Testing. November 2012 Author Luc-Yves Pagal-Vinette

RFC 6349 Testing with TrueSpeed from JDSU Experience Your Network as Your Customers Do

WHITE PAPER. Testing Voice over IP (VolP) Networks

itvsense Probe M-301/M-304

Improving Quality of Service

The Triple Play Analysis Suite - VoIP. Key Features. Standard VoIP Protocol G.711 SIP RTP / RTCP. Ethernet / PPP. XDSL, Metro Ethernet

Mail Gateway Testing. Test Plan W. Agoura Rd. Calabasas, CA (Toll Free US) FOR.IXIA (Int'l) (Fax)

Comparison of Wireless Protocols. Paweł Ciepliński

An Introduction to VoIP Protocols

Troubleshooting Common Issues in VoIP

Analysis of QoS parameters of VOIP calls over Wireless Local Area Networks

Delivering Network Performance and Capacity. The most important thing we build is trust

MDI / QoE for IPTV and VoIP

Voice Over IP Performance Assurance

Introduction VOIP in an Network VOIP 3

An Executive Brief for Network Security Investments

APPLICATION NOTE 183 RFC 2544: HOW IT HELPS QUALIFY A CARRIER ETHERNET NETWORK. Telecom Test and Measurement. What is RFC 2544?

THE IMPORTANCE OF TESTING TCP PERFORMANCE IN CARRIER ETHERNET NETWORKS

UNIFIED PERFORMANCE MANAGEMENT

Application Service Testing Enabling scalable delivery of layer 4-7 services

Deploying the ShoreTel IP Telephony Solution with a Meru Networks Wireless LAN

Traffic Characterization and Perceptual Quality Assessment for VoIP at Pakistan Internet Exchange-PIE. M. Amir Mehmood

How To Improve A Wireless Phone Connection On A Wireless Network

Lab Testing Summary Report

Virtual Private LAN Service (VPLS) Conformance and Performance Testing Sample Test Plans

WHITE PAPER. 50 versus 62.5 micron multimode fiber

The Recommended Testing Process for PacketCableTM Voice Service at a Customer Premises

Combining Voice over IP with Policy-Based Quality of Service

Performance of Cisco IPS 4500 and 4300 Series Sensors

The Analysis and Simulation of VoIP

VMWARE WHITE PAPER 1

Performance Evaluation of Linux Bridge

How To Deliver High Quality Telephony Over A Network

Discussion Paper Category 6 vs Category 5e Cabling Systems and Implications for Voice over IP Networks

VegaStream Information Note Considerations for a VoIP installation

CISCO IOS IP SERVICE LEVEL AGREEMENTS: ASSURE THE DELIVERY OF IP SERVICES AND APPLICATIONS

Analysis of Performance of VoIP

Voice over Internet Protocol (VoIP) systems can be built up in numerous forms and these systems include mobile units, conferencing units and

Testing L7 Traffic Shaping Policies with IxChariot IxChariot

Quality of Service (QoS) and Quality of Experience (QoE) VoiceCon Fall 2008

Application Note How To Determine Bandwidth Requirements

Measuring the Impact of Security Protocols for Bandwidth

Chapter 2 - The TCP/IP and OSI Networking Models

Juniper / Cisco Interoperability Tests. August 2014

Juniper Networks EX Series Ethernet Switches/ Cisco VoIP Interoperability Test Results. September 25, 2009

Transcription:

Evaluating Wireless Broadband Gateways for Deployment by Service Provider Customers Overview A leading provider of voice, video, and data services to the residential and businesses communities designed its broadband network around the latest advances in network technology. For Wi- Fi, this meant the ability to deliver speed and efficiency, and maintain high-quality performance over distance. Broadband service providers are constantly seeking ways to improve their average revenue per user (ARPU) by offering value-added services and multimedia applications on top of traditional voice and data services. Among other things, these services demand high quality and performance from the equipment deployed by the service provider. Offering service in more than a dozen western U.S. states, the provider featured in this case study responded to demand from its customer base to extend broadband access wirelessly across the homes and businesses served. To guarantee superior customer satisfaction, the provider decided to qualify wireless broadband gateways or base stations (WBS) from two different vendors using Ixia s IxVeriWave Wi-Fi test systems to subject the equipment to the actual network conditions it would encounter in live end-user environments. Testing not only helped the provider assess the quality of two competing solutions, but also provided a clear picture of how each compared to enterprise-class wireless access points (APs). The Challenge The provider had several goals in evaluating wireless broadband gateway performance: Determining the ability of each WBS to deliver superior quality of experience (QoE) for users in a home networking environment; Assessing the impact of subsequent software releases with minimal manual intervention; Increasing the quality of the deployed solution, making it as good as or better than leading WLAN access points; Understanding the latency, packet loss, and throughput characteristics of each WBS for various traffic conditions; Ensuring customer satisfaction and superior end-user experience, thus reducing support costs. The provider decided to conduct testing of throughput, packet loss, and latency to baseline the performance of its WBS options. And more importantly, they wished to observe results for various combinations of test parameters to create real-world tests that were repeatable, automated, and able to run in controlled RF environments. The only viable means of conducting the testing described above was using IxVeriWave traffic generator/analyzers. Company A leading provider of voice, data, and wireless services to residential, business, and government agency users. Project Overview Strategic evaluation of wireless broadband gateways used in providing WLAN services. Key Issues Requirements included: Comparing the performance of two competing broadband gateway solutions Assessing application performance under realworld conditions with reproducible results Automation and repeatability to optimize evaluation of future firmware releases Results Strategic emulation of realworld conditions and precise measurement of application performance revealed that considerable effort was required to raise the quality of experience delivered using both broadband gateway options. Page 1 www.ixiacom.com

The Solution The service provider used Ixia s WaveTest Wi-Fi Traffic Generator/Analyzer test system to verify the performance of its WBS devices in a multi-client environment at the WLAN IP transport and application layers. The IxVeriWave system delivered numerous benefits: Verification of application performance with emphasis on voice and video quality and its impact on end-user QoE; Allowing tests to be conducted in different scenarios to assess the access network at realistic scale; The Packet Throughput and Packet Latency tests were run using seven different frame sizes: 88 bytes 128 bytes 256 bytes 512 bytes 1280 bytes 1518 bytes Three traffic directions were also employed: Upstream, Downstream, and Bidirectional. Providing high-level, easy to understand results and graphs along with detailed csv files containing perflow statistics; Reducing test cycle times from 4-6 weeks to about 1-2 weeks while minimizing the necessary manual intervention of the testers. To achieve its diverse goals, the provider employed the following tests: 1. Packet Throughput Test: Measures the maximum forwarding rate of a WBS or access point when zero packet loss is observed. 2. Packet Latency Test measuring the delay in traffic imposed by the WBS; used to determine the key performance metrics for voice, video, and delaysensitive data traffic. 3. WiMix Enterprise Test determining application layer performance indicating the gateway s ability to handle voice, video, and data traffic under different deployment conditions. This test was run with ten independent clients and a typical small-office deployment profile. Test Bed Topology WaveTest 90 Traffic Generator / Analyzer with Wi-Fi and Ethernet traffic generation and analysis modules Access point isolation chamber with internal antenna and external SMA connector RF chamber connected to Wi-Fi WaveBlade using SMA cable Ethernet WaveBlade and WBS Ethernet interfaces connected to Cisco 10/100 Mbps switch RF chamber connected to Wi-Fi WaveBlade using SMA cable Management plane separate from test plane 4. TCP Goodput Test: Creates high- bandwidth TCP traffic of specific MSS sizes from various independent clients utilizing the WBS or access point to determine the maximum forwarding rate for stateful transport layer data. Page 2 www.ixiacom.com

Test Results Packet Throughput Test Results The packet throughput test results for both WBS products deployed by provider were expected to deliver very highperformance numbers for all frame sizes. Figures 2 and 3 show the results for vendors A and B respectively. Although not more than 1.5 Mbps below the maximum theoretical throughput possible at 1518 bytes frame size, results clearly indicated there was sufficient room for improvement in the ability of both devices to deliver the best possible customer experience. Packet Latency Test Results Figure 4 shows the results for Vendor A. While there were no significant delays for any of the frame sizes, it was observed that the delay was close to 20ms for 256 byte packets. In practice it was expected that packet latencies would be below 20ms in WLAN networks in order to maintain high performance for delay-sensitive voice and video applications. Given that enterprise-class APs were able to support latencies that were consistently less than 5 ms, there again was room for improvement in the QoE achieved using the WBS. Figure 2 - Throughput vs. Frame Size Vendor A Results Figure 3 - Throughput vs. Frame Size Vendor B Results 40.0 Measured 40.0 Theory Mbps 30.0 20.0 10.0 0.0 1.08 88 Theory 2.16 128 5.01 256 9.36 512 16.34 1024 18.64 1280 21.28 1518 Mbps 30.0 20.0 10.0 0.0 2.17 88 Measured 3.17 128 5.9 256 10.88 512 17.51 1024 19.95 1280 22.71 1518 Frame Size The theoretical throughput of the system, as limited by the physical media, is also indicated on the above graph. The System Under Test (SUT) throughput should ideally be as close as possible to the indicated theoretical throughput values. 40.0 Figure 3 - Throughput vs. Frame Size Vendor B Results Theory Measured Frame Size The graph summarizes the measured minimum, maximum, and average latency performance of the SUT at the specified frame sizes, accumulated over all trials. Lower values indicate better performance. Also, a smaller difference between maximum and minimum latency indicates a better-performing SUT datapath. Normal values for latency range from 1 to 100 microseconds. Values in excess of 20 milliseconds are a cause for concern, as they can pose problems for VoIP traffic. 30.0 WiMix Enterprise Test Results Mbps 20.0 10.0 0.0 2.17 88 3.17 128 10.88 5.9 256 512 Frame Size 17.51 1024 19.95 1280 22.71 1518 In order to determine the performance of the WBS under real-world deployment conditions, the provider chose to run the WiMix test. The Enterprise deployment scenario was chosen in order to verify that the WBS would perform satisfactorily under a variety of application traffic combinations and loads. This was also intended to indicate the amount of headroom these devices had in terms of capacity to accommodate future value-added services in residential deployments. Page 3 www.ixiacom.com

The WiMix Enterprise tests were run with 10 clients and involved a wide range of application traffic including FTP,, HTTP, VoIP calls, UDP, and RTP. Each of these traffic types was associated with a default Service Level Agreement (SLA). For example, voice calls were set to meet an SLA of R-Value = 78 which is equivalent to a MOS score of 4; video was set to Delay Factor: 150 and Media Loss Ratio:10; HTTP and FTP Goodput were set at 50% of ILOAD (intended load). Figures 5, 6, and 7 show WiMix test results for Vendor A. The results indicate that the WBS was unable to handle the 7 VoIP calls and 1 Mpeg2 video stream. The WBS also failed to adhere to the required performance for most of the application traffic. About 50% of the VoIP calls, 100% of the video streams, 75% of HTTP flows, 50% of UDP flows, 79% of TCP flows, and none of the or FTP or RTP flows could maintain their required performance targets. Only 51% of the intended traffic load was allowed by the WBS and it was only able to forward 32% of this traffic successfully, which indicated to the provider that considerable quality improvement was required to support services planned for deployment. Flow Type SLA Requirement % Flows that Met SLA FTP Per load 0 DF: 150 0 MLR: 10 Http Per load: 50 25 VOIP G711 SLA mode: 0 50 Value: 78 UDP Latency: 10000ms 50 Jitter: 500 ms Packet loss: 10% TCP Per load: 50 21 RTP Latency: 10000ms Jitter: 500 ms Packet loss: 10% 0 Figure 5: WiMix Enterprise Test SLA - Vendor A Results Flow Type VOIP G711 No. Flows Layer 7 Results ILOAD (kbps) OLOAD (kbps) Fwd. rate (kbps) Latency (msec) Jitter (msec) 14 MOS: 3.16 87.2 92.8 59.8 69.8 6.1 35.0 MPEG 1 MDI Score: 2003.3 2044.3 1071.1 135.5 3.6-66.75msecs: 49.29 Ftp 2 File transfer type: 100.0 74.0 - - - 26.1 968.17secs Goodput: 10.53 kbps Http 4 Goodput: 1650.0 539.3 - - - 29.3 398.13 kbps TCP 14 Goodput: 428.6 177.3 - - - 29.9 107.65 UDP 4-400.5 416.9 308.6 68.0 3.5 26.0 RTP 1-400.0 408.1 235.9 135.1 1.3 51.5 % Pack-et Loss Figure 6: WiMix Enterprise Test SLA - Vendor A Results Page 4 www.ixiacom.com

The table below shows the per flow average performance measurements of each traffic type: iload (Total 19.83 Mbps) oload (Total 10.21 Mbps) Achieved Load (Total 6.50 Mbps) Figure 7 - WiMix Enterprise Test Traffic Vendor A Results Figures 8, 9, and 10 show WiMix test results for Vendor B. Once again, the WBS was unable to handle 7 voice calls and 1 Mpeg2 video stream and failed to adhere to the required performance for most application traffic. About 50% of the VoIP calls, 100% of the video streams, 100% of the HTTP flows, 50% of the UDP flows, 93% of TCP flows, and none of the or FTP or RTP flows could maintain their required performance targets. Only 44% of the intended traffic load was allowed by the WBS and it was only successful in forwarding 26% of this traffic. Considerable quality improvement was again clearly required. Flow Type SLA Requirement % Flows that Met SLA FTP Per load: 50 0 DF: 150 0 MLR: 10 Http Per load: 50 0 VOIP G711 SLA mode: 0 50 Value: 78 UDP Latency: 10000ms 50 Jitter: 500 ms Packet loss: 10% TCP Per load: 50 7 RTP Latency: 10000ms Jitter: 500 ms Packet loss: 10% 0 Figure 8: WiMix Enterprise Test SLA - Vendor B Results Page 5 www.ixiacom.com

Flow Type VOIP G711 No. Layer 7 Results ILOAD OLOAD Fwd. rate Latency Jitter % Pack- Flows (kbps) (kbps) (kbps) (msec) (msec) et Loss 14 MOS: 3.5 87.2 94.9 72.0 96.4 5.5 24.2 MPEG 1 MDI Score: 80.49 2003.3 2092.5 1166.5 191.8 6.1 - msecs: 46.99 Ftp 2 File transfer type: 100.0 145.7 - - - 21.7 453.23 secs Goodput: 73.97 kbps Http 4 Goodput: 1650.0 225.6 - - - 26.0 138.4 kbps TCP 14 Goodput: 428.6 154.2 - - - 26.2 52.4 UDP 4-400.5 426.5 330.1 96.7 4.6 23.3 RTP 1-400.0 417.7 253.2 189.9 2.8 50.1 Figure 9: WiMix Enterprise Test SLA - Vendor B ResultsThe following table shows the per flow average performance measurements of each traffic type. iload (Total 19.83 Mbps) oload (Total 8.90 Mbps) Achieved Load (Total 5.18 Mbps) Figure 10 - WiMix Enterprise Test Traffic Vendor B Results Page 6 www.ixiacom.com

TCP Goodput Results The TCP Goodput test results for both vendors were expected to be very high for all MSS sizes. Figures 11 and 12 show the results for vendors A and B respectively. While no more than 1 Mbps of difference was observed for the largest MSS size between the two solutions, Vendor A performed better on average. Figure 11 - TCP Goodput Test Vendor A Results 30.0 25.0 Theoretical TCP Goodput 20.0 16.65 18.12 19.35 Mbps 15.0 10.0 9.06 5.0 0.0 1.86 2.51 4.73 88 128 256 512 1024 1280 1518 Max. Segment Size The graph shows the TCP goodput performance of the SUT in Mbps for the specified TCP maximum segment size in bytes. Page 7 www.ixiacom.com

Conclusion The precise and accurate testing and highly repeatable test methodology achieved using Ixia s IxVeriWave test systems enabled the provider to validate and verify the quality of its WBS options, to compare them with enterprise-class APs, and to run numerous variations of the basic benchmarking tests in a short amount of time. Testing allowed the provider to identify specific areas where improvements were needed to increase the quality of their WBS performance as well as issues occurring on its access points. Using the Ixia system allowed the provider to increase productivity, decrease test cycle time, reduce support costs, and minimize the time needed to bring a new service to market. Ixia Worldwide Headquarters 26601 Agoura Rd. Calabasas, CA 91302 (Toll Free North America) 1.877.367.4942 (Outside North America) +1.818.871.1800 (Fax) 818.871.1805 www.ixiacom.com Ixia European Headquarters Ixia Technologies Europe Ltd Clarion House, Norreys Drive Maidenhead SL6 4FL United Kingdom Sales +44 1628 408750 (Fax) +44 1628 639916 Ixia Asia Pacific Headquarters 21 Serangoon North Avenue 3 #04-01 Singapore 554864 Sales +65.6332.0125 Fax +65.6332.0127 Page 8 www.ixiacom.com