Black Book dition. Critical Testing for Wireless Client Devices. Edition April September 2015 ADVANCED MPLS
|
|
|
- Gilbert Dean
- 10 years ago
- Views:
Transcription
1 ADVANCED MPLS Black Book dition Edition 10 Wi-Fi Device Testing Critical Testing for Wireless Client Devices April September 2015 PN Rev A September 2015 i
2
3 Critical Testing for Wireless Client Devices Your feedback is welcome Our goal in the preparation of this Black Book was to create high-value, high-quality content. Your feedback is an important ingredient that will help guide our future books. If you have any comments regarding how we could improve the quality of this book, or suggestions for topics to be included in future Black Books, please contact us at Your feedback is greatly appreciated! Copyright 2015 Ixia. All rights reserved. This publication may not be copied, in whole or in part, without Ixia s consent. RESTRICTED RIGHTS LEGEND: Use, duplication, or disclosure by the U.S. Government is subject to the restrictions set forth in subparagraph (c)(1)(ii) of the Rights in Technical Data and Computer Software clause at DFARS and FAR Ixia, the Ixia logo, and all Ixia brand names and product names in this document are either trademarks or registered trademarks of Ixia in the United States and/or other countries. All other trademarks belong to their respective owners. The information herein is furnished for informational use only, is subject to change by Ixia without notice, and should not be construed as a commitment by Ixia. Ixia assumes no responsibility or liability for any errors or inaccuracies contained in this publication. PN Rev A September 2015 iii
4
5 Critical Testing for Wireless Client Devices Table of Contents Table of Contents... v How to Read this Book... vii Dear Reader... viii Introduction... 1 Wi-Fi Device Issues... 2 Wi-Fi Device Test Challenges... 4 Test Case 1: Throughput Benchmarking Test...11 Test Case 2: Performance Characterization over Packet Sizes...29 Test Case 3: Performance Characterization over Distance Rate vs Range...36 Test Case 4: Cost of Throughput Analysis...47 Test Case 5: Roaming Validation...55 Test Case 6: Security Test...63 Test Case 7: Ecosystem Test...71 Test Case 8: Radio Transmitter Quality...77 Test Case 9: Interoperability Testing Performance Characterization over Distance...80 Contact Ixia...91 PN Rev A September 2015 v
6
7 Critical Testing for Wireless Client Devices How to Read this Book The book is structured as several standalone sections that discuss test methodologies by type. Every section starts by introducing the reader to relevant information from a technology and testing perspective. Each test case has the following organization structure: Overview Objective Setup Step-by-Step Instructions Test Variables Results Analysis Troubleshooting and Diagnostics Conclusions Provides background information specific to the test case. Describes the goal of the test. An illustration of the test configuration highlighting the test ports, simulated elements and other details. Detailed configuration procedures using Ixia test equipment and applications. A summary of the key test parameters that affect the test s performance and scale. These can be modified to construct other tests. Provides the background useful for test result analysis, explaining the metrics and providing examples of expected results. Provides guidance on how to troubleshoot common issues. Summarizes the result of the test. Typographic Conventions In this document, the following conventions are used to indicate items that are selected or typed by you: Bold items are those that you select or click on. It is also used to indicate text found on the current GUI screen. Italicized items are those that you type. PN Rev A September 2015 vii
8 Critical Testing for Wireless Client Devices Dear Reader Ixia s Black Books include a number of IP and wireless test methodologies that will help you become familiar with new technologies and the key testing issues associated with them. The Black Books can be considered primers on technology and testing. They include test methodologies that can be used to verify device and system functionality and performance. The methodologies are universally applicable to any test equipment. Step-by-step instructions using Ixia s test platform and applications are used to demonstrate the test methodology. This tenth edition of the Black Books includes twenty-five volumes covering key technologies and test methodologies: Volume 1 Higher Speed Ethernet Volume 2 QoS Validation Volume 3 Advanced MPLS Volume 4 LTE Evolved Packet Core Volume 5 Application Delivery Volume 6 Voice over IP Volume 7 Converged Data Center Volume 8 Test Automation Volume 14 Network Security Volume 15 MPLS-TP Volume 16 Ultra Low Latency (ULL) Testing Volume 17 Impairments Volume 18 LTE Access Volume ac Wi-Fi Benchmarking Volume 20 SDN/OpenFlow Volume 21 Network Convergence Testing Volume 9 Converged Network Adapters Volume 10 Carrier Ethernet Volume 11 Ethernet Synchronization Volume 12 IPv6 Transition Technologies Volume 13 Video over IP Volume 22 Testing Contact Centers Volume 23 Automotive Ethernet Volume 24 Audio Video Bridging Volume 25 Wi-Fi Client Device Testing A soft copy of each of the chapters of the books and the associated test configurations are available on Ixia s Black Book website at Registration is required to access this section of the web site. Ixia is committed to helping our customers network perform at its highest level, so that end users get the best application experience. We hope this Black Book series provides valuable insight into the evolution of our industry, and helps customers deploy applications and network services in a physical, virtual, or hybrid network configuration. Bethany Mayer, Ixia President and CEO PN Rev A September 2015 viii
9 Critical Testing for Wireless Client Devices Critical Testing for Wireless Client Devices Test Methodologies Wi-Fi has become the de-facto standard of communication for local area networks. It s fast, flexible and cheap, which has led to a proliferation of Wi-Fi devices in the market. While this trend is overall a good thing for the market, a large number of devices still exhibit issues resulting in poor experience for the end user. This booklet aims to address this quality gap by proposing various test methodologies to verify the performance, functionality and security resiliency of Wi-Fi client devices. PN Rev A September 2015 ix
10 Critical Testing for Wireless Client Devices PN Rev A September 2015 x
11 Introduction Introduction Over 10 Billion Wi-Fi enabled devices have been shipped to date, and this number is projected to grow at 10% for the years to come. Although a majority of the Wi-Fi device shipments today are made up of smart phones, tablets, e-readers and laptops, there is a growing trend of Wi-Fi becoming the access technology for several application specific devices, including: Home security cameras, set-top-boxes and media players, thermostats etc. Hospitals patient monitors, infusion pumps, oxygen monitoring devices etc. Industry machine diagnostics, sensors, smart grids etc. For most of these devices, good Wi-Fi connectivity is critical to their functioning and quality of experience delivered to the end-user. Figure 1: Wi-Fi Devices PN Rev A September
12 Wi-Fi Device Issues Wi-Fi Device Issues Wi-Fi devices have been around for over a decade now; however their use-cases have come a long way from their initial days. These new use-cases impose several requirements that the technology was not originally designed to address. Wi-Fi technology has been evolving to keep up with these new use-cases, but it s not immune to issues. From a test perspective it s these issues that need to be targeted comprehensively. Unlicensed frequencies As Wi-Fi operates in unlicensed frequencies, devices typically have to contend with interference issues coming from other devices operating in the same frequencies. This can come from Bluetooth, Microwave, DECT and other Wi-Fi devices. Lack of deployment standards There are no standard deployment models. This leads to lot of variation between each deployment. Devices have to cope with these variations when they operate. Legacy devices Wi-Fi standards evolve pretty fast. This model is possible because the standard imposes backwards compatibility on devices. Often Wi-Fi devices have to operate in an ecosystem with several legacy devices. This causes a range of issues, because legacy devices operate at different PHY rates and usually occupy the medium for much longer periods. Roaming issues Handover/Roaming is a relatively new concept in Wi-Fi. Originally Wi- Fi was designed for fixed or nomadic wireless access; however due to its recent popularity in the enterprise several new use-cases now require a seamless handover. Moreover, the devices are completely responsible for planning and executing the handover. This creates an extremely challenging scenario for devices. AP/Device Interoperability Issues Interoperability issues exist in any technology; Wi- Fi is not immune to it either. QoS Wi-Fi networks mostly carried best effort data traffic in the beginning; however, with its surge in popularity, several new use-cases require QoS. This is extremely challenging in Wi-Fi networks because of a lack of centralized control. Radio Resource Management Transmitting data, and transmitting data efficiently are different concepts. Efficient transmission is increasingly becoming a focal point as it leads to overall better network utilization and minimal impact on the battery. Battery Performance Most Wi-Fi devices are battery operated. Devices that don t optimize transmission algorithms will find that they spend too many cycles in transmissions and re-transmissions. Optimizing transmission is key to better battery performance. Radar Compliance Regulatory bodies impose restrictions around usage of certain frequencies at certain times. This functionality is also referred to as Dynamic Frequency PN Rev A September
13 Wi-Fi Device Issues Selection. Devices exhibit many issues when it comes to DFS, because it involves dynamically switching from one channel to another. Antenna design Antenna design is a complex subject: if not done right, device performance can vary vastly, depending on its orientation or its interaction with the environment. PN Rev A September
14 Wi-Fi Device Test Challenges Wi-Fi Device Test Challenges Tools and equipment Prior to Ixia s foray into device testing, Wi-Fi device test labs had to use several test tools from various vendors. Real APs Access Points that DUT can connect to Traffic generation tools Tools to generate different types of traffic. Programmable attenuator Controls attenuation to set different path loss. Channel emulator Emulates different channel conditions like home, small/large office, outdoor etc. The channel conditions vary in each of these deployment conditions. Packet capture/sniffing and protocol analyzer Captures wireless traffic and helps decode and analyze it. Signal and Spectrum analyzer Tools to analyze the Radio Frequency transmitted. This hodge-podge solution had several issues: Complexity imagine working with 6 vendors! Usability users have to train themselves in all of these different products with different interfaces Predictability accounting for failure is 6 times harder when dealing with different pieces Time the time-cost to put together and maintain these different pieces Cost and finally $$$ With Ixia s WaveDevice solution, this hodge-podge approach should now be in the rear view. PN Rev A September
15 Wi-Fi Device Test Challenges Testing Methodology and Focus Unlike cellular technologies, in Wi-Fi there are no test-focused specifications for Wi-Fi devices. This whole space is relatively new compared to the cellular world, where there have been several generations of technologies released, and the market has evolved to standards-based test approaches. When testing Wi-Fi Devices, three main technology areas should be assessed: PHY Layer This is responsible for converting information into RF signals, and for transmitting them between the source and the destination. To ensure successful communication, the transmitter and receiver of the device need to perform well and also conform to all necessary specifications. On the functional side, the transmitter needs to make sure that the RF energy transmitted is confined to the allowed spectrum mask within the frequency band of operation. It also needs to ramp up and ramp down power within spec to ensure any given transmission does not interfere with the next one. Because ensuring a proper PHY layer proves critical for ensuring end-user quality of experience on a Wi-Fi device, RF testing is an essential step. On the performance side, the transmitter needs to have proper transmitter modulation accuracy at different modulation rates, while the receiver needs to have a low Error Vector Magnitude (EVM) when receiving at various data rates and power levels. MAC Layer: Unlike other wireless access technologies such as LTE, where the base station makes most of the decisions for the device, the Wi-Fi protocol requires the device to make lots of decisions. The device must decide: When to transmit How to contend and acquire the channel How to roam between access points How and when to rate adapt How and when to use power-save mechanisms A typical Wi-Fi device is expected to be a lot smarter, and needs to implement several complex algorithms at the MAC layer. Hence the MAC layer on a device needs to be tested thoroughly for both functionality and performance. On the functional side, the device needs to be tested to make sure it can roam, rate adapt, connect to the AP using proper security mechanisms; and that it can only connect to APs with matching credentials. On the performance side, it s important to test that the device can optimize its resources to maximize throughput, implement proper traffic classification under load, and minimize battery consumption. PN Rev A September
16 Wi-Fi Device Test Challenges Application Layer This is what the user sees and interacts with. Here, we need to look at several aspects of application performance, including issues such as seamless LTE to Wi-Fi handover, delaysensitive Unified Communications (UC) applications, high-definition video streaming over Wi-Fi, and the like. One very important point to note is that bad RF performance or bad MAC layer performance will result in a bad user experience with an application. It s important to thoroughly test and harden a device at each one of these layers in isolation, and then test the system as a whole. Ixia recommends that testing is best conducted using a staged approach. It is important to baseline the performance of the Device under Test (DUT) under ideal conditions, and to find and fix issues. Testing should then progress by introducing one variable at a time, moving from the most deterministic to the most realistic test conditions. Testing can be divided into three major stages: PN Rev A September
17 Wi-Fi Device Test Challenges Design/Development/QA Testing Stage 1 During Stage 1 of testing Dev Test and QA configurability, repeatability, stress, and automation are very important to achieving maximum test coverage in the minimum amount of time. This stage of testing is best addressed by using a piece of test equipment that can simulate a Wi-Fi access point. The key tests that need to be run in Stage 1 include measuring radio performance, validating device connectivity, measuring raw throughput, and ensuring protocol conformance. Stage 1 also includes other MAC and PHY protocol-related aspects such as roaming, rate adaptation, powersave protocols, and security. Extensive test coverage is critical during this stage, and covering numerous test cases in a small amount of time is essential for an on-time release of a high-quality product in a highly competitive market. The WaveDevice Golden AP solution combines hardware and software. The hardware includes an a/b/g/ac Golden AP emulator, full line-rate traffic generator, channel and distance emulator, line-rate real-time protocol sniffer, and line-rate, real-time signal generator and analyzer. The test hardware connects to an RF enclosure using RF cables; the device under test is placed inside the RF enclosure. The DUT runs simple endpoint software called the WaveAgent - that sits at the transport layer on the device. The WaveAgent receives commands from the Ixia WaveDevice hardware to send/receive different types of traffic at different rates, making precise performance measurements. Interoperability Testing Stage 2 Stage 2 of testing can begin during the post-production phase of the development life-cycle. The device is now fully developed and must be tested as a system prior to release. During this stage, it is important to subject the device to more realistic test conditions, including testing against real APs and testing over-the-air PN Rev A September
18 Wi-Fi Device Test Challenges In Stage 2, the client device under test has to be tested against the most common real APs to make sure that the device can work well with those APs in the field. The key tests in Stage 2 include TCP/UDP/VOIP Upstream/Downstream performance at different frame sizes and rates, and on different frequency channels with different settings on the AP and the client. The client device under test is connected using RF cables to the AP through the IxVeriWave RF Management Unit. Both the AP and the Client are placed in separate RF enclosures to create an isolated, fully controllable, and repeatable test environment. The testbed also includes the IxVeriWave WT-90/92 that houses two ac wireless cards that can capture all the traffic between the AP and the Client on the wireless interface and perform expert analysis to isolate and identify PHY/MAC-level issues with the AP and the Client device. The WT-90/92 chassis also includes an Ethernet card that is connected to the Ethernet interface of the AP and acts as one of the endpoints for traffic. The second endpoint is the WaveAgent software installed on the device under test. The WaveDevice software application can create TCP/UDP/VOIP traffic in both upstream and downstream directions and measure end-to-end Key Performance Indicators (KPIs). The RF Management Unit can also programmatically simulate distance between the AP and the client and thus allows the software application to run performance over distance tests. All the components of the testbed are nicely integrated and can be controlled from an easy-touse GUI. The user can run automated tests with various combinations of test settings and watch the results in real-time. Field Testing Stage 3 Stage 3 occurs when the device is deployed in a live network. Here, it is important to characterize the behavior of the device in real-world conditions, and to find and fix the small percentage of issues that fell through the cracks during lab testing. IxVeriWave users can also validate that the network into which the client device is being deployed has no major issues and can support the reliable operation of the device. PN Rev A September
19 Wi-Fi Device Test Challenges In Stage 3 of testing, the goal is to evaluate the device s performance in the field. Stage 1 and Stage 2 testing provide an excellent platform for the test engineers to do everything possible in the lab and ship an excellent product. However, there will almost certainly be some issues that only show up in the field. While testing Wi-Fi networks and devices in the field, the common misconception is: good RF coverage means happy users. A device could be getting excellent signal strength at all the locations on the floor but still have poor performance. There could be several reasons for this: maybe the device is getting excellent signal in general but is not connected to the best available AP; maybe the traffic load is not balanced across all the APs and thus resulting in low throughput on the device; maybe all the neighboring devices are communicating only on the 2.4GHz band even though there is ample free bandwidth available on the 5GHz band; or maybe it is not even a wireless problem, and there is some policy/role based misconfiguration on the wired network that is causing poor performance. Ixia s WaveDeploy test tool allows customers to run active site assessments from real devices using the same WaveAgent software used in Stage 1 and Stage 2 testing. These assessments measure the voice, video, and data performance of real devices at various locations on the deployment floor under very real usage conditions. From extensive testing conducted in the field over several sites, it becomes clear that the traditional RSSI-based survey kind of testing is not sufficient. Coverage doesn t mean capacity. It is very important to: Measure application performance Use the real client device in the test Test in the actual deployment site. PN Rev A September
20 Wi-Fi Device Test Challenges Only then can the users find the issues in the field that were not found in the Stage 1 and Stage 2 lab testing. Device Test Methodology Summary Start with a standard solution offering such as a reference design from a silicon manufacturer, or an embedded Wi-Fi module. Customize the generic solution to hit the performance, power, reliability, and physical requirements of your client device. Exhaustively test the behaviors of the device to ensure that the Wi-Fi customizations have not compromised functionality or performance. Identifying and addressing a handful of issues at this stage can save a lot of angry phone calls and customer support trips after deployment. Tune the design: test, then adjust, then test again. Continue until you reliably achieve the behavior you need. Once the device is rock-solid on its own, ensure that it interoperates with the network under all possible realistic environments while testing in the lab. Finally, test the end user s network while performing a major rollout to ensure that the user s network can support a high-quality client transaction and that there are no sitespecific issues. PN Rev A September
21 Test Case 1: Throughput Benchmarking Test Test Case 1: Throughput Benchmarking Test Overview Devices come in various shapes and sizes, but one function common among all of them is the ability to transmit, receive, and process traffic. The throughput benchmark test provides a concise sketch of the overall performance of the device. It indirectly validates the radio HW, RF signal chain, device driver, OS and application level performance, all in a single test. In Wi-Fi, the overall throughput of a device can be impacted by several factors; some of them are listed below: Transmit power Ecosystem traffic Packet sizes Frame Aggregation Number of spatial streams SISO/MIMO TCP/UDP To be able to make sense of the results, it s essential to keep the test variables to a minimum and under control during each trial. A test-bed like Ixia s WaveDevice Golden AP gives users this control while enabling them to execute various tests. Objective Benchmark the maximum Downstream & Upstream throughput of a given device. Setup The setup here is made up of AP with traffic generation and control capabilities, a chamber to isolate the RF environment from external RF sources and DUT with an agent that can be controlled to generate data traffic. PN Rev A September
22 Test Case 1: Throughput Benchmarking Test Step-by-step Instructions 1. Launch IxVeriwave Golden AP WaveDevice. The workflow for configuring a test is outlined in the left frame of the GUI System (chassis and port assignment), Access Points (AP configuration), Devices (Devices and Tests), Analysis (Results analysis). Please refer to the user guide to familiarize yourself with WaveDevice GUI 2. Enter the IP address of the chassis that hosts the Golden AP card. Click on connect when done. 3. Select the Golden AP card to be used for the simulating the AP and click Reserve at the bottom of the screen 4. Set the Channel information for the simulated AP. Note: channel selection can have an impact on AP configuration parameters like AP Type, Bandwidth. PN Rev A September
23 Test Case 1: Throughput Benchmarking Test 5. Switch to Access Point configuration to configure the simulated AP. You can leave most of the parameters in their default values. General Tab o Port - <Make sure this matches, what you ve reserved> o SSID - Blackbook_Exercise o Default Tx Power - <leave as default to start with, adjust based on RSSI feedback from device> Data and Beacon PHY rates tab o o o This is the screen to set max. supported Data and Mgmt. PHY rates of the simulated AP. The DUT used for this exercise is an Apple iphone 6, which is a SISO device that supports 256QAM. Therefore the AP will be configured to support MCS 8 and 9. Note: Disable Antennas 2, 3 and 4 in the configuration screen, if they are not connected to the device Under OFDM, leave the default settings as is, as it offers ultimate compatibility. Also set the Beacon PHY Rate to 6 Mbps for maximum compatibility. The Beacon PHY rate sets the PHY rate for management frames transmitted by AP. PN Rev A September
24 Test Case 1: Throughput Benchmarking Test Note: Setting a low beacon PHY rate will impact the maximum throughput the device under test can achieve. If you are sure your DUT supports higher management PHY rates, you can override this setting. o Under VHT Rates, set NSS 1 - MCS 0-9 NSS 2 - Not Supported NSS 3 - Not Supported NSS 4 - Not Supported Advanced o o Leave all remaining parameters in their default values. Aggregation parameters will be discussed in more detail in a separate test case. 6. Activate the AP and switch to Devices page Clicking Activate AP will begin beacon transmission from AP. The Beacons will be transmitted at Tx Power level set in AP config screen (default value 15dBm). 7. From DUT join the wireless network and start WaveAgent. PN Rev A September
25 Test Case 1: Throughput Benchmarking Test 8. Select the DUT from the devices that show up in the summary screen 9. Pick the GDPT option from Test Type drop down menu and configure it with the following parameters to drive maximum throughput. The General Data Plane Test is designed to characterize the performance of a device by subjecting it to different types of traffic. GDPT Test is an ideal test to benchmark throughput performance of a device under test. Traffic Type: UDP and TCP Traffic Direction: Downstream and Upstream Frame Size: Set to MTU value 1518, for best results Frame rate: Set 100% of theoretical frame rate. Theoretical frame rate is derived based on several configuration parameters like channel bandwidth, max. data PHY rate, guard timer, aggregation settings. For an 80Mhz, AC SISO device with support for 256 QAM modulation (MCS 9), Short Guard timer the theoretical frame rate works out to be 433.3Mbps Under options set the trial duration to 60 secs. Trial duration can be increased for long duration and stability tests. PN Rev A September
26 Test Case 1: Throughput Benchmarking Test 10. Start the test by clicking on the Start Test icon in the ribbon on top. Result Analysis When test starts executing, monitoring stats will begin populating simultaneously. Monitoring stats are retrieved from IxVeriwave cards like (RFA/WBA) as well as the WaveAgent running on DUT. Retrieved stats are presented in WaveDevice GUI in 3 categories Flow Stats - stats measuring the active traffic flows Client Stats - stats pertaining to the each client or DUT Port Stats - stats measured at port which include all clients and APs using the specific IxVeriwave HW port Stats are also stored away as CSV files in the hard disk. There is an analysis module called View Measurement that can analyze results by co-relating stats and presenting results as bar or line graphs. GDPT tests go by trials, for each trial the set of key configuration parameters are highlighted as the trial progresses. In this test there are 4 trials: UDP Downstream with packet size of 1518 bytes UDP Upstream with packet size of 1518 bytes TCP Downstream with packet size of 1518 bytes TCP Upstream with packet size of 1518 bytes When each trial starts executing, the first set of stats to look at would be Offered Load and Forwarding rate. But first, some background into the stats used here: PN Rev A September
27 Test Case 1: Throughput Benchmarking Test Statistics Terminology Intended Load Intended Load is the throughput intended to be generated. For tests like GDPT and RvR it s computed as a percent of theoretical PHY rate, and it is configurable in the test. Theoretical frame rate Offered load Theoretical frame rate is an estimate of maximum PHY throughput achievable, based on the current GoldenAP configuration. It is derived from several configuration parameters like channel bandwidth, maximum data PHY rate, guard interval, and few more. This value is an estimate, because the exact value cannot be determined without taking into account clients and their behavior. Downstream In the downstream direction, this stat represents the L4 traffic load generated at the simulated distribution system. As the L4 traffic is generated in the same hardware as simulated golden AP, this stat also represents the throughput load at L2 of AP. Moreover, because of the way Wi-Fi MAC works, the system will limit the load value to traffic successfully ACK ed by the L2 of DUT. So offered load is the L4 traffic load generated at the simulated AP that is also successfully ACK ed at L2 of the DUT. The difference between offered load and Intended load can be attributed to system overhead (management frames, contention, retransmission etc) and receiver performance. Upstream In the upstream direction, this stat represents the L4 traffic generated by Waveagent. The Waveagent relies on DUT s Operating system TCP/IP stack to send the traffic out. If for whatever reason there is a bottleneck in the transmission path and the OS is unable to keep-up with traffic generated by WaveAgent, it will reflect in a lower offered load. Forwarding rate Downstream In the downstream direction, this stat represents the effective traffic that reaches the WaveAgent. If any traffic is dropped between L2 and L4 of the DUT, it will be reflected between the delta between Forwarding Rate and Offered Load. Upstream In the upstream direction, this stat represents the traffic received by the simulated AP (L2). Because the entire simulated AP sub system is in the same HW, there are no resulting packet losses between layers of the simulated AP. Therefore, this can also be interpreted as L4 traffic at the distribution system. PN Rev A September
28 Test Case 1: Throughput Benchmarking Test Trial 1 - UDP Downstream with packet size of 1518 bytes To begin with, check the offered load and forwarding rate for the given trial. Also take note of medium utilization, Failed ACK frames (L2 Frame error in DL) and FCS errors (L2 Frame errors in UL). Observation 1 - Throughput Offered load and forwarding rate are pretty close Mbps vs Mbps PN Rev A September
29 Test Case 1: Throughput Benchmarking Test The difference between offered load and forwarding rate is generally attributed to packet loss across the various interfaces of the stack. Note that L2 frame errors are all only in 1 direction. That s because traffic is only in downstream direction, and even though the client transmits some management frames in upstream, it doesn t result in any L2 errors. Observation 2 Packet loss Packet loss and corresponding L2 Frame errors are low 0.23% and 1.8% respectively. L2 Frame errors can be related to DUT not being able to keep up with received traffic rates, processing aggregated frames (AMPDUs), or MCS decoding error. Minimizing L2 frame error will improve overall throughput performance. PN Rev A September
30 Test Case 1: Throughput Benchmarking Test Observation 3 Aggregation In the downstream direction the simulated AP will aggregate packets-based negotiations with DUT. Aggregating MPDUs (mac layer protocol data unit) results in overall higher throughput, as there is lesser overhead in acquiring the medium for transmitting the same amount of data. As we can see from the result below, aggregation performance was quite good; all Tx Packets were aggregated in the max aggregation bucket, which is Overall, these results reflect very good performance. PN Rev A September
31 Test Case 1: Throughput Benchmarking Test Trial 2 - UDP Upstream with packet size of 1518 bytes To begin with, check the offered load and forwarding rate for the given trial. Also, take note of medium utilization, failed ACK frames (L2 frame error in DL), and FCS errors (L2 frame errors in UL). Observation 1 - Throughput Once again, offered load and forwarding rate are pretty close Mbps vs Mbps PN Rev A September
32 Test Case 1: Throughput Benchmarking Test The difference between offered load and forwarding rate is generally attributed to packet loss across the various interfaces of the stack. Note that L2 frame errors are all only in 1 direction, that s because traffic is only in upstream direction and even though the Golden AP transmits some management frames in upstream, it doesn t result in any L2 errors. Observation 2 Packet loss Packet loss and corresponding L2 frame errors are pretty low 0.31% and 0% respectively. In the upstream direction, L2 frame errors occur due to transmitter issues or low signal quality. Packet loss is measured as the number of packets lost between transmitter and receiver, as L2 takes care of retransmission in case of errors, packet loss equals the packets lost between layers of the DUT (L4-L2). PN Rev A September
33 Test Case 1: Throughput Benchmarking Test Observation 3 Aggregation In the upstream direction the DUT has to aggregate packets. Aggregating MPDUs (mac layer protocol data unit) results in overall higher throughput, as there is lesser overhead in acquiring the medium for transmitting the same amount of data. As we can see from the result below, aggregation performance was OK, but it was not maximized. Each AMPDU had roughly 32 MPDUs aggregated. Effective throughput could have been higher if more MPDUs were aggregated. In real-life though, DUTs have to balance high throughput with overheads associated with retransmission; from that angle, aggregating fewer MPDUs could be seen as a balancing act to optimize throughput. Overall, these results reflect very good performance. PN Rev A September
34 Test Case 1: Throughput Benchmarking Test Trial 3 - TCP Downstream with packet size of 1518 bytes To begin with, check the offered load and forwarding rate for the given trial. Also take note of medium utilization, failed ACK frames (L2 frame error in DL) and FCS errors (L2 frame errors in UL). Observation 1 - Throughput Offered load and forwarding rate match Mbps. But the overall throughput is lower compared to UDP. This is expected for TCP. Note that the L2 errors exist in both directions (unlike UDP). This is because in TCP the ACKs coming back also take up resources. PN Rev A September
35 Test Case 1: Throughput Benchmarking Test Observation 2 Packet loss Packet loss and corresponding L2 frame errors are low 0.23% and 1.8% respectively. L2 frame errors can be related to DUT not being able to keep up with received traffic rates, processing aggregated frames (AMPDUs), or MCS decoding error. Minimizing L2 frame error will improve overall throughput performance. PN Rev A September
36 Test Case 1: Throughput Benchmarking Test Observation 3 Aggregation In the downstream direction the simulated AP will aggregate packets based negotiations with DUT. Aggregating MPDUs (mac layer protocol data unit) results in overall higher throughput, as there is lesser overhead in acquiring the medium for transmitting the same amount of data. As we can see from the result below, aggregation performance was quite good; all Tx Packets were aggregated in the max aggregation bucket, which is Overall, these results reflect very good performance. PN Rev A September
37 Test Case 1: Throughput Benchmarking Test Trial 4 - TCP Upstream with packet size of 1518 bytes This trial is not analyzed here, as the same techniques described in prior trials can be applied to analyze results from this trial. Troubleshooting and Diagnostics Symptom Diagnosis Comments Low throughput (DL) Low throughput (UL) 1. Check for L2 frame errors, high L2 frame errors implies any one of the following DUT unable to keep up with high traffic rates DUT framing issues with aggregated packets DUT unable to decode specific MCS at certain power levels (even though this is a L1 issue, its reported as a L2 frame error) 2. Check packet loss; packet loss can be result of interlayer processing errors 1. Check for L2 frame errors, high L2 frame errors implies any one of the following 2. DUT Transmitter Issue 3. Tx power level below Golden AP s receiver sensitivity 4. Check packet loss; packet loss can be result of interlayer processing errors 5. Check aggregation performance Assume testing is done in RF isolated chamber, so interference is not an issue Assume testing is done in RF isolated chamber, so interference is not an issue Test Variables Different packet sizes Device orientation (TRP/TIS) PN Rev A September
38 Test Case 1: Throughput Benchmarking Test Conclusion Based on analysis done above we conclude that maximum throughput for this device has been: UDP o o Downstream 375 Mbps Upstream 325 Mbps TCP o Downstream 216 Mbps By using the Golden AP and a conductive setup, we were able to get consistent results with regards to maximum throughput. PN Rev A September
39 Test Case 2: Performance Characterization over Packet Sizes Test Case 2: Performance Characterization over Packet Sizes Overview Several wireless devices are designed for specific applications like medical devices, point of sale, VoIP phones etc. Traffic pattern in such devices is generally pretty deterministic, these devices need to be benchmarked with specific traffic profiles. General purpose devices (laptops, tablets etc.), on the other hand, deal with wide-ranging traffic patterns, and they too need to be benchmarked with different traffic profiles. There have been many instances where device performance deteriorates for specific packet sizes. It s therefore important to characterize a device performance with different traffic characteristics. Objective Characterize device performance over different packet sizes. Setup Step-by-step Instructions 1. Follow steps 1-8 defined in TC1 to configure a simulated AP. 2. Pick the GDPT option from Test Type drop down menu and configure it with the following parameters. The General Data Plane Test is designed to characterize the performance of a device by subjecting it to different types of traffic. GDPT Test is an ideal test to benchmark performance of a device under test. Traffic Type: UDP and TCP Traffic Direction: downstream and upstream Frame Size: Set to MTU value 128, 256, 512, 1024, 1518, for a full sweep of different packet sizes PN Rev A September
40 Test Case 2: Performance Characterization over Packet Sizes Frame Rate: Set 100% of theoretical frame rate. Theoretical frame rate is derived based on several configuration parameters like channel bandwidth, max. data PHY rate, guard timer, aggregation settings. For an 80Mhz, AC SISO device with support for 256 QAM modulation (MCS 9), Short Guard timer the theoretical frame rate works out to be 433.3Mbps Under Options, set the trial duration to 60 secs. Trial duration can be increased for long duration and stability tests. 3. Start the test and begin monitoring statistics. Result Analysis Tests are broken into trials, which basically lock the configuration down during execution. For this testcase there are 10 trials. For sake of brevity, we will focus our analysis to only UDP traffic. Observation 1 Offered Load From the graph below, the first thing that becomes evident is that the offered load decreases with decreasing frame size. Offered load represents the traffic successfully transferred between the L2 (mac) AP and DUT. Refer to <> for more information on offered load. The second point that becomes evident is that offered load is higher in downstream compared to upstream. PN Rev A September
41 Test Case 2: Performance Characterization over Packet Sizes Observation 2 Forwarding rate Overall forwarding rate also tracks offered load pretty closely with a small percentage of packet loss, which we will analyze next. PN Rev A September
42 Test Case 2: Performance Characterization over Packet Sizes Observation 3 L2 Frame Errors and Packet loss L2 frame errors represents transmission issues between the L2 of AP and DUT. These errors are an overhead for the system as they increase retries and slow down the overall system. Note: L2 frame errors don t directly result in packet loss, as the stack takes care of retransmissions. In the run below, it s evident that L2 errors increase drastically almost 40X in downstream as the frame size decreases. Packet loss on the other hand is nominal. In the downstream direction Packet loss is highest for the biggest frame size (1518 bytes) and in the upstream direction Packet loss is pretty even across the board. PN Rev A September
43 Test Case 2: Performance Characterization over Packet Sizes Observation 4: Aggregation Aggregation stats represent the number of MAC layer PDUs aggregated by the system. It s presented in both upstream and downstream directions. From the graph below, it becomes evident that in the upstream direction the DUT aggregates MPDUs differently for different packet sizes. In the downstream direction, the simulated AP always aggregates at a constant rate of 64 MPDUs. This could be one of the reasons why the L2 frame errors are high in downstream direction, especially for smaller frame sizes. PN Rev A September
44 Test Case 2: Performance Characterization over Packet Sizes Troubleshooting and Diagnostics Symptom Diagnosis Comments Low throughput (DL) Low throughput (UL) 1. Check for L2 frame errors, high L2 frame errors implies any one of the following DUT unable to keep up with high traffic rates DUT framing issues with aggregated packets DUT unable to decode specific MCS at certain power levels (even though this is a L1 issue, its reported as a L2 frame error) 1. Check for L2 frame errors, high L2 frame errors implies any one of the following DUT Transmitter Issue Tx Power level below Golden AP s receiver sensitivity 2. Check packet loss: packet loss can be a result of interlayer processing errors 3. Check aggregation performance Assume testing is done in RF isolated chamber, so interference is not an issue Assume testing is done in RF isolated chamber, so interference is not an issue PN Rev A September
45 Test Variables Test Case 2: Performance Characterization over Packet Sizes Frame rates More packet sizes Longer duration tests Conclusion Based on analysis done above, we conclude that for the given DUT: The bigger the frame size, the better the throughput performance Number of L2 frame errors is much lower for bigger frame sizes compared to small frame sizes. This translates to more overhead for lower frame sizes Bigger frame sizes have better forwarding rate to intended load, compared to smaller frame sizes Overall performance is much better for higher frame sizes PN Rev A September
46 Test Case 3: Performance Characterization over Distance Rate vs Range Test Case 3: Performance Characterization over Distance Rate vs Range Overview Wi-Fi technology is very commonly used in residential and enterprise scenarios to carry deal sensitive and high bandwidth real-time voice and video traffic. Since the devices connected to the Access Point are wireless, they can be located at different distances from the AP. It s important to make sure that users get good quality of experience at different distances from the Access Point. Objective Characterize the performance of device over various distance profiles. Setup Step-by-step Instructions 1. Follow steps 1-5 of Test Case Switch to Access Point configuration to configure the simulated AP. You can leave most of the parameters in their default values. General Tab o Port - Select Golden AP port reserved in ports page. o SSID - Blackbook_Exercise o Default Tx Power - Set this value to default and adjust if required, depending on feedback from device. PN Rev A September
47 Test Case 3: Performance Characterization over Distance Rate vs Range Data and Beacon PHY rates tab o o o This is the screen to set max. Supported Data and Mgmt. PHY rates of the simulated AP. The DUT used for this exercise is a Nexus 6, which is a MIMO 2x2 device that supports 256QAM. Therefore the AP will be configured to support Nss 1, Nss 2 with MCS 0-9. Disable Antennas 3 and 4 in the configuration screen, if they are not connected to the device Under OFDM, leave the default settings as is, as these are mandatory supported rates according to specification. User has the privilege to change these settings. Note: Setting a low beacon PHY rate will impact the maximum throughput the device under test can achieve. If you are sure your DUT supports higher management PHY rates, you can override this setting. o Under VHT Rates, set NSS 1 - MCS 0-9 NSS 2 - MCS 0-9 NSS 3 - Not Supported NSS 4 - Not Supported PN Rev A September
48 Test Case 3: Performance Characterization over Distance Rate vs Range Advanced o o Leave all remaining parameters in their default values. Aggregation parameters will be discussed in more detail in a separate test case. 3. Activate the AP and switch to Devices page Clicking Activate AP will begin beacon transmission from AP. 4. From DUT join the wireless network and start WaveAgent 5. Select the DUT from the devices that show up in the summary screen PN Rev A September
49 Test Case 3: Performance Characterization over Distance Rate vs Range Note: RSSI value shown for each device is reported by WaveAgent endpoint installed on DUT; initial path loss can be measured using Tx Power configured in Golden AP and RSSI value reported by WaveAgent: TxPower - RSSI 6. Pick the Rate vs. Range Test from Test Type drop down menu and configure test with the following parameters. The Rate vs. Range test is designed to characterize the device receiver performance at different distances from the Access Point while locking the data rate. This provides a fair idea about receiver performance for the given Modulation and Coding Scheme/data rate. This test can be configured with multiple data rates so user can benchmark the receiver performance for different Modulation and Coding Scheme/data rate at different distances in single click. Golden AP uses Tx Power to simulate distance. The Tx power can be configured with minimum of 1 db step difference. RvR Configuration PN Rev A September
50 Test Case 3: Performance Characterization over Distance Rate vs Range Test Configuration Tx PHY. Rates: Nss 1 MCS 0, Nss 1 MCS 9, Nss2 MCS 3 and Nss2 MCS 6 Note: Ideally this test case should be run with all possible MCS values to fully sweep and identify any issues. However, to keep the content relevant for this Black Book, this test has been configured to target certain key MCS indexes. Tx Powers: 0 dbm to -40 dbm with step size -2 dbm Note: Tx Power value can configured with minimum of 1 db step size in the range of +15 dbm to -50 dbm Traffic Type: UDP Traffic Direction: Downstream Frame Size: Set the Frame Size value to 1518 Frame Rates: Set 50% of theoretical frame rate. Frame rate has been set as 50% of theoretical, to target reasonable throughput values. 7. Start the test by clicking on the Start Test icon in the ribbon on top Result Analysis Path Loss in setup: It is always recommended to measure the path loss between AP and DUT before measuring the receiver performance. Initial estimated path loss can be measured using TxPower configured in Golden AP and RSSI value reported by WaveAgent. In this example the initial estimated path loss is 44 db. When test starts executing, monitoring stats will begin populating simultaneously. Monitoring stats are retrieved from IxVeriWave cards like (RFA/WBA3601) as well as the WaveAgent running on DUT. Retrieved stats are presented in WaveDevice GUI in 3 categories Flow Stats - stats measuring the active traffic flows Client Stats - stats pertaining to the each client or DUT Port Stats - stats measured at port which include all clients and APs using the specific IxVeriWave HW port PN Rev A September
51 Test Case 3: Performance Characterization over Distance Rate vs Range Stats are also stored away as CSV files for each trial in the host PC. There is an analysis module called View Measurement that can analyze results by co-relating stats and presenting results as bar or line graphs. Along with other UI graphs, trial results are also available in table format with a great deal of information. RvR tests go by trials; for each trial, the set of key configuration parameters is highlighted as the trial progresses. In this test there are 84 trials. The number of trials will be derived from number of Tx PHY Rates times number of Tx Powers times frame rates. For sake of brevity, we will pick some key trials for analysis. This should be sufficient to give an idea of how to go about analyzing results from the RvR test. When each trial starts executing, the first set of stats to look at would be Offered Load and Forwarding Rate, L2 Frame Errors and Medium Utilization. The following chart shows the typical receiver sensitivity for ac modulation coding schemes with channel width20/40/80 and 160 MHz. Most receivers today exceed these performance metrics quite comfortably. If the DUT does not have any specifications on its receiver sensitivity, this reference chart should provide some guidance on how to evaluate the device. Observation 1 Throughput Analysis Note: Please refer to Test Case 1 - Statistics Terminology for more info on terms used here. Use View Measurement module to plot Offered Load and Forwarding Rate graphs over distance (represented as decreasing TxPower) First observation is that Offered Load maps close to Intended load. This implies the DUT is receiving the target throughput from L2 perspective. PN Rev A September
52 Test Case 3: Performance Characterization over Distance Rate vs Range Second, DUT L2 performance of NSS-2-MCS-6 is slightly better NSS-1-MCS-9, this matches our expectation. DUT exhibits expected waterfall curve when comes to L2 performance. PN Rev A September
53 Test Case 3: Performance Characterization over Distance Rate vs Range Bingo key issue noticed. Forwarding rate drops close to 0, for NSS-2-MCS-6. Offered load for the same modulation was 250Mbps; this implies that at L2 of DUT the success rate was close to 100%. However, between L2 and L4 of DUT, there was over 99% packet loss. These packets will not be retransmitted, as the test was using UDP packets. This translates to poor Quality of Experience for the user. At -16dBm Tx power, there is again a dip in Forwarding Rate, for NSS-2-MCS-6. NSS 1 MCS 9 and NSS 2 MCS3 also shows drops in forwarding rate at certain Tx powers, and again shoots up at adjacent lower power values. PN Rev A September
54 Test Case 3: Performance Characterization over Distance Rate vs Range Finally, when comparing results with packet loss graph, it s once again clear there is an issue for NSS-2-MCS-6 at certain power levels. Observation 2 - L2 Frame Errors L2 Frame errors are related to DUT not being able to keep up with received traffic rates, processing aggregated frames (AMPDUs), or MCS decoding error at certain Tx power. L2 errors are yo-yo ing up and down for a range of power levels, after which they go all the way up. This behavior clearly demonstrates that DUT is not able to decode/acknowledge frames with specific Tx Power and Tx PHY rate. This behavior will result more Layer 2 retries, and will finally increase the cost of throughput by using additional air time. Since Wi-Fi is shared medium, this will not only impact the device performance but overall Wi-Fi network performance. Observation 3 Jitter Increased Path Loss may result in high jitter values. The high variation in jitter across different power levels can cause degraded Voice/Video quality at different distances from the Access Point. Observed high jitter values with Nss 1 MCS 0, which is lowest Tx PHY Rate configured irrespective of Tx Powers Higher Tx PHY rates resulted in low jitter values, which shows that the device is receiving a contentious stream. PN Rev A September
55 Test Case 3: Performance Characterization over Distance Rate vs Range Troubleshooting and Diagnostics Symptom Diagnosis Comments Low Forwarding Rate Check for L2 frame errors, high L2 frame errors implies any one of the following DUT unable to keep up with high traffic rates DUT framing issues with aggregated packets DUT unable to decode specific MCS at certain power levels (even though this is a L1 issue, its reported as a L2 frame error) Check Packet loss: packet loss can be a result of interlayer processing errors Assume testing is done in RF isolated chamber, so interference is not an issue Test Variables Modulation and Coding Scheme/Tx PHY Rates Tx Power Frame Rates PN Rev A September
56 Test Case 3: Performance Characterization over Distance Rate vs Range Conclusion Based on the analysis done above, we can conclude that the DUT clearly exhibited receiver issues at some modulation rates and power levels. This calls for deeper analysis to understand and remediate the issues. Forwarding rate of device at different distances can be heavily influenced based on several factors: Modulation and Coding Scheme/Tx PHY Rate Path Loss between AP and Client Device receiver sensitivity Device ability to process the frames at receive data rate There can be other factors as well, but assuming the test bed setup is isolated and left in optimum conditions, the above factors are key influencers. PN Rev A September
57 Test Case 4: Cost of Throughput Analysis Test Case 4: Cost of Throughput Analysis Overview Wi-Fi devices operate in a shared access medium, and they have to co-operate and co-exist with several other devices. Wi-Fi, like Ethernet, uses a distributed access scheme with only a small difference: that it uses a CSMA/CA scheme (carrier sensing multiple access / collision avoidance) to control access to the medium. However, collisions still occur. Moreover, there are several other challenges that MAC layer has to deal with that add to the overhead. Cost of throughput is a lesser known but very important metric that represents the performance of the device in terms of overhead to the DUT, as well as to the overall system. In simple terms, cost of throughput is a reflection of the proportional use of medium medium utilization to transfer a given amount of data. There will always be some cost associated with throughput: the aim is to minimize it. Being a shared medium, a higher cost of throughput implies a burden for all devices using the medium. Objective Validate that cost of throughput improves with increasing modulation rate. Setup Step-by-step Instructions 1. Follow steps 1-8 defined in TC1 to configure a simulated AP. 2. Pick the Simple test for this exercise. It supports locking down Tx Power, MCS and Data rate for a given trial. 3. Configure and collect results for the following 4 trials: Trial 1 MCS=3, Tx Power=5, Data rate=50mbps PN Rev A September
58 Test Case 4: Cost of Throughput Analysis Trial 2 MCS=5, Tx Power=5, Data rate=50mbps PN Rev A September
59 Test Case 4: Cost of Throughput Analysis Trial 3 MCS=8, Tx Power=5, Data rate=50mbps Trial 4 MCS=9, Tx Power=5, Data rate=50mbps PN Rev A September
60 Test Case 4: Cost of Throughput Analysis 4. Tx Power setting needs to correspond to optimum RSSI. In this case Tx Power = 5, results in RSSI of 30 dbm at the device. Result Analysis Observation 1 MCS 50Mbps As the test execution starts, begin monitoring real-time stats. First make sure the forwarding rate matches the target throughput Forwarding rate: Mbps Next, check the Tx Flow Medium Utilization Tx Flow Medium Utilization %: 79.4 This represents the amount of resources taken up by of the DUT to transmit 50Mbps successfully. It includes retransmissions; as the retransmission rate is pretty low in this case, the medium utilization is mostly made up of cost of transmission of 50Mbps. Tx Failed ACK Frame rate (pps): 34 Observation 2 MCS 50Mbps Same as above. Note down the Forwarding Rate, Medium Utilization and Tx Failed ACK Frame rate Forwarding rate: 50 Mbps PN Rev A September
61 Test Case 4: Cost of Throughput Analysis Tx Flow Medium Utilization %: 62.6 Tx Failed ACK Frame rate (pps): 58 Observation 3 MCS 50Mbps Same as above, note down the Forwarding Rate, Medium Utilization and Tx Failed ACK Frame rate Forwarding rate: Mbps Tx Flow Medium Utilization %: 56.4 Tx Failed ACK Frame rate (pps): 95 PN Rev A September
62 Test Case 4: Cost of Throughput Analysis Observation 4 MCS 50Mbps Same as above. Note down the Forwarding Rate, Medium Utilization and Tx Failed ACK Frame rate Forwarding rate: Mbps Tx Flow Medium Utilization %: 73 Tx Failed ACK Frame rate (pps): 1467 PN Rev A September
63 Test Case 4: Cost of Throughput Analysis Troubleshooting and Diagnostics Symptom Diagnosis Comments High Medium utilization Check L2 frame errors (or Tx failed ACK rate): if that s high, medium utilization can be impacted Check the modulation rate: if that s set low again medium utilization can be impacted Check target throughput: setting that high can result in high medium utilization Assume testing is done in RF isolated chamber, so interference is not an issue Test Variables Check results for different MCS rates Check different packet sizes PN Rev A September
64 Test Case 4: Cost of Throughput Analysis Conclusion The expected result, in general, is that medium utilization goes down with higher order modulation rates, because high MCS results in more efficient encoding at the PHY layer, resulting in better overall performance. Looking closer at the results, it becomes clear that medium utilization starts trending down from MCS3 to MCS 8. However, for MCS 9 it shoots back up. This is because of the amount of L2 errors and associated retransmission. It s indeed quite high for MCS 9. Medium Utilization for MCS 3 = 79.4 Medium Utilization for MCS 5 = 62.6 Medium Utilization for MCS 8 = 56.4 Medium Utilization for MCS 9 = 73 With this data we can conclude that MCS 9 performance for the given Tx Power is not on par with expectation. The overall cost of throughput is much higher for MCS 9. PN Rev A September
65 Test Case 5: Roaming Validation Test Case 5: Roaming Validation Overview Roaming is the ability of a device to move from one AP to another while keeping an active network session. Roaming is now very common in most commercial deployments; users typically move within the campus and expect their connection to stay up. When it comes to roaming function, the network only plays a small part (this is changing somewhat with r and k). The device makes the key decisions on when to roam and where to roam. The complexity arises from the fact that the active connection has to be maintained and serviced in parallel to completing the roaming process. The standards don t address roaming, so every vendor has their own implementation. This makes roaming function particularly susceptible to failures and interoperability issues. Testing roaming should be at the top of a device vendor s test plan, as it can impact the user s experience quite significantly. Objective Validate the roaming success rate of a device roaming between APs in channels 48 and 44. Setup PN Rev A September
66 Test Case 5: Roaming Validation Step-by-step Instructions 1. Launch IxVeriwave Golden AP WaveDevice. The workflow for configuring a test is outlined in the left frame of the GUI System (chassis and port assignment), Access Points (AP configuration), Devices (Devices and Tests), and Analysis (Results analysis). Please refer to the user guide to familiarize yourself with WaveDevice GUI 2. Enter the IP address of the chassis that hosts the Golden AP card. Click on connect when done 3. Select and reserve the Golden AP cards (IxAP) to be used for simulating the roaming test. For this example, we select 2 IxAP cards. 4. Set the Channel information for the assigned cards. Set them to channels 44 and 48, as example. Note: channel selection can have an impact on AP configuration parameters, such as AP Type and Bandwidth. 5. Switch to Access Point configuration to configure the simulated AP. You can leave most of the parameters in their default values. PN Rev A September
67 Test Case 5: Roaming Validation General Tab o Count - 5 (for each port) o Port - <Make sure this matches, what has been reserved> o SSID - Blackbook_Exercise o Default Tx Power - <leave as default to start with; adjust based on RSSI feedback from device> Data and Beacon PHY rates tab o o This is the screen to set max. supported Data and Mgmt. PHY rates of the simulated AP. The DUT used for this exercise is an Apple iphone 6, which is a SISO device that supports 256QAM. Therefore, the AP will be configured to support MCS 8 and 9. Note: Disable Antennas 2, 3 and 4 in the configuration screen, if they are not connected to the device PN Rev A September
68 Test Case 5: Roaming Validation o Under OFDM, leave the default settings as is, as it offers ultimate compatibility. Also set the Beacon PHY Rate to 6 Mbps for maximum compatibility. The Beacon PHY rate sets the PHY rate for management frames transmitted by AP. Note: Setting a low beacon PHY rate will impact the maximum throughput the device under test can achieve. If you are sure your DUT supports higher management PHY rates, you can override this setting. o Under VHT Rates, set NSS 1 - MCS 0-9 NSS 2 - Not Supported NSS 3 - Not Supported NSS 4 - Not Supported Advanced o o Leave all remaining parameters in their default values. Aggregation parameters will be discussed in more detail in a separate test case. PN Rev A September
69 Test Case 5: Roaming Validation 6. Activate the AP and switch to Devices page Toolbar Clicking Activate AP will begin beacon transmission from AP. The Beacons will be transmitted at Tx Power level set in AP config screen (default value 15dBm). 7. From DUT join the wireless network and start WaveAgent 8. Pick the Roaming test for this exercise. Roaming test is designed to simulate various roaming scenarios like 2-AP roam back and forth Multi-AP roam Intra-channel Inter-channel Mix of Intra and Inter-channel roam Roam in presence of neighbor APs Roaming simulation works by adjusting Tx Power levels between a source AP and target AP; the test engine will step down the power level in the source AP and step up the power level in target AP. These power transitions occur at regular intervals, which can be PN Rev A September
70 Test Case 5: Roaming Validation configured in the test. When the device roams, the test engine determines if the roam was successful and then calculates a roam delay. Select the device and configure roaming with following settings Path: Inter-channel Roam. This will automatically create a roam path of APs with alternating channels. Repeat: 3, to get sufficient trials. This will cycle through the roam path 3 times. Return to source AP: check Continue on fail: check. If roam fails, i.e. device doesn t end up in the target AP, this will continue the test and try to recover from the failure for future trials. Min Power: Set this to -50dB. This will be lowest Tx Power level setting applied when stepping down power level in source AP. Currently -50 is a hardware limitation as well. External attenuators can be used if more attenuation is needed. Max Power: Set this to +15dB. This will be the maximum Tx Power setting applied to a target AP when stepping up power. This is also a hardware limitation. Power Step: 1 db. Every: 1000 ms. Power step and every, are bets interpreted together. Power program will step up/down power in source/target AP based on these values. Neighbor APs Enable: Uncheck. This is an advanced configuration that is designed to validate the roaming algorithms performance with regards to picking the right AP. This will be covered in another test. Estimated Attenuation: This setting is only exposed for devices that don t report an RSSI. For such devices, enter the approximate pathloss, the test engine will use this to work out the estimate RSSI based on changing Tx Power. Start test. PN Rev A September
71 Test Case 5: Roaming Validation Result Analysis When the test starts, roaming dashboard will begin populating. Roaming dashboard tracks BSSID, Channel, TxPower, Estimated RSSI and it will automatically calculate Roam Delay and Trial Status. Real-time Monitoring stats are also available in parallel for deeper analysis. Observation 1 Roam Summary The roaming dashboard presents a run-time view of key statistics while the roam simulation is in progress. TxPower and Est. RSSI are updated every time they change, and they pause right at the moment the device roams. Soon after the roam, the status column indicates Passed or Failed based on a successful roam. A successful roam is when a device switches to the configured target AP is able to resume the active traffic session. PN Rev A September
72 Test Case 5: Roaming Validation Based on the table above, this device roamed successfully in each of the 6 trials. The roam-delay was somewhere between 30ms and 60ms for each trial. Observation 2 Forwarding Rate Vs Packet Loss Forwarding rate is the effective throughput that a device is able to achieve, Packet loss tracks number of packets lost during the last sampled period. These stats give a good insight into the impact on quality of experience for the user. High packet loss results in high medium utilization or in other words increased overhead to keep up with the same amount of forwarding rate. Based on the graph below, this device had overall very little impact on forwarding rate during the roam. This could also be because the intended load was not set high ForwardingRate RxFlow1PacketLossNumber Observation 3 TxPHYDataRate Client Device s TxPHYDataRate represents the link rate that device picks for transmission. A number of factors determine this selection, chief among which is quality of previous transmissions. Looking at the graph below the TxPHYDataRate remains mostly steady around 38Mbps, but it also fluctuates between a lower PHY rate quite often. Before we analyze this further, it will help to understand how the test system simulates roaming. The GoldenAP simulates roaming by controlling transmit power, which basically affects traffic in the downstream direction. The roaming test, however, only works in the upstream direction. Therefore, it shouldn t have much of an impact on client s TxPHYDataRate. Moreover, the fluctuating rate doesn t make sense. This is an issue that requires further investigation. PN Rev A September
73 Test Case 5: Roaming Validation TxDataPHYRate Troubleshooting and Diagnostics Symptom Diagnosis Comments Roaming Failure Check call flow, and see if the device probes all active channels Check if the device sends out association request messages to the target AP Test Variables Number of roam trials Inter channel and Intra channel Roam frequency Traffic types Traffic direction Conclusion Overall, the device under test performed well, as all the roams were successful and roam delay was under 100ms, which is the benchmark for voice traffic. However, the client s TxPHYDataRate fluctuated quite a bit, which requires further investigat PN Rev A September
74 Test Case 6: Security Test Test Case 6: Security Test Overview Wireless security covers two key functions, Authentication and Encryption. Today, most wireless networks operate with some form of 802.1x based authentication scheme and i based encryption scheme. It is therefore important to measure the impact of these settings on the performance of the device. For instance, a device s performance with or without encryption might vary quite a bit. The same goes for other roaming, power save, and other functionalities. Objective Measure the impact of AES-CCMP encryption on effective performance Setup Step-by-step Instructions 1. Follow steps 1-8 defined in TC1 to configure a simulated AP. 2. Select a GDPT test and setup a simple configuration to benchmark throughput. UDP traffic with 1518 byte packet size for both Upstream and Downstream direction. PN Rev A September
75 Test Case 6: Security Test 3. Start test, collect results. 4. Re-run the same test configuration with Security turned ON. In Access Point configuration, enable security and set a password 5. Start and collect results. Result Analysis To analyze this test case, we will compare the results of the same test configuration with security turned ON and OFF. PN Rev A September
76 Test Case 6: Security Test Observation 1 Security setup (Authentication & Encryption) When a device associates with the AP, it goes through a security handshake to complete authentication and setup encryption. This can be analyzed by capturing and viewing a trace in wireshark. WaveDevice supports packet capture in real-time. Note: this step is not necessary, but it s always good practice to validate that the security setup was successful. PN Rev A September
77 Test Case 6: Security Test Observation 2 Forwarding Rate Forwarding rate is the effective throughput a device is able to achieve in the test. The results below indicate that the throughput is lower in both Upstream and Downstream directions when security is turned ON. Encryption ON Encryption OFF Delta % Upstream 312 Mbps 327 Mbps 5% drop Downstream 316 Mbps 373 Mbps 16% drop As you can see, in the upstream direction there is a 5% drop in forwarding rate when security is turned ON in up. In the downstream direction there is a 16% drop in forwarding rate. Figure 2: Security Turned ON Figure 3: Security Turned OFF Let s continue analyzing forwarding rate with packet loss and L2 frame errors. Observation 3 L2 frame errors L2 frame errors indicate the number of transmission errors that occur between the AP and DUT. The results below indicate that, in the downstream direction, the L2 frame errors are significantly higher when encryption is turned ON 16.24% vs 1.89% PN Rev A September
78 Test Case 6: Security Test Figure 4: Security Turned ON Figure 5: Security Turned OFF The L2 frame errors are unusually high when security is turned ON, this requires deeper investigation at the device level. Observation 4 Comparison with lower load In the previous observation we noted that L2 frame errors jumped unusually high to 16% from 1.88% when security was turned ON, and this had a direct impact on the end result. One possible explanation for this outcome is that the DUT was unable to keep up with the frame rate when encryption was turned ON. To validate this claim, we need to cross-check with a lower intended load. The earlier test was run at intended load set to 100% of theoretical frame rate, so cross-check that with 25% of theoretical frame rate. Following are results from test run with security turned ON PN Rev A September
79 Test Case 6: Security Test Following are results from security turned OFF From the result above: When the frame rate is set to 25% of theoretical frame rate, forwarding rate is not impacted by security (unlike when frame rate is set to 100% of theoretical frame rate). When security is turned off, for frame rate set to 100% of theoretical, the L2 frame errors drop drastically from 16% to 1.8%. This directly correlates to higher forwarding rate in downstream. When security is turned off, for frame rate set to 25% of theoretical, the L2 frame errors increase drastically from 1.8% to 22%. This is an anomaly: even after re-running this several times the results are consistent. This is also something to investigate. PN Rev A September
80 Test Case 6: Security Test Troubleshooting and Diagnostics Symptom Diagnosis Comments Authentication failure Poor forwarding rate Download the capture file and analyze the call flow between DUT and AP Check for related stats like L2 frame errors, packet loss Conclusion Turning ON encryption typically takes up some resources for processing, and it s bound to have an impact on the performance of the device. It s important to understand this impact, and in the case of this device, we can conclude that a couple of key issues were noticed: In the downstream direction, at 100% theoretical frame rate, with security turned ON, the device is unable to keep up with the generated traffic. In the downstream direction, at 25% theoretical frame rate, with security turned OFF, the L2 errors were much higher than expected. PN Rev A September
81 Test Case 7: Ecosystem Test Test Case 7: Ecosystem Test Overview Most devices operate in an ecosystem with several other devices. Performance in this ecosystem is heavily dependent on the ability to acquire medium and transmit successfully. This is quite complex, as Wi-Fi uses a distributed coordination mechanism, and with more devices comes more complexity. It s therefore important to model these deployment scenarios in the lab to understand the impact on the performance of the device and also optimize the performance. Objective Determine the performance of DUT in the presence of 10 other devices connected to the same Access Point Setup PN Rev A September
82 Test Case 7: Ecosystem Test Step-by-step Instructions 1. Follow steps 1-8 defined in TC1 to configure a simulated AP. In Step-3 instead of reserving just 1 port, reserve a second IxClient port as well. The IxClient port will be used for simulating ecosystem clients. I 2. Select the Simple Test from Devices page. Currently, Simple test is the only test that supports ecosystem client simulation. As the DUT is being tested for its ability to acquire medium and transmit, the traffic configuration will target upstream traffic. PN Rev A September
83 Test Case 7: Ecosystem Test Traffic Direction: Upstream Frame Size: 1518 bytes Data Rate: 30 Mbps Client Count: 10 (number of ecosystem clients simulated by Ixia s IxClient Cards) 3. Start test Result Analysis When the test starts, monitoring stats will automatically begin reporting. Two columns will be created, one for reporting stats from the DUT and another for reporting stats from simulating IxClients. PN Rev A September
84 Test Case 7: Ecosystem Test Observation 1 Throughput impact Forwarding Rate - DUT Forwarding Rate Per - IxClient Based on configuration, each client is configured to simulate 30Mbps of upstream traffic. They have to contend with each other to acquire the medium and transmit. Based on the graph, we can see that the performance of the DUT was much lower than the performance of simulated IxClients. Each IxClient was able to generate round 20Mbps of forward rate, whereas for the DUT the forwarding rate was under 10Mbps. This is certainly an area for optimization. PN Rev A September
85 Test Case 7: Ecosystem Test Observation 2 DUT Transmit Rate TxDataPHYRate TxDataMCSRate The DUT kicks off with a high MCS and TxPHYDataRate, but starts tracking downward soon after and ends up at low MCS of 1 or TxPHYDataRate of 27Mbps for the majority of the test. This low rate had a significant impact on the net result low forwarding rate. It takes much longer to transmit the same amount of data at lower rates. Although devices typically adapt their transmit rates to changing channel conditions and frame transmission quality feedback. In this case the DUT made some aggressive moves while adapting, which resulted in poor forwarding rate. PN Rev A September
86 Test Case 7: Ecosystem Test Test Variables Add more simulated APs to the mix Add more clients Add bi-directional traffic Conclusion Based on the results, its clear the DUT has trouble in performing well under busy deployment conditions. The simulated IxClients performed much better in comparison to the DUT in the same given ecosystem. The DUT made some aggressive moves in switching Tx Rates while rate adapting, which resulted in impacting the forwarding rate. PN Rev A September
87 Test Case 8: Radio Transmitter Quality Test Case 8: Radio Transmitter Quality Overview For any client device to work well and meet expectations, it needs a solid foundation in the form of a really good transmitter and receiver. The transmitter should be able to transmit high quality signals when transmitting at different transmit power settings and also at different modulation rates. Similarly, the receiver should be able to meet or beat the specs in the ability to successfully receive and decode all the data at different RSSI values and modulations rates. It s very important that both the transmitter and receiver meet specifications atleast under ideal test conditions. Objective Validate the quality of the transmitter by measuring Error Vector Magnitude (EVM) at the receiver when the transmitter is transmitting at different data rates. Setup Step-by-step Instructions 5. Follow steps 1-8 defined in TC1 to configure a simulated AP. 6. Pick the Simple test for this exercise. 7. Configure the test with Upstream UDP traffic. 8. For the first trial setup a traffic flow from DUT to the Simulated AP at 1 Mbps. 9. Measure the EVM on the traffic stream on the Simulated AP using the WaveAnalyze application as shown in the screenshot below (This functionality is only available on the RFA L1-7 Hardware). PN Rev A September
88 Test Case 8: Radio Transmitter Quality In the example above the DUT was transmitting at MCS 7, 40 MHz channel Bandwidth and the 5-second moving Maximum EVM was measured at 3.9% which is well within what the spec requires for MCS 7 which is 4.47% 10. Now repeat the same test but now with a 100 Mbps traffic load from the device under test to the Golden AP. Make the same EVM measurements. Results can be seen below: In this case the measured EVM was 6.02% which is way above the spec of 4.47% Result Analysis The device s transmitter quality clearly degraded substantially when the data rate increased from 1 Mbps to 100 Mbps. It is to be noted that the theoretical throughput of the DUT is much higher than 100 Mbps and hence in theory it should very well capable of transmitting at 100 Mbps without PN Rev A September
89 Test Case 8: Radio Transmitter Quality any issues. But as we can see from the results at high data rate (which is very common when devices carry applications like HD video), the transmitter quality degrades substantially, which results in the receiver not being able to decode the frames properly. This then results in the receiver not being able to acknowledge the frames, causing the transmitter to retransmit extensively. This phenomenon will cause the transmitter to re-transmit frame several time, which substantially increases the cost of throughput. Troubleshooting and Diagnostics Symptom Diagnosis Comments High EVM Values Poor transmitter quality in certain test conditions Test antenna placement, quality of the various radio components under various conditions. Look for problems caused by interference from multiple radios and radio technologies like WiFi, LTE and Bluetooth placed too close to each other Testing is done under ideal test conditions with a fully cabled and isolated test setup Test Variables Check results for different Data Rates Conclusion Devices should be able to have excellent transmitter quality to avoid a high cost of throughput when transmitting at high data rates, which happens very commonly with applications like streaming HD video PN Rev A September
90 Test Case 9: Interoperability Testing Performance Characterization over Distance Test Case 9: Interoperability Testing Performance Characterization over Distance Overview Wi-Fi technology is very commonly used in residential and enterprise scenarios to carry deal sensitive and high bandwidth real-time voice and video traffic. Since the devices connected to the Access Point are wireless, they can be located at different distances from the AP, and it s important to make sure that the users of applications like streaming HD video get a good quality of service at different distances from the Access Point. The previous test was designed to benchmark performance over distance of a DUT against a Golden AP. This test case deals with running the same test using real APs. Objective Characterize the performance of device under test against a real AP over various distance profiles. Setup PN Rev A September
91 Test Case 9: Interoperability Testing Performance Characterization over Distance Step-by-step Instructions 1. Launch IxVeriWave Interoperability WaveDevice. The workflow for configuring a test is outlined in the left frame of the GUI- System (chassis and port assignment), End Points (End Points IP configuration), Devices (Devices and Tests) and Analysis (Results Analysis). Please refer to the user guide to familiarize yourself with Interoperability WaveDevice GUI. 2. Enter host name or IP address of the chassis that hosts the Interoperability hardware. Click on connect to establish communication with chassis. Interoperability hardware includes the following components: 1 Ethernet port 2 Wi-Fi ports i.e., Access Point and Client expert analysis ports 1 Programmable RF Management unit 1 Access Point Device under test with WaveAgent software pre-installed 3. After successfully connecting to chassis, Application will populate end point and monitoring ports information. User is expected to select appropriate hardware ports for the current test. PN Rev A September
92 Test Case 9: Interoperability Testing Performance Characterization over Distance 4. Enter RFMU IP address and click connect to establish communication with RFMU. After successful connection with RFMU, application will retrieve RFMU Model, RFMU Firmware Revision, default attenuation value and available RFMU banks information. User can reserve RFMU banks by clicking on check-box against bank number. If it is SISO one RFMU bank is enough and MIMO testing requires multiple banks. PN Rev A September
93 Test Case 9: Interoperability Testing Performance Characterization over Distance 5. Reserve the endpoint port. Ethernet endpoint will be used to generate or receive traffic depending on traffic direction. 6. Reserve Access Point and Client monitoring ports. AP and Client Wi-Fi monitor ports will monitor Tx frames from Access Point and device respectively. Reserving Access Point monitor port will initiate scan functionality and discover all available Wi- Fi networks. The available Wi-Fi network(s) information will be shown in table format and user should choose test wireless network. 7. Reserve client monitor port. Client monitor port will be configured on same Wi-Fi channel as selected in AP monitor port. PN Rev A September
94 Test Case 9: Interoperability Testing Performance Characterization over Distance 8. Switch to Endpoint configuration. Enter WaveAgent endpoint information by click on + sign. Click Active Endpoints to establish communication between Ethernet and WaveAgent endpoints. 9. Switch to Devices configuration. Select Rate vs Range Test test from Test Type. Traffic Type : UDP Traffic Direction : Downstream, Upstream Attenuations : 0 60 dbm with 1 dbm step Frame Rate : 200 PN Rev A September
95 Test Case 9: Interoperability Testing Performance Characterization over Distance Note: User can measure path loss using AP Tx Power and RSSI value 10. Start Test Result Analysis Interoperability WaveDevice provides you the ability to validate the DUT rate adaptation algorithm using upstream traffic and measure receive performance using downstream traffic. When test starts executing, monitoring stats will begin populating simultaneously. Monitoring stats are retrieved from IxVeriWave cards like (RFA/WBA3601) as well as the WaveAgent running on DUT. Retrieved stats are presented in WaveDevice GUI in 2 categories: Flow Statistics - stats measuring the active traffic flows Station Statistics Tx Stats from Client device to Access Point and vice versa PN Rev A September
96 Test Case 9: Interoperability Testing Performance Characterization over Distance Stats are also stored away as CSV files for each trail in the host PC. There is an analysis module called View Measurement that can analyze results by co-relating stats and presenting results as bar or line graphs. RvR tests go by trials: for each trial the set of key configuration parameters are highlighted as the trial progresses. In this test there are 120 trials. The number of trials will be derived from Traffic Types, Traffic Direction, Attenuations, and Frame Rates. For the sake of brevity, we will pick some key trials for analysis. This should be sufficient to give an idea on how to go about analyzing results from RvR test. When each trial starts executing, the first set of stats to look at would be Offered Load and Forwarding Rate, Packet Loss and Client Data PHY Rate. Observation 1 Forwarding rate The downstream forwarding rate follows a nice curve of decreasing forwarding rate over simulated distance, which is the expected result. But on the upstream direction when the client is transmitting to the AP, at around 25dB of attenuation there is a sharp drop in the forwarding rate, and there are also a number of instances where the measurements could not be made because of lost connectivity. This indicates that the client device is not performing well after a certain simulated distance while transmitting. PN Rev A September
97 Test Case 9: Interoperability Testing Performance Characterization over Distance Observation 2 Packet Loss The same effect seen in the forwarding rate chart is also reflected in the packet loss chart. Packet loss shoots up when attenuation goes beyond a certain value. The increase in the packet loss means that AP is not able to properly receive frames from the client at those attenuation values. This could also mean that the client device is not picking the most optimal transmit data rate based on the channel conditions. Observation 3 Average Data PHY Rate Data PHY rate indicates the transmission rate selected by DUT. Devices pick the optimal rate for a successful transmission. Under rate adaptation, devices will dynamically update the rates to increase rate of successful transmission, and minimize retries. Figure 6: AP Data PHY Rate PN Rev A September
98 Test Case 9: Interoperability Testing Performance Characterization over Distance Figure 7: Client Data PHY Rate The above two charts plot the transmit PHY data rate of the AP and the client respectively over attenuation on the X-axis. These charts reinforce the point that the AP did very well in rate adapting from over 600 Mbps to 6 Mbps as the attenuation (simulated distance) increased. The client, however, started from about 300 Mbps PHY rate and went through several ups and downs when the signal between the AP and the client was attenuated. For example, when the client transmits at 300 Mbps, it used a high Modulation and Coding (MCS) rate. The disadvantage of using a high MCS rate is, it requires a high Signal to Noise (SNR) ratio to decode the frames at the receiver. The advantage is that the transmitter can send more bits per symbol and can achieve higher data rates and higher spectral efficiency. As the attenuation increases, the SNR at the receiver decreases, and this causes the receiver (in this case the AP) to start seeing packet errors when the client is transmitting at high MCS rates. So the AP will not be able to acknowledge (ACK) some of the frames from the client. If the client continues to send data at the high MCS rates, the AP will continue to lose frames. So the client needs to drop its MCS rate to adapt to the changing channel conditions, and always try to find the optimal PHY rate that can minimize the number of packet errors and maximize the spectral efficiency. In this case, the client doesn t seem to be doing a proper job of rate adaption, which is causing it to lose more frames in the upstream/ This in turn results in less forwarding rate in the upstream. PN Rev A September
99 Test Case 9: Interoperability Testing Performance Characterization over Distance Observation 4 - Aggregation The above chart shows how the client and the AP aggregate frames as the attenuation increases. The downstream here represents the AP perspective, and the upstream represents the client perspective. AP seems to be very consistent in the way it s aggregating. Interestingly, the AP seems to start with about 30 MPDUs in an AMPDU at low attenuation value. It then seems to increase it 64 MPDUs and then drop back to 30. The increase to 64 in the middle of the test could be because the AP has a lot of lost frames to retransmit and hence is trying to build large aggregates. The client, however, seems to be all over the place when it comes to aggregation. This indicates that there are number of potential issues with the way the client device is buffering and reordering frames to be transmitted. Troubleshooting and Diagnostics Symptom Diagnosis Comments Sharp drop in Forwarding Rate for upstream traffic. Client Data PHY Rate very inconsistent over distance Possibly a problem with the quality of the transmitter signal at high attenuation values. The client transmit PHY rates possibly not adapting well to changing error rates as the attenuation increases. Test run under ideal isolated test conditions so real world performance may be worse The rates drop and then go back up, which causes a lot of retries and errors PN Rev A September
100 Test Case 9: Interoperability Testing Performance Characterization over Distance Test Variables Traffic Type Traffic Direction Attenuation Values Frame Rate Conclusion Based on the results, the AP performance was on par with expectation; however, the client device exhibited multiple issues in packet loss, throughput, and data PHY rate. The up and downs in the PHY rates and the aggregation sizes of the packets transmitted by the DUT in the upstream direction indicate that the upstream application performance may not meet the expectations at certain attention levels or certain distances from the AP. PN Rev A September
101 Contact Ixia Contact Ixia Corporate Headquarters Ixia Worldwide Headquarters W. Agoura Rd. Calabasas, CA USA FOR IXIA ( ) (International) (FAX) Web site: General: Investor Relations: Training: Support: Option 1 (outside USA) Online support form: EMEA Ixia Technologies Europe Limited Clarion House, Norreys Drive Maiden Head SL6 4FL United Kingdom Renewals: [email protected] Support: [email protected] Online support form: FAX VAT No. GB [email protected] Ixia Asia Pacific Headquarters 101 Thomson Road, #29-04/05 United Square Singapore Support: [email protected] (Option 1) Online support form: FAX [email protected] PN Rev A September
102 PN Rev A September
Evaluating Wireless Broadband Gateways for Deployment by Service Provider Customers
Evaluating Wireless Broadband Gateways for Deployment by Service Provider Customers Overview A leading provider of voice, video, and data services to the residential and businesses communities designed
Voice over Wi-Fi Voice Quality Assessment Test
Location of Test: Azimuth Systems Headquarters, Acton, MA Date of Test: August 23 25, 2005 Voice over Wi-Fi Voice Quality Assessment Test A CT Labs Summary Report for Azimuth Systems and Empirix, Inc.
ALLION USA INTERNET SERVICE PROVIDER WIRELESS GATEWAY COMPETITIVE ANALYSIS
ALLION USA INTERNET SERVICE PROVIDER WIRELESS GATEWAY COMPETITIVE ANALYSIS Date: 4/25/2013 Rev 1.0 Visit our Web Site at: www.allionusa.com 1 Introduction Internet Service Providers (ISP) have a number
Optimizing Wireless Networks.
from the makers of inssider Optimizing Wireless Networks. Over the past few years, MetaGeek has created tools to help users optimize their wireless networks. MetaGeek s tools help visualize the physical
Cloud-based Wireless LAN for Enterprise, SMB, IT Service Providers and Carriers. Product Highlights. Relay2 Enterprise Access Point RA100 Datasheet
Cloud-based Wireless LAN for Enterprise, SMB, IT Service Providers and Carriers The Relay2 Smart Access Point (RA100) is an enterprise-class access point designed for deployment in high-density environments
Guide for wireless environments
Sanako Study Guide for wireless environments 1 Contents Sanako Study... 1 Guide for wireless environments... 1 What will you find in this guide?... 3 General... 3 Disclaimer... 3 Requirements in brief...
TamoSoft Throughput Test
TAKE CONTROL IT'S YOUR SECURITY TAMOSOFT df TamoSoft Throughput Test Help Documentation Version 1.0 Copyright 2011-2014 TamoSoft Contents Contents... 2 Introduction... 3 Overview... 3 System Requirements...
SmartDiagnostics Application Note Wireless Interference
SmartDiagnostics Application Note Wireless Interference Publication Date: May 27, 2015 KCF Technologies, Inc. Background The SmartDiagnostics wireless network is an easy to install, end-to-end machine
GTER 26 tudo o que você. quer saber sobre 802.11n
GTER 26 tudo o que você (não) quer saber sobre 802.11n Luiz Eduardo Dos Santos CISSP CWNE CEH GISP GCIH Sr. Systems & Security Engineer Americas hello agenda evolution of wi-fi what makes 11n what actually
CT522-128 LANforge WiFIRE Chromebook 802.11a/b/g/n WiFi Traffic Generator with 128 Virtual STA Interfaces
1 of 8 Network Testing and Emulation Solutions http://www.candelatech.com [email protected] +1 360 380 1618 [PST, GMT -8] CT522-128 LANforge WiFIRE Chromebook 802.11a/b/g/n WiFi Traffic Generator with
Attenuation (amplitude of the wave loses strength thereby the signal power) Refraction Reflection Shadowing Scattering Diffraction
Wireless Physical Layer Q1. Is it possible to transmit a digital signal, e.g., coded as square wave as used inside a computer, using radio transmission without any loss? Why? It is not possible to transmit
Document ID: 108184. Contents. Introduction. Prerequisites. Requirements. Components Used. Related Products. Conventions. 802.
Products & Services Configure 802.11n on the WLC Document ID: 108184 Contents Introduction Prerequisites Requirements Components Used Related Products Conventions 802.11n - An Overview How Does 802.11n
Wireless Technologies for the 450 MHz band
Wireless Technologies for the 450 MHz band By CDG 450 Connectivity Special Interest Group (450 SIG) September 2013 1. Introduction Fast uptake of Machine- to Machine (M2M) applications and an installed
Wireless N 150 USB Adapter with 10dBi High Gain Antenna. Model # AWLL5055 User s Manual. Rev. 1.0
Wireless N 150 USB Adapter with 10dBi High Gain Antenna Model # AWLL5055 User s Manual Rev. 1.0 Table of Contents 1. Introduction...2 1.1 Package Contents...2 1.2 Features...2 2. Install Wireless USB Adapter...3
Wireless N 300 Mini USB Adapter. Model # AWLL6086 User s Manual. Rev. 1.0
Wireless N 300 Mini USB Adapter Model # AWLL6086 User s Manual Rev. 1.0 Table of Contents 1. Introduction...2 1.1 Package Contents...2 1.2 Features...2 2. Install the Wireless Adapter...3 3. Install the
Measure wireless network performance using testing tool iperf
Measure wireless network performance using testing tool iperf By Lisa Phifer, SearchNetworking.com Many companies are upgrading their wireless networks to 802.11n for better throughput, reach, and reliability,
IEEE 802.11n Enterprise Class Wireless LAN?
Introduction Over the last decade Wi-Fi has advanced from a technology offering a maximum 2Mbps over-theair data rate, to 11Mbps and now to 54Mbps. The technology has been improved to include additions
Demystifying Wireless for Real-World Measurement Applications
Proceedings of the IMAC-XXVIII February 1 4, 2010, Jacksonville, Florida USA 2010 Society for Experimental Mechanics Inc. Demystifying Wireless for Real-World Measurement Applications Kurt Veggeberg, Business,
Lab Exercise 802.11. Objective. Requirements. Step 1: Fetch a Trace
Lab Exercise 802.11 Objective To explore the physical layer, link layer, and management functions of 802.11. It is widely used to wireless connect mobile devices to the Internet, and covered in 4.4 of
Best Practices for High Density Wireless Network Design In Education and Small/Medium Businesses
Best Practices for High Density Wireless Network Design In Education and Small/Medium Businesses White Paper Table of Contents Executive Summary 3 Introduction 3 Determining Access Point Throughput 4 Establishing
VOICE OVER WI-FI CAPACITY PLANNING
VOICE OVER WI-FI CAPACITY PLANNING Version 1.0 Copyright 2003 Table of Contents Introduction...3 Wi-Fi RF Technology Options...3 Spectrum Availability and Non-Overlapping Wi-Fi Channels...4 Limited
IxVeriWave BYOD (Bring Your Own Device) Testing
IxVeriWave BYOD (Bring Your Own Device) Testing Highlights High-scale controller load testing with a single test involving tens of thousands of clients and hundreds of APs Real-world deployment tests scale
HUAWEI Enterprise AP Series 802.11ac Brochure
Enterprise AP Series 802.11ac Brochure 01 Enterprise AP Series 802.11ac Brochure 1 Overview Release of 802.11ac standards has driven wireless technologies to the era of GE Wi-Fi. Enterprise Wi-Fi networks
App coverage. ericsson White paper Uen 284 23-3212 Rev B August 2015
ericsson White paper Uen 284 23-3212 Rev B August 2015 App coverage effectively relating network performance to user experience Mobile broadband networks, smart devices and apps bring significant benefits
Golden N Wireless Mini USB Adapter. Model # AWLL6075 User s Manual. Rev. 1.2
Golden N Wireless Mini USB Adapter Model # AWLL6075 User s Manual Rev. 1.2 Table of Contents 1. Introduction...2 1.1 Package Contents...2 1.2 Features...2 2. Install the Wireless Adapter...3 3. Connect
ProCurve Networking. Troubleshooting WLAN Connectivity. Technical White paper
ProCurve Networking Troubleshooting WLAN Connectivity Technical White paper Introduction... 3 Identifying the Problem... 3 Troubleshooting Wireless Station Connection to AP... 4 Can Any Wireless Stations
WHITE PAPER. WEP Cloaking for Legacy Encryption Protection
WHITE PAPER WEP Cloaking for Legacy TM Encryption Protection Introduction Wired Equivalent Privacy (WEP) is the encryption protocol defined in the original IEEE 802.11 standard for Wireless Local Area
LTE BACKHAUL REQUIREMENTS: A REALITY CHECK
By: Peter Croy, Sr. Network Architect, Aviat Networks INTRODUCTION LTE mobile broadband technology is now being launched across the world with more than 140 service providers committed to implement it
APC series overview. Copyright 2014 Deliberant LLC
APC series overview APC series - overview Deliberant s APC series product line includes a comprehensive variety of devices to meet the most demanding of applications. All the products utilize unlicensed
Wireless Ethernet LAN (WLAN) General 802.11a/802.11b/802.11g FAQ
Wireless Ethernet LAN (WLAN) General 802.11a/802.11b/802.11g FAQ Q: What is a Wireless LAN (WLAN)? Q: What are the benefits of using a WLAN instead of a wired network connection? Q: Are Intel WLAN products
LTE, WLAN, BLUETOOTHB
LTE, WLAN, BLUETOOTHB AND Aditya K. Jagannatham FUTURE Indian Institute of Technology Kanpur Commonwealth of Learning Vancouver 4G LTE LTE (Long Term Evolution) is the 4G wireless cellular standard developed
WIRELESS IN THE METRO PACKET MICROWAVE EXPLAINED
WIRELESS IN THE METRO PACKET MICROWAVE EXPLAINED RAJESH KUMAR SUNDARARAJAN Assistant Vice President - Product Management, Aricent Group WIRELESS IN THE METRO PACKET MICROWAVE EXPLAINED This whitepaper
Understanding the IEEE 802.11ac Wi Fi Standard Preparing for the next gen of WLAN
WHITEPAPER Understanding the IEEE 802.11ac Wi Fi Standard Preparing for the next gen of WLAN July 2013 Author: Richard Watson Contributors: Dennis Huang, Manish Rai Table of Contents Executive summary...
Key Features. Multiple Operation Modes ENH500 can operate into four different modes with Access Point, Client Bridge, Client Router and WDS Mode.
802.11a/n Long Range Wireless Outdoor CPE Key Features IEEE 802.11 a/n compliant Up to 300Mbps (5GHz) 24V Proprietary PoE support Waterproof Housing IP65 rated AP/CB/CR/WDS Modes 4 SSIDs support + VLAN
Get the best performance from your LTE Network with MOBIPASS
Get the best performance from your LTE Network with MOBIPASS The most powerful, user friendly and scalable enodeb test tools family for Network Equipement Manufacturers and Mobile Network Operators Network
IEEE802.11ac: The Next Evolution of Wi-Fi TM Standards
QUALCOMM, Incorporated May 2012 QUALCOMM is a registered trademark of QUALCOMM Incorporated in the United States and may be registered in other Countries. Other product and brand names may be trademarks
Wi-Fi CERTIFIED n: Longer-Range, Faster-Throughput, Multimedia-Grade Wi-Fi Networks
Wi-Fi CERTIFIED n: Longer-Range, Faster-Throughput, Multimedia-Grade Wi-Fi Networks September 2009 The following document and the information contained herein regarding Wi-Fi Alliance programs and expected
Site Survey and RF Design Validation
CHAPTER 8 Site Survey Introduction In the realm of wireless networking, careful planning is essential to ensure that your wireless network performs in a manner that is consistent with Cisco s design and
Link Link sys E3000 sys RE1000
User Guide High Performance Extender Wireless-N Router Linksys Linksys RE1000 E3000Wireless-N Table of Contents Contents Chapter 1: Product Overview 1 Front 1 Top 1 Bottom 1 Back 2 Chapter 2: Advanced
MEASURING WIRELESS NETWORK CONNECTION QUALITY
Technical Disclosure Commons Defensive Publications Series January 27, 2016 MEASURING WIRELESS NETWORK CONNECTION QUALITY Mike Mu Avery Pennarun Follow this and additional works at: http://www.tdcommons.org/dpubs_series
Network Simulation Traffic, Paths and Impairment
Network Simulation Traffic, Paths and Impairment Summary Network simulation software and hardware appliances can emulate networks and network hardware. Wide Area Network (WAN) emulation, by simulating
WI-FI PERFORMANCE BENCHMARK TESTING: Aruba Networks AP-225 and Cisco Aironet 3702i
WI-FI PERFORMANCE BENCHMARK TESTING: Networks AP-225 and Cisco Aironet 3702i Conducted at the Proof-of-Concept Lab January 24, 2014 Statement of Test Result Confidence makes every attempt to optimize all
AIRAYA offers several customer support options to assist you with difficulties you might experience with your WirelessGRID wireless bridge:
Contents Step 1. Setup a wired Ethernet network between test stations...2 Step 2. Setup wired Ethernet network connections to WirelessGRID bridges...3 Step 3. Setup bridge software configuration for WirelessGRID
ACRS 2.0 User Manual 1
ACRS 2.0 User Manual 1 FCC Regulatory Information This device complies with part 15 of the FCC Rules. Operation is subject to the following two conditions: (1) This device may not cause harmful interference,
Propsim enabled Mobile Ad-hoc Network Testing
www.anite.com Propsim enabled Mobile Ad-hoc Network Testing Anite is now part of Keysight Technologies Lab-based, end-to-end performance testing of systems using Propsim MANET channel emulation A Mobile
Burst Testing. New mobility standards and cloud-computing network. This application note will describe how TCP creates bursty
Burst Testing Emerging high-speed protocols in mobility and access networks, combined with qualityof-service demands from business customers for services such as cloud computing, place increased performance
High-Density Wi-Fi. Application Note
High-Density Wi-Fi Application Note Table of Contents Background... 3 Description... 3 Theory of Operation... 3 Application Examples... Tips and Recommendations... 7 2 Background One of the biggest challenges
Characterizing Wireless Network Performance
Characterizing Wireless Network Performance Ruckus Wireless Black Paper Accurate performance testing for wireless networks requires understanding how to test for worst case scenarios As expensive and inconvenient
alcatel-lucent converged network solution The cost-effective, application fluent approach to network convergence
alcatel-lucent converged network solution The cost-effective, application fluent approach to network convergence the corporate network is under pressure Today, corporate networks are facing unprecedented
Wi-Fi / WLAN Performance Management and Optimization
Wi-Fi / WLAN Performance Management and Optimization Veli-Pekka Ketonen CTO, 7signal Solutions Topics 1. The Wi-Fi Performance Challenge 2. Factors Impacting Performance 3. The Wi-Fi Performance Cycle
Linksys WAP300N. User Guide
User Guide Contents Contents Overview Package contents 1 Back view 1 Bottom view 2 How to expand your home network 3 What is a network? 3 How to expand your home network 3 Where to find more help 3 Operating
CWNA Instructor Led Course Outline
CWNA Instructor Led Course Outline Enterprise Wi-Fi Administration, Outline v7.0 Introduction The Enterprise Wireless LAN Administration 7.1 course (which prepares students for the CWNA-106 exam), whether
HomePlug AV2 Technology
HomePlug AV2 Technology Raising the Bar for Sustained High-Throughput Performance and Interoperability for Multi-stream Networking Using Existing Powerline Wiring in the Home. Copyright 2013, HomePlug
AC1750 Dual Band Wireless Router with StreamBoost Technology. TEW-824DRU (v1.0r) TEW-824DRU
AC1750 Dual Band Wireless Router with StreamBoost Technology TEW-824DRU (v1.0r) Low latency gaming/voice prioritization AC1750: 1,300 Mbps WiFi AC + 450 Mbps WiFi N bands Intelligent traffic shaping Pre-encrypted
Understanding and Optimizing 802.11n
Understanding and Optimizing 802.11n Buffalo Technology July 2011 Brian Verenkoff Director of Marketing and Business Development Introduction: Wireless networks have always been difficult to implement
End-to-end Cognitive Radio Testbed (EECRT ) current state and proposal for continuation TEKES TRIAL program
End-to-end Cognitive Radio Testbed (EECRT ) current state and proposal for continuation TEKES TRIAL program Department of Communications and Networking School of Electrical Engineering Aalto University
Qualcomm Atheros, Inc. 802.11ac MU-MIMO: Bridging the MIMO Gap in Wi-Fi
Qualcomm Atheros, Inc. 802.11ac MU-MIMO: Bridging the MIMO Gap in Wi-Fi January, 2015 Qualcomm Atheros, Inc. Not to be used, copied, reproduced, or modified in whole or in part, nor its contents revealed
Delivering Network Performance and Capacity. The most important thing we build is trust
Delivering Network Performance and Capacity The most important thing we build is trust The Ultimate in Real-life Network Perfomance Testing 1 The TM500 Family the most comprehensive 3GPP performance and
White Paper. Actionable Insight Into WiFi Device Performance
White Paper Actionable Insight Into WiFi Device Performance 26601 Agoura Road, Calabasas, CA 91302 Tel: 818.871.1800 Fax: 818.871.1805 www.ixiacom.com 915-6026-01 Rev. B, September 2013 2 Table of Contents
Mail Gateway Testing. Test Plan. 26601 W. Agoura Rd. Calabasas, CA 91302 (Toll Free US) 1.877.FOR.IXIA (Int'l) +1.818.871.1800 (Fax) 818.871.
Mail Gateway Testing 26601 W. Agoura Rd. Calabasas, CA 91302 (Toll Free US) 1.877.FOR.IXIA (Int'l) +1.818.871.1800 (Fax) 818.871.1805 www.ixiacom.com Test Plan Copyright 2006 by Ixia All rights reserved
Top Six Considerations
Top Six Considerations for Upgrading to table of contents + Section I: Introduction Understanding...2 + Section II: Uses Cases for... 3 + Section III: Top 6 considerations for...5 + Section IV: Conclusion...
Express Forwarding : A Distributed QoS MAC Protocol for Wireless Mesh
Express Forwarding : A Distributed QoS MAC Protocol for Wireless Mesh, Ph.D. [email protected] Mesh 2008, Cap Esterel, France 1 Abstract Abundant hidden node collisions and correlated channel access
WaveInsite Mobile WLAN Client Interoperability and Performance Testing
WaveInsite Mobile WLAN Client Interoperability and Performance Testing WaveInsite is the fastest way to conduct over the air Wi-Fi interoperability & performance testing between Wi-Fi access points (APs)
Whitepaper. Next Generation Gigabit WiFi - 802.11ac
Whitepaper Next Generation Gigabit WiFi - 802.11ac Next Generation Gigabit WiFi - 802. 11ac The first WiFi-enabled devices were introduced in 1997. For the first time, we were liberated from a physical
Using TrueSpeed VNF to Test TCP Throughput in a Call Center Environment
Using TrueSpeed VNF to Test TCP Throughput in a Call Center Environment TrueSpeed VNF provides network operators and enterprise users with repeatable, standards-based testing to resolve complaints about
Testing Packet Switched Network Performance of Mobile Wireless Networks IxChariot
TEST PLAN Testing Packet Switched Network Performance of Mobile Wireless Networks IxChariot www.ixiacom.com 915-6649-01, 2006 Contents Testing Packet Switched Network Performance of Mobile Wireless Networks...3
LTE-Advanced Carrier Aggregation Optimization
Nokia Networks LTE-Advanced Carrier Aggregation Optimization Nokia Networks white paper LTE-Advanced Carrier Aggregation Optimization Contents Introduction 3 Carrier Aggregation in live networks 4 Multi-band
500M Powerline Pass-Through Ethernet Bridge
500M Powerline Pass-Through Ethernet Bridge Key Features IEEE Compliant HomePlug AV & LA Designed for high-definition multimedia streaming Data rate up to 500Mbps and distance up to 300 Meters over existing
Whitepaper. 802.11n The Next Generation in Wireless Technology
Whitepaper 802.11n The Next Generation in Wireless Technology Introduction Wireless technology continues to evolve and add value with its inherent characteristics. First came 802.11, then a & b, followed
Measuring Wireless Network Performance: Data Rates vs. Signal Strength
EDUCATIONAL BRIEF Measuring Wireless Network Performance: Data Rates vs. Signal Strength In January we discussed the use of Wi-Fi Signal Mapping technology as a sales tool to demonstrate signal strength
Wireless Video Best Practices Guide
Wireless Video Best Practices Guide Using Digital Video Manager (DVM) with the OneWireless Universal Mesh Network Authors: Annemarie Diepenbroek DVM Product Manager Soroush Amidi OneWireless Product Manager
Maximizing Range and Battery Life in Low-Cost Wireless Networks
Maximizing Range and Battery Life in Low-Cost Wireless Networks The proliferation of cost-effective wireless technology has led to the rise of entirely new types of networks across a wide range of applications
SwannSecure Monitoring System
EN SwannSecure Monitoring System Wi-Fi Connections Quick Setup Guide Welcome! Lets get started. QWIFISS130814E Swann Communications 2014 1 INTRODUCTION 1 2 3 4 By default, the SwannSecure Monitoring System
MIGRATING PUBLIC SAFETY NETWORKS TO IP/MPLS
AVIAT NETWORKS MIGRATING PUBLIC SAFETY NETWORKS TO IP/MPLS CHOOSING THE RIGHT MICROWAVE PLATFORM converging w i r e l e s s networks This paper explores the migration strategy for Public Safety (PS) networks
HSPA+ and LTE Test Challenges for Multiformat UE Developers
HSPA+ and LTE Test Challenges for Multiformat UE Developers Presented by: Jodi Zellmer, Agilent Technologies Agenda Introduction FDD Technology Evolution Technology Overview Market Overview The Future
When SDN meets Mobility
When SDN meets Mobility The result is an automated, simpler network that supports the way you work With wireless increasingly becoming the primary means of access for end users, it is essential that any
Lecture 17: 802.11 Wireless Networking"
Lecture 17: 802.11 Wireless Networking" CSE 222A: Computer Communication Networks Alex C. Snoeren Thanks: Lili Qiu, Nitin Vaidya Lecture 17 Overview" Project discussion Intro to 802.11 WiFi Jigsaw discussion
Enterprise WiFi System. Datasheet. Models: UAP, UAP-LR, UAP-PRO, UAP-AC UAP-Outdoor, UAP-Outdoor5
Enterprise WiFi System Models: UAP, UAP-LR, UAP-PRO, UAP-AC UAP-Outdoor, UAP-Outdoor5 Unlimited Indoor/Outdoor AP Scalability in a Unified Management System Breakthrough Speeds up to 1300 Mbps (802.11ac)
Cooperative Techniques in LTE- Advanced Networks. Md Shamsul Alam
Cooperative Techniques in LTE- Advanced Networks Md Shamsul Alam Person-to-person communications Rich voice Video telephony, video conferencing SMS/MMS Content delivery Mobile TV High quality video streaming
Deliberant.com, 2011. Technology review
Technology review 11N advantages Parameter 802.11a/g 802.11n 2x2 Improvement factor Data rate, Mbps 108 300 2.7x Max throughput, h t Mbps 45 150 3.3x3 Spectral efficiency, bit/hz 1.125 3.75 3.3x Signal
communication over wireless link handling mobile user who changes point of attachment to network
Wireless Networks Background: # wireless (mobile) phone subscribers now exceeds # wired phone subscribers! computer nets: laptops, palmtops, PDAs, Internet-enabled phone promise anytime untethered Internet
LTE Mobility Enhancements
Qualcomm Incorporated February 2010 Table of Contents [1] Introduction... 1 [2] LTE Release 8 Handover Procedures... 2 2.1 Backward Handover... 2 2.2 RLF Handover... 3 2.3 NAS Recovery... 5 [3] LTE Forward
Spectrum Analysis How-To Guide
Spectrum Analysis How-To Guide MOTOROLA SOLUTIONS and the Stylized M Logo are registered in the US Patent & Trademark Office. Motorola Solutions, Inc. 2012. All rights reserved. Spectrum Analysis 3 Contents
Improving Quality of Service
Improving Quality of Service Using Dell PowerConnect 6024/6024F Switches Quality of service (QoS) mechanisms classify and prioritize network traffic to improve throughput. This article explains the basic
Networking: Certified Wireless Network Administrator Wi Fi Engineering CWNA
coursemonster.com/uk Networking: Certified Wireless Network Administrator Wi Fi Engineering CWNA View training dates» Overview This new market-leading course from us delivers the best in Wireless LAN training,
1 Introduction 1 1.1 Services and Applications for HSPA 3 1.2 Organization of the Book 6 References 7
Figures and Tables About the Authors Preface Foreword Acknowledgements xi xix xxi xxiii xxv 1 Introduction 1 1.1 Services and Applications for HSPA 3 1.2 Organization of the Book 6 References 7 2 Overview
WiLink 8 Solutions. Coexistence Solution Highlights. Oct 2013
WiLink 8 Solutions Coexistence Solution Highlights Oct 2013 1 Products on market with TI connectivity 2004 2007 2009-11 2013 Use cases: BT voice, WLAN data Features: TDM based operation Strict protection
White Paper. D-Link International Tel: (65) 6774 6233, Fax: (65) 6774 6322. E-mail: [email protected]; Web: http://www.dlink-intl.
Introduction to Voice over Wireless LAN (VoWLAN) White Paper D-Link International Tel: (65) 6774 6233, Fax: (65) 6774 6322. Introduction Voice over Wireless LAN (VoWLAN) is a technology involving the use
TECHNICAL NOTE. GoFree WIFI-1 web interface settings. Revision Comment Author Date 0.0a First release James Zhang 10/09/2012
TECHNICAL NOTE GoFree WIFI-1 web interface settings Revision Comment Author Date 0.0a First release James Zhang 10/09/2012 1/14 Web interface settings under admin mode Figure 1: web interface admin log
Universal Form-factor. Wi Fi Troubleshooting Made Easy
AirMedic USB AirMedic USB is a powerful, easy-touse and affordable spectrum analysis tool that brings Wi-Fi troubleshooting to entry-level users. Built upon AirMagnet expertise in Wi-Fi troubleshooting,
Wireless Networks. Reading: Sec5on 2.8. COS 461: Computer Networks Spring 2011. Mike Freedman
1 Wireless Networks Reading: Sec5on 2.8 COS 461: Computer Networks Spring 2011 Mike Freedman hep://www.cs.princeton.edu/courses/archive/spring11/cos461/ 2 Widespread Deployment Worldwide cellular subscribers
Lab Testing Summary Report
Lab Testing Summary Report November 2011 Report 111018 Product Category: Supervisor Engine Vendor Tested: Product Tested: Catalyst 4500E Supervisor Engine 7L-E Key findings and conclusions: Cisco Catalyst
NetComm Wireless NP920 Dual Band WiFi USB Adapter. User Guide
NetComm Wireless NP920 Dual Band WiFi USB Adapter User Guide Contents Preface... 3 Important Safety Instructions... 3 Introduction... 4 Overview... 4 Features... 4 Package Contents... 5 Minimum System
ENSC 427: Communication Networks. Analysis of Voice over IP performance on Wi-Fi networks
ENSC 427: Communication Networks Spring 2010 OPNET Final Project Analysis of Voice over IP performance on Wi-Fi networks Group 14 members: Farzad Abasi ([email protected]) Ehsan Arman ([email protected]) http://www.sfu.ca/~faa6
LoRaWAN. What is it? A technical overview of LoRa and LoRaWAN. Technical Marketing Workgroup 1.0
LoRaWAN What is it? A technical overview of LoRa and LoRaWAN Technical Marketing Workgroup 1.0 November 2015 TABLE OF CONTENTS 1. INTRODUCTION... 3 What is LoRa?... 3 Long Range (LoRa )... 3 2. Where does
