Nokia Siemens Networks. Bandwidth Proximity Control. A recipe for low latency networks

Similar documents
Bandwidth Proximity Control. Reduce latency to milliseconds. NSN White paper Reduce latency to milliseconds March 2013

Technology Vision 2020 Reducing network latency to milliseconds

Business aware traffic steering

Signaling is growing 50% faster than data traffic

White paper. Mobile broadband with HSPA and LTE capacity and cost aspects

networks Live & On-Demand Video Delivery without Interruption Wireless optimization the unsolved mystery WHITE PAPER

NSN White paper February Nokia Solutions and Networks Smart Scheduler

App coverage. ericsson White paper Uen Rev B August 2015

3G Application Aware RAN with In-bearer optimization

4G Operator Strategies & the Key Lessons Learned

LTE-Advanced Carrier Aggregation Optimization

Efficient resource utilization improves the customer experience

How QoS differentiation enhances the OTT video streaming experience. Netflix over a QoS enabled

FutureWorks Nokia technology vision 2020: personalize the network experience. Executive Summary. Nokia Networks

Timing over Packet. Technical Brief

Specialized services and net neutrality

Nokia Networks. Voice over LTE (VoLTE) Optimization

APPLICATION-AWARE ROUTING IN SOFTWARE-DEFINED NETWORKS

Simplified network architecture delivers superior mobile broadband

Is Your Network Ready for VoIP? > White Paper

Accelerating Cloud Based Services

Nokia Siemens Networks Total Expertise for Customer Experience driven OSS Transformation

Nokia Siemens Networks mobile softswitching Taking voice to the next level

What is going on in Mobile Broadband Networks?

Security Executive Summary. Securing LTE Radio Access Networks Effectively

NETWORK ISSUES: COSTS & OPTIONS

Latency on a Switched Ethernet Network

LTE Test: EE 4G Network Performance

Nokia Siemens Networks Service Operations and Management Solution

LTE BACKHAUL REQUIREMENTS: A REALITY CHECK

Delivering Network Performance and Capacity. The most important thing we build is trust

Research of User Experience Centric Network in MBB Era ---Service Waiting Time

Voice, Video and Data Convergence > A best-practice approach for transitioning your network infrastructure. White Paper

NSN Liquid Core Management for Telco Cloud: Paving the way for reinventing telcos for the cloud

Quality of Service Testing in the VoIP Environment

White paper. Latency. The impact of latency on application performance

The Business Case for Caching in 4G LTE Networks

The Impact of QoS Changes towards Network Performance

5. DEPLOYMENT ISSUES Having described the fundamentals of VoIP and underlying IP infrastructure, let s address deployment issues.

How a fibre connectivity solution can improve your company s performance

Video Streaming Without Interruption

Rethinking the Small Cell Business Model

Quality Expectations of Mobile Subscribers

The Value of a Content Delivery Network

Testing & Assuring Mobile End User Experience Before Production. Neotys

Key Components of WAN Optimization Controller Functionality

Bit-Rate and Application Performance in Ultra BroadBand Networks

Clearing the Way for VoIP

Results-Oriented Application Acceleration with FastView Because Every Second Counts Whitepaper

LTE Congestion Management. Enabling Innovation and Improving the Consumer Experience

ALTERNATIVE BACKHAUL AND DATA OFFLOAD SOLUTIONS FOR GSM AND UMTS OPERATORS

WHITE PAPER. Realistic LTE Performance From Peak Rate to Subscriber Experience

Introduction, Rate and Latency

Video Collaboration & Application Sharing Product Overview

WAN Optimization Integrated with Cisco Branch Office Routers Improves Application Performance and Lowers TCO

FutureWorks 5G use cases and requirements

Cooperative Techniques in LTE- Advanced Networks. Md Shamsul Alam

3G smartphones. ericsson White paper Uen February 2015

RTEs Must Anticipate New Network Demands

The LTE Challenge. for the Small-to- Midsize Mobile Network Operator

WAN Performance Analysis A Study on the Impact of Windows 7

Intelligent Content Delivery Network (CDN) The New Generation of High-Quality Network

Running SAP Solutions in the Cloud How to Handle Sizing and Performance Challenges. William Adams SAP AG

HIGH-SPEED BRIDGE TO CLOUD STORAGE

Quality of Service. Traditional Nonconverged Network. Traditional data traffic characteristics:

AKAMAI WHITE PAPER. Delivering Dynamic Web Content in Cloud Computing Applications: HTTP resource download performance modelling

The changing face of global data network traffic

Computer Networks Homework 1

Subtitle. VoIP Trends. What to Expect in VoIP 2016 Compare Business Products

LTE Mobility Enhancements

Fixed Line Broadband Performance (ADSL) in New Zealand. April June 2013

Application Visibility and Monitoring >

Nokia Networks. FutureWorks Network architecture for the 5G era. Nokia Networks white paper Network architecture for the 5G era

Leading Entertainment Provider Optimizes Offsite Disaster Recovery with Silver Peak

Broadband Quality of Service Experience (QoSE)

QoS issues in Voice over IP

Nokia Siemens Networks Flexi Network Server

Enabling Cloud Architecture for Globally Distributed Applications

Ensure VoIP and Skype for Business Call Quality and Reliability with NetScaler SD-WAN

Performance Management for Next- Generation Networks

How To Optimize Your Website With Radware Fastview

Enabling Real-Time Sharing and Synchronization over the WAN

The Quality of Internet Service: AT&T s Global IP Network Performance Measurements

Are you getting the speed that you pay for? Understanding Internet Speeds

A SENSIBLE GUIDE TO LATENCY MANAGEMENT

ADVANTAGES OF AV OVER IP. EMCORE Corporation

Traffic Management Solutions for Social Innovation Business

Nokia NetAct. Virtualized OSS that goes beyond network management

The Next Generation of Wide Area Networking

FIVE WAYS TO OPTIMIZE MOBILE WEBSITE PERFORMANCE WITH PAGE SPEED

What is Network Latency and Why Does It Matter?

AN OVERVIEW OF QUALITY OF SERVICE COMPUTER NETWORK

How To Improve Your Cell Phone Battery Life

Best Effort gets Better with MPLS. Superior network flexibility and resiliency at a lower cost with support for voice, video and future applications

TIME TO RETHINK REAL-TIME BIG DATA ANALYTICS

The Evolution to Local Content Delivery

Access the Test Here

Optimizing Converged Cisco Networks (ONT)

Wireless Technologies for the 450 MHz band

Transcription:

Nokia Siemens Networks Bandwidth Proximity Control A recipe for low latency networks

Executive summary Contents 3 The rapidly growing use of mobile applications demands reduced network latency 4 Why low latency matters 5 Key success factors for improving network latency 6 Nokia Siemens Networks recommended approach to reducing network latency 8 Nokia Siemens Networks is making low latency networks a reality Low latency is the key to instant satisfaction for mobile broadband users. Latency has a major impact on services and applications, such as mobile video communication, live video streaming and multiplayer gaming. It may be impossible to predict the kinds of applications that people will be using in five years time, but one thing is certain we re going to see more emphasis on real-time performance in mobile services. Gamers are especially sensitive to latency, since their applications call for latencies below 10 milliseconds. Similarly, many automated trading applications are also highly sensitive to latency, transactions can take place in milliseconds and delays could cause financial losses. prioritized through the network. The third aspect, proximity, relates to the laws of physics. Data travels at a fraction of the speed of light in a network, so bringing content closer to the end user can have a dramatic impact on the latency experienced by the user. Latency is an opportunity for network operators to change the game in content delivery by enhancing the customer experience of latency-sensitive applications. It s a big challenge, but adding capacity, improving traffic control and placing content close to users will lead to a noticeable better customer experience and create substantial business benefits for network operators. Three critical parameters need to be addressed to create a low-latency mobile network: bandwidth, proximity and control. While bandwidth helps to improve the overall customer experience, including latency, simply adding more and more capacity to a network is economically not feasible. Latency can be lowered significantly by implementing QoS differentiation and policy control to help ensure that latency-sensitive traffic is

The rapidly growing use of mobile applications demands reduced network latency The use of latency-sensitive applications in mobile networks is growing. Today, 69% of all mobile traffic is streaming video and this proportion is expected to grow to 76% in 2020 (Cisco VNI 2012). In the future, latency will become even more important because applications will go far beyond those of today. Remote real-time control of machines, 3D-video and cloud-based gaming are expected to be among the next wave of applications and will revolutionize the way people work, learn and play. Latency determines the perception of speed in about 80% of all mobile broadband uses. Real-time functionality demands the lowest possible delay in the network, therefore latency will become a business opportunity for operators. The loading time of web pages is a good example of how response time interferes with the customer experience. Studies into consumer response to the performance of a travel shopping site indicate that people expect a loading time of less than three seconds. In 1999, consumers expected eight seconds load time, and in 2006 less than four seconds. Consumers are clearly becoming more demanding. Today, 65% of 18 24 year-olds expect a web page to load in two seconds or less. It is highly likely that in the near future, people will be looking for a sub one-second browsing experience. Awareness of how latency affects the customer experience has motivated over-the-top (OTT) providers to tailor their messaging protocols to minimize latency over today s networks. However, since the biggest factors affecting latency are in the networks, operators are better placed to address the issue and also potentially benefit commercially by reducing latency. Nokia Siemens Networks forecasts a growth in partnerships between OTT providers and network operators. The OTTs want people to experience their content and services with the best possible quality. The majority of today s mobile networks have an end-to-end delay in the range of 200 to 500 ms. However, measurements of latency of 15 to 20 milliseconds in LTE networks, as well as 20 to 50 milliseconds in HSPA networks prove that end-to-end latency can be lowered. The effect of latency on people Response time is relevant for every interaction of humans with machines. Some 20 years ago, three response time limits were defined by Nielsen*. 0.1 second gives the feeling of instantaneous response that is, the outcome feels like it was caused by the user, not the computer. This level of responsiveness is essential to support the feeling of direct manipulation. 1 second keeps the user's flow of thought seamless. Users can sense a delay, and thus know the computer is generating the outcome, but they still feel in control of the overall experience and that they're moving freely rather than waiting on the computer. 10 seconds keeps the user's attention. From 1 10 seconds, users definitely feel at the mercy of the computer and wish it was faster, but they can handle it. After 10 seconds, they start thinking about other things, making it harder to get their brains back on track once the computer finally does respond. *Nielsen Response time limits http://www.useit.com/papers/responsetime.html

Why low latency matters Looking into latency-sensitive services in mobile networks provides some insight into how latency in networks needs to be improved. Fast browsing requires more than just bandwidth Web page performance depends on the technology selection of web developers and web hosters. Advanced web developers will create pages with html compression, css and js files. They use optimized images, avoiding bad request and redirects and other techniques. Moreover, the dynamic and diverse development of web browsers and servers often focuses on speed as a competitive advantage. Web page download enhancements such as opening of several simultaneous parallel TCP connections and http pipelining are used to deal with issues in high delay networks. Today, during an average web page download, information must be retrieved from 90 different resources within six different domains. The browser has to deal with several TCP connections to those domains and resources, without being able to load everything in parallel. Typically, the response time should be less than 5 for each individual connection. Video and voice calls are strongly affected by delay For a fluid conversation, the mouth to ear delay should be less than 15, otherwise the participants may detect a pause and take that as their cue to speak. By time their words arrive at their destination, the other speaker has already begun the next sentence. A conversation becomes very difficult at around 40 mouth to ear delay. In video calls or video conferences, this 15 audio requirement applies as well, but lip synchronization also enters the scene. Visual lip movements must match with the spoken words, with any misalignment being referred to as skew. A skew of less than 2 is considered imperceptible. As the skew approaches 5, some viewers will begin to notice the audio/video mismatch. As the skew increases, viewers become increasingly distracted from the video conference. When the skew approaches one second, the video signal provides no benefit, viewers will ignore the video and focus on the audio. So lip sync delay should be well below 5. Web communication applications, such as Google Hangout, Apple Facetime and Skype, use low latency codecs and scalable video coding to make the applications more robust against network latency. Cloud-based applications are delay sensitive by definition Cloud-based gaming is another good example of a latencysensitive application. For example in first-person avatar games, the time difference between a player performing an action and the result of that action appearing on the screen should be less than 10. In some fast action games, latency in excess of 70 or 8 may be unacceptable. Typically, people expect instant response within a game of less than a hundred milliseconds. Future latency demands As applications and services become more sophisticated and capable and as people s expectations grow, network latency will become an increasingly important consideration for operators. In general, an overall latency of 50 ms is likely to be the future benchmark. In today s mobile networks, this 5 figure is far from being achieved. However, measurements in LTE networks show significant improvements in latency because of high bandwidth and low latency technologies, Figure 1.

Customer experience of network delay LTE measurements in tomorrow s network RTT 50 40 30 Mobile network Fixed network RTT 5 4 3 20 2 10 1 MEA LAT APAC CEE NAM WE 6pm 6am 6pm 6am 6pm 6am 6pm Source: Cisco Analysis of Ookla Speedtest Data, 2011 Source: Epitiro Ltd.: LTE Real World Performance Study (TeliaSonera) Figure 1. Measured latencies today and in the future Key success factors for improving network latency To understand how to improve network latency, we need to understand its sources. First the propagation delay, defined by the speed of light, depends directly on the distance the data must travel within the network. Second, serialization delay occurs at each interface that a data packet meets along its journey. This delay is defined by the length of the packet and the interface speed. Both delay sources can be estimated by determining the distance, number of nodes and Figure bandwidth. 1 However, complications arise when considering other sources of delay, such as the processing delay which happens in every node performing routing, switching, and inspection of the data packets. The processing time depends on the task and the actual processing load. Finally, other real-world factors can cause delay, such as congestion, faults and outages, maintenance interruptions, automated optimization routines, poor network management, and of course, simple human error. These unpredictable events cause packet queuing and packet losses. They are also almost impossible to predict and hard to control. Queuing is the time a packet has to wait until processing, which is extremely dependent on traffic load and its burstiness. All quality of service concepts address queuing delay, but the easiest way to control it is to optimize buffers in the network or minimize the number of nodes. After analyzing all these sources of delay we can come to two very simple conclusions. Firstly, it is possible to build and operate low latency networks. Secondly, operators are uniquely placed to reduce latency and make a profit by doing so since they can add perceivable value for third parties in the value chain linking the content provider with the end user. Evolving network technologies will make it possible to improve network latency by a factor of three to five, depending on the network.

Nokia Siemens Networks recommended approach to reducing network latency There are three key factors operators must take into account if they are to reduce latency: Access bandwidth is critical, especially for media-rich lowlatency applications like video conferencing and cloud-based gaming. Fiber distance will strongly limit the delivery of low latency services like telepresence or teleprotection. Instant network control is vital and requires a low-latency network architecture, latencyaware traffic engineering and latency-aware network/service control and management. Bandwidth, proximity and control are the three key words to remember for low latency success. Bandwidth: Improving the perceived network performance Access speed remains a limiting factor for the latency of media rich, time-sensitive applications that are particularly critical on shared radio, copper or fiber links under high load conditions. Therefore, building low latency networks requires a next generation access network that supports more than 100 Mbps per user. In addition to bandwidth, other ways to improve latency include the fast allocation of resources on shared media, fast retransmission procedures when transmission errors occur, the implementation of QoS capabilities and congestion control. Bandwidth also has an indirect impact on the perception of speed because it affects latency in a couple of different ways. First, the traffic load throughout of the network is important, because packets will be held up or dropped if they are forced to queue. Queuing (aka scheduling) delay is highly dependent on the burstiness of the traffic, as well as the overall load. In addition, packet processing resources may be spread more thinly under heavy load, leading to an increased delay at each network element. 3GPP Delay (RTT) evolution Average Time to load a video frame RTT 2 15 ms 1 5 ms BTS RNC UE enodeb UE Re-transmission Transmission Grant & Buffer Processing HSPA LTE B4G Sufficient bandwidth and fast transmission are both important, but so too is fast signaling Loading Time 20 15 10 5 HD SD HSPA SD/H.265 Full HD/H.265 UHD/H.265 high load UHD low load HD LTE B4G Figure 2. Ultra-fast broadband can be achieved by adding 1,000 times more capacity and ten times faster transmission and will be vital in meeting future application requirements

High bandwidth technologies such as LTE are evolving, and their high capacity will certainly help to reduce the load on each element and hence contribute to an overall reduction in latency. However, simply ramping up overall capacity to provide unlimited extra headroom at peak times is unrealistic from an economic perspective, so the key is to introduce mechanisms to manage the traffic more effectively. Proximity: Bringing content closer to the user The propagation speed of light in fiber is around two thirds the speed of light in a vacuum, so in one millisecond a signal can travel around 200 km. A user in Europe accessing a server in the US will face a 5 roundtrip time because of the distance, no matter how fast the network. The only way to beat this limit is to reduce the distance between user devices and the content and applications they are accessing. There are already content distribution networks (CDNs) that aim to address the distance issue by distributing static content to the edges of the network. Bringing the content 2,000 km closer to the user can potentially reduce the round trip time (RTT) by about 3 and page load times by more than 30. Local caches could be an answer for some applications. Storage is relatively cheap, so that even a smartphone is today able to store tens of Gigabytes. That will grow to hundreds or thousands of Gigabytes in the future. Furthermore, the rising power of data analytics will make it possible to estimate very precisely which content an individual user will ask for next, at which location and on which device. However, some future applications such as cloudbased gaming depend on dynamically generated content that cannot be cached. That means processing capacity also has to be dispersed into local data centers. For ultra-low latency services and to offload the mobile backhaul, this functionality will even be integrated into the base stations at the very edge of the network. Control: Setting the right priorities Technologies such as Qualityof-Service (QoS) differentiation and policy control allow networks to treat different traffic streams according to their specific needs. So latencysensitive media traffic might take a different path through the network or receive preferred treatment over plain data transfers. Networks are becoming application-aware, so that the priority scheduling and buffering available with QoS differentiation can minimize the impact of load-dependent delay. Policy controlled content delivery, caching, and pre-fetching RTT impact on page loading time 2.0 s 1.5 s ~3 ~ 30 Edge Core 1.0 s RTT: 20-3 RTT: 5 1 RTT: 10 100 + ms Proximity of content combined with policy controlled delivery 0.5 s 10 6 2 Bringing the content 2,000 km closer to the user can potentially reduce page load times by 30 or more Figure 3. With local caching, content can be delivered to end users from the edge of the network, substantially reducing latency simply because the content has less far to travel

With improved control, the most sensitive, low-latency traffic could be charged at a premium, providing network operators with new ways to monetize services and content while complying with net neutrality regulations. Monitoring and optimization of network parameters in response to the real-time experience of users can also improve the perception of low latency networks. Conversely, the experience can also be optimized by making applications networkaware and equipping them with interfaces that instantly request the necessary connectivity. Nokia Siemens Networks is making low latency networks a reality Nokia Siemens Networks is committed to delivering on the promise of low latency networks. Therefore Nokia Siemens Networks has developed a variety of capabilities: Bandwidth: Commercial LTE networks deployed by Nokia Siemens Networks show an average round trip time of less than 2. Benchmarks have also shown that average and peak data rates are significantly higher compared with other vendors implementations. Proximity: Only operators can place content close enough to users for hosting or providing low latency services. Therefore, our enhanced Content Delivery Network (CDN) functionality brings content caching and processing as close as possible to the end user. Control: Our Intelligent Broadband provides policycontrolled content delivery, content optimization and QoS and application aware RAN. Intelligent Broadband is able to control the latency across the entire network - not just controlling parts of it. Customer Experience Management (CEM) for Liquid Net reveals how the network performs and what is affecting the customer experience. This monitoring analysis allows for latency to be managed as a KPI. Nokia Siemens Networks is committed to delivering low latency networks. Low latency is a vital pillar of our technology vision for future networks.

Significantly lower delay Much higher average data rates 40 worst delay average delay 100 Mbps peak data rate average data rate 30 75 Mbps 20 50 Mbps 10 +41% +96% Vendor B Vendor C 25 Mbps -26% -47% Vendor B Vendor C Drive tests performed by Nokia Siemens Networks (Oct. 2011) in a three vendor LTE network, 10MHz carrier LTE 850 MHz Figure 4. Nokia Siemens Networks LTE RAN solutions - a benchmark in providing low latency networks today

Abbreviations CDN CEM CSS JS HSPA HTML LTE QoS RAN RTT TCP Content Delivery Network Customer Experience Management Cascading Style Sheets JavaScript High Speed Packet Access Hypertext Markup Language Long Term Evolution Quality of Service Radio Access Network Round Trip Time Transmission Control Protocol

Nokia Siemens Networks P.O. Box 1 FI-02022 NOKIA SIEMENS NETWORKS Finland Visiting address: Karaportti 3, ESPOO, Finland Switchboard +358 71 400 4000 (Finland) Switchboard +49 89 5159 01 (Germany) Copyright 2011 Nokia Siemens Networks. All rights reserved. Nokia is a registered trademark of Nokia Corporation, Siemens is a registered trademark of Siemens AG. The wave logo is a trademark of Nokia Siemens Networks Oy. Other company and product names mentioned in this document may be trademarks of their respective owners, and they are mentioned for identification purposes only. This publication is issued to provide information only and is not to form part of any order or contract. The products and services described herein are subject to availability and change without notice. www.nokiasiemensnetworks.com Every effort is made to ensure that our communications materials have as little impact on the environment as possible