Case Study How network conditions impact IPTV QoE

Similar documents
Agilent N2X SIP Protocol Emulation Software

Agilent N2X. Solution Note

Agilent RouterTester. E7864A Technical Datasheet. Powered by QA Robot Technology. Complete MPLS Protocol Test Software MPLS Protocol Test Software

True Router Performance Testing

Agilent N2X Layer 2 MPLS VPN Emulation Software

Agilent N2X Agilent N2X Chassis

Evaluating Carrier-Class Ethernet Services

diversifeye Application Note

White Paper Testing Ethernet Passive Optical Networks

Testing Edge Services: VPLS over MPLS

MDI / QoE for IPTV and VoIP

Proactive Video Assurance through QoE and QoS Correlation

Demonstration of Internet Protocol Television(IPTV) Khai T. Vuong, Dept. of Engineering, Oslo University College.

Using Multicast Call Admission Control for IPTV Bandwidth Management

White Paper. Testing Next Generation Core Router Performance

IxLoad: Testing Microsoft IPTV

MultiDSLA. Measuring Network Performance. Malden Electronics Ltd

Quality of Service Testing in the VoIP Environment

Evaluating Wireless Broadband Gateways for Deployment by Service Provider Customers

VoIP QoS. Version 1.0. September 4, AdvancedVoIP.com. Phone:

ADVANTAGES OF AV OVER IP. EMCORE Corporation

Microsoft TV Test. Technology Background. ICC Technology. Application Note. by John Williams

Troubleshooting VoIP and Streaming Video Problems

Using the ClearSight Analyzer To Troubleshoot the Top Five VoIP Problems And Troubleshooting Streaming Video

IPTV over Fiber Optics for CPE Installers

Solutions to enhance the performance & security of your networks & applications.

81110A Pulse Pattern Generator Simulating Distorted Signals for Tolerance Testing

Demonstrating the high performance and feature richness of the compact MX Series

PART III. OPS-based wide area networks

Business Case for the Brocade Carrier Ethernet IP Solution in a Metro Network

Analysis of Different Parameters that Affect QoS/QoE for Offering Multi-IPTV Video Simultaneously in TK

The Changing Role of Policy Management

Agilent Technologies Performing Pre-VoIP Network Assessments. Application Note 1402

ZHONE VDSL2 TECHNOLOGY. Access Technology for the Future. November 2009 CONTENTS

Business case for VoIP Readiness Network Assessment

Interface. Displaying Transactions

Verifying Metro Ethernet Quality of Service

Fundamentals of VoIP Call Quality Monitoring & Troubleshooting. 2014, SolarWinds Worldwide, LLC. All rights reserved. Follow SolarWinds:

Highly Available Mobile Services Infrastructure Using Oracle Berkeley DB

Infosim Whitepaper VoIP quality monitoring in Cable-TV networks

QOS Requirements and Service Level Agreements. LECTURE 4 Lecturer: Associate Professor A.S. Eremenko

Triple Play Test Suite

The Picture must be Clear. IPTV Quality of Experience

Troubleshooting Common Issues in VoIP

SLA verification + QoS control: the base for successful VoIP & IPTV deployments

APPLICATION NOTE 183 RFC 2544: HOW IT HELPS QUALIFY A CARRIER ETHERNET NETWORK. Telecom Test and Measurement. What is RFC 2544?

The Triple Play Analysis Suite - VoIP. Key Features. Standard VoIP Protocol G.711 SIP RTP / RTCP. Ethernet / PPP. XDSL, Metro Ethernet

Customer White paper. SmartTester. Delivering SLA Activation and Performance Testing. November 2012 Author Luc-Yves Pagal-Vinette

VoIP network planning guide

Application Note. IPTV Services. Contents. Title Managing IPTV Performance Series IP Video Performance Management. Overview IPTV Services...

Solutions Focus: IPTV

A Mock RFI for a SD-WAN

TCP in Wireless Mobile Networks

Broadband Video over Twisted Pair Cabling. By Paul Kish Director, IBDN Systems & Standards NORDX/CDT

CISCO IOS IP SERVICE LEVEL AGREEMENTS: ASSURE THE DELIVERY OF IP SERVICES AND APPLICATIONS

Routing & Traffic Analysis for Converged Networks. Filling the Layer 3 Gap in VoIP Management

Wide Area Network Latencies for a DIS/HLA Exercise

Introduction to Automatic Multicast Tunneling as a Transition Strategy for Local Service Providers

Clearing the Way for VoIP

12 Quality of Service (QoS)

Estimation of QoE of Video Traffic using a Fuzzy Expert System

Performance Evaluation of AODV, OLSR Routing Protocol in VOIP Over Ad Hoc

Access the Test Here

D. SamKnows Methodology 20 Each deployed Whitebox performs the following tests: Primary measure(s)

profile Advanced Test Equipment Rentals ATEC (2832) Solve complex internetworking problems the first time

Traffic Characterization and Perceptual Quality Assessment for VoIP at Pakistan Internet Exchange-PIE. M. Amir Mehmood

EXpert Test Tools PLATFORM SOFTWARE TOOLS FOR TESTING IP-BASED SERVICES

Methods for Mitigating IP Network Packet Loss in Real Time Audio Streaming Applications

RFC 2544 Testing of Ethernet Services in Telecom Networks

Maintaining Wireline-Grade VoIP QoE in Cable Networks

Application Notes. Introduction. Contents. Managing IP Centrex & Hosted PBX Services. Series. VoIP Performance Management. Overview.

Customer Network Assessment

Network-Wide Change Management Visibility with Route Analytics

Measurement of V2oIP over Wide Area Network between Countries Using Soft Phone and USB Phone

Achieving High Quality Voiceover-IP Across WANs With Talari Networks APN Technology

Best Effort gets Better with MPLS. Superior network flexibility and resiliency at a lower cost with support for voice, video and future applications

APPLICATION NOTE 209 QUALITY OF SERVICE: KEY CONCEPTS AND TESTING NEEDS. Quality of Service Drivers. Why Test Quality of Service?

Delivering Network Performance and Capacity. The most important thing we build is trust

Scalar Network Analysis with the HP 8590 Series Spectrum Analyzers Product Overview

Quality of Service (QoS) and Quality of Experience (QoE) VoiceCon Fall 2008

Analysis of Performance of VoIP

NETWORK MANAGEMENT DISCLOSURE

REAL TIME VISIBILITY OF IPTV SUBSCRIBER EXPERIENCE AND VIEWING ACTIVITY. Alan Clark CEO, Telchemy Incorporated

Traffic load and cost analysis for different IPTV architectures

Service resiliency and reliability Quality of Experience Modelling requirements A PlanetLab proposal. PDCAT'08 - Dunedin December 1-4, 2008

The Broadband Service Optimization Handbook Chapter 3

Virtual Office. Network Tests. Revision x8, Inc O'Nel Drive San Jose, CA Phone: Fax:

Is Your Network Ready for VoIP? > White Paper

Application Performance Management

LIVE BROADCAST CONTENT FOR SOUTH KOREAN IPTV SERVICE: WILL IT TAKE OFF? Asia Pacific Market Insights. By: Min Cho, Industry Analyst

itvsense Probe M-301/M-304

R&S IP-GATE IP gateway for R&S MKS9680 encryption devices

VoIP Conformance Labs

Network Simulation Traffic, Paths and Impairment

IPTV and IMS in Next-generation Networks

Transcription:

Case Study How network conditions impact IPTV QoE This case study examines the importance of validating IPTV service quality under realistic Multiplay network conditions.

Introduction Telecommunication service providers are investing heavily in IPTV to complement their existing voice and data services, and compete with cable multiple service operators (MSOs). Before IPTV proliferation and Multiplay utopia is realized, service providers and network equipment manufacturers (NEMs) must first verify that IPTV services will in fact meet user quality expectations. Viewers have come to expect a predictable level of service quality with their broadcast and satellite TV services and will not be tolerant to service interruptions, picture and sound degradation or long waiting periods to change channels with their new IPTV service. With a plethora of competing IPTV and cable TV service offerings, subscriber churn due to poor quality of experience is a serious risk. This case study identifies how IPTV service quality is impacted by different network characteristics to illustrate the importance of validating IPTV service quality using realistic Multiplay network conditions. IPTV Quality of Experience (QoE) IPTV Quality of Experience (QoE) refers to how well the video service satisfies users expectations. The IPTV quality experienced by subscribers must be equal to or better than today s cable and satellite TV services or service providers run the risk of significant subscriber churn and the resulting loss in revenue. IPTV is the key component to service provider growth and has been shown to as much as double ARPU. With such significant financial rewards at stake, service providers are taking IPTV QoE very seriously. Measuring IPTV QoE refers to testing the technical aspects that influence the subscriber s service experience. This discussion is focussed on the emerging test methodologies and metrics for verifying IPTV QoE across the IPTV delivery network, as illustrated in Figure 1. IPTV QoE test measurements There are two fundamental areas of IPTV QoE testing: Channel zapping measurements Media (audio and video) quality metrics Channel zapping measurements measure how quickly subscribers can change channels, and verify that they are in fact receiving the correct channel. Acceptable channel zapping delay is generally considered to be approximately 1 second total, end-to-end. A channel zapping time of 100~200 ms is considered by viewers to be instantaneous. Multicast protocols enable channel zapping within the network infrastructure. IGMP (Internet Group Management Protocol) or MLD (Multicast Listener Discovery) leave/join delay has a direct impact on channel zapping delay. To keep overall channel zapping delay within one second, the target multicast leave/join delay of each network component needs to be within the 10-200 ms range. Measuring IPTV media quality is a formidable challenge since there are many factors that can compromise the perceived media quality. The number and behavior of IPTV subscribers and convergence of other Multiplay traffic contending for finite resources on the network has a significant impact on the timely and accurate forwarding of IPTV packets. Network impairments (packet loss and sequence errors, latency and jitter) can have various detrimental effects on visible video quality such as blocking, blurring, edge distortion, judder (choppy playback), and visual noise. Therefore, a complex network environment that accurately reflects the characteristics of Multiplay networks must be represented in the lab in order to sufficiently stress network equipment and evaluate IPTV media quality. The media quality test metrics themselves must be scalable, repeatable, and provide insight into the reasons behind performance problems (i.e., relevant). Figure 1: IPTV delivery network. Agilent Technologies, Inc. 2008 2

The Media Delivery Index (MDI) is gaining widespread industry acceptance for testing media quality over network elements in a video delivery infrastructure. MDI is an industry standard defined in RFC 4445. MDI s two components, the delay factor (DF) and the media loss rate (MLR), are based on concepts that translate directly into networking terms: jitter and loss. MDI correlates network impairments with video quality, which is vital for isolating problems and determining their root cause. A high delay factor directly indicates that the device/system under test has introduced jitter into the traffic, which can degrade video quality, and warns of possible impending packet loss as device buffers approach overflow or underflow levels. A high DF value thus points to congestion in the network and indicates the amount of buffer resources required to dejitter the received video and avoid poor video quality at the Set Top Box. The MLR measures the number of out-of-order or lost media packets per second. Media packet loss represented as a non-zero MLR compromises video quality and can introduce visual distorations or irregular, uneven video playback. The amount of tolerable or acceptable media packet jitter and loss from a viewer s perspective is subjective; however, a recent study by Agilent Technologies recommends the following cumulative MDI measurements throughout the delivery network: Maximum acceptable DF: 9-50 ms Maximum acceptable average MLR (all codecs): 0.004 for SDTV and VOD, and 0.0005 for HDTV 1 The maximum acceptable delay factor is a specified as a value range since the exact value must be tuned to the buffer size of the Set Top Box being used. It s important to also note that the acceptable MLR also depends on the test scenario. The above recommended MLR values are based on a viewing period of several hours, which is why the value is expressed as an average. For measurements taken over short viewing periods, the maximum acceptable total MLR is zero. Network characteristics and media quality Exponential subscriber growth It s estimated that IPTV subscriber growth in North America alone will increase 12,985% between 2004 and 2009 2. France, the leading European country in rolling out IPTV services, had 281,000 subscribers registered to the three main IPTV services (Maligne, Free and Neuf) at the end of 2005 3. Digital video distributed via IP multicast doesn t ensure consistent video quality among all the users watching the same channel. It is therefore difficult to ensure that each and every subscriber is receiving the video properly. Since both bandwidth and processing resources in the IPTV delivery network are finite, it follows that the more subscribers requesting the IPTV service, the higher the threat of compromised QoE. It is critical that network equipment be tested under an increasing scale of both subscribers and IPTV channels to identify the point at which per-subscriber IPTV QoE reaches an unacceptable level (i.e., the performance limits). Dynamic subscriber behaviors In a realistic Multiplay user environment, subscribers behave in a dynamic fashion. A household receiving Multiplay services from a single provider may be simultaneously initiating channel-change, browsing the Internet, and having multiple VoIP telephone conversations. When scaled across the subscriber base, this dynamic behavior can be very demanding on the control plane of IPTV network elements, and potentially jeopardize the quality of experience IPTV viewers receive. Converged Multiplay traffic IPTV traffic will share network links with voice and data traffic from the same Multiplay subscriber or from other subscribers sharing an uplink from an aggregation device. All three services will contend for finite network bandwidth and equipment resources and the different traffic types each require a different level of service from the network. It is imperative to include a combination of Multiplay traffic within the test environment to identify how the presence (or interference) of other service traffic, impacts quality of service and the timely forwarding of high priority video traffic. 1. Andre Dufour, Agilent White Paper: Understanding the Media Delivery Index, 2006. 2. Infonetics Research, Inc., IPTV Equipment and Services Market Outlook, 2005, p. 28. 3. Franz Kozamernik, IPTV - a different television, July 2006; available from www.ebu.ch/en/union/diffusion_on_line/tech/tcm_6-46276.php, accessed August 3, 2006. Agilent Technologies, Inc. 2008 3

Testing under realistic Multiplay network conditions Agilent Technologies performed a series of tests against a popular edge router (Broadband Network Gateway) to illustrate how the number of supported subscribers, channel zapping behavior and other converged service traffic impacts IPTV quality of experience. The tests compare sustained IPTV channel zapping performance (IGMP group join/leave latency) and media quality metrics (MDI delay factor and media loss rate) when manipulating the following variables: Number of simulated subscribers (10 vs. 100) Channel zapping activity (absence vs. presence). Background voice and data traffic (absence vs. presence) Test configuration Figure 2 illustrates the equipment set-up used for the IPTV QoE measurements. On the service emulation side, a Gigabit Ethernet (GbE) port of the Agilent N2X Multiservice Test Solution 4 was connected to an upstream port of the SUT to simulate the core network and service sources, and generate actual video and background voice and data (High Speed Internet) traffic. On the subscriber access side, a second GbE N2X port was used to simulate an access node (DSLAM) aggregating IPTV subscribers behind the system under test (SUT), and make measurements on the received service traffic. Channel Zapping Performance Tests Agilent performed different channel zapping performance test variations to determine how the number of subscribers impacted IGMP group join latency during periods of typical random channel zapping behavior, as well as during peak load conditions. Peak load conditions represent instances when all subscribers change to the same channel at the same time, like when viewers return to the live broadcast of a major event, such as the SuperBowl or Academy Awards, after a commercial break. Test 1a: Subscriber Scale and Random Channel Zapping 10 subscribers cycling through 4 channels at a rate of 1 channel per second 100 subscribers cycling through 4 channels at a rate of 1 channel per second Test 1b: Subscriber Scale and Peak Load Channel Zapping 10 subscribers cycling through same 4 channels at the exact same instant (rate of 1 channel per second) 100 subscribers cycling through same 4 channels at the same instant (rate of 1 channel per second) Figure 2: Test Configuration. 4. Agilent N2X web site, http://www.agilent.com/find/n2x. Agilent Technologies, Inc. 2008 4

Channel Zapping Measurements: Sustained IGMP Performance For each test variation N2X measured the maximum and average IGMP group join latency over a period of 1 minute with a measurement interval of 5 seconds, as shown in Figure 2 and Figure 3. Figure 3: IGMP Group Join/Leave Latency under Random Channel Zapping (Test 1a) (10 vs. 100 Subscribers). Measurement Analysis: Channel Zapping Measurements To keep overall channel zapping delay within one second, the target multicast leave/join delay of each network component needs to be within the 10-200 ms range. Tests 1a and 1b measured sustained IGMP group join delay. A comparison of the join delay measurements in test 1a (Figure 3), indicates that it takes approximately 3 times longer for subscribers to join a new channel when the device is supporting 100 IPTV subscribers versus only 10 IPTV subscribers. The average join delay in the 10-subscriber case remains below the recommend 200 ms for group leave/join delay. With 100 subscribers, however, the average join delay fluctuated between 200 and 400 ms, which exceeds the acceptable threshold for channel zapping delay. It is also important to note that the maximum channel zapping delay with 10 subscribers peaked at 460 ms, indicating that at least 1 of the 10 subscriber s experienced unacceptable channel zapping delay during the 60-second test interval. The maximum join delay with 100 subscribers ranged from 400 ms to over 600 ms, resulting in at least one subscriber experiencing very noticeable delays between channel changes. During peak load conditions (Test 1b), when all subscribers are changing to the same channel at the same time, there is a notable increase in the join delay experienced by the subscribers when compared with the basic fast channel zapping test case (1a). The average and join delay with 10 subscribers was below the 200 ms threshold, however, under the peak load channel zapping conditions in test 1b, the join delay consistently exceeded 200 ms for the same 10 subscribers. The average join delay with 100 subscribers averaged approximately 600 ms for the measurement interval, with the maximum delay reaching over 800 ms; this is four times the maximum recommended channel change delay. Figure 4: IGMP Group Join/Leave Latency under Peak Load Channel Zapping (Test 1b) (10 vs. 100 Subscribers). Agilent Technologies, Inc. 2008 5

Media Quality Performance Tests Test 2a: Subscriber Scale 10 subscribers, with no channel zapping and no background voice or data traffic 100 subscribers, with no channel zapping and no background voice or data traffic Media Quality Measurements: For each test variation N2X measured the maximum MDI Delay Factor and Media Loss Rate over a period of 1 minute, with a measurement interval of 5 seconds, as shown in Figures 5 through 9. Test 2b: Subscriber Scale and Random Channel Zapping 10 subscribers randomly cycling through 4 channels (rate of 1 channel per second) 100 subscribers randomly cycling through 4 channels (rate of 1 channel per second) Test 2c: Subscriber Scale, Random Channel Zapping plus Background Voice and Data Traffic 10 subscribers, random channel zapping plus background voice and data traffic 100 subscribers, random channel zapping plus background voice and data traffic Test 2d: Subscriber Scale, Peak Load Channel Zapping plus background voice and data traffic 10 subscribers cycling through same 4 channels at the exact same instant (rate of 1 channel per second) plus background voice and data traffic 100 subscribers cycling through same 4 channels at the same instant (rate of 1 channel per second) plus background voice and data traffic. Figure 5: MDI Delay Factor under Subscriber Scale (Test 2a) (10 vs. 100 Subscribers). Figure 6: MDI Delay Factor during Random Channel Zapping (Test 2b) (10 vs. 100 Subscribers). Figure 7: MDI Delay Factor during Random Channel Zapping (Test 2c) and Background Voice and Data Traffic (10 vs. 100 Subscribers). Agilent Technologies, Inc. 2008 6

Measurement Analysis: Delay Factor Figure 8: MDI Delay Factor under Peak Load Channel Zapping and Background Voice and Data Traffic (Test 2d) (10 vs. 100 Subscribers). Figure 9: Media Loss Rate (10 vs. 100 Subscribers, All Test Variations). Measurement Analysis: Media Loss Rate Given that the measurement interval was only 60 seconds, any recorded media packet loss during the brief test duration would not be deemed acceptable. Figure 9 shows that the there was no media packet loss across the 10- subscriber pool for any of the test variations. However, there was minor (0.0007) packet loss recorded across the 100-subscriber pool when channel zapping and background voice and data traffic was present. Although this amount may appear negligible, if it was consistent over time, it would definitely have a detrimental impact on subscribers viewing experience. The most alarming measurement is the amount of media packet loss that occurred across the 100- subscriber pool when peak load channel zapping was added to the background voice and data traffic. MLR values across the sampling period range from 0.001 to 0.015. Subscribers experienced some amount of packet loss and, over time, the impact of this loss would result in significant video impairments and a very poor quality of experience for the viewers. To keep the cumulative Delay Factor across the IPTV delivery network within the acceptable 9-50 ms range, the amount of delay introduced by each individual network device must remain minimal. Measurements from Figure 5 (Test 2a) indicate that the delay introduced by the DUT does not vary significantly when the number of supported subscribers increases from 10 to 100. The DF values across the sampling interval remain below 1.22 ms, which is considered acceptable given the buffer would size of most Set Top Boxes. The performance of the device should be predictable under these ideal (yet unrealistic conditions), when no channel change requests or other Multiplay traffic is being processed. Results in Figure 6 (Test 2b) show that when supporting only 10 subscribers, the performance of the DUT appears relatively unaffected by random channel zapping activity. However, when the supported subscriber base is increased to 100, random channel zapping activity increases the delay factor from 1.22 ms to approximately 1.50 ms, with one sample reaching 2.37 ms. When background voice and data traffic is added in Test 2c (Figure 7), the delay factor introduced by the DUT climbs to over 15 ms. The 15 ms of delay introduced by this DUT alone could result in problems when the video is played on lower-end Set Top Boxes with small buffers. These Set Top Boxes would be unable to completely dejitter the received video and buffer overflow or underflow could result, leading to poor video quality. Furthermore, if several devices in the IPTV delivery network introduced similar delay, the resulting delay factor could easily exceed the recommended 50 ms threshold for high-end Set Top Boxes. Figure 8 (Test 2d) shows that the added stress put on the DUT during peak load channel zapping conditions results in even greater jitter, with the DF value ranging from 15-21 ms across the 100 emulated subscribers. Agilent Technologies, Inc. 2008 7

Conclusion This case study measured how IPTV QoE was impacted when common, yet increasingly stressful network characteristics were introduced into the test environment. The test measurements revealed that the performance of the DUT was predictable and acceptable in terms of the subscriber s video quality of experience under ideal network conditions and low number of subscribers. However, when the subscriber pool was scaled, and additional stress was put on the DUTs control and forwarding planes, video quality suffered significantly. The comparative video quality measurements illustrate that it is critical to test under realistic network conditions in order to identify how a device will perform once deployed in a live Multiplay network. The best way for service providers and network equipment manufacturers to mitigate the risk of quality-related subscriber churn is to ensure that their customer s viewing experience meets expectations from day one. It is critical to select the test tools and methodologies with the ability to test IPTV in the larger context of Multiplay networks to verify IPTV QoE under increasing scale, dynamic subscriber behavior, and amidst other voice and data services. Since quality can be inconsistent across the subscriber base, it s important to evaluate IPTV QoE on a per-subscriber basis. Test equipment must be capable of not only providing these measurements, but also of troubleshooting the performance issues so the equipment and network configurations can be optimized accordingly. Diligent and thorough pre-deployment testing, under realistic and dynamic network conditions will have a direct influence on IPTV subscriber satisfaction, and the ultimate success of new IPTV and Multiplay service offerings. Agilent Technologies, Inc. 2008 8

This page intentionally left blank. Agilent Technologies, Inc. 2008 9

Sales, Service and Support N2X must be serviced by an approved Agilent Technologies service centre, please contact us for more information. United States: Agilent Technologies Test and Measurement Call Center P.O. Box 4026 Englewood, CO 80155-4026 1-800-452-4844 Canada: Agilent Technologies Canada Inc. 2660 Matheson Blvd. E Mississauga, Ontario L4W 5M2 1-877-894-4414 Europe: Agilent Technologies European Marketing Organisation P.O. Box 999 1180 AZ Amstelveen The Netherlands (31 20) 547-2323 United Kingdom 07004 666666 Japan: Agilent Technologies Japan Ltd. Measurement Assistance Center 9-1, Takakura-Cho, Hachioji-Shi, Tokyo 192-8510, Japan Tel: (81) 426-56-7832 Fax: (81) 426-56-7840 Latin America: Agilent Technologies Latin American Region Headquarters 5200 Blue Lagoon Drive, Suite #950 Miami, Florida 33126 U.S.A. Tel: (305) 269-7500 Fax: (305) 267-4286 Asia Pacific: Agilent Technologies 19/F, Cityplaza One, 1111 King s Road, Taikoo Shing, Hong Kong, SAR Tel: (852) 3197-7777 Fax: (852) 2506-9233 Australia/New Zealand: Agilent Technologies Australia Pty Ltd 347 Burwood Highway Forest Hill, Victoria 3131 Tel: 1-800-629-485 (Australia) Fax: (61-3) 9272-0749 Tel: 0-800-738-378 (New Zealand) Fax: (64-4) 802-6881 This information is subject to change without notice. Printed on recycled paper Agilent Technologies, Inc. 2008 Printed in USA October 03, 2008 5989-6143EN