Doc No 20 TH EACO REGULATORY ASSEMBLY /CONGRESS REPORT OF THE EACO QUALITY OF SERVICE (QoS) TASKFORCE MEETING HELD ON THE 22 ND TO 26 TH APRIL 2013 IN NAIROBI-KENYA 1.0 Introduction 1.1 Background Following the 18 th EACO Congress, Agreed; 57 xiv hereby quoted as follows: A Quality of Service Taskforce is created with the mandate to review and advise on quality of service parameters across all ICT Services and Networks in East Africa. It shall coordinate with other related taskforces including the Consumer Affairs Taskforce in the implementation of its terms of reference. The maiden taskforce meeting was held between the dates of 17 th and 18 th April 2012 in Kigali, Rwanda under the chairmanship of RURA. The 2 nd taskforce meeting was held from 22-26 th April, 2013, in Nairobi under the chairmanship of CCK and overseen by an official from the EACO Secretariat. The proceedings of this meeting are herewith presented below, and areas that are not fully covered will be finalised as per the attached work plan. 1
1.2 Attendance Below is the list of participants: Mr. DANIEL WATURU (CCK) Kenya-Chairing Mr. LISTON KIRUI (CCK) - Kenya Mr. PROTAIS KANYANKORE (RURA) - Rwanda Mr. CONSTAQUE HAKIZIMANA (ARCT) - Burundi Mr. DIDACE NDIVYARIYE (ARCT) - Burundi Ms. IRENE NAKAGGWA (UCC) - Uganda Ms. CAROLINE KOECH (EACO SECRETARIAT) Additional details of the participants are provided in ANNEX 1. 1.3 Opening of the Meeting The meeting was officially opened with remarks from CCK s Director of Licensing, Compliance and Standards, Mr. Christopher Kemei. remarks he emphasised the following; In his Good quality ICT systems and services are important in unlocking the enormous potential benefits of ICT to the public It is important to deploy and provide ICT services that meet and surpass the minimum acceptable quality of services and standards in order to ensure that consumers get value for money in the usage of ICT. 2
It is important to institutionalize SLAs as a way of improving QoS for the consumers There are challenges in monitoring of compliance to quality standards and tasked the QoS Taskforce to consider during the meeting deliberation with a view to finding solutions 1.4 Election of the Bureau Mr. Daniel Waturu from the host country (Kenya) was requested by EACO Secretariat to chair the meeting. Burundi was elected as the 1st rapporteur, and Uganda was elected 2nd rapporteur. EACO Secretariat was requested to support the two rapporteurs. 1.5 Agenda The agenda outlined below was adopted a. Arrival and Registration of Participants b. Opening of the Meeting c. Election of the Bureau d. Adoption of the Agenda e. Review of the report of the last QoS/QoE Taskforce meeting f. Review and Adoption of the ToRs g. Presentation of country experiences in enforcing QoS/QoE on Licensees and Challenges encountered h. Identification of the ICT networks, ICT services, QoS & QoE parameters, targets and QoS monitoring tools & different approaches in EAC region. i. Harmonization of QoS/QoE parameters and targets for the region. j. Identification of relevant ITU-T Recommendations and Standards from other SDOs on QoS/QoE 3
k. Benchmarking with other Regional (CRASA, WATRA) and International (TRAI-India) administrations l. Recommendations m. Drafting of Report n. Closing 1.6 Review of the report of the Last QoS/QoE Taskforce Meeting The Report of the last EACO Quality of Service (QoS) Taskforce was presented to members by the chairman, Mr. Daniel Waturu. Matters arising out of the last report are included in this report and are part of the attached annexes. 2.0 Summary of Decisions 2.1 Review and Adoption of the ToRs The Taskforce discussed the above item and recommended adoption of the general Terms of Reference provided for in the 18th EACO congress report as stated below. The TORs and their status thereof are shown in the attached ANNEX 2 2.2 Presentation of Country Experiences in enforcing QoS/QoE on Licensees and challenges encountered. It was observed that different EACO member countries have different QoS targets and monitoring mechanisms. Challenges were observed that some countries did not have the ability to monitor certain technologies 4
such as PSTN, data networks and internet, among others. The country experiences were presented as attached in ANNEX 3 2.3 Identification of the ICT networks, ICT services, QoS & QoE parameters, targets and QoS monitoring tools & different approaches in EAC region ICT networks, ICT services, QoS & QoE parameters, targets and QoS monitoring tools have been identified by the different countries. A matrix containing each country status on the parameters and targets was developed and populated as attached in ANNEX 4. A matrix containing each country s monitoring and measuring tools and their capabilities is attached in ANNEX 5. 2.4 Harmonization of QoS/QoE parameters and targets for the EACO region A matrix of QoS/QoE parameters, targets, computation, definitions, references, measurement methodologies and other relevant details for the region was developed as illustrated in ANNEX 6 attached. Noted: Parameters, targets and computations have only been harmonised for mobile networks but have been identified (not harmonized) for other networks. Agreed: Taskforce to continue working on areas that have not been covered as per the attached work plan. 5
2.5 Identification of relevant ITU-T Recommendations and Standards from other SDOs on QoS/QoE The team reviewed various documents as detailed in ANNEX 7 Agreed i. Taskforce to consider the documents in developing the harmonized QoS parameters ii. Taskforce Members were encouraged to study the documents and utilize the recommendations therein. 2.6 Benchmarking with other Regional and International (CRASA, WATRA, TRAI-India etc) Administrations Agreed i. It was recommended that EACO member countries sponsor QoS/QoE Taskforce members for benchmarking in member countries of the above regional regulatory organizations to build capacity on the subject matter. 3.0 Other activities of the taskforce during the meeting 3.1 Meeting with Director General-CCK The members of the QoS/QoE Taskforce held a meeting with Mr. Francis Wangusi, Director General, CCK. The Director General commended the team for their efforts and emphasized on the following; It is important to involve other stakeholders including the academia and other industry players when developing and discussing the QOS/QOE issues. There is need to have more Taskforce meetings in order to have continuity in between Congress meetings There is need for various Taskforces to coordinate their activities for effective delivery on expected results. 6
3.2 Enforcing Compliance The QoS/QoE taskforce appreciates the congress recommendation that the Enforcement Taskforce is reinstated to work with the QoS/QoE Taskforce. It was felt that once the QoS/QoE parameters and targets have been identified and included in the respective licenses, they MUST be enforced. This will entail elaborate monitoring and imposition of requisite sanctions. Below are the proposed Enforcement Sanctions for non-compliance in the EACO region. They were developed by the Enforcement Taskforce. Regular Publication of QoS Measurement Results Issuance of Reprimands Issuance of Compliance Directives Imposition of Penalties Requiring operators/service providers to compensate their customers Imposition of embargos to persistent violators (denial of Gvt Contracts) License Revocation (as a last resort) Encourage signing of SLAs between service providers or 4.0 RECOMMENDATIONS between service providers and their customers 1. QoS/QoE taskforce meetings require more time than the currently allocated time to comprehensively handle all issues relating to the subject matter. Proposed to have a minimum of five days per session (instead of the current 2-3 days). 2. The Regulators Assembly to recommend to the Congress to approve development of a model industry SLA by QoS/QoE Taskforce, as an enforcement tool in the region. 3. The Regulators Assembly to consider nontechnical mechanisms such as customer service surveys as part of QoS/QoE monitoring and measurements. 7
4. The Regulators Assembly to incorporation of operators and other stakeholder to contribute to the QoS/QoE taskforce work. 5. The Taskforce to develop a comprehensive document on QoS/QoE guidelines for EACO members 6. Member countries are requested to sponsor QoS/QoE Taskforce members for benchmarking with other regional and international bodies to build their capacity in handling QoS/QoE matters. They are in particular encouraged to sponsor taskforce member to attend ITU-T SG 12, and Regional Group for Africa meetings/workshops on QoS/QoE. 7. EACO member countries are requested to maintain consistency in their representation in the Taskforce meetings for effective contributions to the work of the taskforce. 8. EACO member countries are requested to ensure that they are represented and participate in the QoS/QoE task force meetings. 5.0 Decision Expected/Request The Regulators Assembly is requested to take note of this report together with the recommendations and recommend them for approval and adoption by the Congress Mr. Daniel K. Waturu Chairman/EACO QoS/QoE Taskforce 8
ANNEX1 - PARTICIPANTS NAME ORGANIZATION/ COUNTRY EMAIL ADDRESS Protais Kanyankore RURA Protais.kanyakore@rura.gov.rw Liston Kiui CCK lkirui@cck.go.ke Daniel Waturu CCK waturu@cck.go.ke Didace Ndivyariye ARCT ndivyariyedidace@yahoo.fr Constque Hakizimana ARCT hkizimanac@yahoo.fr Irene Nakaggwa UCC inakagwa@ucc.co.ug Caroline Koech EACO ckoech@eaco.int 9
ANNEX 2 STATUS ON TERMS OF REFERENCE No. Terms of Reference 1. Harmonize QoS & QoE parameters of various networks (including mobile networks, legacy PSTN networks, IP networks) at EAC region level. The works of the Broadcasting Technical Taskforce on QoS need to be considered. 2. Identify and develop measurement formulas and network performance monitoring (test equipments, activities...) for quality of service purposes, at EAC region level; 3. Encourage EACO country members (regulators, operators...) to contribute to the development of new/revised ITU-T Recommendations in relation with the quality of service; Current Status Compiled each country s ICT Networks and Services EACO secretariat representative to coordinate with the Broadcasting Taskforce on QoS issues that need to be considered Work in Progress Benchmarked with other international standards Each country to provide measurement tools used for Monitoring QoS. To consider both technical and non-technical measurements Region members are contributing and actively participating in ITU-T study group 12 and other SDOs Contributed to ITU-T approved recommendations on QoS/QoE 4. Encourage participation of EACO country members in QoS Rapporteur s meetings, workshops, Quality of Service Development Group (QSDG) meetings and other ITU-T Study Group 12 events; Members participating in the both QoS Rapporteur s meetings and QSDG and ITU-T study Group 12 events. Region member from RURA, (Ms Yvonne Umotoni) is currently chairing the QSDG meeting 5. Carry out other activities which may improve the QoS, QoE & Network Performance of the EAC region. The taskforce will: Encourage Regular publication of QoS/QoE reports Participate in both regional & international benchmarks Develop a model SLAs for consideration and adoption by EACO Member countries Carry out consumer awareness/education 10
6. Liaise with other stakeholders namely Telecom Operators/ISPs on issues relating to harmonization To engage different stakeholders on harmonising QoS parameters 11
ANNEX- 3 COUNTRY EXPERIENCES IN ENFORCING QOS/QOE ON LICENSEES AND CHALLENGES ENCOUNTERED. Item Countries Industry Statistics QoS Interventions Challenges Future Plans Uganda PIPs: 24 Mobile operators: 5 Major Mobile Subscribers: 16,356,387 Fixed Lines: 314,956 Internet Subscribers: Penetration rate: 48.8% Operational TV Stations: 60 Operational Radio stations: 250 Courier License operators: 8, 7, 13 (International, Regional, Domestic) respectively The above figures are as of December 2012. Burundi Mobile subscribers: 2.247.126 Main lines subscribers: 17.394 Internet subscribers: 173.329 Radio Stations: 20 TV stations: 5 (operational) + 1 operator (not operational) Legal mandate under the Communications Act 2013. License provisions QoS parameters set for all networks (Mobile, fixed & Internet networks). Monitoring of mobile networks (Drive tests & Physical inspections) Quarterly Name & Shame reports. Consumers complaints mechanism Consumer Surveys Conducting consumer awareness Strengthening collaborations with consumer Associations Established Regional Offices Intervene in a situation where QoS networks and services has deteriorated Conduct surveys to assess the perception of the quality of services by consumers Organize group and individual meetings to address degradation of service quality conduct inspections and collect the necessary data from operators to indentify the cause of the fault Performing technical measurements on network (drive tests) Appropriate monitoring and evaluation system Low enforcement system. Infrastructural challenges Culture of complacency Unavailable comparable and relevant QoS information Lack of Improved skills in the use of monitoring equipment The regulatory framework does not provide the specific penalties for lack of compliance QoS indicator Lack of updated measurement system in terms of capacity and software Formulate more regulations on QoS/QoE Strengthen enforcement Develop QoE parameters Continue to monitor consumer expectations and perceptions through research. Strengthening stakeholder collaborations More consumer awareness Improve skills in the use of monitoring equipment Kenya No. of mobile operators:4 No. of Mobile Subscribers:30,731,754 No. of Fixed Line Subscribers: 251,567 No. of Internet Subscribers: 9.49 Stringent QoS monitoring and enforcement for Mobile QoS : Scheduled monitoring and reporting in place to independently take QoS measurements Requirement placed on service providers to submit quarterly QoS performance Currently lacks means to independently verify accuracy of returns received Insufficient capacity to undertake countrywide monitoring on a quarterly Plans to enhance QoS monitoring on mobile QoS to include data/multimedia services Plans to identify and acquire QoS measurement 12
Rwanda million No. of Internet users: 16.2 million No. of Licensed TV broadcasters: 16, Radio broadcasters:99 information to the Commission for the rest of the Licensees: Certain parameters identified and prescribed but no targets set for the service providers to meet Additional Interventions undertaken whenever consumers raise QoS concerns whether on Mobile or other Services. basis in order to make compliance & enforcement more effective Penalties regime not sufficiently deterrent equipment for fixed line and other services Tanzania 13
ANNEX -4 IDENTIFIED ICT NETWORKS, SERVICES, QOS/QOE PARAMETERS AND TARGETS Countries Item Burundi Kenya Rwanda Tanzania Uganda ICT Networks PSTN Mobile IP-based Data Broadcasting PSTN Mobile Data Broadcasting PSTN Mobile IP -based Broadcasting PSTN Mobile International IP Broadcasting PSTN Mobile Data s Broadcasting ICT Services Voice Data SMS Multimedia VAS VoIP QoS & QoE Parameters Voice Data PRSP/Content Multimedia SMS and USSD VoIP VAS Money Transfer Parameters Minimum QoS Parameters Minimu m QoS Voice - Data - Added value services (SMS, USSD services...) Multimedia services (VoIP, Video on Demand...) Cloud computing PSTN International Mobile Telephone Internet services Content services Parameters Minimum QoS Parameters Minimum QoS IP-based Networks Voice Data SMS Content services Multimedia services USSD services VoIP services Money Transfer Parameters Minimum QoS Successful log-in ratio Dial-up users must be able to connect at least 90% of the time Service Provisioning /Activation Time >95% of work should be done within 5 working days, subject to technical feasibility Service Activation/ provisioning time 5 working days from time of completion of request for fixed services Leased line users must be able to connect In case of NTF(Non Technical Feasibility), 90% of all activations should be complete within 24 hours for 14
at least 99% of the time greater than 99% within should be done on the date mobile services specified and agreed with the customer An SLA should always be signed between the ISP and the customer. Delay (one way transmission time) One way transmission time (international) should be less than or equal to 150 ms Fault Repair /Restoration Time >85% of all faults should be cleared within 24 Hrs >99% of all faults should be cleared within a maximum of Network Availability >99% availability for core network elements >95% for access networks 1 four(4) days unless in the case of Natural disasters/acts of God may arise. Loss Ratio Loss ratio for any class of service should be less than 10-3 Customer complaint resolution Within 24hrs >95% complaints should be resolved (24/7) Within 72hrs >99% complaints should be 1 Access network: An implementation comprising those entities which provide the required transport bearer capabilities for the provision of telecommunications services between a Service Node Interface (SNI) and each of the associated User-Network Interfaces (UNIs) ITU G.902 15
resolved (24/7) Unsuccessful data transmission ratio Billing Performance Billing complaints per 100 bills issued <2% of bills issued during the billing period Data transmission speed achieved Data transmission achieved should be at least 80% of that advertised by the service provider Bandwidth Utilization/ Throughput <90% link(s)/route bandwidth utilization during peak hours (TCBH). If on any link(s)/route bandwidth utilization exceeds 90%, then network is considered to have congestion. Broadband Connection Speed achieved 95% or greater the speed of connection(updownstream) from ISP s server(s) to the customer shall be achieved fulltime (24/7) >80% for non dedicated line Network Latency The round trip delay for traffic within the local broadband network from end-user to ISP/IX should be less than 150 msec for 95% of the time during peak 16
Service Availability / Uptime (for all users) Greater than 99% of the time, the network shall be available to the subscribers Customer perception of Services Network Faults % satisfied with the provision of service >95% Poor Service Reception % satisfied with the billing performance >95% Disconnection while service is in progress % satisfied with help services Billing % satisfied with network performance, reliability and availability >95% >95% Poor customer service Spam control % satisfied with maintainability >95% Mobile Cellular Networks Call setup success rate (CSSRR >=95 % Call Setup Success Rate (CSSR) >95% Call Setup Success Rate (CSSR) >=95% (In busy hour) Blocked call rate <2% Call drop rate (CDR < 2% Call Drop Rate <2% Call Drop Rate (CDR) < 3% Call setup time 20s for on-net calls 17
TCH blocking rate < 2% Blocked Call Rate (TCH Congestion) <5% Blocked Call Rate (TCH Congestion) < 2% Dropped Call rate Less than 2% of established calls dropped before either called or caller party terminates connection SDCCH blocking rate < 2% Blocked Call Rate (SDCCH Congestion) - Blocked Call Rate (SDCCH Congestion) < 1% Good call quality >3.1 Outgoing intra BSS HO drop rate < 1% Speech Quality (MOS, PESQ Values) >95% of samples > 2.7 Voice quality (percentage of voice with good quality) >90% Good Call Quality rate Greater than 90% of successful calls during busy hour Outgoing inter BSS HO drop rate <2% POI Congestion - POI Congestion < 2% SMS Completion rate 98% of peer to peer SMSs sent are delivered within 2 minutes from instant they are sent. HO request UL Quality rate 20% Handover Success Rate >90% Handover Success Rate >=95% Point of Interconnect blocking Not more than 1.5% of inter network calls should be blocked at interconnect point HO request UL Level rate 10% Call Set up Time <13.5 HO request DL Quality rate 20% SMS Success rate >=95% Rx Lev (dbm) HO request DL Level rate 5% Indoor -95 dbm SMS End to End duration Upper limit: 48 hours 18
Successful SMS Ratio >=95 % Incar -100 dbm Network Availability (For MSCs & BSCs) > 98% End-to-end delivery time for SMS Outdoor -102 dbm Network Availability (For BTSs) > 95% Telephone main lines faults per 100 main lines per annum Fault incidence per 100 access line Fixed Networks <15 or Less of fault reports per Copper lines a month < 5 or Less of fault reports per WLL lines per month Less than 20% of proved faults are repeated Service restoration time 80% of all service restoration should be fulfilled within 24 hrs 100% of all service restoration should be fulfilled within 48hrs. Any service interruptions that may not be resolved within these limits and are due to exogenous factors should be reported to the commission within 12 hours of occurrence of the fault. Average time taken for calls to be connected Average daily percentage of payphones in working condition Fault Repair time 85% of all trouble should be cleared within 24 Hrs 95% of all trouble should be cleared within a maximum of four (4) working days unless an extenuating fact may arise. Call completion Rate(CCR) 85% or higher of all call attempts 19
Percentage of payphone faults repaired Blocked Call Rate (BCR) 2% of all established connections per month. Call Setup success rate (CSSR) (outgoing and incoming) 95% or higher of local and national total number calls shall be successful 93% or higher of total incoming and out coming international calls shall be successful Grade of Service Less than 2% 20
ANNEX -5 IDENTIFIED QOS MEASUREMENT TOOLS & THEIR CAPABILITIES FOR EACH COUNTRY Counties Item Measurement Tool/Equipment Capabilities Challenges Uganda Quality of service monitoring tool supplied by SGS and installed by Keynote Sigos based in Germany Drive tests for Voice and data both Fixed and Mobile on GSM/UMTS, LTE and CDMA 450 networks. Fixed DATA tests for CDMA 800. Supports automatic reporting for all ITU data and voice KPIs, in a variety of platforms including display on Google maps. Only limited samples can be collected in a given location. Can only simultaneously test 3 networks. Events cannot be viewed in real time remotely. Cannot decode network cause values. System can be used to monitor value added premium services like mobile money, application services on USSD and MMS services. The system uses simulated terminals a a consequence is not able to capture and decode some service setup protocols. Kenya QVoice, by Ascom AG of Switzerland. QVoice has voice as well as data measurement capabilities. It is capable of performing analysis with report generating functionality that assists in the interpretation of network behaviour as experienced by customers. Burundi Rwanda Tanzania Has now been superseded by other versions of the equipment. Based on proprietary hardware and software. 21
ANNEX 6 QOS/QOE HARMONIZED PARAMETERS &TARGETS FOR THE REGION ICT NETWORKS A. TECHNICAL PARAMETERS ITEM Parameter Definition Computation Proposed target Reference Measure ment Methodol ogy Comments Mobile Networks (2G-3G) Call Setup Success Rate (CSSR) The probability that the enduser can access the mobile telephony service when requested if it is offered by display of the network indicator on the UE. x 100 >95% GSMA ITU-T E.800 ETSI Call Setup Time The period starting when the address information required for setting up a call is received by the network (recognized on the calling user's access line) and finishing when the called party busy tone, or ringing tone or answer signal is received by the calling party (i.e., recognized on the calling user's access line). Call setup time = T2-T1 (Time of alerting signal time of sending address signal) <9s ITU-T E.CCH Local, national and service calls should be included, but calls to Other Licensed Operators should not, as a given operator cannot control the QoS delivered by another network. Call completion Rate (CCR) Is defined as the probability that a call has, after being successfully set up, to be maintained during a period of time, ending normally, i.e., according to the user s will. Number of normally endedcalls Call Completion Total number ofcallattempts [%] = 100% >85% ITU-T E.CCH 22
Call Drop Rate (CDR) Blocked Call Rate Network Availability Voice/Speech quality Percentage of calls that are terminated after connection to the system within the period of the call duration without any of the users will SDCCH Congestion SDCCH Congestion is defined as the probability of failure of accessing a standalone dedicated control channel during call set up. TCH Congestion Traffic Channel Congestion Rate is defined as the probability of failure of accessing traffic channel(s) during call connections. Time that network resources are valuable to the customer and/or the percentage uptime of the link where bandwidth is accessed from the operator by the customer. Excludes time for planned maintenance An indicator representing the quantification of the end-to-end speech transmission quality of the mobile telephony service. This parameter computes the speech quality on the basis of completed calls. x100 <9% <1% ITU-T E.CCH <1% ITU-T E.CCH >99.7% ETSI TS 102 250 MOS (scale of 1-5), POLQA, PESQ 3.5 ETSI 23
POI Congestion Handover Success Rate This is the proportion of successful interconnect call attempts over the total number of interconnection attempts This is the proportion of successful handovers over the total number of handover attempts x100 >99% x100 >95% Service Coverage ( Rx Lev) The proportion of an area within which a specified minimum level of signal strength is achieved or the areas in which a given service is available Average area over which the minimum signal strength is received/ Total area where network is available >95% of area served should have minimum outdoor signal strength of 90dBm for a given reporting area. Successful SMS Ratio Probability that a user can send an SMS/MMS successfully from a terminal equipment to a SMS Centre: SMS/MMS Completion rate Ratio of correctly send and received SMS/MMS between two terminal equipments SMS/MMS End to end delivery time The period starting when sending an SMS from a terminal equipment to an SMSC and finishing when receiving the very same SMS on another terminal 24
equipment. USSD session completion rate USSD transaction time SMS/MMS Service Accessibility Attach failure ratio Attach Setup time PDP Context Activation Failure PDP Context Activation Time PDP Context cut-off ratio Rx Qual IP Based Networks Fault Repair / Restoration Time Bandwidth Utilization/ Throughput Broadband Connection Speed achieved 25
Successful log-in ratio Data transmission speed achieved Latency Time taken to transfer BER JITTER Variation PSTN/Fixed networks Fault incidence per 100 access line Fault Repair time Call completion Rate(CCR) Blocked Call Rate (BCR) Grade of Service Fault incidence per 100 access line Fault Repair time 26
Call Setup time Call Setup Sucess rate B. NON TECHNICAL PARAMETERS Parameter Definition Computation Measurement Methodology Service Activation/provision ing Service restoration Charging and billing performance Customer support performance Customer Satisfaction Surveys Customer complaint resolution Proposed target Reference 27
ANNEX 7 LIST OF IDENTIFIED RELEVANT ITU-T RECOMMENDATIONS AND STANDARDS FROM OTHER SDOS ON QOS/QOE No. Organization Standard Content 1. ITU ITU-T E.CCH QoE Draft New Recommendation: Definitions and associated measurement methods of user-centric parameters for call handling in cellular mobile voice service 2. ETSI ETSI TS 102 250-2 Speech and multimedia Transmission Quality (STQ); QoS aspects for popular services in mobile networks; Part 2: Definition of Quality of Service parameters and their computation 3. ITU ITU-T E.800 Definitions of terms related to quality of service 4. ITU ITU-T E.802 Framework and methodologies for the determination and application of QoS parameters 5. ITU ITU-T E.803 Quality of service parameters for supporting service aspects 6. African Forum for Utility Regulators (AFUR) 7. AFUR QoS Guidelines Common guidelines on minimum quality of service standards for communication services 8. 28
29