Supplier Ratings Systems: A Survey of Best Practices
Executive Summary Supplier ratings systems were created in the early 1990s as a way to measure supplier performance and to help companies with supply base optimization decisions. As ratings systems evolved, there has been increased transparency and understanding; however, each company has a slightly different system. The Aerospace Industries Association (AIA) Supplier Management Council formed the Supplier Ratings Committee in 2008 whose objectives included: Surveying suppliers to determine best practices Provide feedback to prime contractors, Provide information assisting suppliers in creating their own ratings systems. and Help suppliers gain increased knowledge to better partner with customers. In 2008, the Supplier Ratings Committee surveyed 180 suppliers regarding best practices of their customers supplier ratings systems. Results and key themes were presented via a survey report and a panel discussion at the October 2008 Supplier Management Council Conference. Key themes which emerged included that ratings cannot replace the need for supplier relationship management. The Supplier Ratings Committee invited 227 AIA members in 2010 to analyze improvements made to customers ratings systems in the aerospace and defense industry. Key themes which emerged from the latest survey included: Significant improvements experienced in data accuracy, timeliness, and communications More timely feedback communicated via electronic media such as email and supplier portals Increased convergence in how on-time delivery and quality metrics are defined, with two primary options listed for each Increased focus on supplier responsiveness as a key qualitative metric Addition of suppliers financial health and formal risk management process as new attributes used to calculate supplier ratings. The following pages of this best practices benchmark explain the history of ratings systems, the process followed in this survey, and comparisons between the 2008 and 2010 survey results. For Aerospace Industries Association Members Only 2
Background Supplier ratings systems have evolved in complexity, maturity, and usage since inception nearly twenty years ago. Historically supplier ratings systems were used by primes simply to measure supplier performance. As they evolved to include other total cost of ownership (TCO) aspects, they began to be used in source selection decisions. As such, data accuracy and timeliness became a larger concern. Accurate and timely data exchange benefits both parties during conversations relating to performance. While technology continues to better facilitate this exchange, supplier relationship management discussions are still necessary to allow open communication and retain a strong partnership. Below is a brief review of the how ratings systems have changed. Early 1990s 2008 2010 2012-2013 Future: Historical: Used by Primes Quality & OTD Inaccurate Data No Dialogue No Process for Challenging Data Recent Past: Multiple Systems Expansion to Total Cost of Ownership Greater Utilization in Sourcing Some Transparency Improved System Capabilities Present: Improved Accuracy Greater Data Visibility Increased Use of Electronic Media More Focus on Responsiveness Increased Interaction Adaptation to Systems Integration Greater Use for Root Cause Analysis Add l Focus on Risk Mgmt Used for Some Sourcing Decisions Committee Membership The Supplier Ratings Committee comprised of both suppliers and prime contractors. These volunteers participate in the Supplier Management Council and have an interest in driving supply chain improvements within the industry. Since the committee s inception in 2008, its members have actively participated in both the 2008 and 2010 survey analyses. The committee s representatives are detailed below. For Aerospace Industries Association Members Only 3
Name Roger Weiss Kristin M. King Ken Bram Don Weiss Bob White Greg Bolles Robert Barnett Bill Peterson Bill Lewandowski Larry Stenger Tom Carll Mike Hackerson Kevin Engfer Bill Haslett John Heckman Company Rockwell Collins Rockwell Collins AUSCO Inc. Harris Corp Millitech Incorporated Plexus BTC Electronics AIA Aerospace Supply Chain Solutions Jabil Defense and Aerospace Services, LLC Astro-Med, Inc. PRTM Northrop Grumman Northrop Grumman Southern California Braiding Company, Inc. Objectives AIA s key strategic goals were established to promote the health and vitality of the aerospace and defense industry. The AIA goal which aligns supplier performance to one of those strategies is: Enhance U.S. economic health and warfighter success by enabling more efficient and effective aerospace industry-wide performance outcomes relative to safety/reliability/quality of supplier products. As a result of the strategic goals established for the Supplier Management Council, supplier ratings systems were identified as an opportunity for improvement. The following objectives were established for this Committee: Determine best practices Provide feedback to Primes on best practices Utilize to help suppliers to work with Primes Provide information to help suppliers implement their own ratings systems. Supplier Ratings Committee Process In 2008, the Supplier Ratings Systems (SRS) Committee established the following process to identify best practices relating to supplier ratings systems and those customers whose systems they believed best exhibited these characteristics. For Aerospace Industries Association Members Only 4
Define Objectives Collect Information Determine Best Practices Communicate Results Utilize Best Practices -Supplier / Prime Objectives: Determine Best Practices Provide Feedback to Primes on Best Practices Utilize to Help Suppliers Work With Primes Provide Information to To Help Suppliers Implement with Own Ratings System Suppliers Survey: Finalize Survey: 6/27 AIA Launch Survey: 7/28 Responses Due: 8/8 AIA Gather Data: 8/15 Subcommittee Synthesize Data: 9/5 Determine Best Practices: Discuss & Select Best Practices: 9/5 Socialize Results with AIA Membership: Select 3 Speakers to Present Best Practices: 9/10 Present Results at October SMC (including Speakers) Primes: Ratings Systems Suggestions Associates: Implement Ratings Systems After presenting the best practice survey results in October 2008, the committee focused on helping suppliers and primes implement ratings systems which included key best practice elements. First, the committee ensured all AIA members had access to the 2008 results. Next, the committee created a brochure to educate suppliers on how to best utilize the systems to gain the most value for them and their customers. Additionally, the committee created a web-based toolkit to guide suppliers through the decisionmaking elements of how to create or improve their own ratings systems. Both of these are available to AIA members. For additional details, please refer to the References section of their document. To further round out its objectives, the SRS committee decided to obtain trending data to assist both primes and suppliers in understanding the direction of supplier ratings systems. Thus, the 2010 best practices survey was launched. Trends and key themes are listed in the remainder of this document. Survey Logistics A 2010 survey was sent to 159 AIA associate members, representing various industries throughout aerospace and defense. Additionally, 68 regular members were also invited to participate. Only 24 suppliers responded, with 11% completing the entire survey. In comparison, the 2008 initial best practices survey had a response rate of 24%. While the 2010 survey is less than in 2008, the committee surmises this is a positive indicator that issues associated with supplier ratings systems are less of a concern than in the past. Key Trends and Themes For Aerospace Industries Association Members Only 5
Key themes and trends which emerged from the suppliers responses regarding supplier ratings include the five listed below. Significant improvements experienced in data accuracy, timeliness, and communications It is evident from responses that OEMs are providing more weekly and monthly feedback compared to the quarterly and monthly communications seen in 2008. Furthermore, more than 83% of respondents perceive supplier ratings systems as being accurate more than 75% of the time, a large increase from the 50% in 2008. Additionally, 43% of respondents believe data accuracy is improving since 2008. More timely feedback communicated via electronic media such as email and supplier portals The more frequent performance feedback mentioned previously has been assisted in part because of the increased use of technology as a communication mechanism. Nearly half of respondents described customer feedback as being sent via electronic media (web portals and email). As technology continues to evolve and some suppliers move to systems integrator roles, it is expected that the frequency of performance communications will continue to grow. Increased convergence in how on-time delivery and quality metrics are defined, with two primary options listed for each In 2008, OEMs used a wide variety of definitions for measuring on-time delivery (OTD) and quality. OTD metrics ranged primarily from 10 days early / 10 days late to zero days early / late. The 2008 survey results showed just over 1/3 of customers considered on-time delivery to be 3 days early / zero days late. Responses to the 2010 survey showed a strong convergence within the industry with nearly 60% selecting this definition. Interestingly, customers within the industry continued to utilize 2 primary definitions for quality measurement. Equal percentages of respondents indicated the industry measures quality via defective parts per million (DPPM) versus by lot / shipment. Increased focus on supplier responsiveness as a key qualitative metric In 2008, customer satisfaction, innovation, and integration / partnership were key qualitative metrics also used in supplier ratings. Responsiveness was a frequent write-in vote and thus added as an option in the follow-up survey. In 2010, responsiveness was the most common qualitative metric with over 30% of the responses. Though customer satisfaction and innovation continued to be popular responses, both received fewer votes than in 2008. Addition of suppliers financial health and formal risk management process as new attributes used to calculate supplier ratings. Given the change in economic climate and technology in the past 2 years, the Supplier Ratings System Committee was interested in defining what new attributes are being used in supplier ratings systems. Since 2008, the survey showed that risk management and financial health are 2 major attributes that have been added as criteria in many ratings systems. While the economy was a strong influencer, customers increased focus on business continuity planning was likely a factor as well. Conclusion For Aerospace Industries Association Members Only 6
Supplier ratings systems were listed as one of the top concerns for AIA s associate members during a 2008 issues workshop. The Supplier Ratings System Committee focused on achieving key objectives of communicating the best practices across the industry to both OEMs and suppliers and assisting suppliers in working more effectively with their customers systems and implementing their own systems. In 2008, the Committee surveyed associate members who responded overwhelming that the key struggles they found were inaccurate data, slow frequency of feedback, and a one size fits all approach to ratings definitions in many instances. Upon surveying the AIA membership again in 2010, key trends between the surveys emerged: Significant improvement in the accuracy of ratings system data Improved timeliness of performance feedback Increased use of technology to provide feedback to suppliers Convergence in both the on-time delivery and quality metric definitions As a result of survey feedback, the Supplier Ratings Systems Committee was able to determine the following as best practices in today s supplier ratings systems: OEMs provide interactive discussions with timely feedback. Ratings are communicated using the easy & efficient technology available to all, most frequently via supplier portals. Data accuracy has improved since 2008; however the best OEMs provide a strong focus on maintaining accurate data and seek to fix it if inconsistencies are found. Overall supplier performance includes qualitative or total cost of ownership (TCO) metrics in addition to on-time delivery and quality. Key qualitative metrics frequently utilized include responsiveness, customer satisfaction, risk management, and financial health. Simply communicating metrics to suppliers are not enough. Interactive conversations and feedback are important to supplier relationship management. Their importance will only grow in the future as the industry drives to increased levels of systems integration. It is recommended that OEMs review survey responses to understand how their own supplier ratings systems compare to the industry best practices and whether any updates are appropriate. Suppliers should also review their own communications to their customers to help strengthen the mutual relationship. Suppliers are also encouraged to evaluate their own supplier ratings systems to determine if meets the identified best practices or to implement one if they do not already have a system. The three (3) resources listed in the References section are available to assist them in this process. References To assist AIA members in further understanding and implementing supplier ratings systems, the following tools can be accessed via AIA s members only section of their website. 2008 Best Practices Supplier Ratings Survey Results How to Create a Supplier Ratings Systems toolkit Supplier Ratings Systems Understanding Your Customers Systems to Enhance Your Business brochure For Aerospace Industries Association Members Only 7
Addendum: Supplier Survey Responses Question 1: In addition to on-time delivery, quality, and cost metrics, please identify the 3 most common metrics on which your customers measure your performance. 35.0% Qualitative Metrics 30.0% % of Responses 25.0% 20.0% 15.0% 10.0% 5.0% 2008 2010 0.0% In 2008, responsiveness was listed as a common write-in response. As such, the Supplier Ratings Survey Team added this as an option in 2010. Though responsiveness was predicted to score as a frequent qualitative metric in 2010, it was not expected to be the most frequent response. Customer satisfaction and innovation continued to remain a commonly utilized metric, though down slightly from 2008. This may be due to a shift in defining customer satisfaction to be more definitive by responsiveness. For Aerospace Industries Association Members Only 8
Questions 2-3 & 13: In which types of formats is your supplier rating presented to you? How often do most customers update / communicate your performance to you? How do you believe the feedback and communications from your customers supplier rating systems are trending? 30.0% Communication Format % of Responses 25.0% 20.0% 15.0% 10.0% 5.0% 0.0% 2008 2010 Internet / Portal Email Excel PDF file Hard Copy (Mail or Fax) Other Format Frequency of Communication Semi-annually Weekly Other Daily Annually 2008 2010 Quarterly Monthly 0.0% 10.0% 20.0% 30.0% 40.0% 50.0% 60.0% % of Responses For Aerospace Industries Association Members Only 9
Communications Trends 13% 43% 43% Improving Staying the same Worsening Since 2008, suppliers indicate that customers are communicating with them more frequently and by more electronic means. While there is a trend to utilize more webbased portals and email, a remarkably high number of customers still utilize traditional means over communicating ratings to their suppliers. Most improvements can be traced back to the implementation of technology to improve the accuracy and timeliness of information flow and communications. One positive trend appears that customers are increasing the most common frequency of notifying suppliers from quarterly to monthly. The team found it interesting that there was a slight increase in the amount of annual feedback provided; however this is deemed to be due more to a shift in respondents answering the survey, rather than a trend within the industry overall. Overall, a significant number of suppliers view that feedback from customers is improving. This is not surprising given the increase in both frequency in communicating ratings and the increased visibility into the data comprising those ratings. For Aerospace Industries Association Members Only 10
Questions 4-6: How do customers measure on-time delivery (OTD)? Is there agreement between you and your customers on delivery dates before they become contractual? When considering which customer has the most effective way to measure OTD, please explain what / why that is the best? 60.0% On-Time Delivery Metric 50.0% % of Responses 40.0% 30.0% 20.0% 2008 2010 10.0% 0.0% 3 Days Early + ontime (0 days late) Other On-time only 1 day early + ontime + 1 day late Time Period Delivery Date Agreement Response 4.3% 0.0% 17.4% Most of the Time 47.8% Sometimes All of the Time 50% of the Time 30.4% Never Since 2008, the industry has begun to converge on two primary definitions of ontime delivery (OTD). These include 3-days early / 0 days late as well as on-time only. Previously, survey respondents indicated a vast difference in the definition, including allowing up to 10 days early to count as being on-time. While it cannot be said that a single definition exists across the industry, customers with best practices For Aerospace Industries Association Members Only 11
continue to provide timely data that is based upon an easily understandable definition and dates that are mutually agreed upon either via contractual dates or being listed on the purchase order. Customers were identified as having the best ratings systems in terms of measuring on-time delivery by survey respondents. In no particular order, these include: Boeing / Boeing IDS Raytheon Rockwell Collins UTC / Sikorsky / Pratt & Whitney Respondents described these systems as: Allowing for weekly review of data Visibility of data when it is signed at the dock Utilizing contracted or agreed upon delivery dates Ease of utilizing a min/max replenishment system. For Aerospace Industries Association Members Only 12
Questions 7-8: How do customers measure quality? When considering which customer has the most effective way to measure quality, please explain what / why that is the best? Quality Rating Criteria Lot / Shipment DPPM 2008 2010 Other 0.0% 10.0% 20.0% 30.0% 40.0% 50.0% Similar to the on-time delivery metric, there is not one method of measuring quality. The results of the surveys indicated slight shifting between measuring by lot / shipment and DPPM. However, these two methods remain the dominant methods of choice for measuring supplier quality performance. As in 2008, companies who consistently ranked as having better ratings systems were noted for communicating the reason why a product or service was viewed as not meeting quality expectations. Furthermore, these companies recognized that sometimes the reason for the defective product may be customer-caused and allowed for an open dialogue with the supplier. It was re-iterated in 2010 that ratings systems in the future need built-in flexibility to allow for differences between various products / services provided. For example, suppliers providing hardware have different quality requirements levied upon them than a company developing a complex subsystem. Customers who were listed as having the most effective quality metrics were listed for the following reasons: Utilization of DPPM and/or escapes as a measurement Ensure verification of accurate specification and understanding by supplier Assign responsibility for quality defect prior to awarding a rating rather than assuming it was supplier caused until otherwise notified For Aerospace Industries Association Members Only 13
Question 9: Do your customers measure and utilize an overall composite rating to measure your entire company s performance or do they provide a rating based upon each type product / service you provide to them? 90.0% Rating Applicability % of Responses 80.0% 70.0% 60.0% 50.0% 40.0% 30.0% 20.0% 10.0% 2008 2010 0.0% Rating Applies to Entire Company Rating Applies to Individual Locations Rating Is f or Each Product / Service Not Applicable (We Only Provide 1 Type of Product/Service) Coverage Criteria There is a significant shift in the applicability of ratings. While most customers provide one overall supplier rating for the entire company, many are measuring suppliers by each customer location served. Though this may appear counterintuitive, this may be partially due to the additional response option added in the 2010 survey. Suppliers indicated in 2008 this was a new trend in supplier ratings systems in the last 2 years; as such, this option was added. Regardless, customers with the best ratings systems still provided an overall composite score for the entire company while still allowing for an additional level of detail at each location. For Aerospace Industries Association Members Only 14
Questions 10-12: On average, what percent of the time do you believe the data utilized to create your supplier ratings is accurate? Please list the top 2 factors which contribute to inaccurate data used for the ratings. How do you believe the accuracy of your customers supplier ratings systems are trending? Accuracy Measurement 0-50% % of Accurate Data 51-75% 76%-85% 86%-99% 2008 2010 100% 0.0% 10.0% 20.0% 30.0% 40.0% 50.0% % of Responses Data Accuracy Trends 13% 43% Improving Staying the same Worsening 43% In 2008, data accuracy spiked out as one of the largest issues with supplier ratings systems. Just over 50% of suppliers believed the data was accurate more than 75% For Aerospace Industries Association Members Only 15
of the time. In 2010, a positive increase in data accuracy was seen, with over 43% of respondents believing data accuracy was improving in the past 2 years. Additionally, a marked increase with approximately 83% of suppliers believing ratings system data was accurate more than 75% of the time. It remains clear that further actions to improve data accuracy are needed and would help make ratings even more useful. The reasons provided for inaccurate data remain much the same as 2008. These are frequent a result of: Lack of purchase order maintenance Timely and accurate receipt of products (e.g. goods remain at receiving/inspection) Lack of agreement and/or communication on scheduled delivery date Poorly communicated specification requirements. Question 14: Who in your company reviews your customer ratings performance metrics? (List top 3 positions.) Suppliers have a combination of front-line personnel and executive leadership accessing customers ratings. The front-line personnel include those who work directly with customers on a day-to-day basis. Executive and senior leaderships, however, remain very interested in monitoring customer ratings. Additionally, suppliers frequently share these ratings with employees throughout their company as a communication and motivation mechanism. As in 2008, the most frequently listed functions receiving ratings data include: Quality Management CEO / President / General Manager Senior Executives Sales / Product Management / Customer Service Functional Leadership (ex: Engineering or Production leaders) Question 15: In addition to on-time delivery and quality, what are the two (2) most important criteria that you as a supplier analyze to measure your own performance to your customers? In addition to utilizing customer ratings, suppliers management teams also measure themselves by reviewing: Customer Responsiveness Customer Satisfaction Internal Performance Measurements (Quality / DPPM) Sales Capture Rates Business Development For Aerospace Industries Association Members Only 16
Question 16: Do you utilize a ratings system to measure your suppliers? Active Supplier Ratings Systems Yes In Process 2008 2010 No 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% % of Responses Though the vast majority of suppliers currently have their own systems to measure supplier performance, there was a slight decline compared to 2008. The committee does not believe this is necessarily indicative of the entire industry. Rather, the Supplier Ratings Survey team believes this is likely due primarily to a slight shift in the respondents answering the 2010 survey. It does remain evident that as systems integration increases within the industry, supplier ratings systems will become an even more essential tool utilized at deeper tiers within the supply chain. For Aerospace Industries Association Members Only 17
Question 17: Please describe what actions are taken by your company as a result of reviewing the ratings data. Suppliers take the performance feedback they receive seriously and utilize the data to: Implement root cause and corrective actions when ratings necessitate Increase company awareness Modify approved suppliers listings (AVLs) Identify processes which require additional training to employees Challenge ratings with customers when needed Drive continuous improvement into operations. More suppliers are utilizing their ratings to understand the root cause and correction actions relating to performance. The 2010 survey results showed that suppliers are now using their ratings and corrective action results to modify their own selection of suppliers, including adding and removing their own suppliers from their approved suppliers listings (AVLs). Question 18: Name three (3) customers who you believe have the best supplier rating system. (Non-aerospace and defense customers also apply.) The following companies were identified as having the best supplier ratings systems. These customers are listed in alphabetical order: Boeing / Boeing Integrated Defense Systems Cessna GE Aviation Honeywell Lockheed Martin Northrup Grumman Parker Hannifan Raytheon Rockwell Collins UTC / Sikorsky / Pratt & Whitney Notable companies outside the aerospace / defense industry listed as having the best ratings include both John Deere and Harley-Davidson. For Aerospace Industries Association Members Only 18
Question 19: Name three (3) customers who you believe have most improved their supplier rating system over the last two years. (Nonaerospace and defense customers also apply.) Similar to those customers believed to have the overall best ratings systems, the following customers were identified by respondents more than once as having the most improved ratings systems since 2008: Boeing / Boeing Integrated Defense Systems Lockheed Martin Raytheon Rockwell Collins Sikorsky Question 20: Please name up to three (3) improvements you would recommend to upgrading your customers rating systems. The most universal suggested improvements for enhancing ratings systems include: Ensure data accuracy is maintained and provide the appropriate resources to do so Expand upon user-friendly formats which allow for quick or real-time feedback Streamline the appeals process to make it easier for suppliers to correct inaccurate data Require or incentivize buyers to correct supplier ratings system data in a timely manner. Even though ratings systems are important feedback mechanisms, it should also be noted that these should not replace regular, personal interaction. Conversely, the 2010 survey did mention one of the 2008 frequently listed concern of fairness of ratings systems. In 2010, only a very small percentage of suppliers referred to customizing ratings to accommodate providing highly complex sub-assemblies versus off-the-shelf components or non-deliverable services like shown in 2008. For Aerospace Industries Association Members Only 19
Question 21: What NEW attributes have customers began measuring in their rating systems during the past two (2) years? New Ratings System Attributes Risk Management Financial Health Attributes Other / None Change Management Developmental Assistance 0% 10% 20% 30% 40% 50% 60% % of Responses In the past two years, customers added both risk management and financial health as the 2 primary new systems attributes. This may be either a reflection of the change in economic climate, the increased awareness of business continuity planning, or a combination of both. For Aerospace Industries Association Members Only 20
Question 22-23: How many customers include risk management as a part of their ratings system? As a supplier, do you have a formal risk management process? Usage of Risk Management Ratings 5+ # of Customers 3 to 4 1 to 2 0 0% 5% 10% 15% 20% 25% 30% 35% % of Responses Suppliers with Risk Mgmt Process 39% 61% Yes No Though risk management was the most noted new attribute, there appears to be a large variation in the total number of customers utilizing risk management as a criterion for calculating supplier ratings. While nearly 1/3 of suppliers have more than 5 customers measuring risk management, nearly as many do not have any customers measuring it. Also, some companies may be utilizing risk management as For Aerospace Industries Association Members Only 21
an informal part of their supplier evaluations, but have not formally included it in their supplier ratings systems. On a positive note, nearly 70% have at least 1 customer measuring suppliers risk management and the majority of suppliers already have a formal risk management process established. The utilization of both risk management systems as a criterion as well as the suppliers creating formal processes both seem likely to increase in the future. For Aerospace Industries Association Members Only 22