A Data Mining Based Approach to Electronic Part Obsolescence Forecasting



Similar documents
A Data Mining Based Approach to Electronic Part Obsolescence Forecasting

Forecasting electronic part procurement lifetimes to enable the management of DMSMS obsolescence

Making Business Cases to Support Obsolescence Management

Electronic Part Life Cycle Concepts and Obsolescence Forecasting

Industry Environment and Concepts for Forecasting 1

Using Data Mining to Detect Insurance Fraud

Ideas for a More Proactive Role for Parts Management and DMSMS in Acquisition

Magruder Statistics & Data Analysis

The New Rules of Telecom Expense Management Effectively Managing Telecom Spend in Today s World

Case study of a batch-production/inventory system E.M.M. Winands 1, A.G. de Kok 2 and C. Timpe 3

A Fuel Cost Comparison of Electric and Gas-Powered Vehicles

An Innovative Method for Dead Time Correction in Nuclear Spectroscopy

A better way to calculate equipment ROI

Using Data Mining to Detect Insurance Fraud

Analysis of Bayesian Dynamic Linear Models

Reliability Modeling Software Defined

Software project cost estimation using AI techniques

Validation of On-Wafer Vector Network Analyzer Systems

I N D U S T R Y S P O T L I G H T. T h e Grow i n g Appeal of Ad va n c e d a n d P r e d i c ti ve Analytics f o r the Utility I n d u s t r y

Quantitative Inventory Uncertainty

AMS 5 CHANCE VARIABILITY

Module 10: Assessing the Business Case

Behavioral Segmentation

A Comparative Study of the Pickup Method and its Variations Using a Simulated Hotel Reservation Data

Monte Carlo Simulations for Patient Recruitment: A Better Way to Forecast Enrollment

Tapping the benefits of business analytics and optimization

Concept for an Algorithm Testing and Evaluation Program at NIST

Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods

Risk Analysis Overview

Risk Analysis and Quantification

FUZZY CLUSTERING ANALYSIS OF DATA MINING: APPLICATION TO AN ACCIDENT MINING SYSTEM

Two-sample inference: Continuous data

Project Proposal: Monitoring and Processing Stock Market Data In Real Time Using the Cyclone V FPGA

Seismic Risk Assessment Procedures for a System consisting of Distributed Facilities -Part three- Insurance Portfolio Analysis

Content Sheet 7-1: Overview of Quality Control for Quantitative Tests

STATISTICA. Financial Institutions. Case Study: Credit Scoring. and

Table 1. Active User Projections for Planned M-Banking Service

Measuring Line Edge Roughness: Fluctuations in Uncertainty

APPENDIX N. Data Validation Using Data Descriptors

Simple Predictive Analytics Curtis Seare

A Contact Center Crystal Ball:

Attaining Supply Chain Analytical Literacy. Sr. Director of Analytics Wal Mart (Bentonville, AR)

ORACLE ASSET TRACKING

Marketing Mix Modelling and Big Data P. M Cain

On Correlating Performance Metrics

ITG Executive Summary

BUILDING A DEMAND-DRIVEN SUPPLY CHAIN THROUGH INTEGRATED BUSINESS MANAGEMENT FROM EXECUTION TO OPTIMIZATION ADVANCED EXECUTIVE EDUCATION SERIES

Easily Identify Your Best Customers

BoR (13) 185. BEREC Report on Transparency and Comparability of International Roaming Tariffs

When to Refinance Mortgage Loans in a Stochastic Interest Rate Environment

Customer Relationship Management. EC-Council

Mitigating Risk through OEM Partnerships. Leveraging OEM to Drive the Bottom Line

Integrated Sales and Operations Business Planning for Chemicals

PROJECT CONTROL PROCEDURE (PROJECT STANDARDS AND SPECIFICATIONS)

A Decision-Support System for New Product Sales Forecasting

White Paper: Impact of Inventory on Network Design

Article 3, Dealing with Reuse, explains how to quantify the impact of software reuse and commercial components/libraries on your estimate.

The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.

Predictive Analytics

Optimizing Inventory in Today s Challenging Environment Maximo Monday August 11, 2008

Forecast Confidence Level and Portfolio Optimization

Riversand Technologies, Inc. Powering Accurate Product Information PIM VS MDM VS PLM. A Riversand Technologies Whitepaper

Figure 1. New Wafer Size Ramp Up as a Percentage of World Wide Silicon Area Versus Years Elapsed Since the New Size Was Introduced [1].

Data Warehouse design

"Reliability and MTBF Overview"

Technology Lifecycle Management. A Model for Enabling Systematic Budgeting and Administration of Government Technology Programs

CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS

Multi-Factor Risk Attribution Concept and Uses. CREDIT SUISSE AG, Mary Cait McCarthy, CFA, FRM August 29, 2012

A Practical Approach for Ship Construction Cost Estimating

CHOICES The magazine of food, farm, and resource issues

Biomarker Discovery and Data Visualization Tool for Ovarian Cancer Screening

VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS

Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria

An Unconventional View of Trading Volume

NetApp FAS Hybrid Array Flash Efficiency. Silverton Consulting, Inc. StorInt Briefing

Empirical study of Software Quality Evaluation in Agile Methodology Using Traditional Metrics

Valuation of Software Intangible Assets

Measurement Information Model

Big Data, Socio- Psychological Theory, Algorithmic Text Analysis, and Predicting the Michigan Consumer Sentiment Index

Inventory Optimization for the Consumer Products Industry

Project Risks. Risk Management. Characteristics of Risks. Why Software Development has Risks? Uncertainty Loss

Chapter 14 Managing Operational Risks with Bayesian Networks

Setting smar ter sales per formance management goals

Interaction of FAS 157 with FAS 133:

Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities

Use Advanced Analytics to Guide Your Business to Financial Success

COMBINING THE METHODS OF FORECASTING AND DECISION-MAKING TO OPTIMISE THE FINANCIAL PERFORMANCE OF SMALL ENTERPRISES

Transcription:

A Data Mining Based Approach to Electronic Part Obsolescence Forecasting Peter Sandborn CALCE Electronic Products and Systems Center Department of Mechanical Engineering University of Maryland Frank Mauro and Ron Knox PartMiner Information Systems, Inc. Part obsolescence dates (the date on which the part is no longer procurable from its original source) are important inputs during design planning. Most electronic part obsolescence forecasting algorithms are based, at least in part, on the development of models for the part s lifecycle. Traditional methods of life cycle forecasting utilized in commercially available tools and services are ordinal scale based approaches, in which the life cycle stage of the part is determined from a combination of technological and market attributes (e.g., TACTrac, Q-Star, Total Parts Plus). Analytical models based on technology and/or market trends have also appeared including a methodology based on forecasting part sales curves [1], and leading-indicator approaches [2]. Existing commercial forecasting tools are good at articulating the current state of a part s availability and identifying alternatives, but limited in their capability to forecast future obsolescence dates and do not generally provide quantitative confidence limits when predicting future obsolescence dates or risks. More accurate forecasts, or at least forecasts with a quantifiable accuracy would open the door to the use of lifecycle planning tools that could lead to more significant sustainment cost avoidance, [3]. This paper demonstrates the use of data mining based algorithms to augment commercial obsolescence risk databases by increasing their predictive capabilities. The method is a combination of life cycle curve forecasting and the determination of electronic part vendor-specific windows of obsolescence using data mining of historical last-order or last-ship dates. The extended methodology not only enables more accurate obsolescence forecasts but can also generate forecasts for user-specified confidence levels. The methodology has been demonstrated on both individual parts and modules. While successful electronic part obsolescence forecasting involves more than just predicting part-specific last order dates, being able to predict original vendor last order dates more accurately using a combination of market trending and data mining is an important component of an overall obsolescence risk forecasting strategy. Obsolescence Forecasting Approach The obsolescence forecasting approach discussed in this paper is an extension of a previously published lifecycle curve forecasting methodology based on curve fitting sales data for an electronic part [1]. In the existing methodology, attributes of the curve fits (e.g., mean and standard deviation for sales data fitted with a Gaussian) are plotted and trend equations are created that can be used for predicting the lifecycle curve of future versions of the part type (see the next section for an example). Similar procedures could

be used to forecast the lifecycle trends of secondary attributes such as bias level or package type. This obsolescence forecasting approach used a fixed window of obsolescence determined as a fixed number of standard deviations from the peak sales year of the part. This method was evaluated along with several other approaches (i.e., i2- TACTrac and Total Parts Plus) in late 22 by Northrop Grumman (in the Air Force CPOM program) and proved to be about the same accuracy as the commercial ordinal scale forecasting approaches. Example Lifecycle Curve Forecasting Algorithm [1] Flash Memory Figure 1 shows the historical and forecasted sales data for monolithic flash memory from [4]. The values of µ p and σ p that resulted from the best Gaussian fits to the data sets in Figure 1 were plotted, the trends for µ p and σ p are shown in Figures 2 and 3. For flash memory the trend in peak sales year and standard deviation in number of units shipped is given by, µ p = 1.5663ln(M) + 1997.2 (1) σ p = -.281ln(M) + 2.2479 (2) where M is the size of the flash memory chip in Megs. The resulting trend equations can be used reproduce the lifecycle curve for the parts that were used to create the relationships and for parts that do not yet exist. For example, for M = 1 Meg, the trend equations gives µ p = 1997.2 and σ p = 2.25 (you can compare these to the actual data in Figure 1). Similarly, plugging in M = 512 Meg into equations (1) and (2) gives µ p = 27 and σ p = 2.7 (this is a monolithic flash memory chip that does not exist yet). 4 Number of Units Shipped (M) 35 3 25 2 15 1 256K 512K 1M 2M 4M 8M 16M 32M 64M 128M 5 199 1992 1994 1996 1998 2 22 24 Year Figure 1 Historical and forecasted sales data for monolithic flash memory [4].

26 24 Peak Sales Year 22 2 1998 y = 1.5663Ln(x) + 1997.2 1996 1994.1 1 1 1 1 Size (M) Figure 2 Trend equation for peak sales year (µ p ), for flash memory. 3 2.5 Standard Deviation (years) 2 1.5 1.5 y = -.281Ln(x) + 2.2479.1 1 1 1 1 Size (M) Figure 3 Trend equation for standard deviation (σ p ), for flash memory. When generating the lifecycle curve trend equations one should be careful not to mix mil-spec parts and commercial parts. For example, equations (1) and (2) were generated for commercial flash memory chips and should only be applied to commercial flash memory chips. Mil-spec flash memory chips (if they existed) would be considered a completely different part and unique trend equations would need to be developed for them.

Determining the Window of Obsolescence via Data Mining The methodology described above provides a way to create or re-create the lifecycle curve for a part type given its primary attribute. 1 In the original baseline methodology reference, [1], the window of obsolescence specification was defined to be at 2.5σ p to 3.5σ p after the peak sales date (µ p ). In reality, the window of obsolescence specification is not a constant but depends on numerous factors. We suggest that the window of obsolescence specification is dependent on manufacturer-specific and part-specific business practices. For a particular part type (e.g., flash memory), historical last order date data is collected and sorted by manufacturer. Each part instance (data entry) in the resulting sorted data has a specific value of primary attribute (e.g., 32M) for which the peak sales date (µ p ) and standard deviation (σ p ) can be computed using the previously created trend equations (for flash memory these are given in (1) and (2)). The last order date for the part instance is then normalized relative to the peak sales year. The normalization is performed for every part instance for the selected part type and manufacturer. Next a histogram of the normalized vendor-specific last order dates is plotted; the histogram represents a probability distribution of when (relative to the peak sales year) the specific manufacturer obsoletes the part type. As an example, Figure 4 shows the histogram for Atmel (ATM) flash memory (based on 57 last order dates mined from PartMiner CAPS Expert). In order to quantify the manufacturer-specific obsolescence probability, the histogram is fit with a Gaussian form and extract the parameters of the fit are extracted, i.e., µ lo and σ lo. The window of obsolescence specification is then given by, Obsolescence window ( µ lo xσ lo ) σ p = µ + (3) p ± where x depends on the confidence level desired (x = 1 you have a 68% confidence that you have the range that accurately predicts the obsolescence event, similarly, x = 2 represents 95% confidence). By combining the lifecycle curve trends and the ATM-specific obsolescence window, the resulting obsolescence dates for ATM flash memory are given by the following equation as a function of the size in Megs (M) and confidence level desired: Number of standard deviations past the peak for ATM Flash Obsolescence date = 1.5663ln(M)+1997.2 + [.88 ±.72x](-.281ln(M)+2.2479) (4) Peak sales date Standard deviation in sales data 1 Primary attributes are attributes that can be identified with the evolution of the part over time.

Number 35 3 25 2 15 1 ATM (57 points) Mean.88 (.88) Stddev.72 (.72) 5-2.5-1.5 -.5.5 1.5 2.5 3.5 4.5 More Number of standard deviations past the peak Figure 4 Atmel (ATM) flash memory last order dates. Note, (4) assumed that the uncertainty in the window of obsolescence dominates the model uncertainty associated with the trend equations. Using the methodology for the entire set of flash memory provided by PartMiner (262 data points), yields the results shown in Figure 5. The diagonal line in the plot shows exact agreement between prediction and actual. The error bars represent a 68% 21 28 26 The model predicts obsolescence later than it actually occures Predicted Obsolescence 24 22 2 1998 1996 The model 1994 Fixed Window predicts Data Mined Window obsolescence 1992 earlier than it 1997 1998 1999 2 21 22 23 24 25 Actual Obsolescence Date Figure 5 Forecasting results for monolithic flash memory chips. 262 flash memory chips plotted. Fixed Window model = assumes a fixed window of obsolescence specification of 2.5σ p to 3.5σ p, Data Mined Window model = assumes the manufacturer-specific window of obsolescence.

confidence level. The accuracy with which the improved algorithm forecasts the obsolescence of parts is a substantial improvement over the original algorithm. Application of Data Mining Determined Windows of Obsolescence to Memory Modules As a further demonstration of the methodology described in this paper consider application to memory modules that are made up of multiple chips. The obsolescence of memory modules is not generally dictated by the obsolescence of the memory chips that are embedded within them. Rather, the obsolescence of memory modules is related to the beginning of availability of monolithic replacements for identical volumes of memory. As an example, in Figure 6, the 16M DRAM module became obsolete when monolithic 16M DRAM chips became available. In the case of DRAM memory modules, the last order date data is collected. Each module instance (data entry) has a specific value of primary attribute (e.g., 16M). For each module instance, the peak sales date (µ p ) and standard deviation (σ p ) are computed for the monolithic equivalent. The last order date for the module instance is then mapped (normalized) to the standard deviations before the peak sales date for the monolithic equivalent. In the case of memory modules, there was no need to sort the data by vendor all the vendors considered appear to be obsoleting memory modules based on the same driver. Figure 7 shows a curve fit of the resulting data mined last order dates mapped to the life cycle curve of the monolithic equivalents. Armed with the relation shown in Figure 7 and the lifecycle curve trends for DRAMs (e.g., see [1]), obsolescence dates for DRAM memory modules can be determined. Units shipped (in millions) 3 Life cycle profile parameters: µ = 1998.2 25 σ = 1.6 years 2 15 1 5 92 93 94 95 96 97 98 99 1 2 3 4 5 Year 16M actual 16M forecast Life Cycle Curve for a 16M DRAM A 16M DRAM module might go obsolete here Monolithic 16M DRAM goes obsolete here Figure 6 Obsolescence characteristics of DRAM memory modules vs. monolithic DRAM.

35 Number of standard deviations before the peak 3 25 2 15 1 5-5 IBM OKLJ y =.923x 3-4.747x 2 + 13.167x - 11.935 SIEG Poly. ( ).5 1 1.5 2 2.5 3 3.5 4 4.5 log(size M) Discussion Figure 7 Data mined data mapping for DRAM memory modules. Successful use of existing commercial electronic part obsolescence forecasting relies on the assumption that the forecasting is updated often and that the forecasts become better (more accurate) the closer you get to the actual obsolescence date. This implies that real forecasting value depends on an organization s ability to institute a continuous monitoring strategy and its ability to act quickly if a part accelerates toward obsolescence. 2 Unfortunately, the closer to the actual obsolescence event you get before the forecast converges, the less useful the forecast is, and thereby the value of pro-active refresh planning is limited. Knowing the obsolescence date years in advance obviously provides many more options than knowing it 1 month in advance. The methodology presented in this paper is a move in the direction of providing real obsolescence forecasting with quantifiable confidence limits. However, the work presented in this paper does not represent a standalone solution. This approach needs to be combined with subjective information included in traditional obsolescence forecasting tools, e.g., number of sources, market share, technology factors, etc. It is also worth pointing out that the two examples presented in this paper are straightforward applications of the methodology (they are easy cases). Not all part types have easily identifiable primary attributes (attributes that can be identified with the evolution of the part over time), therefore, we do not claim that the methodology will be useful on every part type. The large electronic part databases are treasure troves of data 2 This implies that organizations should very carefully consider the update frequency of the electronic part availability risk forecasting data before subscribing to a particular tool or service if they expect to make practical use of the forecasts provided.

for predicting obsolescence, the challenge is figuring out how to mine the data to find the significant trends. Acknowledgements This work was funded in part by PartMiner Information Services and the National Science Foundation (Division of Design, Manufacture, and Industrial Innovation) Grant No. DMI-438522. References [1] R. Solomon, P. Sandborn and M. Pecht, Electronic part life cycle concepts and obsolescence forecasting, IEEE Trans. on Components and Packaging Technologies, vol. 23, no. 4, pp. 77-713, December 2. [2] M. Meixell and S.D. Wu, Scenario analysis of demand in a technology market ssing leading indicators, IEEE Transactions on Semiconductor Manufacturing, vol. 14, no. 1, pp 65-78, 21. [3] P. Sandborn, "Beyond reactive thinking We should be developing pro-active approaches to obsolescence management too!," DMSMS Center of Excellence Newsletter, vol. 2, no. 3, pp. 4 and 9, July 24. [4] http://smithsonianchips.si.edu/ice/cd/memory97/sec5.pdf