A Data Mining Based Approach to Electronic Part Obsolescence Forecasting
|
|
|
- Neil Singleton
- 10 years ago
- Views:
Transcription
1 A Data Mining Based Approach to Electronic Part Obsolescence Forecasting Peter Sandborn CALCE Electronic Products and Systems Center Department of Mechanical Engineering University of Maryland Frank Mauro and Ron Knox PartMiner Information Systems, Inc. Part obsolescence dates (the date on which the part is no longer procurable from its original source) are important inputs during design planning. Most electronic part obsolescence forecasting algorithms are based, at least in part, on the development of models for the part s lifecycle. Traditional methods of life cycle forecasting utilized in commercially available tools and services are ordinal scale based approaches, in which the life cycle stage of the part is determined from a combination of technological and market attributes (e.g., TACTrac, Q-Star, Total Parts Plus). Analytical models based on technology and/or market trends have also appeared including a methodology based on forecasting part sales curves [1], and leading-indicator approaches [2]. Existing commercial forecasting tools are good at articulating the current state of a part s availability and identifying alternatives, but limited in their capability to forecast future obsolescence dates and do not generally provide quantitative confidence limits when predicting future obsolescence dates or risks. More accurate forecasts, or at least forecasts with a quantifiable accuracy would open the door to the use of lifecycle planning tools that could lead to more significant sustainment cost avoidance, [3]. This paper demonstrates the use of data mining based algorithms to augment commercial obsolescence risk databases by increasing their predictive capabilities. The method is a combination of life cycle curve forecasting and the determination of electronic part vendor-specific windows of obsolescence using data mining of historical last-order or last-ship dates. The extended methodology not only enables more accurate obsolescence forecasts but can also generate forecasts for user-specified confidence levels. The methodology has been demonstrated on both individual parts and modules. While successful electronic part obsolescence forecasting involves more than just predicting part-specific last order dates, being able to predict original vendor last order dates more accurately using a combination of market trending and data mining is an important component of an overall obsolescence risk forecasting strategy. Obsolescence Forecasting Approach The obsolescence forecasting approach discussed in this paper is an extension of a previously published lifecycle curve forecasting methodology based on curve fitting sales data for an electronic part [1]. In the existing methodology, attributes of the curve fits (e.g., mean and standard deviation for sales data fitted with a Gaussian) are plotted and trend equations are created that can be used for predicting the lifecycle curve of future versions of the part type (see the next section for an example). Similar procedures could
2 be used to forecast the lifecycle trends of secondary attributes such as bias level or package type. This obsolescence forecasting approach used a fixed window of obsolescence determined as a fixed number of standard deviations from the peak sales year of the part. This method was evaluated along with several other approaches (i.e., i2- TACTrac and Total Parts Plus) in late 22 by Northrop Grumman (in the Air Force CPOM program) and proved to be about the same accuracy as the commercial ordinal scale forecasting approaches. Example Lifecycle Curve Forecasting Algorithm [1] Flash Memory Figure 1 shows the historical and forecasted sales data for monolithic flash memory from [4]. The values of µ p and σ p that resulted from the best Gaussian fits to the data sets in Figure 1 were plotted, the trends for µ p and σ p are shown in Figures 2 and 3. For flash memory the trend in peak sales year and standard deviation in number of units shipped is given by, µ p = ln(M) (1) σ p = -.281ln(M) (2) where M is the size of the flash memory chip in Megs. The resulting trend equations can be used reproduce the lifecycle curve for the parts that were used to create the relationships and for parts that do not yet exist. For example, for M = 1 Meg, the trend equations gives µ p = and σ p = 2.25 (you can compare these to the actual data in Figure 1). Similarly, plugging in M = 512 Meg into equations (1) and (2) gives µ p = 27 and σ p = 2.7 (this is a monolithic flash memory chip that does not exist yet). 4 Number of Units Shipped (M) K 512K 1M 2M 4M 8M 16M 32M 64M 128M Year Figure 1 Historical and forecasted sales data for monolithic flash memory [4].
3 26 24 Peak Sales Year y = Ln(x) Size (M) Figure 2 Trend equation for peak sales year (µ p ), for flash memory Standard Deviation (years) y = -.281Ln(x) Size (M) Figure 3 Trend equation for standard deviation (σ p ), for flash memory. When generating the lifecycle curve trend equations one should be careful not to mix mil-spec parts and commercial parts. For example, equations (1) and (2) were generated for commercial flash memory chips and should only be applied to commercial flash memory chips. Mil-spec flash memory chips (if they existed) would be considered a completely different part and unique trend equations would need to be developed for them.
4 Determining the Window of Obsolescence via Data Mining The methodology described above provides a way to create or re-create the lifecycle curve for a part type given its primary attribute. 1 In the original baseline methodology reference, [1], the window of obsolescence specification was defined to be at 2.5σ p to 3.5σ p after the peak sales date (µ p ). In reality, the window of obsolescence specification is not a constant but depends on numerous factors. We suggest that the window of obsolescence specification is dependent on manufacturer-specific and part-specific business practices. For a particular part type (e.g., flash memory), historical last order date data is collected and sorted by manufacturer. Each part instance (data entry) in the resulting sorted data has a specific value of primary attribute (e.g., 32M) for which the peak sales date (µ p ) and standard deviation (σ p ) can be computed using the previously created trend equations (for flash memory these are given in (1) and (2)). The last order date for the part instance is then normalized relative to the peak sales year. The normalization is performed for every part instance for the selected part type and manufacturer. Next a histogram of the normalized vendor-specific last order dates is plotted; the histogram represents a probability distribution of when (relative to the peak sales year) the specific manufacturer obsoletes the part type. As an example, Figure 4 shows the histogram for Atmel (ATM) flash memory (based on 57 last order dates mined from PartMiner CAPS Expert). In order to quantify the manufacturer-specific obsolescence probability, the histogram is fit with a Gaussian form and extract the parameters of the fit are extracted, i.e., µ lo and σ lo. The window of obsolescence specification is then given by, Obsolescence window ( µ lo xσ lo ) σ p = µ + (3) p ± where x depends on the confidence level desired (x = 1 you have a 68% confidence that you have the range that accurately predicts the obsolescence event, similarly, x = 2 represents 95% confidence). By combining the lifecycle curve trends and the ATM-specific obsolescence window, the resulting obsolescence dates for ATM flash memory are given by the following equation as a function of the size in Megs (M) and confidence level desired: Number of standard deviations past the peak for ATM Flash Obsolescence date = ln(M) [.88 ±.72x](-.281ln(M) ) (4) Peak sales date Standard deviation in sales data 1 Primary attributes are attributes that can be identified with the evolution of the part over time.
5 Number ATM (57 points) Mean.88 (.88) Stddev.72 (.72) More Number of standard deviations past the peak Figure 4 Atmel (ATM) flash memory last order dates. Note, (4) assumed that the uncertainty in the window of obsolescence dominates the model uncertainty associated with the trend equations. Using the methodology for the entire set of flash memory provided by PartMiner (262 data points), yields the results shown in Figure 5. The diagonal line in the plot shows exact agreement between prediction and actual. The error bars represent a 68% The model predicts obsolescence later than it actually occures Predicted Obsolescence The model 1994 Fixed Window predicts Data Mined Window obsolescence 1992 earlier than it Actual Obsolescence Date Figure 5 Forecasting results for monolithic flash memory chips. 262 flash memory chips plotted. Fixed Window model = assumes a fixed window of obsolescence specification of 2.5σ p to 3.5σ p, Data Mined Window model = assumes the manufacturer-specific window of obsolescence.
6 confidence level. The accuracy with which the improved algorithm forecasts the obsolescence of parts is a substantial improvement over the original algorithm. Application of Data Mining Determined Windows of Obsolescence to Memory Modules As a further demonstration of the methodology described in this paper consider application to memory modules that are made up of multiple chips. The obsolescence of memory modules is not generally dictated by the obsolescence of the memory chips that are embedded within them. Rather, the obsolescence of memory modules is related to the beginning of availability of monolithic replacements for identical volumes of memory. As an example, in Figure 6, the 16M DRAM module became obsolete when monolithic 16M DRAM chips became available. In the case of DRAM memory modules, the last order date data is collected. Each module instance (data entry) has a specific value of primary attribute (e.g., 16M). For each module instance, the peak sales date (µ p ) and standard deviation (σ p ) are computed for the monolithic equivalent. The last order date for the module instance is then mapped (normalized) to the standard deviations before the peak sales date for the monolithic equivalent. In the case of memory modules, there was no need to sort the data by vendor all the vendors considered appear to be obsoleting memory modules based on the same driver. Figure 7 shows a curve fit of the resulting data mined last order dates mapped to the life cycle curve of the monolithic equivalents. Armed with the relation shown in Figure 7 and the lifecycle curve trends for DRAMs (e.g., see [1]), obsolescence dates for DRAM memory modules can be determined. Units shipped (in millions) 3 Life cycle profile parameters: µ = σ = 1.6 years Year 16M actual 16M forecast Life Cycle Curve for a 16M DRAM A 16M DRAM module might go obsolete here Monolithic 16M DRAM goes obsolete here Figure 6 Obsolescence characteristics of DRAM memory modules vs. monolithic DRAM.
7 35 Number of standard deviations before the peak IBM OKLJ y =.923x x x SIEG Poly. ( ) log(size M) Discussion Figure 7 Data mined data mapping for DRAM memory modules. Successful use of existing commercial electronic part obsolescence forecasting relies on the assumption that the forecasting is updated often and that the forecasts become better (more accurate) the closer you get to the actual obsolescence date. This implies that real forecasting value depends on an organization s ability to institute a continuous monitoring strategy and its ability to act quickly if a part accelerates toward obsolescence. 2 Unfortunately, the closer to the actual obsolescence event you get before the forecast converges, the less useful the forecast is, and thereby the value of pro-active refresh planning is limited. Knowing the obsolescence date years in advance obviously provides many more options than knowing it 1 month in advance. The methodology presented in this paper is a move in the direction of providing real obsolescence forecasting with quantifiable confidence limits. However, the work presented in this paper does not represent a standalone solution. This approach needs to be combined with subjective information included in traditional obsolescence forecasting tools, e.g., number of sources, market share, technology factors, etc. It is also worth pointing out that the two examples presented in this paper are straightforward applications of the methodology (they are easy cases). Not all part types have easily identifiable primary attributes (attributes that can be identified with the evolution of the part over time), therefore, we do not claim that the methodology will be useful on every part type. The large electronic part databases are treasure troves of data 2 This implies that organizations should very carefully consider the update frequency of the electronic part availability risk forecasting data before subscribing to a particular tool or service if they expect to make practical use of the forecasts provided.
8 for predicting obsolescence, the challenge is figuring out how to mine the data to find the significant trends. Acknowledgements This work was funded in part by PartMiner Information Services and the National Science Foundation (Division of Design, Manufacture, and Industrial Innovation) Grant No. DMI References [1] R. Solomon, P. Sandborn and M. Pecht, Electronic part life cycle concepts and obsolescence forecasting, IEEE Trans. on Components and Packaging Technologies, vol. 23, no. 4, pp , December 2. [2] M. Meixell and S.D. Wu, Scenario analysis of demand in a technology market ssing leading indicators, IEEE Transactions on Semiconductor Manufacturing, vol. 14, no. 1, pp 65-78, 21. [3] P. Sandborn, "Beyond reactive thinking We should be developing pro-active approaches to obsolescence management too!," DMSMS Center of Excellence Newsletter, vol. 2, no. 3, pp. 4 and 9, July 24. [4]
A Data Mining Based Approach to Electronic Part Obsolescence Forecasting
A Data Mining Based Approach to Electronic Part Obsolescence Forecasting Peter Sandborn CALCE Electronic Products and Systems Center Department of Mechanical Engineering University of Maryland Frank Mauro
Forecasting electronic part procurement lifetimes to enable the management of DMSMS obsolescence
Forecasting electronic part procurement lifetimes to enable the management of DMSMS obsolescence P. Sandborn a,*, V. Prabhakar a, O. Ahmad b a CALCE Electronic Products and Systems Center, Department of
Making Business Cases to Support Obsolescence Management
Making Business Cases to Support Obsolescence Management Peter Sandborn CALCE, Department of Mechanical Engineering University of Maryland, College Park, MD, USA Abstract Cost models are needed so that
Electronic Part Life Cycle Concepts and Obsolescence Forecasting
IEEE Trans. on Components and Packaging Technologies, Dec. 2000, pp. 707-717 1 Electronic Part Life Cycle Concepts and Obsolescence Forecasting Rajeev Solomon, Peter Sandborn, and Michael Pecht Abstract
Industry Environment and Concepts for Forecasting 1
Table of Contents Industry Environment and Concepts for Forecasting 1 Forecasting Methods Overview...2 Multilevel Forecasting...3 Demand Forecasting...4 Integrating Information...5 Simplifying the Forecast...6
Using Data Mining to Detect Insurance Fraud
IBM SPSS Modeler Using Data Mining to Detect Insurance Fraud Improve accuracy and minimize loss Highlights: Combine powerful analytical techniques with existing fraud detection and prevention efforts Build
Ideas for a More Proactive Role for Parts Management and DMSMS in Acquisition
Ideas for a More Proactive Role for Parts Management and DMSMS in Acquisition Presented to the Parts Standardization Management Committee April 28, 2015 Objective To identify ideas for how parts management
Magruder Statistics & Data Analysis
Magruder Statistics & Data Analysis Caution: There will be Equations! Based Closely On: Program Model The International Harmonized Protocol for the Proficiency Testing of Analytical Laboratories, 2006
The New Rules of Telecom Expense Management Effectively Managing Telecom Spend in Today s World
Over the past decade, the way organizations communicate has radically changed. As data and wireless communications have grown, telecom expenses have become a significant expense equating to 3-4% of revenue
Case study of a batch-production/inventory system E.M.M. Winands 1, A.G. de Kok 2 and C. Timpe 3
Case study of a batch-production/inventory system E.M.M. Winands 1, A.G. de Kok 2 and C. Timpe 3 The plant of BASF under consideration consists of multiple parallel production lines, which produce multiple
A Fuel Cost Comparison of Electric and Gas-Powered Vehicles
$ / gl $ / kwh A Fuel Cost Comparison of Electric and Gas-Powered Vehicles Lawrence V. Fulton, McCoy College of Business Administration, Texas State University, [email protected] Nathaniel D. Bastian, University
An Innovative Method for Dead Time Correction in Nuclear Spectroscopy
An Innovative Method for Dead Time Correction in Nuclear Spectroscopy Upp, Daniel L.; Keyser, Ronald M.; Gedcke, Dale A.; Twomey, Timothy R.; and Bingham, Russell D. PerkinElmer Instruments, Inc. ORTEC,
A better way to calculate equipment ROI
page 1 A better way to calculate equipment ROI a West Monroe Partners white paper by Aaron Lininger Copyright 2012 by CSCMP s Supply Chain Quarterly (www.supplychainquarterly.com), a division of Supply
Using Data Mining to Detect Insurance Fraud
IBM SPSS Modeler Using Data Mining to Detect Insurance Fraud Improve accuracy and minimize loss Highlights: combines powerful analytical techniques with existing fraud detection and prevention efforts
Analysis of Bayesian Dynamic Linear Models
Analysis of Bayesian Dynamic Linear Models Emily M. Casleton December 17, 2010 1 Introduction The main purpose of this project is to explore the Bayesian analysis of Dynamic Linear Models (DLMs). The main
Reliability Modeling Software Defined
Reliability Modeling Software Defined Using Titan Reliability Modeling Software May 30, 2014 Prepared by: The Fidelis Group 122 West Way, Suite 300 Lake Jackson, TX 77566 Fidelis Group, LLC All Rights
Software project cost estimation using AI techniques
Software project cost estimation using AI techniques Rodríguez Montequín, V.; Villanueva Balsera, J.; Alba González, C.; Martínez Huerta, G. Project Management Area University of Oviedo C/Independencia
Validation of On-Wafer Vector Network Analyzer Systems
Validation of On-Wafer Vector Network Analyzer Systems J. Randy Fenton Cascade Microtech, Inc., Beaverton, 2430 NW 206 th Avenue, Beaverton, OR, 97006, USA Abstract The case study described in this paper
I N D U S T R Y S P O T L I G H T. T h e Grow i n g Appeal of Ad va n c e d a n d P r e d i c ti ve Analytics f o r the Utility I n d u s t r y
(% of respondents) I N D U S T R Y S P O T L I G H T T h e Grow i n g Appeal of Ad va n c e d a n d P r e d i c ti ve Analytics f o r the Utility I n d u s t r y October 2014 Sponsored by SAP Advanced
Quantitative Inventory Uncertainty
Quantitative Inventory Uncertainty It is a requirement in the Product Standard and a recommendation in the Value Chain (Scope 3) Standard that companies perform and report qualitative uncertainty. This
AMS 5 CHANCE VARIABILITY
AMS 5 CHANCE VARIABILITY The Law of Averages When tossing a fair coin the chances of tails and heads are the same: 50% and 50%. So if the coin is tossed a large number of times, the number of heads and
Module 10: Assessing the Business Case
Module 10: Assessing the Business Case Learning Objectives After completing this section, you will be able to: Do preliminary assessment of proposed energy management investments. The measures recommended
Behavioral Segmentation
Behavioral Segmentation TM Contents 1. The Importance of Segmentation in Contemporary Marketing... 2 2. Traditional Methods of Segmentation and their Limitations... 2 2.1 Lack of Homogeneity... 3 2.2 Determining
A Comparative Study of the Pickup Method and its Variations Using a Simulated Hotel Reservation Data
A Comparative Study of the Pickup Method and its Variations Using a Simulated Hotel Reservation Data Athanasius Zakhary, Neamat El Gayar Faculty of Computers and Information Cairo University, Giza, Egypt
Monte Carlo Simulations for Patient Recruitment: A Better Way to Forecast Enrollment
Monte Carlo Simulations for Patient Recruitment: A Better Way to Forecast Enrollment Introduction The clinical phases of drug development represent the eagerly awaited period where, after several years
Tapping the benefits of business analytics and optimization
IBM Sales and Distribution Chemicals and Petroleum White Paper Tapping the benefits of business analytics and optimization A rich source of intelligence for the chemicals and petroleum industries 2 Tapping
Concept for an Algorithm Testing and Evaluation Program at NIST
Concept for an Algorithm Testing and Evaluation Program at NIST 1 Introduction Cathleen Diaz Factory Automation Systems Division National Institute of Standards and Technology Gaithersburg, MD 20899 A
Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods
Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods Enrique Navarrete 1 Abstract: This paper surveys the main difficulties involved with the quantitative measurement
Risk Analysis Overview
What Is Risk? Uncertainty about a situation can often indicate risk, which is the possibility of loss, damage, or any other undesirable event. Most people desire low risk, which would translate to a high
Risk Analysis and Quantification
Risk Analysis and Quantification 1 What is Risk Analysis? 2. Risk Analysis Methods 3. The Monte Carlo Method 4. Risk Model 5. What steps must be taken for the development of a Risk Model? 1.What is Risk
FUZZY CLUSTERING ANALYSIS OF DATA MINING: APPLICATION TO AN ACCIDENT MINING SYSTEM
International Journal of Innovative Computing, Information and Control ICIC International c 0 ISSN 34-48 Volume 8, Number 8, August 0 pp. 4 FUZZY CLUSTERING ANALYSIS OF DATA MINING: APPLICATION TO AN ACCIDENT
Two-sample inference: Continuous data
Two-sample inference: Continuous data Patrick Breheny April 5 Patrick Breheny STA 580: Biostatistics I 1/32 Introduction Our next two lectures will deal with two-sample inference for continuous data As
Project Proposal: Monitoring and Processing Stock Market Data In Real Time Using the Cyclone V FPGA
Project Proposal: Monitoring and Processing Stock Market Data In Real Time Using the Cyclone V FPGA Alexander Gazman (ag3529), Hang Guan (hg2388), Nathan Abrams (nca2123) 1. Introduction The general premise
Seismic Risk Assessment Procedures for a System consisting of Distributed Facilities -Part three- Insurance Portfolio Analysis
Seismic Risk Assessment Procedures for a System consisting of Distributed Facilities -Part three- Insurance Portfolio Analysis M. Achiwa & M. Sato Yasuda Risk Engineering Co., Ltd., Tokyo, Japan M. Mizutani
Content Sheet 7-1: Overview of Quality Control for Quantitative Tests
Content Sheet 7-1: Overview of Quality Control for Quantitative Tests Role in quality management system Quality Control (QC) is a component of process control, and is a major element of the quality management
STATISTICA. Financial Institutions. Case Study: Credit Scoring. and
Financial Institutions and STATISTICA Case Study: Credit Scoring STATISTICA Solutions for Business Intelligence, Data Mining, Quality Control, and Web-based Analytics Table of Contents INTRODUCTION: WHAT
Table 1. Active User Projections for Planned M-Banking Service
Market Research MARKET SIZING OVERVIEW Market sizing is traditionally defined as estimating the number of buyers of a particular product, or users of a service. Because of the relative newness of mobile
Measuring Line Edge Roughness: Fluctuations in Uncertainty
Tutor6.doc: Version 5/6/08 T h e L i t h o g r a p h y E x p e r t (August 008) Measuring Line Edge Roughness: Fluctuations in Uncertainty Line edge roughness () is the deviation of a feature edge (as
APPENDIX N. Data Validation Using Data Descriptors
APPENDIX N Data Validation Using Data Descriptors Data validation is often defined by six data descriptors: 1) reports to decision maker 2) documentation 3) data sources 4) analytical method and detection
Simple Predictive Analytics Curtis Seare
Using Excel to Solve Business Problems: Simple Predictive Analytics Curtis Seare Copyright: Vault Analytics July 2010 Contents Section I: Background Information Why use Predictive Analytics? How to use
A Contact Center Crystal Ball:
A Contact Center Crystal Ball: Marrying the Analyses of Service, Cost, Revenue, and Now, Customer Experience Ric Kosiba, Ph.D. Vice President Interactive Intelligence, Inc. Table of Contents Introduction...
Attaining Supply Chain Analytical Literacy. Sr. Director of Analytics Wal Mart (Bentonville, AR)
Attaining Supply Chain Analytical Literacy Rhonda R. Lummus F. Robert Jacobs Indiana University Bloomington Kelley School of Business Sr. Director of Analytics Wal Mart (Bentonville, AR) The Senior Director
ORACLE ASSET TRACKING
ORACLE ASSET TRACKING KEY FEATURES Self Service User Interface with powerful search capability, comprehensive asset information, mass updates and multiple-asset deployment Single repository of physical,
Marketing Mix Modelling and Big Data P. M Cain
1) Introduction Marketing Mix Modelling and Big Data P. M Cain Big data is generally defined in terms of the volume and variety of structured and unstructured information. Whereas structured data is stored
On Correlating Performance Metrics
On Correlating Performance Metrics Yiping Ding and Chris Thornley BMC Software, Inc. Kenneth Newman BMC Software, Inc. University of Massachusetts, Boston Performance metrics and their measurements are
ITG Executive Summary
ITG Executive Summary October 2008 VALUE PROPOSITION FOR IBM SOFTWARE PREMIUM SUPPORT SERVICES: QUANTIFYING THE COST/BENEFIT CASE Why Premium Support? What is the value of an IBM Premium Support contract?
BUILDING A DEMAND-DRIVEN SUPPLY CHAIN THROUGH INTEGRATED BUSINESS MANAGEMENT FROM EXECUTION TO OPTIMIZATION ADVANCED EXECUTIVE EDUCATION SERIES
BUILDING A DEMAND-DRIVEN SUPPLY CHAIN THROUGH INTEGRATED BUSINESS MANAGEMENT FROM EXECUTION TO OPTIMIZATION ADVANCED EXECUTIVE EDUCATION SERIES As the world of supply chain revolves more and more around
Easily Identify Your Best Customers
IBM SPSS Statistics Easily Identify Your Best Customers Use IBM SPSS predictive analytics software to gain insight from your customer database Contents: 1 Introduction 2 Exploring customer data Where do
BoR (13) 185. BEREC Report on Transparency and Comparability of International Roaming Tariffs
BEREC Report on Transparency and Comparability of International Roaming Tariffs December 2013 Contents I. EXECUTIVE SUMMARY AND MAIN FINDINGS... 3 II. INTRODUCTION AND OBJECTIVES OF THE DOCUMENT... 5 III.
When to Refinance Mortgage Loans in a Stochastic Interest Rate Environment
When to Refinance Mortgage Loans in a Stochastic Interest Rate Environment Siwei Gan, Jin Zheng, Xiaoxia Feng, and Dejun Xie Abstract Refinancing refers to the replacement of an existing debt obligation
Customer Relationship Management. EC-Council
Customer Relationship Management 1 Customer Relationship Management CRM (customer relationship management) is an information industry term for methodologies, software, and usually Internet capabilities
Mitigating Risk through OEM Partnerships. Leveraging OEM to Drive the Bottom Line
Mitigating Risk through OEM Partnerships Leveraging OEM to Drive the Bottom Line Executive Summary Incorporating third-party technology to add new capability to existing applications is not a new concept
Integrated Sales and Operations Business Planning for Chemicals
Solution in Detail Chemicals Executive Summary Contact Us Integrated Sales and Operations Business Planning for Chemicals Navigating Business Volatility Navigating Volatility Anticipating Change Optimizing
PROJECT CONTROL PROCEDURE (PROJECT STANDARDS AND SPECIFICATIONS)
TABLE OF CONTENTS Page 1 of 18 PURPOSE 2 SCOPE 2 DEFINITION 2 ORGANIZATION AND RESPONSIBILITIES 2 Organization and Function 2 Integration of Control Functions 5 7 Introduction 7 Planning and Scheduling
A Decision-Support System for New Product Sales Forecasting
A Decision-Support System for New Product Sales Forecasting Ching-Chin Chern, Ka Ieng Ao Ieong, Ling-Ling Wu, and Ling-Chieh Kung Department of Information Management, NTU, Taipei, Taiwan [email protected],
White Paper: Impact of Inventory on Network Design
White Paper: Impact of Inventory on Network Design Written as Research Project at Georgia Tech with support from people at IBM, Northwestern, and Opex Analytics Released: January 2014 Georgia Tech Team
Article 3, Dealing with Reuse, explains how to quantify the impact of software reuse and commercial components/libraries on your estimate.
Estimating Software Costs This article describes the cost estimation lifecycle and a process to estimate project volume. Author: William Roetzheim Co-Founder, Cost Xpert Group, Inc. Estimating Software
The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.
CAPACITY AND AVAILABILITY MANAGEMENT A Project Management Process Area at Maturity Level 3 Purpose The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision
Predictive Analytics
Predictive Analytics How many of you used predictive today? 2015 SAP SE. All rights reserved. 2 2015 SAP SE. All rights reserved. 3 How can you apply predictive to your business? Predictive Analytics is
Optimizing Inventory in Today s Challenging Environment Maximo Monday August 11, 2008
Optimizing Inventory in Today s Challenging Environment Maximo Monday August 11, 2008 1 Agenda The Value Proposition Case Studies Maximo/DIOS Offering Getting Started Q&A 2 Current Inventory Management
Forecast Confidence Level and Portfolio Optimization
and Portfolio Optimization by Richard O. Michaud and Robert O. Michaud New Frontier Advisors Newsletter July 2004 Abstract This report focuses on the role and importance of the uncertainty in forecast
Riversand Technologies, Inc. Powering Accurate Product Information PIM VS MDM VS PLM. A Riversand Technologies Whitepaper
Riversand Technologies, Inc. Powering Accurate Product Information PIM VS MDM VS PLM A Riversand Technologies Whitepaper Table of Contents 1. PIM VS PLM... 3 2. Key Attributes of a PIM System... 5 3. General
Figure 1. New Wafer Size Ramp Up as a Percentage of World Wide Silicon Area Versus Years Elapsed Since the New Size Was Introduced [1].
White Paper Forecasting the 45mm Ramp Up IC Knowledge LLC, PO Box 2, Georgetown, MA 1833 Tx: (978) 352 761, Fx: (978) 352 387, email: [email protected] Introduction The introduction and ramp up of 45mm
Data Warehouse design
Data Warehouse design Design of Enterprise Systems University of Pavia 21/11/2013-1- Data Warehouse design DATA PRESENTATION - 2- BI Reporting Success Factors BI platform success factors include: Performance
"Reliability and MTBF Overview"
"Reliability and MTBF Overview" Prepared by Scott Speaks Vicor Reliability Engineering Introduction Reliability is defined as the probability that a device will perform its required function under stated
Technology Lifecycle Management. A Model for Enabling Systematic Budgeting and Administration of Government Technology Programs
Technology Lifecycle Management A Model for Enabling Systematic Budgeting and Administration of Government Technology Programs Even as technology improves, government s fundamental IT challenge remains
CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS
CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS In today's scenario data warehouse plays a crucial role in order to perform important operations. Different indexing techniques has been used and analyzed using
Multi-Factor Risk Attribution Concept and Uses. CREDIT SUISSE AG, Mary Cait McCarthy, CFA, FRM August 29, 2012
Multi-Factor Risk Attribution Concept and Uses Introduction Why do we need risk attribution? What are we trying to achieve with it? What is the difference between ex-post and ex-ante risk? What is the
A Practical Approach for Ship Construction Cost Estimating
Abstract A Practical Approach for Ship Construction Cost Estimating Jonathan M. Ross, Proteus Engineering, Anteon Corporation, U.S.A., [email protected] To succeed commercially, shipyards must be able to
CHOICES The magazine of food, farm, and resource issues
CHOICES The magazine of food, farm, and resource issues 4th Quarter 2005 20(4) A publication of the American Agricultural Economics Association Logistics, Inventory Control, and Supply Chain Management
Biomarker Discovery and Data Visualization Tool for Ovarian Cancer Screening
, pp.169-178 http://dx.doi.org/10.14257/ijbsbt.2014.6.2.17 Biomarker Discovery and Data Visualization Tool for Ovarian Cancer Screening Ki-Seok Cheong 2,3, Hye-Jeong Song 1,3, Chan-Young Park 1,3, Jong-Dae
VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS
VEHICLE TRACKING USING ACOUSTIC AND VIDEO SENSORS Aswin C Sankaranayanan, Qinfen Zheng, Rama Chellappa University of Maryland College Park, MD - 277 {aswch, qinfen, rama}@cfar.umd.edu Volkan Cevher, James
Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria
Characteristic Best Practice Estimate Package Component / GAO Audit Criteria Comprehensive Step 2: Develop the estimating plan Documented in BOE or Separate Appendix to BOE. An analytic approach to cost
An Unconventional View of Trading Volume
An Unconventional View of Trading Volume With all the talk about the decline in trading volume (over 30% on a year- over- year basis for most of the third quarter) you would think that no one is playing
NetApp FAS Hybrid Array Flash Efficiency. Silverton Consulting, Inc. StorInt Briefing
NetApp FAS Hybrid Array Flash Efficiency Silverton Consulting, Inc. StorInt Briefing PAGE 2 OF 7 Introduction Hybrid storage arrays (storage systems with both disk and flash capacity) have become commonplace
Empirical study of Software Quality Evaluation in Agile Methodology Using Traditional Metrics
Empirical study of Software Quality Evaluation in Agile Methodology Using Traditional Metrics Kumi Jinzenji NTT Software Innovation Canter NTT Corporation Tokyo, Japan [email protected] Takashi
Valuation of Software Intangible Assets
Valuation of Software Intangible Assets Eric A. Thornton Senior Associate (703) 917-6616 [email protected] ASA International Conference San Diego, California August 28, 2002 San Francisco, California
Measurement Information Model
mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides
Big Data, Socio- Psychological Theory, Algorithmic Text Analysis, and Predicting the Michigan Consumer Sentiment Index
Big Data, Socio- Psychological Theory, Algorithmic Text Analysis, and Predicting the Michigan Consumer Sentiment Index Rickard Nyman *, Paul Ormerod Centre for the Study of Decision Making Under Uncertainty,
Inventory Optimization for the Consumer Products Industry
Inventory Optimization for the Consumer Products Industry Highlights Helps improve service levels and reduce working capital costs by enabling improved inventory planning and optimization Dynamically adjusts
Project Risks. Risk Management. Characteristics of Risks. Why Software Development has Risks? Uncertainty Loss
Project Risks Risk Management What can go wrong? What is the likelihood? What will the damage be? What can we do about it? M8034 @ Peter Lo 2006 1 M8034 @ Peter Lo 2006 2 Characteristics of Risks Uncertainty
Chapter 14 Managing Operational Risks with Bayesian Networks
Chapter 14 Managing Operational Risks with Bayesian Networks Carol Alexander This chapter introduces Bayesian belief and decision networks as quantitative management tools for operational risks. Bayesian
Setting smar ter sales per formance management goals
IBM Software Business Analytics Sales performance management Setting smar ter sales per formance management goals Use dedicated SPM solutions with analytics capabilities to improve sales performance 2
Interaction of FAS 157 with FAS 133:
Interaction of FAS 157 with FAS 133: The Second Wave By: Peter Seward, Vice President, Product Strategy, Reval June 2, 2009 CONTENT Executive Summary Introduction FAS 157 Fair Valuations Impact on FAS
Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities
Technology Insight Paper Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities By John Webster February 2015 Enabling you to make the best technology decisions Enabling
Use Advanced Analytics to Guide Your Business to Financial Success
SAP Information Sheet Analytics Solutions from SAP Quick Facts Use Advanced Analytics to Guide Your Business to Financial Success Quick Facts Summary With advanced analytics from SAP, finance experts can
COMBINING THE METHODS OF FORECASTING AND DECISION-MAKING TO OPTIMISE THE FINANCIAL PERFORMANCE OF SMALL ENTERPRISES
COMBINING THE METHODS OF FORECASTING AND DECISION-MAKING TO OPTIMISE THE FINANCIAL PERFORMANCE OF SMALL ENTERPRISES JULIA IGOREVNA LARIONOVA 1 ANNA NIKOLAEVNA TIKHOMIROVA 2 1, 2 The National Nuclear Research
