A Framework for Prioritizing Economic Statistics Programs



Similar documents
THE U.S. ECONOMIC CENSUS MOVING FORWARD

IT Service Desk Unit Opportunities for Improving Service and Cost-Effectiveness

Consulting Services. Debit and Checking Consulting

QUARTERLY RETAIL E-COMMERCE SALES 1 ST QUARTER 2016

Automated Business Intelligence

MEASURING ELECTRONIC BUSINESS

E-commerce Sector Highlights

Iowa School District Profiles. Central City

$ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ FOR

Management Advisory Postal Service Transformation Plan (Report Number OE-MA )

Table of Contents. FY BLS Strategic Plan Our Strategies and Goals Strategy Framework Strategy 1 (Products)...

2003 National Survey of College Graduates Nonresponse Bias Analysis 1

Basel Committee on Banking Supervision. Working Paper No. 17

The Economic Benefits of Oil and Natural Gas Production: An Analysis of Effects on the United States and Major Energy Producing States

Building a Strategic Workforce Planning Capability at the U.S. Census Bureau 1

The Future of Census Bureau Operations

Changes Ahead for Retirement Plan 401K Forced Transfers and Inactive Accounts WHITE PAPER. By Peter E. Preovolos, APA, AIFA, RIA

E-commerce 2000 Note to readers Explanatory Notes B-to-B E-commerce Explanatory Notes

Updated Summary NIPA Methodologies

NEW YORK STATE-WIDE PAYROLL CONFERENCE. Presented to:

Recently, the Bureau of Labor Statistics projected

WikiLeaks Document Release

U.S. Department of the Treasury. Treasury IT Performance Measures Guide

Measurement Information Model

Co-employment and the Business Register:

CFPB Proposal Would Make 'HMDites' Of Us All

Cybersecurity Capability Maturity Model

2014 Vendor Risk Management Benchmark Study

FOURTH QUARTER 2014 DATA FROM THE QUARTERLY FINANCIAL REPORT: U.S. MANUFACTURING, MINING, WHOLESALE TRADE, AND SELECTED SERVICE INDUSTRIES

I will release the initial results of the 13 th comprehensive,

Updated Summary NIPA Methodologies

WHITE PAPER The Impact of Rising Generic Drug Prices on the U.S. Drug Supply Chain

CHAPTER 8: NET EXPORTS OF GOODS AND SERVICES (Updated: December 2015)

UNITED STATES DEPARTMENT OF THE INTERIOR BUREAU OF LAND MANAGEMENT MANUAL TRANSMITTAL SHEET

Nonprofit Fundraising Change in the Number of Companies

Converting Historical Industry Time Series Data from SIC to NAICS

Analysis of State of Vermont. Employee Engagement Survey Results January 2014

CONTACT CENTER OUTSOURCING

Quarterly Census of Employment and Wages (QCEW) Business Register Metrics August 2005

Revenue Administration: Performance Measurement in Tax Administration

The Bank of Canada s Senior Loan Officer Survey

ACCRUAL ACCOUNTING MEASURES FOR NIPA TIME SERIES. Why is Accrual Accounting Important? Definitions

Financial Services FINANCIAL SERVICES UTILITIES 57 FINANCIAL SERVICES AND UTILITIES BUSINESS PLAN. CR_2215 Attachment 1

Talent Management Leadership in Professional Services Firms

THOMAS G. DAY SENIOR VICE PRESIDENT, INTELLIGENT MAIL AND ADDRESS QUALITY PRITHA N. MEHRA VICE PRESIDENT, BUSINESS MAIL ENTRY AND PAYMENT TECHNOLOGIES

Procurement Transformation Division. Procurement guidance. Engaging and managing consultants. Includes definitions for consultants and contractors

Enterprise Content Management Solutions for Solving Health Payers Business Problems

Sarbanes-Oxley: Beyond. Using compliance requirements to boost business performance. An RIS White Paper Sponsored by:

Marketing Accountability Study White Paper

Council of Ambulance Authorities

Road Transportation of Freight Turnover Measures and Practices at the U.S. Census Bureau

Council of Ambulance Authorities

Overview of Key Performance Indicators, Goals and Results

MARKET CONDUCT ASSESSMENT REPORT

SOURCES OF SME BUSINESS DEBT FINANCING IN ATLANTIC CANADA. By Theresa Shutt & Pierre Vanasse The Conference Board of Canada Ottawa, March 1999

Department of Human Resources

Tax System Challenges in the 21 st Century

ACPSA Issue Brief #4: Arts and Cultural Contributions to the Creative Economy

Real World Strategies for Migrating and Decommissioning Legacy Applications

14 TRUTHS: How To Prepare For, Select, Implement And Optimize Your ERP Solution

Survey of more than 1,500 Auditors Concludes that Audit Professionals are Not Maximizing Use of Available Audit Technology

CORPORATE PERFORMANCE MANAGEMENT GUIDELINE

Purpose: To outline the policy for selecting students for admission into the School of Nursing Bachelor of Science in Nursing (Prelicensure Track).

State of Kansas Information Technology Vendor Management Program Executive Summary

MEASURING RETAIL E-COMMERCE SALES

National Heavy Duty Truck Transportation Efficiency Macroeconomic Impact Analysis

Board of Directors and Management Oversight

Positive Trends in Debt Management

How To Improve The Business Service Network

Innovation in New Zealand: 2011

Chapter 5: Analysis of The National Education Longitudinal Study (NELS:88)

Transcription:

A Framework for Prioritizing Economic Statistics Programs Thomas L. Mesenbourg, Ronald H. Lee Associate Director for Economic Programs Senior Financial Advisor Office of the Associate Director for Economic Programs Disclaimer This report is released to inform interested parties of research and to encourage discussion. Any views expressed on statistical, methodological, technical, or operational issues are those of the authors and not necessarily those of the U.S. Census Bureau. In the future, we expect increased scrutiny of statistical budgets with budget appropriations increasingly tied to program performance. Maintaining financial program support during the remainder of the decade will be a challenge as funding requests for the 2007 Economic Census and the 2010 Decennial Census will be increasing. Nonetheless, we believe that simply maintaining the programmatic status quo is not sufficient; we also must reallocate resources to improve and enhance our programs, even in the face of constrained financial resources. We assume that all but the most important program improvements will have to be funded internally by reallocating existing resources from lower to higher priority programs. To facilitate program review and reallocation decisions and to respond to possible budget cuts, the U.S. Census Bureau s Economic Programs Directorate has been developing a framework for prioritizing and evaluating its existing programs. Prioritizing our economic statistics programs has proved an easier task than developing specific measures that evaluate and rank individual programs using proxy metrics for program usefulness/importance to key stakeholders, data quality, and cost efficiency. Moreover, we have found that the framework we developed for establishing relative priorities among our economic statistics portfolio has proved useful in responding to possible budget reductions; but it is not as useful in identifying possible cost savings and efficiency opportunities to pay for internally-funded program improvements. In fact, our experience over the past several years convinces us that no single framework will satisfactorily respond to the myriad of program needs. This paper describes the Census Bureau s strategies for establishing relative priorities among diverse economic statistics programs when faced with significant budget reductions, their effectiveness, and shortcomings. In addition, the paper discusses our initial efforts to define program metrics to help us identify possible streamlining opportunities. Keywords: Budget Framework, Resources, Economic Statistics Programs, Metrics, Streamlining Opportunities, Prioritizing, Priorities, Criteria, Data Quality, Reduction, Quality Measures, Unit Costs, and Stakeholders. Budget Priorities Framework Budget reductions can take several forms. The Congress may earmark a specific program for elimination, but this has been very rare. More often, they reduce funding for a major subactivity, such as Current Economic Statistics or the Economic Census. Generally, such reductions are closely tied to the previous year s budget for the subactivity. Even a moderate increase in the appropriation may not completely fund pay raises, increases in postage, or even increased rents associated with our new Census buildings. Recently, budget reductions have taken the form of an across-the-board cut of one or two percent. Since across-the-board cuts damage all programs, the Census Bureau generally has proposed to eliminate lower priority programs and asked Congress for permission to reprogram the reduction against specific programs. The Congress, of course, has the discretion to approve or disapprove the reprogramming request. Since budget reductions rarely eliminate a specific survey or program, we have developed criteria to help determine what programs we should retain and what programs we should consider for elimination. Over the past decade, we used the following broad criteria: Retain programs that provide source data to the Bureau of Economic Analysis (BEA) National Income and Product Accounts (NIPA) and the Federal Reserve Board s (FRB) Index of Industrial Product or Flow of Funds account. These agencies are two of our most important data users and stakeholders. Protect programs and content that serve as benchmarks for composite measures of economic 586

activity, such as Gross Domestic Product (GDP), the Producer Price Index, productivity measures, or the FRB index. The Economic Census sectoral censuses (i.e., census of manufactures, retail, etc.) and the Census of Governments are the two most important benchmark programs. Preserve data quality of existing programs. When faced with budget cuts, we have avoided slashing existing samples. Sample cuts generally result in relatively small costs savings but significant losses in data quality. We used these broad criteria to respond to an across-theboard cut in the Census Bureau budget in 2005. Rather than reducing all of our individual economic statistics programs by 1 percent, a total of almost $3 million, we selected the Vehicle Inventory and Use Survey (VIUS) for elimination. VIUS, part of the Economic Census, was selected because its elimination had a minimal impact on BEA s programs. We also took into consideration that all 2002 VIUS data products had been released and planning for the 2007 program had just started in earnest. Additionally, we hoped that the transportation community, the primary users of the VIUS data, could possibly secure external funding for continuing the program. Initially, the Congress opposed the elimination of VIUS; but eventually, we were able to convince Appropriation staff that our criteria and rationale for elimination were sound and preferable to negatively impacting all of our economic statistics programs. The program was terminated and was not included in the FY 2007 President s Budget. Early in the summer of 2006, the Appropriation committees asked the Government Accountability Office (GAO) to review the Census Bureau s FY 2007 Budget request and focus on the criteria the Census Bureau would employ if faced with a significant budget reduction of $53 million in the 2010 Census request. As we responded to GAO inquiries, we further refined the framework for prioritizing economic statistics programs. The new framework is described below: 1. The 2010 Decennial Census was deemed the Census Bureau s top priority. We argued that a FY 2007 reduction of $53 million would make it impossible to fully test the handheld computers, a critical component of the decennial automation initiative. Without testing, the new technologies would not be used for nonresponse follow-up in 2010, substantially increasing data collection costs. 2. The Economic Census sector censuses and the Census of Governments were identified as the second priority. Both programs provide critical source data to the NIPA and serve as benchmarks for GDP, the Producer Price Index, the FRB s Index of Industrial Production, as well as many Census Bureau annual surveys. Moreover, both programs have critical roles in updating the Census Bureau register of both businesses and state and local governments. 3. The Census Bureau s Principal Economic Indicators and related annual surveys were deemed as the third priority. The Census Bureau s 12 economic indicator programs provide critical source data for quarterly GDP estimates and serve as current barometers of economic activity. Response to the indicator surveys is generally voluntary, while response to related annual surveys is mandatory. Since we benchmark the monthly and quarterly estimates to the related annual survey, we were unwilling to eliminate or significantly change any of the related annual surveys. 4. The fourth priority would be Census Bureau surveys that provide source data for the NIPA. 5. The final priority would be remaining economic surveys not directly used by the Bureau of Economic Analysis. It is clear from the above criteria that protecting programs that provide the Bureau of Economic Analysis with source data for both the NIPA and the GDP was the top priority for prioritizing economic statistics programs. Moreover, we determined that the underlying infrastructure supporting the censuses and our current economic surveys was not going to be targeted for reduction or elimination. This included the Business Register and associated supporting activities, such as the Company Organization Survey, industry and product classification, and the research conducted and supported by the Center for Economic Studies. The list below identifies the major economic surveys and programs and their alignment under the budget priorities 2 through 5. We used this model in the fall 2006 to identify possible program reductions if economic statistics programs had to absorb $10 million of the proposed $53 million cut in 2010 census funding levels. It also identifies the programs that were considered for possible elimination or modification. 587

Priority 2 Economic Census and Census of Governments Programs Economic Censuses Sectoral Censuses Census of Governments Business Expenses Survey Island Area Economic Censuses o Economic Census of Puerto Rico o Economic Census of Northern Mariana Islands o Economic Census of Guam o Economic Census of Virgin Islands o Economic Census of American Samoa Survey of Business Owners Proposed cut eliminates coverage of businesses with no paid employees. Priority 3 Principal Economic Indicators and Related Annual Surveys Principal Economic Indicators Tier 1 Related Annual Surveys US International Trade in Good & Services Advance Monthly Retail Sales Survey Annual Retail Trade Survey Advance Report on Durable Goods Annual Survey of Manufactures New Residential Construction Principal Economic Indicators Tier 2 Related Annual Surveys Monthly Retail Trade Survey Annual Retail Trade Survey Manufacturers Shipments, Inventories Annual Survey of Manufactures and Orders (Durables and Nondurables One Week Later Release) Monthly Wholesale Trade Survey Annual Wholesale Trade Survey Quarterly Services Survey Service Annual Survey Value of New Construction Put in Place Quarterly Financial Report Proposed cut eliminates coverage of small manufacturing firms Priority 4 Economic Surveys That Provide NIPA Data Annual Capital Expenditures Survey Information and Communications Technology Survey Proposed for elimination Measuring Electronic Business -Manufacturers Sales Offices and Branches Annual Estimates Annual Survey of State and Local Finances Annual Public Employment Survey Current Industrial Reports Proposed for elimination State Government Tax Collections Survey State and Local Public Employee Retirement System Survey Local Government School System Finance Survey Quarterly Tax Survey Quarterly Public Employee Retirement Systems Survey Residential Alterations and Repairs Survey Proposed for elimination 588

Priority 5 Other Current Economic Surveys and Programs County Business Patterns Proposed for one-year suspension Measuring Electronic Business Quarterly Retail E-Commerce Sales Nonemployer Report Consolidated Federal Funds Report Federal Audit Clearinghouse While the criteria permitted us to assign relative priorities among our economic statistics programs, we also considered other dimensions, such as data quality, availability of alternative data sources, integration into the National Accounts, and associated savings. If faced with a possible reduction, we needed to identify programs that would substantially reduce costs, such as the possible elimination of the Information and Communications Technology Survey, the Current Industrial Reports program, and dropping nonemployer coverage from the quinquennial Survey of Business Owners. The list of possible economic program reductions was shared with the GAO, the Appropriation staff, and other important stakeholders. Fortunately, Congress after assessing the possible impact on 2010 census plans and the economic statistics programs, provided sufficient funds and no reductions were required to our economic statistics programs. Program Improvements Model Over the past five years, we have made significant improvements to the Census Bureau s economic statistics programs. A new principal economic indicator, the Quarterly Services Survey, was introduced in 2004. We published data from new annual Information and Communication Technology Survey, provided first time annual coverage of manufacturers sales offices and branches, and substantially expanded the content of the Survey of Business Owners. Nonetheless, serious data gaps remain. The FY 2008 President s Budget requests an additional $8 million to improve our measurement of services. If the Congress appropriates these funds, the Quarterly Services Survey coverage of GDP would be increased from 17 percent to 30 percent in calendar year 2008; the Services Annual Survey coverage would be increased from 30 percent to 55 percent of GDP in 2009, matching Economic Census coverage; and the QSS coverage would be increased to 55 percent in 2010. The FY 2008 President s Budget also requests an additional $45 million above the 2007 appropriation for the Economic Census and $285 million more for the 2010 census. Given the magnitude of the requested increase in FY 2008, we will be fortunate indeed to get full funding for the services increase. While we do not believe we can fund major program enhancements like services internally, we do believe more modest improvements will have to be funded through program streamlining, reductions, or improved efficiency. To accomplish this, we need a framework for evaluating our programs and identifying cost savings opportunities. The framework, which was developed over a year ago, ranks programs using various attributes such as relevance, cost effectiveness, users and uses, and quality. We developed initial metrics to help quantify these and other important program attributes. This methodology is still in its infancy and needs additional refinements. Initially, we evaluated eight of our economic indicators, eight annual economic surveys, and the Current Industrial Reports program. The Current Industrial Reports program is shown as a single program even though it consists of 5 monthly surveys, 12 quarterly surveys, and 26 annual-product surveys. We did not develop metrics for each CIR survey, because it would have been too difficult and because the entire program was thoroughly reviewed and evaluated in 2004. As a result of the CIR program review, 11 surveys were eliminated, several high-tech surveys were re-engineered, and several other surveys were changed from annual to quarterly frequency. The economic census and the census of governments were not evaluated because of their special characteristics and their cyclical nature. The nine annual programs listed below account for over $43 million or about 30 percent of our total current economic statistics budget: Annual Retail Trade Survey Annual Survey of Manufactures Annual Wholesale Trade Survey Service Annual Survey Annual Capital Expenditures Survey 589

Information and Communications Technology Survey Annual Survey of State and Local Finances Annual Public Employment Survey Current Industrial Reports The eight economic indicator programs listed below total $39 million annually. Initially, we also included the monthly international trade program, but its uniqueness and reliance on import and export transactions makes it unlike any of the other indicators and it is not included in this paper. New Residential Construction Value of Construction Put in Place Advanced Monthly Retail Monthly Retail Trade Survey Monthly Wholesale Survey Manufacturers Shipments, Inventories, and Orders Survey (M-3) Quarterly Financial Report Quarterly Services Survey The initial framework consists of three dimensions: value to the Bureau of Economic Analysis (BEA) and the Federal Reserve Board (FRB), data quality, and cost efficiency. Each dimension included two measures. A 5-level scale was used to rate each of the six measures. Values ranged from 1 to 5, where the higher number, the better the measure. Each dimension, associated measures, and scoring scales are described below: Value to the BEA and the FRB Our two most important stakeholders, the Bureau of Economic Analysis and the Federal Reserve Board, were asked to rank the principal economic indicators and selected annual surveys. The BEA and the FRB scores were each treated as a separate value measure. The BEA rated eight of the annual surveys and seven of the economic indicators as being critically important with a rank of 5. The Federal Reserve Board rated all the indicator programs as a 4, except the QFR program which they ranked as a 5, the only indicator ranked a 4 the by BEA. The high ratings reflect how important our indicator programs and our annual programs are to both the BEA and the FRB and the high rankings support the priorities presented in the first section of the paper. BEA/FRB Rankings Distribution Importance Annual Surveys Indicators RANK BEA FRB BEA FRB 5 8 2 7 1 4 1 5 1 7 3 0 2 0 0 Data Quality We used the following two measures for data quality: the survey s unit response rate and the coefficient of variation or CV for its principal variable. The CV expresses the standard error as a percentage of the estimate to which it refers. For example, an estimate of 100 units that has an estimated standard error of 10 units has an estimated CV of 10 percent. The resulting 90 percent confidence interval for the estimate would include 83.5 to 116.5 units. Values for the quality measures were assigned as follows: Response Rate Distribution Unit Response Rate Rank Annuals Indicators 80% or greater 5 1 1 75% -79% 4 5 1 70% - 74% 3 3 1 60% - 69% 2 0 3 <69% 1 0 2 Not Applicable NA 0 0 590

Coefficient of Variation Distribution Coefficient of Variation Rank Annuals Indicators* <1% 5 5 3 1.0% - 1.9% 4 2 2 2.0% - 2.9% 3 2 2 3.0% - 3.9% 2 0 0 Not Applicable NA NA 1 *CV not applicable M-3 Cost Efficiency We used the following two measures for cost efficiency: the cost per annualized number of survey units and the cost per annualized number of variables. The second measure gives some weight to the survey s complexity; that is, the more variables collected generally the more expensive the survey s processing. The cost per survey unit was calculated by dividing the survey s budget by the number of forms mailed annually. The cost per variable was calculated similarly except the number of variables was substituted for forms. Values ranged from 1 to 5, where the higher number indicated lower cost. Different ranges were used for the annual surveys and economic indicators. Values were based on the following cost ranges: Cost Per Annualized Variable Distribution Cost Per Annual Variable Rank Annual Surveys Cost Per Variable Indicators <$2.00 5 2 < $4.00 3 $2.00 - $4.99 4 4 $4.00 - $9.99 2 $5.00 - $9.99 3 2 $10.00 - $19.99 1 $10.00 - $19.99 2 0 $20.00 - $39.99 2 > $20.00 1 1 > $40.00 0 Cost Per Annual Sampling Units Distribution Cost Per Annual Sample Unit Rank Annual Surveys Cost Per Annual Sample Unit Indicators <$100.00 5 3 <$50.00 2 $100.00-$149.99 4 2 $50.00 - $74.99 2 $150.00-199.00 3 3 $75.00 - $99.99 2 $200.00-399.00 2 0 $100.00 - $199.99 1 >399.99 1 1 >$199.99 1 591

Measures Considered But Not Used We considered a number of other measures related to data value, such as GDP coverage, customer satisfaction scores from our annual Web survey, and the number of survey program Web page hits. The customer satisfaction survey results did not relate directly to the specific surveys identified in Table 1, and the page hits proved difficult to analyze and use intelligently. We considered using GDP coverage as a measure but did not use it. If a survey is the primary or only source of data for GDP or the NIPA, the BEA ranks the survey as critically important regardless of the GDP coverage. We also calculated timeliness metrics but decided that it was not overly useful in identifying streamlining opportunities. For the economic indicators, we plan to calculate a metric related to the average size of the revision comparing the first estimate to the final estimate for one key measure in each survey. That work has not been completed. Findings The two tables below provide information on eight economic indicator programs and nine annual surveys. Relative rankings for each of the six dimensions are presented, along with dimension and program averages. The indicator information is provided in Table 1. Annual survey information is provided in Table 2. Table 1 Indicator Findings Program BEA FRB Response CV $/Unit $/Variable New Residential Construction 5 4 5 3 5 5 Value of Construction Put in Place 5 4 2 4 5 4 Advanced Monthly Retail 5 4 1 5 5 2 Monthly Retail Trade Survey 5 4 2 5 4 3 Monthly Wholesale Survey 5 4 3 4 4 4 Manufacturers Shipments, Inventories, and Orders Survey 5 4 4 NA 3 3 Quarterly Financial Report 4 5 2 5 2 5 Quarterly Services Survey 5 4 1 3 2 1 Average 4.9 4.1 2.5 4.1 3.8 3.4 Table 2 Annual Findings Program BEA FRB Response CV $/Unit $/Variable Information and Communication Technology Survey 4 4 3 5 5 5 Annual Capital Expenditures Survey 5 4 3 4 5 5 Current Industrial Reports 5 5 3 3 4 3 Annual Public Employment Survey 5 3 4 5 4 5 Service Annual Survey 5 4 4 3 4 3 Annual Survey of Manufacturers 5 5 5 5 4 4 Annual Retail Trade Survey 5 4 4 5 3 3 Annual Government Finance Survey 5 3 4 5 3 4 Annual Trade Survey 5 3 4 4 1 1 Average 4.9 3.9 3.8 4.3 3.7 3.7 This information was compiled to evaluate the usefulness of different metrics in helping us identify streamlining or reallocation opportunities for shifting resources from lower priority programs to fund higher priority improvements or programs. These decisions also would be made in the context of the program priorities presented in the first section of the paper. For information purposes, we have computed an average score for each of the column dimensions. We did not compute an average for each survey across the six dimensions or a subset of the dimensions because not each of the dimensions is equally important. In fact, our first finding is that survey assessments or evaluations cannot be captured by a single metric. Rather, multiple dimensions, 592

including ones not presented in this paper, need to be thoughtfully considered. The BEA and the FRB rankings support the program priorities presented in the first section of the paper; namely, protect the economic indicators and related annual surveys that provide critical source data for GDP and next, protect surveys providing source data for the National Income and Product Accounts. These criteria argue against eliminating any of the programs included in the above tables unless the magnitude of the reduction was such that program eliminations were unavoidable. When faced with a possible $10 million reduction last fall, we did follow the BEA priorities very closely, i.e., the Quarterly Financial Report program was preserved; but the manufacturing size cutoff was raised to $250 million in assets, matching the other sectors cutoff. The Information and Communication Technology Survey was considered for elimination. The only annual survey the BEA ranked as a 4. The other significant program reduction was the Current Industrial Reports program; this cut was required to generate enough cost savings. The BEA rankings and even the lower FRB scores do not provide much differentiation among surveys and not much help in identifying lower priority programs or streamlining opportunities. In terms of the quality measures, the coefficients of variation scores were quite comparable for both indicator and annual surveys. The response rate rankings were calculated using unweighted unit response data. As such, they should be interpreted with caution. For example, the Quarterly Services Survey has a unit response rate close to 60 percent; yet, the weighted response rate for revenue is close to 80 percent. The difference, however, between the economic indicator response rate average rank and the annual survey rank is striking. This difference is largely explained by the fact that seven of the annual surveys (all but the government employment and finance surveys) are collected under mandatory reporting authority, while response to all the indicator programs, except the Quarterly Financial Report program, is voluntary. The efficiency measures, cost per sampling unit and the cost per variable, have identified some interesting differences among both indicator programs and annual surveys; but some caveats need to be heeded. The underlying cost figures were extracted from our financial system, and not all costs are charged directly to specific surveys. For several of the programs, programming, sampling, and other statistical support, as well as other support charges, are available only at the line-item level (for example, manufacturing, construction, or business statistics); and line items may include multiple surveys. Support costs were allocated back to specific surveys using each survey s share of total direct costs for the line item. This allocation method will distort survey costs to the degree that support costs are not proportionately expended on each survey within the line item. The lower unit costs of the indicators is not a surprise given that the nonconstruction-related monthly and quarterly indicators have relatively small samples, 4,000-5,000 for most monthlies and 6,000-8,000 for the existing quarterlies. On the other hand, the indicators typically collect information for very few variables (QFR and construction indicators are exceptions) so their cost per variable are generally much higher than the annual surveys. Annual surveys consistently cost more than the associated monthly or quarterly indicator surveys even when the annualized monthly sample is the same or somewhat larger than the annual survey. The higher cost of the annual surveys is primarily due to the much more detailed information collected. The average number of variables for the nine annual surveys is 1.4 million variables, while the annualized average number of variables for the nonconstruction indicators is 442,000. If we remove the QFR, the average number of annualized variables is only 218,000. The significantly larger number of variables collected in the annual surveys requires substantially more staff resources for analysis and review. The differences in unit costs and cost per variable within both the economic indicator and annual survey programs warrant further investigation. For example, the QSS indicator ranks in the lowest quartile in cost per unit and cost per variable, reflecting the fact that this is a brand new survey that we have staffed to handle significant increases in coverage over the next several years. As the quarterly sample triples, the cost per unit will drop significantly. We plan to review in some detail individual surveys with higher than average costs to see if best practices are being used and to identify opportunities for streamlining. Conclusions What we collect is for the most part determined by our key stakeholders. Given their strong support for our existing programs, few opportunities remain to save money by eliminating programs should we face future budget cuts. On the other hand, how we collect and process data is something we can control. A careful review of existing methods, processes, and practices using some of the information discussed in this paper as well as other metrics may help us identify opportunities for cutting individual program costs so we can reallocate funds to additional improvements. For example, the unit and variable cost data presented reinforce our strategy of leveraging existing surveys by adding new content, rather than starting new surveys, a much more expensive proposition. We invite suggestions and recommendations both on our program priorities and possible methods to refine and improve our evaluations of individual programs. 593