The Necessary Skills for Advanced Analytics 4 Hugh J. Watson. BI Dashboards the Agile Way 8

Size: px
Start display at page:

Download "The Necessary Skills for Advanced Analytics 4 Hugh J. Watson. BI Dashboards the Agile Way 8"

Transcription

1 EXCLUSIVELY FOR TDWI PREMIUM MEMBERS Volume 17 Number 4 4th Quarter 2012 The leading publication for business intelligence and data warehousing professionals The Necessary Skills for Advanced Analytics 4 Hugh J. Watson BI Dashboards the Agile Way 8 Paul DeSarra Best Practices for Turning Big Data into 17 Big Insights Jorge A. Lopez Implementing Dashboards for a Large 22 Business Community Doug Calhoun and Ramesh Srinivasan Data Government Models for Healthcare 34 Jason Oliveira BI Q&A: Gaming Companies on the 40 Bleeding Edge of Analytics Linda L. Briggs Offloading Analytics: Creating a 43 Performance-Based Data Solution John Santaferraro BI Experts Perspective: Mobile Apps 49 Timothy Leonard, William McKnight, John O Brien, and Lyndsay Wise

2 BI Training Solutions: As Close as Your Conference Room We know you can t always send people to training, especially in today s economy. So TDWI Onsite Education brings the training to you. The same great instructors, the same great BI/DW education as a TDWI event brought to your own conference room at an affordable rate. It s just that easy. Your location, our instructors, your team. Contact Yvonne Baho at or [email protected] for more information. tdwi.org/onsite

3 volume 17 number 4 3 From the Editor 4 The Necessary Skills for Advanced Analytics Hugh J. Watson 8 BI Dashboards the Agile Way Paul DeSarra 17 Best Practices for Turning Big Data into Big Insights Jorge A. Lopez 22 Implementing Dashboards for a Large Business Community Doug Calhoun and Ramesh Srinivasan 34 Data Government Models for Healthcare Jason Oliveira 40 BI Q&A: Gaming Companies on the Bleeding Edge of Analytics Linda L. Briggs 43 Offloading Analytics: Creating a Performance-Based Data Solution John Santaferraro 49 BI Experts Perspective: Mobile Apps Timothy Leonard, William McKnight, John O Brien, and Lyndsay Wise 55 Intructions for Authors 56 BI StatShots BUSINESS INTELLIGENCE Journal vol. 17, No. 4 1

4 volume 17 number 4 tdwi.org EDITORIAL BOARD Editorial Director James E. Powell, TDWI Managing Editor Jennifer Agee, TDWI President Director, Online Products & Marketing Senior Graphic Designer rich Zbylut Melissa Parrish Bill Grimmer Senior Editor Hugh J. Watson, TDWI Fellow, University of Georgia Director, TDWI Research Philip Russom, TDWI President & Chief Executive Officer Senior Vice President & Chief Financial Officer Executive Vice President neal Vitale Richard Vitale Michael J. Valenti Director, TDWI Research David Stodder, TDWI Associate Editors Barry Devlin, 9sight Consulting Mark Frolick, Xavier University Troy Hiltbrand, Idaho National Laboratory Claudia Imhoff, TDWI Fellow, Intelligent Solutions, Inc. Barbara Haley Wixom, TDWI Fellow, University of Virginia Advertising Sales: Scott Geissler, [email protected], List Rentals: 1105 Media, Inc., offers numerous , postal, and telemarketing lists targeting business intelligence and data warehousing professionals, as well as other high-tech markets. For more information, please contact our list manager, Merit Direct, at or Reprints: For single article reprints (in minimum quantities of ), e-prints, plaques, and posters contact: PARS International, phone: , [email protected], Copyright 2012 by 1105 Media, Inc. All rights reserved. Reproductions in whole or in part are prohibited except by written permission. Mail requests to Permissions Editor, c/o Business Intelligence Journal, 1201 Monster Road SW, Suite 250, Renton, WA The information in this journal has not undergone any formal testing by 1105 Media, Inc., and is distributed without any warranty expressed or implied. Implementation or use of any information contained herein is the reader s sole responsibility. While the information has been reviewed for accuracy, there is no guarantee that the same or similar results may be achieved in all environments. Technical inaccuracies may result from printing errors, new developments in the industry, and/or changes or enhancements to either hardware or software components. Printed in the USA. [ISSN ] Product and company names mentioned herein may be trademarks and/or registered trademarks of their respective companies. Vice President, Finance & Administration Vice President, Information Technology & Application Development Vice President, Event Operations Chairman of the Board Christopher M. Coates Erik A. Lindgren David F. Myers Jeffrey S. Klein Reaching the Staff Staff may be reached via , telephone, fax, or mail. To any member of the staff, please use the following form: [email protected] Renton office (weekdays, 8:30 a.m. 5:00 p.m. PT) Telephone ; Fax Monster Road SW, Suite 250, Renton, WA Corporate office (weekdays, 8:30 a.m. 5:30 p.m. PT) Telephone ; Fax Oakdale Avenue, Suite 101, Chatsworth, CA Business Intelligence Journal (article submission inquiries) Jennifer Agee [email protected] tdwi.org/journalsubmissions TDWI Premium Membership (inquiries & changes of address) [email protected] tdwi.org/premiummembership Fax: BUSINESS INTELLIGENCE Journal vol. 17, No. 4

5 From the Editor Speed is on everyone s mind these days. From real-time data to on-demand reporting, BI professionals want up-to-the-minute information and they want it now. The authors in this issue of the Business Intelligence Journal understand. Agile development methodologies have long promised speedier delivery of new applications or features thanks to shorter development cycles and increased user collaboration. Paul DeSarra explains how an agile approach can be leveraged to meet the highly dynamic needs of business; he uses an agile dashboard project to illustrate his ideas. Dashboards are a quick and easy way to communicate key performance indicators, and Doug Calhoun and Ramesh Srinivasan provide tips and best practices for creating a successful dashboard design. An agile approach may also be what s needed for mobile development at a maternity clothes maker, the subject of our Experts Perspective. Timothy Leonard, William McKnight, John O Brien, and Lyndsay Wise offer their advice for getting mobile BI up and running quickly. Of the three leading characteristics of big data (the so-called 3 Vs: volume, variety, and velocity), it s the speed component that is often cited as its downfall. How can you process so much data without becoming bogged down? Jorge A. Lopez describes one approach. John Santaferraro discusses how analytics must be offloaded to separate analytics databases if big data is to provide accelerated queries, faster batch processing, and immediate access to a robust analytics environment. Senior editor Hugh J. Watson notes that studies suggest enterprises will soon face a shortage of data scientists. He explains that we will have to give business analysts and data scientists wider and more in-depth permissions and provide more training for existing staff if we re to keep up with current trends. Healthcare organizations face a variety of tough governance challenges. Jason Oliveira explores what can be learned from other governance and services organizations that have adopted business intelligence competency centers (BICCs) and how to apply that knowledge to help improve healthcare s BI disciplines. Speed can present challenges, which is why our Q&A explores how gaming companies are on the bleeding edge of analytics, using real-time information to improve gameplay (as well as up-sell or cross-sell products or services to players). How are you keeping up? We welcome your feedback and comments; please send them to [email protected]. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 3

6 Advanced Analytics The Necessary Skills for Advanced Analytics Hugh J. Watson Hugh J. Watson is a C. Herman and Mary Virginia Terry Chair of Business Administration in the Terry College of Business at the University of Georgia. He is a Fellow of TDWI and senior editor of the Business Intelligence Journal. [email protected] Analytics work requires business domain knowledge, the ability to work with data, and modeling skills. Figure 1 identifies some of the skills in each area. The importance of particular skills and the exact forms they take depend on the user and the kind of analytics involved. Let s take a closer look. It is useful to distinguish among business users, business analysts, and data scientists. Business users access analytics-related information and use descriptive analytics tools and applications in their work reports, OLAP, dashboards/scorecards, and data visualization. They have extensive business domain knowledge and are probably familiar with the data they are accessing and using but have a limited need for and understanding of modeling. BUSINESS DOMAIN Goals Strategies Processes Decisions Communication of results DATA Access Integration Transformation Preparation MODELING Methods, techniques, and algorithms Tools and products Methodologies Figure 1. Skills needed for analytics. 4 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

7 Advanced Analytics Business analysts use analytical tools and applications to understand business conditions and drive business processes. Their job is to access and analyze data and to provide information to others in the organization. Most business analysts are located in the functional areas of a business (such as marketing) and perform analytical work (such as designing marketing campaigns), or they may work in a centralized analytics team that provides analytics organizationwide. Depending on their positions, business analysts work with some combination of descriptive, predictive, and prescriptive analytics. They tend to have a good balance of business domain knowledge as well as data and modeling skills. The data scientist title is taking hold even though it sounds elitist (I ve also heard the term data ninja). Data scientists typically have advanced training in multivariate statistics, artificial intelligence, machine learning, mathematical programming, and simulation. They perform predictive and prescriptive analytics and often hold advanced degrees, including Ph.D.s, in fields such as econometrics, statistics, mathematics, and management science. Companies don t need many data scientists, but they come in handy for some advanced work. Data scientists often have limited business domain knowledge, the ability to handle data related to performing analytics (e.g., data transformations), and strong modeling skills. They often move from project to project and are paired with business users and business analysts so that necessary domain knowledge is included on the team. Most companies have moved along the BI/analytics maturity curve and now have business users who understand and can employ descriptive analytics and business analysts who can deliver descriptive and some predictive analytics. Interest is now focusing on the organizational capability to perform predictive and prescriptive (that is, advanced) analytics to answer why things happen and propose changes that will optimize performance. This explains why enterprises are employing more data scientists. Successful advanced analytics requires a high level of business domain, data, and modeling skills, and a team of people is often required to ensure that all of the skills and perspectives are in place. As an example, consider the following experience. Southwire: Bringing the Skills Together Several years ago, I received a call from a manager at Southwire, a leading producer of building, utility, industrial power, and telecommunications cable products and copper and aluminum rods. He wanted help solving an impending problem associated with the production of copper, a key component of many of his company s products. My experience on that project (in particular, how the problem was approached and solved) provides a good example of the skills required to be successful with advanced analytics. I learned that the there is no set formula for manufacturing copper. A variety of ores and other ingredients are used depending on what is available. The current approach involved an expert who would look at what materials were on hand and decide what and how much of each ingredient should be used. It was critical that the ingredients produced copper and that the copper would be viscous enough to flow out of the smelter and refining furnace. The problem was that the expert was retiring soon and his expertise was going to be lost. A new solution approach was needed. Southwire assembled a team of business people, chemical engineers, IT, and me. We had individuals with business knowledge, subject area experts, people who were familiar with available data and systems, and members with modeling expertise. The team contained all the skills needed for advanced analytics. My role was to provide the modeling (data scientist) skills. I saw two possible modeling approaches. The first option was to create an expert/rules-based system based on the knowledge of the retiring expert. We would capture in an application the heuristics that the expert used in deciding what to put into the smelter each day. The model would be descriptive in that it would describe the expert s thinking. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 5

8 Advanced Analytics The other alternative, and the one chosen, was to use linear programming. If you are familiar with linear programming, Southwire s problem was the classic production blending application. You create an objective function (that is, an equation) that you want to minimize with the sum of the cost of the various ingredients multiplied by the quantity of each ingredient. You also write constraint equations that reflect the conditions that the solution must satisfy. The output of the analysis is the quantity of each ingredient that will minimize costs while satisfying all of the constraints. A possible solution to the potential conflict over control versus flexibility is an analytical sandbox, whether it is internal to the warehouse or hosted on a separate platform. The writing of the constraint equations was fascinating to me. Remember that the solution had to produce copper and it had to be sufficiently viscous. These requirements were handled through the constraint equations and reflected what ingredients were available and the chemical reactions involved. The chemical engineers input was critical for developing these equations. Remember when you took chemistry in high school or college and studied valences (the number of bonds a given atom has formed, or can form, with one or more other atoms)? This and other factors (such as what ingredients were available each day) were incorporated into the constraint equations. Data scientists are not a one-trick pony when it comes to modeling. They are familiar with multiple modeling approaches and algorithms. They are able to identify and experiment with different models until they find the one that seems most appropriate. At Southwire, a linear programming modeling approach was selected over an expert/rules-based system. Once the objective function and constraint equations were developed, it was necessary for IT and me to select an appropriate linear programming package, enter the objective function and constraint equations, test the solution, develop a user interface that operational people could easily use for entering data (such as ingredients) and interpreting the output, implement the system, and train people to use it. Assembling the Skills Enterprises have the business domain knowledge for advanced analytics. However, as illustrated at Southwire, a key to success is to make sure that people with business domain knowledge are on the analytics team. Enterprises also have the required data skills, but a few changes may be necessary to accommodate their need for advanced analytics. Data scientists (and some business analysts) may need to have fewer restrictions on the data they can access and what they can do with it. They may need access to underlying data structures as well as the ability to join, transform, and aggregate data in ways necessary for their work. They may also need the ability to enter new data into the warehouse, such as from third-party demographic data sources. A possible solution to the potential conflict over control versus flexibility is an analytical sandbox, whether it is internal to the warehouse or hosted on a separate platform. Finding the required modeling skills is a trickier proposition. You can hire consultants, as Southwire did, or use a third-party analytics provider, but these options can become costly over time if your plans include extensive advanced analytics. You can probably coach some of your current business analysts. There are many conferences (such as those offered by TDWI), short courses, and university offerings that teach advanced analytics. As advanced analytics becomes better integrated into application software (for example, campaign management) and easier to use, it is likely that trained business analysts can take on tasks that have skill requirements typically associated with data scientists. 6 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

9 Advanced Analytics You can also hire data scientists. This isn t a new approach; many companies have already done so and have data scientists scattered throughout their business units or in specialized groups such as analytics competency centers. Studies suggest that companies are planning to hire more data scientists and will face a shortage of such resources. For example, the McKinsey Global Institute predicts a shortfall of between 140,000 and 190,000 data scientists by 2018 (Manyika, et al, 2011). Many universities are gearing up to meet the need through degree programs, concentrations, and certificates. These offerings are usually in business, engineering, or statistics and the instructional delivery varies from on campus to online. skills, you will probably need to provide training for existing staff and bring in new people with specialized analytical skills. Reference Manyika, James, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, and Angela Hung Byers [2011]. Big Data: The Next Frontier for Innovation, Competition, and Productivity, McKinsey Global Institute, May. technology_and_innovation/big_data_the_next_ frontier_for_innovation One of the first and best-known programs is the Master of Science in Analytics at North Carolina State University. SAS has been an important contributor to the program, which is offered through the Institute for Advanced Analytics and has its own facility on campus. Deloitte Consulting has partnered with the Kelly School of Business at Indiana University to offer a certificate in business analytics for Deloitte s professionals. Just this year, Northwestern University rolled out an online Master of Science in Predictive Analytics offered through its School of Continuing Studies. Will students take advantage of these programs in large enough numbers? Advanced analytics is a tough study, and many students may not have the necessary aptitude, inclination, and drive to complete the programs, even though the career opportunities are great. Summary You have already been performing analytics under the BI umbrella. BI includes descriptive analytics, and you have probably also been performing predictive analytics. For more advanced analytics, however, you will need to ramp up your game a little. You have the business domain knowledge covered. For the data component, you will need to grant business analysts and data scientists wider or more in-depth permissions and you will likely need to extend and enhance your analytical platforms (such as appliances and sandboxes). For the modeling BUSINESS INTELLIGENCE Journal vol. 17, No. 4 7

10 Agile Dashboard Development BI Dashboards the Agile Way Paul DeSarra Paul DeSarra is Inergex practice director for business intelligence and data warehousing. He has 15 years of BI strategy, development, and management experience working with enterprises. Abstract Although the concept of agile software development has been around for more than 10 years, organizations only recently began to think about how this methodology can be applied to business intelligence (BI) and analytics. BI teams are continually evolving their rapid delivery of additional value through reporting, analytics, and dashboard solutions. These teams must also discover what types of BI solutions can reinvigorate a BI deployment and produce meaningful results. One way to reinvigorate BI deployments is to take the concept of agile software development and apply it to BI initiatives such as BI dashboard solutions, which can both re-engage the business and drive actionable intelligence and confident decision making. Agile BI replaces traditional BI project methods (heavy in documentation and process) with a user-driven approach. This article discusses an approach to building BI solutions and dashboards using an agile software development methodology. Introduction Although the concept of agile software development has been around for more than a decade, it s only been in the last few years that organizations have started to examine how this methodology can be applied to business intelligence and analytics. The constantly changing, highly dynamic needs of business today have increased the demands on BI environments and teams. The pressure to be more organized, turn projects around faster, and ensure user adoption at all levels is increasing. Teams need to be able to react to demands from the business and proactively develop ideas and solutions that give the business more creative ways to think about how to use data. Leveraging an agile software methodology as it applies to business intelligence is a great way to meet these constantly changing business needs. 8 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

11 Agile Dashboard Development In a nutshell, using an agile software development methodology ( agile ) instead of a traditional development methodology allows end users to experience a version of the software product sooner. Instead of adhering to a strict and intensive requirements and design phase before development begins, agile employs a series of shorter development cycles to increase user collaboration. The agile approach welcomes changes during the development process to provide a better product that delivers measurable value quickly and efficiently. There are four guiding principles for agile software development (according to the Manifesto for Agile Software Development, These can also be applied to business intelligence development efforts. Principle #1: Value individuals and interactions over processes and tools Traditional BI development focuses on strong processes and tools to solve development challenges. As a result, many organizations end up creating silos among the business and IT teams. Each team silo focuses on individual responsibilities and objectives and, in effect, each team loses sight of the overall project goal of providing cohesive and comprehensive data-driven solutions that improve performance levels. When using an agile BI approach, all those involved in the BI initiative work together as one team with one goal and set of objectives. To accomplish this, many organizations create hybrid teams and a business intelligence competency center (BICC) composed of individuals with the necessary skills to define, architect, and deliver analytic solutions. In some cases, many of these teams are organized under business units outside of IT and the program management office. Principle #2: Value working software over comprehensive documentation Traditional BI development in a big-bang approach focuses on developing detailed documentation about common metrics, terminologies, processes, governance, support, business cases, and data warehouse architectures. Organizations may create a standardized enterprise data warehouse and then fail because they were focused on the documentation and lost touch with the business and the problems they were trying to solve. This doesn t mean we should stop creating detailed documentation. BI teams can and should continue to focus on creating documentation that emphasizes the vision and scope as well as the architecture for future support. With agile BI, the focus is not on solving every BI problem at once but rather on delivering pieces of BI functionality in manageable chunks via shorter development cycles and documenting each cycle as it happens. Agile employs a series of shorter development cycles to increase user collaboration. It welcomes changes during development to deliver measurable value quickly and efficiently. Principle #3: Value customer collaboration over contract negotiation Using an agile BI approach does not mean giving users an unlimited budget or tolerance for changes. Instead, users can review changes discussed in the last development cycle to ensure expectations and objectives are being met throughout the project. Traditional BI development teams use functional documentation to discuss what the solution will look like and how it will operate. Such an imagine this method often leaves users to try and visualize what they believe the solution will become. The resulting subjective expectations can quickly derail a BI project. In contrast, an agile methodology reviews progress during each development cycle using prototypes so that stakeholders and business users can see how the BI solution is expected to look and BUSINESS INTELLIGENCE Journal vol. 17, No. 4 9

12 Agile Dashboard Development RELEASE 1... N BUILD SCOPE PROTOTYPE The vice president of sales of a large manufacturing organization asked us to help his company gain better insight into its orders, shipments, and pipeline in order to hold the sales teams more accountable. Specifically, he wanted a dashboard that he and his executive team could use to meet accountability objectives. His vision for the dashboard was solid, and our role was to take that vision and boil it down to key metrics that would drive actions. STAKEHOLDER REVIEWS Figure 1. The agile process. function. Prototypes put a visual face to the project by showing what data is available, how it will be used, and how it will be delivered. Principle #4: Value responding to change over following a plan With an agile methodology, traditional BI projects that focus on huge project and resource plans are replaced by shorter development cycles designed to better incorporate changes and keep the project team focused and informed. For BI projects, changes should be expected and welcomed. When users see prototypes and gain a better understanding of what analytic capabilities and information are available, they are better able to communicate how they could use that information to make improved business decisions. Such revelations and ideas only strengthen the final product. An agile BI project still uses a plan, but its plan is short, manageable, and coupled with a prototype users can see and experience. Changes are jointly reviewed with business sponsors, users, and IT professionals at every project stage. Example: An Agile Dashboard To better understand how this methodology can be used, let s look at a real-world example of incorporating agile BI into a BI dashboard project for an executive sales team. After a few meetings with the vice president of sales and the IT sponsor (in this case, the IT director), we concluded that an agile BI dashboard project was the best approach. We ensured we had the needed sponsorship from both the business and IT teams. In addition, we confirmed the organization was using a BI tool that was capable of delivering the desired solution. We advanced the project using a hybrid approach to agile development, breaking the project into three phases to quickly and efficiently develop the scope, build prototypes, conduct reviews, develop the solution, and implement it quickly. An agile BI project still uses a plan, but its plan is short, manageable, and coupled with a prototype users can see and experience. Phase 1 This was the foundational phase for our project and focused on the third agile principle ( customer collaboration over contract negotiation ). Phase 1 should last no more than one week and involves identifying, at a high level, the scope of the BI dashboard to ensure that the executive sponsors are engaged and the internal teams are assembled. Phase 1 is essential because it is used to narrow the scope and prioritize what can be delivered in the set time frame. 10 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

13 Agile Dashboard Development In the first week, we worked with IT and the vice president of sales to ensure that the team had the right people with the right skills who understood the project goals. We outlined roles and responsibilities, opportunity and vision, and the high-level scope all standard practices for an agile BI project. We worked with the vice president of sales along with several key business users to identify the metrics of greatest value. We worked diligently to understand what metrics were needed and how they influenced business decisions. A dashboard metric isn t enough; we strived to enable users to respond to each metric to achieve the best business results. For example, we examined what happened after the dashboard highlighted a large gap between what the customer relationship management (CRM) application identified as a sales opportunity and the revenue actually generated. We asked questions about the process of capturing these opportunities in the CRM to better understand leading factors that could influence revenue. Delving into these questions ensured that we understood the full sales engagement process. We didn t stop there. We identified about 10 metrics for invoicing, orders, shipments, and budgets across four different dimensions business area, Figure 2. Dashboard prototype examples. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 11

14 Agile Dashboard Development product, customer, and date attributes. In our vision, the dashboard would allow the sales teams to focus more effectively on specific sales opportunities, better track budgets, and confidently predict and forecast sales throughout the year (and know where and how to make necessary adjustments). We held two meetings with the IT team to better understand the ability of the source systems to provide the data elements required. The prototyping tool may be separate from your BI tool, but it must be able to demonstrate visual elements as well as drill-up, drilldown, and interactivity. The Phase 1 deliverables included a high-level vision and scope document that clearly set the stage for the rest of the project. By quickly defining the vision and scope as well as establishing a short time frame, we removed one barrier (long contract negotiations and timelines) so we could focus on having the right people involved and the right team defined. Phase 1 was completed in one week. Phase 2 Phase 2 is where collaboration, rapid prototyping, whiteboard sessions, and interactive brainstorming take place. This phase applies three of the agile principles ( individuals and interactions, customer collaboration, and responding to change ). Phase 2 focuses on using prototyping methods in brainstorming sessions to quickly build and show business users how their ideas and needs are reflected in the proposed solution sometimes iteratively and on the fly. The prototyping tool may be separate from your BI tool, but it must be able to demonstrate visual elements as well as drill-up, drilldown, and interactivity. This phase requires collaboration between the sponsors, key business users, and IT. A key benefit of this phase is that users see the data in action and will know whether the data is being presented in a way that effectively delivers the information they need. In fact, the process often gives users new ideas for using the information to make business decisions (see Figure 2). In Phase 1 we created our vision and scope, outlined the business problem, and understood the set of metrics and dimensions necessary to reach the desired outcome. We approached Phase 2 with two goals in mind: Collaborate with the vice president of sales and the sales teams to define the look of the BI dashboard and the data interactions required to populate it. Work with the IT team to determine the data components and further understand what could be accomplished and delivered by the project deadline. (The overall project length was seven weeks, so we had only six weeks left.) The collaboration sessions were held with the vice president of sales, several key business users, and individuals from the IT team. The meetings started as whiteboarding sessions. Once we completed the initial design, we built a prototype with phony (but businesssensible) data and set up daily meetings to review and refine our development cycles. In each session, we identified how and why metrics were to be used and outlined the decisions that would be made using the data. We evaluated different ways to display information so it would be most useful to users. We also mocked up the drill-through detail analysis and reporting that would be available via the easy-to-understand dashboard and made sure only a single path led to the detail at each level. The resulting dashboard prototype 12 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

15 Agile Dashboard Development had four quadrants, each of which was meant to answer a specific question: How are we performing today? Are we on plan and what is our updated forecast? Where are we winning and losing? There was a need to tie in a certain product category captured in a separate data source outside the ERP. The product category was required to ensure we were capturing the full picture. This product was set to be coded in the new ERP. We decided to pull in and map this information from the separate data source and also put in place a process to map it into the new ERP when the time was right. Who and what is not profitable? The mockup took the form of charts, regional maps, and dynamic and color-coded lists. It also included detailed drill-through paths and report examples to help guide users in making decisions. For example, a user could click on a troubled region on the map that identified a large revenue gap based on forecasting and get details on current activity within that region as well as open opportunities and win/loss details. All in all, we held about 10 different business sessions and kept coming back the next day with a refined prototype to generate ideas. As a result, throughout the entire process, users were engaged, excited, and willing to participate in the sessions. They also felt confident that their needs were being addressed and their ideas and feedback were incorporated. Users felt confident that their needs were being addressed and their ideas and feedback were incorporated. We uncovered more than 15 potential roadblocks to the initiative, and we worked through them all with the team. We kept everyone informed and made joint IT/business decisions to move forward accomplishing this with daily status meetings with IT and the business subject matter experts to address issues quickly and outline resolutions. Sponsors and stakeholders were also part of weekly checkpoint meetings. We simultaneously worked on the data components to map the vision to the actual data sources. To do so, we had to remove several roadblocks and make some tough decisions as a team (IT and business) in order to meet our deadline. As the team forged ahead, we uncovered several items that needed to be worked through as quickly as possible. A few financial metrics were not in the current ERP but would be implemented in an upgraded version, which was set to go live the following year. We worked with the business to outline the metrics and ultimately decided to put them on hold so that we could continue building the rest of the dashboard. After we removed all our technical and business roadblocks, we completed Phase 2 and delivered the prototype dashboard, drill-through mockups, and a Lean Requirements document that captured the requirements and outlined the assumptions and decisions we had made. We also built a Lean Design document that described the database design, data mapping, reporting designs, and ETL construct. Phase 2 was completed in four weeks. Phase 3 Phase 3 is the build phase and applies the second agile principle ( working software ). The foundation has been BUSINESS INTELLIGENCE Journal vol. 17, No. 4 13

16 Agile Dashboard Development set, the scope has been refined to ensure rapid delivery, IT and business are fully engaged, and now the time has come to take the prototype and construct it within the BI environment. Phase 3 should take no more than a few weeks and involves building integration and ETL procedures, security, and the BI solution itself. The project was a success because of the agile BI processes applied to every aspect of the project. One of the core success factors was the use of prototypes and interactive sessions. At this point, with two weeks left, we began to build the required dashboard, drill-through reports, and supporting data layers. Building everything in dynamic prototypes made it much easier to ensure expectations were in line as development progressed. During this phase, we continued to show the results of actual development of the dashboard every two days to the business sponsor and key users. Throughout this process, changes were still submitted. We reviewed all changes and put them into one of two buckets implement or put on hold and made notes in our change control log. Some of the change requests that flowed through in this phase revolved around adding different relative time-period buckets for revenue and margin analysis, some minor layout changes, and three changes that were put on hold for future phases around customizing various alternate drill paths from the dashboard based on a user s business unit and region. Phase 3 was completed in two weeks. Tips for Agile BI Success In the end, the initial phase of the dashboard was released in seven weeks. The project was a success because of the agile BI processes applied to every aspect of the project. One of the core success factors was the use of prototypes and interactive sessions. Using prototypes enabled us to keep all players involved from the beginning and provided a forum to exchange ideas, discuss issues, and actually see the solution as its development progressed. After reading the case study, perhaps you are now thinking, Can organizations really implement these types of BI solutions in seven weeks? You may be asking, What about data governance, load procedures, ETL, business rules, capacity planning, and security maintenance? The reality is that you must strike a balance when using the agile software development methodology for your BI initiatives. The process walks the line of ensuring that you are building a solid foundation that has longevity, speed, and strength to weather the dynamic and demanding needs of the business. The following ideas and concepts can help you implement an agile BI process. Tip #1: Start small, think big When you begin to build an agile BI solution, it doesn t matter if you have an enterprise data warehouse coupled with a large-scale, mega-vendor BI software stack or a small data mart managed with a niche tool. The key is to focus on the immediate business need and pain, then map that to the ultimate vision. Get the stakeholders to define and work with you to build out what it will look like. Once you have the vision, determine the best approach that completes the work quickly and keeps the long-term picture in mind. If you need to take shortcuts to get the work done, that s fine, as long as everyone approves the shortcuts and you have a process in place to close the loop at a future point. For example, if you have an ERP application and you have to group some of your sales data into a customized dimension (instead of modifying the ERP source of records) in order to deliver the BI solution, then do so, but ensure that you get approval and that everyone understands the costs and benefits. 14 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

17 Agile Dashboard Development Although you are building a specific solution, you can still take steps to ensure it is repeatable, scalable, and fits into your overall data architecture. For example, in our case study, we were building a specific BI dashboard solution that was focused on shipments, orders, and pipeline processes specific to the sales functional area. However, in creating the solution, we built a data design that could scale outside of sales by building conformed dimensions and process-driven fact tables. If, for performance reasons, we had to create summary or aggregated tables to support specific business areas, we made sure these mapped back to lower-grain fact tables for data consistency and detail analysis. Tip #2: Remove the roadblocks Whether you face IT challenges or other obstacles, work systematically to overcome them. Typical roadblocks you may encounter in an agile BI project include: A narrowed scope. In some cases, it can be challenging to narrow the scope of a BI project so that a portion of the solution can be delivered in a shorter time frame. This is a slippery slope and requires the ability to prioritize and find common ground with business users and/or sponsors. If you can get the business sponsor to commit to a shorter time frame up front, it will be easier to narrow the scope. In addition, separate out the must-haves and the would-like-to-haves right away. Data gaps. In any BI project, data gaps are typical as users may not fully understand how information is collected or data anomalies are discovered. Agile BI is no different, and data profiling is a necessary step. In our case study example, we encountered data gaps that we had to eliminate or overcome by accepting risk, leaving out components, or implementing a temporary fix. Business commitment and time. Agile BI requires interaction with business stakeholders, sponsors, and users throughout the project s life. Secure commitment up front with everyone and clearly outline the project s benefits in terms of effective decision making. Managing expectations. There is often a gap in the expectations about what it takes to deliver a BI solution and the time it actually takes. Users may believe that much more can be done in a short amount of time, which can cause extreme tension between IT and the business. Managing these expectations requires strong communication skills and an individual on the team who can effectively bridge IT and business users. This individual should understand data modeling and architecture, reporting and analytics, and dimensional concepts and be able to articulate the challenges to business managers and sponsors in a language they understand. If you need to take shortcuts to get the work done, that s fine, as long as everyone approves the shortcuts and you have a process in place to close the loop at a future point. Rogue development. Agile BI still follows a process and a method. There is still documentation and a plan; success metrics are still defined at the beginning of the project. Project management is still a core component in this process. We recommend that you still use the following tools, documentation, and processes to help guide the project: A vision and scope document is used to define initial, critical success factors and get project approval. A requirements document outlines the core business problems and key data elements, metrics, and dimensions that are needed for the BI solution. The difference from traditional BI development is that this document focuses on the smaller and shorter deliverables and keeps it lean. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 15

18 Agile Dashboard Development A design document describes the database design, data mapping, reporting designs, and ETL constructs. Again, different from a traditional BI project, this design should focus on bringing the technical team together on the architecture and for future support without getting lost in too many details. A project baseline plan for delivering a piece of functionality quickly, with the longer-term plan represented at a higher level. A change control log to track which changes are implemented and which are put on hold. An enhancement log to track enhancements that the team is unable to fit into the first release. and moved to the full development stage, keeping the key users involved continued to be highly important. During development, we still met at least twice a week to review progress and update our change control logs as we showed progress on the BI dashboard solution. The prototype became our guide to ensuring the development was on course and meeting all expectations. Summary As business becomes more dynamic and social in nature, BI environments need to be prepared to move fast and deliver value in creative ways. Intertwining BI best practices with the agile software methodology is one way to infuse speed, creativity, commitment, and value into any BI project. If you have obtained the right sponsorship at the start and ensured everyone has the same vision and understands the project, your ability to remove roadblocks will be improved. Inevitably, however, challenges will arise, so always keep one eye on the vision and one on the scope. Tip #3: Engage the business BI professionals sometimes get so focused on the technology that even after the initial meeting with business users they may flip back to thinking mostly about the tools and technology rather than the business s pains, needs, and objectives. In our case study, we used rapid prototyping and whiteboarding sessions to gather requirements and keep the right people involved and working in unison. We had daily brainstorming sessions to promote collaboration on the design, discuss the metrics and information needed to make business decisions, and show the BI dashboard prototype progress. From this, we built a requirements document that was focused on the key metrics and data elements, and we incorporated visuals from our prototype to ensure we had everything captured. Once we completed this phase 16 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

19 big data insights Best Practices for Turning Big Data into Big Insights Jorge A. Lopez Jorge A. Lopez is senior manager, data integration, for Syncsort Incorporated. Abstract Big data is surfacing from a variety of sources from transaction growth and increases in machine-generated information to social media input and organic business growth. What does an enterprise need to do to maximize the benefits of this big data? In this article, we examine several best practices that can help big data make a difference. We discuss the role that extract, transform, and load (ETL) plays in transforming big data into useful data. We also discuss how it can help address the scalability and ease-of-use challenges of Hadoop environments. Introduction Growing data volumes are not a new problem. In fact, big data has always been an issue. Fifty years ago, big data was someone with a ledger recording inventory; more recently, it was a bank s mainframe processing customer transactional data. Today, new technologies enable the creation of both machine- and user-generated data at unprecedented speeds. With the growing use of smartphones and social networks, among other technologies, IDC estimates that digital data will grow to 35 zettabytes by 2020 (IDC, 2011). These new technologies have turned big data into a mainstream problem. In fact, it s not uncommon to see small and midsize organizations with just a few hundred employees struggling to keep up with growing data volumes and shrinking batch windows, just as large enterprises do. The viability of many businesses will depend on their ability to transform all this data into competitive insights. According to McKinsey (Manyika, et al, 2011), big data presents opportunities to drive innovation, improve productivity, enhance customer satisfaction, and increase profit margins. Although many CIOs and CEOs recognize the value of big data, they have struggled BUSINESS INTELLIGENCE Journal vol. 17, No. 4 17

20 big data insights (through no fault of their own) to handle this new influx of data. The problem isn t information overload; it s the failure to harness, prioritize, and understand the data flowing in. This is why data integration is a critical yet often overlooked step in the big data analytics strategy. Traditional IT approaches will not generate the results businesses expect in this era of big data. Therefore, IT organizations should look at the hype around big data as an opportunity to set a new strategy for harnessing their data to improve business outcomes. As a first step, organizations must examine their existing data strategies and ask: Are these data strategies helping us achieve the objectives of the business? Can our environment economically scale to support the requirements of big data? Can our infrastructure quickly adapt to new demands for information? To fully take advantage of new sources of information, organizations must cut through the buzz that big data creates. There are many definitions of big data, but most experts agree on three fundamental characteristics: volume, velocity, and variety. Another key aspect, often overlooked, is cost. Forrester, for instance, defines big data as techniques and technologies that make handling data at extreme scale affordable (Hopkins and Evelson, 2011). This touches on two critical areas that must be addressed to have a successful data management strategy: scalability and cost effectiveness. To scale data volumes 10, 50, or 100 times requires new architectural approaches to the data integration process. Doing so in a cost-effective way has been the biggest challenge to date for organizations. No matter what kind of IT environment you have or how you label your data (big or small), there are steps you can take to rearchitect and optimize your approach to data management, such as returning your attention to the data integration process in your quest for improved business insights. Not All Big Data Is Important Data Sometimes it s easy to get caught up in the hype about big data. However, trying to process larger data volumes can significantly increase the amount of noise, hindering your ability to uncover valuable insights. It s important to remember that not all data is created equal. Any big data strategy must include ways to efficiently and effectively process the required data while filtering out the noise. Data integration tools play a key role in filtering out the unnecessary data early in the process to make data sets more manageable and, ultimately, load only relevant data into the appropriate environment for analysis (whether that is a data warehouse, Hadoop, or an appliance). To scale data volumes 10, 50, or 100 times requires new architectural approaches to the data integration process. Doing so in a costeffective way has been the biggest challenge to date for organizations. Organizations can take three approaches: 1. Define a clear data strategy that identifies the users data requirements. (Why do I need this data? How will it help me accomplish my business objectives?) 2. Build an efficient data model that is adequate to the demands of the business. 3. Have the right data integration tools to do the job. Ultimately, the data integration tool is the critical component; it can help materialize the strategy and execute on it to build an efficient data model. The tool must have the right capabilities as well as the scalability and performance required to work effectively. A key component is the ease of use that allows developers to focus on business requirements instead of worrying about performance, scalability, and cost. 18 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

21 big data insights Bringing Data Transformations Back to the ETL Layer Data integration and ETL tools have historically focused on expanding functionality. For instance, ETL was originally conceived as a means to extract data from multiple sources, transform it to make it consumable (by sorting, joining, and aggregating the data), and ultimately load and store it within a data warehouse. However, in today s era of big data, this strategy neglects two critical success factors: ease of use and high performance at scale. As IT organizations confront the accelerating volume, variety, and velocity of data by applying analytics, they have been forced to turn to costly and inefficient workarounds, such as hand-coded solutions or pushing transformations into the database, to overcome their performance challenges. The costs of such scaling approaches can outweigh their benefits. The best example is staging data when joining heterogeneous data sources. This practice alone increases the complexity of data integration environments and adds millions of dollars a year in database costs just to keep the lights on. As such, an Enterprise Strategy Group survey (Gahm, et al, 2011) found data integration complexity cited as the number one data analytics challenge. There are new approaches that don t require big budgets, however. To rectify this situation, we recommend bringing all data transformations back into a high-performance, in-memory ETL engine. This approach tackles four main issues: 1. Think about performance in strategic, rather than tactical, terms. This requires a proactive, not reactive, approach. Performance and scalability should be at the core of any decision throughout the entire development cycle, from inception and evaluation to development and ongoing maintenance. Organizations must attack the root of the problem with approaches that are specifically designed for performance. 2. Organizations must improve the efficiency of the data integration architecture by optimizing hardware resource utilization to minimize infrastructure costs and complexity. 3. Productivity gains can be achieved through selfoptimization techniques, which means that little, if any, manual tuning of data transformations should be required. The constant tuning of databases can consume so many hours and resources that it actually hinders the business. 4. Cost savings are realized by eliminating the data staging environment, resulting in server and database maintenance cost savings; deferring large infrastructure investments with the efficient use of system resources; and gaining improved developer productivity because a considerable amount of time need not be spent tuning for growing data volumes, thus providing more time for strategic projects. The high-performance ETL approach should accelerate existing data integration environments where organizations have already made significant investments and enhance emerging big data frameworks such as Hadoop. IT departments within several companies have initiated high-performance ETL projects to achieve a long-term, sustainable solution to their data integration challenges: A storage industry pioneer and leading producer of high-performance hard drives and solid-state drives needed to improve its assurance process and inventory management with faster data processing of one million data records from its manufacturing plants. Using a high-performance ETL strategy, the company has reduced data processing times from 5.5 hours to 12 minutes and has released 70 percent of its data warehousing capacity to devote to analytics. An independent provider of risk assessment and decision analytics expertise to the global healthcare industry needed to process and analyze TB of claims data per month to uncover risk-mitigation opportunities. Through a similar approach, the healthcare analytics organization reduced processing BUSINESS INTELLIGENCE Journal vol. 17, No. 4 19

22 big data insights from 2 hours to 2.5 minutes. Further business growth could also be supported by reducing turnaround time for new customers being entered into the system from 5 days to 24 hours. A mobile advertising platform company needed to quickly analyze large volumes of online activity data (such as views, clicks, and conversion rates) that was doubling every year in order to make important decisions (such as what ad space to bid on and when and where to place ads for customers). The business went from waiting one hour to obtain the information needed to adjust advertising campaigns down to 10 minutes. In addition, its two developers, who spent most of their time just maintaining the infrastructure, can now work on more valuable projects. Hadoop gets its scalability by deploying a significant number of commodity servers. This way, the Hadoop framework can distribute the work among servers for increased performance at scale. Of course, adding commodity hardware running open source software looks like a more cost-effective proposition than adding nodes to a high-end, proprietary database appliance. However, the hardware required to cope with growing data volumes and performance service-level agreements can grow significantly. Therefore, it is not uncommon to find Hadoop deployments with a significant number of nodes. This elevates capital and operational costs because of hardware maintenance, cooling, power, and data center expenses. In addition, the required tuning involves hundreds of configurable parameters, making it difficult to achieve optimum performance. The benefits of a proper ETL process with fast, efficient, simple, and cost-effective data integration translate into benefits across the entire organization, including operational, financial, and business gains, with the ultimate benefit being quicker access to cleaner, more relevant data to drive big data insights and optimize decision making. Optimize Your Hadoop Environment In today s world of mobile devices, social networks, and online data, organizations must massively scale data integration and analytics differently than before. According to Forrester (2011), despite the opportunity new data presents, organizations use only a small fraction of the data available to them. A new architecture is necessary to change both performance and costs that are driving Hadoop, the open source framework for big data. Hadoop is designed to manage and process large data volumes. It presents several opportunities but also introduces challenges including scalability and ease of use that lead to siloed deployments with limited functionality, which is why Hadoop doesn t provide significant value by itself. Organizations should not expect to rely solely on Hadoop for all their needs; other tools and platforms need to complement Hadoop to optimize the data management environment for these data sets. Such increased complexity is tied to ease of use, which is one of the major challenges facing nearly every organization working with Hadoop. Hadoop is not easy to develop. For instance, adding new capabilities (such as reverse sorting) and coding MapReduce jobs is typically performed manually, which requires specific skills that are expensive and difficult to find. For many organizations, finding the skill set needed to manage Hadoop is the most significant barrier to Hadoop adoption. Organizations can overcome these challenges and extend Hadoop s capabilities, maximize its value, and simplify the overall Hadoop experience by integrating the high-performance ETL approach. This approach allows for sorting and organizing the data before it is pushed into the Hadoop environment by leveraging Hadoop Distributed File System (HDFS) connectivity and by creating MapReduce jobs in a separate graphical interface rather than writing Java or Pig scripts. Data integration comes into play after analysis as well; the results of the analyzed data need to be reintegrated into other information systems. For example, comscore, a global digital information provider of online consumer behavior insights, saw its data volume increase 72 times per day within two years and deployed a Hadoop cluster to better manage the data processing. 20 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

23 big data insights However, it is challenging to bring Hadoop into an enterprise with heterogeneous operating systems. Moreover, Hadoop lacks critical features such as real-time integration and robust high availability. Therefore, com- Score deployed a data integration strategy that groups and splits larger data files that fit more perfectly into Hadoop, which provides a higher rate of parallelism on compressed files and reduces disk costs for the Hadoop cluster. This saved 75 TB of data storage per month and slashed processing time from 48 hours to just 6 hours, so comscore can now process twice the data each month (compared to a year before), allowing the company to provide its customers data insights faster. Manyika, James, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, and Angela Hung Byers [2011]. Big Data: The Next Frontier for Innovation, Competition, and Productivity, McKinsey Global Institute, May. technology_and_innovation/big_data_the_next_ frontier_for_innovation Summary Today s enterprises must make sense of the increasing volume, velocity, and variety of data while maintaining cost and operational efficiencies. Your business intelligence strategy must focus on optimizing the data integration process so it is fast, efficient, simple, and cost effective. This means ensuring you have all the right data at your fingertips by managing the volume and new sources of data, coupled with scalability as data requirements evolve. Quicker access to cleaner, more relevant data is what drives big data insights and what will truly lead your enterprise to faster and more profitable decisions. References Gahm, Jennifer, Bill Lundell, and John McKnight [2011]. The Impact of Big Data on Data Analytics, Enterprise Strategy Group, research report, September. Hopkins, Brian, and Boris Evelson [2011]. Expand Your Digital Horizon With Big Data, Forrester, research report, September. +Your+Digital+Horizon+With+Big+Data/fulltext/-/E- RES60751?objectid=RES60751 IDC [2011]. The 2011 Digital Universe Study: Extracting Value from Chaos, digital iview report, June. microsites/emc-digital-universe-2011/index.htm BUSINESS INTELLIGENCE Journal vol. 17, No. 4 21

24 practical dashboard development Implementing Dashboards for a Large Business Community A Practical Guide for the Dashboard Development Process Doug Calhoun and Ramesh Srinivasan Abstract Dashboards are becoming more prevalent as business intelligence tools, and the reason is obvious: well-designed, accurate dashboards can quickly communicate important business indicators and trends and provide actionable information. Doug Calhoun is systems analyst, claims technology data and information delivery at Allstate Insurance Company. [email protected] Ramesh Srinivasan is manager, claims technology data and information delivery at Allstate Insurance Company. [email protected] However, creating and implementing a successful dashboard involves a great amount of work. It often requires implementing tight controls while allowing the flexibility needed to test and learn with the business. This article outlines tips for how to integrate these seemingly divergent processes as well as how to ensure the data accuracy, ease of use, and optimal performance that make a truly successful dashboard. Introduction The use of dashboards as a primary business intelligence tool is expanding quickly. When supporting a business unit fueled by data, how does an application team build dashboards that will provide great business value and be sustainable? There are many methods for doing this, as we will explain in this article. However, there are also certain fundamental principles that may seem obvious, but can be difficult to implement: Engage business users, not just at the beginning and end of a project, but throughout the entire process. Make business users your partners. 22 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

25 practical dashboard development Involve the entire application team throughout your project s life. A factory-like approach of handing off tasks from phase to phase will not work well. Although design updates may require an iterative approach with business users, the number of components needed should drive the team to define phases and key deliverables early in your project to keep it on track. Sophisticated user interfaces are great, but in the end, it s really about the data. Ensure that everyone is in agreement about how to define the data from a business point of view, and create a plan for how to validate it. Ease of use is critical. Make sure your business partners get hands-on opportunities as often as possible. Design your technology based on the number and types of users. Performance and capacity should be considered when designing and building dashboards, much as they are with more traditional transactional systems. This article is not intended to serve as a guide to visual design. That topic has already been studied extensively and successfully. We will discuss best practices for the process of creating a successful design. In addition, the word dashboard is used here as a general term for data visualization tools showing at-a-glance trends and other indicators. It is not meant to signify the timing or refresh rate of the data, and could be used interchangeably with scorecard depending on how a business unit chooses to define it. In the business intelligence world, dashboard has become the most common term, so it will be used here with assumed broader connotations. Another concern is process methodology. Many companies primarily employ a waterfall life cycle, which can be a difficult fit for a business intelligence implementation. However, a purely agile methodology for dashboards can also lead to trouble, as there are complexities with development and testing that require a certain level of more traditional phase-gating. Essentially, the dashboard needs to be treated both as an application (with all the functional testing required) as well as a mechanism for providing data, including iterative testing and prototype updates. A certain level of flexibility in your development process may be required to achieve a happy medium and ensure a successful rollout. Sophisticated user interfaces are great, but in the end, it s really about the data. Ensure that everyone is in agreement about how to define the data from a business point of view. Depending on the size of your company, you may also need to leverage the assistance of other technology groups to implement. Where appropriate, involve groups such as your business intelligence community of practice or center of expertise; data, solution, and/or BI architecture; database administrators; all associated infrastructure/ server administrators; change and release coordinators; and any other applicable groups you believe should be enlisted. Do this early. All of this may require innovating your process, which might sound like a contradiction in terms to process methodologists but may have practical application to your work. The best practices below will guide you in the direction that best fits your project s needs. Starting the Project If you are embarking on a dashboard project for the first time, there are several rules of thumb you should follow at the project s outset. First, as with any project, you will need to define team roles and lay the groundwork for how the project life cycle will work. At the same time, you will need to BUSINESS INTELLIGENCE Journal vol. 17, No. 4 23

26 practical dashboard development identify and engage all stakeholders and ensure both groups agree on expected outcomes. It is unlikely that you will be working on a dashboard project without a business case behind it, but getting a request from the business and truly engaging users as partners are two very different things. Although it can be easy to take orders and make assumptions, keeping the key business partners involved throughout the entire project life cycle, and beyond, is absolutely essential. Finally, you will need some vital information before you begin. Some questions are obvious from a technical point of view. What are the data sources? Will data be stored separately? What tools will be used? What environment(s) must be built? Other questions are just as vital, but may not be so obvious. For example, is the project feasible, especially as an initial effort? We recommend you limit the scope of an initial dashboard to a simple, straightforward first effort that has high business value. This way, a quick win is more possible, success can be attained early, and business trust will be earned as you learn the ropes. You will also need to be sure that the project is appropriate for a dashboard or other visualizations. For example, if the business primarily wants to track how hundreds of their individual workers are performing, a dashboard is likely not the right vehicle. However, if they want to track how their offices are performing over a period of time, using standard, well-known measures within the company, then a dashboard may be the best option. (You can still consider getting to the individuals detail, which we ll discuss shortly.) The main lesson here, and throughout the early phases of your project, is to ask questions and keep on asking them! If something does not make sense or seems impossible, work with business users until you reach a mutually satisfactory agreement. Once the project looks possible, list all your assumptions whether business related, technical, or process/ project based. You ll need this list to build an order-ofmagnitude estimate, define the technical space you will be working within, and help business users understand their role during the project (and how crucial it is). Having everything in order even before detailed requirements are determined will give both you and business users confidence. After all, before you start involving them in detailed requirements meetings, they re going to want some idea about when to expect a finished product. Limit the scope of an initial dashboard to a simple, straightforward first effort that has high business value. Finally, as you devise this plan, treat the dashboard as a full-blown application. Although the dashboard is built in the business intelligence space, it has both the complexity of a dynamic user interface (with the myriad possibilities of errors on click events), as well as the need for absolutely exact, gold-standard data. Both the data and the functionality will need to be tested thoroughly, as if you were developing a transactional application. If you release the slickest, most attractive dashboard your business has ever seen but the data is wrong or a button doesn t work, user confidence will quickly erode. Your dashboard may be pretty pretty meaningless. Consider the metrics and aggregations needed and what types of structures will be required to support your project. Depending on your company s standards, you might be using denormalized tables, dimensional tables in a warehouse (or a combination of these), an integration of detailed and aggregated data, OLAP cubes, or many other possible sources and targets. As with any BI solution, you need to choose the appropriate data model. 24 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

27 practical dashboard development Dashboard Implementation Effort by Role and Phase Architects/BI CoE/DBA Testing team Developers Analysts (technical) Business champion/sme Business sponsor Implement and change nav User testing Build and UI update Data and dashboard design Data requirements Scope Figure 1. Both business users and technical staff should be involved throughout a project s life cycle. The point here is that performance is paramount for user adoption. Document and agree upon functional requirements and data definitions while offering the flexibility of iterative testing and tweaking that a business intelligence solution should provide. It is critical to lock down the logic behind the displayed metrics early in the project. If that changes or is vague to everyone, there is little chance you ll deliver a successful dashboard. Gathering Requirements Involving business users in your work is crucial and most clearly needed early in a project, especially during requirements gathering and scoping. You may need to remind yourself to keep your business partners actively involved, because it s vital to your project s success! Multiple meetings will certainly be necessary, but make sure to keep users actively engaged via various methods, including whiteboarding at first, dashboard prototyping later, and sharing early data results. This will not only help hone the requirements, but also allow business users to feel they are truly partnering on the project. This will ensure that they know and trust what they will be getting. In addition, the entire development team should be involved from inception through implementation to ensure nothing gets lost in translation through the work. See Figure 1 for a gauge of both business and technical involvement through a general project life cycle (regardless of the specific methodology used). The following best practices can help you avoid pitfalls during requirements gathering, even when the relationship with the business is good. Know your user. It is possible that your business partner may represent only one part of the larger group using the dashboard, or may be assigned to a project and may not be an ultimate end user at all. Some users may have different business needs from your primary business partner. Make sure that you define all the groups of users who will have access to the dashboard, and ensure all of their voices are heard. This is not as easy as it sounds, but is worth the effort. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 25

28 practical dashboard development Define a use case for every component you build. There is no point in creating a dashboard component unless there is a direct use for it that can be easily defined and documented. Documenting the requirements is crucial to ensure business users get what they have asked for and so developers and testers have a clear guide about what they must build. You want to ensure that the use cases, and the data shown, will stay meaningful over time for each component; it is not a good idea to introduce new or rarely used metrics with a dashboard solution. Finally, require sign-off for all use cases, business requirements, and scope documentation you create. The scope should be limited to the business metrics and granularity of the data at this stage; visualization requirements can be developed later. Define all the groups of users who will have access to the dashboard, and ensure all of their voices are heard. Know your data sources and plan your approach. You must understand both where the data initially resides and, if you use an extract-transform-load (ETL) or similar process, where it will eventually reside. If storing the data, you will need to know how it should be stored, how long the data will need to be available for access, and how often it needs to be refreshed. Especially if using ETL, three-quarters of your work will be spent on the analysis, load build and testing, and validation of the data. Even without ETL, our experience is that the majority of the time should be spent working with the data rather than building the front end. Given the visual nature of dashboards, it is easy to assume that the bulk of your work is spent building attractive, user-friendly interfaces. This is simply not the case with successful implementations, especially when so many easy-to-implement dashboard tool suites are available. Include only trusted, known metrics whenever possible. Exceptions may arise, but if metrics are well known, the exceptions will be much easier to validate. The sources of the data must also be trusted, and business users should be included in selecting data sources. Know your refresh rate. Will the dashboard be loaded monthly, weekly, daily, hourly, or a combination of these frequencies? The fundamental dashboard design approach will depend on your answer to this question. Use cases will drive your design. Make sure you have thorough discussions about what is really needed versus what would be nice to have, because the more often the dashboard will be refreshed, the more support (and cost) it will require after rollout. Identify all filters and selections. The earlier in the project s life you can do this, the better. This information has a major influence on your dashboard design and will affect decisions about performance and capacity. If a large combination of multi-select filters can be selected for one component, there will be a multitude of data combinations to validate and possibly many thousands of rows to be stored. Technologists can be tempted to impress their business partners, but be careful not to promise something that is not scalable or sustainable. Understand what levels of aggregation and detail are required. An early requirements exercise should involve the filters and dimensions that will be used as well as how they should be aggregated. Time periods are a common dimension as are office or geographical territories. On the flip side, sophisticated business users will inevitably want to know the details behind what is driving their trend or that one outlier metric. Not having a method of either drilling down to (or otherwise easily accessing) the detail behind the aggregation will frustrate users after the post-implementation honeymoon period has ended. Determining aggregation/detail needs should be part of the discussions during requirements gathering, but remember to balance your requirements with development difficulty and desired timelines. If detailed data is provided, it should be accessed directly via the dashboard, 26 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

29 practical dashboard development whether through sub-reports or drill-down capabilities in the components themselves, depending on your tool set. Identify how much history you need. Some graphical trends will require year-over-year comparisons. Beyond that, it may be worth considering how long any data that no longer appears on the dashboard should be retained. If it does need to be retained for compliance or other purposes, an archival strategy should be considered, or possibly a view built on top of the base data. The more the dashboard can be limited to querying only the data it needs to display, the better it will perform. Define the data testing and validation process. It is never too early to address how you will ensure data quality through a validation process. Defining specific responsibilities and expectations, and what methods will be used for validation, should happen even before design. This will also ensure that business users will be ready when they are asked to begin testing. The validity of the data is the most critical factor in the dashboard s success and continued use. Integrate business users. There are several ways to involve business users in requirements gathering and refinement besides letting them dictate while you take notes. These options include: Prototype early and often. Prototyping can start with simple whiteboard exercises, and many dashboard tools now lend themselves to quick prototyping so business users can see and play with something similar to the final product deliverable. This hands-on method is excellent for rooting out requirements gaps, although it should not replace formal documentation. Use real data wherever possible when prototyping to give business users a better context. It also helps you to identify and correct data issues early. Integrate developers. Requirements gathering should not be done solely by analysts. If there are separate individuals responsible for coding, they must be involved at this stage so they truly understand the value and meaning of what they will build. Set expectations for production support. Agree upon a process for communication of user questions or any defects users discover. Depending on the user, this can be done many ways, although users at the executive level will likely prefer a direct communication path with the team s manager(s). Additional suggestions appear in the post-implementation section later in this article. Define milestone deliverables. Regardless of the software development methodology you use, defining milestone deliverables is critical for instilling and retaining business confidence in the project. It is also necessary to ensure the development team is progressing as expected. An early requirements exercise should involve the filters and dimensions that will be used as well as how they should be aggregated. Milestone due dates should be communicated early and deadlines met. If a deadline is at risk of being missed, share this information (as well as the reasons for the problem and the recommended course of action) with business users so new dates and deadlines can be agreed upon or so the team can remove items from the project scope or adjust resource levels and assignments. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 27

30 practical dashboard development Required deliverables from the business requirementsgathering phase may include: and capacity concerns should be considered. This is just as important as designing the dashboard functionality. Scope lockdown, with documentation of what is in scope and out of scope. Final prototype with business sign-off. (Note: This remains a working prototype, and all team members must understand and agree that the design may change later in the project if practical.) The highestlevel sponsor of the project should be part of this sign-off, as well as further sign-offs of the actual product prior to rollout. Detailed requirements definitions, including images from the prototype. Such documents can tie the business definitions of the metrics to the way they will be displayed. Such a connection will bring clarity both to the business client and to the developers/analysts building the solution. Technical conceptual design. This high-level document defines all data sources and targets, what delivery mechanisms are being used, and the general business use case(s) for the dashboard. Defining milestone deliverables is critical for instilling and retaining business confidence in the project. Designing and Building the Dashboard: Soup to Nuts When dashboard design has begun, all layers should be considered in relation to one other. For example, if the dashboard will be connected to aggregated tables designed for performance, those tables, the way they are loaded (or otherwise populated), and any performance In general, the dashboard design should: Ensure a single, consistent view of the data. This can apply to the visual look and feel as well as how often the components on a screen are refreshed. The user should not have to think about how to interpret the dashboard; the data presentation should be clear and intuitive. Keep everything in one place. If detailed data or supplemental reports are needed, use the dashboard like a portal or ensure a centralized interface keeps the data logically consolidated. Also, make sure the same data source is used for both detailed and aggregated data on the dashboard. Keep in mind, however, that business users may expect that a snapshot of the dashboard will not change. For example, a monthly metric could possibly vary slightly in the source data, but re-querying every time for the dashboard view with different results could erode confidence and even skew expected trends. Have a conversation with business users early on to discuss such scenarios and determine whether storing pointin-time dashboard snapshots will be required. Understand the usage scenario. Knowing the size of the user base, as well as the types of users and when they will be accessing the dashboard, can drive design. You should understand the usage volumetrics early in your project and plan accordingly. You must also ensure that any maintenance windows do not conflict with peak-time use. Environment sizing, capacity, and performance will all be critical to ensure a stable tool. Address multiple environments for development. If your environment has the necessary capacity, build development, test, and production environments. It s worth it. 28 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

31 practical dashboard development Plan to validate data accuracy as early as possible, and ensure your design and project plan allow this. To avoid rework, it is crucial to make every effort to get the data perfect and acquire sign-off in a lower database environment during user testing. This will ensure that the data acquisition process is free of bugs. At the same time, ensure that you validate using data from the production source system(s), because the data will be well defined and likely have an independent method of validation. Roll out with historical data available. Plan on migrating all validated data to production tables along with the rest of the code. Implementing a dashboard with predefined history and trends will ensure a great first impression and enhance user confidence. In addition to these areas of focus, consider several design best practices for both database/data quality and dashboard interfaces. Database-Level Best Practices Ideally, your dashboard will be running in a stable database environment. This environment may be managed by your team or may be the responsibility of another area of your company. Either way, your dashboard is meant to provide data for quick and meaningful analysis, so treating the data and the tables in which it resides is critical. Some best practices include: Store the data using IDs, and reference static code or dimensional tables wherever possible. This way, if a business rule changes, only one table must be modified, and no data has been persisted to a table that is now outdated. Design and model the data so the front end can dynamically handle any business changes at the source level. This eliminates the need to update the code every time business users make a change, and maintenance costs will be much lower. The development team will then be able to work on exciting new projects rather than just keeping the lights on. Detailed data should be kept separate and not reloaded anywhere, if possible. However, it should be available in the same database so the aggregate and related detail can easily coexist. Unless absolutely necessary, do not store calculated values or any data that is prone to business rule changes. If persisted data becomes incorrect, it can be a huge effort to re-state it. Calculated fields can be done quickly using front-end queries or formulas (if designed properly). Create a data archival strategy based on business needs for data retention and how much history the dashboard needs to show. Using ETL or other data acquisition methods to regularly write to a highly aggregated, denormalized table. This will ensure optimal performance, as dashboard click events need to be fast. A good goal is to ensure that no click event takes more than three seconds to return data to the dashboard. Use predefined and well-maintained dimensional tables wherever possible. This ensures consistency and eliminates redundant data structures. Ensure that any queries from the dashboard to the tables are well-tuned and that they will continue to run quickly over time. Likewise, ensure that any middle-tier environment used for running the dashboard queries is highly stable and can take advantage of any caching opportunities to enhance performance. Dashboard-Level Best Practices Spending a great deal of time on getting the dashboard data modeled, stored, automated, and correct will, of BUSINESS INTELLIGENCE Journal vol. 17, No. 4 29

32 practical dashboard development course, all be for naught if the dashboard front end is not intuitive, does not perform, or otherwise does not have high availability. To address this, take these steps throughout the life cycle: Test plans should include scripts for testing the overall dashboard load time as well as specific load times for all click events. This will afford the time needed to tweak code for optimal performance. Check the dashboard usability by bringing in end users who were not involved in the initial project. Observe how quickly and easily they can meet their objectives, and remove all bias as you watch. You will need to plan for their participation well in advance, and this work should be done early in your testing (make sure to have production data at this point) so there is time to react to their input. Within the dashboard code, implement dynamic server configuration so all dashboard components can automatically reference the proper environment for the database, middle tier, and front end itself. This reduces reliance on hard-coded server names and can prevent deployments from accidentally pointing to the wrong location. Users will want to use Excel regardless of how well-designed your dashboard is. Make sure an Excel export option is available for all the data shown on the dashboard and any included reports. For every dashboard component, include a label referencing the specific data source as well as the data refresh date. This simple step resolves confusion and will greatly reduce the number of support questions you receive post-rollout. Do everything possible to avoid hard-coding filters, axes, or any other front-end components that change based on predictably changing business. The data and the front end both need to be flexible and dynamic enough to display information based on a changing business. The dashboard should not have to display invalid or stale data for a time period while the development team scrambles to implement a production fix. That would inevitably lead to a drop in user adoption and reduced confidence in the dashboard s validity. Near the end of testing, simulate a performance load test whether you have automated tools to do this or you have to do it manually with multiple users. The purpose is to ensure no part of the underlying infrastructure has an issue with load. Test boundary conditions to avoid unforeseen defects later in the project s life. For example, what happens when a multi-year trend goes into a new year? Will the x-axis continue to sort properly? Define all conditions like this and find a way to test each one. Do not store calculated values or any data that is prone to business rule changes. Running the Project (and Subsequent Projects) Considering the myriad of complexities involved in implementing a dashboard, from ensuring correct data is available when expected, to designing a usable and innovative front end, to working with the business through multiple and complex requirements, costs can be high and timelines can easily be missed if the project is not carefully managed. The following procedures will help ensure a successful dashboard release, all in the context of the best practices explained so far: Create and use an estimating model. The model should cover all aspects of a dashboard release (from data to user interface), all the technical roles and resources that will be involved, and be sufficiently detailed to break down time in hours by both phase and resource type. A model 30 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

33 practical dashboard development that can be defined by selecting answers to requirementsbased questions will be the easiest for your analysts to use, such as: How many metrics and components will be displayed? How many data sources will be used? Does data for validation exist? The model should be refined after each large project by entering the answers to these questions and determining how closely the model s hours match those actually spent. Data validation is your top priority. Plan and allocate the time with your business partners and understand what data sources you will use for validation. If there is no independent source, you and your business users must reach an agreement about how validation will be performed. Share real data as soon as it becomes available and the team has reasonable confidence in its accuracy. There is no reason to wait to share data, regardless of how early in the process this occurs, because the earlier data defects are identified and resolved, the more smoothly the subsequent processes will go. As we ve mentioned, we recommend you implement your project with historical data loaded. If this is planned, ensure that business users are aware and secure their pledge to spend adequate time comparing and validating the historical data. Define phases of work and identify key deliverables for each. Regardless of the development methodology your department uses, you must align milestones to specific dates to ensure the project does not get out of control and to keep business users confident in your progress. Depending on your business client and their expectations, you may need to blend agile and waterfall methods. Although this will not satisfy ardent practitioners of the methodologies, a blended approach can allow for the iterative testing and discovery that this type of work requires while ensuring adherence to a strict timeline, which a release of this complexity also requires. Implementations are complex, so make a detailed plan. The manager or lead of the project should define all the steps needed, assign dates and responsible parties, and build a T-minus document/project scorecard. These tasks should be completed during the initial stages of the work, soon after any intake approval and/or slotting, and the document should be reviewed with the entire team at least once a week to ensure the project is consistently on track. Depending on your business client and their expectations, you may need to blend agile and waterfall methods. Escalate all identified issues and risks early and often. If your department already has a process for bringing issues and risks into the open and to the attention of those who can mitigate them, use it. Otherwise, create your own process for the project. Enlist all stakeholders and technology leaders for support, and do this proactively. Review, review, review. Plan multiple design and code reviews, and assume at least a draft and final review will be needed for each major piece of work. Devote ample time to design review, because waiting until the dashboard is built may make recovery impossible if a fundamental design flaw has gone unnoticed. Formalize a method for tracking and implementing all changes identified during reviews. Keep the development team engaged. For example, if the development team includes offshore resources, record key meetings using Webinar technology. This can serve as both knowledge transfer and training material later. Make sure everyone knows about the recording and ensure that no legal or compliance issues will arise. Even though your work may be completed in phases, dashboards can rarely be efficiently delivered if a factory BUSINESS INTELLIGENCE Journal vol. 17, No. 4 31

34 practical dashboard development approach is used (in which requirements are passed to designers, and designs passed to builders, without everyone being involved). When a developer is far removed from business users working on a dashboard project, this can lead to project failure. Create an internal process for ticket and defect handling, and implement bug fixes in small, bundled releases. Implement a formal user-acceptance testing process. Once the development team has completed all internal testing of data and functionality, plan time (we recommend two to three weeks) to allow the business team to complete their tests. Testing should include as much data validation as possible. We recommend you formally kick off the testing phase with business users and employ a documented process for submitting defects and questions to the development team. Make this easy for your business partners. They should focus on testing, not on how to submit their test results. Require sponsor/stakeholder approval before rollout. This will give your dashboard legitimacy to the ultimate end users and is invaluable for those early weeks when adoption may hang in the balance. This approval should include a presentation during which the sponsor can view and provide feedback about the dashboard, with sufficient time allotted to make adjustments. As mentioned, we recommend you conduct sponsor reviews of the dashboard throughout the project, including during prototype design. Post-Implementation (You re Never Really Done) After the dashboard is implemented, team members are often tempted to relax. There may also be new projects demanding focus. Do not become distracted or complacent, because there are certain post-implementation steps that will ensure both that the few critical months after rollout go smoothly and that the development team does not become bogged down by production support or answering business questions. First, build post-implementation work into the initial plan. Sustainability and support should be factors in scope and technical design. For larger rollouts, consider best practices for the sponsoring business group and the technology team to handle presentations. This way, both business and technical questions can be answered accurately, all key partners are included, and accountability is shared. Post-rollout sponsorship and change navigation coordination are crucial. The business unit will likely be responsible for communications and training, but the technology team can and should influence this. If possible, ensure you have a method to collect usage metrics. If you can identify usage by user ID, that is even better because delineation between business and technology usage can be made and groups can be identified for training if usage is lower than expected. The development team can suggest and implement innovative ways to communicate with users: Add a scrolling marquee to the dashboard or use some other technique for instantly communicating important messages. This component should be database driven, and the technical support team should have write access into a table separate from the dashboard s main tables. This way, announcements such as planned downtime or key data load dates can be easily delivered to all users. Add an button that goes directly to the dashboard support team. This may not be a popular choice for all technology teams, but dashboards are 32 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

35 practical dashboard development often used by upper-level managers who have no desire to call in tickets through a service center. They prefer direct and immediate access to the group that can resolve a problem. Create an internal process for ticket and defect handling, and implement bug fixes in small, bundled releases. Communicate the fixes to all users. Build context-sensitive help directly into the dashboard. Dynamically displayed help can greatly increase usability as well as cut down on support questions. Help text should discuss how to use the dashboard components, but should primarily emphasize the business rules and definitions of the metrics themselves. The dashboard should be built intuitively enough so users are not confused about its use. A document attachment is a viable alternative to context-sensitive help depending on the user base and what would best suit them. However, implementing dashboards is not as magical as a flashy user interface might make it seem. The accuracy of the data, timeliness of its updates, and performance of user interactions are all as important as the visual design. To successfully implement all of these things, careful planning is required, as is a strong partnership between business sponsors and the entire development team. The dashboard should be easy to use but can be difficult to develop. By utilizing the methods we ve discussed, you can improve both the process and results of dashboard development at your enterprise. Continuously funnel large and small enhancements into subsequent releases to maintain momentum. No matter how well you succeed with the initial deployment, there will certainly be areas for improvement. Be open to user suggestions, even regarding component design or more traditionally technical items. Business end users are often great innovators you can learn from, and with advancements in Excel and business intelligence tools, they are becoming more technically skilled. Learn from each deployment, and continue improving and documenting your own best practices. Summary Dashboards can be incredibly useful tools for business users, offering at-a-glance indicators of key company measures, the ability to view trends over time, and filtering capabilities to drill down to areas that could impact key business objectives. Dashboards can offer great visual flair and a quick method to identify and understand a positive or negative business impact. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 33

36 Data Governance for Healthcare Data Government Models for Healthcare Jason Oliveira Jason Oliveira is the managing director of health systems consulting at Recombinant Data Corp. Abstract The U.S. healthcare provider industry, which represents roughly 17 percent of gross domestic product, is on the trailing end of the adoption curve of business intelligence (BI) approaches. Now that enterprise information management and analytic technologies are starting to become prevalent, healthcare providers need to reorganize their BI support services, resources, and data governance. Healthcare organizations are unique business entities that present challenges for optimally organizing governance, people, and services for next-generation BI. Learning from other industries that have adopted the concept of the business intelligence competency centers (BICC), this article explores the available options and evaluates which service and organizational model best fits healthcare providers and similarly complex organizations. Introduction BI, performance management, and enterprise data warehousing have become more strategic in healthcare organizations. Hospitals, health systems, payers, home health, and physician practices are all struggling to find ways to manage and support BI deployments across multiple entities, departments, and functions, as well as to support the multiple missions of patient care, research, and academic medical education. This article explores what can be learned from industries that have adopted the BICC approach to the organization of services and what can be applied to healthcare organizations. Because form must follow function, we examine several of the unique operational realities of complex healthcare organizations. Given these realities, we identify governance and services organization models to consider. 34 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

37 Data Governance for Healthcare Finally, given a set of optimal fit criteria, we discuss why a BICC is a good fit for the typical healthcare provider organization seeking to mature its BI disciplines. Healthcare Realities I have worked in the U.S. healthcare sector for my entire 27-year professional life. One of my professional objectives has been to learn from other industries and associations, such as TDWI, and apply that knowledge to benefit my healthcare-provider clients. Along the way, several realities of the unique business model and operational makeup of healthcare organizations have presented challenges to strategic BI efforts. These challenges may also exist in other industry organizations. Variation in Constituencies The typical healthcare organization is a manifestation of diverse missions and constituencies that are all hungry for data and actionable insights. The patient-care enterprise (i.e., the hospital) seeks to improve performance better quality, safe care, for more people, at a lower cost/reimbursement in the face of dramatic healthcare reform such as Obamacare. Researchers are advancing medical science through bench and translational research. All the while we are training the medical students who will be the next generation of care providers within the halls of our patient-care business. These enterprises run the gamut of financial, supply chain, human capital, quality measurement, safety, production function, capacity, and throughput analytics that largely mirror the performance goals of any business entity in any industry. However, healthcare is further colored by several unique realities, including the prevalent not-for-profit status; intense state and federal regulation; privacy laws restricting the sharing of patient data; an orientation toward public good over profit; and the independence of the professional workforce (that is, physicians are often not employees of the hospital, and clinical researchers are employees/faculty of a university, yet both practice on the hospital s patients). In addition, few healthcare organizations own the entire production function they are a care community of many independent clinical professionals with little data shared across organizational boundaries. This fragmentation of the healthcare production function is directly mirrored in the legacy of health-system BI solutions and services. Over time, different departments and functions representing different user constituencies (all too numerous to list here) have grown to support the data and analytics needs of their specific user constituencies. Each department, in turn, has its own data mart solution, analytical tool set, data collectors, data quality controls, master data, and analyst professionals supporting it all. Each user may also operate in multiple domains and may need to access multiple support services and teams. Doctors wearing their clinical-process-improvement hats need to go to the quality department. The same doctors conducting clinical research need to go to a School of Medicine research data administration team for support. They also need to manage their practices revenue, costs, and productivity and thus turn to yet another practice management analytics team for help and support. All the while, the same patient data that enables these three different use cases is duplicated and managed in silos. Follow the Money Another dynamic is that some constituencies have a place to go for support, but many do not. Several user constituencies have revenue from large clinical service lines (cardiology and oncology, for instance), and therefore have the wherewithal to create their own data and analytical fiefdoms. In the absence of any enterprise services and solutions, an adverse consequence is that smaller departments and breakeven functions do not have the same access to required data and analytical resources and solutions that could be used to improve their performance. The performance of the entire organization suffers in this environment of haves and have-nots. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 35

38 Data Governance for Healthcare Oversight/support/guide Collaborates with Reports to C-Level Executives CEO, CFO, CIO, CNO, CMO STRATEGIC Strategic alignment STRATEGIC Business intelligence executive TACTICAL Application architecture Data architecture Reporting and analysis TACTICAL Analytical applications Data architecture Reporting and analysis Informatics Privacy/security Governance business partners Operations leaders OPERATIONAL OPERATIONAL Constituencies, users, operations, and technology teams Customers, operations, and technology teams Figure 1. Three tiers of data government. Figure 2. The benevolent monarchy form of data government. Data Government Models Recognizing that the current state is not optimal, many health systems are striving to design a better way. They quickly discover the need for data governance to foster the enterprisewide recognition that data is an asset that requires rigor and discipline in the management of its life cycle across every use case and constituency. As shown in Figure 1, three tiers are universal, whatever the model of organizing your BI government. Strategic: The alignment of BI to corporate strategy, goals, and objectives is embodied in some form and framework of governance. Tactical: Much functional expertise is required to design, lead, guide, and participate in the building of the BI architecture and delivery of analytical services. Operational: The tactical functions are applied to specific projects in tight integration with the user constituencies and operations of the organization. The interaction between governance and services organizations becomes an exercise in how best to shuffle, delegate, and assign resources to the various boxes in the three tiers. When presented with a complex ecosystem of data management and analytics constituencies, interests, missions, and services, how can we best organize and govern ourselves for success? The next sections use the history of the formation of the U.S. government as an analogy to describe three political systems for data governance and for providing analytical services to multiple constituencies. Political System #1: The Benevolent Monarchy In a benevolent monarch governmental system, a dedicated and accountable executive manages a dedicated resource team an enterprise BICC (see Figure 2). In this model, executive leadership or delegated governance bodies provides active support and guidance to ensure alignment of analytics with strategies. A BI executive (the benevolent monarch) takes a full-time leadership position as the BI and chief knowledge officer. This is not the part-time function of a CIO or other existing executive. The tactical production function in this model provides analytical data management and services for the entire organization, and reports to and is managed by the BI executive. This tactical team defines and drives standards and architecture and therefore a de facto adherence to standards. Business partners, data stewards, and opera- 36 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

39 Data Governance for Healthcare C-Level Executives CEO, CFO, CIO, CNO, CMO C-Level Executives CEO, CFO, CIO, CNO, CMO STRATEGIC Enterprise BI governance Enterprise P&Ps Operations leaders STRATEGIC BI governance Enterprise BI governance Clinical governance Research governance TACTICAL Analytical applications Data architecture Reporting and analysis TACTICAL BI competency center Tactical teams EXECUTION Customers, users, operations, and technology teams OPERATIONAL Constituencies, users, operations, and technology teams Figure 3. The independent confederacies form of data government. Figure 4. The federation of states form of data government. tional teams collaborate to execute data management/ analytics projects to satisfy business requirements. In short, a single, permanent function and leadership governs both the strategic and tactical layers of services and resources on behalf of the entire organization and its body politic. It is a concentration of oversight, resources, and enforcement of policies and procedures into a single, controlling, benevolent government body for all things related to BI. Political System #2: Independent Confederacies The independent confederacies model (see Figure 3) can be thought of as virtual governance over many independent tactical teams and resources. Each constituency (i.e., research, clinical, quality, finance) thinks of itself as a state governed by its own constitution of choice in lieu of an enterprise governing or services function. Access to services and data across constituencies, as increasingly demanded by healthcare reform, needs to be negotiated, coordinated, and taxed for each project. Each confederacy has its own money (data), laws (data policies and procedures), and armies (data management and analytical services teams). Enterprise BI governance comes through the part-time contributions of operations leaders charged with strategic planning and establishing minimal enterprisewide data standards. They seek to guide and influence the many operations, BI functional teams, and technology teams a congress of the confederation. All standards and attendant responsibilities to implement them are distributed across the constituencies and their tactical teams. Access to the tactical teams is largely uncontrolled by demand governance and is negotiated project by project. The interaction with, and enforcement of, any enterprise data policies is more distant, making it more difficult to appreciate the enterprise good of data governance. Team members could deem it as an overbearing interference by big government that does not understand local requirements. Data stewards, if any exist, typically serve a single department and source system, versus addressing enterprisewide data issues. It should be noted that this model is the typical current state of many health systems and describes how they approach data governance and the organization of BI tactical teams today. Political System #3: Federation of States The third and final data government model, a federation of states (Figure 4), is characterized by a blend of virtual governance and a dedicated enterprise BI services team. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 37

40 Data Governance for Healthcare Criteria Description Monarchy Confederacies Federation Ensures alignment with the states Ability to represent, be responsive, and be the voice of disparate operations, needs, and missions Low High Moderate Ensures adherence to standards and policy Ability to drive compliance and use of established data policy, standards, and solutions High Low Moderate Cultural fit Suited for adoption by existing healthcare culture; may need to shift culture to realize full value Low High Moderate Systemness Increases data services maturity Degree to which the model achieves economies of scale and avoids conflicting, disparate, duplicative consumption of resources and effort Experts, incubation, advanced analytics, and innovation responsively service users information needs High Low Moderate Moderate Low High Minimizes interference with existing structures Degree to which reorganizing resources is deemed disruptive Low High Moderate Table 1. Criteria to evaluate the fitness of each data government model. The emphasis is on integrating best practices across the functional silos and collaborating with the tactical teams through a BICC that concentrates user-facing access to analysis services. This enterprise federal function is, in turn, governed by an enterprise BI governing body. Within an existing enterprise governance structure (for example, IT or performance excellence), a BI governing body brings together executives and operations leadership charged with strategic planning and guiding the BI architecture development and services. They collaborate with the other subject area governance bodies to ensure alignment of BI solutions and services to various business needs. A BICC takes on operations-facing services such as architecture design, a chief data steward, skill-set development, and other shareable services. The goal is to drive enterprise design, economies of scale, consistency, and enforcement of enterprise data policies. The domainspecific tactical teams (the states ) retain application, data mart building, project management, and domainspecific analytical support for their unique and local constituencies. This data government model mirrors the federationof-states approach of today s United States of America. Enterprise governance and the BICC act as a federal government that provides laws, regulations, policy enforcement, and a common, shareable foundation of enterprise data assets and services. The many tactical teams, in turn, deliver constituency-specific implementations of data and analytical services by tapping into the shareable enterprise resources, funding, and expertise. A New Order for a New Age Table 1 describes how each of these three data government models align with different fitness criteria. If driving standard enterprise policies for data and analytics is paramount and achieving a pure theoretical state of economies of scale is important, then the benevolent monarchy approach works best. There are several examples of this model in U.S. healthcare and in command-and-control cultures such as for-profit hospital chains and faith-based systems. The model will need to navigate the politics of the different constituencies and seek to keep divergent fiefdoms happy in terms of responsiveness and delivering 38 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

41 Data Governance for Healthcare domain-specific expertise and solutions. This task is easier said than done, unfortunately, given the realities of many U.S. healthcare organizations with multiple missions and a workforce of independent professionals. If it is paramount to have perfect alignment with the many different constituencies within a complex portfolio of funding streams, and sensitivity to arms-length relationships with multiple independent organizations, then the confederacy model is best. It introduces the least amount of conversion trauma while seeking to collaborate and cooperate toward shared enterprise objectives. Adopting this model will mean suffering the inherent inefficiencies and higher total cost of ownership of duplication, negotiating politics, and divergent data and analytical architecture fiefdoms. If an organization recognizes that the strategic response to healthcare reform requires far more intense collaboration and integration across once-independent domains while retaining what makes the individual domains so good at what they do, then the federal government model approach works best. Conclusion In my role as a strategic business advisor who facilitates my clients consideration of these governing models, I have found that it is often self-evident to borrow a phrase from the Declaration of Independence that the federal model, although imperfect, is the optimal fit. Where should the BICC live within the organization? It might be an extension of the IT function as an enterprise architecture team concept; it might report to the CEO outside of IT or to the chief medical officer; or it might have a dual-reporting matrix function that ensures fairness (real or perceived) in support of different constituencies. Should existing analysts be moved from multiple departments into the new BICC, or does the BICC require all new hires along with the attendant increased costs? These are just a few of the questions and permutations that must be designed and aligned with an individual organization s culture, legal formation, diversity, and existing governance and analytical teams. Healthcare organizations are responding to a convergence of environmental drivers including healthcare reform, lower reimbursement, accountability to coordinate care, and caring for chronic diseases. These require a new kind of business intelligence that is inclusive of integrated enterprise data assets and new types of analytics that cross multiple domains of research, clinical excellence, cost containment, and revenue integrity. The old order of divergent, independent, and duplicative analytical solutions and services teams is no longer tenable. The BICC adopted by other industries in conjunction with mature enterprise data governance is the new order for a new age. Modern healthcare organizations need to organize and govern themselves as a federation of states. The challenging part of redesigning the data governance and services organization is the actual manifestation of the model with real people, real reporting relationships, and real politics. Where should the BI governance committee/council reside? It could be something new; formed within an existing governance body such as IT, quality/ clinical transformation, or performance excellence; or an additional agenda item for an existing executive council (requiring no new governance body). BUSINESS INTELLIGENCE Journal vol. 17, No. 4 39

42 BI Q&A BI Q&A Gaming Companies on the Bleeding Edge of Analytics & Two experts discuss the use of analytics by gaming companies, which collect huge quantities of data about players and then use it in real time to shape their products on the fly. David Loshin is president of Knowledge Integrity and a recognized expert in information management, information quality, business intelligence, and metadata management topics on which he writes and speaks frequently. [email protected] Ellie Fields is senior director of product marketing for Tableau Software; she also speaks frequently at industry events on business intelligence and data journalism. [email protected] Linda L. Briggs Analytics experts David Loshin, with Knowledge Integrity, and Ellie Fields, with Tableau Software, discuss an industry that is using analytics effectively on huge quantities of data. The thing that gaming companies do that other companies can really learn from, says Fields, is they continually gather and analyze new types of data. They aren t limited to a fixed set of reports or analyses. They re always dealing with new phenomena in new ways. Business Intelligence Journal: David, you and Ellie recently spoke in a TDWI Webinar on Data Analytics and Interactive Gaming: Opportunities to Influence Consumer Behavior (tdwi.org/webcasts/2012/05/data-analytics-andinteractive-gaming). That s somewhat of a departure from your standard topics. As an expert in information management, what about that subject caught your attention? David Loshin: It was based on my experience watching my children playing some online games. We have a room set up with a few computers that are intended for homework, but my children also use them for playing games, and I was observing how they interacted with the games as well as how they interacted with each other in real life. I watched as they played and saw how many decision points existed in the game that could be used to enhance the experience, such as moving to different places on the play map, choosing different parts of the game to play, how points were scored, and the types of skills necessary to succeed in each part of the game. I also noticed a lot of barren real estate that could be used for messaging (perhaps advertising is too strong a word when you are targeting eight-year-olds). The whole experience got me thinking about other play experiences and the types of data that could be collected and analyzed. Ellie, are gaming companies ahead of the curve in many ways when it comes to using analytics? Ellie Fields: In fact, gaming companies are on the bleeding edge of working with analytics. It s because they are working with a type of data, in-game analytics, that never existed before. Therefore, that data doesn t have a defined format or report. In-game analytics data is hugely valuable for helping gaming companies design better games, 40 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

43 BI Q&A monetize games, and upsell and cross-sell new games. It s a treasure trove, and because gameplay data includes every interaction a user has with a game, that data can be massive and is growing fast. Backtracking a little, what s the definition of an interactive game, and who is playing these games? Is it mostly kids? David: I am probably not the person to strictly define what is or is not an interactive game, although I would say that pretty much all games are interactive. However, in the context of the kinds of games my children play, they tend to involve a play space that is mapped out with different areas of focus, in which the player takes on a role (thus joining many other players doing the same). They move around on the map, interacting with parts of the game and with other players. There is definitely a social aspect to it. The environment is designed to test players skills, to let them fail or succeed, and to move them along to other parts of the game. In some games you are matched with, and play against, other opponents. Although kids make up a large part of the game-playing community, I m guessing that there are many adults spending a lot of time playing these games as well! Ellie, is there something gaming companies in general have in common, including huge quantities of data being collected and analyzed in real time? Ellie: Yes, absolutely. We can talk about the similarities in questions they want to answer. Who are my best users? What makes my game sticky, or are there aspects that cause users to get frustrated and abandon it? What are the trigger points for a user to share my game or invite friends? How do I increase those? What is my most successful monetization strategy? Game companies are very focused on monetization analysis, which may include cross-sell across games, up-sell to premium versions, and the revenue generated in a game. All of these questions, and potentially dozens more, can be answered with gameplay data, which as we ve been discussing, can easily be hundreds of millions or billions of rows. The closer to real time you can answer these questions, the better offers you can give your users. Tying all this to business intelligence, what kinds of analysis are you seeing among customers, Ellie? What kinds of things do gaming companies want to track? Ellie: Several types of analyses are becoming critical for gaming companies. We ve mentioned gameplay analysis, which helps game designers make more interesting, stickier games. Game companies are also very focused on monetization analysis, which may include cross-sell across games, up-sell to premium versions, and the revenue generated in a game. Finally, gaming companies are very interested in social analysis, answering questions such as Who is most likely to share and pull new users into the game? and When and why do they do that? What are the opportunities for getting value from online games using BI? David: Players interact via a variety of transactions with the game and among themselves, and engage with aspects of the game. Within the game, players will maintain and use an inventory of items acquired or purchased within the game, establish links with other players, and self-organize into communities of interest. However, you have to think in terms of what the game companies want to achieve: improved performance and a richer player experience (to drive stickiness and virality as well as influence player behavior). Ultimately, of course, gaming companies want to drive revenue, BUSINESS INTELLIGENCE Journal vol. 17, No. 4 41

44 BI Q&A either through upselling services and in-game purchases, advertisement revenue, or through the development of information products that can be sold to other businesses. Can non-gaming companies learn something from how gaming companies are using BI and analytics? It seems as if we are talking about some leading trends in data management: big data and a need for fast analysis. David: I think you hit the nail on the head. This is an environment in which there are millions of simultaneous play transactions incorporating data about individuals, actions, locations, times, durations, and ultimately decisions that players make regarding their interaction. There is a lot of data being streamed in real time from many sources, and this needs to be merged with existing player profiles to do real-time delivery of actionable knowledge directly into the operation of the game. Ellie: I completely agree with David. The thing that gaming companies do that other companies can really learn from is that they continually gather and analyze new types of data. They aren t limited to a fixed set of reports or analyses. They re always dealing with new phenomena in new ways. How do gaming companies profile players? What are they looking for? Using that information, how can gameplay behavior be influenced? David: Certainly, these companies look to create archetypal profiles for players and use statistical results from historical behavior to predict the probabilities that a particular player in a specific scenario will make a specific decision. These are typical data mining applications segmentation, clustering, market-basket, decision trees that can be combined with the transactions of play to enable The thing that gaming companies do that other companies can really learn from is that they continually gather and analyze new types of data. the generation of customized offers within the game. The same ad placement algorithms that retail vendors use can be incorporated within the game to post ads within the game s layout, such as a billboard on a building. Given all the information collected, a great deal of it about underage customers, how do game companies ensure that players private information is not being exposed as part of the analysis? David: This is a good question and one that I have raised many times in many different situations. Even when gaming sites have privacy policies, there is always a risk that the company is not properly safeguarding the data. However, if they say that the data is used within the constraints of what is set forth in the privacy policy, we can hope that they are protecting personally identifiable data. At the same time, few people actually read through those policies and terms of use. They might be surprised to see how loosely defined the privacy obligation is. For the most part, the objective is not different from any other interactive environment, such as a retail site or a search engine. They are always collecting data. The question is whether these companies, like any others, are good citizens when it comes to data protection. I would hope that most companies would act ethically because not doing so, of course, would ultimately negatively impact the customer experience. Linda Briggs writes about technology in corporate, education, and government markets. She is based in San Diego. [email protected] 42 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

45 Offloading Analytics Offloading Analytics Creating a Performance-Based Data Solution John Santaferraro John Santaferraro is vice president of solutions for ParAccel Inc. [email protected] It s a big data world and getting bigger every day. Everyone acknowledges the potential benefits of this data explosion, but the growing mountains of information can also overwhelm existing data warehouse infrastructure. Unless companies can effectively use the new data to gain greater insight, agility, and competitiveness than they could achieve with less data, big data means nothing more than increased infrastructure and storage costs and tremendous disappointment. The real center for value creation in big data is analytics. Unfortunately, efforts to run analytics on large data sets are choking existing data warehouse technology, which was designed for reporting, dashboards, and static analysis. This bottleneck tends to limit business analysts access to the system as well as how quickly they can find the answers they need. The answer is a straightforward and standards-based approach to accelerating performance. New technology, designed with analytic workloads in mind, makes it possible to offload analytics from the data warehouse to improve performance in both environments. The deployment of a performance-based data solution can free analysts to drive greater innovation and maximize investments in big data. Challenges of the Old Technology: Analytics Backlog Existing data warehouse technologies suffer from four key weaknesses that are leading to significant reporting and analytics backlogs at large enterprises even without the introduction of big data initiatives. First, many of the databases being used for data warehousing were originally built for online transaction processing that is, for performing simple workloads on flat data. Although they have been tweaked over the years to work for reporting and static analytics, databases BUSINESS INTELLIGENCE Journal vol. 17, No. 4 43

46 Offloading Analytics still have architectural limitations. For example, running complex analytics requires databases to scan every row of their increasingly massive data sets over and over again. As a result, they simply don t have the processing power and throughput to handle today s complex queries and workloads. This means that analysts are constantly waiting for long periods of time for queries to run, and they often have to settle for data sampling and aggregation, which results in suboptimal or even inaccurate results. The real center for value creation in big data is analytics. Second, making frequent changes to support the analytic discovery process is time-consuming and resource intensive. It requires constant interaction between data administrators and business analysts to determine the right magic for modeling the data and tuning the database. This severely limits analysts ability to perform quick-turn, ad hoc, and on-the-fly queries that is, to get the answers they (and their company executives) need when they need them. That means when timely response is critical for businesses to dynamically react to new risks and opportunities, it is almost impossible for them to make fact-based decisions. The result is often poor decision making and excessive risk. Third, most data warehouse technology was not designed to handle new types of data. Big data isn t simply about a larger amount of the same data that companies have been collecting all along. It s about new sources of data that represent new opportunities. It s about mixing structured and unstructured (polystructured) data. It s about affinity and sentiment campaigns, as well as enterprise and portfolio risk management. It s about demand signaling, personalized marketing, micro-segmentation, network optimization, fraud protection, health management, supply-chain optimization, warranty and RFID analytics, smart meter analytics, digital media analytics, and so much more. Bringing all the new data sources and data types into existing data warehouses presents administrators with significant difficulties that often result in long (and sometimes endless) delays. Fourth, even though the price of processing, memory, and storage continues to drop, today s data warehouse platforms are expensive to scale. Even without huge new analytic workloads, these systems are already stressed by current requirements for reports, dashboards, and analytics (on relatively small data sets and run by relatively few analysts). In most environments, 20 percent of the workloads takes up 80 percent of the data warehouse resources, and that 20 percent tends to be centered on analytics. Because impacts to operational reports and dashboards compromise daily business operations, the more intensive, next-generation analytics are constrained or crowded out when system resources reach their limits. Running complex analytics on huge new data sets would simply bring the system to a halt. One possible solution to these challenges is to add hardware, software, and labor to achieve the needed acceleration. Although this approach may give analysts what they need, it is hugely expensive and not sustainable in the long run. Analytics Offloading Analytics offloading is a strategy for moving analytic workloads from data warehouses and large data marts to a separate analytics database. New analytics platforms have been designed to handle complex analytics and large amounts of data. An offload strategy allows the data warehouse to do what it was designed to do while bringing in the power of new platforms to handle big data analytics. Because most new platforms are designed to scale linearly, it makes sense to utilize them for growing data sets and increasingly complex analytic computations. Like new big data platforms, many of these analytics platforms are built to run on industry-standard hardware. This commodity approach lowers the overall cost of data management by allowing workloads to migrate to the platforms that handle them best. 44 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

47 Offloading Analytics Inherent in the offload design is the development of different nodes for different loads. This enables an enterprise to offload workloads to: Staging nodes for fast discovery, extraction, transform, cleansing, and compression of diverse source data types Reporting nodes for fast intake, efficient storage, and flexible administration and governance of structured data, metadata, dimensions, and hierarchies is immediately accessible for running queries without tuning the database or tweaking the front-end SQL tools. Finally, when creating a collaborative analytics environment, the analytic nodes must support industry standards such as ODBC, JDBC, and ANSI SQL to support a wide variety of data integration and business intelligence (BI) tools. It s important that integration be as seamless as possible, especially for companies that have already standardized on ETL and front-end BI platforms. Analytic nodes for concurrent access, flexible exploration, and fast queries of conformed subject-area data Ideally, the analytic nodes should run a purpose-built analytics database that optimizes data retrieval for analytics processing efficiency. Features might include: Minimizing the need for administrative intervention is absolutely critical to the strategy. A columnar orientation (to eliminate repeated scans) Compression (to maximize memory efficiency and the effective scan speed) Query optimization (parsing complex queries to run efficiently over multiple nodes) Inline data acquisition (to bring in new types of data during each query) To support the iterative nature of the analytics process, these nodes should also make it easy to rapidly spin up and spin down analytic data marts on demand, both on and off premises and in both physical and virtual environments. Minimizing the need for administrative intervention is absolutely critical to the strategy. Businesses today are wasting huge amounts of time, money, and effort modeling data and administering the database. Current systems require an excessive amount of time and effort to create indexes, materialize views, and fine-tune optimization engines to run complex analytics. In-house experts or outside consultants must constantly make adjustments to the data warehouse. Purpose-built analytics databases must eliminate these constraints by storing data so it When architected properly, the analytics offload platform serves as a foundation for a performance-based data solution that can scale to serve the analytic needs of the entire enterprise. Such a solution has been discussed by Gartner s Mark Beyer ( logical data warehouse ) and Enterprise Management Associates Shawn Rogers ( hybrid data ecosystem ). Both writers call for creating a larger data ecosystem composed of the data warehouse, an analytics platform, and a big data platform (Hadoop), with connectivity, integration, and interaction among the parts. (See References for more information.) Four Principles With this new data ecosystem in mind, here are four principles for architecting a data solution that can take optimal advantage of analytics offload and deliver the greatest benefits to the enterprise. Principle 1: Allow for Interaction and Collaboration The goal of separating analytics and big data from the data warehouse is to allow each platform to accelerate and optimize the activities for which it is best suited. However, any performance gains made on each platform can be nullified if the solution isn t architected for fast, seamless interaction and collaboration among the parts. BUSINESS INTELLIGENCE Journal vol. 17, No. 4 45

48 Offloading Analytics For example, there must be real-time synchronization between the data warehouse and the separate analytics platform to ensure consistent results between the two. A query run on the analytics platform must accurately mirror a report run on the data warehouse. There must also be on-demand capabilities between the two platforms. For example, analysts running a query on the analytics platform must be able to pull the needed data from the data warehouse in real time. In the same way, managers running reports from the data warehouse must be able to query the analytics platform and bring the results into their reports. This can be accomplished through the open APIs and connectors provided by vendors. If vendors have not provided open APIs, the integration effort can be more difficult to maintain. Analysts running a query on the analytics platform must be able to pull the needed data from the data warehouse in real time. The analytics platform must also integrate and collaborate with the big data platform. For example, suppose an analyst is running a complex analytic query, part of which depends on the results of a sentiment analysis of data stored in Hadoop. In a properly architected system, the two platforms would share the workload. Hadoop would run the sentiment analysis and feed the results back to an analytics platform, which would then deliver the results to the analyst, along with corporate customer analytics. Such collaboration requires the same kind of on-demand capabilities that should exist between the analytics platform and the data warehouse. Each platform should be able to pull the most up-to-date information from the other platform at the time of the query. As we ve discussed, a fundamental capability of the analytics platform is the ability to make changes quickly without the intervention of an administrator. This must remain true in the interaction between platforms. Analysts must be able to iterate quickly: to create a hypothesis, ask a question, change the hypothesis, ask a new question, and so on, arriving at a final algorithm and then repeat this process for the next level of analysis. Each platform and the interaction among platforms must have the performance capabilities, resources, and simplicity for such iteration. In addition, look for common languages, such as SQL or XML, to provide analysts with a consistent way to access data across the three platforms. Principle 2: Let Workloads Gravitate to the Platform that Handles Them Best This principle is critical to the overall performance of the solution. Creating separate workload platforms accomplishes little if mismatches remain between workloads and the optimal platform. Consider the 80/20 rule, which asserts that 80 percent of resources are typically taken by 20 percent of workloads and those workloads tend to be analytical. Complex analytic workloads demand the most resources, so if complex analytics continues to be performed on the data warehouse, then performance of the data solution will continue to suffer, and both business users and analysts will get suboptimal results. Instead, the performance-based data solution should be architected so that workloads can be moved to the platform that handles them best. The data warehouse is reserved for reporting and dashboards, static analysis using sets of questions that don t change over time, building cubes and supporting data marts for unique sets of requirements, and staging data that will be fed to the analytics platform. With its ability to iterate, combine diverse data types, and parse queries to run over multiple nodes, the analytics platform is reserved for complex analytics, data mining, and dynamic analysis. Hadoop, which is essentially a distributed file system with programming language around it, is reserved for archiving large amounts of data, simple and fast filtered searches of massive data feeds, simple batch analytics, and text analytics. Hadoop is also great for transformations 46 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

49 Offloading Analytics used in sentiment analysis or for data being fed into the data warehouse or analytics environment. For example, Hadoop can be used to take enormous Twitter and Facebook feeds, filter out certain words for their sentiment value, parse the number of instances of those words, and assign a numerical rank. In this way, the complexity of language can be transformed into a value that can be incorporated into a report or used by the analytics platform. This ability to perform transformations on massive data sets is also useful for translations such as integrating financial data collected from countries using different currencies. By moving appropriate workloads to separate platforms, it becomes possible to extend the life of the data warehouse, perhaps for years, while eliminating the backlog of dashboard and reporting requests. Principle 3: Understand and Utilize the Different Value Creation Capabilities of the Different Platforms IT serves the business by creating value. A performancebased data solution should be architected so that each platform creates the highest possible business value. For example, the data warehouse delivers operational value. It should be used to quantify what is happening in the business, reveal trends, monitor KPIs, and provide warnings about changes in the supply chain and marketing programs. The data warehouse excels at reporting on what has happened in the past and what is happening now, as well as providing guidance on how to adapt going forward. In contrast, the analytics platform delivers analytic value. It drives innovation and competitive advantage by providing answers to discovery questions the why. It brings together diverse data and complex algorithms to provide answers to questions that decision makers inevitably ask: Which customers are most likely to buy new products? What will next quarter look like? How can I optimize the supply chain to get the right products to the right stores when they are needed? The analytics platform supports predictive analytics, so decision makers can look forward to what is likely to happen and fine-tune their activities to favor business goals. It also supports prescriptive analytics, providing clear guidance about what to do in a particular situation. The analytics platform drives innovation and competitive advantage by providing answers to discovery questions the why. For example: What offers should I present to users visiting my website based on a variety of criteria (such as time of year, number of visits, or where they click)? If an enterprise has just produced 10,000 new widgets, business users might want to know: What distribution centers should they be sent to for optimal distribution to meet current and evolving demand? Hadoop delivers informational value. With its ability to quickly work on massive data feeds, Hadoop can help businesses quickly see what trends are emerging across social media sites, how customers are reacting to news about the company or a new product, and what s happening on a smart grid or across a supply chain. Again, the results can be fed back into the analytics platform to make the predictive and prescriptive analytics more robust and accurate. Principle 4: Share Data and Analytics Results at the Right Time for the Greatest Value The final principle for architecting a performance-based data solution that maximizes the benefits of analytics offload is that users must be able to share data and analytics results for maximum value. To increase the potential value of shared data, the sharing should be in real time and on demand. Users must be able to pull data from one platform to another at the time of a query. For example, suppose a telecommunications BUSINESS INTELLIGENCE Journal vol. 17, No. 4 47

50 Offloading Analytics company (telco) has a business customer that has recently doubled in size as the result of an acquisition. The customer logs on to the telco s portal for the first time since the acquisition. What services should be offered to the customer? The telco doesn t want to offer services that the customer has already purchased, so the customer portal must pull in information about the customer and the acquired company from the order management system, then feed this information to the analytics platform to generate the prescriptive analytics, which is then fed back to the customer portal. Another simple but increasingly important example is enabling a call center application to access Hadoop to run queries on Twitter and Facebook feeds. When the call center of a large retailer receives a call regarding a news report about a product safety issue, the operator could immediately use Hadoop to query Twitter and Facebook feeds to learn whether customers have experienced the issue, explore how widespread the problem might be, and consider whether there is a potential impact to the retailer s brand. Sharing analytic results openly can also speed time to value. Let s say a business analyst is using Hadoop to do sentiment analysis on Twitter and Facebook feeds and discovers that customers are increasingly complaining about product installations. In this case, the analyst can mix the sentiment analysis with product, order management, and supply chain data to obtain intelligence from all three platforms on what may be causing the installation issues. recovery. These benefits will enable businesses to get the maximum value out of their existing investments in data warehousing and analytics by aligning workloads with the right platforms while reducing hardware operational and acquisition costs and releasing resources previously tied down with system maintenance. More important, by solving the big data analytics challenge, businesses will be able to solve the competitive challenges that lie ahead. With immediate access to a more robust analytics environment and more granular analysis, businesses can uncover hidden risks and capitalize on previously undiscovered opportunities. They can make more accurate decisions based on full access to their data and analytic tools and functions. References Beyer, Mark [2011]. Mark Beyer, Father of the Logical Data Warehouse, Guest Post, blog post, Gartner.com, March. merv-adrian/2011/11/03/mark-beyer-father-of-thelogical-data-warehouse-guest-post/ Rogers, Shawn [2012]. Embracing a Hybrid Data Ecosystem, blog post, EnterpriseManagement. com, April. shawnrogers/2012/04/16/embracing-hybrid-dataecosystem/ Living in a Big Data World Ultimately, the technical benefit of a properly architected performance-based data solution that uses analytics offload as a foundation is the ability to bring more data and more data types into the business intelligence workflow while achieving extreme performance and flexibility at any scale and price. The solution supports accelerated queries and table scans, faster batch and incremental loads, additional user and query concurrency, and a wider range of mixed query, reporting, and other workloads. It is also more resource efficient and offers lower storage utilization as well as speedier archiving, backup, and 48 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

51 BI Experts Perspective Experts Perspective Mobile Apps Timothy Leonard is an independent consultant. William McKnight is president of McKnight Consulting Group. John O Brien is principal and CEO of Radiant Advisors. [email protected] Lyndsay Wise is president of WiseAnalytics and author of Using Open Source Platforms for Business Intelligence: Avoid Pitfalls and Maximize ROI (Morgan Kaufmann, 2012). [email protected] Timothy Leonard, William McKnight, John O Brien, and Lyndsay Wise Torey Williams is the BI director at New Addition, a company that produces and sells maternity clothes through its own nationwide chain of stores as well as independent department and specialty stores. Most of Torey s BI efforts have focused on building a data warehouse that supports reporting and dashboards for sales information. Management is pleased with what Torey has done, but wants to take it to the next level and be able to access the information on cell phones and tablet devices. One manager has also asked if Torey can monitor social media data such as Twitter and feed interesting information to his ipad. The BI vendor Torey uses for reporting and dashboards does not currently support mobile apps, but is promising to have a product solution within the next six months. This is all new to Torey, and she has many questions. 1. Should she wait for her current BI vendor, or should she strike out on her own? What factors should influence this decision? Torey might be able to get management to wait six months, but she would prefer to respond more quickly. 2. What technology options are available if Torey does decide to proceed without waiting for her current BI vendor? 3. Management has mentioned using both cell phones and tablets. Should she try to support both? What are the differences that she should know about for BI purposes? How much does she need to standardize on the devices she supports? 4. To what extent do people s experiences with cell phones and tablets influence their expectations for BI delivered on these devices? 5. What are the options for security? For distributing the apps? 6. Is it feasible for her to stream social media comments? How can this be done? BUSINESS INTELLIGENCE Journal vol. 17, No. 4 49

52 BI Experts Perspective Timothy Leonard No, don t wait! It s important to go through the experience and the effort it takes to develop new programs when technology and methodologies aren t established yet. This will help you develop an understanding of the available options and approaches as well as the challenges. The key to making mobile BI successful is by starting to develop it. You want to show progress, not perfection. In order for the overall mobile BI effort to be viewed as successful, the end solution must show ROI to the executive team and must make the analysts and business users processes easier. Training, documentation (including making short videos available for reference), spoon-feeding, and follow-up to ensure adoption and satisfaction are all essential and should be expected to be a significant part of the effort. As far as supporting cell phones and/or tablets, that depends on the specific projects and the skills required, but there are many open source software packages and resources available to help Torey get started quickly and inexpensively. If she has a unique team, each with very different skills and not much overlap, there won t be much organizational flexibility. The formal team should include report generators and the requirement producers. It should also include IT staff members who understand how to use the tools and can teach the analysts. The IT members must also be able to work closely with the business users in order to understand the analysis requirements and objectives. A formal team lead will be required to maintain focus. Finally, have specific goals so the proper time can be allocated to the project. Annual performance reviews can be used as a motivational tool. The formal team should include report generators and the requirement producers. It should also include IT staff members who understand how to use the tools and can teach the analysts. In short, Torey needs to build a cross-platform mobile development solution. As to the request about supporting social media, it seems clear that social media is here to stay. It s a tool, like any other, and it is effective when used correctly and in the right situations. Social media gives you a whole new perspective on actionable customer data. There is a lot of hype surrounding it right now, but you need to prepare for the long term. After all, as technology advances, we will move from big data to humongous data and so on. I like to start with sample data from social media to gain predictive value. This will often start yielding patterns. All the social media sites provide APIs that are accessible to everyone; you can use these to get started. Start small and look for reference patterns that fit your business activities. William McKnight A BI director today should be initiating conversations about mobile at her organization. Torey should not be on the receiving end of a requirement for mobile or in any way surprised by it, regardless of her otherwise daily responsibilities. If so, lesson learned. Torey cannot wait six months for mobile support from her current BI vendor. Not only is this too long a time frame, but the eventual solution may also be subpar and could easily exceed the current vendor s estimate for availability. Many BI vendors first ports to mobile simply take the long form and shrink it down for the small screen. This is totally inadequate. It is perfectly reasonable to have different BI tools and vendors to address different needs in an organization. Although it appears Torey is fortunate to have the same vendor for reporting and dashboards, this consistency may not extend into mobile deployment. A BI vendor who provides reporting and 50 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

53 BI Experts Perspective dashboards today, but is six months away from mobile support, is not full stack nor positioned to address the majority of a client s BI needs over time. Perhaps Torey can begin to hedge her reliance on this vendor with her new mobile solution. Torey should probably look to buy, not build, her mobile solution. Keeping up with the differences between ios and Android (HTML5 notwithstanding) and the many phones, tablets, and in-betweens using these, and perhaps other, operating systems, is not a task for most IT organizations. Keep in mind that the deployment profile within the organization might change over time. I ve seen them change quickly. It s important that the mobile BI solution address a variety of platforms. Vendors with excellent mobile ports include MicroStrategy, YellowFin, RoamBI, BIRT, and SAP. Information and analytics must flow through the path of least resistance, utilizing the deployment option that turns the information into valuable business action most quickly. Both smartphones and tablets are business devices that are widely deployed to knowledge workers in an organization and should be supported. Phones may come first, but tablets must not be far behind. Users have different expectations and cognitive and emotional attachments for different devices. For example, the expectation of authoring data is low to nonexistent on a cell phone, low but somewhat expected for a tablet, and limitless on a laptop or PC. In any case, best use of the real estate is a must. Users will quickly grow impatient with a PC-optimized screen that is shrinkfitted for mobile. Showing less of a PC rendering on a mobile device is also not the user expectation. The idea of categorization and drill is much more pronounced in the mobile world because of the limited viewing space. Users have different expectations and cognitive and emotional attachments for different devices. The considered norm for business intelligence involves zero-footprint, Web-based delivery of reports. This allows information to reach many users. Although detailed, transactional data must be accessible on a drill-through basis, it is the rapid availability of summarylevel information that activates the process. The emergence of corporate dashboards has prepared organizations for working from summary to detail, which is the ideal approach for mobile BI. Furthermore, notifications take on a whole new utility with mobile. Notifications can be sent to mobile devices when data changes in a way that is meaningful to the knowledge worker. Mobile applications provide useful notifications on the devices, and does not have to be a part of the process. Likewise, the transmission of data and collaboration around that data is facilitated with mobile by revving up the usage of the information. Collaboration in mobile BI should not be an afterthought. Mobile notifications, combined with the knowledge worker s ability to immediately access information wherever he or she may be, provide many advantages over dashboards, which require Web browser technologies. Note that mobile often refers to in-office mobility. The user does not have to be at a restaurant, at home, or in an airport to prefer mobile. The old saw about business intelligence is that it gets the right information to the right people at the right time. It s really time to add via the right medium to that mix. Although organizations would like to support BYOD (bring your own device to work), many struggle. These devices can contain, either locally or through their connections to corporate systems, cached credentials that give access to personally identifiable information (PII) on customers, trade secrets, customer lists, and so on. Companies developing internal projects that utilize mobile, such as mobile business intelligence, need to require formal policies and procedures pertaining to controls such as encryption, BUSINESS INTELLIGENCE Journal vol. 17, No. 4 51

54 BI Experts Perspective remote wiping, and password authorization on the devices. As far as distribution, mobile apps can be distributed internally by hosting them on a Web server where employees can access in-house apps. This is simply the Apple App Store concept applied to internal apps. Employees can download and update at their convenience, and policy should direct employees to stay relatively current. As a short-term solution, the manager who wanted social media can easily set up a keyword search for the company name or whatever tag he is interested in and view that on a mobile Twitter application such as TweetDeck, or have specific Twitter account activity sent to his phone in real time as text messages. It is not difficult to feed any RSS application (many of which work on the ipad), specific Twitter account, or keyword activity. I d be concerned about the volume, however. This might be why he specified interesting information. Torey may want to do some monitoring herself and schedule weekly meetings with the manager to share what she is seeing and ask if it is interesting activity. She could then start building smarter Twitter searches that could be fed to an RSS reader app. Navigating the fast-changing world of social media is going to be important for every organization and, again, it will be a challenge for IT to home-grow the needed capabilities. In the longer term, Torey may want to investigate the capabilities of Gnip, a provider of social media to the enterprise. Gnip has the best head start on this dilemma and is adding all sources possible. This will still require work on the part of the Gnip customer to intelligently filter the information received and to actually analyze it and do the right thing with the information. The old saw about BI is that it gets the right information to the right people at the right time. It s time to add via the right medium. Torey should help the manager utilize social media data as well as provide the data. John O Brien It s natural for BI and information needs to be a company s first mobile implementation. After all, being mobile is about having instant access to information and answers when you are away from your desk. Adopting a mobile strategy can be a large undertaking with many variables and risks. One of the first things Torey should do is verify that there are no other mobile strategies currently being planned or rolled out within IT. If there is another initiative under way, she should determine what she can adopt and leverage from its strategy, standards, technologies, and security perspective. This is infrastructure that Torey should avoid developing and duplicating, if possible. Any IT mobile implementation will need to organize strategies for the bring-your-own-device (BYOD) phenomenon, as well as application, data, and access management; application stores; user permissions for downloading apps and accessing data; app performance testing; remote wipe on lost devices; and so on. Each of these items must be addressed, regardless of whether the implementation is related to BI, so a quick but thorough check through IT for similar initiatives is always a good place to start. Many companies choose to implement mobile BI through the extension of their current BI tool sets. Doing so may help gain quicker deployments that avoid BI migration work and additional infrastructure. However, many long-standing BI tools treat their mobile BI offering as the same reports and dashboards just on different devices. Newer mobile BI platforms are more likely to start fresh and recognize that mobile BI involves a new and different user profile and level of interactivity. Torey should look at her current BI environment and decide which approach has the most value for New Addition. Torey must identify and understand new mobile user profiles for her BI 52 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

55 BI Experts Perspective developers. Mobile users can only do so much with the small screen of smartphones. Tablets are used much like portfolio binders; they are effective for retrieving and sharing information, but users don t want to do hours of iterative analysis on a tablet, as they would on a desktop. The form factor of the mini-tablet is still mostly unknown, but my hunch is that it will be used as an oversized smartphone. None of these devices will foster the desire to author substantial content. Torey s analysis of her own likes, dislikes, and usage of these devices will be revealing. For example, what is the average length of time she spends interacting with different types of apps on different devices? Torey should also recognize that the three dominant form factors will apply to many hardware vendors, mobile OSes, and carriers. Developing or adopting HTML5 with CSS3 apps opens up a single common denominator across all of these platforms, but presents trade-offs in performance and offline access. HTML5-based apps may also prevent access to device functions such as camera, GPS and accelerometers, voice recognition, near-field communication, and gesture control. These are all great instruments for data collection, and devices are also incredible data collection instruments for collaboration and analysis. One of the most critical decisions that will depend on New Addition s strategy toward BYOD is the choice between native app development per mobile OS platform versus HTML5 development for all mobile OS platforms. This daunting decision is one reason an initial mobile BI implementation that extends current BI tools makes sense until Torey is able to develop mobile BI delivery expertise. Torey should verify that there are no other mobile strategies currently being planned or rolled out within IT. Finally, Torey s mobile strategy should be planned around use cases and mobile BI maturity levels. Use cases will center on how users interact with data, starting with the most simple information consumption and moving to the most sophisticated mobile apps. In the beginning there will be many infrastructure details, and simpler information apps will deliver enormous reliable value while fleshing out platform details. Most companies target the tablet first (and ipad in particular) as the standard device with the most potential value in the enterprise, and then extend the use cases to smartphones. Mobile BI users on smartphones tend to be more interrupt driven and work well with notifications, alerts, or quick information decisions. Use cases can be categorized as BI consumption, BI authoring, collaboration, presenting information, and action oriented. The experience she gains with mobile app development will allow Torey to discover new ways to develop BI apps. She should always be sure to watch how users want to interact with information and allow developers to experiment and be creative. She must be careful not to just deliver the same reports and dashboards on a smaller screen with no keyboard or she s missing the whole point of mobile BI. Lyndsay Wise Torey s decision to expand into mobile BI use will help expand general visibility across the organization. However, she has a difficult choice to make: whether or not to wait for the current BI vendor to come out with a mobile BI application that can be used on both cell phones and tablet devices. In general, a vendor s first iteration of an offering is not always its best in terms of features and capabilities. Therefore, it can be a risk to wait for a solution that may or may not meet user expectations when other solutions exist that do. On the other hand, there may be the potential to have a say in solution development and the solution road map, depending on how willing the vendor is to work with Torey and the New Addition team. As she weighs this decision, Torey should consider: BUSINESS INTELLIGENCE Journal vol. 17, No. 4 53

56 BI Experts Perspective The management time frame can she wait six months or is it important for BI momentum to expand access more quickly? Managing internal expectations may propel Torey toward evaluating current mobile BI applications to ensure that current successes are carried over. Payoffs for waiting assuming that the BI vendor can meet the needs of end users, what benefits exist beyond expansion of use? Will waiting enable more agility later, or will the new offering inhibit expansion in the future? If a different solution is selected, how will it play into current development efforts and management over time? Many BI vendors provide mobile BI options. When considering outside alternatives, Torey must ensure that the solutions can be integrated easily. For instance, Torey doesn t want to have to spend additional time to reinvent dashboards. Some platforms provide a centralized development platform that can be used for both Web-based and mobile applications. If she is selecting a new solution, this might not be possible, but it will largely depend on the development platform. If it s not possible, it becomes important to identify the effort involved not only in relation to development but also data integration, data transformations, design, maintenance, and cohesion between the management of two distinct platforms. Determining whether to support both phones and tablets will depend on the types of sales and other data being accessed and the levels of interactivity required. For instance, screen real estate on cell phones is very limited; in addition, many businesses choose to standardize on one type of cell phone for security and risk management purposes. Depending on the expectations in relation to interactivity and information access, one form of mobile access may be better than another. For instance, a smartphone might not provide the expected level of interactivity. On the other hand, if smartphones are more widely used than tablets, tablet expansion may require broader BI applications. This means that determining the best type of solution can be a challenge depending on the level of comfort with mobile devices. In general, tablets are better suited for touch and higher levels of interactivity. Proper expectations should be set; cell phones should be used only for summary levels of information and cannot be effectively interacted with in the same way as larger devices. Consequently, depending on the exact analytics and information required, it may be more effective to not only standardize on one type of device, but also develop a single type of use until end users develop a better understanding of what can and cannot be achieved through mobile BI access. Even though Torey s group is familiar with BI use, the expansion to include mobile means that end users will be interacting with BI in new and different ways, leading to broader applications. Therefore, it makes the most sense to start off with a few key performance indicators (KPIs) and job-related analytics while still giving end users the freedom to develop their own uses. Many vendors now offer the ability to stream social media data, so the potential exists to embed it within a mobile application. However, it might be more valuable for now to identify trends within the social media streams, or at least develop procedures for how to handle comments and identify trends over time. 54 BUSINESS INTELLIGENCE Journal vol. 17, No. 4

57 instructions for authors Editorial Calendar and Instructions for Authors The Business Intelligence Journal is a quarterly journal that focuses on all aspects of data warehousing and business intelligence. It serves the needs of researchers and practitioners in this important field by publishing surveys of current practices, opinion pieces, conceptual frameworks, case studies that describe innovative practices or provide important insights, tutorials, technology discussions, and annotated bibliographies. The Journal publishes educational articles that do not market, advertise, or promote one particular product or company. Editorial Topics Journal authors are encouraged to submit articles of interest to business intelligence and data warehousing professionals, including the following timely topics: Agile BI Architecture and deployment (including cloud computing, software-as-a-service, Hadoop) BI adoption and use BI and big data Data analysis and delivery Data design and integration Data management: MDM, data quality, and data governance Editorial Acceptance All articles are reviewed by the Journal s editors before they are accepted for publication. The publisher will copyedit the final manuscript to conform to its standards of grammar, style, format, and length. Articles must not have been published previously either online or in printed form. Submission of a manuscript implies the authors assurance that the same work has not been submitted elsewhere, nor will be submitted elsewhere during the Journal s evaluation. Authors will be required to sign a release form before the article is published; this agreement is available upon request (contact [email protected]). The Journal will not publish articles that market, advertise, or promote one particular product or company. Submissions For more information and complete submissions guidelines, please visit tdwi.org/journalsubmissions. Materials should be submitted to: Jennifer Agee, Managing Editor [email protected] Data warehouse and database technologies Mobile BI Project management and planning Selling and justifying the data warehouse Upcoming Submissions Deadlines Volume 18, Number 2 Submission deadline: February 22, 2013 Distribution: June 2013 Volume 18, Number 3 Submission deadline: May 17, 2013 Distribution: September 2013 BUSINESS INTELLIGENCE Journal vol. 17, No. 4 55

58 BI Statshots StatShots Update on Cloud Computing The Technology Survey that TDWI circulated at its recent World Conference in Boston asked attendees to answer a few questions about their perceptions of cloud computing in general, plus its potential use in business intelligence and data warehousing (BI/DW). TDWI had asked the same questions at conferences in 2009 and Comparing responses across the three runs of the survey reveals a few trends: BI/DW professionals are much more familiar with cloud computing. Comparing results from the three survey runs, respondents selecting not familiar at all dropped from 42% to 16% to 13%. Correspondingly, results for very familiar rose from 6% to 9% to 21%. (See Figure 1.) The change suggests that cloud computing is finally on the radar screens of BI/DW professionals, who have been reading about it and studying it more seriously in recent years. Enterprise cloud use is up, but cloud BI is still rare. For example, the percentage of survey respondents reporting no plans for enterprise cloud use has steadily decreased from 52% to 30% to 26%, while those already using increased from 13% to 15% to 27%. (See Figure 2.) Although enterprise cloud use is up, a mere 6% of survey respondents report already using cloud BI (which is simply any implementation of a BI/DW platform on any kind of cloud). (See Figure 3.) A whopping 57% have no plans for cloud BI today. TDWI suspects that clouds are like any platform; operational applications are the first systems to deploy on the new platform, and BI/DW systems will follow later. Philip Russom, TDWI Research Director for Data Management How familiar with cloud computing are you? Not familiar at all Somewhat familiar Very familiar 9% 6% 13% 16% 21% 42% 52% 66% 75% Figure 1. Based on 113 respondents in 2009, 208 in 2010, and 143 in What s the status of your organization s enterprise cloud strategy? No plans Exploration phase Design phase Implementation phase Already using 3% 2% 5% 8% 4% 4% 15% 13% 26% 30% 26% 27% 36% 52% 49% Figure 2. Based on 110 respondents in 2009, 206 in 2010, and 141 in What s the status of your organization s cloud BI strategy? No plans 57% Exploration phase 32% Design phase 4% Implementation phase 1% Already using 6% Figure 3. Based on 139 respondents in BUSINESS INTELLIGENCE Journal vol. 17, No. 4

59 tdwi.org/cbip Set yourself apart from the crowd. Get certified. WHAT SETS YOU APART FROM THE CROWD? Distinguishing yourself in your career can be a diffi cult task. Through TDWI s CBIP (Certifi ed Business Intelligence Professional) program, we help you define, establish, and set yourself apart professionally with a meaningful BI certifi cation credential. Become a Certified Business Intelligence Professional today! To find out what CBIP exams you should take, how to prepare, and where you can take the exams, visit tdwi.org/cbip.

60 TDWI Partners These solution providers have joined TDWI as special Partners and share TDWI s strong commitment to quality and content in education and knowledge transfer for business intelligence and data warehousing.

BI Dashboards the Agile Way

BI Dashboards the Agile Way BI Dashboards the Agile Way Paul DeSarra Paul DeSarra is Inergex practice director for business intelligence and data warehousing. He has 15 years of BI strategy, development, and management experience

More information

Ten Mistakes to Avoid

Ten Mistakes to Avoid EXCLUSIVELY FOR TDWI PREMIUM MEMBERS TDWI RESEARCH SECOND QUARTER 2014 Ten Mistakes to Avoid In Big Data Analytics Projects By Fern Halper tdwi.org Ten Mistakes to Avoid In Big Data Analytics Projects

More information

DATA VISUALIZATION AND DISCOVERY FOR BETTER BUSINESS DECISIONS

DATA VISUALIZATION AND DISCOVERY FOR BETTER BUSINESS DECISIONS TDWI research TDWI BEST PRACTICES REPORT THIRD QUARTER 2013 EXECUTIVE SUMMARY DATA VISUALIZATION AND DISCOVERY FOR BETTER BUSINESS DECISIONS By David Stodder tdwi.org EXECUTIVE SUMMARY Data Visualization

More information

A Buyer s Guide to Enterprise Performance Management Suites

A Buyer s Guide to Enterprise Performance Management Suites White Paper A Buyer s Guide to Enterprise Performance Management Suites Seven Key Requirements in Choosing an EPM Solution Table of Contents Seven key requirements to evaluate in an EPM solution 3 1. Breadth

More information

INFORMATION CONNECTED

INFORMATION CONNECTED INFORMATION CONNECTED Business Solutions for the Utilities Industry Primavera Project Portfolio Management Solutions Achieve Operational Excellence with Robust Project Portfolio Management Solutions The

More information

A business intelligence agenda for midsize organizations: Six strategies for success

A business intelligence agenda for midsize organizations: Six strategies for success IBM Software Business Analytics IBM Cognos Business Intelligence A business intelligence agenda for midsize organizations: Six strategies for success A business intelligence agenda for midsize organizations:

More information

Integrating SAP and non-sap data for comprehensive Business Intelligence

Integrating SAP and non-sap data for comprehensive Business Intelligence WHITE PAPER Integrating SAP and non-sap data for comprehensive Business Intelligence www.barc.de/en Business Application Research Center 2 Integrating SAP and non-sap data Authors Timm Grosser Senior Analyst

More information

Make the right decisions with Distribution Intelligence

Make the right decisions with Distribution Intelligence Make the right decisions with Distribution Intelligence Bengt Jensfelt, Business Product Manager, Distribution Intelligence, April 2010 Introduction It is not so very long ago that most companies made

More information

Business Intelligence and Big Data Analytics: Speeding the Cycle from Insights to Action Four Steps to More Profitable Customer Engagement

Business Intelligence and Big Data Analytics: Speeding the Cycle from Insights to Action Four Steps to More Profitable Customer Engagement white paper Business Intelligence and Big Data Analytics: Speeding the Cycle from Insights to Action Four Steps to More Profitable Customer Engagement»» Summary For business intelligence analysts the era

More information

Prescriptive Analytics. A business guide

Prescriptive Analytics. A business guide Prescriptive Analytics A business guide May 2014 Contents 3 The Business Value of Prescriptive Analytics 4 What is Prescriptive Analytics? 6 Prescriptive Analytics Methods 7 Integration 8 Business Applications

More information

Developing a Business Analytics Roadmap

Developing a Business Analytics Roadmap White Paper Series Developing a Business Analytics Roadmap A Guide to Assessing Your Organization and Building a Roadmap to Analytics Success March 2013 A Guide to Assessing Your Organization and Building

More information

Who Doesn t Want to be Agile? By: Steve Dine President, Datasource Consulting, LLC 7/10/2008

Who Doesn t Want to be Agile? By: Steve Dine President, Datasource Consulting, LLC 7/10/2008 Who Doesn t Want to be Agile? By: Steve Dine President, Datasource Consulting, LLC 7/10/2008 Who wants to be involved in a BI project or program that is labeled slow or inflexible? While I don t believe

More information

Cincom Business Intelligence Solutions

Cincom Business Intelligence Solutions CincomBI Cincom Business Intelligence Solutions Business Users Overview Find the perfect answers to your strategic business questions. SIMPLIFICATION THROUGH INNOVATION Introduction Being able to make

More information

BI-based Organizations 4 Hugh J. Watson. Beyond Business Intelligence 7 Barry Devlin

BI-based Organizations 4 Hugh J. Watson. Beyond Business Intelligence 7 Barry Devlin Volume 15 Number 2 2nd Quarter 2010 THE LEADING PUBLICATION FOR BUSINESS INTELLIGENCE AND DATA WAREHOUSING PROFESSIONALS BI-based Organizations 4 Hugh J. Watson Beyond Business Intelligence 7 Barry Devlin

More information

The optimization maturity model

The optimization maturity model The optimization maturity model Know where you are so you can move forward Table of contents 1 Digital optimization 2 Optimization maturity model 2 Five levels of optimization maturity 5 Benefits of becoming

More information

Enhancing Sales and Operations Planning with Forecasting Analytics and Business Intelligence WHITE PAPER

Enhancing Sales and Operations Planning with Forecasting Analytics and Business Intelligence WHITE PAPER Enhancing Sales and Operations Planning with Forecasting Analytics and Business Intelligence WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Analytics.... 1 Forecast Cycle Efficiencies...

More information

Build an effective data integration strategy to drive innovation

Build an effective data integration strategy to drive innovation IBM Software Thought Leadership White Paper September 2010 Build an effective data integration strategy to drive innovation Five questions business leaders must ask 2 Build an effective data integration

More information

Business Intelligence Project Management 101

Business Intelligence Project Management 101 Business Intelligence Project Management 101 Managing BI Projects within the PMI Process Groups Too many times, Business Intelligence (BI) and Data Warehousing project managers are ill-equipped to handle

More information

Using and Choosing a Cloud Solution for Data Warehousing

Using and Choosing a Cloud Solution for Data Warehousing TDWI RESEARCH TDWI CHECKLIST REPORT Using and Choosing a Cloud Solution for Data Warehousing By Colin White Sponsored by: tdwi.org JULY 2015 TDWI CHECKLIST REPORT Using and Choosing a Cloud Solution for

More information

BestPractices. Dashboard Design: Key Performance Indicators & Metrics Choosing the right data to display. Thomas W. Gonzalez Managing Director

BestPractices. Dashboard Design: Key Performance Indicators & Metrics Choosing the right data to display. Thomas W. Gonzalez Managing Director BestPractices Dashboard Design: Key Performance Indicators & Metrics Choosing the right data to display. Thomas W. Gonzalez Managing Director BrightPoint Consulting, Inc. October 2005 Introduction This

More information

Business Intelligence

Business Intelligence Transforming Information into Business Intelligence Solutions Business Intelligence Client Challenges The ability to make fast, reliable decisions based on accurate and usable information is essential

More information

Agile Master Data Management A Better Approach than Trial and Error

Agile Master Data Management A Better Approach than Trial and Error Agile Master Data Management A Better Approach than Trial and Error A whitepaper by First San Francisco Partners First San Francisco Partners Whitepaper Executive Summary Market leading corporations are

More information

Infor10 Corporate Performance Management (PM10)

Infor10 Corporate Performance Management (PM10) Infor10 Corporate Performance Management (PM10) Deliver better information on demand. The speed, complexity, and global nature of today s business environment present challenges for even the best-managed

More information

BUSINESS INTELLIGENCE. Keywords: business intelligence, architecture, concepts, dashboards, ETL, data mining

BUSINESS INTELLIGENCE. Keywords: business intelligence, architecture, concepts, dashboards, ETL, data mining BUSINESS INTELLIGENCE Bogdan Mohor Dumitrita 1 Abstract A Business Intelligence (BI)-driven approach can be very effective in implementing business transformation programs within an enterprise framework.

More information

TDWI strives to provide course books that are content-rich and that serve as useful reference documents after a class has ended.

TDWI strives to provide course books that are content-rich and that serve as useful reference documents after a class has ended. Previews of TDWI course books offer an opportunity to see the quality of our material and help you to select the courses that best fit your needs. The previews cannot be printed. TDWI strives to provide

More information

PDF PREVIEW EMERGING TECHNOLOGIES. Applying Technologies for Social Media Data Analysis

PDF PREVIEW EMERGING TECHNOLOGIES. Applying Technologies for Social Media Data Analysis VOLUME 34 BEST PRACTICES IN BUSINESS INTELLIGENCE AND DATA WAREHOUSING FROM LEADING SOLUTION PROVIDERS AND EXPERTS PDF PREVIEW IN EMERGING TECHNOLOGIES POWERFUL CASE STUDIES AND LESSONS LEARNED FOCUSING

More information

4 Steps For Improving Healthcare Productivity Using Dashboards and Data Visualization

4 Steps For Improving Healthcare Productivity Using Dashboards and Data Visualization Steps For Improving Healthcare Productivity Using Dashboards and Data Visualization p Steps For Improving Healthcare Productivity Introduction In our real-world example hospital, it s the job of the Chief

More information

Why Modern B2B Marketers Need Predictive Marketing

Why Modern B2B Marketers Need Predictive Marketing Why Modern B2B Marketers Need Predictive Marketing Sponsored by www.raabassociatesinc.com [email protected] www.mintigo.com [email protected] Introduction Marketers have used predictive modeling

More information

Five Technology Trends for Improved Business Intelligence Performance

Five Technology Trends for Improved Business Intelligence Performance TechTarget Enterprise Applications Media E-Book Five Technology Trends for Improved Business Intelligence Performance The demand for business intelligence data only continues to increase, putting BI vendors

More information

THE TOP 5 RECRUITMENT KPIs

THE TOP 5 RECRUITMENT KPIs BETTER DATA, BETTER BUSINESS: THE TOP 5 RECRUITMENT KPIs Jointly written by: COPYRIGHT 2012, BULLHORN + INSIGHTSQUARED, ALL RIGHTS RESERVED. Companies today are increasingly competing in a data-centric

More information

25 Questions Top Performing Sales Teams Can Answer - Can You?

25 Questions Top Performing Sales Teams Can Answer - Can You? 25 Questions Top Performing Sales Teams Can Answer - Can You? How high growth businesses use Sales Force Automation to drive success The best performing sales teams can answer the sales management questions

More information

Analytics For Everyone - Even You

Analytics For Everyone - Even You White Paper Analytics For Everyone - Even You Abstract Analytics have matured considerably in recent years, to the point that business intelligence tools are now widely accessible outside the boardroom

More information

Work Smarter, Not Harder: Leveraging IT Analytics to Simplify Operations and Improve the Customer Experience

Work Smarter, Not Harder: Leveraging IT Analytics to Simplify Operations and Improve the Customer Experience Work Smarter, Not Harder: Leveraging IT Analytics to Simplify Operations and Improve the Customer Experience Data Drives IT Intelligence We live in a world driven by software and applications. And, the

More information

WHITE PAPER OCTOBER 2014. Unified Monitoring. A Business Perspective

WHITE PAPER OCTOBER 2014. Unified Monitoring. A Business Perspective WHITE PAPER OCTOBER 2014 Unified Monitoring A Business Perspective 2 WHITE PAPER: UNIFIED MONITORING ca.com Table of Contents Introduction 3 Section 1: Today s Emerging Computing Environments 4 Section

More information

Career Management. Making It Work for Employees and Employers

Career Management. Making It Work for Employees and Employers Career Management Making It Work for Employees and Employers Stuck in neutral. That s how many employees around the world would describe their career. In fact, according to the 2014 Global Workforce Study,

More information

MITS Distributor Analytics

MITS Distributor Analytics Product Datasheet For TrulinX Users MITS Distributor Analytics A powerful combination of reporting features MITS Distributor Analytics gives you a competitive edge when it comes to making decisions that

More information

PUSH INTELLIGENCE. Bridging the Last Mile to Business Intelligence & Big Data. 2013 Copyright Metric Insights, Inc.

PUSH INTELLIGENCE. Bridging the Last Mile to Business Intelligence & Big Data. 2013 Copyright Metric Insights, Inc. PUSH INTELLIGENCE Bridging the Last Mile to Business Intelligence & Big Data 2013 Copyright Metric Insights, Inc. INTRODUCTION... 3 CHALLENGES WITH BI... 4 The Dashboard Dilemma... 4 Architectural Limitations

More information

Enhancing Sales and Operations Planning with Forecasting Analytics and Business Intelligence WHITE PAPER

Enhancing Sales and Operations Planning with Forecasting Analytics and Business Intelligence WHITE PAPER Enhancing Sales and Operations Planning with Forecasting Analytics and Business Intelligence WHITE PAPER Table of Contents Introduction... 1 Analytics... 1 Forecast cycle efficiencies... 3 Business intelligence...

More information

Best practices for planning and budgeting. A white paper prepared by Prophix

Best practices for planning and budgeting. A white paper prepared by Prophix A white paper prepared by Prophix Executive summary The continual changes in the business climate constantly challenge companies to find more effective business practices. However, common budgeting limitations

More information

OPTIMUS SBR. Optimizing Results with Business Intelligence Governance CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE.

OPTIMUS SBR. Optimizing Results with Business Intelligence Governance CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE. OPTIMUS SBR CHOICE TOOLS. PRECISION AIM. BOLD ATTITUDE. Optimizing Results with Business Intelligence Governance This paper investigates the importance of establishing a robust Business Intelligence (BI)

More information

Setting smar ter sales per formance management goals

Setting smar ter sales per formance management goals IBM Software Business Analytics Sales performance management Setting smar ter sales per formance management goals Use dedicated SPM solutions with analytics capabilities to improve sales performance 2

More information

Best Practices for Planning and Budgeting. A white paper prepared by PROPHIX Software October 2006

Best Practices for Planning and Budgeting. A white paper prepared by PROPHIX Software October 2006 A white paper prepared by PROPHIX Software October 2006 Executive Summary The continual changes in the business climate constantly challenge companies to find more effective business practices. However,

More information

A Hurwitz white paper. Inventing the Future. Judith Hurwitz President and CEO. Sponsored by Hitachi

A Hurwitz white paper. Inventing the Future. Judith Hurwitz President and CEO. Sponsored by Hitachi Judith Hurwitz President and CEO Sponsored by Hitachi Introduction Only a few years ago, the greatest concern for businesses was being able to link traditional IT with the requirements of business units.

More information

Agenda Overview for Marketing Management, 2015

Agenda Overview for Marketing Management, 2015 G00270720 Agenda Overview for Marketing Management, 2015 Published: 18 December 2014 Analyst(s): Richard Fouts Increased participation in strategic business decisions and an evolving organization put new

More information

EMA Service Catalog Assessment Service

EMA Service Catalog Assessment Service MORE INFORMATION: To learn more about the EMA Service Catalog, please contact the EMA Business Development team at +1.303.543.9500 or [email protected] The IT Service Catalog Aligning

More information

Big Data Integration: A Buyer's Guide

Big Data Integration: A Buyer's Guide SEPTEMBER 2013 Buyer s Guide to Big Data Integration Sponsored by Contents Introduction 1 Challenges of Big Data Integration: New and Old 1 What You Need for Big Data Integration 3 Preferred Technology

More information

Business Intelligence Solutions for Gaming and Hospitality

Business Intelligence Solutions for Gaming and Hospitality Business Intelligence Solutions for Gaming and Hospitality Prepared by: Mario Perkins Qualex Consulting Services, Inc. Suzanne Fiero SAS Objective Summary 2 Objective Summary The rise in popularity and

More information

WHY IT ORGANIZATIONS CAN T LIVE WITHOUT QLIKVIEW

WHY IT ORGANIZATIONS CAN T LIVE WITHOUT QLIKVIEW WHY IT ORGANIZATIONS CAN T LIVE WITHOUT QLIKVIEW A QlikView White Paper November 2012 qlikview.com Table of Contents Unlocking The Value Within Your Data Warehouse 3 Champions to the Business Again: Controlled

More information

State of Embedded Analytics Report. Logi Analytics Third Annual Executive Review of Embedded Analytics Trends and Tactics

State of Embedded Analytics Report. Logi Analytics Third Annual Executive Review of Embedded Analytics Trends and Tactics 2015 State of Embedded Analytics Report Logi Analytics Third Annual Executive Review of Embedded Analytics Trends and Tactics Table of Contents 3. Introduction 4. What is Embedded Analytics? 5. Top 10

More information

Solve Your Toughest Challenges with Data Mining

Solve Your Toughest Challenges with Data Mining IBM Software Business Analytics IBM SPSS Modeler Solve Your Toughest Challenges with Data Mining Use predictive intelligence to make good decisions faster Solve Your Toughest Challenges with Data Mining

More information

Process Intelligence: An Exciting New Frontier for Business Intelligence

Process Intelligence: An Exciting New Frontier for Business Intelligence February/2014 Process Intelligence: An Exciting New Frontier for Business Intelligence Claudia Imhoff, Ph.D. Sponsored by Altosoft, A Kofax Company Table of Contents Introduction... 1 Use Cases... 2 Business

More information

A Road Map for Advancing Your Career

A Road Map for Advancing Your Career CERTIFIED BUSINESS INTELLIGENCE PROFESSIONAL TDWI CERTIFICATION A Road Map for Advancing Your Career Get recognized as an industry leader. Get ahead of the competition. Advance your career with CBIP. Professionals

More information

BI4Dynamics provides rich business intelligence capabilities to companies of all sizes and industries. From the first day on you can analyse your

BI4Dynamics provides rich business intelligence capabilities to companies of all sizes and industries. From the first day on you can analyse your BI4Dynamics provides rich business intelligence capabilities to companies of all sizes and industries. From the first day on you can analyse your data quickly, accurately and make informed decisions. Spending

More information

Hybrid: The Next Generation Cloud Interviews Among CIOs of the Fortune 1000 and Inc. 5000

Hybrid: The Next Generation Cloud Interviews Among CIOs of the Fortune 1000 and Inc. 5000 Hybrid: The Next Generation Cloud Interviews Among CIOs of the Fortune 1000 and Inc. 5000 IT Solutions Survey Wakefield Research 2 EXECUTIVE SUMMARY: Hybrid The Next Generation Cloud M ost Chief Information

More information

Delivering Value to the Business. Why Your Current HR Systems Hold You Back

Delivering Value to the Business. Why Your Current HR Systems Hold You Back Delivering Value to the Business Why Your Current HR Systems Hold You Back Delivering Value to the Business Why Your Current HR Systems Hold You Back When your Human Resources organization directly contributes

More information

Applied Business Intelligence. Iakovos Motakis, Ph.D. Director, DW & Decision Support Systems Intrasoft SA

Applied Business Intelligence. Iakovos Motakis, Ph.D. Director, DW & Decision Support Systems Intrasoft SA Applied Business Intelligence Iakovos Motakis, Ph.D. Director, DW & Decision Support Systems Intrasoft SA Agenda Business Drivers and Perspectives Technology & Analytical Applications Trends Challenges

More information

The Information Management Center of Excellence: A Pragmatic Approach

The Information Management Center of Excellence: A Pragmatic Approach 1 The Information Management Center of Excellence: A Pragmatic Approach Peter LePine & Tom Lovell Table of Contents TABLE OF CONTENTS... 2 Executive Summary... 3 Business case for an information management

More information

Top 10 Business Intelligence (BI) Requirements Analysis Questions

Top 10 Business Intelligence (BI) Requirements Analysis Questions Top 10 Business Intelligence (BI) Requirements Analysis Questions Business data is growing exponentially in volume, velocity and variety! Customer requirements, competition and innovation are driving rapid

More information

Information Governance Workshop. David Zanotta, Ph.D. Vice President, Global Data Management & Governance - PMO

Information Governance Workshop. David Zanotta, Ph.D. Vice President, Global Data Management & Governance - PMO Information Governance Workshop David Zanotta, Ph.D. Vice President, Global Data Management & Governance - PMO Recognition of Information Governance in Industry Research firms have begun to recognize the

More information

Successful Outsourcing of Data Warehouse Support

Successful Outsourcing of Data Warehouse Support Experience the commitment viewpoint Successful Outsourcing of Data Warehouse Support Focus IT management on the big picture, improve business value and reduce the cost of data Data warehouses can help

More information

Tools for Managing and Measuring the Value of Big Data Projects

Tools for Managing and Measuring the Value of Big Data Projects Tools for Managing and Measuring the Value of Big Data Projects Abstract Big Data and analytics focused projects have undetermined scope and changing requirements at their core. There is high risk of loss

More information

"Why Didn't We Do It Sooner?" Deployment of a New BI Solution at The Pain Center of Arizona

Why Didn't We Do It Sooner? Deployment of a New BI Solution at The Pain Center of Arizona Buyer Case Study "Why Didn't We Do It Sooner?" Deployment of a New BI Solution at The Pain Center of Arizona Dan Vesset IDC OPINION Investment in analytics, business intelligence, and big data technologies

More information

CREATING PACKAGED IP FOR BUSINESS ANALYTICS PROJECTS

CREATING PACKAGED IP FOR BUSINESS ANALYTICS PROJECTS CREATING PACKAGED IP FOR BUSINESS ANALYTICS PROJECTS A PERSPECTIVE FOR SYSTEMS INTEGRATORS Sponsored by Microsoft Corporation 1/ What is Packaged IP? Categorizing the Options 2/ Why Offer Packaged IP?

More information

Best practices for managing the data warehouse to support Big Data

Best practices for managing the data warehouse to support Big Data E-Guide Best practices for managing the data warehouse to support Big Data The new challenge for IT and data warehousing teams is how to leverage existing technology investments along with emerging tools

More information

Data Virtualization A Potential Antidote for Big Data Growing Pains

Data Virtualization A Potential Antidote for Big Data Growing Pains perspective Data Virtualization A Potential Antidote for Big Data Growing Pains Atul Shrivastava Abstract Enterprises are already facing challenges around data consolidation, heterogeneity, quality, and

More information

Becoming Agile: a getting started guide for Agile management in Marketing and their partners in IT, Sales, Customer Service and other business teams.

Becoming Agile: a getting started guide for Agile management in Marketing and their partners in IT, Sales, Customer Service and other business teams. Becoming Agile: a getting started guide for Agile management in Marketing and their partners in IT, Sales, Customer Service and other business teams. Agile for Business www.agilefluent.com Summary The

More information

Before getting started, we need to make sure we. Business Intelligence Project Management 101: Managing BI Projects Within the PMI Process Group

Before getting started, we need to make sure we. Business Intelligence Project Management 101: Managing BI Projects Within the PMI Process Group PMI Virtual Library 2010 Carole Wittemann Business Intelligence Project Management 101: Managing BI Projects Within the PMI Process Group By Carole Wittemann, PMP Abstract Too many times, business intelligence

More information

Focus Experts Briefing: Five Ways Modern ERP Solutions Increase Business Agility

Focus Experts Briefing: Five Ways Modern ERP Solutions Increase Business Agility Focus Experts Briefing: Five Ways Modern ERP November 16, 2011 topics: Information Technology Operations Enterprise Resource Planning ERP Business Agility Mobility Cloud Computing Business Intelligence

More information

Achieving Greater Agility with Business Intelligence Improving Speed and Flexibility for BI, Analytics, and Data Warehousing.

Achieving Greater Agility with Business Intelligence Improving Speed and Flexibility for BI, Analytics, and Data Warehousing. Achieving Greater Agility with Business Intelligence Improving Speed and Flexibility for BI, Analytics, and Data Warehousing By David Stodder TDWI Best Practices Report Sponsors 2 Agenda About this report:

More information

ON Semiconductor identified the following critical needs for its solution:

ON Semiconductor identified the following critical needs for its solution: Microsoft Business Intelligence Microsoft Office Business Scorecards Accelerator Case study Harnesses the Power of Business Intelligence to Drive Success Execution excellence is an imperative in order

More information

Analytics Outsourcing: The Hertz Experience 4 Hugh J. Watson, Barbara H. Wixom, and Thomas C. Pagano

Analytics Outsourcing: The Hertz Experience 4 Hugh J. Watson, Barbara H. Wixom, and Thomas C. Pagano EXCLUSIVELY FOR TDWI PREMIUM MEMBERS volume 18 number 4 The leading publication for business intelligence and data warehousing professionals Analytics Outsourcing: The Hertz Experience 4 Hugh J. Watson,

More information

Statement of Direction

Statement of Direction Microsoft Dynamics SL Statement of Direction Product strategy and roadmap for Microsoft Dynamics SL Date: January 2012 www.microsoft.com/dynamics/sl Page 1 CONTENTS Welcome... 3 Overview of Microsoft Dynamics

More information

Senior Business Intelligence/Engineering Analyst

Senior Business Intelligence/Engineering Analyst We are very interested in urgently hiring 3-4 current or recently graduated Computer Science graduate and/or undergraduate students and/or double majors. NetworkofOne is an online video content fund. We

More information

GUIDEBOOK MAXIMIZING SUCCESS DELIVERING MICROSOFT DYNAMICS

GUIDEBOOK MAXIMIZING SUCCESS DELIVERING MICROSOFT DYNAMICS GUIDEBOOK MAXIMIZING SUCCESS DELIVERING MICROSOFT DYNAMICS Corporate Headquarters Nucleus Research Inc. 100 State Street Boston, MA 02109 Phone: +1 617.720.2000 Nucleus Research Inc. TOPICS Enterprise

More information