硕 士 学 位 论 文 Dissertation for Master s Degree ( 工 程 硕 士 ) (Master of Engineering)

Size: px
Start display at page:

Download "硕 士 学 位 论 文 Dissertation for Master s Degree ( 工 程 硕 士 ) (Master of Engineering)"

Transcription

1 硕 士 学 位 论 文 Dissertation for Master s Degree ( 工 程 硕 士 ) (Master of Engineering) 外 包 电 话 公 司 绩 效 管 理 系 统 的 设 计 与 实 现 Design and Implemention of a Performance Management System for a Call Center 李 厚 川 UNIVERSITY OF PAVIA 2013 年 10 月

2 国 内 图 书 分 类 号 :TP311 学 校 代 码 :10213 国 际 图 书 分 类 号 :681 密 级 : 公 开 工 程 硕 士 学 位 论 文 Dissertation for the Master s Degree in Engineering ( 工 程 硕 士 ) (Master of Engineering) 外 包 电 话 公 司 绩 效 管 理 系 统 Design and Implemention of a Performance Management System for a Call Center 硕 士 研 究 生 : 李 厚 川 导 师 : 李 东 教 授 副 导 师 : Gianmario Motta 教 授 实 习 单 位 导 师 : Giuseppe Ferraro 申 请 学 位 : 工 程 硕 士 学 科 : 软 件 工 程 所 在 单 位 : 软 件 学 院 答 辩 日 期 : 2013 年 10 月 授 予 学 位 单 位 : 哈 尔 滨 工 业 大 学

3 Classified Index: TP311 U.D.C: 681 Dissertation for the Master s Degree in Engineering Design and Implemention of a Performance Management System for a Call Center Candidate: Houchuan Li Supervisor: Professor Dong Li Associate Supervisor: Professor Gianmario Motta Industrial Supervisor: Giuseppe Ferraro Academic Degree Applied for: Master of Engineering Speciality: Software Engineering Affiliation: School of Software Date of Defence: October, 2013 Degree-Conferring-Institution: Harbin Institute of Technology

4 摘 要 近 年 来, 随 着 互 联 网 技 术 和 服 务 行 业 的 迅 速 兴 起, 尤 其 是 IT 技 术 在 各 行 各 业 的 大 量 应 用 企 业 绩 效 软 件 越 来 越 受 到 各 大 中 小 公 司 的 青 睐 本 项 目 是 由 帕 维 亚 大 学 的 服 务 工 程 实 验 室 和 Phonetica 电 话 服 务 中 心 联 合 开 发 的 一 个 针 对 电 话 中 心 商 务 智 能 方 面 开 发 的 绩 效 管 理 系 统 在 商 业 分 析 阶 段, 我 们 根 据 不 同 的 角 色 对 处 在 整 个 电 话 中 心 商 业 活 动 中 进 行 不 同 的 分 析, 来 确 定 他 们 的 KPI 简 而 言 之, 我 们 采 用 BPMN 来 分 析 当 前 商 业 活 动 进 而 来 识 别 在 每 个 活 动 当 中 产 生 的 数 据 信 息, 在 根 据 不 同 较 色 来 分 析 他 们 接 下 来 采 用 SIRE 来 确 定 我 们 需 要 具 体 分 析 的 数 据 模 块 HIGO 相 当 于 一 个 识 别 器, 根 据 不 同 的 较 色 来 找 出 正 确 的 KPI 在 鉴 定 了 KPI 之 后, 本 文 还 介 绍 了 如 何 基 于 开 源 商 业 智 能 软 件 套 件 : Pentaho, 去 设 计 并 实 现 了 绩 效 管 理 系 统 体 系 首 先, 采 用 DFM( 三 维 事 实 模 型 ) 用 于 设 计 数 据 库 的 事 实 表 和 其 余 维 表 其 次, 该 系 统 采 用 ETL( 提 取, 转 移 和 负 载 ) 过 程, 将 数 据 进 行 提 取, 并 将 其 转 化, 最 终 用 于 关 联 表 现 层 和 数 据 层 最 终, 系 统 系 统 将 那 些 数 据 进 行 分 析, 并 且 将 相 似 且 有 关 联 的 数 据 进 行 对 比, 然 后 再 通 过 建 立 dashoboard 进 行 结 果 显 示 在 结 论 中, 本 文 说 明 如 何 创 建 一 套 绩 效 管 理 系 统 : 通 过 对 不 同 商 业 模 块 的 分 析 来 确 定 KPI, 然 后 根 据 开 源 商 业 套 件 Pentaho 来 实 现 本 系 统 本 系 提 供 了 一 个 适 当 的 逻 辑 体 系 来 根 据 不 用 的 角 色 对 商 务 流 程 进 行 分 析, 最 终 确 定 KPI 并 且 基 于 Pentaho 软 件 开 发 出 一 套 较 为 灵 活 的 解 决 方 案, 将 分 析 出 来 的 KPI 转 化 为 数 据 结 果, 然 后 加 以 显 示, 以 满 足 客 户 的 需 要 关 键 词 : 绩 效 管 理 系 统, 商 业 智 能, 电 话 中 心, 商 业 流 程, 商 业 智 能 工 具 I

5 Abstract This thesis illustrates the design and implemation of a dashoboard application for a call center. This application is a module of a complete Busines Intelligence system, that includes also a warehouse, a forecasting system, a web based reporting and a mobile reporting. This system has been developed for Phonetica, a company which provides customer cares services. Specifically, the Performance Management System enables management to analyze level of service to the customers and the the erfoemnce of employees. During the analysis phase, KPIs (Key Perfomence Indicators) have been identified for the various stakeholders, by HIGO model. Then, by BPMN (Business Processs Management Notation) we modeled business processes and identified information flow and stakeholders; finally, by SIRE (Strategic Information Requirements Elicitation) we identified the key data entities. This paper also presents the design and implementation pahhase, that has been based on an open source platform, namely Pentaho BI Suite. Firstly, by DFM (Dmensional Fact Model) we modeled fact and dimension tables of the Data Warehouse. Secondly, we implemented ETL (Extract, Transform and Load) to create information from transactional database and store it in data warehouse. Thirdly, relevant hypercubes are created to provide knowledge for the upper presentation layer. Finally, the dashboard presents the knowledge for analysis by the various Phonetica stakeholders. In short, this paper illustrates how to design a performance management system through a set of analysis models and how to imlement it on an open source palatform (namely Pentaho BI Suite). The analysis model provides a logical and smooth procedure to identify KPIs that do reflect the requirements of a woide range of stakeholders. In turn, the open source solution provides a flexible solution to implement Business Intelligence systems. Keywords: performance management system; business intelligence; ETL; BI tools; II

6 目 录 摘 要... I ABSTRACT... II CHAPTER 1 INTRODUCTION BACKGROUD THE PURPOSE OF PROJECT THE STATUS OF RELATED RESEARCH Research about Corporate Performance Management Research about Business Intelligence Research about Management Process Research about BI tools Research about Report Research about KPI Research about SLA Research about Dashboard MAIN CONTENT AND ORGANIZATION OF THE THESIS CHAPTER 2 SYSTEM REQUIREMENT ANALYSIS THE GOAL OF THE SYSTEM THE FUNCTIONAL REQUIREMENTS BPMN SIRE analysis KPI requirements HIGO analysis KPI select Use case THE UN-FUNCTIONAL REQUIREMENTS BRIEF SUMMARY CHAPTER 3 SYSTEM DESIGN THE OVERALL SYSTEM DESIGN III

7 3.2 DFM MODEL DATA MINING AND PROCESSING MODEL Intrdoduction Data warehouse design Data integration Cubes design DATA ACCESS MODEL Introduction Model process Control model DASHBOARD MODEL Introduction Program process Interface design Sequence diagram KEY TECHNIQUES ETL MDX CDE CDA BRIEF SUMMARY CHAPTER 4 SYSTEM IMPLEMENTATION AND TESTING THE ENVIRONMENT OF SYSTEM IMPLEMENTATION KEY PROGRAM FLOW CHARTS Data integration Data access model Control model Dashboard model KEY INTERFACES OF THE SOFTWARE SYSTEM SYSTEM TESTING Test environment ETL test OLAP test IV

8 4.4.4 Server test Presentation test BRIEF SUMMARY CONCLUSION REFERENCES STATEMENT OF ORIGINALITY AND LETTER OF AUTHORIZATION 错 误! 未 定 义 书 签 ACKNOWLEDGEMENT RESUME V

9 Chapter 1 Introduction This chapter will introduce the background and purpose of this project that it includes the division of work and the brief introduction of the Performance Management. At the end of this chapter, I will describe the status of related research, and summarize the main content and structure of this thesis. 1.1 Backgroud The project is launched for a genuine outsourcring call center company, namely Phonetica ( a midsized enterprise that operates in Business Process Outsourcing (BPO). It provides various services to a wide range of companies, such as "Business Concierge" (corporate switchboard), customer care front & back office, remote answering service and reception activities. Phonetica also provides marketing services such as market research and opinion surveys, the updating and management of marketing databases, sales support and events organization. Since customer satisfaction, e.g. service level is the KSF [1] (Key Success Factors) of a call center), call quality monitoring is one of the most effective methods for improving the level of service. Generally, call quality monitoring refers to the process of listening to or observing an agent's phone conversations or other multi-media contacts with customers [2]. As we know the companies can measure customer satisfaction through customer focus groups. Telephone surveys of the client and written satisfaction surveys, the results are usually detailed enough or timely enough to help the agents to understand their contribution or impact. On the other hand, if we can make call center monitoring systems correctly, it can timely report a wealth of consumer information, reveal the status of the relevant business processes and policies and collect individual agent performance. Not only can it improve the client satisfies but also it can improve overall call center performance, e.g.reducing callbacks, the rework and training efforts, as a result a company will achieve more cost-effective process more sales opportunities, and more sophisticated employee. Good quality monitoring must be reflective of customer expectations and value. It means truly measure the experience of each customer. How to measure the 1

10 customer experience is the key of purpose of the call center's quality monitoring process. In another word, the quality monitoring is a performance management tool. As well as any measurement tool, so we must be careful about what we are measuring. The process of our measurement in and of itself produces change. We must make sure what we are measuring is the right behaviors. How frequently should call quality be measured? The answer depends upon a lot of factors. It showed as Table 1-1 Table 1-1 call quality factors No. Factors 1 How well is the employee prepared to do the job 2 How well do your agents understand the process and the measurements 3 How calibrated are your evaluators? Measuring consistently? Fairly? Capturing performance 4 How good is your feedback? Can employees relate and have faith in the scoring? See how they can change their behavior to improve 5 Does regulation or a bargaining unit agreement restrict your organization? Make sure you comply with your labor contract or local legislation. 6 How many can you do? Do you have a system to help record calls? Track performance 7 How close are employees to standard? How far does their behavior have to change to meet expectations 8 Lack of buy-in or credibility 9 Insufficient resources or time 10 Lack of automation or poor system reliability or support 11 Inadequate training for coaches and supervisors 12 Inconsistent evaluation results 13 Lack of support from management [4]. 14 However call center Quality Monitoringsystem challenged by many factors, including: difficulty in designing evaluation criteria: Thus, some studies have been made and shown that there is little correlation between employee tenure and quality scores. Generally, behavior varies based on employee skill level, experiences, and motivation. There needs a way to help 2

11 employees achieve a level of "unconscious competence" in their customer interactions through training, measurement, feedback, and rewards. However, based on the number of calls monitored per month per representative, the presentation in each sector varies widely. As a group, participants average 6.3 monitoring sessions per month per agent [3]. The reseasure is showed as Figure 1-1. Figure 1-1 Calls Monitored per Month per Representative So call quality monitoring is all about how to change the behavior and improve and stablize the performance of employee. In order to make it, it is still waiting a efficient tool to make proper analysis. On the other hand, since in the mid 1990s, business intelligence (BI) solutions have promised increasingly fast and widespread access to information that drives better business decisions. But the better part of a decade later, BI s considerable benefits are still limited to a lucky few and still draw from just a subset of the data available to an organization. In particular, if your organization has tried to apply mainstream BI solutions to extract intelligence from your seemingly infinite store of telecommunications data, you ve probably realized a fraction of the potential payback [5]. In highly coveted business intelligence (BI) includes understanding consumer needs, knowing what competition is up to, keeping ears to the ground to make sure that products and services continue to meet changing customer needs and complying with a growing array of rules and regulations. According to Gartner, in 2010 companies spent $10.5 billion for BI and related software, trying to make sense of all the information siloed away in disparate databases and applications. This money was spent trying to mine structured information (numbers and text). These costly investments overlook the contact center, where unstructured data (voice and screen actions) contain answers to many of the 3

12 key questions that influence strategy [6]. As a consequence, more recently, call centers have begun looking at vendors that are exploring the customer experience side of the problem using databases designed according to the principles of Big Data. Call centers have a larger breadth of data subject to more scrutiny through tools similar to IT-friendly business intelligence systems and performance management systems. As businesses need more complex questions about customer interactions, call centers respond by turning to analytics tools that mimic business intelligence applications. Therefore, according to the all the requirements, Phonetica manager wants to improve the quality of business process and their business decisions that impact on costs and revenue evaluation, encourage staff, promoting enterprise development, increasing the market position of the company. Therefore, this operations research shall study how to provide suitable performance management to satisfy its requirement with the lowest cost. It aims to provide the analysis of all its service level, the efficiency of the worker, the traffic time and so on, the key of all the requirements must be evaluated from a different dimension and a different measure. In order to specify and supply their demand, we classify this project into 2 steps, one step is: analysis and design phase, it include: Business process analysis (BPMN), Fit-Gap analysis, KPI (Key Performance Indicator) analysis in terms of identifying main entries and stakeholders based on HIGO and, create DFM(Dimensional Fact Model) for each KPI. Another step simplification phase which implements KPIs as dashboard to build up the corporate performance management system. The whole project includes business process analysis, data integration and analysis result visualization. The analytical model used in this project is aimed at analysis, design and implement a performance management system for a call center, however it can be a reference for other business process analysis system. 1.2 The purpose of project The purpose of this project should be designed to save the cost of the staff and increase efficiency in the call center. So we will analyze Phonetica's business process and according to different stakeholder to design KPIs which can help Phoneica define and measure its business progress toward its goals, then there must be a way to accurately define and measure it. Therefore this project should be show the satisfy of all its service level, depend on different the efficiency of the staff, the 4

13 Primary processes Thesis for Master s Degree at HIT and UP traffic time and so on, the key of all the requirements must be evaluate from different dimension and different measure save cost and increase efficiency in the call center. The meaning of this project should be: - Meet all the requirements which are in the requirement analysis. - Provide a solustion to analysis the call center business process. - According business process to provide a method how to design KPIs. - Provide a solution how to ETL data form the raw data according to KPIs. - Using a good way to display data. - Provide a basic solustiong how to implement a Performence Management system for the call center. - Reduce the salary costs of PHONETICA. - Improve the service level of the customer. 1.3 The status of related research Research about Business Process Business Process (BP) is a capital element of business architecture. Indeed it defines how an organization works and produces its products and services [7]. The overall schema of organizational BP can be represented as a T model as shown Figure 1-2. Management processes support processes Figure 1-2 T model As we can see from the picture, the arms of T represent, respectively, support processes and management processes, while the leg represents primary process. Primary processes are specific to each industry therefore it shown as vertical, while management and support processes are cross-industry so it shown as horizontal. 5

14 Primary processes are end-to-end, cross-functional and deliver value to customers. Primary processes are often called critical processes, as they represent the essential activities that an organization performs to accomplish its mission. These processes make up the value chain where each step adds value to the previous step as measured by its contribution in the creation or delivery of a product or service, ultimately generating value to customers. Management process is a process of planning and controlling the organizing and leading execution of any type of activity, The difference between management process and performance management is PM is to ensure the goals which are being met in an effective and efficient manner however MP force on how to control the organization, such as: a project (project management process) or a process (process management process, sometimes referred to as the process performance measurement and management system). The organization's senior management is responsible for carrying out its management process. However, this is not always the case for all management processes [8]. In another word the meaning of process management is to describe and analyze the service process, depending on the the function of the services, and to according to the services model, manage, execute, and update the services that provided by organizations [9]. So to a vulnerability management process should be part of an organization's effort to control information security risks. A vulnerability management process should be part of an organization's effort to control information security risks. Therefore in the organization of the strategic point of performance management is the process of creating a work environment or setting in which people are enabled to perform to the best of their abilities and the whole work system that begin when a job is defined as needed. It ends when an employee leaves the organization. The reason why lots of organizations want to improve their management process is that it will seek potential value of the evaluation. More effective evaluation processes accomplish these goals and have additional benefits. However, performance management process, participated in effectively and the appropriate mindset, accomplishes the same goals and more and it also supplies additional advantages to both the manager and the employee.after italicising a lot of information about the management process, I found that using the term "performance management" as a substitution for the traditional appraisal system. A 6

15 performance management system includes the following actions. The actions are shown as Table 1-2 Table 1-2 performance management system No. Action 1 Develop clear job descriptions. 2 Select appropriate people with an appropriate selection process. 3 Negotiate requirements and accomplishment-based performance standards, outcomes, and measures. 4 Provide effective orientation, education, and training. 5 Provide on-going coaching and feedback. 6 Conduct quarterly performance development discussions. 7 Design effective compensation and recognition systems that reward people for their contributions. 8 Provide promotional/career development opportunities for staff. 9 Assist with exit interviews to understand why valued employees leave the organization. 10 Then the organizations can use this checklist to help them in a more traditional performance appraisal process. 11 Define the purpose of the job, job duties, and responsibilities. 12 Define performance goals with measurable outcomes. 13 Define the priority of each job responsibility and goal. 14 Define performance standards for key components of the job. 15 Hold interim discussions and provide feedback about employee performance, preferably daily, summarized and discussed, at least, quarterly. (Provide positive and constructive feedback. 16 Maintain a record of performance through critical incident reports. (Jot notes about contributions or problems throughout the quarter, in an employee file.) 17 Provide the opportunity for broader feedback. Use a 360 degree performance feedback system that incorporates feedback from the employee's peers, customers, and people who may report to him. 18 Develop and administer a coaching and improvement plan if the employee is not meeting expectations. 7

16 The organization's senior management is responsible for carrying out its management process. Therefore, this is not always the case for all management processes; for example, it is the responsibility of the project manager to carry out a project management process. The control sequence activities are as shown in Figure 1-3. CONTROL Define (define objectives) Appraise (monitor performances ; review & results) Adjust (plan adjustments; execute adjustments) EXECUTION Interaction between control and execution flow (information or commands) Activity sequence Figure 1-3 Overall schema of management control sequence Support processes are designed to provide support for primary processes, often by the management resources and or infrastructure required by primary processes. The key differentiator between primary and support processes is that support processes do not generate direct value to customers, while the primary processes do. However, support processes can and usually cross functional boundaries. The process of managing ability, does not deliver direct value to the customer, but supports the organization's ability to deliver products and services. Each of support processes activities could include cross-functional teams with representatives from accounting, human resources, IT services and so on. 8

17 1.3.2 Research about Corporate Performance Management Performance management defines the expected performances with each IT service in terms of availability, response time and a like performance indicators. The performance values are often written in a Service Level Agreement (SLA) that, in the case of a third party provider, states price and penalties to performance levels. It includes activities which ensure that goals are consistently being met in an effective and efficient manner. It can focus on the performance of an organization, a department, employee, or even the processes to build a product of service, as well as many other areas. PM is also known as a process by which organizations align their resources, systems and employees to strategic objectives and priorities [10]. Corporate Performance Management (CPM) is a framework that integrates strategy with business operations. It gives management a prospective and real-time picture of what is actually going on across the value chain and provides a robust platform to support future growth. It helps executives address the fundamental business questions. So how to translate strategy into sustainable performance is the key of the CPM. Performance management process system of strategic orientation is the key to success of performance management activities, and currently has become one of the hot issues focused by researchers and practitioners of performance management theory [11]. The performance management system for the complex structure organization based on the emergence and strategy requires the dominant log [12] Companies struggle mightily to meld their strategic intent with their operations, leaving a gap between the strategy they develop and their ability to execute that strategy in day-to-day business. What's needed is a better capability to understand the pulse of the organization not only what is going on financially, but in terms of operations and compliance actionable management information to guide their operational and financial decisions. Developing a strategic performance management approach drives sustainable performance by aligning the activities of the management team and employees with corporate strategy. However companies pursuing a valid strategic plan usually assign small teams of top executives to set their goals, using a rigorous planning process. Most compnies use a top-down process; therefore some combine this process with a bottom-up approach that enables middle management to bring their local knowledge into the process. Such a flexible approach helps build middle manager ownership and motivation [13]. The ability of the organization overall 9

18 level performance which promoted by the performance management system in the complex structure organization depends on a series of overall cooperative activities in the members of the system or the elements of internal processes [14]. Managing performance requirements can be quite difficult due to the nature of performance requirements. First, performance requirements can have a global impact on the target system. Meeting one such requirement may change several parts of a system. Indeed, one cannot simply add a the function of the performance module it will be produce a system with good performance. Rather, one must consider performance requirements throughout the system and throughout the development process [12]. There is some main solution to help decisions to be made. In order to meet performance requirements, which change from one system another, one needed is to consider the characteristics of the particular company and the system such as its workload.to create an effective CPM framework, some important questions must be answered: what business outcomes demonstrate successful execution of our business strategy? How do we best monitor and measure those outcomes. What are the key elements of our operating model and the related business processes? How do we best measure the effectiveness of our operating model and the efficiency of the business processes that support it? What information is necessary to add context to the performance measurement framework and to answer business questions. Performance measurement and analytical information environment that best supports its business strategy and operating model, and outline the steps necessary to enable it with technology. Major components of the CPM Vision and Roadmap include: - Definition of the company level outcome measurements - Definition of the enterprise level process measurements - Rationale importance for each measurement - Description of the primary drill-downs and analytical dimensions - Identification of the primary data sources and any major business process or data gaps - Business case for action - Roadmap to enable CPM [15] Performance management system should be completely dwelled into the culture of the organization & not the other way around and performance measures 10

19 must be defined on the basis of the overall goals or objectives of the organization which are principal concept underlying any employee evaluation process [16]. The process model of cooperation management is showed as figure 1-4. Cooperative management target Evaluation gap Current situation Cooperative opportunity identify Cooperative valuation evaluation Communication Elements of integration Y Management of the order parameter N Achieve results Achieve cooperation effect Figure 1-4 Process model of cooperation management Identify the objectives that you need to achieve. This gives the organization's future state, and then analyzes organization's current Situation. Using this two methods, we can identify the evaluation gap of the company. It means identifying who bridge the gap. Once we know the future state and the organization's current situation, we can think about what you need to do to bridge the gap and reach the company project's objectives: cooperative opportunity identify and cooperative value evaluation. Insights can be gained concerning the factors being considered by cooperative managers when making restructuring decisions and, second, extension education programs can be adapted to meet the greatest need. So opportunity desired value. So the results of the achievement impact the next target of the organization and also its business process. 11

20 1.3.3 Research about Business Intelligence Business intelligence (BI) is a set of theories, methodologies, processes, architectures, and technologies that transform raw data into meaningful and useful information for business purposes. BI can handle large amounts of information to help identify and develop new opportunities. Making use of new opportunities and implementing an effective strategy can provide a competitive market advantage and long-term stability [7]. BI provides the consolidation and analysis of raw data, and also the capacity of processing raw data into the executable decision-making information [17]. It is a broad category of computer software solutions that enables a company or organization to gain insight into its critical operations through reporting applications and analysis tools. BI applications may include a variety of components such as tabular reports, spreadsheets, charts, and dashboards. Although traditional business intelligence systems were delivered via host terminals or paper reports, the typical modern deployment of a BI application is over the web, via Internet or intranet connections. It is also possible, and becoming more popular, to develop interactive BI apps optimized for mobile devices such as tablets and smart phones, and for . By 2001, Information Democracy will emerge in forward-thinking enterprises, with Business Intelligence information and applications available broadly to employees, consultants, customers, suppliers, and the public. The key to thriving in a competitive market place is staying ahead of the competition. Making sound business decisions based on accurate and current information takes more than intuition. Data analysis, reporting, and query tools can help business users wade through a sea of data to synthesize valuable information from it - today these tools collectively fall into a category called "Business Intelligence." For the past many years, BI applications are successfully working in supply chain management, GIS, insurance, retailing, telecommunications, aviation, customer relationship management, banking, healthcare, disaster management and many other domains [18]. Nowdays most of the business companies has implemented Business intelligence to improve their decision making [19]. So it means give more information what they need, analysis when they need it through the right format. By integrating data from across enterprise and delivering self-service reporting and analysis, IT spends less time responding to requests and business users spend less time looking for information. The popular way is offer an integrated, robust and flexiblepresentation 12

21 layer for the full breadth of Analytics capabilities, including statistics, predictive analytics, data and text mining, forecasting, and optimization, all integrated within the business context for better, faster decision making. From the related research, there some detail [19] had been obtained in the table 1-3 below. Table 1-3 Business Intelligence Models Analysis Horizontal BI Models Business Intelligence Model From Voice of Customer [20] A Feasible Enterprise BI Design Mode [21] Application of Data Warehouse in bank operations [22] BI to Improve Delivery Reliability [23] SOA in BI For Real-Time Environments [24] A BI to Support Information retrieve in an ontology-based environment [25] A Systematic Information Collection BI Model [26] The General System Framework of Text-Driven BI [27] Features Combintion of the unstructured information and structured information in an information intensive enterprise. It can derive the richer business insights from the combined data from all the business process. The function of a specific BI model is created to solve the problem of lack of the reference and the prototype. The function of this model is to improve the performance of bank operations and Different waysin schema reconciliation and data duplication Analytical performance is ensured. Flexibility for expanding thecurrent technical architecture The system continuously receives information in real time according to the contracts between the respective services ETL module in order to integrate semantically indexed data with operational ones. Retrieve relevant data. Clean the fake data and store the meaningful data into the database. Design of a structure information collection method based on extenics theory. (Avoids information overload or a knowledge overload). It consists of Text data collection, Text preprocessing, Feature extraction and special information extraction and Mining, 13

22 Continue Table 1-3 Horizontal BI Models Features BI understanding application consumer Neural Network Simplified the understanding of domain requirement. heterogeneity [28]. A Model using amorphic architecture [29] The Amorphic Web information extraction system prototype can locate data of interest based on domain knowledge or page structure. A BI system for catalogue and online retailers [30]. CRM systems with BI increase satisfaction and customer relations. BI understanding application consumer Neural Network Simplified the understanding of domain requirement. heterogeneity [31]. A BI Approach To Support Strategy-Making [32] BI in Enhancing the Teaching-Learning BI reveals the usage patterns (customer usage behaviors and network facility utilization). BI system ensures an effective student-institution relationship Process [33]. BI in E-Learning [34]. The proposed solution produces reports on the time spent by each student for learning. Application of B Very secure, dynamic and analytically enriched tool. I in Banks [35]. BI for Automated test Cost effective and readily available from the raw data. system environments [36] Research about BI tools To accomplish a goal, make sure the proper tools are selected". The table 1-4 is the popular business intelligence tool and the companies. Table 1-4 popular BI tools Business Intelligence Tool Version Vendor BizzScore Suite 7.4 EFM Software IBM Cognos Series IBM Microstrategy 9.2 Microstrategy Style Intelligence 11.3 InetSoft 14

23 Business Intelligence Tool Version Vendor Pentaho BI suite (open source) Board Management IntelligenceToolkit 4.8 Pentaho 7.4 Board International Business Intelligence Tool Version Vendor JasperSoft (open source) 5.0 JasperSoft WebFOCUS 8.01 Information Builders Microsoft Business Intelligence* 2013/ 10 Microsoft QlikView 11.2 QlikTech SAS Enterprise BI Server 9.3 SAS Institute Tableau Software 7.0 Tableau Software Continue Table 1-4 From another perspective, we can divided BI software in 5 main parts, it shown as Table 1-5. Name of each part Database/Hardware ETL tools OLAP tools Reporting tools Metadata tools Table 1-5 the parets of BI software description Database systems are the information heart of modern enterprises, where they are used for processing business transactions and for understanding and managing the enterprise. reads data from a specified source database and extracts a desired subset of data transform function works with the acquired data write the resulting data to a target database enables business users to analyze large quantities of data in real-time presentation layer provide information to the end users Open source business intelligence introduction Open source BI are BI software can be distributed for free and permits users to modify the source code. Open source software is available in all BI tools, from data modeling to reporting to OLAP to ETL. Because open source software is community driven, it relies on the community for improvement. As such, new feature sets typically come from community contribution rather than as a result of dedicated R&D efforts. 1) Advantages of open source BI tools 15

24 Easy to get started. With traditional BI software, the business model typically involves a hefty startup cost, and then there is an annual fee for support and maintenance that is calculated as a percentage of the initial purchase price. In this model, a company needs to spend a substantial amount of money before any benefit is realized. With the substantial cost also comes the need to go through a sales cycle, from the RFP process to evaluation to negotiation, and multiple teams within the organization typically get involved. These factors mean that it's not only costly to get started with traditional BI software, but the amount of time it takes is also long. With open source BI, the beginning of the project typically involves a free download of the software. Given this, bureaucracy can be kept to a minimum and it is very easy and inexpensive to get started. Lower cost. Because of its low startup cost and the typically lower ongoing maintenance/support cost, the cost for open source BI software is lower (sometimes much lower) than traditional BI software. Easy to customize. By definition, open source software means that users can access and modify the source code directly. That means it is possible for developers to get under the hood of the open source BI tool and add their own features. In contrast, it is much more difficult to do this with traditional BI software because there is no way to access the source code. 2) Disadvantages of open source BI tools Features are not as robust. Traditional BI software vendors put in a lot of money and resources into R&D, and the result is that the product has a rich feature set. Open source BI tools, on the other hand, rely on community support, and hence do not have as strong a feature set. Consulting help not as readily available. Most of the traditional BI software - MicroStrategy, Business Objects, Cognos, Oracle and so on, have been around for a long time. As a result, there are a lot of people with experience with those tools, and finding consulting help to implement these solutions is usually not very difficult. Open source BI tools, on the other hand, are a fairly recent development, and there are relatively few people with implementation experience. So, it is more difficult to find consulting help if you go with open source BI After we consider the advantage and disadvantage of the open source BI tools, we select the Pentaho as our development platform. 16

25 Pentaho 1) Pentaho Introduction Thesis for Master s Degree at HIT and UP Pentaho is one of the World s most popular enterprise open source BI Suite. It have more than 2 million lifetime downloads, averaging 100K/month. Founded in 2004: Pioneer in professional open source BI Management - proven BI and open source veterans. From Business Objects, Cognos, Hyperion, JBoss, Oracle, Red Hat, SAS. 2) Pentaho Professional Open Source Advantages. Pentaho is widely recognized as the leader in open source BI. The advantages are shown as Table 1-6. Advantage Open Source Licensing Superior relationship Table 1-6 Advantage of Pentaho description Software code is free Dramatically lower up-front and ongoing costs Complete transparency Relationship based 100% on quality of customer support No vendor lock in simply based on access to future upgrades Reduced risk Delivering whole product using an open source core support, training, documentation, global partner network, consulting, product management, quality assurance, longevity Enterprise Development Methodology Enterprise Support Methodology Continuous testing in diverse environments by a huge global community 3) Pentaho Architectural Advantages Innovative, enterprise-quality products via a professional methodology Superior development productivity and commercial quality via extensive QA Transparent, detailed roadmap Product roadmap, core development, and project contributions managed by Pentaho Delivered via Subscription Service 9 X 5 or 24 X 7 20% of Core Developers time allocated to delivering Services and Support Pentaho architectural advantages are shown as Table 1-7. NO. Table 1-7 Architectural advantage description 1 100% J2EE server-side application for scalability, manageability, integration 17

26 NO. description Continue Table Aggressive support of open standards wherever available J2EE, JDBC, MDX, SQL, JSR-170, etc. 3 Designed for embed ability and Service-Oriented-Architectures (SOA) not a monolithic, hardwired stack exposed via a thin web services layer 4 Lack of legacy architectural issues, acquisition baggage, or cumbersome migrations 5 Componentized and modular for flexibility and easy customization 6 Completely exposed via AJAX and Web Services [37] The architecture diagram as shown in Figure 1-5, the relationship between the major components of the BI Server and it's interfaces with the outside world. The heart of the server is the Solution Engine.The Solution Engine is the focal point for activity within the Pentaho BI Platform. Data Warehouse Browser portal office web services Reporting Analysis Dashboards Process Management Production Data mining Metrics Interation Operational OLAP KPIs Definition Ad-hoc Explore Alerts Execution Business Intelligence Platform Security Administration Business Logic Repository Data & Application Integration ETL Metadata EII 3rd Party Applications ERC/CRM Legacy Data OLAP Other Applications Local Data Figure 1-5 Pentaho architecture 18

27 1.3.4 Research about Report Thesis for Master s Degree at HIT and UP A report is a fundamental part of the larger movement towards improved business intelligence and knowledge management and also is any informational work (usually of writing, speech, television, or film) made with the specific intention of relaying information or recounting certain events in a widely presentable form. Reports are often used to display the result of an experiment, investigation, or inquiry. Using features such as graphics, images, voice, or specialized vocabulary in order to persuade that specific audience to undertake an action. One of the most common formats for presenting reports is tables, dashboard. Reports are not required to follow this pattern, and may use alternative patterns like the problem-solution format. Additional elements often used to persuade readers include: headings to indicate topics, to more complex formats including charts, tables, figures, pictures, tables of contents. Reports are very important in all their various especially in business intelligence. The quality reporting system is a facility that provides site-wide centralized accessibility to the quality control data. The reporting system is menu driven making it extremely user friendly. The data is reported and presented to meet the requirements of management, while allowing for the more detailed information needed by engineering. The programming for this facility is modular. The modules are aligned with the components of the quality measurement system Research about SLA A service-level agreement (SLA) is a s a formal contract between a service provider and a customer guaranteeing quantifiable service performance at defined levels, usually in measurable terms, what services the network service provider will furnish. It describe in detail about service type, scope, quality, and can monitor the service in runtime to check that if there is any default [41]. SLA should contain a specified level of service, support options, enforcement or penalty provisions for services not provided, a guaranteed level of system. Many Internet service providers (ISP)s provide their customers with an SLA. More recently, Internet service (IS) departments in major enterprises have adopted 19

28 the idea of writing a service level agreement so that services for their customers (users in other departments within the enterprise) can be measured, justified, and perhaps compared with those of outsourcing network providers. Some metrics that SLAs may specify include: - What percentage of the time services will be available - The number of users that can be served simultaneously. - Specific performance benchmarks to which actual performance will be periodically compared. - The schedule for notification in advance of network changes that may affect users. - Help desk response time for various classes of problems - Dial-in access availability - Usage statistics that will be provided. Service level for call center KPI is the single most important indicator of the ability to deliver a high quality of service to their customers. The call center's Service Level Agreement is their commitment to a basic standard of service - if they are unable to meet their SLA, the call center is in trouble. Service level is typically set using a goal such as answering 75% of calls with 20 seconds, or 75/20. Monitoring this KPI in real-time is essential, as it provides a barometer of the call center's performance. Up to now about research which studies SLA in the call center is not mature enough, and the relevant information is relatively small. This makes this project more meaningful Research about KPI Key Performance Indicators, also known as KPI or Key Success Indicators (KSI), help an organization define and measure progress toward organizational goals [38]. Once an organization has analyzed its mission, identified all its stakeholders, and defined its goals, it needs a way to measure progress toward those goals. It can be monitored using BI techniques to assist in prescribing a course of action as well as to assess the current state of the business level [39]. I founded that based on recently research BI has significant impact onorganization profit growth; but dashboard is the popular way in the lots tools of BI is so flexible and for more 20

29 affectivity, and it better to recognize which KPIs is more important for industries [40]. Therefor I will selet dashboard as my solusion to meet this project requiremnt. To measurements which are quantifiable and agreed to beforehand, that reflect the critical success factors of an organization. Depending on different companies it will difference [42]. In the last century management realized that corporate performances could not be measured only by financial ratios. It emerged to measure so called physical performances that ultimately lead to financial results (higher product quality may lead higher profitability). KPIs with a wide range of measures includes effectiveness and efficiency. KPI performance is as shown in Figure 1-6 KPI Efficiency Effetiveness Unit Cost Time/Service Quality Productivity Response/Lead Time Conformity Workload Perfect Orders Reliability Flexibility User Satisfaction Timeliness Figure 1-6 KPI performance 21

30 1.3.7 Research about Dashboard Thesis for Master s Degree at HIT and UP Business intelligence dashboards is a data visualization tool displays the current status of an enterprise metrics and key performance indicators (KPI). Dashboards consolidate and arrange numbers on a screen, indicators and performance scorecards sometimes. They may be designed for a specific role and display indicators for a single point of view or the department. BI dashboard products basic features include a customizable interface and ability to pull from real-time data from multiple data sources [43]. Supports interactive query data warehouse system usability and ease of use of the instrument panel is an essential element, because they provide a view of analysts, key business indicators that reflect business performance [44]. The Business intelligence dashboards are often confused performance scorecards. Traditionally, the main difference between of them is business intelligence dashboard, such as the dashboard of a car; indicate status at a particular point in time. On the other hand, scorecards, show progress over time to achieve a specific goal. Dashboard and scorecard design is increasingly converge. For example, some commercial dashboard products also include the ability to track progress towards the goals. Sometimes to referred as a product combining elements dashboards and scorecards as a scoreboard. The development of the enterprise dashboard key factor is the fact that the dashboard is essentially a variety of core business data format [45]. In this project, it will include to display the current status of metrics and KPIs for the phonetica. It will pull real-time data from multiple datasources to display its current service status. In my project, it will focus on the support system to help the manager to develop performance management. It will include five parts: there is requirements analysis, create KPIs; create DFM, ETL that analysis for mining data and creating dashboards. Analysis requirements we will concentrate on all the Phonetica Company s activities in his business life. Next step select KPIs which can be significantly to reflect the current state of the company, like his service level and the customer satisfy. Then according to the KPIs we will make the DFM that for analytic information which is used in KPI element data. We will consider that from 3 parts: measure, dimension, and fact. A fact is a focus of interest; typically, it models an event occurring in the enterprise world. Dimensions are discrete attributes which determine the minimum granularity adopted to represent facts. 22

31 Measures are continuously valued attributes which describe the fact from different points of view; for instance, each sale is measured by its revenue. ETL it. final step is to implement all the result in the dashboard. We will to find the best way to implement the data using data visualizations. Data visualisations are easier to understand and look more appealing to the audience, it is crucial to achieve a perfect balance between visual appeal and functionality. There are two main advantages: the first one is clear. It is a lot easier to understand a dial or graphic than numbers. The second one is less confusion. It is not difficult to get confused when dealing with lots of numbers as you actually need to memorize them to be able to understand the communicated information Own solution To implement this system, my solution is including 3 processes: KPI selected, DFM analyzed, Data integration and implement. The following Figure 1-7 illustration shows the three processes: Figure 1-7 the main process of the system KPI be selected. We use SIRE to help us to select the KPI which meet our requirement. SIRE can help us analyze the whole business process, which data will be used, which data is to be analyzed in different processes, according to the different stakeholders for different analysis, but not through a process of time, 23

32 ultimately decide which data we need. According to the BPMN that the business process of the Phonetica. DFM analyzed. Depending on each state holder, we can understand how to ETL the data from the data base and how to clean the data for preparing to build the cubes. Therefore I have to make sure all the main function of the requirement which is using the dashboard to show all the KPIs which I have selected. Data integration and result display. This is the main function of the system was implemented in open source software: Pentaho BI suite, which will provide the BI tools. ETL (exact, transform, load), it happens during the processing original historical records. OLAP(on-line transaction processing). And using MDX language to retrieve data from the cubes which implemented by the ETL. After the transformations of data, the Pentaho Schema Workbench, which is an interface designer that allows creating and testing Mondrian OLAP cube schemas will offer the cube of forecasting data visually. The Mondrian engine processes MDX requests for the ROLAP (Relational OLAP) schemas. These schema files are XML metadata models that are created in a specific structure used by the Mondrian engine and the dimensions of the each services and time line are needed to connect into cubes by using the Pentaho Schema Workbench [50].The data source is retrieved from the relational database, and presents the results in a multidimensional format via a Java API, then translate MDX to SQL to select the relational database. Finally, by using CDF (Community Dashboard Framework), to implement dashboards and reports using web application. In order to improve the system responding time and make the data update automatically, we will make a control model through the server. The control model sends the parameter to the CDA. Next step CDA will return the data source to the server. The server will translate the result in JSON pattern and store it in the database.the data, which are extracted from the database (original data) under the ETL, through data clean processing, Data transformations is performed by using the ETL that includes both the 'transformation' piece and the 'cleansing' piece. ETL plays an important role in metadata because it maps the source data to the destination, which is an important piece of the metadata. After creating cubes, we can use CDF(Community Dashboard framework) the most important and core part of Pentaho BI tools.cde(community Dashboard Editor) /CDA(Community Data Access) allows you to build real time dashboards by their feature rich UI editor. CDA is a community project that allows fetching data in 24

33 various formats from Pentaho BI platform. It will be accessed by simple url calls and support the following data sources.cda is developed as an abstraction tool between database connections and CDF). It allows data to be retrieved from multiple data sources and combined in a single output that can easily be passed on to dashboard components. 1.4 Main content and organization of the thesis The main content of this thesis is illustrated as following: - It will introduce the requirements and analysis the requirements. - It describes depending on the requriements to design KPIs. - It shows how to according to the KPIs to build the cubs (ETL and data integration). - It will introduce using BI tools such as Pentaho and EXT-JS for implementing the system. - It describes the use cases for each module and introduces the testing methods. - It introduces the steps to deploy the system and describe the deployment documents. The thesis includes system requirement analysis, system design and system implementation and system test and deployment. The organization of thesis is showed in table 1-8 Table 1-8 the organization of thesis Name of each part Requirement analysis System design Description Describe the goal of the system Describe the function of the requirement which shows what meaning of the business intelligent system it will have Describe the how to design KPI according to the business process Describe non-functional requirement Describe overall functional design Show the systeme architecture Design the process for each modules 25

34 Continue Table 1-8 Name of each part System implementation and test Deployment Description Introduce the steps of system implementation Show how to ETL and buid cubes Show how to retrieve data from cubes to build dashoard. Describe the technical keys and problems Describe work schedule Describe the medthods of testing Describe the environment of the system. Describe the dashboard system testing Introduce the steps to deploy the system Describe the deployment documents 26

35 Chapter 2 System Requirement Analysis This chapter will describe the main goal of the system and it will show some introduction of the requirements. In order to illustrate system functionalities more clearly, it will be described how to design the KPI through the business process. After that it will describe non-functional requirements. 2.1 The goal of the system Phonetica manager wants to improve the quality of business decisions that impact on costs and revenue evaluation, encourage staff, promoting enterprise development, increasing the market position of the company. Therefore, this operations research shall study how to provide suitable performance management to satisfy its requirement with the smallest cost. It is to provide the analysis of the satisfy for all its service level, the efficiency of the staff, the traffic time and so on, the key of all the requirements must be evaluated from a different dimension and a different measure. In order to specify and implement Corporate Performance Management, we classify the requirement into 2 steps, one step is: business process analysis, it includes: Fit-Gap analysis, KPI analysis, creates DFM. Another step is according to the KPIs to implement the corporate performance management. In order to describe system functionalities more clearly, this chapter introduces them by showing KPI requirements and other requirements details respectively. The goal of the system is to monitor for call center performances according to diverse aggregation dimension. Stakeholder oriented: Top Management, supervisors, agents, customers. Analysis Roadmap is showed as figure

36 Performance management BPMN Entities analysis Identification in terms of data entities and stakeholder. Conceptual modeling (DFM) SIRE grid (Strategic Information Requirements Elicitation) KPI grid(higo) Information Analysis according to stateholderr perspective Figure 2-1 the analysis roadmap overview Performance management as an effective tool plays an important role in the evaluation, encouraging staff, promoting enterprise development, improving the market position of the company, enhancing vitality and competitiveness. So in my project, I consider the Performance Management coverage as following figure 2-2. Data sources Operation Manager Quality management (survey, recordings, observation,etc.) Objectives(planning) Revenue improvement First call resolution improvement Etc Optimization and improvement Operations improvement New tools New marketing plans Etc Monitoring (dashboard) Agents Customer Supervisors Top management KPI Quality Cost/Efficiency service Figure 2-2 Performance Management Coverage 28

37 2.2 The KPI analysis The first step, we analyze the Phonetica business process, the point is in each activism what's kind of data is generated, and what's kind of information is to be used. Then the second step is according to different stakeholder analyze the data entities. The third step is the use of SIRE to define the data entity attribute. Finally we use HIGO to select KPI BPMN Business Process Model and Notation (BPMN) is a graphical representation for specifying business processes in a business process model. It was previously known as Business Process Modeling Notation [46]. BPMN is an agreement between multiple modeling tools vendors, who had their own notations, to use a single notation for the benefit of end-user understand and training. However business process modeling using the BPMN process modeling standard. And business processes describe how a business pursues its objectives. In order to illustrate the Phonetica business process briefly, there is a simple BPMN of the Phonetica business process as shown in Figure

38 Record the time Phonetica Time Call center receive the call Busy? Busy can not answer Stuff receive the call Provide the solution Question Time Phone call request Hand up the call Customer Make a phone call Wait or not Waite Tell the problem Receive the solution Figure 2-3 simple Phonetica business process From the figure we can find out that when a customer make a phone call, the Phonetica customer server can receive customer's phone it means some customer servicer will answer it, however at this time may be it is very busy (traffic time) in the call center of relate customer service, not necessarily timely received a phone call, therefore customers may wait a few minutes or may be directly hung up the phone. If the customer waits too long time, the client may hang up the phone. If the customer service staff received the consumer's request, it means staff answer the phone. Then the customer service representative will record the customer's request, and will answer the client problem. All above is simply a service, after completion of a service, the company will record all the information of customers, including the time when the customer call in, when the phone hangs up, wether the customer's problem is solved, the time when the customer hangs on, and the question customer wants to solve. All the information will write in the database. Since this project has commercial value, and also has trade secrets, that why I cannot specifically describe the exact process. What I can do be to describe the process in an approximate way. 30

39 2.2.2 SIRE analysis The method, called SIRE (Strategic Information Requirements Elicitation), includes elicitation and modeling of strategic information requirements, which are an abstraction level higher than traditional conceptual level [47]. SIRE can help us analyze the whole business process, which data will be used, which data is to be analyzed in different processes, according to the different stakeholders for different analysis, but not through a process of time, ultimately decide which data we need. According to the BPMN that the business process of the Phonetica I design the SIRE grid as showed in Table 2-1 Table 2-1 SIRE Master Data transaction Analysis Stakeholde r customer - Call number - Start time staff - Master data - Other information Resources Unit - Master data - Service resource - Make a phone call - Wait - Provide request - Hang up the phone - Answer the phone - Provide the solution - Record the start time - Record the customer phone number - Record the waiting time - Record is be answer or not - Record the request - Record the end time - Record the staff number - Record the satisfy score Customer problem Staff skills Traffic time Waiting time HIGO analysis HIGO is a framework that defines business process performances and stakeholders returns and also has been the main framework to define and select key performance indicators [13]. In this project, there are three main stakeholders: customer, worker and manager. There is a simple HIGO grid as showed in Figure

40 Figure 2-4 HIGO grid KPI requirements Phonetica has their own KPIs that their manager can use to determine the success of their operation. Here are many KPIs that give to us. I will Liste some of the common ones, with short descriptions and compare other call center as illustrate in Table 2-2. Table 2-2 KPIs KIP name Average waiting time Abandon Rate Call Handling Time First Call Resolution Transfer Rate Idle Time Hold Time Describe How long does it take for an agent to answer an incoming call? What percentage of the calls is lost before they can be answered? How long does it take the agent to complete the call? What percentage of calls can be resolved in a single call? What percentages of calls have to be transferred to someone else to complete? How much time does an agent spend after the completion of a call to finish the business from that call? How much time does the agent keep the caller on hold during the call? 32

41 Continue Table 2-2 KIP name Describe Service Level How about their service level. Average talking time Customer call frequency Customer Satisfaction. Back office Agent Performance Call Resolution How long time of a phone call A metric to indicate the frequency of repeated calls from the same customer. How satisfied are you with the service provided? How well an agent works within their schedule and determine how efficiently they are spending their time How their agents are performing across several categories The outcome of each call handled by your call center KPI description KPI of Service Level Formula: Percentage of calls answered within X seconds = Σcalls answered within X seconds / total calls Percentage of calls lost after X seconds = Σcalls lost after X seconds / total calls Table 2-3 Table of KPI Specification service level KPI Metrics Service Level (SL) Percentage of calls answered within X seconds Percentage of calls lost after X seconds Average talk time(check KPI VI) Average speed of answer(check II) Objective Varying 33

42 Continue Table 2-3 KPI Description Source Service Level (SL) It is important to have a well defined threshold in order to fit the distribution of calls.information about service level is necessary to size correctly the call center. Moreover it is important to make the agent aware of the service level goal (the trainer can demonstrate to the agent how the effort of each individual concurs to reach the goal by showing Erlang distribution) «Chiamate» table There are totally seven types of SLA, each SLA is defined by a metrics or it will be a combination of two metrics, and they are shown as below table 2-4: Table 2-4 Table of service level types Service Level Agreement Definition Variable (X) range SLA 1 Answered calls within X seconds SLA 2 Lost calls after X seconds 0 40 SLA 3 Average speed of answer(the same as KPI II) SLA 4 Answered calls within X seconds 40 Average speed of answer SLA 5 Lost calls after X seconds Average speed of answer SLA 6 Average talk time(the same as KPI VI) SLA 7 Answered calls within X seconds 20 34

43 Service Level Agreement Definition Continue Table 2-4 Variable (X) range Lost calls after X seconds KPI of Average Speed of Answer Formula: Average Waiting time = Σ waiting time of answered calls / number of calls Table 2-5 KPI specification Average Speed of Answer KPI Metrics Objective Description Source Average speed of answer (ASA) Average waiting time of answered calls No standard It is a measure for the call center, not individuals. However, it is directly affected by staffers being available to take calls when scheduled, so schedule adherence is the measure of individual performance that is typically in place to ensure that the call center s ASA goal is met. The recommended metric for service level and ASA is the percentage of the intervals (hourly or half-hourly) for the day in which the service level or ASA goal is met. This metric provides a much more useful look at speed of answer with more information to address staffing issues and adjust schedules for better work coverage. «Chiamate» table KPI of Upselling Opportunity Formula: Number of overtime calls = Σ overtime calls Table 2-6 KPI specification Upselling Opportunity KPI Metrics Upselling opportunity Number of overtime calls Average of overtime calls Objective No standard 35

44 Continue Table 2-6 KPI Description Source Upselling opportunity It is a measure for upselling opportunities. The contract between the call center and the customer defines on which range of time (e.g. from 9 AM to 6 PM) and on which days (e.g. from Monday to Friday) the call will be answered. The indicator counts the number of overtime calls in order to assess opportunities to propose and sell additional range that is profitable for both call center and customer. «Chiamate» table KPI of Traffic Formula: Number of calls = Σ calls Table 2-7 KPI specification Traffic KPI Metrics Objective Description Source Traffic Number of calls no industry standard This is the number of contacts received by the center (technically, the volume is measured at the point of entry into the queue calls handled within the IVR are measured separately). Call Volumes can be represented on a Yearly, Monthly, Weekly and Daily basis. For the purpose of Work Force Management (WFM), Call Volumes are often broken down into 15 minutes intervals. Traffic represents the work-load that a center must handle and although the calls are initiated by individual customers, a good forecasting process can predict the volume demand with a very high degree of accuracy. Actual Call Volumes are tracked and provide information for any Intra-Day adjustment as well as short and long-term forecasting. Phonetica calls KPI of Abandonment Rate Formula: Percentage of abandoned calls = Σabandoned calls / total calls 36

45 Table 2-8 Table of KPI specification Abandonment rate KPI Metrics Abandonment rate Number of abandoned calls Percentage of abandoned calls Objective Description Source no industry standard Abandon rate can translate into lost customers, and tracking it may help to identify patterns of abandon behaviors. The abandon rate is measured by looking at the calls that abandon during the defined period of time compared with all calls for that period. Abandon rate is a typical measure of call center performance, but abandon rate is not entirely under the call center s control. While abandons are affected by the average wait time in queue (which can be controlled by the call center), a multitude of other factors influence this number, such as caller tolerance, time of day, and availability of service alternatives. Phonetica calls KPI of Average Talk Time Formula Average of call time = Σcall time / number of calls Table 2-9 KPI specification Average talk time KPI Metrics Objective Description Average Talk Time (ATT) Average of call time no industry standard The most common measure of contact handling is the average talk time. ATT is used when determining overall workload and staffing requirements. To accommodate differences in calling patterns, ATT should be measured and identified by time of day as well as by day of week. It measures overall call center performance and team and individual agent performance. Although handle times will vary based on call content, an agent should typically deliver a consistent handle time within an acceptable range. However, overemphasizing short ATT can reduce the quality of the interaction 37

46 and decrease the conversion rate. ATT numbers should be gathered and analyzed primarily to determine if agents are in an acceptable range of performance and whether differences among agents are associated with different conversion rates. Source Phonetica calls KPI of Average Back-office Time Formula: Average handle time of instance = Σhandle time of back-office instances/ total number of back-office instances Table 2-10 KPI specification Average Back-office Time. KPI Metrics Objective Description Source Average Back-office Time (ABT) Average handle time of instance benchmarking One of the components of that is considered to be the most variable and the most controllable is the back-office time portion of the contact. The ACD provides this measure. ABT should be measured and evaluated over time to determine the appropriate amount of time needed to accomplish the necessary tasks. This overall call center ABT number will then typically serve as the benchmark against which to measure an individual agent s ABT time. Comparisons between agents should be made with similar types of calls because the requirements of different call-handling situations can vary significantly. ABT should be measured by type of call as well as by individual. Phonetica calls KPI of Delay In Queue Formula: Max delay = Maximum of waiting time Table 2-11 KPI specification Delay in queue KPI Metrics Objective Delay in queue (LDQ) Max delay no industry standard 38

47 Continue Table 2-11 KPI Description Source Delay in queue (LDQ) The age of the call that has been in queue the longest, or the longest delay in queue (LDQ), is a real-time measure of Performance that is used by many calls centers to indicate when immediate staffing changes are required. LDQ is also A historical gauge of performance that indicates the "worst-case" experience of a customer over a period of time, such As a day. LDQ is a call center measure and not an individual gauge of performance, but, like the other speed-of-service Measures, this statistic is affected by schedule adherence. The LDQ measure is an indicator of an extreme situation and can be used as a "red flag" for real-time management. Although it can be used as a historical key performance measure, that use is not recommended because unusual call Arrival patterns can make the queue situation look much worse than it really is. Phonetica calls 2.3 System functional requriement The fist step, we need to analyze the participant or who will use the system, after consulting and analysis, we make sure that one kind of user may utilize this system, the manager of the Phentica. Next step, use case analysis.as we have analyzed in the first step. There is only one kind of user may use this system. Therefore for the operators, the system mainly provides 5 functions to meet their requirements. The system should automatically update the data day by day. According to phone records, the system should the information and store the related data to the database. Therefore The system should automatically run its function. In this way we can make sure the data we will use is the refresh data, and the result calculated by the data is correct. 39

48 In order to meet the user to view the requirement KPI information, the system should provide the appropriate method to display the KPI information. About this function, we will use the table chart to display the result. This system should provide a summary page to show the result of each service, and support the user according to different time to see who is going on in this service. The system should provide the user to access the specific detail information which comes from the summary page item. The manager will want to know the exact detail of each service in different time. In this approach, he can what's happened in this time, and do some decision. Consider the user requirement of this system, and see the workforce result, the system should provide a function that compares the precise information of the service in a different time. In this way, we can quickly see the consequence that how this service is going on in this period time and how efficient progress of each service.then we will use the line chart to display the result information. Line char simply use a line to connect the data points. They are most useful for showing trends, and for identifying whether two variables relate to (or "correlate with") one another. And think the user maybe wants to use the data result. For instance print it or send it to other users, the system should also provide the function saves it as Excel. There is a use case of the system as showed in Figure

49 Performance Management System browser all the KPIs see detail KPI user compare information save the result as Excel update the data automaticly Figure 2-4 system use case. 2.3 The un-functional requirements This system has following non-functional requirements: 1) Make sure that the response time when the user click should be less than 5s. 2) All the result which has been calculated must ensure timely and automatic update. 3) Interface should be easy enough to use, and keep consistent with other systems. Non-functional requirement is to focus on friendly interface design, response time, easy to use, reliability of different environment. 2.4 Brief summary In this chapter, I introduce the simple business process of the phonetic and describe how to design and select the KPI from the business process. After that I illustrate the function of the system. Through the analyzing the business process, we 41

50 can find out how to analyze the call center business process not only to the Phonetica but also to other company. After analysis the KPI the requirement more is become more clearly. Next step it describes the non-functional requirements. Response time and friendly interface are very important for the implementation of this system and needed to be implemented in the final product. 42

51 Chapter 3 System Design In this chapter, it will describe the overall function of the system. Specifically to describe the system for each function module design, and also in this chapter I will introduce each process details, including the program process, class diagrams, and flow charts. In the end of this chapter, I will introduce some popular BI tools and how to use these tools to help me to achieve this project. 3.1 The overall system design As showed in the Figure 3-1 is the overall system architecture, there is included mainly 5 layers: data layer, OLAP layer, server layer and the presentation layer. 1. The main function of the Data layer is to store the data source and retrieve the data source. 2. The main function of the OLAP layer is to process the data source. Extract and analyze the data source which the database then return the result to thee server layer. 3. The main function of the server is to send a query to the OLAP layer with the necessary parameter and then save the result in the database. Another function of it is receiving the request from the presentation layer and select the database gets the result and return the data source to the presentation layer 4. The presentation layer is used by the system user. All the operations are happened in this layer. The main function mode of this system is in 3 layers: OLAP layer, server layer and presentation layer. I will explain them one by one. 43

52 Present ation Layer Dashbaord Http Request JSON Data Server layer Data Access Component revieve Stroe HTTP client Data source OLAP Layer CDA CDA CDA CUBEs Mondrain Engin Analysis Data Wareho use layer Phonetica raw data warehouse ETL Data Warehous e Slice Date Mart Data Mart Data Warehous e Figure 3-1 system architecture overviews According to the demand of the request of the analysis in the last chapter, we are forcing on the KPI which we have decided from the requirement analysis. Therefore all the model design is used to implement this requirement. In this system, it has four main modules: In data warehouse layer, we use DFM mode which according to the stakeholder and the KPI requirement, we can decide which data we are needed, it will analyze the data in the call records and design a new relationship table in the data warehouse. The function of it is to design the database is used by the data warehouse layer. Data mining and processing model are used by the OLAP layer. The function of it is to ETL data and stores them in the database which designed depending on the DFM. In server layer, we use data access model and control model. The function of data access model is to retrieve the data and returns them. The control model module its function is to send the parameter to the access model and store them in the database and supports the service to the display module. In the presentation layer we use dashboard module which according to the functional needs, it is necessary to compare the results depending on different times, and also we want to store in a table chart as Excel. So in this system it needs this three modules implement sequentially. In order to describe all modules clearly there is a table is shown in the Table

53 Table 3-1 modle description Model list The detail of the model DFM model Data mining and processing model Data access control model Control model Dashboard model According to the stakeholder and the KPI requirement, we can decide which data we are needed. The function is the process of analyzing data from different perspectives and summarizing it into useful information.the useful information is according to the KPI The function is typically refers to software and activities related to storing, retrieving, or acting on data housed in a database or other repository Retrieve the datasource from the data access modle and store the result in the database. When the dashboard send request return the datasoure to dashboard The function is to monitor the major functions of the result at a glance via the instrument cluster. 3.2 DFM Design According to the KPIs, we can create the DFM (Dimensional Fact Model) depend on each state holder. After that we can ETL the data from the data base to clean the data for preparing to build the cubes. Up to now, I have described all the principal function of the requirement which is using the dashboard to show all the KPIs which I have selected. Therefore the next step is according to all the DFMs to design the system. Data which DFM related can be retrieved from the Phonetica database servicer. Therefore the first step is ETL and data integration. We use Pentaho tool to hand it. After that we can build the cubes. Subsequent step we use the CDA to implement the data source which retrieves the data from the CDA and last step CDE build the dashboard which its component using the data source comes from the CDA. I will show some DFMs that I have a design accord the KPI which have already selected. Others DFM are familiar with this one. 45

54 3.2.1 Service Level DFM According to KPI service level to design the Service Level of DFM. The fact means the metritis. So we design the fact as the formula requirement. We need to calculate the result depending on the time and service names, it means we will design three dimensions, they are: the Time dimension, the service dimension, is the date dimension. In order to explain the DFM of the service level, I will describe it as following Table 3-2. And the picture is shown as Figure 3-2 Table 3-2 DFM of Service level Service level Fact Describe Percentage of calls answered within X seconds Percentage of calls lost after X seconds dimension Date-week-month-month of year Date-day of month Date-Holiday Date-Weekend Service-SLA type Service-Skillset Service-Customer Service-insieme Time-Quarter-Half-Hour-shift Figure 3-2 DFM Service Level 46

55 KPI 01 Average Speed of Answer DFM According to KPI Average Speed of Answer to design the Average Speed of Answer of DFM. The fact means the metritis. So we design the fact as the formula requirement. We need to calculate the result depending on the time and service names, it means we will design three dimensions, they are: the Time dimension, the service dimension, is the date dimension. In order to explain the DFM of the Average Speed of Answer, I will describe it as following Table 3-3. And the picture is shown as Figure 3-3 Table 3-3 DFM of Average Speed of Answer Average Speed of Answer Fact dimension Describe Σ waiting time of answered calls / number of calls Date-week-month-month of year Date-day of month Date-Holiday Date-Weekend Service-SLA type Service-Skillset Service-Customer Service-insieme Time-Quarter-Half-Hour-shift 47

56 Figure 3-3 DFM Average Speed of Answer KPI 02 Upselling Opportunity DFM According to KPI Upselling Opportunity to design the Upselling Opportunity of DFM. The fact means the metritis. So we design the fact as the formula requirement. We need to calculate the result depending on the time and service names, it means we will design three dimensions, they are: the Time dimension, the service dimension, is the date dimension. In order to explain the DFM of the upselling opportunity, I will describe it as following Table 3-4. And the picture is shown as Figure 3-4 Table 3-4 DFM of Upselling Opportunity Upselling Opportunity Fact dimension Describe Σ overtime calls Date-week-month-month of year Date-day of month Date-Holiday Date-Weekend Service-SLA type Service-Skillset 48

57 Service-Customer Service-insieme Time-Quarter-Half-Hour-shift Figure 3-4 DMF of upselling opportunity KPI 03 Traffic DFM According to KPI service level to design the Traffic of DFM. The fact means the metritis. So we design the fact as the formula requirement. We need to calculate the result depending on the time and service names, it means we will design three dimensions, they are: the Time dimension, the service dimension, is the date dimension. In order to explain the DFM of the Traffic, I will describe it as following Table 3-5. And the picture is shown as Figure 3-5 Table 3-5 DFM of Traffic Traffic Fact dimension Describe Σcalls Date-week-month-month of year Date-day of month Date-Holiday Date-Weekend Service-SLA type Service-Skillset Service-Customer 49

58 Service-insieme Time-Quarter-Half-Hour-shift Agent-skillset-insieme-profilo Figure 3-5 DMF of Traffic KPI 04 Abandonment Rate DFM According to KPI service level to design the Abandonment rate of DFM. The act means the metritis. So we design the fact as the formula requirement. We need to calculate the result depending on the time and service names, it means we will design three dimensions, they are: the Time dimension, the service dimension, is the date dimension. In order to explain the DFM of the Abandonment rate, I will describe it as following Table 3-6. And the picture is shown as Figure 3-6 Table 3-6 DFM of Abandonment Rate Abandonment Rate Fact dimension Describe Σabandoned calls / total calls Date-week-month-month of year Date-day of month Date-Holiday Date-Weekend 50

59 Continue Table 3-6 Rate Abandonment dimension Describe Service-SLA type Service-Skillset Service-Customer Service-insieme Time-Quarter-Half-Hour-shift Figure 3-6 DMF of Abandonment rate KPI 05 Average Talk Time DFM According to KPI service level to design the Average Talk Time of DFM. The act means the metritis. So we design the fact as the formula requirement. We need to calculate the result depending on the time and service names, it means we will design three dimensions, they are: the Time dimension, the service dimension, is the date dimension. In order to explain the DFM of the Average Talk Time, I will describe it as following Table 3-7. And the picture is shown as Figure

60 Table 3-7 DFM of Average Talk Time Average Time Fact dimension Talk Describe Σcall time / number of calls Date-week-month-month of year Date-day of month Date-Holiday Date-Weekend Service-SLA type Service-Skillset Service-Customer Service-insieme Time-Quarter-Half-Hour-shift Agent-skillset-insieme-profilo Figure 3-7 DMF of Average talk time KPI 06 Average Back-office Time DFM According to KPI service level to design the back office of DFM. The act means the metritis. So we design the fact as the formula requirement. We need to calculate the result depending on the time and service names, it means we will design three dimensions, they are: the Time dimension, the service dimension, is the date dimension. In order to explain the DFM of the back office, I will describe it as following Table 3-8. And the picture is shown as Figure

61 Average office Fact dimension back Thesis for Master s Degree at HIT and UP Table 3-8 DFM of back office Describe Σhandle time of back-office instances/ total number of back-office instances Date-week-month-month of year Date-day of month Date-Holiday Date-Weekend Service-SLA type Service-Skillset Service-Customer Service-insieme Time-Quarter-Half-Hour-shift Agent-skillset-insieme-profilo Figure 3-8 DMF of Average back office KPI 07 Delay in Queue DFM According to KPI service level to design the delay in queue of DFM. The act means the metritis. So we design the fact as the formula requirement. We need to calculate the result depending on the time and service names, it means we will design three dimensions, they are: the time dimension, the service 53

62 dimension, is the date dimension. In order to explain the DFM of the delay in queue, I will describe it as following Table 3-9. And the picture is shown as Figure 3-9. Table 3-9 DFM of delay in queue Delay in queue Fact dimension Describe Maximum of waiting time Date-week-month-month of year Date-day of month Date-Holiday Date-Weekend Service-SLA type Service-Skillset Service-Customer Service-insieme Time-Quarter-Half-Hour-shift Agent-skillset-insieme-profilo Figure 3-9 DMF of Delay in queue 54

63 3.2 Data mining and processing model Intrdoduction The function for data mining is the process of analyzing and extracting meaningful data which from all the information of the Phonetica call records. The call records information is recorded in a different database. So according to different service and skill set we have to extract different data, after that summer it and design it. After the DFM model, we are clearly to know which data we are needed. Next step is to retrieve data from the data database and ETL to translate in a new database. Moving each calls record data from the original data to another. And separate each item record data store in another table Data warehouse design The function of data extraction mainly includes the following aspects: data analysis, data relationship analysis, according to the DFM framework to design a star schema which we can translate the business model to our system model. Phonetica call records are stored in the database, in accordance with the DFM design pattern, find the corresponding data which we will use. Then according to DFM design patterns translate star schema design pattern. Next step follows star schema design we can finish database design. We should be noted that in a star schema that "fact table" t according to DFM "fact" to design, the design of other dimension tables should accord to DFM "Dimensional" to design. In DFM model dimension as the parameter that need to be calculated, when the parameter which is needed to calculate the result is provided the user, according to the formula of recorded in the fact, we can get the result what we want. For example: We want to calculate the average talk time of each service in In order to calculate the average talk time we must be known: how many calls has been answered in How long time of each service's talking time. At this point, we already knew the date dimension: , and another implicit dimension parameter: All the service. Next steps according to these data parameters we select the database to calculate the number of calls and the time of each call's talking time. After that adds every call's talking time, we will get the total time of each service. Using the formula: Talk time synthesis / successfully picked up the phone number = average talk time. In this way we can get all the result that related on the fact function. So 55

64 we follow the star-schema and combined with DFM model o design the database. With this design, we can design the digestion accurately to link all the parameter that needs to calculate the result. And using the relationship of the fact table we can select other dimension table and calculate other parameters. Dimension as the properties of the table. Once we obtain the parameter, we can select the related time's dimension value. We can calculate the result. In order to describe the model clearly, there are two pictures are shown in the Figure fact Fact table DFM Dimension Dimension tables Star schema Database Figure 3-10 modle describe Date Date Week month month of year day of month Holiday Weekend Service Service Service SLA type Skillset Customer insieme Fact Date-id Agent-id Service-id Time-id talking time Time Time Quarter Half Hour shift Agent Agent Skillset Insieme profilo Figure 3-11 start schema table of the average talk time Other data source design according to their DFM model. 56

65 3.2.3 Data integration Thesis for Master s Degree at HIT and UP The function of the ETL process to integrate the data and store in a new database.the first step in the ETL process is mapping the data between source systems and target database (Phonetica call records). The second step is cleansing of source data in staging area. The third step is transforming cleansed source data and then loading into the original database. We will use the open source BI tool Pentaho to manage the enormous volumes and velocity of the data. After data warehouse design, we already had known which data we will use and how to store with them. Therefore the logic model is evaluated and the evaluation results are available for further processing by downstream steps. As part of the data mining flow, the process logistic model is described as a serialized form to a file. After ETL process all the data which star schema dimension needed will be extracted from other databases, therefore we can calculate the result based on the this data. After this process designs we have completed translate the business model to a data model. Because the Trade Secrets, I cannot illustrate the one call idem of the Phonetiica Company. Therefore I will describe as a parallel call record. For instance when a customer makes a phone call to Phonetica, we suppose the phone call is to be answered, then the client asks some question the question is answered by the staff, next step customer hang up the phone. The call record is like the table Table 3-10 call record Customer Is Call time Hand up Saff Service call answered time name name number ******** yes ****** ****** 10:45,20s 10:50,30s According to this record, the ETL process will translate call record information and store the data the database what I have described in the data warehouse model. For instance in DFM model average talking time, the database design as the picture figure 3-11, then we can extract the information of this call record and translate them and load them on that table. For time table, it has 6 properties, so extract the call time and hand up time and calculate the talking time is 5min10s and store into the fact table which its properties named talking time. Next step translates the call time and stores it in the time table and the data table. 57

66 The key point we need to ensure that data integrating must to be done automatically. According to the running mechanism of the Phonetica, because of ensuring the data timely and accurately, we have to run the data integration day by day. Therefore, according to the time attribute of every item of the call records, we can update the data to the new database that we have designed. After ETL, the data are saved in a different database Cubes design Cube is a system for collecting time stamped events and deriving metrics. And we use cubes to select the data from the database. In this project, the cube design as collection of measures (DFM fact) and dimensions (DFM dimension). In starting schema, the fact table holds the columns from which measures are calculated, and contains references to the tables which hold the dimensions. An event in Cube is simply a JSON object with a type, time, and arbitrary data. In order to describe the cube model clearly there is a picture is shown in the Figure 3-3. Figure 3-12 the cube of Average talking time In data warehouse design, we have to solve the problem how to ETL data and store it in a new database which design follows the start schema pattern. Then the function of the cube is how to retrieve the data source for the database which design 58

67 follows the start schema pattern. As we can find out that the cube is similar with the DFM, both of them have the division and the name of it is the same. The relationship between them as showed Figure Data layer Phonetica transaction DB Data Warehous e Logic Layer ETL data ETL Define Store Define Retrieve DFM CUBEs 3.3 Data access model Introduction Figure 3-13 the cube of Average talking time The model of data access to the main function of the system.after the model of the cube designed, we retrieve the data from the cubes using the MDX. MDX as a language to retrieve data from a relational database, and presents the results in a multidimensional format via a Java API, then translate MDX to SQL to select the relational database. MDX query statement Schema element model file. In this module, it is mainly included the following steps: the first step is the selection of data sources, the second is data source connections, the third step is data access, and the next is data processing, the last one is data returned. In order to describe the model clearly there is a table is shown at the table 3-2. Table 3-11 module description Function list Detail description Extract data This funciotn it means because different requirement we need to select to access differnet cubse, and then obtain the corresponding data source 59

68 Continue Table 3-11 Function list Detail description Result calculated According to the demand of the different level calculate the result return the result Return the data source Model process Extract data When we want to extract data from the database we must use the cubes. It means we send the request to the cube then the cube retrieve the data from the database and return the result. So In this module it contains several steps as illustrates: 1. Configuration Database: Connect the database which must be included in the original data, in means the database must contain all of the original data, and automatically update every day, in this project is the phone records of the Phonetica. 2. Select Data Source: Connect to a data source using Jinni, it can translate the MDX language to relational SQL command actions is configured to reference a specific database using the database's JNDI name. When running the Pentaho BI Platform in the JBoss application server, there are several steps required to successfully configure a JNDI name for the data source. 3. Text if the connection is successful. 4. Create a MDX query: Depending on the different requirement we select related cubes and choose the dimension. 5. Calculation of results: According to the fact in the cube calculate the result. 6. Result returned The program process of this module is shown at the figure

69 Start Database connect MDX N Select JNDI Y Retrieve the data source from cube test Select the data source form database end Figure 3-14 Extract data flow charts Result calculated Based on the results that analysis in the previous chapter, we've got all of the KPIs and finished the DFM design. Therefore how to calculate the result accurately is the key point of this system. So it means that the results calculate accurately determine whether the KPI we select correct and whether we build the DFM right or not. Calculation of the result includes the following steps: 1. Obtain the parameters we need to calculate (Extract data model) 2. Require the dimension of the cube. 3. Select the current time 4. Obtain the function 5. Calculate the result 6. Return the result. In order to describe this model clearly there is a picture shown as the figure

70 start N Obtain the parameter Obtain the current time Is correct Y Select the dimension Calculate the result Return End Figure 3-15 Result calculated flow chart In order to describe it briefly, the table 3-12 is shown the hierarchy of the tables. After we connect the database and finish the JNDI step, we use control module to provide the parameter send the request to the data calculated model then it will select a cube to calculate the result. Next step we select the dimension named Service and choose another dimension called time so the system will retrieve the data from the time table which named Dimension Table and select the related data called Dim_time and Dim_services. After that the model will translate the result to JSON and store it into the database. Table 3-12 tables of the Hierarchy. Hierarchy Name Dimension Name Dimension Table Date Date Dim_date Year Month Week Months Day and month Day in Month 62

71 Continue Table 3-12 Hierarchy Name Dimension Name Dimension Table Day Time Hour Dim_time Half Quarter Service Service Dim_services Insieme_azienda Skillset Skillset Dim_services Customer Customer Dim_services Control model The key point in the design of this model is how to meet her customer need for the faster to screen the dashboard. In the previous tests about the dashboard model when the user want to screen the dashboard, each result need to calculate one by one and every time the user wants to see the dashboard again it needs to calculate once,it costs a lot time, very poor user experience. So in order to improve the user experience, I design this model to solve this problem. The function of this model is to automatically calculate all the parameters, then send all these parameters to the result to calculate model. After that receives the data source and stores them in the database. So we can implement using a server Retrieve the data source Considering this system is using the web browser. I decide translate the data result to JSON format. In order to improve the speed of the response time and the dashboard can receive the data source immediately and don't need to calculate it again. The first step we must retrieve the entire data source using the data access model. In this process it include following steps: 1. Log in CDA 2. Generate the parameter which is needed to send into the data access model 3. Send the parameter to the data access model. 4. Receive the data source results. 5. Translate the result to the JSON. 63

72 6. Store the result. The program process of this module is shown at the Figure start N Login pentaho Send the parameter success Y Generate the parameter Receive the data result Translate the result Store the result end Figure 3-16 store result model flow diagram In order to store all the result of the data source, server accesses the CDA automaticly, and then stores all the result in the database. The server will login into the Pentaho CDE, after that sends the parameter to the CDA. CDA will return the datasoure to the server then servers translate the datasource to the JSON and store it in the database. When CDA access the data in the Phonetica database, the first step is to access the cube using the MDX language, then through the JNDI access the Phonetica database retrieves data. Last step return the result to the CDA.The sequence diagram is shown in the Figure

73 Data source MDX CUBE JNDI database MDX(parameter) CUBE Select(parameter) JNDI request() select datasource Figure 3-17 store result sequence diagram In order to make sure the data in the database update automaticly, we should implement the server can retrieve the data automaticly. In the first step, we log in to the Pentaho and send the user name and password. The second step we need to prepare all the parameter to retrieve the data source from the CDA. Time, the name of the cube, the address of the CDA, and send to the Pentaho CDA, next retrieve in information from the CDA page. The result of the information is JSON, so the third step is analysed the JSON formate, the separate the JSON result. In order to explain this part, there is a sequence diagram to describe it Figure

74 Parameter prepare login CDA JSON analyze usr name password database control login() login result result inmormation send parameter () Send data() analyze send analysis result() store result of store Figure 3-18 retrieve data sequence diagram In order to compare the same service base on the different time, the server should set the data source, and join the different information. In order to explain this part, there is a sequence diagram to describe it Figure Data source Data Extraction translate Target JSON format Database control retrieve data source() translate() format() get the target format result store the result() result Figure 3-19 compare result sequence diagram 66

75 Interact with dashboard model After the first step of the control model, we already have all data source result. It is when the user wants to see the dashboard it doesn't need to calculate it in the same. What the system needs to do is support the data source to the dashboard model. The next step is interacting with the dashboard model. Because the database is very large, if the server support all the consequence, the dashboard module will crash. So we design like this, when the user select one KPI the dashboard system sends a request with the parameter, then the server will receive the parameter and select the related data source result and return it to the dashboard model. In this process it include following steps: 1. Receive the dashboard mode request 2. Analyze the parameter 3. Connect the database 4. Select the database with the parameter which is received from the dashboard model 5. Return the result 6. Close the database The program process of this module is shown in the Figure

76 start Receive the request Connect the database Error N Is received Database is connected Y Y Analyze parameter Y Select the data source N Parameter Error N Parameter is correct Return the data source END Figure 3-20 Interact with dashboard model flow diagram To implement the control model I will use the Server to access the data from the web page. And the use CDA to retrieve data forms the cube. After that the cubes comment with the database to select the data.in order to explain this model more clearly, there is a sequence diagram as showed Figure

77 Loop:the mumber of the parameter dashboard model server control Parameter analysis Send parameter Retrieve data source databaser control send request(parameter) analyze parameter(parameters) send parameter(parameter) the number of the parameter retrieve the data(parameter) Database control(parameter) result data source 3.4 Dashboard model Introduction Figure 3-21 control sequence diagram The main function of the dashboard model is displaying the result, and while compare the results. Accessing the data source it can send the parameter to store the result model and receive the data source then display it. How to be more intuitive and more concise the results displayed to the user, and meet the requirements of KPI, is the focus of this module design. After consultations with Phonetica manager, and investigated some actual situation we decide that distinguish the KPI information between the Service and the skillset and each item needs to display its details information. It includes: yesterday, the detail that was comparing the last week two weeks ago, the detail which was comparing past month to two month ago. The detail information of the yesterday, it needs to divide into 24 hours, and in every hour it shows this service's the number of the phone call. The detail information of the week, it needs to separate into 7 days, and each day it shows this service's (or skillet) the number of the phone call. At the same time we should note that in accordance with the comparison of the same week day, for instance Mondy to Mondy, Tuesday to Tuesday, and so on. The detail information of the month, it needs to separate into 30 days (or 31days), and each day it shows this service's the 69

78 number of the phone call. We should note that in accordance with the comparison of the same week day. For instance the first day of previous month is Mondy and the first of the two month ago is the Tuesday, then the comparison point was from the first day of the last month and the second day of the two months ago, and so on. The function description is shown at the table Table 3-13 function description for dashboard model Function list Detail description Extract data source This funciotn it means because different requirement we need to select to access related datasource, and display it. Save as excel It is convenience to the user In order to understand of each requirement of the dashboard clearly, I will describe one by one. The description is shown at the table Table 3-14 description for dashboard requirement Requirement list Detail description Show the summary information For each KPI it will display its information. Devied them by service and skillset. Show the detail infomation Display the item of the summay information. Save as excel It is convenience to the user In order to understand the summary pages and detail information clearly, I will describe them as table3-15 and Table Table 3-15 summary pages Summary informaton describe yesterday Service Last week 70

79 Continue Table 3-15 Summary informaton Service Skillset Table 3-16 detail pages Detail information Detail describe this month preivious month yesterday Last week This month Perivous month describe yesterday Week comparison Month comparison Program process Summary page process 1. The user browses the summary page according to the following steps: 2. Browser the index page, it will show all the KPI. 3. Select one KPI; it will display all the information devices in service and skillset. 4. The server will select the database and retrieve the data source then returns the data source to the summary page. 5. Summary page displays the result in table format. 6. The user clicks the Excel button. It will save all the information as Excel. 7. The process of summary module is shown at the Figure

80 Start N Login Success Select the data source in the database Y Index page Server return data source Select one KPI Y Display the result N Send parameter to the server Save the result as excel Parameter is correct end Figure 3-22 browse summary page folw diagram Detail page process The user browses the detail page according to the following steps: 1. The user selects one summary page. 2. Select one item then it will display all the information devices by yesterday, week comparison and monthly comparison. 3. The server will select the database and retrieve the data source then returns the data source to the detail page. 4. Detail page displays the result in table format. 5. The user click the Excel button. It will save all the information as Excel. The process of detail module is shown at the Figure

81 Start Summary page Detail page Server return the data source Parameter Error N Select parameter to the server Paramete r is correct Display the data source using the chart Y Select the data resource Save the result as a excel End Interface design Figure 3-23 browse detail page folw diagram In this model, it will introduce the system page design. According to the function requirement and non-function requires, the page should be designed concisely and easy to use and the result should be displayed in the suitable place. So I decide to use table and line chart to show the result. As showed in Figure 3-24, in the center of the page is the name of the KPI, on the right side of the picture is Phonetica company logo. In upper left corner of the page is a text description of the KPI. In center of the page, is the top panel, divide by the service and skill, and the contents of the top panel is the table chart. In the table include five columns, they 73

82 are: Service name, Yesterday, This month, previous month. Service columns display the service name. Yesterday, this month, previous month display the number of the telephone. In the top right is the button the function is to save the results as Excel. Figure 3-24 Mock up of the summary page As showed in Figure 3-25, in the center of the page is the name of the KPI, on the right side of the picture is Phonetica company logo. In upper left corner of the page is the name of the service. In center of the page is the top panel, divide by the Yesterday, week completion and month completion, and the contents of the top panel is the line chart. The chart is showing the result is reflected the number of the cells divide by related dimension. In the top right is the button the function is to save the results as Excel. 74

83 3.4.4 Sequence diagram Figure 3-25 Mock up of the detail page The main sequence diagrams of this system are shown as following pictures. When the user wants to browse the summary page, the first step is open the index page, second step is choose one KPI, after that the system will jump to the summary page, then the summary page will send the URL request to the server. The server will select the database to query the result. Finally the server returns the result to the summary dashboard. The sequence diagram for Browse summary page is shown at the figure

84 Loop:the mumber of the parameter summary page Summary page control Parameter analysis Send parameter Retrieve data source server control send request analyze parameter(parameter) send parameter(parameter) the number of the parameter retrieve the data(parameter) Database control(parameter) result data source Figure 3-26 sequence diagram of browse the summary page The user browses the detail is similar to browsing the summary page. The sequence diagram is shown in the Figure Loop:the mumber of the parameter Summary page detail page detail page control Parameter analysis Send parameter Retrieve data source server control send request(parameter) detail information(service name) analyze parameter(parameters) send parameter(parameter) the number of the parameter retrieve the data(parameter) Database control(parameter) result data source Figure 3-27 Detail sequence diagram 76

85 3.5 Key techniques ETL Data. All information is saving into relation tables in the database which is in the normalized format (usually) of the Phonetica database server. That allows manipulating data effectively. Call record information stored in the database is represented in flat format that is very efficient for the data storage and retrieval. About Measures and Dimensions. Database systems store terabytes of data, which is just a raw data and not use from the business perspective. At the business level, we need to have meaningful information to instead of data the old data. There is a quite significant difference between data and information. Data are information, but information is not a data. For analysis purposes, we need to convert data to information. Definition: "Information is an organised data, in the form relevant for us". It means that to get "Information" we need to extract the most valuable data and transform it to the usable form; this process is called ETL - extract, process and load. In the result of this transformation we get measures. Measures are exactly what information you want to analyse. Measures examples: Average talk time, traffic, the number of calls. However, measures need to be specified by a range of something. We need to create the "Dimension" to group all the values we measured. For instance, the dimension can be determined as: data, time, dimension measure allows us to set the context (in other words the value of the filtered set of similar information). Taking into account the dimension can be organized in hierarchies. Measures are classified by the dimension of data organized into OLAP cubes to create a specific pattern (star or snowflake) (online analytical processing). Cube represents a different approach to allow the use of drilling on the data analysis, aggregation, decomposition and reporting data forecasting, budgeting, planning and other purposes MDX MDX was introduced by Microsoft with Microsoft SQL Server OLAP Services in around 1998, as the language component of the OLE DB for OLAP API. More recently, MDX has appeared as part of the XML for Analysis API. Microsoft proposed MDX as a standard, and its adoption among application writers and other 77

86 OLAP providers is steadily increasing. When looking at time series that exhibit strong deviations, it is sometimes hard to get the general picture of the development. A practical approach is to smooth out the values by calculating a sliding average. MDX stands for 'multi-dimensional expressions'. It is the main query language implemented by Mondrian. The language inherited the built-in functions available in that environment. Even though Mondrian cannot interface with Visual Basic, it includes a large number of VBA functions to allow MDX queries written in a Microsoft environment to run unchanged [48]. SQL result set visual image is intuitive; collection is composed of rows and columns of a two-dimensional form. However, MDX visual image of the result set is not intuitive. The result of the MDX can include more than three dimensions, so that the structure is more difficult to visualize. To be referenced in SQL these two-dimensional data, in applying a single data unit called fields, you can use any method suitable for the data, the column names and rows can be uniquely identified. However, MDX in the reference data unit, regardless of the data to form a single cell or a group of cells, is the use of a very specific and uniform syntax. As we know SQL and MDX is very similar, but the MDX is more powerful, and also it can be very complicated. However, since the original design intent MDX query multidimensional data to provide a simple and effective way, it uses a consistent and easy to understand way the user understand the multi-dimensional query and query the conceptual difference CDE CDE (Community Dashboard Editor) is a plugin for Pentaho Business Analysis suite, designed to allow greater flexibility for data sources. Most tools available can perform queries over a single data source and there's no easy way to join data coming from different databases or in different languages (joining data from a SQL query with data from a MDX query). These tasks usually require an independent ETL job to synchronize different data sources and build a new database. CDE composed by 3 parts: CDE Layout, CDE Components, CDE Data Source. 1) CDE Layout defines static HTML blocks that the dynamic parts of the dashboard can hook into. Think of it as providing the underlying grid for the more interesting dashboard elements. CSS and general JavaScript are also maintained in the layout section. 78

87 2) CDE Components are entities that render and display data. Charts, selector boxes and data tables are components. If it s a dashboard entity that displays data or requires data in order to work properly, it s a component. In the example dashboard the year selector box, the charts, and the data table are components. Components are placed on the dashboard by first preparing a named space for them in the layout section, and then linking the component into that space. 3) CDE Data Sources provide data to components. They are usually parameterized SQL or MDX queries and they do not have direct visual representation on a dashboard. It s just the data rows [49]. The following Figure 3-28 illustration shows the three concepts and how they relate: CDA Figure 3-28 the dashboard overview. CDA was developed as an abstraction tool between database connections and CDF (Community Dashboard Framework). It allows data to be retrieved from multiple data sources and combined in a single output that can easily be passed on to dashboard components. It serves three main purposes: To allow joining data from different sources; To avoid SQL injection problems within CDF; To allow an easier way to export data from queries. CDA uses two different components: A connection, the database or pentaho data source to use, and a data Access, a query over that connection. Connections 79

88 and queries are defined in a XML file (the CDA file) lying in the solution repository. Data access is done by a call to a specific data Access id in the CDA file. Each data access may have parameters and calculated columns added to the results. It's also possible to have compound Data Access elements, which represent joins or unions on different queries. CDA features an API to fetch query results, an editor to edit CDA files and a previewer to visualize the results of a query in table form. Finally, export: query results can be returned from the API in various formats. Currently supported are JSON, XML, CSV and XLS. 3.6 Brief summary To implement this system, my solution includes three parts: the first part is analysis and design phase, it include: Business process analysis (BPMN), Fit-Gap analysis, KPI analysis in terms of identifying main entries and stakeholders based on HIGand, create DFM for each KPI. Second step DFM analyzed. Data integration and design the database. Using Pentaho BI tools ETL data and store in the database. The third step is the simplification phase which implements KPIs as dashboard to build up the corporate performance management system. 80

89 Chapter 4 System Implementation and Testing 4.1 The environment of system implementation In this project, we will use Pentaho BI platform which is the main business intelligence tools. And the Postgresql will be used as the database. Before implementing this BI system, it needs a complete operating environment which includes both the software environment and the hardware environment. The software environment is shown in the table 4-1. Table 4-1 experiment condition Operating system: Windows Professional 8 IDE My Eclipse Professional 2013 Database: Postgresql 9.2 BI Platform Pentaho stable Web Server Apache Tomcat Experiment data Phonetica transaction data from 2011 to 2012 Programming Language Other tools Java, HTML5, CSS3, JavaScript, MDX, SQL Ext-JS Kettle CDE CDA PgAdmin Mondrian Spoon Ext-JS In order to implement of this system we needs a server and a personal computer, the hardware environment is shown by table 4-2. Table 4-2 hardware environment Server CPU Memory Disk Quad-core Intel Xeon 16G 4TB System Ubuntu 11.4 (server version) 81

90 Continue Table 4-2 Client CPU Memory I5 8G Disk WD 500G (5400) 4.2 Key program flow charts Data integration To implement data integration, the first step is ETL data from the Phonetica database. As it is designed for the data integration model, the first step is to build the fact table and dimension tables. The next step is extracted from the Phonetica each call record that is stored in the Phonetica database. The last step is separate each record information and insert the value to the new database which is designed in the data integration model Service Level In order to describe the ETL, and explain how to store the data into the service leve database, each step as is shown Table 4-3. Table 4-3 Data integration process of service level Step Create fact table service level and dimention tables Extract data SLA Service lookup Date lookup Service loopup Agent loopup Store value into fact table Description This step executes a SQL script to create fact and dimension tables named service level1 fact, date, agent, time, service tables in database. Extract data which are answered calls and lost calls from Phonetica database. Extract the recycled from the service which is belonging to service leve1. Obtain date value from each call record which is belonging to service level1. Separate the data value writes into date and time table. Obtain service value from each call record which is belonging to service level1. Separate data and write into service table. Obtain agent value of each call record which is belonging to service level1. Separate data and write into agent table. Store all the id values for date,service,agent, and time into fact table and the values need to be calculated in service level1 82

91 Average speed of answer Thesis for Master s Degree at HIT and UP In order to describe the ETL, and explain how to store the data into the average speed of answer database, each step as is shown Table 4-4. Table 4-4 Data integration process of average speed of answer Step Create fact table average speed of answer and dimention tables Extract data SLA Service lookup Date lookup Service loopup Agent loopup Store value into fact table Upselling opportunity Description This step executes a SQL script to create fact and dimension tables named an average speed of answer fact, date, agent, time, service tables in database. Extract data which are answered calls and lost calls from Phonetica database Extract the recycled from the service which is belonging to average speed of answer. Obtain date value from each call record which is belonging to average speed of answer. Separate the data value writes into date and time table. Obtain service value from each call record which is belonging to service level1. Separate data and insert into service table. Obtain agent value of each call record which is belonging to average speed of answer. Separate data and write into agent table. Store all the id values for date, service, agent, and time into the fact table and the values needs to be calculated in average speed of answer. In order to describe the ETL, and explain how to store the data into the upselling opportunity database, each step as is shown Table 4-5. Table 4-5 Data integration process of upselling opportunity Step Create fact table upselling opportunity and dimention tables Extract data SLA Service lookup Date lookup Description This step executes a SQL script to create fact and dimension tables named up selling opportunity act, date, agent, time, service tables in database. Extract data which are answered calls and lost calls from Phonetica database. Extract the recycled from the service which is belonging to up selling opportunity. Obtain date value from each call record which is belonging to up selling opportunity. Separate the data value writes into date and time table. 83

92 Continue Table 4-5 Step Service loopup Agent loopup Store value into fact table Description Obtain service value from each call record which is belongs to serice level1. Separate the data and insert into service table. Obtain agent value from each call record which is belongs to upselling opportunity. Separate the data and write into agent table. Store all the id values form date, service, agent, and time into fact table and the values need to be calculalte in upselling opportunity Traffic In order to describe the ETL, and explain how to store the data into the traffic database, each step as is shown Table 4-6. Step Create fact table traffic and dimention tables Extract data SLA Service lookup Date lookup Service loopup Agent loopup Store value into fact table Abandonment rate Table 4-6 Data integration process of traffic Description This step executes a SQL script to create fact and dimension tables named traffic fact, date, agent, time, service tables in database. Extract data which are answered calls and lost calls from Phonetica database. Extract the recycled from the service which is belonging to traffic. Obtain date value from each call record which is belonging to traffic. Separate the data value writes into date and time table. Obtain service value from each call record which is belonging to service level1. Separate data and insert into service table. Obtain agent value of each call record which is belonging to traffic. Separate data and write into agent table. Store all the id values for date, service, agent, and time into the fact table and the values needs to be calculated in traffic. In order to describe the ETL, and explain how to store the data into the abandonment rate database, each step as is shown Table

93 Table 4-7 Data integration process of abandonment rate Step Create fact table abandonment rate and dimention tables Extract data SLA Service lookup Date lookup Service loopup Agent loopup Store value into fact table Average talk time Description This step executes a SQL script to create fact and dimension tables named abandonment rate fact, date, agent, time, service tables in database. Extract data which are answered calls and lost calls from Phonetica database. Extract the recycled from the service which is belonging to abandonment rate. Obtain date value from each call record which is belonging to abandonment rate. Separate the data value writes into date and time table. Obtain service value from each call record which is belonging to service level1. Separate data and insert into service table. Obtain agent value of each call record which is belonging to abandonment rate. Separate data and write into agent table. Store all the id values for date, service, agent, and time into the fact table and the values needs to be calculated in the abandonment rate. In order to describe the ETL, and explain how to store the data into the average talk time database, each step as is shown Table 4-8. Table 4-8 Data integration process of average talk time Step Create fact table average talk time and dimention tables Extract data SLA Service lookup Date lookup Service loopup Description This step executes a SQL script to create fact and dimention tables named average talk time fact, date, agent, time, service tables in database. Extract the data which are answered calls and lost calls from Phonetica database. Extract the rececord from the service which is belongs to average talk time. Obtain date value from each call record which is belongs to average talk time. Separate the data value write into date and time table. Obtain service value from each call record which is belongs to serice level1. Separate the data and insert into service table. Continue Table

94 Agent loopup Store value into fact table Back office Obtain agent value from each call record which is belongs to average talk time. Separate the data and write into agent table. Store all the id values form date, service, agent, and time into fact table and the values need to be calculalte in average talk time. In order to describe the ETL, and explain how to store the data into the back office database, each step as is shown Table 4-9. Table 4-9 Data integration process of back office Step Create fact table back office and dimention tables Extract data SLA Service lookup Date lookup Service loopup Agent loopup Store value into fact table Delay in queue Description This step executes a SQL script to create fact and dimention tables named back office fact, date, agent, time, service tables in database. Extract the data which are answered calls and lost calls from Phonetica database. Extract the rececord from the service which is belongs to back office. Obtain date value from each call record which is belongs to back office. Separate the data value write into date and time table. Obtain service value from each call record which is belongs to serice level1. Separate the data and insert into service table. Obtain agent value from each call record which is belongs to back office. Separate the data and write into agent table. Store all the id values form date, service, agent, and time into fact table and the values need to be calculalte in back office. In order to describe the ETL, and explain how to store the data into the delay in queue database, each step as is shown Table Table 4-10 Data integration process of delay in queue Step Create fact table delay in queue and dimention tables Description This step executes a SQL script to create fact and dimension tables named delay in queue fact, date, agent, time, service tables in database. Continue Table

95 Extract data SLA Service lookup Date lookup Service loopup Agent loopup Store value into fact table Extract data which are answered calls and lost calls from Phonetica database. Extract the recycled from the service which is belonging to delay in queue. Obtain date value from each call record which is belonging to delay in queue. Separate the data value writes into date and time table. Obtain service value from each call record which is belonging to service level1. Separate data and insert into service table. Obtain agent value of each call record which is belonging to delay in queue. Separate data and write into agent table. Store all the id values for date, service, agent, and time into the fact table and the values needs to be calculated in a delay in queue Data access model In order to retrieve data from the database, the first step is send the parameter to the CDA, and select right cubes, then write the MDX to query the cube, next step the CDA will retrieve the data source from the database which is implement from the data integration part. The flow diagram is shown as the figure

96 Start Check the parameter MDX is correct N MDX Error Select the cube Y Calculate the MDX Retrieve data source from cube Return to CDA end Figure 4-1 data access flow chart The function of Pentaho CDA is to access all the parameter next sent all the parameter with MDX to retrieve the data source from the cube. Pentaho CDA is shown as Figure 4-2. Figure 4-2 Pentaho CDA 88

97 4.2.3 Control model To implement the model I use Java to a server. The function is described in the chapter 3 control model. About retrieve data source automaticly. After every day at 00:00, the server will run by itself. Next step connects the Pentaho by send user name and password, after that sends the parameter to CDA, retrieve the data source one by one which it is using the data access model. The final step is store all the data source for the database. Before store, the data source we must translate the result as JSON. And the format of the JSON I has described in Chapter 3. The flow diagram is shown as the figure 4-3. start Run the server Connect with CDA Send the parameter Check parameter Y N Parameter Error success Return to server Y CDA calculate the rusult Store in database end Figure 4-3 retrieves data source flow chart 89

98 When the user want to use the dashboard the dashboard will send the request to connect to the server. After that the server will retrieve the data source from the database. Because the database is very large, we implement like this, when the user select one KPI the dashboard system send a request with the parameter, then the server will receive the parameter and select the related data source result and return it to the dashboard model. The program process of this module is shown as the Figure 4-4. Start Send the parameter to server success N Return NULL to dashboard Return NLL to dashboard N received Y Y Analyze the parameter Select the data source with parameter Connect database Return the result Return to dashboard end Figure 4-4 interacts with dashboard flow chart 90

99 4.2.4 Dashboard model Thesis for Master s Degree at HIT and UP When the user wants to browse the dashboard page, the first step is open the index page, second step is choose one KPI, after that the system will jump to the summary page, then the summary page will send the URL request to the server. The server will select the database to query the result. Finally the server returns the result to the summary dashboard. I implement all the dashboard used Ext-JS.The function process of this module is shown as Figure 4-5. Start Index page Log in N Summary page See the service information success exisit N Error message Y KPI infromation Y Shown the service result Save the result as Excel end Figure 4-5 browses dashobard flow chart 4.3 Key Interfaces of the software system The index page is shown in the Figure 4-6. When the user wants to use this system, the first step is to show the index page. It will illustrate all the KPIs. If the user wants to see one of them he can click the link. On the left is the KPI name. On the right is the description of each KPI. 91

100 Figure 4-6 index page When the user selects the KPI named Traffic the system will jump to the traffic summary page. In this page user will see the customer divide by service and skillset. In the center of the page is the name of the KPI Traffic, on the right side of the picture is Phonetica company logo. In upper left corner of the page is a text description of the KPI Traffic. In center of the page, is the top panel, part by the service and skill, and the contents of the top panel is the table chart. In the table include six columns, they are: Service name, Yesterday, daily trend, this month, previous month and monthly trend. Service columns display the service name. Yesterday, this month, previous month displays the result and the daily trend and monthly trend are the consequence be compared. In the top right is the button the function is to save the results as Excel. The service of the Traffic as showed in Figure 4-7. The skillset of the Traffic as showed in Figure

101 Figure 4-7 Service of Traffic Figure 4-8 Skillset of Traffic When the user wants to see the detail of the service he can click one a service. Then the system will jump to the detail page. In the center of the page is the name of the KPI, on the right side of the picture is Phonetica company logo. In upper left corner of the page is the name of the service. In center of the page is the top panel, divide by the Yesterday, week completion and month completion, and the contents of the top panel is the line chart. The chart is showing the result is reflected the 93

102 number of the cells separate by related dimension. In the top right is the button the function is to save the results as Excel. When the user selects "BI_TO_ABM", the system will show the line chart of yesterday information. The information of yesterday as showed in Figure 4-9. Figure 4-9 yesterday of detail When the user select wants to see "BI_TO_ABM" weekly comparison information, the page as showed in Figure Figure 4-10 weekly comparision of detail 94

103 When the user wants to see the "AMPLIFON_A" about its monthly comparison of the average taking time, the page as showed in Figure Figure 4-11 monthly comparision of detail If the user wants to save for information as Excel, what he will do, be clicking the Export button. Then the system will save the information as Excel as showed in Figure Figure 4-12 save as Excel Open the Excel the user can see all the information as showned in Figure

104 Figure 4-13 Excel 4.4 System Testing According to analyze the process of the Phonetica business, we have developed this Performance Management System, which is designed to monitor and evaluate 8 key performance indicators (KPIs) that help the managers to find their best business solutions and to improve their performance management. Therefore, the system strategy focuses on the manager as end-user; the manager can use this system easyly and help them to evaluate their key performance indicators Test environment The software environment is shown in the table Table 4-11 experiment condition Operating system: Windows Professional 8 IDE My Eclipse Professional 2013 Database: Postgresql 9.2 BI Platform Pentaho stable Web Server Apache Tomcat Experiment data Phonetica transaction data from 2011 to 2012 Programming Language Other tools Java, HTML5, CSS3, JavaScript, MDX, SQL Ext-JS Kettle CDE CDA PgAdmin 96

Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative Analysis of the Main Providers

Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative Analysis of the Main Providers 60 Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative Analysis of the Main Providers Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative

More information

Innovation. Simplifying BI. On-Demand. Mobility. Quality. Innovative

Innovation. Simplifying BI. On-Demand. Mobility. Quality. Innovative Innovation Simplifying BI On-Demand Mobility Quality Innovative BUSINESS INTELLIGENCE FACTORY Advantages of using our technologies and services: Huge cost saving for BI application development. Any small

More information

BUSINESS INTELLIGENCE. Keywords: business intelligence, architecture, concepts, dashboards, ETL, data mining

BUSINESS INTELLIGENCE. Keywords: business intelligence, architecture, concepts, dashboards, ETL, data mining BUSINESS INTELLIGENCE Bogdan Mohor Dumitrita 1 Abstract A Business Intelligence (BI)-driven approach can be very effective in implementing business transformation programs within an enterprise framework.

More information

Business Intelligence for the Modern Utility

Business Intelligence for the Modern Utility Business Intelligence for the Modern Utility Presented By: Glenn Wolf, CISSP (Certified Information Systems Security Professional) Senior Consultant Westin Engineering, Inc. Boise, ID September 15 th,

More information

Implementing Oracle BI Applications during an ERP Upgrade

Implementing Oracle BI Applications during an ERP Upgrade Implementing Oracle BI Applications during an ERP Upgrade Summary Jamal Syed BI Practice Lead Emerging solutions 20 N. Wacker Drive Suite 1870 Chicago, IL 60606 Emerging Solutions, a professional services

More information

MDM and Data Warehousing Complement Each Other

MDM and Data Warehousing Complement Each Other Master Management MDM and Warehousing Complement Each Other Greater business value from both 2011 IBM Corporation Executive Summary Master Management (MDM) and Warehousing (DW) complement each other There

More information

Data warehouse and Business Intelligence Collateral

Data warehouse and Business Intelligence Collateral Data warehouse and Business Intelligence Collateral Page 1 of 12 DATA WAREHOUSE AND BUSINESS INTELLIGENCE COLLATERAL Brains for the corporate brawn: In the current scenario of the business world, the competition

More information

Implementing Oracle BI Applications during an ERP Upgrade

Implementing Oracle BI Applications during an ERP Upgrade 1 Implementing Oracle BI Applications during an ERP Upgrade Jamal Syed Table of Contents TABLE OF CONTENTS... 2 Executive Summary... 3 Planning an ERP Upgrade?... 4 A Need for Speed... 6 Impact of data

More information

Make the right decisions with Distribution Intelligence

Make the right decisions with Distribution Intelligence Make the right decisions with Distribution Intelligence Bengt Jensfelt, Business Product Manager, Distribution Intelligence, April 2010 Introduction It is not so very long ago that most companies made

More information

Comparative Analysis of the Main Business Intelligence Solutions

Comparative Analysis of the Main Business Intelligence Solutions 148 Informatica Economică vol. 17, no. 2/2013 Comparative Analysis of the Main Business Intelligence Solutions Alexandra RUSANEANU Faculty of Cybernetics, Statistics and Economic Informatics Bucharest

More information

Vendor briefing Business Intelligence and Analytics Platforms Gartner 15 capabilities

Vendor briefing Business Intelligence and Analytics Platforms Gartner 15 capabilities Vendor briefing Business Intelligence and Analytics Platforms Gartner 15 capabilities April, 2013 gaddsoftware.com Table of content 1. Introduction... 3 2. Vendor briefings questions and answers... 3 2.1.

More information

Pentaho BI Capability Profile

Pentaho BI Capability Profile Pentaho BI Capability Profile InfoAxon s Pentaho BI Integration Capabilities InfoAxon s Pentaho BI Integration Capabilities Challenge Organizations are under continuous pressure to improve their business

More information

Big Data Analytics with IBM Cognos BI Dynamic Query IBM Redbooks Solution Guide

Big Data Analytics with IBM Cognos BI Dynamic Query IBM Redbooks Solution Guide Big Data Analytics with IBM Cognos BI Dynamic Query IBM Redbooks Solution Guide IBM Cognos Business Intelligence (BI) helps you make better and smarter business decisions faster. Advanced visualization

More information

CONCEPTUALIZING BUSINESS INTELLIGENCE ARCHITECTURE MOHAMMAD SHARIAT, Florida A&M University ROSCOE HIGHTOWER, JR., Florida A&M University

CONCEPTUALIZING BUSINESS INTELLIGENCE ARCHITECTURE MOHAMMAD SHARIAT, Florida A&M University ROSCOE HIGHTOWER, JR., Florida A&M University CONCEPTUALIZING BUSINESS INTELLIGENCE ARCHITECTURE MOHAMMAD SHARIAT, Florida A&M University ROSCOE HIGHTOWER, JR., Florida A&M University Given today s business environment, at times a corporate executive

More information

SUSTAINING COMPETITIVE DIFFERENTIATION

SUSTAINING COMPETITIVE DIFFERENTIATION SUSTAINING COMPETITIVE DIFFERENTIATION Maintaining a competitive edge in customer experience requires proactive vigilance and the ability to take quick, effective, and unified action E M C P e r s pec

More information

IBM Information Management

IBM Information Management IBM Information Management January 2008 IBM Information Management software Enterprise Information Management, Enterprise Content Management, Master Data Management How Do They Fit Together An IBM Whitepaper

More information

Creating a Business Intelligence Competency Center to Accelerate Healthcare Performance Improvement

Creating a Business Intelligence Competency Center to Accelerate Healthcare Performance Improvement Creating a Business Intelligence Competency Center to Accelerate Healthcare Performance Improvement Bruce Eckert, National Practice Director, Advisory Group Ramesh Sakiri, Executive Consultant, Healthcare

More information

Next Generation Business Performance Management Solution

Next Generation Business Performance Management Solution Next Generation Business Performance Management Solution Why Existing Business Intelligence (BI) Products are Inadequate Changing Business Environment In the face of increased competition, complex customer

More information

By Makesh Kannaiyan makesh.k@sonata-software.com 8/27/2011 1

By Makesh Kannaiyan makesh.k@sonata-software.com 8/27/2011 1 Integration between SAP BusinessObjects and Netweaver By Makesh Kannaiyan makesh.k@sonata-software.com 8/27/2011 1 Agenda Evolution of BO Business Intelligence suite Integration Integration after 4.0 release

More information

Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario

Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario About visualmetrics visualmetrics is a Business Intelligence (BI) solutions provider that develops and delivers best of breed Analytical Applications, utilising BI tools, to its focus markets. Based in

More information

Business Intelligence and Service Oriented Architectures. An Oracle White Paper May 2007

Business Intelligence and Service Oriented Architectures. An Oracle White Paper May 2007 Business Intelligence and Service Oriented Architectures An Oracle White Paper May 2007 Note: The following is intended to outline our general product direction. It is intended for information purposes

More information

Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff

Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business

More information

A Knowledge Management Framework Using Business Intelligence Solutions

A Knowledge Management Framework Using Business Intelligence Solutions www.ijcsi.org 102 A Knowledge Management Framework Using Business Intelligence Solutions Marwa Gadu 1 and Prof. Dr. Nashaat El-Khameesy 2 1 Computer and Information Systems Department, Sadat Academy For

More information

www.sryas.com Analance Data Integration Technical Whitepaper

www.sryas.com Analance Data Integration Technical Whitepaper Analance Data Integration Technical Whitepaper Executive Summary Business Intelligence is a thriving discipline in the marvelous era of computing in which we live. It s the process of analyzing and exploring

More information

ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS

ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS Oracle Fusion editions of Oracle's Hyperion performance management products are currently available only on Microsoft Windows server platforms. The following is intended to outline our general product

More information

Business Intelligence

Business Intelligence Transforming Information into Business Intelligence Solutions Business Intelligence Client Challenges The ability to make fast, reliable decisions based on accurate and usable information is essential

More information

Evaluating Business Intelligence Offerings: Business Objects and Microsoft. www.symcorp.com

Evaluating Business Intelligence Offerings: Business Objects and Microsoft. www.symcorp.com : Business Objects and Microsoft www.symcorp.com August 2, 2005 : Business Objects and Microsoft Table of Contents Introduction... 3 What is Business Intelligence... 3 Key Considerations in Deciding on

More information

Cincom Business Intelligence Solutions

Cincom Business Intelligence Solutions CincomBI Cincom Business Intelligence Solutions Business Users Overview Find the perfect answers to your strategic business questions. SIMPLIFICATION THROUGH INNOVATION Introduction Being able to make

More information

idashboards FOR SOLUTION PROVIDERS

idashboards FOR SOLUTION PROVIDERS idashboards FOR SOLUTION PROVIDERS The idashboards team was very flexible, investing considerable time working with our technical staff to come up with the perfect solution for us. Scott W. Ream, President,

More information

Presented By: Leah R. Smith, PMP. Ju ly, 2 011

Presented By: Leah R. Smith, PMP. Ju ly, 2 011 Presented By: Leah R. Smith, PMP Ju ly, 2 011 Business Intelligence is commonly defined as "the process of analyzing large amounts of corporate data, usually stored in large scale databases (such as a

More information

Office Business Applications (OBA) for Healthcare Organizations. Make better decisions using the tools you already know

Office Business Applications (OBA) for Healthcare Organizations. Make better decisions using the tools you already know Office Business Applications (OBA) for Healthcare Organizations Make better decisions using the tools you already know Page 1 A B S T R A C T Healthcare information is getting more and more difficult to

More information

ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS

ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS PRODUCT FACTS & FEATURES KEY FEATURES Comprehensive, best-of-breed capabilities 100 percent thin client interface Intelligence across multiple

More information

Business Intelligence and Healthcare

Business Intelligence and Healthcare Business Intelligence and Healthcare SUTHAN SIVAPATHAM SENIOR SHAREPOINT ARCHITECT Agenda Who we are What is BI? Microsoft s BI Stack Case Study (Healthcare) Who we are Point Alliance is an award-winning

More information

Expand Your Business Intelligence Usage at Lower TCO with Open Source

Expand Your Business Intelligence Usage at Lower TCO with Open Source Expand Your Business Intelligence Usage at Lower TCO with Open Source Jared Cornelius, Product Marketing Manager, Pentaho Michael Tarallo, Senior Pre-Sales Manager, Pentaho July 29, 2008 Agenda Pentaho

More information

Proven Testing Techniques in Large Data Warehousing Projects

Proven Testing Techniques in Large Data Warehousing Projects A P P L I C A T I O N S A WHITE PAPER SERIES A PAPER ON INDUSTRY-BEST TESTING PRACTICES TO DELIVER ZERO DEFECTS AND ENSURE REQUIREMENT- OUTPUT ALIGNMENT Proven Testing Techniques in Large Data Warehousing

More information

Dashboards as a management tool to monitoring the strategy. Carlos González (IAT) 19th November 2014, Valencia (Spain)

Dashboards as a management tool to monitoring the strategy. Carlos González (IAT) 19th November 2014, Valencia (Spain) Dashboards as a management tool to monitoring the strategy Carlos González (IAT) 19th November 2014, Valencia (Spain) Definitions Strategy Management Tool Monitoring Dashboard Definitions STRATEGY From

More information

Implementing Data Models and Reports with Microsoft SQL Server

Implementing Data Models and Reports with Microsoft SQL Server Course 20466C: Implementing Data Models and Reports with Microsoft SQL Server Course Details Course Outline Module 1: Introduction to Business Intelligence and Data Modeling As a SQL Server database professional,

More information

Expanding Uniformance. Driving Digital Intelligence through Unified Data, Analytics, and Visualization

Expanding Uniformance. Driving Digital Intelligence through Unified Data, Analytics, and Visualization Expanding Uniformance Driving Digital Intelligence through Unified Data, Analytics, and Visualization The Information Challenge 2 What is the current state today? Lack of availability of business level

More information

POLAR IT SERVICES. Business Intelligence Project Methodology

POLAR IT SERVICES. Business Intelligence Project Methodology POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...

More information

Open Source Business Intelligence Intro

Open Source Business Intelligence Intro Open Source Business Intelligence Intro Stefano Scamuzzo Senior Technical Manager Architecture & Consulting Research & Innovation Division Engineering Ingegneria Informatica The Open Source Question In

More information

White Paper March 2009. Government performance management Set goals, drive accountability and improve outcomes

White Paper March 2009. Government performance management Set goals, drive accountability and improve outcomes White Paper March 2009 Government performance management Set goals, drive accountability and improve outcomes 2 Contents 3 Business problems Why performance management? 4 Business drivers 6 The solution

More information

SQL Server 2012 Business Intelligence Boot Camp

SQL Server 2012 Business Intelligence Boot Camp SQL Server 2012 Business Intelligence Boot Camp Length: 5 Days Technology: Microsoft SQL Server 2012 Delivery Method: Instructor-led (classroom) About this Course Data warehousing is a solution organizations

More information

Business Benefits From Microsoft SQL Server Business Intelligence Solutions How Can Business Intelligence Help You? PTR Associates Limited

Business Benefits From Microsoft SQL Server Business Intelligence Solutions How Can Business Intelligence Help You? PTR Associates Limited Business Benefits From Microsoft SQL Server Business Intelligence Solutions How Can Business Intelligence Help You? www.ptr.co.uk Business Benefits From Microsoft SQL Server Business Intelligence (September

More information

Pentaho Update and Discussion with Curt Monash. Jared Cornelius, Product Marketing Manager Lance Walter, VP of Marketing

Pentaho Update and Discussion with Curt Monash. Jared Cornelius, Product Marketing Manager Lance Walter, VP of Marketing Pentaho Update and Discussion with Curt Monash Jared Cornelius, Product Marketing Manager Lance Walter, VP of Marketing Agenda Customer adoption dynamics Customer examples Key product differentiators Demonstration

More information

Think bigger about business intelligence create an informed healthcare organization.

Think bigger about business intelligence create an informed healthcare organization. KNOWLEDGE DRIVEN HEALTH Think bigger about business intelligence create an informed healthcare organization. Help every healthcare professional contribute to better decision making. Help everyone in your

More information

ABSS Solutions, Inc. Upper Marlboro, MD 20772

ABSS Solutions, Inc. Upper Marlboro, MD 20772 ABSS Solutions, Inc. Upper Marlboro, MD 20772 1. 1 Using Business Information LogiXML Implementation Process LogiXML Licensing Structure LogiXML Implementations LogiXML Corporate Overview ASI Implementation

More information

A Roadmap to Intelligent Business By Mike Ferguson Intelligent Business Strategies

A Roadmap to Intelligent Business By Mike Ferguson Intelligent Business Strategies A Roadmap to Business By Mike Ferguson Business Strategies What is Business? business is a fundamental shift in thinking for the world of data warehousing and business intelligence (BI). It is about putting

More information

www.ducenit.com Analance Data Integration Technical Whitepaper

www.ducenit.com Analance Data Integration Technical Whitepaper Analance Data Integration Technical Whitepaper Executive Summary Business Intelligence is a thriving discipline in the marvelous era of computing in which we live. It s the process of analyzing and exploring

More information

IBM Software A Journey to Adaptive MDM

IBM Software A Journey to Adaptive MDM IBM Software A Journey to Adaptive MDM What is Master Data? Why is it Important? A Journey to Adaptive MDM Contents 2 MDM Business Drivers and Business Value 4 MDM is a Journey 7 IBM MDM Portfolio An Adaptive

More information

The IBM Cognos Platform

The IBM Cognos Platform The IBM Cognos Platform Deliver complete, consistent, timely information to all your users, with cost-effective scale Highlights Reach all your information reliably and quickly Deliver a complete, consistent

More information

Enterprise Solutions. Data Warehouse & Business Intelligence Chapter-8

Enterprise Solutions. Data Warehouse & Business Intelligence Chapter-8 Enterprise Solutions Data Warehouse & Business Intelligence Chapter-8 Learning Objectives Concepts of Data Warehouse Business Intelligence, Analytics & Big Data Tools for DWH & BI Concepts of Data Warehouse

More information

Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes

Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes Five Steps to Integrate SalesForce.com with 3 rd -Party Systems and Avoid Most Common Mistakes This white paper will help you learn how to integrate your SalesForce.com data with 3 rd -party on-demand,

More information

BusinessObjects XI. New for users of BusinessObjects 6.x New for users of Crystal v10

BusinessObjects XI. New for users of BusinessObjects 6.x New for users of Crystal v10 BusinessObjects XI Delivering extreme Insight Bringing information to new users, in new ways, with unmatched simplicity and context. Broadest and deepest end user capabilities from reporting, to query

More information

Business Intelligence and Analytics: Leveraging Information for Value Creation and Competitive Advantage

Business Intelligence and Analytics: Leveraging Information for Value Creation and Competitive Advantage PRACTICES REPORT BEST PRACTICES SURVEY: AGGREGATE FINDINGS REPORT Business Intelligence and Analytics: Leveraging Information for Value Creation and Competitive Advantage April 2007 Table of Contents Program

More information

Data Analytics Solution for Enterprise Performance Management

Data Analytics Solution for Enterprise Performance Management A Kavaii White Paper http://www.kavaii.com Data Analytics Solution for Enterprise Performance Management Automated. Easy to Use. Quick to Deploy. Kavaii Analytics Team Democratizing Data Analytics & Providing

More information

Supply Chain Management Build Connections

Supply Chain Management Build Connections Build Connections Enabling a business in manufacturing Building High-Value Connections with Partners and Suppliers Build Connections Is your supply chain responsive, adaptive, agile, and efficient? How

More information

Breadboard BI. Unlocking ERP Data Using Open Source Tools By Christopher Lavigne

Breadboard BI. Unlocking ERP Data Using Open Source Tools By Christopher Lavigne Breadboard BI Unlocking ERP Data Using Open Source Tools By Christopher Lavigne Introduction Organizations have made enormous investments in ERP applications like JD Edwards, PeopleSoft and SAP. These

More information

ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION

ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION EXECUTIVE SUMMARY Oracle business intelligence solutions are complete, open, and integrated. Key components of Oracle business intelligence

More information

A Hyperion System Overview. Hyperion System 9

A Hyperion System Overview. Hyperion System 9 A Hyperion System Overview Hyperion System 9 Your organization relies on multiple transactional systems including ERP, CRM, and general ledger systems to run your business. In today s business climate

More information

Business Intelligence Meets Business Process Management. Powerful technologies can work in tandem to drive successful operations

Business Intelligence Meets Business Process Management. Powerful technologies can work in tandem to drive successful operations Business Intelligence Meets Business Process Management Powerful technologies can work in tandem to drive successful operations Content The Corporate Challenge.3 Separation Inhibits Decision-Making..3

More information

JAVASCRIPT CHARTING. Scaling for the Enterprise with Metric Insights. 2013 Copyright Metric insights, Inc.

JAVASCRIPT CHARTING. Scaling for the Enterprise with Metric Insights. 2013 Copyright Metric insights, Inc. JAVASCRIPT CHARTING Scaling for the Enterprise with Metric Insights 2013 Copyright Metric insights, Inc. A REVOLUTION IS HAPPENING... 3! Challenges... 3! Borrowing From The Enterprise BI Stack... 4! Visualization

More information

Business Intelligence, Analytics & Reporting: Glossary of Terms

Business Intelligence, Analytics & Reporting: Glossary of Terms Business Intelligence, Analytics & Reporting: Glossary of Terms A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Ad-hoc analytics Ad-hoc analytics is the process by which a user can create a new report

More information

www.ducenit.com Self-Service Business Intelligence: The hunt for real insights in hidden knowledge Whitepaper

www.ducenit.com Self-Service Business Intelligence: The hunt for real insights in hidden knowledge Whitepaper Self-Service Business Intelligence: The hunt for real insights in hidden knowledge Whitepaper Shift in BI usage In this fast paced business environment, organizations need to make smarter and faster decisions

More information

Moving Large Data at a Blinding Speed for Critical Business Intelligence. A competitive advantage

Moving Large Data at a Blinding Speed for Critical Business Intelligence. A competitive advantage Moving Large Data at a Blinding Speed for Critical Business Intelligence A competitive advantage Intelligent Data In Real Time How do you detect and stop a Money Laundering transaction just about to take

More information

The SAS Transformation Project Deploying SAS Customer Intelligence for a Single View of the Customer

The SAS Transformation Project Deploying SAS Customer Intelligence for a Single View of the Customer Paper 3353-2015 The SAS Transformation Project Deploying SAS Customer Intelligence for a Single View of the Customer ABSTRACT Pallavi Tyagi, Jack Miller and Navneet Tuteja, Slalom Consulting. Building

More information

QAD Business Intelligence

QAD Business Intelligence QAD Business Intelligence QAD Business Intelligence (QAD BI) unifies data from multiple sources across the enterprise and provides a complete solution that enables key enterprise decision makers to access,

More information

What You Need to Know About Transitioning to SOA

What You Need to Know About Transitioning to SOA What You Need to Know About Transitioning to SOA written by: David A. Kelly, ebizq Analyst What You Need to Know About Transitioning to SOA Organizations are increasingly turning to service-oriented architectures

More information

IBM Software Enabling business agility through real-time process visibility

IBM Software Enabling business agility through real-time process visibility IBM Software Enabling business agility through real-time process visibility IBM Business Monitor 2 Enabling business agility through real-time process visibility Highlights Understand the big picture of

More information

Contact Center Performance Management Software

Contact Center Performance Management Software Markets, W. Close Research Note 7 March 2003 Contact Center Performance Software Enterprises face critical challenges in contact center management. Capitalizing on people, performance and analytics will

More information

ORACLE OLAP. Oracle OLAP is embedded in the Oracle Database kernel and runs in the same database process

ORACLE OLAP. Oracle OLAP is embedded in the Oracle Database kernel and runs in the same database process ORACLE OLAP KEY FEATURES AND BENEFITS FAST ANSWERS TO TOUGH QUESTIONS EASILY KEY FEATURES & BENEFITS World class analytic engine Superior query performance Simple SQL access to advanced analytics Enhanced

More information

BUSINESS INTELLIGENCE

BUSINESS INTELLIGENCE BUSINESS INTELLIGENCE Microsoft Dynamics NAV BUSINESS INTELLIGENCE Driving better business performance for companies with changing needs White Paper Date: January 2007 www.microsoft.com/dynamics/nav Table

More information

QAD Business Intelligence Data Warehouse Demonstration Guide. May 2015 BI 3.11

QAD Business Intelligence Data Warehouse Demonstration Guide. May 2015 BI 3.11 QAD Business Intelligence Data Warehouse Demonstration Guide May 2015 BI 3.11 Overview This demonstration focuses on the foundation of QAD Business Intelligence the Data Warehouse and shows how this functionality

More information

Microsoft Services Exceed your business with Microsoft SharePoint Server 2010

Microsoft Services Exceed your business with Microsoft SharePoint Server 2010 Microsoft Services Exceed your business with Microsoft SharePoint Server 2010 Business Intelligence Suite Alexandre Mendeiros, SQL Server Premier Field Engineer January 2012 Agenda Microsoft Business Intelligence

More information

Dashboard Reporting Business Intelligence

Dashboard Reporting Business Intelligence Dashboard Reporting Dashboards are One of 5 Styles of BI Applications Increasing Analytics & User Interactivity Advanced Analysis & Ad Hoc OLAP Analysis Reporting Ad Hoc Analysis Predictive Analysis Data

More information

STAT REVIEW/EXECUTIVE DASHBOARD TASK ORDER EXECUTIVE DASHBOARD SYNTHESIS OF BEST PRACTICES

STAT REVIEW/EXECUTIVE DASHBOARD TASK ORDER EXECUTIVE DASHBOARD SYNTHESIS OF BEST PRACTICES STAT REVIEW/EXECUTIVE DASHBOARD TASK ORDER EXECUTIVE DASHBOARD SYNTHESIS OF BEST PRACTICES October 26, 2011 This publication was produced for review by the United States Agency for International Development.

More information

Implementing Data Models and Reports with Microsoft SQL Server 20466C; 5 Days

Implementing Data Models and Reports with Microsoft SQL Server 20466C; 5 Days Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Implementing Data Models and Reports with Microsoft SQL Server 20466C; 5

More information

Course 803401 DSS. Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization

Course 803401 DSS. Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization Oman College of Management and Technology Course 803401 DSS Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization CS/MIS Department Information Sharing

More information

IBM Cognos Performance Management Solutions for Oracle

IBM Cognos Performance Management Solutions for Oracle IBM Cognos Performance Management Solutions for Oracle Gain more value from your Oracle technology investments Highlights Deliver the power of predictive analytics across the organization Address diverse

More information

The Homebuilder Intelligence Suite

The Homebuilder Intelligence Suite The Homebuilder Intelligence Suite informxl is a comprehensive reporting suite providing builders with better data insight for more intelligent and informed decisions. Summary to detail, desktop to mobile,

More information

Developing Business Intelligence and Data Visualization Applications with Web Maps

Developing Business Intelligence and Data Visualization Applications with Web Maps Developing Business Intelligence and Data Visualization Applications with Web Maps Introduction Business Intelligence (BI) means different things to different organizations and users. BI often refers to

More information

Business Intelligence Solutions for Gaming and Hospitality

Business Intelligence Solutions for Gaming and Hospitality Business Intelligence Solutions for Gaming and Hospitality Prepared by: Mario Perkins Qualex Consulting Services, Inc. Suzanne Fiero SAS Objective Summary 2 Objective Summary The rise in popularity and

More information

Business Intelligence

Business Intelligence Microsoft Dynamics NAV 2009 Business Intelligence Driving insight for more confident results White Paper November 2008 www.microsoft.com/dynamics/nav Table of Contents Overview... 3 What Is Business Intelligence?...

More information

Business Intelligence

Business Intelligence Microsoft Dynamics NAV 2009 Business Intelligence Driving insight for more confident results White Paper November 2008 www.microsoft.com/dynamics/nav Table of Contents Overview... 3 What Is Business Intelligence?...

More information

redesigning the data landscape to deliver true business intelligence Your business technologists. Powering progress

redesigning the data landscape to deliver true business intelligence Your business technologists. Powering progress redesigning the data landscape to deliver true business intelligence Your business technologists. Powering progress The changing face of data complexity The storage, retrieval and management of data has

More information

Converging Technologies: Real-Time Business Intelligence and Big Data

Converging Technologies: Real-Time Business Intelligence and Big Data Have 40 Converging Technologies: Real-Time Business Intelligence and Big Data Claudia Imhoff, Intelligent Solutions, Inc Colin White, BI Research September 2013 Sponsored by Vitria Technologies, Inc. Converging

More information

QlikView Business Discovery Platform. Algol Consulting Srl

QlikView Business Discovery Platform. Algol Consulting Srl QlikView Business Discovery Platform Algol Consulting Srl Business Discovery Applications Application vs. Platform Application Designed to help people perform an activity Platform Provides infrastructure

More information

Business Intelligence: Effective Decision Making

Business Intelligence: Effective Decision Making Business Intelligence: Effective Decision Making Bellevue College Linda Rumans IT Instructor, Business Division Bellevue College lrumans@bellevuecollege.edu Current Status What do I do??? How do I increase

More information

Business Intelligence Enabling Transparency across the Enterprise

Business Intelligence Enabling Transparency across the Enterprise White Paper Business Intelligence Enabling Transparency across the Enterprise Business solutions through information technology Entire contents 2004 by CGI Group Inc. All rights reserved. Reproduction

More information

Qlik UKI Consulting Services Catalogue

Qlik UKI Consulting Services Catalogue Qlik UKI Consulting Services Catalogue The key to a successful Qlik project lies in the right people, the right skills, and the right activities in the right order www.qlik.co.uk Table of Contents Introduction

More information

Integrating SAP and non-sap data for comprehensive Business Intelligence

Integrating SAP and non-sap data for comprehensive Business Intelligence WHITE PAPER Integrating SAP and non-sap data for comprehensive Business Intelligence www.barc.de/en Business Application Research Center 2 Integrating SAP and non-sap data Authors Timm Grosser Senior Analyst

More information

The difference between. BI and CPM. A white paper prepared by Prophix Software

The difference between. BI and CPM. A white paper prepared by Prophix Software The difference between BI and CPM A white paper prepared by Prophix Software Overview The term Business Intelligence (BI) is often ambiguous. In popular contexts such as mainstream media, it can simply

More information

Anatomy of a Decision

Anatomy of a Decision research@bluehillresearch.com @BlueHillBoston 617.624.3600 Anatomy of a Decision BI Platform vs. Tool: Choosing Birst Over Tableau for Enterprise Business Intelligence Needs What You Need To Know The demand

More information

Self-Service Business Intelligence

Self-Service Business Intelligence Self-Service Business Intelligence BRIDGE THE GAP VISUALIZE DATA, DISCOVER TRENDS, SHARE FINDINGS Solgenia Analysis provides users throughout your organization with flexible tools to create and share meaningful

More information

How Effectively Are Companies Using Business Analytics? DecisionPath Consulting Research October 2010

How Effectively Are Companies Using Business Analytics? DecisionPath Consulting Research October 2010 How Effectively Are Companies Using Business Analytics? DecisionPath Consulting Research October 2010 Thought-Leading Consultants in: Business Analytics Business Performance Management Business Intelligence

More information

IBM Enterprise Content Management Product Strategy

IBM Enterprise Content Management Product Strategy White Paper July 2007 IBM Information Management software IBM Enterprise Content Management Product Strategy 2 IBM Innovation Enterprise Content Management (ECM) IBM Investment in ECM IBM ECM Vision Contents

More information

Improving Customer Contact Quality

Improving Customer Contact Quality Improving Customer Contact Quality An Extract from Call Quality Practices 2009 Call quality monitoring is one of the most effective methods for improving the level of service you provide to your customers.

More information

IBM Cognos 8 Business Intelligence Reporting Meet all your reporting requirements

IBM Cognos 8 Business Intelligence Reporting Meet all your reporting requirements Data Sheet IBM Cognos 8 Business Intelligence Reporting Meet all your reporting requirements Overview Reporting requirements have changed dramatically in organizations. Organizations today are much more

More information

Automated Business Intelligence

Automated Business Intelligence Automated Business Intelligence Delivering real business value,quickly, easily, and affordably 2 Executive Summary For years now, the greatest weakness of the Business Intelligence (BI) industry has been

More information

Business Intelligence & IT Governance

Business Intelligence & IT Governance Business Intelligence & IT Governance The current trend and its implication on modern businesses Jovany Chaidez 12/3/2008 Prepared for: Professor Michael J. Shaw BA458 IT Governance Fall 2008 The purpose

More information