In-memory databases from an IT audit perspective

Size: px
Start display at page:

Download "In-memory databases from an IT audit perspective"

Transcription

1 Post-Master Thesis In-memory databases from an IT audit perspective Student Name: Chi Kin Cheng, MSc Studentnumber: Thesis number: / Date: August, 2014 Thesis Supervisor VU: Dr. Wiekram B. Tewarie RE Second Reader: Chiel Meulendijks (Manager IT Audit PwC) Postgraduate IT Audit, Compliance & Advisory Faculty of Economics and Business Administration (FEWEB) Vrije Universiteit Amsterdam

2 Preface This thesis is written as part of the Postgraduate program IT Audit, Compliance and Advisory of the Faculty of Economics and Business Administration of the Vrije Universiteit Amsterdam. Its students are required to conduct research and write a thesis in the field of IT auditing in order to finalize the program. This thesis is written at the PwC Amsterdam office. I would like to thank Wiekram Tewarie, the thesis supervisor, for his patience, support and feedback during the process of writing this thesis. Many times I had different priorities in my life and work that lead to postponing and delaying the finalization of this thesis. Wiekram has kept me on track by supporting and motivating me to finish what I have started. I would like to thank the teachers and professors of the Vrije Universiteit for guiding me and teaching me valuable insights that I can use in my daily work. Furthermore, I would like to thank PwC for providing me with the information and knowledge that has enabled me to perform this research. I would like to thank Chiel Meulendijks, manager IT audit at PwC, for co-reading this thesis and providing me with interesting insights that improved the quality of this thesis. Finally, I would like to thank my family and loved ones for their patience and support. After a long time I am finally able to finish what I have started. Chi Kin Cheng Rotterdam / Amsterdam, The Netherlands September

3 Abstract In-memory databases (IMDB) are a type of databases where data is stored entirely on the main physical memory of the system (DRAM). This is different compared to traditional databases (TDB), where data is stored via a persistent disk storage mechanism. The main advantage of IMDB is this performance increase: Since data is directly available in memory, IMDB can provide much better response times and transaction throughputs compared to the TDB. In the recent years, due to the dropping prices for DRAM and increasing demand for higher database performance, several providers, such as SAP and Oracle, are introducing products that are based on the in-memory technology to meet their clients demand. A lot of research has been conducted to the added (business) value of in-memory databases. However, when looking at IMDB from an IT audit perspective (in terms of database continuity and database security) as part of a financial statements audit, there are not many studies conducted on the impact of IMDB on the financial statements. Therefore, the main question of this study is: do the current audit standards and principles for databases cover the risk associated with the usage of IMDB? In-memory databases introduce a number of risks that are inherent to the technology, which are not present in a traditional database environment. Firstly, from a data continuity perspective, data stored in the main memory is volatile of nature, which increases the risk of data loss at the event of system error/failure. Secondly, from a data security perspective, access to the main memory should be sufficiently restricted in order to prevent unauthorised access to the data. This study has noted that the two mentioned risks can be covered by the existing audit standards and principles. This is illustrated by one practical example: the SAP HANA IMDB environment. Within SAP HANA, traditional database controls, such as logging, access management and backup and recovery measures are applicable. The theory is validated through the usage of the Expert Opinion method, where four experts in the field of IT audit are consulted. Based on interviews and open discussions, the findings of this study have been acknowledged by the experts. This study concludes that the audit standards regarding traditional database controls are also applicable to an in-memory database environment; the audit standards also cover the risk associated with in-memory databases. The actual implementation (such as configuration settings, hardware set-up) of these controls may differ (i.e. due to the technical differences between traditional databases and IMDB), but the controls can be implemented in a similar way. It is recommended that during the definition of the IT audit approach, the risks should be carefully considered when the audit object is an IMDB. The findings of this study can provide guidelines to properly plan the IT audit approach as part of the financial statements audit. This study has several limitations: the technical differences between traditional databases and IMDBs are not in scope for this research and therefore any risks related to the design differences are not regarded in this research. Furthermore, this study is conducted in the context of a financial statements audit. Other types of audits such as technical security audits or compliancy audits have different objectives compared to a financial audit and might provide new perspectives on the risks related to IMDB. 3

4 Table of Contents 1 Introduction Problem description Research Objective Research Question Research Design Research Scope & Limitations Research Relevance Structure of Thesis In-Memory Databases Traditional Databases (TBD) The Database Concept The Physical Database design Database Risks In-Memory Databases The History of IMDB The need for IMDB The Characteristics of IMDB Main differences between traditional Databases and In-Memory Databases Risks from associated with In-Memory Databases Audit Standards and Principles regarding databases IT audit as part of the financial statements audit Audit standards and principles for databases Audit standards applied to risks associated with In-Memory Databases Control Implementation In-Memory Databases: Database Security Control Implementation In-Memory Databases: Data Continuity Summary of database controls in-memory databases Validation Expert Opinion Method Define Problem Statement and Problem Background Select Experts based on a Set of Criteria Design Interview Elicit Opinion: Conduct Interview Aggregation of Collected Data Analysis and Conclusion Limitations and Validity Threat of the Expert Opinion Method Conclusion Main Findings Limitations Recommendations for Further Research References Glossary of Terms

5 1 Introduction As part of the Postgraduate IT audit course at the Vrije Universiteit Amsterdam, her students are required to conduct scientific research and write a thesis on a topic that is relevant to the Postgraduate IT audit course. The topic of this thesis is in-memory databases (IMDB) and the impact on the IT environment from a financial audit perspective. While performing financial audits for the purpose of annual reporting as an external auditor, the IT audit component in such audits is increasing in importance; financial data and transactions are processed and stored in IT systems (such as ERP solutions), therefore the work performed by the IT auditor is of increased relevance for the auditing of financial statements. In the recent years, organizations are continuously developing and implementing new IT solutions in order to support and optimize their business processes by aligning the business strategy with the IT strategy (Business- IT alignment). Furthermore, technical developments are also impacting the design of the business processes and the function of IT within an organization, such as cloud based storage and services, mobile devices, social media and the improvement of access speed to management information via the introduction of in-memory databases. In-memory databases (IMDB), also known as main memory databases (MMDB), are a type of databases where the storage of data relies on the main memory of the system; data is stored entirely on the main physical memory of the system (DRAM). This is in contrast with traditional databases (TDB), where data is stored via a disk storage mechanism and the memory is used for caching (storing temporary data) and other computational activities. The key difference between IMDB and TDB is that IMDB do not rely on persistent data storage and eliminate sub-systems (such as storage I/O control) that support that storage. IMDB keeps data in one place, in the main memory, during data processing rather than moving it around. TDB requires an application to read a piece of data from a traditional database, modify it and write that data back to the database. This process requires time and CPU cycles, which inherently cannot be avoided in a traditional database. In contrast, the same process using an IMDB entails a single data transfer from the database to the application, and back again. The main advantage of IMDB is this performance increase: Since data is directly available in memory, IMDB can provide much better response times and transaction throughputs compared to the TDB (Garcia-Molina & Salem, 1992). The access time for main memory is orders of magnitude less than for disk storage (Garcia- Molina & Salem, 1992). Although this concept is not new, (research on this subject has been conducted dating back in the 1980s, such as DeWitt et al. 1984), this concept is becoming more relevant since more and more companies are implementing this concept as part of their ERP solution. Since the market price for memory is dropping, the economic incentive to use the memory for primary data storage is increasing. ERP suppliers have already made IMDB products commercially available, for instance 5

6 SAP introduced her product SAP Hana in 2010, Oracle came up with her own productline Oracle TimesTen In-Memory Database in The introduction of these new solutions is mainly focused on providing added value to the client in terms of dramatically improved performance and providing aggregated and complex management information in a real-time fashion. Furthermore, a lot of research has been conducted on the added value to the business based on the advantages of IMDB compared to the traditional databases (Loos et al. 2011). However, when looking at IMDB from an IT audit perspective (such as database continuity and database security), what are the differences between the traditional database and IMDB and the risks associated with this new technology? This chapter will continue with the problem description, research objective and research question in order to further explore the differences between TBD and IMDB. 1.1 Problem description In the current International Standard of on Auditing (ISA), it is described that IT also poses specific risks to an entity s internal control and therefore, increasing the risk of material misstatements, such as unauthorized access to data or potential loss of data or inability to access data as required. (ISA315 revised, 2013). Therefore the IT environment is an important aspect as part of the entity s internal control environment. Furthermore, according to ISA and ISA315.21, it is required for the external auditor to understand how the entity has responded to risks arising from IT, including the related business processes, relevant to financial reporting to maintain the integrity of information and security of data (ISA315 revised, 2013). The database is a crucial component of the IT infrastructure, as the database is the place where all financial transactions are initiated, recorded, processed and reported. Therefore it is important to assess the database controls as part of the IT audit. Since the implementation of IMDB in the business is relatively new concept, a lot of organizations are exploring the possibilities of using IMDB for their business. On one hand, IMDB promises a lot of (business) advantages: faster (near real-time) data processing, response time and transaction throughput compared to traditional databases. On the other hand, from an IT audit perspective, not a lot of research has been conducted yet to address the technological consequences implementing IMDB and the differences between IMDB and the traditional databases. A lot of organizations are struggling at the moment to develop a business case for implementing IMDB. The differences between IMDB and traditional databases in terms of the database mechanism may result to new risks that are not present in the traditional databases. For instance, one potential disadvantage of IMDB is that the data storage in the main memory highly volatile and is not persistent and therefore increasing the risk of data loss at the event of a system failure, whereas disk storage is considered passive/persistent and does not need active measures to store 6

7 and retain the data. As a result of the shift in technology and the inherent risks involved, this may lead to changes in the audit approach regarding database security and continuity aspects. Regarding risks on traditional databases level (extracted from the PwC Audit Guide. 2013), the following risks might apply IMDB from an audit perspective: Database Security: Risk of unauthorized access to the database - How is access managed in IMDB? Database Continuity: Continuity risks such as accidental system shutdown or the incorrect restore of data: What are the measures available in IMDB to assure continuity of the data? Access to data is a routine and essential aspect of operating most computerized accounting systems (often as part of an Enterprise Resource Planning solution, ERP). The Information Technology General Controls (ITGCs) will often contribute indirectly to the achievement of many or all financial statement assertions, such as completeness (all financial transactions are recorded) and accuracy (financial transactions and other data related to recorded transactions and events are recorded appropriately). This is because effective ITGCs ensure the continued effective operation of application controls and automated accounting procedures that depend on computer processes (PwC Audit guide, 2013, based on ISA330). IT poses specific risks to an entity s internal control (PwC Audit Guide, 2013, based on ISA 315.A56), including: Reliance on systems or programs that are inaccurately processing data, processing inaccurate data, or both Unauthorized access to data that may result in destruction of data or improper changes to data, including the recording of unauthorized or non-existent transactions, or inaccurate recording of transactions. Particular risks may arise where multiple users access a common database The possibility of IT personnel gaining access privileges beyond those necessary to perform their assigned duties thereby breaking down segregation of duties Unauthorized changes to data in master files Potential loss of data or inability to access data as required The risks are addressed during the audit procedures performed during the IT audit, by testing the corresponding IT General Controls that cover these risks as part of the audit of the financial statements of a company. Do the differences between traditional DMBS and IMDB have impact on the risks identified in this section? Both elements will be combined in the next sections, where the research objective and research question will be formulated. 1.2 Research Objective This section describes the objective of this research. The general objective of this research is to contribute to the development of a theory to provide insight to the development of IMDB in 7

8 relation to the IT audit as part of the financial statements audit. Furthermore, this research also contributes for organizations that are exploring the possibilities to implement IMDB. By researching the differences between the traditional databases and IMDB, this research will aim on identifying whether the current set of audit standards and principles regarding database audits are also relevant for IMDB. 1.3 Research Question The purpose of this research is to identify the audit risk associated with IMDB. Based on these risks, the current set of audit standards and principles is used to map the current database security and database continuity controls to IMDB. Therefore, the main research question of this study is: What existing audit standards and principles regarding database security and database continuity cover the risks associated with In-Memory Databases (IMDB)? In order to address the main research question, the following sub questions are formulated, divided into the following sections: Theoretical perspective: Section 1: In-Memory Databases 1: What are the key differences between the in-memory database and the traditional database? 2: Based on these differences, what are the specific risks associated with in-memory databases? The answer to the abovementioned sub questions in section 1 will lead to an identification of risks that are specifically applicable to IMDB, but not to the traditional databases. Section 2: Audit Standards and Principles regarding databases 3: What are the current set of audit standards and principles regarding database security and database continuity that apply to the traditional databases? 4: Do these standards cover the specific risks associated with In-Memory Databases? The research questions from section 1 and 2 are evaluated from a theoretical perspective. The output of section 1 and 2 will provide the theoretical background in order to answer the main research question. Thereafter, the output of the section 1 and 2 is used as a basis for validating the research from a practical perspective: Practical perspective: Section 3: Validate the theory in a practical environment with the help of experts 8

9 5: To what extent does the theory, as researched from sections 1 and 2, apply in a practical environment? The validation of the theory will be conducted based on validation interviews with experts and experienced practitioners in the field of audit. The results will lead to answering the main research question from a practical perspective, yet grounded in a theoretical perspective. 1.4 Research Design In order to provide an answer to the main research question and its sub questions, the research will be conducted via a literature study. The answers to the sub questions in section 2 (see chapter 1.3) will result in the mapping of the current audit standards with the identified risks from section 1. Based on the results of section 1 and 2 (see chapter 1.3), a possible gap will be identified for auditing IMDB with the current set of audit standards. Based on the gap, recommendations will be formulated to address this gap. The output of sections 1 and 2 will be used as input to conduct the research from a practical perspective. During the practical phase, the formulated theory (the answer to the main research question) will be validated on relevance based on interviews with experts in the field of IT database audit based on the Expert Opinion Method. As a result of the research conducted from a theoretical and a practical perspective, the main research question will be answered. The conclusion and possible recommendations for further studies will be formulated. 1.5 Research Scope & Limitations This scope of this research will be exploring the main differences between IMDB and the traditional databases, the associated risks that are involved with IMDB and the mapping these new risks to the current audit standards and principles of ISACA for database security and database continuity (ISACA, 2013). This study will focus on the differences between IMDB and traditional databases on conceptual level, thus the (technical) architectural differences, implementation consequences will not be discussed in this study. This study will focus on the database risks and controls as part of the IT General Controls (ITGC) in the context of the audit of financial statements. The audit standards researched in the context of this study are based on the International Standards of Auditing (ISA). Several other standards are considered (such as Cobit, NIST), but these standards do not provide additional standards for databases that are not already covered by the ISA regarding the financial statements audit. Since this study is related to databases in the context of financial statement audits, this study will put an emphasis on databases where financial transactions are recorded. Typically, this will be databases that are part of an ERP solution, where transactional data such as purchase orders, 9

10 invoices and journal entries are recorded. Databases that are used for management information (such as data warehouses) will not be addressed in this study. From an IT audit perspective, this study will only highlight the risks and controls regarding databases. This study is not meant to provide comprehensive guidelines on controls testing such as how to assess design, implementation and operating effectiveness of controls, nor presents an exhaustive list of database controls to support the financial statement audit. Furthermore, other aspects regarding databases access and security will not be part of the scope for this research, as illustrated in the figure below. Only the database (represented as data in the figure below) will be the object of this study: Figure 1-1: Layers of security (PwC Audit guide 2013) All adjacent layers, such as the Application layer, Operating System layer and the Network layers are not in scope for this research. 1.6 Research Relevance This research is relevant for both academic world as well as the organizations that are exploring the possibilities to adapt the IMDB into their own business. At the moment, there are a number of providers that are introducing IMDB as platform as part of the ERP business solutions for their clients. Organizations are still struggling to discover and incorporate the advantages of IMDB that will help them reach a competitive advantage. This study will provide a comparison of the traditional databases and IMDB in terms of risks to facilitate organizations in their choice. Furthermore, from an IT audit perspective, the introduction of IMDB may have consequences to the way an IT audit is currently performed as part of the financial statement audit. Since the technology of data storage and retrieval in IMDB is conceptually different to the traditional DMBS, the current audit standards may not be sufficient to address the risks related to IMDBs. This exploratory study will provide insight to IT auditors, such as an external IT auditor of an accounting firm, in order to address the risks related to an IMDB during an IT audit. 10

11 1.7 Structure of Thesis The objective of this study is to provide insight in the IMDB risks compared to the traditional database environment. To reach this objective, this thesis is divided in the following chapters: - Chapter 1 describes the problem description and research question. The research design, research scope and research outline are defined in this chapter. - The first part of chapter 2 provides an overview of the traditional database concept: what is a database and what are characteristics of a database and what are the associated risks? The second part of chapter 2 describes the concept of in-memory databases in more detail. Chapter 2 concludes by comparing the differences between DMBS and IMDB and the associated risks. - Chapter 3 discusses the understanding of an IT audit as part of the financial statements audit. The different aspects of an IT audit (the different domains of the IT general controls) will be briefly explained. The second part of this chapter describes the risks and controls in an IT audit setting when auditing a traditional database environment. The differences between IMDB and traditional databases, as identified from chapter 2, will be applied to the risks of controls in order to provide an answer to the main research question of this study. - Chapter 4 discusses the validation of the results of chapter 3. The validation is performed through peer review. The output of this research is feedback from the interviewees regarding the validity and reliability of results of this study. - Chapter 5 presents the conclusion and main findings of this study, as well as limitations of this research and also provides some recommendations for further research. 11

12 2 In-Memory Databases In this section, the following two sub-questions will be answered, as described in the previous chapter: 1: What are the key differences between In-Memory Databases and the traditional database management systems? 2: Based on these differences, what are the specific risks associated with In-Memory Databases? Based on the answers for these two research sub-questions, the main (conceptual) differences between IMDB and the traditional DMBS will be identified and the risks associated with In- Memory Databases will be highlighted. Before focusing on answering the sub-questions, this chapter will commence with describing the characteristics of the traditional DMBS. Thereafter, this chapter will continue with providing an understanding of the IMDB and the differences between traditional databases and IMDBs. This chapter will conclude with identifying the risks that are associated with In-Memory Databases, based on the differences with the traditional databases in order to provide answer to the subquestions. 2.1 Traditional Databases (TBD) This section describes the characteristics of the traditional database management systems, the use of a database and the supporting software known as database management system that controls and oversees the database environment (Gillenson, 2012). The relation to risks related to the audit of financial statements is described in the next chapter. A database is a concept that represents the collection of organized information into data and assembled in a structured way to serve a particular purpose (Gillenson, 2012). The database provides a manner to manipulate, store and retrieve data in an effective manner with the use of specialised software called database management systems (DBMS). According to Elmasri & Navathe (2006), the DMBS is a collection of software application that allows the user to define, construct, manipulate and share databases amongst various users and applications. The collection of different databases and the supporting database management systems is called the database environment. A database environment is designed to allow storage of large volumes of data and provides access to data, both to applications (such as Enterprise Resource Planning software solutions) as well as end-users. Furthermore, the database environment also supports a variety of functionalities, such as data sharing, controlling data redundancy, data accuracy improvement and provide tools to control data security/privacy and business continuity measures, such as back-up and recovery (Gillenson, 2012). The database as a concept is further elaborated in the following section. 12

13 2.1.1 The Database Concept The database concept is not an isolated element, but is part of the information systems environment. This environment consists of a collection of components such as hardware, networks, application software, systems software and the database. The database concept allows access to data by an Enterprise Resource Planning (ERP) system as part of an enterprise resource. The ERP system is a collection of applications that supports the business processes of an organization, such as purchasing, sales and the financial administration. The database of an ERP provides centralised access to shared data within the database and therefore support multiple business processes from a centralized enterprise resource. There are two key principles that are relevant in the database environment in relation an ERP system: Data Integration and Data Redundancy. Data integration is related to the ability to combine different data elements within an ERP environment. For instance, supplier master data (such as the supplier s name, address and payment related information) is related to a purchase order that has been placed at this supplier (the types of items ordered, quantity and price information). Data integration allows the data to be combined and made accessible, therefore providing answer to the questions such as: what items are purchased at this particular supplier in the past month? Data redundancy refers to the information that has been stored more than once in an information system. This is the exact opposite of data integration, as data redundancy is the storage of redundant data that leads to waste of precious storage space. Therefore the key is to optimize data integration and minimize data redundancy, which results in more efficient storage, and shorter seek/response times The Physical Database design This section describes the physical design of the database concept. Typically, computers execute programs and process data using the CPU and stores temporary data in the main memory during processing. Since the main memory is usually very fast, this allows the performance of data processing to reach a desirable level (Garcia-Molina, H. & Salem, K., 1992). However, there are several disadvantages within this concept. First, the main memory is relatively expensive and therefore not suitable to install in large quantities in order to support the vast volume of business data. Secondly, main memory is not portable and therefore not transferable. Lastly, the main memory is not designed persistent storage of data and therefore the data in the main memory is highly volatile; if the power source is disrupted data in the main memory is lost. Therefore, in a traditional database design, the data is stored on a secondary memory device, such as a physical disk (magnetic disks) or a portable medium such as magnetic tapes and optical media. Magnetic disks are often arranged in one or more platters on which data can be stored magnetically. Access to data on magnetic disks is handled using specialised I/O algorithms that handles the basis retrieval and manipulation of data, such as read, insert, delete and update activities. 13

14 2.1.3 Database Risks In order to have a functioning operational database environment, there are certain risk areas regarding database control that needs to be addressed. These issues can be classified into the following two areas: Database security Database continuity This section will provide more detail regarding these two risk areas in general. In the next chapter, these risk areas will be related to actual risks and controls relevant to the financial statement audit. According to Sandhu & Jajodia (1993) and in line with international standards such as COBIT (2005) the objective of database security and database continuity is divided into the following three quality aspects: Data Confidentiality is concerned with unauthorized disclosure of data and confidentiality of sensitive data; Data Integrity is concerned with consistency (completeness and accuracy) of data, granting minimum access to data in order to prevent (un)intended modification or deletion of data; Data Availability is concerned with availability of data and the continuity of the operational database, for instance by means of adding redundancy. Database Security Data is considered to be a corporate resource and an important asset of an organization. Data is critical to an organization to conduct business. Sensitive data, such as customer personal data, needs to be protected and not accessible to the rest of the world. It is the organization s responsibility to protect the data. Furthermore, even within an organization, the data should usually not be accessible to everyone in within the organization, such as privileged data. Therefore, it is important to have a secured database environment in order to reduce the probability of unauthorized data access. In general, database security is concerned about protecting data from theft, unauthorized deletion and unauthorized modification. To protect a database environment, database security controls can be designed and implemented into the database environment. There are different types of database security controls that protect against protect against its corresponding risk, such as physical security controls (to prevent unauthorized physical access to a database), network security controls (prevent unauthorized access on network level, for instance with the use of firewalls to block access) and security controls on database level (such as access controls on database level). The collection of database security controls implemented is used to ensure the data confidentiality and data integrity of the database. For the database controls regarding database security from an audit perspective, see chapter 3. 14

15 Database Continuity The main objective regarding database continuity is to prevent potential loss of data or inability to access data as required (PwC Audit Guide, 2014). Regardless of how the database is designed, the risk exists that an event can occur that can affect or even destroy data (Gillenson, 2012). Such event can range from erroneous user input (leading to corruption the data in the database), interruption of the power supply (due to power black-out) to a full-blown disaster (destroying the physical database). Elmasri & Navathe (2006) have pointed out several types of failures that may occur that can be classified as transaction failures (a transactional error such as division by zero, an integer overflow or concurrency failure), system failure (a hardware, software or network error occurs during database transaction execution) or media failure (ranging from disk failure, physical problems such as power or air-conditioning errors to catastrophes such as fire or flood). Therefore, database recovery techniques can be designed and implemented to ensure the continuity of the database. In case of transaction failures, a system log can be kept in the database, which records the transactions and its changes applied to data items. In case of transaction errors, the system log can be used to restore the failure by rolling back the transaction that has caused the failure. In case of system and media failures, a database back-up can be used to restore the database. A database back-up can be made by periodically copy the database and the system log to an external location, such as to a back-up database server or to an external medium, such as a magnetic tape that can be stored in a secure location. The database back-up can be used to reload the data to the primary database and therefore restore the data. The collection of database security controls implemented is used to ensure the data integrity and data availability of the database. For the database controls regarding database continuity from an audit perspective, see chapter In-Memory Databases This section describes the characteristics of In-Memory Databases (IMDB). This section will commence with describing the history of IMDB and the necessity of IMDB in order to meet the increasing demands of the database environment in terms of data volume, data accessibility and speed. Furthermore, this section will elaborate on the technological characteristics and the advantages of IMDB. The next section will focus on the main differences between traditional databases and In-Memory Databases The History of IMDB The need for IMDB There are several drivers that play important roles in the introduction of IMDBs. One of the reasons is that the market demands more and more of the database environment in order to keep up with the technological advancements (Plattner & Zeier, 2012). Modern hardware is subject to change and continuous innovation in order to meet the business requirements. These drivers can 15

16 be categorized into two perspectives: from a technical perspective and from a business perspective. From a technical perspective, the emergence of client-server architecture (introduced in the 80s) allows the application to be run on multiple, relatively cheap clients whilst connected to central application server and database server. This is called the three-tier architecture. All processing activities requested from the clients (tier 1) are handled by the application server (tier 2). The database server (tier 3) handles all incoming data requests, such as insert, read and delete commands and processes these requests on the physical database through the DMBS. For a logical overview of the three-tier architecture, see diagram below: Diagram 2-1: Three-Tier Architecture The database was soon becoming the bottleneck: receiving data from and transmitting data to the increasing number of application servers created a huge performance bottleneck (Word, 2013). The bottleneck is not caused by the processing power of the database server: the problem lies in the Input/Output (I/O) of databases. All data handling (such as read, insert, delete and update activities) are processed by the database I/O and proved to be the bottleneck of the database process. From a business perspective, the demand for reporting data is also increasing in the current fastmoving business environments. Enterprises required access to their data in order to analyse their data in a timely manner. However, the traditional databases are designed to process operational 16

17 data via transactions in an efficient manner based on the so-called Online Transaction Processing (OLTP). The concept of OLTP is to minimize the volume of data entry and speed up data handling ( insert, delete, update ) by using high degree of normalization of the data in order to minimize redundancy (Elmasri & Navathe, 2006). This means that data is stored into multiple tables across the database, where the data is often stored according to an entity-relational model. For enterprise reporting purposes, data needs to be retrieved and joined from multiple tables and multiple reads from the database is required, which further impacted the performance of the database in a negative manner. For the purpose of performing analytics on data, Online Analytical Processing (OLAP) systems were developed. The data in an OLAP system is defined in a different data structure compared to the OLTP traditional database, often in a dimensional data mode. The OLAP system is designed as a separate environment solely for the purpose of quickening the processing of complex analytical queries. The database as part of the OLAP system is typically called a datamart or enterprise data warehouse. All transactional data from the database are transferred and loaded in the OLAP data warehouse via the so-called ETL process (Extract, Transform & Load), where data is sorted, aggregated and deduplicated (Vassiliadis et al, 2002). The traditional OLTP database is focused on processing all transactional data, whilst the OLAP system could handle all reporting of data via the processing of analytical queries. The OLAP system provided the end-user a snapshot of the OLTP database, depending on the frequency of the ETL process (usually scheduled) to refresh the OLAP system. The relation between OLTP and OLAP is displayed in the diagram below: Diagram 2-2: OLTP and OLAP (Qikkwit, 2014) However, organizations were creating quantities of data that was getting larger and larger due to business processes getting more and more automated. This created a vicious circle: since more and more data was created and becoming available, more end-users demanded more data to use 17

18 and as a result, leading to the creation of even more data. Furthermore, end-users demanded more and more up-to-date data (preferably in real-time) in order to support their business decisions (Word, 2013), whereas the OLAP system could only present data as a snapshot of the last ETL process. The combination of OTLP databases for handling transactions and OLAP systems for reporting purposes was getting not sufficient anymore for the rapid increase of data creation and data (real-time) demand: one is unable to do real-time analytics as analytical queries are posed against a copy of the data in the OLAP system that does not include the latest transactions The Characteristics of IMDB The concept of In-Memory database dates back to the early 1980, where possibilities of the usage of main memory as database is researched, see DeWitt et al. (1984). The advantage of using main memory as database was soon proven to be evident (see DeWitt et al, 1984): the access and read times for main memory is dramatically lower compared to disk. For the comparison between main memory and disk, see table below: Action Main Memory Disk Data Access 100 ns ns Read 1 MB sequentially ns ns Table 2-3: Access and Read times for main memory compared to disk (Plattner & Zeier, 2012). However, the commercial implementation of databases based on in-memory technology is relatively new. For instance, major ERP provider SAP has developed specific algorithms based on in-memory technology applied on their logistics planning module in early 2000 to increase the performance. Furthermore, the users of the SAP BW environment (SAP BW is the SAP OLAP module for analytics and reporting as part of the SAP ERP solution) were having significant performance issues in the mid-2000, since analytical reporting procedures on the SAP BW OLAP environments where taking longer (several hours) or even days, even though OLAP system was already specifically designed for analytical reporting purposes. The turning point was the continuous decrease of price for storage: the prices for main memory and disk storage have decreased in the past in line with Moore s Law, to the point that the price for main memory DRAM (Dynamic Random Access Memory) was becoming inexpensive enough to provide an affordable option for usage in large enterprise application systems (Plattner & Zeier, 2012). Therefore the cost : size ratio for main memory provided an excellent cost efficient opportunity to explore the possibilities for commercial use of in-memory technology. The large main memory is becoming cheap enough to load all information needed and therefore speed up the transactions (in the OLTP environment) and analytics (in the OLAP environment) significantly, which leads to cost reduction. Therefore, different suppliers (from ERP application providers to database developers) are currently developing and implementing solutions based on 18

19 in-memory technology. The following section provides a short overview of the in-memory solutions commercially available at the moment: SAP HANA ERP solutions provider SAP AG has introduced her SAP HANA platform, an in-memory database management system. HANA is an acronym for High-Performance Analytic Appliance. HANA was introduced in 2010 and initially positioned to perform specific (niche) functionalities using IMDB. The performance increase of IMDB was a key selling point and used as business case for her customers (Word, 2013). In 2011, the SAP HANA platform was made available for the SAP NetWeaver BW environment (OLAP system) in order to speed up the analytical procedures. The SAP ERP transactional data (OLTP) is loaded into the SAP BW environment, which is run on an IMDB. The next step would be to move the SAP ERP environment to IMDB as well, creating one single platform where transactional as well as analytical data is readily available, diminishing the need of a separate enterprise data warehouse (EDW), see diagram below: Diagram 2-4: the Vision of SAP HANA as a single platform for all business processing (Vital BI, 2011) Oracle TimesTen Oracle is well known for their traditional object-relational database management systems, supporting many applications and ERP solutions. In the mid-2000, Oracle has acquired the company TimesTen, who is specialized in IMDBs. Since then, Oracle was able to provide the IMDB solution for her customers, named Oracle TimesTen. This in-memory database was designed to store all data needed in the physical memory, unlike the disk-based databases of the traditional Oracle Database. In 2012, Oracle has introduced Oracle Exalytics to her customers. Oracle Exalytics provides an in-memory BI machine that delivering fast performance for business intelligence and planning applications (Oracle, 2014), similar to the SAP BW HANA platform. 19

20 The following section will provide a summary of the main differences between traditional databases and in-memory databases. 2.3 Main differences between traditional Databases and In- Memory Databases In order to determine the risks involved with the use of an IMDB, this section will describe the traditional databases and IMDB. The focus will be on the main differences between the traditional databases and IMDB. Based on the noted differences, the research sub-questions as described in chapter 1 will be answered in this section. As discussed earlier in section 2.2.1, in a traditional client-server database system environment (Elmasri & Navathe, 2006), the end-user interacts with the application (such as an ERP) through one of the clients. User interactions with the application include data processing activities such as read and write information in the application. The user interactions in the application are processed by the database at the server-side and processed in the database. A DMBS is a collection of programs that enable users to create and maintain a database (Elmasri & Navathe, 2006). The data is stored on persistent disk-based media and maintained in a database server. In this traditional set-up, access to data (I/O) is the bottleneck: Every user interaction on application level, such as querying for data, has to go through the DBMS and subsequently searched on disk. When the data is acquired, the data follows the same path back to the application and made available to the user. This process can be optimized by using database caching, whereby frequently access data is stored temporary in the database cache, which eliminates the access and seeking time on disk. However, for complex queries and processing large amounts of data, disk reads are still required. In an in-memory database environment (Plattner & Zeier, 2012), the I/O sequences between application and the database (disk) is entirely eliminated: IMDB uses the main memory as the primary storage. The main drivers of using IMDB is the steadily falling prices of main memory, increasing CPU processing power and the increasing demand of real-time analytics and computation on the fly. The differences are highlighted in the following illustration (Gartner, 2012): 20

21 Fig 2-5: Difference between traditional and in-memory computing (Gartner, 2012) The diagram above displays the main differences between traditional and in-memory computing. (Please note that the terms in-memory database and in-memory computing are interchangeable, since in-memory computing makes use of in-memory databases). The diagram shows a traditional approach of computing: the application is running on the application server (which consists of the CPU, main memory, among other elements) and communicates (I/O) with an external database. This external database holds the actual data (as database records). The yellow arrows represent the I/O communication lines between the application and the database disk. The in-memory computing on the right-hand-side of the illustration displays the IMDB situation: the database resides in the main memory and therefore accessed directly by the application. However, the in-memory database cannot fully replace the traditional disk-based database: the main memory is a storage medium and is still prone to system failures, such as system errors or power disruptions. This issue is also recognized in the research of Garcia-Molina & Salem (1992): Although the probability of failure can be reduced by using techniques as battery-backed up memory, uninterruptable power supplies, error detection/correction and memory redundancy, the probability can never be zero. Thus, one will always have to have a back-up copy of the database, probably on disk. Therefore, most IMDBs are also equipped with a traditional database for back-up and recovery purposes. Different approaches exist for backing up IMDB environment, such as recording transaction loggings on non-volatile media or periodic full data replications. Main memory has different properties as that of traditional database disks that have implications on the design and performance of the database system. According to Garcia-Molina & Salem (1992) amongst other researchers, the main differences are summarized below: 21

22 (1) Access method: The data in the main memory is accessed directly by the CPU, while disks are not; (2) Data Volatility: Main memory is highly volatile, while disk storage is persistent; (3) Access time / Performance: The access time for main memory is orders of magnitude less than for disk storage, by eliminating the need for I/O to perform database transactions. This performance increase is also confirmed by other studies, such as Raja et al. (2008); (4) Cost per Access: Disk storage has a high, fixed cost per access compared to access from main memory. Disk-based access is usually sequential in TBD. Data in memory is accessed randomly, however this does not pose as a disadvantage as access to memory is much faster compared to TBD, see also (3) ; (5) Data structure/ Architecture: Since IMDB and the traditional DBMS are conceptually different; these differences have architectural consequences and differences in data structures, such as entity relational databases in traditional databases versus the column-oriented data structures commonly used in IMDBs. As described in the research scoping section, these technical differences in architecture will not be in scope for this study. In this section, the main differences between the traditional DBMS and the in-memory databases (IMDB) are explored and summarized. The summary above provides the answer to sub-question 1 as described in the beginning of this chapter. In the next section, the risks associated with these database techniques will be explored in more detail. 2.4 Risks from associated with In-Memory Databases In this section, the differences between IMDB and traditional databases (as identified in the previous section) are analysed and grouped from a risk-based perspective. These risks can be grouped into two categories: data security and database continuity, as based on the database control issues as described in section The main differences in the previous sections are linked to one of these categories. Previously, we have noted five main differences between traditional databases and IMDB. In the next chapter, the identified risks will be linked to the audit standards and principles regarding databases. Database Security in In-Memory Database Access Method As discussed in the previous section, one of the differences regarding TBD and IMDB is the (1) access method. In case of the traditional DMBS, the data is stored on disk-based storage device. Typical access controls are in place that ensures that a user is only authorized to perform database operations for which that user is permitted (Sandhu & Jajodia, 1993). These access controls are 22

In-memory databases and innovations in Business Intelligence

In-memory databases and innovations in Business Intelligence Database Systems Journal vol. VI, no. 1/2015 59 In-memory databases and innovations in Business Intelligence Ruxandra BĂBEANU, Marian CIOBANU University of Economic Studies, Bucharest, Romania babeanu.ruxandra@gmail.com,

More information

Understanding the Value of In-Memory in the IT Landscape

Understanding the Value of In-Memory in the IT Landscape February 2012 Understing the Value of In-Memory in Sponsored by QlikView Contents The Many Faces of In-Memory 1 The Meaning of In-Memory 2 The Data Analysis Value Chain Your Goals 3 Mapping Vendors to

More information

SAP HANA SAP s In-Memory Database. Dr. Martin Kittel, SAP HANA Development January 16, 2013

SAP HANA SAP s In-Memory Database. Dr. Martin Kittel, SAP HANA Development January 16, 2013 SAP HANA SAP s In-Memory Database Dr. Martin Kittel, SAP HANA Development January 16, 2013 Disclaimer This presentation outlines our general product direction and should not be relied on in making a purchase

More information

IN-MEMORY DATABASE SYSTEMS. Prof. Dr. Uta Störl Big Data Technologies: In-Memory DBMS - SoSe 2015 1

IN-MEMORY DATABASE SYSTEMS. Prof. Dr. Uta Störl Big Data Technologies: In-Memory DBMS - SoSe 2015 1 IN-MEMORY DATABASE SYSTEMS Prof. Dr. Uta Störl Big Data Technologies: In-Memory DBMS - SoSe 2015 1 Analytical Processing Today Separation of OLTP and OLAP Motivation Online Transaction Processing (OLTP)

More information

Top Ten Questions. to Ask Your Primary Storage Provider About Their Data Efficiency. May 2014. Copyright 2014 Permabit Technology Corporation

Top Ten Questions. to Ask Your Primary Storage Provider About Their Data Efficiency. May 2014. Copyright 2014 Permabit Technology Corporation Top Ten Questions to Ask Your Primary Storage Provider About Their Data Efficiency May 2014 Copyright 2014 Permabit Technology Corporation Introduction The value of data efficiency technologies, namely

More information

CASE STUDY: Oracle TimesTen In-Memory Database and Shared Disk HA Implementation at Instance level. -ORACLE TIMESTEN 11gR1

CASE STUDY: Oracle TimesTen In-Memory Database and Shared Disk HA Implementation at Instance level. -ORACLE TIMESTEN 11gR1 CASE STUDY: Oracle TimesTen In-Memory Database and Shared Disk HA Implementation at Instance level -ORACLE TIMESTEN 11gR1 CASE STUDY Oracle TimesTen In-Memory Database and Shared Disk HA Implementation

More information

SAP NetWeaver BW Archiving with Nearline Storage (NLS) and Optimized Analytics

SAP NetWeaver BW Archiving with Nearline Storage (NLS) and Optimized Analytics SAP NetWeaver BW Archiving with Nearline Storage (NLS) and Optimized Analytics www.dolphin corp.com Copyright 2011 Dolphin, West Chester PA All rights are reserved, including those of duplication, reproduction,

More information

In-Memory Data Management for Enterprise Applications

In-Memory Data Management for Enterprise Applications In-Memory Data Management for Enterprise Applications Jens Krueger Senior Researcher and Chair Representative Research Group of Prof. Hasso Plattner Hasso Plattner Institute for Software Engineering University

More information

Whitepaper: Back Up SAP HANA and SUSE Linux Enterprise Server with SEP sesam. info@sepusa.com www.sepusa.com Copyright 2014 SEP

Whitepaper: Back Up SAP HANA and SUSE Linux Enterprise Server with SEP sesam. info@sepusa.com www.sepusa.com Copyright 2014 SEP Whitepaper: Back Up SAP HANA and SUSE Linux Enterprise Server with SEP sesam info@sepusa.com www.sepusa.com Table of Contents INTRODUCTION AND OVERVIEW... 3 SOLUTION COMPONENTS... 4-5 SAP HANA... 6 SEP

More information

ORACLE DATABASE 12C IN-MEMORY OPTION

ORACLE DATABASE 12C IN-MEMORY OPTION Oracle Database 12c In-Memory Option 491 ORACLE DATABASE 12C IN-MEMORY OPTION c The Top Tier of a Multi-tiered Database Architecture There is this famous character, called Mr. Jourdain, in The Bourgeois

More information

B.Sc (Computer Science) Database Management Systems UNIT-V

B.Sc (Computer Science) Database Management Systems UNIT-V 1 B.Sc (Computer Science) Database Management Systems UNIT-V Business Intelligence? Business intelligence is a term used to describe a comprehensive cohesive and integrated set of tools and process used

More information

Protect SAP HANA Based on SUSE Linux Enterprise Server with SEP sesam

Protect SAP HANA Based on SUSE Linux Enterprise Server with SEP sesam Protect SAP HANA Based on SUSE Linux Enterprise Server with SEP sesam Many companies of different sizes and from all sectors of industry already use SAP s inmemory appliance, HANA benefiting from quicker

More information

The IBM Cognos Platform for Enterprise Business Intelligence

The IBM Cognos Platform for Enterprise Business Intelligence The IBM Cognos Platform for Enterprise Business Intelligence Highlights Optimize performance with in-memory processing and architecture enhancements Maximize the benefits of deploying business analytics

More information

Chapter 2 Why Are Enterprise Applications So Diverse?

Chapter 2 Why Are Enterprise Applications So Diverse? Chapter 2 Why Are Enterprise Applications So Diverse? Abstract Today, even small businesses operate in different geographical locations and service different industries. This can create a number of challenges

More information

Supplier IT Security Guide

Supplier IT Security Guide Revision Date: 28 November 2012 TABLE OF CONTENT 1. INTRODUCTION... 3 2. PURPOSE... 3 3. GENERAL ACCESS REQUIREMENTS... 3 4. SECURITY RULES FOR SUPPLIER WORKPLACES AT AN INFINEON LOCATION... 3 5. DATA

More information

Chapter 6 8/12/2015. Foundations of Business Intelligence: Databases and Information Management. Problem:

Chapter 6 8/12/2015. Foundations of Business Intelligence: Databases and Information Management. Problem: Foundations of Business Intelligence: Databases and Information Management VIDEO CASES Chapter 6 Case 1a: City of Dubuque Uses Cloud Computing and Sensors to Build a Smarter, Sustainable City Case 1b:

More information

SAP HANA - Main Memory Technology: A Challenge for Development of Business Applications. Jürgen Primsch, SAP AG July 2011

SAP HANA - Main Memory Technology: A Challenge for Development of Business Applications. Jürgen Primsch, SAP AG July 2011 SAP HANA - Main Memory Technology: A Challenge for Development of Business Applications Jürgen Primsch, SAP AG July 2011 Why In-Memory? Information at the Speed of Thought Imagine access to business data,

More information

Projected Cost Analysis Of SAP HANA

Projected Cost Analysis Of SAP HANA A Forrester Total Economic Impact Study Commissioned By SAP Project Director: Shaheen Parks April 2014 Projected Cost Analysis Of SAP HANA Cost Savings Enabled By Transitioning to HANA Table Of Contents

More information

Real-Time Enterprise Management with SAP Business Suite on the SAP HANA Platform

Real-Time Enterprise Management with SAP Business Suite on the SAP HANA Platform Real-Time Enterprise Management with SAP Business Suite on the SAP HANA Platform Jürgen Butsmann, Solution Owner, Member of Global Business Development Suite on SAP HANA, SAP October 9th, 2014 Public Agenda

More information

Introduction to XLink EzOpenBackup Plus!

Introduction to XLink EzOpenBackup Plus! Introduction to XLink EzOpenBackup Plus! For Windows 2000/2003/XP White Paper XLink Technology, Inc 1546 Centre Pointe Drive Milpitas, CA 95035, USA For the latest Product Information and free demo program,

More information

Main Reference : Hall, James A. 2011. Information Technology Auditing and Assurance, 3 rd Edition, Florida, USA : Auerbach Publications

Main Reference : Hall, James A. 2011. Information Technology Auditing and Assurance, 3 rd Edition, Florida, USA : Auerbach Publications Main Reference : Hall, James A. 2011. Information Technology Auditing and Assurance, 3 rd Edition, Florida, USA : Auerbach Publications Suggested Reference : Senft, Sandra; Gallegos, Frederick., 2009.

More information

Transaction Processing and Enterprise Resource Planning Systems. Goal of Transaction Processing. Characteristics of Transaction Processing

Transaction Processing and Enterprise Resource Planning Systems. Goal of Transaction Processing. Characteristics of Transaction Processing C H A P T E R 9 Transaction Processing and Enterprise Resource Planning Systems Goal of Transaction Processing Provide all the information needed to keep the business running properly and efficiently.

More information

Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities

Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities Technology Insight Paper Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities By John Webster February 2015 Enabling you to make the best technology decisions Enabling

More information

In-memory Tables Technology overview and solutions

In-memory Tables Technology overview and solutions In-memory Tables Technology overview and solutions My mainframe is my business. My business relies on MIPS. Verna Bartlett Head of Marketing Gary Weinhold Systems Analyst Agenda Introduction to in-memory

More information

Outline. Failure Types

Outline. Failure Types Outline Database Management and Tuning Johann Gamper Free University of Bozen-Bolzano Faculty of Computer Science IDSE Unit 11 1 2 Conclusion Acknowledgements: The slides are provided by Nikolaus Augsten

More information

IS IN-MEMORY COMPUTING MAKING THE MOVE TO PRIME TIME?

IS IN-MEMORY COMPUTING MAKING THE MOVE TO PRIME TIME? IS IN-MEMORY COMPUTING MAKING THE MOVE TO PRIME TIME? EMC and Intel work with multiple in-memory solutions to make your databases fly Thanks to cheaper random access memory (RAM) and improved technology,

More information

Certified Information Systems Auditor (CISA)

Certified Information Systems Auditor (CISA) Certified Information Systems Auditor (CISA) Course Introduction Course Introduction Module 01 - The Process of Auditing Information Systems Lesson 1: Management of the Audit Function Organization of the

More information

Speed and Persistence for Real-Time Transactions

Speed and Persistence for Real-Time Transactions Speed and Persistence for Real-Time Transactions by TimesTen and Solid Data Systems July 2002 Table of Contents Abstract 1 Who Needs Speed and Persistence 2 The Reference Architecture 3 Benchmark Results

More information

CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS

CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS CHAPTER - 5 CONCLUSIONS / IMP. FINDINGS In today's scenario data warehouse plays a crucial role in order to perform important operations. Different indexing techniques has been used and analyzed using

More information

Improving Business for SMEs with Online Backup Improving Business for SMEs with Online Backup

Improving Business for SMEs with Online Backup Improving Business for SMEs with Online Backup Improving Business for SMEs with Online Backup www.cloudsecure.co.uk/cloudsecure 1 Accountants and Solicitors Firms Professional organisations such as accountancy and solicitors firms have an ever increasing

More information

Disk-to-Disk-to-Tape (D2D2T)

Disk-to-Disk-to-Tape (D2D2T) Where Disk Fits into Backup Tape originated in the 1950 s as the primary storage device for computers. It was one of the first ways to store data beyond the memory of a computer, which at the time was

More information

Preview of Oracle Database 12c In-Memory Option. Copyright 2013, Oracle and/or its affiliates. All rights reserved.

Preview of Oracle Database 12c In-Memory Option. Copyright 2013, Oracle and/or its affiliates. All rights reserved. Preview of Oracle Database 12c In-Memory Option 1 The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any

More information

FIFTH EDITION. Oracle Essentials. Rick Greenwald, Robert Stackowiak, and. Jonathan Stern O'REILLY" Tokyo. Koln Sebastopol. Cambridge Farnham.

FIFTH EDITION. Oracle Essentials. Rick Greenwald, Robert Stackowiak, and. Jonathan Stern O'REILLY Tokyo. Koln Sebastopol. Cambridge Farnham. FIFTH EDITION Oracle Essentials Rick Greenwald, Robert Stackowiak, and Jonathan Stern O'REILLY" Beijing Cambridge Farnham Koln Sebastopol Tokyo _ Table of Contents Preface xiii 1. Introducing Oracle 1

More information

MAS 200. MAS 200 for SQL Server Introduction and Overview

MAS 200. MAS 200 for SQL Server Introduction and Overview MAS 200 MAS 200 for SQL Server Introduction and Overview March 2005 1 TABLE OF CONTENTS Introduction... 3 Business Applications and Appropriate Technology... 3 Industry Standard...3 Rapid Deployment...4

More information

Taming Big Data Storage with Crossroads Systems StrongBox

Taming Big Data Storage with Crossroads Systems StrongBox BRAD JOHNS CONSULTING L.L.C Taming Big Data Storage with Crossroads Systems StrongBox Sponsored by Crossroads Systems 2013 Brad Johns Consulting L.L.C Table of Contents Taming Big Data Storage with Crossroads

More information

IBM DB2 Near-Line Storage Solution for SAP NetWeaver BW

IBM DB2 Near-Line Storage Solution for SAP NetWeaver BW IBM DB2 Near-Line Storage Solution for SAP NetWeaver BW A high-performance solution based on IBM DB2 with BLU Acceleration Highlights Help reduce costs by moving infrequently used to cost-effective systems

More information

The Benefits of Continuous Data Protection (CDP) for IBM i and AIX Environments

The Benefits of Continuous Data Protection (CDP) for IBM i and AIX Environments The Benefits of Continuous Data Protection (CDP) for IBM i and AIX Environments New flexible technologies enable quick and easy recovery of data to any point in time. Introduction Downtime and data loss

More information

Projected Cost Analysis of the SAP HANA Platform

Projected Cost Analysis of the SAP HANA Platform A Forrester Total Economic Impact Study Commissioned By SAP Project Director: Shaheen Parks April 2014 Projected Cost Analysis of the SAP HANA Platform Cost Savings Enabled By Transitioning to the SAP

More information

ICOM 6005 Database Management Systems Design. Dr. Manuel Rodríguez Martínez Electrical and Computer Engineering Department Lecture 2 August 23, 2001

ICOM 6005 Database Management Systems Design. Dr. Manuel Rodríguez Martínez Electrical and Computer Engineering Department Lecture 2 August 23, 2001 ICOM 6005 Database Management Systems Design Dr. Manuel Rodríguez Martínez Electrical and Computer Engineering Department Lecture 2 August 23, 2001 Readings Read Chapter 1 of text book ICOM 6005 Dr. Manuel

More information

Memory-Centric Database Acceleration

Memory-Centric Database Acceleration Memory-Centric Database Acceleration Achieving an Order of Magnitude Increase in Database Performance A FedCentric Technologies White Paper September 2007 Executive Summary Businesses are facing daunting

More information

ENTERPRISE RESOURCE PLANNING SYSTEMS

ENTERPRISE RESOURCE PLANNING SYSTEMS CHAPTER ENTERPRISE RESOURCE PLANNING SYSTEMS This chapter introduces an approach to information system development that represents the next step on a continuum that began with stand-alone applications,

More information

Alexander Nikov. 5. Database Systems and Managing Data Resources. Learning Objectives. RR Donnelley Tries to Master Its Data

Alexander Nikov. 5. Database Systems and Managing Data Resources. Learning Objectives. RR Donnelley Tries to Master Its Data INFO 1500 Introduction to IT Fundamentals 5. Database Systems and Managing Data Resources Learning Objectives 1. Describe how the problems of managing data resources in a traditional file environment are

More information

SAP HANA Live for SAP Business Suite. David Richert Presales Expert BI & EIM May 29, 2013

SAP HANA Live for SAP Business Suite. David Richert Presales Expert BI & EIM May 29, 2013 SAP HANA Live for SAP Business Suite David Richert Presales Expert BI & EIM May 29, 2013 Agenda Next generation business requirements for Operational Analytics SAP HANA Live - Platform for Real-Time Intelligence

More information

Whitepaper - Security e-messenger

Whitepaper - Security e-messenger Whitepaper 1 Security e-messenger Contents 1. Introduction Page 3 2. Data centre security and connection Page 3 a. Security Page 3 b. Power Page 3 c. Cooling Page 3 d. Fire suppression Page 3 3. Server

More information

INCREASING EFFICIENCY WITH EASY AND COMPREHENSIVE STORAGE MANAGEMENT

INCREASING EFFICIENCY WITH EASY AND COMPREHENSIVE STORAGE MANAGEMENT INCREASING EFFICIENCY WITH EASY AND COMPREHENSIVE STORAGE MANAGEMENT UNPRECEDENTED OBSERVABILITY, COST-SAVING PERFORMANCE ACCELERATION, AND SUPERIOR DATA PROTECTION KEY FEATURES Unprecedented observability

More information

INFORMATION TECHNOLOGY CONTROLS

INFORMATION TECHNOLOGY CONTROLS CHAPTER 14 INFORMATION TECHNOLOGY CONTROLS SCOPE This chapter addresses requirements common to all financial accounting systems and is not limited to the statewide financial accounting system, ENCOMPASS,

More information

Knowledge Management Series. Internal Audit in ERP Environment

Knowledge Management Series. Internal Audit in ERP Environment Knowledge Management Series Internal Audit in ERP Environment G BALU ASSOCIATES Knowledge Management Series ISSUE-5 ; VOL 1 Internal Audit in ERP Environment APRIL/2012 Editorial Greetings..!!! Raja Gopalan.B

More information

Chapter 14: Databases and Database Management Systems

Chapter 14: Databases and Database Management Systems 15 th Edition Understanding Computers Today and Tomorrow Comprehensive Chapter 14: Databases and Database Management Systems Deborah Morley Charles S. Parker Copyright 2015 Cengage Learning Learning Objectives

More information

An Introduction to the Cloud

An Introduction to the Cloud An Introduction to the Cloud It s increasingly being recognised that Cloud based solutions are now rapidly becoming mainstream in terms of providing superior, much more efficient, and highly cost-effective

More information

PART 10 COMPUTER SYSTEMS

PART 10 COMPUTER SYSTEMS PART 10 COMPUTER SYSTEMS 10-1 PART 10 COMPUTER SYSTEMS The following is a general outline of steps to follow when contemplating the purchase of data processing hardware and/or software. The State Board

More information

IT General Controls Domain COBIT Domain Control Objective Control Activity Test Plan Test of Controls Results

IT General Controls Domain COBIT Domain Control Objective Control Activity Test Plan Test of Controls Results Acquire or develop application systems software Controls provide reasonable assurance that application and system software is acquired or developed that effectively supports financial reporting requirements.

More information

Data Center Infrastructure

Data Center Infrastructure Data Center Infrastructure Module 1.3 2006 EMC Corporation. All rights reserved. Data Center Infrastructure - 1 Data Center Infrastructure Upon completion of this module, you will be able to: List the

More information

Data Warehouse (DW) Maturity Assessment Questionnaire

Data Warehouse (DW) Maturity Assessment Questionnaire Data Warehouse (DW) Maturity Assessment Questionnaire Catalina Sacu - csacu@students.cs.uu.nl Marco Spruit m.r.spruit@cs.uu.nl Frank Habers fhabers@inergy.nl September, 2010 Technical Report UU-CS-2010-021

More information

Protect Microsoft Exchange databases, achieve long-term data retention

Protect Microsoft Exchange databases, achieve long-term data retention Technical white paper Protect Microsoft Exchange databases, achieve long-term data retention HP StoreOnce Backup systems, HP StoreOnce Catalyst, and Symantec NetBackup OpenStorage Table of contents Introduction...

More information

Business Continuity and Disaster Survival Strategies for the Small and Mid Size Business. www.integrit-network.com

Business Continuity and Disaster Survival Strategies for the Small and Mid Size Business. www.integrit-network.com Business Continuity and Disaster Survival Strategies for the Small and Mid Size Business www.integrit-network.com Business Continuity & Disaster Survival Strategies for the Small & Mid Size Business AGENDA:

More information

Today s Volatile World Needs Strong CFOs

Today s Volatile World Needs Strong CFOs Financial Planning Today s Volatile World Needs Strong CFOs Strategist Steward Operator CFO 2014 SAP AG or an SAP affiliate company. All rights reserved. 2 2 Top Business Priorities for the CFO Finance

More information

Firewall Administration and Management

Firewall Administration and Management Firewall Administration and Management Preventing unauthorised access and costly breaches G-Cloud 5 Service Definition CONTENTS Overview of Service... 2 Protects Systems and data... 2 Optimise firewall

More information

DEFINING THE RIGH DATA PROTECTION STRATEGY

DEFINING THE RIGH DATA PROTECTION STRATEGY DEFINING THE RIGH DATA PROTECTION STRATEGY The Nuances of Backup and Recovery Solutions By Cindy LaChapelle, Principal Consultant, ISG www.isg-one.com INTRODUCTION Most organizations have traditionally

More information

Emerging Technologies Shaping the Future of Data Warehouses & Business Intelligence

Emerging Technologies Shaping the Future of Data Warehouses & Business Intelligence Emerging Technologies Shaping the Future of Data Warehouses & Business Intelligence Appliances and DW Architectures John O Brien President and Executive Architect Zukeran Technologies 1 TDWI 1 Agenda What

More information

Database Management. Chapter Objectives

Database Management. Chapter Objectives 3 Database Management Chapter Objectives When actually using a database, administrative processes maintaining data integrity and security, recovery from failures, etc. are required. A database management

More information

Data Backup Options for SME s

Data Backup Options for SME s Data Backup Options for SME s As an IT Solutions company, Alchemy are often asked what is the best backup solution? The answer has changed over the years and depends a lot on your situation. We recognize

More information

Library Recovery Center

Library Recovery Center Library Recovery Center Ever since libraries began storing bibliographic information on magnetic disks back in the 70 s, the challenge of creating useful back-ups and preparing for a disaster recovery

More information

Cloud Based Application Architectures using Smart Computing

Cloud Based Application Architectures using Smart Computing Cloud Based Application Architectures using Smart Computing How to Use this Guide Joyent Smart Technology represents a sophisticated evolution in cloud computing infrastructure. Most cloud computing products

More information

The Information Systems Audit

The Information Systems Audit November 25, 2009 e q 1 Institute of of Pakistan ICAP Auditorium, Karachi Sajid H. Khan Executive Director Technology and Security Risk Services e q 2 IS Environment Back Office Batch Apps MIS Online Integrated

More information

Providing real-time, built-in analytics with S/4HANA. Jürgen Thielemans, SAP Enterprise Architect SAP Belgium&Luxembourg

Providing real-time, built-in analytics with S/4HANA. Jürgen Thielemans, SAP Enterprise Architect SAP Belgium&Luxembourg Providing real-time, built-in analytics with S/4HANA Jürgen Thielemans, SAP Enterprise Architect SAP Belgium&Luxembourg SAP HANA Analytics Vision Situation today: OLTP and OLAP separated, one-way streets

More information

IBM Software Information Management. Scaling strategies for mission-critical discovery and navigation applications

IBM Software Information Management. Scaling strategies for mission-critical discovery and navigation applications IBM Software Information Management Scaling strategies for mission-critical discovery and navigation applications Scaling strategies for mission-critical discovery and navigation applications Contents

More information

SAP HANA In-Memory Database Sizing Guideline

SAP HANA In-Memory Database Sizing Guideline SAP HANA In-Memory Database Sizing Guideline Version 1.4 August 2013 2 DISCLAIMER Sizing recommendations apply for certified hardware only. Please contact hardware vendor for suitable hardware configuration.

More information

DELL s Oracle Database Advisor

DELL s Oracle Database Advisor DELL s Oracle Database Advisor Underlying Methodology A Dell Technical White Paper Database Solutions Engineering By Roger Lopez Phani MV Dell Product Group January 2010 THIS WHITE PAPER IS FOR INFORMATIONAL

More information

Storage Backup and Disaster Recovery: Using New Technology to Develop Best Practices

Storage Backup and Disaster Recovery: Using New Technology to Develop Best Practices Storage Backup and Disaster Recovery: Using New Technology to Develop Best Practices September 2008 Recent advances in data storage and data protection technology are nothing short of phenomenal. Today,

More information

Near-Instant Oracle Cloning with Syncsort AdvancedClient Technologies White Paper

Near-Instant Oracle Cloning with Syncsort AdvancedClient Technologies White Paper Near-Instant Oracle Cloning with Syncsort AdvancedClient Technologies White Paper bex30102507wpor Near-Instant Oracle Cloning with Syncsort AdvancedClient Technologies Introduction Are you a database administrator

More information

Chapter 6. Foundations of Business Intelligence: Databases and Information Management

Chapter 6. Foundations of Business Intelligence: Databases and Information Management Chapter 6 Foundations of Business Intelligence: Databases and Information Management VIDEO CASES Case 1a: City of Dubuque Uses Cloud Computing and Sensors to Build a Smarter, Sustainable City Case 1b:

More information

Well packaged sets of preinstalled, integrated, and optimized software on select hardware in the form of engineered systems and appliances

Well packaged sets of preinstalled, integrated, and optimized software on select hardware in the form of engineered systems and appliances INSIGHT Oracle's All- Out Assault on the Big Data Market: Offering Hadoop, R, Cubes, and Scalable IMDB in Familiar Packages Carl W. Olofson IDC OPINION Global Headquarters: 5 Speen Street Framingham, MA

More information

An Oracle White Paper November 2010. Backup and Recovery with Oracle s Sun ZFS Storage Appliances and Oracle Recovery Manager

An Oracle White Paper November 2010. Backup and Recovery with Oracle s Sun ZFS Storage Appliances and Oracle Recovery Manager An Oracle White Paper November 2010 Backup and Recovery with Oracle s Sun ZFS Storage Appliances and Oracle Recovery Manager Introduction...2 Oracle Backup and Recovery Solution Overview...3 Oracle Recovery

More information

Chapter 14: Recovery System

Chapter 14: Recovery System Chapter 14: Recovery System Chapter 14: Recovery System Failure Classification Storage Structure Recovery and Atomicity Log-Based Recovery Remote Backup Systems Failure Classification Transaction failure

More information

BACKUP ESSENTIALS FOR PROTECTING YOUR DATA AND YOUR BUSINESS. Disasters happen. Don t wait until it s too late.

BACKUP ESSENTIALS FOR PROTECTING YOUR DATA AND YOUR BUSINESS. Disasters happen. Don t wait until it s too late. BACKUP ESSENTIALS FOR PROTECTING YOUR DATA AND YOUR BUSINESS Disasters happen. Don t wait until it s too late. OVERVIEW It s inevitable. At some point, your business will experience data loss. It could

More information

Would-be system and database administrators. PREREQUISITES: At least 6 months experience with a Windows operating system.

Would-be system and database administrators. PREREQUISITES: At least 6 months experience with a Windows operating system. DBA Fundamentals COURSE CODE: COURSE TITLE: AUDIENCE: SQSDBA SQL Server 2008/2008 R2 DBA Fundamentals Would-be system and database administrators. PREREQUISITES: At least 6 months experience with a Windows

More information

IT Service Management

IT Service Management IT Service Management Service Continuity Methods (Disaster Recovery Planning) White Paper Prepared by: Rick Leopoldi May 25, 2002 Copyright 2001. All rights reserved. Duplication of this document or extraction

More information

SAP Real-time Data Platform. April 2013

SAP Real-time Data Platform. April 2013 SAP Real-time Data Platform April 2013 Agenda Introduction SAP Real Time Data Platform Overview SAP Sybase ASE SAP Sybase IQ SAP EIM Questions and Answers 2012 SAP AG. All rights reserved. 2 Introduction

More information

MICHIGAN AUDIT REPORT OFFICE OF THE AUDITOR GENERAL THOMAS H. MCTAVISH, C.P.A. AUDITOR GENERAL

MICHIGAN AUDIT REPORT OFFICE OF THE AUDITOR GENERAL THOMAS H. MCTAVISH, C.P.A. AUDITOR GENERAL MICHIGAN OFFICE OF THE AUDITOR GENERAL AUDIT REPORT THOMAS H. MCTAVISH, C.P.A. AUDITOR GENERAL The auditor general shall conduct post audits of financial transactions and accounts of the state and of all

More information

EMC VFCACHE ACCELERATES ORACLE

EMC VFCACHE ACCELERATES ORACLE White Paper EMC VFCACHE ACCELERATES ORACLE VFCache extends Flash to the server FAST Suite automates storage placement in the array VNX protects data EMC Solutions Group Abstract This white paper describes

More information

Speeding ETL Processing in Data Warehouses White Paper

Speeding ETL Processing in Data Warehouses White Paper Speeding ETL Processing in Data Warehouses White Paper 020607dmxwpADM High-Performance Aggregations and Joins for Faster Data Warehouse Processing Data Processing Challenges... 1 Joins and Aggregates are

More information

Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010

Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010 Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010 Better Together Writer: Bill Baer, Technical Product Manager, SharePoint Product Group Technical Reviewers: Steve Peschka,

More information

MAS 200 for SQL Server. Technology White Paper. Best Software, Inc.

MAS 200 for SQL Server. Technology White Paper. Best Software, Inc. MAS 200 for SQL Server Technology White Paper Best Software, Inc. Table of Contents MAS 200 for SQL Server............ 1 Why Microsoft SQL Server for MAS 200?... 3 Tuning Wizard...3 Query Optimizer...4

More information

Rackspace Cloud Databases and Container-based Virtualization

Rackspace Cloud Databases and Container-based Virtualization Rackspace Cloud Databases and Container-based Virtualization August 2012 J.R. Arredondo @jrarredondo Page 1 of 6 INTRODUCTION When Rackspace set out to build the Cloud Databases product, we asked many

More information

Connectivity. Alliance Access 7.0. Database Recovery. Information Paper

Connectivity. Alliance Access 7.0. Database Recovery. Information Paper Connectivity Alliance Access 7.0 Database Recovery Information Paper Table of Contents Preface... 3 1 Overview... 4 2 Resiliency Concepts... 6 2.1 Database Loss Business Impact... 6 2.2 Database Recovery

More information

Course 103402 MIS. Foundations of Business Intelligence

Course 103402 MIS. Foundations of Business Intelligence Oman College of Management and Technology Course 103402 MIS Topic 5 Foundations of Business Intelligence CS/MIS Department Organizing Data in a Traditional File Environment File organization concepts Database:

More information

Rajan R. Pant Controller Office of Controller of Certification Ministry of Science & Technology rajan@cca.gov.np

Rajan R. Pant Controller Office of Controller of Certification Ministry of Science & Technology rajan@cca.gov.np Rajan R. Pant Controller Office of Controller of Certification Ministry of Science & Technology rajan@cca.gov.np Meaning Why is Security Audit Important Framework Audit Process Auditing Application Security

More information

POLAR IT SERVICES. Business Intelligence Project Methodology

POLAR IT SERVICES. Business Intelligence Project Methodology POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...

More information

IT - General Controls Questionnaire

IT - General Controls Questionnaire IT - General Controls Questionnaire Internal Control Questionnaire Question Yes No N/A Remarks G1. ACCESS CONTROLS Access controls are comprised of those policies and procedures that are designed to allow

More information

Innovative technology for big data analytics

Innovative technology for big data analytics Technical white paper Innovative technology for big data analytics The HP Vertica Analytics Platform database provides price/performance, scalability, availability, and ease of administration Table of

More information

Navisphere Quality of Service Manager (NQM) Applied Technology

Navisphere Quality of Service Manager (NQM) Applied Technology Applied Technology Abstract Navisphere Quality of Service Manager provides quality-of-service capabilities for CLARiiON storage systems. This white paper discusses the architecture of NQM and methods for

More information

Promise of Low-Latency Stable Storage for Enterprise Solutions

Promise of Low-Latency Stable Storage for Enterprise Solutions Promise of Low-Latency Stable Storage for Enterprise Solutions Janet Wu Principal Software Engineer Oracle janet.wu@oracle.com Santa Clara, CA 1 Latency Sensitive Applications Sample Real-Time Use Cases

More information

Optimizing Backup and Data Protection in Virtualized Environments. January 2009

Optimizing Backup and Data Protection in Virtualized Environments. January 2009 Optimizing Backup and Data Protection in Virtualized Environments January 2009 Introduction The promise of maximizing IT investments while minimizing complexity has resulted in widespread adoption of server

More information

Big Data Analytics Service Definition G-Cloud 7

Big Data Analytics Service Definition G-Cloud 7 Big Data Analytics Service Definition G-Cloud 7 Big Data Analytics Service Service Overview ThinkingSafe s Big Data Analytics Service allows information to be collected from multiple locations, consolidated

More information

Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario

Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario About visualmetrics visualmetrics is a Business Intelligence (BI) solutions provider that develops and delivers best of breed Analytical Applications, utilising BI tools, to its focus markets. Based in

More information

White paper: Unlocking the potential of load testing to maximise ROI and reduce risk.

White paper: Unlocking the potential of load testing to maximise ROI and reduce risk. White paper: Unlocking the potential of load testing to maximise ROI and reduce risk. Executive Summary Load testing can be used in a range of business scenarios to deliver numerous benefits. At its core,

More information

NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation

NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation Market Offering: Package(s): Oracle Authors: Rick Olson, Luke Tay Date: January 13, 2012 Contents Executive summary

More information

Chapter 6 FOUNDATIONS OF BUSINESS INTELLIGENCE: DATABASES AND INFORMATION MANAGEMENT Learning Objectives

Chapter 6 FOUNDATIONS OF BUSINESS INTELLIGENCE: DATABASES AND INFORMATION MANAGEMENT Learning Objectives Chapter 6 FOUNDATIONS OF BUSINESS INTELLIGENCE: DATABASES AND INFORMATION MANAGEMENT Learning Objectives Describe how the problems of managing data resources in a traditional file environment are solved

More information

Foundations of Business Intelligence: Databases and Information Management

Foundations of Business Intelligence: Databases and Information Management Foundations of Business Intelligence: Databases and Information Management Content Problems of managing data resources in a traditional file environment Capabilities and value of a database management

More information