Dacorum Borough Council Final Internal Audit Report



Similar documents
Dacorum Borough Council Final Internal Audit Report. IT Business Continuity and Disaster Recovery

SOUTH NORTHAMPTONSHIRE COUNCIL. 11/31 ICT Capacity Management FINAL REPORT. June 2011

How To Audit A Windows Active Directory System

Coleg Gwent Internal Audit Report 2012/13 Assets and Inventory. Assurance Rating:

Draft Internal Audit Report Software Licensing Audit. December 2009

Coleg Gwent Internal Audit Report 2014/15 Staff Performance Management. Assurance Rating:

SOUTH NORTHAMPTONSHIRE COUNCIL 10/11 REMOTE WORKING FINAL REPORT MARCH 2011

Report 7 Appendix 1d Final Internal Audit Report Sundry Income and Debtors (inc. Fees and Charges) Greater London Authority February 2010

Business Planning & Budgetary Control 2012/13

Report 6c. Final Internal Audit Report Network and Communications. April 2008

Appendix 6c. Final Internal Audit Report Disaster Recovery Planning. June Report 6c Page 1 of 15

Avon & Somerset Police Authority

Item 10 Appendix 1d Final Internal Audit Report Performance Management Greater London Authority April 2010

Coleg Gwent Internal Audit Report 2012/13 Payroll and HR. Assurance Rating: Payroll

Internal Audit Report 2010/11 North Norfolk District Council. February 2011

Internal Audit at the University of Cambridge.

Aberdeen City Council IT Security (Network and perimeter)

How To Audit Health And Care Professions Council Security Arrangements

DBC 999 Incident Reporting Procedure

Aberdeen City Council IT Asset Management

REVIEW OF THE FIREWALL ARRANGEMENTS

Business Continuity Business Impact Analysis arrangements

IT REVIEW OF THE DISASTER RECOVERY ARRANGEMENTS

Essex Fire Authority. Fleet Management. Internal Audit Report (4.12/13) 28 February 2013 FINAL. Overall Opinion

Aberdeen City Council

X2 CONNECT NETWORKS SUPPORT SERVICES PRODUCT DEFINITION LEVEL 1, 2 & 3

Information Commissioner's Office

A guide to investing. Appendix 10 Choice of business entity

AGENDA ITEM: SUMMARY. Author/Responsible Officer: John Worts, ICT Team Leader

Coleg Gwent. Business Continuity Plan Test - Post Implementation Review (PIR) Internal Audit Report (12.09/10)

APPENDIX 4 GREATER LONDON AUTHORITY SUN ACCOUNTS UNIX REVIEW FINAL AUDIT REPORT. Auditor: Chris Power & Michael Lacey Date: April 2003 Reference: 320

Essex Fire Authority

Interim Audit Report. Borough of Broxbourne Audit 2010/11

Audit Committee, 13 March Internal Audit Report Project Management. Executive summary and recommendations. Introduction

Northumberland National Park Authority Report on the audit for the year ended 31 March 2012

Aberdeen City Council IT Disaster Recovery

Governance and Audit Committee 23 November 2015

NHS Dorset Clinical Commissioning Group. Internal Audit Annual Report 2014/15. May 2015

UMHLABUYALINGANA MUNICIPALITY IT CHANGE MANAGEMENT POLICY

Charity Audit Committee performance evaluation Self assessment checklist. October 2014

Auditing data protection a guide to ICO data protection audits

D-G4-L4-126 Police contact management and demand reduction review Deloitte LLP Service for G-Cloud IV

Identity & Access Management The Cloud Perspective. Andrea Themistou 08 October 2015

Office of the Police and Crime Commissioner for Avon and Somerset and Avon and Somerset Constabulary

Information Commissioner's Office

Information Commissioner's Office

Level 3 Customer support provision for the IT professional ( / )

Information Security Policies. Version 6.1

Comhairle nan Eilean Siar Internal Audit Follow Up Review Disaster Recovery. Final Report FU18 14/15

ESSEX FIRE AUTHORITY. Internal Audit Progress Report. Audit Sub-Committee Meeting: April 2012

Comhairle nan Eilean Siar Internal Audit Follow Up Review Document Management. Final Report FU01 14/15

Internal audit report Information Security / Data Protection review

WEST LOTHIAN COUNCIL INFORMATION SECURITY POLICY

The benefits anticipated from the project can be summarised as follows:

APPENDIX 2 GENERIC OPERATIONAL RISKS RISK TABLES & ADDITIONAL ACTION PLANS MONITORING REPORT MARCH 2006

The Internal Audit fraud challenge Prevention, protection, detection

REMOTE WORKING POLICY

We then give an overall assurance rating (as described below) indicating the extent to which controls are in place and are effective.

Informing the audit risk assessment Enquiries to those charged with governance Calderdale Council. Year ended 31 March 2013

The robots are coming. A Deloitte Insight report

Islington ICT Physical Security of Information Policy A council-wide information technology policy. Version 0.7 June 2014

Current issues and trends in the Aerospace supply chain

RS Official Gazette, No 23/2013 and 113/2013

Information Security Team

POST DESCRIPTION AND PERSON SPECIFICATION

IT control environment Caerphilly County Borough Council

OPERATIONAL SERVICE LEVEL AGREEMENT BETWEEN THE CLIENT AND FOR THE PROVISION OF PRO-ACTIVE MONITORING & SUPPORT SERVICES

1. To be the principal point of contact and have responsibility for IT support incidents.

AUDIT COMMITTEE 10 DECEMBER 2014

Aberdeen City Council IT Governance

Information Security Policy September 2009 Newman University IT Services. Information Security Policy

The Audit Findings for London Borough of Richmond upon Thames

LONDON BOROUGH OF HARROW. Overview & Scrutiny Committee

The Audit Findings for NHS Dorset Clinical Commissioning Group

Newcastle University Information Security Procedures Version 3

The Annual Audit Letter for Torbay Council

ICT Strategy

Internal Audit Report Project Management

Support & Field Services

The Learning Zone - Project Management Arrangements

TECHNICAL VULNERABILITY & PATCH MANAGEMENT

An Approach to Records Management Audit

Comhairle nan Eilean Siar Internal Audit Follow Up Review Licensing. Final Report FU16 12/13

University of Aberdeen Information Security Policy

University of Liverpool

Internal Audit Report Disaster Recovery / Business Continuity Planning

Governance in brief BIS and the FRC consult on options for UK implementation of the EU Audit Directive & Regulation

Comhairle nan Eilean Siar Internal Audit Review DISASTER RECOVERY ARRANGEMENTS Information Technology. Final Report 2014/15-06

The Annual Audit Letter for West Mercia Police and Crime Commissioner and Chief Constable

Internal Audit Strategic and Annual Plans 2015/16

Internal Audit Progress Report Performance and Overview Committee (19 th August 2015) Cheshire Fire Authority

ICT Internal Audit Strategy to Report by the Head of Finance

Cambridgeshire and Peterborough Fire Authority. Internal Audit Progress Report Overview & Scrutiny Committee meeting 16 October 2014

EA-ISP-012-Network Management Policy

Oracle Fusion Financials The Future Unearthed

Robotic Process Automation Overview and RPA Case Study. November 2015

Audit of Business Continuity Planning

EASYNET CHANNEL PARTNERS LIMITED PARTNER MASTER SERVICES AGREEMENT HYBRID CLOUD IT PRODUCT TERMS

D-G4-L4-231 Data Governance Assessment Design and Implementation Deloitte LLP Service for G- Cloud IV

Audit and Governance Committee Report. 4 July quarter. Internal audit activity report. one 2011/2012 1/2012. Purpose of Report. Report No.

Transcription:

Dacorum Borough Council Final Internal Audit Report ICT Change Management Distribution list: Chris Gordon Group Manager Neil Telkman - Information, Security and Standards Officer Gary Osler ICT Service Support Manager Key dates: Date of fieldwork: December 2010 Date of draft report: March 2011 Receipt of responses: April 2011 Date of final report: April 2011 This report has been prepared on the basis of the limitations set out in Appendix C. This report and the work connected therewith are subject to the Terms and Conditions of the Contract between Dacorum Borough Council and Deloitte & Touche Public Sector Internal Audit Limited. The report is produced solely for the use of Dacorum Borough Council. Its contents should not be quoted or referred to in whole or in part without our prior written consent except as required by law. Deloitte & Touche Public Sector Internal Audit Limited will accept no responsibility to any third party, as the report has not been prepared, and is not intended for any other purpose. 1

Contents 1. EXECUTIVE SUMMARY 3 2. SCOPE OF ASSIGNMENT 5 3. ASSESSMENT OF CONTROL ENVIRONMENT 6 4. OBSERVATIONS AND RECOMMENDATIONS 7 Recommendation 1: Change Management Procedures (Priority 2) 7 Recommendation 2: Documentation of Changes (Priority 2) 8 Recommendation 3: User Requirements Analysis (Priority 2) 9 Recommendation 4: Compatibility of Systems (Priority 2) 10 Recommendation 5: Roll Back and Fault Logging Procedures (Priority 2) 11 Recommendation 6: System Testing (Priority 2) 12 Recommendation 7: Third Party Access (Priority 2) 13 Recommendation 8: Hardware Changes (Priority 2) 14 Recommendation 9: Hardware Inventory (Priority 2) 15 APPENDIX A REPORTING DEFINITIONS 16 APPENDIX B STAFF INTERVIEWED 17 APPENDIX C - STATEMENT OF RESPONSIBILITY 18 2

1. Executive summary 1.1. Background This audit forms part of the agreed 2010/11 Internal Audit Plan with Dacorum Borough Council. ICT Change Management is the controlled process for managing system changes within ICT to help ensure that changes are formally evaluated, tested and implemented in a controlled manner to assist in changes being applied in a consistent manner across IT systems. This helps to ensure that risks relating to system changes are mitigated to avoid conflict occurring with the existing IT environment. The Support Works system is currently used by ICT to log and manage changes to systems within the Council, this can include routine patch updates to changes in functionality of key Council service applications. The system has recently been implemented and has been used since June 2010 and replaced the previous system called Magic. The Information, Security and Standards Officer and the Service Support Manager have the responsibility for managing and approving any changes that are requested. 1.2. Objectives and Scope The overall objective of this audit was to assess whether the Council s systems of internal control over ICT Change Management support the control objectives set out in section 2.3. In summary, the scope covered Change Management Processes, Software Changes, Hardware Changes, Asset Management and User Management. Further detail on the scope of the audit is provided in Section 2 of the report. 1.3. Summary assessment Our audit of DBC s internal controls operating over ICT Change Management found that whilst there are weaknesses in design which may place some of the system objectives at risk. Our assessment in terms of the design of, and compliance with, the system of internal control covered is set out below. Evaluation Assessment Limited Testing Assessment Limited Management should be aware that our internal audit work was performed according to UK Government Internal Audit Standards which are different from audits performed in accordance with International Standards on Auditing (UK and Ireland) issued by the Auditing Practices Board. Similarly, the assessment gradings provided in our internal audit report are not comparable with the International Standard on Assurance Engagements (ISAE 3000) issued by the International Audit and Assurance Standards Board. The classifications of our audit assessments and priority ratings definitions for our recommendations are set out in more detail in Appendix A, whilst further analysis of the control environment for Data Protection and Freedom of Information is shown in Section 3. 3

1.4. Key findings We have raised nine priority 2 recommendations where we believe there is scope for improvement within the control environment. These are summarised below: Comprehensive change management procedures covering the management of all ICT changes have not been established. Sample audit testing identified that not all changes are documented, authorised and appropriately prioritised. A formal review of all changes that have been performed is also not carried out. Not all software changes are supported by documentation pertaining to the user requirements of the change or a business case. System compatibility is not always checked prior to implementing changes to systems. Although system snap shots are taken before changes are implemented, the need for formal Rollback Plans are not always documented and the snap shots of systems are not always retained. Audit testing identified that fault logs for system changes are not always documented. There was no evidence to confirm that system testing is undertaken prior to a change being fully implemented and closed on the system. There are currently no processes for third parties to obtain access to Council systems. Remote access requests are not always formally completed and approved. Hardware compatibility and installation is not formally checked as part of the change management process. Although hardware performance is logged, it is not always reported and monitored within the quarterly performance reports. There was no evidence to confirm that adequate support arrangements are in place to govern the support and maintenance of hardware assets. Although a PC and Server inventory is maintained, other items of hardware are not recorded on the inventory. Full details of the audit findings and recommendations are shown in Section 4 of the report. 1.5. Management Response We have included a summary of the management responses in our Final report. We would like to take this opportunity to thank all staff involved for their time and co-operation during the course of this audit. 4

2. Scope of assignment 2.1 Objective The overall objective of this audit was to assess whether DBC s systems of internal control over ICT Change Management support the control objectives set out in section 2.3. 2.2 Approach and methodology The following procedures were adopted to identify and assess risks and controls and thus enable us to recommend control improvements: discussions with key members of staff to ascertain the nature of the systems in operation; evaluation of the current systems of internal control through walk-through and other non statistical sample testing; identification of control weaknesses and potential process improvement opportunities; discussion of our findings with management and further development of our recommendations; and preparation and agreement of a draft report with the process owner. 2.3 Areas covered In accordance with our agreed terms of reference, our work was undertaken to cover the following system control objectives: Change Management Processes Change management procedures have been documented and the changes are handled appropriately. Software Changes Controls are in place over software change management environment. Hardware Changes Controls are in place over hardware change management environment. Asset Management Council assets are managed appropriately and unwanted hardware is disposed securely. User Management Council users are managed appropriately and are subject to the change management protocol. 5

3. Assessment of Control Environment The following table sets out in summary the control objectives we have covered as part of this audit, our assessment of risk based on the adequacy of controls in place, the effectiveness of the controls tested and any resultant recommendations. Control Objectives Assessed Design of Controls Operation of Controls Recommendations Raised Change Management Processes Software Changes Recommendation 1 and 2 Recommendation 3 Hardware Changes Recommendation 4, 5 and 6 Asset Management Recommendation 8 and 9 User Management Recommendation 7 The classifications of our assessment of risk for the design and operation of controls are set out in more detail in Appendix A. 6

4. Observations and Recommendations Recommendation 1: Change Management Procedures (Priority 2) Recommendation Management should ensure that a comprehensive change management procedure is documented to outline all stages of the change management process. The procedure should contain information regarding the processes and responsibilities for change identification, the approval process and the emergency change process. Observation Creating comprehensive change management procedures helps to ensure that staff are fully aware of the change management process. This also provides guidance on how change management should be implemented within the Council and defines the expected standards on how IT change should be implemented. A change management flow chart and user guide were provided which shows some of the steps required to be followed for managing ICT changes, however, they did not contain details about how changes are identified, who can approve changes, the staff that have overall responsibility for change management and how emergency changes are managed and approved. Where change management procedures are not in place, there is an increased risk that changes to the ICT Infrastructure is not managed according to a specified process and that changes may not be adequately tested or authorised prior to implementation. Responsibility Change Process owner Management response / deadline Accepted: We will review the documentation currently in place on the Change Management Process. Changes to the documentation will be made in line with this recommendation, ensuring responsibilities and approval process are clear as well as how the emergency change process will work and when the emergency process can be used. This will be completed in July 2011. 7

Recommendation 2: Documentation of Changes (Priority 2) Recommendation Management should ensure all changes are appropriately documented, authorised and prioritised. A formal review of all changes should also take place before the change is closed on the system. This should be documented and retained for future reference. Observation Documenting, approving and prioritising changes helps to ensure that the change has followed established practices, is valid and is adequately handled. Management review of the change helps to confirm that this has been completed as required. From a sample of 10 hardware and software changes tested, it was identified that three out of 10 changes had been documented; Seven out of 10 changes had been approved; and eight of out 10 had a priority assigned. Out of the 10 changes, three required dual key approval due to the new process introduced, however, it was not evident that the three changes had been approved by two officers. Audit were informed that changes are reviewed before they are closed, however, there was no evidence to confirm this had been undertaken for the sample of 10 changes tested. Where changes are not documented, approved, prioritised and reviewed, there is an increased risk that inappropriate changes are implemented. There is also a risk of ineffective implementation leading to the need for further changes to rectify initially poorly specified changes. Responsibility ICT Service Manager Management response / deadline Accepted: During the review of the Change Management process a new way of working will be implemented ensuring more control around authorisation and prioritisation will. All Changes will also require clear documentation. Formal reviews will be made on all medium and large scale changes. On small scale changes reviews will be made on more business critical changes only. July 2011 8

Recommendation 3: User Requirements Analysis (Priority 2) Recommendation Management should ensure a user requirements analysis is performed prior to the development of or procurement of a system or software. This should be documented and retained as part of the change management process. Observation Undertaking a requirements analysis helps to ensure user needs are identified prior to the change being implemented. Audit could not obtain evidence that a user requirements analysis or a reason for the change was submitted for the sample of 10 changes tested. Where user requirements are not identified there is an increased risk that the changes implemented do not fulfil user requirements and their business needs. Responsibility ICT Service Managers Management response / deadline Accepted: All projects or significant changes involving ICT will need to have clear business cases stating the reason for change and the expected outcome. If the business case is progressed this document will be made as part of the Change Documentation. June 2011. 9

Recommendation 4: Compatibility of Systems (Priority 2) Recommendation Management should ensure that the compatibility of new systems and software is formally documented with the Council s existing IT environment prior to a change being approved for implementation. This should form a standard check on the change control template to ensure conflicting changes with the existing environment are not implemented. Observation Checking the compatibility of systems helps to ensure the software is able to operate effectively within the Council s IT environment. Audit were informed that the compatibility of systems is checked prior to approving a change, however, we could not obtain any formal evidence to confirm that this was the case. Where compatibility of a change with existing systems is not checked, there is an increased risk that inappropriate system changes or purchases are made. Where this is not tested or evaluated, it could lead to changes that are implemented having a detrimental effect on the Council s existing IT infrastructure. Responsibility ICT Service Managers Management response / deadline Accepted: Although I have not been aware that we implement products that are incompatible a formal documentation of this will be made as part of the Change Control. June 2011 10

Recommendation 5: Roll Back and Fault Logging Procedures (Priority 2) Recommendation Management should ensure that roll back plans are documented for all approved changes. These should be supported by before and after images of changes to master data. Additionally, processes should be in place and communicated to all users to log faults that have been identified within a system with the Helpdesk. Observation Documenting of roll back plans helps to ensure that changes that do not achieve the anticipated benefits in the live environment can be reversed to restore the IT environment back to its original state. The retention of before and after images helps to provide assurance that the requested change has been performed as requested. The logging and monitoring of faults assists in the identification of areas where changes may be needed to existing implemented changes. Audit were informed that snap shots are taken before changes are implemented to help ensure the application can be rolled back if required. However, formal roll back plans are not documented and the snap shots are not retained. Additionally, procedures to log faults were not provided and it could not be confirmed if they were tracked over the long term. Where roll back plans are not documented and before and after images are not retained, there is an increased risk that the IT environment cannot be restored if the result of the change is not as intended. Failure to log and monitor faults, increases the risk that any unanticipated affects of the change on the ICT environment are not identified, which could result in an increase of incidents logged at a later stage. Responsibility Change Manager Management response / deadline Accepted: Roll back plans will be clearly documented to the satisfaction of the Change Manager and approval of changes will depend on this field being completed. In addition a communication plan will also be part of the new change process. June 2011 11

Recommendation 6: System Testing (Priority 2) Recommendation Management should ensure system testing is performed prior to the change being marked as complete. Documented evidence of the testing should be retained for future reference. Observation Undertaking formal testing of the change before it is closed helps to provide assurance that the change is working as required before it is transferred to the live environment. Audit were informed that system testing is performed when a software change is implemented. However, there was no evidence of these tests and documentation relating to tests are not stored on the system. Where system testing is not performed, there is an increased risk that errors may not be identified prior to full implementation leading to poor system performance or system downtime. Responsibility ICT Service Manager Management response / deadline Accepted: A test plan will be implemented as part of the Change Process documentation. In addition all changes before the Manager signs it off should also have been through the test in the plan. July 2011 12

Recommendation 7: Third Party Access (Priority 2) Recommendation Third party access to the Council s live IT environment should be controlled through access requests which should be authorised and retained to provide accountability over the reason for the remote access. The reason for access should be noted and the length of time the access is required indicated. This should be removed when no longer required. Observation Documented formal processes for remote access support helps to ensure unauthorised changes are not made to software and systems which could place the integrity of the ICT environment and system data at risk. Reviewing remote access logs helps to ensure that suppliers only access Council systems following their access being approved. Audit were informed that third party access is only enabled when required and disabled once the work has been undertaken. This is not raised as part of the change management system and it is raised as a service request. However, no further evidence was received of the procedure or service requests raised. Where suppliers have unlimited and unrestricted access to the Council s infrastructure there is a risk that ICT has no record of the work undertaken by suppliers on the network. There is also no record of instances where the supplier has accessed the Council s systems for development work. Responsibility ICT Service Manager Management response / deadline Accepted: Service requests will be followed up with enablement and disablement to be documented through the ICT help desk. July 2011 13

Recommendation 8: Hardware Changes (Priority 2) Recommendation Management should ensure that for all hardware and hardware changes: The compatibility of hardware is assessed prior to making a hardware change or purchase; Processes should also be in place to ensure that checks are undertaken to ensure hardware is correctly installed; Evidence of the compatibility and installation checks should be retained for future reference; Responsibility for reporting hardware performance should be determined and performance should be reported on a regular basis; and Adequate support arrangements should be in place to govern the support and maintenance of hardware assets. Observation Checking the compatibility of hardware helps to ensure the hardware is able to operate effectively in the IT environment. Checking the installation of the hardware helps to confirm the change process has been successfully completed prior to formal closure of the change. The early recognition of potential hardware problems can assist in avoiding longer system disruptions. Ensuring support arrangements are in place would help to provide a level of assurance that the potential risk of hardware related failures is mitigated. Hardware compatibility and installation is not formally checked as part of the change management process. Though Key Performance Indicators have been established for hardware performance and were reported for Quarter 2, they were not reported on the Council s performance management system (Corvu) and therefore were not included in the Council s Quarterly performance reports for Quarter 1 in June 2010. Furthermore, no evidence was provided to confirm that support arrangements are in place with the supplier Dell. Where hardware compatibility and installation are not checked, there is an increased risk of system conflicts and poor hardware performance. Failure to monitor hardware performance on a regular basis increases the risk of hardware failure which could render it unstable. Responsibility Change Manager Management response / deadline Accepted: While checks are made they are not clearly documented. The checks will make up the documentation process. July 2011 14

Recommendation 9: Hardware Inventory (Priority 2) Recommendation Management should ensure that the hardware inventory is updated to include all items of hardware including (but not limited to) peripherals and items such as printers, switches and routers. The inventory should be updated in the event of a change and reviewed regularly. Management should also consider implementing elements of Configuration Management to assist in the timely identification of IT asset configuration. Observation Maintaining a comprehensive hardware inventory helps to ensure all items of hardware are tracked and the inventory is updated following a change. It was identified that a PC and Server inventory is maintained, however, other items of hardware are not recorded. Where a comprehensive hardware inventory is not maintained, there is an increased risk that hardware items cannot be traced in the event of loss or theft. It also makes it difficult to identify if there have been any changes to hardware. Responsibility Change Manager Management response / deadline Accepted: All ICT hardware assets will be placed on an inventory. August 2011 15

Appendix A Reporting definitions Audit assessment In order to provide management with an assessment of the adequacy and effectiveness of their systems of internal control, the following definitions are used: Level Symbol Evaluation Assessment Testing Assessment Full Substantial Limited Nil There is a sound system of internal control designed to achieve the system objectives. Whilst there is a basically sound system of internal control design, there are weaknesses in design which may place some of the system objectives at risk. Weaknesses in the system of internal control design are such as to put the system objectives at risk. Control is generally weak leaving the system open to significant error or abuse. The controls are being consistently applied. There is evidence that the level of non-compliance with some of the controls may put some of the system objectives at risk. The level of non-compliance puts the system objectives at risk. Significant non-compliance with basic controls leaves the system open to error or abuse. The assessment gradings provided here are not comparable with the International Standard on Assurance Engagements (ISAE 3000) issued by the International Audit and Assurance Standards Board and as such the grading of Full does not imply that there are no risks to the stated control objectives. 16

Grading of recommendations In order to assist management in using our reports, we categorise our recommendations according to their level of priority as follows: Level Priority 1 Priority 2 Priority 3 System Improvement Opportunity Definition Recommendations which are fundamental to the system and upon which the organisation should take immediate action. Recommendations which, although not fundamental to the system, provide scope for improvements to be made. Recommendations concerning issues which are considered to be of a minor nature, but which nevertheless need to be addressed. Issues concerning potential opportunities for management to improve the operational efficiency and/or effectiveness of the system. Appendix B Staff interviewed The following personnel were consulted: Neil Telkman Information, Security and Standards Officer Gary Osler Service Support Manager John Worts - Service Support Manager We would like to thank the staff involved for their co-operation during the audit. 17

Appendix C - Statement of responsibility We take responsibility for this report which is prepared on the basis of the limitations set out below. The matters raised in this report are only those which came to our attention during the course of our internal audit work and are not necessarily a comprehensive statement of all the weaknesses that exist or all improvements that might be made. Recommendations for improvements should be assessed by you for their full impact before they are implemented. The performance of internal audit work is not and should not be taken as a substitute for management s responsibilities for the application of sound management practices. We emphasise that the responsibility for a sound system of internal controls and the prevention and detection of fraud and other irregularities rests with management and work performed by internal audit should not be relied upon to identify all strengths and weaknesses in internal controls, nor relied upon to identify all circumstances of fraud or irregularity. Auditors, in conducting their work, are required to have regards to the possibility of fraud or irregularities. Even sound systems of internal control can only provide reasonable and not absolute assurance and may not be proof against collusive fraud. Internal audit procedures are designed to focus on areas as identified by management as being of greatest risk and significance and as such we rely on management to provide us full access to their accounting records and transactions for the purposes of our audit work and to ensure the authenticity of these documents. Effective and timely implementation of our recommendations by management is important for the maintenance of a reliable internal control system. The assurance level awarded in our internal audit report is not comparable with the International Standard on Assurance Engagements (ISAE 3000) issued by the International Audit and Assurance Standards Board. Deloitte & Touche Public Sector Internal Audit Limited London April 2011 In this document references to Deloitte are references to Deloitte & Touche Public Sector Internal Audit Limited. Registered office: Hill House, 1 Little New Street, London EC4A 3TR, United Kingdom. Registered in England and Wales No 4585162. Deloitte & Touche Public Sector Internal Audit Limited is a subsidiary of Deloitte LLP, the United Kingdom member firm of Deloitte Touche Tohmatsu Limited ( DTTL ), a UK private company limited by guarantee, whose member firms are legally separate and independent entities. Please see www.deloitte.co.uk/about for a detailed description of the legal structure of DTTL and its member firms. Member of Deloitte Touche Tohmatsu Limited 18