COURSE: 80156 SQL SERVER INSTALLATION AND OPTIMIZATION FOR MICROSOFT DYNAMICS NAV 2009



Similar documents
INSTALLATION AND CONFIGURATION IN MICROSOFT DYNAMICS NAV 2009

System Requirements for Microsoft Dynamics NAV 2009

System Requirements for Microsoft Dynamics NAV 2013 R2

ICONICS Choosing the Correct Edition of MS SQL Server

Online Transaction Processing in SQL Server 2008

Your Data, Any Place, Any Time.

Server Consolidation with SQL Server 2008

Course Syllabus. At Course Completion

Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010

Your Data, Any Place, Any Time. Microsoft SQL Server 2008 provides a trusted, productive, and intelligent data platform that enables you to:

How To Use Microsoft Crm 3.0 Small Business Edition

Kronos Workforce Central 6.1 with Microsoft SQL Server: Performance and Scalability for the Enterprise

System Requirements for Microsoft Dynamics NAV 2015

How To Use Windows Small Business Server 2011 Essentials

Microsoft Dynamics NAV

Course Syllabus. Maintaining a Microsoft SQL Server 2005 Database. At Course Completion

COURSE: 8611 INVENTORY MANAGEMENT IN MICROSOFT DYNAMICS NAV 5.0

Maintaining a Microsoft SQL Server 2008 Database

System Requirements for Microsoft Dynamics NAV 2009 SP1

Enterprise and Standard Feature Compare

Getting started with Microsoft SharePoint Server 2010

Course. Overview. Length: 5 Day(s) Published: English. IT Professionals. Level: Type: Method: Delivery. Enroll now (CAL)

This document is provided to you by ABC E BUSINESS, Microsoft Dynamics Preferred partner. System Requirements NAV 2016

Designing, Optimizing and Maintaining a Database Administrative Solution for Microsoft SQL Server 2008

System Requirements. Microsoft Dynamics NAV 2016

System Requirements for Microsoft Dynamics NAV 2016

Key Benefits of Microsoft Visual Studio 2008

Microsoft SQL Server Beginner course content (3-day)

Windows Server 2008 R2 Hyper-V Live Migration

SQL Server 2012 Gives You More Advanced Features (Out-Of-The-Box)

Preparing to Install SQL Server 2005

Microsoft. Course 20463C: Implementing a Data Warehouse with Microsoft SQL Server

MS Design, Optimize and Maintain Database for Microsoft SQL Server 2008

System Requirements for Microsoft Dynamics NAV 2016

Jet Enterprise Frequently Asked Questions Pg. 1 03/18/2011 JEFAQ - 02/13/ Copyright Jet Reports International, Inc.

Executive Summary WHO SHOULD READ THIS PAPER?

Installing Windows Rights Management Services with Service Pack 2 Step-by- Step Guide

Explain how to prepare the hardware and other resources necessary to install SQL Server. Install SQL Server. Manage and configure SQL Server.

Workshop & Chalk n Talk Catalogue Services Premier Workshop & Chalk n Talk Catalogue

Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for Multitenant Deployments

SQL Server 2008 Performance and Scale

System Requirements for Microsoft Dynamics NAV 2016

Microsoft Dynamics TM NAV Installation & System Management: C/SIDE Database Server for Microsoft Dynamics TM NAV

Microsoft Dynamics CRM 2011 Guide to features and requirements

Basic knowledge of the Microsoft Windows operating system and its core functionality Working knowledge of Transact-SQL and relational databases

SQL Server 2012 Performance White Paper

Administering a Microsoft SQL Server 2000 Database

6231A - Maintaining a Microsoft SQL Server 2008 Database

Hardware & Software Requirements for BID2WIN Estimating & Bidding, the BUILD2WIN Product Suite, and BID2WIN Management Reporting

Windows Server Virtualization An Overview

SQL Server 2012 Optimization, Performance Tuning and Troubleshooting

50238: Introduction to SQL Server 2008 Administration

Course Syllabus. Microsoft Dynamics GP Installation & Configuration. Key Data. Introduction. Audience. At Course Completion

Microsoft Dynamics AX 2012 System Requirements. Microsoft Corporation Published: August 2011

Inmagic Content Server Workgroup Configuration Technical Guidelines

EMC Backup and Recovery for Microsoft SQL Server 2008 Enabled by EMC Celerra Unified Storage

Administering Microsoft SQL Server Databases

Table Of Contents. - Microsoft Windows - WINDOWS XP - IMPLEMENTING & SUPPORTING MICROSOFT WINDOWS XP PROFESSIONAL...10

System Requirements for Microsoft Dynamics NAV 2016

Below are the some of the new features of SQL Server that has been discussed in this course

Administering Microsoft SQL Server Databases

LEARNING SOLUTIONS website milner.com/learning phone

Overview of Active Directory Rights Management Services with Windows Server 2008 R2

Windows Embedded Security and Surveillance Solutions

Microsoft Dynamics AX 2012 System Requirements. Microsoft Corporation Published: November 2011

Administering a Microsoft SQL Server 2000 Database

Implementing a Data Warehouse with Microsoft SQL Server 2012

10775 Administering Microsoft SQL Server Databases

SQL Server 2005 Features Comparison

SQL Server Performance Tuning and Optimization

Administering Microsoft SQL Server Databases

Course 20462C: Administering Microsoft SQL Server Databases

Microsoft SQL Server for Oracle DBAs Course 40045; 4 Days, Instructor-led

Microsoft SQL Server: MS Performance Tuning and Optimization Digital

Metalogix SharePoint Backup. Advanced Installation Guide. Publication Date: August 24, 2015

Networking Best Practices Guide. Version 6.5

Scalability. Microsoft Dynamics GP Benchmark Performance: Advantages of Microsoft SQL Server 2008 with Compression.

Symantec Backup Exec 11d for Windows Small Business Server

ADMINISTERING MICROSOFT SQL SERVER DATABASES

$99.95 per user. SQL Server 2005 Database Administration CourseId: 152 Skill level: Run Time: 30+ hours (158 videos)

Implementing a Data Warehouse with Microsoft SQL Server 2012

10775A Administering Microsoft SQL Server 2012 Databases

Microsoft Dynamics GP. econnect Installation and Administration Guide Release 9.0

SplendidCRM Deployment Guide

Course 6437A: Designing a Windows Server 2008 Applications Infrastructure

Microsoft Dynamics NAV 2015 Hardware and Server Requirements. Microsoft Dynamics NAV Windows Client Requirements

Virtualization with Microsoft Windows Server 2003 R2, Enterprise Edition

SQL Server 2005 Reporting Services (SSRS)

MOC 20462C: Administering Microsoft SQL Server Databases

Administering Microsoft SQL Server Databases

WINDOWS SERVER SMALL BUSINESS SOLUTIONS. Name: Marko Drev

NetIQ Directory and Resource Administrator NetIQ Exchange Administrator. Installation Guide

Network device management solution

Windows Small Business Server 2003 Upgrade Best Practices

Microsoft SQL Server performance tuning for Microsoft Dynamics NAV

Transcription:

COURSE: 80156 SQL SERVER INSTALLATION AND OPTIMIZATION FOR MICROSOFT DYNAMICS NAV 2009

Last Revision: July 2009 The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information presented after the date of publication. This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation. Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property. 2009 Microsoft Corporation. All rights reserved. Microsoft Dynamics, Microsoft PowerPoint Microsoft SQL Server and Microsoft Dynamics NAV MorphX are trademarks or registered trademarks of Microsoft Corporation. The names of actual companies and products mentioned herein may be the trademarks of their respective owners. This course content is designed for Microsoft Dynamics NAV 2009.

Table of Contents Introduction 0-1 Welcome... 0-1 Microsoft Dynamics Courseware Contents... 0-2 Documentation Conventions... 0-3 Student Objectives... 0-4 Chapter 1: Introduction to the Course 1-1 Objectives... 1-1 Introduction... 1-1 Microsoft SQL Server tool set... 1-2 Summary... 1-5 Chapter 2: Setup and Installation 2-1 Objectives... 2-1 Introduction... 2-1 Software Requirements... 2-1 Hardware Requirements... 2-13 Microsoft Dynamics NAV Architecture... 2-21 Configuration... 2-24 Security Synchronization... 2-41 Summary... 2-53 Lab 2.1 - Change the Recovery Model... 2-55 Lab 2.2 - Set Trace Flags... 2-58 Lab 2.3 - Create a Login Stored Procedure... 2-60 Lab 2.4 - Use Filegroups to Change the Storage Location of a Table... 2-63 Lab 2.5 - Create Users and Synchronize Security... 2-66 Quick Interaction: Lessons Learned... 2-68 Chapter 3: Advantages of SQL Server Option 3-1 Objectives... 3-1 Introduction... 3-1 Backup Facilities... 3-2 Database Access Using Third-Party Tools... 3-8 Performance Monitoring... 3-10 Scalability... 3-23 Summary... 3-23 Test Your Knowledge... 3-24 Lab 3.1 - Create a Backup and Restore to a Point in Time... 3-26 Lab 3.2a - Transfer Data from Test to Live Database (Transact-SQL)... 3-33 Quick Interaction: Lessons Learned... 3-43 Solutions... 3-44 Chapter 4: Performance Audits 4-1 Objectives... 4-1 Introduction... 4-1 Set up a Test Environment... 4-2 System Monitor... 4-4 Client Monitor... 4-18 Session Monitor... 4-31 i

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 SQL Server Profiler... 4-34 Database Engine Tuning Advisor... 4-45 Dynamic Management Views... 4-50 Useful Scripts, Tools, and Reports... 4-56 Summary... 4-60 Test Your Knowledge... 4-61 Lab 4.1 - Setup and Schedule System Monitor... 4-63 Lab 4.2 - Create a SQL Server Profiler Trace... 4-67 Lab 4.3 - Correlate System Monitor and SQL Server Profiler Data... 4-69 Lab 4.4a - Analyze a SQL Server Profiler Trace using DTA... 4-72 Lab 4.4b - Analyze an SQL query using DTA... 4-74 Lab 4.4c - Find Missing Indexes using DMVs... 4-76 Quick Interaction: Lessons Learned... 4-78 Solutions... 4-79 Chapter 5: Improving Application Performance 5-1 Objectives... 5-1 Introduction... 5-1 Optimizing C/AL Code... 5-2 SIFT... 5-9 FIND Instructions... 5-15 Keys... 5-23 Locks, Blocks and Deadlocks... 5-29 Graphical User Interface... 5-33 Index and Rowlock Hinting... 5-35 Bulk Insert... 5-38 Best Practices... 5-39 Summary... 5-48 Test Your Knowledge... 5-49 Lab 5.1 - Optimize C/AL Code for Performance... 5-52 Lab 5.2a - Find Index Usage... 5-54 Lab 5.2b - Find Unused Indexes... 5-56 Lab 5.2c - Disable Unused Keys... 5-61 Lab 5.3 - Create a Deadlock Trace... 5-64 Quick Interaction: Lessons Learned... 5-68 Solutions... 5-69 Chapter 6: Maintenance 6-1 Objectives... 6-1 Introduction... 6-1 Optimizing a Microsoft Dynamics NAV Database... 6-2 Implementing Maintenance on SQL Server... 6-2 Monitoring... 6-23 Summary... 6-32 Test Your Knowledge... 6-33 Lab 6.1 - Create a Maintenance Plan... 6-35 Lab 6.2 - Change the Fill Factor for Hot Tables... 6-41 Lab 6.3 - Compare Query Execution Before and After Deleting Statistics... 6-44 Quick Interaction: Lessons Learned... 6-49 ii

Table of Contents Solutions... 6-50 Chapter 7: Appendix 7-1 Objectives... 7-1 Introduction... 7-1 Backup Facilities... 7-2 Summary... 7-3 Lab 7.1 - Set Up a Connection from Microsoft Excel... 7-4 Quick Interaction: Lessons Learned... 7-9 iii

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 iv

Introduction INTRODUCTION TO SQL SERVER INSTALLATION AND OPTIMIZATION FOR MICROSOFT DYNAMICS NAV 2009 E-Learning Training is a vital component of retaining the value of your Microsoft Dynamics NAV investment. Quality training from industry experts helps keep you updated on your solution and develops skills to maximize the value of your solution. Whether choosing E-Learning, instructor-led training, or self-paced study using training materials, there is a type of training that meets your needs. Additionally, validate your training and demonstrate your expertise with one of many certifications for Microsoft Dynamics. Choose the training or certification type that best enables you to stay ahead of the competition. Online training for Microsoft Dynamics products helps you increase your productivity without spending time away from your home or office. E-Learning allows you to learn at your own pace through flexible access to training, therefore proving beneficial for those lacking the time or budget to travel. E-Learning are online training courses designed to cover detailed concepts on specific product areas and allow you to: Gain in-depth technical and business application training through daily on-demand training. Learn at your own pace - lessons can be stopped and restarted, skipped or repeated. Save time and increase your productivity. Receive product knowledge comparable to instructor-led training without the need for travel or time away from the office. Gain beneficial training when preparing for Microsoft Dynamics certification exams. Find tips and tricks to show you how to increase productivity and save time. Learn about the changes in features and functionality of a new Microsoft Dynamics product version. Evaluate a new Microsoft Dynamics module or product. 0-1

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Instructor-Led Training Training Materials Certifications With instructor-led training, you can gain a solid foundation or refresh your knowledge in Microsoft Dynamics products and processes while learning from an expert in an interactive environment. With courses on a variety of topics, you can: Follow demonstrations and attend presentations. Receive hands-on product experience. Participate in classroom activities and discussions with other attendees. Gain beneficial training when preparing for Microsoft Dynamics certification exams. Training materials can be ordered for the purpose of self-paced study. These materials are comparable to courseware used with instructor-led training, and enable you to: Learn at your own pace, in your own time. Refer to an abundance of tips, tricks, and insights. Learn using a self-study format when preparing for Microsoft Dynamics certification exams. For selected training materials there are training material local functionality available which cover country specific features in the product. The training material local functionality add on to existing training materials and are designed to teach local functionality within a given country. Please notice that training material local functionality are used only in conjunction with the training material, not as stand-alone training materials Certifications help identify and distinguish an individual s technical skill set using a Microsoft Dynamics or related business product. Certifications for Microsoft Dynamics are widely recognized by industry employers and provide an objective validation of an individual s knowledge. Organizations that employ certified individuals benefit from a complete approach to learning certified individuals have higher skills retention and increased productivity. Organizations can streamline their employee recruitment process and lower their external support costs plus downtime by maintaining technically skilled employees by requiring Microsoft Certified Business Management Solutions Specialist and Professional certifications. 0-2

Introduction Microsoft Certified Business Management Solutions Specialist The achievement of this certification demonstrates an individual s proficiency in one module of a Microsoft Dynamics or related business product. Microsoft Certified Business Management Solutions Specialists must pass a single certification exam for a Microsoft Dynamics or related business product to earn the title. Microsoft Certified Business Management Solutions Professional The Microsoft Certified Business Management Solutions Professional is a premier certification where an individual has completed a pre-determined set of required and elective certification exams. These include certification exams for Microsoft Dynamics and related business products, as well as certification exams for other Microsoft technologies such as SQL Server. The pre-determined set of required and elective exams for this certification are focused on one of three specific knowledge areas Applications, Developer, or Installation and Configuration for one Microsoft Dynamics product. This certification demonstrates an individual s broad and deep knowledge of a Microsoft Dynamics product more knowledge than what is needed to achieve a Microsoft Certified Business Management Solutions Specialist title. Certification Exam Preparation Guides To help prepare for a certification exam, Microsoft highly recommends the use of the certification exam preparation guides available for each exam. Certification exam preparation guides contain valuable information about a specific exam, including: The target audience Skills being measured Time expected to take each portion of the exam Pass rates and requirements Question types and topics Preparation tools, such as: o recommended training o supplemental learning resources o additional recommended skills Certification Exam Preparation Guides help you determine which learning options are appropriate to best prepare you for a certification exam. Microsoft does not expect or intend one course or type of training to be the sole preparation method for passing a certification exam. A combination of hands-on experience using the product and training are recommended certification exam preparation. 0-3

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Learning Plans Certification exam preparation guides are available through PartnerSource and CustomerSource. Learning Plans can help you plan the best learning strategy for you and your organization. Learning plans illustrate module specific learning tracks that can be easily targeted towards specific roles. Learning plans also provide a learning track towards obtaining certifications. Learning plans can be found on both PartnerSource and CustomerSource, NOTE: For more information on E-Learning, instructor-led training, training materials and certifications for Microsoft Dynamics log in to PartnerSource or CustomerSource and visit Training & Certification. Elements of Training Materials for Microsoft Dynamics Training Materials for Microsoft Dynamics products contain a number of sections or elements. Each chapter includes the following elements: Objectives Each chapter begins with a statement of the learning objectives for that chapter. Learning objectives are important because they inform you about what needs to be done to successfully complete the chapter. Introduction An introduction sets the stage for the learning to take place and prepares you with key statements of the chapter. Topics Chapters are split up into topic areas, usually according to the learning objectives for the chapter. This is especially beneficial in large chapters so that the knowledge and skills to be learned are split up into more manageable units. Test your Knowledge The Test your knowledge section consists of review questions for each chapter or topic and is designed to help reinforce learning concepts. Questions can be short answer, true and false, multiple-choice, fill-in-the-blank or any other type. Answers to questions are also provided. 0-4

Introduction Conclusion The conclusion wraps up the chapter by highlighting the important parts of the chapter as well as providing a transition to the next chapter. The conclusion also offers an opportunity to refresh earlier learning. Labs Labs test your skills with the learning concepts presented and learned during a topic or chapter. Labs begin with a scenario paragraph which describes the business problem to be solved, and also sets the stage for the exercise. Solutions to the labs are also provided. Labs may be offered at different levels to accommodate the variety of skills and expertise of each student. Challenge Yourself! Challenge Yourself! labs are the most challenging. These exercises are designed for the experienced student who requires little instruction to complete the required task. This level of exercise states the business problem to be solved and describes the tasks the learner needs to complete. Need a Little Help? These exercises are designed to challenge students while providing some assistance. These exercises do not provide step-by-step instructions; however, they provide the user with helpful hints and more information to complete the lab. We suggest you try the Challenge Yourself! labs first, and if you need help completing the task, look to the information in the Need a Little Help? labs. If additional assistance is required, refer to the Step by Step lab solutions located in an Appendix. Quick Interaction: Lessons Learned At the end of each chapter within the Training Material for Microsoft Dynamics, you will find a Quick Interaction: Lessons Learned page. This interaction is designed to provide you with a moment to reflect on the material you have learned. By outlining three key points from the chapter, you are maximizing knowledge retention, and providing yourself with an excellent resource for reviewing key points after class. 0-5

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 About This Training Material This section provides you with a brief description of: The training material Audience Suggested prerequisites Training material objectives Student materials Related certification exams Description This three-day course is designed for students preparing to optimize performance in Microsoft Dynamics NAV. Audience This course is intended for Developers, IT Professionals, Microsoft Dynamics Partner professionals and Consultants. At Training Material Completion The course completion objectives are: Conduct a performance audit Improve Application Performance Maintain Microsoft SQL Server Monitor Performance Prerequisites Before attending this course, students must have: General knowledge of Microsoft Dynamics NAV. Attended course 80043, Introduction to Microsoft Dynamics NAV 2009, and What s New in Microsoft Dynamics NAV 2009 - Installation and Development. Working experience with Microsoft SQL Server. 0-6

Introduction Student Objectives What do you hope to learn by participating in this course? List three main objectives below. 1. 2. 3. 0-7

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 0-8

Chapter 1: Introduction to the Course CHAPTER 1: INTRODUCTION TO THE COURSE Objectives Introduction The objectives are: Know the structure and scope of the course. This chapter provides an overview of the course structure and the scope of the course. This course focuses on Microsoft Dynamics NAV 2009 and the way it integrates with Microsoft SQL Server 2005 and 2008. This chapter starts with an introduction to the Microsoft SQL Server tool set and how you can use it with Microsoft Dynamics NAV 2009. Next, it provides an overview of the all of the chapters in this course and explains the sequence of the topics. 1-1

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Microsoft SQL Server tool set Microsoft SQL Server Tools and Microsoft Dynamics NAV 2009 Microsoft SQL Server is a database platform for online transaction processing (OLTP), data warehousing, and e-commerce applications. It is also a business intelligence platform for data integration, analysis, and reporting solutions. SQL Server is a multi-component relational database management system centered around a high-performance, highly available database engine. The SQL Server Database Engine is the core service for storing, processing, and securing data. The Database Engine provides controlled access and rapid transaction processing to meet the requirements of the most demanding data consuming applications within enterprises. The Database Engine also provides rich support for sustaining high availability. SQL Server is more than just a database. The following diagram shows the relationships among SQL Server components and identifies interoperability between components. FIGURE 1.1 SQL SERVER COMPONENTS OVERVIEW SQL Server consists of a suite of tools and components that support you in designing, managing, maintaining, and programming a SQL Server installation and its associated data. This course mainly focuses on the tools that are important for the performance of the Microsoft Dynamics NAV 2009 database. It is important to understand how Microsoft Dynamics NAV integrates with SQL Server. SQL Server offers many tools you can use to change the design of the database tables and indexes. For example, you can use SQL Server Management Studio (SSMS) to completely manage a database on SQL Server. 1-2

Chapter 1: Introduction to the Course Microsoft SQL Server Management Studio, new in Microsoft SQL Server 2005, is an integrated environment for accessing, configuring, managing, administering, and developing all components of SQL Server. SQL Server Management Studio combines a broad group of graphical tools with a number of rich script editors that provide access to SQL Server for developers and administrators of all skill levels. SQL Server Management Studio combines the features of Enterprise Manager, Query Analyzer, and Analysis Manager included in previous releases of SQL Server, into a single environment. In Microsoft Dynamics NAV objects are managed and designed using the Object Designer. Within Microsoft Dynamics NAV you create tables, design keys, and implement properties using the Table Designer. The Microsoft Dynamics NAV database driver then translates these settings for SQL Server. Although you can use SQL Server Management Studio and other SQL Server tools to change the way the database tables and indexes are designed, you must be careful when doing this. This is because Microsoft Dynamics NAV is always the master of the design. Microsoft Dynamics NAV stores metadata describing tables and indexes. This metadata is decoupled from SQL Server metadata about tables and indexes and could be brought out of sync if changes are made directly from management studio. The consequence is that the next time a table is modified from inside Microsoft Dynamics NAV all changes made from outside Microsoft Dynamics NAV will be overwritten. For example, Simon, the Systems Implementer, uses SQL Server Management Studio and changes the fields contained in an index for the customer table. He can use the SQL Server Management Studio to do this. Afterward, if Mort, the IT Systems Developer, opens the Microsoft Dynamics NAV table designer, the changes made by Simon will not be visible. Moreover, if Mort closes Microsoft Dynamics NAV table designer and saves or recompiles the customer table it might be that the customer table on SQL Server will be resynchronized with the design specified in the Microsoft Dynamics NAV and the changes applied in SSMS may be lost. So when Simon goes back into SQL Server Management Studio his changes will be lost and he will have to re-implement them. This is why it is important to understand how to use the SQL Server tools. If you are required to make a change in the design of a table, you must do it from Microsoft Dynamics NAV. In Microsoft Dynamics NAV there are many properties available that influence how the database will be created on SQL Server. For example, within Microsoft Dynamics NAV there are several properties in tables and in keys that you can use to influence the way corresponding indexes are created on SQL Server. Only when there is no property available for a required change, can you make the change directly from SQL Server. 1-3

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 It is also important to know that if you use the Microsoft Dynamics NAV client to make a backup of your database, not all of the changes you performed directly in SQL Server will be included in the backup. After you restore the backup, you will most likely have to redo your changes. Course Outline and Scope The topics in this course are chosen to meet real-world implementations of Microsoft Dynamics NAV as closely as possible. The course starts with an introduction of the technology and explains the requirements for the different components. One of the first things you do is to install and configure the software after evaluating the requirements. Hardware is an important factor for high performance in any ERP system. Selecting the correct hardware and configuring it from the beginning helps produce optimal results for performance and will reduce the risk of encountering future problems. Although you can choose to comply with minimal hardware requirements, you also need to consider the future. The company may grow and the number of concurrent users may increase causing increased demands on the hardware. Choosing and configuring the correct hardware settings will affect the scalability of the solution. If the customer chooses not to invest much initially and only complies with the minimal hardware requirements, and if the hardware you purchase is not scalable, then you may encounter problems if the company and the corresponding concurrent users and database size grow greater than their expected limits. At that time you may need to expand the hardware to meet the demand. If the server is not scalable, you will have to invest again to buy a bigger server. This kind of investment can be avoided by selecting a server that is scalable at the initial implementation. The next chapter provides an overview of the advantages of SQL Server. When selecting SQL Server as the database platform there are a number of advantages and possibilities that were not available in the old native database. Once the application is implemented, it is important to start monitoring performance. By monitoring performance you can predict bottlenecks or problems that may occur in the future and take action to avoid issues. You can proactively monitor the database and also, for example, forecast database growth. This chapter presents and explains the tools SQL Server and Microsoft Dynamics NAV offer to monitor performance. 1-4

Chapter 1: Introduction to the Course If you discover a performance bottleneck there are different ways to solve the problem. The chapter about improving application performance goes into detail on what you need to know when designing the Microsoft Dynamics NAV application objects, properties, C/AL code, and keys to keep them performing optimally. It is important to know that, although you can try to make the design of the application as good as possible, a part of its performance will depend on how the application is used. This is why monitoring performance is important. It provides you with information on how the application is being used allowing you to change the design accordingly. For example, if the Microsoft Dynamics NAV application is in one case mainly used for reporting and in another case it is mainly used for data entry, then these two scenarios may have completely different requirements on indexes. Once Microsoft Dynamics NAV is running smoothly and the design and performance are optimal, your work is not finished. Now it is important to maintain the database. If you do not maintain a database, performance will decrease and may become sub-optimal. For example indexes may become fragmented, the transaction log will continue to grow, and so on. There are several tools you can use to maintain a SQL Server database and this chapter explains how you can use them to keep the application running optimally. Summary This chapter gives a high-level overview of Microsoft SQL Server. It also describes the outline of this course. 1-5

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Quick Interaction: Lessons Learned Take a moment and write down three Key Points you have learned from this chapter 1. 2. 3. 1-6

Chapter 2: Setup and Installation CHAPTER 2: SETUP AND INSTALLATION Objectives Introduction The objectives are: Evaluate the software requirements for Microsoft Dynamics NAV 2009. Evaluate the hardware requirements for Microsoft Dynamics NAV 2009. Review the Microsoft Dynamics NAV 5.0 Architecture. Understand the Microsoft Dynamics NAV 2009 Architecture. Configure the components in a Microsoft Dynamics NAV implementation. Explain the security model and security synchronization. One of the most important steps in a Microsoft Dynamics NAV implementation is the selection of the hardware and software platform. When planning a Microsoft Dynamics NAV 2009 installation, you must make sure that the computers that will be used meet the minimum requirements and are sufficient for your current and future needs. Failure to meet these specifications can cause the installation of some or all of the components to fail. This lesson describes the requirements for installing Microsoft Dynamics NAV 2009. Understanding the setup requirements and options help you plan a successful installation. 2-1

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Software Requirements The software requirements for a Microsoft Dynamics NAV 2009 implementation depend on the following factors: The architecture and the different tiers The database platform Additional components and desired functionality Architecture and Tiers Microsoft Dynamics NAV can be used in single-user or multiuser installations. In single-user installations, all the work is performed on one computer, and all the information (the database) is stored on this computer. Even if you purchase several single-user installations and run each on its own computer, the entire installation is still called "single-user" because the information is stored on each individual computer and not on one centrally located database. In a multiuser installation, many users share common information that is stored in one or more databases on a server. The computers that work with the data are called clients, and the way the server and the computers work together is called a client/server installation. For a single-user installation, the software requirements are limited. The client must be installed on a supported operating system. Furthermore, when you choose Microsoft SQL Server as a database platform, you also have to consider the software requirements for Microsoft SQL Server Express Edition. Multiuser installations require the presence of a server program that is installed on the server computer. Microsoft Dynamics NAV supports two server options: Microsoft Dynamics NAV Classic Database Server and Microsoft SQL Server. Each of these server options has its own software requirements. In addition, in multiuser installations, you can choose between a two-tier and a three-tier architecture. (In fact, both architectures can coexist.) While the Classic client works with both database platforms, the RoleTailored client requires Microsoft SQL Server as a database platform. Database Platform The selected database platform affects the software requirements. While the Microsoft Dynamics NAV Database Server has its own requirements, the requirements for the Microsoft SQL Server Option fully depend on the specifications for Microsoft SQL Server. 2-2

Chapter 2: Setup and Installation Microsoft Dynamics NAV 2009 supports the following SQL Server platforms: Microsoft SQL Server 2005 SP2 Express, Workgroup, Standard, Enterprise Microsoft SQL Server 2008 Express, Workgroup, Standard, Enterprise Each edition has its own software and hardware requirements. Additional Components and Functionality Depending on the functions and components used in or together with Microsoft Dynamics NAV, additional software requirements apply. Functional areas such as Business Analytics or Automated Data Capture System require additional software components (for example Application Server for Microsoft Dynamics NAV and Microsoft SQL Server Analysis Services). To use Office integration functions (mail merge, Outlook client integration, and export to Excel), you need Microsoft Office. Other software requirements can come from a more general perspective or need, such as database size, scalability, and high availability, and so on. For a detailed overview of the software requirements for the individual Microsoft Dynamics NAV components, go to http://www.microsoft.com/dynamics/nav/product/systemrequirements.mspx. Microsoft Dynamics NAV 2009 is designed to take advantage of the Microsoft software platform. This platform consists of the following prerequisites for Microsoft Dynamics NAV 2009: Windows Server 2003 or 2008 Microsoft SQL Server 2005 or 2008 The Microsoft.NET Framework 3.5 Microsoft Windows Services Enhancements ASP.NET Microsoft Report Viewer 2008 Users must install and configure these prerequisites to prepare the server on which to load Microsoft Dynamics NAV. These software requirements will influence the hardware configuration, which will be discussed in the next lessons. Operating System Microsoft Dynamics NAV 2009 introduces a three-tier RoleTailored architecture that improves the security, scalability, and flexibility of Microsoft Dynamics NAV. 2-3

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 In the RoleTailored architecture there are the following three tiers: Client tier Service tier Database server tier Each tier has its own role and its own requirements that further determine the operating system requirements. The Client Tier - RoleTailored Client The Microsoft Dynamics NAV 2009 RoleTailored client can be installed on the following operating systems: Microsoft Windows XP Professional SP3 or later (X86 or running 32 bit on X64) Windows Server 2003 SP2 or later (X86 or running 32 bit on X64) Windows Server 2003 R2 SP2 or later (X86 or running 32 bit on X64) Microsoft Windows Vista (Business, Enterprise, or Ultimate) SP1 or later (X86 or running 32 bit on X64) Windows Server 2008 (X86 or running 32 bit on X64) Furthermore, depending on the functionalities and applications used by the client, the following applications are required: Internet Explorer 6.0 or later Microsoft.NET Framework 3.5 Active Directory required for 3-tier configurations. For instant messaging and TAPI Microsoft Office Communicator 2007 is required. For Mail Merge*, Microsoft Outlook Client Integration, Import and Export Budget to and from Microsoft Excel and Office XML and Share Point links, the following is required: Microsoft Office 2003 or 2007 *Mail merge requires Collaboration Data Object (CDO) installed. 2-4

The Client Tier - Classic Client Chapter 2: Setup and Installation The Microsoft Dynamics NAV 2009 Classic client can be installed on the following operating systems: Windows XP Professional SP3 or later (X86 or running 32 bit on X64) Windows Server 2003 SP2 or later (X86 or running 32 bit on X64) Windows Server 2003 R2 SP2 or later (X86 or running 32 bit on X64) Windows Vista (Business, Enterprise, or Ultimate) SP1 or later (X86 or running 32 bit on X64) Windows Server 2008 (X86 or running 32 bit on X64) Depending on the functionalities and applications used by the client, the following applications are required: Microsoft.NET Framework 3.5 For Mail Merge*, Outlook Client Integration, Import and Export Budget to and from Microsoft Excel and Office XML and Share Point links, the following is required: Microsoft Office 2003 or 2007 *Mail merge requires Collaboration Data Object (CDO) installed. For single-user installation Microsoft SQL Server 2005 Express or SQL Server 2008 Express is supported. For developing reports for the RoleTailored client, one of the following products is required: Microsoft Visual Web Developer 2005 Express edition SP1 or above* Microsoft Visual Studio 2005 Standard / Professional SP1 or later Microsoft Visual Studio 2008 Standard / Professional SP1 or later * If Microsoft Visual Web Developer 2005 Express edition is used, Reporting Add-in for Microsoft Visual Web Developer 2005 Express is also required. For debugging applications running on the Microsoft Dynamics NAV Server, Microsoft Visual Studio 2008 SP1 is required. 2-5

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 The Service Tier The Service Tier requires one of the following operating systems: Windows XP Professional with SP3 or later (X86 or running 32 bit on X64) Windows Server 2003 SP2 or later (X86 or running 32 bit on X64) Windows Server 2003 R2 SP2 or later (X86 or running 32 bit on X64) Microsoft Small Business Server 2003 R2 SP2 or later (X86 or running 32 bit on X64) Microsoft Small Business Server 2008 or later (X86 or running 32 bit on X64) Windows Vista (Business, Enterprise, or Ultimate) SP1 or later (X86 or running 32 bit on X64) Windows Server 2008 (X86 or running 32 bit on X64) Microsoft Windows Essential Business Server 2008 Standard or Premium (running 32 bit on X64) The following components are needed: Microsoft.NET Framework 3.5 Active Directory required for 3-tier configurations. The Database Tier - Microsoft SQL Server Option Microsoft Dynamics NAV 2009 supports the following versions of SQL Server: Microsoft SQL Server 2005 SP2 Express, Workgroup, Standard, Enterprise Microsoft SQL Server 2008 Express, Workgroup, Standard, Enterprise The operating system requirements depend on the specifications for both SQL Server versions. Both x86 and x64 operating systems are supported. Windows Server 2003 Windows Server 2003 or Windows Server 2008 are the platform operating systems for Microsoft Dynamics NAV 2009. Whether you use the 2003 or 2008 depends on your hardware and infrastructure, and your performance needs. 2-6

Chapter 2: Setup and Installation Windows Server 2003 includes all the functionality customers have to do more with less, while providing security, reliability, availability, and scalability. Microsoft has improved the Microsoft Windows server operating systems to incorporate the benefits of Microsoft.NET. This enables information, people, systems, and devices to successfully connect to one another. Windows Server 2003 is a multipurpose operating system that can handle a diverse set of server roles, depending on a user's needs, in either a centralized or distributed manner. Some of these server roles include the following: Application server File and print server Web server and Web application services Mail server Terminal server Remote access and Virtual Private Network (VPN) server Directory services including Domain Name System (DNS), Dynamic Host Configuration Protocol (DHCP) server, and Microsoft Windows Internet Naming Service (WINS) Streaming media server Windows Server 2003 Standard Edition is designed for departmental and standard workloads. Windows Server 2003 R2 Enterprise Edition differs from Windows Server 2003 R2 Standard Edition primarily in its support for high-performance servers and its ability to cluster servers for more load handling. These capabilities include the following: Eight-way Symmetric Multiprocessing (SMP) Eight-Node Clustering Up to 64 Gigabytes of RAM This powerful platform provides reliability that helps systems remain available even if problems occur. Windows Server 2003 Datacenter Edition is designed for: The highest levels of scalability and reliability Supporting mission-critical solutions for databases Enterprise resource planning software High-volume, real-time transaction processing Server consolidation 2-7

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Windows Server 2003 R2, Datacenter Edition, is available in both 32-bit and 64- bit versions and includes the following: Thirty two-way SMP Eight-Node Clustering Up to 64 Gigabytes of RAM Microsoft Dynamics NAV uses Windows Server 2003 as an application server, and will be installed on a Windows Server 2003 Standard Edition in most production environments. Other environments may require the Enterprise or Datacenter Editions. Windows Server 2008 Windows Server 2008, with built-in Web and virtualization technologies, enables businesses to increase the reliability and flexibility of their server infrastructure. New virtualization tools, Web resources, and security enhancements help save time, reduce costs, and provide a platform for a dynamic and optimized datacenter. Powerful new tools, such as Internet Information Services (IIS) 7.0 and Server Manager, provide more control over servers, and streamline Web, configuration, and management tasks. Advanced security and reliability enhancements, such as Network Access Protection and the Read-Only Domain Controller, empower the operating system and help protect the server environment to make sure a solid foundation on which to build businesses. Windows Server 2008 is available in multiple editions to support the varying server needs of organizations of all sizes. Windows Server 2008 is available in five primary editions, and three of these editions are also available without Windows Server Hyper-V, bringing the total number of editions to eight. 2-8

Chapter 2: Setup and Installation Windows Server 2008 Roles are specified in the following table: FIGURE 2.1 WINDOWS SERVER 2008 ROLES 2-9

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Microsoft SQL Server The software requirements for Microsoft Dynamics NAV 2009 with Microsoft SQL Server depend on the SQL Server versions supported by Microsoft Dynamics NAV. There are two supported versions: Microsoft SQL Server 2005 SP2 Express, Workgroup, Standard, Enterprise Microsoft SQL Server 2008 Express, Workgroup, Standard, Enterprise Microsoft SQL Server is a comprehensive database platform providing enterprise-class data management with integrated business intelligence (BI) tools. It is fully integrated into the Microsoft Data Platform. FIGURE 2.2 THE MICROSOFT DATA PLATFORM The Microsoft SQL Server data engine is the core of this enterprise data management solution. Additionally, Microsoft SQL Server combines analysis, reporting, integration, and notification. Business intelligence (BI) features provide a competitive advantage. These advantages include enriching data and building complex business analytics with Analysis Services, and writing, managing, and delivering rich reports that use Reporting Services. 2-10

Software Requirements for SQL Server Chapter 2: Setup and Installation SQL Server must be installed on a computer running Microsoft Windows. The specific version of Windows required depends on the edition of SQL Server being installed. SQL Server is available in many editions to help meet the needs of your organization. Microsoft SQL Server Enterprise Edition Microsoft SQL Server 2008 Enterprise provides a trusted, productive, and intelligent data platform that enables you to run your most demanding businesscritical applications, reduce time and cost of development and management of applications, and deliver actionable insight to your entire organization. SQL Server 2008 Enterprise provides the highest levels of security, reliability, and scalability. Enterprise Edition is meant to support the largest enterprise online analytical processing environments, highly complex data analysis, data warehousing, and active Web servers. For more information, go to http://www.microsoft.com/sqlserver/2008/en/us/enterprise.aspx. Microsoft SQL Server Standard Edition SQL Server 2008 Standard is a complete data management and business intelligence platform providing best-in-class ease of use and manageability for running departmental applications. Standard Edition includes the necessary functionality for e-commerce, data warehousing, and line of business solutions that most small and medium sized business use. If your organization needs to track large amounts of data but does not need all the functionality of Enterprise Edition then Standard Edition will work well with Microsoft Dynamics NAV. For more information, go to http://www.microsoft.com/sqlserver/2008/en/us/standard.aspx. Microsoft SQL Server Workgroup Edition SQL Server 2008 Workgroup is a reliable data management and reporting platform that delivers secure, remote synchronization and management capabilities for running branch applications. It includes the core database features of the SQL Server product line and is easy to upgrade to Standard or Enterprise. 2-11

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Workgroup Edition is the data management solution for small organizations that need a database that has no limit on the size or number of users and can work as a back end to small Web servers and departmental or branch office operations. For more information, go to http://www.microsoft.com/sqlserver/2008/en/us/workgroup.aspx. Microsoft SQL Server Developer Edition SQL Server 2008 Developer enables developers to build and test applications that run on SQL Server on 32-bit, IA-64, and x64 platforms. SQL Server 2008 Developer includes all of the functionality of Enterprise Edition, but is licensed only for development, test, and demonstration use. The license for SQL Server 2008 Developer entitles one developer to use the software on as many systems as necessary. For rapid deployment into production, instances of SQL Server 2008 Developer can easily be upgraded to SQL Server 2008 Enterprise without reinstallation. Developer Edition includes all the functionality of SQL Server Enterprise Edition. However, it is licensed as a development and test server, not as a production server. For more information about SQL Server Developer Edition, go to http://www.microsoft.com/sqlserver/2008/en/us/developer.aspx. For more information about upgrading Developer to Enterprise Edition, go to http://msdn.microsoft.com/en-us/library/ms143393.aspx. Microsoft SQL Server Express Microsoft SQL Server 2008 Express is a free edition of SQL Server that is ideal for learning, developing and powering desktop, Web and small server applications, and for redistribution by Independent Software Vendors (ISVs). SQL Server 2005 Express Edition is a free, easy to use and easy to manage database that can be redistributed to act as a client database and basic server database. It is usually suited for small data sets and will not work in some Microsoft Dynamics NAV implementations. For more information, go to http://www.microsoft.com/sqlserver/2008/en/us/express.aspx. Microsoft SQL Server Web Edition and Compact Edition Microsoft SQL Server 2008 Web and Compact 3.5 Edition are specifically targeted for web application hosting and mobile application development scenarios. 2-12

Chapter 2: Setup and Installation Edition Features SQL Server is available in many editions to meet the needs of your organization. To determine which SQL Server edition will work best for your Microsoft Dynamics NAV implementation, review the features in each edition. FIGURE 2.3 SQL SERVER EDITION FEATURE COMPARISON For a full comparison of the features available in each edition of SQL Server, please go to http://www.microsoft.com/sqlserver/2008/en/us/editionscompare.aspx. SQL Server 2005 on Windows Server 2008 and Windows Vista In order to give customers more secure products, Windows Server 2008 and Windows Vista are supported by SQL Server 2005 Express Edition Service Pack 1 (SP1). All other editions will be supported by SQL Server 2005 Service Pack 2 (SP2) or later when it becomes available. Earlier versions of SQL Server, including SQL Server 2000 (all editions including Desktop Engine edition, also known as MSDE), SQL Server 7.0, and SQL Server 6.5, will not be supported on Windows Server 2008 or Windows Vista. Customers running applications that have these earlier versions of SQL Server should consider evaluating and upgrading to SQL Server 2005, which was designed to take advantage of the upcoming security and performance enhancements in the operating environment. 2-13

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 SQL Server 2005 requires Microsoft SQL Server 2005 Service Pack 2 (http://www.microsoft.com/downloads/details.aspx?familyid=d07219b2-1e23-49c8-8f0c-63fa18f26d3a&displaylang=en) (SP2) to run on Windows Server 2008 and Windows Vista SP1. You must first install the full release version of SQL Server 2005 before you apply SP2. Other.NET Framework Hardware Requirements The.NET Framework is a development and execution environment that enables different programming languages and libraries to work together seamlessly to create Microsoft Windows-based applications that are easier to build, manage, deploy, and integrate with other networked systems. Microsoft Web Services Enhancements The Web Services Enhancements for Microsoft.NET is an add-in to Microsoft Visual Studio 2005 and the Microsoft.NET Framework 3.5 that enables developers to build secure Web services based on the latest Web services protocol specifications. Microsoft Report Viewer 2008 The Microsoft Report Viewer 2008 Redistributable Package includes Windows Forms and ASP.NET Web server controls for viewing reports designed by using Microsoft reporting technology. Microsoft Report Viewer control enables applications that run on the.net Framework to display reports designed using Microsoft reporting technology. This redistributable package contains Windows Forms and ASP.NET Web server control versions of the Report Viewer. Microsoft Dynamics NAV does not require particularly sophisticated equipment, but as with all programs, the better your equipment, the better the results. You get the best solution with the optimal equipment and with the program settings optimized for that equipment. In a multi-user installation, you can, in principle, use the same type of computer for both the clients and the server. However, there is a difference in how much CPU power, memory, and disk space the client and server will need. Computers for servers that run Windows Server 2003 or Windows Server 2008 must comply with the requirements specified by Microsoft. If you use the Microsoft Dynamics NAV SQL Server option, the hardware must meet the specifications of both the operating system and the Microsoft SQL Server edition. 2-14

Chapter 2: Setup and Installation The database is stored on the server, making it a critical area of the application because several users can access it at the same time. Therefore, it is important to select a powerful computer for the server. In addition, the following are aspects of the server that require additional consideration: Hard Disks The hard disk and controller The RAID system The memory The network adapter The CPU The hard disk is the slowest component in a computer because it consists of mechanical parts. Access times to the hard disk are long compared to those to memory (typical access time to memory is less than 60 nanoseconds and access time to a hard disk is faster than 10 milliseconds). All of the programs and information are stored on the hard disk so that data is continuously read from and written to the disk. Because there is only one read/write head in a hard disk, only one read or write operation can be carried out at a time. By using more than one disk in your system, you can increase performance significantly. Hard Disk Controller You must, however, use a hard disk controller that supports control of more than one hard disk at a time without increasing access time to the disks. Furthermore, it is important for the controller to have a high transfer rate so data can travel quickly between the memory and the hard disk. The use of CPU per disk transfer must also be minimized. An example of a controller with these features is the Fast Wide SCSI 2 (Small Computer System Interface) controller. When selecting a hard disk controller, do not use the write-back or lazy-write caching systems that are built into your hard disk controller unless the disk controller has a battery backup. Using a battery-supported hard disk controller prevents loss of data that might otherwise result if the system experiences a power failure. You should also be aware of the write-cache facility that most of today's hard disks use. When you buy a hard disk, make sure that you can disable its writecache (using software or a jumper on the disk). When write-cache is enabled and a power failure occurs when data is still in this cache, you could lose the data. It is also necessary to have some sort of error detection unit implemented to allow the controller to determine when a byte of data in the cache is corrupted (for example, caused by a single-bit error or a defective memory chip). Any errors that occur must be corrected so that a correction scheme must be implemented in the controller. An ECC (Error Correction Code) RAM is an example of this kind of correction scheme. 2-15

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Disk Speed and Disk Size Because the hard disk is the slowest component of a computer, you want to make disk read and write transactions as fast as possible. You can do this by using fast disks (15000 rpm). Fast disks definitely perform better than disks that have a lower speed (10000 rpm). Adding hard drives to your system will increase the overall performance. To avoid poor performance in your daily work, you should add more than one hard disk to your system and divide the database among these hard disks. Four relatively slow hard disks perform much better together than one super fast hard disk. Several "intelligent" controllers exist, which can control several hard disks simultaneously, for example, RAID (Redundant Array of Independent Disks) controllers. Important to know is that pure disk performance is not the same as Microsoft Dynamics NAV performance because many other things are going on for example in-memory processing, network transport and so on. It would be wrong to claim that write performance is more important than read performance because Microsoft Dynamics NAV does about five reads every time it does one write. With a RAID setup you can achieve almost double performance by doubling the number of disks in a pure striping setup but most customers will run both striping and mirroring requiring more disks to achieve this performance gain. Solid-state drive (SSD) Solid state disks solve the problem of physical constraints by replacing hard disk drives with high speed circuitry. Instead of a rotating disk, a solid state disk uses memory chips (typically DDR RAM or Flash Memory) to read and write data. Solid state drives have several advantages over the magnetic hard drives. Most of this comes from the fact that the drive does not have any moving parts. While a traditional drive has drive motors to spin up the magnetic platters and the drive heads, all the storage on a solid state drive is handled by flash memory chips. This provides three distinct advantages: Less Power Usage Faster Data Access Higher Reliability While solid state disks will often boost write performance they typically do not provider as much read performance because the memory cache on the SQL Server already serves a lot of read requests. However the initial measurements (http://blogs.msdn.com/freddyk/archive/2009/02/09/more-ssd-testing.aspx) show that they can improve raw read/write performance compared to conventional hard 2-16

Chapter 2: Setup and Installation drives. As with conventional hard drives it is still a good idea to build redundancy into a system using solid-state hard drives (mirroring) RAID (Redundant Array of Independent Disks) RAID systems provide two main advantages: reliability (RAID 1 or mirroring) and performance (RAID 0 or striping). Given the low cost of hard drives, any company should at least implement RAID 1 to be more secure from data-loss. Whether you use RAID 0 depends on the number of transactions the system is required to handle. The advantage of RAID systems is that you can add disks over time and improve performance or increase capacity. A RAID system consists of several disks. The key feature of a RAID system is that the failure of one disk does not bring the entire system down. Several RAID configurations exist. The most important configurations are described in the following table: RAID Level RAID 0 RAID 1 Description This level is also known as disk striping because it uses a disk file system called a stripe set. Data is divided into blocks and spread in a fixed order among all disks in an array. RAID 0 improves read and write performance by spreading operations across multiple disks. RAID 0 is similar to RAID 5, but RAID 0 does not provide redundancy (fault tolerance). Called mirroring. The data is written redundantly to pairs of drives and can be read independently from each drive. This is fast and provides full redundancy, but the disk capacity required is doubled. The read performance can be up to twice as fast as a single drive because both drives can process the read request simultaneously. Write performance is almost unchanged. RAID 1 is best for transaction processing, where many small I/Os are required. RAID 1 is also the most expensive RAID configuration because of the disk overhead. In theory, RAID 1 has twice the read transaction rate of single disks and the same write transaction rate as single disks. 2-17

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 RAID Level RAID 5 RAID 10 (1+0) Description Also known as striping with parity, this level is the most popular strategy for new designs. RAID 5 stripes the data in large blocks across the disks in an array. The parity for the stripes of data is also spread across all of the drives, so no one drive is dedicated to parity. Data redundancy is provided by the parity information. The data and parity information are arranged on the disk array so that the two types of information are always on different disks. In general, striping with parity offers better performance than disk mirroring (RAID 1). RAID 5 requires a minimum of three drives to implement. It has the highest read transaction rate, but a medium write transaction rate. RAID 10 uses a mirrored array of disks (RAID 1 array) that are striped to another set of disks. RAID 10 is not to be confused with RAID 0+1. RAID 0+1 This level is also known as mirroring with striping. RAID 0+1 uses a striped array of disks that are then mirrored to another identical set of striped disks. For example, a striped array can be created by using five disks. The striped array of disks is then mirrored using another set of five striped disks. RAID 0+1 provides the performance benefits of disk striping with the disk redundancy of mirroring. RAID Levels and SQL Server RAID is a disk system that contains multiple disk drives, called an array, to provide better performance, reliability, storage capacity, and reduced cost. Faulttolerant arrays are categorized in six RAID levels: 0 through 5. Each level uses a different algorithm to implement fault tolerance. Although RAID is not a part of SQL Server, implementing RAID can directly affect the way SQL Server performs. RAID levels 0, 1, and 5 are typically used with SQL Server. A hardware disk array improves I/O performance because I/O functions, such as striping and mirroring, are handled efficiently in firmware. Conversely, an operating system-based RAID offers reduced cost, but consumes processor cycles. When cost is an issue and redundancy and high performance are required, RAID 5 volumes are a good solution. Data striping (RAID 0) is the RAID configuration with the best performance, but if one disk fails, all the data on the stripe set becomes inaccessible. A common installation technique for relational database management systems is to configure the database on a RAID 0 drive and then put the transaction log on a mirrored drive (RAID 1). You can obtain the best disk I/O performance for the database 2-18

Chapter 2: Setup and Installation and maintain data recoverability through a mirrored transaction log, assuming you perform regular database backups. RAID 5 provides redundancy of all data on the array. This allows for a single disk to fail and be replaced most of the time without system downtime. Be aware that RAID 5 offers reduced performance compared to RAID 0 or RAID 1 (because of the parity overhead), but better reliability and faster recovery. Write performance on RAID 5 increases when more disks are added to the RAID 5 array. In general, read performance is better in RAID 5 than in RAID 1 (as RAID 5 has at least three disks, whereas RAID 1 has only two). For ERP systems, with many write transactions, RAID 5 is not always the best solution. 2-19

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 In general, the following RAID recommendations apply to SQL Server: SQL Server Component Operating System + SQL Server Files RAID 1 Database RAID 10 RAID Configuration Transaction Log RAID 1 or 10 TempDB RAID 1 or 10 The operating system files on a SQL Server are not the most critical nor the most frequently used files of a SQL Server configuration; a RAID configuration with good read performance can always be recommended. However, it is not required (because reading these files does not generate much overhead). For the database files, it is important to select a RAID level with good read and write performance. Make sure that you put the database files and the transaction log files on separate disk arrays. Otherwise you will encounter disk contention which reduces the performance of the SQL Server. You could also lose both the database and transaction log files should a disk failure occur. It is also recommended to keep the Windows page file and the database files on separate disk arrays. The transaction log files require a RAID level with excellent write performance, because the log files are constantly written to. Here, RAID 10 is a good choice. If you have multiple transaction log files, it is better to store each transaction log on a separate RAID 10 array. Although it is possible to place the TempDB on a RAID 0, we recommend that you select a more fault-tolerant RAID configuration for production environments. (For development environments or environments where downtime is not important, RAID 0 can be used.) Be aware that failure of a disk that contains the TempDB will result in server downtime, which is to be avoided in production environments. Processor The speed of the processor (also called Central Processing Unit or CPU) is also an important performance factor. It is the CPU that performs all the calculations involved in Microsoft Dynamics NAV - the faster the CPU, the more calculations per second. It is also important to have as much level 2 cache in the system as possible. This increases the speed with which the CPU gets data from and saves data to RAM. However, it should be noted that adding more hard disks gives a greater improvement in performance than increasing the speed of the CPU. 2-20

Chapter 2: Setup and Installation Multiple Processors Microsoft SQL Server can be configured to use multiple processors. The maximum number of processors that can be used by SQL Server depends on the edition: SQL Server edition Enterprise Standard Workgroup Express Number of CPUs OS Maximum 4 2 1 If multiple processors are enabled, it is recommended that you: Create as many database files as there are processors Disable automatic database growth in SQL Server If you have a multi-processor server, make sure not to allocate all processors to SQL Server, but keep a number of processors available for other tasks that are running on the server. For example, in a server that has four processors, allocate a maximum of three processors to SQL Server and keep at least one for the operating system and other applications. 32-bit versus 64-bit The terms 32-bit and 64-bit refer to the way a computer's processor handles information. The 64-bit version of Windows handles large amounts of RAM more effectively than a 32-bit system. The benefits are most apparent when you have a large amount of RAM installed on your computer (typically 4 GB of RAM or more). Because a 64-bit operating system can handle large amounts of memory more efficiently than a 32-bit operating system can, a 64-bit system can be more responsive when running several programs at the same time and switching between them frequently. Memory Memory is an important resource in a Microsoft SQL Server environment. Microsoft SQL Server tends to use its memory to serve the user requests. The amount of memory that SQL Server can address depends on the SQL Server edition (and the operating system). It should be proportional to the number of users. Edition Enterprise Standard Workgroup Express Required 512 MB 512 MB 512 MB 192 MB Minimum Recommended Minimum (SQL Server 2005) 1 GB 1 GB 1 GB 512 MB 2-21

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Edition Enterprise Standard Workgroup Express Recommended Minimum(SQL Server 2008) 2 GB 2 GB 2 GB 1 GB Maximum Memory Utilization 4 GB (32-bit) or OS Maximum (64-bit) 4 GB (32- bit) or OS Maximum (64-bit) 4 GB 1 GB The memory requirements in the table do not reflect additional memory requirements for the operating systems. For detailed memory requirements, go to http://www.microsoft.com/sql/. The amount of memory that SQL Server can address can be configured through the Memory page in the Server Properties window in SQL Server Management Studio. When Minimum server memory is set to 0 and Maximum server memory is set to 2147483647, SQL Server can take advantage of the optimal amount of memory at any given time, subject to how much memory the operating system and other applications are currently using. As the load on the computer and SQL Server changes, so does the memory allocated. You can further limit this dynamic memory allocation to the minimum and maximum values. Minimum server memory (in MB) specifies that SQL Server should start with at least the minimum amount of allocated memory and not release memory under this value. Set this value based on the size and activity of your instance of SQL Server. Always set the option to a reasonable value to ensure that the operating system does not request too much memory from SQL Server and inhibit Windows performance. Maximum server memory (in MB) specifies the maximum amount of memory SQL Server can allocate when it starts and while it runs. This configuration option can be set to a specific value if you know there are multiple applications running at the same time as SQL Server and you want to guarantee that these applications have sufficient memory to run. If these other applications, such as Web or e-mail servers, request memory only as needed, then do not set the option, because SQL Server will release memory to them as needed. However, applications often use whatever memory is available when they start and do not request more if needed. If an application that behaves in this manner runs on the same computer at the same time as SQL Server, set the option to a value that guarantees that the memory required by the application is not allocated by SQL Server. 2-22

Chapter 2: Setup and Installation Be aware that sometimes you can achieve better performance by reserving a reduced amount of memory for SQL Server. However, if SQL Server has insufficient memory (because of a lack of physical memory or because of inadequate memory configuration) and Microsoft SQL Server uses all available memory, query response time, CPU usage, and disk input/output will go up (because Windows starts paging). In general, you can say that a SQL Server cannot have too much memory. In the ideal scenario, the SQL Server has sufficient memory to keep the entire Microsoft Dynamics NAV database in memory. It is also important that the memory has a parity bit or is of the ECC (Error Correction Code) type. 32-bit versus 64-bit systems If you run SQL Server on a 32-bit operating system, you can only address 4 GB of memory. For a 64-bit edition running on a 64-bit operating system, the amount of memory is limited to the operating system's maximum. Network Communication to and from the client passes through the network. If messages are to be delivered quickly, you must have a fast network adapter. This also ensures that the CPU use per network send/receive activity is minimal, which reduces the load on the CPU. The physical connection (the cabling) between the server and the clients must also be able to support the high speed. The Microsoft Dynamics NAV Client requires a 100MB switched (no hubs) connection to the server. Therefore, 56K modem or broadband connections are not supported with the standard Microsoft Dynamics NAV Client. Alternative solutions are available (Windows Terminal Services, Microsoft Dynamics NAV Employee Portal or Automated Data Capture System (ADCS) for example). It is important to have a powerful network connection between the Microsoft NAV Service Tier (NST) and the SQL Server because this is where most network communication will take place. The NST processes all business logic on behalf of the client. The Microsoft Dynamics NAV RoleTailored client is designed to communicate less with the NST than the classic client. Microsoft Dynamics NAV Architecture Microsoft Dynamics NAV 2009 is built on a three-tier architecture model. This differs significantly from the two-tier architecture of Microsoft Dynamics NAV 5.0 and earlier versions. Microsoft Dynamics NAV 2009 introduces a new layer in the architecture. This additional layer, or tier, is designed to host and execute all the business logic. In the two-tier architecture of Microsoft Dynamics NAV 5.0, the business logic resides at the client layer. 2-23

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 It improves security to execute the business logic on another layer that does not provide an interface, or access point, to the user. Scalability is also improved in the three-tier architecture. Microsoft Dynamics NAV 5.0 Architecture Two-tier architecture models are used in the true Client/Server Distributed Data solution. In two-tier architectures, the data and data manipulation layers reside on the server, whereas the application logic, presentation logic, and presentation layers reside on the client. Microsoft Dynamics NAV 5.0 is designed according to the two-tier architecture model. It puts the application logic (Business Logic) and presentation logic/layers (User Interface) on the client computer. The two-tier architecture also puts the data and data manipulation layers (DML) on the server in the native database and SQL Server configurations. The Client Tier The Classic client is located on the client tier. It consists of an administration component for administrators and Microsoft Dynamics NAV 4.0/5.0 super users and client access for the Classic client user. The client tier has specific connections for data transfer to the server tier and the database management component. These connections consist of the client, security approved access to the database components on the server tier, and administration access to the database management. The Server Tier The server tier consists of the database management system and the native database for Microsoft Dynamics NAV 4.0/5.0 or the Microsoft SQL Server database. The existing application server or Application Server for Microsoft Dynamics NAV is an integration point to the Microsoft Dynamics NAV 4.0/5.0 application. It can be used for example, to connect to Microsoft BizTalk Server. The application server represents an effort to achieve three-tier architecture benefits. However, it is limited to only one process at a time. Therefore, multiple application servers are needed for many production environments. Microsoft Dynamics NAV 2009 Architecture The three-tier architecture is used in the Client/Server Distributed Data and Application system. With this architecture, in which the data and data manipulation layers are put on their own servers, the application logic is put on its own server, and the presentation and presentation logic are put on the client computer. 2-24

Chapter 2: Setup and Installation Microsoft Dynamics NAV 2009 is designed according to the three-tier architecture model as follows: This version puts the presentation logic/layers (User Interface) on the client computer. It puts the Business logic on another layer available in the three-tier architecture called the service tier. It puts the data and data manipulation layers (DML) on the database server tier. The Microsoft Dynamics NAV 2009, three-tier architecture and the classic twotier architecture are not mutually exclusive. In fact, they coexist. The Classic client remains an essential component for developing and administering Microsoft Dynamics NAV applications. (For example, you must use the Classic client to upload the Microsoft Dynamics NAV 2009 license file or to develop objects for the RoleTailored client.) The three-tier architecture is multithreaded so that it can handle more than one process at a time. It overcomes the following intrinsic limitations of the two-tier architecture: The first tier of the architecture is the client. The second tier is the multithreaded middle tier. This is the service tier that is based on Web services. The third tier is where the SQL Server database resides. The RoleTailored client connects to the service tier instead of to the database directly, and all business logic resides in the Microsoft Dynamics NAV Service Tier. Because the business logic is using the service tier, parties cannot gain access to sensitive data merely by hacking a client. Also, customization and integration scenarios are greatly improved, as programming interfaces are hosted by the service tier and are therefore available from custom solutions. The Service Tier Briefly, the Microsoft Dynamics NAV Server is a.net-based Windows Service application that works exclusively with SQL Server databases. It uses the Windows Communication Framework as the communication protocol for RoleTailored clients and for Web services. It can execute multiple client requests in parallel and serve other clients by providing Web service access to authenticated clients. As soon as a request is received and validated, it is passed on to the relevant component, metadata provider, application, or reporting service for execution. When execution is complete, the executing component sends a response to the calling client. 2-25

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Configuration The Database Tier The database tier in Microsoft Dynamics NAV 2009 can hold either a native or a SQL Server database. The two-tier architecture supports both database server options. Classic client users connect to the database and continue to work with the two-tiered architecture. RoleTailored client users will pass through the three-tier architecture that requires Microsoft SQL Server in the database tier. When upgrading to the new three-tiered architecture, make sure that you plan for a possible database migration. When creating a Microsoft Dynamics NAV database, you need to specify a number of database settings, such as the database name, the database and transaction log files with their size and location, the collation, and so on. Although the impact of these settings is not always immediately visible when the new database is created, some of these settings can have an influence on userfriendliness and performance once you start using the database. In addition, some of these settings, such as collation, are difficult to change afterward. This lesson describes the most important database settings to use when configuring a Microsoft Dynamics NAV environment. Recommended Database Settings To create a new Microsoft Dynamics NAV database, open the Classic client and select File, Database, and then New. FIGURE 2.4 SELECT SERVER WINDOW In the Select Server window, you select the SQL Server that you want to create a database on. Also, select the database authentication type. 2-26

Chapter 2: Setup and Installation If you select Windows Authentication, the current Windows account will be used to create the database. If you select Database Server Authentication, you have to specify a User ID and a password. Regardless of the selected authentication type, you must use a user ID that is part of the sysadmin server role in Microsoft SQL Server to create a database. On the Advanced tab, you can select the network protocol to connect to the server. We recommend that you use either the Default or the TCP/IP protocols. Finally, when you click OK, the New Database window opens. The New Database window contains a number of tabs with database settings. FIGURE 2.5 THE NEW DATABASE WINDOW On the General tab, enter the name of the database. The Database Name specified here will be used to automatically define the names of the database and transaction log files. After specifying a database name, click the Database Files tab. FIGURE 2.6 THE DATABASE FILE NAMES 2-27

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Here you can specify the number of files that the new database will contain. You can also enter the name, size, growth rate, and so on, of the individual database files. It is recommended that you carefully consider your database before you determine the size of the database. Expanding your database later can take considerable time and resources. Logical Name Logical names make it easier to manage data files. Microsoft Dynamics NAV will by default configure two data files for each new database. The first data file is created as the primary file in the PRIMARY filegroup. The second data file and every subsequent file that is added to the database will be created in an additional filegroup. The primary data file is always the first file listed on the Database Files tab and must always be specified. It is recommended that primary data files have the extension.mdf and secondary data files have the extension.ndf. Multiple Database Files When you create multiple database files, make sure that you create database files of equal size. This is because SQL Server uses a proportional fill strategy across all files in a filegroup. The proportional fill algorithm is based on the size of the data files and writes an amount of data proportional to the free space in the file. If data files are created with unequal sizes, the proportional fill algorithm will use the largest file more frequently rather than spreading the data between all the files. This defeats the purpose of creating multiple data files. Creating multiple database files is especially recommended when you have several server processors enabled in SQL Server. If you enable multiple processors, the server load can be spread over several CPUs. However, if there is only one database file, only one processor can write to the database at a time. To increase performance, it is recommended to create as many database files as you have processors enabled for SQL Server. File Name and Locations Whether you have single or multiple database files, always place them on a disk system that offers excellent read and write performance. As discussed earlier in this chapter, a RAID configuration is preferred to ensure high availability. For more details about the RAID configurations, see the Hardware Requirements section in this chapter. If you have multiple database files, make sure that you spread the database files on multiple disk systems. Because there is only one read/write head in a hard disk, only one read or write operation can be carried out at a time. By using more than one disk in your system, you can increase performance significantly. 2-28

Chapter 2: Setup and Installation By default, the database files will all be placed in SQL Server's data folder (C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\). You can specify different locations by entering the full path and file name in the File Name field. By preference, place the database files on a different disk than the one that contains the Windows page file. Do not place the database files on a compressed drive, as this degrades performance. Database Size As a default, the size of the primary data file will be set at 40 MB or to the size of the primary data file in the model database (whichever is the greater). The size of the secondary data file will be set at 160 MB. The size of the transaction log file will be set at 50% of the sum of the primary and secondary data files. These settings allow you to restore the standard database backup that is included with the program to a new database without causing the data or transaction log files to grow. You can change all of these sizes and all of the other file properties so that they meet your requirements before you create the database. As stated earlier, make sure that you select an initial size that meets your needs. Be aware that expanding the database costs time and resources. Also, create database files of equal size. To change the default size, enter the new database size (in megabytes) in the File Size (MB) field. File Growth As you continue to work in Microsoft Dynamics NAV, more and more data is entered in the database and the database will grow. At some point, the database files will be completely full, and must be expanded. In the File Growth field, you can enter the amount by which the data file will increase in size each time it expands. This can be expressed in megabytes (MB) or as a percentage (%). The default value is MB. Make sure that you specify an equal file growth size for all database files, to keep the proportional fill strategy. If you enter a percentage, take into account that the database will be expanded with a variable amount. Unrestricted Growth versus Maximum Size You can also specify the maximum size for a database file. You can specify a fixed size (in the Maximum Size (MB) field) or you can select the Unrestricted Growth option for a database file. In the first case, the database will grow until the maximum size specified is reached. In the latter case, the database will continue to grow until all available space on the corresponding disk is used. 2-29

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Whatever option you choose, you must always monitor the amount of free space on the different disks. If the database cannot be expanded (because of a lack of free disk space), SQL Server can no longer write to the database, an error message will be generated, and the database will become unresponsive. We recommend that you do not set the maximum size of the database file to the maximum size of the disk. Instead, set it to 80 or 90% of the disk drive size and keep the remaining disk size as a buffer for emergency cases. Transaction Log Files Microsoft Dynamics NAV will configure one transaction log file for each new database. As a default, the size of this transaction log file will be set at 50% of the sum of the primary and secondary data files. The Transaction Log Files tab lets you control the location, size, growth, and maximum size of all the transaction log files. FIGURE 2.7 DEFINING THE TRANSACTION LOG FILES In general, the recommendations for the database files also apply to the transaction log files. The transaction log is used to track the changes that are made to the database and for database recovery. Therefore, the transaction log files can be considered as the most important files in the SQL Server environment. If a performing disk system is important for the database files, it is even more important and almost a requirement for the transaction log files because these files are constantly being written to. Remember not to place the transaction log files and the database files on the same disk. Make sure that you monitor the free disk space on disks containing the transaction logs. If the transaction log is full and cannot be expanded, the database will enter the Suspect mode and become unavailable for the users. 2-30

Chapter 2: Setup and Installation Collation Types The New Database window contains editable collation information - that is, the system you will use to sort and compare your data. FIGURE 2.8 THE DATABASE COLLATION The SQL Server Option for Microsoft Dynamics NAV allows you to choose between Windows collations and SQL collations. A Windows collation corresponds to the collations supported by the Windows operating systems where they are known as Regional and Language Options. SQL collations are the original collations introduced in SQL Server 7.0 and are still supported for backward compatibility. SQL collations are not recommended at all. Instead, Microsoft advises you to use a Windows collation. This type of collation closely follows the collation rules of the operating system. Using a Windows collation enables Microsoft Dynamics NAV to sort and filter data the same way as SQL Server. When you create a new database, this tab displays the default server collation information. If the server collation is a Windows collation, this collation will be used as the default collation for the database. If the server collation is an SQL collation, then a case and accent sensitive Windows collation for the English language will be used as the default collation for the database. If you do not make any modifications, this will be the collation information used for the new database. Always choose the collation that best matches your requirements when you create a database. Before choosing a collation for Microsoft Dynamics NAV, it is important to know in which countries/regions the database is used. If Microsoft Dynamics NAV is used in a single country/region environment, it is advised to always use the valid local Windows collation. In multiple country/region environments, it is recommended to use one database per country/region or, if possible, one for each collation in your region. 2-31

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Microsoft Dynamics NAV works best with a single database per country/region. When you work with multiple collations in your environment, it is advised that you use a dedicated SQL server for each one. The collation of the master database and other system databases such as tempdb should be in the same collation to avoid performance issues because of sorting problems. If users with different collations must work in one and the same database, it is advised that you select the Windows collation that best matches the geographic needs of the different users. When selecting a collation, the following recommendations apply: Select a Windows collation. Never use a binary sort order. Use a case-insensitive, accent sensitive collation. Never use a collation that mismatches the Regional Settings on the server. Selecting a case-sensitive collation has a negative effect on user-friendliness. Suppose you want to search the Contact table for a contact named "Contoso". In a case-insensitive collation, any notation of the company name (all uppercase, all lowercase, or a mix) will result in the company name being found. In a casesensitive collation, you have to remember the exact notation used when creating the record for the search to return a result. Although the case-sensitive collation is faster, it is less user-friendly. You can modify the collation to suit your requirements before you create a database. Afterward, it is very difficult to change the collation. Validate Code Page If you have selected the Validate Code Page option, the Collation tab only displays the collation descriptions supported by the operating system installed on the client computer that is being used to create the database. This means that it displays the collations that match either the OEM or ANSI code pages used by the client computer. If you have not selected the Validate Code Page option, the Collation tab displays all of the available collations. You can disable this option if you are sure that every character is converted correctly between all the clients and the database. Disabling this setting allows clients that are using different regional settings (code pages) to use the same database even though special characters entered by one client may not be interpreted correctly by another client or by the server. 2-32

Chapter 2: Setup and Installation The following lists other problems that can be caused by not validating the code page are. The sorting of textual data is governed by the database server and this means that the data may not be sorted according to the rules specified on the "incompatible" client computers. This problem will be more acute if there is some C/AL code that only works correctly when a particular sort order is selected. If you are accessing SQL Server with external tools, these tools may not be able to read the data that has been entered by the "incompatible" clients correctly. We recommend that you use the default setting and validate code pages because this will avoid all of these problems. On the Options tab, there are a number of options that may influence performance. FIGURE 2.9 DEFINING THE RECOVERY MODEL The Recovery Model The first option that you should consider is the Recovery Model for the database. This setting determines the kind of information written to the transaction log and therefore the kind of recovery model that you want to use in this database. The available options are: Bulk-Logged Full Simple The Full and Bulk-Logged recovery models are similar, and many users of the Full Recovery model will use the Bulk-Logged model occasionally. 2-33

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 If you select Full, the details of every transaction are stored in the transaction log, and this information can be used when you apply transaction log backups. The Full Recovery model uses database backups and transaction log backups to provide complete protection against media failure. If one or more data files are damaged, media recovery can restore all the committed transactions. Incomplete transactions are rolled back. Full Recovery allows you to recover the database to the point of failure or to a specific point in time. To guarantee this degree of recoverability, all the operations, including bulk operations such as SELECT INTO, CREATE INDEX, and bulk loading data, are fully logged. This model will have the biggest effect on the size of the transaction log files. If you select Bulk-Logged, the transaction log will only contain limited information about certain large-scale or bulk copy operations. The Bulk-Logged Recovery model provides protection against media failure combined with the best performance and minimal use of log space for certain large-scale or bulk copy operations. For more information about which operations are logged under this recovery model, see the Minimally logged operations [SQL Server] topic in the SQL Server Books Online. If you select Simple, the database can be recovered to the point at which the last backup was made. However, you cannot restore the database to the point of failure or to a specific point in time. To do that, select either the Full or Bulk- Logged Recovery model. For development or test environments, the Simple recovery model can be selected. For production environments, the Full or Bulk-Logged model is highly recommended. We recommend using the SQL Server backup functions to make backups. If you only use Microsoft Dynamics NAV's internal backup functionality to make backups, it makes no sense to use the Full or Bulk-Logged recovery model. On the contrary, the transaction logs will continue to grow until the hard disk runs out of free space. When selecting the Bulk-Logged or Full models, it is extremely important to make backups with the SQL Server functions. For more information about backups, see Advantages of the SQL Server Option chapter. Auto-shrink If you select the Auto-shrink option, the SQL Server Database Engine will periodically shrink data files. The shrink process runs in the background and will reduce the size of the database files by removing free space from the database files. As already stated, expanding the database can cost time and resources. Therefore, it is recommended not to activate this option for live production environments. However, for static environments (test, development, or archive databases) it can be activated. 2-34

Chapter 2: Setup and Installation Torn Page Detection Torn Page detection is a built-in check mechanism that allows SQL Server to guarantee data integrity. If this option is activated, SQL Server will mark database pages with a small checksum and check this number when reading from and writing data pages to the database. Activating this option causes a small performance hit for every read and write operation, which can become visible for large recordsets. In SQL Server 2005, new I/O verification systems have been introduced. Torn Page Detection is likely to be removed in future SQL Server versions. Therefore, it is recommended not to activate this option. Allow Find as You Type This setting determines whether you can use the Find as You Type option when using the Find function to find an entry in a table or form. Using the Find as You Type facility can affect performance because requests are sent to the server for each character that is typed. By default, the option is selected. Enable for Microsoft Dynamics NAV Server This option determines whether a Microsoft Dynamics NAV database is enabled for the three-tiered architecture. If you activate this option, objects will be automatically compiled upon import in the database. Be aware that object compilation will take longer, because the objects will be translated to C# code and saved as BLOB in the Object table. If you do not plan to use the three-tier architecture, it is recommended that you disable this option. If you create a new database in Microsoft Dynamics NAV 2009, the option is selected by default. When upgrading a Microsoft Dynamics NAV 5.0 database, you must select this option if you plan to use the RoleTailored client only. The Advanced tab contains settings that let you control the way locking is handled in the database. FIGURE 2.10 ADVANCED DATABASE SETTINGS 2-35

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Lock Timeout and Timeout Duration (sec.) This setting allows you to specify whether a session will wait to place a lock on a resource that has already been locked by another session. If you select this option, you can specify a Timeout Duration. This setting allows you to specify the maximum length of time a session will wait to place a lock on a resource that has already been locked by another session. The default value is 10 seconds, which under normal circumstances, is sufficient. You can change this value. If this option is cleared, the session will wait indefinitely (is not recommended). If a lock timeout error message occurs while this option is activated, you can slightly increase the value. However, this may be an indication of a malfunctioning component (a network failure, limited network connectivity, suboptimum coding, or insufficient database maintenance). Use monitoring tools to locate the cause of the problem. Always Rowlock This setting allows you to specify that Microsoft Dynamics NAV always places row-level locks instead of page- and table-level locks. By default, the Always rowlock option is not enabled. This means that the Query Optimizer will decide how to lock a table: by row, by page or by table. As rowlocks require additional memory, SQL Server will sometimes convert multiple rowlocks into a single table lock and use the memory for other purposes. The advantage of this is that SQL Server needs less memory to maintain the locks, so it will have a positive effect on performance. It can have a negative impact on concurrency, because page locks can lock too much data. To keep the Query Optimizer from choosing a locking method, you can activate the Always rowlock option, so Microsoft Dynamics NAV will send ROWLOCK hints to the SQL Server. Sending rowlocks reduces the risk of blocks, but will put pressure on the master database, because administering rowlocks has a higher cost than administering other locks. If you have a high transaction volume, dealing with large result sets, the Always rowlock option can cause an overall decrease of performance if the master database reacts too slowly because of the high number of lock administrations. We do not recommend activating this option. If you do activate it, make sure that SQL Server has sufficient memory to maintain the locks. Security Model The security model setting allows you to specify whether this database uses the Standard or the Enhanced security model. The default setting is Enhanced. 2-36

Chapter 2: Setup and Installation The main difference between these two security models is how they synchronize the Microsoft Dynamics NAV security system with SQL Server and the way that they integrate the Microsoft Dynamics NAV security system with Windows authentication. Security will be discussed in the next lesson. The Standard model does not require synchronization, except for initial synchronization when changing from Enhanced to Standard. We recommend that you use Standard security because it will not require all the synchronizations. The biggest difference is that Standard security model creates one application role, whereas the Enhanced model created a separate application role for each user. The security model can be changed at any time, but can take time, depending on the number of users. To do this, open the database in single-user mode, change the model in the Alter Database window, and clear the single user option. Close and reopen the database. Caching The Record Set field allows you to specify how many records are cached when Microsoft Dynamics NAV performs a FINDSET operation with the ForUpdate parameter set to FALSE. The default value is 50. In versions before NAV2009, the default value is 500. The recommended setting for this property is to set it to the average number of sales lines. If a FINDSET statement reads more than the number of records specified here, additional SQL statements will be sent to the server which will decrease performance. Increasing this value also increases the amount of memory that the client uses to support each FINDSET statement. The statements will be discussed further in the chapter on Performance Audits. Configure the RoleTailored Client The RoleTailored client is a component of the RoleTailored architecture in Microsoft Dynamics NAV 2009. As you run Microsoft Dynamics NAV 2009 Setup, you can provide configuration information for the RoleTailored client. This information is then written, by Setup, to the ClientUserSettings.config file. On Windows Server 2003 or Windows XP, a separate instance of ClientUserSettings.config is maintained for each client user at the following location: Documents and Settings\<username>\Local Settings\Application Data\Microsoft\Microsoft Dynamics NAV On Windows Vista or Windows Server 2008, the location is: Users\<username>\AppData\Local\Microsoft\Microsoft Dynamics NAV 2-37

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 This file is hidden. Change your folder options in Windows Explorer to allow access to hidden files before trying to edit ClientUserSettings.config. After you install the RoleTailored client, configuration settings are stored in the ClientUserSettings.config file. If you have configured the RoleTailored client in Setup, your setting values are included in this file. In addition to the settings you can configure in Setup, ClientUserSettings.config contains additional settings for the RoleTailored client. After modifying ClientUserSettings.config, you must restart the RoleTailored client for the changes to take effect. For more information about the different settings, see Configuring the RoleTailored Client (http://msdn.microsoft.com/en-us/library/dd301077.aspx). Configure the Classic Client Object Cache Object cache, like cache, allows Microsoft Dynamics NAV to work faster. Objects such as code, descriptions, and windows that will be used on the client computer are stored in the object cache. This means that the client computer only needs to retrieve these objects once from the server. The client computer must have sufficient memory to store the objects while they are being used. To change this setting, in the menu select Tools, and then Options. FIGURE 2.11 In the Options window, set the Object Cache parameter to the desired value. The default value is 32MB. To disable object cache completely, set the value to 0. Disabling object cache is frequently done by developers, to make sure they always have the latest version of the object. Be aware that disabling object cache can have a negative impact on performance, as each object must be retrieved from the server. 2-38

Chapter 2: Setup and Installation If you do not want to disable object cache, you have to restart the Classic client to make sure you work with the latest version of the objects. The RoleTailored client will notice page and report modifications and automatically use the latest version without restarting. Configure the Microsoft Dynamics NAV Service Tier In Microsoft Dynamics NAV 2009, the Client Tier will not connect directly to the database tier. The client tier will connect to the service tier, which turn connects to the database tier (and also executes the business logic). When you install the Service Tier, it waits for a connection from a RoleTailored client, so even if the Service Tier is started, it consumes minimal resources until RoleTailored clients connect to it. After you install the Service Tier, it must be configured to connect to a SQL Server database. Each Service Tier has its own configuration file called CustomSettings.config. This file can be found in the Service folder in the installation path of Microsoft Dynamics NAV 2009. Note that the installation path varies according to version and language of the operating system. If you install on a 64-bit computer, it will be installed under C:\Program Files (x86) because the Service Tier in Microsoft Dynamics NAV 2009 is 32-bit only. A Microsoft Dynamics NAV Server can only connect to one SQL Server database at a time. However, in many cases you have multiple databases in the Microsoft Dynamics NAV implementation (for example a test database and a production database). You can change the DatabaseServer and DatabaseName keys in the configuration file and restart the service tier every time you need to connect to another database, but this is not really best practice. In environments that have many users, you can enable multiple service tiers on the same database to do load balancing. In both scenarios, you probably want to install additional service tiers and configure each service tier to connect to the correct database. If you have multiple service tiers on the same computer, you have to change the InstanceName, ServerPort and WebServicePort keys for each service tier. When installing multiple service tiers, we recommend that you use the InstanceName key to differentiate between service tiers and not ports. For more information about how to install multiple service tiers, see the following walkthroughs: Walkthrough: Installing the Three Tiers On Two Computers (http://msdn.microsoft.com/en-us/library/dd355184.aspx) 2-39

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Walkthrough: Installing the Three Tiers on Three Computers (http://msdn.microsoft.com/en-us/library/dd301254.aspx) Walkthrough: Accessing Multiple Microsoft Dynamics NAV Databases from a Single Microsoft Dynamics NAV Server Computer (http://msdn.microsoft.com/en-us/library/dd301437.aspx) Configure Microsoft SQL Server Microsoft Dynamics NAV 2009 imposes no special requirements for the Microsoft SQL Server installation. If you already have Microsoft SQL Server installed you should be able to use it with Microsoft Dynamics NAV, provided you have the necessary components installed. If you do not have Microsoft SQL Server installed, this topic provides information about how to install and configure Microsoft SQL Server. SQL Server Components to Install Microsoft Dynamics NAV 2009 is compatible with either Microsoft SQL Server 2005 or Microsoft SQL Server 2008. However, the list of SQL Server components you should install varies somewhat between the two versions. If you have already installed SQL Server, you may need to modify the installation to add components. Microsoft SQL Server 2005 Components If you are installing Microsoft SQL Server 2005 to use with Microsoft Dynamics NAV, install the following components: SQL Server Database Services Workstation components, Books Online, and development tools After you install Microsoft SQL Server 2005, remember to also install Service Pack 2. Microsoft SQL Server 2008 Components If you are installing Microsoft SQL Server 2008 to use with Microsoft Dynamics NAV, install the following components: Database Engine Services Client Tools Connectivity Management Tools - Complete Setup Options for Microsoft SQL Server When running Microsoft SQL Server Setup, you will be required to provide various pieces of information. Your responses can affect your use of SQL Server with Microsoft Dynamics NAV 2009. 2-40

Chapter 2: Setup and Installation SQL Server Instances Use the default instance and Instance ID on the Instance Configuration page as you install Microsoft SQL Server. If you need to use a non-default instance or instance ID, contact Microsoft Dynamics NAV support for information about how to customize your installation to work correctly with Microsoft Dynamics NAV 2009. Server Configuration We recommend that you use a dedicated domain user account you created specifically for the SQL Server service (MSSQLSVC), rather than a Local System account or the Network Service account. For more information about service security, see the Microsoft Dynamics NAV 2009 Security Hardening Guide (http://go.microsoft.com/fwlink/?linkid=126282). Configuring SQL Server for Microsoft Dynamics NAV 2009 The following additional steps are required to configure SQL Server for Microsoft Dynamics NAV: Set the Microsoft Dynamics NAV Trace Flag for SQL Server. Configure the Microsoft Dynamics NAV Extended Stored Procedure for a 64-bit Version of SQL Server. Procedure: Set the Microsoft Dynamics NAV Trace Flag for SQL Server On a 64-bit version of Microsoft SQL Server, you must set the trace flag to enable Microsoft Dynamics NAV Server to connect to the SQL database. The Microsoft Dynamics NAV 2009 Setup program automatically sets the trace flag when you install Microsoft Dynamics NAV Database Components to a 32-bit version of SQL Server. Perform the following steps to set the trace flag: 1. Open SQL Server Configuration Manager. 2. In the left pane, click SQL Server 2005 Services. 3. In the right pane, right-click SQL Server (MSSQLSERVER), and on the shortcut menu, click Properties. 4. In the Properties window, click the Advanced tab. 5. Click the Startup Parameters property, and open the drop-down list. 6. Type ;-T4616 at the end of the line in the drop-down list and then click OK. 7. In the right pane, right-click SQL Server (MSSQLSERVER), and on the shortcut menu, click Restart. 2-41

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Setting the -T4616 trace flag is done automatically in Microsoft Dynamics NAV 2009 SP1. In the same way, you can add additional trace flags, such as 1204 and 1222, to the SQL Server configuration to facilitate SQL Server Monitoring. These trace flags send additional information related to locks and deadlocks to the SQL Server Log. Both trace flags will be discussed in the chapter Performance Audit. The trace flag is no longer required with Microsoft Dynamics NAV 2009 SP1 and onwards. For more information about SQL Server trace flags, see Trace Flags (http://msdn.microsoft.com/en-us/library/ms188396.aspx). Procedure: Configure the Microsoft Dynamics NAV Extended Stored Procedure for a 64-bit Version of SQL Server When installing Microsoft Dynamics NAV 2009 on a 64-bit version of SQL Server, you must manually configure the extended stored procedures that Microsoft Dynamics NAV uses to interact with SQL Server. The Microsoft Dynamics NAV Setup program automatically installs this extended stored procedure when you install Microsoft Dynamics NAV Database Components to a 32-bit version of SQL Server. Perform the following steps to configure the Microsoft Dynamics NAV extended stored procedures on a 64-bit version of SQL Server: 1. Copy xp_ndo_x64.dll from the Sql_esp\x64 folder on the Microsoft Dynamics NAV DVD to the Microsoft Dynamics NAV Database folder on the server computer. In a default installation, the location of the Database folder is C:\Program Files (x86)\microsoft Dynamics NAV\60\Database. 2. Open SQL Server Management Studio. 3. In the Connect to Server pane, in the Server type box, fill in the following fields. Field Server Type Server Name Authentication Action Select Database Engine Select the name of the computer hosting SQL Server. Select Windows Authentication. 4. Click Connect. 5. In the left pane, click Databases, System Databases, master, Programmability, and then Extended Stored Procedures. Update the xp_ndo_enumusergroups extended stored procedure with the 64- bit DLL. 2-42

Chapter 2: Setup and Installation Security Synchronization 6. In the Extended Stored Procedures tab, right-click xp_ndo_enumusergroups, and, on the shortcut menu, select Properties. 7. In the Select a Page pane, click General. In the DLL property, provide the location of the 64-bit DLL that you previously copied, such as C:\Program Files (x86)\microsoft Dynamics NAV\60\Database. In the Select a Page pane, click Permissions. 8. In the Users or roles section, click Add to open the Select Users or Roles window. 9. Click Browse, select [public], and then click OK. Click OK to close the Select Users or Roles window. 10. In the Users or roles section, click public. In the Explicit permissions for public section, select the Grant check box for the Execute permission. 11. In the Extended Stored Procedures tab, right-click xp_ndo_enumusersids, and, on the shortcut menu, select Properties. 12. Restart the SQL Server Service. 13. Open SQL Server Configuration Manager. 14. In the left pane, click SQL Server 2005 Services. 15. In the right pane, right-click SQL Server (MSSQLSERVER), and on the shortcut menu, click Restart. An enterprise business solution must have a built-in security system that protects your database and the information it contains from being accessed by unauthorized people. It must also allow you to specify what the authorized users are allowed to do in the database (whether they can read, enter or modify data). The minimum acceptable level of security requires that each user be assigned an ID and a password. This ensures that only authorized personnel can gain access to your database. This is database-level security. A medium level of security requires you to limit the user's access so that they can only access certain types of information stored in the database. In other words, they can only access particular tables in the database. This is table-level security. A high level of security requires that you limit the access that users have to the information stored in the tables. This is record-level security. The SQL Server Option for Microsoft Dynamics NAV satisfies these requirements by integrating its own security system (which includes record-level security) with the Microsoft SQL Server security system and with the Windows security system. This allows Microsoft Dynamics NAV to use the unified login system provided by Windows. If your domain is running on Windows 2003 or Windows 2008, Microsoft Dynamics NAV makes use of both the Active Directory Services and the single sign-on system. 2-43

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Security Overview To understand how security is managed in the SQL Server Option for Microsoft Dynamics NAV, you must understand the SQL Server security system, Active Directory, the Microsoft Dynamics NAV security system, and how they interact. With several security systems interacting, the terminology can be confusing, so before explaining how the Microsoft Dynamics NAV security system works it is necessary to clarify two key concepts. Authentication Authentication is the process by which the system validates the user identity. This can be done by having the user enter an ID and password when they log on. Microsoft Dynamics NAV supports two kinds of authentication: Windows authentication and database server authentication. Login When users have identified themselves and been recognized by the system, they are granted access to the parts of the system for which they have permission. If the user has used Windows authentication to log on to the system, then he or she has been assigned a Windows login. If users have used database server authentication to log on to the system, they have been assigned a database login. The following table shows what the different authentication modes require from the user before granting access to databases. Authentication Windows requires SQL Server requires Windows Windows account (user ID Windows login authentication and password) Database server authentication Database Server Authentication SQL Server login (user ID and password) In the SQL Server Option for Microsoft Dynamics NAV the database server authentication is based on Microsoft SQL Server authentication. Windows Authentication The Windows single sign-on and the unified login supported by Windows are the same. In this course, both of these systems are referred to as Windows authentication. 2-44

Chapter 2: Setup and Installation With Windows authentication, when a user tries to connect with SQL Server to open a database, they do not have to supply a user ID or password. Microsoft Dynamics NAV automatically asks Windows to confirm whether this user, who has already logged on to the network, has a valid Windows account and whether this account gives them the right to access this particular server. If the user is allowed to access the server then Microsoft Dynamics NAV will check to see if the user has been assigned a Windows login within Microsoft Dynamics NAV. If the user has a Windows login, they will be granted access to the database. The user will be granted access to Microsoft Dynamics NAV and be given the permissions specified for that Windows user and those specified for any Windows groups of which they are a member. If the user does not have a valid Windows account or if their account does not include permission to log on to the Microsoft Dynamics NAV database, authentication fails and the user receives an error. The RoleTailored client and Microsoft Dynamics NAV Service Tier only support Windows authentication. Advantages of Windows security The Windows authentication system includes the following security features: Secure validation and encryption of passwords A time limit on passwords Minimum password length Account lockout after an invalid password is entered Active Directory and Microsoft Dynamics NAV To take full advantage of the features provided by the Active Directory Security system, the Microsoft Dynamics NAV client computers and the domain controller must all either be running on Windows XP or Windows Vista or otherwise have access to Active Directory. If your Microsoft Dynamics NAV client computers do not have access to Active Directory, they will not be able to see the Windows Users & Groups window. When they create or open a database, the clients will not be able to see the generated list of available servers either. Active Directory allows the administrator to give administrative permissions to other users, thereby delegating large areas of responsibility to other members of the organization. This feature makes administering Microsoft Dynamics NAV more flexible. Other users, for example, department managers, can administer all the groups that they need within their department from the Microsoft Management Console. 2-45

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 With this tool you can make Windows user members of specific security groups that have already been given roles within Microsoft Dynamics NAV. You can control access to and permissions within Microsoft Dynamics NAV, without having to open the program, provided that the Windows security groups have been given the appropriate roles within Microsoft Dynamics NAV. In an Active Directory environment, Microsoft Dynamics NAV allows you to create Microsoft Dynamics NAV users and roles from Windows accounts, and modify the rights of these users and roles. All Active Directory security groups will be visible within Microsoft Dynamics NAV and can be given roles within Microsoft Dynamics NAV. Active Directory Service security The Active Directory Service adds new features to the security used by Microsoft Dynamics NAV. The two key features are: The administrators can grant or deny users access to Microsoft Dynamics NAV by adding them to or deleting them from a Windows security group. The administrators can allow other people in the organization to create and administer users and groups. The SQL Server Security System Microsoft SQL Server has two levels of security: server security and database security. The SQL Server Option for Microsoft Dynamics NAV works with both levels of security and interacts with them by means of an automatic synchronization process. Server security consists of server-wide security accounts (known as logins), which are used to authenticate users before granting them access to the server. Database security consists of database-specific security accounts that control the level of access and the permissions granted to individual users for the databases on the server. Server Security The SQL Server security system authenticates users by validating their logins before granting them access to any of the resources contained in the system. SQL Server employs two types of authentication. These correspond to the two types of logins that can be created in SQL Server: Windows logins and SQL Server logins. The Windows authentication used by SQL Server corresponds to the Windows authentication used by Microsoft Dynamics NAV, as described earlier. 2-46

Chapter 2: Setup and Installation The database server authentication used by Microsoft Dynamics NAV refers to SQL Server authentication. It is used when the network administrator has decided not to support Windows authentication or the SQL Server administrator has chosen not to use Windows authentication. With this method, SQL Server performs its own authentication of the user's ID and password. SQL Server does this by checking whether an SQL Server login with this user's ID and password has been created. This login must first have been created by an SQL Server administrator, with an SQL Server tool. If an SQL Server login has not been set up, authentication fails and the user receives an error. Database Security In the SQL Server security system, access to individual databases on the server is controlled by the database user accounts in each database. The user is granted access to the server after the login has been authenticated. Database security then validates permissions by checking the database user accounts on the server. The permissions that the user has been granted to the various objects within the database, such as tables, are determined by the information contained in the user's database user account. It also contains information about any additional permissions that the user may have been granted to alter the database itself. Users who have valid SQL Server logins, but no database user accounts, will be granted default permissions. The default setting grants users access to the master database as guests. Guests have very limited rights. This means that a valid SQL Server login always gives access to at least one database. Microsoft Dynamics NAV and the SQL Server Security System The previous sections of this lesson have been devoted to explaining the SQL Server security system and the Microsoft Dynamics NAV security system. This section explains how these two systems interact. Note that Microsoft Dynamics NAV has two login tables. Windows logins are listed in the Windows Login table. Database logins are listed in the User table. The main portion of the security system for the SQL Server Option for Microsoft Dynamics NAV is the synchronization process. The synchronization process ensures that the information contained in the Microsoft Dynamics NAV User table and Windows Login table corresponds with the information contained in the SQL Server security system. SQL Server database user accounts contain information about the permissions that the users have to the objects contained in the database. The information for managing permissions to Microsoft Dynamics NAV objects is contained and administered within Microsoft Dynamics NAV. 2-47

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Synchronization of User Accounts Every time a user is added, deleted, or renamed in the Windows Login table or User table, a synchronization process is initiated. The synchronization process compares the Microsoft Dynamics NAV login tables with the security system in SQL Server. It modifies the SQL Server security system to reflect the changes made in the Microsoft Dynamics NAV Windows Login table or User table. This means that every time a Microsoft Dynamics NAV database administrator alters the information about a login in one of the Microsoft Dynamics NAV login tables for a particular database, the synchronization process automatically updates the information contained in the SQL Server database user accounts for this database. Microsoft Dynamics NAV cannot create or delete an SQL Server login. The SQL Server login must first be created by an SQL Server administrator. Microsoft Dynamics NAV can only verify or reject the validity of a login before updating the database user account. However, Microsoft Dynamics NAV can create and delete Windows logins in SQL Server through the synchronization process. Adding Users For both kinds of logins, the synchronization process creates a database user account for the login in the corresponding database if such an account does not already exist. If a new Windows login is added to the Windows Login table of a Microsoft Dynamics NAV database, the synchronization process matches this login to that in SQL Server. This is done by comparing the security identifiers (SIDs) of the two logins. If the synchronization process does not find a match, the system creates a new Windows login in SQL Server. If a new database login is added to the User table of a Microsoft Dynamics NAV database, the synchronization process checks whether this user ID has a valid SQL Server login in SQL Server. This SQL Server login must have the same name (user ID) as the Microsoft Dynamics NAV login that is being added. Deleting Users For both kinds of logins, if you delete a login from one of the Microsoft Dynamics NAV login tables, the synchronization process deletes the SQL Server database user account for that login. Note that if you delete SQL Server database user accounts from outside Microsoft Dynamics NAV, without deleting the login in Microsoft Dynamics NAV, synchronization will create new database user accounts for these users. When a Windows login is deleted from the Microsoft Dynamics NAV Windows Login table, you are asked if you want to delete this user's Windows login on 2-48

Chapter 2: Setup and Installation SQL Server. Microsoft Dynamics NAV does not delete the Windows login on SQL Server automatically. When a database login is deleted from the Navision User table, the synchronization process will not delete the SQL Server login. It can only be deleted by an SQL Server administrator using an SQL Server tool, such as SQL Server Management Studio. When Microsoft Dynamics NAV tries to match user IDs in the User table with SQL Server logins, the uppercase user ID in the User table is matched with the uppercase representation of the logins in SQL Server, regardless of case. Synchronizing The synchronization process can be initiated from within Microsoft Dynamics NAV. To start the synchronization process, click Tools, Security, Synchronize All Logins or Synchronize Single Login. You may need to initiate the synchronization process after restoring a Microsoft Dynamics NAV backup. If the logins in the backup do not match the SQL Server logins or the Windows users and groups, the necessary changes must be made to the Microsoft Dynamics NAV logins, Windows users and groups or SQL Server logins, after the backup has been fully restored. Re-initiate the synchronization process after these changes have been made. Never use SQL Server tools to add or delete information stored in the Microsoft Dynamics NAV Windows Login table or User table because this information is used during the synchronization process. Security Models Microsoft Dynamics NAV contains a comprehensive security system that enables you to manage the access that all of your users have to the objects and data in your Microsoft Dynamics NAV database. As this database is stored on SQL Server, the Microsoft Dynamics NAV security system and SQL Server's own security system must work together to ensure that only authorized users can access the database. The Microsoft Dynamics NAV security system therefore contains a synchronization mechanism that ensures that the information contained in the Microsoft Dynamics NAV security system corresponds with the information contained in the SQL Server security system. Microsoft Dynamics NAV allows you to specify the following level of security to implement: Standard Security Enhanced Security The main difference between these security models is the way in which they synchronize the Microsoft Dynamics NAV security system with SQL Server and the way that they integrate the Microsoft Dynamics NAV security system with Windows authentication. Another difference is that Standard security creates one 2-49

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 application role, whereas the Enhanced creates a separate application role for each user The security system is not synchronized automatically when you do the following: Change the security model Restore a backup Convert a database Update the executable files Update the application To change the security model used in the database, you must be: A member of the sysadmin server role on SQL Server or be a member of the db_owner database role for the database in question. Assigned the SUPER role in Microsoft Dynamics NAV. change security models, you must ensure that both of the extended stored procedures that come with Microsoft Dynamics NAV have been added to the instance of SQL Server that you are using. These extended stored procedures are called: xp_ndo_enumusergroups xp_ndo_enumusersids These extended stored procedures are part of the xp_ndo.dll that comes on the Microsoft Dynamics NAV product DVD. The main differences between the two security models are listed in the following table: Feature Standard Enhanced Security Security Synchronization Performance Fast Slower If you have several companies and many users in the same database, the synchronization process will be slower with Enhanced Security. Windows groups displayed Logins required in Microsoft Dynamics NAV Local domain plus forest of domains Windows groups and Local domain only Windows Groups plus the members of each group and 2-50

Chapter 2: Setup and Installation Feature Granularity of Synchronization Automatic synchronization when you insert, modify or delete a Windows login or a database login in Microsoft Dynamics NAV. Required Extended Stored Procedure Standard Security individual Windows users Entire security system Yes xp_ndo_en umusersids Enhanced Security individual Windows users Entire security system and individual logins. No xp_ndo_enumusergroups When to Synchronize the Security System The Microsoft Dynamics NAV security system must be synchronized with SQL Server every time you do the following: Change the security model. Change the users, permissions, or roles in Microsoft Dynamics NAV. Restore a backup. Convert a database. Modify an object in the database Update the executable files. Update the application. NOTE: Every time you modify an object in the database or modify the permissions that an object has to other database objects, you must update all the roles and users who have permission to access this object and then you must synchronize these roles and users. 2-51

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Selecting the Security Model You can specify the security model to use in a database when creating the database. You specify the security model in the Advanced tab of the New Database window. FIGURE 2.12 SELECTING THE SECURITY MODEL You can change the security model used in the database in the Alter Database window. Perform the following steps to change the security model: 1. Ensure that no other users or tools, such as Microsoft SQL Server Management Studio, are using the database. 2. Close and reopen the database to ensure that you are the only user currently accessing the database. 3. Click File, Database, and then Alter to open the Alter Database window. Click the Options tab. 4. Select the Single user field. This prevents other users and tools from accessing the database and blocking the synchronization. 5. Click the Advanced tab. 6. In the Security Model field, click the drop-down list and select the security model to implement in this database. 7. Click OK to alter the database and change the security model. As mentioned earlier, you must synchronize the entire Microsoft Dynamics NAV security system with SQL Server after changing the security model. Be sure to clear the check mark in the Single user field before synchronizing the security system. If another user logs on to the database while you are synchronizing the security system, they will not have access to all the resources that they need. 2-52

Chapter 2: Setup and Installation After Changing the Security Model When changing from Enhanced Security to Standard Security, you can delete all of the individual users who have been given a Windows Login in Microsoft Dynamics NAV if the Windows group(s) that they are members of have the permissions required. However, if any of the users have been assigned extra permissions do not delete them from the Windows Logins window. When changing from Standard Security to Enhanced Security, you must give all of the individual members of any Windows groups entered in the Windows Logins window a Windows login of their own. You do not need to assign any permissions or roles to these logins in Microsoft Dynamics NAV. The Windows group that they belong to has already been assigned the permissions they need. Standard Security The Standard Security model only allows you to synchronize the entire security system when updating the permissions system in Microsoft Dynamics NAV. When using Standard Security, you can enter a Windows group in the Windows Logins window and assign it a role in Microsoft Dynamics NAV. All the users who are members of this windows group are then automatically assigned this role in Microsoft Dynamics NAV. With the Standard Security model, the Windows Users & Groups window lists all of the Windows groups and users that are shown in Active Directory and the local groups on your computer. With the Standard Security model, every time you create, modify, or delete a Windows login or a database login, the security system is automatically synchronized. However, if you add, alter, or delete a role in the Microsoft Dynamics NAV security system, you must manually synchronize the security system. The synchronization of the security system is performed faster with Standard Security than it is with Enhanced Security. Standard Security may be preferable if you have several companies in the same database and need to update the security system on a regular basis. Synchronizing the Standard Security Model If using the Standard Security model, you can only synchronize the entire security system. To synchronize the entire security system, click Tools, Security, Synchronize All and the entire security system is synchronized. When you are using the Standard Security model, synchronizing the entire security system is lengthy short process. 2-53

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Enhanced Security The Enhanced Security model has a more refined synchronization system. Enhanced Security allows you to do the following: Synchronize individual users one at a time. When you modify the permissions of a particular user, you can select that user and synchronize them. Synchronize the entire security system at one time. When using Enhanced Security, you can enter a Windows group in the Windows Logins window and assign it a role in Microsoft Dynamics NAV. However, you must also enter all of the individual users who are members of this Windows group in the Windows Logins window. You do not need to assign these individual logins any permissions in Microsoft Dynamics NAV as they receive their permissions by virtue of their membership of the Windows group. However, you can assign them any extra permissions that they might need in Microsoft Dynamics NAV. When using Enhanced Security, the Windows Users & Groups window only lists all of the Windows groups and users that are visible to you in Active Directory. You cannot see any local groups on your computer. IMPORTANT: All of the Microsoft Dynamics NAV users must be members of the current domain. With the Enhanced Security model, every time you create, modify, or delete a Windows login or a database login, the security system is not automatically synchronized. You must remember to synchronize the security system yourself. No message is displayed. If you have implemented Enhanced Security, synchronizing the entire security system can be a lengthy process and is considerably slower than Standard Security. It is therefore recommended that no other users be logged on to the database when you synchronize the entire security system. Synchronizing the Enhanced Security Model If using the Enhanced Security model, you can synchronize: A single user All (the entire security system) Perform the following steps to synchronize a user: 1. Click Tools, Security, and Windows Logins to open the Windows Logins window. 2-54

Chapter 2: Setup and Installation 2. Select the Windows login that you want to synchronize. You can only synchronize one login at a time. 3. Click Tools, Security, and Synchronize Single Login to synchronize that login. You can also open the Database Logins window and synchronize the database logins one at a time. To synchronize the entire security system, click Tools, Security, and Synchronize All and the entire security system will be synchronized. Note that synchronizing the entire security system can take a considerable time. NOTE: When you are altering the permissions of users or roles in Microsoft Dynamics NAV, make sure that none of the users whose permissions are being altered, or who have been assigned the role you are altering, are logged on to the database. When synchronizing the entire security system, make sure that you are the only user in the database. Summary This chapter explains how to evaluate the software and hardware requirements for Microsoft Dynamics NAV 2009. It also reviews the Microsoft Dynamics NAV two-tier and three-tier architectures. Next, the chapter provides an overview of the recommended settings for a Microsoft Dynamics NAV database. Finally, the security models and the synchronization process are explained. 2-55

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Test Your Knowledge Test your knowledge with the following questions. 1. Which version of RAID is not recommended for use in ERP applications? ( ) RAID 0 ( ) RAID 1 ( ) RAID 5 ( ) RAID 10 2. When selecting a recovery model for a database, which option allows for a restore operation to a point in time? ( ) Full ( ) Simple ( ) Bulk-Logged 3. What is the advantage of implementing filegroups? (Select all that apply) ( ) Filegroups allow your files to be grouped inside SQL Server. That way it is easier to find them again when they are lost. ( ) Using files and filegroups improves database performance, because it allows creating a database across multiple disks. ( ) When multiple filegroups are used, the files in a database can be backed up and restored individually. ( ) Filegroups enable data placement, because a table can be created in a specific filegroup. That way all I/O for a specific table can be directed at a specific disk. 4. Which two security models are available for a Microsoft Dynamics NAV database? (Select all that apply) ( ) Standard ( ) Simple ( ) Enhanced ( ) Advanced 2-56

Lab 2.1 - Change the Recovery Model Chapter 2: Setup and Installation Recovery models are designed to control transaction log maintenance. Three recovery models exist: simple, full, and bulk-logged. Typically, a database uses the full or simple recovery model. In this lab you will have to select and implement the appropriate recovery model for the Demo Database NAV (6-0) database. Scenario Cronus International Ltd. has been using Microsoft Dynamics NAV for some time. The company's business has expanded a lot since the application was implemented and so have the requirements for data availability and recovery. Originally, a data restore for a specific point in time was not required. If a database crash occurred, it was considered acceptable to lose a half a day of work. To meet this requirement a series of backup procedures were implemented. There is a weekly full backup procedure. Every weekday at 12:00 P.M. and at 7:00 P.M. there are incremental backups in place. Now, the requirements have changed. It is no longer considered acceptable to lose data whenever a database restore occurs. To meet this requirement you will have to change the recovery model of the database to the appropriate level. Challenge Yourself! You need to change the recovery model of the Cronus database to a setting that meets the requirements. There are two possible ways to achieve this goal. You can use the Microsoft Dynamics NAV database properties window or you can use SQL Server Management Studio to implement the requested change. You must decide which is the best tool to implement the change. Need a Little Help? The requirements specify that no data from the Cronus database may be lost when a disaster recovery situation occurs. When to Use the Simple Recovery Model Use the simple recovery model if the following are all true: Point of failure recovery is unnecessary. If the database is lost or damaged, you are willing to lose all the updates between a failure and the previous backup. You are willing to risk losing some data in the log. 2-57

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 You do not want to back up and restore the transaction log, preferring to rely exclusively on full and differential backups. When to Use the Full Recovery Model Use the full recovery model and, optionally, the bulk-logged recovery model if any one of the following is true: You must be able to recover all the data. You must be able to recover to the point of failure. You want to be able to restore individual pages. You are willing to incur the administrative costs of transaction log backups. When to Use the Bulk-Logged Recovery Model The bulk-logged recovery model is intended strictly as an adjunct to the full recovery model. We recommend that you use it only during periods when you are running large-scale bulk operations, and when you do not require point-in-time recovery of the database. Step by Step To comply with the new requirements, you must make sure the recovery model for the Demo Database NAV (6-0) database is set to Full. The following are two ways to change the recovery model: Using the Alter Database window in the Microsoft Dynamics NAV Classic client. Using SQL Server Management Studio (SSMS). Using the Microsoft Dynamics NAV client: Using SSMS: 1. Start your Microsoft Dynamics NAV classic client. 2. Connect and open the Demo Database NAV (6-0) database and company. 3. Go to the File menu, then select Alter database. 4. On the Options tab, to change the recovery model select a different model from the list. The choices are Full, Bulk-logged, or Simple. Select Full. 5. Click OK to save the changes. 1. In the Start menu, select All Programs, Microsoft SQL Server, and then SQL Server Management Studio. 2-58

Chapter 2: Setup and Installation 2. After connecting to the appropriate instance of the Microsoft SQL Server Database Engine, in Object Explorer, click the server name to expand the server tree. 3. Expand Databases, and select the Demo Database NAV (6-0) database. 4. Right-click the database and select Properties to open the Database Properties dialog box. 5. In the Select a Page pane, click Options. The current recovery model is displayed in the Recovery model list box. 6. In the Recovery model field, select Full. 7. Click OK to save the changes. 2-59

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 2.2 - Set Trace Flags It is important to know how to set a trace flag on SQL Server. Before accessing SQL Server from Microsoft Dynamics NAV, you must set trace flag 4616 on SQL Server. Scenario In this lab you will set trace flag 4616 on SQL Server. Next, you will test to verify that the trace flag has been set correctly. Challenge Yourself! Make sure trace flag 4616 is set on SQL Server. Make sure that when the server is rebooted the setting is still enabled. Need a Little Help? Before you can access Microsoft SQL Server 2005 from Microsoft Dynamics NAV, you must set trace flag 4616 on SQL Server 2005. Trace flags are used to customize certain characteristics by controlling how SQL Server operates. For more information, see the section How To: Set the Microsoft Dynamics NAV Trace Flag for 64-bit SQL Server in this chapter. Step by Step Perform the following steps to enable the trace flag through SQL Server Configuration Manager: 1. Open SQL Server Configuration Manager. 2. In the left pane, right-click SQL Server 2005 Services, and then click Open to see of all the services. 3. In the right panel, right-click SQL Server (MSSQLSERVER) and select Properties to open the Properties window. 4. In the Properties window, click the Advanced tab and expand the Advanced option if necessary. 5. Click the Startup Parameters property and open the drop-down list. Type ;-T4616 at the end of the line in the drop-down list. 6. Restart the SQL Server service. Perform the following to enable the trace flag using the DBCC TRACEON statement: 1. Open SSMS. 2. Click the New Query button to open a new query window. 2-60

Chapter 2: Setup and Installation 3. In the New Query window, enter the following SQL statement: DBCC TRACEON (4616) 4. Click the Execute button. Perform the following steps to verify if the trace flag has been set correctly: 1. Open SSMS. 2. Click the New Query button to open a new query window. 3. In the New Query window, enter in the following SQL statement: DBCC TRACESTATUS (4616, -1). If you do not specify a trace flag, you will get an overview of all the global trace flags that are currently enabled. 4. Click the Execute button. 2-61

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 2.3 - Create a Login Stored Procedure A login stored procedure is a procedure that you can use to perform predefined functions after a user logs on to Microsoft Dynamics NAV with Microsoft SQL Server. The login stored procedure is run immediately after the user has logged on to SQL Server and opened a database and before Microsoft Dynamics NAV carries out any tasks including executing any C/AL triggers. The user must have successfully logged on to the server and have access to the database before the stored procedure is run. Scenario You are the database administrator for Cronus International Ltd. Because you need to perform a large maintenance task on the Microsoft Dynamics NAV 2009 database, you want to make sure nobody else is able to log into the Demo Database NAV (6-0) database. Challenge Yourself! To accomplish this task, you will create and implement a login stored procedure for the Demo Database NAV (6-0) database. This stored procedure will execute every time someone tries to login to the database, until you remove it from the database. The stored procedure will compare the user name to the user name of the administrator. If they are different, login will not be granted and a corresponding error message displays. Finally, you will perform a test to be sure the stored procedure is working. Need a Little Help? The stored procedure is created in the database and has a predefined name and a list of parameters. The stored procedure is called [sp_$ndo$loginproc] and has the following characteristics: It takes two VARCHAR parameters: the name of the application and the C/SIDE version number. These parameters must be declared as part of the stored procedure but do not have to be used. It can perform transactions. Microsoft Dynamics NAV uses a COMMIT to flush any outstanding transactions after the stored procedure has finished executing. 2-62

Chapter 2: Setup and Installation The RAISERROR statement can be used to display an error message in Microsoft Dynamics NAV and prevent the user from accessing the database. The PRINT statement can be used to display a warning in Microsoft Dynamics NAV and allow the user to access the database. If the stored procedure returns a value, it is ignored. If the stored procedure does not exist, no action is taken by Microsoft Dynamics NAV and the login process continues as usual. After creating the stored procedure, start the RoleTailored and the Classic clients with different credentials. Step by Step Perform the following steps to create a login stored procedure: 1. Open SSMS. 2. Click the New Query button to open a new query window. 3. In the New Query window, enter the following SQL statement: USE [Demo Database NAV (6-0)] GO IF EXISTS(SELECT name FROM sysobjects WHERE name = 'sp_$ndo$loginproc' AND type = 'P') DROP PROCEDURE [sp_$ndo$loginproc] GO CREATE PROCEDURE [sp_$ndo$loginproc] @appname VARCHAR(64) = NULL, @appversion VARCHAR(16) = NULL AS BEGIN IF SUSER_SNAME() NOT IN ('CONTOSO\Administrator') RAISERROR('The system administrator has currently disabled logging into the database. Please try again later.', 11, 1) END GO GRANT EXECUTE ON [sp_$ndo$loginproc] TO public GO 4. Click the Execute button. 5. Start the Microsoft Dynamics NAV RoleTailored client. The error message is not displayed and the user is granted access. 6. Close the RoleTailored client. 7. Log off from Windows. 2-63

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 8. Log on to Windows with the following credentials: User name Password Domain Susan pass@word1 (mailto:pass@word1) CONTOSO 9. Start the Microsoft Dynamics NAV Classic client. The error message is displayed. 10. Click OK to close the error message dialog box. The user is not logged in. 2-64

Chapter 2: Setup and Installation Lab 2.4 - Use Filegroups to Change the Storage Location of a Table In this lab you move a Microsoft Dynamics NAV table from one filegroup to another. Scenario Cronus International Ltd. has been working in Microsoft Dynamics NAV for some time now and noticed that the data in some tables is growing enormously. Management does not allow deleting or archiving old data in tables because this data has to be available whenever it is requested. As a database administrator you know that spreading a database over multiple hard disks would improve performance. The management allowed for the purchase of a new hard disk and you installed it on the database server. Your task now is to move some tables to the newly installed hard disk. Database monitoring has shown that a large percentage of the queries executed on your database make use of the Value Entry table. So this table may be a good candidate to move towards the new hard disk. Challenge Yourself! Create a new filegroup named SECONDARY for the Demo Database NAV (6-0) database. Next, add a new database file to the Demo Database NAV (6-0) database and give it the name: SECONDARYFILE. Use the newly installed hard disk as the storage location as for the file. Make sure that the new database file is assigned to the new filegroup. Finally, assign the Value Entry table to the new filegroup. Need a Little Help? Perform the following to move an existing index to a different filegroup or partition scheme. 1. In Object Explorer, connect to an instance of the SQL Server Database Engine and then expand that instance. 2. Expand Databases, expand the Demo Database NAV (6-0) database that contains the table with the specific index, and then expand Tables. 3. Expand the table in which the index belongs and then expand Indexes. 4. Right-click the index to be moved and then select Properties. 2-65

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 5. On the Index Properties dialog box, select the Storage page. 6. Select the filegroup in which to move the index. You cannot move indexes created using a unique or primary key constraint by using the Index Properties dialog box. To move these indexes, you need to drop the constraint using ALTER TABLE (Transact-SQL) with the DROP CONSTRAINT option and then re-create the constraint on the desired filegroup using ALTER TABLE (Transact-SQL) with the ADD CONSTRAINT option. If the table or index is partitioned, select the partition scheme in which to move the index. If you are moving a clustered index, you can use online processing. Online processing allows concurrent user access to the underlying data and to nonclustered indexes during the index operation. Step by Step Perform the following steps to move a Microsoft Dynamics NAV table from one filegroup to another: 1. Open SSMS. 2. Click the New Query button to open a new query window. 3. In the New Query window, enter the following SQL statement: USE [Demo Database NAV (6-0)] GO IF EXISTS (SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(N'[dbo].[CRONUS International Ltd_$Value Entry]') AND name = N'CRONUS International Ltd_$Value Entry$0') ALTER TABLE [dbo].[cronus International Ltd_$Value Entry] DROP CONSTRAINT [CRONUS International Ltd_$Value Entry$0] GO ALTER TABLE [dbo].[cronus International Ltd_$Value Entry] ADD CONSTRAINT [CRONUS International Ltd_$Value Entry$0] PRIMARY KEY CLUSTERED ([Entry No_] ASC) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = ON, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [SECONDARY] GO 4. Run the query. 2-66

Chapter 2: Setup and Installation 5. Open the Object Explorer in SSMS. 6. Expand the Databases tree. 7. Select Demo Database NAV (6-0) in the list of databases. 8. In the selected database, open the Tables tree. 9. Select the Cronus International Ltd_$Value Entry table in the list 10. Right-click the table and select its properties 11. On the General tab, check the value for the Filegroup property. It is now set to SECONDARY. 2-67

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 2.5 - Create Users and Synchronize Security This lab demonstrates how to create a new user in Microsoft Dynamics NAV, and the difference between database and windows user creation. Scenario As the database administrator for the Demo Database NAV (6-0) database you are required to manage the creation and deletion of users. You have been asked to create two new users, Michael and David. Michael will use the RoleTailored client. David will use the Classic client. David requested that his login should be available on different workstations, independent of the current Windows user. Challenge Yourself! Create a new user for Michael and for David. Make sure they are able to login to the Demo Database NAV (6-0) database. Use the RoleTailored and the Classic client to test the users for Michael and David. Once the users have been created, make sure you log off from the database, close it, and reopen it to test the users. Need a Little Help? Michael will work with the RoleTailored client, which requires a Windows login to logon to the database. You need to create a Windows user in the Microsoft Dynamics NAV database and synchronize it with SQL Server. David will work with the Classic client. Because he needs to login to Microsoft Dynamics NAV on different workstations, regardless of the current Windows user, you create a database user for David. Before you create a database user for David in Microsoft Dynamics NAV, you first need to create a database login in SQL Server. After the database login has been created in SQL Server, you must create the database user in Microsoft Dynamics NAV and synchronize the security system. Step by Step Create a Windows user for Michael 1. Open Microsoft Dynamics NAV Classic client. 2. From the Tools menu, click Security, and then Windows Logins. 3. On an empty line, enter CONTOSO\Michael and press Enter to insert the login. 4. Select the newly added user and click Roles. The Roles window will appear. 2-68

Chapter 2: Setup and Installation 5. In the Role ID column, click the button to get an overview of the available roles. 6. Select the SUPER role and click OK. 7. Close the Roles window. 8. From the Tools menu, click Security, and then click Synchronize All Logins. Create a SQL Server login for David (Microsoft SQL Server): 1. In SQL Server Management Studio, open Object Explorer and expand the folder of the server instance in which to create the new login. 2. Right-click the Security folder, point to New, and then click Login. 3. On the General page, in the Login Name field, enter DAVID. 4. Select SQL Server Authentication. 5. Enter the following strong password for the login: pass@word1 6. Click OK. Add a Database User for David (Microsoft Dynamics NAV) 1. Open Microsoft Dynamics NAV Classic with SQL Server. 2. From the Tools menu, click Security, and then Database Logins. 3. On an empty line, in the User ID field, enter DAVID. 4. Press Enter to add the user. 5. Select the user DAVID and click the Roles button. 6. In the Role ID column, click the button for an overview of the available roles. 7. Select the SUPER role and click OK. 8. Close the Roles window. 9. From the Tools menu, click Security, and then click Synchronize All Logins. 2-69

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 Quick Interaction: Lessons Learned Take a moment and write down three Key Points you have learned from this chapter 1. 2. 3. 2-70

Chapter 2: Setup and Installation Solutions Test Your Knowledge 1. Which version of RAID is not recommended for use in ERP applications? ( ) RAID 0 ( ) RAID 1 ( ) RAID 5 ( ) RAID 10 2. When selecting a recovery model for a database, which option allows for a restore operation to a point in time? ( ) Full ( ) Simple ( ) Bulk-Logged 3. What is the advantage of implementing filegroups? (Select all that apply) ( ) Filegroups allow your files to be grouped inside SQL Server. That way it is easier to find them again when they are lost. ( ) Using files and filegroups improves database performance, because it allows creating a database across multiple disks. ( ) When multiple filegroups are used, the files in a database can be backed up and restored individually. ( ) Filegroups enable data placement, because a table can be created in a specific filegroup. That way all I/O for a specific table can be directed at a specific disk. 4. Which two security models are available for a Microsoft Dynamics NAV database? (Select all that apply) ( ) Standard ( ) Simple ( ) Enhanced ( ) Advanced 2-71

SQLServer Installation and Optimization for Microsoft Dynamics NAV 2009 2-72

Chapter 3: Advantages of SQL Server Option CHAPTER 3: ADVANTAGES OF SQL SERVER OPTION Objectives Introduction The objectives are: Define an adequate backup strategy. Access a Microsoft Dynamics NAV database from third-party tools. Describe the available Performance Monitoring tools. Evaluate the scalability requirements of a Microsoft Dynamics NAV implementation and anticipate future growth. As stated earlier, Microsoft Dynamics NAV can run on two servers - Classic Database Server and Microsoft SQL Server. To the client these two server options look and perform the same. However, there are some important differences between them, including: The way you create a database. The backup facilities that are available. The ability to access the data in the database by using third-party tools. Performance monitoring. Scalability. Multi-processor support. Multiple processor support and database creation have already been discussed in the chapter Setup and Installation. This chapter describes the most important differences between both server options. 3-1

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Backup Facilities An enterprise business solution must be able to manage a substantial amount of input and output activity every day. This increases the need to guard against information loss if there is database or hardware failure. Therefore, make sure that you implement a suitable backup procedure and that the system is set up so that the possibilities for data loss are minimized. If the system fails, you must be able to recover all of your data, including the data that has been modified since you made your last backup. The SQL Server Option allows you to use two types of backup: Microsoft SQL Server backup and Microsoft Dynamics NAV backup. Microsoft SQL Server Backup When using the Microsoft Dynamics NAV SQL Server Option, we recommend that you use the backup facilities provided by SQL Server for your daily needs. Microsoft SQL Server allows server-side database backups. The scope of a backup of data can be a whole database, a partial database, or a set of files or filegroups. For each of these, SQL Server supports full and differential backups. SQL Server uses a roll forward capability to recover all the committed transactions that were carried out up to the point of failure. Roll forward is achieved by restoring your last database backup and applying all subsequent transaction log backups to re-create these transactions. In these cases, only uncommitted work (incomplete transactions) will be lost, provided the active transaction log is also backed up and applied. The active transaction log also contains details of all uncommitted transactions. When you apply the active transaction log backup, SQL Server will roll back the uncommitted transactions. Losing the active transaction log will prevent the system from successfully applying all the transaction log backups. One way of protecting both the transaction log files and the data files against hardware failure is to place them on mirrored disks. When you place the primary data file and the transaction log files on different physical disks than the data files containing the user objects, you ensure that any media failure on the disks containing the user database files affects only those files. You can further protect the files from isolated media failure by placing the primary data file and the transaction log files on mirrored disks. For more information about transaction logs, see Transaction Log Physical Architecture (http://msdn.microsoft.com/en-us/library/ms179355.aspx). In order to apply transaction log backups, you must choose the correct options when you create your databases and implement suitable backup procedures. These options will be explained in the next section. 3-2

Chapter 3: Advantages of SQL Server Option Microsoft SQL Server Backups Microsoft SQL Server supports different types of backup. You should choose the type of backup that you will be using carefully in order to ensure that you achieve the level of security you require. The four types of backup are the following: Database backup - this makes a backup of the entire database. Transaction log backup - this makes a backup of the entire transaction log. Differential backup - this makes a backup of all committed entries since the last database backup. File and filegroup backup - this makes a backup of individual files or filegroups within a database. These can be combined to form many types of backup and restore procedures. This allows you to make your backup and restore strategy fit your database needs. The SQL Server backup/restore system is server-based and is therefore considerably faster than the Dynamics NAV backup/restore system, which is client-based. You can restore a SQL Server backup of a Dynamics NAV database directly into SQL Server without using Dynamics NAV. You can also create a database directly in SQL Server without first having to create it in Microsoft Dynamics NAV and then restore a SQL Server backup of a Dynamics NAV database directly into the database on SQL Server. SQL Server allows you to make backups when the system is being used. With SQL Server, you can also automate many of your administrative tasks, including making backups. SQL Server also allows you to establish a database maintenance plan (with the help of a wizard) that includes database optimization, integrity tests and a backup plan. SQL Server Database Tests You should run SQL Server database consistency tests (using the SQL Server DBCC options) before making backups. However, SQL Server allows you to include integrity tests in its backup procedure. Backup Media and Destinations SQL Server can back up to a hard disk or to a tape. Disk files (local or network) are the most common media used for storing backups. When you back up to a tape, the tape drive must be attached locally to SQL Server. 3-3

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Notice that SQL Server can write to multiple backup files at the same time (in parallel). In that case, data is striped on across all files. You can back up to multiple tapes or disk controllers to decrease the total backup time for a database. All devices used in a single backup operation must be of the same media type (disk or tape). You cannot mix disk and tape devices for a single backup. IMPORTANT: You must never store your backups on the same disk drive as your database and/or transaction log files. Backup Types Microsoft SQL Server provides several backup methods to meet the needs of many different business environments and database activities. Full Backups A full backup of a database includes the data files and part of the transaction log. A full backup represents the database at the time that the backup was completed and serves as your baseline if there is a system failure. When you perform a full database backup, SQL Server does the following: Backs up all the data in the database Backs up any changes that occurred while the backup was running Backs up any uncommitted transactions in the transaction log. SQL Server uses the portions of the transaction log that were captured in the backup file to ensure data consistency when the backup is restored. The restored database matches the state of the database when the backup completed, minus any uncommitted transactions. When the database is recovered, uncommitted transactions are rolled back. If your database is a read-only database, full database backups may be sufficient to prevent data loss. Transaction Log Backups Transaction log backups record any database changes. You typically back up transaction logs when you perform full database backups. Note the following about transaction log backups: You should not back up a transaction log unless you have performed a full database backup at least once. You cannot restore transaction logs without a corresponding database backup. You cannot back up transaction logs when using the Simple recovery model. 3-4

Chapter 3: Advantages of SQL Server Option When you backup the transaction log, SQL Server does the following: Backs up the transaction log from the last successfully executed BACKUP LOG statement to the end of the current transaction log Truncates the transaction log up to the beginning of the active portion of the transaction log and discards the information in the inactive portion. The active portion of the transaction log starts at the point of the oldest open transaction and continues to the end of the transaction log. If you do not make any full database backups, the transaction logs will continue to grow until the transaction log has reached its predefined limit or it runs out of disk space. If you do not plan to use the Microsoft SQL Server backup features, use the Simple recovery model. Differential Backups You must perform a differential backup if you want to minimize the time that is required for restoring a frequently modified database. You can perform a differential backup only if you have performed a full database backup. In a differential backup, SQL Server does the following: Backs up the parts of the database that have changed since the last full database backup. Backs up any activity that occurred during the differential backup, as well as any uncommitted transactions in the transaction log. If you have a backup strategy including differential backups, it is extremely important not to make ad hoc database backups (except copy-only backups). Ad hoc database backups will disturb the backup schedule, because they will make the differential backups unusable. File or Filegroup Backups If performing a full database backup on very large databases is not practical, you can perform database file or filegroup backups. When SQL Server backs up files or filegroups, it does the following: Backs up only the database files that you specify in the FILE or FILEGROUP options. Allows you to back up specific database files instead of the entire database. When you perform database file or filegroup backups: You must specify the logical files or filegroups 3-5

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 You must perform transaction log backups to make restored files consistent with the rest of the database. You should establish a plan to back up each file on a rotating basis to ensure that all database files or filegroups are backed up regularly. Copy-only backups SQL Server supports creating copy-only backups. Unlike other backups, a copyonly backup does not affect the overall backup and restore procedures for the database. Copy-only backups can be used to create a copy of the backup to take offsite to a safe location. All recovery models support copy-only data backups. A copy-only backup cannot be used as a base backup and does not affect any existing differential backups. In Microsoft SQL Server 2005, copy-only backups can be created or restored only by using the BACKUP and RESTORE Transact-SQL statements. SQL Server Management Studio does not support these backups. (In SQL Server 2008, copy-only backups can be made using SQL Server Management Studio.) The following example shows how to create a copy-only backup from the Demo Database NAV (6-0) database: BACKUP DATABASE [Demo Database NAV (6-0)] TO DISK='F:\BACKUPS\DemoDB60.BAK' WITH COPY_ONLY Collations in BACKUP and RESTORE Operations If you restore a database, RESTORE uses the collation of the source database that was recorded in the backup file. The restored database has the same collation as the original database that was backed up. Individual objects within the database that have different collations also retain their original collation. The database can be restored even if the instance on which you run RESTORE has a default collation different from the instance on which BACKUP was run. If there is already a database with the same name on the target server, the only way to restore from the backup is to specify REPLACE on the RESTORE statement. If you specify REPLACE, the existing database is replaced with the contents of the database on the backup file, and the restored version of the database will have the same collation recorded in the backup file. If you are restoring log backups, the destination database must have the same collation as the source database. 3-6

Backup Strategy Chapter 3: Advantages of SQL Server Option Determining a procedure for creating backups is a vital part of maintaining your database. If you make frequent entries in your database, you will need a backup procedure that guarantees the reliability of your data and will allow you to fully recover your data after any failures that may occur. The choice of a backup strategy is often determined by the recoverability requirement. How much data are you willing to lose if there is a system failure? Or in other words: what is the cost of system downtime? There are many backup strategies that can be used and each strategy has advantages and disadvantages. If your database must have a high-availability, you have to use database and transaction log backups, but you must have the necessary hardware resources to support this. If high-availability is less important, you can use the Simple recovery model. In that case, it will take more time to recover the database, as point-in-time recovery is not possible. Point-in-time Recovery If you want to enable point-in-time recovery, then you must use the Full or Bulk- Logged recovery model and schedule both database and transaction log backups. Full Backup Strategy A full database backup strategy is a recovery strategy that involves regular full database backups. If the database fails, you can restore the most recent full backup to recover the database to the same state it was when the backup was taken. Your database size and how frequently data is modified determine together the time and resources that are involved in implementing a full database strategy. Implement a full backup strategy if: The database is small. The time that is required to back up a small database is reasonable. The database has few data modifications or is read-only. Performing a full database backup captures a reasonably complete set of data. You may be willing to accept a minor loss of data if the database fails between backups and must be restored. The recommended recovery model for this strategy is Simple. Database and Transaction Log Backup Strategy When it is impractical to meet the recoverability requirements by performing only full database backups, you can perform intervening transaction log backups to have a record of all database activities that occurred between full database backups. 3-7

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 With this approach, you can restore a database from the most recent full backup and then apply all of the transaction log backups that were created since the last full database backup. Differential Backup Strategy A differential database backup strategy involves performing regular full database backups with intervening differential backups. Optionally, you can also perform transaction log backups between the full and differential backups. To recover the database if there is a disaster, you must restore the most recent full database backup, followed by the most recent differential backup, and then restore each transaction log since the differential backup in order. Use this strategy to reduce recovery time if the database becomes damaged. For example, instead of applying multiple, large transaction logs, you can use a differential backup to apply the changes that were made to the database since the last full database backup. File or Filegroup Backup Strategy A file or filegroup backup strategy involves backing up individual files or filegroups on a regular basis. Usually, this strategy is implemented by alternately backing up each read/write file or filegroup. Additionally, you usually back up the transaction log between file or filegroup backups. However, this strategy is complicated and does not automatically maintain referential integrity. Use this strategy for a very large database that is partitioned among multiple files. When combined with regular transaction log backups, this technique offers a time-sensitive alternative to full database backups. For example, if you have only one hour to perform a full database backup (which usually takes four hours), you could back up individual files each night and still ensure data consistency. SQL Server supports mirroring of backup media, increasing the reliability of backups by providing redundancy. Backup mirroring increases reliability by reducing the impact of backup-device malfunctions. These malfunctions are especially serious because backups are the last line of defense against data loss. Mirroring applies to both disk and tape. All backup devices for a single backup or restore operation must be of the same type (either disk or tape). Within these broader classes, you must use similar devices that have the same properties, such as drives with the same model number from the same manufacturer. Insufficiently similar devices generate an error message (3212). NOTE: Mirrored backups can only be created or restored by using the BACKUP or RESTORE Transact-SQL statements. SQL Server Management Studio does not support these backups. 3-8

Chapter 3: Advantages of SQL Server Option The following example shows how to make a mirrored backup: BACKUP DATABASE [Demo Database NAV (6-0] TO DISK = 'F:\Backups\DemoDB60.bak' MIRROR TO DISK= M:\MirrorBackups\DemoDB60.bak' Database Access Using Third-Party Tools It is much easier to access data in the database with third-party tools when you are running on the SQL Server Option for Microsoft Dynamics NAV. Today, companies building client/server and Web-based database solutions seek maximum business advantage from the data and information distributed throughout their organizations. Microsoft Universal Data Access (UDA) is a platform, application, and tools initiative that defines and delivers both standards and technologies and is a key element in the Microsoft foundation for application development, the Microsoft Windows Distributed internet Applications (DNA) architecture. Universal Data Access provides high-performance access to many different data and information sources on multiple platforms and an easy-to-use programming interface that works with almost any tool or language, taking advantage of the technical skills developers already have. The technologies that support Universal Data Access enable organizations to create easy-to-maintain solutions and use their choice of best-of-breed tools, applications, and data sources on the client, middle tier, or server. Another benefit of Universal Data Access is that it does not require expensive and time-consuming movement of all corporate data into a single data store, nor does it require commitment to a single vendor's products. Universal Data Access is based on open industry specifications with broad industry support and works with all major established database products. Universal Data Access is an evolutionary step from standard interfaces such as Open Database Connectivity (ODBC), Remote Data Objects (RDO), and Data Access Objects (DAO), and it significantly extends the functionality of these well-known and well-tested technologies. The Universal Data Access platform includes several technologies such as Microsoft Data Access Components and SQL Native Client. For more information about Data Access Technologies, see http://msdn.microsoft.com/en-us/library/ms810810.aspx. SQL ODBC SQL Server supports ODBC via the SQL Server Native Client ODBC driver, as one of the native APIs for writing C, C++, and Microsoft Visual Basic applications that communicate with SQL Server. SQL Server Native Client contains the SQL OLE DB provider and SQL ODBC driver in one native dynamic link library (DLL) supporting applications using APIs, such as ODBC, OLE DB and ADO, to gain access to Microsoft SQL 3-9

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Performance Monitoring Server. In addition, the Windows Data Access Components (Windows DAC), which have become part of the Windows Server 2008 and Windows Vista operating systems, also provide technologies and components that can be used to build applications that access SQL Server. By providing these technologies like this, Microsoft SQL Server can be accessed by a large number of external applications. For more information, see the Data Platform Development Center (http://msdn.microsoft.com/en-us/data/default.aspx). Many standard applications or application components provide SQL Server integration functionalities by default. Typical examples are Microsoft Office Excel, Microsoft Office Word, Microsoft Office Access, Microsoft Office InfoPath, Microsoft Visual Studio, Microsoft SQL Server Integration Services and so on. Microsoft Dynamics NAV uses the DAC ODBC driver installed with Windows. The SQL Server ODBC driver complies with the Microsoft Win32 ODBC 3.51 specification. Another area in which the two server options differ is how you monitor performance. The Client Monitor can be used with both server options. When you use the Client Monitor with the SQL Server Option, it contains some additional options and fields that give you more insight into how your application is performing. The Client Monitor is an important tool for troubleshooting performance and locking problems. You can also use it to identify the worst server calls and to identify index and filter problems in the SQL Server Option. The Client Monitor and the Code Coverage tool now work closely together allowing you to easily identify, for example, the code that generated a particular server call. Microsoft Dynamics NAV also contains a debugger that you can use to refine functions that you write in C/AL code. The debugger can also be used with both server options. When you are using the SQL Server Option, you can supplement these tools with the SQL Server Error Log. By enabling trace flags 1204 and 3605, you generate additional diagnostic error messages in the error log. These give you information about the type of locks that are involved in a deadlock. For a detailed description of how to use these tools and of performance troubleshooting in general, see the manual Performance Troubleshooting Guide that is available on the Microsoft Dynamics NAV Tools CD. The Tools CD also contains some additional tools that you can use for troubleshooting. 3-10

Client Monitor Chapter 3: Advantages of SQL Server Option The Client Monitor is an important tool for troubleshooting performance and locking problems. The tool watchdogs the communications between an individual computer and the Classic Database Server, a locally stored database or when used with the SQL Server option. The log that is shown here shows the calls that are made to the database and provides an overview of the sequence in which the calls are made and the database objects are called. FIGURE 3.1 THE CLIENT MONITOR WINDOW You can also use it to identify the worst server calls and to identify index and filter problems in the SQL Server Option. 3-11

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Session Monitor Session Monitor is a powerful tool for diagnosing performance problems. It assists analysis efforts by providing information on the computers performance during a specific period of time. FIGURE 3.2 THE SESSION MONITOR The Session Monitor (Microsoft Dynamics NAV Database Server) helps you identify the clients that cause performance problems in a Microsoft Dynamics NAV environment. 3-12

Chapter 3: Advantages of SQL Server Option Performance Monitor (Perfmon.exe) Perfmon.exe allows you to open a Performance console configured with the System Monitor ActiveX control and Performance Logs and Alerts Service. FIGURE 3.3 SYSTEM MONITOR AND PERFORMANCE LOGS AND ALERTS System Monitor With System Monitor, you can measure the performance of your own computer or other computers on a network in the following ways: Collect and view the real-time performance data of a local computer or several remote computers: Usability has been modified in the Windows Server 2003 family. For example, you can delete multiple counters at the same time and display the data properties page for a counter directly from the list window. You can save selected data from a performance log file or an SQL database to a new file for analysis for later use. Also new in the Windows Server 2003 family are two new security groups that help you ensure that only trusted users can access and manipulate sensitive performance data. These are the Performance Log Users group and the Performance Monitor Users group. View data collected either currently or previously in a counter log. With the Windows Server 2003 family, you can now concurrently view data from multiple log files. Present data in a printable graph, histogram, or report view. 3-13

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Incorporate System Monitor functionality into applications that support ActiveX controls, for example, Web pages, and Microsoft Word or other applications in Microsoft Office Create HTML pages from performance views. Views stored in HTML format can be displayed by a browser. Create reusable monitoring configurations that can be installed on other computers using Microsoft Management Console (MMC). With System Monitor, you can collect and view extensive data about the usage of hardware resources and the activity of system services on computers you administer. You can define the data you want System Monitor to collect in the following ways: Type of data. To select the data to be collected, you specify performance objects, performance counters, and performance object instances. Some objects provide data on system resources (such as memory); others provide data on the operation of applications (for example, system services). Source of data. System Monitor can collect data from your local computer or from other computers on the network for which you have administrative credentials. By default, administrative credentials are required. In addition, you can include real-time data or data collected previously using counter logs. With the Windows Server 2003 family, you can now view performance data that was previously collected and stored in an SQL database by the Performance Logs and Alerts service. Sampling parameters. System Monitor supports manual, ondemand sampling or automatic sampling based on a time interval you specify; this functionality applies to real-time data only. When viewing logged data, you can also choose starting and stopping times so that you can view data spanning a specific time range. In addition to options for defining data content, you have considerable flexibility in designing the appearance of your System Monitor views: Type of display. System Monitor supports graph, histogram, and report views. The graph view is the default view; it offers the widest variety of optional settings. Display characteristics. For any of the three views, you can define the colors and fonts for the display. In graph and histogram views, you can select from many options when you view performance data: o Provide a title for your graph or histogram and label the vertical axis. 3-14

Chapter 3: Advantages of SQL Server Option o o Set the range of values depicted in your graph or histogram. Adjust the characteristics of lines or bars plotted to indicate counter values, by using color, width, style, and other graphical features. For more information about the performance-monitoring process and interface, see Performance console overview (http://technet.microsoft.com/enus/library/cc785096(ws.10).aspx) and System Monitor interface (http://technet.microsoft.com/en-us/library/cc776933(ws.10).aspx). Performance Logs and Alerts With Performance Logs and Alerts you can collect performance data automatically from local or remote computers. You can view logged counter data using System Monitor or export the data to spreadsheet programs or databases for analysis and report generation. Performance Logs and Alerts offers the following capabilities: Performance Logs and Alerts collects data in a comma-separated or tab-separated format for easy import to spreadsheet programs. A binary log-file format is also provided for circular logging or for logging instances such as threads or processes that may begin after the log starts to collect data. (Circular logging is the process of continuously logging data to a single file, overwriting previous data with new data.) You can also collect data in an SQL database format. This option defines the name of an existing SQL database and log set within the database where the performance data will be read or written. This file format is useful when collecting and analyzing performance data at an enterprise level rather than a per-server basis. Counter data collected by Performance Logs and Alerts can be viewed during collection and after collection has stopped. Because logging runs as a service, data collection occurs regardless of whether any user is logged on to the computer being monitored. You can define start and stop times, file names, file sizes, and other parameters for automatic log generation. You can manage multiple logging sessions from a single console window. You can set an alert on a counter, thereby defining that a message be sent, a program be run, an entry made to the application event log, or a log be started when the selected counter's value exceeds or falls below a specified setting. 3-15

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Similar to System Monitor, Performance Logs and Alerts supports defining performance objects, performance counters, and object instances, and setting sampling intervals for monitoring data about hardware resources and system services. Performance Logs and Alerts also offers other options related to recording performance data: Start and stop logging either manually on demand or automatically based on a user-defined schedule. Configure additional settings for automatic logging, such as automatic file renaming, and set parameters for stopping and starting a log based on the elapsed time or the file size. Create trace logs. Using the default system data provider or another application provider, trace logs record detailed system application events when certain activities such as a disk I/O operation or a page fault occurs. When the event occurs, logs the data to a file specified by the Performance Logs and Alerts service. This differs from the operation of counter logs; when counter logs are being used, the service obtains data from the system when the update interval has elapsed, rather than waiting for a specific event. A parsing tool is required to interpret the trace log output. Developers can create such a tool using application programming interfaces (APIs) provided on the Performance Logs and Alerts interface Dynamic Management Views and Functions Introduced in Microsoft SQL Server 2005, dynamic management views (DMV) and functions (DMF) return server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance. There are two types of dynamic management views and functions: Server-scoped dynamic management views and functions. Database-scoped dynamic management views and functions.. All dynamic management views and functions exist in the sys schema and follow this naming convention: dm_*. When you use a dynamic management view or function, you must prefix the name of the view or function by using the sys schema. Dynamic management views can be referenced in Transact-SQL statements by using two-part, three-part, or four-part names. Dynamic management functions on the other hand can be referenced in Transact-SQL statements by using either two-part or three-part names. 3-16

Chapter 3: Advantages of SQL Server Option Dynamic management views and functions have been organized into a number of categories: Change Data Capture Related Dynamic Management Views and Functions Common Language Runtime Related Dynamic Management Views Database Mirroring Related Dynamic Management Views Database Related Dynamic Management Views Execution Related Dynamic Management Views and Functions Full-Text Search Related Dynamic Management Views Index Related Dynamic Management Views and Functions I/O Related Dynamic Management Views and Functions Object Related Dynamic Management Views and Functions Query Notifications Related Dynamic Management Views Replication Related Dynamic Management Views Resource Governor Dynamic Management Views Service Broker Related Dynamic Management Views SQL Server Extended Event Related Dynamic Management Views SQL Operating System Related Dynamic Management Views Transaction Related Dynamic Management Views and Functions Security Related Dynamic Management Views For Microsoft Dynamics NAV 2009, the most important views are I/O Related, Index Related, Execution Related and Database Related DMVs. In the Performance Audit chapter we will cover some of these DMVs in more detail. For more information about DMV and DMF in SQL Server, see Dynamic Management Views and Functions (http://msdn.microsoft.com/enus/library/ms188754.aspx). SQL Server Performance Dashboard Reports The Microsoft SQL Server 2005 Performance Dashboard Reports are used to monitor and resolve performance problems on your SQL Server 2005 database server. The SQL Server instance being monitored and the Management Studio client used to run the reports must both be running SP2 or a later version. 3-17

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 The SQL Server 2005 Performance Dashboard Reports are Reporting Services report files designed to be used with the Custom Reports feature that was introduced in the SP2 release of SQL Server Management Studio. The reports allow a database administrator to quickly identify whether there is a current bottleneck on their system, and if a bottleneck is present, capture additional diagnostic data that may be required resolve the problem. For example, if the system is experiencing waits for disk IO the dashboard allows the user to quickly see which sessions are performing the most IO, what query is running on each session and the query plan for each statement. FIGURE 3.4 THE PERFORMANCE DASHBOARD REPORTS Common performance problems that the dashboard reports may help resolve include: CPU bottlenecks (and what queries are consuming the most CPU) IO bottlenecks (and what queries are performing the most IO). Index recommendations generated by the query optimizer (missing indexes). Blocking. Latch contention. 3-18

Chapter 3: Advantages of SQL Server Option The information captured in the reports is retrieved from SQL Server's dynamic management views. There is no additional tracing or data capture required, which means the information is always available and this is a very inexpensive means of monitoring the server. Reporting Services is not required to be installed to use the Performance Dashboard Reports. The Performance Dashboard Reports can be downloaded from http://www.microsoft.com/downloads/details.aspx?familyid=1d3a4a0d-7e0c- 4730-8204-e419218c1efc&displaylang=en. SQL Server Activity Monitor Use Activity Monitor to obtain information about SQL Server processes and how these processes affect the current instance of SQL Server. FIGURE 3.5 THE SQL SERVER ACTIVITY MONITOR Activity Monitor is a tabbed document window that has the following expandable and collapsible panes: Overview, Active User Tasks, Resource Waits, Data File I/O, and Recent Expensive Queries. When any pane is expanded, Activity Monitor is querying the instance for information. When a pane is collapsed, all querying activity stops for that pane. You can also expand one or more panes at the same time to view different kinds of activity on the instance. 3-19

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 For the columns that are included in the Active User Tasks, Resource Waits, Data File I/O, and Recent Expensive Queries panes, you can customize the display in the following ways: To rearrange the order of the columns, click the column heading and drag it to another location in the heading ribbon. To sort a column, click the column name. To filter on one or more columns, click the drop-down arrow in the column heading, and then select a value. To view the Activity Monitor in SQL Server 2005 and SQL Server 2008, a user must have VIEW SERVER STATE permission. To open the Activity Monitor: 1. On the SQL Server Management Studio standard toolbar, click Activity Monitor. 2. In the Connect to Server dialog box, select the server name and authentication mode, and then click Connect. You can also open Activity Monitor at any time by pressing CTRL+ALT A. To open Activity Monitor in Object Explorer, right-click the instance name, and then select Activity Monitor. Database Engine Tuning Advisor (DTA) Database Engine Tuning Advisor is a tool that analyzes the performance effects of workloads run against one or more databases. A workload is a set of Transact- SQL statements that executes against databases you want to tune. After analyzing the effects of a workload on your databases, Database Engine Tuning Advisor provides recommendations to add, remove, or modify physical design structures in Microsoft SQL Server databases. These physical performance structures include clustered indexes, nonclustered indexes, indexed views, and partitioning. When implemented, Database Engine Tuning Advisor recommendations enable the query processor to perform workload tasks in the shortest period of time. Tuning your databases with Database Engine Tuning Advisor requires no expertise in database structure, workloads, or the internal workings of SQL Server. Database Engine Tuning Advisor provides two interfaces: A stand-alone graphical user interface tool for tuning databases, and viewing tuning recommendations and reports. A command-line utility program, dta.exe, for Database Engine Tuning Advisor functionality in software programs and scripts. 3-20

Chapter 3: Advantages of SQL Server Option In previous releases of SQL Server, some Database Engine Tuning Advisor functionality was provided by the Index Tuning Wizard. Database Engine Tuning Advisor evaluates more types of events and structures, and provides better quality recommendations. For more information, see Tuning the Physical Database Design (http://msdn.microsoft.com/en-us/library/ms191531.aspx). SQL Server Profiler SQL Server Profiler is a tool that captures SQL Server events from a server. The events are saved in a trace file that can later be analyzed or used to replay a specific series of steps when trying to diagnose a problem. SQL Server Profiler is used for activities such as: Stepping through problem queries to find the cause of the problem. Finding and diagnosing slow-running queries. Capturing the series of Transact-SQL statements that lead to a problem. The saved trace can then be used to replicate the problem on a test server where the problem can be diagnosed. Monitoring the performance of SQL Server to tune workloads. For information about how to tune the physical database design for database workloads, see Database Engine Tuning Advisor Overview (http://msdn.microsoft.com/en-us/library/ms173494.aspx). Correlating performance counters to diagnose problems SQL Server Profiler also supports auditing the actions performed on instances of SQL Server. Audits record security-related actions for later review by a security administrator. Performance Data Collector (SQL 2008) SQL Server 2008 introduced a new performance monitoring tool called the data collector. The data collector stores data in the management data warehouse (MDW). There are five collector types: T-SQL Query, SQL Trace, Performance Counters, and Query Activity. Out of the box, SQL Server 2008 provides the following system data collection definitions: Disk Usage. Collects local disk usage information for all the databases of the SQL Server instance. This information can help you determine space usage and space requirements for disk capacity planning. Server Activity. Collects SQL Server instance-level resource usage information such as CPU, memory, and I/O. This information can help you monitor short-term to long-term resource usage trends and identify potential resource bottlenecks on the system. It can also be used for resource capacity planning. 3-21

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Query Statistics. Collects individual statement-level query statistics, including query text and query plans. This information can help you identify top resource consuming queries for performance tuning. The data collector is implemented as SQL Server Information Services (SSIS) packages. These packages can be configured to run manually, continuously, or scheduled as SQL Server Agent jobs to periodically collect and upload data to a central database referred to as the management data warehouse (sometimes known as the MDW). The MDW is simply a database serving the purpose of storing the collected data for viewing and reporting. The following figure shows the architecture of the feature. FIGURE 3.6 DATA COLLECTION ARCHITECTURE A single MDW database can serve as the central repository for data collectors running on multiple target SQL Server instances. The data collectors are configured on each target server, and they collect and upload data to the MDW database, which can be on a remote server. Between the time that the data is captured and the time it is uploaded, the data collector can write temporary data into cache files on the target server. The collection sets are usually run as SQL Server Agent jobs, so metadata about collection frequency, what items to collect, and so on, is stored in the msdb database. The system collection sets have predefined reports, which are accessed through SQL Server Management Studio and used to visualize the captured data. You can change the schedule of collection sets if they are configured as scheduled jobs. You can also specify how long to keep the collected data and where to store cached data prior to upload. For example, by default, the Server Activity collection set collects data every minute and uploads data every 15 minutes. Depending on the time frame of your target window, you might choose to decrease the collection frequency to 5 minutes, which will capture only onefifth of the data (mostly performance counters). 3-22

Chapter 3: Advantages of SQL Server Option The definition of system collection sets cannot be modified. However, you can define your own collection sets and store that information in the MDW (tables will be created in the custom_snapshots schema) and define your own custom reports for this data. Scalability Another one of the main differences between the two server options is scalability, not only in terms of hardware resources but also in terms of concurrent users. Before starting a new Microsoft Dynamics NAV implementation, we strongly recommend to consider the scalability requirements and select adequate hardware resources and software platforms. Microsoft SQL Server provides a better scalability than Classic database server. While the Classic database server supports only one processor and up to 1 GB of memory, SQL Server can be configured to use multiple processors and more memory. In addition, the maximum database size for the Classic database server is 132 GB, where SQL Server (except Express and Compact editions) can scale out to bigger databases. For detailed feature specifications, see http://www.microsoft.com/sqlserver/2008/en/us/editions-compare.aspx. The SQL Server Option also supports more concurrent users than Classic Database Server. In August 2008, Microsoft has run tests to compare the performance of Microsoft Dynamics NAV 5.0 SP 1 on SQL Server 2005 SP2 with that of Microsoft Dynamics NAV 5.0 SP 1 on a classic database. Test results clearly show that Microsoft Dynamics NAV 5.0 SP1 performs at a better level with Microsoft SQL Server than with a classic database, and in particular SQL Server outperforms the other database with greater levels of concurrent use. This means that Microsoft Dynamics NAV customers can fully enjoy the benefits of SQL Server 2005 while improving the performance and scalability of Microsoft Dynamics NAV. The combination of Microsoft Dynamics NAV 5.0 SP1 and SQL Server 2005 SP 2 allows users to process documents faster and more efficiently, which means that customers can push more business through the system. Summary In this chapter you have seen the differences between the two database platforms. The different types of backups in Microsoft SQL Server were explained and the tools you can use to monitor performance for a Microsoft Dynamics NAV 2009 database on SQL Server. In the next chapter, Performance Audit, you will find more detailed information, scripts, procedures and examples on how to conduct a performance review, by using the tools introduced in this chapter. 3-23

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Test Your Knowledge Test your knowledge with the following questions. 1. Which recovery model is required to enable point-in-time recovery? (Select all that apply) ( ) Simple ( ) Bulk-Logged ( ) Full ( ) All of the above 2. Which backup strategy allows point-in-time recovery? (Select all that apply) ( ) Full Backup Strategy ( ) Database and Transaction Log Backup Strategy ( ) File or Filegroup Backup Strategy ( ) Differential Backup Strategy, combined with Transaction Log Backups 3. What is true about SQL Server Native Client ODBC driver? ( ) SQL Server Native Client ODBC driver is installed automatically with the client. ( ) SQL Server Native Client ODBC driver can write to the database. ( ) SQL Server Native Client ODBC driver ensures data consistency and validation by firing C/AL code triggers. ( ) SQL Server Native Client ODBC driver can be used to access Classic database and SQL Server database. 4. Which tools can be used to monitor performance on a classic database? (Select all that apply) ( ) Client Monitor ( ) Session Monitor ( ) Performance Data Collector ( ) Perfmon 3-24

Chapter 3: Advantages of SQL Server Option 5. What are Dynamic Management Views and Functions used for and how can you use them? 6. One of your customers wants to upgrade to Microsoft Dynamics NAV 2009. You have calculated the initial size of the database to be 110 GB, with an expected annual growth of 20%. Which server option will you recommend and why? 3-25

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 3.1 - Create a Backup and Restore to a Point in Time In this lab you will use the point in time restore option of Microsoft SQL Server. Scenario To update some fields in the Customer table, Mort runs a processing-only report that executes a DELETEALL operation on a temporary instance of the Customer table. After running the report, Mort notices that his report contained an error. He forgot to use a temporary instance, and as a result, all customers in the production environment have disappeared. In order to resolve this issue, Tim, the IT-manager, starts by creating a backup of the production environment. Next, he restores the backup to a point in time shortly before the DELETEALL operation. In the next lab, Tim will copy the Customers from the test to the live database. Challenge Yourself! Make a full backup and a transaction log backup from the Demo Database NAV (6-0) database. Next, delete all customers from the database by running a small processing-only report, to simulate a disaster scenario. Make a new database and transaction log backup. Restore the latest backup to a new database Test Database NAV (6-0) and reapply the transaction log to a point in time that precedes the DELETEALL statement. Check the Customer table in the Test database. Need a Little Help? 1. Enable Database and Transaction Log Backups for the Demo Database NAV (6-0) database. 2. Create a report to delete all customers. 3. Make a Database and Transaction Log Backups for the Demo Database NAV (6-0) database. 4. Restore the Database Backup to a New Test Database To a Point In Time. 5. Check the Customer table. Step by Step Create Database Backup for the Demo Database NAV (6-0) database 1. Open SQL Server Management Studio. 2. In the Object Explorer pane, expand the Databases tree. 3-26

Chapter 3: Advantages of SQL Server Option 3. Right-click the Demo Database NAV (6-0) database and select Tasks > Back Up. 4. In the Back Up Database window, enter the following settings to create a full database backup: FIGURE 3.7 THE BACKUP DATABASE WINDOW FOR A FULL BACKUP 5. Click OK to start the backup. Create a Transaction Log Backup for the Demo Database NAV (6-0) database 1. Open SQL Server Management Studio. 2. In the Object Explorer pane, expand the Databases tree. 3-27

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 3. Right-click the Demo Database NAV (6-0) database and select Tasks > Back Up. 4. In the Back Up Database window, enter the following settings to create a transaction log backup: FIGURE 3.8 THE BACKUP DATABASE WINDOW FOR A TRANSACTION LOG BACKUP 5. Click OK to start the backup. Delete all Customers 1. Open Microsoft Dynamics NAV 2009 Classic client. 2. In the Tools menu, select Object Designer. 3. Click Report. 4. Click New. 5. In the New Report window, select Create a blank report. 6. Click OK. 7. In the Report Designer window, on the first blank line, press Shift+F4 to open the Properties window. 8. In the Properties window, set the ProcessingOnly property to Yes. 3-28

Chapter 3: Advantages of SQL Server Option 9. In the Report Designer window, on the first blank line, enter Customer in the DataItem column. 10. Select the newly added data item and press F9 to open the C/AL Editor window. 11. Enter the following lines of code in the OnPreDataItem() trigger. FIGURE 3.9 THE CODE TO DELETE ALL CUSTOMERS 12. On the File menu, select Save As... 13. Save the report as 123456701, Delete All Customers. 14. Close the Report Designer window. 15. In the Object Designer window, select report 123456701 and then click Run. 16. In the Report Request form, click the OK button. A dialog box will be displayed. 17. Write down the time displayed in the dialog box and then click Yes. The customers will now be deleted. 18. In the Object Designer, click Table. 19. Select table 18, Customer. 20. Click Run to open the Customer table. As you can see, all customers have been deleted. Make a Transaction Log Backups for the Demo Database NAV (6-0) database 1. Open SQL Server Management Studio. 2. In the Object Explorer pane, expand the Databases tree. 3-29

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 3. Right-click the Demo Database NAV (6-0) database and select Tasks > Back Up. 4. In the Back Up Database window, enter the following settings to create a transaction log backup: FIGURE 3.10 THE BACKUP DATABASE WINDOW FOR A TRANSACTION LOG BACKUP 5. Click OK to start the backup. Restore the Database Backup to a New Test Database To a Point In Time The backup set Lab_3_2.trn contains two transaction log backups: one executed before the deletion, and one made after the deletion. When restoring the test database, you can safely restore the entire first transaction log, as it contains database changes made before the deletion. However, you must make sure not to restore the complete transaction log. To stop the restore at a specific point in time, you can add the STOPAT parameter to the RESTORE command. 1. Open SQL Server Management Studio. 3-30

Chapter 3: Advantages of SQL Server Option 2. In the Object Explorer pane, right-click the Databases node and select Restore database. 3. In the Restore database window, specify the following restore options: FIGURE 3.11 THE RESTORE DATABASE WINDOW 4. In the Restore Database window, click the Script button. An SQL script for the restore operation will now be generated. RESTORE DATABASE [Test Database NAV (6-0)] FROM DISK = N'C:\Lab_3_2.bak' WITH FILE = 1, MOVE N'Demo Database NAV (6-0)_data' TO N'C:\Program Files\Microsoft Dynamics NAV\60\Database\Test Database NAV (6-0).MDF', MOVE N'Demo Database NAV (6-0)_log' TO N'C:\Program Files\Microsoft Dynamics NAV\60\Database\Test Database NAV (6-0)_1.LDF', NORECOVERY, NOUNLOAD, STATS = 10 GO RESTORE LOG [Test Database NAV (6-0)] FROM DISK = N'C:\Lab_3_2.trn' WITH FILE = 1, 3-31

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 NORECOVERY, NOUNLOAD, STATS = 10 GO RESTORE LOG [Test Database NAV (6-0)] FROM DISK = N'C:\Lab_3_2.trn' WITH FILE = 2, NOUNLOAD, STATS = 10, RECOVERY, STOPAT=N'YYYY-MM-DDTHH:MM:SS' GO At the bottom of the script, you will see the RESTORE command for the second transaction log. 5. To this RESTORE command, add the RECOVERY and STOPAT=N'YYYY-MM-DDTHH:MM:SS' parameters, where YYYY-MM-DD and HH:MM:SS represent the date and time to which you want to restore the transaction log. In this case, replace the time with the start time of the DELETEALL - 1 second. 6. Click the Execute button to run the SQL script. When the database has been restored, right-click the Databases tree in the Object Explorer then select Refresh. The Test Database NAV (6-0) database will appear in the Databases tree. Check the Customer table in the Test Database NAV (6-0) database 1. Open SQL Server Management Studio. 2. In the Object Explorer pane, expand the Databases tree. 3. In the Test Database NAV (6-0) database; expand the Tables node. 4. Right-click the Cronus International Ltd_$Customer table and choose Select Top 1000 Rows. You will notice that all customers are still in the test database. You can now start transferring the customer data from the Test Database NAV (6-0) to the Demo Database NAV (6-0) database. 3-32

Chapter 3: Advantages of SQL Server Option Lab 3.2a - Transfer Data from Test to Live Database (Transact-SQL) In this lab, you will use SQL Server features to transfer data between databases. Scenario After restoring the database to a point in time, Tim needs to get the lost Customer data back into the live database. Tim does not want to lose time developing C/SIDE objects to export and import the data. Instead, he will use built-in features of SQL Server to transfer the data. In this lab, Tim will use a Transact- SQL statement. Before copying the data, Tim will make sure he is the only person working in the Demo Database NAV (6-0) database. He instructs the other users to leave the database and opens the database in single-user mode. After copying the data, Tim removes the single-user flag. Challenge Yourself! Open the Demo Database NAV (6-0) database in single-user mode. Use a Transact-SQL statement to move the Customer data from the Test Database NAV (6-0) to the Demo Database NAV (6-0) database. Remove the single-user mode flag. Need a Little Help? 1. Open the Demo Database NAV (6-0) database in single-user mode. 2. Create the Transact-SQL query. 3. Design the Transact-SQL query. 4. Execute the Transact-SQL query. 5. Check the Customer table in Demo Database NAV (6-0). 6. Remove the single-user mode flag. Step by Step Open the Demo Database NAV (6-0) database in single-user mode 1. Open Microsoft Dynamics NAV 2009 Classic client. 2. In the File menu, select Database, Information to open the Database Information window. 3-33

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 3. On the Sessions tab, check that the Current Sessions field displays 1. You can click the lookup button to see a list of all active database sessions. In the list, you can see the Host Name and User ID of all active database sessions. If the Current Sessions field does not display 1, you will be unable to open the database in single-user mode. 4. Close the Database Information window. 5. In the File menu, select Database, Alter to open the Alter Database window. 6. On the Options tab, and select the Single-user option. 7. Click OK to close the Alter Database window and activate the single-user mode. 8. Exit the Microsoft Dynamics NAV Classic client. If you look at the list of databases in SQL Server Management Studio, you will see the Demo Database NAV (6-0) listed as Demo Database NAV (6-0) (Single-user). FIGURE 3.12 DATABASE MARKED AS SINGLE-USER 3-34

Chapter 3: Advantages of SQL Server Option In addition, in the Database Properties window in SQL Server Management Studio, you will see that the Restrict Access property is now set to SINGLE_USER. Be aware that you must exit the Microsoft Dynamics NAV Classic client before you can look at the database properties. Create the Transact-SQL query 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine. 3. Click the New Query button. 4. In the new query window, enter the following SQL statement: SELECT * FROM [Test Database NAV (6-0)].[dbo].[CRONUS International Ltd_$Customer] Design the Transact-SQL query 1. In the New Query window, select the entire SQL statement. 2. Right-click the selected line and select Design Query in Editor. 3. In the Query Designer window, in the top pane, remove the check mark in front of the time stamp column, to exclude the time stamp column. FIGURE 3.13 THE QUERY DESIGNER WINDOW In the bottom pane, the time stamp will be removed from the SELECT clause. 4. Click OK to close the Query Designer window and return to the query window. 3-35

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 5. In the Query window, select the list of fields between the SELECT and FROM keywords. 6. Press Ctrl+C to copy the list to the Clipboard. 7. Place the cursor before the SELECT keyword. 8. Press Enter to insert a new line. 9. On the new line, enter the following command: INSERT INTO [Demo Database NAV (6-0)].dbo.[CRONUS International Ltd_$Customer] () 10. Place the cursor between the parenthesis. 11. Press Ctrl+V to insert the field list from the Clipboard. The query window will now look like this: FIGURE 3.14 THE QUERY WINDOW WITH THE INSERT INTO... SELECT... STATEMENT 3-36

Chapter 3: Advantages of SQL Server Option Execute the Transact-SQL query Click the Execute button to run the query. The data is now transferred from the Test Database NAV (6-0) to the Demo Database NAV (6-0) database. Check the Customer Table in Demo Database NAV (6-0) 1. In the Object Explorer window in SQL Server Management Studio, expand the Databases node. 2. Select and expand Demo Database NAV (6-0). 3. In the Tables node, right-click the CRONUS International Ltd_$Customer table and select Select Top 1000 Rows. A new query window is opened, showing all customers. Remove the Single User flag 1. Open SQL Server Management Studio. 2. Click the New query button to open a new query window. 3. In the Query window, enter the following command: ALTER DATABASE [Demo Database NAV (6-0)] SET MULTI_USER WITH NO_WAIT 4. Click the Execute button. The single-user flag will be removed. If you refresh the list of databases in SQL Server Management Studio, you see that the (Single-user) tag has been removed for the Demo Database NAV (6-0) database. In addition, in the Database Properties window, the Restrict Access property is reset to MULTI_USER. In the Alter Database window in the Classic client, the single-user option is no longer selected. 3-37

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 3.2b - Transfer Data from Test to Live Database (SSIS) In this lab, you will use SQL Server features to transfer data between databases. The method described in this lab can be used as an alternative for the one described in the previous lab. Scenario After restoring the database to a point in time, Tim needs to get the lost Customer data back into the live database. Tim does not want to lose time developing C/SIDE objects to export and import the data. Instead, he will use built-in features of SQL Server to transfer the data. In this lab, Tim will use the SQL Server Import and Export Wizard to create a SQL Server Integration Services (SSIS) package. Before copying the data, Tim will make sure that nobody is working in the Demo Database NAV (6-0) database. He instructs the other users to leave the database and opens the database in restricted user mode. After copying the data, Tim removes the restricted user flag. Challenge Yourself! Open the Demo Database NAV (6-0) database in restricted user mode. Use the SQL Server Import and Export Wizard to move the Customer data from the Test Database NAV (6-0) to the Demo Database NAV (6-0) database. Check the data and remove the restricted user mode flag. Need a Little Help? 1. Open the Demo Database NAV (6-0) database in restricted user mode. 2. Execute the SQL Server Import and Export Wizard. 3. Select the Customer table in Demo Database NAV (6-0). 4. Remove the restricted user mode flag. Open the Demo Database NAV (6-0) Database in Restricteduser Mode 1. Open Microsoft Dynamics NAV 2009 Classic client. 2. In the File menu, select Database, Information to open the Database Information window. 3. On the Sessions tab, check that the Current Sessions field displays 1. 4. Exit the Microsoft Dynamics NAV Classic client. 5. Open SQL Server Management Studio. 6. Connect to the Database Engine. 3-38

Chapter 3: Advantages of SQL Server Option 7. Click the Activity Monitor button in the menu bar or press Ctrl+Alt+A to open the Activity Monitor window. 8. Double-click the Processes tab to see a list of the active SQL Server processes. 9. In the Application Name column, check for Microsoft Dynamics NAV Classic client processes. The session information that you see here corresponds to the information displayed in the Database Information and Database Sessions window. If Microsoft Dynamics NAV Classic client processes appear, it means that there are active users connected to the database. In that case, the Login column contains the user who opened the session. Contact the user and ask the user to exit Microsoft Dynamics NAV. Eventually, you can terminate a session by right-clicking the session, then selecting Kill Process. Note that killing a process should only be done in emergency cases. Always verify the current user's activity before you kill the process. 10. When no Microsoft Dynamics NAV Classic client processes are active, close Activity Monitor. 11. Click the New query button to open a new query window. 12. In the query window, enter the following command. ALTER DATABASE [Demo Database NAV (6-0)] SET RESTRICTED_USER WITH NO_WAIT 13. Click the Execute button. Execute the SQL Server Import and Export Wizard 1. In the Object Explorer, select the Demo Database NAV (6-0) database. 2. Right-click the database and select Tasks > Import Data. The SQL Server Import and Export Wizard will appear. 3. On the Welcome page, click Next. 4. On the Choose a Data Source page, select SQL Server Native Client 10.0 as Data source. 5. In the Server Name field, enter NAV-SRV-01. 6. Select Windows Authentication. 7. In Database, select Test Database NAV (6-0) from the drop-down list. 8. Click the Next button. 9. On the Choose a Destination page, select SQL Server Native Client 10.0 as Data source. 10. In the Server Name field, enter NAV-SRV-01. 11. Select Windows Authentication. 3-39

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 12. In Database, select Demo Database NAV (6-0) from the drop-down list. 13. Click the Next button. 14. On the Specify Table Copy or Query page, select the Write a query to specify the data to transfer option. 15. Click the Next button. 16. On the Provide a Source Query page, enter the following SQL statement: SELECT [No_],[Name],[Search Name],[Name 2],[Address 2], [Address],[City],[Contact],[Phone No_],[Telex No_], [Our Account No_],[Territory Code], [Global Dimension 1 Code],[Global Dimension 2 Code], [Chain Name],[Budgeted Amount],[Credit Limit (LCY)], [Customer Posting Group],[Currency Code], [Customer Price Group],[Language Code], [Statistics Group],[Payment Terms Code], [Fin_ Charge Terms Code],[Salesperson Code], [Shipment Method Code],[Shipping Agent Code], [Place of Export],[Invoice Disc_ Code], [Customer Disc_ Group],[Country_Region Code], [Collection Method],[Amount],[Blocked],[County], [Invoice Copies],[Last Statement No_],[Priority], [Print Statements],[Bill-to Customer No_], [Payment Method Code],[Last Date Modified], [Application Method],[Prices Including VAT], [Location Code],[Fax No_],[Telex Answer Back], [VAT Registration No_],[Combine Shipments], [Gen_ Bus_ Posting Group],[Picture],[Post Code], [E-Mail],[Home Page],[Reminder Terms Code], [No_ Series],[Tax Area Code],[Tax Liable], [VAT Bus_ Posting Group],[Service Zone Code], [Block Payment Tolerance],[IC Partner Code], [Prepayment %],[Primary Contact No_], [Responsibility Center],[Shipping Advice], [Shipping Time],[Shipping Agent Service Code], [Reserve],[Allow Line Disc_],[Base Calendar Code], [Copy Sell-to Addr_ to Qte From] FROM [Test Database NAV (6-0)].[dbo].[CRONUS International Ltd_$Customer] You can use the Parse button to check the syntax of your query. 17. Click the Next button. 18. On the Select Source Table and Views page, in the Destination column, select [dbo].[cronus International Ltd_$Customer]. Optionally, you can click the Edit Mappings button to check the field mappings in the tables. 19. Click the Next button. 3-40

Chapter 3: Advantages of SQL Server Option 20. On the Review Data Type Mapping page, click the Next button. 21. On the Save and Run Package page, you can choose to Run the package immediately and/or Save the SSIS package. Select the Run Immediately option. The Save SSIS package option allows you to save the query (either in SQL Server or on the File system), so you can rerun the package afterwards. (You can run the package manually or schedule the SSIS package through the SQL Server Agent.) 22. Click the Next button. 23. On the Complete the Wizard page, click the Finish button. The package is executed and the Customer data is transferred to the Demo Database NAV (6-0) database. After the package has run, a status window with the execution results is shown. FIGURE 3.15 SQL SERVER IMPORT AND EXPORT WIZARD 3-41

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Check the Customer Table in Demo Database NAV (6-0) 1. Open SQL Server Management Studio. 2. Click the New Query button to open a new query window. 3. In the New Query window, enter the following SQL statement SELECT * FROM [Demo Database NAV (6-0)].[dbo].[CRONUS International Ltd_$Customer] 4. Click the Execute button. Remove the Restricted User Mode Flag 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine. 3. Click the New Query button. 4. In the new query window, enter the following SQL statement: ALTER DATABASE [Demo Database NAV (6-0)] SET MULTI_USER WITH NO_WAIT 5. Click the Execute button. 3-42

Quick Interaction: Lessons Learned Chapter 3: Advantages of SQL Server Option Take a moment and write down three Key Points you have learned from this chapter 1. 2. 3. 3-43

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Solutions Test Your Knowledge 1. Which recovery model is required to enable point-in-time recovery? (Select all that apply) ( ) Simple ( ) Bulk-Logged ( ) Full ( ) All of the above 2. Which backup strategy allows point-in-time recovery? (Select all that apply) ( ) Full Backup Strategy ( ) Database and Transaction Log Backup Strategy ( ) File or Filegroup Backup Strategy ( ) Differential Backup Strategy, combined with Transaction Log Backups 3. What is true about SQL Server Native Client ODBC driver? ( ) SQL Server Native Client ODBC driver is installed automatically with the client. ( ) SQL Server Native Client ODBC driver can write to the database. ( ) SQL Server Native Client ODBC driver ensures data consistency and validation by firing C/AL code triggers. ( ) SQL Server Native Client ODBC driver can be used to access Classic database and SQL Server database. 4. Which tools can be used to monitor performance on a classic database? (Select all that apply) ( ) Client Monitor ( ) Session Monitor ( ) Performance Data Collector ( ) Perfmon 3-44

Chapter 3: Advantages of SQL Server Option 5. What are Dynamic Management Views and Functions used for and how can you use them? MODEL ANSWER: Dynamic management views (DMV) and functions (DMF) were introduced as a new feature in SQL Server 2005. They return server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance. Dynamic Management Views All dynamic management views and functions exist in the sys schema and follow this naming convention: dm_*. Dynamic management views can be referenced in Transact-SQL statements by using two-part, three-part, or four-part names. Dynamic management functions on the other hand can be referenced in Transact-SQL statements by using either two-part or three-part names. 6. One of your customers wants to upgrade to Microsoft Dynamics NAV 2009. You have calculated the initial size of the database to be 110 GB, with an expected annual growth of 20%. Which server option will you recommend and why? MODEL ANSWER: Given the initial database size and the expected growth, SQL Server option offers better opportunities for future growth. The maximum database size with Classic database server is 132 GB, while SQL Server can scale out to bigger databases. If you recommend Classic database server, the maximum database size limit will be reached after one year already and the customer will have to switch to the SQL Server Option. 3-45

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 3-46

Chapter 4: Performance Audits CHAPTER 4: PERFORMANCE AUDITS Objectives Introduction The objectives are: Create and setup a test environment for troubleshooting purposes. Run a performance audit using System Monitor. Monitor performance using the Client Monitor tool. Identify the clients that cause performance problems using Session Monitor. Create trace files using Microsoft SQL Server Profiler and analyze the trace files. Use the Database Engine Tuning Advisor to analyze indexes and data partitioning. Use Dynamic Management Views to analyze performance data. Collect performance-related data using additional scripts, tools and reports. Monitoring server performance helps keep your servers functioning optimally and helps you identify bottlenecks in the system. You can use the performance monitoring data to identify problems and apply corrective action. You can also use the monitoring data to enhance the performance of your servers by identifying areas that need additional resources. For example, you may need to increase your storage capacity to handle the growing number of users in your organization. If your organization is small, or if you rely on one server for most of the Microsoft SQL Server operations, you may need to monitor only one server. If you have a larger organization, or if you want to monitor the performance of all servers and components in SQL Server, you can use System Monitor, which is a Windows Server 2003 component. You can also use the Windows Performance Monitor, a Windows Server 2003 snap-in, to verify that your operating system is functioning correctly. The Windows Performance Monitor, which is made up of the System Monitor and Performance Logs and Alerts snap-ins, is the primary tool set used to analyze and maintain SQL Server and operating system performance levels. The Windows Performance Monitor is quite flexible and can be used to collect data interactively from a single server or automated to collect data from many servers. For more information about how to use the Performance console, see the Windows Server 2003 documentation Windows Server 2003 documentation (http://www.microsoft.com/windowsserver2003/proddoc/default.mspx). 4-1

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Set up a Test Environment You can also use Task Manager (Taskmgr.exe) to obtain information about the processes and programs that are running on your local computer. There are important differences between Task Manager and the Windows Performance Monitor, such as the Windows Performance Monitor captures data to a file whereas the Task Manager can end a process. Task Manager is primarily a troubleshooting aid, and the Windows Performance Monitor is used for more detailed troubleshooting and analysis. Before troubleshooting and solving performance problems that exist in a working installation, you generally need to set up a separate test environment. Setting up a test environment means the following: 1. Setting up a separate database server. The test server should be set up on a separate computer and not on the computer used by the production system. Using a separate test environment gives you complete control over the system and over who has access to it. It also means that the customer can continue to use the production system. 2. Copying the production system database to the test server. If you are running on SQL Server, use the backup/restore functions in SQL Server Management Studio to make a fast copy of the database to the test server. If you are running on Classic Database Server, you can use the server-based backup program HotCopy. 3. Warming up the server, to ensure that you get realistic measurements. You must warm up the test server regardless of which server option you are using. Warming Up SQL Server If you have just turned SQL Server on or if you have just created the database or company, you must warm up SQL Server by using the database and the company. This ensures that the system resembles the actual customer installation and means that you can generate realistic performance measurements. You only need to run an initial test to warm up SQL Server. When SQL Server is warmed up, the execution plans for most queries have already been generated and are ready for use. Furthermore, the most frequently used data is now available in memory. When SQL Server is not warmed up, you will, for example, see that inserting, modifying or deleting records in a table can take more time to finish. This would usually be done much faster in an active SQL Server installation. To warm up a SQL Server, you can use an SQL script that contains a specific workload, or create a SQL Server Profiler trace file and replay the file afterward. 4-2

Chapter 4: Performance Audits Performance Indicators When you connect a client to Classic Database Server, you can see status information for server calls that take two or more seconds in the status bar. Typical server calls that generate status information are as follows: Server calls that modify or delete sets of records. Server calls that scan an index or an entire table to find some data. Server calls that need to lock a record or a table can be forced to wait until other transactions are committed and release the locks that they placed. Therefore, you should monitor the indicator in the status bar when trying to identify problematic tasks because this information may be all you need to break down a performance problem. However, if you are using the Microsoft SQL Server Option for Microsoft Dynamics NAV, the client's user interface has no indicators to tell you how much time is spent on long-running tasks. You therefore need some other procedures to break down a performance problem on SQL Server. Performance Problems Performance issues can occur at any time and can be caused by many things. Some of the most common causes are as follows: System resource contention Application design problems Queries or procedures that have long execution times Blocking Performance issues normally occur unexpectedly, but sometimes they can be predicted. A disk or memory failure, for example, is difficult to predict and is frequently detected after the problem arises. However, a performance issue that is caused by a missing index can be detected proactively, by monitoring the system resources and the database activity. Often, the monitoring information will show a specific trend. In the beginning, when there is little data in the system, the issue will not be visible in the counters. As more data is added to the database, one or more counters will start behaving differently. Troubleshooting performance issues involves the use of a series of steps to isolate and determine the cause. Identifying these causes is typically very timeconsuming, and you may spend several days evaluating the collected information. If there are performance issues, you can use the following troubleshooting scenario: 1. Check the hardware configuration. 4-3

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 System Monitor 2. Check the hardware performance. 3. Check the database design. When you are trying to identify any bottlenecks that exist in an installation, we recommend that you start by checking the hardware that the installation is running on. The first thing to verify is whether the hardware meets the system and configuration requirements. Identify which RAID level is used. And are you using separate disks for database and transaction log files (or is everything placed on the same disk)? If the configuration seems correct, you should start to look at the performance of the hardware. For example, does the server have sufficient memory to serve all requests, or is the server constantly paging because of a memory shortage, causing excessive disk I/O and CPU usage? Finally, you can analyze the queries that are running on SQL Server. By monitoring the activity inside the database, you can see which queries cause the performance issue. Afterward, you can take the steps that are required to solve the problem. Performance problems that are related to specific tasks should always be tested in the test environment, when no other users are logged on to the database server. This helps you determine whether the performance problem is related to the task itself, or if the problem only occurs when the task is executed in combination with other tasks on the same server. Deadlocks occur when concurrent transactions try to lock the same resources but do not lock them in the same order. This can be either solved by always using the same locking order or by using a "locking semaphore" that will prevent these transactions from running concurrently. To monitor the SQL Server installation, you can use several tools that will be described in the next sections. You can use System Monitor to obtain comprehensive information about your computer and about instances of SQL Server running on your computer. You can then use this information to diagnose performance issues and identify bottlenecks in the system. In this lesson, you learn how to use the System Monitor tool to collect and view real-time or logged data for memory, disk, processor and SQL Server activity. With Performance Logs and Alerts, you can collect performance data automatically from local or remote computers. You can view logged counter data using System Monitor or import the data into spreadsheet programs or databases for analysis and report generation. 4-4

Chapter 4: Performance Audits System Monitor System Monitor is an MMC snap-in that can be used to view system performance metrics such as processor and memory utilization or disk activity statistics. You can view System Monitor by starting the Performance application in the Administrative Tools program group. Alternatively, you can click Start, and then Run in the Windows Taskbar and enter perfmon.exe in the Run window. System Monitor categorizes information into objects, counters and instances. Understanding how objects, counters and instances are related to one another is critical to using System Monitor effectively. Objects - in System Monitor, objects are major components or subsystems of the computer system. Objects can be hardware (for example, a hard disk), software (for example, a process), or applications (for example, an instance of SQL Server). There are a fixed number of objects in Windows Server 2003, and installing SQL Server adds more objects specific to SQL Server to the list. Counters - counters collect data about different aspects of objects. For example, for the Process object, counters collect data on the % processor time and the user time. Counters are built into the operating system and continually read performance data, whether it is visible in System Monitor or not. If an object has multiple instances, counters track statistics for each instance or for the total of all instances. Instances - instances are multiples of the same object type. For example, if a system has multiple processors, the Processor object type will have multiple instances. When you view performance information in System Monitor, you can choose to view the values for an individual instance of an object (for example, the utilization of a single processor) or the combined values for all instances (for example, the overall processor utilization across all processors in the system). In System Monitor, you can specify which counters are to be displayed. You can display the counter values as a graph, a histogram (bar chart), or a report. Graphs, histograms and reports can be viewed in a browser and printed when performance data is saved as an HTML file. Reports can be exported into a spreadsheet, such as Microsoft Office Excel for further analysis. Performance Logs and Alerts In addition to the System Monitor MMC snap-in, the Performance tool includes a second snap-in named Performance Logs and Alerts. You can use this snap-in to capture performance information to a log file for later viewing in System Monitor. There are two types of log files: counter logs and trace logs. Counter logs record data about hardware resources and system resources based on performance over time. Counter logs are useful for tracking trends. Trace logs are used to record memory and resource events. 4-5

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Log files provide useful information for troubleshooting and planning. While charts, alerts, and reports on current activity provide instant feedback, log files enable you to track counters over a long period of time. Thus, you can examine information more thoroughly and document system performance. You can configure alerts that fire when a counter reaches a specific threshold value. Alerts are useful if you are not actively monitoring a particular counter but want to be notified when it exceeds or falls below a specified value so that you can investigate and determine the cause of the change. For example, you can set an alert to fire when the percentage of disk space used exceeds 80 percent or when the number of failed logon attempts exceeds a specific number. NOTE: Alerts in System Monitor, though they perform a similar function, are not related to alerts in SQL Server. For more information about how to create and configuring alerts in Windows Server 2003, see Microsoft Knowledge Base article 324752 How to Create and Configure Performance Monitor Alerts in Windows Server 2003 (http://go.microsoft.com/fwlink/?linkid=3052&kbid=324752). NOTE: The alert functionality depends on the Windows Server 2003 Messenger Service, the Windows Server 2003 Alerter Service, and the existence of the recipient account registration in the Windows Internet Name Service (WINS). The Messenger and Alerter services are disabled by default and must be enabled and started to allow network messages to be transmitted. SQL Server Performance Objects SQL Server provides objects and counters that can be used by System Monitor to monitor activity in computers running an instance of SQL Server. An object is any SQL Server resource, such as the SQL Server Lock Manager. Each object contains one or more counters that determine various aspects of the objects to monitor. For example, the SQL Server Locks object contains the Number of Deadlocks/per second and Lock Timeouts/per second counters. SQL Server Objects The following table describes the most commonly used SQL Server Objects. Performance object SQLServer:Access Methods SQLServer:Buffer Manager Description Searches through and measures allocation of SQL Server database objects (for example the number of index searches or number of pages that are allocated to indexes and data). Provides information about the memory buffers used by SQL Server, such as free memory and buffer cache hit ratio. 4-6

Chapter 4: Performance Audits Performance object SQLServer:Databases SQLServer:General Statistics SQLServer:Locks SQLServer:Memory Manager SQLServer:Plan Cache SQLServer:SQL Errors SQLServer:Transactions Description Provides information about a SQL Server database, such as how much log space is available or the number of active transactions in the database. There can be multiple instances of this object. Provides information about general server-wide activity, such as the number of users who are connected to an instance of SQL Server. Provides information about the individual lock requests made by SQL Server, such as lock timeouts and deadlocks. There can be multiple instances of this object. Provides information about SQL Server memory usage, such as the total number of lock structures currently allocated. Provides information about the SQL Server cache used to store objects such as stored procedures, triggers and query plans. Provides information about SQL Server errors. Provides information about the active transactions in SQL Server such as the overall number of transactions and the number of snapshot transactions. There are many other SQL Server objects and some SQL Server Agent objects. When multiple instances of SQL Server are installed on the same computer, each instance has its own set of performance objects. Considerations for Monitoring SQL Server Monitoring an instance of SQL Server requires analysis of some key aspects of the system, such as disk system, memory, and CPU. Eliminating the physical bottlenecks can immediately affect performance and further isolate the design issues in the database, Transact-SQL queries, or client applications. It is important to monitor SQL Server performance so that you can identify bottlenecks, determine their cause, and eliminate them. Bottlenecks can be eliminated by upgrading or optimizing hardware, by distributing server load among other SQL Servers, or by tuning SQL Server databases, indexes and queries. While System Monitor and Performance Logs and Alerts show information about system resources, they do not show any information on the database activity that is going on at the same time. 4-7

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 System Monitor shows you the current activity. Performance Logs and Alerts allows you to create counter logs containing specific counters. The counter logs can run in the background and collect information over a longer period, so you can see when the performance issues occur and to analyze trends. Counter logs can be scheduled to run automatically in a specific time interval and can be used to document system performance. Measurements can be saved either to a file or to an SQL database. Monitoring Disk Activity SQL Server uses Windows operating system input/output (I/O) calls to perform read and write operations on your disk subsystems. SQL Server manages how and when disk I/O is performed, but the Windows operating system performs the underlying I/O operations. The I/O subsystem includes the system bus, disk controller cards, disks, tape drives, CD-ROM drive, and many other I/O devices. Disk I/O is a frequent cause of system bottlenecks. Monitoring disk activity involves two areas of focus: Monitoring disk I/O and detecting excess paging Isolating disk activity that SQL Server creates You can monitor the following counters in the PhysicalDisk object to determine disk I/O and detect excess paging. Counter Description Guidelines % Disk Time Avg. Disk Queue Length Current Disk Queue Length Monitors the percentage of time that the disk is busy with read/write activity. Monitors the average number of read/write requests that are queued. Monitors the current number of read/write requests that are queued. If this counter is high (more than 90 percent), check the Current Disk Queue Length counter to see how many system requests are waiting for disk access. This counter should be no more than twice the number of spindles that make up the physical disk. This counter should be no more than twice the number of spindles that make up the physical disk. The Disk Read Bytes/sec and Disk Write Bytes/sec counters indicate the maximum throughput of a disk. Use the values of the Current Disk Queue Length and % Disk Time to detect bottlenecks within a subsystem. If Current Disk Queue Length and % Disk Time counter values are consistently high, consider taking one of the following actions: Use a faster disk drive 4-8

Chapter 4: Performance Audits Move some files to an additional disk or server Add disks to a RAID array (if one is used) NOTE: If you have more than one logical partition on the same hard disk, use the LogicalDisk counters rather than the PhysicalDisk counters. Monitor the Page Faults/sec counter in the Memory object to make sure that the disk activity is not caused by paging. A page fault occurs when the operating system cannot find the requested information in its physical memory, forcing the operating system to seek the information at the disk level. A soft page fault is when a page is found elsewhere in the physical memory, and a hard fault requires disk access. Most processors can handle large numbers of soft faults without any significant consequences. However, hard faults, which require disk access, can cause significant delays. In general, the value of this counter should stay below 20-25 pages per second (per processor). You can monitor the following counters in the SQL Server:Buffer Manager object to isolate the disk activity generated by SQL Server components. Counter Description Guidelines Page reads/sec Page writes/sec Number of physical database page reads that are issued per second. This statistic displays the total number of physical page reads across all databases. Number of physical database page writes issued per second. Minimize the number of reads either by using a larger data cache, intelligent indexes and more efficient queries, or by changing the database design. Minimize the number of writes either by using a larger data cache, intelligent indexes and more efficient queries, or by changing the database design. If the values for these counters approach the capacity limit of the hardware I/O subsystem, try to reduce the values by tuning your application or database to reduce I/O operations (such as index coverage, better indexes, or normalization), increasing the I/O capacity of the hardware or adding memory. Monitoring CPU Usage Monitor an instance of SQL Server periodically to determine whether CPU usage rates are within usual ranges. A continually high rate of CPU usage may indicate the need to upgrade the CPU or add multiple processors. Alternatively, a high CPU usage rate may indicate a poorly tuned or designed application. Optimizing the application can reduce CPU use. 4-9

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Use the counters described in the following table to monitor CPU usage. Counter Description Guidelines Processor - % Processor Time Process - % Processor Time (sqlservr instance) System - Processor Queue Length Monitors how long the CPU spends executing a thread that is not idle. Monitors how long the CPU spends executing a thread in the SQL Server process. Monitors the queue length per processor. Monitoring Memory Usage A consistent state of 80 to 90 percent may indicate the need to upgrade your CPU or add more processors. For multiprocessor systems, monitor a separate instance of this counter for each processor. Use this counter to assess the SQL Server contribution to overall processor utilization. Recommended maximum value per processor is 2. Higher values usually indicate a CPU bottleneck. You can monitor an instance of SQL Server periodically to confirm that memory usage is within typical ranges. You need to be sure that no processes, including SQL Server, consume too much memory or are constrained by insufficient memory. To monitor for a low-memory condition, use the object counters described in the following table. Counter Description Guidelines Memory - Available Bytes Indicates how many bytes of memory are currently available for use by processes. Low values of the Available Bytes counter can indicate an overall shortage of memory on the computer or that an application is not releasing memory. 4-10

Chapter 4: Performance Audits Counter Description Guidelines Memory - Pages/sec Process - Page Faults/sec (sqlservr instance) Process - Working set (sqlservr instance) SQL Server Buffer Manager - Buffer Cache Hit Ratio SQL Server Buffer Manager - Page Life Expectanc y SQL Server Buffer Manager - Total Pages Indicates the number of pages that either were retrieved from disk because of hard page faults or written to disk to free up space in the working set because of page faults. Windows Virtual Memory Manager takes pages from SQL Server and other processes as it trims the working-set sizes of those processes. Shows the amount of memory that is used by a process Monitors the percentage of required pages found in the buffer cache, without reading from the hard disk. Does not differentiate between physical memory and paging file memory that is used for buffer cache. Indicates the number of seconds a page remains in physical memory. Monitors the total number of pages in the buffer cache, including database, free and stolen pages from other processes. A high rate for the Pages/sec counter could indicate excessive paging. Monitor the Memory:Page Faults/sec counter to make sure that disk activity is not caused by paging. The counter will typically be high after SQL Server is restarted, but show a decreasing trend afterward. A high number of this counter indicates excessive paging and disk thrashing. Use this counter to check whether SQL Server or another process is causing the excessive paging. If this number is consistently below the amount of memory that is set by the min server memory and max server memory server options, SQL Server is configured to use too much memory. Add more memory until the value is consistently greater than 90 percent. If this value is below 300 seconds, SQL Server is experiencing a memory shortage. A low number may indicate frequent disk I/O or thrashing. Consider adding more memory. 4-11

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Counter Description Guidelines SQL Server Memory Manager - Total Server Memory (KB) Monitors the total amount of dynamic memory that the server is using. Miscellaneous Counters If this counter is consistently high in comparison to the amount of physical memory available, more memory may be required. Other than the hardware monitoring counters described in the previous sections, you can add various other counters that monitor the performance and that help identify bottlenecks in the system. Counter Description Guidelines SQL Server:Locks - Lock Waits/Sec SQL Server:Locks - Lock Timeouts (Timeout > 0)/sec SQL Server:Locks - Number of Deadlocks/sec SQL Server:Wait Statistics Indicates the number of lock requests per second that require the caller to wait. Indicates the number of lock timeouts per second. Indicates the number of deadlocks per second. Provides general information about the time SQL Server spends waiting due to several reasons. The different counters can help you identify bottlenecks in the performance of the log files, the data files, the network and so on. This number should be near 0. If the value is greater than 0 for a considerable period, the performance drop is most likely caused by locking and blocking issues within the database. Recommended value is 0. Recommended value is 0. In general, high values indicate that specific system components are not performing optimally. Further monitoring of the individual component is necessary. 4-12

Chapter 4: Performance Audits Counter Description Guidelines SQLServer:Acce ss Methods - Full Scans/sec SQLServer:Acce ss Methods - Page Splits/sec SQLServer:Acce ss Methods - Table Lock Escalations/sec SQLServer:Acce ss Methods - Worktables created/sec SQLServer:SQL Statistics - Batch Requests/sec SQLServer:SQL Statistics - SQL Re- Compilations/sec Monitors the number of full table scans that SQL Server performs to retrieve data. Monitors the number of times a page is full and split between the current page and a new allocated page. This reports the number of times a table lock was asked for in a second. Indicates the number of temporary work tables that SQL Server is creating. Monitors the number of Transact-SQL command batches received per second. Monitors the number of statement recompiles per second. A high number can indicate insufficient indexes. Recommended value is less than 2. Use the sys.dm_index_usage_stats dynamic management view to view the index statistics or run Database Engine Tuning Advisor to analyze current indexes. Page splits occur when a data or index page is full. Excessive splitting can cause a high disk I/O. Consider adjusting the fill factor of your indexes. A high number can indicate insufficient indexes. A high number can indicate insufficient indexes. High batch requests indicate good throughput. Recommended values frequently depend on hardware. In general, values around 1000 indicate a CPU bottleneck. If values grow higher, consider adding a faster network card. As recompiles consume CPU time, try to reduce the number of recompiles. A value of 100 indicates unnecessary compilation overhead. This counter should be compared with the Batch Requests/sec. 4-13

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Demonstration: Setup Disk Activity Monitoring Recently, Mort, the IT Systems developer, received several support requests about server performance. Users have been complaining about long execution times, sometimes resulting in deadlocks. Mort reports this to Tim, the IT Manager, who will setup some monitoring on the SQL Server. Tim will monitor disk activity, memory and CPU usage. He will also monitor the number of deadlocks in SQL Server. 1. On the Windows taskbar, select Start and then Run. 2. In the Run window, enter perfmon and then click OK. The Performance Microsoft Management Console including the System Monitor and Performance Logs and Alerts snap-ins will appear. FIGURE 4.1 THE PERFMON TOOL The bottom right pane shows the resources currently being monitored. In the upper right pane, you see the visualization of the counter values. By default, there are three counters, whose values are displayed in a graph object. 3. Right-click the graph object in the upper right pane and select Add Counters. 4. In the Add Counters window, select the computer that you want to monitor. You can either select the local computer or other computers in the network. In this case, Tim selects the local computer. 4-14

Chapter 4: Performance Audits 5. In the Performance object field, select PhysicalDisk. 6. Choose the Select counters from List option. 7. In the left list box, select % Disk Time. 8. Choose the Select Instances from List option. 9. In the right list box, select the C drive. 10. Click the Add button. 11. Repeat steps 7 to 10 for the Current Disk Queue Length and Avg. Disk Queue Length counters. To monitor the disk activity generated by SQL Server, Tim adds two more counters from the SQL Server:Buffer Manager object. 1. In the Performance object field, select SQL Server:Buffer Manager. 2. Choose the Select counters from List option. 3. In the left list box, select Page reads/sec. 4. Click the Add button. 5. In the left list box, select Page writes/sec. 6. Click the Add button. Finally, to detect the I/O caused by paging, Tim will add the Page Faults/sec counter from the Memory object. 1. In the Performance object field, select Memory. 2. Choose the Select counters from List option. 3. In the left list box, select Page Faults/sec. 4. Click the Add button. Demonstration: Setup CPU Monitoring As a next step, Tim will add counters to monitor the CPU. The % Processor Time counter from the Processor object is displayed by default. To monitor the CPU time claimed by SQL Server, Tim adds the % Processor Time from the Process object. 1. In the Performance object field, select Process. 2. Choose the Select counters from List option. 3. In the left list box, select % Processor Time. 4. Choose the Select Instances from List option. 5. In the right list box, select the sqlservr process. 6. Click the Add button. 4-15

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Demonstration: Setup Memory Monitoring As a third step, Tim will add counters to monitor the server memory. The Pages/sec counter is added by default. The number of page faults per second has been added when monitoring disk activity. Tim wants to know whether SQL Server has sufficient memory, so he decides to add the Buffer Cache Hit Ratio counter. 1. In the Performance object field, select SQL Server:Buffer Manager. 2. Choose the Select counters from List option. 3. In the left list box, select Buffer Cache Hit Ratio. 4. Click the Add button. Demonstration: Setup Lock Monitoring As a final step, Tim will add counters to monitor the number of locks and deadlocks in SQL Server. 1. In the Performance object field, select SQL Server:Locks. 2. Choose the Select counters from List option. 3. In the left list box, select Number of Deadlocks/sec. 4. Choose the Select Instances from List option. 5. In the right list box, select the Database option. 6. Click the Add button. 4-16

Chapter 4: Performance Audits The result should look as follows: FIGURE 4.2 COUNTERS IN THE PERFMON TOOL WINDOW Demonstration: Create a Counter Log Tim has modified the default counters of System Monitor. However, he cannot watch System Monitor all the time. He wants to save the measurements in a counter log file so that he can analyze the data when he has time. Tim will also use counter logs to track trends. If necessary, he can go back months to confirm a change in resource usage. Moreover, by keeping logs, he can see exactly when a problem originated. Perform the following steps to create a counter log. 1. On the Windows taskbar, select Start and then Run. 2. In the Run window, enter perfmon and then click OK. The Performance Microsoft Management Console including the System Monitor and Performance Logs and Alerts snap-ins will appear. 3. In the left pane, under Performance Logs and Alerts, right-click Counter Logs and select New Log Settings. 4. In the New Log Settings window, enter a name for the log. For example, the name of the server or object to measure. In this case, Tim enters NAV-SRV-01. 4-17

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 5. Click OK to close the New Log Settings window. The NAV-SRV- 01 window appears. 6. In the Current Log File Name field, enter a path and a name for the counter log files. The default folder for the log files is C:\PerfLogs. You can change the default folder and log name if you want. 7. Click the Add Counters button to add individual counters, as shown in the previous demonstrations. To select multiple counters in an object, press the Ctrl key at the same time you select the counters. 8. In the Interval and Units fields, leave the default values. Be aware that smaller intervals imply more accurate event information, but also require more disk space. 9. Click the Apply button to create and activate the counter log. To modify the counter log, right-click the counter log and select Properties. To stop or start a counter log, right-click the log and select either Stop or Start. Tim has defined and activated a counter log to monitor the system resources. While watching the System Monitor, he sees the % Processor Time counter peaking at almost 100%. FIGURE 4.3 THE SYSTEM MONITOR WINDOW SHOWS INCREASED % PROCESSOR TIME He needs to find out what is happening on the server. In the meantime, he leaves the counter log running. 4-18

Chapter 4: Performance Audits Client Monitor The Client Monitor is an important tool for troubleshooting performance and locking problems. You can also use it to identify the worst server calls and to identify index and filter problems in the SQL Server Option. The Client Monitor and the Code Coverage tool now work closely together allowing you to easily identify, for example, the code that generated a particular server call. Import Client Monitor The Client Monitor displays all the details of the server calls made by the current client, including the time spent on each server call. All the trace information is stored in the Monitor table (a virtual table stored in the Microsoft Dynamics NAV database). This makes it an invaluable tool when analyzing a particular task and studying the server calls that the task makes as well as the code that initiates the server calls. This tool can be used with both server options. When you use the Client Monitor with the SQL Server Option it contains some additional options and fields that provide more insight into how your application is performing. The Client Monitor tool is available on the Tools menu. However, to profile and analyze a given task in Microsoft Dynamics NAV using the Client Monitor, you must have some Client Monitor helper objects, which allow better analysis. The helper objects are part of the Microsoft Dynamics NAV Database Resource Kit, which is available on PartnerSource(https://mbs.microsoft.com/partnersource/downloads/supplements/ databaseresourcekit.htm). Client Monitor requires a Microsoft Dynamics NAV Classic client and only captures information about the server requests (SQL statements) from the current client. It does not contain information about the system resources. You can either analyze the Client Monitor information in Client Monitor or you can export the output to a file and analyze the data in Microsoft Office Excel. Be aware that in the standard version of the Client Monitor, the virtual table Monitor is cleared when you exit the Microsoft Dynamics NAV Classic client. If you use the extended version, remember to open form 150020, Client Monitor before exiting the Classic client. (Opening the form copies the data from the virtual table to a regular database table, so the information will be available for later analysis.) If you run the Code Coverage and Client Monitor tools, you can link the output of both tools and see which line of C/AL code a specific measurement is linked to. Analyzing locking problems requires multiple clients running the Client Monitor. To find deadlocks, you must enter the locking rules. 4-19

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Profile a Task After you have imported and compiled the Client Monitor helper objects, you can start profiling a task. 1. On the Tools menu, click Debugger and then Code Coverage to open the Code Coverage window. 2. Start the Code Coverage tool and then start the Client Monitor just before you are ready to perform the task that you want to investigate. 3. Perform the task that you want to test. 4. When you have finished the task, stop the Client Monitor and then stop the Code Coverage tool. The Client Monitor uses many lines to describe a single server call, and this makes it difficult to use for data analysis. 5. Run form 150020, Client Monitor. This processes the data from the Client Monitor and displays it in a new window. The Client Monitor window displays and formats the data that has been collected by the Client Monitor so that it can be more easily analyzed. It performs a type of cross tabulation of the operations and parameters and uses one line per server call. After you have profiled a specific task, you can start analyzing the information in the Client Monitor window. You can use the information to find the worst server calls and to detect locking problems. Best Practices when Profiling a Task When using these tools, make sure that your test tasks are focused on the area that you are interested in testing. If you spend time doing other tasks, both the Client Monitor and the Code Coverage tool will fill up with irrelevant information. If you are analyzing a lengthy task that takes an hour or more to run, consider restricting the scope of the task. You can limit the task by applying filters that will make the task handle less data, or by stopping the task after several minutes. You can then use the Client Monitor data from the part of the task that was performed as the basis for your analysis. You can analyze the Client Monitor data within Microsoft Dynamics NAV, or you can perform a more detailed analysis by importing the data into pivot tables in Excel. Problematic Server Calls Perform the following steps to identify the most problematic server calls: 1. Profile your task as described in the previous section. 4-20

Chapter 4: Performance Audits 2. Run form 150020, Client Monitor. 3. Click View and then Sort to sort the data in the Client Monitor window. Sorting by Elapsed Time (ms) in descending order is a useful way to view the data. The server calls that took the longest time will then be listed at the top. This helps you identify the worst server calls. After you have identified the problematic server calls, you can optimize the slow queries that are caused by filters and keys that do not match (especially on SQL Server) by using the appropriate keys in the queries or possibly by changing the existing keys. NOTE: Rearranging the fields in a key, for example, by moving the first field in a key to the end and by changing the references to the key (both in the code and in the properties), can solve a performance problem. Furthermore, any FlowFields in the key that are calculating sums are guaranteed to work as long as all the original fields are left in the key. If you remove some of the fields from a key, you can cause some FlowFields that are calculating sums to produce run time errors. When you are developing an application, you will not encounter problems such as the one described here, unless you enter some pseudo-realistic amounts of data into the database. Locking Problems You can also use the Client Monitor to see whether locking prevents concurrent tasks from being executed at the same time and to identify where deadlocks occur in a specific multi-user scenario. Before you use the Client Monitor to identify potential locking problems, you must import the Client Monitor helper objects as described earlier in this lesson. Before you try to identify locking problems, you must make sure the clocks are synchronized on all the client machines. You can set up computers running most Windows operating systems so that their clocks are automatically synchronized with the time on a server when they log on by using the following command: net time \\computername /SET. 1. Start the Client Monitor on all the computers involved in the multiuser test. 2. Perform the tasks that you want to test. 3. Stop the Client Monitor on all the computers. 4. On each client computer, process the common client monitor data by running form 150020, Client Monitor. 5. Run form 150024, Client Monitor (Multi-User) on one of the client computers. 4-21

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 The Client Monitor (Multi-User) window contains information about the transactions that might have blocked other clients. The COMMIT in the transactions that might have blocked other clients is shown with the server calls made by the clients that are potentially waiting for the COMMIT to be completed. The other server calls are listed before the COMMIT. Deadlocks 1. Check the values in the Elapsed Time (ms) field to see whether there are any server calls that are taking longer than usual. A high value in the Elapsed Time (ms) field indicates that a server call is waiting for locks to be released. Use the filtering features in Microsoft Dynamics NAV to see all the details of the locking scenarios. The value in the Locking field is Yes when a server call locks data. You should put a filter on this field to limit the data. 2. If a deadlock has occurred on SQL Server, the SQL Error field in the Client Monitor (Multi-User) window will show the error message that is generated by SQL Server. To see all of these lines, set the filter "SQL Error<>''. These lines are marked red and bold. Two transactions can only cause a deadlock if they both lock some of the same tables. However, if both of the transactions are defined so that the first lock they place is on the same table, no deadlock will occur. In other words, a deadlock occurs when two or more transactions have a conflicting locking order. To identify potential locking problems, you only need to use one client. You run the tasks on the client that you think might cause locking problems and collect all of the relevant data in the Client Monitor and then open a special form to see whether there are any potential deadlocks. To find potential deadlocks: 1. Import the Client Monitor.fob file, if you have not already imported it. 2. Compile all the objects that are imported. This must be done because some of the field definitions are different on the two database server options. 3. Prepare the tasks that you want to run concurrently without any deadlocks occurring. 4. Open and start the Code Coverage tool and then open and start the Client Monitor. 5. Perform the tasks. 6. Stop the Client Monitor and then stop the Code Coverage tool. 7. Run form 150030, Potential Deadlocks (Navision). 4-22

Chapter 4: Performance Audits The Potential Deadlocks (Navision) window lists all the potential deadlocks or locking order conflicts that occurred during the tasks you performed. It is based on an analysis of the locking order used in each write transaction that was carried out. Each line in the window contains information about two transactions that represent a potential deadlock. These transactions represent a potential deadlock because they both lock some of the same tables but lock them in a different order. Sets of transactions that do not contain a potential deadlock are not displayed. Each line in the window contains the following information. From this form, you can access more detailed information about the locks that were placed by each transaction, as well as the code that was used. From this form, you can access more detailed information about the locks that were placed by each transaction, as well as the code that was used. Locking Order Rules As stated in the previous section, a deadlock occurs when two or more transactions have a conflicting locking order and no deadlock can occur if the first lock the transactions place is on the same table. So if you have an agreed set of rules that determine the locking order that must be used in your application then no deadlocks will occur. The problem is that agreeing to a set of rules is one thing, adhering to the rules is another thing completely. Remembering the rules is not as easy as it sounds - there could be lots of them. There is also a tool that can help you see whether your application follows the locking rules specified. This involves determining which rules must apply in your application, entering them into the system and then checking your application to see whether it violates the rules or not. Entering Locking Rules The most practical way to identify the locking order rules is to focus on the key procedures and document the order in which they lock the various tables. After you determine the rules that govern the locking order in your application, you can enter them into the system. 1. Run form 150029, Locking Order Rules. 2. Enter the rules that you want your application to follow. Each entry represents a rule and you can enter as many rules as are needed. Each rule specifies that one table must be locked before another table. Needless to say your rules must not contain any conflicts. 4-23

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Verifying Locking Rules The consistency of the rules is checked when you test your application to see whether it follows the rules. If the rules contain a conflict you will receive an error message. After you have entered the rules, you can test whether your application follows the rules. 1. Open and start the Code Coverage tool and then open and start the Client Monitor. 2. Perform the tasks that you want to test. 3. Stop the Client Monitor and then stop the Code Coverage tool. 4. Run form 150027, Transactions (Locking Rules). The Transactions (Locking Rules) window appears listing the transactions that you performed. 5. In the Transactions (Locking Rules) window, check the Locking Rule Violations field. If any of the transactions violated the rules that you specified earlier a check mark is displayed in the Locking Rule Violations field. 6. Click Transaction, Locking Rules Violations to check the violation. 7. In Locking Rules Violations window, you can see whether and how your transaction violated the locking order rules. You can fix the code, amend the locking order rules or decide that the probability of the two processes running concurrently is minimal and ignore the violation. Furthermore, the Transactions (Locking Rules) window offers the following functions: To see the C/AL code that broke the locking rule, select the transaction in question and then click C/AL Code. The Code Coverage window appears displaying the relevant code. To see all of the operations and tables involved in a particular transaction, select the transaction and then click Transaction and then Client Monitor. To see only the locking operations and the tables that were locked in a particular transaction, select the transaction and click Transaction and then Client Monitor (Locking Operations Only). To see the order in which tables were locked by a particular transaction, select the transaction and click Transaction and then Locking Order. To see the locking rules that were violated by a particular transaction and then select the transaction and click Transaction, Locking Rules Violated. 4-24

Chapter 4: Performance Audits When testing the locking order rules, you can run the Code Coverage tool to help you identify the code that is causing the conflicts. However, the Code Coverage tool is quite a time intensive tool and should not be used for multiple transactions, or run for more than 10 minutes. It takes a long time to perform complex procedures when the Code Coverage tool is running and it also takes time to process the information in the tool afterward. If you only use the Client Monitor without the Code Coverage tool, you will most likely get enough information to identify the code that is causing the conflict. Be aware that the only way to lock a record on SQL Server is to actually read it, whereas on Classic database server, you can use the LOCKTABLE command to dictate and define the locking order. The LOCKTABLE command on SQL Server Option does not do anything except flag internally that locks will be placed later when records are read. In summary, you can run the Client Monitor on its own in a real-life environment without experiencing any major decrease in performance and still get the information needed to resolve incorrect locking order problems. Keys When you are using the SQL Server Option, it is important that any customizations that you develop contain keys and filters that are designed to run optimally on SQL Server. Microsoft has therefore developed a tool that helps you test your keys and filters in a development environment and ensure that they comply with the demands made by SQL Server. To see whether an application contains any keys that might cause problems, you need a demonstration database and not a copy of the customer's database. Inserting, modifying, and deleting records take a similar amount of time in both large and small databases. However, the time that it takes to read will be very different, especially for tables that become very large in the customer's database. This means that an analysis based on the Elapsed Time (ms) field in the Client Monitor is not enough when you are troubleshooting performance problems in a small database. To check whether the keys and filters are designed correctly: 1. Open and start the Code Coverage tool and then open and start the Client Monitor. 2. Perform the task that you want to test. 3. Stop the Client Monitor and then stop the Code Coverage tool. 4. Open form 150022, Client Monitor (Key Usage). Queries with filters that do not use the keys appropriately are shown in this window. 4-25

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 The key that is being used is split into two fields: Good Filtered Start of Key and Key Remainder. Fields that are filtered to a single value, but are not used efficiently on SQL Server because of the selection and ordering of fields in the key that is used, are shown in the Key Candidate Fields field. Remember that SQL Server always wants a single-value field as the filter or as the first field in the filter. The information in the Client Monitor (Key Usage) window is sorted by table name and only displays the queries with filters that can potentially cause problems. You will need to use your knowledge of the application being developed and the theory behind the design of keys for SQL Server to decide which of the queries can be ignored and which will have to be modified. In general, consider the following: Ignore queries that use small tables that will not grow very large in the customer's database. An example of a small table that you can readily ignore is table 95, G/L Budget Name. Focus on the large tables and the tables that will grow quickly in the customer's database. Focus on the Key Candidate Fields and the Good Filtered Start of Key fields. As mentioned earlier, you should look at the Good Filtered Start of Key field. If this field is empty, check the Key Candidate Fields field and decide whether the fields shown here would have made a difference if they had been used efficiently. This is where your understanding of the application helps you. You need to decide whether the suggested key will make the query run more efficiently or not. If the suggested filter is a field that contains many different values, it will probably help. If you understand the theory behind the design of efficient queries, you will know whether it makes sense to change the application. However, if you are uncertain about the theory you will have to test the suggested query. This means that you must use a database that contains a realistic amount of data and then test the existing filter and the suggested new filter to see which one works more efficiently. NEXT Statements When running SQL Server and reading data, some NEXT function calls can generate separate SQL statements instead of using the data that is stored in cache. This can cause performance problems. The Client Monitor also contains a tool that can help you locate the C/AL code that generated these problematic SQL NEXT statements. 4-26

Chapter 4: Performance Audits To locate this C/AL code, you must perform the tasks in question, identify the problematic NEXT statements, and then locate the C/AL code that generated these statements. 1. Prepare the tasks that you want to perform. 2. Open and start the Code Coverage tool and then open and start the Client Monitor. 3. Perform the tasks that you want to test. 4. Stop the Client Monitor and then stop the Code Coverage tool. 5. Run form 150023, Client Monitor (Cache Usage). The Client Monitor (Cache Usage) window lists the problematic NEXT statements generated by the tasks that you performed. All of the normal NEXT statements are ignored. These NEXT statements are problematic because they generate their own SQL statement and database call to retrieve data from the server and do not use the data that is already cached on the client. It is difficult to know with certainty why these NEXT statements behave in this manner. It might be due to the following: Because C/SIDE is unable to optimize this function. A result of the way that the code is written. However these NEXT statements only cause problems if they are run repeatedly as part of a long-running batch job and generate a large number of additional server calls. To see the C/AL code that generated the SQL NEXT statements, select the line you are interested in and then click C/AL Code. The code that generated the statement is displayed in the Code Overview window. Analysis in Microsoft Office Excel You can use the Client Monitor together with Microsoft Excel to analyze the time that is spent by tasks that make many server calls (100+). You must begin by profiling the task as described earlier in this lesson. The data must then be transferred into Microsoft Office Excel. Perform the following steps to transfer the data into Excel: 1. Run form 150020, Client Monitor. 2. Click Export and save as a.txt file. 3. In Excel, import the.txt file that you have just saved. You now have a spreadsheet that contains the basic Client Monitor data. 4-27

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Use the pivot tables in Microsoft Office Excel to generate a sorted list of the tables that take the most time. The pivot table must also list the functions that are used, the search method, and the search result for each table. You must check the server calls that generated the sums to see the average amount of elapsed time for each server call. You can also create new spreadsheets that summarize different operations on various tables by using the Pivot Table feature. To create a pivot table: 1. Click Data, PivotTable, PivotChart Report and then click Finish in the wizard that appears. You can now choose which breakdown of the Client Monitor elements you want to analyze. This example uses the typical elements. 2. In the PivotTable window, select the Table Name button and drag it over to the range that says "Drop Row Fields Here". 3. Repeat this procedure for Function Name, Search Method and Search Result placing each field to the right of the previous field. 4. Drag Elapsed Time (ms) over to the range that says "Drop Data Items Here". A sum of the breakdown of timings per table/function, etc. is shown. Perform the following steps to list the most important tables first: 1. Double-click the Table Name field heading. 2. Select Advanced. 3. In the AutoSort options, select Descending. In the Using drop-down list select Sum of Elapsed Time (ms). To list the most important functions per table first, repeat this procedure for the Function Name field. If there are any totals that you do not want to see, rightclick the field that contains the word Total and then click Hide. For more information about pivot tables, see the online help in Microsoft Office Excel. If time is spent on modifications (INSERT, MODIFY, DELETE, MODIFYALL, DELETEALL) and the average time that is spent on modification server calls is high, you should check the keys in the table. The number and length of the keys influence the time it takes to make modifications on both database servers. On SQL Server, if the average time that is spent on modification server calls is very long, check whether there are SumIndexFields in the keys and whether the MaintainSIFTIndex property is set to Yes for these keys. Setting the MaintainSIFTIndex property to No for these keys can greatly improve the speed of modification server calls, but there will be some loss of performance for those tasks that generate sums using these keys. 4-28

Chapter 4: Performance Audits Demonstration: Monitoring Code with Client Monitor Mort needs to initialize a new field "Entry No. 2" in the Ledger Entry Dimension table. To do this, he writes a codeunit to copy the value from the Entry No. field to the Entry No. 2 field. The C/AL code is as follows: LEDim.SETRANGE("Entry No. 2", 0); WHILE LEDim.FIND('-') DO BEGIN LEDim."Entry No. 2" := LEDim."Entry No."; LEDim.MODIFY; END; When he runs the code on the Test Database NAV (6-0) database, he notices that the code runs rather slowly. Before he runs the code in the live environment, he checks the performance of his code in the extended version of the Client Monitor as follows. 1. Open Microsoft Dynamics NAV 2009 Classic client. 2. In the Tools menu, select Object Designer to open the Object Designer window. Mort selects his codeunit. Before running his codeunit, he starts the Code Coverage tool and the Client Monitor tool, so he can link the performance data from the Client Monitor to the code information from the Code Coverage tool. 1. In the Tools menu, select Debugger > Code Coverage. 2. In the Code Coverage window, click the Start button. 3. In the Tools menu, select Client Monitor. 4. In the Client Monitor window, on the Options tab, and select all options. 5. Click the Start button. Now that both monitoring tools have been started, Mort starts his codeunit. As soon as the codeunit finishes, Mort stops both monitoring tools. 1. In the Code Coverage window, click the Stop button. 2. In the Client Monitor window, click the Stop button. Mort now opens the extended version of the Client Monitor tool. As he is focused on the duration of the transaction, he sorts the Client Monitor information based on the Elapsed Time (ms) field. 1. In the Object Designer window, select Form 150020, Client Monitor. 2. Click Run to open the Client Monitor window. 3. On the View menu, select Sort. 4-29

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 4. In the Client Monitor - Sort window, in the Key field, select Login Date, Login Time, Connection ID, and Elapsed Time (ms). 5. In the Order field, select the Descending option. 6. Click OK to close the window. The information will be sorted on Elapsed Time. The statements with the highest value will be displayed at the top. FIGURE 4.4 THE CLIENT MONITOR WINDOW When looking at the top lines, Mort notices that a lot of time is spent on MODIFY and FIND/NEXT statements. Mort checks the SourceText column and sees that the Client Monitor line is linked to the LEDim.MODIFY statement. FIGURE 4.5 THE SOURCETEXT COLUMN Mort cannot do much about the MODIFY statement. Leaving it out is not an option, and he cannot use a MODIFYALL statement. 4-30

Chapter 4: Performance Audits Mort decides to examine the FIND/NEXT function lines. He selects the second line in the Client Monitor window and clicks the C/AL Code button. Session Monitor FIGURE 4.6 THE CODE COVERAGE WINDOW The FIND/NEXT function is linked to the WHILE DO statement. Mort needs to find a faster way to go through the records. You use the Session Monitor (SQL Server) to identify the clients that cause performance problems. As with the Client Monitor, you must import some helper objects before identifying the clients causing performance problems. Before you can use the tool, you must read the following KB Article; KB 933042 - Error message when you use the Session Monitor feature in Microsoft Dynamics NAV: "Invalid length parameter passed to the substring function" (https://mbs.microsoft.com/knowledgebase/kbdisplay.aspx?wtntzsmnwuk NTMMYLSVQUSPTNTNSMQPYLWVKVPNNSOMLNKYXSOONZRWVU TPWQPVL). Import the Session Monitor The Session Monitor (SQL Server)window displays updated information from the virtual Session table. Before identifying the clients that are causing performance problems, you must import some helper objects in the database. In addition, you need to run an SQL script to update the existing Session (SQL) view. The helper objects and the SQL script are part of the Microsoft Dynamics NAV Database Resource Kit, which is available on PartnerSource(https://mbs.microsoft.com/partnersource/downloads/supplements/ databaseresourcekit.htm). 4-31

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Perform the following steps to import the Session Monitor: 1. Ensure that you have installed the client components for SQL Server from the Microsoft SQL Server CD. 2. Open the Microsoft SQL Server Management Studio and click New Query, File, and then Open. 3. Browse to the folder where you have stored the session monitor tools and open the Session Monitor (SQL Server).sql file. 4. In the toolbar, select the Microsoft Dynamics NAV database to monitor from the drop-down list of available databases. 5. Click Query and then Execute. The Session Monitor (SQL Server).sql script drops the current Session (SQL) view and creates a new view to replace it. 6. In Microsoft Dynamics NAV, open Object Designer and import the Session Monitor (SQL Server).fob file. 7. Compile all imported objects. Start the Session Monitor Perform the following steps to start the Session Monitor. 1. Run form 150014, Session Monitor (SQL Server), and click Monitor, and then Setup. The Session Monitor Setup (SQL) window opens. 2. Click the Log tab. 3. Select the Log Session Activity check box. Specify a time interval in the Log Interval (sec) field. If you are only interested in identifying blocks that occur, select the Log only Blocks check box. You must enter a value in the Log Interval (sec) field, for example 15 seconds. Microsoft Dynamics NAV will now log the current level of activity once every 15 seconds. When Microsoft Dynamics NAV logs the activity, it runs through the active sessions and creates one entry per session. These entries are logged to the Session Information History table. If you select the Log only Blocks option, Microsoft Dynamics NAV only logs the sessions that are involved in blocking. This includes the sessions that are blocked and the sessions blocking others. Finding the correct setting for the Log Interval (sec) option is a matter of achieving the right balance between how accurately you want Microsoft Dynamics NAV to log activity, and how large a performance overhead you will accept. Fifteen seconds would seem to give a reasonable balance. 4-32

Chapter 4: Performance Audits The Session Monitor creates a log entry every time the specified interval has elapsed and therefore any blocks that occur within the specified interval are not logged. If you want to make sure that the Session Monitor catches as many blocks as possible, decrease the interval to, for example, 5 seconds or less. To simultaneously decrease the performance overhead, select the Log only Blocks option so that only the sessions involved in a block are logged. 4. In the Session Monitor Setup (SQL) window, click Functions and then Start Logging. The Log Session Activity codeunit runs as a single instance codeunit. This means that the only way to stop the codeunit is to close and reopen the current company. However, you can suspend it by removing the check mark from the Log Session Activity option. The codeunit will continue to run in the background, but it will not log anything. To resume logging, you must restart the session and start logging again. Viewing the Session Monitor Log You can view the information collected by the Session Monitor in the Session Monitor (SQL Server) window. The Session Monitor (SQL Server)window displays updated information from a SQL Server view. This view is similar to the one linked to the Session table. The Session Monitor (SQL Server) window tells you which clients are currently connected to the server and the current load on the server. The information is refreshed every second by default. This setting can be changed by clicking Monitor, Setup, and changing the number in the Refresh Rate (sec) field. By default, the most active sessions in terms of the number of records scanned are shown at the top of the list. Follow the guidelines in the other sections to investigate these sessions. By default, the most active sessions in terms of physical I/O (the number of records scanned), are listed at the top of the Session Monitor (SQL Server) window. The Records Scanned field shows how many records the database server has scanned to determine the records that this session wanted. The sessions with the largest number of scanned records are the sessions that should be investigated first. You can also list the sessions according to memory usage, as this is a good indicator of activity. SQL Server can also give you information about the CPU usage. NOTE: In the Session Monitor (SQL Server) window, if the value in the Found/Scanned Ratio field is high, this indicates that the indexes and queries match. A value of 30-50% is normal, while 3% is low. To investigate the most active sessions, follow the guidelines described in the following sections. 4-33

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 SQL Server Profiler The Session Monitor (SQL Server) window also lists information about the clients that are waiting for locks held by other clients to be released, as well as the identity of the clients that placed the locks. If you want to concentrate on this area only, examine and/or filter the fields starting with Blocked (Blocked, Blocked by Connection ID, Blocked by User ID, Blocked by Host Name). Analysis in Microsoft Office Excel As mentioned earlier, the information collected by the Session Monitor is inserted into the Session Information History table. You can also use the Session Information History window to view this data. The Session Information History window shows you all the entries that have been logged. You can analyze this information in Microsoft Dynamics NAV or export it to a.csv file and analyze it in Microsoft Office Excel. Perform the following steps to export the log to and analyze it in Microsoft Office Excel: 1. Open Object Designer. 2. Run form 90015, Session Information History. 3. Click Functions and then Export, and in the Options tab specify a file name and the location where you want to save the log file. Remember to use the extension.csv, for example, Log.csv. 4. Click OK to export the log file. 5. Locate the file and double-click it to open it in Microsoft Office Excel. Excel has a limitation of 1,048,576 rows. If the log contains more entries than this, you cannot open the file. This can be remedied by applying a filter and only exporting some of the entries. Alternatively, you can delete some of the entries and then export the rest. If you are close to the limit of 1,048,576 records, click Functions and then Count, to check how many entries are within the current filter. SQL Server Profiler provides the ability to trace server and database activity such as login, user and application activity. You can capture the data in a table, a file or a Transact-SQL script for later analysis. What is SQL Server Profiler? Microsoft SQL Server Profiler is a graphical user interface tool for monitoring an instance of the SQL Server Database Engine or Analysis Services. You can capture and save data about each event to a file or table to analyze later. For example, you can monitor a production environment to see which queries or stored procedures are affecting performance by executing too slowly. 4-34

Chapter 4: Performance Audits Functions of SQL Server Profiler SQL Server Profiler shows how SQL Server resolves queries internally, enabling administrators to view exactly which Transact-SQL statements are submitted to the server and how the server accesses the database to return result sets. By using SQL Server Profiler, you can do the following: Create a trace based on a reusable template Watch the trace results as the trace runs Store the trace results in a table or file for further analysis Start, stop, pause and modify the trace results as necessary Replay the trace results Use SQL Server Profiler to monitor only the events in which you are interested. If there is too much activity to examine easily, you can filter activity based on the information you want so that only a subset of the event data is collected. Monitoring too many events adds overhead to the server and the monitoring process, and can cause the trace file or trace table to grow very large, especially when the monitoring process occurs over a long period. To minimize the effect of monitoring on the SQL Server, you can start SQL Server Profiler on a second SQL Server. Tracing SQL Server Activity by using SQL Server Profiler To use SQL Server Profiler, first decide what to trace and then select the criteria. Activity that you might want to monitor includes the following: Poorly performing queries Queries that cause table scans Activities of individual users or applications Performance of the tempdb database Deadlock problems Login attempts, failures, connections and disconnections Logical disk reads and writes CPU Use at statement level Wait time for all post execution events 4-35

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 SQL Server Profiler Trace Options When using SQL Server Profiler to create a trace, there are several options for defining the activity that it will record and where the logged trace activity will be stored. Specifying a Trace Template The events included in a trace are determined by specifying the event classes to monitor and the individual data values (columns) to record. Do this by selecting the template on which you want to base your trace, and then adding or removing individual event classes or columns and applying filters to limit the data collected based on specific criteria. SQL Server Profiler offers several predefined templates that enable you to easily configure the events you are most likely to need for specific kinds of activity. The Standard template, for example, helps you create a generic trace for recording logins, logouts, batches completed, and connection information. Use this template to run traces without modification or as a starting point for additional templates with different event configuration. You can also create your own user-defined templates or modify the predefined templates. Template Name SP_Counts Standard TSQL TSQL_Duration TSQL_Grouped TSQL_Locks TSQL_Replay Template Purpose Captures stored procedure execution behavior over time. Generic starting point for creating a trace. Captures all stored procedures and Transact-SQL batches that are run. Use to monitor general database server activity. Captures all Transact-SQL statements submitted to SQL Server by clients and the time issued. Use to debug client applications. Captures all Transact-SQL statements submitted to SQL Server by clients, their execution time (in milliseconds), and groups them by duration. Use to identify slow queries. Captures all Transact-SQL statements submitted to SQL Server and the time that they were issued. Groups information by user or client that submitted the statement. Use to investigate queries from a particular client or user. Captures all of the Transact-SQL statements that are submitted to SQL Server by clients together with exceptional lock events. Use to troubleshoot deadlocks, lock time-out, and lock escalation events. Captures detailed information about Transact-SQL statements that is required if the trace will be replayed. Use to perform iterative tuning, such as benchmark testing. 4-36

Chapter 4: Performance Audits Template Name TSQL_SPs Tuning Template Purpose Captures detailed information about all executing stored procedures. Use to analyze the component steps of stored procedures. Add the SP:Recompile event if you suspect that procedures are being recompiled. Captures information about stored procedures and Transact-SQL batch execution. Use to produce trace output that Database Engine Tuning Advisor can use as a workload to tune databases. When tuning a Microsoft Dynamics NAV environment, we recommend that you start from the Tuning wizard and include the SP:StmtCompleted (in the Stored Procedures category) and SQL:BatchCompleted (in the TSQL category) events. To do benchmark testing, you can use the TSQL_Replay template. For more information about trace templates, see SQL Server Profiler Trace Templates (SQL Server Profiler Trace Templates (http://msdn.microsoft.com/enus/library/ms190176.aspx)). Saving Trace Data Save captured event data to a file or a SQL Server table when you need to analyze or replay the captured data at a later time. By saving a trace, you can do the following: Use a trace file or trace table to create a workload that is used as input for Database Engine Tuning Advisor. Use a trace file to capture events and send the trace file to the support provider for analysis. Use the query processing tools in SQL Server to access the data or to view the data in SQL Server Profiler. Only members of the sysadmin fixed server role or the table creator can access the trace table directly. The following options are available when saving a trace to a table: The location and name of the table. The maximum number of rows to store in the table. The following options are available when saving a trace to a file: The location and name of the file. The maximum file size. 4-37

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Trace behavior when the file is full (roll over to start at the beginning of the file again or create a new file). Trace processing by server or SQL Server Profiler. Configuring the server to process the trace can reduce the performance effect of tracing. SQL Server Profiler traces can be combined with counter logs from the Performance Logs and Alerts snap-in. This way, you can link the performance information to specific database activities and analyze which query is being executed when performance drops. Furthermore, SQL Server Profiler traces can be analyzed by Database Engine Tuning Advisor. Specifying a Trace Stop Time Although SQL Server Profiler cannot be scheduled, you can set a trace stop time, which allows you to start a trace and have it run until a specific date and time. The ability to specify a stop time is useful when you want to record SQL Server activity for a predetermined period. Trace Categories, Events and Columns The information recorded in a trace is divided into categories. Categories contain events, each of which has attributes further defined by columns. Trace Categories In SQL Server Profiler, a category is a group of related event classes. Event classes consist of types of events that can be traced. The event class contains all the data columns that can be reported by an event. Categories listed by default are as follows: Security Audit - includes event classes that are used to audit server activity. Sessions - includes event classes produced by clients connecting to and disconnecting from an instance of SQL Server. Stored procedures - includes event classes produced by the execution of stored procedures. TSQL - includes event classes produced by the execution of Transact-SQL statements passed to an instance of SQL Server from the client. Each of the trace templates contains specific categories to monitor. 4-38

Chapter 4: Performance Audits Events An event is defined as the occurrence of an action within an instance of the SQL Server Database Engine. Events are further defined by their attributes, which are listed in data columns. The default events available are described in the following table. Category Event Description Security Audit Login Indicates that a user has successfully Audit logged in to SQL Server. Security Audit Audit Logout Indicates that a user has successfully logged out of SQL Server. Sessions ExistingConnection Indicates the properties of existing user connections when the trace was started. The server raises one ExistingConnection event per existing user connection. Stored Procedures RPC:Completed Indicates that a remote procedure call has been completed. TSQL SQL:Batch:Completed Indicates that a Transact-SQL batch has been completed. TSQL SQL:BatchStarting Indicates that a Transact-SQL batch is starting. Columns Data columns contain the attributes of events. SQL Server Profiler uses data columns in the trace output to describe events that are captured when the trace runs. You can manage columns by using column filters to control what data is being collected. For example, use the Application Name filter to exclude any data generated by SQL Server Profiler itself. You can also organize columns into related groups by using the Organize Columns function. Create a Trace Perform the following steps to create a new trace in SQL Server Profiler. 1. On the File menu, click New Trace, and connect to an instance of SQL Server. The Trace Properties dialog box appears. 2. In the Trace name box, type a name for the trace. 3. In the Use the template list, select a trace template on which to base the trace, or select Blank if you do not want to use a template. 4-39

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 4. To save the trace results, do one of the following: a. Click Save to file to capture the trace to a file. Specify a value for Set maximum file size. The default value is 5 megabytes (MB). b. Optionally, select Enable file rollover to automatically create new files when the maximum file size is reached. You can also select Server processes trace data, which causes the service that is running the trace to process trace data instead of the client application. When the server processes trace data, no events are skipped even under stress conditions, but server performance may be affected. 5. Optionally, select the Enable trace stop time check box, and specify a stop date and time. 6. To add or remove events, data columns or filters, click the Events Selection tab. 7. Click Run to start the trace. Specify Events and Data Columns for a Trace This topic describes how to specify event classes and data columns for traces by using SQL Server Profiler. On the Trace Properties or Trace Template Properties dialog box, click the Events Selection tab. The Events Selection tab contains a grid control. The grid control is a table that contains each of the traceable event classes. The table contains one row for each event class. The event classes can differ slightly, depending on the type and version of server to which you are connected. The event classes are identified in the Events column of the grid and are grouped by event category. The remaining columns list the data columns that can be returned for each event class. On the Events Selection tab, use the grid control to add or remove events and data columns from the trace file. The events checked by default depend on the trace template you select. To see a list of all events, check the Show all events field. To remove events from the trace, clear the check box in the Events column for each event class. To include events in a trace, select the box in the Events column for each event class, or check a data column that corresponds to an event. The following are recommended columns to include in the trace: TextData, ApplicationName, Reads, Writes, CPU, DatabaseName, HostName, SPID, StartTime, EndTime and Duration. You can apply a filter to the Duration field, to only include events that take longer than 50 milliseconds. 4-40

Chapter 4: Performance Audits NOTE: If the trace will be correlated with System Monitor or Performance Monitor data, both Start Time and End Time data columns must be included in the trace. Filter Events in a Trace Filters limit the events collected in a trace. If a filter is not set, all events of the selected event classes are returned in the trace output. It is not mandatory to set a filter for a trace. However, a filter minimizes the overhead that is incurred during tracing. For example, you can filter the DatabaseName column to include data from a specific database only, or you can filter the Duration column to include only statements with a minimum duration. You add filters to trace definitions by using the Events Selection tab of the Trace Properties or Trace Template Properties dialog box. 1. In the Trace Properties or Trace Template Properties dialog box, click the Events Selection tab. The Events Selection tab contains a grid control. The grid control is a table that contains each of the traceable event classes. The table contains one row for each event class. The event classes may differ slightly, depending on the type and version of server to which you are connected. The event classes are identified in the Events column of the grid and are grouped by event category. The remaining columns list the data columns that can be returned for each event class. 2. Click Column Filters. The Edit Filter dialog box appears. The Edit Filter dialog box contains a list of comparison operators you can use to filter events in a trace. 3. To apply a filter, click the comparison operator, and type a value to use for the filter. 4. Click OK. Considerations If you set filter conditions on the StartTime and EndTime data columns of the Events Selection tab, then be sure to do the following: The date you enter matches this format: YYYY/MM/DD HH:mm:sec. -OR- Use regional settings to display date and time values is checked in the General Options dialog box. To view the General Options dialog box, on the SQL Server Profiler Tools menu, click Option. -AND- The date you enter is between January 1, 1753 and December 31, 9999. 4-41

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 When filtering the Duration field, check whether the duration is mentioned in milliseconds or in microseconds in the General Options dialog box. To view the General Options dialog box, on the SQL Server Profiler Tools menu, click Options. Demonstration: Create a SQL Server Profiler Trace Tim has noticed peaks in the % Processor Time counter. To identify the cause, he will create a new trace in SQL Server Profiler to monitor the queries. 1. Open SQL Server Management Studio. 2. On the Tools menu, click SQL Server Profiler. 3. Connect to the SQL Server that you want to run the trace on. SQL Server Profiler will open. By default, the Trace Properties window displays. If it does not open, you can start a new trace by selecting New trace in the File menu. FIGURE 4.7 THE TRACE PROPERTIES WINDOW 4. In the Trace Properties window, in the Trace Name field, enter a name for your trace. You can specify any name you want. We recommend that you use a meaningful name, such as the name of the server to trace, the starting date of the trace, or a combination of both. 5. In the Use the template field, you can choose to start from a blank trace, or you can select a template to base the trace on. In this case, Tim starts from a blank trace. 4-42

Chapter 4: Performance Audits 6. Click the Save to File option to save the trace information to a file. The Save As dialog box appears. 7. In the Save As dialog box, browse to the folder where you want to store the trace information and specify the name for the trace file. In this case, Tim decides to store the trace file as C:\PerfLogs\NAV- SRV-01_PctProcTime.trc. 8. Click the Save as button to close the dialog box. 9. Select the Server processes trace data option to make sure that all data is logged. If selected, no events are skipped even under stress conditions. If this check box is cleared, processing is performed by SQL Server Profiler, and there is a possibility that some events are not traced under stress conditions. 10. Click the Events Selection tab. 11. In the Events column, select and expand the Stored Procedures category. 12. Select the SP:StmtCompleted event. 13. In the Events column, select and expand the TSQL category. 14. Select the SQL:BatchCompleted event. You can use the Show all event and Show all columns to show more or less events and columns. 15. For each of the selected events, select the following columns: ApplicationName, CPU, DatabaseName, Duration, EndTime, HostName, Reads, SPID, StartTime, TextData and Writes. FIGURE 4.8 EVENTS AND COLUMNS SELECTION 16. Click the Column Filters button. 17. In the Edit Filter window, select the DatabaseName column in the left pane. 4-43

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 18. In the right pane, double-click the Like operator and enter Demo Database NAV (6-0). FIGURE 4.9 DEFINE A TRACE FILTER As a consequence, the trace will only gather event information for the selected database. 19. Click OK to close the Edit Filters window. 20. Click Run to start the trace. Correlate Perfmon and SQL Server Profiler Data Although SQL Server Profiler trace files give you an overview of what is occurring inside a SQL Server database, they give no details about hardware resource consumption. However, the perfmon tool contains details about hardware resource consumption, but does not show which queries effectively cause the resource consumption. As of SQL Server 2005, it is possible to correlate trace information from SQL Server Profiler with trace data collected by the Perfmon tool. You cannot correlate a running trace that is still collecting data. You can only correlate saved trace files. NOTE: To correlate the SQL Trace file and the PerfMon log file, SQL Server Profiler trace files must include Start Time and End Time information. For a correct matching, both files must be time synchronized. If both trace files are captured on different computers, make sure to synchronize the system time on both computers. 1. In SQL Server Profiler, open a saved trace file or trace table. You cannot correlate a running trace that is still collecting event data. For accurate correlation with System Monitor data, the trace must contain both StartTime and EndTime data columns. 4-44

Chapter 4: Performance Audits Database Engine Tuning Advisor 2. On the SQL Server Profiler File menu, click Import Performance Data. 3. In the Open dialog box, select a file that contains a performance log. The performance log data must have been captured during the same time period in which the trace data is captured. 4. In the Performance Counters Limit dialog box, select the check boxes that correspond to the System Monitor objects and counters that you want to display along with the trace. 5. Click OK. 6. Select an event in the trace events window, or move through several adjacent rows in the trace events window by using the arrow keys. The vertical red bar in the System Monitor data window indicates the performance log data that is correlated with the selected trace event. 7. Click a point of interest in the System Monitor graph. The corresponding trace row that is nearest in time is selected. To zoom in on a time range, press and drag the mouse pointer in the System Monitor graph. The Database Engine Tuning Advisor enables you to tune databases for improved query processing. Database Engine Tuning Advisor examines how queries are processed in the databases you specify and then it recommends how you can improve query processing performance by modifying physical design structures such as indexes, indexed views, and partitioning. Database Engine Tuning Advisor provides a graphical user interface (GUI) and the dta command prompt utility. The GUI makes it easy to quickly view the results of tuning sessions. The dta utility makes it easy to incorporate Database Engine Tuning Advisor functionality into scripts for automated tuning. In this lesson, you will use the GUI interface. Launch DTA To begin, open the Database Engine Tuning Advisor graphical user interface (GUI). On first use, a member of the sysadmin fixed server role must launch Database Engine Tuning Advisor to initialize the application. After initialization, members of the db_owner fixed database role can use Database Engine Tuning Advisor to tune databases that they own. Perform the following steps to launch the DTA: 1. On the Windows Start menu, point to All Programs, point to Microsoft SQL Server, point to Performance Tools, and then click Database Engine Tuning Advisor. 2. In the Connect to Server dialog box, verify the default settings, and then click Connect. 4-45

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 By default, Database Engine Tuning Advisor opens to the configuration in the following illustration: FIGURE 4.10 THE DATABASE ENGINE TUNING ADVISOR WINDOW Two main panes are displayed in the Database Engine Tuning Advisor GUI when it is first opened. Database Engine Tuning Advisor Left pane The left pane contains the Session Monitor. This lists all tuning sessions that have been performed on this Microsoft SQL Server instance. When you open Database Engine Tuning Advisor, it displays a new session at the top of the pane. You can name this session in the adjacent pane. Initially, only a default session is listed. This is the default session that Database Engine Tuning Advisor automatically creates for you. After you have tuned databases, all tuning sessions for the SQL Server instance to which you are connected are listed below the new session. You can right-click a tuning session to rename it, close it, delete it, or clone it. If you right-click in the list, you can sort the sessions by name, status, or creation time, or create a new session. In the bottom section of this pane, details of the selected tuning session are displayed. You can choose to display the details organized into categories with the Categorized button, or you can display them in an alphabetized list by using the Alphabetical button. You can also hide Session Monitor by dragging the right pane border to the left side of the window. To view it again, drag the pane border back to the right. Session Monitor enables you to view previous tuning sessions, or to use them to create new sessions with similar definitions. 4-46

Chapter 4: Performance Audits You can also use Session Monitor to evaluate tuning recommendations. For more information, see Using Session Monitor to Evaluate Tuning Recommendations (http://msdn.microsoft.com/en-us/library/ms175080.aspx). Database Engine Tuning Advisor Right pane The right pane contains the General and the Tuning Options tabs. Here you define your Database Engine Tuning session. In the General tab, you type the name for your tuning session, specify the workload file or table to use, and select the databases and tables you want to tune in this session. A workload is a set of Transact-SQL statements that execute against a database or databases that you want to tune. Database Engine Tuning Advisor uses trace files, trace tables, Transact-SQL scripts, or XML files as workload input when tuning databases. On the Tuning Options tab, you select the physical database design structures (indexes or indexed views) and the partitioning strategy that you want Database Engine Tuning Advisor to consider during its analysis. On this tab, you can also specify the maximum time the Database Engine Tuning Advisor takes to tune a workload. By default, Database Engine Tuning Advisor will tune a workload for one hour. Setup DTA You can set options that configure what the Database Engine Tuning Advisor graphical user interface (GUI) displays on startup, the font it uses, and other tool functionality to best support how you use it. Perform the following steps to start the Database Engine Tuning Advisor: 1. On the Windows Start menu, point to All Programs, point to Microsoft SQL Server, point to Performance Tools, and then click Database Engine Tuning Advisor. 2. On the Tools menu, click Options. 3. In the Options dialog box, view the following options: o Expand the On startup list to view what Database Engine Tuning Advisor can display when it is started. By default, Show a new session is selected. o Click Change Font to see what fonts you can choose for the lists of databases and tables on the General tab. The fonts you choose for this option are also used in Database Engine Tuning Advisor recommendation grids and reports after you have performed tuning. By default, Database Engine Tuning Advisor uses system fonts. o The Number of items in most recently used lists can be set between 1 and 10. This sets the maximum number of items in the lists displayed by clicking Recent Sessions or Recent Files on the File menu. By default, this option is set to 4. 4-47

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 o o o When Remember my last tuning options is selected, by default Database Engine Tuning Advisor uses the tuning options you specified for your last tuning session for the next tuning session. Clear this check box to use Database Engine Tuning Advisor tuning option defaults. By default, this option is selected. By default, Ask before permanently deleting sessions is selected to avoid accidentally deleting tuning sessions. By default, Ask before stopping session analysis is selected to avoid accidentally stopping a tuning session before Database Engine Tuning Advisor has finished analyzing a workload. Demonstration: Monitor a Query Using DTA When analyzing the counter log data and the SQL Server Profiler data, Tim notices that processor time peaks when the following query is executed: SELECT [Entry No_] FROM [Demo Database NAV (6-0)].dbo. [CRONUS International Ltd_$Ledger Entry Dimension] WHERE [Entry No_ 2] > 0 In the SQL Server Profiler trace, in the HostName field, he can see that the query is issued from Mort's computer. Tim quickly checks whether this query can be optimized by adding indexes. To do this, he creates an SQL script with the query, which he analyzes with the Database Engine Tuning Advisor. 1. Open SQL Server Management Studio. 2. In the database drop-down list, select Demo Database NAV (6-0). 3. Click the New Query button to open a new query window. 4. In the Query window, enter the query listed above. 5. Right-click the Query window and select Analyze Query in Database Engine Tuning Advisor. The Database Engine Tuning Advisor application opens. 6. In the Session Name field, enter a meaningful name for the tuning session. For example, enter MortsQuery. 7. In the Select databases to tune list, in the Name column, select the Demo Database NAV (6-0) database. In the Selected tables column, you see the number of tables that have been selected for tuning. By default all tables are selected. 8. Click the drop-down button to open a list of all tables. 4-48

Chapter 4: Performance Audits 9. In the list of tables, select the CRONUS International Ltd_$Ledger Entry Dimension table. Select the check box in the upper-left corner to (de)select all tables. To close the list of tables and return to the session window, click the dropdown button again. The Selected Tables column will contain the text : XXX of YYY, where XXX represents the number of selected tables and YYY represents the total number of tables. FIGURE 4.11 DATABASE ENGINE TUNING ADVISOR SESSION 10. Click the Tuning Options tab. Here you can specify some settings that will influence the recommendations made by the Database Engine Tuning Advisor, as described earlier in this lesson. 11. Click the Start Analysis button in the toolbar. The query will now be analyzed. 4-49

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 While the Database Engine Tuning Advisor analyzes the query, a Progress tab appears, showing the overall progress of the analysis. When the analysis is finished, the recommendations and the estimated improvement are displayed on the Recommendations tab. FIGURE 4.12 DATABASE ENGINE TUNING ADVISOR RECOMMENDATIONS The Recommendations tab contains all indexing and partitioning recommendations from the Database Engine Tuning Advisor, depending on the selected tuning options. The overview contains information such as Database Name, Object Name, Recommendation and Definition. In this example, DTA suggests that you create an additional index on the Entry No. 2 field. The index fields are listed in the Definition column. To implement the suggestions, first select all of the recommendations you want to apply. Next, on the Tools menu, select Apply recommendations. However, we recommend applying the recommendations from within Microsoft Dynamics NAV. Dynamic Management Views You can use dynamic management views and dynamic management functions to query dynamic metadata in SQL Server. They provide information about the current state of SQL Server (such as locks currently held within a database); some views can depend on how long the server is running. Dynamic Management Views are listed in the System Views folder in SQL Server Management Studio. Dynamic Management Views return the current status of activity in SQL Server. Their names generally contain the dm prefix to distinguish them from other views. 4-50

Chapter 4: Performance Audits The information contained in the DMVs can also be obtained by using other monitoring tools. However, DMVs allow you to instantly check specific values, such as checking for missing indexes. You can join different dynamic management views to easily obtain even more detailed performance counter information. This lesson provides an overview of the most important dynamic management views and functions. For more detailed information about each of the views, see Dynamic Management Views and Functions (http://msdn.microsoft.com/enus/library/ms188754.aspx). sys.dm_db_file_space_usage This view returns space usage information for each file in the database. In SQL Server 2005, this view is only applicable to the tempdb database. SELECT * FROM sys.dm_db_file_space_usage Running out of disk space in tempdb can cause significant disruptions in the SQL Server production environment and can prohibit applications that are running from completing operations. For more information, see Troubleshooting Insufficient Disk Space in tempdb (http://msdn.microsoft.com/en-us/library/ms176029(sql.90).aspx). sys.dm_db_index_usage_stats This view gives statistics on how an index has been used to resolve queries. The view shows the number of times a query was used to find a single row (user_seeks), a range of values, or to resolve a non-unique query (user_scans). It also shows how many times the index has been updated (user_updates). Notice that sys.dm_db_index_operational_stats will give the details of how it has been modified. The view is very useful for performance tuning, as it tells you when indexes are NOT being used. Using this dynamic management view, you can see which indexes are used, which ones are not used, and which indexes are updated many times and never being used. sys.dm_db_index_physical_stats This view returns size and fragmentation information for the data and indexes of the specified table or view. 4-51

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Fragmentation occurs through the process of data modifications (INSERT, UPDATE, and DELETE statements) that are made against the table and therefore, to the indexes defined on the table. Because these modifications are not ordinarily distributed equally among the rows of the table and indexes, the fullness of each page can vary over time. For queries that scan part or all of the indexes of a table, this kind of fragmentation can cause additional page reads. This hinders parallel scanning of data. Before optimizing the fragmentation of an index, it is useful to analyze the usage of an index. The following example returns detailed index statistics for the index with ID 1 in the Customer table: SELECT * FROM sys.dm_db_index_physical_stats( DB_ID(N'Demo Database NAV (6-0)'), OBJECT_ID(N'[CRONUS International Ltd_$Customer]'), 1, NULL, 'DETAILED'); sys.dm_db_index_operational_stats This view returns current low-level I/O, locking, latching, and access method activity for each partition of a table or index in the database. The following query shows the operational statistics for all indexes in the Customer table: SELECT * FROM sys.dm_db_index_operational_stats( DB_ID(N'Demo Database NAV (6-0)'), OBJECT_ID(N'[CRONUS International Ltd_$Customer]'), NULL, NULL) The last two parameters can be replaced by the index id or partition id. sys.dm_db_missing_index_details This view returns detailed information about missing indexes (excluding spatial indexes). SELECT * FROM sys.dm_db_missing_index_details After querying the sys.dm_db_missing_index_details dynamic management view, you can create the missing index by using information that is returned in the equality_columns, included_columns, and statement columns. 4-52

sys.dm_io_pending_io_requests Chapter 4: Performance Audits This view returns one row for each pending I/O request. On very active SQL Server computers, the view can return large result sets. Execute this query periodically to check the health of I/O subsystems and to isolate physical disk(s) that are involved in the I/O bottlenecks. SELECT * FROM sys.dm_io_pending_io_requests sys.dm_io_virtual_file_stats This view returns I/O statistics for data and log files. The view has two parameters: a database ID (or NULL) and a file ID (or NULL). If you do not specify a database, information for all databases will be displayed. If you do not specify a database file, information for all database files will be displayed. SELECT * FROM sys.dm_io_virtual_file_stats(db_id('demo Database NAV (6-0)'), NULL) Important information is in the io_stall column which indicates the total time, in milliseconds, that users waited for I/O to be completed on the file. The view can help you identify I/O bottlenecks. sys.dm_exec_cached_plans This view returns a row for each query plan that is cached by SQL Server for faster query execution. You can use this dynamic management view to find cached query plans, cached query text, the amount of memory taken by cached plans, and the reuse count of the cached plans. sys.dm_exec_query_optimizer_info This view returns detailed statistics about the operation of the SQL Server query optimizer. You can use this view when tuning a workload to identify query optimization problems or improvements. SELECT * FROM sys.dm_exec_query_optimizer_info For example, you can use the total number of optimizations, the elapsed time value, and the final cost value to compare the query optimizations of the current workload and any changes observed during the tuning process. Some counters provide data that is relevant only for SQL Server internal diagnostic use. These counters are marked as "Internal only." 4-53

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 sys.dm_exec_query_plan Use this view to collect information about the query plans that are used to execute a particular Transact-SQL query. Query plans for various types of Transact-SQL batches, such as ad hoc batches, stored procedures, and user-defined functions, are cached in an area of memory called the plan cache. Each cached query plan is identified by a unique identifier called a plan handle. If a Transact-SQL query or batch runs a long time on a particular connection to SQL Server, you can retrieve the execution plan for that query or batch to discover what is causing the delay. The following example shows how to retrieve the execution plan for a query that is identified as 0x06000100A27E7C1FA821B10600: SELECT * FROM sys.dm_exec_query_plan(0x06000100a27e7c1fa821b10600) sys.dm_exec_requests This view returns one row per request that is executed on SQL Server. The view includes database information (database name), system resource information such as wait time, elapsed time and CPU time, but also some general information about open transactions and blocked sessions. SELECT * FROM sys.dm_exec_sessions sys.dm_exec_sessions This view returns one row per authenticated session on SQL Server. sys.dm_exec_sessions is a server-scope view that shows information about all active user connections and internal tasks. This information includes client version, client program name, client login time, login user, current session setting, and more. Use sys.dm_exec_sessions to first view the current system load and to identify a session of interest, and then learn more information about that session by using other dynamic management views or dynamic management functions. SELECT * FROM sys.dm_exec_sessions sys.dm_os_performance_counters This view makes it possible to query a view directly to capture the SQL Server counters related to the instance. SELECT * FROM sys.dm_os_performance_counters 4-54

Chapter 4: Performance Audits Depending on the services and applications installed the number of counters can vary. These counters range from memory usage to SQL Server application specific counters to include the following: MSSQL:Access Methods MSSQL:Broker Activation MSSQL:Broker/DBM Transport MSSQL:Broker Statistics MSSQL:Buffer Manager MSSQL:Buffer Node MSSQL:Buffer Partition MSSQL:Catalog Metadata MSSQL:CLR MSSQL:Cursor Manager by Type MSSQL:Cursor Manager Total MSSQL:Databases MSSQL:Exec Statistics MSSQL:General Statistics MSSQL:Latches MSSQL:Locks MSSQL:Memory Manager MSSQL:Plan Cache MSSQL:SQL Errors MSSQL:SQL Statistics MSSQL:Transactions MSSQL:User Settable MSSQL:Wait Statistics The view is limited to SQL Server performance counters. You cannot use this view to monitor physical disks, network I/O, and so on. sys.dm_os_wait_stats This view returns information about all the waits encountered by threads that executed. You can use this aggregated view to diagnose performance issues with SQL Server and also with specific queries and batches. sys.dm_os_wait_stats shows the time for waits that have completed. This dynamic management view does not show current waits. 4-55

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Specific types of wait times during query execution can indicate bottlenecks or stall points within the query. Similarly, high wait times, or wait counts server wide can indicate bottlenecks or hot spots in interaction query interactions within the server instance. For example, lock waits indicate data contention by queries, page IO latch waits indicate slow IO response times, and page latch update waits indicate incorrect file layout. For more information, see sys.dm_os_wait_stats (http://msdn.microsoft.com/enus/library/ms179984.aspx). You can query dynamic management views using a standard SELECT statement. For example, the following code returns details about the current locking status in the system: SELECT * FROM sys.dm_tran_locks For example, the following code returns all sessions that lock the Customer table: SELECT DISTINCT request_session_id FROM sys.dm_tran_locks WHERE resource_associated_entity_id = object_id('cronus International Ltd_$Customer') Useful Scripts, Tools, and Reports In addition to the monitoring tools described in the previous lessons, there are several other tools (scripts and reports) that can be used to analyze or troubleshoot performance issues. The sp_who stored procedure provides the following information about current users, sessions, and processes in an instance of the Microsoft SQL Server Database Engine: Process ID Process Status User Login Name User Name Blocking Process SPID (for blocked processes) Database used by the process Command being executed The information can be filtered to return only those processes that are not idle, that belong to a specific user, or that belong to a specific session. You can use sp_who to analyze the server load, by analyzing the number of database connections. 4-56

Chapter 4: Performance Audits The sp_who2 stored procedure retrieves the same information as sp_who, but adds the following important columns: Total CPU time per process Total amount of disk reads per process Last time a client called a procedure or executed a query Connected application To run the stored procedures, enter the following Transact-SQL query in a new query window in SQL Server Management Studio: -- To execute sp_who, enter: EXEC sp_who -- To execute sp_who2, enter: EXEC sp_who2 The following query, which is based on Dynamic Management Views, has proven to be very useful in general performance troubleshooting of SQL installations. The query is read-only. It does not cause any locks or any noticeable overhead to SQL Server. SELECT TOP 30 st.text, SUBSTRING(st.text, (qs.statement_start_offset/2) + 1, ((CASE statement_end_offset WHEN -1 THEN DATALENGTH(st.text) ELSE qs.statement_end_offset END - qs.statement_start_offset)/2) + 1) as statement_text, execution_count, CASE WHEN execution_count = 0 THEN null ELSE total_logical_reads/execution_count END as avg_logical_reads, last_logical_reads, min_logical_reads, max_logical_reads, CASE WHEN execution_count = 0 THEN null ELSE total_logical_writes/execution_count END as avg_logical_writes, last_logical_writes, min_logical_writes, max_logical_writes, max_elapsed_time FROM sys.dm_exec_query_stats as qs CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) as st ORDER BY max_logical_reads DESC -- change ORDER BY to sort by max_logical_writes 4-57

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 The query gives you an immediate view of the top 30 plans that are currently in cache, ordered by number of reads (or writes with a small change). It gives you a view of the queries most likely to cause the most performance problems. In this way, it does what you would have otherwise had to use SQL Profiler for, but without the overhead of SQL Profiler or the need to spend hours browsing through thousands of lines of details in SQL Server Profiler traces. Every time a query is sent to SQL Server, SQL Server makes a query-plan for that query. Then it caches this plan to re-use it for identical queries. This plancache also collects statistics about how efficiently each query-plan was run. This query looks into the plan-cache and retrieves the plans, with the one causing the most reads at the top. It returns (among other things) the following information: Text and Statement_text - this shows the query that this plan is being used for. Remember, the same plan can be used again for identical queries. Execution_count - shows how many times the plan was used. If this shows 1, the plan may have been for a one-off query, and it may not be relevant to investigate it further. If it shows a high count, then the plan is for a common query, and you may want to investigate further where this query came from. SQL Server's plan cache changes all the time, depending on what queries SQL Server runs, so you may get different results, depending on what time of the day you run the query. The plan cache is also reset when SQL Server restarts. What to look for Being sorted by "max_logical_reads", you have the "worst" query at the top. But also look at execution_count. If a query ran just once, it might have been a batch job, or something else that is not really causing any problems. Queries that have an execution_count in the hundreds or thousands may be more relevant to look at. Also see whether the queries (statement_text) look similar, or if many of them are in the same area (same tables). The column diff_quota shows max_logical_reads divided with min_logical_reads. If this number is high, it means that the query plan is inconsistent. This can be either because of inconsistent use of Microsoft Dynamics NAV (for example users applying different filters on the same table). Or, it may be because a query plan is good for some queries but bad for others. In this case, you can affect the way that SQL Server creates query plans, by adding RECOMPILE hints, plan guides, or index hints. Or by upgrading to a newer version of Microsoft Dynamics NAV client (for example see the post "SQL Preprocessing in Microsoft Dynamics NAV 5.0 SP1" for how NAV 5 SP1 will cause different query plans). 4-58

Chapter 4: Performance Audits If diff_quota is low, it means that the query plan is just consistently bad, which means that it is more likely that the query itself is bad. You will then have to look for reasons why that query consistently causes the number of reads that it does. The "TOP 30"query cannot help determining why a query is causing many reads. But it can identify which queries to investigate first, which can otherwise be a very time-consuming task (collecting and analyzing profiler traces, and so on). You can also look at max_elapsed_time, but be aware that when a query takes a long time to run because it is being blocked, then the real problem is elsewhere (in the blocking query). So, if a query has a high max_elapsed_time, then see whether the query contains a lock (WITH UPDLOCK). If it does, then you are most likely looking at a blocking problem which requires a wider look, and which frequently cannot be solved by the query you see. Performance Data Collector The data collector is a component of SQL Server 2008 that collects different sets of data. Data collection either runs constantly or on a user-defined schedule. The data collector stores the collected data in a relational database known as the management data warehouse. The data collector is a core component of the data collection platform for SQL Server 2008 and the tools that are provided by SQL Server. The data collector provides one central point for data collection across your database servers and applications. This collection point can obtain data from a variety of sources and is not limited to performance data, unlike SQL Trace. The data collector enables you to adjust the scope of data collection to suit your test and production environments. The data collector also uses a data warehouse, a relational database that enables you to manage the data that you collect by setting different retention periods for your data. The Data Collector can be started and managed from SQL Server Management Studio > Management > Data Collection. To set up and configure storage for collected data, you can use the Configure Management Data Warehouse wizard. The wizard provides an easy way to do the following: Create the management data warehouse. You can install the management data warehouse on the same instance of SQL Server that runs the data collector. However, if server resources or performance are an issue on the server that is being monitored, you can install the management data warehouse on a different computer. Install the predefined System Data collection sets. Map logins to management data warehouse roles. Enable data collection. Start the System Data collection sets. 4-59

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Summary When data collection is enabled, you can run reports on the management data warehouse database. To run the reports, right-click the database in SQL Server Management Studio, and select Reports and then Management Data Warehouse. For more information, see Data Collection (http://msdn.microsoft.com/enus/library/bb677179.aspx) and Data Collector Architecture and Processing (http://msdn.microsoft.com/en-us/library/bb677355.aspx). This chapter explains how to create a test environment that can be used to troubleshoot performance issues. It also explains different tools that can be used to monitor system performance and code execution. 4-60

Chapter 4: Performance Audits Test Your Knowledge Test your knowledge with the following questions. 1. Performance troubleshooting involves several steps. Order the following steps correctly? Step: : Check database structure and indexes : Check hardware performance : Check hardware configuration 2. Which monitoring tools are part of the perfmon tool? (Select all that apply) ( ) System Information ( ) System Monitor ( ) Performance Logs and Alerts ( ) Task Manager 3. Information collected by the Client Monitor tool can be linked to information from another tool. Which tool can Client Monitor be linked to? ( ) Session Monitor ( ) SQL Server Profiler ( ) Code Coverage ( ) System Monitor 4. Which of the following statements about SQL Server Profiler are false? (Select all that apply) ( ) You can use customizable templates to trace performance. ( ) SQL Server Profiler can be scheduled to run automatically during a specific period of time. ( ) You can save trace information to a file and a database ( ) You can use SQL Server Profiler to analyze current system performance with current database activity. 4-61

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 5. Database Engine Tuning Advisor provides information about the performance of which of the following: (Select all that apply) ( ) memory ( ) CPU ( ) indexes ( ) partition structures 6. The sys.dm_os_performance_counters dynamic management views captures information related to. 7. The sys.dm_db_file_space_usage DMV returns information related to database. 4-62

Lab 4.1 - Setup and Schedule System Monitor Chapter 4: Performance Audits In this lab you set up several monitoring tools to monitor specific system resources. Scenario Recently users have reported that systems are occasionally performing poorly. Tim decides to start monitoring the hardware of the SQL Server to identify the cause of the problem. He starts monitoring CPU, disk activity, network activity and memory, with immediate effect and for the next 72 hours. Challenge Yourself! Use Perfmon to monitor the following system resources for 72 hours: % processor time (total) % processor time (used by SQL Server) % Disk Time, Disk Reads/sec, Disk Writes/sec, Average Disk Queue Length, Current Disk Queue Length per drive SQL Server Buffer cache hit ratio Network packages sent and received per second Available memory (in MB) Page Faults / sec Need a Little Help? Perform the following steps to complete this lab: 1. Start Perfmon. 2. Create a New Counter Log. 3. Enter Log Settings. 4. Add Disk Activity Counters to the Counter Log. 5. Add Network Interface Counters to the Counter Log. 6. Add Processor Counters to the Counter Log. 7. Add Memory Counters to the Counter Log. 8. Add SQL Server Memory Counters to the Counter Log. 9. Schedule the Counter Log. 10. Activate the Counter Log. 4-63

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Step by Step Start Perfmon 1. In the Windows Taskbar, click Start. 2. On the Start menu, select Run. 3. In the Open field, enter perfmon. 4. Click OK to execute the command. Create a New Counter Log 1. Click to expand Performance Logs and Alerts. 2. Right-click Counter Logs, then select New Log Settings. 3. In the New Log Settings dialog box, enter Lab 4-1. 4. Click OK to continue. Enter Log Settings In the Current Log file Name field, enter the path and location for the log files. Add Disk Activity Counters to the Counter Log 1. Click the Add Counters button. 2. In the Add Counters window, select the Select counters from Computer option. 3. In the computer name drop-down list, select NAV-SRV-01. 4. In the Performance object field, select PhysicalDisk. 5. Choose the Select counters from List option. 6. In the left list box, select % Disk Time. 7. Choose the Select Instances from List option. 8. In the right list box, select the C drive. 9. Keep the Ctrl key pressed. 10. In the left list box, click Avg. Disk Queue Length, Current Disk Queue Length, Disk Reads/sec, and Disk Writes/sec. 11. Click the Add button. Add Network Interface Counters to the Counter Log 1. In the Performance object field, select Network Interface. 2. Choose the Select counters from List option. 3. In the left list box, select Packets Received/sec. 4. Choose the Select Instances from List option. 5. In the right list box, select a network card. 4-64

Chapter 4: Performance Audits 6. Keep the Ctrl key pressed. 7. In the left list box, click Packets Sent/sec. 8. Click the Add button. Add Processor Counters to the Counter Log 1. In the Performance object field, select Processor. 2. Choose the Select counters from List option. 3. In the left list box, select % Processor Time. 4. Choose the Select Instances from List option. 5. In the right list box, select Total. 6. Click the Add button. 7. In the Performance object field, select Process. 8. In the left list box, select % Processor Time. 9. Choose the Select Instances from List option. 10. In the right list box, select sqlservr. 11. Click the Add button. Add Memory Counters to the Counter Log 1. In the Performance object field, select Memory. 2. Choose the Select counters from List option. 3. In the left list box, select Available MBytes. 4. Keep the Ctrl key pressed. 5. In the left list box, select Page Faults/sec. 6. Click the Add button. Add SQL Server Memory Counters to the Counter Log 1. In the Performance object field, select SQLServer:Buffer Manager. 2. Choose the Select counters from List option. 3. In the left list box, select Buffer Cache Hit Ratio. 4. Keep the Ctrl key pressed. 5. In the left list box, click Pages reads/sec and Page writes/sec. 6. Click the Add button. 7. Click OK to close the Add Counters window. 8. Click the Apply button. 4-65

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Schedule the Counter Log 1. On the Schedule tab, in the Start Log field, choose the first option. 2. In the Stop log field, select the second option (After... units). 3. In the After field, enter 72. 4. In the Units field, select hours. 5. Click the OK button. Activate the Counter Log 1. In the Microsoft Management Console, the new Lab 4.1 counter log appears. 2. Right-click the counter log. 3. Select Start. 4-66

Lab 4.2 - Create a SQL Server Profiler Trace Chapter 4: Performance Audits In this lab you set up an SQL Server Profiler trace to identify database activity. Scenario To find the cause of the performance problems as soon as possible, Tim also starts monitoring the database activity using SQL Server Profiler. In case performance problems arise, he can see the database activity running on the server at the same time. While Tim is monitoring, Susan runs the badly performing codeunit written by Mort (see the Client Monitor lesson in this chapter). Challenge Yourself! Trace the database activity for the Demo Database NAV (6-0) database for a period of 72 hours. Use the Tuning template, and make sure the trace data can be correlated to the counter log setup in Lab 4.1. Need a Little Help? Perform the following steps to complete the lab: 1. Open SQL Server Profiler. 2. Define the Trace. 3. Filter the Trace. 4. Run Codeunit in Microsoft Dynamics NAV. 5. Check the Trace. Step by Step Open SQL Server Profiler 1. Open SQL Server Management Studio. 2. In the Tools menu, select SQL Server Profiler. 3. Connect to the NAV-SRV-01 Database Engine. Define the Trace 1. In the Trace Properties window, in the trace name, enter Lab 4.2. 2. In the Use the template list, select the Tuning template. 3. Check the Save to File option. 4. In the Save As dialog box, enter the path and file name for the SQL Server Profiler trace. In this example, enter C:\PerfLogs\Lab 4.2.trc. 5. Clear the Enable file rollover option. 6. Select the Server processes trace data option. 4-67

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 7. Select the Enable trace stop time. 8. Set the date and time to current system time + 72 hours. 9. On the Events Selection tab, right-click the header for the DatabaseID column and choose Deselect column. 10. Repeat step 7 for the ObjectType column. 11. Click the Show all columns option. 12. Right-click the header for the ApplicationName column and choose Select column. 13. Repeat step 10 for the following columns: CPU, EndTime, HostName, Reads, StartTime, and Writes. Filter the Trace 1. On the Events Selection tab, click the Column Filters button. 2. In the Edit Filter window, in the left pane, select DatabaseName. 3. In the right pane, double-click the Like operator. 4. In the text box, enter Demo Database NAV (6-0). 5. Click OK to close the Edit Filter window. 6. Click Run to start the SQL Server Profiler trace. Run Codeunit 123456710 in Microsoft Dynamics NAV 1. In the Windows Taskbar, go to Start > All Programs. 2. Right-click Microsoft Dynamics NAV 2009 Classic with Microsoft SQL Server and select Run as. 3. In the Run as window, select The following user. 4. In the User name field, enter CONTOSO\Susan. 5. In the Password field, enter pass@word1. 6. Click OK to run the Microsoft Dynamics NAV 2009 Classic client. 7. On the Tools menu, select Object Designer. 8. Click Codeunit. 9. Select codeunit 123456710. 10. Click Run. 4-68

Chapter 4: Performance Audits Check the Trace Switch to the SQL Server Profiler window. While the codeunit is running, events are added to the SQL Server Profiler trace. The LoginName column shows that the database activity is caused by CONTOSO\SUSAN. Stop the Trace When defining the trace, you have entered a trace stop time. Under usual circumstances, the trace will continue to run until the stop time is reached. For this lab, you can stop the trace here. In the SQL Server Profiler, right-click the trace window and select Stop Trace. 4-69

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 4.3 - Correlate System Monitor and SQL Server Profiler Data In this lab you use SQL Server Profiler to analyze the events in the trace file and to link this database activity information to the data in the performance counter logs. Scenario Counter Logs and SQL Server Profiler have been running for 72 hours. Both the log and the trace file now contain a lot of information about system resources performance and database activity. Tim is curious to see the results of the monitoring processes. Challenge Yourself! In this lab you analyze the monitoring information from the counter log (Lab 4.1) and the trace file (Lab 4.2) in SQL Server Profiler. Need a Little Help? Perform the following steps to complete the lab: 1. Stop the Counter Log. 2. Start SQL Server Profiler. 3. Open SQL Server Profiler Trace. 4. Import Performance Data. 5. Analyze Performance Events. Step by Step Stop the Counter Log 1. In the Windows Taskbar, Click Start > Run. 2. In the Run window, enter perfmon. 3. Click OK to start System Monitor. 4. Under Performance Logs and Alerts, select Counter Logs. 5. In the right pane, right-click the Lab 4.1 counter log and select Stop. 4-70

Chapter 4: Performance Audits Start SQL Server Profiler 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine. 3. In the Tools menu, select SQL Server Profiler. If the Trace Properties window appears, click the Close button to close the window. Open SQL Server Profiler Trace 1. In the File menu, select Open > Trace File. 2. In the Open File dialog box, browse to the SQL Server Profiler Trace file that you created in Lab 4.2: C:\PerfLogs\Lab 4.2.trc 3. Click Open to load the trace file. Import Performance Data 1. In the File menu, select Import Performance Data. 2. Browse to the Counter Log log file created in Lab 4.1: C:\PerfLogs\Lab 4.1_000001.blg. 3. Click Open. 4. In the Performance Counters Limit Dialog window, select the Processor and Process categories. 5. Click OK to import the performance data. The monitoring information in both files is now correlated. The result should look as follows: FIGURE 4.13 CORRELATING SYSTEM RESOURCE INFORMATION WITH DATABASE ACTIVITY 4-71

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Analyze Performance Events The SQL Server Profiler window is now divided into four panes: The top pane shows the database queries run on the database. The second pane shows the system monitor graph. The third pane shows the list of counters that are displayed in the graph. The bottom pane shows the SQL statement that is run. To analyze the data, you can click any point in the graph or any event in the top pane. If you click an event in the top pane, the red line in the graph object will move to the corresponding point in time. Alternatively, you can click any point in the graph, after which the corresponding statement is selected in the top pane. In the top pane, in the LoginName column, you can see the login that caused the database activity. FIGURE 4.14 ANALYZING PERFORMANCE IN THE SQL SERVER PROFILER WINDOW 1. In the graph object, click a point at the left side. 2. Verify the TextData, LoginName, Duration, CPU, Reads and Writes columns in the events pane. 3. Use the right arrow key to move the red line farther to the right. Watch the columns in the top pane. 4. Move the red line to the point where both graph lines show a sudden increase. 5. Verify the Reads column in the top pane. 4-72

Chapter 4: Performance Audits 6. Click the ascending part of the graph lines. 7. Verify the Reads column in the top pane. 8. Repeat steps 6 and 7 several times. 9. Click the part of the graph where both lines reach the top limit. 10. Check the Reads column in the top pane. You notice that the number of reads goes up. 4-73

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 4.4a - Analyze a SQL Server Profiler Trace using DTA In this lab, you analyze a SQL Server Profiler trace using Database Engine Tuning Advisor, in order to find missing indexes. Scenario When correlating system monitor and counter log information, Tim detects that performance degrades when Susan is executing her piece of code. He can see that many UPDATE and SELECT statements are run on the Ledger Entry Dimension table and that the number of reads is increasing toward the end of the batch. Tim checks the indexes of the table. Challenge Yourself! Analyze the SQL Server Profiler trace created in Lab 4.2 using Database Engine Tuning Advisor. Need a Little Help? Perform the following steps to complete the lab: 1. Open Database Engine Tuning Advisor. 2. Configure Database Engine Tuning Advisor. 3. Start Database Engine Tuning Advisor. Step by Step Open Database Engine Tuning Advisor 1. Open SQL Server Management Studio. 2. On the Tools menu, select Database Engine Tuning Advisor. 3. Connect to the Demo Database NAV (6-0) Database Engine. Configure Database Engine Tuning Advisor 1. In the Trace Properties window, in the Session Name field, enter a meaningful name for the tuning session. For example, enter Lab 4.4a. 2. In the Workload field, select the File option. 3. Click the Browse for a workload file button. 4. In the Open File dialog box, browse to the SQL Server Profiler trace created in Lab 4.2: C:\PerfLogs\Lab 4.2.trc. 5. Click Open. 4-74

Chapter 4: Performance Audits 6. In the Select databases to tune list, in the Name column, select the Demo Database NAV (6-0) database. In the Selected tables column, you see the number of tables that have been selected for tuning. By default all tables are selected. 7. Click the drop-down button to open a list of all tables. 8. In the list of tables, click the check box in the upper-left corner to deselect all tables. 9. In the list of tables, select the CRONUS International Ltd_$Ledger Entry Dimension table. 10. Click the drop-down button again to close the list of tables and return to the session window. FIGURE 4.15 THE DATABASE ENGINE TUNING ADVISOR 11. On the Tuning Options tab, leave the default settings. Start Database Engine Tuning Advisor To start the Database Engine Tuning Advisor, click the Start Analysis button. Wait for the analysis to finish, and check the recommendations. 4-75

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 4.4b - Analyze an SQL query using DTA In this lab, you analyze an SQL query using Database Engine Tuning Advisor, in order to find missing indexes. Scenario When correlating system monitor and counter log information, Tim detects that performance is decreased when Susan is executing her piece of code. He can see that many UPDATE and SELECT statements are run on the Ledger Entry Dimension table and that the number of reads increases toward the end of the batch. Tim checks whether the necessary indexes exist for the SELECT query. Challenge Yourself! Analyze a query using Database Engine Tuning Advisor. Need a Little Help? Perform the following steps to complete the lab: 1. Select the query to analyze. 2. Configure Database Engine Tuning Advisor. 3. Start Analysis. Step by Step Select the query to analyze 1. Open SQL Server Management Studio. 2. In the database drop-down list, select Demo Database NAV (6-0). 3. Click the New Query button to open a new query window. 4. In the Query window, enter the following query: SELECT * FROM [Demo Database NAV (6-0)].dbo.[CRONUS International Ltd_$Ledger Entry Dimension] Instead of entering the query manually, you can copy the query from SQL Server Profiler. Select the query you want to analyze, and select Edit > Copy cell. Paste the query in the new query window and remove all unknown variables and parameters. The resulting query should look like the query listed above. 5. Right-click the Query window and select Analyze Query in Database Engine Tuning Advisor. 4-76

Chapter 4: Performance Audits Configure Database Engine Tuning Advisor 1. In the Session Name field, enter a meaningful name for the tuning session. For example, enter Lab 4.4b. 2. In the Select databases to tune list, in the Name column, select the Demo Database NAV (6-0) database. The Selected tables column shows the number of tables that have been selected for tuning. By default all tables are selected. 3. Click the drop-down button to open a list of all tables. 4. In the list of tables, select the check box in the upper-left corner to deselect all tables. 5. In the list of tables, select the CRONUS International Ltd_$Ledger Entry Dimension table. 6. Click the dropdown button again to close the list of tables and return to the session window. 7. On the Tuning Options tab, leave the default settings. Start Database Engine Tuning Advisor To start the Database Engine Tuning Advisor, click the Start Analysis button. Wait for the analysis to finish, and check the recommendations. FIGURE 4.16 DATABASE ENGINE TUNING ADVISOR RECOMMENDATIONS 4-77

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 4.4c - Find Missing Indexes using DMVs In this lab you find information about missing index information using dynamic management views. Scenario Tim checks for missing indexes using the dynamic management views. Challenge Yourself! Check for missing indexes using dynamic management views. Need a Little Help? Perform the following steps to complete the lab: 1. Create a new query in SQL Server Management Studio. 2. Query Dynamic Management Views. Step by Step Create a new query in SQL Server Management Studio 1. Open SQL Server Management Studio. 2. Click the New Query button. Query Dynamic Management Views 1. In the New Query window, enter the following statement: SELECT * FROM sys.dm_db_missing_index_details 2. Click the Execute button. 4-78

Chapter 4: Performance Audits Instead of waiting for the SQL Server Profiler or the Database Engine Tuning Advisor to process the load, Tim immediately sees the results of the query. The result should look as follows: FIGURE 4.17 QUERYING THE SYS.DM_DB_MISSING_INDEX_DETAILS DMV 4-79

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Quick Interaction: Lessons Learned Take a moment and write down three Key Points you have learned from this chapter 1. 2. 3. 4-80

Chapter 4: Performance Audits Solutions Test Your Knowledge 1. Performance troubleshooting involves several steps. Order the following steps correctly? Step: 3 : Check database structure and indexes 2 : Check hardware performance 1 : Check hardware configuration 2. Which monitoring tools are part of the perfmon tool? (Select all that apply) ( ) System Information ( ) System Monitor ( ) Performance Logs and Alerts ( ) Task Manager 3. Information collected by the Client Monitor tool can be linked to information from another tool. Which tool can Client Monitor be linked to? ( ) Session Monitor ( ) SQL Server Profiler ( ) Code Coverage ( ) System Monitor 4. Which of the following statements about SQL Server Profiler are false? (Select all that apply) ( ) You can use customizable templates to trace performance. ( ) SQL Server Profiler can be scheduled to run automatically during a specific period of time. ( ) You can save trace information to a file and a database ( ) You can use SQL Server Profiler to analyze current system performance with current database activity. 5. Database Engine Tuning Advisor provides information about the performance of which of the following: (Select all that apply) ( ) memory ( ) CPU ( ) indexes ( ) partition structures 4-81

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 6. The sys.dm_os_performance_counters dynamic management views captures information related to SQL Server 7. The sys.dm_db_file_space_usage DMV returns information related to tempdb database. 4-82

Chapter 5: Improving Application Performance CHAPTER 5: IMPROVING APPLICATION PERFORMANCE Objectives Introduction The objectives are: Write optimized C/AL code. Optimize Sum Index Field Technology (SIFT) tables. Optimize cursors by using the right C/AL code statements. Optimize key design and usage in Microsoft Dynamics NAV. Avoid locks and deadlocks. Troubleshoot performance issues related to the graphical user interface. Setup index hinting. Optimize data entry using the Bulk Insert feature. Use some tips and tricks that are useful when optimizing Microsoft Dynamics NAV on Microsoft SQL Server. This chapter describes how to solve and avoid performance issues by optimizing C/AL code and indexes. 5-1

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Optimizing C/AL Code There are several areas where developers need to focus when optimizing Microsoft Dynamics NAV applications. These areas are as follows, in order of importance (based on the processing costs): SIFT Keys and Indexes Cursors Locks Suboptimum Code Graphical User Interface (GUI) This lesson contains general guidelines for writing C/AL code. It explains which areas to pay extra attention to. Keys Keys define the order in which data is stored in your tables. Logically, records are stored sequentially in ascending order, sorted by the clustered index. Physically, records can be sorted on disk in a different order. You can speed up searches in tables by defining several keys which sort information in different ways. Defining keys is one thing; using the correct keys is also very important for performance. When writing code or creating reports, you must use the appropriate keys to get maximum performance. If you do not specify an adequate key, Microsoft SQL Server will try to find the best index. When a key is created in Microsoft Dynamics NAV, an index is created for that key in the corresponding SQL Server table. By default, a primary key is translated into the clustered unique index and secondary keys become nonclustered unique indexes. The only time when data in a table in SQL Server is stored in a sorted manner is when there is a clustered index defined in the table. The data is then stored sorted according to the fields in the clustered index. The primary key does not always provide the best sorting order for records. A typical example is a ledger entry table. The primary key of a ledger entry table is a single field, Entry No. However, most queries on ledger entry tables use fields other than the primary key, such as Posting Date, No. or Status. Data manipulation and retrieval on these tables can be optimized by physically storing the records in the order in which they are often queried. The physical storage order of records is determined by the clustered index. By default, if you do not specify a clustered index, the primary key will be used as clustered index. To select a key as clustered index, you can set the Clustered property of a key to Yes. You can have only one clustered index per table. Tables without clustered indexes are called "heaps." 5-2

Chapter 5: Improving Application Performance A heap can be considered as an unordered table or a collection of unordered data pages. This means that rows will be inserted into the heap anywhere there is space. As data is fragmented, retrieving data from a heap causes many data page reads. Often this results in reading the complete table. For this reason, heaps are to be avoided. Indexes that are only designed for sorting purposes can create overhead during insert, update, delete statements on SQL Server. That is why sometimes we recommend not to maintain these indexes on SQL Server. On the other hand, Microsoft Dynamics NAV will request a dynamic cursor most of the time and, in general, it is a good idea that an index fits both the order by and the where clause. Cursors When writing code to retrieve or browse through record sets in Microsoft Dynamics NAV, you can use a number of instructions. Retrieving data can be done for different reasons, such as you want to modify the data, or you just want to check whether records exist that meet specific criteria (without doing anything with the records). In SQL Server Option, each FIND instruction will be translated into one or more SQL statement that read data a particular way (using a specific cursor and isolation level). Very often, performance issues are caused by improper use of FIND statements, causing wrong cursors and isolation levels to be used. As a result of this, data is returned but not used, or data is returned as read-only when it must be modified. Using the correct FIND statement improves data retrieval and processing. Locks There are additional considerations to make when working with Microsoft Dynamics NAV on SQL Server. Microsoft Dynamics NAV is designed to read without locks and locks only if it is necessary, following optimistic concurrency recommendations. If records are to be modified, that intent should be indicated to the Database Management System (DBMS) (use explicit locking), so that the data is read properly. Implicit Locking The following table demonstrates implicit locking. The C/AL code is mapped to the equivalent action on SQL Server: C/AL Code SQL Server Cust.FIND('-'); SELECT * FROM Customer WITH (READUNCOMMITTED) (the retrieved record timestamp = TS1) Cust.Name := 'John Doe'; 5-3

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 C/AL Code SQL Server Cust.MODIFY; SELECT * FROM Customer WITH (UPDLOCK, REPEATABLEREAD) (the retrieved record timestamp = TS2) performs the update UPDATE Customer SET Name = 'John Doe' WITH (REPEATABLEREAD) WHERE TimeStamp <= TS1 The reason for such a complex execution is that: The data is read with the READUNCOMMITED isolation level, but because users will update it, they need to ensure that they read committed data and issue an update lock on it to prevent other users from updating the same records. The data is originally read uncommitted. Users need to lock the record and ensure that they update the record with an original timestamp. If somebody else changes the record, that person receives the following error message: "Another user has modified the record since users retrieved from the database." Explicit Locking If developers indicate that their intention is to modify the record by using explicit locking, they can eliminate the unacceptable behavior completely, as shown in the following table: C/AL Code Cust.LOCKTABLE; SQL Server Indicates intention of modification. Cust.FIND('-'); SELECT * FROM Customer WITH (UPDLOCK) (the retrieved record timestamp = TS1) Cust.Name := 'John Doe'; Cust.MODIFY; UPDATE Customer SET Name = 'John Doe' WITH (REPEATABLEREAD) (the retrieved record timestamp is guaranteed to be TS1) 5-4

Chapter 5: Improving Application Performance Instead of re-reading the data, Microsoft Dynamics NAV can immediately proceed to issue an UPDATE statement. This behavior explains the many occurrences of the following piece of code in the standard application: IF Rec.RECORDLEVELLOCKING THEN Rec.LOCKTABLE; The LOCKTABLE instruction is used to read the data with the correct isolation level. On SQL Server, LOCKTABLE will not lock any records. It will change the way records are accessed. The RECORDLEVELLOCKING function returns true if the code is being executed on SQL Server, otherwise it returns false. Suboptimum Code Taking performance into consideration often influences programming decisions. Often, users pay a high price in terms of performance because the code is not optimized. For example, if developers do not use explicit locking, or if they program bad loops and provoke problems with NEXT. Therefore, developers must review their code and check for the presence of performance degrading statements or scenarios, such as the following: Read the same table multiple times. Use COUNT to check whether records exist that meet specific criteria. Use MARKEDONLY instead of pushing records to a temporary table and read them from there. Use WHILE FIND to browse through a record set. (The WHILE FIND always looks for the first or last record in a set and therefore automatically disables the read ahead mechanism). Use IF NOT INSERT THEN MODIFY. Additionally, there are features in the application that require special attention. For example, the advanced dimensions functionality can be cost demanding when you use a lot of dimensions and have analysis views updated automatically on posting. Other functionalities that require attention to are as follows: Automatic Cost Posting, Automatic Cost Adjustment, Expected Cost Posting to G/L Discount Posting ="All Discounts" Credit Warnings ="Both Warnings" Stockout Warnings Users should review the application setup for the performance aspect and make corrective actions possible. 5-5

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 BLOBs BLOBs can also cause performance issues because they are stored in a specific way in the SQL Server database. First of all, BLOB fields in Microsoft Dynamics NAV have a compressed property, which indicates whether the data will be saved compressed. By default, the property is set to True, which means that Microsoft Dynamics NAV will use a special algorithm to compact the BLOB data. However, SQL Server does not know this algorithm and needs additional operations to handle the data. Secondly, as Microsoft Dynamics NAV always generates SELECT * FROM queries, all data from a table is returned, including BLOB fields. As the BLOB is stored separately from the other table columns, SQL Server needs extra operations (page reads and CPU power) to collect the BLOB data. (Because a BLOB can take up to 2 GB, BLOBs are often spread over several pages.) However, the BLOB data is not relevant for many transactions and processes (such as item lookups, order posting, and item journal posting). Nevertheless, the BLOB data is always read. When using BLOBs, we recommend that you: Set the Compressed property on BLOB fields to False. As a consequence, SQL Server needs less operations to retrieve the BLOB data. Keep BLOBs away from transactions and processes, by storing the BLOB data in separate tables. You do not have to delete or disable the BLOB field; not using the field will already do this. Problems with NEXT In some cases, the NEXT command causes the biggest performance problem in Microsoft Dynamics NAV. The problem is that NEXT uses a cursor, and you cannot change the isolation level in the middle of the data retrieval. This means that data has to be retrieved again, using a new SELECT statement. This imposes a serious performance penalty in SQL Server and, in some cases, leads to very lengthy execution. Typical scenarios where NEXT causes problems are as follows: The filtering of a record set is changed The sorting of a record set is changed A key value is changed The isolation level is changed NEXT is called in the middle of nowhere (on a record that is retrieved using GET or another way) NEXT in combination with FINDFIRST or FINDLAST The following examples show some of the scenarios listed here. 5-6

Chapter 5: Improving Application Performance Change Filtered Values In the following example, a field used to filter a table is assigned a new value in the record set. SETRANGE(FieldA, Value); FIND('-'); REPEAT... FieldA := NewValue;... MODIFY;... UNTIL NEXT = 0; By changing the filtered field, the record will be moved outside the current record set. When calling the NEXT instruction, Microsoft Dynamics NAV needs to execute extra queries to detect its cursor position in the original record set and will finally retrieve a complete new record set. This is no longer the case from version 5.0 and later versions, because instead Microsoft Dynamics NAV will request a dynamic cursor for this type of statement. However, for example, FindSet with parameter (FALSE) Microsoft Dynamics NAV will reissue the queries to find data. Change Sorting The same happens when you change the sorting of a record set after it was retrieved. SETCURRENTKEY(FieldA); FIND('-'); REPEAT SETCURRENTKEY(FieldB); FIND('-');... UNTIL NEXT = 0; When you change the sorting of a record set and retrieve the first record, Microsoft Dynamics NAV calls for a new record set (and an extra cursor). Change Isolation Level The following code shows an example of a changed isolation level. FINDFIRST; REPEAT... MODIFY; UNTIL NEXT = 0; 5-7

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Initially, the record set is retrieved with the READUNCOMMITTED isolation level. However, the MODIFY instruction requires a higher isolation level to modify the data. Since it is not possible to change the isolation level, in the middle of the process, Microsoft Dynamics NAV requests for a new record set. Jumping Through Record Sets The following code is a typical example of "jumping through a record set". SETRANGE(FieldA, Value); FIND('-');... REPEAT SETRANGE(FieldB,Value2);... FIND('+');... SETRANGE(FieldB);... UNTIL NEXT = 0; In this example, a new extra filter is applied to a record set, which leads to a new record set. The FIND('+') instruction requires a new cursor. When the NEXT statement is reached, Microsoft Dynamics NAV needs several queries to reposition its cursor in the original record set. FINDFIRST /FINDLAST with NEXT When NEXT is used in combination with FINDFIRST or FINDLAST, you go from a non-cursor to a cursor situation. The FINDFIRST instruction retrieves the data without cursors. NEXT causes a re-read with a cursor. SETRANGE(FieldA); FINDFIRST; REPEAT... UNTIL NEXT = 0; Change Key Values In the following code, a field that is part of an active key is changed. SETCURRENTKEY(FieldA); FIND('-'); REPEAT... FieldA := NewValue;... UNTIL NEXT = 0; 5-8

Chapter 5: Improving Application Performance By changing the key field, you disturb the current sorting and at the same time disable the benefits of SQL Server's read-ahead mechanism. When calling the NEXT instruction, SQL Server tries to read the next record based on the new key value. Solutions To eliminate performance problems with NEXT, consider the following solutions: Browse sets nicely (without jumping) and by preference use a separate looping variable. Restore original key values, sorting, and filters before the NEXT statement. If you modify them multiple times then read records to temporary tables, modify within, and write back afterward. Use FINDSET(TRUE,TRUE). Note also that FINDSET(TRUE,TRUE) is not a "real solution." It is merely a reduction of the costs and should be used only as a last resort. Graphical User Interface (GUI) The GUI overhead can slow down the client, if, for example, a dialog is refreshed 1000 times in a loop. GUI overhead can also cause increased server trips. When users use the default SourceTablePlacement = <Saved> on forms, it costs more than using Last or First. Users should review all forms showing data from large tables to look for performance problems. Another big overhead may come from the "Find As You Type" feature. When Find As You Type is enabled, the system is forced to do another query for each keystroke. This causes extra queries to be sent to the server. Finally, displaying many FlowFields on normal forms such as Customer Card, Item Card, and so on, can adversely affect form retrieval time, as the FlowFields have to be calculated. This can be a problem especially on list forms (showing multiple records at a time). The basic principle is to display these FlowFields on demand rather than by default when the user is not even interested in the information provided. If you need to display many FlowFields, use special forms such as Customer Statistics, Item Statistics, Customer Entry Statistics, Item Entry Statistics, Customer Sales, Item Turnover, and so on. 5-9

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 SIFT SIFT was originally implemented on SQL Server by using extra summary tables called SIFT Tables, that were maintained through table triggers directly in the table definitions on SQL Server. When an update was performed on a table that contains SIFT indexes, a series of additional updates were necessary to update the associated SIFT tables. This imposed an extra performance penalty - one that grew as the number of SIFT indexes on a table increased. With regards to performance, SIFT tables are one of the biggest Microsoft Dynamics NAV performance problems on SQL Server, as one record update in the base table produces a potentially massive stream of Input/Output (I/O) requests with updates to the records in the SIFT tables, possibly blocking other users during that time. In Microsoft Dynamics NAV 5.0 SP 1, Microsoft replaced SIFT tables with V- SIFT, which are indexed views. However, Microsoft Dynamics NAV developers will likely be involved with older versions, where they may encounter performance issues related to the SIFT tables. It is important for Microsoft Dynamics NAV developers to know how SIFT tables worked before version 5.0 SP1 and how to troubleshoot performance issues related to these tables. Optimizing SIFT Tables SIFT tables are used in Microsoft Dynamics NAV 5.0 and older, to implement SIFT on the SQL Server, and store aggregate values for SumIndexFields for keys in the source tables. The overhead of the separate SIFT tables is massive and should be carefully considered for activation. By default, Microsoft Dynamics NAV activates the SIFT tables when developers create a new index with SumIndexFields. Developers should review all of the existing SIFT indexes and determine whether they need to keep them activated. Developers can de-activate the creation and maintenance of a SIFT table by using the MaintainSIFTIndex property in the Microsoft Dynamics NAV key designer. If they make the property false, and there is no other maintained SIFT index supporting the retrieval of the cumulative sum, Microsoft Dynamics NAV asks SQL Server to calculate the sum itself. For example, if developers have a Sales Line table and put Amount in the SumIndexFields for the primary key ("Document Type, Document No., Line No."), a new SIFT table "CRONUS International Ltd_$37$0" is created and maintained. When a CALCSUM is used to display a FlowField in Microsoft Dynamics NAV showing the sum of all Sales Lines for a specific Sales Header (Order ORD-980001), the resulting query looks as follows: SELECT SUM(s29) FROM "CRONUS International Ltd_$37$0" WHERE "bucket" = 2 AND "f1" = 1 AND "f3" = 'ORD-980001' 5-10

Chapter 5: Improving Application Performance If developers disable the SIFT table by clearing the MaintainSIFTIndex check box, Microsoft Dynamics NAV still works, and the resulting query looks as follows: SELECT SUM("Amount") FROM "CRONUS International Ltd_$Sales Line" WHERE "Document Type" = 1 AND "Document No_" = 'ORD- 980001' This is a very light load on CPU overhead compared to the massive costs of maintaining the SIFT table. SIFT tables are very beneficial when users need to sum a large number of records. With that in mind, developers can check existing SIFT tables and see whether they need all of the level of details. There is no need, for example, to store a cumulative sum of just a few records. Developers can use the property SIFTLevels and disable specific levels by clearing the Maintain for a specific bucket check box, thus reducing the overall overhead of the SIFT table while still keeping the SIFT table in place for summing the larger number of records. However, there is no need, for example, to keep cumulative sums on the top level buckets if they are used, such as a total of Quantity on Location Code in the Item Ledger Entry table, since users always filter on Item No. Table Optimization As explained earlier, every time you update a key or a SumIndexField in a base table all of the SIFT tables associated with the base table must also be updated. This means that the number of SIFT tables that you create, as well as the number of SIFT levels that you maintain, affects performance. If you have a very dynamic base table that constantly has records inserted, modified and deleted, the SIFT tables that are associated with it will constantly need to be updated. As a consequence, the SIFT tables can get very large, both because of the new records that are entered and because the records that are deleted from the base table are not removed from the SIFT tables. This can badly affect performance, especially when the SIFT tables are queried to calculate sums. 5-11

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 To keep the SIFT tables from growing very large and to maintain performance, it is important to optimize the tables regularly. To initiate the optimization process, click File, Database, Information, Tables, Optimize in the Microsoft Dynamics NAV client. FIGURE 5.1 TABLE OPTIMIZATION ON LARGE TABLES The optimization process removes any entries that contain zero values in all numeric fields from each SIFT table. The removal of these redundant entries frees space and makes updating and summing SIFT information more efficient. At the same time, the optimization process rebuilds all indexes. As an alternative, you can run an SQL query on the SIFT table to determine how many records there are with zero values in all the sum fields in the table. If there are a large number of these records, you can either initiate the optimization process in Microsoft Dynamics NAV and remove them or schedule a query to delete these records on SQL Server. VSIFT Starting with Microsoft Dynamics NAV 5.0 SP 1, the SIFT tables are replaced by indexed views. Separate SIFT tables are no longer part of Microsoft Dynamics NAV on SQL Server. Microsoft Dynamics NAV 5.0 SP1 uses "indexed views" to maintain SIFT totals. Indexed views are a standard SQL Server feature. An indexed view is similar to a normal SQL Server view except that the contents have been materialized (computed and stored) to disk to speed up the retrieval of data. One indexed view is created for each SIFT key that is enabled. When you create a SIFT key for a table, you must set the MaintainSIFTIndex property for that key to Yes to enable the SIFT key and create the indexed view. 5-12

Chapter 5: Improving Application Performance After the indexed view has been created, the contents of the view are maintained so that changes are made to the base table. If you set the MaintainSIFTIndex property for that key to No, the indexed view is dropped and totals are no longer maintained. The indexed view that is used for a SIFT key is always created at the most finely-grained level. Therefore, if you create a SIFT key for AccountNo.,PostingDate, the database will store an aggregated value for each account for each date. This means that in the worst case scenario, 365 records multiplied by the number of unique Account No. must be summed to generate the total for each account for a year. Tuning and Tracing VSIFT As a result of using indexed views, SIFT keys are exposed to SQL Server tracing and tuning tools. For example, the SQL Server Profiler can display information about which indexed views are maintained and the cost associated with maintaining them. This makes it easier for you to make informed decisions about which SIFT indexes are required for optimal performance. Demonstration: Analyzing SIFT Configuration with SQL Server Profiler Perform the following steps to use SQL Profiler to determine the best SIFT index configuration. 1. In the Windows Taskbar, click Start > All Programs > Microsoft SQL Server 2008. 2. Open Performance Tools. 3. Open SQL Server Profiler. 4. Connect to the NAV-SRV-01 Database Engine. 5. Choose a trace template, for example: Tuning. 6. Go to Events Selection. 7. Expand Performance. 8. Select Showplan XML. By default, this information is not included. 9. Press Run to start the trace. 5-13

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 There are different possibilities when setting up a new trace. For this example, the output is not saved and you will need to manually stop the trace. FIGURE 5.2 TRACING SIFT CONFIGURATION It is possible to customize the SQL Profiler trace with options. For this example, Showplan XML is added to the Standard (default) template. Showplan XML provides greater flexibility in viewing query plans. In addition to the usability benefits, Showplan XML also provides an overview of certain plan specific information, such as cached plan size, memory fractions (grants distributed across operators in the query plan), parameter list with values used during optimization, and missing indexes. When data is inserted, updated, or deleted in a table, the SIFT keys that have been defined and enabled for the specific table are maintained. Maintaining these SIFT indexes has a performance overhead. The size of the performance overhead depends on the number of keys and the SumIndexFields defined for each table. Defining Efficient Indexes There are several things to consider when designing SIFT indexes. It is important to only create the needed SIFT indexes but at the same time be sure that these indexes cover the sum queries required by Microsoft Dynamics NAV. If a table does not contain a large number of records, there is no need to maintain any SIFT indexes for that table. In this case, set the MaintainSIFTIndex property to No. Be sure to notice the number of SIFT keys defined in the system to ensure that you only maintain the SIFT keys that are important. Combine SIFT indexes, if possible. 5-14

Chapter 5: Improving Application Performance You do not have to maintain a SIFT key for a total that is only used periodically. Periodically generated totals can easily be generated by a report instead. FIND Instructions FIGURE 5.3 SHOWPLAN XML EXECUTION PLAN DETAILS Even though indexed views are used to support SIFT indexes, when a sum is requested, the SIFT index that best matches the filter or sum fields will be used. In this case, single SIFT indexes that contain all key fields and all sum fields will be used. If such a SIFT index does not exist, the sum will be calculated from the base table (SIFT indexes will not be used). As with regular indexes, the key fields in the SIFT index that are used most regularly in queries will be positioned to the left in the SIFT index. As a general rule, the field that contains the greatest number of unique values will be placed on the left, with the field that contains the second greatest number of unique values positioned to the right and so on. Integer fields generally contain the greatest number of unique values. Option fields contain a fairly small number of values. Even if a specified filter does not supply values for the most left columns in the SIFT index it can still be used and add value. The reason is that the algorithm will use the SIFT index and that the SIFT index/indexed view only contains the sums, so the data that needs to be traversed to calculate a total is very much less than going to the base table. Unlike Microsoft Dynamics NAV Classic database server, SQL Server can be characterized as a set-based engine. This means that SQL Server is very efficient when retrieving a set of records from a table, but less so when records are accessed one at a time. 5-15

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Cursors Because SQL Server is set-based, it does not provide a fast way to do this retrieval. SQL Server uses mechanisms called cursors for record-level access. There are different types of cursors which have different capabilities. For example, the forward-only cursor allows fast reading from top to bottom, while the dynamic cursor allows retrieval in either direction. Typically, the FIND instruction is used to retrieve records in a table or a filtered record set. Often, the FIND instruction is used in combination with a filter instruction such as SETFILTER or SETRANGE (to check whether records exist that meet specific criteria) or in combination with the NEXT instruction (to loop through a set of records). In both cases, the FIND instructions will be translated into a Transact-SQL statement that uses a cursor and returns data. Compared to retrieving sets of records, cursors are very expensive. When writing C/AL code to retrieve records, it is important to consider the purpose of the code and to use the correct instructions (and, as a consequence, the correct cursors). By default, the way the dynamic cursors are used is not very efficient. Because cursors have a big effect on performance, handling them in a different way can yield significant improvements. For example, there is no reason to create a cursor for retrieving a single record. To optimize cursors, the following four Microsoft Dynamics NAV commands can be used: ISEMPTY ISEMPTY FINDFIRST FINDLAST FINDSET The ISEMPTY function allows you to determine whether a C/SIDE table or a filtered set of records is empty. The following code samples check for the presence of a Master record in the Customer table: // Code Sample 1 Customer.SETRANGE(Master, TRUE); IF NOT Customer.FIND('-') THEN ERROR('No Master Customer record has been defined.'); // Code Sample 2 Customer.SETRANGE(Master, TRUE); IF Customer.ISEMPTY THEN ERROR('No Master Customer record has been defined.'); 5-16

Chapter 5: Improving Application Performance When executed, the first code sample will be translated into an SQL statement that uses cursors. In addition, if a record exists, it is returned to the client, causing extra network traffic and disk reads. However, in this case, you do not want to retrieve a record. To avoid this, you can use the ISEMPTY instruction, as shown in the second code sample. When executed, the first code sample will be translated into an SQL statement that uses cursors. In addition, if a record exists, it is returned to the client, causing extra network traffic and disk reads. However, in this case, you do not want to retrieve a record. To avoid this, you can use the ISEMPTY instruction, as shown in the second code sample. When executed, this code results in the following T-SQL command: SELECT TOP 1 NULL FROM The ISEMPTY instruction will not cause cursors to be used. Note that NULL is used, which means that no record columns are retrieved from the database (as opposed to '*', which would get all columns). This makes it a very efficient command that causes just a few bytes to be sent over the network. This can be a significant improvement as long as subsequent code does not use the values from the found record. FINDFIRST Retrieving the first record in a table can also be an unnecessarily expensive command. Consider the following code samples: // Code Sample 1 Customer.SETRANGE(Master, TRUE); IF NOT Customer.FIND('-') THEN ERROR('No Master Customer record has been defined.'); // Code Sample 2 Customer.SETRANGE(Master, TRUE); IF NOT Customer.FINDFIRST THEN ERROR('No Master Customer record has been defined.'); In the first code sample, the FIND instruction will generate a cursor. To avoid this cost, you can use the FINDFIRST instruction, as shown in code sample 2. The FINDFIRST instruction retrieves the first record in a set based on the current key and filters. As with ISEMPTY, FINDFIRST does not use cursors. When executed, the following T-SQL statement is generated: SELECT TOP 1 * FROM... ORDER BY... Note that in this case the * is used, so all columns of the record are returned. Use this function instead of FIND('-') when you need only the first record. 5-17

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 WARNING: If using a REPEAT/UNTIL NEXT loop, do not use this command, because the NEXT will need to create a cursor for fetching the subsequent records. FINDLAST FINDLAST works in the same manner as the FINDFIRST instruction. The FINDLAST command retrieves the last record in a set (based on the current key and filters), but, like FINDFIRST, FINDLAST does not use cursors. Consider the following two code samples: // Code Sample 1 Message.SETCURRENTKEY(Date); IF Message.FIND('+') THEN MESSAGE('Last message is dated %1.', FORMAT(Message.Date)); // Code Sample 2 Message.SETCURRENTKEY(Date); IF Message.FINDLAST THEN MESSAGE('Last message is dated %1.', FORMAT(Message.Date)); This second code sample retrieves the last record in the set, and does not use cursors. When executed, this T-SQL is generated: SELECT TOP 1 * FROM... ORDER BY... DESC You should use this function instead of FIND('+') when you need only the last record in a table or set. WARNING: If doing a REPEAT/UNTIL NEXT(-1) loop, do not use this command, because the NEXT will have to create a cursor for fetching the subsequent records. FINDSET FINDSET retrieves a set of records based on the current key and filter and can be used when you need to browse through a set of records. Very often, the scenario of code sample 1 is used: // Code Sample 1 IF RecordVariable.FIND('-') THEN REPEAT UNTIL RecordVariable.NEXT = 0; // Code Sample 2 IF RecordVariable.FINDSET THEN REPEAT UNTIL RecordVariable.NEXT = 0; 5-18

Chapter 5: Improving Application Performance From previous paragraphs, you know that FIND will generate a cursor in code sample 1, which is to be avoided. Therefore, it is better to use the FINDSET instruction, as shown in code sample 2. Unlike the FIND('-') command, FINDSET does not use cursors. When executed, the T-SQL result looks as follows: SELECT TOP 500 * FROM... The value 500 in the code snippet here comes from the database setup. Its default value has been set to 50. The recommended value for this parameter is the average number of sales lines on a sales order. REPEAT/UNTIL NEXT browses through the records locally on the client machine. This is the recommended way to retrieve sets quickly, without any cursor overhead. Note that FINDSET only allows you to loop through the record set from the top down. If you want to loop from the bottom up, use FIND('+'). Use this function only when you explicitly want to loop through a record set. You should only use this function in combination with REPEAT/UNTIL. The complete syntax for the FINDSET instruction is as follows: Ok := Record.FINDSET([ForUpdate][, UpdateKey]) Although you can use it without, the FINDSET instruction has two optional parameters which might improve performance. The ForUpdate parameter indicates whether you want to modify the records or not. The UpdateKey parameter indicates whether you want to modify a field in the current key. The UpdateKey parameter does not apply when ForUpdate is FALSE. Using FINDSET without parameters corresponds to FINDSET(FALSE, FALSE). You can use it to obtain a read-only record set. This uses no server cursors and the record set is read with a single server call. NOTE: FINDSET only allows you to loop through the record set from the top down. If you want to loop from the bottom up, use FIND('+'). We recommend that you use FINDSET to loop through a set without updating it, as shown in the following example. SalesLine.SETFILTER("Purch. Order Line No.",'<>0'); IF SalesLine.FINDSET THEN BEGIN REPEAT CopyLine(SalesLine); UNTIL SalesLine.NEXT = 0; END; If you set any or both of the parameters to FALSE, you can still modify the records in the set but these updates will not be performed optimally. 5-19

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 The variations of the FINDSET instructions will be discussed in the next sections. FINDSET(TRUE) We recommend that you set the ForUpdate parameter to TRUE to modify any records in the set. If you set the parameter to TRUE, the LOCKTABLE command is issued immediately before the records are read. This variation of FINDSET locks the set that is read, so it is equivalent to a LOCKTABLE followed by FINDSET. FINDSET(TRUE) uses the read-ahead mechanism to retrieve several records instead of just one. Unlike the FINDSET instruction, FINDSET(TRUE) uses a dynamic cursor. The main purpose of this command is to raise the isolation level before reading the set because the resulting records are to be modified. This example shows how to use the FINDSET function to loop through a set and update a field that is not within the current key. SalesLine.SETRANGE("Document Type",DocumentType); SalesLine.SETRANGE("Document No.",DocumentNo); IF SalesLine.FINDSET(TRUE, FALSE) THEN BEGIN REPEAT SalesLine."Location Code" := GetNewLocation(SalesLine); SalesLine.MODIFY; UNTIL SalesLine.NEXT = 0; END; We recommend that LOCKTABLE be used with FINDSET for small sets, and that the FINDSET(TRUE) command be used for sets larger than 50 records (the Record Set parameter in the Alter Database window). We do not recommend that you use FINDSET with a large result set. That is a result set that is larger than the Record Set Size parameter. In this case, you should use Find('-'). The reason is that FINDSET means that you are working with a confined set of records and Microsoft Dynamics NAV will use this information to optimize what is being done on SQL server. A good example of using the FINDSET(TRUE) command is for the read of a big set of records and the need to modify those records. For example, when going through all G/L entries for a specific account, and changing a field value based on the record condition, the filtered set will probably have more than 50 records. This might be done as follows: GLEntry.SETRANGE("G/L Account No.", "6100"); IF GLEntry.FINDSET(TRUE) THEN REPEAT IF (GLEntry.Amount > 0) THEN BEGIN GLEntry."Debit Amount" := GLEntry.Amount; 5-20

Chapter 5: Improving Application Performance GLEntry."Credit Amount" := 0; END ELSE BEGIN GLEntry."Debit Amount" := 0; GLEntry."Credit Amount" := -GLEntry.Amount; END; GLEntry.MODIFY; UNTIL GLEntry.NEXT = 0; A good example of using the LOCKTABLE and FINDSET command (as opposed to using the FINDSET(TRUE) command) is for the read of a small set of records and the need to modify those records. For example, when going through all sales lines for a specific order, and changing the value of several fields, the filtered set will probably have less than 50 records. SalesLine.SETRANGE("Document Type","Document Type"::Order); SalesLine.SETRANGE("Document No.",'S-ORD-06789'); SalesLine.LOCKTABLE; IF SalesLine.FINDSET THEN REPEAT SalesLine."Qty. to Invoice" := SalesLine."Outstanding Quantity; SalesLine."Qty. to Ship" := SalesLine."Outstanding Quantity; SalesLine.MODIFY; UNTIL SalesLine.NEXT = 0; FINDSET(TRUE, TRUE) This variation of FINDSET(TRUE) allows the modification of a key value of the current sorting order of the set. The following example shows how to use the FINDSET function to loop through a set and update a field that is within the current key. SalesShptLine.SETRANGE("Order No.",SalesLine."Document No."); SalesShptLine.SETRANGE("Order Line No.",SalesLine."Line No."); SalesShptLine.SETCURRENTKEY("Order No.","Order Line No."); IF SalesShptLine.FINDSET(TRUE, TRUE) THEN BEGIN REPEAT SalesShptLine."Order Line No." := SalesShptLine."Order Line No." + 10000; SalesShptLine.MODIFY; UNTIL SalesShptLine.NEXT = 0; END; 5-21

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Like FINDSET(TRUE), the command uses a dynamic cursor, with the main purpose of raising the isolation level before starting to read the set (because the set needs to be modified). However, it does not use the read-ahead mechanism. Instead it retrieves one record at a time because the set is expected to be invalidated within the loop. Avoid using this command, since the loop code should be changed to a more efficient method of working, such as using a different variable for browsing through the set. Important to understand is that the isolation level and the cursor type have nothing to do with one another. The isolation level is connection wide whereas the cursor type influences the current statement. Microsoft Dynamics NAV uses a dynamic cursor to get a dynamic result set, a result set that contains our own changes. This is done to be consistent with the old classic database. If Microsoft Dynamics NAV used another cursor type or no cursor at all then it might have to reissue the query whenever the code was changing data in the current table. A good example of using the FINDSET(TRUE,TRUE) command (as opposed to using FIND command) is for the read of a set of records and the need to modify a key value. This should be avoided. If there is not a way to avoid this, use FINDSET(TRUE,TRUE). For example, going through all sales lines for a specific order, and changing key value, the filtered set will probably have less than 50 records in the set. This can be done as follows: SalesLine.SETCURRENTKEY("Document Type","Document No.","Location Code"); SalesLine.SETRANGE("Document Type","Document Type"::Order); SalesLine.SETRANGE("Document No.",'S-ORD-06789'); SalesLine.SETFILTER("Location Code",''); IF SalesLine.FINDSET(TRUE,TRUE) THEN REPEAT IF SalesLine.Type = SalesLine.Type::Item THEN SalesLine."Location Code" := 'GREEN'; IF SalesLine.Type = SalesLine.Type::Resource THEN SalesLine."Location Code" := 'BLUE'; SalesLine.MODIFY; UNTIL SalesLine.NEXT = 0; Note that the example can be easily changed into more efficient code using FINDSET as opposed to FINDSET(TRUE,TRUE) and using a separate variable to modify the records. This can be done as follows: SalesLine.SETCURRENTKEY("Document Type","Document No.","Location Code"); SalesLine.SETRANGE("Document Type","Document Type"::Order); SalesLine.SETRANGE("Document No.",'S-ORD-06789'); SalesLine.SETFILTER("Location Code",''); SalesLine.LOCKTABLE; IF SalesLine.FINDSET THEN REPEAT SalesLine2 := SalesLine; 5-22

Chapter 5: Improving Application Performance IF SalesLine.Type = SalesLine.Type::Item THEN SalesLine2."Location Code" := 'GREEN'; IF SalesLine.Type = SalesLine.Type::Resource THEN SalesLine2."Location Code" := 'BLUE'; SalesLine2.MODIFY; UNTIL SalesLine.NEXT = 0; There is a parameter in Microsoft Dynamics NAV that is used to set up the maximum number of records retrieved from the database (File, Database, Alter, Advanced tab, Caching, Record Set = 50). If the set is bigger than the maximum, Microsoft Dynamics NAV will continue to work but it will replace the reading mechanism with a dynamic cursor. If there is an indication that this will occur, use the 'old' FIND('-') command as opposed to FINDSET. Use FINDSET for forward direction only; it will not work for REPEAT/UNTIL NEXT(-1). Also, if the LOCKTABLE command is used prior to the FINDSET, the set is locked, and records can be modified within the loop. A good example of an efficient use of cursors (using the 'old' FIND command), is for the read of a big set of records, for example all G/L Entries for a specific account, probably with more than 50 records in the set. GLEntry.SETRANGE("G/L Account No.", "6100"); IF GLEntry.FIND('-') THEN REPEAT UNTIL GLEntry.NEXT = 0; A good example of using the new FINDSET command (as opposed to using the 'old' FIND command), is for the read of a small set of records, such as all sales lines in a sales order, probably always with less than 50 records. This can be done as follows: SalesLine.SETRANGE("Document Type","Document Type"::Order); SalesLine.SETRANGE("Document No.",'S-ORD-06789'); IF SalesLine.FINDSET THEN REPEAT TotalAmount := TotalAmount + SalesLine.Amount; UNTIL SalesLine.NEXT = 0; Keys One of the largest typical Microsoft Dynamics NAV overheads is the cost of indexes. The Microsoft Dynamics NAV database is over-indexed, since customers require certain reports to be ordered in different ways, and the only way to do it is to create a Microsoft Dynamics NAV key to sort data in these specific ways. SQL Server can sort results quickly if the set is small, so there is no need to keep indexes for sorting purposes only. For example, in the Warehouse Activity Line 5-23

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 table, there are several keys that begin with Activity Type and No. fields, such as the following: Activity Type,No.,Sorting Sequence No. Activity Type,No.,Shelf No. Activity Type,No.,Action Type,Bin Code The issue here is that these indexes are not needed on SQL Server, because the Microsoft Dynamics NAV code always filters on Activity Type and No. when using these keys. With SQL Server, the Query optimizer looks at the filter and realizes that the clustered index is Activity Type,No_,Line No_ and that the set is small, and that there is no need to use an index to retrieve the set and return it in that specific order. It will use only the clustered index for these operations. Additionally, the entire functionality is not used by customers, so if they never pick the stock by Sorting Sequence No. for example, there is no need to maintain the index. Developers should analyze the existing indexes with a focus on use and benefits compared to the overheads, and decide what action is need. Disable the index completely, using the key property Enable, using the KeyGroups property, or using the MaintainSQLIndex property. Indexes that remain active can change structure using the SQLIndex property. Developers can also cluster the table by a different index. If an index exists, sorting by the fields matching the index will be faster, but modifications to the table will be slower. When you write a query that searches through a subset of the records in a table, be careful when defining the keys both in the table and in the query so that Microsoft Dynamics NAV can quickly identify this subset. For example, the entries for a specific customer will usually be a small subset of a table that contains entries for all the customers. The time that is required to complete a query depends on the size of the subset. If a subset cannot be located and read efficiently, performance will deteriorate. To maximize performance, you must define the keys in the table so that they facilitate the queries that you will have to run. These keys must then be specified correctly in the queries. For example, you want to retrieve the entries for a specific customer. To do this, you apply a filter to the Customer No. field in the Cust. Ledger Entry table. SQL Server makes stricter demands than Classic Database Server on the way that keys are defined in tables and on the way they are used in queries. Microsoft Dynamics NAV Classic Database Server has been optimized for low selectivity keys. For example, if there is an index that consists of the Document Type and Customer No. fields and the application filters on the Customer No. field only, 5-24

Chapter 5: Improving Application Performance Classic Database Server will search through the index branches and retrieve the result set quickly. However, SQL Server is not optimized to do that, so it scans from the beginning to the end of a range and, in many cases, this results in a nonclustered index scan. To run the query efficiently on SQL Server, you need to define a key in the table that has Customer No. as the first field. You must also specify this key in the query. Otherwise, SQL Server will be unable to answer this query efficiently and will read through the entire table. Define your keys and queries with SQL Server in mind, as this will ensure that your application can run as efficiently on both server options. When designing keys, the following guidelines can be considered: Redesign keys so that their selectivity becomes higher by putting Boolean, Option, and Date fields toward the end of the index. Set the MaintainSIFTIndex property to No on small tables or temporary tables (such as Sales Line, Purchase Line and Warehouse Activity Line). Set the MaintainSQLIndex property to No for indexes that are only used for sorting purposes. Reduce the number of keys on hot tables. Use the SQLIndex property to optimize a key on SQL Server. But be careful with this property. If the SQL Server index differs from the Microsoft Dynamics NAV index it can lead to problems with the ORDER BY and the WHERE CLAUSE not fitting the same index. This is largely a problem with dynamic cursors. Reduce the number of records in static tables (by archiving or using data partitioning). Keys and Performance Searching for specific data is usually easier if several keys have been defined and maintained for the table holding the desired data. The indexes for each of the keys provide specific views that enable quick flexible searches. However, there are both advantages and drawbacks to using a large number of keys. If you increase the number of secondary keys marked as active, performance will improve when you read data, but will deteriorate when updating information (because indexes must be maintained for each secondary key). When you decrease the number of active sortings, performance will slow down when reading data, but updates will be faster. 5-25

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 The decision whether to use few or many keys is not easy. The choice of appropriate keys and the number of active keys to use should be the best compromise between maximizing the speed of data retrieval and maximizing the speed of data updates (operations that insert, delete, or modify data). In general, it may be worthwhile to deactivate complex keys if they are rarely used. The overall speed of C/SIDE depends on the following factors: The size of the database The number of active keys The complexity of the keys The number of records in your tables The speed of your computer and its disk system Key Properties The keys associated with a table have properties that describe their behavior, just as tables and fields do. When you create a key, C/SIDE automatically suggests several default values for these properties. Depending on the purpose of the key, you may want to change these default values. Enabled Property Enabled property simply turns the specific key on and off. It might have been there for temporary reasons and is no longer needed. If a key is not enabled and is referenced by a C/AL code or CALCSUMS function, users will get a run-time error. KeyGroups Property Use this property to select the (predefined) key groups to which the key belongs. As soon as developers assign this key to one or more key groups, they can selectively activate or deactivate the keys of various groups by enabling and disabling the key groups. To make use of the key groups for sorting, choose the Key Groups option on the Database Information window which appears when they select File, Database, Information, and then press the Tables button. There are key groups that are defined already, such as Acc(Dim), Item(MFG), and so on, but users can create more and assign them to keys they want to control this way. The purpose of key groups is to make it possible to set up a set of special keys that are used rarely (such as for a special report that is run once every year). Since adding lots of keys to tables will eventually decrease performance, using key groups makes it possible to have the necessary keys defined, but only active when they are really going to be used. 5-26

Chapter 5: Improving Application Performance MaintainSQLIndex Property This property determines whether a SQL Server index that corresponds to the Microsoft Dynamics NAV key should be created (when set to Yes) or dropped (when set to No). A Microsoft Dynamics NAV key is created to sort data in a table by the required key fields. However, SQL Server can sort data without an index on the fields to be sorted. If an index exists, sorting by the fields matching the index will be faster, but modifications to the table will be slower. The more indexes there are on a table, the slower the modifications become. In situations where a key must be created to allow only occasional sorting (for example, when running infrequent reports), developers can disable this property to prevent slow modifications to the table. SQLIndex Property This property allows users to define the fields that are used in the SQL index. The fields in the SQL index can: Differ from the fields defined in the key in Microsoft Dynamics NAV. Be arranged in a different order. If the key in question is not the primary key, and the SQLIndex property is used to define the index on SQL Server, the index that is created contains exactly the fields that users specify and may not be a unique index. It will only be a unique index if it contains all the fields from the primary key. When users define the SQL index for the primary key, it must include all the fields defined in the Microsoft Dynamics NAV primary key. Users can add extra fields and rearrange the fields to suit their needs. Be careful when using the property SQLIndex. It can backfire because the SQL Server query and the SQL index will per definition have a mismatch, as described at the following location (http://blogs.msdn.com/nav_developer/archive/2009/04/10/beware-the-sql-indexproperty-on-nav-5-0-sp1.aspx): Clustered Property Use this property to determine which index is clustered. By default, the index that corresponds to Microsoft Dynamics NAV primary key will be made clustered. We recommend that you make sure the primary key and the clustered key is the same. If they are not then SQL Server will add the clustered index fields to every index internally while Microsoft Dynamics NAV will add all the primary key fields causing keys to be very long. 5-27

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 MaintainSIFTIndex This property allows you to determine whether SIFT structures should be created in SQL Server to support the corresponding SumIndexFields for the Microsoft Dynamics NAV key. SumIndexFields are created in Microsoft Dynamics NAV to support FlowField calculations and other fast summing operations. SQL Server can sum numeric data by scanning the table. If the SIFT structures exist for the SumIndexFields, summing the fields is faster, especially for large sets of records, but modifications to the table are slower because the SIFT structures must also be maintained. In situations where SumIndexFields must be created on a key to enable FlowField calculations, but the calculations are performed infrequently or on small sets of data, you can disable this property to prevent slow modifications to the table. Also be aware that even the new implementation that uses indexed views will cause blocking in the database. For example if two users update the total for an account on the same date. Enable/Disable Keys using C/AL Code To make the information in the tables as useful as possible, many of the tables have several predefined sorting keys. Keys can be set up as part of a key group, which you can enable and disable without risk. To add a key to a key group, set the KeyGroups property to the name of an existing key group. Microsoft Dynamics NAV will generally perform better when you disable key groups (because the system does not have to maintain the keys included in the key group). Adding a large number of keys to database tables decreases performance. However, by making the keys members of predefined key groups you can have the necessary keys defined and only activate them when they will be used. 5-28

Chapter 5: Improving Application Performance Key groups can be maintained in the Database Key Groups window. To open the window, select File > Database Information > Tables and click the Key groups button. FIGURE 5.4 THE DATABASE KEY GROUPS WINDOW In this window you can enable and disable existing key groups. You can also add new and delete existing key groups. If you delete an existing key group, all keys that belong to this group will be disabled. In Microsoft Dynamics NAV 5.0, key groups can also be enabled and disabled using C/AL Code. To do this, the following instructions have been introduced: KEYGROUPDISABLE KEYGROUPENABLE KEYGROUPENABLED KEYGROUPDISABLE allows you to disable a key group and all related keys in all tables. KEYGROUPENABLE does the opposite, it allows you to enable a key group and all related keys. KEYGROUPENABLED allows you to check whether a key group is currently enabled. The following code sample shows how to activate the ABC key group, run some code, and disable the key group again. KEYGROUPENABLE('ABC');... KEYGROUPDISABLE('ABC'); Enabling a key group can take some time, depending on the number of keys and the amount of data. 5-29

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Locks, Blocks and Deadlocks When data is read from the database, Microsoft Dynamics NAV, outside transaction or in a Browse/Update No Locks transaction mode, uses the READUNCOMMITTED isolation level, meaning that any other user can modify the records that are currently being read. Data that is read is considered "dirty" because it can be modified by another user. When data is modified, Microsoft Dynamics NAV reads the record again with the UPDLOCK isolation level and compares the timestamp of the record. If the record is 'old,' the following Microsoft Dynamics NAV error displays: "Another user has modified the record after you retrieved it from the database." The tradeoff is that care must be taken when writing code that modifies the data. This requires that locking and blocking be employed to synchronize access, but deadlocks - a condition where one or more competing processes are queued indefinitely - can occur as a side-effect. The following subtopics discuss strategies for synchronizing data access while avoiding deadlock. Every write transaction implies an automatic implicit lock and unlock. Explicit locking is also possible using the LOCKTABLE instruction. Explicit locking is necessary to preserve data consistency during complex processes, such as the Posting function. The isolation level can be changed to a more restrictive setting, such as UPDLOCK. In this level, records that are read are locked, meaning that no other user can modify the record. This is referred to as pessimistic locking, and causes the server to protect the record in case there is a need to modify it - making it impossible for others to modify. An example of a lock of a customer record is as follows: Cust.LOCKTABLE; Cust.GET('10000'); // Customer 10000 is locked Cust.Blocked := TRUE; Cust.MODIFY; COMMIT; // Lock is removed If the record is not locked up front, the following situation can occur: User A User B Comment Cust.GET('10000'); User A reads record without any lock. Cust.GET('10 000'); Cust.Blocked := TRUE; Cust.MODIF Y; COMMIT: User B reads same record without any lock. User B modifies record. 5-30

Chapter 5: Improving Application Performance User A User B Comment Cust.Blocked := FALSE; Cust.MODIFY; ERROR Blocking SUCCESS User A gets an error: "Another user has modified the record after you retrieved it from the database." When other users try to lock data that is currently locked, they are blocked and have to wait. If they wait longer than the defined time-out, they receive the following Microsoft Dynamics NAV error: "The ABC table cannot be locked or changed because it is already locked by the user who has User ID XYZ." If you receive this error, you can change the default time-out with File, Database, Alter, Advanced tab, Lock Timeout check box and Timeout duration (sec) value. Based on the previous example, where two users try to modify the same record, the data that is intended to be modified can be locked. This prevents other users from doing the same. This is shown in the following example: User A User B Comment Cust.LOCKTABLE; Cust.GET('10000'); Cust.Blocked := FALSE; Cust.MODIFY; COMMIT; SUCCESS Cust.LOCKTABLE; Cust.GET('10000'); waiting... waiting... Cust.Blocked := TRUE; Cust.MODIFY; COMMIT; SUCCESS User A reads record with a lock. User B tries to read the same record with a lock User B waits and is blocked, because the record is locked by user A. User A modifies the record. User B is kept waiting. Lock is released. Data is now sent to User B. User B successfully modifies record. Lock is released. There is a potential situation when blocking cannot be resolved by the server in a good way. The situation arises when one process is blocked because another process has locked some data. The other process is also blocked because it tries to lock the first process data. Only one of the transactions can be finished. SQL Server terminates the other and sends the following error message to the client: "Your activity was deadlocked with another user" 5-31

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 For example, consider a case in which two users are working concurrently and trying to get one another's blocked records, as shown in this pseudo code: User A User B Comment Cust.LOCK TABLE; Vend.LOC KTABLE; Cust.FIND FIRST;...... Vend.FIND FIRST; SUCCESS Cust.LOCKTABLE; Vend.LOCKTABLE; Vend.FINDFIRST; Cust.FINDFIRST; "Your activity was deadlocked with another user" ERROR Indicates that the next read will use UPDLOCK A blocks Record1 from the Customer table. B blocks Record 1 from the Vendor table. A wants B's record, while B wants A's record. A conflict occurs. SQL Server detects deadlock and arbitrarily chooses one over the other, so one will receive an error. Because SQL Server supports record level locking, there may be a situation where these two activities bypass one another without any problems. Suppose that in this example user A and B try to read the last record in the other table, no conflicts arise, as no records are in contention. The deadlock will only occur if they lock the same rows but in a different order. Note that there would be a deadlock if one of the tables is empty, or contained few records only. A large number of deadlocks can lead to major customer dissatisfaction, but deadlocks cannot always be avoided completely. To minimize the number of deadlocks, do the following: Process tables in the same sequence. Process records in the same order. Keep the transaction length to a minimum. If all code always processed the data in the same order then there would be no deadlocks, only locks. One reason why deadlocks are so expensive is that SQL Server does not immediately discover a deadlock. The initial deadlock discovery frequency is 5 seconds. 5-32

Chapter 5: Improving Application Performance Graphical User Interface Although locking and blocking are necessary to support concurrency, it can lead to decreased performance, especially when tables are locked longer than necessary. To reduce locking time, you can do the following: Keep the lock time to a minimum by locking the resources as late as possible and releasing the locks as soon as possible. Test data validity for an entire transaction before starting the transaction. Keep transactions as short as possible. Use adequate keys. Never allow user input during a transaction. Test conditions of data validity before the start of locking. Allow some time gap between heavy processes so that other users are less affected. Make sure SQL Server has sufficient memory. If the transaction is too complex or there is limited time, consider discussing with the customer the possibility of over-night processing of heavy jobs by using a Microsoft Dynamics NAV Application Server. This avoids the daily concurrency complexity and the high costs of rewriting the code. If the over-night processing is not possible because of the complexity of the processes, as a last resort, revert to serializing the code by ensuring that conflicting processes cannot execute in parallel. This can be done by creating a so-called locking semaphore table that is locked at the beginning of a transaction. As long as the table remains locked, other users cannot start the same transaction. This avoids deadlocks, but at the same time, it affects concurrency. For more information about locking order rules, see the Performance Audits chapter. When upgrading older installations to SQL Server 2005, users can experience poor performance when they search and filter on data in Microsoft Dynamics NAV and when they open and browse lists. Two main problems were identified: SQL Server has a feature called "parameter sniffing," which may cause suboptimal plans to be used by SQL Server. Microsoft Dynamics NAV queries that inherently return an empty result set may cause poor response times. 5-33

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Parameter Sniffing Microsoft Dynamics NAV uses queries that contain parameters on SQL Server. The first time a Microsoft Dynamics NAV query is executed, SQL Server calculates a plan for accessing the data in the most efficient way. This plan is based on the actual values in the search criteria and parameters. Every time a query is sent to SQL Server, SQL Server makes a query-plan for that query. Then it caches this plan to re-use it for identical queries. This is known as "parameter sniffing." This may lead to suboptimal performance because the first values sent may not be representative of subsequent queries. In Microsoft Dynamics NAV 5.0 SP1 the way that queries are sent to SQL Server has been restructured. As of this version, SQL Server will make query plans that are optimized for average parameter values rather than extreme parameter values. This method ensures that SQL Server makes the plan, without forcing it in a certain direction with index hints or the recompile-option. Microsoft Dynamics NAV 5.0 SP1, issues statements that disables parameter sniffing. This is done by forcing the SQL client to not defer the plan calculation until the statement is executed. This requires an extra database roundtrip the first time an SQL statement is constructed or sent. A built-in statement cache means that the extra roundtrip will only occur once if users are working in an isolated area of the application. This change insures against sub-optimal plans but at the same time inflicts an extra roundtrip for the database. Also, there may only be queries with parameters that fit the sub-optimal plan, so the new behavior may be worse. This method guarantees that SQL Server's query plan will not be affected by the parameter values. It means that sometimes, SQL Server is prevented from making the optimal query plan for a certain set of parameter values. But remember that the query plan will be re-used for other parameter values. So, at the expense of having a few highly optimized queries, the method provides optimized queries with better consistency. Another cost of this method, is that now Microsoft Dynamics NAV requires an extra roundtrip to SQL Server. However, this only happens the first time the query is run. If the same query is run again, Microsoft Dynamics NAV will only run the second query (sp_cursorexecute). You can revert to the old behavior by modifying the contents in the $ndo$dbproperty tables as follows. UPDATE [$ndo$dbproperty] SET [diagnostics]=[diagnostics]+1048576 5-34

Chapter 5: Improving Application Performance Adding the 1048576 value to the Diagnostics column of the row in ndo$dbproperty will turn off the No Deferred Prepare behavior. The behavior will be the same as in Microsoft Dynamics NAV 4.0 SP3. As an alternative solution, you can manually implement plan guides for poorly performing queries. To implement a plan guide, you should know the combination of the query and the parameters for which you implemented the plan guide. Another possibility is using the $ndo$dbconfig table to add the OPTION (RECOMPILE) query hint. The OPTION (RECOMPILE) query hint instructs the instance of SQL Server to compile a new query plan for the query instead of using a cached plan. Each plan guide works for a specified query. However, you can use a $ndo$dbconfig table to add the OPTION (RECOMPILE) query hint for all queries. To do this, perform the following steps: 1. Run the following script to create the $ndo$dbconfig table in the Microsoft Dynamics NAV database: -- Step 1 CREATE TABLE [$ndo$dbconfig] (config VARCHAR(512) NOT NULL) -- Step 2 GRANT SELECT ON [$ndo$dbconfig] TO public -- Step 3 INSERT INTO [$ndo$dbconfig] VALUES('UseRecompileForTable="G/L Entry"; Company="CRONUS International Ltd."; RecompileMode=1;') 2. Grant the SELECT permission to the Public role for the $ndo$dbconfig table, as shown in Step 2 in the script here. 3. Use the $ndo$dbconfig table to specify the tables for which you want to add the OPTION (RECOMPILE) query hint. Step 3 in the script here shows how to add the OPTION (RECOMPILE) query hint for queries in the G/L Entry table. To add the OPTION (RECOMPILE) query hint for other tables, you can create a new line in the $ndo$dbconfig table for each table. You do not need to specify the value of the Company parameter if you want to add the OPTION (RECOMPILE) query hint for all companies in the database. 5-35

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Index and Rowlock Hinting The RecompileMode parameter can be a value from 0 to 3. The RecompileMode parameter values represent the following modes: o 0: Do not use the OPTION (RECOMPILE) query hint. o 1: Use the OPTION (RECOMPILE) query hint when you browse the form in a table. The default value is 1. o 2: Use the OPTION (RECOMPILE) query hint with operations that are caused by the C/AL code. o 3: Always use the OPTION (RECOMPILE) query hint. Use the default value of the RecompileMode parameter unless you need to use other recompile modes. RECOMPILE consumes CPU time so it can also degrade performance, especially with recompilemode 3, but also with the other recompilemodes. Queries Returning Empty Result Sets -- Step 1 CREATE TABLE [$ndo$dbconfig] (config VARCHAR(512) NOT NULL) -- Step 2 GRANT SELECT ON [$ndo$dbconfig] TO public -- Step 3 INSERT INTO [$ndo$dbconfig] VALUES('UseRecompileForTable="G/L Entry"; Company="CRONUS International Ltd."; RecompileMode=1;') It is possible to force SQL Server to use a particular index when executing queries for FIND and GET statements. This can be used as a workaround when SQL Server's Query Optimizer picks the wrong index for a query. What is Index Hinting? Index hinting can help avoid situations where SQL Server's Query Optimizer chooses an index access method that requires many page reads and generates long-running queries with response times that vary from seconds to several minutes. Selecting an alternative index can give instant 'correct' query executions with response times of milliseconds. This problem usually occurs only for particular tables and indexes that contain certain data spreads and index statistics. In the rare situations where it is necessary, you can direct Microsoft Dynamics NAV to use index hinting for such problematic queries. When you use index hinting, Microsoft Dynamics NAV adds commands to the SQL queries that are sent to the server. These commands bypass the usual decision-making of SQL 5-36

Chapter 5: Improving Application Performance Server's Query Optimizer and force the server to choose a particular index access method. WARNING: This feature should only be used after all the other possibilities have been exhausted, for example, updating statistics, optimizing indexes or reorganizing column order in indexes. Setup Index Hinting To set up index hinting, you must first create a configuration parameter table in the Microsoft Dynamics NAV database that contains the index hints. To create the table, you can use the following script: CREATE TABLE [$ndo$dbconfig] (config VARCHAR(512) NOT NULL) GRANT SELECT ON [$ndo$dbconfig] TO public Next, you need to enter parameters into the table that will determine some of the behavior of Microsoft Dynamics NAV when it is using this database. You can add additional columns to this table. The length of the config column should be large enough to contain the necessary configuration values, as explained, but does not have to be 512. The following examples show how you can add index hints for specific tables. INSERT INTO [$ndo$dbconfig] VALUES('IndexHint=Yes; Company="CRONUS International Ltd."; Table="Item Ledger Entry"; Key="Item No.","Variant Code"; Search Method="-+";Index=3') This statement will hint the use of the $3 index of the CRONUS International Ltd_$Item Ledger Entry table for FIND('-') and FIND('+') statements when the Item No.,Variant Code key is set as the current key for the Item Ledger Entry table in the CRONUS International Ltd. company. To disable the hint, either delete the hint or execute the same statement with IndexHint=No; Note that the following: If the company is not supplied, the entry will match all the companies. If the search method is not supplied, the entry will match all the search methods. If the index ID is not supplied, the index hinted is the one that corresponds to the supplied key. In most cases this is the desired behavior. If the company/table/fields are renamed or the table's keys redesigned, the IndexHint entries must be modified manually. 5-37

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Bulk Insert The following statement will hint the use of the $3 index of the CRONUS International Ltd_$Item Ledger Entry table for every search method when the Item No.,Variant Code key is set as the current key for the Item Ledger Entry table in the CRONUS International Ltd. company. INSERT INTO [$ndo$dbconfig] VALUES('IndexHint=Yes; Company="CRONUS International Ltd."; Table="Item Ledger Entry"; Key="Item No.","Variant Code"; Search Method=;Index=3') Rowlock Hinting SQL Server has an advanced locking mechanism that decides how data will be locked, either by table, by page, or by row. Although SQL Server tends to apply record-level locking, SQL Server can decide to escalate multiple row locks into a page or a table lock to free system resources. This is better for performance, but not for concurrency (as table and page locks might lock too much data). To prevent SQL Server from choosing a locking method, you can activate the Always Rowlock database option, so that Microsoft Dynamics NAV will send ROWLOCK hints to the SQL Server with every query. By default, the Always rowlock option is not enabled. Without ROWLOCK hints, SQL Server can decide at what level it will lock. The advantage is that SQL Server requires less memory to maintain all the locks, so performance will increase. The disadvantage is that page locking is not as finegrained as record locking. Therefore a user may be locking too many records. The probability of getting blocks is greater, so concurrency is reduced. Activating row locking keeps the lock granularity small and reduces the probability of blocks and locks. The disadvantage is that administering all row locks requires more system memory and creates an additional load on the master database. If you have a high transaction volume dealing with large result sets, row locking can cause an overall decrease of performance (if the master database reacts too slowly due to the high number of lock administrations). We do not recommend activating this option. If you do activate it, make sure that SQL Server has sufficient memory to maintain the locks as row locks require more memory. Analysis of customer feedback has shown that many performance problems are related to locking and long-running transactions. In particular, INSERTs were causing poor performance because of many server roundtrips, update of SIFT tables (one INSERT to the General Ledger Entry table caused 24 additional INSERT calls), and many indexes. 5-38

Chapter 5: Improving Application Performance To resolve the scalability and general performance issues, changes were made to include automatic bulk inserts. The automatic bulk insert feature has nothing to do with SQL Server bulk inserts. Microsoft Dynamics NAV 5.0 SP1 automatically buffers inserts to send them to SQL Server at the same time. By using bulk inserts, the number of server calls is reduced and performance is improved. This feature also improves scalability by delaying the insert until the last possible moment in the transaction. This reduces the time that records are locked and also delays the implicit contention on SIFT indexes. Software developers who want to write high performance code which uses this feature should understand the following. Records are sent to SQL Server: When COMMIT is called, either explicitly or when execution of a code unit ends. When you call MODIFY or DELETE on the table. When you call any FIND, CALCFIELDS, or CALCSUMS on the table. Records are not buffered if you are using the return value from an INSERT call. For example, if you write "IF (GLEntry.INSERT) THEN", records are not buffered if any of the following conditions are true: The table where you insert the records contains BLOB fields. The table where you insert the records contains Variant fields. The table where you insert the records contains RecordID fields. The table where you insert the records contains fields that have the AutoIncrement property set to Yes. 5-39

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Best Practices Bulk Insert Example The following code shows a loop of INSERTs to the General Ledger Entry table. Nothing is inserted to the table before the code reaches the COMMIT statement at the end of the example. IF (JnlLine.FINDSET) THEN BEGIN GLEntry.LOCKTABLE; IF (GLEntry.FindLast) THEN GLEntry."Entry No.":= GLEntry."Entry No."+1 ELSE GLEntry."Entry No.":=1; REPEAT GLEntry."Entry No.":= GLEntry."Entry No." +1; GLEntry.INSERT; UNTIL JnlLine.NEXT = 0; END; COMMIT; //All INSERTs are sent. This lesson contains some general guidelines on how to optimize Microsoft Dynamics NAV on SQL Server. Performance Strategies and Tuning Checklist When you are responsible for maintaining a Microsoft Dynamics NAV database and keeping it running as efficiently as possible, it is best to follow a tuning methodology. This section describes a basic tuning methodology that can be used as a guideline and can be adapted to your individual needs. The methodology consists of the following eight steps: 1. Define the problem/issue 2. Monitor the system 3. Analyze monitoring results 4. Create a hypothesis 5. Propose a solution 6. Implement changes 7. Test solution 8. Return to step 2 5-40

Chapter 5: Improving Application Performance Define the Problem/Issue You first need to understand and document the problem and the environment. Determine the problem. Document and validate parameters: database size, tuning parameters, There are many checklists available or you can create your own. Look at the system as a whole. This step is important to determine how you will approach the problem or where the bottleneck might be located. Remember to talk to both the IT staff and the end-users. Inquire what they were doing when the problem occurs. Investigate the problem. Monitor the System Monitoring the system is used for discovering the problem or tuning the system. The goal of this step is to collect baseline information and to make an initial determination of the possible problem(s). Use the following tool(s) to monitor the system: Operating system tools: Performance Monitor, Task Manager, Event Viewer. SQL Server tools: Error log, system tables, Dynamic Management Views, Activity Monitor, SQL Server Profiler, Database Engine Tuning Advisor. Analyze SQL Server, the Operating system and hardware parameters. Baseline information is very important to determine initial issues and to determine whether changes make an improvement. This step should be documented in detail. Analyze monitoring results When you have completed an initial assessment and collected data, you must analyze and interpret this data. This analysis is important because it allows you to determine the problem and its cause. The analysis should be done carefully and should include the following areas of study: Analyze monitoring data. Review error logs. View customer performance data from their monitoring software. 5-41

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 This assessment should be documented. It will be the basis of your report to the customer or management. This should also include data about the following: CPU utilization. I/O utilization and response time. Memory utilization. Errors reported in the error log. Wait statistics (if available). By carefully analyzing performance data, you may be able to determine the problem immediately, or you might be able to create a theory about possible contributing factors of the problem. This step and the next may benefit from having more than one person participate to provide ideas, experience, and guidance. Create a Hypothesis When you have analyzed the monitoring and log data, you are ready to populate a theory about the cause of the problem. This may sound more complex than it is actually. Creating a hypothesis is as simple as formulating a theory and documenting it. If you do not document the hypothesis, it can be easy to stray from proper testing of this hypothesis. The goal is to determine what the problem is. Create a theory: I/O problem, locking problem. Document your theory. Back up the theory with data. Propose a Solution After you have created the hypothesis, you are ready to develop a solution to the performance problem. In many cases, you will be unable to immediately solve the problem. Instead, you may need to develop a test to further narrow down the problem. Your test can be designed to split the problem or improve some aspect of the system. The solution consists of the following topics: Develop a solution. Develop a validation plan. Document expected results. 5-42

Chapter 5: Improving Application Performance Implement Changes After you have theorized the problem and developed a solution or test, it is time to implement changes. These change implementations can take the following forms: A hardware change. A configuration parameter change. Adding an index. Changing C/AL code. Implementing change should be done very carefully. Changes should be categorized to no risk, moderate risk and high risk. If possible, first test the change on a test system, before you implement it on production. Test the Solution The final step is to actually run the test. Some tips and best practices for changes are as follows: Change only one thing at a time. Document the result of the change. Compare performance after the test to the baseline metrics. If possible, test the change in a nonproduction environment. If possible, run load tests. Return to Step 2 After you have started testing the solution, return to step 2: Monitor the System, to collect data about the state of the system while the test is running. Follow the methodology until you run out of time, budget, or problems. By documenting each step, you will get better results and be better able to create professional and complete reports on the engagement, the problem, the solution and the results. Storage Top 10 Best Practices Correct configuration of I/O subsystems is critical for optimal performance and operation of SQL Server systems. The following are some of the most common best practices that the SQL Server team recommends with respect to storage configuration for SQL Server. 5-43

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Understand the I/O characteristics of SQL Server and the specific I/O requirements / characteristics of your application To be successful in designing and deploying storage for your application, you need to understand your application's I/O characteristics and SQL Server I/O patterns. Performance monitor is the best place to capture this information for an existing application. Some of the questions you should ask yourself are as follows: What is the read versus write ratio of the application? What are the typical I/O rates (I/O per second, MB/s and size of the IOs)? Monitor the perfmon counters: o Average read bytes/sec, average write bytes/sec o Reads/sec, writes/sec o Disk read bytes/sec, disk write bytes/sec o Average disk sec/read, average disk sec/write o Average disk queue length More / faster spindles are better for performance Ensure that you have a sufficient number of spindles to support your I/O requirements with an acceptable latency. Use filegroups for administration requirements such as backup / restore, partial database availability, and so on. Use data files to "stripe" the database across your specific I/O configuration (physical disks, LUNs, and so on). Try not to "over" optimize the design of the storage; simpler designs generally offer good performance and more flexibility Unless you understand the application very well avoid trying to over optimize the I/O by selectively placing objects on separate spindles. Be sure to consider the growth strategy up front. As your data size grows, how will you manage growth of data files / LUNs / RAID groups? It is much better to design for this in the beginning than to rebalance data files or LUN(s) later in a production deployment. Validate configurations before deployment Do basic throughput testing of the I/O subsystem before deploying SQL Server. Make sure these tests achieve your I/O requirements with an acceptable latency. Understand that the of purpose running the SQLIO tests is not to simulate SQL Server's exact I/O characteristics but to test maximum throughput achievable by the I/O subsystem for common SQL Server I/O types. 5-44

Chapter 5: Improving Application Performance Always put log files on RAID 1+0 (or RAID 1) disks This provides the following: Better protection from hardware failure. Better write performance. In general RAID 1+0 provides better throughput for write-intensive applications. The performance gained varies based on the hardware vendor's RAID implementations. The most common alternative to RAID 1+0 is RAID 5. Generally, RAID 1+0 provides better write performance than any other RAID level providing data protection. This includes RAID 5. Isolate log from data at the physical disk level When this is not possible (for example, consolidated SQL environments) consider I/O characteristics and group similar I/O characteristics (all logs) on common spindles. Combining heterogeneous workloads (workloads with very different I/O and latency characteristics) can have negative effects on overall performance (for example, placing Exchange and SQL data on the same physical spindles). Consider configuration of TEMPDB database Be sure to move TEMPDB to adequate storage and pre-size after you install SQL Server. Performance may benefit if TEMPDB is placed on RAID 1+0 (dependent on TEMPDB usage). For the TEMPDB database, create 1 data file per CPU. Lining up the number of data files with CPUs has scalability advantages for allocation intensive workloads We recommend that you have 0.25 to 1 data files (per filegroup) for each CPU on the host server. This is especially true for TEMPDB where the recommendation is 1 data file per CPU. Dual core processors count as two CPUs; logical processors (hyperthreading) do not. 5-45

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Do not overlook some of SQL Server basics Data files should be of equal size - SQL Server uses a proportional fill algorithm that favors allocations in files with more free space. Pre-size data and log files. Do not rely on AUTOGROW, instead manage the growth of these files manually. You may leave AUTOGROW ON for safety reasons, but you should proactively manage the growth of the data files. Do not overlook storage configuration bases Use up-to-date drivers recommended by the storage vendor. Use storage vendor specific drivers from the manufacture's Web site. Ensure that the storage array firmware is up to the latest recommended level. Use multipath software to achieve balancing across HBAs and LUNs and ensure this is functioning correctly. Focus on Application Speed Performance issues are not always caused by missing indexes. Sometimes the more indexes you define, the more indexes need to be maintained, resulting in faster reads and slower updates. Many performance issues are caused by long-running transactions. If the C/AL code is not optimized, adding indexes will not solve the problem, and the transaction will still take longer than necessary. Therefore, it is very important to focus on the application speed before adding indexes. Keeping the application as fast as possible will solve or avoid many problems. Instead of only focusing on getting the right results, developers should always keep performance in their mind and focus on getting the right results the right way. Minimize the Number of Keys Do not maintain indexes that are only used for sorting purposes. SQL Server will sort the result set. The main focus should be on quickly retrieving the result set. If you, for example, have several indexes that start with the same combination of keys (index fields), you should maintain only one of them on SQL Server. Hopefully you can identify the one that is most used in the most situations. In Microsoft Dynamics NAV, you should then use the MaintainSQLIndex property of the other indexes to specify that they should no longer be maintained on SQL Server (without disabling the key). 5-46

Chapter 5: Improving Application Performance Avoid having too many indexes on hot tables because each record update means an index update producing more disk I/Os. For example, if the Item Ledger Entry table is growing by 1000 records per day and has 20 indexes, it can easily produce more than 20000 disk I/Os per day. However, if you reduce the number of indexes to for example 5, it greatly reduces the number of disk I/Os. Experience has shown that it is always possible to reduce the number of indexes to between 5 and 7 and even less on hot tables. When minimizing the number of keys, you can use two methods: Keep the existing indexes and gradually disable them one by one, until you experience performance issues. This way, you know which index caused the performance issue and you can enable it again. Disable all indexes and activate them one by one. This way, you know which index caused the performance increase and you can disable the other indexes. Indexes per Table The following script lists all the indexes in a Microsoft Dynamics NAV database by table: SELECT OBJECT_NAME(id) AS [Object Name], name AS [Index Name] FROM sysindexes WHERE (name NOT LIKE '_WA_%') AND (id > 255) ORDER BY [Object Name] Number of Indexes per Table The following query will show the number of indexes per table in a Microsoft Dynamics NAV database sorted by the number of indexes: SELECT OBJECT_NAME(id) AS [Object Name], COUNT (id) AS [No. of Indexes] FROM sysindexes WHERE (name NOT LIKE '_WA_%') AND (id > 255) GROUP BY id ORDER BY [No. of Indexes] DESC, [Object Name] 5-47

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Minimize the Number of SIFT Buckets In versions before Microsoft Dynamics NAV 5.0 SP1, minimize the number of buckets maintained for each SIFT index. There is, for example, no reason to maintain the daily bucket for the Cust. Ledger Entry table if you only post several invoices and payments a month for the same customer. Furthermore, if you have several SIFT indexes defined on a table that you design, investigate whether some of the buckets are already maintained by another index. For example if you have two indexes Customer No.,Currency Code and Customer No.,Open which both maintain Amount (LCY) sums, you could disable the bucket that maintains the totals per Customer No. in one of the indexes. Consider the following example. If a SIFT table contains a lot of buckets, every single update of the 'source' table produces a large number of bucket updates. Every time you insert a record into the source table, Microsoft Dynamics NAV must update all buckets, one 'source' record and its indexes (all/some keys). A single insert could produce more than 100 I/Os on the disk subsystem. Obviously, the smaller the records in the table, the smaller the problems associated with the indexes and the SIFT indexes become. To see the number of SIFT buckets per table, use the following script. SET NOCOUNT ON -- DROP TABLE ##SIFTtables CREATE TABLE ##SIFTtables ( [table_name] VARCHAR(255) DEFAULT '', [bucks] INT DEFAULT 0 ) DECLARE @Statement CHAR (255) DECLARE @tname sysname DECLARE Get_Curs CURSOR FOR SELECT name FROM sysobjects WHERE OBJECTPROPERTY(id, N'IsUserTable') = 1 AND name LIKE '%'+'$'+'[0-9]'+'%' ORDER BY name OPEN Get_Curs FETCH NEXT FROM Get_Curs INTO @tname WHILE @@FETCH_STATUS = 0 BEGIN INSERT INTO ##SIFTtables (bucks) EXEC('SELECT COUNT(DISTINCT bucket) AS bucks FROM ['+@tname+']') UPDATE ##SIFTtables SET table_name = @tname WHERE table_name = '' FETCH NEXT FROM Get_Curs INTO @tname END 5-48

Chapter 5: Improving Application Performance CLOSE Get_Curs DEALLOCATE Get_Curs SELECT table_name AS [Table Name], bucks AS [No. Of Buckets] FROM ##SIFTtables However, if your database has no data and/or some SIFT source tables have not been populated, the query will not show any or just a few buckets. Use Key Groups When creating new keys, make sure to add keys to a key group. This allows other developers to see what a key is used for. If necessary, you can create additional key groups. By using key groups, you can also enable or disable the keys (manually or with C/AL Code). Disabling a key group often results in better performance, as less indexes need to be maintained. Key Selectivity Redesign indexes so their selectivity becomes higher. Remember, do not place Boolean and option fields at the beginning of an index and always put date fields toward the end of the index. Indexes like Document Type,Customer No., have very low selectivity on the first key. You can create a new index Customer No.,Document Type, and maintain it on SQL Server while turning off the maintenance of the original index on SQL Server. With the changes in Microsoft Dynamics NAV 5.0 SP1 to avoid parameter sniffing it is no longer important to have the most selective fields to the left in the index because the entire index selectivity is now the important thing. It used to be important because SQL server would only consider the histograms of the first field in the index. Ask for Assistance If you encounter performance issues and do not know or cannot manage to locate the cause of the problems, ask for assistance before changing the hardware configuration or the database design. 5-49

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Summary In this chapter you learn the key areas in application development that are important for performance. The chapter explains how to create optimized keys and how to read data in an optimal way using FINDFIRST, FINDLAST, and FINDSET. In addition, the chapter explains how you can avoid deadlocks. Problem solving is considered one of the most complex of all intellectual functions. This section provides tips, techniques, and methods to more easily perform troubleshooting and tuning exercises. With all these tasks, process is very important. It is through a systematic approach that you can determine the cause and solution(s) to any type of problem. 5-50

Chapter 5: Improving Application Performance Test Your Knowledge Test your knowledge with the following questions. 1. Put the following tuning methodology steps in the correct order: Step: : Return to step 2 : Analyze monitoring results : Test the solution : Propose a solution : Create a hypothesis : Monitor the system : Define the issue. : Implement changes 2. Why is the Find As You Type feature bad for performance on SQL Server? 5-51

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 3. How is SIFT implemented in Microsoft Dynamics NAV? 4. What do you know about RowLock Hinting? 5. What determines the physical storage order of records in a table? ( ) The Primary Key ( ) The Clustered Index ( ) The Timestamp column ( ) The Record No. column 6. Which statement is recommended when you want to browse a record set and modify a field that is not part of the current sorting? ( ) FINDSET(TRUE, TRUE) ( ) FINDSET(TRUE,FALSE) ( ) FINDSET(FALSE, TRUE) ( ) FINDSET(FALSE, FALSE) 5-52

Chapter 5: Improving Application Performance 7. What is true about the FINDSET statement from a performance point of view? (Select all that apply) ( ) FINDSET should be used to check whether one or more records exist that meet specific criteria. ( ) FINDSET returns a read-only record set. ( ) FINDSET only allows you to browse a record set from the top down. ( ) LOCKTABLE and FINDSET should be used to read large record sets, as opposed to using FINDSET(TRUE). Fill in the blanks to test your knowledge of this section. 8. The function allows you to determine whether a C/SIDE table or a filtered set of records is empty. 9. To enable a key group in C/AL code, you must use the function. 10. In Microsoft Dynamics NAV 5.0 SP1, SIFT tables have been replaced by. 11. should never be used in combination with REPEAT/UNTIL NEXT. 12. The property can be used to optimize a Microsoft Dynamics NAV key on SQL Server. 13. SQL Server prefers keys with a high. 14. A table without a clustered index is called a. 5-53

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 5.1 - Optimize C/AL Code for Performance In this lab you optimize a piece of C/AL code for performance. Although this lab can be done individually by the students, it is meant to start a class discussion. The optimization of this piece of code depends on several factors, such as the number of sales headers and the number of sales lines per sales header. Multiple solutions are possible. Scenario Using SQL Server Profiler and Windows Performance Monitor, Tim, the ITmanager, has found that a particular function in Microsoft Dynamics NAV causes serious performance issues. Tim asks Mort, the IT Systems Developer, to review the particular function. Challenge Yourself! Optimize the following code for best performance: SalesHdr.SETRANGE("Shipment Date", 0D); SalesHdr.SETFILTER( "Document Type", '%1', SalesHdr."Document Type"::Order); SalesHdr.LOCKTABLE; SalesHdr.FIND('-'); NoOfRecords := SalesHdr.COUNT; IF CONFIRM('Do you want to update all %1 Sales Orders?', FALSE, NoOfRecords) THEN BEGIN REPEAT SalesHdr.TESTFIELD(Status, SalesHdr.Status::Released); SalesLine.SETRANGE( "Document Type", SalesHdr."Document Type"); SalesLine.SETRANGE("Document No.", SalesHdr."No."); SalesLine.SETRANGE("Shipment Date", 0D); WHILE SalesLine.FIND('-') DO BEGIN SalesLine.LOCKTABLE(FALSE, TRUE); IF SalesLine."Shipment Date" = 0D THEN BEGIN SalesLine."Shipment Date" := WORKDATE; SalesLine.MODIFY; END; IF SalesLine."Qty. Shipped (Base)" <> 0 THEN FoundError := TRUE; COMMIT; END; UNTIL SalesHdr.NEXT = 0; END; COMMIT; IF FoundError = TRUE THEN ERROR('At least one Sales Line was found with %1 <>0.', SalesLine.FIELDCAPTION("Qty. Shipped (Base)")); 5-54

Need a Little Help? Discuss the following topics: Chapter 5: Improving Application Performance 1. The use of keys and filters. In case one or more keys must be added, discuss the structure of the keys. 2. Data retrieval (FIND, GET) and repetitive statements. 3. Locking and unlocking (implicit and explicit locking). 4. Data manipulation statements (INSERT, DELETE, MODIFY). Can multiple DELETE or MODIFY statements be replaced by DELETEALL or MODIFYALL statements? 5. Other statements that can have adverse effect on performance (COUNT). 6. The order the statements are executed. The following code can be used as an alternative for the previous code: SalesHdr.SETCURRENTKEY( "No.", "Document Type", Status, "Shipment Date"); SalesHdr.SETRANGE( "Document Type", SalesHdr."Document Type"::Order); SalesHdr.SETRANGE(Status, SalesHdr.Status::Released); SalesHdr.SETRANGE("Shipment Date", 0D); IF SalesHdr.FINDSET THEN IF CONFIRM('Do you want to update all Sales Orders?', FALSE) THEN BEGIN SalesLine.SETCURRENTKEY( "Document No.", "Document Type", "Shipment Date"); REPEAT SalesLine.SETRANGE("Document No.", SalesHdr."No."); SalesLine.SETRANGE("Document Type", SalesHdr."Document Type"); SalesLine.SETRANGE("Shipment Date", 0D); SalesLine.LOCKTABLE; // For small record sets; IF SalesLine.FINDSET THEN // otherwise use FIND('- ') REPEAT SalesLine2 := SalesLine; SalesLine2."Shipment Date" := WORKDATE; SalesLine2.MODIFY; IF SalesLine."Qty. Shipped (Base)" <> 0 THEN FoundError := TRUE; UNTIL SalesLine.NEXT = 0; UNTIL SalesHdr.NEXT = 0; END; IF FoundError = TRUE THEN MESSAGE('At least one Sales Line was found with %1 <>0.', SalesLine.FIELDCAPTION("Qty. Shipped (Base)")); Discuss the differences between both code fragments. 5-55

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 5.2a - Find Index Usage In this lab you use dynamic management views to retrieve information about the usage of all indexes. Index usage statistics show how frequent indexes are used and updates can be used to disable specific indexes on SQL Server. Scenario Mort finds out that the Microsoft Dynamics NAV database is over-indexed. To know which indexes he can disable on SQL Server, he runs a query showing index usage information. Challenge Yourself! Run a query to find index usage information for the Demo Database NAV (6-0) database. Use the dynamic management views as a base. Sort the information so that indexes causing the biggest overhead are listed first. Need a Little Help? Perform the following list of steps: 1. Open SQL Server Management Studio. 2. Run the query. 3. Analyze the results. Step by Step Open SQL Server Management Studio 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine. 3. In the database dropdown list, select the Demo Database NAV (6-0) database. 5-56

Chapter 5: Improving Application Performance Run the query 1. Click the New Query button to open a new query window. 2. In the New Query window, enter the following query: SELECT db_name(database_id) as [DatabaseName], object_name(object_id) as [TableName], * FROM sys.dm_db_index_usage_stats WHERE db_name(database_id) = 'Demo Database NAV (6-0)' and object_name(object_id) LIKE 'CRONUS%' ORDER BY user_updates DESC 3. Click the Execute button to run the query. Analyze the Query Results Check the columns user_seeks, user_scans, and user_lookups to understand how often an index is used. Then compare with the column user_updates, to see how often the index is being used. There are several other columns available, and this query could easily be modified to, for example, ORDER BY user_updates, to see the indexes causing the largest overheads, and then check the actual usage of these indexes. 5-57

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 5.2b - Find Unused Indexes In this lab you write a query to retrieve information about unused indexes. The query shows general information about the Microsoft Dynamics NAV keys, such as the number of updates, the number of reads, or when an index was last used for reading. The idea is to show a list of indexes being maintained, but never or rarely being used. In the next lab, you process the information from this query. Scenario In the previous lab, Mort has retrieved index usage statistics for the Microsoft Dynamics NAV database. Now he knows how many times the table indexes are used and he can start disabling the keys. However, before disabling used keys, he wants to start by cleaning up keys that are frequently updated and never read. Challenge Yourself! Run a dynamic management view-based query that shows the unused indexes for the Demo Database NAV (6-0) database. Sort the information in descending order by number of updates. Need a Little Help? Perform the following list of steps: 1. Open SQL Server Management Studio. 2. Run the query. 3. Analyze the results. Step by Step Open SQL Server Management Studio 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine 3. In the database dropdown list, select the Demo Database NAV (6-0) database. Run the query 1. Click the New Query button to open a new query window. 2. In the New Query window, enter the following query: IF OBJECT_ID ('z_iuq_temp_index_keys', 'U') IS NOT NULL DROP TABLE z_iuq_temp_index_keys; IF OBJECT_ID ('ziuq_temp_index_usage', 'U') IS NOT NULL DROP TABLE ziuq_temp_index_usage; -- Generate list of indexes with key list 5-58

Chapter 5: Improving Application Performance CREATE TABLE z_iuq_temp_index_keys( [F_Obj_ID] [int] NOT NULL, [F_Obj_Name] [nvarchar] (128) NULL, [F_Ind_ID] [int] NOT NULL, [Index_Column_ID] [int] NOT NULL, [Index_Key] [nvarchar] (128) NULL, [Index_Key_List] [nvarchar] (MAX) NULL, CONSTRAINT [z_iuq_temppk] PRIMARY KEY( [F_Obj_ID], [F_Ind_ID], [Index_Column_ID])); INSERT INTO z_iuq_temp_index_keys SELECT object_id, object_name(object_id), index_id, Index_Column_ID, index_col(object_name(object_id), index_id,index_column_id), '' FROM sys.index_columns; GO -- populate key string DECLARE IndexCursor CURSOR FOR SELECT F_Obj_ID, F_Ind_ID FROM z_iuq_temp_index_keys FOR UPDATE OF Index_Key_List; DECLARE @ObjID int; DECLARE @IndID int; DECLARE @KeyString VARCHAR(MAX); SET @KeyString = NULL; OPEN IndexCursor; SET NOCOUNT ON; FETCH NEXT FROM IndexCursor INTO @ObjID, @IndID; WHILE @@fetch_status = 0 BEGIN SET @KeyString = ''; SELECT @KeyString = COALESCE(@KeyString, '') + Index_Key + ', ' FROM z_iuq_temp_index_keys WHERE F_Obj_ID = @ObjID and F_Ind_ID = @IndID ORDER BY F_Ind_ID, Index_Column_ID; SET @KeyString = LEFT(@KeyString,LEN(@KeyString) - 2); UPDATE z_iuq_temp_index_keys SET Index_Key_List = @KeyString WHERE CURRENT OF IndexCursor; FETCH NEXT FROM IndexCursor INTO @ObjID, @IndID; END; CLOSE IndexCursor; DEALLOCATE IndexCursor; -- Generate list of Index usage CREATE TABLE ziuq_temp_index_usage( [F_Table_Name] [nvarchar](128) NOT NULL, [F_Ind_ID] [int] NOT NULL, [F_Index_Name] [nvarchar](128) NULL, [No_Of_Updates] [int] NULL, 5-59

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 [User_Reads] [int] NULL, [Last_Used_For_Reads] [datetime] NULL, [Index_Type] [nvarchar](56) NOT NULL, [last_user_seek] [datetime] NULL, [last_user_scan] [datetime] NULL, [last_user_lookup] [datetime] NULL, [Index_Keys] [nvarchar] (255) NULL); INSERT INTO ziuq_temp_index_usage SELECT object_name(us.object_id) Table_Name, US.index_id Index_ID, SI.name Index_Name, US.user_updates No_Of_Updates, US.user_seeks + US.user_scans + US.user_lookups User_Reads, CASE WHEN (ISNULL(US.last_user_seek,'00:00:00.000') >= ISNULL(US.last_user_scan,'00:00:00.000')) AND (ISNULL(US.last_user_seek,'00:00:00.000') >= ISNULL(US.last_user_lookup,'00:00:00.000')) THEN US.last_user_seek WHEN (ISNULL(US.last_user_scan,'00:00:00.000') >= ISNULL(US.last_user_seek,'00:00:00.000')) AND (ISNULL(US.last_user_scan,'00:00:00.000') >= ISNULL(US.last_user_lookup,'00:00:00.000')) THEN US.last_user_scan ELSE US.last_user_lookup END AS Last_Used_For_Reads, SI.type_desc Index_Type, US.last_user_seek, US.last_user_scan, US.last_user_lookup, '' FROM sys.dm_db_index_usage_stats US, sys.indexes SI WHERE SI.object_id = US.object_id AND SI.index_id = US.index_id ORDER BY No_Of_Updates DESC; GO -- Select and join the two tables. SELECT TIU.F_Table_Name Table_Name, --TIU.F_Ind_ID Index_ID, --TIU.F_Index_Name Index_Name, TIK.Index_Key_List, TIU.No_Of_Updates, TIU.User_Reads, CASE WHEN TIU.User_Reads = 0 THEN TIU.No_Of_Updates ELSE TIU.No_Of_Updates / TIU.User_Reads END AS Cost_Benefit, 5-60

Chapter 5: Improving Application Performance TIU.Last_Used_For_Reads, TIU.Index_Type FROM ziuq_temp_index_usage TIU, z_iuq_temp_index_keys TIK WHERE TIK.F_Obj_Name = TIU.F_Table_Name AND TIK.F_Ind_ID = TIU.F_Ind_ID AND TIK.Index_Column_ID = 1 AND TIU.F_Table_Name NOT IN ('ziuq_temp_index_usage', 'z_iuq_temp_index_keys') AND TIU.F_Table_Name LIKE 'CRONUS%' ORDER BY No_Of_Updates DESC; --order by Cost_Benefit desc 3. Click the Execute button to run the query. Depending on the size of your database, it may take a few minutes to run the query. First time you run it, it is recommended that you do it when the SQL Server is not too busy, until you know how long it takes. Analyze the Query Results The query shows you one line for each index in the SQL database. It includes the table name and the list of fields in the index. Note that a non-clustered index also contains the clustered index. For example on SQL Server, the key Document No. in the Cus. Ledger Entry table is Document No.,Entry No. Also note that the indexes shown by SQL Server are not always shown in the same order as you have defined them in Microsoft Dynamics NAV. The No_Of_Updates column shows the cost of this index (because every update requires a lock as well as a write to the database). The User_Reads column displays how often this index has been used, either from the user interface or by C/AL code. The Cost_Benefit column (which is No_Of_Updates / User_Reads, or No_Of_Updates when User_Reads = 0) allows you to compare the index costs to the benefits. The Last_Used_For_Reads column shows you when an index was actually used for reading. The query sorts the indexes by No_Of_Updates, with the most updated index (the biggest cost) first. On the last line of the query, you can change the sorting to order by Cost_Benefit DESC. The query also shows you whether each index is clustered or non-clustered. The query will create two new tables called z_iuq_temp_index_keys and ziuq_temp_index_usage to collect index usage statistics. If you already have tables with these names in your database, the query will overwrite those without warnings. 5-61

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 NOTE: If you need to run the query again, for example, because you lost the results or because you want to run it with a different sorting, you do not have to run the whole query. In that case, you can simply run the last part of the query - from Select and join the two tables section, and it will run much faster. You need to run the whole query again if you want an updated view of index usage (for example after you have changed some indexes). The data shown by the query is reset every time SQL Server restarts. If you have recently restarted SQL Server, the query may not show you the most precise picture of how the indexes are being used over time. Also consider that some indexes may only ever be used for example at the end of the month or fiscal year. Although the query shows that an index was not used (since SQL Server was last restarted), this index may still be required for specific jobs. Typically, indexes with high cost (number of updates) and low benefits are subject to be changed. 5-62

Lab 5.2c - Disable Unused Keys Chapter 5: Improving Application Performance In this lab you disable unused keys in Microsoft Dynamics NAV. Scenario When he runs the unused indexes query from the previous lab, Mort notices that the Entry No. index in the Bank Account Ledger Entry table has a high cost and no benefit (user_reads = 0) FIGURE 5.5 UNUSED INDEXES He decides to disable this key on SQL Server and change the physical storage order of the ledger entries to Bank Account No., Posting Date. Challenge Yourself! Disable maintenance of an Entry No. key on Microsoft SQL Server and change the clustered index for the Bank Account Ledger Entry table to Bank Account No., Posting Date. Need a Little Help? Perform the following list of steps: 1. Change the clustered index of the Bank Account Ledger Entry table. 2. Disable maintenance of the Entry No. key on SQL Server. 3. Check the table keys using sp_helpindex. Step by Step Change the clustered index of the Bank Account Ledger Entry table 1. Start Microsoft Dynamics NAV 2009 Classic with Microsoft SQL Server. 2. On the Tools menu, select Object Designer. 3. Click Table. 5-63

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 4. In the list of tables, select table Bank Account Ledger Entry. 5. Click Design. 6. On the View menu, select Keys to open the Keys window. 7. Select the Entry No. key. 8. On the View menu, select Properties to open the Key - Properties window. 9. Set the Clustered property to No. FIGURE 5.6 THE KEY PROPERTIES WINDOW 10. In the Keys window, select the Bank Account No., Posting Date key. 11. In the Key - Properties window, set the Clustered property to Yes. You have now changed the clustered index of the table on SQL Server. Disable maintenance of the Entry No. key on SQL Server 1. In the Keys window, select the Entry No. key. 2. In the Key - Properties window, set the MaintainSQLIndex property to No. 3. Close the Keys window. 4. Save and compile the Bank Account Ledger Entry table. Check the table keys using sp_helpindex 1. Open SQL Server Management Studio. 2. Click the New Query button to open a new query window. 5-64

Chapter 5: Improving Application Performance 3. In the Query window, enter the following statement: sp_helpindex 'CRONUS International Ltd_$Bank Account Ledger Entry' 4. Click the Execute button to run the query. The result will look as follows: FIGURE 5.7 SP_HELPINDEX RESULTS You notice that the Bank Account No., Posting Date, Entry No. index now is the clustered index. Note that the Entry No. index is still maintained on SQL Server. This is because the MaintainSQLIndex property must always be Yes for the primary key. Even if you set it to No (as in this lab), Microsoft Dynamics NAV will change the property back to Yes. Remember that the MaintainSQLIndex property can only be used on secondary keys. 5-65

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 5.3 - Create a Deadlock Trace In this lab you set up a lock trace in SQL Server Profiler to detect locks and deadlocks in the Microsoft Dynamics NAV database. Scenario Recently, Mort receives a lot of requests from users complaining about deadlocks. When they work in Microsoft Dynamics NAV, users get the following error message: FIGURE 5.8 DEADLOCK ERROR MESSAGE Mort asks Tim to set up a lock trace, to see when locks and deadlocks occur. The same information can be obtained by using the dm_tran_locks dynamic management view. However, the dm_tran_locks view shows only the current server state (no historical information). Challenge Yourself! Set up a deadlock trace in SQL Server Profiler. Create two deadlock situations in Microsoft Dynamics NAV involving two different users and analyze the trace. Need a Little Help? Perform the following steps to complete this lab: 1. Open SQL Server Profiler. 2. Define the Trace. 3. Filter the Trace. 4. Open the first Microsoft Dynamics NAV Client session (for the current Windows user). 5. Open the second Microsoft Dynamics NAV Client session (for Susan). 6. Run the codeunits in each Microsoft Dynamics NAV session (first Susan, then Administrator). 7. Run the codeunits again in each Microsoft Dynamics NAV session (first Administrator, then Susan). 8. Check the Trace. 9. Stop the Trace. 10. Analyze the Deadlock Trace. 5-66

Step by Step Chapter 5: Improving Application Performance Open SQL Server Profiler 1. Open SQL Server Management Studio. 2. In the Tools menu, select SQL Server Profiler. 3. Connect to the NAV-SRV-01 Database Engine. Define the Trace 1. In the Trace Properties window, in the trace name, enter Locks and Deadlocks. 2. In the Use the template list, select the TSQL_Locks template. 3. Select the Save to File option. 4. In the Save As dialog box, enter the path and file name for the SQL Server Profiler trace. In this example, enter C:\PerfLogs\Lab 5.3.trc. 5. Clear the Enable file rollover option. 6. Select the Server processes trace data option. Filter the Trace 1. On the Events Selection tab, click the Column Filters button. 2. In the Edit Filter window, in the left pane, select DatabaseName. 3. In the right pane, double-click the Like operator. 4. In the text box, enter Demo Database NAV (6-0). 5. Click OK to close the Edit Filter window. 6. Click Run to start the SQL Server Profiler trace. Next, to simulate the deadlock situation, you need to start two Microsoft Dynamics NAV client sessions and then run a codeunit in each session. Open the first Microsoft Dynamics NAV Client session (for the current Windows user) 1. In the Windows Taskbar, click Start > All Programs > Microsoft Dynamics NAV 2009 Classic with Microsoft SQL Server to start the Microsoft Dynamics NAV client. 2. On the Tools menu, select Object Designer. 3. Click Codeunit. 4. Create a new codeunit 123456703, Locking - Administrator with the following code: Cust.LOCKTABLE; IF Cust.FINDFIRST THEN BEGIN Cust.Name := Cust.Name; Cust.Address := Cust.Address; 5-67

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Cust.MODIFY; END; Vend.LOCKTABLE; SLEEP(30000); IF Vend.FINDFIRST THEN; Open the second Microsoft Dynamics NAV Client session (for Susan) 1. In the Windows Taskbar, click Start > All Programs. 2. Right-click Microsoft Dynamics NAV 2009 Classic with Microsoft SQL Server and select Run as. 3. In the Run as window, select The following user. 4. In the User name field, enter CONTOSO\Susan. 5. In the Password field, enter pass@word1. 6. Click OK to run the Microsoft Dynamics NAV 2009 Classic client. 7. On the Tools menu, select Object Designer. 8. Click Codeunit. 9. Create a new codeunit 123456702, Locking - Susan, with the following code: Vend.LOCKTABLE; IF Vend.FINDFIRST THEN BEGIN Vend.Name := Cust.Name; Vend.Address := Vend.Address; Vend.MODIFY; END; Cust.LOCKTABLE; SLEEP(30000); IF Cust.FINDFIRST THEN; Run the codeunits in each Microsoft Dynamics NAV session (first Susan, then Administrator) 1. In Susan's session, click Run to execute the codeunit 123456702. 2. Immediately, switch to the current Windows user's session. 3. Click Run to execute codeunit 123456703. 4. Wait for the deadlock error message to appear in the current session. The current Windows user is now chosen as a deadlock victim. 5-68

Chapter 5: Improving Application Performance Run the codeunits again in each Microsoft Dynamics NAV session (first Administrator, then Susan) 1. In the current Windows user's session, click Run to execute codeunit 123456703. 2. Switch to Susan's session. 3. Click Run to execute codeunit 123456702. 4. Wait for the deadlock error message to appear in Susan's session. Susan is now chosen as a deadlock victim. Check the Trace Switch to the SQL Server Profiler window. While the codeunits are running, events are added to the SQL Server Profiler trace. The LoginName column shows that the database activity is caused by CONTOSO\SUSAN and CONTOSO\Administrator. Stop the Trace In the SQL Server Profiler, right-click the trace window and select Stop Trace. Analyze the Deadlock Trace 1. In the SQL Server Profiler trace, select the first event with Deadlock graph in the EventClass column. FIGURE 5.9 DEADLOCK GRAPH EVENT 5-69

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 2. In the SQL Server Profiler trace, select the second event with Deadlock graph in the EventClass column. 3. Compare the data for both Deadlock graph events. If another user runs either of the two codeunits while a deadlock situation is ongoing, a lock time-out event will appear in the SQL Server Profiler and the user will get a corresponding error message. 5-70

Quick Interaction: Lessons Learned Chapter 5: Improving Application Performance Take a moment and write down three Key Points you have learned from this chapter 1. 2. 3. 5-71

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Solutions Test Your Knowledge 1. Put the following tuning methodology steps in the correct order: Step: 8 : Return to step 2 3 : Analyze monitoring results 7 : Test the solution 5 : Propose a solution 4 : Create a hypothesis 2 : Monitor the system 1 : Define the issue. 6 : Implement changes 2. Why is the Find As You Type feature bad for performance on SQL Server? MODEL ANSWER: When Find As You Type is enabled, Microsoft Dynamics NAV will send a LIKE query to the SQL Server for every keystroke. This causes unnecessary queries to be sent to the SQL Server. 3. How is SIFT implemented in Microsoft Dynamics NAV? MODEL ANSWER: SIFT was originally implemented on SQL Server by using extra summary tables called SIFT Tables, that were maintained through table triggers directly in the table definitions on SQL Server. When an update was performed on a table that contains SIFT indexes, a series of additional updates were necessary to update the associated SIFT tables. In Microsoft Dynamics NAV 5.0 SP 1, Microsoft replaced SIFT tables with V-SIFT, which are indexed views. 5-72

Chapter 5: Improving Application Performance 4. What do you know about RowLock Hinting? MODEL ANSWER: Rowlock hinting is a tuning technique that can be used to influence SQL Server's default locking behavior. When you activate rowlock hinting, rowlock hints will be sent to SQL Server, to make sure SQL Server always locks data by row. Rowlock hinting can be activated by checking the Always Rowlock database option. Rowlock hinting is to be avoided because of its high memory requirements. 5. What determines the physical storage order of records in a table? ( ) The Primary Key ( ) The Clustered Index ( ) The Timestamp column ( ) The Record No. column 6. Which statement is recommended when you want to browse a record set and modify a field that is not part of the current sorting? ( ) FINDSET(TRUE, TRUE) ( ) FINDSET(TRUE,FALSE) ( ) FINDSET(FALSE, TRUE) ( ) FINDSET(FALSE, FALSE) 7. What is true about the FINDSET statement from a performance point of view? (Select all that apply) ( ) FINDSET should be used to check whether one or more records exist that meet specific criteria. ( ) FINDSET returns a read-only record set. ( ) FINDSET only allows you to browse a record set from the top down. ( ) LOCKTABLE and FINDSET should be used to read large record sets, as opposed to using FINDSET(TRUE). Fill in the blanks to test your knowledge of this section. 8. The ISEMPTY function allows you to determine whether a C/SIDE table or a filtered set of records is empty. 9. To enable a key group in C/AL code, you must use the KEYGROUPENABLE function. 10. In Microsoft Dynamics NAV 5.0 SP1, SIFT tables have been replaced by indexed views 11. FINDFIRST should never be used in combination with REPEAT/UNTIL NEXT. 5-73

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 12. The SQLIndex property can be used to optimize a Microsoft Dynamics NAV key on SQL Server. 13. SQL Server prefers keys with a high selectivity 14. A table without a clustered index is called a heap 5-74

Chapter 6: Maintenance CHAPTER 6: MAINTENANCE Objectives Introduction The objectives are: Evaluate how to optimize a Microsoft Dynamics NAV database. Set up a maintenance plan for a Microsoft Dynamics NAV database to optimize indexes, update Microsoft SQL Server statistics and set index fill factors. Maintain an SQL Server by monitoring performance, database and transaction log growth. Typically, in a database system there are several key items that affect performance. These items include the following: The database design The code Correct indexes Current statistics Defragmented indexes and data In the previous chapters you have learned how to optimize the database design, the C/AL code, and the indexes. However, to ensure that SQL statements run as efficiently as possible, you need to run maintenance tasks periodically. The core maintenance tasks should include index rebuilds, statistics updates, and defragmenting indexes and data. This chapter describes how to implement maintenance on your SQL Server by using maintenance plans. Next, it explains how to execute maintenance tasks manually (by using Transact-SQL script). 6-1

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Optimizing a Microsoft Dynamics NAV Database The previous chapter explained that SIFT tables and indexes are the two main causes of performance issues in Microsoft Dynamics NAV 5.0 (and earlier). To maintain performance in a Microsoft Dynamics NAV 5.0 database, it is necessary to optimize the tables regularly. The optimization process removes any entries that contain zero values in all numeric fields from each SIFT table. The removal of these redundant entries frees space and makes updating and summing SIFT information more efficient. At the same time, the optimization process rebuilds all indexes. In Microsoft Dynamics NAV 5.0 SP1, SIFT tables have been replaced by indexed views that are maintained automatically. However, this does not mean that optimizing tables is no longer necessary in Microsoft Dynamics NAV 5.0 SP1 or Microsoft Dynamics NAV 2009 or later versions. Although the indexed views solve the zero SIFT record issue, indexes will become fragmented as data is entered in the database and they still need optimization to maintain performance. In Microsoft Dynamics NAV 5.0 SP1 and later versions, table optimization still allows you to rebuild table indexes from the Microsoft Dynamics NAV client. Be aware that rebuilding indexes using the Microsoft Dynamics NAV client requires manual action and adversely affects performance (because indexes are dropped and rebuilt), so you do not want all users to be able to do this ad hoc. SQL Server facilitates database maintenance by offering features that allow administrators to configure and schedule database maintenance tasks. These maintenance tasks are defined as maintenance plans and can be scheduled to run automatically at any time during the day. Implementing Maintenance on SQL Server The SQL Server Database Engine is the core service for storing, processing, and securing data. You can use the Database Engine to create relational databases for online transaction processing (OLTP) or online analytical processing (OLAP) data. This includes creating tables for storing data, and database objects such as indexes, views, and stored procedures for viewing, managing, and securing data. After you have designed the database, you can start feeding data to the Database Engine. But like the engine of a car, the Database Engine also needs maintenance to continue to run optimally. If the car is not maintained well, it will not do its job which is to bring you to your destination at the right time. The same applies to the SQL Server Database Engine. When you use SQL Server, there are several important maintenance tasks that need to be executed proactively regularly. The most important tasks, after making backups, are defragmenting indexes, updating SQL Server statistics, and managing the index fill factor. 6-2

Chapter 6: Maintenance SQL Server offers built-in features that allow administrators to configure and automatically execute maintenance activities by means of maintenance plans. You can set up a maintenance plan that includes several tasks, or you can schedule separate Transact-SQL scripts to perform maintenance tasks. Whatever method you choose is not important, as long as the necessary maintenance occurs. In SQL Server certain maintenance tasks can be done automatically without any maintenance plan. For example, if you activate the Auto Update Statistics option on a database, SQL Server will automatically update statistics. However, before activating these options, consider when you want maintenance to run on the server. Be aware that automatic tasks that start during daily work can lead to unexpected behavior. Creating and Using Maintenance Plans Maintenance plans create a workflow of the tasks required to make sure that your database is optimized, is regularly backed up, and is free of inconsistencies. Maintenance plans can be created in two ways, either you use the Maintenance Plan Wizard or you can build your own maintenance plan from the design surface. Often, maintenance plans are created by using the wizard and then edited by using the design surfaces. The Maintenance Plan Wizard is best for creating basic maintenance plans, whereas creating a plan by using the design surface allows you to utilize enhanced workflow and provides more flexibility. In SQL Server 2005 Database Engine, maintenance plans create a SQL Server Integration Services (SSIS) package, which is run by a SQL Server Agent job. These maintenance tasks can be run manually or automatically at scheduled intervals. Maintenance plans only run against databases set to compatibility level 80 or higher. The maintenance plan designer in SQL Server Management Studio does not display databases set to compatibility level 70 or lower. To create or manage maintenance plans, you must be a member of the sysadmin fixed server role. Note that Object Explorer only displays maintenance plans if the user is a member of the sysadmin fixed server role. The Maintenance Plan Wizard The Database Maintenance Plan Wizard can be used to help you set up the core maintenance tasks that are required to ensure that your database performs well, is regularly backed up if there is a system failure, and is checked for inconsistencies. The Database Maintenance Plan Wizard creates a Microsoft SQL Server job that performs these maintenance tasks automatically at scheduled intervals. 6-3

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 The following maintenance tasks can be scheduled to run automatically: Reorganizing the data on the data and index pages by rebuilding indexes with a new fill factor. This ensures that database pages contain an equally distributed amount of data and free space, which allows future growth to be faster. Compressing data files by removing empty database pages. Updating index statistics to ensure the query optimizer has up-to-date information about the distribution of data values in the tables. This allows the query optimizer to make better judgments about the best way to access data because it has more information about the data stored in the database. Although index statistics are automatically updated by SQL Server periodically, this option can force the statistics to be updated immediately. Performing internal consistency checks of the data and data pages within the database to ensure that a system or software problem has not damaged data. Backing up the database and transaction log files. Database and log backups can be retained for a specified period. This allows you to create a history of backups to be used in case you need to restore the database to a time earlier than the last database backup. Setting up log shipping. Log shipping allows the transaction logs from one database (the source) to be constantly fed to another database (the destination). Keeping the destination database synchronized with the source database allows you to have a standby server, and provides a way to offload query processing from the main computer (source server) to read-only destination servers. The results generated by the maintenance tasks can be written as a report to a text file, HTML file, or the sysdbmaintplan_history tables in the msdb database. The report can also be e-mailed to an operator. Scheduling a Maintenance Plan Scheduling a maintenance plan is part of the maintenance plan creation. In the Maintenance Plan Wizard you can select a single schedule for all tasks or a separate schedule for each individual task. Depending on your selection, SQL Server will create one or more subplans that can be scheduled individually. If you want to (re)schedule a maintenance plan that was initially created by using the wizard, you have to schedule it from the design surface. 6-4

Demonstration: Creating a Maintenance Plan Chapter 6: Maintenance To optimize the Demo Database NAV (6-0) database, Tim, the IT manager, defines a maintenance plan for the database which performs the following tasks: Update Statistics Rebuild Indexes Tim wants to schedule the maintenance plan to run at night. However, he does not want the maintenance process to interfere with other scheduled tasks. Before he can define the schedule, he needs to determine when the other tasks are run. He decides to create the maintenance plan and schedule it later. Perform the following steps when creating a Maintenance Plan: 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine by using a login with sysadmin credentials. 3. In Object Explorer, click Management. 4. Right-click Maintenance Plans. 5. Select Maintenance Plan Wizard to start the wizard. 6. On the SQL Server Maintenance Plan Wizard page, click Next. 7. On the Select Plan Properties page, in the Name field, enter a meaningful name for the maintenance plan, such as the name of the server to which the plan applies or the execution interval of the maintenance plan (daily, weekly, and so on), or any combination. For example, enter NAV-SRV-01 - Daily MP. Optionally, in the Description field, you can enter a short description for the maintenance plan, describing the purpose or the subtasks of the maintenance plan. 8. At the bottom of the window, select the scheduling option for the maintenance plan. There are two options: Separate schedules for each task or Single schedule for the entire plan or no schedule. If you select Separate schedules for each task, you can define the schedule for each task later in the Wizard, when you configure the individual tasks. If you select Single schedule for the entire plan or no schedule, you have to define the schedule here (before you select and configure the individual tasks). To define the schedule, click the Change button. In this demonstration, select the second option. In the next demonstration, you will change the schedule. 9. Click the Next button. 6-5

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 10. On the Select Maintenance Tasks page, select the tasks to include in the maintenance plan. Select the tasks as shown in the following figure. FIGURE 6.1 SELECT MAINTENANCE TASKS 11. Click the Next button. 12. On the Select Maintenance Task Order page, you can change the order in which the tasks must be executed. You can use the Move Up and Move Down buttons to rearrange the tasks. 13. Click the Next button to start to configure the individual tasks. 14. On the Define Rebuild Index Task page, in the Database field, select the database for which the task must be executed. To select one or more databases, click the drop-down list, select the These databases option, and select the databases to maintain. In this case, select the Demo Database NAV (6-0) database. Click OK. 15. In the Objects field, select the object type to maintain: Table, View or Tables and Views. To maintain all tables and views in the database, select Tables and Views. To reorganize only selected tables or views, select either the Table or View option and select the individual tables or views in the Selection field. 6-6

Chapter 6: Maintenance 16. In the Free space options field, select how you want to rebuild the indexes. If you select Reorganize pages with the default amount of free space, SQL Server will drop the indexes and re-create them with the fill factor that was specified when the indexes were created. If you select Change free space per page percentage to, SQL Server drops the indexes in the database and re-creates them with a new, automatically calculated fill factor, thereby reserving the specified amount of free space on the index pages. The greater the percentage, the more free space is reserved on the index pages, and the larger the index grows. Valid values are from 0 through 100. 17. Select the Sort results in tempdb option. This option determines where the intermediate sort results, generated during index creation, are temporarily stored. If a sort operation is not required, or if the sort can be performed in memory, this option is ignored. 18. Select the Keep index online while reindexing option to allow users to access the underlying table or clustered index data and any associated nonclustered indexes during index operations. NOTE: Online index operations are available only in SQL Server Developer, Evaluation, and Enterprise editions. 19. Click the Next button. 20. On the Define Update Statistics Task page, in the Database field, select the database the task applies to. In this case, select the Demo Database NAV (6-0) database. 21. In the Objects field, select the object type to maintain (Table, View or Tables and Views) and select the corresponding views or tables in the Selection field. 22. In the Update field, select the statistics to update: All statistics, Column statistics or Index statistics. In this case, select All statistics. 23. In the Scan type field, select the type of scan used to collect updated statistics. If you select Full scan, SQL Server will read all rows in the table or view to collect the statistics. If you select Sample by, SQL Server will update statistics based on a sample of data. In this case, you must specify the percentage of the table or indexed view, or the number of rows to sample when collecting statistics for larger tables or views. 24. Click the Next button. 6-7

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 25. On the Select Report Options page, you can define a report of the task activity, and configure the task to notify people when the task is completed. The report contains details of the steps executed by the maintenance plan. This includes any error information. If you select Write a report to a text file (and specify a path for the text file in the Folder Location field), SQL Server saves the report to a text file. If you want an e-mail message to be sent when a task fails, you must select the E-mail report option. To use this task, you must have Database Mail enabled and configured correctly with MSDB as a Mail Host Database, and have a SQL Server Agent operator with a valid e-mail address. In this case, leave the default option Write a report to a text file. For more information about how to set up database mail, see Database Mail (http://msdn.microsoft.com/enus/library/ms175887.aspx), Defining Operators (http://msdn.microsoft.com/en-us/library/ms179336.aspx) and How To: Create an Operator (SQL Server Management Studio) (http://msdn.microsoft.com/en-us/library/ms175962.aspx). 26. Click the Next button. 27. On the Complete the Wizard page, you can verify the choices made in the wizard. 28. Click the Finish button. 6-8

Chapter 6: Maintenance As a final step, the SQL Server Maintenance Plan Wizard creates the necessary SSIS packages for the maintenance plan and shows the progress. When the Wizard finishes, the results are displayed: FIGURE 6.2 RESULTS OF THE SQL SERVER MAINTENANCE PLAN WIZARD 29. Click Close to exit the wizard. Demonstration: Scheduling a Maintenance Plan Tim has found out that batch processes run between 11:00PM and 1:00AM. He can now schedule his maintenance plan to run once every night at 01:30AM, starting January 1, 2010. As he cannot use the wizard to reschedule the existing maintenance plan, he schedules it from the design surface. This is done by performing following steps: 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine by using a login with sysadmin credentials. 3. In Object Explorer, click Management. 6-9

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 4. Click Maintenance Plans. 5. Double-click the NAV-SRV-01 - Daily MP plan or right-click it, and then select Modify. The design surface opens. FIGURE 6.3 SCHEDULE A MAINTENANCE PLAN FROM THE DESIGN SURFACE The top of the design surface shows the name and the description for the maintenance plan. The middle pane shows a grid with all subplans that are part of the maintenance plan. The number of subplans created depends on the scheduling option selected in the wizard. If you select Separate schedules for each task, a separate subplan is created for task in the maintenance plan. If you select Single schedule for the entire plan or no schedule, only one subplan is created including all tasks. The bottom pane shows the different tasks in the selected subplan and the relation between these tasks. 6. In the grid, select Subplan_1. 7. In the toolbar, click the Subplan Schedule button or click the Schedule icon next to the subplan in the grid. You can also doubleclick the subplan, and then click the Subplan Schedule icon in the Subplan Properties dialog box. 6-10

Chapter 6: Maintenance 8. In the Job Schedule Properties window, you can define the schedule for the subplan. Define the schedule as shown in the following figure: FIGURE 6.4 SCHEDULING A SUBPLAN 9. Click OK to close the Job Schedule Properties window and return to the design surface. Notice that the Schedule column for the subplan now contains schedule information. 10. In the toolbar, click the Save button to save the maintenance plan. To remove the schedule for a specific subplan, select a subplan in the grid, click Remove Schedule in the toolbar or click the Remove Schedule icon next to the subplan in the grid. You can also double-click the subplan, and then click the Remove Schedule icon in the Subplan Properties dialog box. 6-11

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Index Fragmentation The SQL Server Database Engine automatically maintains indexes whenever insert, update, or delete operations are made to the underlying data. Over time these modifications can cause the information in the index to become scattered in the database (fragmented). Fragmentation exists when indexes have pages in which the logical ordering, based on the key value, does not match the physical ordering inside the data file. Heavily fragmented indexes can decrease query performance and cause your application to respond slowly. In SQL Server 2005 you can correct index fragmentation by either reorganizing an index or by rebuilding an index. For partitioned indexes built on a partition scheme, you can use either of these methods on a complete index or on a single partition of an index. Detecting Fragmentation The first step in deciding which defragmentation method to use is to analyze the index to determine the degree of fragmentation. By using the system function sys.dm_db_index_physical_stats, you can detect fragmentation in a specific index, all indexes on a table or indexed view, all indexes in a database, or all indexes in all databases. For partitioned indexes, sys.dm_db_index_physical_stats also provides fragmentation information for each partition. The algorithm for calculating fragmentation is more precise in SQL Server 2005 than in SQL Server 2000. As a result, the fragmentation values will appear higher. For example, in SQL Server 2000, a table is not considered fragmented if it has page 11 and page 13 in the same extent but not page 12. However, to access these two pages would require two physical I/O operations, so this is counted as fragmentation in SQL Server 2005. The result set returned by the sys.dm_db_index_physical_stats function includes the following columns. Column avg_fragmentation_in _percent fragment_count avg_fragment_size_in _pages Description The percent of logical fragmentation (out-of-order pages in the index). The number of fragments (physically consecutive leaf pages) in the index. Average number of pages in one fragment in an index. 6-12

Chapter 6: Maintenance After the degree of fragmentation is known, use the following table to determine the best method to correct the fragmentation. avg_fragmentation_in_percent Corrective statement value > 5% and < = 30% ALTER INDEX REORGANIZE > 30% ALTER INDEX REBUILD WITH (ONLINE = ON) Note that rebuilding an index can be executed online or offline. Reorganizing an index is always executed online. To achieve availability similar to the reorganize option, you should rebuild indexes online. These values provide a rough guideline for determining the point at which you should switch between ALTER INDEX REORGANIZE and ALTER INDEX REBUILD. However, the actual values may vary from case to case. It is important that you experiment to determine the best threshold for your environment. Be aware that recreating a heavily defragmented index can be faster than updating the existing index. Very low levels of fragmentation (less than 5 percent) should not be addressed by either of these commands because the benefit of removing such a small amount of fragmentation is almost always significantly outweighed by the cost of reorganizing or rebuilding the index. Reorganizing an Index To reorganize one or more indexes, use the ALTER INDEX statement with the REORGANIZE clause. This statement replaces the DBCC INDEXDEFRAG statement. To reorganize a single partition of a partitioned index, use the PARTITION clause of ALTER INDEX. Reorganizing an index defragments the leaf level of clustered and nonclustered indexes on tables and views by physically reordering the leaf-level pages to match the logical order (left to right) of the leaf nodes. Having the pages in order improves index-scanning performance. The index is reorganized within the existing pages allocated to it and no new pages are allocated. If an index spans more than one file, the files are reorganized one at a time. Pages do not migrate between files. Reorganizing also compacts the index pages. Any empty pages created by this compaction are removed providing additional available disk space. Compaction is based on the fill factor value in the sys.indexes catalog view. The reorganize process uses minimal system resources. Also, reorganizing is automatically performed online. The process does not hold long-term blocking locks. Therefore, it will not block running queries or updates. 6-13

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Reorganize an index when the index is not heavily fragmented. See the previous table for fragmentation guidelines. However, if the index is heavily fragmented, you will achieve better results by rebuilding the index. Rebuilding an Index Rebuilding an index drops the index and creates a new one. In doing this, fragmentation is removed, disk space is reclaimed by compacting the pages using the specified or existing fill factor setting, and the index rows are reordered in contiguous pages (allocating new pages as needed). This can improve disk performance by reducing the number of page reads required to obtain the requested data. The following methods can be used to rebuild clustered and nonclustered indexes: ALTER INDEX with the REBUILD clause. This statement replaces the DBCC DBREINDEX statement. CREATE INDEX with the DROP_EXISTING clause. Each method performs the same function, but there are advantages and disadvantages to consider. For example, creating an index allows you to change the index definition by adding or removing key columns, changing column order, or changing the column sort order. You can also change filegroup or partition. However, creating multiple indexes is not possible within a single transaction. You can also rebuild an index by first dropping the index with the DROP INDEX statement and re-creating it with a separate CREATE INDEX statement. Performing these operations as separate statements has several disadvantages, and it is not recommended. The following sample code displays all indexes with an average fragmentation percentage bigger than 10. SELECT vw.object_id AS objectid, object_name(vw.object_id) AS objectname, vw.index_id AS indexid, tbl.name, vw.partition_number AS partitionnum, vw.avg_fragmentation_in_percent AS frag FROM sys.dm_db_index_physical_stats ( DB_ID('Demo Database NAV (6-0)'), NULL, NULL, NULL, 'LIMITED') vw JOIN sys.indexes tbl ON vw.object_id = tbl.object_id AND vw.index_id = tbl.index_id WHERE vw.avg_fragmentation_in_percent > 10.0 AND vw.index_id > 0 -- AND object_name(object_id) = -- 'CRONUS International Ltd_$Contact' 6-14

Chapter 6: Maintenance By enabling the last two lines in the script, you can limit the result set to the indexes for a specific table (in this case, the Contact table). You can now reorganize or rebuild all or specific indexes based on the fragmentation values, as shown in the following code sample. -- Rebuild Index $1 in the Contact Table ALTER INDEX [$1] ON dbo.[cronus International Ltd_$Contact] REBUILD -- Reorganize All Indexes in the Contact Table ALTER INDEX ALL ON dbo.[cronus International Ltd_$Contact] REORGANIZE The following example automatically reorganizes or rebuilds all partitions in a database that have an average fragmentation over 10 percent. The decision point at which the switch is made between reorganize and rebuild has been defined in the script. SET NOCOUNT ON; DECLARE @objectid int; DECLARE @indexid int; DECLARE @partitioncount bigint; DECLARE @schemaname nvarchar(130); DECLARE @objectname nvarchar(130); DECLARE @indexname nvarchar(130); DECLARE @partitionnum bigint; DECLARE @partitions bigint; DECLARE @frag float; DECLARE @command nvarchar(4000); -- Conditionally select tables and indexes from the -- sys.dm_db_index_physical_stats function and -- convert object and index IDs to names. SELECT object_id AS objectid, index_id AS indexid, partition_number AS partitionnum, avg_fragmentation_in_percent AS frag INTO #work_to_do FROM sys.dm_db_index_physical_stats ( DB_ID('Demo Database NAV (6-0)), NULL, NULL, NULL, 'LIMITED') WHERE avg_fragmentation_in_percent > 10.0 AND index_id > 0; -- Declare and open the cursor for the list of partitions -- to be processed. DECLARE partitions CURSOR FOR SELECT * FROM #work_to_do; OPEN partitions; -- Loop through the partitions. WHILE (1=1) BEGIN; FETCH NEXT FROM partitions INTO @objectid, @indexid, @partitionnum, @frag; IF @@FETCH_STATUS < 0 BREAK; 6-15

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 SELECT @objectname = QUOTENAME(o.name), @schemaname = QUOTENAME(s.name) FROM sys.objects AS o JOIN sys.schemas as s ON s.schema_id = o.schema_id WHERE o.object_id = @objectid; SELECT @indexname = QUOTENAME(name) FROM sys.indexes WHERE object_id = @objectid AND index_id = @indexid; SELECT @partitioncount = count (*) FROM sys.partitions WHERE object_id = @objectid AND index_id = @indexid; -- 30 is an arbitrary decision point at which to -- switch between reorganizing and rebuilding. IF @frag < 30.0 SET @command = N'ALTER INDEX ' + @indexname + N' ON ' + @schemaname + N'.' + @objectname + N' REORGANIZE'; IF @frag >= 30.0 SET @command = N'ALTER INDEX ' + @indexname + N' ON ' + @schemaname + N'.' + @objectname + N' REBUILD'; IF @partitioncount > 1 SET @command = @command + N' PARTITION=' + CAST(@partitionnum AS nvarchar(10)); EXEC (@command); PRINT N'Executed: ' + @command; END; -- Close and deallocate the cursor. CLOSE partitions; DEALLOCATE partitions; -- Drop the temporary table. DROP TABLE #work_to_do; Statistical Information Microsoft SQL Server allows statistical information about the distribution of values in a column to be created. This statistical information can be used by the query processor to determine the optimal strategy for evaluating a query. When you create an index, SQL Server automatically stores statistical information about the distribution of values in the indexed column(s). The query optimizer in SQL Server uses these statistics to estimate the cost of using the index for a query. Additionally, when the AUTO_CREATE_STATISTICS database option is set to ON (default), SQL Server automatically creates statistics for columns without indexes that are used in a predicate. 6-16

Chapter 6: Maintenance As the data in a column changes, index and column statistics can become out-ofdate and cause the query optimizer to make less-than-optimal decisions on how to process a query. For example, if you create a table with an indexed column and 1,000 rows of data, all with unique values in the indexed column, the query optimizer considers the indexed column a good way to collect the data for a query. If you update the data in the column so there are many duplicated values, the column is no longer an ideal candidate for use in a query. However, the query optimizer still considers it a good candidate based on the index's outdated distribution statistics, which are based on the data before the update. Statistics that are maintained on each table in SQL Server to aid the optimizer in cost-based decision-making include the number of rows, the number of pages used by the table, and the number of modifications made to the keys of the table since the last statistics update. In addition to maintaining statistics on indexed columns, it is possible to maintain statistics on columns that are not indexed. Therefore, SQL Server automatically updates this statistical information periodically as the data in the tables changes. The sampling is random across data pages, and taken from the table or the smallest nonclustered index on the columns needed by the statistics. After a data page has been read from disk, all the rows on the data page are used to update the statistical information. The frequency at which the statistical information is updated is determined by the volume of data in the column or index and the amount of changing data. For example, the statistics for a table that contains 10,000 rows may have to be updated when 1,000 index values have changed because 1,000 values may represent a significant percentage of the table. However, for a table that contains 10 million index entries, 1,000 changing index values is less significant, and so the statistics may not be automatically updated. SQL Server, however, always ensures that a minimum number of rows are sampled; tables that are smaller than 8 megabytes (MB) are always fully scanned to collect statistics. The cost of this automatic statistical update is minimized by sampling the data, rather than analyzing all of it. Under certain circumstances, statistical sampling will not accurately characterize the data in a table. You can control the amount of data sampled during manual statistics updates on a table-by-table basis by using the SAMPLE and FULLSCAN clauses of the UPDATE STATISTICS statement. The FULLSCAN clause specifies that all of the data in the table is scanned to collect statistics, whereas the SAMPLE clause can be used to specify either the percentage of rows to sample or the number of rows to sample. Out of date or missing statistics are indicated by warnings when the execution plan of a query is graphically displayed in SQL Query Analyzer. The table name is displayed in red text. Monitor the Missing Column Statistics event class by using SQL Profiler so that you know when statistics are missing. Use the UPDATE STATISTICS command or the sp_updatestats system stored procedure to manually update statistics after large changes in data, or daily, if there is a daily window available. If you instruct SQL Server not to maintain statistics automatically, you must manually update the statistical information. 6-17

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 We recommend that you set the Auto Update Statistics and Auto Create Statistics database options to False and use a database maintenance plan to create and update statistics. As an alternative for the database maintenance plan, you can use the following Transact-SQL query to create statistics for all indexes in the database: sp_createstats 'indexonly', 'fullscan' Depending on the database size and the amount of data, this query can take some time. Therefore, we recommend that you schedule the index maintenance after working hours. Existing statistics will not be updated by sp_createstats. To update existing statistics, either drop all statistics before using sp_createstats (as shown in Lab 6.3), or execute sp_updatestats. Demonstration: Activate Update Statistics Perform the following steps to enable automatic statistics update for a database by using the Database Properties window. 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 server. 3. Select the database for which you want to update statistics. 4. Right-click the database and then select Properties. 5. On the Options page, set the Auto Update Statistics property to True. 6. Click OK. Demonstration: Activate Update Statistics Using T-SQL Perform the following steps to enable automatic statistics update for a database by using a Transact-SQL query. 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 server. 3. Select the database for which you want to update statistics. 4. Click New Query to open a new query window. 5. In the New Query window, enter the following command: ALTER DATABASE [Demo Database NAV (6-0)] SET AUTO_UPDATE_STATISTICS ON WITH NO_WAIT 6. Click Run to execute the query. To update the statistics for all user-defined and internal tables in the current database, you can run the sp_updatestats system stored procedure. 6-18

Chapter 6: Maintenance Demonstration: Update Table Statistics Using T-SQL Perform the following steps to update statistics for the Customer table based on 80 percent of the data. 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 server. 3. Select the database for which you want to update statistics. 4. Click New Query to open a new query window. 5. In the New Query window, enter the following command: UPDATE STATISTICS [CRONUS International Ltd_$Customer] WITH SAMPLE 80 PERCENT 6. Click Run to execute the query. Demonstration: Update Index Statistics Using T-SQL Perform the following steps to update statistics for a specific index in the Customer table. SQL Server will use 80 percent of the data to update index statistics. 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 server. 3. Select the database for which you want to update statistics. 4. Click New Query to open a new query window. 5. In the New Query window, enter the following command: UPDATE STATISTICS [CRONUS International Ltd_$Customer] [$1] WITH SAMPLE 80 PERCENT 6. Click Run to execute the query. Index Fill Factor When you create a clustered index, the data in the table is stored in the data pages of the database according to the order of the values in the indexed columns. When new rows of data are inserted into the table or the values in the indexed columns are changed, Microsoft SQL Server may have to reorganize the storage of the data in the table to make room for the new row and maintain the ordered storage of the data. This also applies to nonclustered indexes. When data is added or changed, SQL Server may have to reorganize the storage of the data in the nonclustered index pages. When a new row is added to a full index page, SQL Server moves approximately half the rows to a new page to make room for the new row. This reorganization is known as a page split. Page splitting can impair performance and fragment the storage of the data in a table. 6-19

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 When you create an index, you can specify a fill factor to leave additional gaps and reserve a percentage of free space on each leaf-level page of the index to accommodate future expansion in the storage of the table's data and reduce the potential for page splits. The fill factor value is a percentage from 0 to 100 that specifies how much to fill the data pages after the index is created. A value of 100 means the pages will be full and will take the least amount of storage space. This setting should be used only on a read-only table, to which additional data is never added. A lower value leaves more empty space on the data pages, which reduces the need to split data pages as indexes grow but requires more storage space. For example, a fill factor value of 80 means that 20 percent of each leaflevel page will be left empty providing space for index expansion as data is added to the underlying table. This setting is more appropriate when there will be changes to the data in the table. The fill factor option is provided for fine-tuning performance. The fill factor value is a percentage from 1 to 100. The server-wide default of 0 is the optimal choice in most situations. When fill factor is set to 0, the leaf level is filled to capacity. You can use the CREATE INDEX or ALTER INDEX statements to set the fill factor value for individual indexes. To modify the server-wide default value, use the sp_configure system stored procedure. To view the fill factor value of one or more indexes, use the sys.indexes catalog view. NOTE: Even for an application oriented for many insert and update operations, the number of database reads typically outnumber database writes by a factor of 5 to 10. Therefore, specifying a fill factor other than the default can decrease database read performance by an amount inversely proportional to the fill factor setting. For example, a fill factor value of 50 percent can cause database read performance to decrease by two times. It is useful to set the fill factor option to another value only in the following situations: When a new index is created on a table with existing data, When future changes in that data can be accurately predicted. The fill factor is implemented only when the index is created. It is not maintained after the index is created as data is added, deleted, or updated in the table. Trying to maintain the additional space on the data pages would defeat the purpose of originally using the fill factor because SQL Server would have to perform page splits to maintain the percentage of free space, specified by the fill factor, on each page as data is entered. Therefore, if the data in the table is significantly modified and new data added, the empty space in the data pages can fill. In this situation, the index can be re-created and the fill factor specified again to redistribute the data. 6-20

Chapter 6: Maintenance Page Splits and Performance Considerations When a new row is added to a full index page, the Database Engine moves approximately half the rows to a new page to make room for the new row. This reorganization is known as a page split. A page split makes room for new records, but can take time to perform and is a resource intensive operation. Also, it can cause fragmentation that causes increased I/O operations. A correctly chosen fill factor value can reduce the potential for page splits by providing sufficient space for index expansion as data is added to the underlying table. When frequent page splits occur, the index can be rebuilt by using a new or existing fill factor value to redistribute the data. For more information, see Reorganizing and Rebuilding Indexes. Although a low fill factor value, other than 0, may reduce the requirement to split pages as the index grows, the index will require more storage space and can decrease read performance. Even for an application oriented for many insert and update operations, the number of database reads typically outnumber database writes by a factor of 5 to 10. Therefore, specifying a fill factor other than the default can decrease database read performance by an amount inversely proportional to the fill factor setting. For example, a fill factor value of 50 can cause database read performance to decrease by two times. Read performance is decreased because the index contains more pages, therefore increasing the disk IO operations required to retrieve the data. Adding Data to the End of the Table A nonzero fill factor other than 0 or 100 can be good for performance if the new data is evenly distributed throughout the table. However, if all the data is added to the end of the table, the empty space in the index pages will not be filled. For example, if the index key column is an IDENTITY column, the key for new rows is always increasing and the index rows are logically added to the end of the index. If existing rows will be updated with data that lengthens the size of the rows, use a fill factor of less than 100. The additional bytes on each page will help minimize page splits caused by additional length in the rows. The following code sample rebuilds the index [$1] in the Contact table with a fill factor of 80. ALTER INDEX [$1] ON dbo.[cronus International Ltd_$Contact] REBUILD WITH (FILLFACTOR= 80) When adjusting the fill factor, you can add the SORT_IN_TEMPDB=ON parameter to have the intermediate sort results stored in the tempdb. If you set SORT_IN_TEMPDB to OFF, the sort results are stored in the filegroup or partition scheme in which the resulting index is stored. You can add the STATISTICS_NORECOMPUTE parameter to specify whether out-of-date index statistics should be automatically recomputed. 6-21

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Demonstration: Define Server-wide Fill Factor Perform the following steps to change the server-wide default index fill factor. 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 server. 3. In Object Explorer, right-click the server node and select Properties. The Server Properties window appears. 4. Click the Database Settings page. 5. Set the Default index fill factor to 80. FIGURE 6.5 SERVER-WIDE DATABASE SETTINGS 6. Click OK to apply the new setting and to close the window. 6-22

Chapter 6: Maintenance Demonstration: Define Server-wide Fill Factor Using Code The server-wide default fill factor can also be changed by using the sp_configure stored procedure. Some options supported by sp_configure, such as Default index fill factor, are designated as Advanced. By default, these options are not available for viewing and changing. Setting the Show Advanced Options configuration option to 1 makes these options available. Perform the following steps to set the default fill factor to 80. 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 server. 3. Click New Query to open a new query window. 4. In the New Query window, enter the following command: sp_configure 'show advanced options', 1 RECONFIGURE WITH OVERRIDE; GO sp_configure 'fill factor (%)', 80 RECONFIGURE WITH OVERRIDE; GO sp_configure 'show advanced options', 0 RECONFIGURE WITH OVERRIDE; 5. Click Execute to run the query. Demonstration: Define Index Fill Factor Using T-SQL Perform the following steps to rebuild the indexes of the Contact table with a fill factor of 75. 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 server. 3. Select the database on which you want to update indexes. 4. Click New Query to open a new query window. 5. In the New Query window, enter the following command: ALTER INDEX ALL ON dbo.[cronus International Ltd_$Contact] REBUILD WITH (FILLFACTOR= 75) 6. Click Execute to run the query. 6-23

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Note that FILLFACTOR can only be specified with REBUILD (not with REORGANIZE). After all indexes have been rebuilt with the specified fill factor, you can use the following code to view the fill factor for all indexes in the table: SELECT name, index_id, type, is_primary_key, type_desc, fill_factor FROM sys.indexes WHERE object_id = object_id('cronus International Ltd_$Contact') The result will look as follows: Monitoring FIGURE 6.6 THE ADJUSTED FILL FACTOR Monitoring is an important aspect of database administration, because Microsoft SQL Server provides services in a dynamic environment. The data in the application changes, the type of access that users require changes, and the way that users connect changes. SQL Server automatically manages system-level resources such as memory and disk space, but monitoring lets administrators identify performance trends to determine whether changes are necessary. Monitoring an instance of SQL Server requires analysis of some key aspects of the system. Eliminating the physical bottlenecks can immediately affect performance and further isolate the design issues in the database, Transact-SQL queries, or client applications. It is important to monitor SQL Server performance so that you can identify bottlenecks (or the symptoms of upcoming bottlenecks) at an early stage, determine their cause, and eliminate them. Bottlenecks can be eliminated by upgrading hardware, by distributing server load among other instances of SQL Server, or by tuning SQL Server databases, indexes, and queries. To determine performance trends, we recommend that you store historical monitoring information in a separate database. Keep in mind that the more monitoring data you keep, the bigger the load for the monitoring server. 6-24

Chapter 6: Maintenance Performance The goal of monitoring databases is to assess how a server is performing. Effective monitoring involves taking periodic snapshots of current performance to isolate processes that are causing problems, and collecting data continuously over time to track performance trends. Microsoft SQL Server and the Microsoft Windows operating system provide utilities that let you view the current condition of the database and to track performance as conditions change. Monitoring SQL Server lets you do the following: Determine whether you can improve performance. For example, by monitoring the response times for frequently used queries, you can determine whether changes to the query or indexes on the tables are required. Evaluate user activity. For example, by monitoring users who are trying to connect to an instance of SQL Server, you can determine whether security is set up adequately and test applications or development systems. For example, by monitoring SQL queries as they are executed, you can determine whether they are written correctly and producing the expected results. Troubleshoot any problems or debug application components, such as stored procedures. Monitoring is important because SQL Server provides a service in a dynamic environment. The data in the application changes, the type of access that users require changes, and the way that users connect changes. The types of applications accessing SQL Server may even change, but SQL Server automatically manages system-level resources such as memory and disk space so that the need for extensive system-level manual tuning is minimized. But monitoring lets administrators identify performance trends to determine whether changes are necessary. Ongoing evaluation of the database performance helps you minimize response times and maximize throughput yielding optimal performance. Efficient network traffic, disk I/O, and CPU usage are key to peak performance. You need to thoroughly analyze the application requirements, understand the logical and physical structure of the data, assess database usage, and negotiate tradeoffs between conflicting uses such as online transaction processing (OLTP) versus decision support. 6-25

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Changing conditions result in changing performance. In your evaluations, you can see performance changes as the number of users increases, user access and connection methods change, database contents grow, client applications change, data in the applications changes, queries become more complex, and network traffic increases. By using SQL Server tools to monitor performance, you can associate some changes in performance with changing conditions and complex queries. The following scenarios provide examples: By monitoring the response times for frequently used queries, you can determine whether changes to the query or indexes on the tables where the queries execute are required. By monitoring Transact-SQL queries as they are executed, you can determine whether the queries are written correctly and producing the expected results. By monitoring users who try to connect to an instance of SQL Server, you can determine whether security is set up adequately and test applications or development systems. Response time is the length of time required for the first row of the result set to be returned to the user in the form of visual confirmation that a query is being processed. Throughput is the total number of queries handled by the server during a specified period of time. As the number of users increases, so does the competition for a server's resources, which in turn increases response time and decreases overall throughput. Database Growth In addition to monitoring system resources and database activity, database growth, or free disk space should be monitored when using SQL Server. Monitoring free disk space on the SQL Server is very important. If SQL Server runs out of disk space on a specific drive, it means that SQL Server has no space left to write transactions to the database, the transaction log or the tempdb database. As a consequence, SQL Server does the following: Reports error message 9002 or 1105 in the Microsoft SQL Server error log. Marks the database as suspect. Takes the database offline. A lack of disk space can cause serious disruptions in the SQL Server production environment and will prevent running applications from completing transactions. In either case, user action is required to make disk space available, as databases marked as suspect are inaccessible to the end-user. 6-26

Chapter 6: Maintenance For more information about how to troubleshoot any of these situations, see Troubleshooting a Full Transaction Log (Error 9002) (http://msdn.microsoft.com/en-us/library/ms175495.aspx), Troubleshooting Insufficient Data Disk Space (http://msdn.microsoft.com/enus/library/ms366198.aspx), or Troubleshooting Insufficient Disk Space in tempdb (http://msdn.microsoft.com/en-us/library/ms176029.aspx). How to Monitor Database Growth? Monitoring database growth becomes more important as data is entered in the database and databases are added to the SQL Server. Instead of measuring growth of individual databases, you can measure the free space on the drives of the SQL Server. Although Windows Script Host (WSH), Visual Basic Script (VBS) or Windows Management Instrumentation (WMI) offer better ways of collecting disk space information, it can also be done using Transact-SQL. To do this, use the xp_fixeddrives stored procedure, as shown in the following code: xp_fixeddrives The results of running this query will look as follows: FIGURE 6.7 XP_FIXEDDRIVES RETURNS FREE DISK SPACE INFORMATION To monitor disk space daily, you can schedule this query and store the results of the query (with a time stamp) in a separate table. This will allow you to analyze trends in disk space information and predict hardware bottlenecks (by using larger hard disks). This information will show you when the free disk space started decreasing significantly and allows you to calculate when you will run out of space. 6-27

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 The following script shows how to create a table DiskSpaceInfo in a monitoring database (Monitoring Database) and how to insert the xp_fixeddrives information into the table. Use [MonitorDB]; -- Create Table (One Time Only) IF NOT EXISTS (SELECT * FROM sys.sysobjects WHERE id = object_id(n'[monitordb].dbo.[diskspaceinfo]')) CREATE TABLE [MonitorDB].dbo.DiskSpaceInfo (EntryNo int IDENTITY, MetricDate datetime default getdate(), DriveName varchar(2), FreeSpaceMB int ); -- Add Monitoring Data INSERT INTO [MonitorDB].dbo.DiskSpaceInfo (DriveName, FreeSpaceMB) EXEC xp_fixeddrives; -- Show Records in Table -- SELECT * FROM [MonitorDB].dbo.DiskSpaceInfo; This script can be scheduled on SQL Server to collect drive information at specific time intervals, for example every hour or every day at 12:00PM. The smaller the interval, the more accurate the data become and the bigger the monitoring database grows. Database Mail Alerts When measuring the disk space information, you can also setup alerts that will be triggered automatically when a specific limit is reached. You can set up alerts in Windows Performance Monitor or you can set them up in SQL Server. If you want to send alerts through e-mail, you must enable database mail in SQL Server. To enable database mail in SQL Server 2008, use the following procedure: 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine. 3. In Object Explorer, right-click the server name and select Facets. 6-28

Chapter 6: Maintenance 4. In the View Facets window, in the Facet field, select Surface Area Configuration from the drop-down list. 5. Set the DatabaseMailEnabled property to True. FIGURE 6.8 SURFACE AREA CONFIGURATION IN SQL SERVER 2008. 6. Click OK to close the View Facets window. To enable database mail in SQL Server 2005, perform the following procedure: 1. In the Windows taskbar, click Start > All Programs > Microsoft SQL Server 2005 > Configuration Tools > SQL Server Surface Area Configuration. 2. Select Surface Area Configuration for Features. 3. In the Surface Area Configuration for Features window, select the instance of SQL Server to configure. 4. Select Database Engine. 5. On the Database Mail page, check the Enable Database Mail stored procedures field. 6. Click OK. NOTE: Database Mail is a component for sending e-mail messages from the Database Engine with SMTP. Enable Database Mail stored procedures only if you plan to configure and use Database Mail. 6-29

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 For more information about how to set up database mail, see Database Mail How-to Topics (http://msdn.microsoft.com/en-us/library/ms188298.aspx). Measuring Total Disk Capacity The xp_fixeddrives extended stored procedure returns free disk space in megabytes, so you can set up alerts that trigger when the amount of free disk reaches or falls under a specific threshold value in megabytes. If you want to express the threshold value as a percentage of the total disk capacity, you have to use other methods outside SQL Server. The following code shows you how to measure the total disk capacity and store it in a table DiskSpaceInfo2 in the MonitorDB database. Use [MonitorDB]; SET NOCOUNT ON; DECLARE @hr int; DECLARE @fso int; DECLARE @drive char(1); DECLARE @odrive int; DECLARE @TotalSize varchar(20); DECLARE @MB bigint; SET @MB = 1048576; -- Create Temporary Table CREATE TABLE #drives (ServerName varchar(15), drive char(1), FreeSpace int NULL, TotalSize int NULL, FreespaceTimestamp DATETIME NULL); -- Create Table (One Time Only) IF NOT EXISTS (SELECT * FROM sys.sysobjects WHERE id = object_id(n'[monitordb].dbo.[diskspaceinfo2]')) create TABLE DiskSpaceInfo2 (EntryNo int identity, ServerName varchar(15), drive varchar(1), FreeSpace int NULL, TotalSize int NULL, FreePct int, FreespaceTimestamp DATETIME NULL); INSERT #drives(drive,freespace) EXEC master.dbo.xp_fixeddrives; EXEC @hr=sp_oacreate 'Scripting.FileSystemObject',@fso OUT; IF @hr <> 0 EXEC sp_oageterrorinfo @fso; DECLARE dcur CURSOR LOCAL FAST_FORWARD FOR 6-30

Chapter 6: Maintenance SELECT drive from #drives ORDER by drive; OPEN dcur; FETCH NEXT FROM dcur INTO @drive; WHILE @@FETCH_STATUS=0 BEGIN EXEC @hr = sp_oamethod @fso,'getdrive', @odrive OUT, @drive; IF @hr <> 0 EXEC sp_oageterrorinfo @fso; EXEC @hr = sp_oagetproperty @odrive,'totalsize', @TotalSize OUT; IF @hr <> 0 EXEC sp_oageterrorinfo @odrive; UPDATE #drives SET TotalSize=@TotalSize/@MB, ServerName = host_name(), FreespaceTimestamp = (GETDATE()) WHERE drive=@drive; FETCH NEXT FROM dcur INTO @drive; END; CLOSE dcur; DEALLOCATE dcur; EXEC @hr=sp_oadestroy @fso; IF @hr <> 0 EXEC sp_oageterrorinfo @fso; INSERT INTO DiskSpaceInfo2 (ServerName,drive,TotalSize,FreeSpace,FreePct, FreespaceTimestamp) SELECT ServerName, drive, TotalSize as 'Total(MB)', FreeSpace as 'Free(MB)', CAST((FreeSpace/(TotalSize*1.0))*100.0 as int) as 'Free(%)', FreespaceTimestamp FROM #drives ORDER BY drive; DROP TABLE #drives; SELECT * FROM DiskSpaceInfo2 To use the File Script Object set the OleAutomationEnabled property to True. To do this follow the same procedure as for the DatabaseMailEnabled property. NOTE: The OLE Automation extended stored procedures (XPs) allow Transact- SQL batches, stored procedures, and triggers to reference custom OLE Automation objects. Enable OLE Automation only if applications or Transact- SQL scripts use OLE Automation XPs. 6-31

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Although it is not mandatory, we recommend to store the monitored data in a performance data warehouse. This is a (small) database that only contains performance data from one or more servers, collected by different monitoring tools. While data is collected and entered into the database, you can create views or Microsoft SQL Server Reporting Services reports to view the trends in time. Doing this, you can easily see when disk space starts changing considerably. When you see a significant change in database growth, you should closely follow-up the growth and evaluate whether you have to take some action. It is a good practice to find out what is causing the database growth. At the same time, you should plan for future changes. The information will serve as a historical system documentation that can be analyzed later. Shrinking the Database Shrinking the database file daily to recover disk space is not feasible. If you shrink the database file, you have the gain of the empty disk space, but keep in mind that expanding the database during daily work requires system resources. You should only consider shrinking the database file when the database is significantly bigger than the amount of data in the database (for example, after you have archived or removed some of the data) and when the database is either read-only (archive purpose) or rather static (no or only a small amount of data is added). For example, if you have a 3GB database that contains only 1GB of data, you can shrink the database file. If you need to shrink the database file, we recommend not to use a maintenance plan to do it. Instead, shrink the database manually by using the DBCC SHRINKDATABASE or DBCC SHRINKFILE statement. To shrink all data and log files for a specific database, execute the DBCC SHRINKDATABASE command. To shrink one data or log file at a time for a specific database, execute the DBCC SHRINKFILE command. The following example reduces the size of the data and log files in the Demo Database NAV (6-0) database to allow for 10 percent free space in the database. DBCC SHRINKDATABASE ('Demo Database NAV (6-0)', 10); To view the current amount of free (unallocated) space in the database, run the sp_spaceused stored procedure. Consider the following information when you plan to shrink a file: A shrink operation is most effective after an operation that creates a lot of unused space, such as a truncate table or a drop table operation. Most databases require some free space to be available for regular day-to-day operations. If you shrink a database repeatedly and notice that the database size grows again, this indicates that the space that was shrunk is required for regular operations. In these cases, repeatedly shrinking the database is a wasted operation. 6-32

Chapter 6: Maintenance A shrink operation does not preserve the fragmentation state of indexes in the database, and generally increases fragmentation to a degree. This is another reason not to repeatedly shrink the database. Unless you have a specific requirement, do not set the AUTO_SHRINK database option to ON. Troubleshooting File Shrinking Note that, unlike DBCC SHRINKFILE, DBCC SHRINKDATABASE does not allow you to make a database smaller than the minimum size of the database. The minimum size is the size specified when the database is originally created, or the last size explicitly set by using a file size changing operation such as DBCC SHIRNKFILE or ALTER DATABASE. For example, if a database is originally created with a size of 10 MB and grows to 100 MB, the smallest the database can be reduced to is 10 MB, even if all the data in the database has been deleted. Typically it is the log file that appears not to shrink. This is usually the result of a log file that has not been truncated. You can truncate the log by setting the database recovery model to SIMPLE, or by backing up the log and then running the DBCC SHRINKFILE operation again. If insufficient free space is available, the shrink operation cannot reduce the file size any further. Run the DBCC SQLPERF command to return the space used in the transaction log, as shown in the following code. DBCC SQLPERF(LOGSPACE) For more information about shrinking databases and log files, see DBCC SHRINKFILE (http://msdn.microsoft.com/en-us/library/ms189493.aspx), DBCC SHRINKDATABASE (http://msdn.microsoft.com/enus/library/ms190488.aspx), and Shrinking the Transaction Log (http://msdn.microsoft.com/en-us/library/ms178037.aspx). Transaction Log Growth The guidelines for monitoring database file growth also apply to transaction log files. If SQL Server runs out of space on the drive that contains the transaction log files, it cannot write data to the log files and errors will be raised. To limit the growth of a transaction log file, you can set the database recovery model to Simple. However, most production environments require a different recovery model, which means that the transaction log files will grow. By implementing a decent backup strategy you can keep the transaction log growth under control, as the backup will shrink the transaction log files. If you notice that transaction log files suddenly start growing significantly, check the database activity and make sure that you check your backup routines too. Very often, transaction logs start to grow because of a failing backup strategy. 6-33

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Summary To reduce the size of the transaction log, implement a sufficient backup strategy. If it is necessary, you can shrink the log files manually by using DBCC SHRINKFILE. Database and transaction log backups can be included as a subtask in a maintenance plan. However, depending on your needs, backups will frequently be scheduled to run during the day (especially transaction log backups), so the need to include backups in a maintenance plan is low. In fact, for schedule maintenance reasons, it is more practical to keep the backup schedule separated from the maintenance plan. In this chapter you learn how to set up and schedule maintenance plans to perform important maintenance tasks on SQL Server. In addition, the chapter shows how to execute specific maintenance tasks by using Transact-SQL scripts. 6-34

Chapter 6: Maintenance Test Your Knowledge Test your knowledge with the following questions. 1. Which dynamic management view or function allows retrieving index fragmentation? ( ) sys.dm_db_index_operational_stats ( ) sys.dm_db_index_usage_stats ( ) sys.dm_db_index_physical_stats ( ) sys.dm_db_index_frag_stats 2. What does the ALTER INDEX [$1] ON [CRONUS International Ltd_$Contact] REBUILD statement do? (Select all that apply) ( ) Updates the [$1] index for the Contact table ( ) Rebuilds the [$1] index for the Contact table with the most recently set FILLFACTOR value for that table. ( ) Rebuilds the [$1] index for the Contact table with a Fill Factor of 100. ( ) Drops the [$1] index for the Contact table 3. What does the ALTER INDEX ALL ON [CRONUS International Ltd_$Contact] REORGANIZE WITH (FILLFACTOR= 75) statement do? ( ) Updates all indexes for the Contact table ( ) Reorganizes all indexes for the Contact table with a FILLFACTOR value of 75. ( ) Drops the all indexes for the Contact table ( ) This statement is not correct. 4. Which stored procedure runs the UPDATE STATISTICS against all userdefined and internal tables in the current database? ( ) sp_autostats ( ) sp_updatestats ( ) sp_createstats ( ) sp_updatestats ALL 6-35

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 5. Name two reasons why we recommend not to include database shrinking in a maintenance plan. 6. How can you control the growth of transaction log files for a specific database? 6-36

Chapter 6: Maintenance Lab 6.1 - Create a Maintenance Plan In this lab you set up a maintenance plan in SQL Server to optimize the maintenance for a Microsoft Dynamics NAV database. Scenario Tim, the IT manager, has run some tests on the Demo Database NAV (6-0) database. The tests have shown that the indexes for the following warehouse activity tables are fragmented: Warehouse Request Warehouse Activity Header Warehouse Activity Line Registered Whse. Activity Header Registered Whse. Activity Line Warehouse Shipment Header Warehouse Shipment Line Posted Whse. Shipment Header Posted Whse. Shipment Line Tim creates a maintenance plan to defragment (reorganize) all indexes in the warehouse activity tables. He schedules the maintenance plan so that it is run automatically every night at 8:00 P.M. (except on weekends). Finally, he will test the maintenance plan. Challenge Yourself! Create a maintenance plan that reorganizes all indexes of the following warehouse activity tables: Warehouse Request Warehouse Activity Header Warehouse Activity Line Registered Whse. Activity Header Registered Whse. Activity Line Warehouse Shipment Header Warehouse Shipment Line Posted Whse. Shipment Header Posted Whse. Shipment Line Schedule the maintenance plan to run every night at 8:00 P.M. (except on weekends). Test the plan. 6-37

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Need a Little Help? Perform the following steps to complete this lab: 1. Create and Schedule the Maintenance Plan. 2. Start SQL Server Agent. 3. Test the Maintenance Plan. Step by Step Perform the following steps to complete the lab: Create and Schedule the Maintenance Plan 1. Open SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine by using a login with sysadmin credentials. 3. Expand the NAV-SRV-01 server. 4. Click Management. 5. Right-click Maintenance Plans and select Maintenance Plan Wizard. 6. On the SQL Server Maintenance Plan Wizard page, click Next. 7. On the Select Plan Properties page, in the Name field, enter the name for the maintenance plan: Lab 6_1. 8. In the Description field, enter the following description: Defragment Indexes of Warehouse Activity tables. 9. Select the Single schedule for the entire plan or no plan option. 10. Click the Change button to define the schedule. 11. In the Job Schedule Properties window, verify that the Schedule type is set to Recurring and that Enabled is selected. 12. Verify that the Frequency has been set to Weekly and that the Recurs every week(s) on is set to 1. 6-38

Chapter 6: Maintenance 13. Select the weekdays Monday to Friday. 14. Clear Saturday and Sunday. FIGURE 6.9 JOB SCHEDULE PROPERTIES 15. Set the Occurs once at field to 8:00PM. 16. Set the Start Date to the current date and select the No end date option. 17. Click OK to close the Job Schedule Properties window. 18. Click Next. 19. On the Select Maintenance Tasks page, select Reorganize Index. 20. Click Next. 21. On the Select Maintenance Task Order page, click Next. 22. On the Define Reorganize Index Task page, in the Database field, select the Demo Database NAV (6-0) database. 23. In the Objects field, select Tables. 24. In the Selection field, select the following tables: a. Warehouse Request b. Warehouse Activity Header c. Warehouse Activity Line d. Registered Whse. Activity Header e. Registered Whse. Activity Line 6-39

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 f. Warehouse Shipment Header g. Warehouse Shipment Line h. Posted Whse. Shipment Header i. Posted Whse. Shipment Line 25. Click Next. 26. On the Select Report Options page, verify that the Write a report to a text file option is selected. Leave the default Folder Location. 27. Click Next. 28. On the Complete the Wizard page, verify the options selected for the maintenance plan. 29. Click Finish to create the maintenance plan. 30. On the Maintenance Plan Wizard Progress page, click Close. Start SQL Server Agent 1. In SQL Server Management Studio, expand Management. 2. Select SQL Server Agent. 3. Verify that SQL Server Agent is running. If SQL Server Agent is not running, an icon with a small red square displays and no items are displayed under the SQL Server Agent node. If SQL Server Agent is running, an icon with a small green arrow displays and the SQL Server Agent node will contain items. FIGURE 6.10 SQL SERVER AGENT NODE IN SSMS To start SQL Server Agent, right-click SQL Server Agent and select Start. Click Yes to confirm starting the SQL Server Agent Service (SQLSERVERAGENT) on NAV-SRV-01. NOTE: You can start and stop the SQL Server Agent service in SQL Server Management Studio. However, to configure the SQL Server Agent service, we recommend that you use the SQL Server Configuration Manager. 6-40

Chapter 6: Maintenance Test the Maintenance Plan The maintenance plan can be run in two ways. Either you can run the maintenance plan from the Maintenance Plans node in SQL Server Management Studio, or you can run the SQL Server job that corresponds to the maintenance task in the SQL Server Agent node. Method 1: Start the Maintenance Plan under Maintenance Plans. 1. In SQL Server Management Studio, select Maintenance Plans. 2. In the Object Explorer Details pane, right-click the Lab 6_1 maintenance plan and select Execute. FIGURE 6.11 MAINTENANCE PLAN EXECUTION PROGRESS The Execute Maintenance Plan dialog opens, showing the progress of the maintenance plan. 3. Click Close to exit the Execute Maintenance Plan dialog box. Method 2: Start the Maintenance Plan under SQL Server Agent > Jobs. 1. In SQL Server Management Studio, select SQL Server Agent. 6-41

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 2. Select Jobs. 3. In the Object Explorer Details pane, right-click the Lab 6_1.Subplan_1 job and select Start Job at Step. FIGURE 6.12 SQL SERVER AGENT JOB EXECUTION PROGRESS The Start Jobs dialog opens, showing the progress of the job. 4. Click Close to exit the Start Jobs dialog box. 6-42

Lab 6.2 - Change the Fill Factor for Hot Tables Chapter 6: Maintenance In this lab you change the fill factor for the indexes of specific tables to reduce the number of page splits. Scenario While monitoring the SQL Server, Tim detects an increased number of page splits when users are querying the following tables: Sales Header Sales Line Purchase Header Purchase Line Document Dimension To reduce the number of page splits, Tim decides to change the fill factor to 70% so that 30% of free space per page is available for data updates. While indexes are being re-created, the intermediate sort results must be stored in the tempdb and out-of-data statistics must be recomputed. Verify the index fill factor for the tables afterward. Challenge Yourself! Use a Transact-SQL script to change the fill factor of all indexes for the following tables to 70%: Sales Header Sales Line Purchase Header Purchase Line Document Dimension Make sure that you use the tempdb to store the intermediate sort results and make sure that out-of-date statistics are automatically recomputed. Verify the index fill factor for the tables afterward. Need a Little Help? Perform the following steps to complete the lab: 1. Open SQL Server Management Studio. 2. Run the Query. 3. Verify the index fill factor for the tables. 6-43

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Step by Step Perform the following steps to complete the lab: Open SQL Server Management Studio 1. In the Windows Taskbar, click Start > All Programs > Microsoft SQL Server 2008 > SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine. 3. In the database drop-down list, select the Demo Database NAV (6-0) database. Run the Query 1. In SQL Server Management Studio, click the New Query button to open a new query window. 2. In the New Query window, enter the following command: ALTER INDEX ALL ON [CRONUS International Ltd_$Sales Header] REBUILD WITH (FILLFACTOR = 70, SORT_IN_TEMPDB = ON, STATISTICS_NORECOMPUTE = ON); ALTER INDEX ALL ON [CRONUS International Ltd_$Sales Line] REBUILD WITH (FILLFACTOR = 70, SORT_IN_TEMPDB = ON, STATISTICS_NORECOMPUTE = ON); ALTER INDEX ALL ON [CRONUS International Ltd_$Purchase Header] REBUILD WITH (FILLFACTOR = 70, SORT_IN_TEMPDB = ON, STATISTICS_NORECOMPUTE = ON); ALTER INDEX ALL ON [CRONUS International Ltd_$Purchase Line] REBUILD WITH (FILLFACTOR = 70, SORT_IN_TEMPDB = ON, STATISTICS_NORECOMPUTE = ON); ALTER INDEX ALL ON [CRONUS International Ltd_$Document Dimension] REBUILD WITH (FILLFACTOR = 70, SORT_IN_TEMPDB = ON, STATISTICS_NORECOMPUTE = ON); 3. Click Execute to run the query. Verify the fill factor of the tables. 1. In SQL Server Management Studio, click the New Query button to open a new query window. 6-44

Chapter 6: Maintenance 2. In the New Query window, enter the following command: USE [Demo Database NAV (6-0)]; SELECT name, index_id, type, is_primary_key, type_desc, fill_factor FROM sys.indexes WHERE fill_factor = 70 3. Click Execute to run the query. The statement will list all indexes with a fill factor of 70. 6-45

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 6.3 - Compare Query Execution Before and After Deleting Statistics In this lab you investigate the importance of SQL Server statistics for query execution. Scenario To reduce the overhead on the server, Mort, the IT systems developer, has disabled the Auto Update Statistics and Auto Create Statistics options on the Demo Database NAV (6-0) database. As a consequence, statistical information is not updated for the database. Tim, the IT manager, explains that updated index statistics are really important for performance on SQL Server. As Mort is not convinced, Tim shows Mort how important statistics are, by running queries with and without SQL Server statistics. Afterward, he verifies that statistics exist for a specific table and compares the execution plans with and without statistics. Challenge Yourself! Prove the importance of SQL Server statistics for query execution. Use the following query as a base. SELECT * FROM [CRONUS International Ltd_$Sales Line] WHERE ([Document Type]=1) AND ([Type]=2) AND ([No_]='1896-S') ORDER BY [Document Type],[Type],[No_],[Variant Code], [Drop Shipment],[Location Code],[Shipment Date] This is the query that executes when you click Sales, Orders on the Item Card for item 1896-S. Verify the presence of index statistics for the Sales Line table and compare the execution plans. Need a Little Help? Perform the following steps to complete the lab: 1. Open SQL Server Management Studio. 2. Remove all Indexes. 3. Run the Sales Line Query. 4. Re-create Index Statistics. 5. Run the Sales Line Query Again. 6-46

Chapter 6: Maintenance 6. Verify Statistics for the Sales Line Table. 7. Compare Execution Plans. Step by Step Perform the following steps to complete the lab: Open SQL Server Management Studio 1. In the Windows Taskbar, click Start > All Programs > Microsoft SQL Server 2008 > SQL Server Management Studio. 2. Connect to the NAV-SRV-01 Database Engine. 3. In the database drop-down list, select the Demo Database NAV (6-0) database. Remove All Statistics 1. Click the New Query button to open a new query window. 2. In the New Query window, enter the following query to remove statistics for all tables and views in the database: DECLARE @id int; DECLARE @name varchar(128); DECLARE @statement nvarchar(1000); DECLARE stat_cur CURSOR FAST_FORWARD FOR SELECT [id], [name] FROM sys.sysindexes WHERE ((indexproperty(id, name, N'IsStatistics')=1) OR (indexproperty(id, name, N'IsAutoStatistics')=1)) AND (isnull(objectproperty([id], N'IsUserTable'),0)=1); OPEN stat_cur; FETCH NEXT FROM stat_cur INTO @id, @name; WHILE @@fetch_status = 0 BEGIN SET @statement = 'DROP STATISTICS [' + object_name(@id) + '].[' + @name + ']'; BEGIN TRANSACTION; EXEC sp_executesql @statement; COMMIT TRANSACTION; FETCH NEXT FROM stat_cur INTO @id, @name; END; CLOSE stat_cur; DEALLOCATE stat_cur; 3. Click Execute to run the query. 4. Close the query window without saving the query. 6-47

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Run the Sales Line Data Query 1. Click the New Query button to open a new query window. 2. In the New Query window, enter the following query: SELECT * FROM [CRONUS International Ltd_$Sales Line] WHERE ([Document Type]=1) AND ([Type]=2) AND ([No_]='1896-S') ORDER BY [Document Type],[Type],[No_],[Variant Code], [Drop Shipment],[Location Code],[Shipment Date] 3. Right-click the query window and select Display Estimated Execution Plan. SQL Server will now run the query and show the estimated execution plan. FIGURE 6.13 EXECUTION PLAN WITHOUT STATISTICS Re-create Index Statistics 1. Click the New Query button to open a new query window. 2. In the New Query window, enter the following query to create index statistics: sp_createstats 'indexonly'; 3. Run the query. 4. Close the query window. 6-48

Chapter 6: Maintenance Run the Sales Line Data Query Again 1. Click the New Query button to open a new query window. 2. In the New Query window, enter the same query as before: SELECT * FROM [CRONUS International Ltd_$Sales Line] WHERE ([Document Type]=1) AND ([Type]=2) AND ([No_]='1896-S') ORDER BY [Document Type],[Type],[No_],[Variant Code], [Drop Shipment],[Location Code],[Shipment Date] 3. Right-click the query window and select Display Estimated Execution Plan. SQL Server will now run the query and show another execution plan. FIGURE 6.14 EXECUTION PLAN WITH STATISTICS Verify Statistics for the Sales Line Table 1. Click the New Query button to open a new query window. 2. In the New Query window, enter the following query to verify the existing index statistics for the Sales Line table: SELECT [id], object_name([id]) AS [tabname], [name], isnull(indexproperty(id, name, N'IsStatistics'),0) AS [stats], isnull(indexproperty(id, name, N'IsAutoStatistics'),0) AS [auto] FROM sys.sysindexes WHERE ((indexproperty(id, name, N'IsStatistics') = 1) OR (indexproperty(id, name, N'IsAutoStatistics') = 1)) AND (isnull(objectproperty([id], N'IsUserTable'),0) = 1) AND object_name([id]) LIKE '%Sales Line' ORDER BY tabname 3. Run the query. 4. Close the query window. 6-49

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Compare Execution Plans When comparing the execution plans, notice that in the second execution plan SQL Server spends more time executing index seeks and less time doing key lookups. The use of a Key Lookup operator in a query plan indicates that the query might benefit from performance tuning. For example, query performance might be improved by adding a covering index. For more information about key lookups, see Key Lookup Showplan Operator (http://msdn.microsoft.com/enus/library/bb326635.aspx). Furthermore, if you hover the Index Seek icon on the first execution plan, you will notice a warning message stating that there are columns with no statistics. In You can repeat this procedure for other queries in the Microsoft Dynamics NAV database and compare the execution plans with and without statistics. In general, you will notice better performance when statistics are available, especially when the amount of data grows. 6-50

Chapter 6: Maintenance Quick Interaction: Lessons Learned Take a moment and write down three Key Points you have learned from this chapter 1. 2. 3. 6-51

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Solutions Test Your Knowledge 1. Which dynamic management view or function allows retrieving index fragmentation? ( ) sys.dm_db_index_operational_stats ( ) sys.dm_db_index_usage_stats ( ) sys.dm_db_index_physical_stats ( ) sys.dm_db_index_frag_stats 2. What does the ALTER INDEX [$1] ON [CRONUS International Ltd_$Contact] REBUILD statement do? (Select all that apply) ( ) Updates the [$1] index for the Contact table ( ) Rebuilds the [$1] index for the Contact table with the most recently set FILLFACTOR value for that table. ( ) Rebuilds the [$1] index for the Contact table with a Fill Factor of 100. ( ) Drops the [$1] index for the Contact table 3. What does the ALTER INDEX ALL ON [CRONUS International Ltd_$Contact] REORGANIZE WITH (FILLFACTOR= 75) statement do? ( ) Updates all indexes for the Contact table ( ) Reorganizes all indexes for the Contact table with a FILLFACTOR value of 75. ( ) Drops the all indexes for the Contact table ( ) This statement is not correct. 4. Which stored procedure runs the UPDATE STATISTICS against all userdefined and internal tables in the current database? ( ) sp_autostats ( ) sp_updatestats ( ) sp_createstats ( ) sp_updatestats ALL 6-52

Chapter 6: Maintenance 5. Name two reasons why we recommend not to include database shrinking in a maintenance plan. MODEL ANSWER: 1. Most databases require some free space to be available for regular day-today operations. If you remove the free space from the database, SQL Server has to expand the database to execute its regular operations. In these cases, repeatedly shrinking the database is a wasted operation. 2. A shrink operation does not preserve the fragmentation state of indexes in the database, and generally increases fragmentation to a degree. 6. How can you control the growth of transaction log files for a specific database? MODEL ANSWER: You can set the recovery model of the database to SIMPLE. If you require a different recovery model, you have to implement a decent backup strategy (because making a backup reduces the size of the transaction log file). Occasionally, you can use the DBCC SHRINKFILE or DBCC SHRINKDATABASE statements to manually shrink the files. 6-53

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 6-54

Chapter 7: Appendix CHAPTER 7: APPENDIX Objectives Introduction The objectives are: Define an adequate backup strategy. Set up a connection from Microsoft Office Excel to the Microsoft Dynamics NAV database and retrieve data for analysis. The purpose of making backups is to make sure that you always have an additional copy of your data. If a problem arises with the original data, you can import the backup data into the program. It is in your own interest to make backups, but the law also requires it. 7-1

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Backup Facilities The following describes the tools that Microsoft Dynamics NAV Classic client provides for automatically exporting backups. You can make backups by copying the database directly with an operating system command. But there are four main advantages to using the Microsoft Dynamics NAV backup function: The system tests the database for errors so that useless information is not copied to a backup. The data is packed so that it uses as little space as possible. The system calculates how much space the backup will use. A message will appear when a disk is full telling you to insert a new one. Whenever you create a new database, you must always restore a Microsoft Dynamics NAV database backup into it. When you do this the "data common to all companies" and the "application objects" are restored into the new, empty database. The data common to all companies includes the report list, permissions roles, user IDs and printer selections, but no real company data. A database backup can consist of: Entire Database (including all companies in the database, data common to all companies, and application objects) All Companies (including data common to all companies) Custom (whatever you select) Microsoft Dynamics NAV Backup If you are running on Classic client, you can choose between two kinds of backup: a client based backup and an SQL backup. The Microsoft Dynamics NAV client based backup is initiated by clicking TOOLS BACKUP. You can make a backup while other users are using the database because, when you do, the program backs up the latest version of the database. If a user enters something into the database, a new version of the database will be generated, but the backup program will continue to back up the version that existed just before this new version of the database was created. 7-2

Chapter 7: Appendix Be aware that client-side backups can increase the network traffic and cannot be scheduled to run automatically in the standard application. Also keep in mind that, when carrying out a Microsoft Dynamics NAV backup in the SQL Server Option for Microsoft Dynamics NAV, every object that is backed up gets locked and other users are given read-only access. This means that, depending on what is being backed up, other users will not be able to work in the database. Restore Whenever you create a new database, you must always restore a Microsoft Dynamics NAV backup to retrieve the Data Common to All Companies and Application Objects, and place them in the new, empty database. Data common to all companies includes the report list, permissions groups, user IDs and printer selections, but no real company data. Microsoft Dynamics NAV backups do not allow restore to a specific point in time. If the database fails, all modifications since the last successful backup are lost and must be reapplied after restoring the backup. After the data is restored, Microsoft Dynamics NAV will start recreating the active secondary keys for all tables. Depending on the number of keys and data, this can take a considerable amount of time. Summary For more information about backups in Microsoft Dynamics NAV, see the Installation and System Management manual for either Microsoft Dynamics NAV Database Server or the Microsoft Dynamics NAV SQL Server Option. 7-3

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Lab 7.1 - Set Up a Connection from Microsoft Excel The objective of this lab is to learn how to set up a connection from Microsoft Office Excel to the Microsoft Dynamics NAV database and retrieve data for analysis. Scenario Kevin, the sales manager, wants to have a statistical report on the sales achieved by the sales team. As Kevin is not a developer, he does not know how to design the report. Unfortunately, Mort, the business application developer, is currently not available. In addition, Kevin needs the figures for a management presentation that starts within 30 minutes. Kevin decides to use Microsoft Office Excel to retrieve the data from the Microsoft Dynamics NAV database. Challenge Yourself! Create a report in Microsoft Office Excel that shows the total profit amount (in local currency) and total sales amount (in local currency), grouped by currency code by salesperson code. The report is based on the Customer Ledger Entry table (table 21, Cust. Ledger Entry). Moreover, the report must have the following specifications: It must be possible to refresh the data in the report. It must be possible to visualize totals and subtotals. The data has to be visualized in a table and in a chart. 7-4

Chapter 7: Appendix The report should look as follows: FIGURE 7.1 REPORT IN MICROSOFT OFFICE EXCEL Need a Little Help? 1. Create a Database Connection from Microsoft Excel. 2. Configure Data Visualization. 7-5

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 Step by Step Create a Connection from Microsoft Office Excel toward Microsoft SQL Server. 1. Start Microsoft Office Excel. 2. On the Data tab, in the Get External Data group, click From Other Sources, and then click From SQL Server. FIGURE 7.2 GET EXTERNAL DATA 3. In the Data Connection Wizard window, enter the name of the computer that is running SQL Server (NAV-SRV-01) in the Server name box. 4. Under Log on credentials, click Use Windows Authentication to use your current Microsoft Windows user name and password. 5. Click the Next button. 6. In the Select the database that contains the data you want dropdown list, select the Demo Database NAV (6-0) database. 7. Under Connect to a specific table, select the CRONUS International Ltd_$Cust_ Ledger Entry table. 8. Click the Next button. 9. On the Save Data Connection File and Finish page, check the Always attempt to use this file to refresh this data field to ensure that the connection file is always used when the data is updated. 10. Click Finish to close the Data Connection Wizard window. The Import Data dialog box appears. 11. Under Select how you want to view this data in your workbook, select the PivotChart and PivotTable Report option. 12. Under Where do you want to put the data? select the New worksheet option. 13. Click the OK button. Now that you have connected the Excel worksheet to the Customer Ledger Entry table in the Microsoft Dynamics NAV database, you can configure how the data must be displayed in Office Excel. 7-6

Chapter 7: Appendix Configure Data Visualization In step 11, you selected the Pivot Chart and Pivot Table Report option. As a result, the Excel worksheet will contain a placeholder for a pivot chart and a pivot table. At the lower right of the screen, in the PivotTable Field List pane, you can decide which fields to display in each area of the pivot table and pivot chart. By default, the Pivot Chart object is selected. If you select the Pivot Table object, the name of the areas in the PivotTable Field List pane will change. However, the settings apply to both the pivot table and the pivot chart. 1. In the PivotTable Field List pane, select the Customer_No, Salesperson_Code, Currency_Code, Profit (LCY) and Sales (LCY) fields. To select the fields, place a checkmark in the check box in front of the corresponding fields. The selected fields will be added automatically to the areas at the bottom of the pane. You can now rearrange the fields to fit your needs by dragging the fields to one of the areas. 2. In the Axis Fields area, select the Customer No_ field and drag it to the Report Filter area. 7-7

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 3. Leave the Salesperson Code and Currency Code fields in the Axis Fields area. The order of the fields in the Axis Fields area determines the different grouping levels of the data. In this example, Salesperson Code is the first grouping level, and Currency Code is the second. This means that Salesperson Code must be located above the Currency Code field in the Axis Fields area. To change the order of the fields in the Axis Fields area, click a field and drag it to the correct position. Alternatively, open the drop-down menu for the field and select Move Up, Move Down, Move to Beginning or Move to End. FIGURE 7.3 PIVOT TABLE FIELD LIST 4. Leave the Sum of Profit (LCY) and Sum of Sales (LCY) fields in the Values area. 7-8

Chapter 7: Appendix The result should look like the figure at the beginning of this lab. Quick Interaction: Lessons Learned Take a moment and write down three Key Points you have learned from this chapter 1. 2. 3. 7-9

SQL Server Installation and Optimization for Microsoft Dynamics NAV 2009 7-10