20462C ENU TrainerHandbook PDF
20462C ENU TrainerHandbook PDF
20462C ENU TrainerHandbook PDF
20462C
Administering Microsoft®
SQL Server® Databases
ii 20462C: Administering Microsoft SQL Server Databases
Information in this document, including URL and other Internet Web site references, is subject to change
without notice. Unless otherwise noted, the example companies, organizations, products, domain names,
e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with
any real company, organization, product, domain name, e-mail address, logo, person, place or event is
intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the
user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in
or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical,
photocopying, recording, or otherwise), or for any purpose, without the express written permission of
Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property
rights covering subject matter in this document. Except as expressly provided in any written license
agreement from Microsoft, the furnishing of this document does not give you any license to these
patents, trademarks, copyrights, or other intellectual property.
The names of manufacturers, products, or URLs are provided for informational purposes only and
Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding
these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a
manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links
may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not
responsible for the contents of any linked site or any link contained in a linked site, or any changes or
updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission
received from any linked site. Microsoft is providing these links to you only as a convenience, and the
inclusion of any link does not imply endorsement of Microsoft of the site or the products contained
therein.
© 2015 Microsoft Corporation. All rights reserved.
Released: 03/2015
MICROSOFT LICENSE TERMS
MICROSOFT INSTRUCTOR-LED COURSEWARE
These license terms are an agreement between Microsoft Corporation (or based on where you live, one of its
affiliates) and you. Please read them. They apply to your use of the content accompanying this agreement which
includes the media on which you received it, if any. These license terms also apply to Trainer Content and any
updates and supplements for the Licensed Content unless other terms accompany those items. If so, those terms
apply.
BY ACCESSING, DOWNLOADING OR USING THE LICENSED CONTENT, YOU ACCEPT THESE TERMS.
IF YOU DO NOT ACCEPT THEM, DO NOT ACCESS, DOWNLOAD OR USE THE LICENSED CONTENT.
If you comply with these license terms, you have the rights below for each license you acquire.
1. DEFINITIONS.
a. “Authorized Learning Center” means a Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, or such other entity as Microsoft may designate from time to time.
b. “Authorized Training Session” means the instructor-led training class using Microsoft Instructor-Led
Courseware conducted by a Trainer at or through an Authorized Learning Center.
c. “Classroom Device” means one (1) dedicated, secure computer that an Authorized Learning Center owns
or controls that is located at an Authorized Learning Center’s training facilities that meets or exceeds the
hardware level specified for the particular Microsoft Instructor-Led Courseware.
d. “End User” means an individual who is (i) duly enrolled in and attending an Authorized Training Session
or Private Training Session, (ii) an employee of a MPN Member, or (iii) a Microsoft full-time employee.
e. “Licensed Content” means the content accompanying this agreement which may include the Microsoft
Instructor-Led Courseware or Trainer Content.
f. “Microsoft Certified Trainer” or “MCT” means an individual who is (i) engaged to teach a training session
to End Users on behalf of an Authorized Learning Center or MPN Member, and (ii) currently certified as a
Microsoft Certified Trainer under the Microsoft Certification Program.
g. “Microsoft Instructor-Led Courseware” means the Microsoft-branded instructor-led training course that
educates IT professionals and developers on Microsoft technologies. A Microsoft Instructor-Led
Courseware title may be branded as MOC, Microsoft Dynamics or Microsoft Business Group courseware.
h. “Microsoft IT Academy Program Member” means an active member of the Microsoft IT Academy
Program.
i. “Microsoft Learning Competency Member” means an active member of the Microsoft Partner Network
program in good standing that currently holds the Learning Competency status.
j. “MOC” means the “Official Microsoft Learning Product” instructor-led courseware known as Microsoft
Official Course that educates IT professionals and developers on Microsoft technologies.
k. “MPN Member” means an active Microsoft Partner Network program member in good standing.
l. “Personal Device” means one (1) personal computer, device, workstation or other digital electronic device
that you personally own or control that meets or exceeds the hardware level specified for the particular
Microsoft Instructor-Led Courseware.
m. “Private Training Session” means the instructor-led training classes provided by MPN Members for
corporate customers to teach a predefined learning objective using Microsoft Instructor-Led Courseware.
These classes are not advertised or promoted to the general public and class attendance is restricted to
individuals employed by or contracted by the corporate customer.
n. “Trainer” means (i) an academically accredited educator engaged by a Microsoft IT Academy Program
Member to teach an Authorized Training Session, and/or (ii) a MCT.
o. “Trainer Content” means the trainer version of the Microsoft Instructor-Led Courseware and additional
supplemental content designated solely for Trainers’ use to teach a training session using the Microsoft
Instructor-Led Courseware. Trainer Content may include Microsoft PowerPoint presentations, trainer
preparation guide, train the trainer materials, Microsoft One Note packs, classroom setup guide and Pre-
release course feedback form. To clarify, Trainer Content does not include any software, virtual hard
disks or virtual machines.
2. USE RIGHTS. The Licensed Content is licensed not sold. The Licensed Content is licensed on a one copy
per user basis, such that you must acquire a license for each individual that accesses or uses the Licensed
Content.
2.1 Below are five separate sets of use rights. Only one set of rights apply to you.
2.2 Separation of Components. The Licensed Content is licensed as a single unit and you may not
separate their components and install them on different devices.
2.3 Redistribution of Licensed Content. Except as expressly provided in the use rights above, you may
not distribute any Licensed Content or any portion thereof (including any permitted modifications) to any
third parties without the express written permission of Microsoft.
2.4 Third Party Notices. The Licensed Content may include third party code tent that Microsoft, not the
third party, licenses to you under this agreement. Notices, if any, for the third party code ntent are included
for your information only.
2.5 Additional Terms. Some Licensed Content may contain components with additional terms,
conditions, and licenses regarding its use. Any non-conflicting terms in those conditions and licenses also
apply to your use of that respective component and supplements the terms described in this agreement.
a. Pre-Release Licensed Content. This Licensed Content subject matter is on the Pre-release version of
the Microsoft technology. The technology may not work the way a final version of the technology will
and we may change the technology for the final version. We also may not release a final version.
Licensed Content based on the final version of the technology may not contain the same information as
the Licensed Content based on the Pre-release version. Microsoft is under no obligation to provide you
with any further content, including any Licensed Content based on the final version of the technology.
b. Feedback. If you agree to give feedback about the Licensed Content to Microsoft, either directly or
through its third party designee, you give to Microsoft without charge, the right to use, share and
commercialize your feedback in any way and for any purpose. You also give to third parties, without
charge, any patent rights needed for their products, technologies and services to use or interface with
any specific parts of a Microsoft technology, Microsoft product, or service that includes the feedback.
You will not give feedback that is subject to a license that requires Microsoft to license its technology,
technologies, or products to third parties because we include your feedback in them. These rights
survive this agreement.
c. Pre-release Term. If you are an Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, MPN Member or Trainer, you will cease using all copies of the Licensed Content on
the Pre-release technology upon (i) the date which Microsoft informs you is the end date for using the
Licensed Content on the Pre-release technology, or (ii) sixty (60) days after the commercial release of the
technology that is the subject of the Licensed Content, whichever is earliest (“Pre-release term”).
Upon expiration or termination of the Pre-release term, you will irretrievably delete and destroy all copies
of the Licensed Content in your possession or under your control.
4. SCOPE OF LICENSE. The Licensed Content is licensed, not sold. This agreement only gives you some
rights to use the Licensed Content. Microsoft reserves all other rights. Unless applicable law gives you more
rights despite this limitation, you may use the Licensed Content only as expressly permitted in this
agreement. In doing so, you must comply with any technical limitations in the Licensed Content that only
allows you to use it in certain ways. Except as expressly permitted in this agreement, you may not:
• access or allow any individual to access the Licensed Content if they have not acquired a valid license
for the Licensed Content,
• alter, remove or obscure any copyright or other protective notices (including watermarks), branding
or identifications contained in the Licensed Content,
• modify or create a derivative work of any Licensed Content,
• publicly display, or make the Licensed Content available for others to access or use,
• copy, print, install, sell, publish, transmit, lend, adapt, reuse, link to or post, make available or
distribute the Licensed Content to any third party,
• work around any technical limitations in the Licensed Content, or
• reverse engineer, decompile, remove or otherwise thwart any protections or disassemble the
Licensed Content except and only to the extent that applicable law expressly permits, despite this
limitation.
5. RESERVATION OF RIGHTS AND OWNERSHIP. Microsoft reserves all rights not expressly granted to
you in this agreement. The Licensed Content is protected by copyright and other intellectual property laws
and treaties. Microsoft or its suppliers own the title, copyright, and other intellectual property rights in the
Licensed Content.
6. EXPORT RESTRICTIONS. The Licensed Content is subject to United States export laws and regulations.
You must comply with all domestic and international export laws and regulations that apply to the Licensed
Content. These laws include restrictions on destinations, end users and end use. For additional information,
see www.microsoft.com/exporting.
7. SUPPORT SERVICES. Because the Licensed Content is “as is”, we may not provide support services for it.
8. TERMINATION. Without prejudice to any other rights, Microsoft may terminate this agreement if you fail
to comply with the terms and conditions of this agreement. Upon termination of this agreement for any
reason, you will immediately stop all use of and delete and destroy all copies of the Licensed Content in
your possession or under your control.
9. LINKS TO THIRD PARTY SITES. You may link to third party sites through the use of the Licensed
Content. The third party sites are not under the control of Microsoft, and Microsoft is not responsible for
the contents of any third party sites, any links contained in third party sites, or any changes or updates to
third party sites. Microsoft is not responsible for webcasting or any other form of transmission received
from any third party sites. Microsoft is providing these links to third party sites to you only as a
convenience, and the inclusion of any link does not imply an endorsement by Microsoft of the third party
site.
10. ENTIRE AGREEMENT. This agreement, and any additional terms for the Trainer Content, updates and
supplements are the entire agreement for the Licensed Content, updates and supplements.
12. LEGAL EFFECT. This agreement describes certain legal rights. You may have other rights under the laws
of your country. You may also have rights with respect to the party from whom you acquired the Licensed
Content. This agreement does not change your rights under the laws of your country if the laws of your
country do not permit it to do so.
13. DISCLAIMER OF WARRANTY. THE LICENSED CONTENT IS LICENSED "AS-IS" AND "AS
AVAILABLE." YOU BEAR THE RISK OF USING IT. MICROSOFT AND ITS RESPECTIVE
AFFILIATES GIVES NO EXPRESS WARRANTIES, GUARANTEES, OR CONDITIONS. YOU MAY
HAVE ADDITIONAL CONSUMER RIGHTS UNDER YOUR LOCAL LAWS WHICH THIS AGREEMENT
CANNOT CHANGE. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAWS, MICROSOFT AND
ITS RESPECTIVE AFFILIATES EXCLUDES ANY IMPLIED WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
14. LIMITATION ON AND EXCLUSION OF REMEDIES AND DAMAGES. YOU CAN RECOVER FROM
MICROSOFT, ITS RESPECTIVE AFFILIATES AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP
TO US$5.00. YOU CANNOT RECOVER ANY OTHER DAMAGES, INCLUDING CONSEQUENTIAL,
LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL DAMAGES.
It also applies even if Microsoft knew or should have known about the possibility of the damages. The
above limitation or exclusion may not apply to you because your country may not allow the exclusion or
limitation of incidental, consequential or other damages.
Please note: As this Licensed Content is distributed in Quebec, Canada, some of the clauses in this
agreement are provided below in French.
Remarque : Ce le contenu sous licence étant distribué au Québec, Canada, certaines des clauses
dans ce contrat sont fournies ci-dessous en français.
EXONÉRATION DE GARANTIE. Le contenu sous licence visé par une licence est offert « tel quel ». Toute
utilisation de ce contenu sous licence est à votre seule risque et péril. Microsoft n’accorde aucune autre garantie
expresse. Vous pouvez bénéficier de droits additionnels en vertu du droit local sur la protection dues
consommateurs, que ce contrat ne peut modifier. La ou elles sont permises par le droit locale, les garanties
implicites de qualité marchande, d’adéquation à un usage particulier et d’absence de contrefaçon sont exclues.
EFFET JURIDIQUE. Le présent contrat décrit certains droits juridiques. Vous pourriez avoir d’autres droits
prévus par les lois de votre pays. Le présent contrat ne modifie pas les droits que vous confèrent les lois de votre
pays si celles-ci ne le permettent pas.
Acknowledgments
Microsoft Learning would like to acknowledge and thank the following for their contribution towards
developing this title. Their effort at various stages in the development has ensured that you have a good
classroom experience.
Contents
Module 1: Introduction to SQL Server 2014 Database Administration
Module Overview 1-1
Module 13: Monitoring SQL Server 2014 with Notifications and Alerts
Module Overview 13-1
Course Description
This five-day instructor-led course provides students with the knowledge and skills to maintain a
Microsoft SQL Server 2014 database. The course focuses on teaching individuals how to use SQL Server
2014 product features and tools related to maintaining a database.
Audience
The primary audience for this course is individuals who administer and maintain SQL Server databases.
These individuals perform database administration and maintenance as their primary area of
responsibility, or work in environments where databases play a key role in their primary job.
The secondary audience for this course is individuals who develop applications that deliver content from
SQL Server databases.
Student Prerequisites
This course requires that you meet the following prerequisites:
Basic knowledge of the Microsoft Windows operating system and its core functionality.
Students who attend this training can meet the prerequisites by attending the following courses, or
obtaining equivalent knowledge and skills:
20461C: Querying Microsoft SQL Server
Course Objectives
After completing this course, students will be able to:
Course Outline
This section provides an outline of the course:
Module 13: Monitoring SQL Server 2014 with Notifications and Alerts
Course Materials
The following materials are included with your kit:
Course Handbook. A succinct classroom learning guide that provides all the critical technical
information in a crisp, tightly-focused format, which is just right for an effective in-class learning
experience.
Lessons: Guide you through the learning objectives and provide the key points that are critical to
the success of the in-class learning experience.
Labs: Provide a real-world, hands-on platform for you to apply the knowledge and skills learned
in the module.
Module Reviews and Takeaways: Provide improved on-the-job reference material to boost
knowledge and skills retention.
Lab Answer Keys: Provide step-by-step lab solution guidance at your fingertips when it’s
needed.
Lessons: Include detailed information for each topic, expanding on the content in the Course
Handbook.
Labs: Include complete lab exercise information and answer keys in digital form to use during lab
time
About This Course xix
Resources: Include well-categorized additional resources that give you immediate access to the
most up-to-date premium content on TechNet, MSDN, Microsoft Press
Student Course Files: Include the Allfiles.exe, a self-extracting executable file that contains all
the files required for the labs and demonstrations.
Course evaluation. At the end of the course, you will have the opportunity to complete an
online evaluation to provide feedback on the course, training facility, and instructor.
The following table shows the role of each virtual machine used in this course:
Software Configuration
The following software is installed on each VM:
Course Files
There are files associated with the labs in this course. The lab files are located in the folder D:\Labfiles on
the student computers.
Classroom Setup
Each classroom computer will have the same virtual machine configured in the same way.
In addition, the instructor computer must be connected to a projection display device that supports SVGA
1024 x 768 pixels, 16 bit colors.
Note: For the best classroom experience, a computer with solid state disks (SSDs) is recommended. For
optimal performance, adapt the instructions below to install the 20462C-MIA-SQL virtual machine on a
different physical disk than the other virtual machines to reduce disk contention.
1-1
Module 1
Introduction to SQL Server 2014 Database Administration
Contents:
Module Overview 1-1
Module Overview
This module introduces the Microsoft® SQL Server® 2014 platform. It describes the components,
editions, and versions of SQL Server 2014, and the tasks that a database administrator commonly
performs for a SQL Server instance.
Objectives
After completing this module, you will be able to:
• Describe the SQL Server platform.
Lesson 1
Database Administration Overview
Most organizations use software applications to manage business processes and activities, and these
applications generally store data in a database. Organizations are increasingly reliant on applications and
the data they store, and often databases are a “mission-critical” component of a business’s information
technology (IT) infrastructure.
The role of a database administrator (DBA) includes a wide range of responsibilities and tasks that ensure
that the databases an organization relies on are maintained and kept at optimum efficiency. This lesson
describes some of these responsibilities and tasks, which will be explored in greater detail throughout the
rest of this course.
Lesson Objectives
After completing this lesson, you will be able to:
• Describe common characteristics of a database administrator.
• Business-awareness. While a DBA is a technical role, a good DBA typically understands the business
context within which the database operates, and its role in supporting the business.
• Organizational skills. Database systems can be complex, with a lot of components and subsystems
to manage. Some tasks need to be performed at specific times, and a good DBA must keep track of
these tasks while also responding to unexpected issues as they arise.
• Ability to prioritize. When unexpected problems affect a database, application users and business
stakeholders typically make demands on the DBA to resolve the situation based on their individual
requirements. A good DBA must prioritize the resolution of issues based on factors such as service-
level agreements (SLAs) with the business for database services, the number of users and systems
affected, and the degree to which the problems are affecting ongoing operations.
Administering Microsoft® SQL Server® Databases 1-3
• Maintaining database files and objects. After a database has been created and populated with
data in tables and indexes, it requires ongoing maintenance to ensure it continues to perform
optimally. This involves reducing any fragmentation that occurs in data files as records are added and
deleted, ensuring that data files are kept at an appropriate size, and ensuring that the logical and
physical data structures remain consistent.
• Managing recovery in the event of database failure. Databases are often critical to business
operations, and a core responsibility for a DBA is to plan an appropriate backup and recovery strategy
for each database, ensure backups are performed, and restore the database in the event of a failure.
• Importing and exporting data. Data is often transferred between systems, so DBAs often need to
extract data from, or import data to, databases.
• Applying security to data. An organization’s database servers often contain its most valuable asset –
the data that enables the business to operate. Security breaches can be costly, expensive and time-
consuming to trace and repair; and damaging to customer trust and confidence. A DBA must
implement security policies that enable users to access the data they need, while ensuring that the
business meets its legal compliance obligations, protects its assets, and mitigates the risks associated
with security breaches.
• Monitoring and troubleshooting database systems. Many database administration operations are
reactive, in the sense that they involve taking action to troubleshoot and remediate a problem that
has been identified. Successful DBAs also undertake a proactive approach, in which they monitor
systems against an established baseline to try to detect potential problems before they impact data
operations.
1-4 Introduction to SQL Server 2014 Database Administration
While it may be unexciting, maintaining documentation for the database system is an important part of
database administration. A detailed run book can be invaluable when a new DBA must take over
responsibility for managing a database, or when an unexpected emergency occurs and the DBA is not
present to deal with it. Even when the DBA is available to respond to a disaster such as the failure of a
database server, having clearly documented steps to recover the database reduces the sense of panic and
pressure, and enables a faster resolution of the problem.
Administering Microsoft® SQL Server® Databases 1-5
Lesson 2
Introduction to the SQL Server Platform
As a DBA, it is important to be familiar with the database management system used to store your data.
SQL Server is a platform for developing business applications that are data focused. Rather than being a
single monolithic application, SQL Server is structured as a series of components. It is important to
understand the use of each of these.
You can install more than one copy of SQL Server on a server. Each of these is called an instance and can
be configured and managed independently.
SQL Server ships in a variety of editions, each with a different set of capabilities for different scenarios. It is
important to understand the target business cases for each of the SQL Server editions and how the
evolution through a series of improving versions over many years results in today’s stable and robust
platform.
Lesson Objectives
After completing this lesson, you will be able to:
• Explain the role of each component that makes up the SQL Server platform.
• Describe the functionality that SQL Server instances provide.
Component Description
Component Description
2014 also includes a memory-
optimized database engine which
uses in-memory technology to
improve performance for short-
running transactions.
Component Description
• Some of your applications may require server configurations that are inconsistent or incompatible
with the server requirements of other applications. You can configure each instance of SQL Server
independently.
• Your application databases might need different levels of service, particularly in relation to availability.
You can use SQL Server instances to separate workloads with differing service level agreements (SLAs).
• Your applications might require different server-level collations. Although each database can have
different collations, an application might be dependent on the collation of the tempdb database
when the application is using temporary objects.
1-8 Introduction to SQL Server 2014 Database Administration
Different versions of SQL Server can also be installed side-by-side using multiple instances. This can assist
when testing upgrade scenarios or performing upgrades.
Additional instances of SQL Server require an instance name that you can use in conjunction with the
server name and are known as named instances. If you want all your instances to be named instances, you
do not need to install a default instance first. You cannot install all components of SQL Server in more
than one instance. To access a named instance, client applications use the address Server-Name\Instance-
Name. For example, a named instance called Test on a Windows server called APPSERVER1 would be
addressed as APPSERVER1\Test.
There is no need to install SQL Server tools and utilities more than once on a server. You can use a single
installation of the tools to manage and configure all instances.
Early Versions
The earliest versions (1.0 and 1.1) were based on the
OS/2 operating system. SQL Server 4.2 and later
moved to the Microsoft Windows® operating
system, initially on Windows NT.
Subsequent Versions
SQL Server 7.0 saw a significant rewrite of the product. Substantial advances were made in reducing the
administration workload and OLAP Services (which later became Analysis Services) was introduced.
SQL Server 2000 featured support for multiple instances and collations. It also introduced support for data
mining. After the product release, SQL Server Reporting Services (SSRS) was introduced as an add-on
enhancement to the product, along with support for 64-bit processors.
SQL Server 2005 provided another significant rewrite of many aspects of the product. It introduced
support for:
• SQL Server Management Studio (SSMS) was released to replace several previous administrative tools.
• Another key addition was the introduction of support for objects created using the Common
Language Runtime (CLR).
• The T-SQL language was substantially enhanced, including structured exception handling.
• Dynamic Management Views (DMVs) and Dynamic Management Functions (DMFs) were introduced
to enable detailed health monitoring, performance tuning, and troubleshooting.
• Substantial high availability improvements were included in the product; in particular, database
mirroring was introduced.
• Full-text indexing was integrated directly within the database engine. (Previously full-text indexing
was based on interfaces to operating system level services.)
• A policy-based management framework was introduced to assist with a move to more declarative-
based management practices, rather than reactive practices.
• A Windows PowerShell® provider for SQL Server was introduced.
The enhancements and additions to the product in SQL Server 2008 R2 included:
• Support for managing reference data was provided with the introduction of Master Data Services.
• StreamInsight provides the ability to query data that is arriving at high speed, before storing it in a
database.
• Data-tier applications assist with packaging database applications as part of application development
projects.
The enhancements and additions to the product in SQL Server 2012 included:
• The introduction of tabular data models into SQL Server Analysis Services (SSAS).
• Strong enhancements to the T-SQL language such as the addition of sequences, new error-handling
capabilities, and new window functions.
Lesson 3
Database Management Tools and Techniques
A DBA for a SQL Server database has a range of tools for managing different aspects of the database
solution at their disposal. It is important to be familiar with the available tools, and the techniques you can
use within them to manage a SQL Server database.
Lesson Objectives
After completing this lesson, you will be able to:
• Use SQL Server Management Studio to manage a database server and databases.
• SQL Server Configuration Manager (SSCM). You can use SSCM to configure and control SQL Server
services, and to manage server and client network protocols and aliases.
• SQL Profiler. When you need to examine activity in a SQL Server database or SSAS data model, you
can use SQL profiler to record a trace that can be viewed or replayed. This enables you to
troubleshoot problems or optimize database configuration based on actual usage patterns. Note that
this tool is deprecated for database engine workloads, and is replaced by extended events.
• SQL Server Database Engine Tuning Advisor (DTA). A properly optimized database uses indexes
and other structures to improve query performance. The DTA provides schema recommendations
based on analysis of representative workloads, and can provide a useful starting point for database
optimization.
• SQL Server Import and Export. This tool is a graphical wizard that simplifies the process of
transferring data in or out of a SQL Server database.
• The sqlcmd utility. Pronounced “SQL Command”, this is a command line tool that you can use to
connect to a SQL Server instance and run Transact-SQL statements or scripts.
Administering Microsoft® SQL Server® Databases 1-13
• The bcp utility. BCP stands for Bulk Copy Program, and the bcp utility is a command line tool for
importing and exporting data to and from SQL Server.
Additionally, SQL Server includes configuration and management tools for specific components such as
Analysis Services, Reporting Services, Data Quality Services, and Master Data Services. You can also install
SQL Server Data Tools (SSDT) and SQL Server Data Tools for Business Intelligence (SSDT-BI) add-ins in
Microsoft Visual Studio, and use them to develop database and business intelligence (BI) solutions based
on SQL Server components.
• Code Editor. You can manage database servers and objects using graphical interfaces (typically
opened from Object Explorer), or you can enter and run Transact-SQL statements in the code editor
pane. Using Transact-SQL code to perform management tasks enables you to save the commands as
scripts, which can be re-executed at a later time or scheduled to run automatically. The code editor in
SSMS supports IntelliSense, which provides auto-completion of statements and color-coding of
keywords to improve script readability. You can also use snippets to simplify the creation of
commonly used statements. SSMS also provides the ability to generate Transact-SQL code for most
tasks that can be performed using graphical tools, making it easier to create reusable scripts for
administrative tasks.
• Solutions and Projects. You can use projects and solutions to keep related scripts, connections, and
other documents together. This can make it easier to keep track of all the script files required to
create and manage a database solution.
• Reports. SSMS includes an extensible report interface that you can use to view detailed configuration
and status information about SQL Server instances, databases, and other objects.
1-14 Introduction to SQL Server 2014 Database Administration
Parameters
The sqlcmd utility provides parameters that you
can use to configure connections and perform
tasks. These parameters include:
-Q " Transact-SQL query " (run the specified query and exit)
Transact-SQL commands that you can use to perform management tasks include:
• Explicit data definition language (DDL) statements. For example, you can use the Transact-SQL
CREATE DATABASE statement to create a database, and the corresponding DROP DATABASE
statement to delete a database.
• System stored procedures and functions. SQL Server provides system stored procedures and
functions that encapsulate common system configuration and management tasks. For example, you
can use the sp_configure system stored procedure to set SQL Server instance configuration settings.
• DBCC (Database Console Commands). DBCC commands are used to perform specific configuration
and maintenance tasks, and to perform verification checks in a SQL Server database. For example, you
can use the DBCC CHECKDB command to verify the logical and physical integrity of the objects in a
database.
Cmdlets have a recognizable name format—a verb and noun separated by a dash (-), such as Get-Help,
Get-Process, and Start-Service. The verb defines the action that the cmdlet performs; for example, "get"
cmdlets retrieve data, "set" cmdlets establish or change data, "format" cmdlets format data, and "out"
cmdlets direct the output to a specified destination. The noun specifies the object being acted upon; for
example, Get-Service retrieves information about services.
Cmdlets are packaged in modules, which can be installed on a computer and loaded into the PowerShell
environment as required. You can use the Get-Module cmdlet to list available modules on a computer,
and you can use Import-Module to load the modules you need. When you install SQL Server 2014 on a
computer, the installation includes the SQLPS module, which includes cmdlets that you can use to
manage SQL Server instances and objects.
• A SQL Server provider. This enables a simple navigation mechanism similar to file system paths. You
can navigate and build paths similar to file system paths, where the drive is associated with a SQL
Server management object model, and the nodes are based on the object model classes. You can
then use PowerShell cmdlets such as Get-ChildItem to retrieve objects in the SQL Server object
1-16 Introduction to SQL Server 2014 Database Administration
model. You can also use commands such as cd and dir, to navigate the paths, similar to the way you
navigate folders in a command prompt window.
• A set of SQL Server cmdlets. The SQL Server cmdlets support actions such as running Transact-SQL
statements with the Invoke-Sqlcmd cmdlet.
You can use the SQL Server 2014 Windows PowerShell components to manage instances of SQL Server
2000 or later. Instances of SQL Server 2005 must be running SP2 or later. Instances of SQL Server 2000
must be running SP4 or later. When the SQL Server 2014 Windows PowerShell components are used with
earlier versions of SQL Server, they are limited to the functionality available in those versions.
PowerShell Interfaces
You can use the Windows PowerShell to manage SQL Server in the following user interfaces:
• The Windows PowerShell command prompt. This provides a console window in which you can run
PowerShell cmdlets.
• The Windows PowerShell interactive scripting environment (ISE). This provides a PowerShell
script development environment that supports IntelliSense and other features to simplify script
development.
• SQL Server PowerShell. SQL Server includes a PowerShell console named SQLPS.exe in which the
SQLPS module is pre-loaded. You can open this console from within SQL Server Management Studio
and use it to run cmdlets interactively.
Administering Microsoft® SQL Server® Databases 1-17
Objectives
After completing this lab, you will be able to:
Password: Pa$$w0rd
3. Create a Database
5. Create a Project
2. Ensure that Object Explorer is visible, and expand the Databases folder to view the databases that are
hosted on the MIA-SQL instance.
3. View the Server Dashboard standard report for the MIA-SQL instance.
2. View the databases listed under the Database folder and verify that the new database has been
created.
2. View the query results, noting that they include information about the AWDatabase you created in
the previous task.
2. Ensure that Solution Explorer is visible, and add a new connection to the project. The new connection
should be to the MIA-SQL database engine instance and it should use Windows authentication.
3. Add a new query to the project, and change its name to BackupDB.sql.
4. In Object Explorer, right-click the AWDatabase database you created previously, point to Tasks, and
click Back Up.
5. In the Back Up Database – AWDatabase dialog box, in the Script drop-down list, select Script
Action to Clipboard. Then cancel the backup operation.
6. Paste the contents of the clipboard into the empty BackupDB.sql script.
8. Save all of the files in the solution, and then close the solution and minimize SQL Server Management
Studio.
Results: At the end of this exercise, you will have created a SQL Server Management Studio project
containing script files.
sqlcmd -?
Administering Microsoft® SQL Server® Databases 1-19
2. Enter the following command to start sqlcmd and connect to MIA-SQL using Windows
authentication:
sqlcmd -S MIA-SQL -E
3. In the sqlcmd command line, enter the following commands to view the databases on MIA-SQL.
Verify that these include the AWDatabase database you created in the previous exercise:
Exit
Note that the query results are returned, but they are difficult to read in the command prompt screen.
2. Enter the following command to store the query output in a text file:
Results: At the end of this exercise, you will have used sqlcmd to manage a database.
Get-Process
3. Review the list of services. In the ProcessName column, note the SQL services. Then enter the
following command to list only the services with names beginning “SQL”,:
1-20 Introduction to SQL Server 2014 Database Administration
Get-Process SQL*
Get-Help Sort
6. Verify that the list is now sorted by number of handles, and close Windows PowerShell.
2. Enter the following command to show which modules are loaded, and verify that they include SQLPS
and SQLASCMDLETS:
Get-Module
3. Enter the following command to set the current location to the MIA-SQL server:
Set-location SQLServer:\SQL\MIA-SQL
4. Use the following command to display the SQL Server database engine instances on the server:
Get-ChildItem
7. Use the following command to execute a Transact-SQL statement that retrieves the server version:
8. Close the SQL Server Powershell window and close SQL Server Management Studio without saving
any files.
2. In the PowerShell command prompt, enter the following command to verify that the SQLPS module
is not loaded:
Get-Module
3. Use the following command to load the SQLPS module, and then use the Get-Module cmdlet to
verify that it has been loaded:
4. If the Commands pane is not visible, on the View menu, click Show Command Add-on. Then in the
Commands pane, in the Modules list, select SQLPS and view the cmdlets in the module, noting that
Administering Microsoft® SQL Server® Databases 1-21
they include cmdlets to perform tasks such as backing up databases and starting SQL Server
instances.
5. If the Script pane is not visible, click the Script drop-down arrow. Then, in the Script pane, type the
following commands. (Hint: Use the IntelliSense feature.)
6. Click Run Script. Then view the results in the window that is opened. (The script may take a few
minutes to run.)
7. Close the output window, and modify the script as shown in the following example:
8. Save the script as GetDatabases.ps1 in the D:\Labfiles\Lab01\Starter folder. Then close the
PowerShell ISE.
9. In the D:\Labfiles\Lab01\Starter folder, right-click GetDatabases.ps1 and click Run with PowerShell.
10. When the script has completed, open Databases.txt in Notepad to view the results. Then close
Notepad.
Results: At the end of this task, you will have a PowerShell script that retrieves information about
databases from SQL Server.
1-22 Introduction to SQL Server 2014 Database Administration
Review Question(s)
Question: When might you use each of the management tools you explored in the lab?
2-1
Module 2
Installing and Configuring SQL Server 2014
Contents:
Module Overview 2-1
Module Overview
One of the key responsibilities of a database administrator (DBA) is to provision databases servers and
databases. This includes planning and performing the installation of SQL Server on physical servers and
virtual machines.
This module explains how to assess resource requirements for SQL Server 2014 and how to install it.
Objectives
After completing this module, you will be able to:
Lesson 1
Planning SQL Server Installation
Before starting the installation process for SQL Server, it is important to discover how each of the
requirements for a successful installation can be met. In this lesson, you will consider the specific
requirements that SQL Server places on the hardware and software platforms on which it runs and learn
about tools which you can use to pre-test your systems before installing SQL Server.
Lesson Objectives
After completing this lesson, you will be able to:
• Describe the hardware and software requirements for SQL Server 2014.
• Describe considerations for storage I/O and performance requirements for SQL Server.
• Service account identities. Each SQL Server service is configured to run within the context of a
specific Windows account. This account provides an identity for the service, and is used to authorize
Administering Microsoft® SQL Server® Databases 2-3
access to system resources by the service. You should plan the accounts that will be used by the
services in your SQL Server installation carefully to ensure that they meet the requirements of the
“principle of least privilege”, which states that all authenticated accounts should have the minimum
permissions and system rights they need to fulfil their function.
• Default data file locations. The SQL Server database engine is used to store and manage data in
databases, and these databases exist as files on storage devices. While you can specify the location of
data files when you create a database, SQL Server uses a default location for internal system
databases and new databases with no file location specified. You should set the default location to a
folder on an appropriate storage device when you install SQL Server.
• Server collation. SQL Server databases use a specified collation to control how character data is
sorted and stored. You specify a collation at the server (instance) level, and unless you have specific
character sort-order requirements, you should choose a collation that matches that used by the server
on which you are installing SQL Server. You can override the server collation in individual databases if
required, but the server collation is used for system databases including TempDB, which is used by all
databases to store temporary objects.
Hardware Requirements
In earlier versions of SQL Server, it was necessary to
focus on minimum requirements for processor, disk
space and memory. Nowadays, discussing minimum
processor speeds and disk space requirements for
the SQL Server components is pointless. Even the
slowest processor in a new laptop is fast enough to
meet the minimum requirements for SQL Server.
Processors
In enterprise environments, the number of processors is now a much more significant issue. While it might
seem desirable to add as many CPUs as possible, it is important to consider that there is a tradeoff
between the number of CPUs and license costs. Also, not all computer architectures support the addition
of CPUs. Adding CPU resources might then require architectural upgrades to computer systems, not just
the additional CPUs.
The information in the following table shows the processor types required for x84 and x64 processor
architectures:
Architecture Requirements
x64 At least a 1.4 GHz AMD Opteron, AMD Athlon 64, Intel Xeon or Intel Pentium with
EM64T support
Disk
The hardware requirements for SQL Server list the required disk space to install the product. These values
are not the only ones to consider though, because the size of user databases can have a greater impact.
2-4 Installing and Configuring SQL Server 2014
Disk subsystem performance, however, is critical. A typical SQL Server system today is I/O bound, if it is
configured and working correctly. Note that a bottleneck is, in itself, not a bad thing. Any computer
system with any task that it needs to perform will have a bottleneck somewhere. If another component of
the server is the bottleneck (rather than the I/O subsystem), there is usually an underlying issue to resolve.
It could be a lack of memory or something more subtle like a recompilation issue (that is, a situation
where SQL Server is constantly recompiling code). Memory requirements for SQL Server are discussed in
the next topic.
Software Requirements
Like any server product, SQL Server requires specific combinations of operating system and software in
order to install.
Operating System
Even though it is possible to install versions of SQL Server on the client operating systems, such as the
Windows 7® (SP1) and Windows Vista® (SP2), the product is really designed for use on server operating
systems such as the Windows Server series.
Similarly, many higher-end editions of SQL Server also require higher-end editions of Windows. SQL
Server Books Online provides a precise list of supported versions and editions.
It is strongly recommended to avoid installing SQL Server on a domain controller. If you attempt this, the
installation is not blocked but limitations are applied to it.
• You cannot run SQL Server services on a domain controller under a local service account or a network
service account.
• After SQL Server is installed on a computer, you cannot change the computer from a domain member
to a domain controller. You must uninstall SQL Server before you change the host computer to a
domain controller.
• After SQL Server is installed on a computer, you cannot change the computer from a domain
controller to a domain member. You must uninstall SQL Server before you change the host computer
to a domain member.
• SQL Server Setup cannot create security groups or provision SQL Server service accounts on a read-
only domain controller. In this scenario, Setup will fail.
SQL Server failover cluster instances are not supported where cluster nodes are domain controllers and
installation is blocked.
Prerequisite Software
In earlier versions, the installer for SQL Server would preinstall most requirements as part of the
installation process. This is no longer the case—the .NET Framework and PowerShell need to be
preinstalled before running setup. The installer for SQL Server will install the SNAC and the SQL Server
setup support files. However, to minimize the installation time for SQL Server, particularly in busy
production environments, it is useful to have preinstalled these components during any available planned
downtime. Components such as the .NET Framework often require a reboot after installation so the pre-
installation of these components can further reduce downtime during installations or upgrades.
Several components of SQL Server have a requirement for the Internet Explorer® browser. These
components include the Microsoft Management Console (MMC) add-in, SSMS, Business Intelligence
Design Studio (BIDS), the Report Designer in BIDS, and any use of HTML Help.
Administering Microsoft® SQL Server® Databases 2-5
CPU
CPU utilization for a SQL Server largely depends
upon the types of queries that are running on the
system. Processor planning is often considered as
relatively straightforward, in that few system
architectures provide fine-grained control of the
available processor resources. Testing with realistic workloads is the best option.
Increasing the number of available CPUs will provide SQL Server with more scope for creating parallel
query plans. Even without parallel query plans, SQL Server workloads will make good use of multiple
processors when working with simple query workloads from a large number of concurrent users. Parallel
query plans are particularly useful when large amounts of data are being scanned within large data
warehouses.
Try to ensure that your server is dedicated to SQL Server whenever possible. Most servers that are running
production workloads on SQL Server should have no other significant services running on the same
system. This particularly applies to other server applications such as Microsoft Exchange Server.
Many new systems are based on Non-Uniform Memory Access (NUMA) architectures. In a traditional
symmetric multiprocessing (SMP) system, all CPUs and memory are bound to a single system bus. The bus
can become a bottleneck when additional CPUs are added. On a NUMA-based system, each set of CPUs
has its own bus, complete with local memory. In some systems, the local bus might also include separate
I/O channels. These CPU sets are called NUMA nodes. Each NUMA node can access the memory of other
nodes but the local access to local memory is much faster. The best performance is achieved if the CPUs
mostly access their own local memory. Windows and SQL Server are both NUMA aware and try to make
use of these advantages.
Optimal NUMA configuration is highly dependent on the hardware. Special configurations in the system
BIOS might be needed to achieve optimal performance. It is crucial to check with the hardware vendor for
the optimal configuration for a SQL Server on the specific NUMA-based hardware.
Memory
The availability of large amounts of memory for SQL Server to use is now one of the most important
factors when sizing systems.
While SQL Server will operate in relatively small amounts of memory, when memory configuration
challenges arise, they tend to relate to the maximum, not the minimum, values. For example, the Express
Edition of SQL Server will not utilize more than 1 GB of memory, regardless of how much is physically
installed in the system.
The majority of servers being installed today are 64-bit, with a single address space that can directly
access large amounts of memory. The biggest challenge with 32-bit servers is that memory outside of the
4-GB "visible" address space (that is the memory that can be directly accessed) is retrieved by using
Address Windowing Extensions (AWE). While earlier versions of SQL Server allowed the use of AWE-based
memory for the caching of data pages, SQL Server 2012 onwards no longer supports the use of AWE-
based memory to increase the address space for 32-bit systems.
2-6 Installing and Configuring SQL Server 2014
Determining Requirements
In the first phase of planning, the requirements of
the application must be determined, including the
I/O patterns that need to be satisfied. These include
the frequency and size of reads and writes sent by
the application. As a general rule, OLTP systems
produce a high number of random I/O operations
on the data files and sequential write operations on database log files. By comparison, data warehouse-
based applications tend to generate large scans on data files.
Storage Styles
The second planning phase involves determining the style of storage to be used. With direct attached
storage (DAS), it is easier to get good predictable performance. On storage area network (SAN) systems,
more work is often required to get good performance, but SAN storage typically provides a wide variety
of management capabilities and storage consolidation.
One particular challenge for SQL Server administrators is that SAN administrators are generally more
concerned with the disk space that is allocated to applications rather than the performance requirements
of individual files. Rather than attempting to discuss file layouts with a SAN administrator, try to
concentrate on your performance requirements for specific files. Leave the decisions about how to
achieve those goals to the SAN administrator. That is, focus on what is needed in these discussions rather
than on how it can be achieved.
RAID Systems
In SAN-based systems, you will not often be concerned about the redundant array of independent disks
(RAID) levels being used. If you have specified the required performance on a file basis, the SAN
administrator will need to select appropriate RAID levels and physical disk layouts to achieve that.
For DAS storage, become aware of different RAID levels. While other RAID levels exist, RAID levels 1, 5,
and 10 are the most common ones used in SQL Server systems.
Number of Spindles
For most current systems, the number of drives (or spindles, even though the term is now somewhat
dated), will matter more than the size of the disk. It is easy to find large disks that will hold substantial
databases but often a single large disk will not be able to provide sufficient I/O operations per second or
enough data throughput (MB/sec) to be workable. Solid state drive (SSD) based systems are quickly
changing the available options in this area.
Drive Caching
Read caching within disk drives is not particularly useful because SQL Server already manages its own
caching system. It is unlikely that SQL Server will need to re-read a page from disk that it has recently
written, unless the system is low on memory. Write caches can substantially improve SQL Server I/O
performance, but make sure that hardware caches guarantee a write, even after a system failure. Many
drive write caches cannot survive failures and this can lead to database corruptions.
Administering Microsoft® SQL Server® Databases 2-7
SQLIOSim
SQLIOSim is an unsupported utility that you can download from the Microsoft download site. It is
designed to simulate the activity generated by SQL Server without the need to have SQL Server installed.
This capability makes SQLIOSim a good tool for pre-testing server systems that are targets for running
SQL Server.
SQLIOSim is a stand-alone tool that you can copy on to the server and run. It does not need to be
installed on the target system by using an installer. SQLIOSim has both GUI and command-line execution
options.
While SQLIOSim is useful for stress-testing systems, it is not good for general performance testing. The
tasks it performs vary between each execution of the utility, so no attempt should be made to compare
the output of multiple executions directly, particularly in terms of timing.
SQLIO
SQLIO is another unsupported utility that you can download from the Microsoft download site. Unlike
SQLIOSIM, which is used for non-repeatable stress testing, SQLIO is designed to create entirely repeatable
I/O patterns. You use a configuration file to determine the types of I/O operations to test. SQLIO then
tests those types of operations specifically.
A common way to use SQLIOSIM and SQLIO together is to use SQLIOSIM to find issues with certain types
of I/O operations. SQLIO can then be used to generate those specific problematic types of I/O operations,
while attempting to resolve the issues. SQLIO is also a stand-alone tool that does not require SQL Server
to be installed on the system. Also, it is advised to perform these tests before installation of SQL Server.
SQLIO only checks one I/O type at a time. This makes the interpretation of the results the most important
task.
• SQL Server Database Services. This is the service for the SQL Server Database Engine.
• SQL Server Agent. This service is responsible for automation, including running jobs and issuing
alerts.
2-8 Installing and Configuring SQL Server 2014
• Analysis Services. This service provides online analytical processing (OLAP) and data-mining
functionality.
• Reporting Services. This service is involved in the creation and execution or reports.
• Integration Services. This service performs the tasks associated with extract, transform, and load
(ETL) operations.
• Master Data Services. This service enables you to maintain master records of data items to ensure
that data is represented consistency across the enterprise.
• Data Quality Services. This service enables you to maintain data quality through operations such as
data cleansing and de-duplication.
• SQL Server Browser. By default, named instances use dynamic TCP port assignment, which means
that when the SQL Server service starts, they select a TCP port from the available ports. The SQL
Browser service enables connectivity to named SQL Server instances when the connection uses
dynamic port assignment in this way. The service is not required for connections to default SQL Server
instance, which use port 1433, the well-known port number for SQL Server. For connections to named
instances that use a TCP specific port number, or which specify a port number in the connection
string, the SQL Server Browser service is not required and you can disable it.
• Full-text search. This service enables the indexing and searching or unstructured and semi-
structured data.
• SQL Writer. This service enables you to backup and restore by using the Windows Volume Shadow
Copy service (VSS). This service is disabled by default, and you should only enable it if you intend to
use VSS backups.
The different types of accounts that you can use for SQL Server services include:
• Domain user account. A non-administrator domain user account is secure choice for service
accounts in a domain environment.
• Local user account. A non-administrator local user account is secure choice for service accounts in a
non-domain environment, such as a perimeter network.
• Local System account. The local system account is a highly privileged account that is used by various
Window services. Consequently, you should avoid using this account to run SQL Server services.
• Local Service Account. The local service account is a pre-defined account with restricted
privileges that can be used to access local resources. This account is used by Windows services
and other applications that do not require access to remote resources, and generally a
dedicated service account for each SQL Server services is preferred. If your database server runs
on Windows Server 2008 R2 or later, you can use a virtual service account instead (see below).
Administering Microsoft® SQL Server® Databases 2-9
• Network Service account. The Network Service account has fewer privileges than the Local System
account, but it does enable a service to have access to network resources. However, because this
account is often used by multiple services, including Windows services, you should avoid using it
where possible. If your database server runs on Windows Server 2008 R2 or later, you can use a virtual
service account instead (see below).
• Managed service account. Managed service accounts are available if the host operating system is
Windows Server 2008 R2 or later (Windows 7 also supports managed service accounts). SQL Server
support for managed service accounts was introduced in SQL Server 2012. A managed service
account is a type of domain account that is associated with a single server and which you can use to
manage services. You cannot use a managed service account to log on to a server, so it is more
secure than a domain user account. Additionally, unlike a domain user account, you do not need to
manually manage passwords for managed service accounts. A domain administrator needs to create
and configure a managed service account before you can use it.
• Virtual service account. Virtual service accounts are available if the host operating system is
Windows Server 2008 R2 or later (Windows 7 also supports virtual service accounts). SQL Server
support for virtual service accounts was introduced in SQL Server 2012. A virtual service account is
similar to a managed service account, except that it is a type of local account that you can use to
manage services rather than a domain account. Unlike managed service accounts, an administrator
does not need to create or configure a virtual service account. This is because a virtual service account
is simply a virtualized instance of the built-in Network Service account with its own unique identifier.
2-10 Installing and Configuring SQL Server 2014
Lesson 2
Installing SQL Server 2014
After making the decisions about your SQL Server configuration, you can proceed to installation. In this
lesson, you will see the phases that installation proceeds through and how SQL Server checks your system
for compatibility by using a tool known as the System Configuration Checker.
For most users, the setup program will report that all was installed as expected. For the rare situations
where this does not occur, you will also learn how to carry out post-installation checks and
troubleshooting.
Lesson Objectives
After completing this lesson, you will be able to:
Installation Wizard
The SQL Server installation wizard provides a simple
user interface for installing SQL Server. It comprises
of multiple pages that gather the information
required to install the product.
The installation wizard enables you to select all the components of SQL Server that you want to install. As
well as using it to create a new installation on the server, you can also use it to add components to an
existing one.
Note: You must be a local administrator to run the installation wizard on the local
computer. When installing from a remote share, you need read and execute permissions.
Command Prompt
You can also run the SQL Server setup program from the command prompt, using switches to specify the
options that you require. You can enable users to fully interact with the setup program, to view the
progress without requiring any input, or to run it in quiet mode without any user interface. Unless you are
using a volume licensing or third-party agreement, users will always be required to confirm acceptance of
the software license terms.
Administering Microsoft® SQL Server® Databases 2-11
Configuration File
As well as using switches to provide information to the command prompt setup, you can also use a
configuration file. This can simplify the task of installing identically-configured instances across your
enterprise.
The configuration file is a text file containing name/value pairs. You can manually create this file by
running the installation wizard, selecting all your required options, and then, instead of installing the
product, generating a configuration file of those options—or taking the configuration file from a
previously successful installation.
If you use a configuration file in conjunction with command prompt switches, the command prompt
values will override any values in your configuration file.
• Install Setup Files. The installation process installs the setup files that it requires to install SQL Server.
• Install Rules. The installation process checks for known potential issues that can occur during setup
and requires you to rectify any that it finds before continuing.
• Setup Role. You must select the type of installation that you need to ensure that the process includes
the correct feature components you require. The options are:
o SQL Server Feature Installation. This option installs the key components of SQL Server,
including the database engine, Analysis Services, Reporting Services, and Integration Services.
o SQL Server PowerPivot for SharePoint. This option installs PowerPivot for SharePoint on a new
or existing instance of SharePoint server.
o All Features with Defaults. This option installs all SQL Server features and uses the default
options for the service accounts.
After you select one of these options, you can further customize the features to install on the next page of
the wizard.
2-12 Installing and Configuring SQL Server 2014
• Feature Selection. You can use this page to select the exact features that you want to install. You can
also specify where to install the instance features and the shared features.
• Instance Configuration. You must specify whether to install a default or named instance (on the first
installation) and if a named instance, the name that you want to use.
• Server Configuration. Specify the service account details and startup type for each service that you
are installing.
• <Component> Configuration. You must configure component-specific settings for each component
that you have selected to install.
• Ready to Install. Use this page to review the options you have selected throughout the wizard prior
to performing the installation.
• Complete. When the installation is complete, you are likely to need to reboot the server.
In both examples on the slide, the second method has been used. The first example shows a typical
installation command and the second shows how an upgrade could be performed using the same
method.
/q Switch
The "/q" switch shown in the examples specifies "quiet mode" – no user interface is provided. An
alternative switch "/qs" specifies "quiet simple" mode. In the quiet simple mode, the installation runs and
shows progress in the UI but does not accept any input.
In-place Upgrades
In-place upgrades occur when the installed version
of SQL Server is directly replaced by a new version
by the SQL Server setup program. This is an easier
and highly automated, though riskier, method of
upgrading. If an upgrade fails, it is much harder to return to the previous operating state. For most
customers, the risk of this cannot be ignored.
When you are weighing up risk, you need to consider that it may not be the SQL Server upgrade that fails.
Even if the SQL Server upgrade works as expected, but then the application fails to operate as anticipated
on the new version of SQL Server, the need to recover the situation quickly will be just as important.
In-place upgrades have the added advantage of minimizing the need for additional hardware resources
and avoid the need to redirect client applications that are configured to work with the existing server.
Before performing an in-place upgrade, you should use the Upgrade Advisor tool to analyze existing
instances and identify any issues that need to be addressed before upgrading. You can install the Upgrade
Advisor from the Planning page of the SQL Server 2014 Installation Center.
Side-by-side Upgrades
Side-by-side upgrades are a safer alternative, as the original system is left in place and can be quickly
returned to production should an upgrade issue arise. However, side-by-side upgrades involve extra work
and more hardware resources.
To perform a side-by-side upgrade, you will need enough hardware resources to provide for both the
original and the new systems. Two common risks associated with side-by-side upgrades relate to the time
taken to copy all the user databases to a new location and the space required to hold these copies.
While most side-by-side upgrades are performed on separate servers, it is possible to install both versions
of SQL Server on the same server during a side-by-side upgrade. However, side-by-side upgrades of
versions with the same major build number (that is, SQL Server 2008 R2 and SQL Server 2008) on the
same server are a special case. Because the major version number is identical, separate versions of the
shared components cannot co-exist on the same server. Shared components will be upgraded.
Not all versions of SQL Server are supported when installed side-by-side. Consult SQL Server Books Online
for a matrix of versions that are supported when installed together.
Hybrid Options
It is also possible to use some elements of an in-place upgrade and a side-by-side upgrade together. For
example, rather than copying all the user databases, after installing the new version of SQL Server beside
the old version, and migrating all the server objects such as logins, you could detach user databases from
the old server instance and reattach them to the new one.
2-14 Installing and Configuring SQL Server 2014
Once user databases have been attached to a newer version of SQL Server, they cannot be reattached to
an older version again, even if the database compatibility settings have not been upgraded. This is a risk
that needs to be considered when using a hybrid approach.
Administering Microsoft® SQL Server® Databases 2-15
Lesson 3
Post-Installation Configuration
When you have completed the installation of SQL Server, you can perform post-installation checks and
configuration tasks to meet your specific requirements.
Lesson Objectives
After completing this lesson, you will be able to:
You do not need to check the contents of the SQL Server setup log files after installation, because the
installer program will indicate any errors that occur and attempt to reverse any of the SQL Server setup
that has been completed to that point. When errors occur during the SQL Server Setup phase, the
installation of the SQL Server Native Access Client and the Setup Components is not reversed.
Typically, you only need to view the setup log files in two scenarios:
• If setup is failing and the error information displayed by the installer does not help you to resolve the
issue.
• If you contact Microsoft Product Support and they ask for detailed information.
If you do require the log files, you will find them in the %programfiles%\Microsoft SQL Server\110\Setup
Bootstrap\Log folder.
2-16 Installing and Configuring SQL Server 2014
Configuring Services
You can use SSCM to control (that is, start, stop,
and configure) each service independently, to set
the startup mode (automatic, manual, or disabled)
of each service, and to set the service account
identity for each service.
You can also set startup parameters to start SQL Server services with specific configuration settings
troubleshooting purposes.
You also use SSCM to configure both server and client protocols and ports. SSCM provides two sets of
network configurations—protocols that the server exposes and those used for making connections.
• TCP/IP
• Named pipes
• Shared memory
The configuration for the TCP/IP protocol allows for different settings on each configured IP address if
required, or a general set of configurations that are applied to all IP addresses.
Aliases
Connecting to a SQL Server service can involve multiple settings such as server address, protocol, and
port. If you hard-code these connection details in your client applications, and then any of the details
change, your application will no longer work. To avoid this issue and to make the connection process
simpler, you can use SSCM to create aliases for server connections.
Administering Microsoft® SQL Server® Databases 2-17
You create a server alias and associate it with a server, protocol, and port (if required). Client applications
can then connect to the alias without being concerned about how those connections are made.
Each client system that utilizes SNAC (including the server itself) can have one or more aliases configured.
Aliases for 32-bit applications are configured independently of those for 64-bit applications.
• Cumulative Updates (CUs) are periodic roll-up releases of hotfixes that have received further testing
as a group.
• Service Packs (SPs) are periodic releases where full regression testing has been performed. Microsoft
recommend applying SPs to all systems after appropriate levels of organizational testing.
The simplest way to keep SQL Server up to date is to enable automatic updates from the Microsoft
Update service. Larger organizations or those with strong change processes should exert caution in
applying automatic updates. It is likely that the updates should be applied to test or staging environments
before being applied to production environments.
SQL Server 2014 can also have product SPs slipstreamed into the installation process to avoid the need to
apply them after installation.
2-18 Installing and Configuring SQL Server 2014
Objectives
After completing this lab, you will be able to:
Password: Pa$$w0rd
2. In the SQL Server Installation Center, on the Planning page, view the Hardware and Software
Requirements.
2. Keep the SQL Server Installation Center window open. You will use it again in a later exercise.
Results: After this exercise, you should have run the SQL Server setup program and used the tools in the
SQL Server Installation Center to assess the computer’s readiness for SQL Server installation.
Administering Microsoft® SQL Server® Databases 2-19
Item Configuration
Startup Both SQL Server and SQL Server Agent should start
manually
SA Password Pa$$w0rd
o On the Product Key page, select Evaluation edition, which does not require a product key.
o On the Feature Selection page, select only the features that are required.
o On the Server Configuration page, configure the service account name and password, the
startup type for the SQL Server Agent and SQL Server Database Engine services, and verify the
collation.
2-20 Installing and Configuring SQL Server 2014
o On the Database Engine Configuration page, configure the authentication mode and the SA
password; add the current user (Student) to the SQL Server administrators list, specify the
required data directories, and verify that Filestream is not enabled.
Results: After this exercise, you should have installed an instance of SQL Server.
2. Verify that that the service is configured to log on as ADVENTUREWORKS\ServiceAcct. Then start
the service.
2. View the SQL Server Native Client 32-bit client protocols and verify that the TCP/IP protocol is
enabled. Then create an alias named Test that uses TCP/IP to connect to the MIA-SQL\SQLTEST
instance from 32-bit clients.
3. View the SQL Server Native Client protocols and verify that the TCP/IP protocol is enabled. Then
create an alias named Test that uses TCP/IP to connect to the MIA-SQL\SQLTEST instance from 64-
bit clients.
SELECT @@ServerName;
GO
a. View the properties of the Test instance and verify that the value of the Name property is MIA-
SQL\SQLTEST.
Results: After this exercise, you should have started the SQL Server service and connected using SSMS.
2-22 Installing and Configuring SQL Server 2014
Review Question(s)
Question: What additional considerations do you think there are for installing additional
named instances on a server where SQL Server is already installed?
3-1
Module 3
Working with Databases and Storage
Contents:
Module Overview 3-1
Module Overview
One of the most important roles for database administrators working with Microsoft® SQL Server® is the
management of databases and storage. It is important to know how data is stored in databases, how to
create databases, how to manage database files, and how to move them. Other tasks related to storage
include managing the tempdb database and using fast storage devices to extend the SQL Server buffer
pool cache.
Objectives
After completing this lesson, you will be able to:
Lesson 1
Introduction to Data Storage with SQL Server
Before you can create and manage databases effectively, you must understand how data is stored in
them, know about the different types of files that SQL Server can utilize, where the files should be placed,
and learn how to plan for ongoing file growth.
Lesson Objectives
After completing this lesson, you will be able to:
• Explain how specific redundant array of independent disks (RAID) systems work.
• Determine appropriate file placement and the number of files for SQL Server databases.
Database Files
There are three types of database file used by SQL
Server—primary data files, secondary data files, and
transaction log files.
integrity of the database in case of a failure and to support rollbacks of transactions. The recommended
extension for log files is .ldf.
When data pages need to be changed, they are fetched into memory and changed there. The “dirty
pages” are then written to the transaction log in a synchronous manner. Later, during a background
process known as a “checkpoint”, the dirty pages are written to the database files. For this reason, the
pages contained in the transaction log are critical to the ability of SQL Server to recover the database to a
known committed state. Transaction logs are discussed in detail in this course.
Note: The log file is also used by other SQL Server features, such as transactional
replication, database mirroring, and change data capture. These are advanced topics and beyond
the scope of this course.
Extents
Groups of eight contiguous pages are referred to as an extent. SQL Server uses extents to simplify the
management of data pages. There are two types of extents:
• Uniform extents. All pages within the extent contain data from only one object.
• Mixed extents. The pages of the extent can hold data from different objects.
The first allocation for an object is at the page level, and always comes from a mixed extent. If they are
free, other pages from the same mixed extent will be allocated to the object as needed. Once the object
has grown bigger than its first extent, then all future allocations are from uniform extents.
In both primary and secondary data files, a small number of pages is allocated to track the usage of
extents in the file.
3-4 Working with Databases and Storage
RAID Levels
Many storage solutions use RAID hardware to
provide fault tolerance through data redundancy,
and in some cases, to improve performance. You
can also implement software-controlled RAID 0,
RAID 1, and RAID 5 by using the Windows Server
operating system, and other levels may be
supported by third-party SANs. Commonly used
types of RAID include:
• RAID 1, disk mirroring. A mirror set is a logical storage volume that is based on space from two
disks, with one disk storing a redundant copy of the data on the other. Mirroring can provide good
read performance, but write performance can suffer. RAID 1 is expensive in terms of storage because
50 percent of the available disk space is used to store redundant data.
• RAID 5, disk striping with parity. RAID 5 offers fault tolerance through the use of parity data that is
written across all the disks in a striped volume that is comprised of space from 3 or more disks. RAID
5 typically performs better than RAID 1. However, if a disk in the set fails, performance degrades.
RAID 5 is less costly in terms of disk space than RAID 1 because parity data only requires the
equivalent of one disk in the set to store it. For example, in an array of five disks, four would be
available for data storage, which represents 80 percent of the total disk space.
• RAID 10, mirroring with striping. In RAID 10, a non-fault tolerant RAID 0 stripe set is mirrored. This
arrangement delivers the excellent read/write performance of RAID 0, combined with the fault
tolerance of RAID 1. However, RAID 10 can be expensive to implement because, like RAID 1, 50
percent of the total space is used to store redundant data.
• Generally, RAID 10 offers the best combination of read/write performance and fault tolerance, but is
the most costly solution.
• Write operations on RAID 5 can sometimes be relatively slow compared to RAID 1 because of the
need to calculate parity data (RAID 5). If you have a high proportion of write activity, therefore, RAID
5 might not be the best candidate.
• Consider the cost per GB. For example, implementing a 500 GB database on a RAID 1 mirror set
would require (at least) two 500 GB disks. Implementing the same database on a RAID 5 array would
require substantially less storage space.
• Many databases use a SAN, and the performance characteristics can vary between SAN vendors. For
this reason, if you use a SAN, you should consult with your vendors to identify the optimal solution
for your requirements.
• Windows storage spaces enable you to create extensible RAID storage solutions that use commodity
disks. This solution offers many of the benefits of a specialist SAN hardware solution, at a significantly
lower cost.
3-6 Working with Databases and Storage
Access Patterns
The access patterns of log and data files are very different. Data access on log files consists primarily of
sequential, synchronous writes, with occasional random disk access. Data access on data files
predominantly offers asynchronous random disk access to the data files from the database. A single
physical storage device does not tend to provide good response times when these types of data access
are combined.
Recovery
While RAID volumes provide some protection from physical storage device failures, complete volume
failures can still occur. If a SQL Server data file is lost, the database can be restored from a backup and the
transaction log reapplied to recover the database to a recent point in time. If a SQL Server log file is lost,
the database can be forced to recover from the data files, with the possibility of some data loss or
inconsistency in the database. However, if both the data and log files are on a single disk subsystem that is
lost, the recovery options usually involve restoring the database from an earlier backup and losing all
transactions since that time. Isolating data and log files can help to avoid the worst impacts of drive
subsystem failures.
Note: Storage solutions use logical volumes as units of storage, and a common mistake is
to place data files and log files on different volumes that are actually based on the same physical
storage devices. When isolating data and log files, ensure that volumes on which you store data
and log files are based on separate underlying physical storage devices.
• A reduction in recovery time when separately restoring a database file (for example, if only part of the
data is corrupt).
• The ability to have databases larger than the maximum size of a single Windows file.
Administering Microsoft® SQL Server® Databases 3-7
Many administrators are concerned that larger database files will somehow increase the time it takes to
perform backups. The size of a SQL Server backup is not related directly to the size of the database files as
only pages that actually contain data are backed up.
One significant issue that arises with autogrowth is a trade-off related to the size of the growth
increments. If a large increment is specified, a significant delay can be experienced in the execution of the
Transact-SQL statement that triggers the need for growth. If the specified increment is too small, the
filesystem can become very fragmented and the database performance can suffer because the data files
have been allocated in small chunks all over a disk subsystem.
As well as expanding the size of the transaction log, you can also truncate a log file. Truncating the log
purges the file of inactive, committed, transactions and enables the SQL Server database engine to reuse
this part of the transaction log. However, you should be careful when truncating the transaction log as
doing so may affect the recoverability of the database in the event of a failure. Generally, log truncation is
managed as part of a backup strategy.
3-8 Working with Databases and Storage
Lesson 2
Managing Storage for System Databases
SQL Server uses system database to maintain internal metadata. Database administrators should be
familiar with the SQL Server system databases and how to manage them.
Lesson Objectives
After completing this lesson, you will be able to:
• Configure tempdb.
master
The master database contains all system-wide
information. Anything that is defined at the server
instance level is typically stored in the master
database. If the master database is damaged or
corrupted, SQL Server will not start, so it is
imperative to back it up on a regular basis.
msdb
The msdb database holds information about database maintenance tasks, and in particular it contains
information used by the SQL Server Agent for maintenance automation, including jobs, operators, and
alerts. It is also important to regularly back up the msdb database, to ensure that jobs, schedules, history
for backups, restores, and maintenance plans are not lost. In earlier versions of SQL Server, SQL Server
Integration Services (SSIS) packages were often stored within the msdb database. In SQL Server 2014, you
should store them in the dedicated SSIS catalog database instead.
model
The model database is the template on which all user databases are established. Any new database uses
the model database as a template. If you create any objects in the model database, they will then be
present in all new databases on the server instance. Many sites never modify the model database. Note
that, even though the model database does not seem overly important, SQL Server will not start if the
model database is not present.
tempdb
The tempdb database holds temporary data. SQL Server truncates or creates this database every time it
starts, so there is no need to perform a backup. In fact, there is no option to perform a backup of the
tempdb database.
Administering Microsoft® SQL Server® Databases 3-9
resource
The resource database is a read-only hidden database that contains system objects mapped to the sys
schema in every database. This database also holds all system stored procedures, system views and system
functions. In SQL Server versions before SQL Server 2005, these objects were defined in the master
database.
2. In the SQL Server Services node, right-click the instance of SQL Server, click Properties, and then click
the Startup Parameters tab.
3. Edit the Startup Parameters values to point to the planned location for the master database data (-
d parameter) and log (-l parameter) files.
• Internal objects
Note: Working with internal objects is an advanced concept beyond the scope of this
course.
• Row versions
Transactions that are associated with snapshot-related transaction isolation levels can cause alternate
versions of rows to be briefly maintained in a special row version store within tempdb. Row versions
can also be produced by other features, such as online index rebuilds, Multiple Active Result Sets
(MARS), and triggers.
• User objects
Most objects that reside in the tempdb database are user-generated and consist of temporary tables,
table variables, result sets of multi-statement table-valued functions, and other temporary row sets.
Because tempdb is used for so many purposes, it is difficult to predict its required size in advance. You
should carefully test and monitor the sizes of your tempdb database in real-life scenarios for new
installations. Running out of disk space in the tempdb database can cause significant disruptions in the
SQL Server production environment and prevent applications that are running from completing their
operations. You can use the sys.dm_db_file_space_usage dynamic management view to monitor the disk
space that the files are using. Additionally, to monitor the page allocation or deallocation activity in
tempdb at the session or task level, you can use the sys.dm_db_session_space_usage and
sys.dm_db_task_space_usage dynamic management views.
By default, the tempdb database automatically grows as space is required because the MAXSIZE of the
files is set to UNLIMITED. Therefore, tempdb can continue growing until space on the disk that contains it
is exhausted.
Administering Microsoft® SQL Server® Databases 3-11
Demonstration Steps
Move tempdb files
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2. In the D:\Demofles\Mod03 folder, run Setup.cmd as Administrator.
3. Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows
authentication.
4. In Object Explorer, expand Databases, expand System Databases, and then right-click tempdb and
click Properties.
5. In the Database Properties dialog box, on the Files page, note the current files and their location.
Then click Cancel.
7. View the code in the script, and then click Execute. Note the message that is displayed after the code
has run.
8. View the contents of T:\ and note that no files have been created in that location, because the SQL
Server service has not yet been restarted.
9. In Object Explorer, right-click MIA-SQL and click Restart. When prompted to allow changes, to
restart the service, and to stop the dependent SQL Server Agent service, click Yes.
10. View the contents of T:\ and note that the tempdb.mdf and tempdb.ldf files have been moved to
this location.
11. Keep SQL Server Management Studio open for the next demonstration.
3-12 Working with Databases and Storage
Lesson 3
Managing Storage for User Databases
User databases are non-system databases that you create for applications. Creating databases is a core
competency for database administrators working with SQL Server. As well as understanding how to create
them, you need to be aware of the impact of file initialization options and know how to alter existing
databases.
When creating databases, you also need to consider where the data and logs will be stored on the file
system. You may also want to change this or provide additional storage when the database is in use.
When databases become larger, there is a need to allocate the data across different volumes, rather than
storing it in a single large disk volume. This allocation of data is configured using filegroups and used to
address both performance and ongoing management needs within databases.
Lesson Objectives
After completing this lesson, you will be able to:
CREATE DATABASE
Database names must be unique within an instance
of SQL Server and comply with the rules for
identifiers. A database name is of data type sysname, which is defined as nvarchar(128). This means that
up to 128 characters can be present in the database name and that each character can be chosen from
the double-byte Unicode character set. While database names can be quite long, you will find that these
become awkward to work with.
Data Files
As discussed earlier in this module, a database must have a single primary data file and one log file. The
ON and LOG ON clauses of the CREATE DATABASE command specify the name and path to use.
Administering Microsoft® SQL Server® Databases 3-13
In the following code, a database named Sales is being created, comprising of two files—a primary data
file located at M:\Data\Sales.mdf and a log file located at L:\Logs\Sales.ldf:
Each file includes a logical file name as well as a physical file path. Because operations in SQL Server use
the logical filename to reference the file, the logical file name must be unique within each database.
In this example, the primary data file has an initial file size of 100 MB and a maximum file size of 500 MB.
It will grow by 20 percent of its current size whenever autogrowth needs to occur. The log file has an
initial file size of 20 MB and has no limit on maximum file size. Each time it needs to autogrow, it will grow
by a fixed 10 MB allocation.
While it is possible to create a database by providing just the database name, this results in a database
that is based on the model database—with the data and log files in the default locations—which is
unlikely to be the configuration that you require.
Deleting Databases
To delete (or “drop”) a database, right-click it in Object Explorer and click Delete or use the DROP
DATABASE Transact-SQL statement. Dropping a database automatically deletes all of its files.
Dropping a database.
DROP DATABASE Sales;
Categories of Options
There are several categories of database options:
3-14 Working with Databases and Storage
• Auto options. Control certain automatic behaviors. As a general guideline, Auto Close and Auto
Shrink should be turned off on most systems but Auto Create Statistics and Auto Update
Statistics should be turned on.
• Cursor options. Control cursor behavior and scope. In general, the use of cursors when working with
SQL Server is not recommended, apart from particular applications such as utilities. Cursors are not
discussed further in this course but it should be noted that their overuse is a common cause of
performance issues.
• Database availability options. Control whether the database is online or offline, who can connect to it,
and whether or not it is in read-only mode.
o Recovery model. Database recovery models will be discussed in Module 4 of this course.
o Page verify. Early versions of SQL Server offered an option called Torn Page Detection. This
option caused SQL Server to write a small bitmap across each disk drive sector within a database
page. There are 512 bytes per sector, meaning that there are 16 sectors per database page (8 KB).
This was a fairly crude yet reasonably effective way to detect a situation where only some of the
sectors required to write a page were in fact written. In SQL Server 2005, a new CHECKSUM
verification option was added. The use of this option causes SQL Server to calculate and add a
checksum to each page as it is written and to recheck the checksum whenever a page is retrieved
from disk.
Note: Page checksums are only added the next time that any page is written. Enabling the
option does not cause every page in the database to be rewritten with a checksum.
Demonstration Steps
Create a Database by Using SQL Server Management Studio
1. Ensure that you have completed the previous demonstration. If not, start the 20462C-MIA-DC and
20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student
with the password Pa$$w0rd, and run D:\Demofiles\Mod03\Setup.cmd as Administrator.
2. If SQL Server Management Studio is not open, start it and connect to the MIA-SQL database engine
using Windows authentication.
• DemoDB1
o Path: M:\Data\
Administering Microsoft® SQL Server® Databases 3-15
• DemoDB1_log
o Path: L:\Logs\
8. On the Options tab, review the database options. Then click Cancel.
1. In SQL Server Management Studio, open the CreatingDatabases.sql script file from the
D:\Demofiles\Mod03 folder.
2. Select the code under the comment Create a database and click Execute to create a database
named DemoDB2.
3. Select the code under the comment View database info and click Execute. Then view the
information that is returned.
4. Keep SQL Server Management Studio open for the next demonstration.
For example, you can set a database to read only or read write.
Note: Many of the database set options that you configure by using the ALTER DATABASE
statement can be overridden using a session level set option. This enables users or applications to
execute a SET statement to configure the setting just for the current session.
Additional Reading: For more information about database set options, see ALTER
DATABASE SET Options (Transact-SQL) in SQL Server Books Online.
3-16 Working with Databases and Storage
The value that you specify for the compatibility level defines which previous versions it should be
compatible with.
The values you can use are described in the following table:
If a database has already exhausted the space allocated to it and it cannot grow a data file automatically,
error 1105 is raised. (The equivalent error number for the inability to grow a transaction log file is 9002.)
This can happen if the database is not set to grow automatically or if there is not enough disk space on
the hard drive.
Administering Microsoft® SQL Server® Databases 3-17
Adding Files
One option for expanding the size of a database is to add files. You can do this by using either SSMS or by
using the ALTER DATABASE … ADD FILE statement.
Expanding Files
When expanding a database, you must increase its size by at least 1 MB. Ideally, any file size increase
should be much larger than this. Increases of 100 MB or more are common.
When you expand a database, the new space is immediately made available to either the data or
transaction log file, depending on which file was expanded. When you expand a database, you should
specify the maximum size to which the file is permitted to grow. This prevents the file from growing until
disk space is exhausted. To specify a maximum size for the file, use the MAXSIZE parameter of the ALTER
DATABASE statement, or use the Restrict filegrowth (MB) option when you use the Properties dialog box
in SSMS to expand the database.
Transaction Log
If the transaction log is not set up to expand automatically, it can run out of space when certain types of
activity occur in the database. In addition to expanding the size of the transaction log, the log file can be
truncated. Truncating the log purges the file of inactive, committed transactions and allows the SQL
Server database engine to reuse this unused part of the transaction log. If there are active transactions,
the log file might not be able to be truncated and expanding it may be the only available option.
Shrinking a Database
You can reduce the size of the files in a database by removing unused pages. Although the database
engine will reuse space effectively, there are times when a file no longer needs to be as large as it once
was. Shrinking the file may then become necessary, but it should be considered a rarely used option. You
can shrink both data and transaction log files—this can be done manually, either as a group or
individually, or you can set the database to shrink automatically at specified intervals.
Note: Shrinking a file usually involves moving of pages within the files, which can take a
long time.
Regular shrinking of files tends to lead to regrowth of files. For this reason, even though SQL
Server provides an option to automatically shrink databases, this should only be rarely used. As in
most databases, enabling this option will cause substantial fragmentation issues on the disk
subsystem. It is best practice to only perform shrink operations if absolutely necessary.
TRUNCATE ONLY
TRUNCATE_ONLY is an additional option of DBCC SHRINKFILE which releases all free space at the end of
the file to the operating system, but does not perform any page movement inside the file. The data file is
shrunk only to the last allocated extent. This option often does not shrink the file as effectively as a
3-18 Working with Databases and Storage
standard DBCC SHRINKFILE operation, but is less likely to cause substantial fragmentation and is much
faster.
Introduction to Filegroups
As you have seen in this module, databases consist
of at least two files: a primary data file and a log
file. To improve performance and manageability of
large databases, you can add secondary files.
Every database has a primary filegroup (named PRIMARY), and when you add secondary data files to the
database, they automatically become part of the primary filegroup, unless you specify a different
filegroup.
Note: You will learn more about partial backups and piecemeal restores later this course.
When a filegroup contains multiple files, SQL Server can write to all of the files simultaneously, and it
populates them by using a proportional fill strategy. Files that are the same size will have the same
Administering Microsoft® SQL Server® Databases 3-19
amount of data written to them, ensuring that they fill at a consistent rate. Files that are different sizes will
have different amounts of data written to them to ensure that they fill up at a proportionally consistent
rate. The fact that SQL Server can write to filegroup files simultaneously enables you to use a filegroup to
implement a simple form of striping. You can create a filegroup that contains two or more files, each of
which is on a separate disk. When SQL Server writes to the filegroup, it can use the separate I/O channel
for each disk concurrently, which results in faster write times.
Note: Generally, you should use filegroups primarily to improve manageability and rely on
storage device configuration for I/O performance. However, when a striped storage volume is not
available, using a filegroup to spread data files across physical disks can be an effective
alternative.
Creating Filegroups
You can create additional filegroups and assign files
to them as you create a database, or you can add
new filegroups and files to an existing database.
In the following code example, the Sales database includes a primary filegroup containing a single file
named sales.mdf, a filegroup named Transactions containing two files named sales_tran1.ndf and
sales_tran2.ndf, and a filegroup named Archive containing two files named sales_archive1.ndf and
sales_archive2.ndf.
To add filegroups to an existing database, you can use the ALTER DATABASE … ADD FILEGROUP
statement. You can then use the ALTER DATABASE … ADD FILE statement to add files to the new
filegroup.
3-20 Working with Databases and Storage
The following code example changes the default filegroup in the Sales database to the Transactions
filegroup:
To make a filegroup read-only, use the ALTER DATABASE … MODIFY FILEGROUP statement with the
READONLY option.
To make a read-only filegroup writable, use the ALTER DATABASE … MODIFY FILEGROUP statement with
the READWRITE option.
Administering Microsoft® SQL Server® Databases 3-21
Lesson 4
Moving Database Files
As well as adding and removing files from a database, you may sometimes need to move database files,
or even whole databases. You may also need to copy a database.
Lesson Objectives
After completing this lesson, you will be able to:
The following example shows how to move the data file for the AdventureWorks database:
Detaching Databases
You detach databases from an instance of SQL
Server by using SSMS or the sp_detach_db stored
procedure. Detaching a database does not remove
the data from the data files or remove the data files
from the server. It simply removes the metadata
entries for that database from the system databases
on that SQL Server instance. The detached database then no longer appears in the list of databases in
SSMS or in the results of the sys.databases view. After you have detached a database, you can move or
copy, and then attach it to another instance of SQL Server.
UPDATE STATISTICS
SQL Server maintains a set of statistics on the distribution of data in tables and indexes. As part of the
detach process, you can specify an option to perform an UPDATE STATISTICS operation on table and
index statistics. While this is useful if you are going to reattach the database as a read-only database, in
general it is not a good option to use while detaching a database.
Detachable Databases
Not all databases can be detached. Databases that are configured for replication, mirrored, or in a suspect
state, cannot be detached.
Note: Replicated and mirrored databases are advanced topics beyond the scope of this
course.
A more common problem that prevents a database from being detached at the time that you attempt to
perform the operation, is that connections are open to the database. You must ensure that all connections
are dropped before detaching the database. SSMS offers an option to force connections to be dropped
during this operation.
Attaching Databases
SSMS also provides you with the ability to attach databases. You can also do this by using the CREATE
DATABASE … FOR ATTACH statement.
Note: You may find many references to the sp_attach_db and sp_attach_single_file_db
stored procedures. These older syntax are replaced by the FOR ATTACH option to the CREATE
DATABASE statement. Note also that there is no equivalent replacement for the sp_detach_db
procedure.
A common problem when databases are reattached is that database users can become orphaned. You will
see how to deal with this problem in a later module.
Administering Microsoft® SQL Server® Databases 3-23
• Detach a database.
• Attach a database.
Demonstration Steps
Detach a Database
1. Ensure that you have completed the previous demonstrations in this module, and that you have
created a database named DemoDB2.
2. In Object Explorer, right-click the Databases folder and click Refresh; and verify that the DemoDB2
database is listed.
3. Right-click DemoDB2, point to Tasks, and click Detach. Then in the Detach Database dialog box,
select Drop Connections and Update Statistics, and click OK.
4. View the M:\Data and L:\Logs folders and verify that the DemoDB2.mdf and DemoDB2.ldf files have
not been deleted.
Attach a Database
1. In SQL Server Management Studio, in Object Explorer, in the Connect drop-down list, click Database
Engine. Then connect to the MIA-SQL\SQL2 database engine using Windows authentication.
2. In Object Explorer, under MIA-SQL\SQL2, expand Databases and view the databases on this
instance.
3. In Object Explorer, under MIA-SQL\SQL2, right-click Databases and click Attach.
4. In the Attach Databases dialog box, click Add. Then in the Locate Database Files dialog box, select
the M:\Data\DemoBD2.mdf database file and click OK.
5. In the Attach Databases dialog box, after you have added the master databases file, note that all of
the database files are listed. Then click OK.
6. In Object Explorer, under MIA-SQL\SQL2, under Databases, verify that DemoDB2 is now listed.
3-24 Working with Databases and Storage
Lesson 5
Configuring the Buffer Pool Extension
The topics in this module so far have discussed the storage of system and user database files. However,
SQL Server 2014 also supports the use of high performance storage devices, such as solid-state disks
(SSDs) to extend the buffer pool (the cache used to modify data pages in-memory).
Lesson Objectives
After completing this lesson, you will be able to:
• Explain the considerations needed when working with the Buffer Pool Extension.
Only clean pages, containing data that is committed, are stored in the Buffer Pool Extension, ensuring that
there is no risk of data loss in the event of a storage device failure. Additionally, if a storage device
containing the Buffer Pool Extension fails, the extension is automatically disabled. You can easily re-enable
the extension when the failed storage device has been replaced.
• Performance gains on online transaction processing (OLTP) applications with a high amount of read
operations can be improved significantly.
• SSD devices are often less expensive per megabyte than physical memory, making this a cost-
effective way to improve performance in I/O-bound databases.
• The Buffer Pool Extension can be enabled easily, and requires no changes to existing applications.
Note: The Buffer Pool Extension is only available in 64-bit installations of SQL Server 2014
Enterprise, Business Intelligence, and Standard Editions.
Administering Microsoft® SQL Server® Databases 3-25
Scenarios where the Buffer Pool Extension is unlikely to significantly improve performance include:
You can view the status of the buffer pool extension by querying the
sys.dm_os_buffer_pool_extension_configuration dynamic management view. You can monitor its
usage by querying the sys.dm_os_buffer_descriptors dynamic management view.
To disable the Buffer Pool Extension, use the ALTER SERVER CONFIGURATION statement with the SET
BUFFER POOL EXTENSION OFF clause.
To resize or relocate the Buffer Pool Extension file, you must disable the Buffer Pool Extension, and then
re-enable it with the required configuration. When you disable the Buffer Pool Extension, SQL Server will
have less buffer memory available, which may cause an immediate increase in memory pressure and I/O,
resulting in performance degradation. You should therefore carefully plan reconfiguration of the Buffer
Pool Extension to minimize disruption to application users.
You can view the status of the Buffer Pool Extension by querying the
sys.dm_os_buffer_pool_extension_configuration dynamic management view. You can monitor its
usage by querying the sys.dm_os_buffer_descriptors dynamic management view.
Demonstration Steps
Enable the Buffer Pool Extension
1. Ensure that you have completed the previous demonstration. If not, start the 20462C-MIA-DC and
20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student
with the password Pa$$w0rd, and run D:\Demofiles\Mod03\Setup.cmd as Administrator.
2. If SQL Server Management Studio is not open, start it and connect to the MIA-SQL database engine
using Windows authentication.
3. Open the script file ConfiguringBPE.sql in the D:\Demofiles\Mod03 folder.
4. Review the code under the comment Enable buffer pool extension, and note that it creates a Buffer
Pool Extension file named MyCache.bpe in S:\. On a production system, this file location would
typically be on an SSD device.
5. Use File Explorer to view the contents of the S:\ folder and note that no MyCache.bpe file exists.
6. In SQL Server Management Studio, select the code under the comment Enable buffer pool
extension, and click Execute.
1. View the contents of the S:\ folder and note that the MyCache.bpe file now exists.
2. In SQL Server Management Studio, select the code under the comment View buffer pool extension
details, and click Execute. Then note the information about the Buffer Pool Extension that is returned
from the dynamic management view.
3. Select the code under the comment Monitor buffer pool extension, and click Execute. This
dynamic management view shows all buffered pages, and the is_in_bpool_extension column
indicates pages that are stored in the Buffer Pool Extension.
1. In SQL Server Management Studio, select the code under the comment Disable buffer pool
extension, and click Execute.
2. Select the code under the comment View buffer pool extension details, and click Execute. Then
note the information about the Buffer Pool Extension that is returned from the dynamic management
view.
3. Use File Explorer to view the contents of the S:\ folder and note that the MyCache.bpe file has been
deleted.
3-28 Working with Databases and Storage
Objectives
After completing this lab, you will be able to:
• Configure tempdb.
• Create databases.
• Attach a database.
Estimated Time: 45 minutes
2. Alter tempdb to so that the database files match the following specification:
o tempdev
Size: 10MB
File growth: 5MB
Maximum size: Unlimited
File Name: T:\tempdb.mdf
o templog
Administering Microsoft® SQL Server® Databases 3-29
Size: 5MB
File growth: 1MB
Maximum size: Unlimited
File Name: T:\templog.ldf
3. Restart the SQL Server service and verify that the changes have taken effect.
Results: After this exercise, you should have inspected and configured the tempdb database.
• The Human Resources applications is a simple solution for managing employee data. It is not
expected to be used heavily or to grow substantially.
• The Internet Sales application is a new e-commerce website, and must support a heavy workload that
will capture a large volume of sales order data.
Initial
Logical Name Filegroup Growth Path
Size
Initial
Logical Name Filegroup Growth Path
Size
Initial
Logical Name Filegroup Growth Path
Size
2. Execute the code under the comment View page usage and note the UsedPages and TotalPages
values for the SalesData filegroup.
3. Execute the code under the comments Create a table on the SalesData filegroup and Insert
10,000 rows.
4. Execute the code under the comment View page usage again and verify that the data in the table is
spread across the files in the filegroup.
Results: After this exercise, you should have created a new HumanResources database and an
InternetSales database that includes multiple filegroups.
2. Configure Filegroups
o AWDataWarehouse.mdf
o AWDataWarehouse_archive.ndf
o AWDataWarehouse_current.ndf
3. View the properties of the dbo.FactInternetSales table and verify that it is stored in the Current
filegroup.
4. View the properties of the dbo.FactInternetSalesArchive table and verify that it is stored in the
Archive filegroup.
5. Edit the dbo.FactInternetSales table and modify a record to verify that the table is updateable.
6. Edit the dbo.FactInternetSalesArchive table and attempt to modify a record to verify that the table
is read-only.
Results: After this exercise, you should have attached the AWDataWarehouse database to MIA-SQL.
3-32 Working with Databases and Storage
Best Practice: When working with database storage, consider the following best practices:
• Create the database in an appropriate size so it doesn’t have to be expanded too often.
Review Question(s)
Question: Why is it typically sufficient to have one log file in a database?
Question: Why should only temporary data be stored in the tempdb system database?
4-1
Module 4
Planning and Implementing a Backup Strategy
Contents:
Module Overview 4-1
Module Overview
One of the most important aspects of a database administrator's role is ensuring that organizational data
is reliably backed up so that it is possible to recover the data if a failure occurs. Even though the
computing industry has known about the need for reliable backup strategies for decades and discussed
the needs at great length, tragic stories regarding data loss are still commonplace. A further problem is
that, even when the strategies in place work as they were designed, the outcomes still regularly fail to
meet an organization’s operational requirements.
In this module, you will consider how to create a strategy that is aligned with organizational needs, and
learn how to perform the backup operations required by that strategy.
Objectives
After completing this module, you will be able to:
• Describe how database transaction logs function, and how they affect database recovery.
Lesson 1
Understanding SQL Server Recovery Models
Before you can plan a backup strategy, you must understand how SQL Server uses the transaction log to
maintain data consistency, and how the recovery model of the database affects transaction log operations
and the available backup options.
Lesson Objectives
After completing this lesson, you will be able to:
Write-Ahead Logging
When SQL Server needs to modify the data in a
database page, it first checks if the page is present in the buffer cache. If the page is not present, it is read
into the buffer cache. SQL Server then modifies the page in memory, writing redo and undo information
to the transaction log. While this write is occurring, the “dirty” page in memory is locked until the write to
the transaction log is complete. At regular intervals, a background checkpoint process flushes the dirty
pages to the database, writing all the modified data to disk.
This process is known as write-ahead logging (WAL) because all log records are written to the log before
the affected dirty pages are written to the data files and the transaction is committed. The WAL protocol
ensures that the database can always be set to a consistent state after a failure. This recovery process will
be discussed in detail later in this course, but its effect is that transactions that were committed before the
failure occurred are guaranteed to be applied to the database. Those transactions that were “in flight” at
the time of the failure, where work is partially complete, are undone.
Writing all changes to the log file in advance also makes it possible to roll back transactions if required.
Transaction Rollback
SQL Server can use the information in the transaction log to roll back transactions that have only been
partially completed. This ensures that transactions are not left in a partially-completed state. A transaction
rollback may occur because of a request from a user or client application (such as the execution of a
Administering Microsoft® SQL Server® Databases 4-3
Between 64 MB and 1 GB 8
Greater than 1 GB 16
When a log file write reaches the end of the existing log file, SQL Server starts writing again at the
beginning, overwriting the log records currently stored. This mechanism works well, providing that the
previous log records in that section of the log file have already been written to the database and freed up,
or “truncated”. If they have not been truncated and the data is required, SQL Server tries to grow the size
of the log file. If this is not possible (for example, if it is not configured for automatic growth or the disk is
full) SQL Server fails the transaction and returns an error. If it is possible to grow the log file, SQL Server
allocates new virtual log files, using the auto growth size increment in the log file configuration.
Note: Instant File Initialization (IFI) cannot be used with transaction log files. This means
that transactions can be blocked while log file growth occurs.
Note: Replication is beyond the scope of this course but it is important to be aware that
the configuration and state of replicated data can affect transaction log truncation.
When choosing a recovery model for your database, you will need to consider the size of the database,
the potential maintenance overhead, and the level of acceptable risk with regards to potential data loss.
If you decide to use the simple recovery model, you should ensure that you perform regular database
backups to reduce any potential loss. However, the intervals should be long enough to keep the backup
overhead from affecting production work. You can include differential backups in your strategy to help
reduce the overhead.
The full recovery model requires log backups. It fully logs all transactions and retains the transaction log
records until after they are backed up. The full recovery model enables a database to be recovered to the
point of failure, assuming that the tail of the log can be backed up afterwards. The full recovery model
also supports an option to restore individual data pages or restore to a specific point in time.
The bulk-logged recovery model still requires transaction log backups because, like the full recovery
model, the bulk-logged recovery model retains transaction log records until after they are backed up. The
Administering Microsoft® SQL Server® Databases 4-5
trade-offs are bigger log backups and increased work-loss exposure because the bulk-logged recovery
model does not support point-in-time recovery.
Note: One potentially surprising outcome is that the log backups can often be larger than
the transaction logs. This is because SQL Server retrieves the modified extents from the data files
while performing a log backup for minimally-logged data.
Another common misconception is that the log file of a database in simple recovery model will not grow.
This is also not the case. In simple recovery model, the transaction log needs to be large enough to hold
all details from the oldest active transaction. Large or long-running transactions can cause the log file to
need additional space.
Note: Database mirroring, transactional replication, and change data capture are beyond
the scope of this course.
You can use the log_reuse_wait_desc column in the sys.databases table to identify the reason why you
cannot truncate a log.
The values that can be returned for the log_reuse_wait_desc column include:
0 = Nothing
1 = Checkpoint
2 = Log backup
4-6 Planning and Implementing a Backup Strategy
4 = Active transaction
5 = Database mirroring
6 = Replication
9 = Other (transient)
After resolving the reason that is shown, perform a log backup (if you are using full recovery model) to
truncate the log file, and then you can use DBCC SHRINKFILE to reduce the filesize of the log file.
Note: If the log file does not reduce in size when using DBCC SHRINKFILE as part of the
above steps, the active part of the log file must have been at the end at that point in time.
• Manual checkpoints are issued when you execute a Transact-SQL CHECKPOINT command. The
manual checkpoint occurs in the current database for your connection. By default, manual
checkpoints run to completion. The optional checkpoint duration parameter specifies a requested
amount of time, in seconds, for the checkpoint to complete.
• Internal checkpoints are issued by various server operations, such as backup and database snapshot
creation, to guarantee that disk images match the current state of the log.
You can configure the target duration of a checkpoint operation by executing the CHECKPOINT
statement.
Demonstration Steps
Observe Log File Behavior in the Full Recovery Model
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
3. Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows
authentication.
4. In Object Explorer, expand Databases, right-click the LogTest database, and click Properties.
5. In the Database Properties - LogTest dialog box, on the Options page, verify that the Recovery
model is set to Full. Then click Cancel.
8. Select the code comment View database file space and click Execute. Note the space used in the
LogTest_log file (and note that log files have a type value of 1).
9. Select the code comment Insert data and click Execute to insert 5000 rows.
10. Select the code comment View log file space and click Execute. Note the space used in the
LogTest_log file has increased.
11. Select the code comment Issue a checkpoint and click Execute force SQL Server to perform a
checkpoint and flush the modified pages to disk
12. Select the code comment View log file space again and click Execute. Note the space used in the
LogTest_log file has not decreased.
13. Select the code comment Check log status and click Execute. Note that SQL Server is awaiting a log
backup before the log file can be truncated.
14. Select the code comment Perform a log backup and click Execute.
15. Select the code comment Verify log file truncation and click Execute. Note the space used in the
LogTest_log file has decreased because the log has been truncated.
16. Keep SQL Server Management Studio open for the next demonstration.
4-8 Planning and Implementing a Backup Strategy
Lesson 2
Planning a Backup Strategy
Now you have an understanding of SQL Server transaction logs and database recovery models, it is time
to consider the types of backups that are available with SQL Server.
To effectively plan a backup strategy, you need to align your chosen combination of backup types to your
business recovery requirements. Most organizations will need to use a combination of backup types rather
than relying solely on just one.
Lesson Objectives
After completing this lesson, you will be able to:
Backup Types
SQL Server supports several backup types, which
you can combine to implement the right backup
and recovery strategy for a particular database
based on business requirements and recovery
objectives.
Full Backups
A full backup of a database includes the data files
and the active part of the transaction log. The first
step in the backup is performing a CHECKPOINT
operation. The active part of the transaction log
includes all details from the oldest active
transaction forward. A full backup represents the
database at the time that the data reading phase of the backup was completed and serves as your
baseline in the event of a system failure. Full backups do not truncate the transaction log.
Differential Backups
A differential backup saves the data that has been changed since the last full backup. Differential backups
are based on the data file contents rather than log file contents and contain extents that have been
modified since the last full database backup. Differential backups are generally faster to restore than
transaction log backups but they have less options available. For example, point-in-time recovery is not
available unless differential backups are also combined with log file backups.
backed up, the log records that have been backed up and are not in the currently active portion of the
transaction log are truncated. Transaction log backups are not available in the simple recovery model.
Partial Backups
If the database includes some read-only filegroups, you can simplify the backup process by using a partial
backup. A partial backup is similar to a full backup, but it contains only the data in the primary filegroup,
every read/write filegroup, and any specified read-only files. A partial backup of a read-only database
contains only the primary filegroup.
Tail-log Backups
A transaction log backup taken just before a restore operation is called a tail-log backup. Typically, tail-
log backups are taken after a disk failure that affects data files only. From SQL Server 2005 onwards, SQL
Server has required that you take a tail-log backup before it will allow you to restore a database, to
protect against inadvertent data loss.
Additionally, tail-log backups are often possible even when the data files from the database are no longer
accessible.
For example, consider the backup requirements of a major online bank. If the bank was unable to access
any of the data in their systems, how long could it continue to function?
Now imagine that the bank was making full copies of all its data, but a full restore of that data would take
two weeks to complete. This time includes finding the correct backup media, identifying a person with the
authority to perform the restore, locating documentation related to the restore, and actually restoring the
4-10 Planning and Implementing a Backup Strategy
backups. What impact would a two-week outage have on the bank? The more important question is how
long would an interruption to data access need to be, before the bank ceased to be viable?
The key message with RTO is that a plan involving quick recovery with a small data loss might be more
palatable to an organization than one that eliminates data loss but takes much longer to implement.
For example, while a small business might conclude that restoring a backup from the previous night, with
the associated loss of up to a day's work is an acceptable risk trade-off, a large business might see the
situation very differently. It is common for large corporations to plan for zero committed data loss. This
means that work that was committed to the database must be recovered but that it might be acceptable
to lose work that was in process at the time a failure occurred.
Business requirements will determine all aspects of the backup strategy, including how frequently backups
need to occur, how much data is to be backed up each time, the type of media that the backups will be
held on, and the retention and archival plans for the media.
storage space. Therefore, for a large database, you might want to supplement full database backups in
conjunction with other forms of backup.
Following each backup, under the simple recovery model, the database is exposed to potential work loss if
a disaster was to occur. The work-loss exposure increases with each update until the next full backup,
when it returns to zero and a new cycle of work-loss exposure begins.
Scenarios that might be appropriate for using a full database backup strategy include:
• Test systems.
• Data warehouses where the data can be recovered from a source system and where the data in the
data warehouse does not change regularly.
When it is necessary to recover a database, the latest full database backup needs to be restored, along
with the most recent differential backup (if one has been performed). After the database has been
restored, transaction logs that have been backed up since that time are also then restored, in order.
Because the restore works on a transactional basis, it is possible to restore a database to a specific point in
time from the transactions stored in the log backup.
In addition to providing capabilities that let you restore the transactions that have been backed up, a
transaction log backup truncates the transaction log. This enables VLFs in the transaction log to be reused.
If you do not back up the log frequently enough, the log files can fill up.
Because log backups typically take longer to restore than other types, it is often advisable to combine
transaction log backups with periodic differential backups. During recovery, only the transaction log
backups that were taken after the last differential backup need restoring.
Differential Backups
From the time that a full backup occurs, SQL Server
maintains a map of extents that have been
modified. In a differential backup, SQL Server backs
up only those extents that have changed. It is important to realize though that, after the differential
backup is performed, SQL Server does not clear that map of modified extents. The map is only cleared
when full backups occur. This means that a second differential backup performed on a database will
include all changes since the last full backup, not just those changes since the last differential backup.
Differential database backups are especially useful when a subset of a database is modified more
frequently than the remainder of the database. In these situations, differential database backups enable
you to back up frequently without the overhead of full database backups.
Combinations of Backups
Differential backups must be combined with other forms of backup. Because a differential backup saves
all data changed since the last full backup was made, it cannot be taken unless a full backup has already
been performed.
Administering Microsoft® SQL Server® Databases 4-13
Another important aspect to consider is that, when a recovery is needed, multiple backups need to be
restored to bring the system back online—rather than a single backup. This increases the risk exposure for
an organization and must be considered when planning a backup strategy.
Differential backups can also be used in combination with both full and transaction log backups.
Managing file and filegroup backups can be complex, and the loss of a single data file backup can cause
serious problems, including making a database unrecoverable.
One way to simplify the process of backing up parts of a database is to use a partial backup, which backs
up only the primary filegroup and the read/write filegroups. However, this is only recommended when the
database contains enough data in read-only filegroups to make a substantial time and administrative
saving. It is also recommended that you use partial backups in conjunction with the simple recovery
model.
Additional Reading: For more information about partial backups, see the topic Partial
Backups (SQL Server) in SQL Server Books Online.
One of the key benefits of a partial backup strategy is that in the event of a failure, you can perform a
piecemeal restore that makes data in the read/write filegroups available before the read-only filegroups
have been restored. This enables you to reduce the time to recovery for workloads that do not require the
data in the read-only filegroups.
Note: Piecemeal restores are discussed in the next module of this course.
Database Characteristics
As a DBA for Adventure Works cycles, you must
manage the following databases:
• AWDataWarehouse. This database is 150 GB in size, including 100 GB of archive data stored on a
read-only filegroup, 50 GB of data stored on a writable secondary filegroup, and negligible system
tables on the primary filegroup. Around 5 GB of new data is loaded into tables in the writable
secondary filegroup each week in a single batch operation that starts at 5:00 on Saturday and takes
two hours to complete. Each month after the last weekly load, 20 GB of old data in the writable
secondary filegroup is moved to the read-only filegroup in an operation that takes around an hour.
The storage solution used for SQL Server backup devices supports a backup throughout of 150 MB per
minute, and a restore throughput of 100 MB per minute.
Business Requirements
The following business requirements have been identified for the databases:
• HumanResources. This database must never be unavailable for longer than an hour. In the event of a
failure, the database must be recovered so that it includes all transactions that were completed up to
the end of the previous working day.
• InternetSales. This database must never be unavailable for more than two hours, and no more than
30 minutes of transactions can be lost.
• AWDataWarehouse. This database must never be unavailable for more than 48 hours. In the event
of failure, the database should be recovered to include the data loaded by the most recent batch load
operation.
Lesson 3
Backing up Databases and Transaction Logs
Now you have seen how to plan a backup strategy for a SQL Server system, you can learn how to perform
SQL Server backups, including full and differential database backups, transaction log backups, and partial
backups.
Lesson Objectives
After completing this lesson, you will be able to:
• Use SQL Server Management Studio and the BACKUP command to perform backups.
• Back up databases.
The SSMS graphical user interface includes the following pages, on which you can configure backup
options:
• General. Use this page to specify the database to be backed up, the backup type, the backup
destination, and other general settings.
• Media Options. Use this page to control how the backup is written to the backup device(s), for
example overwriting or appending to existing backups.
• Backup Options. Use this page to configure backup expiration, compression, and encryption.
In SQL Server, you perform backups while other users continue working with the database; and these
other users might experience a performance impact due to the I/O load placed on the system by the
backup operation. SQL Server does place some limitations on the types of commands you can execute
while a backup is being performed. For example, you cannot use the ALTER DATABASE command with the
4-16 Planning and Implementing a Backup Strategy
ADD FILE or REMOVE FILE options or shrink a database during a backup. Additionally, you cannot include
the BACKUP command in either an explicit or an implicit transaction or roll back a backup statement.
You can only back up databases when they are online, but it is still possible to perform a backup of the
transaction log when a database is damaged, if the log file itself is still intact. This is why it is so important
to split data and log files onto separate physical media.
Backup Timing
An important consideration when making a backup is to understand the timing associated with its
contents—the database may be in use while the backup is occurring. For example, if a backup starts at
22:00 and finishes at 01:00, does it contain a copy of the database as it was at 22:00, a copy as it was at
01:00, or a copy from a time between the start and finish?
SQL Server writes all data pages to the backup device in sequence, but uses the transaction log to track
any pages that are modified while the backup is occurring. SQL Server then writes the relevant portion of
the transaction log to the end of the backup. This process makes the backups slightly larger than in earlier
versions, particularly if heavy update activities are happening at the same time as the backup. This altered
process also means that the backup contains a copy of the database as it was at a time just prior to the
completion of the backup—not as it was at the time the backup started.
Note: Direct backup to tape is not supported. If you want to store backups on tape, you
should first write it to disk, and then copy the disk backup to tape.
Administering Microsoft® SQL Server® Databases 4-17
If a media set spans several backup devices, the backups will be striped across the devices.
Note: No parity device is used while striping. If two backup devices are used together, each
receives half the backup. Both must also be present when attempting to restore the backup.
Every backup operation to a media set must write to the same number and type of backup devices. Media
sets and the backup devices are created the first time a backup is attempted on them. Media sets and
backup sets can also be named at the time of creation and given a description.
The backups on an individual device within a media set are referred to as a media family. The number of
backup devices used for the media set determines the number of media families in a media set. For
example, if a media set uses two backup devices, it contains two media families.
• FORMAT / NOFORMAT. The FORMAT option is used to write a new media header on the backup
devices used in the backup. This creates a new media set, breaking any existing media sets to which
the backup devices currently belong and deleting any existing backup sets they contain. When you
perform a backup to an existing backup device, SQL Server uses a default value of NOFORMAT to
safeguard against accidental backup deletion. In SQL Server Management Studio, the FORMAT option
is specified by selecting Back up to a new media set, and erase all existing backup sets in the
Backup Database dialog box.
• INIT / NOINIT. The INIT option retains the existing media header, but overwrites all existing backup
sets in the media set. By default, SQL Server uses the NOINIT option to avoid accidental backup
deletion. In SQL Server Management Studio, you can select Backup to the existing media set, and
then select Append to the existing backup set to use the NOINIT option, or Overwrite all existing
backups to use the INIT option.
As an example, consider the following code, which backs up a database to a media set that consists of two
files. Assuming the files do not already exist, SQL Server creates them and uses them to define a new
media set. The data from the backup is striped across the two files:
Another backup could be made at a later time, to the same media set. The data from the second backup
is again striped across the two files and the header of the media set is updated to indicate that it now
contains the two backups:
Later, another backup is performed using the following command. This overwrites the two previous
backups in the media set, so that it now contains only the new backup:
If a user then tries to create another backup to only one of the backup files in the media set using the
following code, SQL Server will return an error, because all backup sets to a media set must use the same
backup devices:
Before the member of the media set can be overwritten, the FORMAT option needs to be added to the
WITH clause in the backup command. This create a new media set that contains a single file. The original
media set, together with all of the backup sets it contains, is no longer valid.
Use the FORMAT option to overwrite the contents of a backup file and split up the media set, but use the
FORMAT option very carefully. Formatting one backup file of a media set renders the entire backup set
unusable.
each backup. In this scenario, differential backups are a sensible consideration. You can perform a
differential backup by using SSMS or by using the DIFFERENTIAL option of the BACKUP statement.
The following code makes a differential database backup of the AdventureWorks database and stores it
in a file named 'R:\Backups\AW.bak'. The NOINIT option appends the backup to any existing backups in
the media set.
Note: You cannot create a differential database backup unless a full database backup has
been taken first.
A transaction log backup finds the MaxLSN of the last successful transaction log backup, and saves all log
entries beyond that point to the current MaxLSN. The process then truncates the transaction log as far as
is possible (unless the COPY_ONLY or NO_TRUNCATE option is specified). The longest-running active
transaction must be retained, in case the database needs to be recovered after a failure.
For example, imagine a scenario where you create a database, and later take a full backup. At this point,
the database can be recovered. If the recovery model of the database is then changed to simple and
subsequently switched back to full, a break in the log file chain has occurred. Even though a previous full
database backup exists, the database can only be recovered up to the point of the last transaction log
backup, prior to the change to simple recovery model.
After switching from simple to full recovery model, you must perform a full database backup to create a
starting point for transaction log backups.
Note: Not all restore scenarios require a tail-log backup. You do not need to have a tail-log
backup if the recovery point is contained in an earlier log backup or if you are moving or
replacing (overwriting) the database and do not need to restore it to a point of time after the
most recent backup.
When performing a tail-log backup of a database that is currently online, you can use the NO_RECOVERY
option to immediately place the database into a restoring state, preventing any more transactions from
occurring until the database is restored.
If the database is damaged, you can use the NO_TRUNCATE option, which causes the database engine to
attempt the backup, regardless of the state of the database. This means that a backup taken while using
the NO_TRUNCATE option might have incomplete metadata.
If you are unable to back up the tail of the log using the NO_TRUNCATE option when the database is
damaged, you can attempt a tail-log backup by specifying the CONTINUE_AFTER_ERROR option.
There are two techniques used to implement this kind of backup solution:
• Partial backup. A partial backup backs up only the primary filegroup and filegroups that are set to
read-write. You can also include specific read-only filegroups if required. The purpose of a partial
Administering Microsoft® SQL Server® Databases 4-21
backup is to enable you to easily back up the parts of a database that change, without having to plan
the backup of specific files or filegroups. You can perform a full or differential partial backup.
• File and Filegroup backups. A filegroup backup enables you to back up only selected files and
filegroups in a database. This can be useful with very large databases that would take a long time to
back up in full, because it enables you to back them up in phases. It’s also useful for databases that
contain some read-only data, or data that changes at different rates, because it enables you to back
up only the read-write data or to back up more frequently updated data more often.
The following code example performs a partial backup that includes the primary filegroup and all
read/write filegroups:
A Partial Backup
BACKUP DATABASE LargeDB
READ_WRITE_FILEGROUPS
TO DISK = 'R:\Backups\LrgRW.bak'
WITH INIT;
The following code example backs up specific filegroups. You can also use the FILE parameter to back up
individual files:
A Filegroup Backup
BACKUP DATABASE LargeDB
FILEGROUP = 'LrgFG2'
TO DISK = 'R:\Backups\LrgFG2.bak'
WITH INIT;
Demonstration Steps
Perform a Full Database Backup
1. Ensure that you have performed the previous demonstration in this module. If not, start the 20462C-
MIA-DC and 20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as
ADVENTUREWORKS\Student with the password Pa$$w0rd, and in the D:\Demofiles\Mod04 folder,
run Setup.cmd as Administrator.
2. If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database
engine using Windows authentication.
3. In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back
Up.
4. In the Backup Up Database – AdventureWorks dialog box, ensure that Backup type is set to Full,
and in the Destination section, select each existing file path and click Remove. Then click Add and in
the Select Backup Destination dialog box, enter the file name D:\Demofiles\Mod04\AW.bak and
click OK.
4-22 Planning and Implementing a Backup Strategy
5. In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, note that
the default option is to append to an existing media set. In this case, there is no existing media set so
a new one will be created, and there are no existing backup sets to overwrite.
6. In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, note the
default backup name and expiration settings.
7. In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script
Action to a New Query Window. Then click OK.
9. In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database.
10. View the D:\Demofiles\Mod04 folder and note the size of the AW.bak file.
1. In SQL Server Management Studio, open the UpdatePrices.sql script file from the
D:\Demofiles\Mod04 folder and click Execute. This script updates the Production.Product table in
the AdventureWorks database.
2. In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back
Up.
3. In the Backup Up Database – AdventureWorks dialog box, in the Backup type list, select
Differential. Then in the Destination section, ensure that D:\Demofiles\Mod04\AW.bak is the only
backup device listed.
4. In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, verify that
the option to append to the existing media set is selected.
5. In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, change
the Name to AdventureWorks-Diff Database Backup.
6. In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script
Action to a New Query Window. Then click OK.
8. In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database.
Note that it includes the WITH DIFFERENTIAL option.
9. View the D:\Demofiles\Mod04 folder and note that size of the AW.bak file has increased, but not
much—the second backup only includes the extents containing pages that were modified since the
full backup.
1. In SQL Server Management Studio, switch to the UpdatePrices.sql script you opened previously and
click Execute to update the Production.Product table in the AdventureWorks database again.
2. In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back
Up.
3. In the Backup Up Database – AdventureWorks dialog box, in the Backup type list, select
Transaction Log. Then in the Destination section, ensure that D:\Demofiles\Mod04\AW.bak is the
only backup device listed.
4. In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, verify that
the option to append to the existing media set is selected. Also verify that the option to truncate the
transaction log is selected.
Administering Microsoft® SQL Server® Databases 4-23
5. In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, change
the Name to AdventureWorks-Transaction Log Backup.
6. In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script
Action to a New Query Window. Then click OK.
9. View the D:\Demofiles\Mod04 folder and note that size of the AW.bak file has increased, but not
much—the third backup only includes the transaction log entries for data modifications since the full
backup.
10. Keep SQL Server Management Studio open for the next demonstration.
4-24 Planning and Implementing a Backup Strategy
Lesson 4
Using Backup Options
SQL Server Backup provides a range of options that can help optimize your backup strategy, including the
ability to perform a copy-only backup, compress backups, and encrypt backups.
Lesson Objectives
After completing this lesson, you will be able to:
Copy-Only Backups
A copy-only SQL Server backup is independent of
the sequence of conventional SQL Server backups.
Usually, taking a backup changes the database and
affects how later backups are restored. However,
there may be a need to take a backup for a special
purpose without affecting the overall backup and
restore procedures for the database.
Compressing Backups
Backup files can quickly become very large, so SQL
Server enables you to compress them. You can set
the default backup compression behavior and also
override this setting for individual backups. The
following restrictions apply to compressed backups:
You can use the property pages for the server to view and configure the default backup compression
setting.
Administering Microsoft® SQL Server® Databases 4-25
To compress a backup, you can use the WITH COMPRESSION option of the BACKUP statement.
Compressing a Backup
BACKUP DATABASE AdventureWorks
TO DISK = 'R:\Backups\AW_Comp.bak'
WITH COMPRESSION;
If your default setting is to compress backups and you want to override this, use the NO_COMPRESSION
option.
The compression level that can be achieved depends entirely upon how compressible the data in the
database is. Some data compresses well, other data does not. A reduction in I/O and backup size of 30 to
50 percent is not uncommon in typical business systems.
Note:
However, any form of compression tends to increase Central Processing Unit (CPU) usage. The additional
CPU resources that are consumed by the compression process may adversely impact concurrent
operations on systems that are CPU bound. Most current SQL Server systems are I/O bound, rather than
CPU bound, so the benefit of reducing I/O usually outweighs the increase in CPU requirements by a
significant factor.
Demonstration Steps
Use Backup Compression
1. Ensure that you have performed the previous demonstration in this module.
3. In the Backup Up Database – AdventureWorks dialog box, ensure that Backup type is set to Full,
and in the Destination section, select the existing file path and click Remove. Then click Add and in
the Select Backup Destination dialog box, enter the file name
D:\Demofiles\Mod04\AW_Comp.bak and click OK.
4-26 Planning and Implementing a Backup Strategy
4. In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, note that
the default option is to append to an existing media set. In this case, there is no existing media set so
a new one will be created, and there are no existing backup sets to overwrite.
5. In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, change
the Name to AdventureWorks-Compressed Backup and in the Set backup compression list,
select Compress backup.
6. In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script
Action to a New Query Window. Then click OK.
7. When the backup has completed successfully, click OK.
8. In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database,
noting that the COMPRESSION option was specified.
9. View the D:\Demofiles\Mod04 folder and note the size of the AW_Comp.bak file. This should be
significantly smaller than the AW.bak file was after the full database backup in the previous
demonstration.
10. Keep SQL Server Management Studio open for the next demonstration.
Encrypting Backups
Backups are a fundamental requirement for
protecting an organization’s data against hardware
failure or natural disaster. However, the data in the
backup may be sensitive, and you must ensure that
the backup media is secured against unauthorized
access to the data it contains. In most organizations,
you can accomplish this goal by storing backup
media in secured file system locations. However, It
is common for organizations to use an off-site
storage solution for backups to protect against loss
of data in the event of a disaster that affects the
entire site (for example, a flood or fire). In this kind
of scenario, or when the data in the backup requires additional security for compliance reasons, you can
encrypt backups so that they can only be restored on a SQL Server instance that contains the correct
encryption key. Backup encryption in SQL Server is based on standard encryption algorithms, including
AES 128, AES 192, AES 256, and Triple DES. To encrypt a backup, you must specify the algorithm you want
to use and a certificate or asymmetric key that can be used to encrypt the data.
1. Create a database master key in the master database. This is a symmetric key that is used to protect
all other encryption keys and certificates in the database.
2. Create a certificate or asymmetric key with which to encrypt the backup. You can create a certificate
or asymmetric in a SQL Server database engine instance by using the CREATE CERTIFICATE or CREATE
ASYMMETRIC KEY statement. Note that asymmetric keys must reside in an extended key
management (EKM) provider.
3. Perform the backup using the ENCRYPTION option (or selecting Encryption in the Backup Database
dialog box), and specifying the algorithm and certificate or asymmetric key to be used. When using
the Backup Database dialog box, you must select the option to back up to a new media set.
Administering Microsoft® SQL Server® Databases 4-27
You should back up the database master key and encryption keys to a secure location (separate from the
backup media location) to enable you to restore the database to a different SQL Server instance in the
event of a total server failure.
The following example code backs up the AdventureWorks database using the AES 128 encryption
algorithm and a certificate named BackupCert:
Note: You can create an encrypted backup in SQL Server 2014 Enterprise, Business
Intelligence, and Standard Edition. You can restore an encrypted backup to any edition of SQL
Server 2014.
• Create a certificate.
Demonstration Steps
Create a Database Master Key
1. Ensure that you have performed the previous demonstration in this module.
2. In SQL Server Management Studio, open the EncyptionKeys.sql script file in the
D:\Demofiles\Mod04 folder.
3. Select the code under the comment Create a database master key and click Execute.
4. Select the code under the comment Back up the database master key and click Execute.
Create a Certificate
1. Select the code under the comment Create a certificate and click Execute.
2. Select the code under the comment Back up the certificate and its private key and click Execute.
1. In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back
Up.
2. In the Backup Up Database – AdventureWorks dialog box, ensure that Backup type is set to Full,
and in the Destination section, select the existing file path and click Remove. Then click Add and in
the Select Backup Destination dialog box, enter the file name
D:\Demofiles\Mod04\AW_Encrypt.bak and click OK.
3. In the Backup Up Database – AdventureWorks dialog box, on the Media Options page, select
Back up to a new media set, and erase all existing backup sets. Then enter the new media set
name Encrypted Backup.
4-28 Planning and Implementing a Backup Strategy
4. In the Backup Up Database – AdventureWorks dialog box, on the Backup Options page, change
the Name to AdventureWorks-Encrypted Backup and in the Set backup compression list, select
Compress backup.
5. In the Encryption section, select Encrypt backup. Then ensure that the AES 128 algorithm is
selected, and select the BackupCert certificate you created previously.
6. In the Backup Up Database – AdventureWorks dialog box, in the Script drop-down list, click Script
Action to a New Query Window. Then click OK.
8. In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database,
noting that the ENCRYPTION option was specified.
9. Keep SQL Server Management Studio open for the next demonstration.
Administering Microsoft® SQL Server® Databases 4-29
Lesson 5
Ensuring Backup Reliability
No matter how many backups you perform, it is essential that you ensure they are readable and
restorable, otherwise the entire backup system is flawed. It is also important to be able to query
information about your backups so you can access the correct data when required.
In this lesson, you will learn how to verify a backup and ensure its integrity and how to retrieve backup
history and header information.
Lesson Objectives
After completing this lesson, you will be able to:
The worst option is generally regarded as creating a backup over your most recent backup. If the system
fails during the backup, you will often then have lost both your data and your backup.
• Insufficient Data on the Backups. Company A performed regular backups, yet no testing of
recovery was ever made. The first time a real recovery was attempted, it was discovered that not all
files that needed to be backed up were in fact being backed up.
• Unreadable Backups. Company B performed regular backups but did not test them. When recovery
was attempted, none of the backups were readable. This is often initiated by hardware failures but
can be caused by inappropriate storage of media.
Avoidance strategy: Regular backup recovery testing and use of redundant backup hardware.
4-30 Planning and Implementing a Backup Strategy
• Unavailable Hardware. Company C purchased a special tape drive to perform backups. When the
time came to restore the backups, that special device no longer worked and the organization had no
other way to read the backups, even if they were valid.
• Old Hardware. Company D performed regular backups and retained them for an appropriate period.
When the time came to restore the backups, the company no longer possessed equipment that was
capable of performing that operation.
Avoidance strategy: Regular backup recovery testing, combined with recovery and backup onto current
devices.
• Misaligned Hardware. Company E performed regular backups and even tested that they could
undertake restore operations from the backups. However, because they tested the restores on the
same device that performed the backups, they did not realize that the device was misaligned and it
was the only one that could read those backups. When a restore was needed, the device that the
backups were performed on had failed.
Avoidance strategy: Regular backup recovery testing on a separate system and physical device.
General Considerations
There are several general points to consider regarding the retention period of backups.
• When a backup strategy calls for you to perform multiple types of backups, it is important to work
out the combination of backups you will require.
• Organizations might need to fulfill legal or compliance requirements regarding the retention of
backups. In most cases, full database backups are kept for a longer time than other types.
• Checking the consistency of databases by using the DBCC CHECKDB statement is a crucial part of
database maintenance, and is discussed later in the course.
• As well as deciding how long backups need to be retained, you will need to determine where they are
kept. Part of the RTO needs to consider how long it takes to obtain the physical backup media if it
needs to be restored.
• You should also make sure that backups are complete. Are all files that are needed to recover the
system (including external operating system files) being backed up?
Using the theory that it is better to have multiple copies of a backup, rather than a single copy, mirroring
a media set can increase availability of your data. However, it is important to realize that mirroring a
media set exposes your system to a higher level of hardware failure risk, as a malfunction of any of the
backup devices causes the entire backup operation to fail.
You can create a mirrored backup set by using the MIRROR TO option of the BACKUP statement.
Note: The mirrored media set functionality is only available in SQL Server Enterprise
Edition.
You enable the checksum option by using the WITH CHECKSUM clause of the BACKUP statement.
Using a Checksum
BACKUP DATABASE AdventureWorks
TO DISK = 'R:\Backups\AW.bak'
WITH CHECKSUM;
Backup Verification
To verify a backup, you can use the RESTORE VERIFYONLY statement which checks the backup for validity
but does not restore it. The statement performs the following checks:
• Page identifiers are correct (to the same level as if it were about to write the data).
The checksum value can only be validated if the backup was performed with the WITH CHECKSUM
option. Without the CHECKSUM option during backup, the verification options only check the metadata
and not the actual backup data.
The RESTORE VERIFYONLY statement is similar to the RESTORE statement and supports a subset of its
arguments.
Verifying a Backup
RESTORE VERIFYONLY
4-32 Planning and Implementing a Backup Strategy
You can also perform verification steps by using the backup database task in SSMS.
Best Practice: Consider verifying backups on a different system to the one where the
backup was performed. This will eliminate the situation where a backup is only readable on the
source hardware.
• backupfile
• backupfilegroup
• backupmediafamily
• backupmediaset
• backupset
You can query these tables to retrieve information
about backups that have been performed. SSMS
also provides reports and logs of backup
information.
Note: If a database is restored onto another server, the backup information is not restored
with the database, as it is held in the msdb database of the original system.
Administering Microsoft® SQL Server® Databases 4-33
Option Description
Demonstration Steps
View the Backup and Restore Events Report
1. Ensure that you have performed the previous demonstration in this module.
3. In the Backup and Restore Events [AdventureWorks] report, expand Successful Backup
Operations and view the backup operations that have been performed for this database.
4. In the Device Type column, expand each of the Disk (temporary) entries to view details of the
backup media set files.
2. Select the code under the comment View backup history, and click Execute.
3. View the query results, which show the backups that have been performed for the AdventureWorks
database.
1. Select the code under the comment Use RESTORE HEADERONLY, and click Execute.
2. View the query results, which show the backups in the AW.bak backup device.
3. Select the code under the comment Use RESTORE FILELISTONLY, and click Execute.
4. View the query results, which show the database files contained in the backups.
5. Select the code under the comment Use RESTORE VERIFYONLY, and click Execute.
6. View the message that is returned, which should indicate that the backup is valid.
Administering Microsoft® SQL Server® Databases 4-35
Objectives
After completing this lab, you will be able to:
• Implement a backup strategy based on full, differential, and transaction log backups.
2. Use SQL Server Management Studio to check the current recovery model of the database, and
change it if necessary.
o Create a new media set with the name “Human Resources Backup”.
2. Verify that the backup file has been created, and note its size.
UPDATE HumanResources.dbo.Employee
SET PhoneNumber='151-555-1234'
o Back up the database to the existing media set, and append the backup to the existing backup
sets.
2. Verify that the report shows the two backups you have created (HumanResources-Full Database
Backup and HumanResources-Full Database Backup 2.
Results: At the end of this exercise, you will have backed up the HumanResources database to
R:\Backups\HumanResources.bak.
2. Use SQL Server Management Studio to check the current recovery model of the database, and
change it if necessary.
o Create a new media set with the name “Internet Sales Backup”.
2. Verify that the backup file has been created, and note its size.
UPDATE InternetSales.dbo.Product
o Back up the log to the existing media set, and append the backup to the existing backup sets.
UPDATE InternetSales.dbo.Product
WHERE ProductSubcategoryID = 2;
This code is in the Update InternetSales.sql file in the D:\Labfiles\Lab04\Starter folder.
4-38 Planning and Implementing a Backup Strategy
o Back up the log to the existing media set, and append the backup to the existing backup sets.
UPDATE InternetSales.dbo.Product
WHERE ProductSubcategoryID = 3;
o Back up the log to the existing media set, and append the backup to the existing backup sets.
o Name the backup set “InternetSales-Transaction Log Backup 2”.
RESTORE HEADERONLY
FROM DISK = 'R:\Backups\InternetSales.bak';
GO
2. Use the following query to identify the database files that are included in the backups:
RESTORE FILELISTONLY
GO
3. Use the following query to verify that the backups are valid:
RESTORE VERIFYONLY
GO
Administering Microsoft® SQL Server® Databases 4-39
Results: At the end of this exercise, you will have backed up the InternetSales database to
R:\Backups\InternetSales.bak.
2. Use SQL Server Management Studio to check the current recovery model of the database, and
change it if necessary.
TO DISK = 'R:\Backups\AWDataWarehouse-Read-Only.bak'
2. Verify that the backup file AWDataWarehouse-Read-Only.bak has been created in the R:\Backups
folder.
READ_WRITE_FILEGROUPS
TO DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak'
WITH FORMAT, INIT, NAME = 'AWDataWarehouse-Active Data', COMPRESSION;
2. Verify that the backup file AWDataWarehouse-Read-Write.bak has been created in the R:\Backups
folder.
4-40 Planning and Implementing a Backup Strategy
VALUES
READ_WRITE_FILEGROUPS
TO DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak'
GO
2. Use the following query to view the backups on AWDataWarehouse_Read-Write.bak, and scroll to
the right to view the BackupTypeDescription column:
RESTORE HEADERONLY
GO
Results: At the end of this exercise, you will have backed up the read-only filegroup in the
AWDataWarehouse to R:\Backups\AWDataWarehouse_Read-Only.bak; and you will have backed up the
writable filegroups in the AWDataWarehouse to R:\Backups\AWDataWarehouse_Read-Write.bak
Administering Microsoft® SQL Server® Databases 4-41
Review Question(s)
Question: What are the unique features of transaction log restores?
Module 5
Restoring SQL Server 2014 Databases
Contents:
Module Overview 5-1
Module Overview
In the previous module, you learned how to create backups of Microsoft® SQL Server® 2014 databases.
A backup strategy might involve many different types of backup so it is essential that you can effectively
restore them.
You will often be restoring a database in an urgent situation. You must, however, ensure that you have a
clear plan of how to proceed and successfully recover the database to the required state. A good plan and
understanding of the restore process can help avoid making the situation worse.
Some database restores are related to system failure. In these cases, you will want to return the system as
close as possible to the state it was in prior to the failure. Some failures, though, are related to human
error and you may wish to recover the system to a point prior to the error. The point-in-time recovery
features of SQL Server 2014 can help you to achieve this.
User databases are more likely to be affected by system failures than system databases because they are
typically much larger. However, system databases can be affected by failures, and special care needs to be
taken when recovering them. In particular, you need to understand how to recover each system database
because you cannot use the same process for all system databases.
In this module, you will see how to restore user and system databases and how to implement point-in-
time recovery.
Objectives
After completing this module, you will be able to:
Lesson 1
Understanding the Restore Process
When you need to recover a database, it is essential to have a good plan to avoid causing further
damage. After you have completed the preliminary step of attempting to create a tail-log backup, it is
most important to determine which database backups to restore—and in which order.
Lesson Objectives
After completing this lesson, you will be able to:
Data Copy
The data copy phase is typically the longest in a
database restore. Firstly, the data files from the
database need to be retrieved from the backups.
Before any data pages are restored, the restore
process reads the header of the backup and SQL
Server recreates the required data and log files. If
instant file initialization (IFI) has not been enabled
by granting rights to the SQL Server service account, the rewriting of the data files can take a substantial
amount of time.
After the data and log files are recreated, the data files are restored from the full database backup. Data
pages are retrieved from the backup in order and written to the data files. The log files need to be zeroed
out before they can be used. This process can also take a substantial time if the log files are large.
If a differential backup is also being restored, SQL Server overwrites the extents in the data files with those
contained in the differential backup.
Redo Phase
At the start of the redo phase, SQL Server retrieves details from the transaction log. In the simple recovery
model, these details are retrieved from either the full database backup or the differential backup. In the
full or bulk-logged recovery model, these log file details are supplemented by the contents of any
transaction log backups that were taken after the full and differential database backups.
In the redo phase, SQL Server rolls all changes that are contained within the transaction log details into
the database pages, up to the recovery point. The recovery point is typically the latest time for which
transactions exist in the log.
Administering Microsoft® SQL Server® Databases 5-3
Undo Phase
The transaction log will likely include details of transactions that were not committed at the recovery
point, which is typically the time of the failure. In the undo phase, SQL Server rolls back any of these
uncommitted transactions.
Because the action of the undo phase involves rolling back uncommitted transactions and placing the
database online, no more backups can be restored.
During the undo phase, the Enterprise edition of SQL Server will allow the database to come online and
users to begin to access it. This capability is referred to as the fast recovery feature. Queries that attempt
to access data that is still being undone are blocked until the undo phase is complete. This can potentially
cause transactions to time out, but does mean that users can access the database sooner.
In general, you cannot bring a database online until it has been recovered. The one exception to this is
the fast recovery option, which allows users to access the database while the undo phase is continuing.
Recovery does not only occur during the execution of RESTORE commands. If a database is taken offline
and then placed back into an ONLINE state, recovery of the database will also occur. The same recovery
process takes place when SQL Server restarts.
Note: Other events that lead to database recovery include clustering or database mirroring
failovers. Failover clustering and database mirroring are advanced topics that are beyond the
scope of this course.
Types of Restores
The restore scenarios available for a database
depend on its recovery model and the edition of
SQL Server you are using.
In most scenarios involving the simple recovery model, no differential backups are performed. In these
cases, you only restore the last full database backup, and then the recovery phase returns the database to
the state it was in at the time just prior to the full database backup being completed.
until the master database is recovered. The recovery of system databases will be discussed later in this
module.
Filegroup or Restore
SQL Server includes the functionality to restore filegroups or individual files, so if specific files in a
database are corrupt or lost you can potentially reduce the overall time to recover the database. The
recovery of individual files is only supported for read-only files when operating in simple recovery model,
but you can use it for read-write files when using the bulk-logged or full recovery models. The recovery of
individual files uses a process that is similar to the complete database restore process. It is discussed later
in this module.
Piecemeal Restore
A piecemeal restore is used to restore and recover the database in stages, based on filegroups, rather than
restoring the entire database at a single time. The first filegroup that must be restored is the primary
filegroup, usually along with the read/write filegroups for which you want to prioritize recovery. You can
then restore read-only filegroups. In SQL Server 2014, piecemeal restore is only available in the Enterprise
edition.
Page Restore
Another advanced option is the ability to restore an individual data page. If an individual data page is
corrupt, users will usually see either an 823 error or an 824 error when they execute a query that tries to
access the page. You can try to recover the page using a page restore. If a user query tries to access the
page after the restore starts, they will see error 829, which indicates the page is restoring. If the page
restore is successful, user queries that access the page will again return results as expected. Page restores
are supported under full and bulk-logged recovery models, but not under simple recovery model.
Online Restore
Online restore involves restoring data while the database is online. This is the default option for File, Page,
and Piecemeal restores. In SQL Server 2014, online restore is only available in the Enterprise edition.
1. Restore the latest full database backup as a base to work from. If only individual files are damaged or
missing, you may be able to restore just those files.
2. If differential backups exist, you only need to restore the latest differential backup.
3. If transaction log backups exist, you need to restore all transaction log backups since the last
differential backup. You also need to include the tail-log backup created at the start of the restore
Administering Microsoft® SQL Server® Databases 5-5
process, if the tail-log backup was successful. (This step does not apply to databases using the simple
recovery model.)
Lesson 2
Restoring Databases
Most restore operations involve restoring a full database backup, often followed by a differential backup
and a sequence of transaction log backups. In this lesson, you will learn how to restore these types of
backup and recover a database.
Lesson Objectives
After completing this lesson, you will be able to:
Restoring a Database
The simplest recovery scenario is to restore a
database from a single full database backup. If no
subsequent differential or transaction log backups
need to be applied, you can use the RECOVERY
option to specify that SQL Server should complete
the recovery process for the database and bring it
online. If additional backups must be restored, you
can prevent recovery from occurring by specifying
the NORECOVERY option. If you do not specify either of these options, SQL Server uses RECOVERY as the
default behavior.
In the following example, the AdventureWorks database is restored from the AW.bak backup media:
In the following code example, the existing AdventureWorks database is replaced with the database in
the AW.bak backup media:
Note: The WITH REPLACE option needs to be used with caution as it can lead to data loss.
In this example, the AdventureWorks database is being restored from another server. As well as
specifying the source location for the media set, new locations for each database file are also specified in
the RESTORE statement. Note that the MOVE option requires the specification of the logical file name,
rather than the original physical file path.
WITH MOVE
RESTORE DATABASE AdventureWorks
FROM DISK = 'R:\Backups\AW.bak'
WITH MOVE 'AdventureWorks_Data' TO 'Q:\Data\AdventureWorks.mdf',
MOVE 'AdventureWorks_Log' TO 'U:\Logs\ AdventureWorks.ldf';
There is no way to restore additional backups after a restore WITH RECOVERY has been processed. If you
accidentally perform a backup using the WITH RECOVERY option, you must restart the entire restore
sequence.
In this example, the AdventureWorks database is restored from the first file in the media set containing a
full database backup. This media set is stored in the operating system file R:\Backups\AW.bak. The second
file in the media set is the first differential backup, but the changes in this are also contained in the
second differential backup in the third file. Therefore, the second RESTORE statement only needs to
restore the contents of the third file.
If both the full and differential backup sets are on the same backup media, SSMS automatically selects the
required the backups in the Restore Database dialog box and ensures that the appropriate recovery
settings are applied.
Note: In the previous example, the log file was available after the database failed, so a tail-
log backup could be taken. This enables the database to be recovered to the point of failure. Had
the log file not been available, the last planned transaction log backup (backup set 5 in AW.bak)
would have been restored using the RECOVERY option, and all transactions since that backup
would have been lost.
Demonstration Steps
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
3. Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows
authentication.
4. In Object Explorer, expand Databases, and note that the AdventureWorks database is in a
Recovery Pending state.
5. Click New Query and execute the following Transact-SQL code to attempt to bring the database
online:
6. Note the error message that is displayed. The AdventureWorks.mdf data file has been lost, so the
database cannot be brought online.
5-10 Restoring SQL Server 2014 Databases
7. Delete the ALTER DATABASE statement, and replace it with the following code to perform a tail-log
backup:
WITH NO_TRUNCATE;
8. Click Execute, and view the resulting message to verify that the backup is successful.
Restore a Database
1. In Object Explorer, right-click the AdventureWorks database, point to Tasks, point to Restore, and
click Database.
2. In the Restore Database – AdventureWorks dialog box, in the Source section, select Device and
click the ellipses (...) button.
3. In the Select backup devices dialog box click Add, and then in the Locate backup File – MIA-SQL
dialog box, select D:\Demofiles\Mod05\AW.bak and click OK.
4. In the Select backup devices dialog box, ensure that D:\Demofiles\Mod05\AW.bak is listed, and
then click Add.
5. In the Locate backup File – MIA-SQL dialog box, select D:\Demofiles\Mod05\AW-TailLog.bak and
click OK.
6. In the Select backup devices dialog box, ensure that both D:\Demofiles\Mod05\AW.bak and
D:\Demofiles\Mod05\AW-TailLog.bak are listed and click OK.
7. Note that the backup media contains a full backup, a differential backup, and a transaction log
backup (these are the planned backups in AW.bak); and a copy-only transaction log backup (which is
the tail-log backup in AW-TailLog.bak). All of these are automatically selected in the Restore
column.
8. On the Options page, ensure that the Recovery state is set to RESTORE WITH RECOVERY.
9. In the Script drop-down list, click New Query Editor Window. Then click OK.
10. When the database has been restored successfully, click OK.
11. View the Transact-SQL code that was used to restore the database, noting that the full backup, the
differential backup, and the first transaction log backup were restored using the NORECOVERY
option. The restore operation for the tail-log backup used the default RECOVERY option to recover
the database.
12. In Object Explorer, verify that the AdventureWorks database is now recovered and ready to use.
13. Close SQL Server Management Studio without saving any files.
Administering Microsoft® SQL Server® Databases 5-11
Lesson 3
Advanced Restore Scenarios
The techniques discussed in the previous lesson cover most common restore scenarios. However, there are
some more complex restore scenarios for which a DBA must be prepared.
This lesson discusses restore scenarios for file and filegroup backups, encrypted backups, individual data
pages, and system databases.
Lesson Objectives
After completing this lesson, you will be able to:
2. Restore each damaged file from the most recent file backup of that file.
3. Restore the most recent differential file backup, if any, for each restored file.
4. Restore transaction log backups in sequence, starting with the backup that covers the oldest of the
restored files and ending with the tail-log backup created in step 1.
You must restore the transaction log backups that were created after the file backups to bring the
database back to a consistent state. The transaction log backups can be rolled forward quickly, because
only the changes that relate to the restored files or filegroups are applied. Undamaged files are not
copied and then rolled forward. However, you do still need to process the whole chain of log backups.
is backed up once, and only read/write filegroups are included in subsequent backups—significantly
reducing the time taken to perform a full or differential backup.
One of the advantages of including read-only filegroups in partial backup strategy is that it enables you
to perform a piecemeal restore. In a piecemeal restore, you can recover read/write filegroups and make
them available to users for querying before the recovery of read-only filegroups is complete. To perform a
piecemeal restore:
1. Restore the latest partial full database backup, specifying the read/write filegroups to be restored and
using the PARTIAL option to indicate that read-only filegroups will be restored separately.
2. Restore the latest partial differential backup, and log file backups if they exist. Use the RECOVERY
option with the last RESTORE operation to recover the database. Data in read/write filegroups is now
available.
3. Restore each read-only filegroup backups with the RECOVERY option to bring them online.
1. Create a database master key for the master database. This does not need to be the same
database master key that was used in the original instance, but if you are recovering from a complete
server failure you can restore the original database master key from a backup.
2. Create a certificate or key from a backup. Use the CREATE CERTIFICATE or CREATE ASYMMETRIC
KEY statement to create a certificate or key from the backup you created of the original key used to
encrypt the database. The new certificate or key must have the same name as the original, and if you
used a certificate, you must restore both the public certificate and the private key.
3. Restore the database. Now that the encryption key is available on the SQL Server instance, you can
restore the database as normal.
The following code sample shows how to restore an encrypted database backup on a new SQL Server
instance:
FILE = 'K:\Backups\Backup.key');
GO
Demonstration Steps
Restore an Encrypted Backup
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2. If you did not complete the previous demonstration, in the D:\Demofiles\Mod05 folder, run
Setup.cmd as Administrator.
3. Start SQL Server Management Studio and connect to the MIA-SQL\SQL2 database engine using
Windows authentication.
4. In Object Explorer, under the MIA-SQL\SQL2 instance, expand Databases and view the existing
databases on this instance.
5. Open the Restore Encrypted Backup.sql script file in the D:\Demofiles\Mod05 folder.
6. Select the code under the comment Try to restore an encrypted backup and click Execute. Note
that this fails because the required certificate is not present.
7. Select the code under the comment Create a database master key for master and click Execute.
This creates a database master key for the master database on MIA-SQL\SQL2.
8. Select the code under the comment Import the backed up certificate and click Execute. This
creates a certificate from public and private key backups that were taken from the MIA-SQL instance.
9. Select the code under the comment Restore the encrypted database and click Execute. Note that
this time the restore operation succeeds.
10. In Object Explorer, refresh the Databases folder and verify that the AdventureWorksEncrypt
database has been restored.
The final two steps of backing up and restoring the log are required to ensure that the final log sequence
number (LSN) of the restored pages is set as the REDO target of the transaction log.
Online page restore is only supported in SQL Server Enterprise Edition. You can perform an offline page
restore by using the following procedure:
4. Restore each subsequent transaction log backup with the NORECOVERY option.
Restoring a Page
-- Restore pages from the full backup
RESTORE DATABASE AdventureWorks PAGE='1:55, 1:207'
FROM DISK = 'R:\Backups\AdventureWorks.bak'
WITH FILE=1, NORECOVERY;
-- Restore the log to set the correct REDO LSN and recover
RESTORE LOG AdventureWorks
FROM DISK = 'R:\Backups\AW-Log.bak'
WITH RECOVERY;
master
The master database holds all system-level
configurations. SQL Server requires the master
database before a SQL Server instance can run at
all. SQL Server cannot start without the master
database, therefore if it is missing or corrupt, you
cannot execute a standard RESTORE DATABASE
command to restore it. Before starting to recover
the master database, you must have access to a
temporary master database so that the SQL Server instance will start. This temporary master database
does not need to have the correct configuration as it will only be used to start up the instance to initiate
the recovery process to restore the correct version of your master database. There are two ways that you
can obtain a temporary master database:
• You can use the SQL Server setup program to rebuild the system databases, either from the location
that you installed SQL Server from or by running the setup program found at Microsoft SQL
Server\120\Setup\Bootstrap\SQL2014\setup.exe.
Note: Re-running the setup program will overwrite all your system databases, so you must
ensure that they are regularly backed up and able to be restored after you have restored the
master database.
• You can use a file-level backup of the master database files to restore the master database. You
must take this file-level backup when the master database is not in use—that is, when SQL Server is
not running—or by using the VSS service.
Note: Copying the master database from another instance is not supported. The VSS
service is beyond the scope of this course.
When you have created a temporary version of the master database, you can use the following
procedure to recover the correct master database:
1. Start the server instance in single-user mode by using the –m startup option.
5-16 Restoring SQL Server 2014 Databases
2. Use a RESTORE DATABASE statement to restore a full database backup of the master database. It is
recommended that you execute the RESTORE DATABASE statement by using the sqlcmd utility.
After restoring the master database, the instance of SQL Server will shut down and terminate your sqlcmd
connection.
model
The model database is the template for all databases that are created on the instance of SQL Server.
When the model database is corrupt, the instance of SQL Server cannot start. This means that a normal
restore command cannot be used to recover the model database if it becomes corrupted. In the case of a
corrupt model database, you must start the instance with the -T3608 trace flag as a command-line
parameter. This trace flag only starts the master database. When SQL Server is running, you can restore
the model database by using the normal RESTORE DATABASE command.
msdb
SQL Server Agent uses the msdb database for scheduling alerts and jobs, and for recording details of
operators. The msdb database also contains history tables, such as those that record details of backup
and restore operations. If the msdb database becomes corrupt, SQL Server Agent will not start. You can
restore the msdb database by using the RESTORE DATABASE statement as you would a user database,
and then the SQL Server Agent service can be restarted.
resource
The resource database is read-only and contains copies of all system objects that ship with Microsoft SQL
Server. This is a hidden database and you cannot perform backup operations on it. It can, however, be
corrupted by failures in areas such as I/O subsystems or memory. If the resource database is corrupt, it
can be restored by a file-level restore in Windows or by running the setup program for SQL Server.
tempdb
The tempdb database is a workspace for holding temporary or intermediate result sets. This database is
recreated every time an instance of SQL Server starts, so there is no need to back up or restore it. When
the server instance is shut down, any data in tempdb is permanently deleted.
Administering Microsoft® SQL Server® Databases 5-17
Lesson 4
Point-in-Time Recovery
In the previous lesson, you learned how to recover a database to the latest point in time possible.
However, there are occasions when you may need to recover the database to an earlier point in time. You
have also learned that you can stop the restore process after any of the backups are restored and initiate
the recovery of the database. While stopping the restore process after restoring an entire backup file
provides a coarse level of control over the recovery point, SQL Server provides additional options that
allow for more fine-grained control.
In this lesson, you will learn about how point-in-time recovery works and how to use the options that it
provides.
Lesson Objectives
After completing this lesson, you will be able to:
• Describe point-in-time recovery.
Note: If a user error causes the inadvertent deletion of some data, you may not be aware of
when the error actually occurred. Therefore, you will not know which log file contains the
deletion and the point at which to recover the database. You can use the WITH STANDBY option
on each log file restore and inspect the state of the database after each restore operation, to
determine when the error occurred and when to recover the database.
5-18 Restoring SQL Server 2014 Databases
STOPAT Option
You use the STOPAT option to specify a recovery
point that is based on a datetime value. You might
not know in advance which transaction log backup
file contains transactions from the time where the
recovery needs to occur. Therefore, the syntax of
the RESTORE LOG command enables you to specify
the RECOVERY option for each log restore
command in the sequence.
• If the specified time is contained within the period covered by the transaction log backup, the restore
command recovers the database at that time.
• If the specified time is later than the last time contained in the transaction log backup, the restore
command restores the logs, sends a warning message, and the database is not recovered, so that
additional transaction log backups can be applied.
This behavior ensures that the database is recovered up to the requested point, even when STOPAT and
RECOVERY are both specified with every restore.
Administering Microsoft® SQL Server® Databases 5-19
STOPATMARK Option
If you require more precise control over the
recovery point, you can use the STOPATMARK
option of the RESTORE Transact-SQL statement.
Marking a Transaction
BEGIN TRAN UpdPrc WITH MARK 'Start of
nightly update process';
If you do not know the name of a transaction that was marked, you can query the dbo.logmarkhistory
table in the msdb database.
The STOPATMARK option is similar to the STOPAT option for the RESTORE command. SQL Server will stop
at the named transaction mark and include the named transaction in the redo phase. If you wish to
exclude the transaction (that is, restore everything up to the beginning of the named transaction), you can
use the STOPBEFOREMARK option instead. If the transaction mark is not found in the transaction log
backup that is being restored, the restore completes and the database is not recovered, so that other
transaction log backups can be restored.
The main use for the STOPATMARK feature is to restore an entire set of databases to a mutually
consistent state, at some earlier point in time. If you need to perform a backup of multiple databases, so
that they can all be recovered to a consistent point, consider marking all the transaction logs before
commencing the backups.
Note: You cannot use the stop at mark functionality in SSMS; it is only available by using
the Transact-SQL statement.
Demonstration Steps
Perform a Point-In-Time Recovery
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2. Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows
authentication.
3. In SQL Server Management Studio, open the Point-in-Time Restore.sql script file in the
D:\Demofiles\Mod05 folder.
4. Select and execute the code under the comment Create a database and back it up. This creates a
database with a single table, and performs a full backup.
5. Select and execute the code under the comment enter some data. This inserts a record into the
Customers table.
6. Select and execute the code under the comment get the current time. This displays the current date
and time. Make a note of the current time.
7. Wait until a minute has passed, and then select and execute the code under the comment get the
current time again to verify that it is now at least a minute since you noted the time.
8. Select and execute the code under the comment enter some more data. This inserts a second record
into the Customers table.
9. Select and execute the code under the comment backup the transaction log. This performs a
transaction log backup of the database.
11. In Object Explorer, expand Databases and verify that BackupDemo is listed (if not, right-click the
Databases folder and click Refresh). Then right-click the BackupDemo database, point to Tasks,
point to Restore, and click Database.
13. In the Backup Timeline: BackupDemo dialog box, select Specific date and time and set the Time
value to the time you noted earlier (after the first row of data was inserted). Then click OK.
14. In the Restore Database – BackupDemo dialog box, click OK. When notified that the database has
been restored successfully, click OK.
15. In Object Explorer, expand the BackupDemo database and its Tables folder. Then right-click
dbo.Customers and click Select Top 1000 Rows. When the results are displayed, verify that the
database was restored to the point in time after the first row of data was inserted, but before the
second row was inserted.
16. Close SQL Server Management Studio without saving any files.
Administering Microsoft® SQL Server® Databases 5-21
Objectives
After completing this lab, you will be able to:
2. Use the following Transact-SQL query to try to bring the database online:
3. Review the error message, and then check the contents of the M:\Data folder to determine if the
HumanResources.mdf file is present. If not, the database cannot be brought online because the
primary data file is lost.
SQL Server should have retained the backup history for this database, and the Restore Database tool in
SQL Server Management Studio should automatically select the second backup set in the
R:\Backups\HumanResources.bak backup media set.
Results: After this exercise, you should have restored the HumanResources database.
2. Use the following Transact-SQL query to try to bring the database online:
3. Review the error message, and then check the contents of the M:\Data folder to verify that the
InternetSales.mdf file is present. This file has become corrupt, and has rendered the database
unusable.
2. Use the following Transact-SQL code to back up the tail of the transaction log:
USE master;
WITH NO_TRUNCATE;
In this case, the backup history for the database has been lost, so you must specify the backup media sets
for the existing planned backups as well as the tail-log backup you just took.
The planned backups should be restored using the NORECOVERY option, and then the tail-log backup
should be restored using the RECOVERY option.
Results: After this exercise, you should have restored the InternetSales database.
USE master;
This code restores the primary filegroup and the Current filegroup from a full database backup on the
AWDataWarehouse_Read-Write.bak media set. The PARTIAL option indicates that only the primary and
named read-write filegroups should be restored, and the NORECOVERY option leaves the database in a
restoring state, ready for subsequent restore operations of the read-write filegroup data.
3. Refresh the Databases folder in Object Explorer to verify that the database is in a restoring state.
2. Verify that the database is now shown as online in Object Explorer, and that you can query the
dbo.FactInternetSales table
3. Verify that you cannot query the dbo.FactInternetSalesArchive table, because it is stored in a
filegroup that has not yet been brought online.
WITH RECOVERY;
5-24 Restoring SQL Server 2014 Databases
Results: After this exercise, you will have restored the AWDataWarehouse database.
Administering Microsoft® SQL Server® Databases 5-25
When planning a database recovery solution, consider the following best practices.
• Best Practice: Don’t forget to back up the tail of the log before starting a restore sequence.
• If available, use differential restore to reduce the time taken by the restore process.
• Use file level restore to speed up restores when not all database files are corrupt.
• Perform regular database backups of master, msdb and model system databases.
• Create a disaster recovery plan for your SQL Server and test restoring databases regularly.
Review Question(s)
Question: What are the three phases of the restore process?
5-26 Restoring SQL Server 2014 Databases
6-1
Module 6
Importing and Exporting Data
Contents:
Module Overview 6-1
Module Overview
While a great deal of data residing in a Microsoft® SQL Server® system is entered directly by users who
are running application programs, there is often a need to move data in other locations to and from SQL
Server.
SQL Server provides a set of tools that you can use to transfer data in and out. Some of these tools, such
as the bcp utility and SQL Server Integration Services (SSIS), are external to the database engine. Other
tools, such as the BULK INSERT statement and the OPENROWSET function, are implemented in the
database engine. SQL Server also enables you to create data-tier applications (DACs) which package all
the tables, views, and instance objects associated with a user database into a single unit of deployment.
In this module, you will explore these tools and techniques so that you can import and export data to and
from SQL Server.
Objectives
After completing this lesson, you will be able to:
Lesson 1
Introduction to Transferring Data
The first step in learning to transfer data in and out of SQL Server is to become familiar with the processes
involved, and with the tools that SQL Server provides to implement data transfer.
When large amounts of data need to be inserted into SQL Server tables, the default settings for
constraints, triggers, and indexes are not likely to provide the best performance possible. You may achieve
improved performance by controlling when the checks that are made by constraints are carried out and
when the index pages for a table are updated.
Lesson Objectives
After completing this lesson, you will be able to:
• Describe the tools that SQL Server provides for data transfer.
• Transforming the data in some way to make it suitable for the target system.
Note: In some situations, an Extract, Load, Transform (ELT) process might be more
appropriate than an ETL process. For example, you may decide to perform data transformations
after the data has been loaded into the database engine rather than before.
Administering Microsoft® SQL Server® Databases 6-3
Extracting Data
While there are other options, extracting data typically involves executing queries on a source system to
retrieve the data, or opening and reading source files.
• To avoid excessive impact on the source system. For example, do not read entire tables of data when
you only need to read selected rows or columns. Also, do not continually re-read the same data, and
avoid the execution of statements that block users of the source system in any way.
• To ensure the consistency of the data extraction. For example, do not include one row from the
source system more than once in the output of the extraction.
Transforming Data
The transformation phase of an ELT process will generally involve several steps, such as the following:
• Data might need to be cleansed. For example, you might need to remove erroneous data or provide
default values for missing columns.
• Lookups might need to be performed. For example, the input data might include the name of a
customer, but the database might need an ID for the customer.
• Data might need to be aggregated. For example, the input data might include every transaction that
occurred on a given day, but the database might need only daily summary values.
• Data might need to be de-aggregated. This is often referred to as data allocation. For example, the
input data might include quarterly budgets, but the database might need daily budgets.
In addition to these common operations, data might need to be restructured in some way, for example by
pivoting the data so that columns become rows, concatenating multiple source columns into a single
column, or splitting a single source column into multiple columns.
Loading Data
After data is transformed into an appropriate format, you can load it into the target system. Instead of
performing row-by-row insert operations for the data, you can use special options for loading data in
bulk. Additionally, you can make temporary configuration changes to improve the performance of the
load operation.
BULK INSERT
You can use the BULK INSERT Transact-SQL statement to import data directly from an operating system
data file into a database table. The BULK INSERT statement differs from bcp in a number of ways. First,
you execute the BULK INSERT statement from within Transact-SQL, whereas the bcp utility is a command
line utility. Also, while the bcp utility can be used for both import and output, the BULK INSERT statement
can only be used for data import.
OPENROWSET (BULK)
OPENROWSET is a table-valued function that you can use to connect to and retrieve data from OLE-DB
data sources. Full details of how to connect to the data source need to be provided as parameters to the
OPENROWSET function. You can use OPENROWSET to connect to other types of database engine.
SQL Server offers a special OLE-DB provider called BULK that you can use with the OPENROWSET
function.
The BULK provider enables the import of entire documents from the file system.
For example, consider a FOREIGN KEY constraint that ensures that the relevant customer does exist
whenever a customer order is inserted into the database. While you could check this reference for each
customer order, it is possible that a customer may have thousands of orders, resulting in thousands of
checks. Instead of checking each value as it is inserted, you can check the customer reference as a single
lookup after the overall import process completes—to cover all customer orders referring to that
customer.
Only CHECK and FOREIGN KEY constraints can be disabled. The process for disabling and re-enabling
constraints will be discussed later in this lesson.
Similar to the way that avoiding lookups for FOREIGN KEY constraints during data import can improve
performance, avoiding constant updating of indexes can have the same effect. In many cases, rebuilding
the indexes after the import process is complete is much faster than updating the indexes as the rows are
Administering Microsoft® SQL Server® Databases 6-5
imported. The exception to this situation is when there is a much larger number of rows already in the
table than are being imported.
Triggers are commands that are executed when data is modified. It is important to decide if the
processing that the triggers perform would also be better processed in bulk after the import, rather than
as each insert occurs.
• If the table has no clustered index but has one or more nonclustered indexes, data pages are always
minimally logged. How index pages are logged, however, depends on whether the table is empty.
• If the table is empty, index pages are minimally logged.
• If the table has a clustered index and is empty, both data and index pages are minimally logged.
• If a table has a clustered index and is non-empty, data pages and index pages are both fully logged,
regardless of the recovery model.
Disabling Indexes
In SQL Server 2005 and later, you can disable an
index. Rather than totally dropping the index details
from the database, this option leaves the metadata
about the index in place and just stops it from
updating. Queries that are executed by users will
6-6 Importing and Exporting Data
not use disabled indexes. You can disable an index by using the graphical interface in SQL Server
Management Studio (SSMS) or by using the ALTER INDEX Transact-SQL statement.
The following code example disables an index named idx_emailaddress on the dbo.Customer table:
Disabling an Index
ALTER INDEX idx_emailaddress ON dbo.Customer
DISABLE;
You can disable also disable all of the indexes on a table, as shown in the following code example:
Note: A clustered index defines how a table is structured. If a clustered index is disabled,
the table becomes unusable until the index is rebuilt.
The major advantage of disabling an index instead of dropping it, is that you can put the index back into
operation by using a rebuild operation. When you rebuild an index, you do not need to know details of
how it is configured. This makes it much easier to create administrative scripts that stop indexes being
updated while large import or update operations are taking place, and that put the indexes back into
operation after those operations have completed.
Rebuilding Indexes
After data has been imported, you can rebuild the indexes on a table by using the graphical tools in SSMS
or by using the ALTER INDEX Transact-SQL statement or the DBCC DBREINDEX command.
The following code example shows how to rebuild the idx_emailaddress on the dbo.Customer table:
Rebuilding an Index
ALTER INDEX idx_emailaddress ON dbo.Customer
REBUILD;
You can also use the ALL keyword with the ALTER INDEX statement to rebuild all indexes on a specified
table, similarly to disabling an index.
If a large volume of data has been loaded, it may be more efficient to recreate the index, dropping
existing indexes in the process. To recreate an index, replacing the existing one, you can use the CREATE
INDEX statement with the DROP_EXISTING option as shown in the following example.
Recreating an Index
CREATE INDEX idx_emailaddress ON dbo.Customer(EmailAddress)
WITH (DROP_EXISTING = ON);
Administering Microsoft® SQL Server® Databases 6-7
Note: If a table has a primary key enforced with a clustered index, disabling the index
associated with the constraint prevents access to any data in the table.
You can use check constraints to limit the values that can be contained in a column or the relationship
between the values in multiple columns in a table.
You can disable and enable both foreign key and check constraints by using the CHECK and NOCHECK
options of the ALTER TABLE statement.
You can also disable or enable all constraints by replacing the constraint name with the ALL keyword.
6-8 Importing and Exporting Data
Lesson 2
Importing and Exporting Data
SQL Server provides a range of tools and techniques for importing and exporting data. In this lesson, you
will explore these tools and learn how to use them to import and export data to and from a SQL Server
database.
Lesson Objectives
After completing this lesson, you will be able to:
You can use the wizard to perform the data transfer immediately, and you can also save the SSIS package
it generates for execution at a later time.
• DTExec utility
You can use DTExec to run SSIS packages from the command line. You need to specify parameters
including the server to use, the location of the package, environment variables, and input parameters. The
utility reads the command line parameters, loads the package, configures the package options based on
the parameters passed, and then runs the package. It returns an exit code signifying the success or failure
of the package.
• DtExecUI utility
Administering Microsoft® SQL Server® Databases 6-9
The Execute Package Utility (DtExecUI) can run SSIS packages from SQL Server Management Studio
(SSMS) or from the command prompt and is a GUI for the DTExec command prompt utility. The GUI
simplifies the process of passing parameters to the utility and receiving exit codes.
You can also run SSIS packages from SQL Server Agent jobs. This enables you to automate and schedule
the execution, either independently or as part of a larger job. You can configure the parameters for the
package by using the New Job Step dialog box.
Note: SQL Server Agent jobs are described in detail later in this course.
The Import and Export Wizard is based on SQL Server Integration Services (SSIS), which provides a
comprehensive platform for building ETL solutions. The Import and Export Wizard itself provides minimal
transformation capabilities. Except for setting the name, the data type, and the data type properties of
columns in new destination tables and files, the SQL Server Import and Export Wizard supports no
column-level transformations. If you need to develop a more complex ETL solution, you should use the
SQL Server Data Tools for BI (SSDT-BI) add-in for Visual Studio to create an SSIS that consists of one or
more SSIS packages.
Note: To learn more about using SSDT-BI to develop SSIS projects, attend course 20463C:
Implementing a Data Warehouse with Microsoft SQL Server.
Demonstration Steps
Use Import and Export Wizard to Export Data
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
3. Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows
authentication.
4. In Object Explorer, expand Databases. Then right-click the AdventureWorks database, point to
Tasks, and click Export Data.
5. On the Welcome to SQL Server Import and Export Wizard page, click Next.
6. On the Choose a Data Source page, in the Data source drop-down list, select SQL Server Native
Client 11.0. Then ensure that the MIA-SQL server is selected, that Use Windows Authentication is
selected, and that the AdventureWorks database is selected; and click Next.
7. On the Choose a Destination page, in the Data source drop-down list, select Flat File Destination.
Then in the File name box type D:\Demofiles\Mod06\Currency.csv, clear the Column names in in
the first data row checkbox, and click Next.
8. On the Specify Table Copy or Query page, ensure that Copy data from one or more tables or
views is selected, and click Next.
6-10 Importing and Exporting Data
9. On the Configure Flat File Destination page, in the Source table or view list, select
[Sales].[Currency]. Then ensure that the Row delimiter is {CR}{LF}and the Column delimiter is
Comma {,}, and click Next.
10. On the Save and Run Package page, ensure that Run immediately is selected, and click Next.
11. On the Complete the Wizard page, click Finish. Then, when the execution is successful, click Close.
12. Start Excel and open the Currency.csv file in the D:\Demofiles\Mod06 folder and view the data that
has been exported. Then close Excel without saving the file.
bcp Syntax
The syntax for the bcp utility is highly versatile, and
includes a large number of options. The general
form of a bcp command specifies:
• A local file name for the source (when importing) or destination (when exporting).
You can also use the queryout direction to specify that data is to be extracted from the database based
on a Transact-SQL query. Additionally, the bcp utility supports the following commonly used parameters.
Note that these parameters are case-sensitive:
• -d: The database containing the table or view (you can also specify a fully-qualified table or view
name that includes the database and schema – for example AdventureWorks.Sales.Currency).
• -T: Specifies that a trusted connection should be used to connect using Windows authentication.
• -c: Specifies that the data file stores data in character format.
• -w: Specifies that the data file stores data in Unicode character format.
• -n: Specifies that the data file stores data in SQL Server native format.
• -f format_file: Specifies a format file that defines the schema for the data.
• -t delimiter: Specifies a field terminator for a data in character format. The default is a tab.
• -r delimiter: Specifies a row terminator for data in character format. The default is a new line.
Note: For a full list of parameters and syntax options, you can enter the command bcp -?
Administering Microsoft® SQL Server® Databases 6-11
The following code example connects to the MIA-SQL SQL Server instance using Windows authentication,
and exports the contents of the Sales.Currency table in the AdventureWorks database to a text file
named Currency.csv in which the data is saved in comma-delimited character format with a new line for
each row.
To preemptively create a format file, use the format nul direction and specify the name of the format file
you want to create. You can then interactively specify the data type, prefix length, and delimiter for each
field in the specified table or view, and save the resulting schema in the format file. The default format file
type is text, but you can use the -x parameter to create an XML format file. If you want to create a format
file for character data with specific field and row terminators, you can specify them with the -c, -t, and -r
parameters.
The following example shows how to use the bcp utility to create an XML-based format file named
CurrencyFmt.xml based on the AdventureWorks.Sales.Currency table.
To use a format file when importing or exporting data, use the -f parameter.
The following example shows how to import the contents of Currency.csv into the Finance.dbo.Currency
table. The in parameter specifies the file to read and the –f parameter specifies the format file to use.
Demonstration Steps
Use bcp to Create a Format File
1. Ensure that you have completed the previous demonstration in this module.
2. Open a command prompt and type the following command to view the bcp syntax help:
6-12 Importing and Exporting Data
bcp -?
3. In the command prompt window, enter the following command to create a format file:
4. Start Notepad and open TaxRateFmt.xml in the D:\Demofiles\Mod06 folder. Then view the XML
format file and close notepad.
1. In the command prompt window, enter the following command to export data from SQL Server:
A key considerations for using the BULK INSERT statement is that file paths to source data must be
accessible from the server where the SQL Server instance is running, and must use the correct drive letters
for volumes as they are defined on the server. For example, when running the BULK INSERT statement in
SQL Server Management Studio or sqlcmd from a client computer, the path C:\data\file.txt references a
file on the C: volume of the server, not the client.
Also unlike bcp, you can execute the BULK INSERT statement from within a user-defined transaction,
which gives the ability to group BULK INSERT with other operations in a single transaction. Care must be
taken however, to ensure that the size of the data batches that you import within a single transaction are
not excessive or significant log file growth might occur, even when the database is in simple recovery
model.
Administering Microsoft® SQL Server® Databases 6-13
In the following example, new orders are inserted into the Sales.OrderDetail table from a text file on the
file system:
Demonstration Steps
Use the BULK INSERT Statement to Import Data
1. Ensure that you have completed the previous demonstration in this module.
2. In SQL Server Management Studio, in Object Explorer, under Databases, expand Finance. Then
expand the Tables folder, right-click dbo.Currency, and click Select Top 1000 Rows.
3. View the query results, and verify that the dbo.Currency table is currently empty.
4. Click New Query, and in the new query pane, enter the following Transact-SQL code:
6. Switch to the query pane that retrieves the top 1000 rows from the dbo.Currency table and click
Execute to re-run the SELECT query. Note that the table is now populated with the same number of
rows as you noted in the previous step.
6-14 Importing and Exporting Data
Note: Note that the results returned by the OPENROWSET function must have a correlation
name specified in an AS clause.
Similarly to the BULK INSERT statement, file paths used with the OPENROWSET function refer to volumes
that are defined on the server.
Two key advantages of OPENROWSET, compared to bcp, are that it can be used in a query with a WHERE
clause (to filter the rows that are loaded), and that it can be used in a SELECT statement that is not
necessarily associated with an INSERT statement.
• SINGLE_NCLOB. This option reads an entire double-byte character-based file as a single value of data
type nvarchar(max).
• SINGLE_BLOB. This option reads an entire binary file as a single value of data type varbinary(max).
In the following example, the data in the SignedAccounts.pdf file is inserted into the Document column
of the dbo.AccountsDocuments table:
Note: To use OPENROWSET with OLE-DB providers other than BULK, the ad hoc
distributed queries system configuration option must be enabled and the
DisallowAdhocAccess registry entry for the OLE-DB provider must be explicitly set to 0. This
registry key is typically located at
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSSQLServer\Providers\MSDASQL.
When these options are not set, the default behavior does not allow for the ad hoc access that is
required by the OPENROWSET function when working with external OLE-DB providers.
Demonstration Steps
Use the OPENROWSET Function to Import Data
1. Ensure that you have completed the previous demonstrations in this module.
2. In SQL Server Management Studio, in Object Explorer, right-click the dbo.SalesTaxRate table in the
Finance database, and click Select Top 1000 Rows.
3. View the query results, and verify that the dbo.SalesTaxRate table is currently empty.
4. Click New Query, and in the new query pane, enter the following Transact-SQL code:
Lesson 3
Copying or Moving a Database
The techniques discussed in this module so far enable you to import and export data to and from
individual tables. However, in some scenarios you must copy or move an entire database from one SQL
Server instance to another.
Lesson Objectives
After completing this lesson, you will be able to:
• Data-Tier Applications. You can export a database as a data-tier application, and import it on a
different SQL Server instance.
The detach/attach and backup/restore techniques are easy to implement. However, only the database is
copied, and the DBA needs to take care of all dependent objects such as logins, certificates, and other
server-level objects.
Administering Microsoft® SQL Server® Databases 6-17
If you select the move option in the wizard, it deletes the source database afterwards. If you select copy,
the source database is left intact. The wizard performs the task by detaching the database, moving the
data and log files on the file system, and then attaching the database back to an instance of SQL Server.
Note: If the source database is in use when the wizard tries to move or copy a database,
the operation is not performed.
Running the Copy Database Wizard requires sysadmin privileges on both instances and a network
connection must be present.
Demonstration Steps
Use the Copy Database Wizard
1. Ensure that you have completed the previous demonstrations in this module.
2. In SQL Server Management Studio, in Object Explorer, in the Connect drop-down list click Database
Engine. Then connect to the MIA-SQL\SQL2 instance using Windows authentication.
3. In Object Explorer, under the MIA-SQL\SQL2 instance, expand Databases and verify that the
AdventureWorks database is not listed.
4. In Object Explorer, under the MIA-SQL instance, right-click the AdventureWorks database, point to
Tasks, and click Copy Database.
6. On the Select a Source Server page, ensure that MIA-SQL is selected with the Use Windows
Authentication option, and click Next.
6-18 Importing and Exporting Data
7. On the Select a Destination Server page, change the Destination server to MIA-SQL\SQL2 and
select the Use Windows Authentication option. Then click Next.
8. On the Select the Transfer Method page, ensure that Use the detach and attach method is
selected, and click Next.
9. On the Select Databases page, in the Copy column, ensure that the AdventureWorks database is
selected. Then click Next.
10. On the Configure Destination Database (1 of 1) page, note the default settings for the database
name and file locations. Then click Next.
11. On the Select Server Objects page, verify that Logins is listed in the Selected related objects list.
Then click Next.
12. On the Configure the Package page, note the default package name and logging options. Then
click Next.
13. On the Scheduling the Package page, ensure that Run immediately is selected, and click Next.
16. In Object Explorer, under the MIA-SQL\SQL2 instance, right-click Databases and click Refresh. Then
verify that the AdventureWorks database has been copied to this instance.
Data-Tier Applications
Data-Tier applications (known as DACs) provide a
way to simplify the development, deployment, and
management of database applications and their
SQL Server instance-level dependencies. They
provide a useful way to package application
databases for deployment. Installation and upgrade
of DACs is automated as they are not designed for
large line of business applications. The intention is
to make it easy to install and upgrade large
numbers of simpler applications.
When the DAC project is built, the output is a .dacpac file containing the database schema and its
dependent objects, which can be delivered to the database administrator for deployment. A single
Administering Microsoft® SQL Server® Databases 6-19
.dacpac file can be used to both install and upgrade an application and is portable across different
environments such as development, test, staging, and production.
Alternatively, DBAs can export the database to a .bacpac package, which is a DAC package that also
includes the data already in the database. The export operation is performed in two phases. First, the
package file is created in the same was as a .dacpac file, and then the data is bulk exported from the
database into the package file.
If you have previously exported a database, including its data as a .bacpac package, you can import it into
a SQL Server instance. In this case, the DAC is deployed to the server along with its data.
Demonstration Steps
Export a Data-Tier Application
1. Ensure that you have completed the previous demonstrations in this module.
2. In SQL Server Manager, in Object Explorer, under the MIA-SQL instance, right-click the Finance
database, point to Tasks, and click Export Data-tier Application. (Be careful to click Export Data-
tier Application, and not Extract Data-tier Application.)
6. Wait for the export operation to complete, and then click Close.
1. In SQL Server Management Studio, in Object Explorer, under the MIA-SQL\SQL2 instance, right-click
the Databases folder and click Import Data-tier Application.
3. On the Import Settings page, ensure that Import from local disk is selected and enter the path
D:\Demofiles\Mod06\Finance.bacpac. Then Next.
4. On the Database Settings page, review the default settings for the database name and file paths,
and then click Next.
7. On Object Explorer, under the MIA-SQL\SQL2 instance, if necessary, refresh the Databases folder
and verify that the Finance database has been imported.
Administering Microsoft® SQL Server® Databases 6-21
Objectives
After completing this lab, you will be able to:
• Use the SQL Server Import and Export Wizard to export data.
Password: Pa$$w0rd
You have created a Transact-SQL query, and the production manager has confirmed that it returns the
required data. Now the production manager has asked you to provide the data in a Microsoft Excel
workbook.
2. Use the SQL Server Import and Export Wizard to Export Data
Task 2: Use the SQL Server Import and Export Wizard to Export Data
1. Use the SQL Server Import and Export Wizard to export the required sales data from the
InternetSales database on the MIA-SQL instance of SQL Server.
o Export the data to an Excel file named Sales.xls in the D:\Labfiles\Lab06\Starter folder.
o Use the Transact-SQL query in the Query.sql script file (which is in the D:\Labfiles\Lab06\Starter
folder) to extract the required data.
6-22 Importing and Exporting Data
Results: After this exercise, you should have exported data from InternetSales to an Excel workbook
named Sales.xls.
o Note that the table includes some columns that contain Unicode characters.
2. Use the bcp utility to create an XML format file for the JobCandidate table.
o Ensure that the format file uses Unicode character data with a tab (\t) file terminator and a new
line (\n) row terminator.
Results: After this exercise, you should have created a format file named JobCandidateFmt.xml, and
imported the contents of the JobCandidates.txt file into the HumanResources database.
1. Disable Indexes
3. Rebuild Indexes
2. In Object Explorer, view the indexes that are defined on the dbo.CurrencyRate table.
2. In SQL Server Management Studio, use the BULK INSERT Transact-SQL statement to import the data
from the CurrencyRates.csv file into the dbo.CurrencyRates table in the InternetSales database.
3. Verify that the data has been imported.
Results: After this exercise, you should have used the BULK INSERT statement to load data into the
CurrencyRates table in the InternetSales database.
2. In Object Explorer, view the indexes and constraints that are defined on the dbo.JobCandidate table.
o Use a WHERE clause to include only records where EmailAddress is not null.
2. Verify that the data has been imported.
Results: After this exercise, you should have imported data from JobCandidates2.txt into the
dbo.JobCandidates table in the HumanResources database.
Question: Why was it not necessary to disable constraints when importing the currency
rates?
Question: If the dbo.JobCandidate table has included a column for a resume in Microsoft
Word document format, which tool or command could you use to import the document into
a column in a table?
Administering Microsoft® SQL Server® Databases 6-25
When planning a data transfer solution, consider the following best practices:
Review Question(s)
Question: What other factors might you need to consider when importing or exporting
data?
6-26 Importing and Exporting Data
7-1
Module 7
Monitoring SQL Server 2014
Contents:
Module Overview 7-1
Module Overview
A large amount of the time spent by a database administrator (DBA) involves monitoring activity in
databases and database servers in order to diagnose problems and identify changes in resource utilization
requirements. SQL Server provides a range of tools and functionality you can use to monitor current
activity and to record details of previous activity. This module explains how to use three of the most
commonly used tools: Activity Monitor, dynamic management views and functions (DMVs and DMFs), and
Performance Monitor.
Objectives
After completing this module, you will be able to:
• Describe considerations for monitoring SQL Server and use Activity Monitor.
Lesson 1
Introduction to Monitoring SQL Server
SQL Server is a sophisticated software platform with a large number of subsystems and potential
workloads, all of which make demands on system resources and affect how database applications perform
in a multi-user environment. To effectively manage a SQL Server database solution, a DBA must be able to
monitor the key metrics that affect the workloads that the solution must support, and use the information
gained from monitoring to plan hardware capacity and troubleshoot performance problems.
Lesson Objectives
After completing this lesson, you will be able to:
Why Monitor?
Important reasons to monitor SQL Server workloads
include:
• Diagnosing causes of performance issues. It
is not uncommon for users to complain of slow performance when using a database application. By
performance, users usually mean that the response time of the application (the time taken between
the user submitting a request and them seeing a response from the application) is slower than
acceptable, though often the problem can be attributed to a bottleneck that affects the overall
throughput of the system (the amount of data that can be processed for all concurrent workloads
simultaneously). Effective monitoring is a key part of diagnosing and resolving these problems. For
example, you might identify long-running queries that could be optimized by creating indexes or
improving Transact-SQL code (to improve response time), or you might find that at peak times the
server has insufficient physical memory to cope with demand (reducing throughput).
• Detecting and resolving concurrency issues. When multiple users and applications access the same
data, SQL Server uses locks to ensure data consistency. This can cause some requests to be blocked
while waiting for another to complete, and can occasionally result in deadlock where two operations
are blocking one another. By monitoring current activity, you can identify processes that are blocking
one another, and take action to resolve the problem if necessary. If your monitoring reveals
consistent locking issues, you can then troubleshoot them by tracing workloads and analyzing the
results. Techniques for accomplishing this are discussed in the next module.
• Identifying changing trends in resource utilization. When a database server is initially provisioned,
it is usually done so after careful capacity planning to identify the required hardware resources to
Administering Microsoft® SQL Server® Databases 7-3
support the database workload. However, business processes, and the database workloads that
support them, can change over time; so it is important to monitor resource utilization changes so that
you can proactively upgrade hardware or consolidate workloads as the needs of the organization
change.
• Understand the workloads you need to support. Every workload has different requirements, both
in terms of the resources required to support it and in terms of trends in demand between quiet and
peak times. Before you can effectively support a database solution, you need to understand its
workloads so that you can identify the relevant metrics to monitor, and prioritize resources based on
the importance of each workload to the business.
• Establish a baseline. A common mistake is to wait until there is a problem before monitoring the
SQL Server solution. The problem with this approach is that without something to use as a
comparison, the values obtained from monitoring are unlikely to help you identify what has changed
since the system was operating acceptably. A better approach is to identify the key metrics that your
workloads rely on, and record baseline values for these metrics when the system is operating
normally. If you experience performance problems later, you can monitor the system and compare
each metric to its baseline in order to identify significant changes that warrant further investigation to
try to diagnose the problem.
• Monitor regularly to track changes in key metrics. Instead of waiting for as problem to arise, it is
generally better to proactively monitor the system on a regular basis to identify any trends that
signify changes in the way the workloads consume resources. With this approach, you can plan server
upgrades or application optimizations before they become critical.
Tool Description
Tool Description
Tool Description
and application services.
Activity Monitor
Activity Monitor is a tool in SQL Server
Management Studio (SSMS) that shows information
about processes, waits, I/O resource performance,
and recent expensive queries. You can use it to
investigate both current and recent historical issues.
You must have the VIEW SERVER STATE permission
to use Activity Monitor.
• The Processes section includes detailed information on processes, their IDs, logins, databases, and
commands. This section also shows details of processes that are blocking other processes.
• The Resource Waits section shows categories of processes that are waiting for resources and
information about the wait times.
• The Data File I/O section shows information about the physical database files in use and their recent
performance.
• The Recent Expensive Queries section shows detailed information about the most expensive recent
queries, and resources consumed by those queries. You can right-click the queries in this section to
view either the query or an execution plan for the query.
You can filter data by clicking column headings and choosing the parameter for which you want to view
information.
Demonstration Steps
View Server Activity in Activity Monitor
1. Ensure that the 20462C-MIA-DC, and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
3. Start SQL Server management Studio and connect to the MIA-SQL database engine instance using
Windows authentication.
7-6 Monitoring SQL Server 2014
4. In Object Explorer, right-click the MIA-SQL SQL Server instance and click Activity Monitor.
5. In Activity Monitor, view the charts in the Overview section, which show background activity in the
SQL Server instance.
6. Expand the Processes section and view the processes currently running in the SQL Server instance.
7. Click the filter icon for the Application column header, and filter the data to show only processes for
the Microsoft SQL Server Management Studio application (you may need to widen the columns to
read the headers).
9. Expand the Resource Waits section and view the statistics for processes waiting on resources.
10. Expand the Data File I/O section and view the details of the database file I/O activity (you may need
to wait for a few seconds while the data is collected and displayed).
11. Expand the Recent Expensive Queries section and view the list of queries that have consumed query
processing resources.
1. With Activity Monitor still open in SQL Server Management Studio, in the D:\Demofiles\Mod07 folder,
run ActivityWorkload.cmd.
2. In SQL Server Management Studio, in Object Explorer, expand Databases, expand AdventureWorks,
and expand Tables. Then right-click Production.Product and click Select Top 1000 Rows.
3. In status bar at the bottom of the query pane, note that the query continues executing. Another
process is preventing it from completing.
4. In the Activity Monitor pane, in the Processes section, filter the Task State column to show processes
that are in a SUSPENDED state.
5. In the Blocked By column for the suspended process, note the ID of the process that is blocking this
one.
6. Remove the filter on the Task State column to show all processes, and find the blocking process with
the ID you identified in the previous step.
7. Note the value in the Head Blocker column for the blocking process. A value of 1 indicates that this
process is the first one in a chain that is blocking others.
8. Right-click the blocking process and click Details. This displays the Transact-SQL code that is causing
the block—in this case, a transaction that has been started but not committed or rolled back.
9. Click Kill Process, and when prompted to confirm, click Yes. After a few seconds, the Processes list
should update to show no blocked processes.
10. Close the Activity Monitor pane, and verify that the query to retrieve the top 1000 rows from
Production.Products has now completed successfully.
11. Close the command prompt window, but keep SQL Server Management Studio open for the next
demonstration.
Administering Microsoft® SQL Server® Databases 7-7
Lesson 2
Dynamic Management Views and Functions
SQL Server provides a range of tools and features to support monitoring of server activity. Dynamic
management views (DMVs) and dynamic management functions (DMFs) provide insights directly into the
inner operations of the SQL Server database engine and are useful for monitoring.
Lesson Objectives
After completing this lesson, you will be able to:
Note: The information exposed by DMVs and DMFs is generally not persisted in the
database as is the case with catalog views. The views and functions are virtual objects that return
state information. That state is cleared when the server instance is restarted.
DMVs and DMFs return server state information that you can use to monitor the health of a server
instance, diagnose problems, and tune performance. There are two types of DMV and DMF:
• Server-scoped. These objects provide server-wide information. To use these objects, users require
VIEW SERVER STATE permission on the server.
• Database-scoped. These objects provide database-specific information. To use these objects, users
require VIEW DATABASE STATE permission on the database.
All DMVs and DMFs exist in the sys schema and follow the naming convention dm_%. They are defined in
the hidden resource database, and are then mapped to the other databases. There are a great many
DMVs and DMFs, covering a range of categories of system information. Some commonly used categories
of DMVs and DMFs are in the following table:
Category Description
7-8 Monitoring SQL Server 2014
Category Description
sys.dm_os_% These objects provide access to SQL OS-related information. For example,
sys.dm_os_performance_counters provides access to SQL Server
performance counters without needing to reach them using operating
system tools.
In addition to these core categories, there are DMVs and DMFs that provide information about specific
SQL Server subsystems, such as security, Resource Governor, AlwaysOn Availability Groups, Service Broker,
memory-optimized tables, and others.
The following example shows how to use the sys.dm_exec_sessions and sys.dm_os_waiting_tasks views
to return a list of user tasks that have been waiting for longer than three seconds:
In many cases, when a task is waiting, the cause of the wait will be some form of lock.
Note: Whenever a task has to wait for any resource, the task is sent to a waiting list. The
task remains on that list until it receives a signal telling it that the requested resource is now
available. The task is then returned to the running list, where it waits to be scheduled for
execution again. This type of wait analysis is very useful when tuning system performance, as it
helps you to identify bottlenecks within the system.
Demonstration Steps
View SQL Server Service Configuration Settings
1. If you did not complete the previous demonstration in this module, start the 20462C-MIA-DC, and
20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student
with the password Pa$$w0rd, and in the D:\Demofiles\Mod07 folder, run Setup.cmd as
Administrator.
2. If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database
engine instance using Windows authentication.
3. In SQL Server Management Studio, open the DMVs.sql script file in the D:\Demofiles\Mod07 folder.
4. Highlight the Transact-SQL statement under the comment View service information, and then click
Execute.
5. Review the results, noting the values in the startup_type column for the SQL Server and SQL Server
Agent services.
7-10 Monitoring SQL Server 2014
6. Highlight the Transact-SQL statement under the comment View registry information, and then click
Execute.
7. Review the results. Note the value Start in the value_name column for the MSSQLSERVER and
SQLSERVERAGENT registry keys and the corresponding values in the value_data column. These
values are the equivalent registry value for the startup_type column values returned by the
sys.dm_server_services dynamic management view.
1. Select the Transact-SQL code under the comment View volume stats, noting that it retrieves data
from the sys.sysdatabases and sys.master_files system tables as well as the
sys.dm_os_volume_stats dynamic management function.
2. Click Execute and review the query results, which show the files for all databases in the instance,
together with details about the disk volume on which they are hosted.
1. Highlight the Transact-SQL statement under the comment Empty the cache and then click Execute.
2. Highlight the Transact-SQL statement under the comment get query stats, noting that this code uses
the sys.dm_exec_query_stats DMV and the sys.dm_exec_sql_text DMF to return details about
Transact-SQL queries that have been executed.
3. Click Execute and review the results, which show some background system queries, noting the
various columns that are returned.
4. Highlight the Transact-SQL statement under the comment Execute a query, and then click Execute.
5. Re-highlight the Transact-SQL statement under the comment get query stats, and then click
Execute to get the query stats again.
6. In the results, find the row with a SQLText column that contains the Transact-SQL statement that you
executed and note the statistics returned for the query.
7. Re-highlight the Transact-SQL statement under the comment Execute a query, and then click
Execute to run the query again.
8. Re-highlight the Transact-SQL statement under the comment get query stats, and then click
Execute to get the query stats again.
9. Review the results for the query, noting that the query you executed now has an execution_count
value of 2.
10. Close SQL Server Management Studio without saving any files.
Administering Microsoft® SQL Server® Databases 7-11
Lesson 3
Performance Monitor
SQL Server Management Studio (SSMS) provides the Activity Monitor which you can use to investigate
both current and recent historical issues. The SQL Server processes also expose a set of performance-
related objects and counters to the Windows Performance Monitor. These objects and counters enable
you to monitor SQL Server as part of monitoring the entire server.
Lesson Objectives
After completing this lesson, you will be able to:
• Counters. An object contains one or more counters, which are the metrics that you can monitor. For
example, the Processor object includes the % Processor Time counter, which measures processor
utilization.
• Instances. Some counters can be captured for specific instances of objects. For example, when
monitoring the % Processor Time counter in a multi-CPU server, you capture measurements for
each individual CPU. Additionally, most multi-instance counters provide a _Total instance, which
measures the counter for all instances combined.
When describing a system counter, the format Object: Counter (instance) is used. For example, the %
Processor Time counter for the _Total instance of the Processor object is described as Processor: %
7-12 Monitoring SQL Server 2014
Processor Time (_Total). When describing the _Total instance, or if an object has only one instance, the
(instance) part is often omitted; for example Processor: % Processor Time.
• A line chart.
• A text-based report.
When viewing a chart in performance Monitor, you can select individual counters in a list below the chart
and see the last, average, minimum, and maximum values recorded for that counter within the selected
timeframe. You can also highlight an individual counter to make it easier to identify in the chart.
You export data from all charts and reports, and you can save them as images.
You can also access the same counter values by using the sys.dm_os_performance_counters DMV.
There are too many SQL Server objects and counters to discuss in detail in this course, and the specific
counters you should monitor depend on the workload of your specific database environment. However,
some commonly monitored SQL Server performance counters include:
• SQLServer: Buffer Manager: Buffer cache hit ratio. This measures how frequently SQL Server
found data pages in cache instead of having to fetch them from disk. Generally a high figure for this
counter is desirable.
• SQLServer: Plan Cache: Cache hit ratio. This measures how often a pre-compiled execution plan for
a query could be used, saving the query processor from having to compile an execution plan. A high
value for this is generally desirable.
• SQLServer: SQL Statistics: Batch Requests / sec. This measures the number of Transact-SQL
batches per second, and provides a good measure of how active SQL Server is.
Administering Microsoft® SQL Server® Databases 7-13
• SQLServer: Locks: Lock Requests / sec. This measures the number of lock requests SQL Server
receives each second. A high value for this counter indicates that a lot of transactions are locking
pages (usually to update data), and on its own is not necessarily an indication of a problem.
• SQLServer: Locks: Average Wait Time (ms). This measures the average time that requests are
waiting for locks to be released. A high value for this counter indicates that concurrent processes are
blocking one another.
• SQLServer: Memory: Database Cache Memory. This measures how much of its memory allocation
SQL Server is using to cache data.
• SQLServer: Memory: Free Memory. This measures how much of its memory allocation SQL Server is
currently not using.
Performance Logs
Each data collector set is configured to use a specific folder in which to save its logs. Logs are files
containing the recorded measurements, and usually have dynamically generated names that include the
date and time the log was saved. You can open log files in Performance Monitor, filtering them to a
specific timespan if desired, and then view the logged counter values as charts or a report.
Demonstration Steps
View Performance Counters
1. If you did not complete the previous demonstration in this module, start the 20462C-MIA-DC, and
20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student
with the password Pa$$w0rd, and in the D:\Demofiles\Mod07 folder, run Setup.cmd as
Administrator.
3. In Computer Management, expand Performance, expand Monitoring Tools, and click Performance
Monitor.
5. In the list of objects, expand the Processor object, and select only the % Processor Time counter.
Then in the Instances of selected object list, ensure that _Total is selected and click Add.
6. In the list of objects, expand the Memory object and select the Page Faults/sec counter. Then click
Add.
7. In the list of objects, expand the SQLServer:Locks object, click the Average Wait Time (ms) counter,
and then hold the Ctrl key and click the Lock Requests/sec and Lock Waits/sec counters. Then in
the Instances of selected object list, ensure that _Total is selected and click Add.
8. In the list of objects, expand the SQLServer:Plan Cache object, and select the Cache Hit Ratio
counter. Then in the Instances of selected object list, ensure that _Total is selected and click Add.
9. In the list of objects, expand the SQLServer:Transactions object, and select the Transactions
counter. Then click Add.
10. In the Add Counters dialog box, click OK. Then observe the counters as they are displayed in
Performance Monitor.
11. On the toolbar, click Freeze Display and note that the chart is paused. Then click Unfreeze Display
to resume the chart.
12. On the toolbar, in the Change Graph Type list, select Histogram bar and view the resulting chart.
Then in the Change Graph Type list, select Report and view the text-based report.
13. On the toolbar, in the Change Graph Type list, select Line to return to the original line chart view.
14. Click any of the counters in the list below the chart and on the toolbar click Highlight so that the
selected counter is highlighted in the chart. Press the up and down arrow keys on the keyboard to
change the selected counter.
16. In Performance Monitor, observe the effect on the counters as the workloads run.
17. Close both command prompt windows, and observe the effect that ending the workloads has on the
Performance Monitor counters.
Objectives
After completing this lab, you will be able to:
Password: Pa$$w0rd
2. Create the data collector set manually, and include the following performance counters:
o SQLServer:Transactions: Transactions
3. Configure the data collector set to save its logs in the D:\Labfiles\Lab07\Starter\Logs folder.
2. Run Baseline.ps1 in the D:\Labfiles\Lab07\Starter folder, changing the execution policy if prompted.
This starts a baseline workload process that takes three minutes to run.
3. When the baseline workload has finished, stop the SQL Server Workload data collector set.
2. Add all counters from the log to the display, and explore the data in the available chart and report
formats.
3. Save the Report view as an image in the D:\Labfiles\Lab07\Starter folder for future reference.
o The top 5 queries by average reads. You can retrieve this information from the
sys.dm_exec_query_stats DMV cross applied with the sys.dm_exec_sql_text DMF based on the
sql_handle column.
o I/O stats for the files used by the InternetSales database. You can retrieve this from the
sys.dm_io_virtual_file_stats DMV, joined with the sys.master_files system table on the
database_id and file_id columns. Note that you can use the DB_NAME function to retrieve a
database named from a database ID.
If you have difficulty creating the required queries, you can use the ones in the Query DMV.sql file in the
D:\Labfiles\Lab07\Starter folder.
2. Save the results of your queries as comma-separated values (CSV) files in the D:\Labfiles\Lab07\Starter
folder.
Results: At the end of this exercise, you will have a data collector set named SQL Server Workload, a log
containing baseline measurements, and query and I/O statistics obtained from DMVs and DMFs.
2. Run Workload.ps1 in the D:\Labfiles\Lab07\Starter folder. This starts a workload process that takes
three minutes to run.
3. When the workload has finished, stop the SQL Server Workload data collector set.
2. Explore the data in the available chart and report formats, noting any values that look consistently
high.
3. Compare the Report view with the baseline image you saved previously, and identify which counters
have changed significantly.
Results: At the end of this exercise, you will have a second log file containing performance metrics for the
revised workload.
Question: Based on the results of your monitoring, what aspect of the database solution is
most significantly affected by the changes to the workload?
7-18 Monitoring SQL Server 2014
Best Practice: When monitoring SQL Server, consider the following best practices:
• Identify the system resources that your database workload uses, and determine the key performance
metrics that indicate how your database and server are performing.
• Record baseline measurements for typical and peak workloads so that you have a basis for
comparison when troubleshooting performance problems later.
• Identify the DMVs and DMFs that return appropriate performance information for your workloads,
and create reusable scripts that you can use to quickly check system performance.
• Monitor the overall system periodically and compare the results with the baseline. This can help you
detect trends that will eventually result in resource over-utilization before application performance is
affected.
Review Question(s)
Question: How are dynamic management views and functions different from system tables?
8-1
Module 8
Tracing SQL Server Activity
Contents:
Module Overview 8-1
Module Overview
While monitoring performance metrics provides a great way to assess the overall performance of a
database solution, there are occasions when you need to perform more details analysis of the activity
occurring within a SQL Server instance in order to troubleshoot problems and identify ways to optimize
workload performance.
This module describes how to use SQL Server Profiler and SQL Trace stored procedures to capture
information about SQL Server, and how to use that information to troubleshoot and optimize SQL Server
workloads.
Objectives
After completing this module, you will be able to:
• Trace activity in SQL Server.
Lesson 1
Tracing SQL Server Workload Activity
Application workloads generate activity in SQL Server, which you can think of as a sequence of events that
begin and end as the workload progresses. The ability to trace these events is a valuable tool for
performance tuning, for troubleshooting and diagnostic purposes, and for replaying workloads in order to
check the impact of performance changes against test systems or to test application workloads against
newer versions of SQL Server.
Lesson Objectives
After completing this lesson, you will be able to:
Capture to Files
Capturing to an operating system file is the most efficient option for SQL Server Profiler traces. When
configuring file output, you need to supply a filename for the trace. The default file extension for a trace
file is .trc. SQL Server Profiler defaults to a file size of 5 MB which can be too small for some traces. A more
realistic value on most large systems is between 500 MB or 5,000 MB, depending upon the volume of
activity to record.
Administering Microsoft® SQL Server® Databases 8-3
When the allocated file size is full, SQL Server Profiler opens a new file using the previous filename with an
integer appended to it and starts writing to the new file. This is called file rollover and is the default
behavior for SQL Server Profiler. You can disable and enable file rollover in the Trace Properties dialog
box. It is considered good practice to work with a large maximum file size and avoid the requirement for
rollover files, unless there is a need to move the captured traces onto media such as DVDs or onto
download sites that cannot work with larger files.
Capturing to Tables
SQL Server Profiler can also capture trace data to database tables. The underlying SQL Trace
programming interface does not directly support output to tables. However, the SQL Server Profiler
program can retrieve the event data into its graphical grid, and then write those rows to the specified
database table.
Note: Writing trace data directly back to the SQL Server system that is being monitored
can impact performance.
SQL Server Profiler also provides an option for saving existing captured event data displayed in the
graphical grid to a database table.
Trace Events
The information that a trace records consists of sets
of events. An event is an occurrence of an action in
an instance of the SQL Server database engine.
Events contain attributes which are listed as data
columns. The events are grouped into categories of
related event classes. The information in the table
on the next page describes the most commonly-
traced events.
Event Description
Event Description
possible to retrieve details
of each individual
statement contained within
the batch.
Note: You can group the output in the SQL Server Profiler graphical grid, based on column
values.
You can define which columns to capture when you create the trace and you should minimize the number
of columns that you capture to help reduce the overall size of the trace. You can also organize columns
into related groups by using the Organize Columns function.
One of the more interesting columns is TextData. Many events do not include it by default, but the
values that it contains are very useful. For example, in the RPC:Completed event, the TextData column
contains the Transact-SQL statement that executed the stored procedure.
Filtering Columns
You can set filters for each of the columns that you capture in a trace. It is important to ensure that you
are only capturing events of interest by using filters to limit the events. Effective use of filters helps to
minimize the overall size of the captured trace, helps to avoid overwhelming the server with tracing
activity, and decreases the number of events that are contained in the trace to reduce complexity during
analysis. Also, smaller traces are typically faster to analyze.
The filters that you configure are only used if the event writes that particular column to the trace. For
example, if you set a filter for DatabaseName = AdventureWorks and capture the Deadlock Graph
event, all deadlock events will be shown because the DatabaseName column is not exposed.
Text-based columns can be filtered using a LIKE operator and wildcard characters. For example, you could
filter the DatabaseName column on the expression LIKE Adventure%, which would include events from
all databases with a name beginning with “Adventure”.
Note: If you need to create a filter based on the database that you want to trace activity
against, tracing by DatabaseID is more efficient than tracing by DatabaseName. However, trace
templates that filter by DatabaseID are less portable than those that do so by the name, because
a database restored on another server will typically have a different DatabaseID.
8-6 Tracing SQL Server Activity
Trace Templates
SQL Server Profiler offers predefined trace
templates that enable you to easily configure the
event classes you need for specific types of traces.
The Standard template, for example, helps you to
create a generic trace for recording logins, logouts,
batches completed, and connection information.
You can use this template to run traces without
modification or as a starting point for additional
ones with different event configurations.
You can create templates in SQL Server Profiler by creating a trace using the graphical interface—starting
and stopping the trace at least once—and then saving the trace as a template.
The following Transact-SQL code retrieves all columns from a trace file:
Using fn_trace_gettable
SELECT * FROM fn_trace_gettable ('D:\Traces\adworks.trc', default);
Demonstration Steps
Use SQL Server Profiler to Create a Trace
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
3. Start SQL Server Management Studio, and connect to the MIA-SQL database engine instance using
Windows authentication.
5. When SQL Server Profile starts, connect to the MIA-SQL database engine instance using Windows
authentication.
6. In the Trace Properties dialog box, on the General tab, set the following properties:
7. In the Trace Properties dialog box, on the Events Selection tab, note the events and columns that
were automatically selected from the TSQL template.
8. Select Show all events, and under TSQL, select SQL:StmtCompleted. Then clear Show all events so
that only the selected events, including the one you just selected are shown.
9. Select Show all columns and select the Duration column for the SQL:StmtCompleted event.
10. Click the column header for the Database Name column, and in the Edit Filter dialog box, expand
Like, enter AdventureWorks, and click OK. Then clear Show all columns so that only the selected
columns are shown.
3. Switch back to SQL Server Management Studio, open the Query.sql script file in the
D:\Demofiles\Mod08 folder, and click Execute. This script runs a query in the AdventureWorks
database twenty times.
4. While the query is executing, switch back to SQL Server Profiler and observe the activity.
5. When the query has finished, in SQL Server Profiler, on the File menu, click Stop Trace.
6. In the trace, select any of the SQL:StmntCompleted events and note that the Transact-SQL code is
shown in the bottom pane.
7. Keep SQL Server Profiler and SQL Server Management Studio open for the next demonstration.
8-8 Tracing SQL Server Activity
SQL Trace
SQL Server Profiler is a graphical tool and it is
important to realize that, depending upon the
options you choose, it can have significant
performance impacts on the server being traced.
SQL Trace is a library of system stored procedures
that you can use for tracing when minimizing the
performance impacts of the tracing is necessary.
Internally, SQL Server Profiler uses the
programming interface provided by SQL Trace.
Traces run in the process of SQL Server database engine and can write events to a file or an application
using SQL Server Management Objects (SMO) objects. The information you learned about how events,
columns, and filtering work in SQL Server Profiler, is directly applicable to how the same objects work
within SQL Trace.
At first, implementing traces can appear difficult, as you need to make many stored procedure calls to
define and run a trace. However, you can use the graphical interface in SQL Server Profiler to create a
trace, and then script it for use with SQL Trace. Typically, very few changes need to be made to the SQL
Trace script files that SQL Server Profiler creates—this generally involves such things as the path to output
files.
• You need to use system stored procedures to configure SQL Trace, whereas SQL Server Profiler
provides a graphical interface for configuration and for controlling the tracing activity.
• SQL Trace runs directly inside the database engine, whereas SQL Server Profiler runs on a client
system (or on the server) and communicates to the database engine by using the SQL Trace
procedures.
• SQL Trace can write events to files or to applications using SMO, whereas SQL Server Profiler can write
events to files or database tables.
• SQL Trace is useful for long-running, performance-critical traces, or for very large traces that would
significantly impact the performance of the target system. SQL Server Profiler is more commonly used
for debugging on test systems, performing short-term analysis, or capturing small traces.
Note: The Server processes trace data option in SQL Server Profiler is not the same as
scripting a trace and starting it directly through stored procedures. The option creates two
traces—one that directly writes to a file and a second to send the events through SMO to SQL
Server Profiler.
Traces do not automatically restart after the server instance restarts. Therefore, if a trace needs to be run
constantly, you should script the trace, write a stored procedure to launch it, and then mark the stored
procedure as a startup.
Administering Microsoft® SQL Server® Databases 8-9
Demonstration Steps
Export a Trace Definition
1. Ensure that you have completed the previous demonstration in this module.
2. In SQL Server Profiler, with the Demo Trace still open, on the File menu, point to Export, point to
Script Trace Definition, and click For SQL Server 2005 - 2014.
3. Save the exported trace script as DemoTrace.sql in the D:\Demofiles\Mod08 folder, and click OK
when notified that the script has been saved.
1. In SQL Server Management Studio, open the DemoTrace.sql script file in the D:\Demofiles\Mod08
folder (which you exported from SQL Server Profiler in the previous task).
2. View the Transact-SQL code, and in the line that begins exec @rc = sp_trace_create, replace
InsertFileNameHere with D:\Demofiles\Mod08\SQLTraceDemo.
3. Click Execute to start the trace, and note the TraceID value that is returned.
4. Switch back to the Query.sql tab and click Execute to run the workload query.
5. When the query has finished, open StopTrace.sql in the D:\Demofiles\Mod08 folder.
6. In the StopTrace.sql script, under the comment Stop the trace, if necessary, modify the DECLARE
statement to specify the TraceID value for the trace you started previously.
7. Select the code under the comment Stop the trace and click Execute. Setting the trace status to 0
stops the trace, and setting it to 2 closes the file and deletes the trace definition on the server.
8. Select the code under the comment View the trace and click Execute. Then review the traced events.
9. Close the StopTrace.sql script and DemoTrace.sql tabs without saving any changes so that only the
Query.sql tab remains open.
10. Keep SQL Server Management Studio open for the next demonstration.
8-10 Tracing SQL Server Activity
Lesson 2
Using Traces
After you have used SQL Server Profiler or SQL Trace to capture a trace, you can use the trace to analyze
workload activity and use the results of this analysis to troubleshoot problems and optimize performance.
SQL Server provides a range of ways in which you can use traces, and these are discussed in this lesson.
Lesson Objectives
After completing this lesson, you will be able to:
• Replay traces.
• Use the Database Engine Tuning Advisor to generate recommendations from a trace.
Replaying Traces
You can replay traces captured with SQL Server
Profiler or SQL Trace to repeat the workload on a
SQL Server instance, enabling you to validate
changes that you are considering making to a
system or for test a workload against new hardware,
indexes, or physical layout changes. You can also
use it to test if corrections that you implement
really do solve the problem.
The replay does not need to be performed against the same system that the trace events were captured
on, but the system must be configured in a very similar way. This particularly applies to objects such as
databases and logins.
• Administration tool. Distributed Replay provides a console tool that you can use to start, monitor, and
cancel replaying.
• Replay controller. The controller is a Windows service which orchestrates the actions of the replay
clients.
Administering Microsoft® SQL Server® Databases 8-11
• Replay clients. The replay client or clients are computers that run a Windows service to replay the
workload against an instance of SQL Server.
• Target server. The target server is the instance of SQL Server upon which the clients replay the traces.
Distributed Replay uses configuration information stored in XML files on the controller, the clients, and
the computer running the administration tool. Each XML file contains configuration data for the relevant
component of the system.
The controller configuration includes information about the level of logging to use. By default, only
critical messages are logged, but you can configure to log information and warning messages.
The client configuration includes information about the name of the controller, the folders in which to
save dispatch and result files, and again, the logging level to use.
The administration tool configuration features information on whether or not to include system session
activities during the replay and to what value to cap the idle time during the trace.
The replay configuration includes information on the name of the target server, whether to use
connection pooling, and how many threads to use per client. It also contains elements to control the
number of rows and result sets to record.
Additional Reading: For more information about the configuration files, their contents,
and their locations, go to Configure Distributed Replay at
http://technet.microsoft.com/library/ff878359(v=sql.120).aspx.
Workloads
The Database Engine Tuning Advisor utility analyzes the performance effects of workloads run against one
or more databases. Typically, these workloads are obtained from traces captured by SQL Server Profiler or
8-12 Tracing SQL Server Activity
SQL Trace. After analyzing the effects of a workload on your databases, Database Engine Tuning Advisor
provides recommendations for improving system performance.
A workload is a set of Transact-SQL statements that executes against databases you want to tune. The
workload source can be a file containing Transact-SQL statements, a trace file generated by SQL Profiler,
or a table of trace information, again generated by SQL Profiler. You can also use SQL Server
Management Studio (SSMS) to launch Database Engine Tuning Advisor to analyze an individual
statement.
Recommendations
The recommendations that the Database Engine Tuning Advisor produces include suggested changes to
the database such as new indexes, indexes that should be dropped, and depending on the tuning options
you set, partitioning recommendations. The recommendations appear as a set of Transact-SQL statements
that will implement the suggested changes. You can view the Transact-SQL and save it for later review
and execution, or you can choose to implement the recommended changes immediately.
The Database Engine Tuning Advisor provides a rich set of configuration options that enable you to
customize the analysis to perform and how the optimization recommendations should be made.
Running the Database Engine Tuning Advisor on sizeable workloads can take a long time, particularly on
systems that also have large numbers of database objects. You can configure Database Engine Tuning
Advisor to limit the time it will spend on analysis and to return the results it has obtained up to the time
limit.
You can also configure which types of recommendations should be made, along with whether or not you
wish to see recommendations that involve dropping existing objects.
Exploratory Analysis
You can also use the Database Engine Tuning Advisor to perform exploratory analysis, which involves a
combination of manual and tool-assisted tuning. To perform exploratory analysis with the Database
Engine Tuning Advisor, use the user-specified configuration feature. This enables you to specify the tuning
configurations for existing and hypothetical physical design structures, such as indexes, indexed views, and
partitioning. The benefit of specifying hypothetical structures is that you can evaluate their effects on your
databases without incurring the overhead of implementing them first.
You can create an XML configuration file to specify a hypothetical configuration, and then use it for
analysis. You can perform the analysis, either in isolation or relative to the current configuration. You can
also perform this type of analysis by using a command-line interface.
• Generate recommendations.
• Validate recommendations.
Administering Microsoft® SQL Server® Databases 8-13
Demonstration Steps
Configure a Tuning Session
1. Ensure that you have completed the previous demonstration in this module.
2. In SQL Server Profiler, on the Tools menu, click Database Engine Tuning Advisor.
3. When the Database Engine Tuning Advisor starts, connect to the MIA-SQL database engine instance
using Windows authentication.
4. In the Database Engine Tuning Advisor, In the Session name box, type Tuning Demo.
5. Under Workload, ensure that File is selected, and browse to the D:\Demofiles\Mod08\Demo
Trace.trc file (which is where you saved the trace from SQL Server Profiler in the previous
demonstration).
o Product
o ProductCategory
o ProductSubcategory
o SalesOrderDetail
o SalesOrderHeader
8. On the Tuning Options tab, review the default options for recommendations. Then click Advanced
Options, select Generate online recommendations where possible, and click OK.
Generate Recommendations
1. In the Database Engine Tuning Advisor, on the toolbar, click Start Analysis.
2. When the analysis is complete, on the Recommendations tab, review the index recommendations
that the DTA has generated.
3. On the Reports tab, view the tuning summary and in the Select report list, select Statement detail
report.
4. View the report and compare the Current Statement Cost value to the Recommended Statement
Cost value (cost is an internal value that the SQL Server query processor uses to quantify the work
required to process a query).
5. On the Actions menu, click Save Recommendations, save the recommendations script as DTA
Recommendations.sql in the D:\Demofiles\Mod08 folder, and click OK when notified that the file
was saved.
Validate Recommendations
1. In SQL Server Management Studio, highlight the SELECT statement in the Query.sql script, taking
care not to highlight the GO 20 statement that follows it.
2. On the Query menu, click Display Estimated Execution Plan. This displays a breakdown of the tasks
that the query processor will perform to process the query.
8-14 Tracing SQL Server Activity
3. Note that the query processor suggests that there is at least one missing index that would improve
query performance. Then hold the mouse over the SELECT icon at the left side of the query plan
diagram and view the Estimated Subtree Cost value that is displayed in a tooltip.
4. In SQL Server Management Studio, open the DTA Recommendations.sql script you saved from the
Database Engine Tuning Advisor in the D:\Demofiles\Mod08 folder. Then click execute to implement
the recommended indexes.
5. Switch back to the Query.sql tab, and highlight the SELECT statement, taking care once again not to
highlight the GO 20 statement that follows it.
6. On the Query menu, click Display Estimated Execution Plan.
7. Note that the query processor no longer suggests that there is a missing index. Then hold the mouse
over the SELECT icon at the left side of the query plan diagram and view the Estimated Subtree Cost
value that is displayed in a tooltip.
To correlate a trace with performance counters, open a trace file or table that contains the StartTime and
EndTime data columns, and then in SQL Server Profiler, on the File menu, click Import Performance
Data. You can then open a performance log, and select the System Monitor objects and counters that you
want to correlate with the trace.
Demonstration Steps
Correlate a Trace with Performance Data
1. Ensure that you have completed the previous demonstration in this module.
Administering Microsoft® SQL Server® Databases 8-15
2. In the D:\Demofiles\Mod08 folder, double-click AWCounters.blg to open the log file in Performance
Monitor.
3. View the line chart, noting the times noted along the bottom axis. Then close Performance Monitor.
4. In SQL Server Profiler, open the AWTrace.trc file in the. D:\Demofiles\Mod08 folder and view the
traced events. Noting that the event times in the StartTime column match those in the Performance
Monitor log.
5. In SQL Server Profiler, on the File menu, click Import Performance Data. Then open the
AWCounters.blg log file in the D:\Demofiles\Mod08 folder.
6. In the Performance Counters Limit Dialog dialog box, select \\MIA-SQL (which selects all of the
counters in the log file) and click OK.
7. Click the line chart at approximately the 3:15:35 PM marker. Note that the event in the trace that
occurred at that time is selected, and the Transact-SQL statement that was executed is shown in the
bottom pane.
8. Keep SQL Server Management Studio open for the next demonstration.
No transaction can be granted a lock that conflicts with the mode of a lock that has already been granted
to another transaction on that same data. If a transaction requests a lock mode that does conflict, the
database engine blocks the requesting transaction until the first lock is released.
For UPDATE operations, SQL Server always holds locks until the end of the transaction. For SELECT
operations, it holds the lock protecting the row for a period that depends upon the transaction isolation
level setting. All locks still held when a transaction completes are released, regardless of whether the
transaction commits or rolls back.
Locking is crucial for transaction processing and is normal behavior for the system. Problems only occur
when locks are held for too long and other transactions are blocked in a similar way because of the locks
being held.
8-16 Tracing SQL Server Activity
Blocking
Blocking is what happens to one process when it needs to wait for a resource that another process has
locked. Blocking is a normal occurrence for systems and is not an issue, except when it is excessive.
You can monitor blocking in real time by using SQL Server Activity Monitor and by running Dynamic
Management Views. Additionally, you can use the TSQL_Locks trace template in SQL Server Profiler to
capture details of lock-related events in order to troubleshoot excessive blocking.
Deadlocks
Deadlock errors are a special type of blocking error where SQL Server needs to intervene; otherwise the
locks would never be released. The most common form of deadlock occurs when two transactions have
locks on separate objects and each transaction requests a lock on the other transaction’s object. For
example:
• Task 1 requests an exclusive lock on row 2, but it cannot be granted until Task 2 releases the shared
lock.
• Task 2 requests an exclusive lock on row 1, but it cannot be granted until Task 1 releases the shared
lock.
• Each task must wait for the other to release the lock, which will never happen.
A deadlock can occur when several long-running transactions execute concurrently in the same database.
A deadlock can also happen as a result of the order in which the optimizer processes a complex query,
such as a join.
• Chooses a deadlock victim. SQL Server gives priority to the process that has the highest deadlock
priority. If both processes have the same deadlock priority, SQL Server rolls back the transaction that
is the least costly to rollback.
Note: In a multi-user environment, each client should check for message number 1205,
which indicates that the transaction was rolled back. If message 1205 is found, the application
should reconnect and try the transaction again.
You can monitor deadlocks by using SQL Server Profiler and/or SQL Trace. There are several deadlock
events available, including the Deadlock Graph event.
Demonstration Steps
Capture a Trace Based on the TSQL_Locks Template
1. Ensure that you have completed the previous demonstrations in this module.
2. In SQL Server Profiler, on the File menu, click New Trace. Then connect to the MIA-SQL database
engine instance using Windows authentication.
3. In the Trace Properties dialog box, in the Trace name box, type Locks. Then in the Use the
template drop-down list, select TSQL_Locks.
4. On the Events Selection tab, view the events that are selected in this template. Then click the column
header for the Database Name column, and in the Edit Filter dialog box, expand Like, enter
AdventureWorks, and click OK.
6. While the trace is running, in the D:\Demofiles\Mod08 folder, run Deadlock.cmd. This will open two
command prompt windows.
When both command prompt windows close, in SQL Server profiler, on the File menu, click Stop Trace.
1. In SQL Server Profiler, in the Locks trace, find the Deadlock graph event and select it.
2. In the bottom pane, view the deadlock graph; which shows that a deadlock occurred and one process
was selected as the victim.
3. On the File menu, point to Export, point to Extract SQL Server Events, and click Extract Deadlock
Events. Then save the deadlock events as Deadlocks in the D:\Demofiles\Mod08 folder.
4. Close SQL Server Profiler, and in SQL Server Management Studio, open the Deadlocks_1.xdl file in
the D:\Demofiles\Mod08 folder. Note that you can view deadlock graph files in SQL Server
Management Studio.
5. Hold the mouse pointer over each of the process circles to see the statements that they were
executing as a tooltip. Note that the deadlock occurred because one process used a transaction to
update records in the Production.Product table and then the Sales.SpecialOffer table, while the
other process tried to update the same records in the opposite order.
Objectives
After completing this lab, you will be able to:
2. Name the trace InternetSales Workload, and configure it to save the results to
D:\Labfiles\Lab08\Starter\InternetSales Workload.trc.
3. Base the trace on the TSQL trace template, and add the SQL:StmtCompleted event.
5. Filter the trace to include events where the DatabaseName column is like InternetSales.
2. While the trace is running, run the Workload.ps1 PowerShell script in the D:\Labfiles\Lab08\Starter
folder. This starts a workload in the InternetSales database that lasts for approximately three
minutes.
3. Observe the activity in SQL Server Profiler while the workload is running.
5. View the details of some of the SQL:StmntCompleted events that were captured to identify
Transact-SQL statements in the workload.
Results: After this exercise, you should have captured a workload using SQL Server Profiler.
2. Generate Recommendations
2. Create a tuning session named Tune InternetSales based on the trace file you captured previously.
The session should analyze the workload in the InternetSales database and tune all tables in this
database apart from dbo.CurrencyRate.
3. Configure the advanced tuning options to generate online recommendations where possible.
2. When analysis is complete, review the recommendations and save them as a Transact-SQL script in
the D:\Labfiles\Lab08\Starter folder.
2. View the Statement detail report and compare the Current Statement Cost and Recommended
Statement Cost values for the query you identified as being most frequently used.
3. Copy the statement string for the most frequently used statement to the clipboard, and then use SQL
Server Management Studio to create a new query with a connection to the InternetSales database in
the MIA-SQL instance and paste the copied statement.
4. Display the estimated execution plan for the query, and note any warnings about missing indexes and
the estimated subtree cost for the root SELECT statement.
8-20 Tracing SQL Server Activity
5. Open the recommendations script you saved from the Database Engine Tuning Advisor and run it to
create the recommended indexes and statistics.
6. Return to the most frequently executed query, display the estimated execution plan again, and note
any differences.
Results: After this exercise, you should have analyzed the trace in the Database Engine Tuning Advisor,
and reviewed the recommendations.
2. Run the script to start the trace, and note the returned TraceID value.
3. While the trace is running, run the Workload.ps1 PowerShell script in the D:\Labfiles\Lab08\Starter
folder. This starts a workload in the InternetSales database that lasts for approximately three
minutes.
4. While the workload is running, in SQL Server Manager create a new query that uses the following
Transact-SQL code (replacing TraceID with the TraceID value for your trace). Do not execute the code
until the workload has finished:
5. When the workload finishes, run the Transact-SQL query you created in the previous step to stop the
trace.
Administering Microsoft® SQL Server® Databases 8-21
Results: After this exercise, you should have captured a trace using SQL Trace.
Question: How do you think the Database Engine Tuning Advisor recommendations you
implemented will affect overall performance of the workload?
Question: The workload you traced was defined to reflect common reporting functionality
and includes only SELECT queries. Alongside this workload, the InternetSales database must
process INSERT and UPDATE operations submitted by the e-commerce site. How will the
recommendations you implemented affect these workloads?
8-22 Tracing SQL Server Activity
Best Practice: When tracing activity in SQL Server, consider the following best practices:
• Use SQL Server Profiler to perform short traces for debugging and other purposes.
• Use SQL Trace for large and long-running traces.
• Use SQL Server Profiler to define traces and script them for SQL Trace.
• Use Database Engine Tuning Advisor to analyze the database based on the overall workload you want
to optimize, rather than focusing on individual queries.
Review Question(s)
Question: In what situations would you use SQL Trace rather than SQL Server Profiler?
Module 9
Managing SQL Server Security
Contents:
Module Overview 9-1
Module Overview
Appropriate protection of data is vital in any application, and understanding how to implement security at
the server and individual database level is a key requirement for a database administrator (DBA).
In this module, you will be learn about the core concepts on which the SQL Server security architecture is
based, and how to manage security at the server and database levels.
Objectives
After completing this lesson, you will be able to:
Lesson 1
Introduction to SQL Server Security
SQL Server is designed to be a secure data platform, and includes a range of security features. The security
architecture in SQL Server is based on well-established principles, with which you should be familiar
before configuring individual security settings.
Lesson Objectives
After completing this lesson, you will be able to:
Security Concepts
Before learning how to configure security in SQL
Server, it may be useful to explore some basic
security concepts, and identify how they relate to
SQL Server. Security is a major feature of all
enterprise software systems, and many of the
concepts relating to security are similar across
multiple systems.
Security Hierarchies
Security architectures are often hierarchical, primarily to simplify management of permissions. In a
hierarchical security architecture, securables can contain other securables, for example a folder can
contain files; and principals can contain other principals, for example users can be added to a group.
Permissions are usually inherited, both by hierarchies of securables (for example granting “read”
permission on a folder implicitly grants “read” permission on the files it contains), and by hierarchies of
principals (for example granting “read” permission to a group implicitly grants read permission to all users
who are members of that group. Generally, inherited permissions can be explicitly overridden at different
hierarchy levels to fine tune access.
• Fewer individual permissions need to be granted, reducing the risk of misconfiguration. You can set
the general permissions that are required at the highest level in the hierarchy, and only apply explicit
overriding permissions further down the hierarchy to handle exceptional cases.
• After the permissions have been set, they can be controlled through group membership. This makes it
easier to manage permissions in environments where new users arrive and existing users leave or
change roles.
Best Practice: When planning a security solution, consider the following best practices:
• Provide each principal with only the permissions they actually need.
• Use securable inheritance to minimize the number of implicit permissions that must be set in order to
enable the required level of access.
• Use principal containers such as groups or roles to create a layer of abstraction between principals
and permissions to access securables. Then use membership of these groups to control access to
resources via the permissions you have defined. Changes in personnel should not require changes to
permissions.
Server-Level Securables
In SQL Server, securables at the top level of the SQL
Server instance are referred to as server-level
objects. Some examples of server-level objects
include:
• Credentials. These can be used by SQL Server to access external resources, such as Microsoft Azure
storage.
Note: Some securables are also principals. For example, a login is a principal that enables
access to the SQL Server instance; but it is also a securable because there are actions that can be
performed on it (such as disabling, or deleting it) that require permissions.
Database-Level Securables
In a database, there are objects that must be secured. These include:
• Certificates. These are cryptographic keys that can be used to encrypt data or authenticate
connections.
9-4 Managing SQL Server Security
• Users. These are principals that enable access to a database and the objects it contains.
• Schemas. These are namespaces that are used to organize database objects.
Schemas define namespaces for database objects. Every database contains a schema named dbo, and
database developers can create additional schemas to keep related objects together and simplify
permissions management. Schemas contain core database objects, including:
• Tables. These are the data structures that contain application data.
• Views. These are pre-defined Transact-SQL queries that are used as a layer of abstraction over tables.
• Indexes. These are structures that are used to improve query performance.
• Stored procedures and functions. These are pre-defined Transact-SQL statements that are used to
implement business logic in a database application.
Server-Level Principals
In order to gain access to a SQL Server instance, a
user must use a server-level principal called a login.
Logins can then be added to server-level roles.
Logins
SQL Server supports two kinds of login:
• SQL Server logins. These are logins with
security credentials that are defined in the
master database. SQL Server authenticates these logins by verifying a password.
• Windows logins. These reference security accounts that are managed by Windows, such as Windows
users or groups. SQL Server does not authenticate these logins, but rather trusts Windows to verify
their identity. For this reasons, connections made to SQL Server using a Windows login are often
referred to as trusted connections.
Note: It is also possible for logins to be created from certificates and keys, but this is an
advanced topic beyond the scope of this course.
When using Windows logins, it is important to note that a Windows login in SQL Server can reference an
individual Windows user, a domain global group defined in Active Directory, or a local group (either a
domain local group in Active Directory or a local group defined on the Windows server hosting SQL
Server). A Windows login that references a group implicitly enables all Windows users in that group to
access the SQL Server instance.
Using Windows group logins can greatly simplify ongoing administration. Windows users are added to
global groups based on their role within the organization, and global groups are added to local groups
based on specific SQL Server access requirements. As new users arrive, and existing users change roles or
leave the organization, access to SQL Server is controlled through group membership in Active Directory,
and no additional changes need to be made within SQL Server. You should also note that basing logins
on Windows groups can make testing and troubleshooting permissions issues within SQL Server more
complex; but generally the long-term manageability benefits make this a worthwhile tradeoff.
Administering Microsoft® SQL Server® Databases 9-5
Server-Level Roles
Server-level roles are security principals to which you can add logins in order to simplify permissions
management. SQL Server 2014 supports two kinds of server-level role:
• Fixed server-level roles. These are system-defined roles that are automatically granted the required
permissions to perform specific server-level management tasks.
• User-defined server roles. These are roles that DBAs can create in order to define custom server-
level management groups.
Adding a login to a server role implicitly grants that login all of the permissions assigned to the role.
Database-Level Principals
Having access to the server does not (in itself) indicate that a login has any access to user databases on
the server. To enable logins to access a databases, a mapping must exist between the login and a
database user in the database. You can add database users to database-level roles to simplify permissions
management.
Database Users
A database user is a database-level principal that is usually mapped to a login at the server-level.
Database users often have the same name as the logins that they are mapped to, but this is not required.
You can even map logins to different user names in each database. You can think of a database user as
being the identity that a login uses when accessing resources in a particular database.
Note: Mapping login names to database user names is considered a best practice.
Database-Level Roles
You can add database users (in both regular and contained databases) to database roles in order to
simplify permissions management. SQL Server supports two kinds of database-level role:
• Fixed database-level roles. These are system-defined roles that encapsulate permissions required to
perform common tasks.
• User-defined database roles. These are custom roles that DBAs can create to group users with
similar access requirements.
Note: In an environment where all logins are based on Windows groups, database users
based on these group logins behave in much the same way as roles, so you may choose to grant
permissions directly to database users based on Windows groups rather than create user-defined
roles. However, user-defined roles can be useful to combine users with similar permissions
requirements when you are using a mixture of individual Windows logins and SQL Server logins.
9-6 Managing SQL Server Security
Application Roles
An application role is a database-level principal that an application can activate in order to change its
security context within the database. When an application role is active, SQL Server enforces the
permissions that are applied to the application role and not those of the current database user.
GRANT
A user who has not been granted permission is unable to perform the action related to it. For example,
users have no permission to SELECT data from tables if they have not been granted permission. In SQL
Server. Permissions are granted using the GRANT statement.
You can grant multiple permissions to multiple principals in a single GRANT statement, as shown in the
following pseudo-code:
-- Object permission
GRANT permission, permission, …n ON securable securable TO principal, principal, …n;
Permissions can be granted explicitly or they can be inherited. Permission inheritance applies to principal
hierarchies (in other words, if a database role is granted SELECT permission on a schema, all database
users who are members of that role are implicitly granted SELECT permission on the schema); and also to
securable hierarchies (for example, the database role that was granted SELECT permission at the schema
level implicitly receives SELECT permission on all objects within the schema that support that permission.
Inherited permissions are cumulative. For example, if a database role has been granted SELECT permission
on schema, and a user who is a member of that database role has explicitly been granted UPDATE
permission on a table within the schema, the user receives SELECT permission on the table (inherited
through membership of a role that has SELECT permission on the parent schema) and UPDATE permission
(directly granted to the user on the table).
Administering Microsoft® SQL Server® Databases 9-7
DENY
An exception can be made to cumulative inherited permissions by using the DENY statement. A DENY
statement explicitly denies a specific permission on a securable to a principal, and overrides any other
explicit or inherited permissions that the principal may have been granted.
The format of a DENY statement mirrors that of a GRANT statement, as shown in the following pseudo-
code:
-- Object permission
DENY permission, permission, …n ON securable securable TO principal, principal, …n;
For example, a user who is a member of a database role that has SELECT permission on a schema,
automatically has SELECT permission on all tables and views in that schema. If the schema contains a table
to which you do not want the user to have access, you can DENY select permission on the table to the
user. Even though the user has inherited SELECT permission through membership of the database role
(which in turn has inherited SELECT permission from the parent schema), the user will not be able to query
the table.
Note that DENY permissions are inherited, and cannot be overridden further down the hierarchy. For
example, if you deny a user SELECT permission on a schema, granting SELECT permission on an individual
table in the schema will not allow the user to query that table. Additionally, note that a DENY permission
cannot be used to prevent access to a securable by its owner or members of the sysadmin server-level
role.
Note: You should use DENY sparingly. A need to DENY many permissions tends to indicate
a potential problem with your security design.
REVOKE
To remove a previously granted or denied permission, you can use the REVOKE statement. Note that
REVOKE removes only a specified explicit permission, it cannot be used to override inherited permissions.
When you revoke a permission, you revoke it from a principal, as shown in the following pseudo-code:
-- Object permission
REVOKE permission, permission, …n ON securable securable FROM principal, principal, …n;
If you have granted a principal a permission and included the WITH GRANT OPTION FOR clause (so the
principal can grant the same permission to others), you can use the REVOKE GRANT OPTION statement to
revoke the ability to grant the permission without revoking the permission itself. You can also use the
9-8 Managing SQL Server Security
REVOKE statement with a CASCADE clause to revoke the permission to others who have been granted the
permission by the specified principal.
Effective Permissions
The effective permissions for a given principle on a specific securable are the actual permissions that SQL
Server will enforce based on:
You can view the effective permissions for SQL Server logins and individual Windows logins in two ways:
• In SSMS, view the Permissions tab of the properties dialog box for the securable, or the Securables
tab of the properties dialog for the principal. Here you can select the combination of principal and
securable you want to check and view the Effective tab of the Permissions pane.
• In Transact-SQL, use the EXECUTE AS statement to impersonate a login (at the server level) or a user
(at the database level), and query the sys.fn_my_permissions system function, specifying the
securable for which you want to view effective permissions.
The following code sample shows how to impersonate the login ADVENTUREWORKS\RosieReeves and
view the effective permissions on a table named dbo.Products:
REVERT
Windows group logins (and users based on them) cannot be impersonated; so you cannot use either of
these techniques to view effective permissions for a login based on a Windows group. Each Windows user
can log in and query the sys.fn_my_permissions system function for themselves, but since Windows
users can be added to more than one Windows group, the results for each Windows user may vary
depending on the groups to which they belong.
Administering Microsoft® SQL Server® Databases 9-9
Lesson 2
Managing Server-Level Security
Security implementation for SQL Server usually begins at the server level, where users are authenticated
based on logins and organized into server-level roles to make it easier to manage permissions.
Lesson Objectives
After completing this lesson, you will be able to:
• Manage logins.
• Manage server-level roles.
Note: In technical terms, an application that retrieves data based on the user’s credentials
uses impersonation to access a SQL Server instance on the same server, and delegation to access
SQL Server on a remote server. Delegation requires substantial configuration, a discussion of
which is beyond the scope of this course.
9-10 Managing SQL Server Security
• SQL Server and Windows authentication mode. In SQL Server and Windows authentication, users
with Windows logins can access SQL Server, and users with SQL Server logins, which are directly
authenticated by SQL Server, can access the instance. This mode is often called mixed authentication.
You can specify which type of authentication to use when you install SQL Server and you can also change
the mode after installation, although this needs an instance level restart to take effect.
Note: You generally change the configuration by using SQL Server Management Studio
(SSMS). However, it is only a single registry key that is being changed, therefore you can also
configure it by using a group policy within Windows.
If you install SQL Server using mixed mode authentication, setup enables a SQL Server login called sa. It is
important to create a complex password for this login because it has administrative rights at database
server level. If you install using Windows authentication mode, then changing to mixed authentication
later does not enable the sa login.
The SQL Server Native Access Client (SNAC) provides encrypted authentication for SQL Server logins. If
SQL Server does not have a Secure Sockets Layer (SSL) certificate installed by an administrator, SQL Server
generates and self-signs a certificate for encrypting authentication traffic.
Note: Encrypted authentication only applies to clients running the SQL Server 2005 version
of SNAC or later. If an earlier client that does not understand encrypted authentication tries to
connect, by default SQL Server does not allow the connection. If this is a concern, you can use
SQL Server Configuration Manager to disallow unencrypted authentication from down-level
clients.
Administering Microsoft® SQL Server® Databases 9-11
Managing Logins
You can create logins by using either Transact-SQL
code or the GUI in SSMS. Because the task can be a
very common operation, you may find it much
faster, more repeatable, and accurate to use a
Transact-SQL script.
Creating Logins
To create a login by using SSMS, expand the
Security node for the relevant server instance,
right-click Logins, and then click New Login.
Complete the details in the Login - New dialog box
to configure the login which you require.
Alternatively, you can create logins by using the
CREATE LOGIN Transact-SQL statement.
In this example, a login named ADVENTUREWORKS\SalesReps is created for the Windows group of the
same name. The default database for the user will be salesdb. If you do not specify this option, the
default database is set to master.
Note: Windows user and group names must be enclosed in square brackets because they
contain a backslash character.
You create SQL Server logins in the same way. There are, however, additional arguments that are only
relevant to SQL Server logins (for example, the PASSWORD argument).
The following example shows how to create a login named DanDrayton and assign a password of
Pa$$w0rd:
When you create a SQL Server login, you can specify the following options to control how the password
policy is enforced:
• MUST_CHANGE: SQL Server will prompt the user to change their password the next time they log
on. You must ensure that whatever client application the user will use to connect to SQL Server
supports this. The default value for this setting is ON when using the user interface to create a login,
but off when using Transact-SQL CREATE LOGIN statement.
• CHECK_POLICY = {ON | OFF}: Setting this value ON enforces the password complexity policy for this
user. The default value for this setting is ON.
9-12 Managing SQL Server Security
• CHECK_EXPIRATION = {ON | OFF}: Setting this value ON enables password expiration, forcing the
user to change their password at regular intervals. The default value for this setting is ON when using
the user interface to create a login, but off when using Transact-SQL CREATE LOGIN statement
You can configure policy settings for a SQL Server login in SSMS, or in the CREATE LOGIN or ALTER LOGIN
statement. The following code example modifies the DanDrayton login created earlier to explicitly
disable policy checking:
The full application of account policy is not always desirable. For example, some applications use fixed
credentials to connect to the server. Often, these applications do not support regular changing of login
passwords. In these cases, it is common to disable password expiration for those logins.
You can reset passwords by using SSMS or the ALTER LOGIN Transact-SQL statement.
Changing a Password
ALTER LOGIN DanDrayton
WITH OLD_PASSWORD = 'Pa$$w0rd',
PASSWORD = 'NewPa$$w0rd';
Disabling a Login
The following code shows how to use the ALTER LOGIN statement to disable a login:
ALTER LOGIN DanDrayton DISABLE;
You can remove logins from a server by using the DROP LOGIN statement or SSMS. If a user is currently
logged in, you cannot drop their login without first ending their session.
In this example, a login is dropped from the server instance:
Dropping a Login
DROP LOGIN DanDrayton;
Administering Microsoft® SQL Server® Databases 9-13
It is very important that you follow the principle of least privilege when assigning roles to security
principals. For example, Imagine a user who needs permissions to shut down the server, end processes,
manage disk files, and to create, alter, and drop databases. You might consider that it is more
straightforward to add the user’s login to the sysadmin role, rather than adding it to the four roles that
would be required to give the required permissions, however adding the login to the sysadmin role
would give excessive permissions, including the ability to add other members to the sysadmin role.
Following the principle of least privilege prevents the awarding of unintended rights, and helps to keep
servers secure.
Note: Unlike in earlier versions of SQL Server, the BUILTIN\administrators and Local System
(NT AUTHORITY\SYSTEM) accounts are not automatically added as members of the sysadmin
role, although you can add them manually if required. Note that this does not affect the ability of
local administrators to access the database engine when it is in single user mode.
In the following code example, the ADVENTUREWORKS\WebAdmins login is added to the app_admin
server-level role:
To remove a role member, use the ALTER SERVER ROLE statement with the DROP MEMBER clause.
To view membership of fixed server-level roles and users-defined server roles, you can query the
sys.server_role_members system view.
Administering Microsoft® SQL Server® Databases 9-15
Note: By default, each member of a fixed server role can add other logins to that role.
However, members of user-defined server roles cannot add other server principals to the role.
In general, you should manage server-level permissions based on membership of fixed server-level roles
or by granting permissions to user-defined server roles rather than directly to logins. The only exception
to this recommendations is that if all logins are based on Windows groups, you may choose to grant
custom permissions directly to logins since they offer the same manageability benefits as user-defined
server-level roles.
• Create logins.
Demonstration Steps
Set the Authentication Mode
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
9-16 Managing SQL Server Security
3. Start SQL Server Management Studio, and connect to the MIA-SQL database engine using Windows
authentication.
In the Server Properties – MIA-SQL dialog box, on the Security page, verify that SQL Server and
Windows Authentication mode is selected. Then click Cancel.
Create Logins
1. In Object Explorer, expand Security, and expand Logins to view the logins that are currently defined
on this server instance.
2. Right-click Logins and click New Login. Then in the Login – New dialog box, next to the Login
name box, click Search.
3. In the Select User or Group dialog box, click Object Types. Then in the Object Types dialog box,
select only Users and Groups and click OK.
4. In the Select User or Group dialog box, click Locations. Then in the Locations dialog box, expand
Entire Directory, select adventureworks.msft and click OK.
5. In the Select User, Service Account, or Group dialog box, click Advanced. Then click Find Now.
This produces a list of all users and groups in the Active Directory domain.
6. In the list of domain objects, select HumanResources_Users (this is a domain local group that
contains multiple global groups, each of which in turn contains users), then click OK.
7. In the Select User, Service Account, or Group dialog box, ensure that HumanResources_Users is
listed, and click OK.
8. In the Login – New dialog box, in the Default database drop-down list, select AdventureWorks.
Then click OK and verify that the ADVENTUREWORKS\HumanResources_Users login is added to
the Logins folder in Object Explorer.
9. Right-click Logins and click New Login. Then in the Login – New dialog box, enter the name
Payroll_Application and select SQL Server authentication.
10. Enter and confirm the password Pa$$w0rd, and then clear the Enforce password expiration check
box (which automatically clears the User must change password at next login check box).
11. In the Default database drop-down list, select AdventureWorks. Then click OK and verify that the
Payroll_Application login is added to the Logins folder in Object Explorer.
12. Open the CreateLogins.sql script file in the D:\DemoFiles\Mod09 folder and review the code it
contains, which creates a Windows login for the ADVENTUREWORKS\AnthonyFrizzell user and the
ADVENTUREWORKS\Database_Managers local group, and a SQL Server login named
Web_Application.
13. Click Execute. Then, when the script has completed successfully, refresh the Logins folder in Object
Explorer and verify that the logins have been created.
1. In Object Explorer, expand Server Roles and view the server roles that are defined on this instance.
2. Right-click the serveradmin fixed server-level role and click Properties. Then in the Server Role
Properties – serveradmin dialog box, click Add.
Administering Microsoft® SQL Server® Databases 9-17
3. In the Select Server Login or Role dialog box, click Browse, and in the Browse for Objects dialog
box, select [ADVENTUREWORKS\Database_Managers] and click OK. Then in the Select Server
Login or Role dialog box, click OK.
5. Open the ServerRoles.sql script file in the D:\DemoFiles\Mod09 folder and review the code it
contains, which creates a user-defined server role named AW_securitymanager and adds the
ADVENTUREWORKS\AnthonyFrizzell user to the new role.
6. Click Execute. Then, when the script has completed successfully, refresh the Server Roles folder in
Object Explorer and verify that the role has been created.
7. Right-click the AW_securitymanager role and click Properties, and verify that
ADVENTUREWORKS\AnthonyFrizzell is listed as a member. Then click Cancel.
1. Open the ServerPermissions.sql script file in the D:\DemoFiles\Mod09 folder and review the code it
contains, which grants ALTER ANY LOGIN permission to the AW_securitymanager server role.
2. Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click the
AW_securitymanager role and click Properties.
3. In the Server Role Properties - AW_securitymanager dialog box, on the General tab, view the
selected securables. Then click Cancel.
7. In the Permissions for Payroll_Application list, on the Explicit tab, note that no explicit permissions
on this login have been granted to ADVENTUREWORKS\AnthonyFrizzell. Then click the Effective
tab and note that the ALTER permission has been inherited through membership of the
AW_securitymanager role.
9. Leave SQL Server Management Studio open for the next demonstration.
9-18 Managing SQL Server Security
Lesson 3
Managing Database-Level Principals
After creating logins, it is necessary to provide that login with access to at least one databases before they
can log into the server. Generally, you only need to enable logins to access the databases that they need
to work with. You can do this by creating a database user for the login in each database that it must
access.
In this lesson, you will see how to create and manage database-level principals, including database users
and database roles.
Lesson Objectives
After completing this lesson, you will be able to:
Note: The names of Windows logins must be enclosed in square brackets because they
contain a backslash character.
Administering Microsoft® SQL Server® Databases 9-19
Note that the first example includes a default schema. Schemas are namespaces used to organize objects
in the database. If no default schema is specified, the user’s default schema will be the built-in dbo
schema. Note that in the third example, the username is different to the login with which it is associated.
You can remove users from a database by using the DROP USER statement or SSMS. However, you cannot
drop a database user who owns any securable object (for example, tables or views).
To resolve the issue, you need to update the database user to link it to the new login on the server by
using the ALTER USER statement with the WITH LOGIN clause.
This solves the issue but, if you later restore the database on the same or a different server, the problem
will arise again. A better way of dealing with it is to avoid the problem overall by using the WITH SID
clause when you create the login.
dbo User
The dbo user is a special user who has permissions
to perform all activities in the database. Any
member of the sysadmin fixed server role
(including the sa user when using mixed mode
authentication) who uses a database, is mapped to
the special database user called dbo. You cannot delete dbo database user and it is always present in
every database.
Database Ownership
Like other objects in SQL Server, databases also have owners, which are mapped to the dbo user.
The following example shows how you can modify the owner of a database by using the ALTER
AUTHORIZATION statement:
TO [ADVENTUREWORKS\Database_Managers];
Any schema created by a login mapped to the dbo user will automatically have dbo as its owner. By
default, objects within a schema have their owner set to NULL and inherit the owner of the schema in
which they are defined. Owners of objects have full access to the objects and do not require explicit
permissions before they can perform operations on those objects.
guest User
The guest user account enables logins that are not mapped to a database user in a particular database to
gain access to that database. Login accounts assume the identity of the guest user when the following
conditions are met:
• The login has access to SQL Server but not the database, through its own database user mapping.
You can add the guest account to a database to enable anyone with a valid SQL Server login to access it.
The guest username is automatically a member of the public role. (Roles will be discussed in the next
module).
• SQL Server checks to see whether the login that is trying to access the database is mapped to a
database user in that database. If it is, SQL Server grants the login access to the database as that
database user.
• If the login is not mapped to a database user, SQL Server then checks to see whether the guest
database user is enabled. If it is, the login is granted access to the database as guest. If the guest
account is not enabled, SQL Server denies access to the database for that login.
You cannot drop the guest user from a database, but you can prevent it from accessing the database by
using the REVOKE CONNECT statement. Conversely, you can enable the guest account by using the
GRANT CONNECT statement.
Note: By default, the guest user is enabled in the master, msdb, and tempdb databases.
You should not try to revoke the guest access in these databases.
Administering Microsoft® SQL Server® Databases 9-21
Fixed Database-Level
Description
Role
Fixed Database-Level
Description
Role
Transact-SQL
commands in the
database. DDL is the
portion of the
Transact-SQL
language that deals
with creating,
altering, and
deleting database
and SQL Server
objects.
In the following code example, the WebApp user is added to the product_reader role:
To remove a role member, use the ALTER ROLE statement with the DROP MEMBER clause.
To view membership of fixed server-level roles and users-defined server roles, you can query the
sys.database_role_members system view.
Demonstration Steps
Create Database Users
1. Ensure that you have completed the previous demonstration in this module.
2. In SQL Server Management Studio, in Object Explorer, expand Databases, expand the
AdventureWorks database, and expand its Security folder. Then expand the Users folder and view
the users currently defined in the database.
3. Right-click Users and click New User. Then, in the Database user – New dialog box, enter the user
name Web_Application, the login name Web_Application, and the default schema Sales; and click
OK.
4. Open the CreateUsers.sql script file in the D:\DemoFiles\Mod09 folder and review the code it
contains, which creates users for the Payroll_Application,
ADVENTUREWORKS\HumanResources_Users, and ADVENTUREWORKS\AnthonyFrizzell logins.
5. Click Execute. Then, when the script has completed successfully, refresh the Users folder in Object
Explorer and verify that the users have been created.
2. Right-click the db_datareader role and click Properties. This is a fixed database-level role.
3. In the Database Role Properties – db_datareader dialog box, click Add. In the Select Database
User or Role dialog box, enter AnthonyFrizzell and click OK. Then verify that AnthonyFrizzel is
listed and click OK.
4. Right-click Database Roles and click New Database Role.
5. Enter the role name hr_reader, and click Add. In the Select Database User or Role dialog box, enter
HumanResources_Users; Payroll_Application and click OK. Then verify that
HumanResources_Users and Payroll_Application are listed and click OK.
6. Open the DatabaseRoles.sql script file in the D:\DemoFiles\Mod09 folder and review the code it
contains, which creates roles name hr_writer and web_customer, and adds the
HumanResources_Users to the hr_writer role and Web_Application to the web_customer role.
7. Click Execute. Then, when the script has completed successfully, refresh the Database Roles folder in
Object Explorer and verify that the roles have been created.
8. Keep SQL Server Management Studio open for the next demonstration.
9-24 Managing SQL Server Security
After you have creates an application role and assigned it the required permissions, it can be activated by
executing the sp_setapprole stored procedure.
The following code example shows how to activate an application role:
An application role remains active until the user disconnects from SQL Server or it is deactivated by using
the sp_unsetapprole stored procedure. However, to use sp_unsetapprole, you must specify a cookie that
was generated when the application role was activated.
The following code example shows how to create a cookie when activating an application role, and how
to use the cookie to deactivate the application role when it is no longer required:
When the application has completed the operation for which the application role’s permissions are
required, it can deactivate the role and revert to the current user’s security context by executing the
sp_unsetapprole stored procedure.
Demonstration Steps
Create an Application Role
1. Ensure that you have completed the previous demonstration in this module.
2. In SQL Server Management Studio, under the Roles folder for the AdventureWorks database, right-
click Application Roles and click New Application Role.
3. In the Application Role – New dialog box, enter the role name pay_admin, enter the default
schema HumanResources, enter and confirm the password Pa$$w0rd, and click OK.
1. Open the ApplicationRole.sql script file in the D:\DemoFiles\Mod09 folder. The code in this file
displays the identity of the current user and login before, during, and after the activation of the
pay_admin application role.
2. Right-click anywhere in the script window, point to Connection, and click Change Connection. Then
connect to the MIA-SQL database engine using SQL Server authentication as Payroll_Application
with the password Pa$$w0rd.
3. Click Execute and view the results. Note that the System Identity does not change (which may be
important for auditing reasons), but that the DB Identity switched to pay_admin while the
application role was active.
4. Close the ApplicationRole.sql query pane, but keep SQL Server Management Studio open for the
next demonstration.
• When a database is in development and the developer does not know which instance will ultimately
host the database.
• When a database that participates in an AlwaysOn availability group is mirrored on multiple server
instances, and it is useful to be able to failover to a secondary instance without having to synchronize
server-level logins required to access the database.
A contained database is a database that is hosted on an instance of SQL Server, but which has no
dependencies on the server instance. Because there are no dependencies, you can move the database
between servers, or use it in availability group scenarios without having to consider external factors, such
as logins.
Note: To learn about AlwaysOn Availability Groups and other high-availability techniques,
attend course 20465C: Designing a Data Solution with Microsoft SQL Server.
• Contained databases store the metadata that defines the database. This information is usually stored
only in the master database, but in a contained database it is also stored in the contained database
itself.
Contained Users
After creating a contained database, you can create contained users for that database. These users can be
one of two types:
• Users with associated password. These users are authenticated by the database.
• Users that are mapped to Windows user accounts. These users exist only in the database with no
associated server-level login, and do not require the user to maintain a separate password. Instead of
performing its own authentication, the database trusts Windows authentication.
You can create a contained user by using the CREATE USER statement in the context of a contained
database.
Administering Microsoft® SQL Server® Databases 9-27
Demonstration Steps
Create a Contained Database
1. Ensure that you have completed the previous demonstrations in this module.
2. In SQL Server Management Studio, open the ContainedDatabase.sql script file in the
D:\Demofiles\Mod09 folder.
3. Select the code under the comment Enable contained databases and click Execute. This code
configures the server options to enable contained databases.
4. Select and execute the code under the comment Create a contained database. This code creates a
database named ContainedDB with a CONTAINMENT setting of PARTIAL.
5. In Object Explorer, under MIA-SQL, refresh the Databases folder and verify that ContainedDB is
listed.
6. Right-click ContainedDB and click Properties. Then, in the Database Properties – ContainedDB
dialog box, on the Options tab, note that the Containment type is set to Partial, and click Cancel.
1. In the query window, select and execute the code under the comment Create contained users. This
code creates two users in the ContainedDB database: A SQL Server user with a password, and a
Windows user.
2. In Object Explorer, expand the ContainedDB database, expand Security, and expand Users. Note
that the two contained users you created are listed.
3. In Object Explorer, under the server-level Security folder, refresh the Logins folder. Note that there
are no logins for the users you created in the contained database.
4. Right-click anywhere in the query window, point to Connection, and click Change Connection.
5. In the Connect to Database Engine dialog box, ensure MIA-SQL is selected, in the Authentication
drop-down list, select SQL Server Authentication, enter the login SalesApp and the password
Pa$$w0rd, and then click Options.
6. On the Connection Properties tab, in the Connect to database box, ensure <default> is selected
and click Connect. An error occurs because there is no login named SalesAppUser. In the Connect
to Database Engine window, click OK.
7. In the Connect to database box, type ContainedDB. Then click Connect. This connection succeeds
because the user is defined in the database.
8. Close the ContainedDatabase.sql query window, but keep SQL Server Management Studio open for
the next demonstration.
9-28 Managing SQL Server Security
Lesson 4
Managing Database Permissions
After you have enabled access to a database by creating users, and organized users into roles, you can
apply permissions to control how users access data and perform tasks in the database.
The fixed database-level roles provided with SQL Server already have some pre-defined permissions, and
it’s possible that you may be able to implement the security you need using only membership of these
roles. However, most databases have more fine-grained security requirements than the fixed database-
level roles alone provide. You should endeavor to use database roles to group users and minimize the
number of individual explicit permissions you need to assign in order to secure the database.
Lesson Objectives
After completing this lesson, you will be able to:
Database-Level Permissions
Similarly to the server level, permissions in a
database can be statement permissions or object
permissions. You manage both of these kinds of
permission by using the GRANT, DENY, and
REVOKE statements as discussed previously in this
module.
Statement permissions at the database level
generally govern data definition language (DDL)
tasks, such as creating or altering users or roles.
At the database level, you can configure permissions on the following securables:
• Users
• Database roles
• Application roles
• Certificates
• Asymmetric keys
• Symmetric keys
• Schemas
Additionally, you can configure permissions on database objects that are contained within schemas, such
as:
• Tables
• Functions
• Stored procedures
• Views
• Indexes
• Constraints
You can use permissions to allow DDL operations on specific securables. In the following code example,
and the sales_admin database role is granted permission to alter the sales_supervisor application role:
You can also use permissions to allow data manipulation language (DML) operations on database objects.
The following example grants SELECT permission on the dbo.ProductCategory and dbo.Product tables
to the product_reader database role:
Note that for database objects that belong to a schema, the object type prefix OBJECT:: can be used, but
this prefix is optional.
Schemas
Schemas are naming and security boundaries within
a database that contain database objects such as:
• Tables
• Functions
• Stored procedures
• Views
• Indexes
• Constraints
Creating a Schema
CREATE SCHEMA sales;
There are a few built-in schemas in SQL Server. The dbo and guest users have associated schemas of their
own names. The sys and INFORMATION_SCHEMA schemas are reserved for system objects which you
cannot drop or create objects in. You can return a list of all the schemas in a database by querying the
sys.schemas view.
Although it is best practice to explicitly state at least the schema and object name when referencing a
database object, you can specify only the object name (for example Product) and rely on SQL Server to
resolve the name to the correct object. When you create a user, you can optionally specify a default
schema for that user. When a user executes Transact-SQL code that references an unqualified object
name, SQL Server first tries to resolve the object name in the user’s default schema, and if it not found
there SQL Server tries to find it in the dbo schema.
Note: A database can potentially contain multiple objects with the same unqualified name
(for example, production.product, sales.product, and dbo.product. For this reason, you should
use explicit two-part schema.object names, three-part database.schema.object names, or fully-
qualified server.database.schema.object names when referencing objects.
The following code example grants INSERT permission on the sales schema to the sales_writer database
role. Members of this role will implicitly be granted INSERT permission on all tables and views in the sales
schema:
Administering Microsoft® SQL Server® Databases 9-31
• DELETE. Principals need this permission to remove rows from a table or view by using the DELETE
statement.
• REFERENCES. Principals need this permission in order to create a foreign-key relationship to a table if
they have no other permissions on the table.
Note: In addition to these DML-related permissions, tables and views have DDL and
administrative permissions, such as ALTER, CONTROL, TAKE OWNERSHIP, and VIEW DEFINITION.
You can view the effective permissions that selected users and roles have on a specific object by viewing
the Permissions page in the Properties dialog box for that object in SSMS. Alternatively, you can view
the effective permissions on selected objects that a specific database principal has by viewing the
Securables tab of the Properties dialog box for that principal.
Column-Level Permissions
In addition to assigning permissions at table or view level, you can also allocate column-level permissions.
This provides a more granular level of security for data in your database.
You do not need to execute separate GRANT or DENY statements for every column where you wish to
assign permissions. Where a set of columns needs to be controlled in the same way, you can provide a list
of columns in a single GRANT or DENY statement.
If you execute a DENY statement at table level for a user, and then execute a GRANT statement at column
level, the DENY permission is removed and the user can access the columns to which you grant access.
9-32 Managing SQL Server Security
However, if you then execute the table-level DENY statement again, the user is denied all permissions on
the table, including on the columns to which they previously had access.
Stored Procedures
By default, users cannot execute stored procedures
that other users create unless you grant them the
EXECUTE permission on the stored procedure. In
addition, they may also need permissions to access
the objects that the stored procedure uses. You will
discover more about this issue later in the lesson.
In the following example, the web_customer role is granted execute permission on the
sales.insert_order stored procedure:
User-Defined Functions
You also need to assign users permissions to execute user-defined functions (UDFs). The permissions that
you need to assign depend on the type of UDF you are working with.
• Scalar UDFs return a single value. Users accessing these functions require EXECUTE permission on the
UDF.
• Table-valued UDFs (TVFs) return a table of results rather than a single value. Accessing a TVF requires
SELECT permission rather than EXECUTE permission, similar to the permissions on a table.
• It is uncommon to directly update a TVF. It is possible, however, to assign INSERT, UPDATE, and
DELETE permissions on one form of TVF known as an inline TVF—this particular form can be updated
in some cases.
In addition to these permissions, there are scenarios where you also need to assign the REFERENCES
permission to users so that they can correctly execute a UDF. These scenarios include functions which:
• Are used in CHECK constraints.
The following example shows how to grant execute permissions to the web_customer role on the
dbo.calculate_tax function:
Managed Code
Managed code is .NET Framework code that ships in assemblies. Assemblies can exist as DLL or EXE files;
however, you can only load assemblies in DLL files in SQL Server by using SQL Server CLR integration.
Assemblies are registered in a SQL Server database by using the CREATE ASSEMBLY statement.
After you load an assembly, the procedures, functions, and other managed code objects appear as
standard objects in SQL Server and the standard object permissions apply. For example, users require
EXECUTE permissions to run a stored procedure, whether it originates in an assembly or in Transact-SQL
code.
Permission Sets
No matter what .NET Framework code is included in an assembly, the actions the code can execute are
determined by the permission set specified when creating the assembly.
• The SAFE permission set strictly limits the actions that the assembly can perform and inhibits it from
accessing external system resources. Code using this permission set can access the local instance of
SQL Server by using a direct access path, called a context connection. The SAFE permission set is the
default.
• The EXTERNAL_ACCESS permission set allows the code to access local and network resources,
environment variables, and the registry. EXTERNAL_ACCESS is even necessary for accessing the same
SQL Server instance if a connection is made through a network interface.
• The UNSAFE permission set relaxes many standard controls over code and you should avoid using it.
Note that this permission set is listed as Unrestricted in the SQL Server Management studio user
interface.
The EXTERNAL_ACCESS and UNSAFE permission sets require additional setup. You cannot specify the
need for an EXTERNAL_ACCESS permission set when executing the CREATE ASSEMBLY statement. You
need to flag the database as TRUSTWORTHY (which is easy, but not recommended) or create an
asymmetric key from the assembly file in the master database, create a login that maps to the key, and
grant the login EXTERNAL ACCESS ASSEMBLY permission on the assembly.
Ownership Chains
All database objects have owners. By default, the
principal_id(owner) property is set to NULL for
new objects and the owner of a schema
automatically owns schema-scoped objects. The
best practice is to have all objects owned by the
schema object owner and therefore an object with
a NULL principal_id property inherits its ownership
from the schema where it is contained.
Having the same owner for all objects in a schema (which itself also has an owner) simplifies permission
management, but it is still important to understand that ownership chain problems can occur and how to
resolve them.
9-34 Managing SQL Server Security
Ownership chaining applies to stored procedures, views, and functions. The slide shows an example of
how ownership chaining applies to views or stored procedures.
2. User2 creates a view that accesses the table and grants User1 permission to access the view. Access is
granted as User2 is the owner of both the top level object (the view) and the underlying object (the
table).
3. User2 then creates a view that accesses a table owned by User3. Even if User2 has permission to
access the table and grants User1 permission to use the view, User1 will be denied access because of
the broken chain of ownership from the top level object (the view) to the underlying object (the
table).
4. However, if User3 grants User1 permissions directly on the underlying table, he can then access the
view that User2 created to access that table.
• Set permissions.
Demonstration Steps
Set Permissions
1. Ensure that you have completed the previous demonstrations in this module.
2. In SQL Server Management Studio, open the DatabasePermissions.sql script file in the
D:\Demofiles\Mod09 folder.
3. Select the code under the comment Grant schema permissions and click Execute This code grants
SELECT permission on the HumanResources schema to the hr_reader database role, and INSERT,
UPDATE, and EXECUTE permission on the HumanResources schema to the hr_writer database role.
4. Select the code under the comment Grant individual object permissions and click Execute This
code grants EXECUTE permission on the dbo.uspGetEmployeeManagers stored procedure to the
hr_reader database role; INSERT permission on the Sales.SalesOrderHeader and
Sales.SalesOrderDetail tables to the web_customer database role; and SELECT permission on the
Production.vProductAndDescription view to the web_customer database role.
5. Select the code under the comment Override inherited permissions and click Execute This code
grants INSERT and UPDATE permission on the Sales schema procedure to the AnthonyFrizzell user,
grants UPDATE permission on the HumanResources.EmployeePayHistory table to the
[Payroll_Application] user; grants UPDATE permission on the SalariedFlag column in the
HumanResources.Employee table to the [Payroll_Application] user; and denies SELECT on the
HumanResources.EmployeePayHistory table to the AnthonyFrizzell user.
2. In the Table Properties – Employee dialog box, on the Permissions tab, note that the
[Payroll_Application] user has been explicitly granted Update permission.
Administering Microsoft® SQL Server® Databases 9-35
3. With the Payroll_Application user selected, view the permissions in the Effective tab, and note that
this user has SELECT permission on the table, and UPDATE permission on the SalariedFlag column.
The SELECT permission has been implicitly granted through membership of the hr_reader database
role, which has inherited SELECT permission from permissions on the parent schema. The UPDATE
permission was granted explicitly.
4. In the Table Properties – Employee dialog box, click Cancel. Then close SQL Server Management
Studio.
9-36 Managing SQL Server Security
• Sales managers.
• ADVENTUREWORKS\Database_Managers:
o ADVENTUREWORKS\IT_Support
• ADVENTUREWORKS\InternetSales_Users:
Administering Microsoft® SQL Server® Databases 9-37
o ADVENTUREWORKS\Sales_Asia
o ADVENTUREWORKS\Sales_Europe
o ADVENTUREWORKS\Sales_NorthAmerica
• ADVENTUREWORKS\InternetSales_Managers:
o ADVENTUREWORKS\Sales_Managers
The main tasks for this exercise are as follows:
3. Create Logins
2. Ensure that the authentication mode for the MIA-SQL SQL Server instance is set appropriately to
support the requirements.
o Any SQL Server logins should use the password Pa$$w0rd and should be subject to password
policy restrictions. However, their passwords should not expire and they should not be required
to change the password when they next log in.
Note: A suggested solution for this exercise is provided in the CreateLogins.sql file in the
D:\Labfiles\Lab09\Solution folder.
2. Create any required user-defined server-level roles, add logins to server-level roles, and grant
appropriate permissions to meet the requirements.
Note: A suggested solution for this exercise is provided in the ServerRoles.sql file in the
D:\Labfiles\Lab09\Solution folder.
9-38 Managing SQL Server Security
Results: After this exercise, the authentication mode for the MIA-SQL SQL Server instance should support
the scenario requirements, you should have created the required logins and server-level roles, and you
should have granted the required server-level permissions.
• dbo schema:
o System objects
• Sales schema:
o SalesOrderHeader table
o SalesOrderDetail table
• Products schema:
o Product table
o ProductSubcategory table
o ProductCategory table
o vProductCatalog view
• Customers schema:
o Customer table
• All sales employees and managers must be able to read all data in the Sales schema.
• Sales managers must be able to insert and update any data in the Sales schema.
• Sales managers must be able to execute any stored procedures in the Sales schema.
• All sales employees, sales managers, and the marketing application must be able to read all data in
the Customers schema.
• The e-commerce application must be able to read data from the Products.vProductCatalog view.
• The e-commerce application must be able to insert rows into the Sales.SalesOrderHeader and
Sales.SalesOrderDetail tables.
• Sales managers must be able to read all data in the Products schema.
• Sales managers must be able to execute any stored procedures in the Products schema.
• The marketing application must be able to read any data in the Products schema.
• Data in the Sales schema must only be deleted by an application that has elevated privileges based
on an additional password. The elevated privileges must enable the application to read, insert,
update, and delete data as well as execute stored procedures in the Sales schema.
3. Assign Permissions
2. Create the required database users in the InternetSales database. Use the following default schemas:
Note: A suggested solution for this exercise is provided in the CreateUsers.sql file in the
D:\Labfiles\Lab09\Solution folder.
Note: A suggested solution for this exercise is provided in the DatabaseRoles.sql file in the
D:\Labfiles\Lab09\Solution folder.
2. Apply the required permissions, granting the minimum number of explicit permissions possible while
ensuring that users have only the privileges they require.
Note: A suggested solution for this exercise is provided in the DatabasePermissions.sql file in the
D:\Labfiles\Lab09\Solution folder.
Results: After this exercise, you should have created the required database users and database-level roles,
and assigned appropriate permissions.
3. In the SQLCMD window, enter the following commands to verify your identity:
SELECT suser_name();
GO
4. Note that SQL Server identifies Windows group logins using their individual user account, even
though there is no individual login for that user. ADVENTUREWORKS\AnthonyFrizzell is a member
of the ADVENTUREWORKS\IT_Support global group, which is in turn a member of the
ADVENTUREWORKS\Database_Managers domain local group.
5. In the SQLCMD window, execute an ALTER LOGIN to change the password of the login for the
marketing application. Your code should look similar to this:
6. In the SQLCMD window, enter an ALTER LOGIN command to disable the login for the e-commerce
web application. Your command should look like this:
7. Close the SQLCMD window and maximize SQL Server Management Studio.
8. In SQL Server Management Studio, view the properties of e-commerce application login and verify
that the login is disabled. Then re-enable it.
2. Use the EXECUTE AS Transact-SQL statement to impersonate the login for the marketing application,
and use the suser_name function to verify that the connection has changed security identity context.
Your code should look similar to this:
USE InternetSales;
SELECT * FROM sys.fn_my_permissions(‘Customers.Customer’, 'object');
GO
4. Execute a SELECT statement to verify that the marketing application can query the
Customers.Customer table. For example:
5. Execute an UPDATE statement and verify that the marketing application cannot update the
Customers.Customer table. For example:
UPDATE Customers.Customer
SET EmailAddress = NULL
WHERE CustomerID = 1;
GO
6. Execute a SELECT statement to verify that the marketing application can query the Products.Product
table. For example:
7. Execute a SELECT statement to verify that the marketing application cannot query the
Sales.SalesOrderHeader table. For example:
3. Verify that you can query the Product.vProductCatalog view. For example, execute the following
query:
4. Verify that you cannot query the Products.Product table. For example, execute the following query:
3. Verify that you can query the Sales.SalesOrderHeader table. For example, execute the following
query:
4. Verify that you cannot update the Sales.SalesOrderHeader table. For example, execute the following
query:
9-42 Managing SQL Server Security
3. Verify that you can query the Sales.SalesOrderHeader table. For example, execute the following
query:
4. Verify that you can update the Sales.SalesOrderHeader table. For example, execute the following
query:
5. Verify that you cannot update the Product.Product table. For example, execute the following query:
6. Verify that you can use the Products.ChangeProductPrice stored procedure to update the
Product.Product table. For example, execute the following query:
7. Verify that you cannot delete data from the Sales.SalesOrderDetail table. For example, execute the
following query:
8. If you created an application role to enable deletions of sales data, test it by using code like this:
Results: After this exercise, you should have verified effective permissions in the MIA-SQL instance and
the InternetSales database.
Question: What sort of login would be required for a user in a Windows domain that is not
trusted by the domain in which SQL Server is installed?
9-44 Managing SQL Server Security
Best Practice: When implementing security in SQL Server, consider the following best
practices:
• Disable logins rather than dropping them if there is any chance that they will be needed again.
• Ensure that expiry dates are applied to logins that are created for temporary purposes.
• Use fixed server-level roles to delegate server-level management responsibility, and only create user-
defined server-level roles if your specific administrative delegation solution requires them.
• Disable the guest user in user databases unless you specifically require guest access.
• Aim to grant the minimum number of explicit permissions possible to meet the security requirements,
and use membership of roles and inheritance to ensure the correct effective permissions.
• Ensure every user has only the permission they actually require.
Review Question(s)
Question: Your organization needs to track data access by individual Windows users. Does
this mean you cannot base logins on Windows groups?
10-1
Module 10
Auditing Data Access and Encrypting Data
Contents:
Module Overview 10-1
Module Overview
When configuring security for your Microsoft® SQL Server® systems, you need to ensure that you meet
any of your organization’s compliance requirements for data protection. Organizations often need to
adhere to industry-specific compliance policies, which mandate auditing of all data access. To address this
requirement, SQL Server provides a range of options for implementing auditing. Another common
compliance requirement is the encryption of data to protect against unauthorized data access in the
event that access to the database files themselves is compromised. SQL Server supports this requirement
by providing transparent data encryption (TDE).
This module describes the available options for auditing in SQL Server, how to use and manage the SQL
Server audit feature, and how to implement encryption.
Objectives
After completing this module, you will be able to:
Lesson 1
Auditing Data Access in SQL Server
SQL Server provides a variety of tools that you can use to audit data access. In general, no one tool
provides all possible auditing requirements and a combination of features often needs to be used.
In this lesson, you will learn about the auditing options available.
Lesson Objectives
After completing this lesson, you will be able to:
use. (It is also available in the Developer and Evaluation editions for non-production use.) In addition to
enabling the common criteria compliance enabled option, you must also download and run a script
that finishes configuring SQL Server to comply with Common Criteria Evaluation Assurance Level 4+
(EAL4+). You can download this script from the Microsoft SQL Server website.
When the option is enabled and the script is run, three changes occur to how SQL Server operates:
• Residual Information Protection (RIP). Memory is always overwritten with a known bit pattern before
being reused.
• Column GRANT does not override table DENY. This changes the default behavior of the permission
system.
Note: The implementation of RIP increases security, but can negatively impact the
performance of the system.
SQL Trace
Many users attempt to use SQL Server Profiler for
auditing because it enables tracing commands to
be sent to SQL Server, as well as tracing the
returned errors. SQL Server Profiler can have a
significantly negative performance impact when it is
run interactively on production systems.
An alternative is to use SQL Trace, which is a set of
system stored procedures that SQL Server Profiler
can utilize. Executing these procedures to manage
tracing offers a much more lightweight method of
tracing, particularly when the events are well-
filtered. SQL Trace can then have a role in
auditing—because it can capture commands that are sent to the server, you can use it to audit those
commands.
SQL Trace uses a server-side tracing mechanism to guarantee that no events are lost, as long as there is
space available on the disk and that no write errors occur. If the disk fills or write errors occur, the trace
stops. SQL Server continues unless C2 audit mode is also enabled. The possibility of missing events needs
to be considered when evaluating the use of SQL Trace for auditing purposes.
10-4 Auditing Data Access and Encrypting Data
DML Triggers
Triggers can play an important role in auditing. SQL
Server supports a variety of types, including data
manipulation language (DML) triggers. These run
when a user modifies data and logon triggers which
enable tracking details of logons and rolling-back
logons, based on business or administrative logic.
IF UPDATE(Salary) BEGIN
INSERT dbo.EmployeeSalaryAudit (EmployeeID, OldSalary, NewSalary, UpdatedBy,
UpdatedAt)
SELECT i.EmployeeID, d.Salary, i.Salary, suser_name(), getdate()
FROM inserted AS i
INNER JOIN deleted AS d
ON i.EmployeeID = d.EmployeeID;
END;
END;
GO
• System performance can be significantly impacted by triggers running alongside the usual load on
the server.
• Users with appropriate permissions can disable triggers. This can cause a significant issue for auditing
requirements.
• You cannot create triggers that run in response to a SELECT statement.
• Triggers have a nesting limit of 32 levels, beyond which they do not work.
• Only limited ability to control trigger-firing order is provided. To make sure that it captures all the
changes made by other triggers, auditing would normally need to be the last trigger that fires, which
you can only specify by using the sp_settriggerorder system procedure.
Demonstration Steps
Create a DML Trigger for Auditing
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
3. Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using
Windows authentication.
5. Select the code under the comment Create a log table, and click Execute. This creates a table
named AuditRateChange in the HumanResources schema of the AdventureWorks database.
6. Select the code under the comment Create a trigger, and click Execute. This creates a trigger on the
EmployeePayHistory table that fires on updates. When the Rate column is updated, a row is inserted
into the AuditRateChange table.
7. Select the code under the comment Update a rate, and click Execute. This updates a rate in the
EmployeePayHistory table.
8. Select the code under the comment View the audit log, and click Execute. This retrieves the logged
details from the AuditRateChange table.
9. Keep SQL Server Management Studio open for the next demonstration.
Object Description
Object Description
Extended Events
SQL Server audit is based on an eventing engine called Extended Events.
A wide variety of events occur within the SQL Server database engine. For example, when a user executes
a query, the database engine may need to request additional memory or check permissions before the
actual query is allowed to run. SQL Server uses the Extended Events feature that enables you to define the
actions that SQL Server should take when events occur. When SQL Server executes its internal code, it
checks to see if a user has defined an action that should be taken at that point in the code. If they have,
SQL Server fires an event and sends details to a target location. Targets can be operating system files,
memory-based ring buffers, or Windows® event logs.
Extended Events is a lightweight eventing engine that has very little performance impact on the database
engine that it is monitoring. You can use Extended Events for many purposes where you may previously
have used SQL Trace.
Extended Events are important because SQL Server Audit is based on the Extended Events infrastructure.
The eventing engine that Extended Events provides is not tied to particular types of events—the engine is
written in such a way that it can process any type of event.
Configurations of Extended Events ship in .exe or .dll files called packages. Packages are the unit of
deployment and installation for Extended Events and contain all the objects that are part of a particular
Extended Events configuration. SQL Server audit is a special package within Extended Events so you
cannot change its internal configuration.
Extended Events uses specific terminology for the objects that it uses, as described in the following table:
Object Description
Targets Places to which the trace are sent, such as operating system files.
Actions Responses that SQL Server can make to an event (for example,
capturing execution plans to include in a trace).
Object Description
Predicates Dynamic filters that SQL Server applies to the event capture.
Lesson 2
Implementing SQL Server Audit
Preparing SQL Server audit for use requires that you configure a number of objects before you can create
and run your audit, and then view the results. In this lesson, you will learn how to configure SQL Server
audit, and how to create and use audits.
Lesson Objectives
After completing this lesson, you will be able to:
• Create audits.
• Actions and Action Groups. The events that can be included in an audit specification are based on
pre-defined actions, which are grouped into action groups. SQL Server provides a comprehensive set
of server-level action groups and database-level action groups, and you can audit user-defined
actions by adding a user-defined action group to an audit specification at the server and database
levels. Additionally, there are audit-level action groups that track changes to the auditing
configuration itself. This ensures that when an administrator disables auditing, a record of this change
is logged.
Administering Microsoft® SQL Server® Databases 10-9
Creating an Audit
You can create audits by using the Transact-SQL
CREATE SERVER AUDIT or SQL Server Management
Studio (SSMS). There are several options you can
configure, the key ones being listed in the following
table:
Option Description
Note: The value you configure for the queue delay needs to be a trade-off between
security and performance. A low value ensures that events are logged quickly and avoids the risk
of losing items from the audit trail in the event of failure, but can result in a significant
performance overhead.
Audit Targets
Audits can be sent to one of the following three targets:
• A file. File output provides the highest performance and is the easiest option to configure.
• Windows Application Event Log. Avoid sending too much detail to this log as network administrators
tend to dislike applications that write too much content to any of the event logs. Do not use this
target for sensitive data because any authenticated user can view the log.
• Windows Security Event Log. This is the most secure option for auditing data, but you need to add the
SQL Server service account to the Generate Security Audits policy before using it.
You should review the contents of the target that you use and archive its contents on a periodic basis.
The following code example creates and enables a server audit that uses a binary file as the target:
10-10 Auditing Data Access and Encrypting Data
Note: The filename that you provide to the FILEPATH parameter when creating a server
audit is actually a path to a folder. SQL Server generates log files automatically and stores them in
this location.
Note: For a full list of server-level audit actions groups, see “SQL Server Audit Action
Groups and Actions” in SQL Server Books Online.
The following example shows how to create and enable an audit to track failed and successful login
attempts:
Note: For a full list of database-level audit actions and actions groups, see “SQL Server
Audit Action Groups and Actions” in SQL Server Books Online.
The following example shows how to create an audit specification that includes all database principal
changes and all SELECT queries on objects in the HumanResources schema by members of the
db_datareader fixed database-level role.
The following example shows how to call the sp_audit_write stored procedure from an insert trigger.
Calling sp_audit_write
CREATE TRIGGER HR.BonusChecker ON HR.EmployeeBonus
AFTER INSERT
AS
DECLARE @bonus money, @empid integer, @msg nvarchar(4000)
Note: You must ensure that all principals who may trigger custom audit actions have been
granted EXECUTE permission on the sys.sp_audit_write stored procedure in the master
database. The easiest way to ensure this is to grant EXECUTE permission on sys.sp_audit_write to
public.
Administering Microsoft® SQL Server® Databases 10-13
• <path>\LoginsAudit_{GUID} which collects all audit files that have the specified name and GUID pair.
The audit records produced by SQL Server need to be in a format that fits in system event logs, as well as
in files. Because of this requirement, the record format is limited in size by the rules related to those event
logging systems. Character fields will be split into 4,000-character chunks that may be spread across a
number of entries. This means that a single event can generate multiple audit entries and a sequence_no
column is provided to indicate the order of multiple row entries
10-14 Auditing Data Access and Encrypting Data
Disabling an Audit
Use master;
ALTER SERVER AUDIT SecurityAudit
WITH (STATE = OFF);
View Description
sys.dm_server_audit_status Returns one row for each server audit, indicating the
current state of the audit.
• Each audit is identified by a GUID. If you restore or attach a database on a server, SQL Server attempts
to match the GUID in the database with the GUID of the audit on the server. If no match occurs,
auditing will not work until you correct the issue by executing the CREATE SERVER AUDIT command
to set the appropriate GUID.
• If databases are attached to editions of SQL Server that do not support the same level of audit
capability, the attach works but the audit is ignored.
• Mirrored servers introduce a similar issue of mismatched GUIDs. The mirror partner must have a
server audit with the same GUID. You can create this by using the CREATE SERVER AUDIT command
and supplying the GUID value to match the one on the primary server.
• You should consider the performance impact of audit writes and whether you need to minimize your
audit list to maximize performance.
• If disk space fills up, SQL Server may not start. In this situation, you may need to force entry to it by
starting SQL Server in minimal configuration mode with the –f startup parameter.
• Create an audit
Demonstration Steps
Create an Audit
1. If you did not complete the previous demonstration, start the 20462C-MIA-DC and 20462C-MIA-SQL
virtual machines, log onto 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password
Pa$$w0rd, and run Setup.cmd in the D:\Demofiles\Mod10 folder as Administrator. Then start SQL
Server Management Studio, and connect to MIA-SQL using Windows authentication.
2. In SQL Server Management Studio, open the Audit.sql script file in the D:\Demofiles\Mod10 folder.
3. Select the code under the comment Create an audit, and click Execute. This creates an audit that
logs events to files in D:\Demofiles\Mod10\Audits.
4. In Object Explorer, expand Security, and expand Audits (if Audits is not expandable, refresh it and
try again).
5. Double-click the AW_Audit audit you created and view its properties. Then click Cancel.
1. In SQL Server Management Studio, in the Audit.sql script, select the code under the comment
Create a server audit specification and click Execute. This creates an audit specification for the
AW_Audit audit that logs failed and successful login attempts.
2. In Object Explorer, refresh the Server Audit Specifications folder and expand it. Then double-click
AW_ServerAuditSpec, view its properties, and click Cancel.
10-16 Auditing Data Access and Encrypting Data
1. In SQL Server Management Studio, in the Audit.sql script, select the code under the comment
Create a database audit specification and click Execute. This creates an audit specification for the
AW_Audit audit that logs specific actions by individual principals on the HumanResources schema
in the AdventureWorks database.
2. In Object Explorer, expand Databases, expand AdventureWorks, expand Security, and expand
Database Audit Specifications (if Database Audit Specifications is not expandable, refresh it and
try again).
1. Open a command prompt and enter the following command to run sqlcmd as
ADVENTUREWORKS\ChadCorbitt. This user is a member of the ADVENTUREWORKS\Personnel
global group, which in turn is a member of the ADVENTUREWORKS\HumanResources_Users
domain local group.
3. In the SQLCMD window, enter the following commands to query the HumanResources.Employee
table:
5. In the D:\Demofiles\Mod10\Audits folder, verify that an audit file has been created.
6. In SQL Server Management Studio, in the Audit.sql script, select the code under the comment View
audited events and click Execute. This queries the files in the audit folder and displays the audited
events (to simplify this demonstration, events logged for the Student user and the service account for
SQL Server have been excluded).
7. Note that all events are logged with the server principal name ADVENTUREWORKS\ChadCorbitt
despite the fact that this user accesses SQL Server through membership of a Windows group and
does not have an individual login.
8. Keep SQL Server Management Studio open for the next demonstration.
Administering Microsoft® SQL Server® Databases 10-17
Lesson 3
Encrypting Databases
With the focus of most data security being on threats posed by hackers or social engineers, an often
overlooked aspect is the risk of the physical theft of data storage media such as disks and backup tapes.
For this reason, many organizations are obliged by their security compliance policies to protect data by
encrypting it. SQL Server 2014 includes two ways of encrypting data: Transparent Data Encryption (TDE)
and Extensible Key Management (EKM).
This lesson describes the considerations for using these encryption technologies.
Lesson Objectives
After completing this lesson, you will be able to:
• Configure TDE.
Note: When a database is configured to use TDE, CPU utilization for SQL Server may
increase due to the overhead of encrypting and decrypting data pages.
• Service Master Key (SMK). The SMK is created at the time of the installation of the SQL Server
instance by Setup. The SMK encrypts and protects the Database Master Key for the master database.
The SMK is itself encrypted by the Windows operating system Data Protection Application
Programming Interface (DPAPI).
• Database Master Key (DMK). The DMK for the master database is used to generate a certificate in
the master database. SQL Server uses the SMK and a password that you specify to generate the DMK,
10-18 Auditing Data Access and Encrypting Data
and stores it in the master database. Note: You can use a password without the SMK to generate a
DMK, although this is less secure.
• Server Certificate. A server certificate is generated in the master database, and is used to encrypt an
encryption key in each TDE-enabled database.
• Database Encryption Key (DEK). A DEK in the user database is used to encrypt the entire database.
You should back up both the server certificate and its associated private key as soon as you create them.
This minimizes the risk of data loss that could occur if you encrypted a database and then lost access to
the certificate and private key.
The following code example shows how to back up the server certificate and its private key.
The code example below creates a database encryption key that uses the AES_128 algorithm in the
AdventureWorks database. The key is encrypted using the server certificate created in the previous step.
The Transact-SQL statement below enables encryption for the AdventureWorks database.
To check whether a database is encrypted, you can query sys.databases. A value of 0 in the is_encrypted
column indicates that the database is not encrypted. A value of 1 in the is_encrypted column indicates
that the database is encrypted.
The code example below queries sys.databases to show the encryption status of all databases on the
server instance.
1. On the source server, detach the database that you want to move.
2. Copy or move the database files to the same location on the destination server.
4. Use a CREATE CERTIFICATE Transact-SQL statement to generate a server certificate on the destination
server from the backup of the original server certificate and its private key.
Demonstration Steps
Create a Database Master Key
1. If you did not complete the previous demonstration, start the 20462C-MIA-DC and 20462C-MIA-SQL
virtual machines, log onto 20462C-MIA-SQL as ADVENTUREWORKS\Student with the password
Pa$$w0rd, and run Setup.cmd in the D:\Demofiles\Mod10 folder as Administrator. Then start SQL
Server Management Studio, and connect to MIA-SQL using Windows authentication.
2. In SQL Server Management Studio, open the TDE.sql script file in the D:\Demofiles\Mod10 folder.
3. Select the code under the comment Create DMK and click Execute. This creates a database master
key in the master database.
1. In the TDE.sql script, select the code under the comment Create DEK and click Execute. This creates
a database encryption key in the ConfidentialDB database.
1. In the TDE.sql script, select the code under the comment Enable encryption and click Execute. This
enables encryption for the ConfidentialDB database, and retrieves database encryption status from
the sys.databases table in the master database.
2. Review the query results, and verify that the is_encypted value for ConfidentialDB is 1.
To use keys from a third-party EKM provider to implement TDE, you must perform the following tasks:
1. Enable the EKM provider enabled option in SQL Server. This is an advanced configuration option, so
you will need to use sp_configure with the show advanced options option before you enable EKM,
as shown in the following code example.
2. Create a cryptographic provider from the file provided by the EKM provider.
USE master ;
GO
CREATE ASYMMETRIC KEY EKM_Login_Key
FROM PROVIDER EKM_Provider
WITH ALGORITHM = RSA_512,
PROVIDER_KEY_NAME = 'SQL_Server_Key' ;
6. Create a credential for the database engine to use when performing encryption and decryption.
USE AdventureWorks ;
CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_128
ENCRYPTION BY SERVER ASYMMETRIC KEY EKM_Login_Key ;
Objectives
After completing this lab, you will be able to:
• Implement auditing.
Password : Pa$$w0rd
2. Create an Audit
o Be enabled immediately.
o Audit SELECT actions on the Customers schema by the customers_reader database role.
o Audit SELECT, INSERT, UPDATE, and DELETE actions on the Customers schema by the
sales_admin application role.
o Be enabled immediately.
2. Grant EXECUTE permission on the sys.sp_audit_write system stored procedure to the public
database role in the master database.
4. In sqlcmd, use the following commands to activate the sales_admin application role and update the
Customers.Customer table:
5. In the D:\Labfiles\Lab10\Starter\Audits folder, verify that an audit file has been created.
6. SQL Server Management Studio, use the following Transact-SQL code to query the files in the audit
folder and displays the audited events (events logged for the Student user and the service account for
SQL Server have been excluded to simplify the results).
7. Note that all events are logged with the server principal name ADVENTUREWORKS\VictoriaGray
despite the fact that this user accesses SQL Server through membership of a Windows group and
does not have an individual login. This identity is audited even when executing statements in the
security context of an application role.
Results: After this exercise, you should have created an audit, a server audit specification, and a database
specification.
o Encrypt the key using the server certificate you created in the previous step.
2. Query the sys.databases table to verify that encryption is enabled for the HumanResources
database.
o Attaching the database should fail because the certificate with which the database encryption key
is protected does not exist on the MIA-SQL\SQL2 instance.
3. Create a database master key for the master database on the MIA-SQL\SQL2 instance.
4. Create a certificate named TDE_Server_Cert in the master database on the MIA-SQL\SQL2 instance
from the backup certificate and private key files you created previously.
5. Attach the HumanResources database to the MIA-SQL\SQL2 instance and verify that you can access
the data it contains.
Results: After completing this exercise, you should have configured TDE and moved the encrypted
HumanResource database to another instance of SQL Server.
Administering Microsoft® SQL Server® Databases 10-27
• Choose the option to shut down SQL Server on audit failure. There is usually no point in setting up
auditing, and then having situations where events can occur but are not audited. This is particularly
important in high-security environments.
• Make sure that file audits are placed on drives with large amounts of free disk space and ensure that
the available disk space is monitored on a regular basis.
Best Practice: When planning to implement database encryption, consider the following
best practices:
• Use a complex password to protect the database master key for the master database.
• Ensure you back up certificates and private keys used to implement TDE, and store the backup files in
a secure location.
• If you need to implement data encryption on multiple servers in a large organization, consider using
an EKM solution to manage encryption keys.
Review Question(s)
Question: What are the three targets for SQL Server audits?
Question: You may wish to audit actions by a DBA. How would you know if the DBA
stopped the audit while performing covert actions?
10-28 Auditing Data Access and Encrypting Data
11-1
Module 11
Performing Ongoing Database Maintenance
Contents:
Module Overview 11-1
Module Overview
The Microsoft® SQL Server® database engine is capable of running for long periods of time with minimal
ongoing maintenance. However, obtaining the best outcomes from the database engine requires a
schedule of routine maintenance operations.
Database corruption is relatively rare but one of the most important tasks in the ongoing maintenance
schedule is to check that no corruption has occurred in the database. Recovering from corruption
depends upon its detection soon after it occurs. SQL Server indexes can also continue to work without any
maintenance, but they will perform better if you periodically remove any fragmentation that occurs within
them. SQL Server includes a Maintenance Plan Wizard to assist in creating SQL Server Agent jobs that
perform these and other ongoing maintenance tasks.
Objectives
After completing this module, you will be able to:
• Ensure database integrity by using DBCC CHECKDB.
• Maintain indexes.
Lesson 1
Ensuring Database Integrity
It is rare for the database engine to cause corruption directly. However, the database engine depends
upon the hardware platform that it runs on—and that can cause corruption. In particular, issues in the
memory and I/O subsystems can lead to corruption within databases.
If you do not detect corruption soon after it has occurred, further (and significantly more complex or
troublesome) issues can arise. For example, there is little point attempting to recover a corrupt database
from a set of backups where every backup contains a corrupted copy of the database.
You can use the DBCC CHECKDB command to detect, and in some circumstances correct, database
corruption. It is therefore important that you are familiar with how DBCC CHECKDB works.
Lesson Objectives
After completing this lesson, you will be able to:
• Describe database integrity.
DBCC CHECKDB
The CHECKDB option in the DBCC utility makes a
thorough check of the structure of a database, to detect almost all forms of potential corruption. The
functions that DBCC CHECKDB contains are also available as options that can be performed separately if
required. The most important of these options are described in the following table:
Option Description
DBCC CHECKDB also performs checks on other types of objects, such as the links for FILESTREAM objects
and consistency checks on the Service Broker objects.
Note: FILESTREAM and Service Broker are advanced topics that are beyond the scope of
this course.
Repair Options
Even though DBCC CHECKDB has repair options, it is not always possible to repair a database without
data loss. Usually, the best method for database recovery is to restore a backup of the database. This
means that you should synchronize the execution of DBCC CHECKDB with your backup retention policy.
This ensures that you can always restore a database from an uncorrupted database and that all required
log backups since that time are available.
database snapshots to ensure that the utility works with a consistent view of the database. If the
performance needs for the database activity running while DBCC CHECKDB is executing are too high,
running DBCC CHECKDB against a restored backup of your database is an alternative option. This is not
ideal, but is better than not running DBCC CHECKDB at all.
Disk Space
The use of an internal snapshot causes DBCC CHECKDB to need additional disk space. DBCC CHECKDB
creates hidden files (using NTFS Alternate Streams) on the same volumes as the database files are located.
Sufficient free space on the volumes must be available for DBCC CHECKDB to run successfully. The
amount of disk space required on the volumes depends upon how much data is changed during the
execution of DBCC CHECKDB.
DBCC CHECKDB also uses space in the tempdb database while executing. To provide an estimate of the
amount of space required in tempdb, DBCC CHECKDB offers an ESTIMATEONLY option.
• You can only perform the EXTENDED_LOGICAL_CHECKS when the database is in database
compatibility level 100 (SQL Server 2008) or above. It performs detailed checks of the internal
structure of objects such as CLR user-defined data types and spatial data types.
• You can use the TABLOCK option to request that DBCC CHECKDB takes a table lock on each table
while performing consistency checks, rather than using the internal database snapshots. This reduces
the disk space requirements at the cost of preventing other users from updating the tables.
• The ALL_ERRORMSGS and NO_INFOMSGS options only affect the output from the command, not the
operations that the command performs.
• The ESTIMATEONLY option estimates the space requirements in the tempdb database.
Administering Microsoft® SQL Server® Databases 11-5
• REPAIR_REBUILD rebuilds indexes and removes corrupt data pages. This option only works with
certain mild forms of corruption and does not involve data loss.
• REPAIR_ALLOW_DATA_LOSS will almost always produce data loss. It de-allocates the corrupt pages
and changes others that reference the corrupt pages. After the operation completes, the database will
be consistent, but only from a physical database integrity point of view. Significant loss of data could
have occurred. Also, repair operations do not consider any of the constraints that may exist on or
between tables. If the specified table is involved in one or more constraints, it is recommended that
you execute DBCC CHECKCONSTRAINTS after running the repair operation.
In the example on the slide, four consistency errors were found and the REPAIR_ALLOW_DATA_LOSS
option is needed to repair the database.
If the transaction log becomes corrupt, you can use a special option called an emergency mode repair.
However, in that situation, it is strongly recommended to restore the database and you should only use
the emergency mode repair when no backup is available.
Demonstration Steps
Use the DBCC CHECKDB Command
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
2. In the D:\Demofiles\Mod11 folder, run Setup.cmd as Administrator.
3. Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using
Windows authentication.
4. Open the DBCCCHECKDB.sql script file in the D:\Demofiles\Mod11 folder.
5. Select the code under the comment Run DBCC CHECKDB with default options and click Execute.
This checks the integrity of the AdventureWorks database and displays detailed informational
messages.
11-6 Performing Ongoing Database Maintenance
6. Select the code under the comment Run DBCC CHECKDB without informational messages and
click Execute. This checks the integrity of the AdventureWorks database and only displays messages
if errors are found.
7. Select the code under the comment Run DBCC CHECKDB against CorruptDB and click Execute.
This checks the integrity of the CorruptDB database and identifies some consistency errors in the
dbo.Orders table in this database. The last line of output tells you the minimum repair level required.
8. Select the code under the comment Try to access the Orders table and click Execute. This attempts
to query the dbo.Orders table in CorruptDB, and returns an error because of a logical consistency
issue.
9. Select the code under the comment Access a specific order and click Execute. This succeeds,
indicating that only some data pages are affected by the consistency issue.
10. Select the code under the comment Repair the database and click Execute. Note that this technique
is used only as a last resort when no valid backup is available. No guarantee on logical consistency in
the database (such as foreign key constraints) is provided.
11. Select the code under the comment Access the Orders table and click Execute. This succeeds,
indicating that the consistency issue has been repaired.
12. Select the code under the comment Check the internal database structure and click Execute. No
error message are displayed, indicating that the database structure is now consistent.
13. Select the code under the comment Check data loss and click Execute. Note that a number of order
details records have no matching order records. The foreign-key constraint between these tables
originally enforced a relationship, but some data has been lost.
Administering Microsoft® SQL Server® Databases 11-7
Lesson 2
Maintaining Indexes
Another important aspect of SQL Server that requires ongoing maintenance for optimal performance is
the management of indexes. Indexes are used to speed up operations where SQL Server needs to access
data in a table. Over time, indexes can become fragmented so the performance of database applications
using the indexes will be reduced. Defragmenting or rebuilding the indexes will restore the performance
of the database.
Index management options are often included in regular database maintenance plan schedules. Before
learning how to set up the maintenance plans, it is important to understand more about how indexes
work and how to maintain them.
Lesson Objectives
After completing this lesson, you will be able to:
• Explain how indexes affect performance.
Indexes can help to improve searching, sorting, and join performance, but they can also impact data
modification performance, they require ongoing management, and they demand additional disk space.
Occasionally, SQL Server will create its own temporary indexes to improve query performance. However,
doing so is up to the optimizer and beyond the control of the database administrator or programmer, so
these temporary indexes will not be discussed in this module. The temporary indexes are only used to
improve a query plan, if no proper indexing already exists.
11-8 Performing Ongoing Database Maintenance
Clustered Index
A table with a clustered index has a predefined
order for rows within a page and for pages within
the table. The order is based on a key made up of
one or more columns. The key is commonly called a
clustering key.
Because the rows of a table can only be in a single order, there can only be a single clustered index on a
table. SQL Server uses an Index Allocation Map entry to point to a clustered index.
There is a common misconception that pages in a clustered index are physically stored in order. While this
is possible in rare situations, it is not commonly the case. If it was true, fragmentation of clustered indexes
would not exist. SQL Server tries to align physical and logical order while creating an index, but disorder
can arise as data is modified.
Consider searching for an OrderID with a value of 23678 in an index for OrderID. In the root page, SQL
Server searches for the range that contains the value 23678. The entry for the range in the root page
points to an index page in the next level. In this level, the range is divided into smaller ranges, again
pointing to pages on the following level. This is done up to a point where every row can be referenced on
its own. This final level is called the leaf node.
Index and data pages are linked within a logical hierarchy and also double-linked across all pages at the
same level of the hierarchy, to assist when scanning across an index. For example, imagine a table with 10
extents and allocated page numbers 201 to 279, all linked in order. (Each extent contains eight pages.) If a
page needed to be placed into the middle of the logical order, SQL Server finds an extent with a free page
or allocates a new extent for the index. The page is logically linked into the correct position but it could
be located anywhere within the database pages.
Nonclustered Index
A nonclustered index is a type of index that does not affect the layout of the data in the table in the way
that a clustered index does. If the underlying table is a heap (that is, it has no clustered index), the leaf
level of a nonclustered index contains pointers to where the data rows are stored. The pointers include a
file number, a page number, and a slot number on the page.
If the underlying table has a clustered index (that is, the pages and the data are logically linked in the
order of a clustering key), the leaf level of a nonclustered index contains the clustering key. This is then
used to seek through the pages of the clustered index to locate the desired rows.
• Integrated full-text search (iFTS) uses a special type of index that provides flexible searching of text.
• The GEOMETRY and GEOGRAPHY data types use spatial indexes.
• Primary and secondary XML indexes assist when querying XML data.
Index Fragmentation
Index fragmentation occurs over time as you insert
and delete data in the table. For operations that
read data, indexes perform best when each page is
as full as possible. However, if your indexes initially
start full (or relatively full), adding data to the table
can cause the index pages to need splitting. Adding
a new index entry to the end of an index is easy but
the process is more complicated if the entry needs
to be made in the middle of an existing full index
page.
• External fragmentation occurs when pages that are logically sequenced are not held in sequenced
page numbers. If a new index page needs to be allocated, it would be logically inserted into the
correct location in the list of pages. In reality, though, it is likely to be placed at the end of the index.
That means that a process needing to read the index pages in order must follow pointers to locate
the pages. The process then involves accessing pages that are not sequential within the database.
Detecting Fragmentation
SQL Server provides a useful measure in the avg_fragmentation_in_percent column of the
sys.dm_db_index_physical_stats dynamic management view. You can use this to analyze the level of
fragmentation in your index and to decide whether to rebuild it.
SQL Server Management Studio (SSMS) also provides details of index fragmentation in the properties
page for each index.
The following example shows how to create an index that is 70 percent full, leaving 30 percent free space
on each page:
Using FILLFACTOR
ALTER TABLE Person.Contact
ADD CONSTRAINT PK_Contact_ContactID
PRIMARY KEY CLUSTERED
(
ContactID ASC
) WITH (PAD_INDEX = OFF, FILLFACTOR = 70);
GO
Note: The difference between the values zero and 100 can seem confusing. While both
lead to the same outcome, 100 indicates that a specific FILLFACTOR value has been requested.
The value zero indicates that no FILLFACTOR has been specified.
By default, the FILLFACTOR option only applies to leaf level pages in an index. You can use it in
conjunction with the PAD_INDEX = ON option to cause the same free space to be allocated in the non-
leaf levels of the index.
REBUILD
Rebuilding an index drops and recreates the index.
This removes fragmentation, reclaims disk space by
compacting the pages based on the specified or
existing fill factor setting, and reorders the index
rows in contiguous pages. When the option ALL is
specified, SQL Server drops all indexes on the table
and rebuilds them in a single operation. If any part of that fails, it rolls back the entire operation.
Because SQL Server performs rebuilds as logged, single operations, a single rebuild operation can use a
large amount of space in the transaction log. To avoid this, you can change the recovery model of the
database to use the BULK_LOGGED or SIMPLE recovery models before performing the rebuild operation,
so that it is a minimally-logged operation. A minimally-logged rebuild operation uses much less space in
the transaction log and completes faster.
Rebuilding an Index
ALTER INDEX CL_LogTime ON dbo.LogTime REBUILD;
Administering Microsoft® SQL Server® Databases 11-11
REORGANIZE
Reorganizing an index uses minimal system resources. It defragments the leaf level of clustered and
nonclustered indexes on tables by physically reordering the leaf-level pages to match the logical, left to
right order of the leaf nodes. Reorganizing an index also compacts the index pages. The compaction is
based on the existing fill factor value. It is possible to interrupt a reorganize without losing the work
performed so far. For example, this means that, on a large index, you could configure partial
reorganization to occur each day.
For heavily fragmented indexes (more than 30 percent) rebuilding is usually the most appropriate option
to use. SQL Server maintenance plans include options to rebuild or reorganize indexes. If you do not use
maintenance plans, it is important to build a job that regularly performs defragmentation of the indexes
in your databases.
You also use the ALTER INDEX statement to reorganize an index.
Reorganizing an Index
ALTER INDEX ALL ON dbo.LogTime REORGANIZE;
Because of the extra work that needs to be performed, online index rebuild operations are typically slower
than their offline counterparts.
The following example shows how to rebuild an index online:
Note: Some indexes cannot be rebuilt online, including clustered and nonclustered indexes
with large object data.
11-12 Performing Ongoing Database Maintenance
Updating Statistics
One of the main tasks that SQL Server performs
when it is optimizing queries is deciding which
indexes to use. This is based on statistics that SQL
Server keeps about the distribution of the data in
the index. SQL Server automatically creates statistics
for indexed columns, and creates them for non-
indexed columns when the
AUTO_CREATE_STATISTICS database option is
enabled.
For large tables, the AUTO_UPDATE_STATISTICS_ASYNC option instructs SQL Server to update statistics
asynchronously instead of delaying query execution, where it would otherwise update an outdated
statistic prior to query compilation.
You can also update statistics on demand. Executing the command UPDATE STATISTICS against a table
causes all statistics on the table to be updated. You can also run the sp_updatestats system stored
procedure to update all statistics in a database.
Demonstration Steps
Maintain Indexes
1. If you did not complete the previous demonstration in this module, ensure that the 20462C-MIA-DC
and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as
ADVENTUREWORKS\Student with the password Pa$$w0rd. Then, in the D:\Demofiles\Mod11
folder, run Setup.cmd as Administrator.
2. If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database
engine instance using Windows authentication.
4. Select the code under the comment Create a table with a primary key and click Execute. This
creates a table with a primary key, which by default creates a clustered index on the primary key field.
5. Select the code under the comment Insert some data into the table and click Execute. This inserts
10,000 rows into the table.
6. Select the code under the comment Check fragmentation and click Execute. In the results, note the
avg_fragmentation_in_percent and avg_page_space_used_in_percent values for each index level.
7. Select the code under the comment Modify the data in the table and click Execute. This updates
the table.
Administering Microsoft® SQL Server® Databases 11-13
8. Select the code under the comment Re-check fragmentation and click Execute. In the results, note
that the avg_fragmentation_in_percent and avg_page_space_used_in_percent values for each
index level have changed as the data pages have become fragmented.
9. Select the code under the comment Rebuild the table and its indexes and click Execute. This
rebuilds the indexes on the table.
10. Select the code under the comment Check fragmentation again and click Execute. In the results,
note that the avg_fragmentation_in_percent and avg_page_space_used_in_percent values for
each index level indicate less fragmentation.
11-14 Performing Ongoing Database Maintenance
Lesson 3
Automating Routine Database Maintenance
You have seen how to manually perform some of the common database maintenance tasks that you will
need to execute on a regular basis. SQL Server provides the Maintenance Plan Wizard you can use to
create SQL Server Agent jobs that perform the most common database maintenance tasks.
While the Maintenance Plan Wizard makes this process easy to set up, it is important to realize that you
can use the output of the wizard as a starting point for creating your own maintenance plans or you could
create plans from scratch.
Lesson Objectives
After completing this lesson, you will be able to:
• Backup tasks.
• Cleanup tasks.
Note: You can create maintenance plans using one schedule for all tasks or with individual
schedules for each one.
Administering Microsoft® SQL Server® Databases 11-15
Note: You can use the cleanup tasks available in the maintenance plans to implement a
retention policy for backup files, job history, maintenance plan report files, and msdb database
table entries.
Demonstration Steps
Create a Maintenance Plan
1. If you did not complete the previous demonstration in this module, ensure that the 20462C-MIA-DC
and 20462C-MIA-SQL virtual machines are running, and log on to 20462C-MIA-SQL as
ADVENTUREWORKS\Student with the password Pa$$w0rd. Then, in the D:\Demofiles\Mod11
folder, run Setup.cmd as Administrator.
2. If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database
engine instance using Windows authentication.
3. In Object Explorer, under MIA-SQL, expand Management, right-click Maintenance Plans, and click
Maintenance Plan Wizard.
5. In the Select Plan Properties window, in the Name textbox type Daily Maintenance. Note the
available scheduling options and click Change.
6. In the New Job Schedule window, in the Name textbox type "Daily". In the Occurs drop down list,
click Daily. In the Occurs once at textbox, change the time to 3:00 AM, and click OK.
7. In the Select Plan Properties window, click Next. Then in the Select Maintenance Tasks page,
select the following tasks and click Next.
o Reorganize Index
11-16 Performing Ongoing Database Maintenance
o Update Statistics
9. On the define Database Check Integrity Task page, select the AdventureWorks database and click
OK. Then click Next.
10. On the Define Reorganize Index Task page, select the AdventureWorks database and click OK,
ensure that Tables and Views is selected, and click Next.
11. On the Define Update Statistics Task page, select the AdventureWorks database and click OK,
ensure that Tables and Views is selected, and click Next.
12. On the Define Backup database (Full) Task page, select the AdventureWorks database and click
OK. Then on the Destination tab, ensure that Create a backup file for every database is selected,
change the Folder value to D:\Demofiles\Mod11\Backups\ and click Next.
13. On the Select Report Options page, ensure that Write a report to a text file is selected, change the
Folder location to D:\Demofiles\Mod11\ and click Next.
14. On the Complete the Wizard page, click Finish. Then when the operation has completed, click
Close.
15. In Object Explorer, under Maintenance Plans, right-click Daily Maintenance and click Execute.
16. Wait a minute or so until the maintenance plan succeeds, and in the Execute Maintenance Plan
dialog box, click Close. Then right-click Daily Maintenance and click View History.
17. In the Log File Viewer - MIA-SQL dialog box, expand the Date value for the Daily Maintenance
plan to see the individual tasks.
18. Keep clicking Refresh and expanding the tasks until four tasks have been completed. Then click
Close.
19. In the D:\Demofiles\Mod11 folder, view the Daily Maintenance_Subplan_1_xxxxx.txt file that has
been created.
20. In the Backups folder, verify that a backup of the AdventureWorks database has been created.
Administering Microsoft® SQL Server® Databases 11-17
Objectives
After completing this lab, you will be able to:
• Defragment indexes.
o AWDataWarehouse
o HumanResources
o InternetSales
2. Note any issues reported, and determine the minimum repair level required.
11-18 Performing Ongoing Database Maintenance
b. Run the DBCC CHECKDB command with the appropriate repair option.
2. Use the DBCC CHECKDB command to verify that the integrity issues have been resolved.
Results: After this exercise, you should have used the DBCC CHECKDB command to check database
consistency, and corrected any issues that were found.
2. Defragment Indexes
Results: After this exercise, you should have rebuilt fragmented indexes.
o Reorganize all indexes on all tables and views in the HumanResources database.
o Perform a full backup of the HumanResources database, storing the backup in the R:\Backups\
folder.
2. View the history of the maintenance plan in the Log File Viewer in SQL Server Management Studio.
3. When the maintenance plan has performed its four tasks, view the report it has generated.
4. Verify that a backup of the HumanResources database has been created in the R:\Backups folder.
Results: After this exercise, you should have created the required database maintenance plan.
Question: After discovering that the InternetSales database contained corrupt pages, what
would have been a preferable solution to repairing the database with potential data loss?
Question: If you need to execute a maintenance plan with timing that cannot be
accommodated by a single schedule, what can you do?
11-20 Performing Ongoing Database Maintenance
Best Practice: When planning ongoing database maintenance, consider the following best
practices.
• If corruption occurs, consider restoring the database from a backup, and only repair the database as a
last resort.
• Defragment your indexes when necessary.
• Update statistics on a schedule if you don’t want it to occur during normal operations.
Review Question(s)
Question: What option should you consider using when running DBCC CHECKDB against
large production databases?
12-1
Module 12
Automating SQL Server 2014 Management
Contents:
Module Overview 12-1
Module Overview
The tools provided by Microsoft® SQL Server® make administration easy when compared with other
database engines. Even when tasks are easy to perform though, it is common to need to repeat a task
many times. Efficient database administrators learn to automate repetitive tasks. This can help avoid
situations where an administrator forgets to execute a task at the required time. Perhaps more important
though, is that the automation of tasks helps to ensure they are performed consistently, each time they
are executed.
This module describes how to use SQL Server Agent to automate jobs, how to configure security contexts
for jobs, and how to implement multi-server jobs.
Objectives
After completing this module, you will be able to:
Lesson 1
Automating SQL Server Management
There are many benefits that you can gain from the automation of SQL Server management. Most of the
benefits center on the reliable, consistent execution of routine management tasks. SQL Server is a flexible
platform that provides a number of ways to automate management, but the most important tool for this
is the SQL Server Agent. All database administrators working with SQL Server need to be familiar with the
configuration and ongoing management of SQL Server Agent.
Lesson Objectives
After completing this lesson, you will be able to:
• Describe the available options for automating SQL Server management and the framework that SQL
Server Agent provides.
The same sort of situation occurs with routine tasks in SQL Server. While you can perform these tasks
individually or manually, efficient database administrators do not do this. They automate all their routine
and repetitive tasks. Automation removes the repetitive workload from the administrators and enables
them to manage larger numbers of systems or to perform higher-value tasks for the organization.
• Reliable execution of routine tasks. When you perform routine tasks manually, there is always a
chance that you might overlook a vital task. For example, a database administrator could forget to
perform database backups. Automation enables administrators to focus on exceptions that occur
during the routine tasks, rather than on the execution of the tasks.
• Consistent execution of routine tasks. Another problem that can occur when you perform routine
tasks manually is that you may not perform the tasks the same way each time. Imagine a situation
where a database administrator archives some data from a set of production tables into a set of
history tables every Monday morning. The new tables need to have the same name as the originals
with a suffix that includes the current date.
Administering Microsoft® SQL Server® Databases 12-3
While the administrator might remember to perform this task every Monday morning, there is a likelihood
that one or more of the following errors could occur:
Anyone who has been involved in ongoing administration of systems will tell you that these and other
problems would occur from time to time, even when the tasks are executed by experienced and reliable
administrators. Automating routine tasks can assist greatly in making sure that they are performed
consistently every time.
Proactive Management
After you automate routine tasks, it is possible that their execution fails but no-one notices. For example,
an automated backup of databases may fail but this is not and noticed until one of the backups is needed.
As well as automating your routine tasks, you need to ensure that you create notifications telling you
when the tasks fail, even if you cannot imagine a situation where they could. For example, you may create
a backup strategy that produces database backups in a given folder. The job may run reliably for years
until another administrator inadvertently deletes or renames the target folder. You need to know as soon
as this problem occurs so that you can rectify the situation.
A proactive administrator will try to detect potential problems before they occur. For example, rather than
receiving a notification that a job failed because a disk was full, an administrator might schedule regular
checks of available disk space and make sure a notification is received when it is starting to get too low.
SQL Server provides alerts on system and performance conditions for this type of scenario.
You can configure the start mode for SQL Server Agent in the properties of the SQL Server Agent service
in SQL Server Configuration Manager. There are three available start modes:
• Disabled. The service will not start, even if you attempt to do so manually.
• Manual. The service needs to be started manually.
12-4 Automating SQL Server 2014 Management
You can also configure the SQL Server Agent service to restart automatically if it stops unexpectedly, by
using the properties page for the SQL Server Agent in SQL Server Management Studio (SSMS). To restart
automatically, the SQL Server Agent service account must be a member of the local Administrators group
for the computer where SQL Server is installed—but this is not considered a best practice. A better option
would be to use an external monitoring tool such as System Center Operations Manager to monitor and
restart the SQL Server Agent service if necessary.
Jobs
You can use jobs to schedule command-line scripts,
Windows PowerShell® scripts, Transact-SQL scripts,
SQL Server Integration Service (SSIS) packages and
so on. You can also use them to schedule a wide variety of task types, including tasks involved in the
implementation of other SQL Server features. These include replication, Change Data Capture (CDC), Data
Collection, and Policy Based Management (PBM).
Note: Replication, CDC, and PBM are advanced topics that are beyond the scope of this
course.
Alerts
The alert system provided by SQL Server Agent is capable of responding to a wide variety of alert types,
including SQL Server error messages, SQL Server performance counter events, and Windows Management
Instrumentation (WMI) alerts.
Operators
You can configure an action to happen in response to an alert, such as the execution of a SQL Server
Agent job or sending a notification to an administrator. In SQL Server Agent, administrators that you can
notify are called operators. One common way of notifying operators is by using Simple Mail Transfer
Protocol (SMTP)-based email. (Alerts and operators are discussed later in this course.)
Note: There are other SQL Server features that you can use to automate complex
monitoring tasks, for example, Extended Events—but this is beyond the scope of this course.
Administering Microsoft® SQL Server® Databases 12-5
Lesson 2
Implementing SQL Server Agent Jobs
Because SQL Server Agent is the primary tool for automating tasks within SQL Server, database
administrators need to be proficient at creating and configuring SQL Server Agent jobs. You can create
jobs to implement a variety of different types of task and categorize them for ease of management.
In this lesson, you will learn how to create, schedule, and script jobs.
Lesson Objectives
After completing this lesson, you will be able to:
It is important to learn to script jobs that have been created. This enables you to quickly recreate the job if
a failure occurs and to reconstruct it in other environments. For example, you may create your jobs in a
test environment, but then need to move them to your production environment.
• Executing SQL Server Integration Services and Analysis Services commands and queries.
Note: While the ability to execute ActiveX® scripts is retained for backwards compatibility,
this option is deprecated and you should not use it for new development.
12-6 Automating SQL Server 2014 Management
Creating Jobs
You can use SSMS to create jobs or you can execute the sp_add_job system stored procedure, as well as
other system stored procedures, to add steps and schedules to the job. After you create the job, SQL
Server stores the job definition in the msdb database, alongside all the SQL Server Agent configuration.
Using sp_add_job
USE msdb;
GO
EXEC sp_add_job
@job_name = 'HR database backup',
@enabled = 1,
@description = 'Backup the HR database',
GO
Job Categories
You can organize your jobs into categories either by using the SQL Server built-in categories, such as
Database Maintenance, or by defining your own.
This is useful when you need to perform actions that are associated with jobs in a specific category. For
example, you could create a job category called SQL Server 2005 Policy Check and write a PowerShell
script to execute all the jobs in that category against your SQL Server 2005 servers.
Argument Description
@job_id Unique
identification
number of job to
which to add the
step (only specify
this or @job_name,
not both).
@step_id Unique
identification
number of the job
step, starting at 1
and incrementing
by 1.
Administering Microsoft® SQL Server® Databases 12-7
Argument Description
@command Command to
execute.
By default, SQL Server advances to the next job step upon success and stops when a job step fails.
However, job steps can continue with any step defined in the job, using the success or failure flags. By
configuring the action to occur on the success and failure of each job step, you can create a workflow that
determines the overall logic flow of the job. Note that, as well as each job step having a defined outcome,
the overall job reports an outcome. This means that, even though some job steps may succeed, the overall
job might still report failure.
You can specify the number of times that SQL Server should attempt to retry execution of a job step if the
step fails. You also can specify the retry intervals (in minutes). For example, if the job step requires a
connection to a remote server, you could define several retry attempts in case the connection fails.
Using sp_add_jobstep
USE msdb;
GO
EXEC sp_add_jobstep
@job_name = 'HR database backup',
@step_name = 'Set HR database to read only',
@subsystem = 'TSQL',
@command = 'ALTER DATABASE HR SET READ_ONLY',
@retry_attempts = 2,
@retry_interval = 2;
12-8 Automating SQL Server 2014 Management
GO
Even though a job may have multiple schedules, SQL Server will limit it to a single concurrent execution. If
you try to run a job manually while it is running as scheduled, SQL Server Agent refuses the request.
Similarly, if a job is still running when it is scheduled to run again, SQL Server Agent refuses to let it do so.
In the graphic on the slide associated with this topic, the following example shows how to create and
attach a schedule for Shift 1 to the job created in earlier examples:
EXEC sp_add_schedule
@schedule_name = 'Shift 1',
@freq_type = 4,
@freq_interval = 1,
@freq_subday_type = 0x8,
@freq_subday_interval = 1,
@active_start_time = 080000,
@active_end_time = 170000 ;
GO
EXEC sp_attach_schedule
@job_name = 'HR database backup',
@schedule_name = 'Shift 1' ;
GO
• Create a job.
Demonstration Steps
Create a Job
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
3. Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using
Windows authentication.
4. In Object Explorer, expand SQL Server Agent and Jobs to view any existing jobs. Then right-click
Jobs and click New Job.
5. In the New Job dialog box, on the General page, in the Name box, type Check AdventureWorks
DB.
6. In the New Job dialog box, on the Steps page, click New.
7. In the New Job Step dialog box, on the General page, in the Step name box, type Make Folder.
Then ensure that Operating system (CmdExec) is selected in the Type drop-down list and in the
Command area, type the following command, which calls a batch file to create an empty folder
named AdventureWorks in the D:\Demofiles\Mod12 folder.
D:\Demofiles\Mod12\MakeDir.cmd
9. In the New Job dialog box, on the Steps page, click New.
10. In the New Job Step dialog box, on the General page, in the Step name box, type Get DB Info.
Then ensure that Transact-SQL script (T-SQL) is selected in the Type drop-down list and in the
Command area, type the following command.
11. In the New Job Step dialog box, on the Advanced page, in the Output file box, type
D:\Demofiles\Mod12\AdventureWorks\DB_Info.txt. Then click OK.
12. In the New Job dialog box, on the Steps page, click New.
13. In the New Job Step dialog box, on the General page, in the Step name box, type Check DB. Then
ensure that Transact-SQL script (T-SQL) is selected in the Type drop-down list and in the
Command area, type the following command.
14. In the New Job Step dialog box, on the Advanced page, in the Output file box, type
D:\Demofiles\Mod12\AdventureWorks\CheckDB.txt. Then click OK.
15. In the New Job dialog box, on the Steps page, verify that the Start step is set to 1:Make Folder and
note the On Success and On Failure actions for the steps in the job.
16. In the New Job dialog box, on the Schedules page, click New.
17. In the New Job Schedule dialog box, in the Name box, type Weekly Jobs; in the Frequency area,
ensure that only Sunday is selected; and in the Daily frequency area, ensure that the option to occur
once at 12:00 AM is selected. Then click OK.
12-10 Automating SQL Server 2014 Management
18. In the New Job dialog box, click OK. Then verify that the job appears in the Jobs folder in Object
Explorer.
1. In Object Explorer, expand Databases. Then right-click AdventureWorks, point to Tasks, and click
Back Up.
2. In the Back Up Database - AdventureWorks dialog box, select the existing backup destination and
click Remove. Then click Add and in the Select Backup Destination dialog box, in the File name
box, type D:\Demofiles\Mod12\Backups\AdventureWorks.bak and click OK.
3. In the Back Up Database - AdventureWorks dialog box, in the Script drop-down list, select Script
Action to Job.
4. In the New Job dialog box, on the General page, note the default name for the job (Back Up
Database - AdventureWorks). Then on the Steps page, note that the job includes one Transact-SQL
step named 1.
5. In the New Job dialog box, on the Schedules page, click Pick. Then in the Pick Schedule for Job -
Back Up Database - AdventureWorks dialog box, select the Weekly Jobs schedule you created
previously and click OK.
6. In the New Job dialog box, click OK. Then, in the Back Up Database - AdventureWorks dialog box,
click Cancel.
7. Verify that the job appears in the Jobs folder in Object Explorer.
3. Save the Transact-SQL script as Create Jobs.sql in the D:\Demofiles\Mod12 folder. Using this
technique to generate scripts to create jobs is a common way to ensure that jobs can be recreated if
they are accidentally deleted or required on a different server.
4. Keep SQL Server Management Studio open for the next demonstration.
Administering Microsoft® SQL Server® Databases 12-11
Lesson 3
Managing SQL Server Agent Jobs
When you automate administrative tasks, you need to ensure that they execute correctly. To help with
this, SQL Server writes entries to history tables in the msdb database on the completion of each job.
In this lesson, you will learn how to query the history tables, as well as how to troubleshoot any issues that
may occur.
Lesson Objectives
After completing this lesson, you will be able to:
The Object Explorer window in SSMS also provides a Job Activity Monitor. This displays a view of currently
executing jobs and data showing the results of the previous execution, along with the scheduled time for
the next execution of the job.
12-12 Automating SQL Server 2014 Management
Note: The WHERE clause specifies a step_id of 0. Job steps begin at one, not zero, but an
entry in the dbo.sysjobhistory table is made with a job step_id of zero to record the overall
outcome. The outcome of individual job steps can be obtained by querying step_id values
greater than zero.
Additional Reading: For more information about the system tables which store SQL Server
Agent data, see SQL Server Agent Tables (Transact-SQL) in SQL Server Books Online.
• That the service account for the service is valid, that the password for the account has not changed,
and that the account is not locked out. If any of these checks are the issue, the service will not start
and details about the problem will be written to the computer’s System event log.
Administering Microsoft® SQL Server® Databases 12-13
• That the msdb database is online. If the msdb database is corrupt, suspect, or offline, SQL Server
Agent will not start.
• That the job is scheduled. The schedule may be incorrect or the time for the next scheduled execution
may be in the future.
• That the schedule is enabled. Both jobs and schedules can be disabled and a job will not run on a
disabled schedule.
Demonstration Steps
Run Jobs
1. Ensure that you have completed the previous demonstration in this module.
2. In SQL Server Management Studio, in Object Explorer, right-click the Back Up Database -
AdventureWorks job and click Start Job at Step. Then, when the job has completed successfully,
click Close.
3. In Object Explorer, right-click the Check AdventureWorks DB job and click Start Job at Step. Then
select step 1 and click Start. Note that the job fails, and click Close.
1. In Object Explorer, right-click the Back Up Database - AdventureWorks job and click View History.
2. In the Log File Viewer - MIA-SQL dialog box, expand the date for the most recent instance of the
job, and note that all steps succeeded. Then click Close.
3. In Object Explorer, right-click the Check AdventureWorks DB job and click View History.
4. In the Log File Viewer - MIA-SQL dialog box, expand the date for the most recent instance of the
job, and note that the third step failed.
5. Select the step that failed, and in the pane at the bottom of the dialog box, view the message that
was returned. Then click Close.
12-14 Automating SQL Server 2014 Management
6. In Object Explorer, double-click the Check AdventureWorks DB job. Then in the Job Properties -
Check AdventureWorks DB dialog box, on the Steps page, select step 3 (Check DB) and click Edit.
7. In the Job Step Properties - Check DB dialog box, modify the command as follows and click OK.
9. In Object Explorer, right-click the Check AdventureWorks DB job and click Start Job at Step. Then
select step 1 and click Start.
10. In Object Explorer, double-click Job Activity Monitor and note the Status of the Check
AdventureWorks DB job.
11. Click Refresh until the Status changes to Idle, and verify that the Last Run Outcome for the job is
Succeeded. Then click Close to close the Job Activity Monitor.
12. In the Start Jobs - MIA-SQL dialog box (which may be behind SQL Server Management Studio),
verify that the job completed with a status of Success, and click Close.
13. In the D:\Demofiles\Mod12 folder, view the text files generated by the Check AdventureWorks DB
job in the AdventureWorks folder, and verify that a backup file was created in the Backups folder by
the Back Up Database - AdventureWorks job.
14. Keep SQL Server Management Studio open for the next demonstration.
Administering Microsoft® SQL Server® Databases 12-15
Lesson 4
Managing Job Step Security Contexts
By default, SQL Server Agent executes job steps in the context of the SQL Server Agent service account.
When a wide variety of jobs types must be automated using jobs, this account requires the appropriate
permissions to perform every job step; which can result in a highly privileged account. To avoid the
potential security implications of having a single account with a wide range of permissions, you can create
proxy accounts with the minimal permissions required to perform specific tasks, and use them to run
different categories of job step.
This lesson discusses ways to control the security context of the steps in your jobs.
Lesson Objectives
After completing this lesson, you will be able to:
Proxy Accounts
As an alternative to using the SQL Server Agent account, you can use a proxy account to associate a job
step with a Windows identity by using an object called a credential. You can create proxy accounts for all
available subsystems other than Transact-SQL steps. Using proxy accounts means that you can use
12-16 Automating SQL Server 2014 Management
different Windows identities to perform the various tasks required in jobs. It provides tighter security by
avoiding the need for a single account to have all the permissions required to execute all jobs.
Credentials
A credential is a SQL Server object that contains the
authentication information required to connect to a
resource outside SQL Server. Most credentials
contain a Windows user name and password. If
multiple SQL Server logins require the same level
access to the same set of resources, you can create
a single credential to map to them. However, you
cannot map a SQL Server login to more than one
credential.
Creating Credentials
You create credentials by using the Transact-SQL CREATE CREDENTIAL statement or SSMS.
When you create a credential, you specify the password (termed as the secret) and SQL Server encrypts it
by using the service master key.
Creating a Credential
USE master;
GO
After you create the credential, you can map it to a login by using the ALTER LOGIN statement.
Managing Credentials
SQL Server provides the sys.credentials system view to give catalog information about existing
credentials.
Because the password for a Windows account may change over time, you can update a credential with
new values by using the ALTER CREDENTIAL statement.
You need to supply both the user name and password (that is, the secret) to the ALTER CREDENTIAL
statement.
Altering Credentials
ALTER CREDENTIAL Agent_Export
Administering Microsoft® SQL Server® Databases 12-17
Proxy Accounts
For a job step in a SQL Server Agent job to use a
credential, you need to map the credential to a
proxy account in SQL Server. There is a built-in set
of proxy accounts and you can also create your
own. SQL Server proxy accounts define the security
context for a job step. SQL Server Agent uses the
proxy account to access the security credentials for
a Microsoft Windows user.
Creating a proxy account does not change existing permissions for the Windows account that is specified
in the credential. For example, you can create a proxy account for a Windows account that does not have
permission to connect to an instance of SQL Server. Job steps using that proxy account are still unable to
connect to SQL Server.
A user must have permission to use a proxy account before they can specify it in a job step. By default,
only members of the sysadmin fixed server role have permission to access all proxy accounts but you can
grant permissions to three types of security principals:
• Server roles.
subsystem that it utilizes. You can associate each proxy account with one or more subsystems. Each proxy
account can be associated with one or more subsystems.
Subsystems assist in providing security control because they segment the functions that are available to a
proxy account. A job step that uses a proxy account can access the specified subsystems by using the
security context of the Windows user. When SQL Server Agent runs a job step that uses a proxy account, it
impersonates the credentials defined in the proxy account and runs the job step by using that security
context. SQL Server Agent checks subsystem access for a proxy account every time a job step runs. If the
security environment has changed and the proxy account no longer has access to the subsystem, the job
step fails.
The following code example grants the Export_Proxy proxy account access to the SSIS subsystem.
The following example shows how to return related information about credentials and their proxy
accounts.
• Create a credential.
Demonstration Steps
Create a Credential
1. Ensure that you have completed the previous demonstration in this module.
2. In SQL Server Management Studio, in Object Explorer, under MIA-SQL, expand Security. Then right-
click Credentials and click New Credential.
3. In the New Credential dialog box, enter the following details and click OK.
o Identity: MIA-SQL\FileAgent
o Password: Pa$$w0rd
3. In the New Proxy Account dialog box, in the Proxy name box type FileAgentProxy, and in the
Credential name box type FileAgent. Then ensure that only the Operating system (CmdExec)
subsystem is selected and click OK.
1. In Object Explorer, under SQL Server Agent, in the Jobs folder, double-click the Check
AdventureWorks DB job.
2. In the Job Properties - Check AdventureWorks DB dialog box, on the Steps page, click step 1
(Make Folder) and click Edit.
3. In the Job Step Properties - Make Folder dialog box, in the Run as drop-down list, select
FileAgentProxy. Then click OK.
7. In the D:\Demofiles\Mod12 folder, right-click the AdventureWorks folder that was created by your
job, and click Properties.
8. In the AdventureWorks properties dialog box, on the Security tab, click Advanced.
9. Note that the owner of the folder is MIA-SQL\FileAgent. This is the account that was used to create
the folder. Then click Cancel to close the Advanced Security Settings for AdventureWorks dialog
box, and click Cancel again to close the AdventureWorks Properties dialog box.
10. Keep SQL Server Management Studio open for the next demonstration.
12-20 Automating SQL Server 2014 Management
Lesson 5
Managing Jobs on Multiple Servers
You may have jobs that run across multiple servers which would benefit from being automated. SQL
Server provides multiserver administration functionality that enables you to distribute jobs across your
enterprise.
In this lesson, you will learn about the concepts behind multiserver administration and how to implement
jobs across multiple servers.
Lesson Objectives
After completing this lesson, you will be able to:
• The master server uses the name of a target server. Therefore, if you want to change the name of the
target server you must first defect it from the master server, rename it, and then enlist it back to the
master server.
• Because they need to communicate across the multiple servers, the SQL Server Agent service and SQL
Server service must run using Windows domain accounts.
Using Transact-SQL
You use the sp_msx_enlist stored procedure to
create master and target servers. The first server that you enlist by using this stored procedure acts as the
master server and subsequent ones become target servers.
The following example creates a master server named AWMaster and enlists two target servers called
AWTarget2 and AWTarget3.
Using sp_msx_enlist
USE msdb ;
GO
GO;
If you change the definition of a multiserver job after distributing it to the target servers, you need to
ensure that SQL Server adds the change to the download list for the target servers to be updated.
You can do this by executing the sp_post_msx_operation stored procedure as shown in the following
example:
You only need to execute this code when you add, update, or remove job steps or schedules;
sp_update_job and sp_delete_job automatically add entries to the download list.
Note: You can locate the job_id property by querying the dbo.sysjobhistory or
dbo.sysjobs tables or by viewing the job properties in SSMS.
Demonstration Steps
Create Master and Target Servers
1. Ensure that you have completed the previous demonstrations in this module.
2. In SQL Server Management Studio, in Object Explorer, under MIA-SQL, right-click SQL Server Agent,
point to Multi Server Administration, and then click Make this a Master.
3. In the Master Server Wizard – MIA-SQL window, on the Welcome to the Master Server Wizard
page, click Next.
4. On the Master Server Operator page, in the E-mail address box, type
[email protected], and then click Next.
6. In the Connect to Server dialog box, in the Server name box, type MIA-SQL\SQL2, and then click
Connect.
7. On the Target Servers page, click Next.
1. In Object Explorer, expand SQL Server Agent (MSX), and then expand Jobs.
3. In the New Job window, in the Name box, type Backup master database.
Because no folder path is specified, the command will store the back up in the default backup folder
for the SQL Server instance.
2. In Object Explorer, in the Connect drop-down list, click Database Engine. Then connect to the MIA-
SQL\SQL2 instance using Windows authentication.
12-24 Automating SQL Server 2014 Management
3. In Object Explorer, under MIA-SQL\SQL2, expand SQL Server Agent (TSX: MIA-SQL) and expand
Jobs.
4. Right-click the Backup master database job on MIA-SQL\SQL2 and click View History.
5. Review the job history to verify that is has been executed successfully, and then click Close.
Objectives
After completing this lab, you will be able to:
• Create jobs.
• Schedule jobs.
2. Add a Transact-SQL step that runs in the HumanResources database and executes the following
command. The output from the command should be saved as a text file in the
D:\Labfiles\Lab12\Starter folder.
3. Add an operating system command step that runs the following command to copy the backup file to
the D:\Labfiles\Lab12\Starter folder.
12-26 Automating SQL Server 2014 Management
4. Ensure that the job is configured to start with the Transact-SQL backup step.
2. View the history for the job and verify that it succeeded.
3. View the contents of the D:\Labfiles\Lab12\Starter folder and verify that it contains a text file
containing the output from the backup step and a copy of the backup file.
Results: After this exercise, you should have created a job named Backup HumanResources.
2. Add a schedule to the Backup HumanResources job so that the job runs every day one minute from
the current system time.
3. Wait for the scheduled time, and then proceed with the next task.
2. When the job is idle, verify that the Last Run Outcome for the job is Succeeded, and that the Last
Run time is the time that you scheduled previously.
Results: After this exercise, you should have created a schedule for the Backup HumanResources job.
1. Create a Credential
2. The proxy account should use the credential you created in the previous step.
2. Configure the second step (which uses an operating system command to copy the backup file) in the
Backup HumanResources job to run as the FileAgent_Proxy proxy account.
Results: After this exercise, you should have configured the Back Up Database step of the Backup
HumanResources job to run as the Backup_User SQL Server user. You should have also created a
credential named FileAgent_Credential and a proxy named FileAgent_Proxy to perform the Copy
Backup File step of the Backup HumanResources job.
Question: Assuming you have administrative control over the local Windows user for the
proxy account in this scenario, what considerations would apply to its properties?
12-28 Automating SQL Server 2014 Management
Best Practice: When planning SQL Server Agent jobs, consider the following best practices.
• Script your jobs for remote deployment, or in case you need to recreate them.
• Apply the “principle of least privilege” when configuring job step execution identities to avoid
creating highly-privileged user accounts.
Review Question(s)
Question: What functions do you currently perform manually that could be placed in a job?
13-1
Module 13
Monitoring SQL Server 2014 with Notifications and Alerts
Contents:
Module Overview 13-1
Module Overview
One key aspect of managing Microsoft® SQL Server® in a proactive manner is to make sure you are
aware of events that occur in the server, as they happen. SQL Server logs a wealth of information about
issues and you can configure it to advise you automatically when these issues occur, by using alerts and
notifications. The most common way that SQL Server database administrators receive details of events of
interest is by email message. This module covers the configuration of Database Mail, alerts, and
notifications.
Objectives
After completing this module, you will be able to:
• Monitor SQL Server errors.
Lesson 1
Monitoring SQL Server Errors
It is important to understand the core aspects of errors as they apply to SQL Server. In particular, you need
to consider the nature and locations of errors, as well as the data that they return. SQL Server records
severe errors in the SQL Server error log, so it is important to know how to configure the log.
Lesson Objectives
After completing this lesson, you will be able to:
What Is in an Error?
It might not be immediately obvious that a SQL
Server error (or exception) is itself an object, and
therefore has properties that you can access.
Property Description
Error numbers are helpful when trying to locate information about the specific error, particularly when
searching for information online.
The following example shows how to use the sys.messages catalog view to retrieve a list of system
supplied error messages, showing the properties described in the table above:
Error messages can be localized and are returned in a number of languages, so the WHERE clause of this
example limits the results to view only the English version.
Values from 0 to 10
Values from 0 to 10 are informational messages
raised when SQL Server needs to provide
information associated with the running of a query.
For example, consider the query SELECT
COUNT(Color) FROM Production.Product. This
query returns a count, but on the Messages tab in
SQL Server Management Studio (SSMS), the message “Warning: Null value is eliminated by an aggregate
or other SET operation” is also displayed. No error has occurred, but SQL Server warns that it ignored
NULL values when counting the rows.
Values from 11 to 16
Values from 11 to 16 are used for errors that the user can correct. Typically, SQL Server uses them when it
asserts that the code being executed contains an error. Errors in this range include:
Values from 17 to 19
Values from 17 to 19 are serious software errors that the user cannot correct. For example, severity 17
indicates that SQL Server has run out of resources (for example, memory or disk space).
Values above 19
Values above 19 tend to be very serious errors that normally involve either the hardware or SQL Server
itself. It is common to ensure that all errors above 19 are logged and alerts generated on them.
13-4 Monitoring SQL Server 2014 with Notifications and Alerts
By default, SQL Server retains backups of the previous six logs and gives the most recent log backup the
extension .1, the second most recent the extension .2, and so on. The current error log has no extension.
You can configure the number of log files to retain by using the right-click Configure option from the SQL
Server Logs node in Object Explorer.
The log file cycles with every restart of the SQL Server instance. Occasionally, you might want to remove
excessively large log files. You can use the sp_cycle_errorlog system stored procedure to close the
existing log file and open a new one on demand. If there is a regular need to recycle the log file, you
could create a SQL Server Agent job to execute the system stored procedure on a schedule. Cycling the
log can help you to stop the current error log becoming too large.
Demonstration Steps
View the SQL Server Error Log
1. Ensure that the 20462C-MIA-DC and 20462C-MIA-SQL virtual machines are running, and log on to
20462C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa$$w0rd.
3. Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using
Windows authentication.
4. In Object Explorer, under the MIA-SQL instance, expand Management and expand SQL Server
Logs. Then right-click Current and click View SQL Server Log.
5. Maximize the Log File Viewer - MIA-SQL window and view the log entries. Note that when you
select a log entry, its details are shown in the bottom pane.
6. In the Select logs pane, expand SQL Server Agent and select Current. Then scroll the main log
entries pane to the right until you can see the Log Type column and scroll down to find an entry with
the log type SQL Server Agent.
7. When you have finished viewing the log entries, click Close.
Administering Microsoft® SQL Server® Databases 13-5
8. Minimize SQL Server Management Studio and view the contents of the C:\Program Files\Microsoft
SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Log folder. If you are prompted to change your
permissions to get access to the folder, click Continue. Note that the current SQL Server log is stored
here in the file named ERRORLOG, and the current SQL Server Agent log is stored as SQLAGENT.1.
The remaining log files contain log entries for other SQL Server components and services.
Cycle the Log File
EXEC sys.sp_cycle_errorlog;
3. Click Execute.
4. In Object Explorer, right-click the Current SQL Server log and click View SQL Server Log.
5. Note that the log has been reinitialized, and then click Close.
13-6 Monitoring SQL Server 2014 with Notifications and Alerts
Lesson 2
Configuring Database Mail
SQL Server needs to be able to advise administrators when issues arise that require their attention, such as
the failure of a scheduled job or a significant error. Email is the most commonly-used mechanism for
notifications from SQL Server. You can use the Database Mail feature of SQL Server to connect to an
existing Simple Mail Transport Protocol (SMTP) server when SQL Server needs to send email.
You can configure SQL Server with multiple email profiles and to control which users can utilize the email
features of the product. It is important to be able to track and trace emails that have been sent so SQL
Server enables you to configure a policy for their retention.
Lesson Objectives
After completing this lesson, you will be able to:
You can use Database Mail to send emails as part of a SQL Server Agent job, in response to an alert, or on
behalf of a user from a stored procedure.
You use the Database Mail Configuration Wizard to enable Database Mail and to configure accounts and
profiles. A Database Mail account contains all the information that SQL Server needs to send an email
message to the mail server. You must specify what type of authentication to use (Windows, basic, or
anonymous), the email address, the email server name, type, and port number, and if using
authentication, the username and password.
SQL Server stores the configuration details in the msdb database, along with all other SQL Server Agent
configuration data. SQL Server Agent also caches the profile information in memory so it is possible to
send email if the SQL Server database engine is no longer available.
Administering Microsoft® SQL Server® Databases 13-7
Mail Profiles
You can create multiple configurations by using different profiles. For example, you could create one
profile to send mail to an internal SMTP server, using an internal email address, for mails sent by SQL
Server Agent and a second profile for a database application to send external email notifications to
customers.
Each database user can have access to multiple profiles. If you do not specify a profile when sending an
email, Database Mail uses the default profile. If both private and public profiles exist, precedence is given
to a private default profile over a public one. If you do not specify a default profile or if a non-default
profile should be used, you must specify the profile name you want to use as a parameter when sending
mail.
The following example shows how to use the sp_send_dbmail system stored procedure to send an email
using a specific profile.
Sending Mail
EXEC msdb.dbo.sp_send_dbmail
@profile_name = 'HR Administrator',
@recipients = '[email protected]',
@body = 'Daily backup completed successfully.',
@subject = 'Daily backup status';
Mail are disabled by default. When you run the Database Mail Configuration Wizard, it automatically
enables the procedures. If you wish to configure Database Mail manually, you can enable the Database
Mail system extended stored procedures by using the sp_configure stored procedure, setting the
Database Mail XPs option to the value of 1.
You can also limit the types and size of attachments that users can send in emails by Database Mail. You
can configure this limitation by using the Database Mail Configuration Wizard or by calling the
dbo.sysmail_configure_sp system stored procedure in the msdb database.
You configure the logging level parameter by using the Configure System Parameters dialog box of the
Database Mail Configuration Wizard or by calling the dbo.sysmail_configure_sp stored procedure in the
msdb database. You can view the logged messages by querying the dbo.sysmail_event_log table.
Internal tables in the msdb database also hold copies of the email messages and attachments that
Database Mail sends, together with the current status of each message. Database Mail updates these
tables as it processes each message. You can track the delivery status of an individual message by viewing
information in the following:
• dbo.sysmail_allitems
• dbo.sysmail_sentitems
• dbo.sysmail_unsentitems
• dbo.sysmail_faileditems
Because Database Mail retains the outgoing messages and their attachments, you need to plan a
retention policy for this data. If the volume of Database Mail messages and related attachments is high,
plan for substantial growth of the msdb database.
Administering Microsoft® SQL Server® Databases 13-9
You can periodically delete messages to regain space and to comply with your organization's document
retention policies.
The following example shows how to delete messages, attachments, and log entries that are more than
one month old:
EXECUTE dbo.sysmail_delete_mailitems_sp
@sent_before = @CutoffDate;
EXECUTE dbo.sysmail_delete_log_sp
@logged_before = @CutoffDate;
GO
You could schedule these commands to be executed periodically by creating a SQL Server Agent job.
Demonstration Steps
Create a Database Mail Profile
1. If you did not complete the previous demonstration in the module, start the 20462C-MIA-DC and
20462C-MIA-SQL virtual machines, log on to 20462C-MIA-SQL as ADVENTUREWORKS\Student
with the password Pa$$w0rd, and in the D:\Demofiles\Mod13 folder, run Setup.cmd as
Administrator.
2. If SQL Server Management Studio is not already open, start it and connect to the MIA-SQL database
engine instance using Windows authentication.
3. In Object Explorer, under the MIA-SQL instance, under Management, right-click Database Mail,
and click Configure Database Mail.
5. In the Select Configuration Task page, select the option to set up Database Mail and click Next.
6. In the New Profile page, in the Profile name textbox type SQL Server Agent Profile, and click Add.
Then, in the Add Account to profile 'SQL Server Agent Profile' dialog box, click New Account.
7. In the New Database Mail Account dialog box, enter the following details and click OK:
9. In the Manage Profile Security page, select Public for the SQL Server Agent Profile profile, and set
its Default Profile setting to Yes. Then click Next.
10. In the Configure System Parameters page, click Next. Then, in the Complete the Wizard page,
click Finish and when configuration is complete, click Close.
1. In Object Explorer, right-click Database Mail and click Sent Test E-Mail.
2. In the Send Test E-Mail from MIA-SQL dialog box, ensure that the SQL Server Agent Profile
database mail profile is selected, and in the To textbox, enter [email protected]. Then
click Send Test Email.
3. View the contents of the C:\inetpub\mailroot\Drop folder, and verify that an email message has been
created here.
4. Double-click the message to view it in Outlook. When you have read the message, close it and
minimize the Drop folder window.
5. In the Database Mail Test E-Mail dialog box (which may be behind SQL Server Management
Studio), click OK.
3. View the results. The first result shows system events for Database Mail, and the second shows records
of e-mail messages that have been sent.
4. Keep SQL Server Management Studio open for the next demonstration.
Administering Microsoft® SQL Server® Databases 13-11
Lesson 3
Configuring Operators, Notifications, and Alerts
Many SQL Server systems have multiple administrators. SQL Server Agent enables you to configure
operators that are associated with one or more administrators and to determine when to contact each of
the operators—along with the method to use for that contact.
SQL Server can also detect many situations that might be of interest to administrators. You can configure
alerts that are based on SQL Server errors or on system events such as low disk space availability, and then
configure SQL Server to notify you of these situations.
Lesson Objectives
After completing this lesson, you will be able to:
• Create alerts.
Configuring Operators
You can define new operators using either SSMS or
the dbo.sp_add_operator system stored
procedure. After you define the operator, you can view the definition by querying the dbo.sysoperators
system table in the msdb database.
You can configure three types of contact methods for each operator:
• Email. An SMTP email address where notifications are sent. Where possible, it is desirable to use
group email addresses rather than individual ones. You can also list multiple email addresses by
separating them with a semicolon.
• Pager email. An SMTP email address where a message can be sent during specified times (and days)
during a week.
Note: Pager and Net send notifications are deprecated, and should not be used for new
development as they will be removed in a future version of SQL Server.
Fail-Safe Operator
You can also define a fail-safe operator that is notified in the following circumstances:
• The SQL Server Agent cannot access the tables that contain settings for operators and notifications in
the msdb database.
• A pager notification must be sent at a time when no operators configured to receive pager alerts are
on duty.
Job Notifications
You can configure SQL Server Agent jobs to send messages to an operator on completion, failure, or
success. Configuring jobs to send notifications on completing or success might lead to a large volume of
e-mail notifications, so may DBAs prefer to be notified only if a job fails. However, for business-critical
jobs, you might want to be notified regardless of the outcome to remove any doubt over the notification
system itself.
• Create an operator.
Demonstration Steps
Enable a SQL Server Agent Mail Profile
1. Ensure that you have completed the previous demonstration in this module.
2. In SQL Server Management Studio, in Object Explorer, right-click SQL Server Agent and click
Properties.
3. In the SQL Server Agent Properties dialog box, on the Alert System page, select Enable mail
profile and in the Mail profile drop-down list, select SQL Server Agent Profile. Then click OK.
4. In Object Explorer, right-click SQL Server Agent and click Restart. When prompted to confirm, click
Yes.
Create an Operator
1. In Object Explorer, under SQL Server Agent, right-click Operators and click New Operator.
2. In the New Operator dialog box, in the Name box type Student, in the E-mail name box type
[email protected], and click OK.
1. In Object Explorer, under SQL Server Agent, expand Jobs and view the existing jobs.
3. In the Job Properties - Back Up Database - AdventureWorks dialog box, on the Notifications tab,
select E-mail, select Student, and select When the job completes. Then click OK.
Administering Microsoft® SQL Server® Databases 13-13
4. In Object Explorer, expand the Operators folder, right-click Student and click Properties. On the
Notifications page, select Jobs, note the job notifications that have been defined for this operator.
Then click Cancel.
5. Right-click the Back Up Database - AdventureWorks job and click Start Job at Step. Then, when
the job has completed, click Close.
6. Under the Operators folder, right-click Student and click Properties. On the History page, note the
most recent notification by e-mail attempt. Then click Cancel.
7. In the C:\inetpub\mailroot\Drop folder, and verify that a new email message has been created.
8. Double-click the most recent message to view it in Outlook. Then, when you have read the message,
close it and minimize the Drop window.
9. Keep SQL Server Management Studio open for the next demonstration.
When the Application Log notifies SQL Server Agent of a logged event, SQL Server Agent compares the
event to the alerts that you have defined. When SQL Server Agent finds a match, it fires the alert, which is
an automated response to an event.
Note: You must configure SQL Server Agent to write messages to the Windows Application
Event Log if they are to be used for SQL Server Agent alerts.
Alerts Actions
You can create alerts to respond to individual error numbers or to all errors of a specific severity level. You
can define the alert for all databases or for a specific database. You can also define the time delay
between responses.
Note: It is considered good practice to configure notifications for all error messages with
severity level 19 and above.
13-14 Monitoring SQL Server 2014 with Notifications and Alerts
System Events
In addition to monitoring SQL Server events, SQL Server Agent can also check conditions that are
detected by Windows Management Instrumentation (WMI) events. The WMI Query Language (WQL)
queries that retrieve the performance data execute several times each minute, so it can take a few seconds
for these alerts to fire. You can also configure performance condition alerts on any of the performance
counters that SQL Server exposes.
Creating Alerts
You can create alerts by using SSMS or by calling
the dbo.sp_add_alert system stored procedure.
When defining an alert, you can also specify a SQL
Server Agent job to start when the alert occurs.
Using sp_add_alert
EXEC msdb.dbo.sp_add_alert
@name=N'AdventureWorks Transaction Log
Full',
@message_id=9002, @delay_between_responses=0,
@database_name=N'AdventureWorks';
GO
Logged Events
You have seen that alerts will only fire for SQL Server errors if the error messages are written to the
Windows Application Event Log. In general, error severity levels from 19 to 25 are automatically written to
the Application Log but this is not always the case. To check which messages are automatically written to
the log, you can query the is_event_logged column in the sys.messages table.
Most events with severity levels less than 19 will only trigger alerts if you perform one of the following
steps:
• Modify the error message by using the dbo.sp_altermessage system stored procedure to make it a
logged message.
• Raise the error in code by using the RAISERROR WITH LOG option.
• Use the xp_logevent system extended stored procedure to force entries to be written to the log.
Administering Microsoft® SQL Server® Databases 13-15
• Execute a job
• Notify operators
You can define a list of operators to notify in
response to an alert by running the
dbo.sp_add_notification system stored procedure. When sending messages to operators about alerts, it
is important to provide the operator with sufficient context so that they can determine the appropriate
action to take.
You can include tokens in the message to add detail. There are special tokens available for working with
alerts, including:
By default, the inclusion of tokens is disabled for security reasons, but you can enable it in the properties
of SQL Server Agent.
If the alert does not appear to be raised, make sure that the setting for the delay between responses is not
set to too high a value.
Check that the job is configured to respond to the alert functions as expected. For operator notifications,
check that Database Mail is working and that the SMTP server configuration is correct. Test the Database
Mail profile that is sending the notifications by manually sending mail from the profile used by SQL Server
Agent.
• Create an alert.
• Test an alert.
Demonstration Steps
Create an Alert
1. In SQL Server Management Studio, in Object Explorer, under SQL Server Agent, right-click Alerts
and click New Alert.
2. In the New Alert dialog box, on the General page, enter the name Log Full Alert. In the Type drop-
down list, note that you can configure alerts on WMI events, performance monitor conditions, and
SQL Server events. Then select SQL Server event alert, select Error number, and enter the number
9002 (which is the error number raised by SQL Server when a database transaction log becomes full).
3. In the New Alert dialog box, on the Response page, select Notify operators and select the E-mail
checkbox for the Student operator.
4. In the New Alert dialog box, on the Options page, under Include alert error text in, select E-mail.
Then click OK.
Test an Alert
1. In SQL Server Management Studio, open the TestAlert.sql script file in the D:\Demofiles\Mod13
folder.
2. Click Execute and wait while the script fills a table in the TestAlertDB database. When the log file for
that database is full, error 9002 occurs.
3. In Object Explorer, under the Alerts folder, right-click Log Full Alert and click Properties. Then on
the History page, note the Date of last alert and Date of last response values and click Cancel.
4. In the C:\inetpub\mailroot\Drop folder, and verify that a new email message has been created.
5. Double-click the most recent message to view it in Outlook. Then, when you have read the message,
close it and minimize the Drop window.
Objectives
After completing this lab, you will be able to:
• Configure alerts.
Password : Pa$$w0rd
2. Verify that the test e-mail message is successfully delivered to the C:\inetpub\mailroot\Drop folder.
3. Query the dbo.sysmail_event_log and dbo.sysmail_mailitems tables in the msdb database to view
Database Mail events and e-mail history.
Results: After this exercise, you should have configured Database Mail with a new profile named SQL
Server Agent Profile.
1. Create Operators
2. Configure the SQL Server Agent Mail Profile
2. Create a second operator named DBA Team with the e-mail address [email protected].
2. Configure the Back Up Database -InternetSales and Back Up Log - InternetSales jobs to notify the
Student operator on completion.
3. Verify the job notifications assigned to the Student operator by viewing its notification properties.
Administering Microsoft® SQL Server® Databases 13-19
2. View the history properties of the Student operator to verify the most recent notification that was
sent.
3. Verify that notification e-mail messages for the failure of the Backup Database -
AWDataWarehouse job and the completion of the Back Up Database - InternetSales job were
successfully delivered to the C:\inetpub\mailroot\Drop folder.
Results: After this exercise, you should have created operators name Student and DBA Team, configured
the SQL Server Agent service to use the SQL Server Agent Profile Database Mail profile, and configured
the Back Up Database - AWDataWarehouse, Back Up Database - HumanResources, Back Up
Database - InternetSales, and Back Up Log - InternetSales jobs to send notifications.
1. Create an Alert
2. Test the Alert
2. Configure the alert to run the Backup Log - InternetSales job and send an e-mail that includes the
error message to the Student operator if error number 9002 occurs in the InternetSales database.
2. View the history properties of the InternetSales Log Full Alert alert to verify the most recent alert
and response.
3. Verify that notification e-mail messages for the full transaction log error and the completion of the
Back Up Log - InternetSales job were successfully delivered to the C:\inetpub\mailroot\Drop folder.
Results: After this exercise, you should have created an alert named InternetSales Log Full Alert.
Question: Under what circumstances would e-mail notifications have been sent to the DBA
Team operator you created?
13-20 Monitoring SQL Server 2014 with Notifications and Alerts
Best Practice: When planning notifications and alerts in SQL Server, consider the following
best practices:
• Provide limited access to the ability to send e-mail messages from the database engine.
• Implement a retention policy for Database Mail log and mail auditing.
Review Question(s)
Question: You want to designate a colleague in the IT team as an operator, but this
colleague does not have a login in the SQL Server instance. Should you create one?
Question: You are planning to send notifications from SQL Server, and think it might be
easier to use NET SEND notifications instead of e-mail. Why should you not do this?
Administering Microsoft® SQL Server® Databases 13-21
Course Evaluation
Your evaluation of this course will help Microsoft understand the quality of your learning experience.
Please work with your training provider to access the course evaluation form.
Microsoft will keep your answers to this survey private and confidential and will use your responses to
improve your future learning experience. Your open and honest feedback is valuable and appreciated.
13-22 Monitoring SQL Server 2014 with Notifications and Alerts
L1-1
2. In the D:\Labfiles\Lab01\Starter folder, right-click Setup.cmd and then click Run as administrator.
3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.
2. When prompted, connect to the MIA-SQL database engine using Windows authentication.
3. If Object Explorer is not visible, on the View menu, click Object Explorer.
4. In Object Explorer, under MIA-SQL, expand Databases and note the databases that are hosted on
this database engine instance.
5. Right-click MIA-SQL, point to Reports, point to Standard Reports, and click Server Dashboard.
Then view the server dashboard report for this instance.
3. View the databases listed under the Database folder and verify that the new database has been
created.
3. Click Execute and view the results, which include information about the AWDatabase you created in
the previous task.
2. In the New Project dialog box, select SQL Server Scripts. Then and save the project as AWProject in
the D:\Labfiles\Lab01\Starter folder.
3. If Solution Explorer is not visible, on the View menu, click Solution Explorer.
L1-2 Administering Microsoft® SQL Server® Databases
4. In Solution Explorer, right-click Connections and click New Connection. Then connect to the MIA-
SQL database engine using Windows authentication.
5. In Solution Explorer, right-click Queries and click New Query. Then when the query is created, right-
click SQLQuery1,sql and click Rename, and rename it to BackupDB.sql
6. In Object Explorer, right-click the AWDatabase database you created previously, point to Tasks, and
click Back Up.
7. In the Back Up Database – AWDatabase dialog box, in the Script drop-down list, select Script
Action to Clipboard. Then click Cancel.
8. Paste the contents of the clipboard into the empty BackupDB.sql script.
Results: At the end of this exercise, you will have created a SQL Server Management Studio project
containing script files.
L1-3
2. In the command prompt window, enter the following command to view details of all sqlcmd
parameters:
sqlcmd -?
3. Enter the following command to start sqlcmd and connect to MIA-SQL using Windows
authentication:
sqlcmd -S MIA-SQL -E
4. In the sqlcmd command line, enter the following commands to view the databases on MIA-SQL.
Verify that these include the AWDatabase database you created in the previous exercise.
Exit
2. Note that the query results are returned, but they are difficult to read in the command prompt
screen.
3. Enter the following command to store the query output in a text file:
4. Enter the following command to view the text file that was created by sqlcmd:
Notepad D:\Labfiles\Lab01\Starter\DBinfo.txt
5. View the results in the text file, and then close Notepad.
Results: At the end of this exercise, you will have used sqlcmd to manage a database.
L1-4 Administering Microsoft® SQL Server® Databases
Get-Process
3. Review the list of services. In the ProcessName column, note the SQL services.
4. Enter the following command to list only the services with names beginning “SQL”,:
Get-Process SQL*
Get-Help Sort
Get-Module
Set-location SQLServer:\SQL\MIA-SQL
5. At the Windows PowerShell prompt, enter the following command to display the SQL Server database
engine instances on the server:
Get-ChildItem
Set-location SQLServer:\SQL\MIA-SQL\DEFAULT\Databases
7. At the Windows PowerShell prompt, enter the following command to display the databases on the
default instance:
Get-ChildItem
10. Close the SQL Server Powershell window and close SQL Server Management Studio without saving
any files.
Get-Module
3. Verify that the SQLPS module is not loaded. Then enter the following command to load it:
4. Enter the following command to verify that the SQLPS module is now loaded.
Get-Module
5. If the Commands pane is not visible, on the View menu, click Show Command Add-on. Then in the
Commands pane, in the Modules list, select SQLPS.
6. View the cmdlets in the module, noting that they include cmdlets to perform tasks such as backing
up databases and starting SQL Server instances.
7. If the Script pane is not visible, click the Script drop-down arrow.
8. In the Script pane, type the following commands. (Hint: Use the IntelliSense feature.)
9. Click Run Script. Then view the results in the window that is opened. (The script may take a few
minutes to run.)
10. Close the window, and modify the script as shown in the following example:
11. Save the script as GetDatabases.ps1 in the D:\Labfiles\Lab01\Starter folder. Then close the
PowerShell ISE.
12. In the D:\Labfiles\Lab01\Starter folder, right-click GetDatabases.ps1 and click Run with PowerShell.
13. When the script has completed, open Databases.txt in Notepad to view the results.
Results: At the end of this task, you will have a PowerShell script that retrieves information about
databases from SQL Server.
L1-6 Administering Microsoft® SQL Server® Databases
L2-1
2. In the D:\Labfiles\Lab02\Starter folder, right-click Setup.cmd and then click Run as administrator.
3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.
2. In the SQL Server Installation Center, on the Planning page, click Hardware and Software
Requirements.
3. In Internet Explorer, note that the documentation provides detailed information about hardware and
software requirements for SQL Server 2014. Then close Internet Explorer.
2. When the tool has run, click Show Details to view the checks that were performed.
3. Click OK to close SQL Server 2014 Setup.
4. Keep the SQL Server Installation Center window open. You will use it again in a later exercise.
Results: After this exercise, you should have run the SQL Server setup program and used the tools in the
SQL Server Installation Center to assess the computer’s readiness for SQL Server installation.
L2-2 Administering Microsoft® SQL Server® Databases
2. Verify that the M:\SQLTEST\Data and L:\SQLTEST\Logs folders exist (if not, create them).
2. If the Microsoft Updates and Product Updates pages are displayed, clear any checkboxes and click
Next.
3. On the Install Rules page, click Show details and note that the list of rules that has been checked. If
a warning about Windows Firewall is displayed, you can ignore it.
8. On the Setup Role page, ensure that SQL Server Feature Installation is selected, and then click
Next.
9. On the Feature Selection page, under the Instance Features, select Database Engine Services, and then
click Next.
10. On the Instance Configuration page, ensure that Named instance is selected, type SQLTEST in the
Named instance box, and then click Next.
11. On the Server Configuration page, on the SQL Server Agent and SQL Server Database Engine rows,
enter the following values:
o Password: Pa$$w0rd
o Startup Type: Manual
12. On the Collation tab, ensure that SQL_Latin1_General_CP1_CI_AS is selected and click Next.
13. On the Database Engine Configuration page, on the Server Configuration tab, in the
Authentication Mode section, select Mixed Mode (SQL Server authentication and Windows
authentication). Then enter and confirm he password Pa$$w0rd.
14. Click Add Current User, this will add the user ADVENTUREWORKS\Student (Student) to the list of
Administrators.
15. On the Data Directories tab, change the User database directory to M:\SQLTEST\Data.
18. On the Ready to Install page, review the summary, and then click Install and wait for the installation
to complete.
Results: After this exercise, you should have installed an instance of SQL Server.
L2-4 Administering Microsoft® SQL Server® Databases
2. In the left-hand pane of the SQL Server Configuration Manager window, click SQL Server Services.
4. In the SQL Server (SQLTEST) Properties dialog box, verify that the service is configured to log on as
ADVENTUREWORKS\ServiceAcct and click Start. Then, when the service has started, click OK.
2. In SQL Server Configuration Manager, expand SQL Native Client 11.0 Configuration (32bit), click
Client Protocols, and verify that the TCP/IP protocol is enabled for 32-bit client applications.
3. Click Aliases, and note that there are currently no aliases defined for 32-bit clients. Then right-click
Aliases and click New Alias.
4. In the Alias – New window, in the Alias Name text box, type Test.
8. Click Aliases, and note that there are currently no aliases defined for 64-bit clients. Then right-click
Aliases and click New Alias.
9. In the Alias – New window, in the Alias Name text box, type Test.
10. In the Protocol drop-down list box, ensure that TCP/IP is selected.
11. In the Server text box, type MIA-SQL\SQLTEST and click OK.
2. At the command prompt, enter the following command to connect to the MIA-SQL\SQLTEST instance
of SQL Server:
sqlcmd –S MIA-SQL\SQLTEST -E
3. At the sqlcmd prompt, enter the following command to display the SQL Server instance name:
SELECT @@ServerName;
GO
5. Start SQL Server Management Studio, and when prompted, connect to the database engine named
Test using Windows Authentication.
L2-5
6. In Object Explorer, right-click Test and click Properties. Then verify that the value of the Name
property is MIA-SQL\SQLTEST and click Cancel.
7. In Object Explorer, right-click Test and click Stop. In the User Account Control message box, click
Yes. Then when prompted to confirm that you want to stop the MSSQL$SQLTEST service, click Yes.
8. When the service has stopped, close SQL Server Management Studio.
Results: After this exercise, you should have started the SQL Server service and connected using SSMS.
L2-6 Administering Microsoft® SQL Server® Databases
L3-1
2. In the D:\Labfiles\Lab03\Starter folder, right-click Setup.cmd, and then click Run as administrator.
3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.
2. In Object Explorer, expand Databases, expand System Databases, right-click tempdb, and click
Properties.
3. On the Files page, view the current file settings. Then click Cancel.
5. Enter the following statements and click Execute alternatively, you can open the Configure
TempDB.sql script file in the D:\Labfiles\Lab03\Solution folder):
USE master;
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = tempdev, SIZE = 10MB, FILEGROWTH = 5MB, FILENAME =
'T:\tempdb.mdf');
ALTER DATABASE tempdb
MODIFY FILE (NAME = templog, SIZE=5MB, FILEGROWTH = 1MB, FILENAME =
'T:\templog.ldf');
GO
6. In Object Explorer, right-click MIA-SQL and click Restart. When prompted to allow changes, to
restart the service, and to stop the dependent SQL Server Agent service, click Yes.
7. View the contents of T:\ and note that the tempdb.mdf and tempdb.ldf files have been moved to
this location.
8. In SQL Server Management Studio, in Object Explorer, right-click tempdb, and click Properties.
9. On the Files page, verify that the file settings have been modified. Then click Cancel.
10. Save the script file as Configure TempDB.sql in the D:\Labfiles\Lab03\Starter folder.
11. Keep SQL Server Management Studio open for the next exercise.
Results: After this exercise, you should have inspected and configured the tempdb database.
L3-2 Administering Microsoft® SQL Server® Databases
2. Enter the following statements and click Execute alternatively, you can open the Create
HumanResources.sql script file in the D:\Labfiles\Lab03\Solution folder)::
3. On Object Explorer, right-click the Databases folder and click Refresh to confirm that the
HumanResources database has been created.
4. Save the script file as Create HumanResources.sql in the D:\Labfiles\Lab03\Starter folder.
2. Enter the following statements and click Execute (alternatively, you can open the Create
InternetSales.sql script file in the D:\Labfiles\Lab03\Solution folder):
3. Under the existing code, enter the following statements. Then select the statements you have just
added, and click Execute.
4. On Object Explorer, right-click the Databases folder and click Refresh to confirm that the
InternetSales database has been created.
2. Select the code under the comment View page usage and click Execute. This query retrieves data
about the files in the InternetSales database.
L3-3
3. Note the UsedPages and TotalPages values for the SalesData filegroup.
4. Select the code under the comment Create a table on the SalesData filegroup and click Execute.
5. Select the code under the comment Insert 10,000 rows and click Execute.
6. Select the code under the comment View page usage again and click Execute.
7. Note the UsedPages value for the SalesData filegroup, and verify that the data in the table is spread
across the files in the filegroup.
Results: After this exercise, you should have created a new HumanResources database and an
InternetSales database that includes multiple filegroups.
L3-4 Administering Microsoft® SQL Server® Databases
2. Move the following files from the D:\Labfiles\Lab03\Starter\ folder to the M:\Data\ folder:
o AWDataWarehouse.mdf
o AWDataWarehouse_archive.ndf
o AWDataWarehouse_current.ndf
3. In SQL Server Management Studio, in Object Explorer, right-click Databases and click Attach.
4. In the Attach Databases dialog box, click Add. Then in the Locate Database Files dialog box, select
the M:\Data\AWDataWarehouse.mdf database file and click OK.
5. In the Attach Databases dialog box, after you have added the master databases file, note that all of
the database files are listed. Then click OK.
3. Select the Read-Only checkbox for the Archive filegroup and click OK.
4. In Object Explorer, expand AWDataWarehouse, and expand Tables. Then right-click the
dbo.FactInternetSales table and click Properties.
5. On the Storage page, verify that the dbo.FactInternetSales table is stored in the Current filegroup.
Then click Cancel.
7. On the Storage page, verify that the dbo.FactInternetSalesArchive table is stored in the Archive
filegroup. Then click Cancel.
8. In Object Explorer, right-click the dbo.FactInternetSales table and click Edit Top 200 Rows.
9. Change the SalesAmount value for the first record to 2500 and press Enter to update the record.
Then close the dbo.FactInternetSales table.
10. In Object Explorer, right-click the dbo.FactInternetSalesArchive table and click Edit Top 200 Rows.
11. Change the SalesAmount value for the first record to 3500 and press Enter to update the record.
12. View the error message that is displayed and click OK. Then press Esc to cancel the update and close
the dbo.FactInternetSalesArchive table.
Results: After this exercise, you should have attached the AWDataWarehouse database to MIA-SQL.
L4-1
2. In the D:\Labfiles\Lab04\Starter folder, right-click Setup.cmd, and then click Run as administrator.
3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.
3. In the Database Properties – HumanResources dialog box, on the Options page, in the Recovery
model drop-down list, select Simple. Then click OK.
2. In the Backup Up Database – HumanResources dialog box, ensure that Backup type is set to Full,
and in the Destination section, select the existing file path and click Remove. Then click Add and in
the Select Backup Destination dialog box, enter the file name R:\Backups\HumanResources.bak
and click OK.
3. In the Backup Up Database – HumanResources dialog box, on the Media Options page, select
Back up to a new media set, and erase all existing backup sets. Then enter the new media set
name HumanResources Backup.
4. In the Backup Up Database – HumanResources dialog box, on the Backup Options page, note the
default backup name. Then in the Set backup compression list, select Compress backup and click
OK.
6. Verify that the backup file HumanResources.bak has been created in the R:\Backups folder, and note
its size.
2. Enter the following Transact-SQL code in the query pane, and then click Execute. Alternatively, you
can open the Update HumanResources.sql file in the D:\Labfiles\Lab04\Starter folder.
UPDATE HumanResources.dbo.Employee
SET PhoneNumber='151-555-1234'
L4-2 Administering Microsoft® SQL Server® Databases
3. Note the number of rows affected, and then close the query pane without saving the file.
2. In the Backup Up Database – HumanResources dialog box, ensure that Backup type is set to Full,
and in the Destination section, verify that R:\Backups\HumanResources.bak is listed.
3. In the Backup Up Database – HumanResources dialog box, on the Media Options page, ensure
that Back up to the existing media set and Append to the existing backup set are selected.
4. In the Backup Up Database – HumanResources dialog box, on the Backup Options page, change
the backup name to HumanResources-Full Database Backup 2. Then in the Set backup
compression list, select Compress backup and click OK.
5. When the backup has completed successfully, click OK.
6. View the HumanResources.bak backup file in the R:\Backups folder, and verify that its size has
increased.
3. In the Device Type column, expand each of the Disk (temporary) entries to view details of the
backup media set file.
Results: At the end of this exercise, you will have backed up the HumanResources database to
R:\Backups\HumanResources.bak.
L4-3
2. In the Database Properties – InternetSales dialog box, on the Options page, in the Recovery
model drop-down list, ensure that Full is selected. Then click OK.
2. In the Backup Up Database – InternetSales dialog box, ensure that Backup type is set to Full, and
in the Destination section, select the existing file path and click Remove. Then click Add and in the
Select Backup Destination dialog box, enter the file name R:\Backups\InternetSales.bak and click
OK.
3. In the Backup Up Database – InternetSales dialog box, on the Media Options page, select Back
up to a new media set, and erase all existing backup sets. Then enter the new media set name
InternetSales Backup.
4. In the Backup Up Database – InternetSales dialog box, on the Backup Options page, note the
default backup name. Then in the Set backup compression list, select Compress backup and click
OK.
6. Verify that the backup file InternetSales.bak has been created in the R:\Backups folder, and note its
size.
2. Enter the following Transact-SQL code in the query pane, and then click Execute. Alternatively, you
can open the Update InternetSales.sql file in the D:\Labfiles\Lab04\Starter folder, and select the first
batch of code.
UPDATE InternetSales.dbo.Product
WHERE ProductSubcategoryID = 1;
3. Note the number of rows affected. Keep the script open, you will use it again in a later task.
2. In the Backup Up Database – InternetSales dialog box, in the Backup type list, select Transaction
Log, and in the Destination section, verify that R:\Backups\InternetSales.bak is listed.
3. In the Backup Up Database – InternetSales dialog box, on the Media Options page, ensure that
Back up to the existing media set and Append to the existing backup set are selected.
4. In the Backup Up Database – InternetSales dialog box, on the Backup Options page, change the
backup name to InternetSales-Transaction Log Backup. Then in the Set backup compression list,
select Compress backup and click OK.
6. View the InternetSales.bak backup file in the R:\Backups folder, and verify that its size has increased.
UPDATE InternetSales.dbo.Product
SET ListPrice = ListPrice * 1.1
WHERE ProductSubcategoryID = 2;
2. Note the number of rows affected. Keep the script open, you will use it again in a later task.
2. In the Backup Up Database – InternetSales dialog box, in the Backup type list, select Differential,
and in the Destination section, verify that R:\Backups\InternetSales.bak is listed.
3. In the Backup Up Database – InternetSales dialog box, on the Media Options page, ensure that
Back up to the existing media set and Append to the existing backup set are selected.
4. In the Backup Up Database – InternetSales dialog box, on the Backup Options page, change the
backup name to InternetSales-Differential Backup. Then in the Set backup compression list,
select Compress backup and click OK.
6. View the InternetSales.bak backup file in the R:\Backups folder, and verify that its size has increased
UPDATE InternetSales.dbo.Product
2. Note the number of rows affected. Then close the query pane without saving the file.
2. In the Backup Up Database – InternetSales dialog box, in the Backup type list, select Transaction
Log, and in the Destination section, verify that R:\Backups\InternetSales.bak is listed.
3. In the Backup Up Database – InternetSales dialog box, on the Media Options page, ensure that
Back up to the existing media set and Append to the existing backup set are selected.
4. In the Backup Up Database – InternetSales dialog box, on the Backup Options page, change the
backup name to InternetSales-Transaction Log Backup 2. Then in the Set backup compression
list, select Compress backup and click OK.
6. View the InternetSales.bak backup file in the R:\Backups folder, and verify that its size has increased.
L4-5
2. Enter the following Transact-SQL code in the query pane, and then click Execute.
RESTORE HEADERONLY
GO
3. Verify that the backups you performed in this exercise are all listed.
RESTORE FILELISTONLY
GO
RESTORE VERIFYONLY
7. Verify that the backup is valid. Then close the query pane without saving the file.
Results: At the end of this exercise, you will have backed up the InternetSales database to
R:\Backups\InternetSales.bak.
L4-6 Administering Microsoft® SQL Server® Databases
2. In the Database Properties – AWDataWarehouse dialog box, on the Options page, in the
Recovery model drop-down list, select Simple. Then click OK.
2. Enter the following Transact-SQL code in the query pane, and then click Execute.
FILEGROUP = 'Archive'
TO DISK = 'R:\Backups\AWDataWarehouse-Read-Only.bak'
3. Verify that the backup file AWDataWarehouse-Read-Only.bak has been created in the R:\Backups
folder.
2. Enter the following Transact-SQL code in the query pane, and then click Execute.
TO DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak'
2. Enter the following Transact-SQL code in the query pane, and then click Execute. Alternatively, you
can open the Update AWDataWarehouse.sql file in the D:\Labfiles\Lab04\Starter folder.
VALUES
3. Note the number of rows affected, and then close the query pane without saving the file.
2. Enter the following Transact-SQL code in the query pane, and then click Execute.
READ_WRITE_FILEGROUPS
TO DISK = 'R:\Backups\AWDataWarehouse-Read-Write.bak'
2. Enter the following Transact-SQL code in the query pane, and then click Execute.
RESTORE HEADERONLY
3. View the backups on this backup media, and scroll to the right to view the BackupTypeDescription
column.
4. Modify the Transact-SQL code as follows, and then click Execute.
RESTORE HEADERONLY
5. View the backups on this backup media, and scroll to the right to view the BackupTypeDescription
column.
6. Close SQL Server Management Studio without saving any script files.
Results: At the end of this exercise, you will have backed up the read-only filegroup in the
AWDataWarehouse to R:\Backups\AWDataWarehouse_Read-Only.bak; and you will have backed up the
writable filegroups in the AWDataWarehouse to R:\Backups\AWDataWarehouse_Read-Write.bak
L4-8 Administering Microsoft® SQL Server® Databases
L5-1
2. In the D:\Labfiles\Lab05\Starter folder, right-click Setup.cmd file, and click Run as administrator.
3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.
2. In Object Explorer, expand Databases, and note that the HumanResources database is in a
Recovery Pending state.
3. In SQL Server Management Studio, click New Query and execute the following Transact-SQL code to
attempt to bring the database online.
4. Note the error message that is displayed. The database cannot be brought online because the
primary data file is lost.
5. View the contents of the M:\Data folder to verify that the HumanResources.mdf file is not present.
2. In the Restore Database – HumanResources dialog box, note that the backup history for the
database has been retained, and the most recent full backup is automatically selected.
3. In the Script drop-down list, click New Query Editor Window. Then click OK.
5. View the Transact-SQL code that was used to restore the database, noting that the full backup was
restored from file 2 in the R:\Backups\HumanResources.bak backup media set.
6. In Object Explorer, verify that the HumanResources database is now recovered and ready to use.
Results: After this exercise, you should have restored the HumanResources database.
L5-2 Administering Microsoft® SQL Server® Databases
2. Click New Query and execute the following Transact-SQL code to attempt to bring the database
online.
3. Note the error message that is displayed. There is a problem with the primary data file.
4. View the contents of the M:\Data folder to verify that the InternetSales.mdf file is present. This file
has become corrupt, and has rendered the database unusable.
2. In SQL Server Management Studio, click New Query and enter the following Transact-SQL code to
back up the tail of the transaction log:
USE master;
WITH NO_TRUNCATE;
3. Click Execute, and view the resulting message to verify that the backup is successful.
2. In the Restore Database – InternetSales dialog box, note that only the tail-log backup is listed. The
backup history for this database has been deleted.
3. In the Restore Database – InternetSales dialog box, in the Source section, elect Device and click
the ellipses (...) button.
4. In the Select backup devices dialog box click Add, and then in the Locate backup File – MIA-SQL
dialog box, select R:\Backups\InternetSales.bak and click OK.
5. In the Select backup devices dialog box, ensure that R:\Backups\InternetSales.bak is listed, and
then click Add.
6. In the Locate backup File – MIA-SQL dialog box, select R:\Backups\IS-TailLog.bak and click OK.
7. In the Select backup devices dialog box, ensure that both R:\Backups\InternetSales.bak and
R:\Backups\IS-TailLog.bak are listed and click OK.
8. Note that the backup media contains a full backup, a differential backup, and a transaction log
backup (these are the planned backups in InternetSales.bak); and a copy-only transaction log
backup (which is the tail-log backup in IS-TailLog.bak). All of these are automatically selected in the
Restore column.
9. On the Options page, ensure that the Recovery state is set to RESTORE WITH RECOVERY.
10. In the Script drop-down list, click New Query Editor Window. Then click OK.
L5-3
11. When the database has been restored successfully, click OK.
12. View the Transact-SQL code that was used to restore the database, noting that the full backup, the
differential backup, and the first transaction log backup were restored using the NORECOVERY
option. The restore operation for the tail-log backup used the default RECOVERY option to recover
the database.
13. In Object Explorer, verify that the InternetSales database is now recovered and ready to use.
Results: After this exercise, you should have restored the InternetSales database.
L5-4 Administering Microsoft® SQL Server® Databases
2. Click New Query and enter the following Transact-SQL code start a partial restore of the database
from the full backup set in position 1 in the AWDataWarehouse_Read-Write.bak media set:
USE master;
3. Click Execute, and view the resulting message to verify that the restore is successful.
4. In Object Explorer, right-click the Databases folder and click Refresh; and verify that
AWDataWarehouse is listed with a “Restoring” status.
2. Select the code you just entered and click Execute, and view the resulting message to verify that the
restore is successful.
3. In Object Explorer, right-click the Databases folder and click Refresh; and verify that
AWDataWarehouse is now shown as online.
4. Expand the AWDataWarehouse database and its Tables folder. Then right-click
dbo.FactInternetSales and click Select Top 1000 Rows. Note that you can retrieve data from this
table, which is stored in the read/write Current filegroup.
5. In Object Explorer, right-click dbo.FactInternetSalesArchive and click Select Top 1000 Rows. Note
that you cannot retrieve data from this table, which is stored in the read-only Archive filegroup.
WITH RECOVERY;
2. Select the code you just entered and click Execute, and view the resulting message to verify that the
restore is successful.
3. In Object Explorer, right-click dbo.FactInternetSalesArchive and click Select Top 1000 Rows. Note
that you can now retrieve data from this table, which is stored in the read-only Archive filegroup.
L5-5
Results: After this exercise, you will have restored the AWDataWarehouse database.
L5-6 Administering Microsoft® SQL Server® Databases
L6-1
3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish.
Task 2: Use the SQL Server Import and Export Wizard to Export Data
1. Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows
authentication.
2. In Object Explorer, expand Databases. Then right-click the InternetSales database, point to Tasks,
and click Export Data.
3. On the Welcome to SQL Server Import and Export Wizard page, click Next.
4. On the Choose a Data Source page, in the Data source drop-down list, select SQL Server Native
Client 11.0. Then ensure that the MIA-SQL server is selected, that Use Windows Authentication is
selected, and that the InternetSales database is selected; and click Next.
5. On the Choose a Destination page, in the Data source drop-down list, select Microsoft Excel. Then
in the Excel file path box type D:\Labfiles\Lab06\Starter\Sales.xls, ensure that First row has
column names is selected, and click Next.
6. On the Specify Table Copy or Query page, select Write a query to specify the data to transfer
and click Next.
7. On the Provide a Source Query page, click Browse and open the Query.sql script file in the
D:\Labfiles\Lab06\Starter folder. Then, on the Provide a Source Query page, click Next.
8. On the Select Source Tables and Views page, replace ‘Query’ in the Destination column with
‘Sales’. Then click Next.
9. On the Review data Type Mapping page, review the default mappings and click Next.
10. On the Save and Run Package page, ensure that Run immediately is selected, and click Next.
11. On the Complete the Wizard page, click Finish. Then, when the execution is successful, click Close.
12. Start Excel and open the Sales.xls file in the D:\Labfiles\Lab06\Starter folder and view the data that
has been exported. Then close Excel without saving the file.
Results: After this exercise, you should have exported data from InternetSales to an Excel workbook
named Sales.xls.
L6-2 Administering Microsoft® SQL Server® Databases
2. View the existing data in the table, noting that some of the columns include Unicode characters.
3. Open a command prompt and enter the following command to create a format file:
4. Start Notepad and open JobCandidateFmt.xml in the D:\Labfiles\Lab06\Starter folder. Then view
the XML format file and close notepad.
4. In SQL Server Management Studio, re-execute the query that retrieves the top 1000 rows from the
dbo.JobCandidate table and verify that the new data has been imported.
Results: After this exercise, you should have created a format file named JobCandidateFmt.xml, and
imported the contents of the JobCandidates.txt file into the HumanResources database.
L6-3
2. Expand the dbo.CurrencyRate table, and then expand its Indexes folder. Note that the table has
indexes defined.
3. Click New Query, and then in the new query pane, enter the following Transact-SQL code to disable
indexes:
4. Click Execute.
2. Click Execute.
Results: After this exercise, you should have used the BULK INSERT statement to load data into the
CurrencyRates table in the InternetSales database.
L6-4 Administering Microsoft® SQL Server® Databases
Note: In this lab environment, the client and server are the same. However, in a real environment you
would need to upload data and format files from your local workstation to a volume that is accessible
from the server. In this scenario, M: represents a volume in a SAN that would be accessible from the
server.
3. Click New Query, and then in the new query pane, enter the following Transact-SQL code to disable
the non-clustered indexes and constraints:
4. Click Execute.
3. Switch to the query pane that retrieves the top 1000 rows from the dbo.JobCandidate table and
click Execute to re-run the SELECT query. Verify that the records for candidates with an email address
have been inserted.
GO
ALTER INDEX idx_JobCandidate_CountryRegion ON HumanResources.dbo.JobCandidate
REBUILD;
GO
ALTER TABLE HumanResources.dbo.JobCandidate
CHECK CONSTRAINT ALL;
GO
2. Click Execute.
Results: After this exercise, you should have imported data from JobCandidates2.txt into the
dbo.JobCandidates table in the HumanResources database.
L6-6 Administering Microsoft® SQL Server® Databases
L7-1
3. Click Yes when prompted to confirm you want to run the command file, and wait for the script to
finish
2. In Computer Management, expand Performance, expand Monitoring Tools, expand Data Collector
Sets, and expand User Defined.
3. If a data collection set named SQL Server Workload already exists (because you completed this lab
previously), right-click it and click Delete. Then click Yes when prompted.
4. Right-click User Defined, point to New, and click Data Collector Set.
5. In the Create new Data Collector Set dialog box, enter the name SQL Server Workload. Then select
Create manually (Advanced) and click Next.
6. On the What type of data do you want to include? Page, under Create data logs, select
Performance counter, and then click Next.
7. On the Which performance counters would you like to add? page, Click Add.
8. In the list of objects, expand the Processor object, and select only the % Processor Time counter.
Then in the Instances of selected object list, ensure that _Total is selected and click Add.
9. In the list of objects, expand the Memory object and select the Page Faults/sec counter. Then click
Add.
10. In the list of objects, expand the SQLServer:Locks object, click the Average Wait Time (ms) counter,
and then hold the Ctrl key and click the Lock Requests/sec and Lock Waits/sec counters. Then in
the Instances of selected object list, ensure that _Total is selected and click Add.
11. In the list of objects, expand the SQLServer:Memory Manager object, click the Database Cache
Memory (KB) counter, and then hold the Ctrl key and click the Free memory (KB) counter. Then
click Add.
12. In the list of objects, expand the SQLServer:Plan Cache object, and select the Cache Hit Ratio
counter. Then in the Instances of selected object list, ensure that _Total is selected and click Add.
13. In the list of objects, expand the SQLServer:Transactions object, and select the Transactions
counter. Then click Add.
14. In the Add Counters dialog box, click OK. Then in the Create new Data Collector Set dialog box, on
the Which performance counters would you like to add? page, Click Next.
15. On the Where would you like the data to be saved? Page, in the Root directory box, type
D:\Labfiles\Lab07\Starter\Logs and then click Next.
L7-2 Administering Microsoft® SQL Server® Databases
16. On the Create the data collector set? page, ensure that Save and close is selected and click Finish.
2. In the D:\Labfiles\Lab07\Starter folder, right-click Baseline.ps1 and click Run with PowerShell. If you
are prompted to change the execution policy, enter Y. This starts a baseline workload process that
takes three minutes to run.
3. When the PowerShell window closes, in Computer Manager, right-click the SQL Server Workload
data collector set and click Stop.
4. In Performance Monitor, in the toolbar, click the Add button (a green +).
5. In the Add Counters dialog box, click Memory, and then hold the Ctrl key and click each other
object to select them all. Then click Add to add all of the counters for all of the objects, and click OK
6. Click any of the counters in the list below the chart and on the toolbar click Highlight so that the
selected counter is highlighted in the chart. Press the up and down arrow keys on the keyboard to
change the selected counter. As you highlight the counters, note the Last, Average, Minimum, and
Maximum values.
7. On the toolbar, in the Change Graph Type list, select Report and view the text-based report, which
shows the average value for each counter.
8. Right-click anywhere in the report and click Save Image As. Then save the report image as
BaselineAverages.gif in the D:\Labfiles\Lab07\Starter folder.
9. On the toolbar, in the Change Graph Type list, select Line to return to the original line chart view.
2. In SQL Server Management Studio, open the Query DMV.sql script file in the
D:\Labfiles\Lab07\Starter folder.
3. Highlight the Transact-SQL statement under the comment Get top 5 queries by average reads, and
then click Execute.
4. View the results. They include SELECT queries that retrieve data from tables in the InternetSales
database.
5. In the Results pane, right-click any cell and click Save Results As. Then save the results as
TopBaselineQueries.csv in the D:\Labfiles\Lab07 folder.
L7-3
6. In the query pane, highlight the Transact-SQL statement under the comment View IO Stats, and then
click Execute.
7. View the results. They include details of I/O activity for the files used by the InternetSales database.
8. In the Results pane, right-click any cell and click Save Results As. Then save the results as
BaslineIO.csv in the D:\Labfiles\Lab07 folder.
9. Minimize SQL Server Management Studio.
Results: At the end of this exercise, you will have a data collector set named SQL Server Workload, a log
containing baseline measurements, and query and I/O statistics obtained from DMVs and DMFs.
L7-4 Administering Microsoft® SQL Server® Databases
2. In the D:\Labfiles\Lab07\Starter folder, right-click Workload.ps1 and click Run with PowerShell. This
starts a database workload process that takes three minutes to run.
3. When the PowerShell window closes, in Computer Manager, right-click the SQL Server Workload
data collector set and click Stop.
2. In the Performance Monitor Properties dialog box, on the Source tab, ensure that Log files is
selected, select any existing files and click Remove.
3. In the Performance Monitor Properties dialog box, on the Source tab, click Add. Browse to the
D:\Labfiles\Lab07\Starter\Logs folder, open the folder with a name similar to MIA-SQL_2014010101-
000002 and open the DataCollector01.blg log file. Then, in the Performance Monitor Properties
dialog box, click OK.
4. View the line chart report, noting any counters that look consistently high.
5. On the toolbar, in the Change Graph Type list, select Histogram bar and view the resulting chart.
Then in the Change Graph Type list, select Report and view the text-based report.
2. View the results. Then start Microsoft Excel and open the TopBaselineQueries.csv file you saved in
the D:\Labfiles\Lab07 folder and compare the results to the queries that were identified during the
baseline workload.
3. In SQL Server Management Studio, in the query pane, highlight the Transact-SQL statement under
the comment View IO Stats, and then click Execute.
4. View the results. Then in Microsoft Excel, open the BaselineIO.csv file you saved in the
D:\Labfiles\Lab07 folder and compare the results to the I/O statistics that were identified during the
baseline workload.
5. Close Excel and SQL Server Management Studio without saving any files.
Results: At the end of this exercise, you will have a second log file containing performance metrics for the
revised workload.
L8-1
2. In the D:\Labfiles\Lab08\Starter folder, right-click Setup.cmd and then click Run as administrator.
3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.
2. In SQL Server Profile, on the File menu, click New Trace. Then connect to the MIA-SQL database
engine instance using Windows authentication.
3. In the Trace Properties dialog box, on the General tab, set the following properties:
4. In the Trace Properties dialog box, on the Events Selection tab, note the events and columns that
were automatically selected from the TSQL template.
5. Select Show all events, and under TSQL, select SQL:StmtCompleted. Then clear Show all events so
that only the selected events, including the one you just selected are shown.
6. Select Show all columns and select the Duration column for the SQL:StmtCompleted event.
7. Click the column header for the Database Name column, and in the Edit Filter dialog box, expand
Like, enter InternetSales, and click OK. Then clear Show all columns so that only the selected
columns are shown.
2. In the D:\Labfiles\Lab08\Starter folder, right-click Workload.ps1 and click Run with PowerShell. This
starts a workload in the InternetSales database that lasts for approximately three minutes.
3. While the workload is running, switch back to SQL Server Profiler and observe the activity.
4. When the workload has finished, in SQL Server Profiler, on the File menu, click Stop Trace.
5. In the trace, select any of the SQL:StmntCompleted events and note that the Transact-SQL code is
shown in the bottom pane.
Results: After this exercise, you should have captured a workload using SQL Server Profiler.
L8-2 Administering Microsoft® SQL Server® Databases
2. When the Database Engine Tuning Advisor starts, connect to the MIA-SQL database engine instance
using Windows authentication.
3. In the Database Engine Tuning Advisor, In the Session name box, type Tune InternetSales.
6. In the Select databases and tables to tune list, select InternetSales and note that all of its tables
are selected. Then in the drop down list of tables, clear the checkbox for the CurrencyRate table.
7. On the Tuning Options tab, review the default options for recommendations. Then click Advanced
Options, select Generate online recommendations where possible, and click OK.
2. When the analysis is complete, on the Recommendations tab, review the index recommendations
that the DTA has generated.
3. On the Actions menu, click Save Recommendations, save the recommendations script as DTA
Recommendations.sql in the D:\Labfiles\Lab08\Starter folder, and click OK when notified that the
file was saved.
2. View the report and identify the most frequently used query in the workload.
3. In the Select report list, select Statement detail report.
4. View the report and compare the Current Statement Cost and Recommended Statement Cost
values for the query you identified as being most frequently used.
5. Select the Statement String cell for the most frequently used statement, and then right-click the
selected cell and click Copy.
6. Minimize the Database Engine Tuning Advisor and start SQL Server Management Studio. When
prompted, connect to the MIA-SQL database engine instance using Windows authentication.
7. In SQL Server Management Studio, click New Query and paste the statement you copied previously
into the query window.
8. In the Available Databases drop-down list, select InternetSales. Then on the Query menu, click
Display Estimated Execution Plan. This displays a breakdown of the tasks that the query processor
will perform to process the query.
9. Note that the query processor suggests that there is at least one missing index that would improve
query performance. Then hold the mouse over the SELECT icon at the left side of the query plan
diagram and view the Estimated Subtree Cost value that is displayed in a tooltip.
L8-3
10. In SQL Server Management Studio, open the DTA Recommendations.sql script you saved from the
Database Engine Tuning Advisor in the D:\Labfiles\Lab08\Starter folder. Then click execute to
implement the recommended indexes.
11. Close the DTA Recommendations.sql tab, and return to the query and its estimated execution plan.
12. On the Query menu, click Display Estimated Execution Plan again, and note that the query
processor no longer suggests that there is a missing index. Then hold the mouse over the SELECT icon
at the left side of the query plan diagram and view the Estimated Subtree Cost value.
Results: After this exercise, you should have analyzed the trace in the Database Engine Tuning Advisor,
and reviewed the recommendations.
L8-4 Administering Microsoft® SQL Server® Databases
2. Save the exported trace script as InternetSales Trace.sql in the D:\Labfiles\Lab08\Starter folder, and
click OK when notified that the script has been saved.
2. View the Transact-SQL code, and in the line that begins exec @rc = sp_trace_create, replace
InsertFileNameHere with D:\Labfiles\Lab08\Starter\InternetSales.
3. Click Execute to start the trace, and note the TraceID value that is returned.
4. In the D:\Labfiles\Lab08\Starter folder, right-click Workload.ps1 and click Run with PowerShell. This
starts a workload in the InternetSales database that lasts for approximately three minutes.
5. While the workload is running, switch back to SQL Server Management Studio and click New Query,
then in the new query window, enter the following code, replacing TraceID with the TraceID value
you noted when you started the trace. Do not execute the code yet.
6. When the Windows PowerShell window closes (indicating that the workload has finished), in SQL
Server Management Studio, click Execute to stop the trace.
2. Select the code you added in the previous step and click Execute. Then view the results.
3. Close SQL Server Management Studio, SQL Server profiler, and the Database Engine Tuning Advisor
without saving any files.
Results: After this exercise, you should have captured a trace using SQL Trace.
L9-1
2. In the D:\Labfiles\Lab09\Starter folder, right-click Setup.cmd and then click Run as administrator.
3. Click Yes when prompted to confirm that you want to run the command file, and wait for the script
to finish.
o [ADVENTUREWORKS\Database_Managers]
o [ADVENTUREWORKS\WebApplicationSvc]
o [ADVENTUREWORKS\InternetSales_Users]
o [ADVENTUREWORKS\InternetSales_Managers]
o Marketing_Application
4. Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click Logins
and click Refresh to verify that the logins have been created.
o Grants ALTER ANY LOGIN and VIEW ANY DATABASE permissions to the application_admin role.
4. Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click Server
Roles and click Refresh to verify that the new role has been created.
L9-2 Administering Microsoft® SQL Server® Databases
Results: After this exercise, the authentication mode for the MIA-SQL SQL Server instance should support
the scenario requirements, you should have created the required logins and server-level roles, and you
should have granted the required server-level permissions.
L9-3
o Marketing_Application
o WebApplicationSvc
o InternetSales_Users
o InternetSales_Managers
o Database_Managers
4. Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click Users
and click Refresh to verify that the new users have been created.
4. Click Execute. Then, when the script has completed successfully, in Object Explorer, right-click
Database Roles and click Refresh and expand Application Roles to verify that the new roles have
been created.
o SELECT, INSERT, UPDATE, DELETE, and EXECUTE on the Sales schema to sales_admin.
L9-4 Administering Microsoft® SQL Server® Databases
3. Click Execute.
Results: After this exercise, you should have created the required database users and database-level roles,
and assigned appropriate permissions.
L9-5
2. In the command prompt window, enter the following command (which opens the sqlcmd utility as
ADVENTUREWORKS\AnthonyFrizzell):
4. In the SQLCMD window, enter the following commands to verify your identity:
SELECT suser_name();
GO
5. Note that SQL Server identifies Windows group logins using their individual user account, even
though there is no individual login for that user. ADVENTUREWORKS\AnthonyFrizzell is a member
of the ADVENTUREWORKS\IT_Support global group, which is in turn a member of the
ADVENTUREWORKS\Database_Managers domain local group for which you created a login.
6. In the SQLCMD window, enter the following commands to alter the password of the
Marketing_Application login.
8. Close the SQLCMD window and maximize SQL Server Management Studio.
9. In Object Explorer, right-click the Logins folder and click Refresh. Then right-click the
ADVENTUREWORKS\WebApplicationSvc login and click Properties.
2. In the new query window, enter the following Transact-SQL code to impersonate the
Marketing_Application login.
3. Click Execute and verify that the connection is executing in the context of the
Marketing_Application login.
USE InternetSales;
SELECT * FROM sys.fn_my_permissions(‘Customers.Customer’, 'object');
L9-6 Administering Microsoft® SQL Server® Databases
GO
5. Select the code you just entered and click Execute to view the effective permissions for
Marketing_Application on the Customers.Customer table.
7. Select the code you just entered and click Execute to verify that the user can query the
Customers.Customer table.
UPDATE Customers.Customer
SET EmailAddress = NULL
WHERE CustomerID = 1;
GO
9. Select the code you just entered and click Execute to verify that the user does not have UPDATE
permission on the Customers.Customer table.
10. Enter the following Transact-SQL code under the previous code:
11. Select the code you just entered and click Execute to verify that the user can query the
Product.Products table.
12. Enter the following Transact-SQL code under the previous code:
13. Select the code you just entered and click Execute to verify that the user does not have SELECT
permission on the Sales.SalesOrderHeader table.
14. Close SQL Server management Studio without saving any files.
3. In the SQLCMD window, enter the following commands to query the Products.vProductCatalog
view:
5. In the SQLCMD window, enter the following commands to query the Products.Product table:
GO
6. Verify that the user does not have SELECT permission on the Products.Product table.
3. In the SQLCMD window, enter the following commands to query the Sales.SalesOrderHeader table:
5. In the SQLCMD window, enter the following commands to update the Sales.SalesOrderHeader
table:
6. Verify that the user does not have UPDATE permission on the Sales.SalesOrderHeader table.
7. Close the SQLCMD window.
3. In the SQLCMD window, enter the following commands to query the Sales.SalesOrderHeader table:
5. In the SQLCMD window, enter the following commands to update the Sales.SalesOrderHeader
table:
7. In the SQLCMD window, enter the following commands to update the Products.Product table:
L9-8 Administering Microsoft® SQL Server® Databases
9. In the SQLCMD window, enter the following commands to call the Products.ChangeProductPrice
stored procedure:
10. Verify that the one row is affected (because the user has EXECUTE permission on the
Products.ChangeProductPrice stored procedure).
11. In the SQLCMD window, enter the following commands to delete a row from the
Sales.SalesOrderDetail table:
12. Verify that the user cannot delete rows from the Sales.SalesOrderDetail table.
13. In the SQLCMD window, enter the following commands to activate the sales_admin application role
and delete a row from the Sales.SalesOrderDetail table:
14. Verify that the one row is affected. This is possible because the sales_admin application role has
DELETE permission on the Sales schema.
Results: After this exercise, you should have verified effective permissions in the MIA-SQL instance and
the InternetSales database.
L10-1
2. In the D:\Labfiles\Lab10\Starter folder, right-click Setup.cmd and then click Run as administrator.
3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.
2. In SQL Server Management Studio, open the Audit.sql script file in the D:\Labfiles\Lab10\Solution
folder.
3. Select the code under the comment Create an audit, and click Execute. This creates an audit that
logs events to files in D:\Labfiles\Lab10\Starter\Audits.
4. In Object Explorer, expand Security, and expand Audits (if Audits is not expandable, refresh it and
try again).
5. Double-click the AW_Audit audit you created and view its properties. Then click Cancel.
2. In Object Explorer, expand Databases, expand InternetSales, expand Security, and expand
Database Audit Specifications.
2. Select the code under the comment Grant permission on sp_audit_write and click Execute. This
grants EXECUTE permission on the sp_audit_write stored procedure to the public role in the master
database.
4. In the SQLCMD window, enter the following commands to activate the sales_admin application role
and update the Customers.Customer table:
6. In the D:\Labfiles\Lab10\Starter\Audits folder, verify that an audit file has been created.
7. In SQL Server Management Studio, in the Audit.sql script, select the code under the comment View
audited events and click Execute. This queries the files in the audit folder and displays the audited
events (events logged for the Student user and the service account for SQL Server have been
excluded to simplify the results).
8. Note that all events are logged with the server principal name ADVENTUREWORKS\VictoriaGray
despite the fact that this user accesses SQL Server through membership of a Windows group and
does not have an individual login. This identity is audited even when executing statements in the
security context of an application role.
9. Keep SQL Server Management Studio open for the next exercise.
Results: After this exercise, you should have created an audit, a server audit specification, and a database
specification.
L10-3
2. Select the code under the comment Create DMK and click Execute. This creates a database master
key in the master database.
3. Select the code under the comment Create server certificate and click Execute. This creates a
certificate.
4. Select the code under the comment Back up the certificate and click Execute. This backs up the
certificate and its private key.
5. Select the code under the comment Create DEK and click Execute. This creates a database
encryption key in the HumanResources database
3. Review the query results, and verify that the is_encypted value for HumanResources is 1.
2. In Object Explorer, in the Connect drop-down list, click Database Engine. Then connect to MIA-
SQL\SQL2 using Windows authentication.
3. In Object Explorer, under the MIA-SQL\SQL2 instance, right-click Databases and click Attach.
4. In the Attach Database dialog box, click Add, select the HumanResources.mdf file in the M:\Data
folder, and click OK. Then click OK on the error message that is displayed because the certificate with
which the database encryption key is protected does not exist on the MIA-SQL\SQL2 instance.
7. Select the code under the comment Create certificate from backup and click Execute. This creates
a certificate in the master database on MIA-SQL\SQL2 from the backup files you created previously.
8. Select the code under the comment Attach database and click Execute. This attaches the
HumanResources database on MIA-SQL\SQL2.
9. Select the code under the comment Test database and click Execute. This queries the
Employees.Employee table in the HumanResources database.
10. Review the query results. Then close SQL Server Management Studio without saving any files.
L10-4 Administering Microsoft® SQL Server® Databases
Results: After completing this exercise, you should have configured TDE and moved the encrypted
HumanResource database to another instance of SQL Server.
L11-1
2. In the D:\Labfiles\Lab11\Starter folder, right-click the Setup.cmd file and then click Run as
administrator.
3. Click Yes when prompted to confirm that you want to run the command file, and wait for the script
to finish.
3. Select the code under the comment Check AWDataWarehouse and click Execute. This checks the
integrity of the AWDataWarehouse database.
4. Select the code under the comment Check HumanResources and click Execute. This checks the
integrity of the HumanResources database.
5. Select the code under the comment Check InternetSales and click Execute. This checks the integrity
of the InternetSales database and identifies some consistency errors in the dbo.Orders table in this
database. The last line of output tells you the minimum repair level required.
2. Select the code under the comment Check the internal database structure and click Execute. No
error message are displayed, indicating that the database structure is now consistent.
Results: After this exercise, you should have used the DBCC CHECKDB command to check database
consistency, and corrected any issues that were found.
L11-2 Administering Microsoft® SQL Server® Databases
2. Select the code under the comment Check fragmentation and click Execute.
2. Select the code under the comment Check fragmentation again and click Execute.
Results: After this exercise, you should have rebuilt fragmented indexes.
L11-3
3. In the Select Plan Properties window, in the Name textbox type HumanResources Maintenance.
Note the available scheduling options and click Change.
4. In the New Job Schedule window, in the Name textbox type "Daily". In the Occurs drop down list,
click Daily. In the Occurs once at textbox, change the time to 6:00 PM, and click OK.
5. In the Select Plan Properties window, click Next. Then in the Select Maintenance Tasks page,
select the following tasks and click Next.
o Reorganize Index
o Update Statistics
o Back up Database (Full)
7. On the define Database Check Integrity Task page, select the HumanResources database and click
OK. Then click Next.
8. On the Define Reorganize Index Task page, select the HumanResources database and click OK,
ensure that Tables and Views is selected, and click Next.
9. On the Define Update Statistics Task page, select the HumanResources database and click OK,
ensure that Tables and Views is selected, and click Next.
10. On the Define Backup database (Full) Task page, select the HumanResources database and click
OK. Then on the Destination tab, ensure that Create a backup file for every database is selected,
change the Folder value to R:\Backups\ and click Next.
11. On the Select Report Options page, ensure that Write a report to a text file is selected, change the
Folder location to D:\Labfiles\Lab11\Starter and click Next.
12. On the Complete the Wizard page, click Finish. Then when the operation has completed, click
Close.
2. Wait a minute or so until the maintenance plan succeeds, and in the Execute Maintenance Plan
dialog box, click Close. Then right-click HumanResources Maintenance and click View History.
3. In the Log File Viewer - MIA-SQL dialog box, expand the Date value for the Daily Maintenance
plan to see the individual tasks.
4. Keep clicking Refresh and expanding the tasks until four tasks have been completed. Then click
Close.
6. In the R:\Backups\ folder, verify that a backup of the HumanResources database has been created.
Results: After this exercise, you should have created the required database maintenance plan.
L12-1
2. In the D:\Labfiles\Lab12\Starter folder, right-click the Setup.cmd file and then click Run as
administrator.
3. Click Yes when prompted to confirm that you want to run the command file, and wait for the script
to finish.
3. In the New Job dialog box, on the General page, in the Name box, type Backup HumanResources.
4. In the New Job dialog box, on the Steps page, click New.
5. In the New Job Step dialog box, on the General page, in the Step name box, type Back Up
Database. Then ensure that Transact-SQL script (T-SQL) is selected in the Type drop-down list,
select HumanResources in the Database drop-down list, and in the Command area, type the
following command.
6. In the New Job Step dialog box, on the Advanced page, in the Output file box, type
D:\Labfiles\Lab12\Starter\BackupLog.txt. Then click OK.
7. In the New Job dialog box, on the Steps page, click New.
8. In the New Job Step dialog box, on the General page, in the Step name box, type Copy Backup
File. Then ensure that Operating system (CmdExec) is selected in the Type drop-down list and in
the Command area, type the following command, which copies the backup file to the
D:\Labfiles\Lab12\Starter folder.
10. In the New Job dialog box, on the Steps page, verify that the Start step is set to 1:Back Up
Database and note the On Success and On Failure actions for the steps in the job.
11. In the New Job dialog box, click OK. Then verify that the job appears in the Jobs folder in Object
Explorer.
L12-2 Administering Microsoft® SQL Server® Databases
2. In the Start Job on 'MIA-SQL' dialog box, ensure that step 1 (Back Up Database) is selected, and
click Start. Then, when the job has completed successfully, click Close.
3. In Object Explorer, right-click the Backup HumanResources job and click View History.
4. In the Log File Viewer - MIA-SQL dialog box, expand the date for the most recent instance of the
job, and note that all steps succeeded. Then click Close.
5. View the contents of the D:\Labfiles\Lab12\Starter folder and verify that it contains a text file named
BackupLog.txt and a backup file named HumanResources.bak.
Results: After this exercise, you should have created a job named Backup HumanResources.
L12-3
2. In the Notification Area Icons window, click Turn system icons on or off. Then, in the System
Icons window, set the behavior for the Clock system icon to On and click OK.
3. Click OK to close the Notification Area Icons window, and click OK again to close the Taskbar and
Navigation properties dialog box.
4. Note the time in the clock. This may not be correct for your geographical location.
5. In SQL Server Management Studio, in Object Explorer, double-click the Backup HumanResources
job.
6. In the Job properties - Backup HumanResources dialog box, on the Schedules page, click New.
7. In the New Job Schedule dialog box, in the Name box, type Daily Backup; in the Frequency area,
in the Occurs list, select Daily; ensure that the Occur Once option is selected; and set the time to one
minute from the current system time as shown in the clock in the notification area. Then click OK.
8. In the Job properties - Backup HumanResources dialog box, click OK. Then wait until the system
clock shows the scheduled time.
2. If the job is still running, click Refresh until the Status changes to Idle.
3. Verify that the Last Run Outcome for the job is Succeeded, and that the Last Run time is the time
that you scheduled previously. Then click Close to close the Job Activity Monitor.
Results: After this exercise, you should have created a schedule for the Backup HumanResources job.
L12-4 Administering Microsoft® SQL Server® Databases
2. In the New Credential dialog box, enter the following details and click OK.
o Identity: MIA-SQL\FileAgent
o Password: Pa$$w0rd
2. In the Job Properties - Backup HumanResources dialog box, on the Steps page, click step 1 (Back
Up Database) and click Edit.
3. In the Job Step Properties - Back Up Database dialog box, on the Advanced page, in the Run as
user box, click the ellipsis (…).
4. In the Select User dialog box, click Browse, and in the Browse for Objects dialog box, select
[Backup_User] and click OK. Then click OK in the Select User dialog box and the Job Step
Properties - Back Up Database dialog box.
5. In the Job Properties - Backup HumanResources dialog box, on the Steps page, click step 2 (Copy
Backup File) and click Edit.
6. In the Job Step Properties - Make Folder dialog box, in the Run as drop-down list, select
FileAgent_Proxy. Then click OK.
Results: After this exercise, you should have configured the Back Up Database step of the Backup
HumanResources job to run as the Backup_User SQL Server user. You should have also created a
credential named FileAgent_Credential and a proxy named FileAgent_Proxy to perform the Copy
Backup File step of the Backup HumanResources job.
L13-1
2. In the D:\Labfiles\Lab13\Starter folder, right-click the Setup.cmd file and then click Run as
administrator.
3. Click Yes when prompted to confirm that you want to run the command file, and wait for the script
to finish.
2. In Object Explorer, under the MIA-SQL instance, expand Management, right-click Database Mail,
and click Configure Database Mail.
3. In the Welcome to Database Mail Configuration Wizard page, click Next.
4. In the Select Configuration Task page, select the option to set up Database Mail and click Next.
5. In the New Profile page, in the Profile name textbox type SQL Server Agent Profile, and click Add.
Then, in the Add Account to profile 'SQL Server Agent Profile' dialog box, click New Account.
6. In the New Database Mail Account dialog box, enter the following details and click OK:
8. In the Manage Profile Security page, select Public for the SQL Server Agent Profile profile, and set
its Default Profile setting to Yes. Then click Next.
9. In the Configure System Parameters page, click Next. Then, in the Complete the Wizard page,
click Finish and when configuration is complete, click Close.
2. In the Send Test E-Mail from MIA-SQL dialog box, ensure that the SQL Server Agent Profile
database mail profile is selected, and in the To textbox, enter [email protected]. Then
click Send Test Email.
L13-2 Administering Microsoft® SQL Server® Databases
3. View the contents of the C:\inetpub\mailroot\Drop folder, and verify that an email message has been
created here.
4. Double-click the message to view it in Outlook. When you have read the message, close it and delete
it, and then minimize the Drop folder window.
5. In the Database Mail Test E-Mail dialog box (which may be behind SQL Server Management
Studio), click OK.
8. View the results. The first result shows system events for Database Mail, and the second shows records
of e-mail messages that have been sent.
Results: After this exercise, you should have configured Database Mail with a new profile named SQL
Server Agent Profile.
L13-3
2. In the New Operator dialog box, in the Name box type Student, in the E-mail name box type
[email protected], and click OK.
3. In Object Explorer, under SQL Server Agent, right-click Operators and click New Operator.
4. In the New Operator dialog box, in the Name box type DBA Team, in the E-mail name box type
[email protected], and click OK.
2. In the SQL Server Agent Properties dialog box, on the Alert System page, select Enable mail
profile and in the Mail profile drop-down list, select SQL Server Agent Profile.
3. In the SQL Server Agent Properties dialog box, select Enable fail-safe operator, in the Operator
drop-down list select DBA Team, and for the Notify using setting, select E-mail. Then click OK.
4. In Object Explorer, right-click SQL Server Agent and click Restart. When prompted to confirm, click
Yes.
3. In the Job Properties - Back Up Database - AWDataWarehouse dialog box, on the Notifications
tab, select E-mail, select Student, and select When the job fails. Then click OK.
5. In the Job Properties - Back Up Database - HumanResources dialog box, on the Notifications
tab, select E-mail, select Student, and select When the job fails. Then click OK.
7. In the Job Properties - Back Up Database - InternetSales dialog box, on the Notifications tab,
select E-mail, select Student, and select When the job completes. Then click OK.
8. Right-click the Back Up Log - InternetSales job and click Properties.
9. In the Job Properties - Back Up Log - InternetSales dialog box, on the Notifications tab, select E-
mail, select Student, and select When the job completes. Then click OK.
10. Expand the Operators folder, right-click Student and click Properties. On the Notifications page,
select Jobs, note the job notifications that have been defined for this operator. Then click Cancel.
2. In Object Explorer, right-click the Back Up Database - HumanResources job and click Start Job at
Step. Then, when the job has completed, note that it succeeded and click Close.
L13-4 Administering Microsoft® SQL Server® Databases
3. In Object Explorer, right-click the Back Up Database - InternetSales job and click Start Job at Step.
Then, when the job has completed, note that it succeeded and click Close.
4. Under the Operators folder, right-click Student and click Properties. On the History page, note the
most recent notification by e-mail attempt. Then click Cancel.
5. In the C:\inetpub\mailroot\Drop folder, and verify that new email messages have been created.
6. Open each of the messages and verify that they include a failure notification for the Back Up
Database - AWDataWarehouse job and a completion notification for the Back Up Database -
InternetSales job, but no notification regarding the Back Up Database - HumanResources job.
Then close all e-mail messages and minimize the Drop window.
Results: After this exercise, you should have created operators name Student and DBA Team, configured
the SQL Server Agent service to use the SQL Server Agent Profile Database Mail profile, and configured
the Back Up Database - AWDataWarehouse, Back Up Database - HumanResources, Back Up
Database - InternetSales, and Back Up Log - InternetSales jobs to send notifications.
L13-5
2. In the New Alert dialog box, on the General page, enter the name InternetSales Log Full Alert. In
the Database name drop-down list, select InternetSales; and then select Error number, and enter
the number 9002.
3. In the New Alert dialog box, on the Response page, select Execute job, and select the Back Up Log
- InternetSales ([Uncategorized (Local)]) job. Then select Notify operators and select the E-mail
checkbox for the Student operator.
4. In the New Alert dialog box, on the Options page, under Include alert error text in, select E-mail.
Then click OK.
3. In Object Explorer, under the Alerts folder, right-click InternetSales Log Full Alert and click
Properties. Then on the History page, note the Date of last alert and Date of last response values
and click Cancel.
4. In the C:\inetpub\mailroot\Drop folder, and verify that two new email messages have been created.
5. Double-click the new email messages to view them in Outlook. They should include a notification that
the transaction log was filled, and a notification that the Back Up Log - InternetSales job completed.
6. When you have read the messages, close them and close the Drop window.
Results: After this exercise, you should have created an alert named InternetSales Log Full Alert.
L13-6 Administering Microsoft® SQL Server® Databases
Notes
Notes
Notes
Notes
Notes
Notes