:::: MENU ::::

Tuesday, December 29, 2015

Memory is one of the important resources for any database management system and for any IT service in general and controlling this critical resource is one of any DBA major challenges.

Most of database management systems has a parameter or more to define the its share from the total amount of server installed memory, this maintain a hard limit to the DBMS that should not exceed and the rest should be used for the operating system itself “in case that there is no other software installed like web services for example and this–by default–the best practice for a production environment.”

This parameter is very useful for defining a hard limit for the memory the database case use and assign the rest for the OS operations in order to maintain the server performance but it is useless for measuring the amount of memory utilized, this is because the simple fact that this is a static value and in order to measure the memory and its peaks we need another figure to be considered which it the buffer pool utilization.

The buffer pool is the area inside SQL Server memory that used for hosting data and indexes pages for both reading and writing operations and represent about 99% of the total memory of SQL Server, by measuring the buffer pool consumption we can then measure the actual memory utilization for SQL Server instance.
 The below snapshot shows the difference between the SQL Server memory utilization “as a whole” and the buffer pool utilization.

















While the red line represent the memory utilization the blue one represents the buffer pool utilization, as we can see the total memory utilization is always the same because it is a static value, while the buffer pool utilization changes by the reading/writing operations activities on the system and that is why we find the utilization during some points reached the total assigned memory and sometimes else it decreased below.

 The next step will be how to collect the memory and buffer pool utilization figures and this will be explained deeply in the next post.

Sunday, December 13, 2015

This is one of the critical and bad situations that any DBA can face, to find yourself unable to start your SQL instance after disabling NT AUTHORITY login “for any reason”.

In my case it was disabled by the Network administrator guy by mistake and I did not realize that till restarting my SQL production instance for a planned maintenance operation!!

So, the situation is very critical that you find your production instance failed to be started with error message:
“Login failed for user 'NT AUTHORITY\SYSTEM'. Reason: The account is disabled”.

The first positive point here is that the error message is clear enough and then we can start troubleshooting in the right track.

What we need to do in this situation is simply grant “sysadmin” permission for NT AUTHORITY system account again but the question is how can we connect to the instance while it is stopped?, the answer is to start it in single user mode and to do that we need to go through the following steps:

1- Open command prompt in administrator mode.

2- Navigate in SQL Server installation media till “Binn” folder.

3- Start SQL Server DB Engine service in single user mode using command “sqlservr.exe -s SQL2012 –m” while SQL2012 is the instance name, here is a complete example: 
“C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2012\MSSQL\Binn>sqlservr.exe -s SQL2012 –m” 
while "–s  SQL2012" is the instance name and "–m" is the option for starting the instance in single user mode.

The next step then is to grant the NT AUTHORITY system user "sysadmin" permission and to do that we need to connect to the instance after starting it in single user mode via command prompt, the command to do that is as below:

SQL Authentication:
SQLCMD -S instance IP,port -U SQL account -P password
ALTER LOGIN [NT AUTHORITY\SYSTEM] ENABLE
GO
GRANT CONNECT SQL TO [NT AUTHORITY\SYSTEM]
GO
  
Windows Authentication:
SQLCMD -S instance IP,port –E
ALTER LOGIN [NT AUTHORITY\SYSTEM] ENABLE
GO
GRANT CONNECT SQL TO [NT AUTHORITY\SYSTEM]
GO

After that we can close the two opened command prompt and then restarting SQL Server instance in the normal mode.

Thursday, April 30, 2015

During my journey with SQL Server, I used to query the SQL Server agent jobs metadata, by default it is very important to get some information related to the jobs defined on your SQL instance, the current running ones, some details like steps defined per job, … etc.

Due to all the above situations, I used to design a lot of complex queries that related to SQL jobs and thought that I am doing the right job … but unfortunately I am not.

The secret behind this is simply the system stored procedure sp_help_job which literally save a lot of time and effort of creating such complex long queries, the sp_help_job is a system stored procedure located in msdb system database and has a set of parameters that cover most of the information required for SQL agent jobs.

We can use sp_help_job stored procedure in the below scenarios:

1- Getting all jobs on a SQL instance with some metadata like server, job name, owner, notification methods, create date and last run date … this can be done by executing the stored procedure without specifying any parameters.
exec msdb.dbo.sp_help_job;

2- Getting all current running jobs, this can be done by specifying the parameter @execution_status with value 1.
exec msdb.dbo.sp_help_job @execution_status=1;

3- Getting details for a specific job, by using parameter @job_name and specifying the job name we can get information about the job itself, the jobs steps, the job schedule and the job outcome.
exec msdb.dbo.sp_help_job @job_name=[Job Name];

4- Getting the jobs owned by a specific login, by specifying parameter @owner_login_name.
exec msdb.dbo.sp_help_job @owner_login_name=[Login Name];

5- Getting the enabled or disabled jobs on the instance by specifying parameter @enabled with 0 value for disabled jobs and 1 for the enabled ones.
exec msdb.dbo.sp_help_job @enabled=0;
exec msdb.dbo.sp_help_job @enabled=1;

There are some other parameters that can be specified like the subsystem name, the category name and date created/ modified.

The below snapshot shows all available parameters for sp_help_job stored procedure.

Sunday, April 12, 2015

This post is one of the "Quick Tips" posts series that gives a quick guide about how to deal with simple tasks in your DBA's day-to-day life, and in this post I will show how to add a new clustered disk to an existing SQL Server cluster.

Adding new disks to the current running SQL Server is a task any DBA may need to do many times during his\her career, the task of adding disk to a local SQL instance is a straight forward one by default, but what about adding disks to clustered instance?

It is still a straight forward task even it may take some extra steps, let us go through it...

The first step is to add the clustered disk to the “Windows Cluster“, this can be done by opening the Failover Cluster Manager, expanding the cluster name then right click on “Storage” folder, select add disk and then select the disk to be added into the windows cluster.














Second step is adding the clustered disk into the targeted SQL Server instance’s resource group and then add it in the dependencies resource list for SQL Server database engine.































Third and final step is to make sure that the SQL Server is currently controlling the disk and can access it normally, this can be done by querying the DMV sys.dm_io_cluster_shared_drives which shows a list of the clustered disks attached to SQL Server or by trying to create a testing database on the new attached disk.

-- Queries
-- Checking with DMV
select * from sys.dm_io_cluster_shared_drives

-- Checking with creating a testing database
-- Assuming that the clustered disk letter is "I"
CREATE DATABASE [TestDB] ON  PRIMARY
( NAME = N'TestDB', FILENAME = N'I:\TestDB.mdf' , SIZE = 3072KB , FILEGROWTH = 1024KB )
 LOG ON
( NAME = N'TestDB_log', FILENAME = N'I:\TestDB_log.ldf' , SIZE = 1024KB , FILEGROWTH = 10%)
GO

Monday, April 6, 2015

This is the second part of our series “SQL Server Daily Operational Tasks” and will focus in this post on the disk space part of our daily checklist.

Disks are one of the valuable resources in any database system, simply they are containing the databases and log files that hold our data and transactions and that are why we need to keep it available and have enough free space for normal database operations.

We will demonstrate here some alternatives for collecting and reporting the SQL Server server’s disks and its current free space.

In fact, the way to collect disk space information is differ from one SQL Server version to another, the reason simply is that the objects used for collecting these figures are not available on all SQL Server version.

Then if the targeted version is SQL Server 2008 R2 SP 1 and above then we can use the new DMF “Dynamic Management Function” sys.dm_os_volume_stats() that requires two parameters (the database_id and file_id), by querying this DMF beside the catalog view sys.master_files to get the required result.

In the below script we simply will go through the following steps:

1- Querying sys.dm_os_volume_stats() and catalog view sys.master_files and inserting the result set into a temp tables.
-- Step 1
SELECT DISTINCT SUBSTRING(volume_mount_point,1,1) AS [Drive]total_bytes/1048576 AS [Total_MB]available_bytes/1048576 AS [Available_MB],
(total_bytes/1048576)-(available_bytes/1048576) AS [Used_MB],
CONVERT(DECIMAL(18,2),CONVERT(DECIMAL(18,2),available_bytes/1048576)/CONVERT(DECIMAL(18,2),total_bytes/1048576)*100 )AS [Available_Space_%],
CONVERT(DECIMAL(18,2),CONVERT(DECIMAL(18,2),(total_bytes/1048576)-(available_bytes/1048576))/CONVERT(DECIMAL(18,2),total_bytes/1048576)*100 )AS [Used_Space_%]
INTO #SpaceUsed
FROM sys.master_files F cross apply sys.dm_os_volume_stats (F.database_id,F.file_idV
ORDER BY SUBSTRING(volume_mount_point,1,1)
GO

2- Getting the used space by databases for each drive into a common table expression “CTE”.
-- Step 2
WITH dbs_size (dbs_size_MB,DriveAS(
SELECT SUM((size * 8)/1024)AS [dbs_size_MB],SUBSTRING(physical_name,1,1) AS [Drive] FROM sys.master_files
GROUP BY SUBSTRING(physical_name,1,1))

3- Querying the temp table #SpaceUsed and the common table expression dbs_size and adding the current date for the final result set.
-- Step 3
SELECT S.DriveS.Total_MBS.Available_MBS.Used_MBS.[Available_Space_%],S.[Used_Space_%],D.DBs_Size_MB,
S.Used_MB-D.DBs_Size_MB AS [Other_Files_Size_MB],
CONVERT(DECIMAL(18,2),CONVERT(DECIMAL(18,2),D.DBs_Size_MB)/CONVERT(DECIMAL(18,2),S.Used_MB)*100) AS [DB_Size_%],
CONVERT(DECIMAL(18,2),CONVERT(DECIMAL(18,2),(S.Used_MB-D.DBs_Size_MB))/CONVERT(DECIMAL(18,2),S.Used_MB)*100) AS [Other_Files_Size_%],
CONVERT(DATE,GETDATE()) AS [date]
FROM #SpaceUsed S join dbs_size D ON S.Drive=D.Drive
GO

4- Dropping the temp table #SpaceUsed.
-- Step 4
DROP TABLE #SpaceUsed
GO

The complete script:
--  SQL Server version: SQL Server 2008 R2 SP1 and above
USE MASTER
GO
SELECT DISTINCT SUBSTRING(volume_mount_point,1,1) AS [Drive]total_bytes/1048576 AS [Total_MB]available_bytes/1048576 AS [Available_MB],
(total_bytes/1048576)-(available_bytes/1048576) AS [Used_MB],
CONVERT(DECIMAL(18,2),CONVERT(DECIMAL(18,2),available_bytes/1048576)/CONVERT(DECIMAL(18,2),total_bytes/1048576)*100 )AS [Available_Space_%],
CONVERT(DECIMAL(18,2),CONVERT(DECIMAL(18,2),(total_bytes/1048576)-(available_bytes/1048576))/CONVERT(DECIMAL(18,2),total_bytes/1048576)*100 )AS [Used_Space_%]
INTO #SpaceUsed
FROM sys.master_files F cross apply sys.dm_os_volume_stats (F.database_id,F.file_idV
ORDER BY SUBSTRING(volume_mount_point,1,1)
GO
WITH dbs_size (dbs_size_MB,DriveAS(
SELECT SUM((size * 8)/1024)AS [dbs_size_MB],SUBSTRING(physical_name,1,1) AS [Drive] 
FROM sys.master_files GROUP BY SUBSTRING(physical_name,1,1))
SELECT S.DriveS.Total_MBS.Available_MBS.Used_MBS.[Available_Space_%],S.[Used_Space_%],D.DBs_Size_MB,
S.Used_MB-D.DBs_Size_MB AS [Other_Files_Size_MB],
CONVERT(DECIMAL(18,2),CONVERT(DECIMAL(18,2),D.DBs_Size_MB)/CONVERT(DECIMAL(18,2),S.Used_MB)*100) AS [DB_Size_%],
CONVERT(DECIMAL(18,2),CONVERT(DECIMAL(18,2),(S.Used_MB-D.DBs_Size_MB))/CONVERT(DECIMAL(18,2),S.Used_MB)*100) AS [Other_Files_Size_%],
CONVERT(DATE,GETDATE()) AS [date]
FROM #SpaceUsed S join dbs_size D ON S.Drive=D.Drive
GO
DROP TABLE #SpaceUsed
GO

The result set should be as below:





As we can see, the result set is really amazing, we can check the total drive space, the used amount, the free space and also the space used by databases and the space used by other files “which is somehow important to know if there are some non-database files that consume a lot of space in database drives”, all these values are presented by MB and also by percentage.

Unfortunately there are two main problems in using this DMF, the first one is as mentioned above, the sys.dm_os_volume_stats is not available for SQL Server versions prior to SQL Server 2008 R2 SP 1 and also it does not include the disks without databases, for example the local disks in the clustered SQL servers and any drive that not being used as a database drive.

The alternative in these two situations is to use the undocumented extended stored procedure sys.xp_fixeddrives to collect drives information, but the bad luck here is the limited information this SP deliver as it only shows the drive name and the free space in MB.

In case of using this extended stored procedure then the collecting steps should be as below:


1- Creating a temp table to insert the SP data in.
-- Step 1
CREATE TABLE #AllDrives (DriveName CHAR(3),FreeSpace_MB INT,[date] DATE)
GO

2- Inserting the SP data into the temp table and updating the temp table with the current date.
-- Step 2
INSERT INTO #AllDrives (DriveName,FreeSpace)
EXEC sys.xp_fixeddrives
UPDATE #AllDrives SET [date]=CONVERT(DATE,GETDATE())
GO

3- Selecting the drives data and then dropping the temp table.
-- Step 3
SELECT FROM #AllDrives
DROP TABLE #AllDrives
GO

The complete script:
-- SQL Server version: SQL Server 2005
USE MASTER
GO
CREATE TABLE #AllDrives (DriveName CHAR(3),FreeSpace_MB INT,[date] SMALLDATETIME)
GO
INSERT INTO #AllDrives (DriveName,FreeSpace)
EXEC sys.xp_fixeddrives
UPDATE #AllDrives SET [date]=CONVERT(SMALLDATETIME,GETDATE())
GO
SELECT FROM #AllDrives
DROP TABLE #AllDrives
GO

-- SQL Server version: SQL Server 2008 & 2008 R2 RTM
USE MASTER
GO
CREATE TABLE #AllDrives (DriveName CHAR(3),FreeSpace_MB INT,[date] DATE)
GO
INSERT INTO #AllDrives (DriveName,FreeSpace)
EXEC sys.xp_fixeddrives
UPDATE #AllDrives SET [date]=CONVERT(DATE,GETDATE())
GO
SELECT FROM #AllDrives
DROP TABLE #AllDrives
GO

The result set should be as below:





Next Step:

There is another way to collect the drive space information out of SQL Server which is using the powerful Microsoft shell scripting tool “PowerShell”, how can we do that??, this will be the next post in our series.