Friday, 30 December 2011

Delete files older than retention period from folder with parameter using Script Task in SQL Server - SSIS by serverku

After using some of the ways to delete the files from the particular folder like Delete file using Script Task and File System Task in SSIS. We have seen such methods and used logic to get the files from the loop container and process to delete them and the same thing applied for the scripts as well. We also looked for the file deletion which are older than some Retention period with a Script Task also.

In earlier posts we have not used any parameters in the scripts and directly applied folder path and retention period values there. But here I would like to use parameters for the folder path and retention period and pass through the scripts and delete them as per the condition.

1. Let check the files from the target folder.


2. Create parameters and set the values. Here it is going to be delete the files, which are older than 3 days from the E:\ImagesBackup folder in this case.


3. Drag and drop File System Task


4. Put the parameters as ReadOnlyVariables


5. Apply attached script in editor which have additional logic with condition to check the file last modified date and check if older than specified retention period or not. Here you can see the parameters used in the script.


Please note here we need to import system.IO namespace.

6. Turn on the final step and run package. Files older than the specified retention period to get deleted.



You can use the script below for the same as mentioned in the above image,

--// You need to apply below one line in "namespaces" region.
using System.IO;

--//You need to apply below lines inplace of  // TODO: Add your code here
int RetentionPeriod = Convert.ToInt32(Dts.Variables["User::Period"].Value.ToString());
string directoryPath = Dts.Variables["User::BackupFolder"].Value.ToString();
string[] oldFiles = System.IO.Directory.GetFiles(directoryPath, "*.*");

foreach (string currFile in oldFiles)
{
FileInfo currFileInfo = new FileInfo(currFile);

if (currFileInfo.LastWriteTime < (DateTime.Now.AddDays(-RetentionPeriod)))
{
currFileInfo.Delete();
}
}
Hope you liked this post. Stay tuned for more.

Friday, 23 December 2011

Delete files older than retention period from specified folder using Script Task in SQL Server - SSIS by serverku

Earlier we have seen in the Delete file using Script Task and file deletion using File System Task in SSIS. We have directly written the logic to get the files and delete from specified folder in the script for the Script Task. For File System Task we have Used Foreach Loop Container and variables to hold file names which are passed from earlier stage one by one and then finally used by the File System Task to delete it.

Here we have same, but don't delete all the files inside pass folder, but get deleted files which are older than the specified days. This is nothing but minor change in the script where we pass the folder path and retention period. Let's start the steps as follow,

1. Let check the files from the target folder.


2. Drag and drop Script Task.


3. Open script editor from the properties.


4. Apply attached script in editor which have additional logic with condition to check the file last modified date and check if older than specified retention period or not.


Please note here we need to import system.IO namespace.

5. Turn on the final step and run package. Files older than the specified retention period to get deleted.

You can also find script code here,

--// You need to apply below one line in "namespaces" region.
using System.IO;

--//You need to apply below lines inplace of  // TODO: Add your code here
int RetentionPeriod = 3;
string directoryPath = @"E:\ImagesBackup";
string[] oldFiles = System.IO.Directory.GetFiles(directoryPath, "*.png");
foreach (string currFile in oldFiles)
{
FileInfo currFileInfo = new FileInfo(currFile);
if (currFileInfo.LastWriteTime < (DateTime.Now.AddDays(-RetentionPeriod)))
{
currFileInfo.Delete();
}
}
Hope you liked this post. Stay tuned for more.

Friday, 16 December 2011

Archive old database backup files using TSQL Script - SQL Server by serverku

I experienced into one issue for database backups were failing. And this was due to space issues on the disk drive. This disk drive is specific to allocate for the database backups only. The space was eaten by this database backups and this drive contains so many old backups. I have manually deleted all files and continue this activity so many days. If you create a maintenance plan then it have the option to delete old backup files. But I have the stored procedure for the database backups. So I do not have the option to delete old and unused database backup files.

Finally, I have created a script to clean those old backups. I have created one stored procedure in which you need to pass their parameters, One is Backup type as want to delete full, differential or transaction log backups. Second is From days and the third one is End day.

Please make sure XP_CMDSHELL is enabled on the database instance as this is required to enable it to delete database backup files to be deleted physically. Here is a query to enable it.
USE MASTER
GO

EXEC SP_CONFIGURE 'show advanced options',1
GO
EXEC SP_CONFIGURE 'XP_CMDSHELL',1
GO
RECONFIGURE
GO
I have already told you as I have created scripts to delete the old DATABASEPROPERTY backups, please find below SP for the same.
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[DeleteDBBackupOldFiles]
@BackupType char(1) = 'D', -- 'D'-Full, 'I'-Differential, 'L'-Log backup type
@StartDayFromToday int,
@EndDayFromToday int
AS
BEGIN
SET NOCOUNT ON

DECLARE @IsFileExists int
DECLARE @DeletedFile VARCHAR(500)
DECLARE @OldFiles VARCHAR(500)

DECLARE OldFiles CURSOR FAST_FORWARD FOR
SELECT
bmf.physical_device_name
from msdb.dbo.backupset bs
INNER JOIN msdb.dbo.backupmediafamily bmf
ON (bs.media_set_id=bmf.media_set_id)
WHERE DEVICE_TYPE = 2
AND TYPE = @BackupType
AND BACKUP_START_DATE < = GETDATE() - @StartDayFromToday
AND BACKUP_START_DATE > = GETDATE() - @EndDayFromToday
-- AND DATEDIFF(DAY,BACKUP_START_DATE,GETDATE()) > BETWEEN @StartDayFromToday and @EndDayFromToday

OPEN OldFiles

FETCH NEXT FROM OldFiles INTO @OldFiles

WHILE @@FETCH_STATUS =0
BEGIN

EXEC XP_FILEEXIST @OldFiles,@IsFileExists OUTPUT
IF @IsFileExists=1
BEGIN

PRINT 'Deleted File : ' + @DeletedFile

SET @DeletedFile = 'DEL ' + @OldFiles

EXEC XP_CMDSHELL @DeletedFile

END
FETCH NEXT FROM OldFiles INTO @OldFiles
END
CLOSE OldFiles
DEALLOCATE OldFiles
END
Finally, after creating a stored procedure, it is time to run and clean the old database backups. After running it will remove all the old database backups as per specified parameters. Like we have passed 'D', 3 and 10 with SP, So I will delete Full database backups while being older than 3 days ago and 10 days before created.
EXEC [DeleteDBBackupOldFiles] 
@BackupType = 'D',
@StartDayFromToday = 3,
@EndDayFromToday = 10
Which method you use for old database backup maintenance? You can read my earlier posts for deleting files using File System Task and Script Task in SSIS.

Wednesday, 14 December 2011

Delete files from specified folder using Script Task in SQL Server - SSIS by serverku

Earlier we have seen for the file deletion using File System Task in SSIS. We have used it in Foreach Loop Container. Used a variable to hold file names which are passed from earlier stage one by one and then finally used by the File System Task to delete it.

Now I am going to use the Script Task to delete all files from a specified folder. Here I have added script to get each file from specified folder and then delete them as you can see in the following steps. Let's start to follow them.

1. Drag and drop Script Task.


2. Open script editor from the properties.


3. Apply attached script in editor and save it.


Please note here we need to import system.IO namespace.

4. Turn on the final step and run package. Files get deleted.


You can also find script code here,
--// You need to apply below one line in "namespaces" region.
using System.IO;

--//You need to apply below lines inplace of  // TODO: Add your code here

 string directoryPath = @"E:\TestFolder";
            string[] oldFiles = System.IO.Directory.GetFiles(directoryPath, "*.txt");
            foreach (string currFile in oldFiles)
            {
                FileInfo currFileInfo = new FileInfo(currFile);
                currFileInfo.Delete();
             
            }
Hope you liked this post. Stay tuned for more.

Saturday, 10 December 2011

Delete files from specified folder using File System Task in SQL Server - SSIS by serverku

We know all the various methods to delete the particular or all files from the specified folder using some of the methods like xp_delete_file, Ole Automation Procedures and with xp_cmdshell command line utility which we used for the old files archive or cleanup purpose. Here I am going to share some of the screens which delete the files with SSIS using File System Task. Let me share them one by one.

1. Folder having some test files,


2. Drag and drop Foreach Loop Container and File System Task. Foreach Loop Container used to get all the files inside that folder one by one and process with File System Task. Open the Foreach Loop Container properties, GOTO Collection tab and select Foreach File Enumerator as specified in the screen below.


3. Select the folder from where need to delete the files. Also apply extension if you need.


4. For the file assignment, we need one variable and the values allocated from the Foreach Loop Container process. So assign it from variable mapping as per shot taken below.



5. Drag File System Task, select Delete File operation and define Source Connection.



6. Finally, run package and files will get deleted.


Please write your comments.

Friday, 2 December 2011

My recent blog posts and just learned tips - SQL Server by serverku

Recently I have posted some of the articles and some just learned from the month of October. This article just summarizes all the posts which recently published. You can find the links below to go into the details.

Blog Posts:
Just Learned tips:
Hope these posts help you.

Wednesday, 30 November 2011

Backup Statistics and History - SQL Server by serverku

Every DBA has a daily activity review or monitor database backups as these database backups used for the restoration at the other place and using for the database restore which used for reporting purposes or used in log shipping purpose. Because database backups are the most important factor and first option in case of disaster recovery even whatever types of them because in this case transaction logs can reduce the data loss.

You can read my earlier posts Database Backup CompressionDatabase Backup files Verification Automated All Databases Backups Script and Split Database Full Backup to Multiple files.

I would like to share the script which helps us to show the database backup status, history of theirs when they are done based on schedule, at where are taking and when, backup types, backups, physical device and the size of the database backups and time to perform backup and all other related backup statistics. Here is the script to collect the database backup statistics and status information.
USE MSDB
GO

SELECT
bs.server_name AS Server, -- Server name
bs.database_name AS DatabseName , -- Database name
CASE bs.compatibility_level
WHEN 80 THEN 'SQL Server 2000'
WHEN 90 THEN 'SQL Server 2005 '
WHEN 100 THEN 'SQL Server 2008'
WHEN 110 THEN 'SQL Server 2012'
END AS CompatibilityLevel , -- Return backup compatibility level
recovery_model AS Recoverymodel , -- Database recovery model
CASE bs.type
WHEN 'D' THEN 'Full'
WHEN 'I' THEN 'Differential'
WHEN 'L' THEN 'Log'
WHEN 'F' THEN 'File or filegroup'
WHEN 'G' THEN 'Differential file'
WHEN 'P' THEN 'Partial'
WHEN 'Q' THEN 'Differential partial'
END AS BackupType, -- Type of database baclup
bs.backup_start_date AS BackupstartDate, -- Backup start date
bs.backup_finish_date AS BackupFinishDate, -- Backup finish date
bmf.physical_device_name AS PhysicalDevice, -- baclup Physical localtion
CASE device_type
WHEN 2 THEN 'Disk - Temporary'
WHEN 102 THEN 'Disk - Permanent'
WHEN 5 THEN 'Tape - Temporary'
WHEN 105 THEN 'Tape - Temporary'
ELSE 'Other Device'
END AS DeviceType, -- Device type
bs.backup_size AS [BackupSize(In bytes)], -- Normal backup size (In bytes)
bs.compressed_backup_size AS [ConmpressedBackupSize(In bytes)] -- Compressed backup size (In bytes)
FROM msdb.dbo.backupset bs WITH (NOLOCK)
INNER JOIN msdb.dbo.backupmediafamily bmf WITH (NOLOCK)
ON (bs.media_set_id=bmf.media_set_id)
ORDER BY bs.backup_start_date DESC

GO

(Click on image to enlarge)

Hope you liked this post.

Friday, 25 November 2011

NOLOCK Hint & READ UNCOMMITTED Isolation level on table and Query/Session level - SQL Server by serverku

When we created a new database, it will be created with default isolation level and that is "READ COMMITTED". If some update transactions are running in with table rows under READ COMMITTED isolation level, How can we get data from a table in another session while running update transaction?

How can ?
NOLOCK hint or READ UNCOMMITTED isolation level help for the same as there are operating same. We have some other options other than this. But I am going to present the NOLOCK hint and READ UNCOMMITTED isolation level here.

For NOLOCK, we need to put this hint on table level, so it is required to put for every table level which are used in update transaction. So it is very lengthy and time consuming to put it everywhere, tables refers in the query. For READ UNCOMMITTED, We do not need to put it every tables level, just put at session level or query level and can be written on top of the query or stored procedure. Let us look on small demo to elaborate it. First checking here database default isolation level,

USE DEMO
GO
DBCC USEROPTIONS


Starting with creating a database and table objects.

IF (OBJECT_ID('TrnTable','U') > 0)
DROP TABLE TrnTable

CREATE TABLE TrnTable
(
TrnId INT ,
TrnData VARCHAR(100),
TrnDate DATETIME
)

GO

-- Inserting some sample records in table

INSERT INTO TrnTable(TrnId,TrnData,TrnDate)
SELECT 1,'TrnData-1',GETDATE()
UNION ALL
SELECT 2,'TrnData-2',GETDATE()
UNION ALL
SELECT 3,'TrnData-3',GETDATE()
UNION ALL
SELECT 4,'TrnData-4',GETDATE()
UNION ALL
SELECT 5,'TrnData-5',GETDATE()

GO

Now for the demo we will run the below script with session 1,

-- Script in session 1
-- Running query with transaction named TRAN1
BEGIN TRANSACTION TRAN1

UPDATE TrnTable
SET TrnData = 'Changed TrnData'
WHERE TrnId = 3
-- Not Committed/Rollback this transaction

After that we will get the same rows which are updated in above session, which are not committed yet in another session. It will be going on waiting to release the lock held by session 1,


We are not closing this transaction here, and created a new session and run following scripts having a NOLOCK hint on table level and READ UNCOMMITTED isolation level on query level.

-- Script in session 3
-- With NOLOCK hint
SELECT
TrnId,
TrnData,
TrnDate
FROM TrnTable (NOLOCK)
WHERE TrnId = 3

GO

-- With READ UNCOMMITTED isolation level
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

SELECT
TrnId,
TrnData,
TrnDate
FROM TrnTable
WHERE TrnId = 3

GO


Do not forget to commit or rollback transaction TRAN1,
Commit Transaction TRAN1

I hope you liked this post. Please let me know what you are using among them or else something?

Saturday, 19 November 2011

"Can not add a shared registered server with the same name as configuration Server" - Central Management Servers register error in SQL Server 2008 by serverku

You may know how can we register SQL Server instances with Central Management Servers (CMS) and also how can we perform multi server query will all of the instances of registered SQL Server instances.

There you can see I have registered one, shared SQL Server 2011 instance, under CMS and named it Denali. Under Shared instance, I have registered SQL Server 2008 and same SQL Server instance of Denali which I already registered. Here SQL server 2008 successfully registered but SQL Server 2011 has encountered an error. But how I have registered it which I am going to explain here.


This is because the same SQL instance already registered as shared SQL server.

How can i register same SQL Server Denali or 2011 instance again?

I have changed the port and applied static port as follows from SQL Server TCP/IP properties which we will available on the SQL Server Configuration Monitor.


Then please see the screenshot below as I have registered SQL Server Denali instance with port.


Now both SQL Server registered successfully.


Did you get it earlier? How did you resolve?

Wednesday, 16 November 2011

Database Backup Compression, Amazing feature for DBA - SQL Server 2008 by serverku


Production Database servers may have databases which are heavily in size. For the maintenance of those database backups are very hard and lengthy as the backups of those heavily databases take more time to execute and very CPU, memory and IO consumptive. And the important thing is backups activity should be completed within down time or pick a time when more users are not connected with databases.

What is the solution?
We have alternative ways to use the some backup tool that can help to use in this matter. But SQL Server itself provides the best feature and supported SQL Server 2008 or newer version. That is "Backup Compression". You can read my earlier posts for Automated All Databases Backups, Database Backup files Verification and Details and Split Database Full Backup to Multiple files

Using this feature, we can take a database backup with compression option. And will really reduce the time required to backup it, reduce server IO and less CPU and memory consumption. It is very full features for the DBA. Let us look on below example, which will clear you the difference between the uncompressed and compressed backups. We will first perform non-compressed backups of the database which have 4 GB size.
Using Management Studio :


Using TSQL:

#1. Performing noncompressed backup.

SET STATISTICS IO ON
SET STATISTICS TIME ON

BACKUP DATABASE ReportServer TO
DISK = N'D:\DBBackups\Compressed\ReportServer_NonCompressedBackup.bak'
WITH NAME = N'ReportServer-Full NonCompressed Database Backup',
NO_COMPRESSION -- Specifying option here

SET STATISTICS IO OFF
SET STATISTICS TIME OFF
GO


#2. Performing compressed backup.

SET STATISTICS IO ON
SET STATISTICS TIME ON

BACKUP DATABASE ReportServer TO
DISK = N'D:\DBBackups\Compressed\ReportServer_CompressedBackup.bak'
WITH NAME = N'ReportServer-Full Compressed Database Backup',
COMPRESSION -- Specifying option here

SET STATISTICS IO OFF
SET STATISTICS TIME OFF
GO


From the result oputput , you can view the time for the backup execution, CPU usage. Here you have screen for the both of the backups size.

You can use below query to get the backup statistics,

SELECT 
bs.database_name AS DatabaseName , -- Database name
backup_size/compressed_backup_size as CompressionRatio,
CASE bs.type
WHEN 'D' THEN 'Full'
WHEN 'I' THEN 'Differential'
WHEN 'L' THEN 'Log'
WHEN 'F' THEN 'File or filegroup'
WHEN 'G' THEN 'Differential file'
WHEN 'P' THEN 'P'
WHEN 'Q' THEN 'Differential partial'
END AS BackupType, -- Type of database baclup
bs.backup_start_date AS BackupstartDate, -- Backup start date
bs.backup_finish_date AS BackupFinishDate, -- Backup finish date
bmf.physical_device_name AS PhysicalDevice, -- baclup Physical localtion
bs.backup_size AS [BackupSize(In bytes)], -- Normal backup size (In bytes)
compressed_backup_size AS [ConmpressedBackupSize(In bytes)] -- Compressed backup size (In bytes)
FROM msdb.dbo.backupset bs
INNER JOIN msdb.dbo.backupmediafamily bmf
ON (bs.media_set_id=bmf.media_set_id)
AND database_name = 'ReportServer'
ORDER BY bs.backup_start_date DESC


You can set the default backup setting to Compressed as following,

By TSQL :

USE MASTER
GO

EXEC SP_CONFIGURE 'backup compression default', 1
GO
RECONFIGURE WITH OVERRIDE;
GO

From UI :


I hope you like this feature..

Saturday, 5 November 2011

Review of some replication issues and workaround - SQL Server by serverku


After posting some snaps to configure a transactional replication in SQL server having publication server as a distributor, I want to share some of the issues which I got during working with it. You might also face the same issue and got resolved as well. You can comment if you faced any other replication issues and the solution for the same.

1."There is insufficient system memory to run this query" - Distributor agent error for subscriber database : During working with replication this is one of the errors which I encountered and get resolved it. This is coming with a Distributor agent, and it stopped working. It's due to the memory issue. I have flashed the unused memory consumed by Adhoc queries and system cache.

The error,
" There is insufficient system memory to run this query. (Source: MSSQL_REPL, Error number: MSSQL_REPL27453) "
To flush the memory uses the below commands for a solution of above error,

DBCC FREESESSIONCACHE
DBCC FREEPROCCACHE
DBCC FREESYSTEMCACHE('ALL')

2. "Could not find stored procedure 'sp_MSins_TableName' in replication in sql server" - Distributor reader agent error: It came after reinitialize all subscriptions for one publication and it stopped working with error,
"Could not find stored procedure 'sp_MSins_TableName' in replication in sql server"
This is due to replication object was not created to insert and named like 'sp_MSins_TableName'. The solution for this issue, we need to create these missing replication objects using 'sp_scriptpublicationcustomprocs' which generates scripts the custom INSERT, UPDATE, and DELETE procedures for all table articles in a publication. Workaround for the solution is,

EXEC sp_scriptpublicationcustomprocs 'PublicationName'

After running it, and generated scripts from it. Ran it in the subscriber database, Distributor reader agent started to work.

3."The row was not found at the Subscriber when applying the replicated command" : It encountered it with Distributor agent and it was stopped due to error as,
"Command attempted:
if @@trancount > 0 rollback tran
(Transaction sequence number: 0x000CAB2A00038CB6003B00000000, Command ID: 2)
Error messages:
The row was not found at the Subscriber when applying the replicated command. (Source: MSSQLServer, Error number: 20598)"
Row conflict issue was occurring due to some of the tables which we can get the details with following query to reach towards a solution.
SELECT
*
FROM dbo.MSarticles m
WHERE EXISTS
(
SELECT mc.Article_id from MSrepl_commands mc
WHERE mc.xact_seqno = 0x000CAB2A00038CB6003B00000000
and mc.Article_id = m.article_id
)
Hope you like this and I will share more replication errors if will get in the future.

Thursday, 3 November 2011

Multi Monitor Support of SSMS - A new feature of SQL Server Denali by serverku

I have drafted all the new features and enhancements introduced by SQL Server Denali CTP1 and CTP3.  Multi Monitor Support is the new feature among them.

What can we do?
1. Using this feature we can use multi screen of query analyzer or editor.
2. You can monitor separately it and run the output individually.
3. You can resize all the screens.
4. You can drag and drop at the place you want.
5. You can use Registered servers and object explorer and it's details with multi screens.

For more idea, You can see the below screen shows which I have captured during enjoyed with it.

1. This screen shot has multi screens of query analyzer with SSMS.


2. This screen shot captured during docking/undocking the screens.


Hope you like this post.

Wednesday, 26 October 2011

"The database owner SID recorded in the master database differs from the database owner SID recorded in database" - SQL CLR DLL Register error in SQL Server by serverku

Recently, while working with SQL CLR functionality and created DLL for the SQL CLR. But while registering this DLL in the database I got one surprised error. Let's show you the script so you have more idea. We have a script to register the DLL as following,
SP_CONFIGURE 'clr enabled',1
GO
RECONFIGURE
GO

USE SQLCLRDb
GO

CREATE ASSEMBLY [SQLCLR_ASSEMBLY]
FROM 'C:\SQLCLR_ASSEMBLY.dll' WITH permission_set = UNSAFE

GO
The error is,
The database owner SID recorded in the master database differs from the database owner SID recorded in database.
You should correct this situation by resetting the owner of database using the ALTER AUTHORIZATION statement.
The solution for this issue is which we have the script below. This script will change dbowner of the running database and make it trustworthy on.
USE SQLCLRDb 
GO

ALTER DATABASE SQLCLRDb SET TRUSTWORTHY ON
go

EXEC SP_CHANGEDBOWNER 'UserName'
GO
After running above query, I come out of the issue and registered SQL CLR DLL successfully. I think you also suffered same or different issues with SQL CLR. Please comment your issues and the solution for the same.

Friday, 21 October 2011

Database Backup files Verification and Details - SQL Server by serverku

As a best practice, the DBA needs to verify each database backups are properly done or not, also make sure the backups are OK then is readable and can be restored. Because so many databases are scheduled as FULL and differential backups weekly/daily and transaction log backups on every hour or whatever as per requirement performing to even data loss.

You can read my earlier articles of script to automated all types of database backup and split database backup to multiple files.

How can we verify the backup files?
SQL Server provides VERIFYONLY clause and we can use it with Restore command. Please see the below details for the same as how it works.

#1. VERIFYONLY
Verify database backup integrity and checking backups are corrupted or not. Per SQL Book online, Verifies the backup but does not restore it, and checks to see that the backup set is complete and the entire backup is readable. However, RESTORE VERIFYONLY does not attempt to verify the structure of the data contained in the backup volumes. In Microsoft SQL Server, RESTORE VERIFYONLY has been enhanced to do additional checking on the data to increase the probability of detecting errors. The goal is to be as close to an actual restore operation as practical. For more information, see the Remarks.
If the backup is valid, the SQL Server Database Engine returns a success message. Let us run the query to verify FULL, Differential and Transactional log backups and will see the output come out of it.
RESTORE VERIFYONLY FROM DISK =  'D:\DBBackups\Backups\ReportServer_Backup_20110517_182654.bak'
GO
RESTORE VERIFYONLY FROM DISK = 'D:\DBBackups\Backups\ReportServer_Backup_20110517_183348.bak'
GO
RESTORE VERIFYONLY FROM DISK = 'D:\DBBackups\Backups\ReportServer_Backup_20110517_183843.trn'
GO

How can we get the backup files details?
Using HEADERONLY with Restore command we have details for database backup files. Let us run the query to verify FULL, Differential and Transactional log backups and will see the output come out of it.

#2. HEADERONLY
Per SQL Book online, Returns a result set containing all the backup header information for all backup set on a particular backup device.
RESTORE HEADERONLY FROM DISK =  'D:\DBBackups\Backups\ReportServer_Backup_20110517_182654.bak'
GO
RESTORE HEADERONLY FROM DISK = 'D:\DBBackups\Backups\ReportServer_Backup_20110517_183348.bak'
GO
RESTORE HEADERONLY FROM DISK = 'D:\DBBackups\Backups\ReportServer_Backup_20110517_183843.trn'
GO

Here another command FILELISTONLY which will the logical file and physical file details of backup files.

#3. FILELISTONLY
Per SQL Book online, Returns a result set containing a list of the database and log files contained in the backup set.
RESTORE FILELISTONLY FROM DISK =  'D:\DBBackups\Backups\ReportServer_Backup_20110517_182654.bak'
GO
RESTORE FILELISTONLY FROM DISK = 'D:\DBBackups\Backups\ReportServer_Backup_20110517_183348.bak'
GO
RESTORE FILELISTONLY FROM DISK = 'D:\DBBackups\Backups\ReportServer_Backup_20110517_183843.trn'
GO

What you are performing an activity with backups?

Saturday, 15 October 2011

Grouping Sets vs Native method of Group By - Performance review in SQL Server 2008 by serverku

Previously I have posted for the overview and usage of the Grouping Sets as how can we get the aggregate data of different group sets with GROPING SETS vs Native method of group by.

If you have not read my earlier post for the same, then please read it before go ahead with this demonstration. In this demo i am going to show the performance of Grouping Sets and native method of group by. Let us start the demo here.
-- Creatind database and table 
CREATE DATABASE GroupingDB

GO

USE GroupingDB

GO

IF (OBJECT_ID('GroupingTable') > 0)
DROP TABLE GroupingTable

CREATE TABLE GroupingTable
(
MainCategoryName VARCHAR(100)
,SubCategoryId VARCHAR(100)
,VALUE BIGINT
)

GO
Now we are going to insert so many records in table for the presentation of demo.
-- Insert some demo records for different group sets
INSERT INTO GroupingTable
SELECT 'Main-1','Sub-1',100
GO 500

INSERT INTO GroupingTable
SELECT 'Main-1','Sub-2',200
GO 500

INSERT INTO GroupingTable
SELECT 'Main-2','Sub-1',300
GO 500

INSERT INTO GroupingTable
SELECT 'Main-3','Sub-1',300
GO 500

INSERT INTO GroupingTable
SELECT 'Main-3','Sub-2',400
GO 500
As I said to check the performance of both of the script using them and check the execution plan.
-- Get aggregate data using native method of group by of different sets
SELECT NULL, NULL,
SUM(VALUE) as Total
FROM GroupingTable
Union all
SELECT  MainCategoryName
,NULL
,SUM(VALUE) as Total
FROM GroupingTable
GROUP BY MainCategoryName
Union all
SELECT NULL,
SubCategoryId
,SUM(VALUE) as Total
FROM GroupingTable
GROUP BY SubCategoryId
union all
SELECT
MainCategoryName
,SubCategoryId
,SUM(VALUE) as Total
FROM GroupingTable
GROUP BY
MainCategoryName
,SubCategoryId

-- Get aggregate data using Grouping Sets of different sets
SELECT
MainCategoryName
,SubCategoryId
,SUM(VALUE) as Total
FROM GroupingTable
GROUP BY
GROUPING SETS
(
(MainCategoryName ,SubCategoryId),
(MainCategoryName),
(SubCategoryId),
()

)
ORDER BY MainCategoryName,SubCategoryId


You can see the data rows of above both scripts are same and given same result set. Now look for execution plan.


(Click on image to enlarge)

This performance review is totally based on the data and depends on the your business requirement. Before implementing this new feature, please check the execution and decide you view.

Sunday, 2 October 2011

Multi server Query with Central Management Servers - SQL Server 2008 by serverku

Before SQL Server 2008, when we need to gather all information and details related to a server or database level, we must run the script individually by connecting each SQL Server instance. But SQL Server 2008 came up and easy our work for that. It has introduced a new feature - Central Management Servers (CMS).

With Central Management Servers we can configure and register SQL Server instances with shared SQL Server instances. Then we run the query against all the SQL instance and get the details for all instances. Let's you demonstrate the same in details here.

1. How to open Central Management Servers?

Go to View --> Registered Servers 
or
press Ctrl + Alt + G.

2. How can register SQL Server instances in CMS?

Expand Database engine from Registered Servers. Right click on the CMS and click on Register SQL Server Management. A new screen will appear below,


In the above, I have registered SQL Server Denali instance, which will be shared SQL Server instance. Now I am creating a new SQL Server Group under CMS and then register SQL Server 2008 and SQL Server 2011 by right click on the group and then go to the link of registration and then go on the same way as I did for SQL Server Denali instance.


3. How can we perform multi server query against all SQL Server ?

Go on right click on Shared SQL Server instance under CMS and click on New Query.


Let's do here same and execute the query and see what will be the result?


This feature very help us to run the script against all the registered SQL Server instances. Hope you liked this post.

Thursday, 29 September 2011

How to invoke Job thorugh SQL script - SQL Server by serverku

We are mostly creating the SQL job and schedule it to run automated. But sometime as per requirement, we need to invoke and run on demand basis, not on a schedule basis. I have done the same thing and the following is the smallest demo ready for you. Let's create one stored procedure and schedule in the job.
USE [Master]
GO

CREATE PROCEDURE JobInvokeProc
AS
BEGIN
SET NOCOUNT ON

WAITFOR DELAY '00:00:15';

END
GO
Now we will create the job for the same and execute this stored procedure in the job. The created script for the job is as follows,
USE [msdb]
GO

/****** Object: Job [Start_JobInvokeProc] Script Date: 05/05/2011 07:51:53 ******/
BEGIN TRANSACTION
DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
/****** Object: JobCategory [[Uncategorized (Local)]]] Script Date: 05/05/2011 07:51:53 ******/
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)
EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name=N'Start_JobInvokeProc',
@enabled=1,
@notify_level_eventlog=0,
@notify_level_email=0,
@notify_level_netsend=0,
@notify_level_page=0,
@delete_level=0,
@description=N'No description available.',
@category_name=N'[Uncategorized (Local)]',
@owner_login_name=N'paresh-PC\paresh', @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/****** Object: Step [Step_JobInvokeProc] Script Date: 05/05/2011 07:51:54 ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Step_JobInvokeProc',
@step_id=1,
@cmdexec_success_code=0,
@on_success_action=1,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0, @subsystem=N'TSQL',
@command=N'exec JobInvokeProc',
@database_name=N'master',
@flags=0
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:

GO
Now I have one more we I get from MSDB database and customize it which is used to get the SQL job status information as running or not. This is the script to get the Job status.
USE [master]
GO

CREATE PROCEDURE GetJobStatus
@job_Name varchar(max),
@status INT OUTPUT
AS
BEGIN
DECLARE @can_see_all_running_jobs INT ,
@job_type VARCHAR(12) , -- LOCAL or MULTI-SERVER
@owner_login_name sysname ,
@subsystem NVARCHAR(40) ,
@category_id INT ,
@enabled TINYINT ,
@execution_status INT , -- 0 = Not idle or suspended, 1 = Executing, 2 = Waiting For Thread, 3 = Between Retries, 4 = Idle, 5 = Suspended, [6 = WaitingForStepToFinish], 7 = PerformingCompletionActions
@date_comparator CHAR(1) , -- >, < or =
@date_created DATETIME ,
@date_last_modified DATETIME ,
@description NVARCHAR(512) , -- We do a LIKE on this so it can include wildcards
@schedule_id INT

set @job_type = NULL -- LOCAL or MULTI-SERVER
set @owner_login_name = NULL
set @subsystem = NULL
set @category_id = NULL
set @enabled = NULL
set @execution_status = NULL -- 0 = Not idle or suspended, 1 = Executing, 2 = Waiting For Thread, 3 = Between Retries, 4 = Idle, 5 = Suspended, [6 = WaitingForStepToFinish], 7 = PerformingCompletionActions
set @date_comparator = NULL -- >, < or =
set @date_created = NULL
set @date_last_modified = NULL
set @description = NULL -- We do a LIKE on this so it can include wildcards
set @schedule_id = NULL

DECLARE @job_owner sysname

declare @job_id UNIQUEIDENTIFIER
set @job_id = (select job_id from msdb.dbo.sysjobs where name = @job_Name)

SET NOCOUNT ON

-- By 'composite' we mean a combination of sysjobs and xp_sqlagent_enum_jobs data.
-- This proc should only ever be called by sp_help_job, so we don't verify the
-- parameters (sp_help_job has already done this).

-- Step 1: Create intermediate work tables
DECLARE @job_execution_state TABLE (job_id UNIQUEIDENTIFIER NOT NULL,
date_started INT NOT NULL,
time_started INT NOT NULL,
execution_job_status INT NOT NULL,
execution_step_id INT NULL,
execution_step_name sysname COLLATE database_default NULL,
execution_retry_attempt INT NOT NULL,
next_run_date INT NOT NULL,
next_run_time INT NOT NULL,
next_run_schedule_id INT NOT NULL)
DECLARE @filtered_jobs TABLE (job_id UNIQUEIDENTIFIER NOT NULL,
date_created DATETIME NOT NULL,
date_last_modified DATETIME NOT NULL,
current_execution_status INT NULL,
current_execution_step sysname COLLATE database_default NULL,
current_retry_attempt INT NULL,
last_run_date INT NOT NULL,
last_run_time INT NOT NULL,
last_run_outcome INT NOT NULL,
next_run_date INT NULL,
next_run_time INT NULL,
next_run_schedule_id INT NULL,
type INT NOT NULL)
DECLARE @xp_results TABLE (job_id UNIQUEIDENTIFIER NOT NULL,
last_run_date INT NOT NULL,
last_run_time INT NOT NULL,
next_run_date INT NOT NULL,
next_run_time INT NOT NULL,
next_run_schedule_id INT NOT NULL,
requested_to_run INT NOT NULL, -- BOOL
request_source INT NOT NULL,
request_source_id sysname COLLATE database_default NULL,
running INT NOT NULL, -- BOOL
current_step INT NOT NULL,
current_retry_attempt INT NOT NULL,
job_state INT NOT NULL)

-- Step 2: Capture job execution information (for local jobs only since that's all SQLServerAgent caches)
SELECT @can_see_all_running_jobs = ISNULL(IS_SRVROLEMEMBER(N'sysadmin'), 0)
IF (@can_see_all_running_jobs = 0)
BEGIN
SELECT @can_see_all_running_jobs = ISNULL(IS_MEMBER(N'SQLAgentReaderRole'), 0)
END
SELECT @job_owner = SUSER_SNAME()

IF ((@@microsoftversion / 0x01000000) >= 8) -- SQL Server 8.0 or greater
INSERT INTO @xp_results
EXECUTE master.dbo.xp_sqlagent_enum_jobs @can_see_all_running_jobs, @job_owner, @job_id
ELSE
INSERT INTO @xp_results
EXECUTE master.dbo.xp_sqlagent_enum_jobs @can_see_all_running_jobs, @job_owner

INSERT INTO @job_execution_state
SELECT xpr.job_id,
xpr.last_run_date,
xpr.last_run_time,
xpr.job_state,
sjs.step_id,
sjs.step_name,
xpr.current_retry_attempt,
xpr.next_run_date,
xpr.next_run_time,
xpr.next_run_schedule_id
FROM @xp_results xpr
LEFT OUTER JOIN msdb.dbo.sysjobsteps sjs ON ((xpr.job_id = sjs.job_id) AND (xpr.current_step = sjs.step_id)),
msdb.dbo.sysjobs_view sjv
WHERE (sjv.job_id = xpr.job_id)

-- Step 3: Filter on everything but dates and job_type
IF ((@subsystem IS NULL) AND
(@owner_login_name IS NULL) AND
(@enabled IS NULL) AND
(@category_id IS NULL) AND
(@execution_status IS NULL) AND
(@description IS NULL) AND
(@job_id IS NULL))
BEGIN
-- Optimize for the frequently used case...
INSERT INTO @filtered_jobs
SELECT sjv.job_id,
sjv.date_created,
sjv.date_modified,
ISNULL(jes.execution_job_status, 4), -- Will be NULL if the job is non-local or is not in @job_execution_state (NOTE: 4 = STATE_IDLE)
CASE ISNULL(jes.execution_step_id, 0)
WHEN 0 THEN NULL -- Will be NULL if the job is non-local or is not in @job_execution_state
ELSE CONVERT(NVARCHAR, jes.execution_step_id) + N' (' + jes.execution_step_name + N')'
END,
jes.execution_retry_attempt, -- Will be NULL if the job is non-local or is not in @job_execution_state
0, -- last_run_date placeholder (we'll fix it up in step 3.3)
0, -- last_run_time placeholder (we'll fix it up in step 3.3)
5, -- last_run_outcome placeholder (we'll fix it up in step 3.3 - NOTE: We use 5 just in case there are no jobservers for the job)
jes.next_run_date, -- Will be NULL if the job is non-local or is not in @job_execution_state
jes.next_run_time, -- Will be NULL if the job is non-local or is not in @job_execution_state
jes.next_run_schedule_id, -- Will be NULL if the job is non-local or is not in @job_execution_state
0 -- type placeholder (we'll fix it up in step 3.4)
FROM msdb.dbo.sysjobs_view sjv
LEFT OUTER JOIN @job_execution_state jes ON (sjv.job_id = jes.job_id)
WHERE ((@schedule_id IS NULL)
OR (EXISTS(SELECT *
FROM msdb.dbo.sysjobschedules as js
WHERE (sjv.job_id = js.job_id)
AND (js.schedule_id = @schedule_id))))
END
ELSE
BEGIN
INSERT INTO @filtered_jobs
SELECT DISTINCT
sjv.job_id,
sjv.date_created,
sjv.date_modified,
ISNULL(jes.execution_job_status, 4), -- Will be NULL if the job is non-local or is not in @job_execution_state (NOTE: 4 = STATE_IDLE)
CASE ISNULL(jes.execution_step_id, 0)
WHEN 0 THEN NULL -- Will be NULL if the job is non-local or is not in @job_execution_state
ELSE CONVERT(NVARCHAR, jes.execution_step_id) + N' (' + jes.execution_step_name + N')'
END,
jes.execution_retry_attempt, -- Will be NULL if the job is non-local or is not in @job_execution_state
0, -- last_run_date placeholder (we'll fix it up in step 3.3)
0, -- last_run_time placeholder (we'll fix it up in step 3.3)
5, -- last_run_outcome placeholder (we'll fix it up in step 3.3 - NOTE: We use 5 just in case there are no jobservers for the job)
jes.next_run_date, -- Will be NULL if the job is non-local or is not in @job_execution_state
jes.next_run_time, -- Will be NULL if the job is non-local or is not in @job_execution_state
jes.next_run_schedule_id, -- Will be NULL if the job is non-local or is not in @job_execution_state
0 -- type placeholder (we'll fix it up in step 3.4)
FROM msdb.dbo.sysjobs_view sjv
LEFT OUTER JOIN @job_execution_state jes ON (sjv.job_id = jes.job_id)
LEFT OUTER JOIN msdb.dbo.sysjobsteps sjs ON (sjv.job_id = sjs.job_id)
WHERE ((@subsystem IS NULL) OR (sjs.subsystem = @subsystem))
AND ((@owner_login_name IS NULL)
OR (sjv.owner_sid = msdb.dbo.SQLAGENT_SUSER_SID(@owner_login_name)))--force case insensitive comparation for NT users
AND ((@enabled IS NULL) OR (sjv.enabled = @enabled))
AND ((@category_id IS NULL) OR (sjv.category_id = @category_id))
AND ((@execution_status IS NULL) OR ((@execution_status > 0) AND (jes.execution_job_status = @execution_status))
OR ((@execution_status = 0) AND (jes.execution_job_status <> 4) AND (jes.execution_job_status <> 5)))
AND ((@description IS NULL) OR (sjv.description LIKE @description))
AND ((@job_id IS NULL) OR (sjv.job_id = @job_id))
AND ((@schedule_id IS NULL)
OR (EXISTS(SELECT *
FROM msdb.dbo.sysjobschedules as js
WHERE (sjv.job_id = js.job_id)
AND (js.schedule_id = @schedule_id))))
END



-- Step 3.1: Change the execution status of non-local jobs from 'Idle' to 'Unknown'
UPDATE @filtered_jobs
SET current_execution_status = NULL
WHERE (current_execution_status = 4)
AND (job_id IN (SELECT job_id
FROM msdb.dbo.sysjobservers
WHERE (server_id <> 0)))

-- Step 3.2: Check that if the user asked to see idle jobs that we still have some.
-- If we don't have any then the query should return no rows.
IF (@execution_status = 4) AND
(NOT EXISTS (SELECT *
FROM @filtered_jobs
WHERE (current_execution_status = 4)))
BEGIN
DELETE FROM @filtered_jobs
END

-- Step 3.3: Populate the last run date/time/outcome [this is a little tricky since for
-- multi-server jobs there are multiple last run details in sysjobservers, so
-- we simply choose the most recent].
IF (EXISTS (SELECT *
FROM msdb.dbo.systargetservers))
BEGIN
UPDATE @filtered_jobs
SET last_run_date = sjs.last_run_date,
last_run_time = sjs.last_run_time,
last_run_outcome = sjs.last_run_outcome
FROM @filtered_jobs fj,
msdb.dbo.sysjobservers sjs
WHERE (CONVERT(FLOAT, sjs.last_run_date) * 1000000) + sjs.last_run_time =
(SELECT MAX((CONVERT(FLOAT, last_run_date) * 1000000) + last_run_time)
FROM msdb.dbo.sysjobservers
WHERE (job_id = sjs.job_id))
AND (fj.job_id = sjs.job_id)
END
ELSE
BEGIN
UPDATE @filtered_jobs
SET last_run_date = sjs.last_run_date,
last_run_time = sjs.last_run_time,
last_run_outcome = sjs.last_run_outcome
FROM @filtered_jobs fj,
msdb.dbo.sysjobservers sjs
WHERE (fj.job_id = sjs.job_id)
END

-- Step 3.4 : Set the type of the job to local (1) or multi-server (2)
-- NOTE: If the job has no jobservers then it wil have a type of 0 meaning
-- unknown. This is marginally inconsistent with the behaviour of
-- defaulting the category of a new job to [Uncategorized (Local)], but
-- prevents incompletely defined jobs from erroneously showing up as valid
-- local jobs.
UPDATE @filtered_jobs
SET type = 1 -- LOCAL
FROM @filtered_jobs fj,
msdb.dbo.sysjobservers sjs
WHERE (fj.job_id = sjs.job_id)
AND (server_id = 0)
UPDATE @filtered_jobs
SET type = 2 -- MULTI-SERVER
FROM @filtered_jobs fj,
msdb.dbo.sysjobservers sjs
WHERE (fj.job_id = sjs.job_id)
AND (server_id <> 0)

-- Step 4: Filter on job_type
IF (@job_type IS NOT NULL)
BEGIN
IF (UPPER(@job_type collate SQL_Latin1_General_CP1_CS_AS) = 'LOCAL')
DELETE FROM @filtered_jobs
WHERE (type <> 1) -- IE. Delete all the non-local jobs
IF (UPPER(@job_type collate SQL_Latin1_General_CP1_CS_AS) = 'MULTI-SERVER')
DELETE FROM @filtered_jobs
WHERE (type <> 2) -- IE. Delete all the non-multi-server jobs
END

-- Step 5: Filter on dates
IF (@date_comparator IS NOT NULL)
BEGIN
IF (@date_created IS NOT NULL)
BEGIN
IF (@date_comparator = '=')
DELETE FROM @filtered_jobs WHERE (date_created <> @date_created)
IF (@date_comparator = '>')
DELETE FROM @filtered_jobs WHERE (date_created <= @date_created)
IF (@date_comparator = '<')
DELETE FROM @filtered_jobs WHERE (date_created >= @date_created)
END
IF (@date_last_modified IS NOT NULL)
BEGIN
IF (@date_comparator = '=')
DELETE FROM @filtered_jobs WHERE (date_last_modified <> @date_last_modified)
IF (@date_comparator = '>')
DELETE FROM @filtered_jobs WHERE (date_last_modified <= @date_last_modified)
IF (@date_comparator = '<')
DELETE FROM @filtered_jobs WHERE (date_last_modified >= @date_last_modified)
END
END

-- Return the result set (NOTE: No filtering occurs here)
SELECT @status = ISNULL(fj.current_execution_status, 0) -- This column will be NULL if the job is non-local

FROM @filtered_jobs fj
LEFT OUTER JOIN msdb.dbo.sysjobs_view sjv ON (fj.job_id = sjv.job_id)
LEFT OUTER JOIN msdb.dbo.sysoperators so1 ON (sjv.notify_email_operator_id = so1.id)
LEFT OUTER JOIN msdb.dbo.sysoperators so2 ON (sjv.notify_netsend_operator_id = so2.id)
LEFT OUTER JOIN msdb.dbo.sysoperators so3 ON (sjv.notify_page_operator_id = so3.id)
LEFT OUTER JOIN msdb.dbo.syscategories sc ON (sjv.category_id = sc.category_id)
ORDER BY sjv.job_id

return @status
END

GO
Now will run the script to invoke the SQL job through the analyzer and will see the SQL job status. Let's run the below code and see the output,
USE [master]
GO

-- Get the job status before invoke the job
DECLARE @status INT
DECLARE @JobName VARCHAR(500)

SET @status = 0
set @JobName = 'Start_JobInvokeProc'

EXEC GetJobStatus
@JobName,
@status OUTPUT

SELECT @status AS JobStatus
------------------------------------------
-- Run this query to invoke the job
EXEC msdb.dbo.Sp_start_job @JobName
------------------------------------------
-- Get the job status after invoke the job
WAITFOR DELAY '00:00:3';

SET @status = 0
set @JobName = 'Start_JobInvokeProc'

EXEC GetJobStatus
@JobName,
@status OUTPUT

SELECT @status AS JobStatus
GO

When you run the script again while executing the job, It will raise the error while invokes it again.

Msg 22022, Level 16, State 1, Line 0
SQLServerAgent Error: Request to run job Start_JobInvokeProc (from User paresh-PC\paresh) refused because the job is already running from a request by User paresh-PC\paresh.


So better option is we need to check the job status first, then we should invoke. The revised script is,
USE [master]
GO

/* Get the job status before invoke the job */
DECLARE @status INT
DECLARE @JobName VARCHAR(500)

SET @status = 0
set @JobName = 'Start_JobInvokeProc'

EXEC GetJobStatus
@JobName,
@status OUTPUT

SELECT @status AS JobStatus

-- Added condition here to check the job is already running or not.
IF (@status <> 1)
BEGIN
/* Run this query to invoke the job*/
EXEC msdb.dbo.Sp_start_job @JobName
END
GO