Thursday, 30 April 2015

How to manual failover mirroring without affecting replication - SQL Server by serverku

Before a couple of days we planned to manual failover of production servers and all live databases for that instance to mirror instance and did a failover too. It was a good experience for failover without fail anything like replication, scheduled jobs, linked servers, ssis packages, reports, windows tasks and whatever dependencies. Well perfect planning and team work was key for that succeed failover for us. This post is about to considering mirroring without witness server\automatic failover and transactional replication where the production database to act as a publisher and principal and the plan for same like following.

Planning
  • Configured mirror all production databases to mirror instance without witness server.
  • Created a dns alias for production server.
  • Used that alias as a data source to connect production sql server instance in linked servers, reports, ssis packages in all servers which pointing production instance and in application too .
  • Created all scheduled jobs with disable status in mirror instance.
  • Created all linked servers of production instance in mirror instance.
  • Created all logins of production instance in mirror instance.
  • Created all database mail profiles of production instance in mirror instance.
  • Created sql server agent operators of production instance in mirror instance.
  • Created windows scheduled tasks with disable status in mirror server.
All of above steps applied in advance with recent changes just before to start failover and need to change dns alias to mirror server, enable scheduled jobs and windows tasks during failover.

Problem 
But had a little bit confusing for replication, How to manually failover of mirroring without affecting replication? That was an issue. I have applied the solution and made it succeed. After manual failover, transactional replication started to raise an error and stopped working. Because it was trying to connect publisher database but it became a mirror after failover,

The process could not execute 'sp_replcmds' on '<original Publisher Server>'.

Workaround
There is one more step apart from listed above.
  • Add Failover Partner as a parameter (–PublisherFailoverPartner) in snapshot, log reader and queue reader agents.
How to add parameter?
I am sharing some screen shots which drive us for the explanation,
1. Go to Replication monitor and move to agent tab. Select agent from Agent types drop box, you will have list of agents, select it and click Agent Profiler from right click property.


2. Under Agent property, create a new user profile which will be created same as system profile, just need to add value <failover partner> of –PublisherFailoverPartner parameter.


3. Add –PublisherFailoverPartner parameter value for all agents like snapshot, log reader, queue reader agent and merge agent we have merge replication configured.


Note : After creating a new user agent profile check the box “Use for this agent”. I created a new agent profile because it does not allow to add –PublisherFailoverPartner parameter for system profile from the user interface. But we can add it with system procedures with tsql script.
USE distribution 
GO

-- For Snapshot Agent
EXEC sp_add_agent_parameter
@profile_id = 1,
@parameter_name = N'-PublisherFailoverPartner',
@parameter_value = N'<Failover Partner>'

-- For Log Reader Agent
EXEC sp_add_agent_parameter
@profile_id = 2,
@parameter_name = N'-PublisherFailoverPartner',
@parameter_value = N'<Failover Partner>'

-- For Distribution Agent
EXEC sp_add_agent_parameter
@profile_id = 3,
@parameter_name = N'-PublisherFailoverPartner',
@parameter_value = N'<Failover Partner>'

-- For Merge Agent
EXEC sp_add_agent_parameter
@profile_id = 4,
@parameter_name = N'-PublisherFailoverPartner',
@parameter_value = N'<Failover Partner>'

-- Queue Reader Agent
EXEC sp_add_agent_parameter
@profile_id = 9,
@parameter_name = N'-PublisherFailoverPartner',
@parameter_value = N'<Failover Partner>'
It has allowed to add this parameter for system profiles, but change the profile_id whatever system or user profile used for an agent which you will get it from sp_help_agent_profile system procedure from distribution database.

How to confirm?
By following script we can confirm the parameter values for such agents. Run this script in msdb database from distributor server.
USE msdb 
GO

SELECT a.profile_id,
a.profile_name,
a.description,
a.def_profile,
b.parameter_name,
b.value
FROM msagent_profiles a
INNER JOIN msagent_parameters b
ON ( a.profile_id = b.profile_id )
WHERE b.parameter_name = '-PublisherFailoverPartner'

(Click on image to enlarge)
Hope you enjoyed this case and might help you a lot. Did you face this issue or what is your solution? Something missing in failover plan? Please share your ideas and opinion about it. Your comments are most welcome!

The row was not found at the Subscriber when applying the replicated command - Replication error in SQL Server by serverku

Earlier I wrote a post for the same error at Review of some replication issues and workaround, but it is just overview with some other replication errors. Please read it if you have not visited those posts and hope you will enjoy and help you much. Let me elaborate the error with proper example. For example, I have already created objects and configured replication, just need to use those objects. Let me introduce them,
    • Primary database : Test
    • Secondary database : Test1
    • Replicated table : dbo.sample1
I am using the same server in the example as publisher, distributor and subscriber. First need to check records in tables of both databases.
-- Selecting  records from table of publisher database 
SELECT *
FROM test.dbo.sample1 (nolock)

-- Selecting records from table of subscriber database
SELECT *
FROM test1.dbo.sample1 (nolock)

How to raise an error manually?
To raise mention error in tile I will remove one record from a table in subscriber database to create inconsistency, then will update same deleted records and insert one more record in table of publisher database. Let us see what will happen then,
-- Deleting one record from table in subscriber database 
DELETE FROM test1.dbo.sample1
WHERE id = 2

-- Updating same record from table in publisher database
UPDATE test.dbo.sample1
SET name = 'test5'
WHERE id = 2

-- Inserting new record in table in publisher database
INSERT test.dbo.sample1
SELECT 4,
'test4'
After running above script we will review replication monitor window and you will see error because it is trying update row at the subscriber side, but it is not exist and therefore it will raise an error. Due to this, new inserted records (id = 4) will not populate at subscriber side.


Solution 
Now we have to do some workaround to get it resolved, but first we need to get missing row details which was deleted at subscriber side. You can see sequence number in the above image from which we can get the same information. Run following script in distribution database and see the output,
USE distribution 

go

SELECT *
FROM dbo.msarticles m
WHERE EXISTS (SELECT mc.article_id
FROM msrepl_commands mc
WHERE mc.xact_seqno = 0x0000002200000048000300000000
AND mc.article_id = m.article_id)

EXEC Sp_browsereplcmds
@xact_seqno_start = '0x0000002200000048000300000000',
@xact_seqno_end = '0x0000002200000048000300000000'

(Click on image to enlarge)
The result is clearly showing article and missing row details. Let me apply that missing row at subscriber side and then review replication monitor again.
-- Inserting missing record in table in subscriber database 
INSERT test1.dbo.sample1
SELECT 2,
'test2'
After inserting above record the issue will get resolved, which you can see in the image below. Missing record and the newly inserted record (id = 4) also applied,


 Hope you enjoyed this post and it will be fine you put your innovative ideas here for any alternative solution and opinion

Replicated transactions are waiting for next Log backup or for mirroring partner to catch up - Issue in SQL Server Replication by serverku

Hope you read my earlier post of "Replication components are not installed on this server" issue, you may like it. One day suddenly replication went to high latency and I clicked it during monitoring. I opened the replication monitor, during analysis I found one message which was showing latency from publisher to distributor due to some issue. The message is as follows,

Replicated transactions are waiting for next Log backup or for mirroring partner to catch up”.


As per message i checked the transaction log backups were happened or not, checked it and log backups were happening . Finally one option is pending to review and it is mirroring. I checked the status of mirroring  and see the  principal database went to synchronizing mode. May be mirroring went in synchronizing mode due to heavy or so many transactions in the route to apply at mirror database. So replication is waiting  to be synchronized status of principal database. I exactly do not know why replication went on waiting even transaction log backups were happening. I checked online solution and received some of the solutions from here and it suggests following,

1. EXEC sp_replicationdboption 'PublisherDB','sync with backup',false
This means that not need to backed up of all transactions before being delivered to the distribution database. Please visit this option here, which sets a replication database option for the specified database. This stored procedure is executed at the Publisher or Subscriber on any database.

2. Enable trace flag 1448
After this setting the Log Reader Agent can continue replicating changes regardless of the mirroring state. Please read more here.

I never applied these suggestions. Replication and mirroring both are using transaction logs and this could be the reason why replication was in high delay status while mirroring was stuck or in process to synchronize the mirror database. So I turned off mirroring and monitored replication status. Finally replication was succeeded to remove the delay and applied all pending commands. Then I configured to mirror again after doing with replication sync.

Conclusion : We have two options, either to wait for synchronized status of principal database or turned of mirroring, getting replication synced properly and reconfigure mirroring again depends on priority and importance. It will be better to go for correct solution to avoid such issue. But question is here, why replication went on waiting for a synchronized of mirroring even transactions log backups happened while? I would like you to share if received such issue and solution or any opinion which you applied to resolve it. This may help to me and all people facing the same issue.

Script to get table size statistics - SQL Server by serverku

Recently we have seen the post for database size information and the script used for the same. Hope you liked that post. Sometime we need to have details like database size for growth and disk planning we viewed in an earlier post, the same way I am repeating this post in details like earlier size statistics for database and now it is for tables. You have seen in an earlier post we used sp_spaceused to get database size information, but what about if we pass the table name as parameter, I will return for particular table size statistics in one user database,
USE [AdventureWorksLT2008]
GO
EXEC sp_spaceused '[SalesLT].[Product]';

/* Output :

name rows reserved data index_size unused
-------- ----- --------- ------- ----------- ------
Product 295 928 KB 808 KB 96 KB 24 KB

*/

The same information can be achieved for all tables using a loop (while loop or cursor ) or sp_msforeachtable system procedure,
USE [AdventureWorksLT2008]
GO
CREATE TABLE #tablesizestatistics
(
Tablename VARCHAR(100),
[rows] VARCHAR(100),
reserved VARCHAR(50),
data VARCHAR(50),
index_size VARCHAR(50),
unused VARCHAR(50)
)

EXEC Sp_msforeachtable
@command1='INSERT INTO #TableSizeStatistics exec sp_spaceused ''?'''

SELECT *
FROM #tablesizestatistics

/* Output :

Tablename rows reserved data index_size unused
-------------------------------- ------ --------- ------- ----------- -------
BuildVersion 1 16 KB 8 KB 8 KB 0 KB
Address 450 336 KB 72 KB 160 KB 104 KB
Customer 847 512 KB 272 KB 136 KB 104 KB
CustomerAddress 417 80 KB 32 KB 48 KB 0 KB
Product 295 928 KB 808 KB 96 KB 24 KB
ProductCategory 41 48 KB 8 KB 40 KB 0 KB
ProductDescription 762 248 KB 144 KB 56 KB 48 KB
ProductModel 128 248 KB 56 KB 104 KB 88 KB
ProductModelProductDescription 762 112 KB 48 KB 64 KB 0 KB
SalesOrderDetail 542 120 KB 40 KB 80 KB 0 KB
SalesOrderHeader 32 64 KB 8 KB 56 KB 0 KB
ErrorLog 0 0 KB 0 KB 0 KB 0 KB

*/

You may use a loop instead of above method and run sp_spaceused inside it to get for each table. But without a loop and sp_msforeachtable and sp_spaceused we have another script which help us to get each tables size statistic,
USE [AdventureWorksLT2008]
GO
SELECT Schema_name(t.schema_id) + '.' + t.name AS TableName,
p.rows AS Rows,
Sum(a.total_pages) * 8 AS TotalSpace_KB,
Sum(a.used_pages) * 8 AS UsedSpace_KB,
(Sum(a.total_pages) - Sum(a.used_pages)) * 8 AS UnusedSpace_KB
FROM sys.tables t
INNER JOIN sys.indexes i
ON t.object_id = i.object_id
INNER JOIN sys.partitions p
ON i.object_id = p.object_id
AND i.index_id = p.index_id
INNER JOIN sys.allocation_units a
ON p.partition_id = a.container_id
GROUP BY Schema_name(t.schema_id) + '.' + t.name,
p.rows
ORDER BY totalspace_kb DESC

/* Output :

TableName Rows TotalSpace_KB UsedSpace_KB UnusedSpace_KB
---------------------------------------- ---------- -------------- ------------- --------------
SalesLT.Product 295 928 904 24
SalesLT.Customer 847 512 408 104
SalesLT.Address 450 336 232 104
SalesLT.ProductDescription 762 248 200 48
SalesLT.SalesOrderDetail 542 120 120 0
SalesLT.ProductModelProductDescription 762 112 112 0
SalesLT.ProductModel 128 112 104 8
SalesLT.CustomerAddress 417 80 80 0
SalesLT.SalesOrderHeader 32 64 64 0
SalesLT.ProductCategory 41 48 48 0
dbo.BuildVersion 1 16 16 0
dbo.ErrorLog 0 0 0 0

*/


Hope you enjoyed this post and this may help you to get same table statistics. I would like you to share if we can have some other scripts or some additional information for tables.

Wednesday, 29 April 2015

SQL Server is unable to complete the New Publication Wizard - Replication Error by serverku

The last time we saw the issue we faced during setup of replication and it was due to original server name change. We came out from one issue But today I am writing for another issue which I faced in the same for replication configuration. I think this is something bad happen during workaround for the solution earlier, by mistake, something went wrong while old server drop and it were used in replication. Even we will look for the solution for this too. So let us move on the error which I get on initial stage of replication configuration while adding publication using Publication Wizard,

“TITLE: New Publication Wizard
------------------------------
SQL Server is unable to complete the New Publication Wizard.

------------------------------
ADDITIONAL INFORMATION:
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)

------------------------------
Invalid object name 'msdb.dbo.MSdistributiondbs'. (Microsoft SQL Server, Error: 208)”
It seems these objects getting destroyed while workaround for earlier error. Finally, I got solution online as how to create those objects in msdb database,
USE MSDB
GO

CREATE TABLE [MSdistributiondbs] (
[name] [sysname] NOT NULL
,[min_distretention] [int] NOT NULL
,[max_distretention] [int] NOT NULL
,[history_retention] [int] NOT NULL
)
GO

CREATE TABLE [dbo].[MSdistpublishers] (
[name] [sysname] NOT NULL
,[distribution_db] [sysname] NOT NULL
,[working_directory] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL
,[security_mode] [int] NOT NULL
,[login] [sysname] NOT NULL
,[password] [nvarchar] (524) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
,[active] [bit] NOT NULL
,[trusted] [bit] NOT NULL
,[thirdparty_flag] [bit] NOT NULL
)
GO
Above objects created and finally I was succeeding to go ahead for my next steps!

Changing Columns to Rows Using UNPIVOT - SQL Server by serverku

I have already posted the article previously with my PIVOT. As PIVOT same, but working reversely and it is UNPIVOT. I do not want to miss the post using UNPIVOT. With PIVOT we can get aggregate data with Rows as Columns. With UNPIVOT operation, we can change Columns to Rows without aggregation.

Let me prepare small example and explain as how how can we use UNPIVOT and changing column as Rows. Before go ahead to run the script of UnPivot, we will create a database and table objects.
USE DEMO
GO

-- Creating table for demo
IF (object_id('TblUnPivot','U') > 0)
DROP TABLE TblUnPivot


CREATE TABLE TblUnPivot
(
ItemCode int,
ItemName varchar(100),
Red int,
Green int,
Blue int
)

GO

-- Inserting some sample records

INSERT INTO TblUnPivot
SELECT 1,'Samsung Mobile', 1, 1, 1
UNION ALL
SELECT 2,'Nokia Mobile', 1, 2, 3
UNION ALL
SELECT 3,'Motorola Mobile', 2, 3, 2

GO
Now we will check the original table data and also using UnPivot as well. So we will run both scripts for the same.
-- Getting table data
select
ItemCode,
ItemName,
Red,
Green,
Blue
from TblUnPivot

GO

-- Getting Pivot and changing columns to rows
SELECT
ItemCode,
ItemName,
ItemColour,
ItemCount
FROM
(
SELECT
ItemCode,
ItemName,
Red,
Green,
Blue
FROM TblUnPivot
) AS UP
UNPIVOT
(
ItemCount FOR ItemColour IN (Red, Green, Blue)
) AS upv

GO

You can review here and see how the UNPIVOT is working. Let me share your experience with UNPIVOT.

Replication components are not installed on this server - Error while adding subscriber in replication by serverku

Recently, when I was working on replication task. It has been to just add new subscribers to an existing publication. Some of their added successfully but some of failed due to replication component is missing. Before you go ahead into details I would like to read some other tips and posts related to replication,
  1. Review of some replication issues and workaround
  2. Finding objects replicated schema/data or schema only published in replication
  3. Get Publications, Articles, Distributors and Subscribers for Replication
  4. Finding Stored Procedures, Functions and Views published in replication
  5. Find the details of publications, subscribers and articles at once in replication
Now moving here to go ahead with this post and the issue which I faced during adding new subscribers to publication. The issue is as the following which was raised as the message,

Replication components are not installed on this server. Run SQL Server setup again and select the option to install replication.


This message gives complete and clear information about exact issue and give suggestion too, as what need to do to complete this task. Just need to add that missing component for replication. Install replication component from SQL Server installation and select that component,


Adding feature to an existing instance,


After completing above steps of installation, I was succeeding to add subscribers to a publication. This is the common post, but hope you liked it.

Database size information - SQL Server by serverku

This is common information which DBA required on a daily basis for monitoring or reporting purposes. This is basic details which help us to get the database growth statistics on daily, weekly and monthly basis so we can have an idea for the disk space management. We can dump daily database statistics and make database size and growth reports. You all know about the script and you may use too, even I would like to share. System stored procedures help us to get the same database information, e.g sp_helpdb stored procedure give details,
EXEC sp_helpdb 'DEMO';

/* Output :

name db_size owner dbid created status compatibility_level
------------------------ ------ ------ ----------- ---------------- -------------------
DEMO 13.25 MB sa 19 Apr 28 2015 Status=ONLINE,... 110


name fileid filename filegroup size maxsize growth usage
------- -------- ------------------------------------------------------------------------------ --------- ------- ------------ -------- -------
DEMO 1 C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\DEMO.mdf PRIMARY 9280 KB Unlimited 1024 KB data only
DEMO_log 2 C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\DEMO_log.ldf NULL 4288 KB 2147483648 KB 10% log only

*/

This is all information for database which returned by sp_helpdb by passing the database name as a parameter. If you do not wish to pass the database name, then it will return all databases without physical file details. Same approach receive by sp_spaceused system stored procedure without any parameter passing.
USE DEMO
GO
EXEC sp_spaceused;

/* Output :

database_name database_size unallocated space
-------------- ------------------ ------------------
DEMO 13.25 MB 3.52 MB

reserved data index_size unused
------------- ------------------ ------------------ ---------
5680 KB 3064 KB 2024 KB 592 KB


*/

Now I would like to get it with some other scripts which give details for each database and their physical files.
USE master 
GO

SELECT db.[name] AS 'DBName',
af.name AS 'LogicalName',
af.[filename] AS 'PhysicalName',
( Cast(( ( ( Cast(af.[size] AS NUMERIC(18, 4)) * 8192 ) / 1024 ) / 1024 )
AS
NUMERIC(18, 2)) ) AS 'FileSize_MB'
FROM sys.sysdatabases db
INNER JOIN sys.sysaltfiles af
ON db.dbid = af.dbid

/* Output :

DBName LogicalName PhysicalName FileSize_MB
------------------- --------------------------- ---------------------------------------------------------------------------------------- ---------------
master master C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\master.mdf 4.88
master mastlog C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\mastlog.ldf 1.75
tempdb tempdev C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\tempdb.mdf 8.00
tempdb templog C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\templog.ldf 0.50
model modeldev C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\model.mdf 4.06
model modellog C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\modellog.ldf 1.00
msdb MSDBData C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\MSDBData.mdf 22.31
msdb MSDBLog C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\MSDBLog.ldf 19.63
test test C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\test.mdf 4.06
test test_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\test_log.ldf 1.31
test1 test1 C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\test1.mdf 8.00
test1 test1_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\test1_log.ldf 19.63
Repl1 Repl1 C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\Repl1.mdf 23.00
Repl1 Repl1_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\Repl1_log.ldf 2.50
SBExternal SBExternal C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SBExternal.mdf 4.06
SBExternal SBExternal_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SBExternal_log.ldf 1.02
SBExternalReceiver SBExternalReceiver C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SBExternalReceiver.mdf 4.06
SBExternalReceiver SBExternalReceiver_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SBExternalReceiver_log.ldf 1.02
importdb importdb C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\importdb .mdf 4.06
importdb importdb _log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\importdb _log.ldf 1.02
import import C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\import.mdf 9.06
import import_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\import_log.ldf 2.81
SchemaSync SchemaSync C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SchemaSync.mdf 4.06
SchemaSync SchemaSync_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SchemaSync_log.ldf 1.02
SchemaSync_New SchemaSync_New C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SchemaSync_New.mdf 4.06
SchemaSync_New SchemaSync_New_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SchemaSync_New_log.ldf 1.02
hMailServer hMailServer C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\hMailServer.mdf 4.06
hMailServer hMailServer_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\hMailServer_log.ldf 1.02
AdventureWorksLT2008 AdventureWorksLT2008_Data D:\Data\AdventureWorksLT2008_Data.mdf 8.31
AdventureWorksLT2008 AdventureWorksLT2008_Log D:\Data\AdventureWorksLT2008_Log.ldf 18.00
powershelldb powershelldb C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\powershelldb.mdf 4.06
powershelldb powershelldb_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\powershelldb_log.ldf 1.02
SMOSimple_DB SMOSimple_DB C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SMOSimple_DB.mdf 4.06
SMOSimple_DB SMOSimple_DB_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\SMOSimple_DB_log.ldf 1.02
Merge_Repl_Sub Merge_Repl_Sub C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\Merge_Repl_Sub.mdf 9.00
Merge_Repl_Sub Merge_Repl_Sub_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\Merge_Repl_Sub_log.ldf 3.75
DEMO DEMO C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\DEMO.mdf 9.06
DEMO DEMO_log C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2K12\MSSQL\DATA\DEMO_log.ldf 4.19

*/

This is details returned by each physical file of all databases, but now the same script with minor changes return whole database size details without physical files detail.
USE master 
GO

SELECT db.[name] AS 'DBName',
Sum(( Cast(( ( ( Cast(af.[size] AS NUMERIC(18, 4)) * 8192 ) / 1024 ) /
1024 ) AS
NUMERIC(18, 2)) )) AS 'FileSize_MB'
FROM sys.sysdatabases db
INNER JOIN sys.sysaltfiles af
ON db.dbid = af.dbid
GROUP BY db.[name]

/* Output :

DBName FileSize_MB
--------------------- -------------
master 6.63
tempdb 8.50
model 5.06
msdb 41.94
test 5.37
test1 27.63
Repl1 25.50
SBExternal 5.08
SBExternalReceiver 5.08
importdb 5.08
import 11.87
SchemaSync 5.08
SchemaSync_New 5.08
hMailServer 5.08
AdventureWorksLT2008 26.31
powershelldb 5.08
SMOSimple_DB 5.08
Merge_Repl_Sub 12.75
DEMO 13.25

*/

These are the scripts which I want to share with you which may help you and I would like you to share your thoughts and scripts which you are using for database size and growth statistics. You may read my earlier post for Database Backup Statistics and History.

SQL Server is unable to connect to server 'ServerName' - Error while setup replication by serverku

One day when I was setting up replication in my local machine, when I clicked to create a new publication from replication tab, I faced one error and the details,

Error :
TITLE: New Publication Wizard
------------------------------

SQL Server is unable to connect to server 'PARESH\MSSQLSERVER2012'.
------------------------------
ADDITIONAL INFORMATION:

SQL Server replication requires the actual server name to make a connection to the server.
Specify the actual server name, 'ADMIN\MSSQLSERVER2012'. (Replication.Utilities)


This is because of earlier my machine name was “ADMIN” and SQL server installed at that time. After some time I changed my machine name to “PARESH”. Currently SQL Server instance showing “PARESH\MSSQLSERVER2012” from SSMS and showing “"ADMIN\MSSQLSERVER2012” while running @@SERVERNAME from query analyzer. So this is the reason why I am facing above mentioned error. This will raise same error while trying to configure replication and the default SQL instance connected with “(local)” or “.” as a name.

Solution :
I have one option which we have to connect SQL Server with the actual server name and it will be done with change SQL server name to machine name\instance name first. You can change the SQL Server instance name with following script,
EXEC sp_dropserver 'ADMIN\MSSQLSERVER2012' -- <Old Name\Instance>
GO

EXEC sp_addserver 'PARESH\MSSQLSERVER2012','local' -- <New Name\Instance>
GO
When you run a first statement which in above mentioned script, you may face the following error too,

Msg 15190, Level 16, State 1, Procedure sp_dropserver, Line 56
There are still remote logins or linked logins for the server 'ADMIN\MSSQLSERVER2012'.


And we have solved too,
EXEC sp_dropremotelogin 'ADMIN\MSSQLSERVER2012'; -- <Old Name\Instance>
GO
After running all steps we can go ahead for the next step of replication. To check renamed server name with @@SERVERNAME. Among you also faced this issue also. Can we have the other options for the solution here?

Archive old database backup files using forfiles.exe and FOR /F by serverku

Before so many posts published I posted to the old database backup files archive using TSQL and using SSIS. Hope you visit that post and you liked them. I am repeating the same thing here, but it will be with two different methods, the first one with FOR /F command which we saw last time to copy database backups to another location too, second one with forfiles.exe. Let us move on the first method.

FOR /F :  As we discussed earlier post, it used same to traverse rows from generated file from SQLCMD command. Here we do not have values with delims (,) and we have only one values to grab and assign in variables.
SQLCMD -Udba -Pdba@1234 -S"PARESH\MSSQLSERVER2012" -dmsdb -Q"set nocount on ; 
SELECT DISTINCT bmf.Physical_device_name FROM msdb.dbo.backupset (nolock) bs
INNER JOIN msdb.dbo.backupmediafamily (nolock) bmf on (bs.media_set_id = bmf.media_set_id)
WHERE bs.backup_finish_date < DATEADD(Day,-7,GETDATE()) " -o "C:\BackupFile.txt"

FOR /F "tokens=1,1 skip=2 delims=," %%G IN (C:\BackupFile.txt) DO del %%G
Please make sure SQLCMD and FOR /F should be in single line individualized. Running above code and captured snapshot as follows,


forfiles.exe : It is used to delete files in giving directories and subdirectories with a specified day or date. Syntax of it is following, Here /P is a directory, /m for search criteria, /S for subdirectories search, /d for days or date and /C command to fire. Please visit this site for more detail.
forfiles [/p <Path>] [/m <SearchMask>] [/s] [/c "<Command>"] [/d [{+|-}][{<Date>|<Days>}]]
Let us run it and capture snap again with the method,


Hope you have some other methods to archive old database backups. Request to share here!

Copy database backup files using SQLCMD and FOR /F commands by serverku

Recently we have done with the conversion of the same topic to copy database backup files to an external drive using XCOPY only. That script copy database backup files created on current day from source to destination, So we can schedule that code to run one time only per day to avoid duplicate file copy, otherwise duplicate files may copy for further run.So I used another alternative solution which we can schedule recursive and no chance for duplicate file copy, So I am sharing the same here.

In this method I will use backupset and backupmediafamily system tables from msdb database to get database backup details for a particular period and and process them for a copy.
SQLCMD -Udba -Pdba@1234 -S"PARESH\MSSQLSERVER2012" -dmsdb 
-Q"set nocount on ;
SELECT DISTINCT bmf.Physical_device_name +','+'\\ExternalDrivePath\DBBackup\'+
HOST_NAME()+'\'+ bs.Database_name + '\' +
CASE WHEN BS.TYPE = 'D' then 'FULL' WHEN BS.TYPE = 'I' then 'DIFF' else 'TRN' End + '\' as BackupFiles
FROM msdb.dbo.backupset (nolock) bs
INNER JOIN msdb.dbo.backupmediafamily (nolock) bmf
on (bs.media_set_id = bmf.media_set_id)
WHERE bs.backup_finish_date > DATEADD(HOUR,-6,GETDATE()) "
-o "D:\Batchfiles\BackupFile.txt"

FOR /F "tokens=1,2 skip=2 delims=," %%G IN (C:\DatabaseBackup\BackupFile.txt) DO xcopy /Y %%G %%H
I have created above code in the batch, Please make sure SQLCMD and FOR /F should be in single line individualized.

SQLCMD will generate a text file having source full file path and destination path with comma separated. Destination path I made a dynamically with ServerName + DatabaseName + Backup Type for pattern. If the destination path doesn't exist, then it will be created by a process.

FOR /F will traverse each row in the generated text file and grab the values to process for copy. The options which I used are tokens to read first and second columns separated by comma, skip option to skip first two lines (skip header and line), delims (here comma) to separate values and variables to grab values. XCOPY used to copy files from source to destination. XCOPY will create the destination path if not exists dynamically, so I used XCOPY instead of the COPY command.

Let's run the batch file and checking for database backups created taken in last 6 hours, review generated text files and copy process,
Text file (BackupFile.txt)
BackupFiles
----------------------------------------------------------------------------------------------------------------
C:\DatabaseBackup\DemoDB\Full\DemoDB_20121019_234500.bak,\\ExternalDrivePath\DBBackup\PARESH\DemoDB\FULL\
C:\DatabaseBackup\SampleDB\Full\SampleDB_20121019_133000.bak,\\ExternalDrivePath\DBBackup\PARESH\SampleDB\FULL\
C:\DatabaseBackup\DemoDB\Trn\DemoDB_20121019_051500.trn,\\ExternalDrivePath\DBBackup\PARESH\DemoDB\TRN\
The above file is generated by the SQLCMD command with backup details which need to process through FOR /F command as a further step after it immediately. Which will skip first two lines and process rows one by one till the end of file, split values with comma, assign them to source and destination variables and process them for a copy.


This routine copy database all backup files created last in 6 hours, so we can schedule this routine to every 6 hours every day. I shared two methods here, first was in an earlier post and second on today. Let me know if we have another method. We can use xp_cmdshell command in SQL server for the same, we should avoid it for security.

Tuesday, 28 April 2015

Alerts for SQL Server security events by serverku

Security is the main thing for servers and SQL servers and we need to trace it for security management. Have to need track who is using it, what they are doing and when they performed. For SQL Server 2005 and later versions we have some DDL events which can help us to achieve this. So let us catch it up. But before going ahead, please visit DDL Trigger and DDL Auditing in SQL Server 2005 and Logon trigger in SQL Server for more information about DDL trigger.
I am presenting here for DDL triggers fire for security events. We can also log all the event data in the table, but I want to describe this post to get al alert for this security event's occurrence. To implement I am creating a DDL trigger in the master database and evaluate for some scenario like login creation, Add server roles and drop logins. You can find the trigger created for the same as following,
CREATE TRIGGER [Trg_TrackLoginManagement]
ON ALL SERVER
FOR
DDL_SERVER_SECURITY_EVENTS
AS
BEGIN
SET NOCOUNT ON

DECLARE @data xml,
@EventType varchar(100),
@EventTime datetime,
@ServerName varchar(100),
@AffectedLoginName varchar(100),
@WhoDidIt varchar(100),
@EmailSubject varchar(500),
@EmailBody varchar(800),
@EmailRecipients varchar(300),
@TSQL varchar(4000)

SET @EmailRecipients = 'prajapatipareshm@gmail.com'

SET @data = EVENTDATA()
SET @EventType = @data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(100)')
SET @EventTime = @data.value('(/EVENT_INSTANCE/PostTime)[1]','datetime')
SET @ServerName = @data.value('(/EVENT_INSTANCE/ServerName)[1]','varchar(100)')
SET @AffectedLoginName = @data.value('(/EVENT_INSTANCE/ObjectName)[1]','varchar(100)')
SET @WhoDidIt = @data.value('(/EVENT_INSTANCE/LoginName)[1]','varchar(100)')
SET @TSQL = @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]','varchar(4000)')

SET @EmailSubject = @EventType + ' occured by ' + @WhoDidIt + ' on ' +
@ServerName + ' occured at: ' + convert(Varchar, @EventTime)
SET @EmailBody = @TSQL

EXEC msdb.dbo.sp_send_dbmail
@recipients = @EmailRecipients
, @subject = @EmailSubject
, @body = @EmailBody
, @profile_name = '<ProfileName>' -- Put profile name here
, @body_format = 'HTML' ;

END
We are capturing some data which is full information of login name , the events occurred and the date on which it occurred and who did. I collected some snaps after performing and testing some scenario for creating a login, assigning server roles and finally deleting created login after testing, Let me share here.


Above are the alert emails which I received for the events happened to create login, server role assignment and after all deleting the login. If we do not want to continue receiving the alerts for change, then it DDL trigger on the server can be disabled with following statement.
DISABLE TRIGGER [Trg_TrackLoginManagement] ON ALL SERVER
GO
Did you configure any alerts for such DDL events? Share your thoughts here.

Logon trigger in SQL Server by serverku

As a secure part, recently I worked with login details and the auditing for the same. I needed to capture each event for logging statistics like login name, the time when logon, the program through established connection, session and host or client IP. This event is raised when user sessions make connections with SQL Server instances where we configured LOGON trigger. This will helpful for us to make auditing the logon details for each connections. For the small demo let us create required objects and implement it.
USE DemoDB
GO
-- Creating audit table
CREATE TABLE LogonAuditing
(
SessionId int,
LogonTime datetime,
HostName varchar(50),
ProgramName varchar(500),
LoginName varchar(50),
ClientHost varchar(50)
)
GO
USE Master
GO
-- Creating DDL trigger for logon
CREATE TRIGGER LogonAuditTrigger
ON ALL SERVER
FOR LOGON
AS
BEGIN
DECLARE @LogonTriggerData xml,
@EventTime datetime,
@LoginName varchar(50),
@ClientHost varchar(50),
@LoginType varchar(50),
@HostName varchar(50),
@AppName varchar(500)

SET @LogonTriggerData = eventdata()

SET @EventTime = @LogonTriggerData.value('(/EVENT_INSTANCE/PostTime)[1]', 'datetime')
SET @LoginName = @LogonTriggerData.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(50)')
SET @ClientHost = @LogonTriggerData.value('(/EVENT_INSTANCE/ClientHost)[1]', 'varchar(50)')
SET @HostName = HOST_NAME()
SET @AppName = APP_NAME()--,program_name()

INSERT INTO DemoDB.dbo.LogonAuditing
(
SessionId,
LogonTime,
HostName,
ProgramName,
LoginName,
ClientHost
)
SELECT
@@spid,
@EventTime,
@HostName,
@AppName,
@LoginName,
@ClientHost

END
GO
Audit table and the trigger to fill that table is created. Now it is time to evaluate, test it and review the audit table.

We have all details of the logging events from the table. Also one more interesting thing is as we can prevent unwanted user logins and connections to SQL Server. This prohibition can be for the login, program or host, for that we just need to add some code in logon trigger for conditions to make rollback at that time like following,
-- Preventing 'sa' login
IF @LoginName = 'sa'
BEGIN
ROLLBACK;
END

-- Preventing the connections from SSMS
IF @AppName = 'Microsoft SQL Server Management Studio'
BEGIN
ROLLBACK;
END
For first criteria to prevent login ‘SA’, if connection made for the same and the message fired at the time of event logging,


You can make your criteria as per requirement as I implemented for login and program connections. Are you using a Logon Trigger?

Apply discrepancies at destination using SSIS - tablediff Utility in SQL Server by serverku

The last time we saw tablediff Utility basis and tablediff utility for multiple tables using SSIS. As you know using sssis package it generated discrepancies log files for database changes which we have to apply on destination servers\databases to make them seem. Now this post is extended for the same in which we will take care to automatically apply the discrepancies for each file generated at destination.
I am writing here for the next portion of previous post using ssis package for multiple tables and changes to be generated. Let create remaining part and add to existing package.


In the previous post we have visited first three steps and here I added last three steps (4 to 6). For the steps (1 to 3) please visit this post. Let’s continue with the remaining steps,

Step 4 : For each loop container
Retrieving each log files for the process,


Here we have taken one more variable to capture full log file path in variable named “Filename” and the following process will do the same,


Step 5 : Execute SQL Task
Taking destination database connection and using file connection,


To apply multiple file changes to destination, taken expression for FileConnection,


Step 6 : File System Task
Moving files to other location after process, so does not repeat next time,


After completing and running all the steps we will have the destination database to same as source , let us run first three steps (1 to 3) after changes apply and verify if found any other discrepancies,


You can see after this run, we have a message by tablediff utility is “Source table and destination table are identical” and it won't generate any discrepancies log files more. I want to know you are using, these steps to apply changes to destination using ssis? Waiting for your comments!

Copy database backup files using XCOPY command by serverku

Recently, when I was working database backups copy to external drives or whatever destination, I tried it with new solution using XCOPY command. Before that solution I was using XP_CMDSHELL command in SQL Server to copy database backup file to an external drive, also we should avoid using XP_CMDSHELL for security concern. So out of SQL server script\query and another tool, the solution came with XCOPY command which copy latest files or as per date specified, so later database backup files or the files created on or after the date specified as input. You can visit XCOPY command here.
Lets first share me the command which I am using to copy latest database backup files. Here I am using some variables in the code, but let’s share me script first, then I will explain the variables.
SET dwMONTH=%DATE:~4,2%
SET dwDAY=%DATE:~7,2%
SET dwYEAR=%DATE:~10,4%
SET dwDate=%dwMONTH%-%dwDAY%-%dwYEAR%

SET source=C:\DatabaseBackup
SET destination=\\externaldrive\DatabaseBackup\
SET extension=*.bak

XCOPY /Y %source%\%extension% %destination% /s /i /D:%dwDate%
You can see batch file created with the above code and the variables used inside and how they are integrated with final XCOPY command. After getting to run it will copy database backup files to an external drive or whatever destination using source, Destination, file extension (.bak\.trn here) and the date on or after the backup files created. Let’s implement and run it..
This code in batch files, copy all the files created Today with specified file type from source folders including sub folders to destination. So lets run batch and see what’s inside. Let's try with sample example and local drives,

Apart from XP_CMDSHELL command using in SQL Server, which methods you are using the database backup files to external drives? Please share here. I will post a further topic with a different method for the same. Hope you enjoyed it.

SQL Server tablediff utility for multiple tables using SSIS by serverku

As we saw the last post for the basic concept of tablediff utility. We learned one example, using static table, now I would like to go it with SISS package and also using more tables comparison. So, lets start it with some demo objects and created as follows. Here I am creating two tables with different databases and same SQL server instance.
USE SourceDB
GO

CREATE TABLE dbo.SourceObj1
(
id int PRIMARY KEY ,
name varchar(10),
CreatedDate DATETIME DEFAULT GETDATE()
)

CREATE TABLE dbo.SourceObj2
(
id int PRIMARY KEY ,
name varchar(10),
CreatedDate DATETIME DEFAULT GETDATE()
)

INSERT dbo.SourceObj1
(
id,
name
)
SELECT 1,'test1'
UNION ALL
SELECT 2,'test2'
GO

INSERT dbo.SourceObj2
(
id,
name
)
SELECT 1,'test3'
UNION ALL
SELECT 2,'test4'
GO


USE DestDB
GO

CREATE TABLE dbo.DestObj1
(
id int PRIMARY KEY ,
name varchar(10),
CreatedDate DATETIME DEFAULT GETDATE()
)

CREATE TABLE dbo.DestObj2
(
id int PRIMARY KEY ,
name varchar(10),
CreatedDate DATETIME DEFAULT GETDATE()
)

INSERT dbo.DestObj1
(
id,
name
)
SELECT 1,'test1'
UNION ALL
SELECT 3,'test3'
GO

INSERT dbo.DestObj2
(
id,
name
)
SELECT 1,'test3'
UNION ALL
SELECT 3,'test4'
GO


SELECT
'Sourcedb.dbo.SourceObj1' as ObjectName,
*
FROM Sourcedb.dbo.SourceObj1
SELECT
'Destdb.dbo.DestObj1' as Objectname,
*
FROM Destdb.dbo.DestObj1
SELECT
'Sourcedb.dbo.SourceObj2' as ObjectName,
*
FROM Sourcedb.dbo.SourceObj2
SELECT
'Destdb.dbo.DestObj2' as ObjectName,
*
FROM Destdb.dbo.DestObj2
GO
Let see the same data inserted,




Now I will create a ssis package with passing dynamic table name and other require details. But before that I need to populate one table with same details which we need require as argument to be passed in a batch file,
USE Maintenance
GO

CREATE TABLE DatasyncDetails
(
Id int identity(1,1),
SourceDatabase varchar(50),
DestDatabase varchar(50),
SourceSchema varchar(20),
DestSchema varchar(50),
SourceTable varchar(50),
DestTable varchar(20),
)


INSERT INTO DatasyncDetails
(
SourceDatabase,
DestDatabase,
SourceSchema,
DestSchema,
SourceTable,
DestTable
)

SELECT
'SourceDB',
'DestDB',
'DBO',
'DBO',
'SourceObj1',
'DestObj1'

UNION ALL

SELECT
'SourceDB',
'DestDB',
'DBO',
'DBO',
'SourceObj2',
'DestObj2'

SELECT
SourceDatabase,
SourceSchema,
SourceTable,
DestDatabase,
DestSchema,
DestTable
FROM Maintenance.dbo.DatasyncDetails
GO
Let's check the data populated which we have to process for discrepancies,


All required demo objects created to prepare for ssis package,


and the variables used ,


I have presented two whole snaps with details of whole flow, Now we will the all the steps one by one.

Step 1 : Execute SQL Task



Step 2 : For Each Loop Container



Step 3 :  Execute Process Task

Here we will create a batch file and call this task where arguments pass from parameter we mapped in an earlier task. You can see the content of batch file and the argument used inside,
"C:\Program Files\Microsoft SQL Server\90\COM\tablediff.exe" 
–sourceserver [PARESH\MSSQLSERVER2012]
-sourcedatabase [%1]
-sourceschema [%2]
-sourcetable [%3]
-sourceuser [dba]
-sourcepassword [dba@1234]
-destinationserver [PARESH\MSSQLSERVER2012]
-destinationdatabase [%4]
-destinationschema [%5]
-destinationtable [%6]
-destinationuser [dba]
-destinationpassword [dba@1234]
-et Difference
-f C:\DiffOutput\%7
Now I will use this batch file in execute process task and the use the arguments passed by For Each Loop container.






Expression of Arguments :
@[User::SourceDB]  +" "+  @[User::SourceSchema] + " " +  @[User::SourceTable] +" "+  @[User::DestDB]  +" "+  @[User::DestSchema] + " " +  @[User::DestTable] +" "+ @[User::SourceTable]
Finally done with all the steps and will have to run the package and will review the resulted SQL script log for discrepancies. So let's run it and review the out files.
Running…


Output files,
-- Host: PARESH\MSSQLSERVER2012
-- Database: [DestDB]
-- Table: [DBO].[DestObj1]
UPDATE [DBO].[DestObj1] SET [CreatedDate]='2012-10-06 11:39:10.620' WHERE [id] = 1
INSERT INTO [DBO].[DestObj1] ([CreatedDate],[id],[name]) VALUES ('2012-10-06 11:39:10.620',2,'test2')
DELETE FROM [DBO].[DestObj1] WHERE [id] = 3

-- Host: PARESH\MSSQLSERVER2012
-- Database: [DestDB]
-- Table: [DBO].[DestObj2]
UPDATE [DBO].[DestObj2] SET [CreatedDate]='2012-10-06 11:39:10.627' WHERE [id] = 1
INSERT INTO [DBO].[DestObj2] ([CreatedDate],[id],[name]) VALUES ('2012-10-06 11:39:10.627',2,'test4')
DELETE FROM [DBO].[DestObj2] WHERE [id] = 3
We can use more tables to find differences between them as we did. Here you can use server name, username & password as an argument and make it fully dynamic, but if those SQL servers can be connect from there. In the next post I will add some additional task to apply differences automatically at destination servers\databases. Hope you will like and share it.