Sunday, 26 July 2015

Missing index suggesion by Execution Plan - SQL Server by serverku

The index is a most important factor in the optimization and performance. Whenever we work for the query optimization, we are mostly looking for indexes, then check any other factors. Also scheduling the job for the maintenance of the indexes on periodical basis. From Index usage and statistics report, we can have an idea for the index optimization. But SQL Server query execution plan also recommended and give the suggestion with index hint. We will look here the demonstration for the same with execution plan.
--- Creating tables
IF (OBJECT_ID('ItemTypes','U') > 0 )
DROP TABLE ItemTypes
GO

IF (OBJECT_ID('ItemDetails','U') > 0 )
DROP TABLE ItemDetails
GO

CREATE TABLE ItemTypes
(
ItemType varchar(100),
ItemTypeDesc varchar(100)
)
GO

CREATE TABLE ItemDetails
(
ItemId int not null,
ItemName varchar(100),
ItemType varchar(10),
ItemDescription varchar(200)
)
GO


-- Inserting sample records
INSERT INTO ItemDetails
SELECT
a.id,
a.name,
a.xtype,
'ItemDesc'
FROM sys.sysobjects a
CROSS JOIN sys.sysobjects b
GO

INSERT INTO ItemTypes
SELECT distinct
type,
type_desc
FROM sys.objects
GO

-- Review the execution plan.
SELECT
id.ItemId,
id.ItemName,
id.ItemType,
it.ItemTypeDesc as ItemTypeDesc,
id.ItemDescription
FROM ItemDetails id
INNER JOIN ItemTypes it
ON (it.ItemType = id.ItemType)
GO

You can see the index hint with the execution plan, and following is a script for same.
/*
Missing Index Details from SQLQuery.sql - ReportServer (dba (56))
The Query Processor estimates that implementing the following index could improve the query cost by 89.0255%.
*/

/*
USE [ReportServer]
GO
CREATE NONCLUSTERED INDEX []
ON [dbo].[ItemDetails] ([ItemType])
INCLUDE ([ItemId],[ItemName],[ItemDescription])
GO
*/
Apply it and enjoy!

Sunday, 19 July 2015

Insert default values for all columns in table - SQL Server by serverku

I was asked by some of the job colleagues as how can insert all the DEFAULT values in table when table have all the columns defined with default property. If the table has at least one column without default and identity column, then we can do insert default with rest columns easily. How can with insert default values for all columns in the table? For that we will see a small demo where we will create one table with all columns defined by default and will see the insertion of default values for the same.
USE DEMO
GO

-- Creating table
IF (OBJECT_ID('DefaultTable','U') > 0)
DROP TABLE DefaultTable
GO

CREATE TABLE DefaultTable
(
TransactionId INT IDENTITY(1,1) NOT NULL,
TransactionName VARCHAR(100) NOT NULL DEFAULT 'Test Transaction',
TransactionDate DATETIME NOT NULL DEFAULT GETDATE(),
TransactionType SMALLINT NOT NULL DEFAULT 0
)
GO

-- Insert Default values for all columns

INSERT DefaultTable
DEFAULT VALUES
GO

-- Reviewing records in table
SELECT
*
FROM DefaultTable
GO

Let us try more with some more inserts and see again,
INSERT DefaultTable
DEFAULT VALUES
GO 100

-- Reviewing records in table
SELECT
*
FROM DefaultTable
GO

You might experience this type of need.

Sunday, 12 July 2015

Script to get Database Files detail - SQL Server by serverku

The DBA needs to perform some day to day activities to monitor SQL servers and databases as everything is OK or not. Some of the tasks are based on daily, some are on weekly and some are on a monthly. But database activities and monitoring and its details are important because databases are growing day to day and accordingly we need to check out the disk space.

For that we should have all the database details and statistics like database file current size,growth and maximum size.We can do check the databases properties and file size for each of them manually. We should have the script to check it for all or specific databases Which can ease to get the details and monitoring with same. Here I will write a simple script which will give the specific database files and information for the same. Let us drive with it, here we will create one database and add one filegroup and secondary files in that.
 
USE MASTER
GO

-- Creating database
CREATE DATABASE DatabASeFileDetails
GO

-- Adding a new filegroup to database
ALTER DATABASE DatabASeFileDetails
ADD FILEGROUP NewFileGroup;
GO

--Adding a new secondary and log files to a database to above created filegroup
ALTER DATABASE DatabASeFileDetails
ADD FILE
(
-- New secondary files added here
NAME = FileGroupDB_Data_1,
FILENAME = 'C:\DatabASeFileDetails_Data_1.ndf',
SIZE = 15 MB,
MAXSIZE = 100 MB,
FILEGROWTH = 5 MB
),
(
-- New log file added here
NAME = FileGroupDB_Log_1,
FILENAME = 'C:\DatabASeFileDetails_Log_1.ldf',
SIZE = 5 MB,
MAXSIZE = 100 MB,
FILEGROWTH = 5 MB
)
TO FILEGROUP NewFileGroup; -- Defining filegroup name here
GO
Now we have created databases and done with new filegroup and database files. We have turned to run the script which we actually want, which returns with database details.
 
-- Using that database
USE DATABASEFILEDETAILS
GO

SELECT
DB_NAME(DBID) AS DatabaseName,
Name AS LogicalFileName,
CASE
WHEN FILEID = 1
THEN 'Primary File'
WHEN FILEID = 2 THEN 'Log File'
ELSE 'Secondary File'
END AS FileDescription,
FILEGROUP_NAME(groupid) AS FileGroup,
CAST( ((CAST (SIZE AS NUMERIC(18,2) )*8)/1024) AS NUMERIC(18,2)) AS [FileSize(MB)],
CASE status
WHEN 0
THEN 'No growth'
WHEN 2
THEN CAST(CAST(((CAST (growth AS INT )*8)/1024) AS INT) AS VARCHAR(1000)) + ' [growth(MB)]'
ELSE CAST(growth AS VARCHAR(1000)) + ' [growth(%)]'
END AS Growth,
CASE maxsize
WHEN 0
THEN 'No growth'
WHEN -1
THEN 'File will grow until the disk is full'
ELSE CAST(CAST(((CAST (maxsize AS NUMERIC(18,2) )*8)/1024) AS NUMERIC(18,2)) AS VARCHAR(1000))
END AS [MaxFileSize(MB)],
FileName AS PhysicalFilePath
FROM SYS.SYSALTFILES
WHERE DBID > 4
AND DB_NAME(DBID) = 'DatabASeFileDetails'
ORDER BY DBID,FileId

GO

In the above snapshot we have compared database script results and database property, so we can have idea everything are ok and fine. What you are using? Please fill your comments if I missed something.

Sunday, 5 July 2015

Purge SQL Server Logs and Agent Error Logs by serverku

One day when I checked the disk space on C:\ drive where I have installed SQL Server, then space is full and free space is not available. I have checked all the folders and files on C:\ drive, but the problem was different. The size of SQL Server error log file goes very high and this cause to disk was full. I have reviewed and the current error log files cause for the same. I have tried to remove it to make free space on disk, but I could not delete as it was used by SQL Server. Then I have applied one script to detach the current error logs and create a new file in the error logs.

Generally, when the new error files generated when SQL Service restart. The script looks as follows, here I have given a sample example as how to detach last active ERROR logs from SQL Server, so after that we can remove those previous error log files. We have a look for the file shots of SQL Server error log files and Agent error log files as well before the script run.


 
-- Using MSDB database to run the command
USE MSDB
GO

-- Removing SQL Server error logs
EXEC SP_CYCLE_ERRORLOG
GO

-- Removing SQL Server Agent error logs
EXEC SP_CYCLE_AGENT_ERRORLOG
GO
 

Let us go, through the screen below which will show the new files created for the same. What do you think, have you applied any script or schedule to create new error log files and remove previous error log files?