- Always download upgrade files using internet explorer or Mozilla. Don’t use Chrome.
- Verify download file has been placed in the correct directory with correct file extension. To do so go to “cd /usr/local/NitroGuard” and verify the download file is placed in this directory with .tgz extension. If not then change the file extension to .tgz with command “mv RECEIVER_Update_9.1.3.signed.gz RECEIVER_Update_9.1.3.signed.tgz”.
- Compare hash values of downloaded files with original hash values provided at McAfee website.
- Upgrade ESM first then ELM and then any one of the rest of the devices.
- After upgrading each single device, verify version in system properties. Also verify device health check status. To do so select the device e.g ADM then select the “device status” dashboard from the menu.
- Also check device logs of each device after upgrade. If there is a problem then you can see it in the logs. Ensure that the files in the directory "/usr/local/NitroGuard/updates" are updated with the latest version, use the command “ls -lrt” to list the files. You will observe many files with version 9.1.3 will appear stating that the files have been upgraded with the latest version.(As we were upgrading to version 9.1.3 from 9.1.1)
- Make sure there is no red flag with any device. If there is any click on it to see device log stating the reason.
- Open policy editor and roll out policy on all devices so that changes in the new version are applied on to all devices and data sources.
- Take Full backup.
- Apply any necessary patches or hotfix that have been recommended by the vendor.
Sunday, 30 December 2012
Upgrade SIEM from 9.1.1 to 9.1.3 by serverku
Tuesday, 11 December 2012
Correlating IPS events in Nitro SIEM by serverku
Issue
If you are using McAfee SIEM (previously Nitro SIEM) and don't have Nitro IPS then you might face issues with correlating IPS events. This is because McAfee SIEM doesn't automatically normalize other IPSs data and label IPS events as uncategorized.
Solution
A work around to this is to manually normalize your IPS events in categories e.g. exploit, p2p etc and then use these normalization rules in correlation rules. However, this is not a permanent solution as IPS signatures are constantly updated and new attacks signature will be added from time to time.
Example scenario
You have a set of critical servers for which you want to see exploit attacks. The first step would be to select your IPS in "physical display" and select "Normalized dashboard" from the list of available dashboards. Next select an exploit event in the "Event Summary" sub-group and go to rule definition by selecting "Show rule" option.
Policy editor window will be opened showing the rule definition you selected previously. Double click on the rule definition and click on the green button next to "Normalized ID". Next you can move this rule to the exploit category in normalization taxonomy.
Once you normalize all your exploit events, you are ready to proceed with writing correlation rule to see all exploit events to your critical servers. First create a variable rule defining IP addresses of all your critical servers.
Next create a correlation rule as following.
You can further reduce resultant events by excluding all events that are blocked by your IPS as shown in the above rule.
If you are using McAfee SIEM (previously Nitro SIEM) and don't have Nitro IPS then you might face issues with correlating IPS events. This is because McAfee SIEM doesn't automatically normalize other IPSs data and label IPS events as uncategorized.
Solution
A work around to this is to manually normalize your IPS events in categories e.g. exploit, p2p etc and then use these normalization rules in correlation rules. However, this is not a permanent solution as IPS signatures are constantly updated and new attacks signature will be added from time to time.
Example scenario
You have a set of critical servers for which you want to see exploit attacks. The first step would be to select your IPS in "physical display" and select "Normalized dashboard" from the list of available dashboards. Next select an exploit event in the "Event Summary" sub-group and go to rule definition by selecting "Show rule" option.
Policy editor window will be opened showing the rule definition you selected previously. Double click on the rule definition and click on the green button next to "Normalized ID". Next you can move this rule to the exploit category in normalization taxonomy.
Once you normalize all your exploit events, you are ready to proceed with writing correlation rule to see all exploit events to your critical servers. First create a variable rule defining IP addresses of all your critical servers.
Next create a correlation rule as following.
You can further reduce resultant events by excluding all events that are blocked by your IPS as shown in the above rule.
How to set HOMENET Variable in Mcafee Nitro SIEM by serverku
How to set HOMENET Variable in Mcafee Nitro SIEM
Issue
The issue we had was when we try to set HOMENET Variable in Policy Editor. The HOMENET variable is almost used in every rule that is used in Mcafee Nitro SIEM. However, we have observed that even though we have defined out custom values of Internet Subnet still the rules were showing that HOMENET variable is using 0.0.0.0/0.
Solution:
GO TO "Asset Manager" in Mcafee Nitro Siem as shown in the post
Go TO "Network Discovery"
CLICK "Homenet"
and specify your subnet, we wanted to include our whole subnet so we have mentioned 10.0.0.0/8.
After updating the value here, we have considerably reduced the false positives and this issue was solved.
Issue
The issue we had was when we try to set HOMENET Variable in Policy Editor. The HOMENET variable is almost used in every rule that is used in Mcafee Nitro SIEM. However, we have observed that even though we have defined out custom values of Internet Subnet still the rules were showing that HOMENET variable is using 0.0.0.0/0.
Solution:
GO TO "Asset Manager" in Mcafee Nitro Siem as shown in the post
Go TO "Network Discovery"
CLICK "Homenet"
and specify your subnet, we wanted to include our whole subnet so we have mentioned 10.0.0.0/8.
After updating the value here, we have considerably reduced the false positives and this issue was solved.
Thursday, 6 December 2012
How to detect and prevent users from removing McAfee agent from Client machines by serverku
1) Create an automated response for client events to show: Attempt to uninstall McAfee Agent 2) Setup a VirusScan access protection policy to not modify or terminate McAfee files, processes, etc. 3) Follow KB69716 to add a registry key to remove the option from add/remove programs for VirusScan. 4) Generate queries, called compliance reports, to show machines which do not have VirusScan or McAfee Agent installed and also specify which version. With this, you can then automatically deploy agents and VirusScan. If however, you give local admin rights to your users, they will always be able to go to the registry and run commands or delete keys to uninstall.
Tuesday, 4 December 2012
Mcafee ePO policies for Microsoft Failover Cluster by serverku
Mcafee ePO policies for Microsoft Failover Cluster
Lets suppose we want cluster for MS SQL Server, we will exclude the data file path and log file path as shown below along with the above three file/drive path exclusions
C:\Program Files\Microsoft SQL Server\MSSQL10.RTCLOCAL\MSSQL\DATA
C:\Program Files\Microsoft SQL Server\MSSQL10.RTCLOCAL\MSSQL\Log
Lets suppose we want cluster for MS SQL Server, we will exclude the data file path and log file path as shown below along with the above three file/drive path exclusions
C:\Program Files\Microsoft SQL Server\MSSQL10.RTCLOCAL\MSSQL\DATA
C:\Program Files\Microsoft SQL Server\MSSQL10.RTCLOCAL\MSSQL\Log
Friday, 17 August 2012
CTE within CTE - SQL Server by serverku
As per requirement in custom logic, we need to require CTE (Common Table Expression) something like for the hierarchy, to find duplicate and remove data or for some other stuff. Recently i have used CTE within CTE for one logic and maxrecursion option as well. So we will look at how we can use CTE inside CTE or multiple CTEs. Let us create one requirement. The requirement is we need to find the first and second objects by object types in the database and it should be in ascending order.
The sample data will be created from the script below. Let us create it first, then we will demonstrate for the logic that need to be created as per requirement.
USE DEMO
GO
-- Creating sample table
IF(OBJECT_ID('TblCTEwithCTE','U') > 0)
DROP TABLE TblCTEwithCTE
CREATE TABLE TblCTEwithCTE
(
ObjectNumber INT ,
ObjectType VARCHAR(50),
ObjectName VARCHAR(100),
ObjectCreateDate DATETIME
)
GO
-- Inserting sample records created above
INSERT INTO TBLCTEWITHCTE
(
ObjectNumber,
ObjectType,
ObjectName,
ObjectCreateDate
)
SELECT
ROW_NUMBER() OVER(PARTITION BY TYPE_DESC ORDER BY TYPE_DESC,CREATE_DATE) as ObjectNumber,
TYPE_DESC,
NAME,
CREATE_DATE
FROM SYS.OBJECTS
Here we are creating first CTE to get only objects with creating first based or created date by object type categories.
-- Fetching first created objects
;WITH FirstCreatedObjectsCTE
AS
(
SELECT
ObjectNumber as ObjectNumber ,
ObjectType as ObjectType,
ObjectName as FirstCreatedObject
FROM TBLCTEWITHCTE WHERE ObjectNumber = 1
)
SELECT
*
FROM FirstCreatedObjectsCTE
GO
Now we have completed work for the first created objects by object type categories. And it is time to have the second created objects using first CTE and another second CTE to finally come out with an output which having both first created and next created objects. This first created and next created object by object type categories will be shown as column as follows.
-- Original table data.
SELECT
ObjectNumber,
ObjectType,
ObjectName,
ObjectCreateDate
FROM TBLCTEWITHCTE
GO
-- Fetching first created objects in first CTE and using in second CTE for the second created objects.
;WITH FirstCreatedObjectsCTE
AS
(
SELECT
ObjectNumber as ObjectNumber ,
ObjectType as ObjectType,
ObjectName as FirstCreatedObject
FROM TBLCTEWITHCTE WHERE ObjectNumber = 1
)
,
SecondCreatedObjectsCTE
AS
(
SELECT
t.ObjectType as ObjectType,
c.FirstCreatedObject as FirstCreatedObject,
t.ObjectName as SecondCreatedObject
FROM TBLCTEWITHCTE t
RIGHT OUTER JOIN
FirstCreatedObjectsCTE c
ON (c.ObjectType = t.ObjectType and t.ObjectNumber = c.ObjectNumber + 1)
)
SELECT
*
FROM SecondCreatedObjectsCTE
GO
Hope you like this, stay tuned from more.
Saturday, 18 February 2012
Archive files based on number of files configuration in SQL Server - SSIS by serverku
Hope you read earlier posts to archive files which are older as per days and you liked them. If you have not read then you can go there by following where you have various scenarios with the same,
Also the variables used in the whole task flow,
It fetches the folders and archive numbers for each row and assign to a variable which is object data type. For variable assignment you need to set a ResultSet option to 'Full result set' as above. After that, you mapping variables are as following for output,
2.
We need to go through the values coming out from variable and then map it to two variables again for archive folder name and archive numbers using Foreach Loop Container,
You can see how the object variable holds the values and assigned to other variables again as per below shot,
3.
Now use Execute SQL Task again and get the list of the files which need to deleted, You need to write some SQL statement to get those files as per archive numbers. So we will look the configuration and variable value mapping and using them in the script. You need to enable xp_cmdshell from the configuration.
Same as earlier steps we need to put the SQL query in SQL Statement option and set 'Full result set' for ResultSet, You can see two arrows which will be input values of the script which going to be executed, You can collect the same script here,
After completion of above to get the resulted values in another variable and pass them to the next step,
4.
Use Foreach Loop Container to process, delete operation for each files same steps we did earlier for Foreach Loop,
5.
Finally, use the Script Task to delete the file one by one as I did in below image. You need to pass the value of the file name as read only,
Now use the following script to perform actual tasks for deletion, Use below script in your script task,
You can copy same script from here,
After running the package files will be deleted as per archive number defined in the tables for each folder.
You can schedule this package in some frequency. Hope this help you.
- Archive old database backup files using TSQL Script - SQL Server
- Delete files from specified folder using File System Task in SQL Server - SSIS
- Delete files from specified folder using Script Task in SQL Server - SSIS
- Delete files older than retention period from specified folder using Script Task in SQL Server - SSIS
- Delete files older than retention period from folder with parameter using Script Task in SQL Server - SSIS
Recently I was asked by my friends to delete the files bases on number of files, not based on days. Such a way we can keep number of files every time and rest files get archived. I will give you an example here as I have three folders and want to keep the different number of files in each folder only. Says keep 3 files in BackupFolder1, keep 2 files in BackupFolder2 and keep 1 file in BackupFolder3. So how can we achieve it? Let's move on the steps,
First we will check the files existing in folders,
We are creating a SSIS package here and will use tasks to perform the archive process,
Also the variables used in the whole task flow,
1.
You can see the flow of the tasks, so we will be here for task 1 which will have fetch the records from a table which really need to be archived. We have inserted folder paths and a number of the files which need to keep in each folder. Let's view the records in a table,
You can see the flow of the tasks, so we will be here for task 1 which will have fetch the records from a table which really need to be archived. We have inserted folder paths and a number of the files which need to keep in each folder. Let's view the records in a table,
Now we will create a task1 with Execute SQL Tasks and get the list of archive folders as figured below,
It fetches the folders and archive numbers for each row and assign to a variable which is object data type. For variable assignment you need to set a ResultSet option to 'Full result set' as above. After that, you mapping variables are as following for output,
2.
We need to go through the values coming out from variable and then map it to two variables again for archive folder name and archive numbers using Foreach Loop Container,
You can see how the object variable holds the values and assigned to other variables again as per below shot,
3.
Now use Execute SQL Task again and get the list of the files which need to deleted, You need to write some SQL statement to get those files as per archive numbers. So we will look the configuration and variable value mapping and using them in the script. You need to enable xp_cmdshell from the configuration.
Same as earlier steps we need to put the SQL query in SQL Statement option and set 'Full result set' for ResultSet, You can see two arrows which will be input values of the script which going to be executed, You can collect the same script here,
CREATE TABLE #TempFiles
(
Seq int identity(1,1),
FileName varchar (500)
)
DECLARE @FullPath varchar(500), @sql varchar(500)
SET @FullPath = ?
SET @sql = 'dir /B /O-D '+ @FullPath
INSERT INTO #TempFiles (FileName)
EXEC xp_cmdshell @sql
SELECT @FullPath + FileName as FullName
FROM #TempFiles
WHERE FileName is not null
AND Seq > ?
DROP TABLE #TempFiles
After completion of above to get the resulted values in another variable and pass them to the next step,
4.
Use Foreach Loop Container to process, delete operation for each files same steps we did earlier for Foreach Loop,
5.
Finally, use the Script Task to delete the file one by one as I did in below image. You need to pass the value of the file name as read only,
Now use the following script to perform actual tasks for deletion, Use below script in your script task,
You can copy same script from here,
--//Apply below line in #region namespaces
using System.IO;
--//Apply below code inside
public void Main()
{
string FilePath = Dts.Variables["User::FileName"].Value.ToString();
System.IO.File.Delete(FilePath);
Dts.TaskResult = (int)ScriptResults.Success;
}
After running the package files will be deleted as per archive number defined in the tables for each folder.
You can schedule this package in some frequency. Hope this help you.
Labels:
#sql,
#sql Server,
Archive,
database,
ms sql,
ms sql server,
retention,
SQL,
SQL Scripts,
SQL Server,
SQL Server General,
SSDT,
SSIS
Friday, 10 February 2012
Restore Full Database from multiple backup files - SQL Server by serverku
As I know database restoration is a best practice on periodically to check and verify database backup copies and issues while restoration due to any stuffs. I have written for the database backups as how can we perform a full database backup completely and split into multiple files to reduce IO and time. As we have performed full database backups into split multiple files, Same way we will perform restoration from those complete and split multiple files. First we will see to restore full database complete backup and tried for from multiple files.
#1. Using TSQL : Let us run the below script first and get the logical names for database data and log files
After collection and putting logical file names in script below, it will restore the database from complete backup.
As above, we can restore database from multiple files as following,
#2. From Management Studio : We can also perform restoration from SSMS by adding those multiple backups as per shot.
You can see over here after adding those files it will show as a single backup set.
Hope this help you.
#1. Using TSQL : Let us run the below script first and get the logical names for database data and log files
RESTORE FILELISTONLY FROM
DISK = 'D:\DBBackups\ReportServer\ReportServer.bak'
GO
After collection and putting logical file names in script below, it will restore the database from complete backup.
RESTORE DATABASE ReportServerComplteCopy FROM
DISK = 'D:\DBBackups\ReportServer\ReportServer.bak'
WITH REPLACE ,
MOVE 'ReportServer' TO 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER11\MSSQL\DATA\ReportServerComplteCopy.mdf',
MOVE 'ReportServer_log' TO 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER11\MSSQL\DATA\ReportServerComplteCopy_log.ldf'
GO
As above, we can restore database from multiple files as following,
RESTORE DATABASE ReportServerSplitCopy FROM
DISK = 'D:\DBBackups\ReportServer\ReportServer_Split1.bak'
,DISK = 'D:\DBBackups\ReportServer\ReportServer_Split2.bak'
,DISK = 'D:\DBBackups\ReportServer\ReportServer_Split3.bak'
WITH REPLACE ,
MOVE 'ReportServer' TO 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER11\MSSQL\DATA\ReportServerSplitCopy.mdf',
MOVE 'ReportServer_log' TO 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER11\MSSQL\DATA\ReportServerSplitCopy_log.ldf'
GO
#2. From Management Studio : We can also perform restoration from SSMS by adding those multiple backups as per shot.
You can see over here after adding those files it will show as a single backup set.
Hope this help you.
Labels:
#sql,
#sql Server,
database,
Maintenance,
ms sql,
ms sql server,
Performance,
query,
restore,
SQL,
SQL Scripts,
SQL Server General,
t-sql,
tsql
Saturday, 4 February 2012
DDL Trigger and DDL Auditing in SQL Server 2005 by serverku
Hope this help you. For very sensitive and secure databases, it is required to track all the activities and changes done with the database objects. And it should be very secure as well, so nobody can play with database objects. Even we should have all the history and tracking for what are database object structure changes occurred, what, when and who did it?
For the tracking all these stuffs we need to have a better process to change them and it should be done by a responsible person. Bur sometime the issues may be occur with database objects changes, at that time if we have all the enough information then we can correct and revert it back as original. Today I would like to introduce to handle all the stuffs, and it is Database DDL Triggers. DDL Triggers can be specified at database and server level, but for the database object tracking we need it on the database. Let us see an example.
Checking :
Now audit test over, we have to review the history in audit table now,
Conclusion : We have some other new features for DDL Auditing in SQL Server 2008. But before SQL Server 2008 we can use this method for auditing.
Are you using this or using something else? Make your comments.
For the tracking all these stuffs we need to have a better process to change them and it should be done by a responsible person. Bur sometime the issues may be occur with database objects changes, at that time if we have all the enough information then we can correct and revert it back as original. Today I would like to introduce to handle all the stuffs, and it is Database DDL Triggers. DDL Triggers can be specified at database and server level, but for the database object tracking we need it on the database. Let us see an example.
-- Creating audit databaseNow to audit the database objects, DDL operation or structure changes we have to create a DDL trigger at database level.
CREATE DATABASE AuditDatabase
GO
USE AuditDatabase
GO
-- Creating table, which capture all the objects structure changes
CREATE TABLE ObjectTracking
(
TrackingId bigint identity(1,1),
TrackingDate datetime NOT NULL DEFAULT GETDATE(),
DatabaseName varchar(500),
EventType varchar(500),
ObjectName varchar(500),
ObjectType varchar(500),
LoginName varchar(500),
HostName varchar(500),
SqlCommand nvarchar(max)
)
GO
-- Creating triggers, which will be fire on every objects actions on the database of that instances.Now we have created tables and DDL triggers to capture the object movement. Now we have turn to test it.
CREATE TRIGGER [AuditObjects]
ON DATABASE
FOR
-- DDL_DATABASE_LEVEL_EVENTS
-- Defining all the DDL database events on which we need track
CREATE_PROCEDURE, ALTER_PROCEDURE, DROP_PROCEDURE,
CREATE_TABLE, ALTER_TABLE, DROP_TABLE,
CREATE_FUNCTION, ALTER_FUNCTION, DROP_FUNCTION,
CREATE_VIEW,ALTER_VIEW,DROP_VIEW,
CREATE_INDEX,ALTER_INDEX,DROP_INDEX
AS
BEGIN
SET NOCOUNT ON
DECLARE @data XML
DECLARE @HostName varchar(500)
SET @data = EVENTDATA()
set @HostName = HOST_NAME()
-- Inserting tracking information in audit table
INSERT INTO ObjectTracking(DatabaseName,EventType,ObjectName,ObjectType,LoginName,HostName,SqlCommand)
VALUES(
@data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'varchar(500)'),
@data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(500)'),
@data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(500)'),
@data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'varchar(500)'),
@data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(500)'),
@HostName,
@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)')
)
END
GO
Checking :
-- Creating , Altering and Destroying some of the various objects
-- Creating table event
CREATE TABLE AuditTest (AuditId int, Auditname char(10))
GO
-- Altering table event
ALTER TABLE AuditTest
ADD AuditDate datetime NULL
GO
-- Creating Indexes events
CREATE CLUSTERED INDEX ix_AuditId on AuditTest(AuditId)
GO
CREATE NONCLUSTERED INDEX ix_AuditDate on AuditTest(AuditDate)
GO
-- Dropping index events
DROP INDEX ix_AuditDate on AuditTest
GO
-- Creating view events
CREATE VIEW AuditView
AS
SELECT * FROM AUDITTEST
GO
-- Creating procedure events
CREATE PROCEDURE AuditProc
AS
BEGIN
SET NOCOUNT ON
SELECT * FROM AUDITVIEW
END
GO
-- Altering procedure events
ALTER PROCEDURE AuditProc
AS
BEGIN
SET NOCOUNT ON
SELECT * FROM AuditTest
END
GO
-- Dropping view and table events
DROP VIEW AuditView
GO
DROP TABLE AuditTest
GO
Now audit test over, we have to review the history in audit table now,
SELECT
TrackingId,
TrackingDate,
DatabaseName,
EventType,
ObjectName,
ObjectType,
LoginName,
--HostName,
'Paresh-PC' as HostName,
SqlCommand
FROM ObjectTracking
ORDER BY TrackingDate
GO
Conclusion : We have some other new features for DDL Auditing in SQL Server 2008. But before SQL Server 2008 we can use this method for auditing.
Are you using this or using something else? Make your comments.
Labels:
#sql,
#sql Server,
Audit,
database,
DDL,
ms sql,
ms sql server,
new features,
SQL,
SQL new features,
SQL Server General,
t-sql,
tsql
Friday, 27 January 2012
Alter failed for Login sa. Cannot set a credential for principal 'sa'. - Error encountered in SQL Server by serverku
Recently, when I worked with SQL Server security, I encountered with one error while trying to modify 'SA' account properties. The exception details looks following,
The fix for the error is the option "Map to Credential" is checked in the "General" tab of the Login Properties Page as mentioned below,
Hope this help you.
Alter failed for Login sa. Cannot set a credential for principal 'sa'.
The fix for the error is the option "Map to Credential" is checked in the "General" tab of the Login Properties Page as mentioned below,
Hope this help you.
Friday, 20 January 2012
Application Role in SQL Server by serverku
In the last post we saw custom database roles as how can we create it and assign required access to users. We also noticed that we can add multiple members with the same role. That was the security with database roles and members comes into the picture. Now here we will study of Application Role. This is the security for the application level and no such members comes into the picture.
Application Role :
As per msdn, An application role is a database principal that enables an application to run with its own, user-like permissions. You can use application roles to enable access to specific data to only those users who connect through a particular application
Workaround:
We can implement application role and take into effect with the following steps, I am going to here with some of the examples, so like to create those required objects, so we can set them with application role.
1. Create required objects
2. Create an application role
4. Connecting database and activating application role
Application roles are enabled/activated by sp_settapprole system stored procedure and it has required a password. So an application can be connected to SQL Server with this application role with scope of the particular session and required a password to authenticate it to connect SQL Server. To authenticate application roles and activate it it should be required to use in.Net connection code or other application database connection method code. You can refer the link here to use application role in application code.
sp_settapprole system stored procedure activate the application role for the specific connection while connecting with the application and the syntax is as follows,
Application Role :
As per msdn, An application role is a database principal that enables an application to run with its own, user-like permissions. You can use application roles to enable access to specific data to only those users who connect through a particular application
Workaround:
We can implement application role and take into effect with the following steps, I am going to here with some of the examples, so like to create those required objects, so we can set them with application role.
1. Create required objects
USE demoHere we have created same required objects as we created in an earlier post for database role.
GO
CREATE TABLE SampleTable1
(
Id int,
Name varchar(10)
)
GO
CREATE TABLE SampleTable2
(
Id int,
Name varchar(10)
)
GO
CREATE PROCEDURE SampleSP1
AS
BEGIN
SET NOCOUNT ON
SELECT * FROM SampleTable1
End
GO
CREATE PROCEDURE SampleSP2
AS
BEGIN
SET NOCOUNT ON
SELECT * FROM SampleTable2
End
GO
2. Create an application role
USE demo3. Add permission to this application role
GO
-- sp_addapprole [@rolename = ] 'rolename', [@password = ] 'password'
EXEC sp_addapprole 'AppRole', 'AppPwd'
GO
USE demoYou can see we have assigned same access to the application role as we did in an earlier post for the custom database role. Application role is created here now, You can also create/view with expanding Roles inside security tab for a particular database and inside it you can find the Application Role tab.
GO
GRANT SELECT ON SampleTable1 to AppRole;
GRANT SELECT, INSERT,UPDATE ON SampleTable2 to AppRole;
GRANT EXEC ON SampleSP1 to AppRole;
GO
4. Connecting database and activating application role
Application roles are enabled/activated by sp_settapprole system stored procedure and it has required a password. So an application can be connected to SQL Server with this application role with scope of the particular session and required a password to authenticate it to connect SQL Server. To authenticate application roles and activate it it should be required to use in.Net connection code or other application database connection method code. You can refer the link here to use application role in application code.
sp_settapprole system stored procedure activate the application role for the specific connection while connecting with the application and the syntax is as follows,
USE demoWe can change the password for the application role with following,
GO
-- sp_setapprole [@rolename = ] 'rolename', [@password = ] 'password'
EXEC sp_setapprole 'AppRole', 'AppPwd'
GO
USE demoHope these all the steps are enough to implement application role. Stay tuned for more.
GO
-- sp_approlepassword [@rolename = ] 'rolename', [@password = ] 'new password'
EXEC sp_approlepassword 'AppRole', 'AppChangedPwd'
GO
Friday, 13 January 2012
Custom Database Role in SQL Server by serverku
Recently, while working with database security, I learned database roles as how the each rule used. Apart from the server level roles if we need to require to assign access/rights to the particular database level, then we need to go through database level roles.
Following are the fixed database level roles as per MSDN,
db_owner :Members of the db_owner fixed database role can perform all configuration and maintenance activities on the database, and can also drop the database.
db_securityadmin :Members of the db_securityadmin fixed database role can modify role membership and manage permissions. Adding principals to this role could enable unintended privilege escalation.
db_accessadmin :Members of the db_accessadmin fixed database role can add or remove access to the database for Windows logins, Windows groups, and SQL Server logins.
db_backupoperator :Members of the db_backupoperator fixed database role can back up the database.
db_ddladmin :Members of the db_ddladmin fixed database role can run any Data Definition Language (DDL) command in a database.
db_datawriter :Members of the db_datawriter fixed database role can add, delete, or change data in all user tables.
db_datareader : Members of the db_datareader fixed database role can read all data from all user tables.
db_denydatawriter :Members of the db_denydatawriter fixed database role cannot add, modify, or delete any data in the user tables within a database.
db_denydatareader : Members of the db_denydatareader fixed database role cannot read any data in the user tables within a database.
You can see the image in all above fixed database roles. Now we will see how can we use the roles and bind with users. Let's create a small demo with examples. Here I am creating required objects used for demos, So let's do that.
1. User can see the data from SampleTable2 table.
2. User can perform select/insert/update operation on SampleTable1 table, not delete operation.
3. Can execute stored procedure SampleSP1.
4. Can not execute stored procedure SampleSP2.
Workaround 1:
If we assigned fixed database roles to User like db_datareader, and db_datawriter for a particular database, then user can perform all DML operations on all tables, views in the database. Even if we assigned EXECUTE permission to the user then the user can execute all the stored procedures. After all these permissions we need to deny permission from the user for some of the tables and stored procedures which are not required to be accessed.
Workaround 2:
Instead of doing above such stuffs, We will create a new custom database level role and assigned requited access to use for the objects.
The main benefit of the custom database role is role can be assigned to multiple users. You can see below script where I have assigned the same role to different users. So once role created it can be assigned to multiple users.
Following are the fixed database level roles as per MSDN,
db_owner :Members of the db_owner fixed database role can perform all configuration and maintenance activities on the database, and can also drop the database.
db_securityadmin :Members of the db_securityadmin fixed database role can modify role membership and manage permissions. Adding principals to this role could enable unintended privilege escalation.
db_accessadmin :Members of the db_accessadmin fixed database role can add or remove access to the database for Windows logins, Windows groups, and SQL Server logins.
db_backupoperator :Members of the db_backupoperator fixed database role can back up the database.
db_ddladmin :Members of the db_ddladmin fixed database role can run any Data Definition Language (DDL) command in a database.
db_datawriter :Members of the db_datawriter fixed database role can add, delete, or change data in all user tables.
db_datareader : Members of the db_datareader fixed database role can read all data from all user tables.
db_denydatawriter :Members of the db_denydatawriter fixed database role cannot add, modify, or delete any data in the user tables within a database.
db_denydatareader : Members of the db_denydatareader fixed database role cannot read any data in the user tables within a database.
You can see the image in all above fixed database roles. Now we will see how can we use the roles and bind with users. Let's create a small demo with examples. Here I am creating required objects used for demos, So let's do that.
USE demoNow here I want to assign the access to user as follows,
GO
CREATE TABLE SampleTable1
(
Id int,
Name varchar(10)
)
GO
CREATE TABLE SampleTable2
(
Id int,
Name varchar(10)
)
GO
CREATE PROCEDURE SampleSP1
AS
BEGIN
SET NOCOUNT ON
SELECT * FROM SampleTable1
End
GO
CREATE PROCEDURE SampleSP2
AS
BEGIN
SET NOCOUNT ON
SELECT * FROM SampleTable2
End
GO
1. User can see the data from SampleTable2 table.
2. User can perform select/insert/update operation on SampleTable1 table, not delete operation.
3. Can execute stored procedure SampleSP1.
4. Can not execute stored procedure SampleSP2.
Workaround 1:
If we assigned fixed database roles to User like db_datareader, and db_datawriter for a particular database, then user can perform all DML operations on all tables, views in the database. Even if we assigned EXECUTE permission to the user then the user can execute all the stored procedures. After all these permissions we need to deny permission from the user for some of the tables and stored procedures which are not required to be accessed.
Workaround 2:
Instead of doing above such stuffs, We will create a new custom database level role and assigned requited access to use for the objects.
USE demoLet's connect the SQL Server instance with this newly created user and see the access rights,
GO
CREATE LOGIN [SupportUser]
WITH PASSWORD=N'SupportUser',
DEFAULT_DATABASE=[master],
DEFAULT_LANGUAGE=[us_english],
CHECK_EXPIRATION=OFF,
CHECK_POLICY=OFF
GO
CREATE USER [SupportUser] FOR LOGIN [SupportUser]
GO
CREATE ROLE [SupportRole]
Go
GRANT SELECT ON SampleTable2 TO [SupportRole];
GRANT SELECT,INSERT,UPDATE ON SampleTable1 to [SupportRole];
GRANT EXEC ON SampleSP1 to [SupportRole]
GO
EXEC sp_addrolemember N'SupportRole', N'SupportUser'
GO
USE demoYou can see the below image to see the access by running user,
GO
PRINT 'Inserting in SampleTable1'
GO
INSERT INTO SampleTable1
(
Id,
Name
)
SELECT
1,
'Sample1'
GO
PRINT 'Inserting in SampleTable2'
GO
INSERT INTO SampleTable2
(
Id,
Name
)
SELECT
1,
'Sample2'
GO
PRINT 'Deleting from SampleTable1'
GO
DELETE FROM SampleTable1
GO
PRINT 'Viewing from SampleTable1'
GO
SELECT * FROM SampleTable2
GO
SELECT * FROM SampleTable1
GO
PRINT 'Executing SampleSP11'
GO
EXEC SampleSP1
GO
PRINT 'Executing SampleSP2'
GO
EXEC SampleSP2
GO
The main benefit of the custom database role is role can be assigned to multiple users. You can see below script where I have assigned the same role to different users. So once role created it can be assigned to multiple users.
USE demoHope you like this post.
GO
CREATE LOGIN [DBAUser]
WITH PASSWORD=N'DBAUser',
DEFAULT_DATABASE=[master],
DEFAULT_LANGUAGE=[us_english],
CHECK_EXPIRATION=OFF,
CHECK_POLICY=OFF
GO
CREATE USER [DBAUser] FOR LOGIN [DBAUser]
GO
EXEC sp_addrolemember N'SupportRole', N'DBAUser'
GO
Labels:
#sql,
#sql Server,
database,
login,
Maintenance,
ms sql server,
query,
Role,
Security,
SQL,
SQL Scripts,
SQL Server,
SQL Server General,
user
Saturday, 7 January 2012
Changing Rows to Columns Using PIVOT - SQL Server by serverku
During working with one logic, I got a chance to work with PIVOT operation. Sometime we need do require row data as a column in our custom logic, then we can use some temp table and then populate aggregate data in a temp table. But With PIVOT we can do it very easily. Let me prepare small example and explain as how how can we use PIVOT and get row data as a column.
Before going ahead to run the script of Pivot, we will create a database and table objects.
You can review here and see how The PIVOT is working. Let me share your experience with PIVOT operation.
Before going ahead to run the script of Pivot, we will create a database and table objects.
CREATE DATABASE DEMONow we will check the original table data and aggregated data using Pivot. So we will run both scripts for the same.
GO
USE DEMO
GO
-- Creating table for demo
IF (object_id('TblPivot','U') > 0)
DROP TABLE TblPivot
CREATE TABLE TblPivot
(
ItemCode int,
ItemName varchar(100),
ItemColour varchar(50)
)
GO
-- Inerting some sample records
INSERT INTO TblPivot
SELECT 1,'Samsung Mobile','Red'
UNION ALL
SELECT 2,'Nokia Mobile','Blue'
UNION ALL
SELECT 3,'Nokia Mobile','Green'
UNION ALL
SELECT 4,'Motorola Mobile','Red'
UNION ALL
SELECT 5,'Samsung Mobile','Green'
UNION ALL
SELECT 2,'Nokia Mobile','Blue'
UNION ALL
SELECT 1,'Samsung Mobile','Red'
UNION ALL
SELECT 2,'Nokia Mobile','Blue'
GO
-- Getting table data
SELECT
ItemCode,
ItemName,
ItemColour
from TblPivot
GO
-- Getting agreegated data using Pivot and converted rows to column
SELECT
*
FROM
(
SELECT
ItemCode,
ItemName,
ItemColour
FROM TblPivot
) AS P
PIVOT
(
Count(ItemName) FOR ItemColour IN (Red, Blue, Green)
) AS pv
GO
You can review here and see how The PIVOT is working. Let me share your experience with PIVOT operation.
Labels:
#sql,
#sql Server,
database,
ms sql,
ms sql server,
pivot,
SQL,
SQL Scripts,
SQL Server,
SQL SERVER 2005,
SQL Server 2008,
SQL Server 2011,
SQL Server Denali,
SQL Server General,
t-sql,
tsql
Subscribe to:
Posts (Atom)