Just when you thought the worst of 2020 was past

Today I got an e-mail from the PASS board indicating that the organization is dissolving.  It reads:

Dear Bennett,

 

PASS could not have anticipated the impact COVID-19 would have on the world, and on our organization. We are all reeling from what this year has been and done, and it is with heavy hearts that we must share yet one more bit of bad news from the annus horribilis that was 2020.


We are saddened to tell you that, due to the impact of COVID-19, PASS is ceasing all regular operations, effective January 15, 2021.


With the registration shortfall from PASS Virtual Summit, a lack of cashflow, and considering future obligations that PASS has on the books (i.e. 5 years’ worth of convention center and hotel agreements), PASS has no choice but to cease its operations and pursue dissolution. During our December 3rd, 2020 Board meeting, the PASS Board unanimously approved taking next steps in this unfortunate, but necessary, direction.


PASS has engaged experienced insolvency counsel and other professionals. All steps that PASS takes now are under the advice of our independent legal counsel as a part of the insolvency process. This includes communications, debt repayment, asset disbursement, conducting Board business, and all other actions.


We encourage you to take full advantage of any access that you have to PASS content between now and January 15, 2021. After that point, PASS servers will cease to function and members will no longer be able to access any PASS resources. So, in the meantime, you can watch sessions on our website, and download session recordings that you’ve purchased or that you have access to as a PASS Pro member. Please take full advantage of this exclusive content while you can.


Thank you for your support and engagement over the years. We wish you all the best.


The PASS Board

 

Posted in Uncategorized | Comments Off on Just when you thought the worst of 2020 was past

PowerShell String Reverse Function

Strangely, PowerShell does not include a string reverse function.  After some googling, I wrote a simple reverse function based on regular expressions:

1
2
3
4
function Reverse-String ($inputstring)
{
    -join[regex]::Matches($inputstring,".",'RightToLeft')
}

Posted in Powershell | Comments Off on PowerShell String Reverse Function

Photography Hobby Stuff

A few weeks ago I bought a Pentax Spotmatic at thrift store. It ran around 15 bucks.  I bought it because it appeared to have a nearly pristine Super Takumar 50mm F1.4 lens.  Post-purchase and on closer inspection, I found that both the lens and camera body are nearly pristine and the lens is optically and functionally perfect.

I’m currently running a roll of film through the camera before I send it off to Eric Hendrickson for CLA and possible repairs.

The Super Takumar 50/1.4 is radioactive and has yellowed over the years.  Other hobbyists have reported restoring the lens with UV exposure, so after some research I bought a $20 fluorescent UV glue curing lamp off of ebay.  After a few days of treatment there wasn’t much reduction in yellowing, so I did a little more research.  Many users have had good luck with an LED desk lamp.  I have plenty of LED MR-11 lamps, so I set up the jig shown below.  After about two days yellow was gone and the brightness was approximately  90% restored.

Posted in Personal, Uncategorized | Comments Off on Photography Hobby Stuff

Synchronizing Logins, Jobs, and Dependent Objects between Replicas with PowerShell

Here’s a simple PowerShell Script that I developed to synchronize logins, credentials, proxies, operators, and jobs between AlwaysOn replicas using PowerShell, dbatools, and a little bit of T-SQL.

 

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
#Powershell Script to copy logins, credentials, agent proxies, agent operators, and agent jobs between replica servers
#Author: Bennett Scharf
 
 
$server = $env:COMPUTERNAME
 
#debug variable assignment - in case we are not running the script on a replica server
$server = "node4"
 
$Commands = @()
 
$cmdlets = @("Copy-DbaLogin", "Copy-DbaCredential", "Copy-DbaAgentProxyAccount", "Copy-DbaAgentOperator", "Copy-DbaAgentJob" )
 
foreach ($cmdlet in $cmdlets)
#note: we're essentially using SQL and the system view sys.availability_replicas to generate the dbatools commands
{
    $query = "select distinct '$cmdlet -source ' + t1.replica_server_name + ' -destination ' +  t2.replica_server_name + '  -ErrorVariable oops'
    FROM sys.availability_replicas t1
    CROSS JOIN sys.availability_replicas t2
    WHERE t1.replica_server_name <>  t2.replica_server_name
    order by 1
    "
    $Commands += invoke-sqlcmd –ServerInstance $server -Database 'master'-Query $query | SELECT -ExpandProperty Column1
}
 
foreach ($command in $Commands)
{
 
    try
    {
        $command
        #comment out next line to only view command strings
        Invoke-Expression $command 
    }
 
    Catch
    {
        $to = "DBA Bob <bob@denvergov.org>"
        $from = "DBAMailAccount <alerts@denvergov.org>"
        $message = "An error was encountered while executing command: $command  Please check SQL Agent job log. `n Error: $oops"
        $subject = $env:COMPUTERNAME + ': Error replicating objects'
        Send-MailMessage -To $to -From $from -Subject $subject  -Body $message -SmtpServer "somerelay.domain.org" 
    }
}
Posted in AlwaysOn, Powershell | Comments Off on Synchronizing Logins, Jobs, and Dependent Objects between Replicas with PowerShell

High Availability Lab: Adding dual Ethernet NIC yields big improvements

I have a lab that includes:

  1. Two standalone SQL Server 2014 instances
  2. One standalone SQL Server 2017 instance running on Linux
  3. Thee 2016 instances that are part of an availability group
  4. A 2016 failover cluster instance with two nodes
  5. A domain controller
  6. A file server that also hosts an iSCSI target
  7. A WAN emulator to simulate network latency and congestion

Items two through six are hosted on a Hyper-V server, while everything else is on standalone hardware.  Item seven us useful for testing network connections between geographically isolated data centers.

Until today I was having issues finding a way to realistically simulate network outages between availability group nodes, since they reside within the same Hyper-V host.  Finally today something in my dim little brain clicked.  All I had to do was bind each of the nodes in my availability group to is own physical network interface.

To accomplish this I found a couple of spare, dual Ethernet NICs, installed them the Hyper-V Host, and connected each port to the (physical) desktop switch. From there I booted the server, went into Virtual Switch Manager, and created  three virtual switches, one for each port.  After that I selected two guest OS instances for modification.  On each instance I removed the existing network adapters and created a new network adapter tied to one of the new virtual switches.  Here is a simplified drawing of the layout:

With this new configuration, creating a network outage is as simple as disconnecting a network cable.  With two cables disconnected on a majority-node, three-node cluster, I was able to lose quorum (as expected) and watch the availability group role shut down as expected.  Restoring network connectivity produced the expected behavior.  Alternately, I was able to use failed quorum scenario to perform a forced quorum recovery.

Posted in AlwaysOn, SQL Server | Tagged , , | Comments Off on High Availability Lab: Adding dual Ethernet NIC yields big improvements

WAN Emulator for Availability Group Testing

I had an unmet need to be able to test my AlwaysOn Availability Groups under less than ideal network conditions.  More specifically I needed a way to add network latency in order to create more realistic monitoring and troubleshooting scenarios.

After some searching, I found the WANem project on SourceForge.  WANem comes in two flavors: a live CD based on Knoppix and a virtual appliance to run under VMWare.  For my project I chose the live CD and ran it on an old laptop.  One very nice thing about WANem is that you only need a single network adapter on your test machine.

The documentation is quite adequate to get started, however I would recommend that you use this guide to set up your testing environment.  For my setup I added static routes so that the route between each server used the IP address of the WANem as its default gateway.

With the WAN emulator inserted into my test lab, my Availability Group lab setup logically looks like this:

Dialing in a network latency of 30 milliseconds easily causes the asynchronous secondary to fall behind.  One can easily see the send queue values ramp up

 

Posted in Open Source, SQL Server | Tagged , , , , | Comments Off on WAN Emulator for Availability Group Testing

How to quickly and easily simulate a low disk space scenario

It turns out that the built in fsutil OS utility can quickly create an empty file of any size.  For example: fsutil file createnew test.txt 1300000000 Will create a file 100 billion bytes in size.

For my scenario I needed to create a low disk space situation on an Always On Availability Group secondary replica.  I created a couple of files that reduced the free space to a few hundred megabytes and then ran a simple query on the primary to do a bunch of large row inserts.  Once the disk space was exhausted and the data file could no longer autogrow, AlwaysOn suspended data movement as expected.  Several alerts were raised including:

C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\DATA\CreditLog.ldf: Operating system error 112(There is not enough space on the disk.) encountered.

I corrected the low disk space scenario by deleting the newly-created file.  I then manually resumed data movement.

 

Posted in Performance, SQL Server | Tagged , , | Comments Off on How to quickly and easily simulate a low disk space scenario

Azure Cert

Today I finished and passed the Microsoft/edX  DAT219x: Provisioning SQL Server and Azure SQL Databases certification course.  It covered provisioning and configuring Azure SQL databases, i.e., PAAS, as well as building SQL Server instances on both Azure Windows and Linux virtual machines, in other words, IAAS.

After that I went on to provision an Azure BLOB store and use it to transfer databases from my on-premises lab to the Azure servers.  Whee!

Posted in Uncategorized | Tagged , , | Comments Off on Azure Cert

SQL Server 2017 on Linux POC

Fun, new stuff today. I installed an instance of Ubuntu on my Oracle VirtualBox hypervisor and then installed SQL Server 2017 using the directions found here. Once installation was complete I used systemctl status mssql-server to validate that the services were running and then connected to the database engine using SSMS on my windows desktop.  Two VirtualBox settings of note:

  • The network adapter was set to the Bridged Adapter – the actual physical adapter in my lab PC.
  • The minimum memory required to install SQL Server is 3.5GB.

For my home lab, I really like being able to install SQL Server 2017 on a Linux machine.  It eliminates having to consume a Windows Server license activation.

Update 6/25/2017: Today I provisioned an instance of Ubuntu server on Azure and went on to install SQL Server 2017 on that instance. One key difference with Azure is that you need to add an inbound security rule to the network interface on port 1433 in order to remotely manage the SQL Server instance.

 

Posted in Uncategorized | Comments Off on SQL Server 2017 on Linux POC

AlwaysOn Availability Group Aware SQL Agent Job Step

The SQL Agent job step that I present here is intended to make a job execute only if the node currently owns a specified replica. The T-SQL for this job step leverages a function that was published in SQL Server Magazine and can be found here. The function, fn_hadr_group_is_primary, takes the name of the availability group as a single parameter and returns true if the replica is currently housed on the node and false if the replica is not currently housed on the node. I strongly recommend reading the SQLMag article as well as the articles in the larger series.

My job step calls the function to determine the status of the node. If the function returns true, the job continues to the next step. If the function returns false, the step calls msdb..sp_stop_job and the job terminates. The advantage to this approach is simplicity. You can simply insert this as the first job step into an existing SQL Agent job. No further modifications are necessary. Here is my code:

1
2
3
4
5
6
7
8
9
IF master.dbo.fn_hadr_group_is_primary('ophelia') = 1
          PRINT 'This server houses the primary replica, so continue to next step';
ELSE 
BEGIN 
	-- we're not in the Primary - exit gracefully:
	PRINT 'This server does not house  the primary replica, so terminate job'
	set @jobid = CONVERT(uniqueidentifier, $(ESCAPE_NONE(JOBID)))
	exec msdb..sp_stop_job @job_id = @jobid	
END
Posted in SQL Server, T-SQL, Uncategorized | Tagged | Comments Off on AlwaysOn Availability Group Aware SQL Agent Job Step

24 Hours of PASS Spring 2017

The Spring 2017 24HOP focused on data security, a subject near and dear to me. One of favorite presentations was Bob Pusateri’s Passive Security for Hostile Environments. He showed what measures you can take when you have a situation where users who shouldn’t have sysadmin rights have those rights nonetheless.

I’ve faced similar situations and taken similar measures in the past. In Bob’s case he presents it in way that is systematic and easily digestible. definitely worthwhile watching. You can find the slide deck and demo scripts on Bob’s blog.

Posted in Uncategorized | Comments Off on 24 Hours of PASS Spring 2017

Controlling Large Deletions and Transaction Log Growth

Deleting from very large tables can cause uncontrolled transaction log growth.  The way around this is to perform an iterative TOP(n) delete where you limit the number of rows to be deleted in each batch.  I got this idea from an article by Hugo Kornelis in the first volume of SQL Server MVP Deep Dives.  It’s titled Set-based iteration, the third alternative.  It’s a worthwhile read.

A basic iterative deletion script might look something like this. We set a batch size variable, @BatchSize equal to the number of rows to delete per iteration.  As long as the number of rows deleted is equal to the batch size, the WHILE loop will continue.  Otherwise it will terminate:

1
2
3
4
5
6
7
8
9
10
DECLARE @BatchSize int, @RowCnt int;
SET @BatchSize = 10000;
SET @RowCnt = @BatchSize;
WHILE @RowCnt = @BatchSize
BEGIN;
DELETE TOP (@BatchSize)
FROM   dbo.sometable
WHERE  somecolumn < 2395;
SET @RowCnt = @@ROWCOUNT;
END;

The script shown above isn’t really suitable for production.  Below is a more production-ready script.  I’ve incorporated similar scripts into scheduled SQL Agent jobs to maintain a specific data retention target.  For example, seven years.  The script has a couple of noteworthy features.  First, it incorporates a WAITFOR DELAY into the loop.  This gives other processes a certain amount of time to use the physical resources of the computer.  It also gives processes a chance to access the table you are deleting.  Second, it incorporates a cutoff time of day.  After that time, the loop will terminate.  For example, you might use SQL Agent to schedule this process every Sunday at 4:00 PM and set the cutoff time to 8:00 PM, because 8:00 PM is when other maintenance processes start up.  Note that you will have to tune the @BatchSize and WAITFOR DELAY parameters to suit your particular situation.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
--controlled deletion prevents long-term locks & blowing out transaction log
 
DECLARE @BatchSize INT ,
    @sPurgeMessage VARCHAR(64) ,
    @RowCnt INT ,
    @StartRowCount BIGINT ,
    @EndRowCount BIGINT ,
    @sMessage VARCHAR(512) ,
	@dStart DATETIME, 
    @StopTime DATETIME ,
	@cutoffDate DATETIME;
 
	--Set the stopping time for 18:00 on the current day
        SELECT  @StopTime = DATEADD(HOUR, 16,
                                    CAST(FLOOR(CAST(GETDATE() AS FLOAT)) AS DATETIME))
 
	-- @cutoffDate is the date value below which rows are purged
        SELECT  @cutoffDate = DATEADD(DAY, -2555, GETDATE())
 
 
	--informational - get & store starting row count
	--BTW, this method of counting rows is considered to be optimal in terms of speed
        SELECT  @StartRowCount = SUM(row_count)
        FROM    sys.dm_db_partition_stats
        WHERE   object_id = OBJECT_ID('dbo.SomeGynormousTable')
                AND ( index_id = 0
                      OR index_id = 1
                    );
 
        SET NOCOUNT ON;
	--set number of deletions per batch
        SET @BatchSize = 1;
 
        SET @RowCnt = @BatchSize
 
        WHILE ( @StopTime > GETDATE() )
            AND @RowCnt > 0
            BEGIN
                SELECT  @dStart = GETDATE()
                BEGIN TRAN
                DELETE TOP ( @BatchSize )
                FROM    dbo.SomeGynormousTable
                WHERE   [Date] < @cutoffDate;
                SET @RowCnt = @@ROWCOUNT
                COMMIT;
                SELECT  @sPurgeMessage = 'Purged '
                        + CAST(@RowCnt AS VARCHAR(10)) + ' rows in '
                        + CAST(DATEDIFF(SECOND, @dStart, GETDATE()) AS VARCHAR(6))
                        + ' seconds.'
                RAISERROR (@sPurgeMessage, 1, 1) WITH NOWAIT  
                WAITFOR DELAY '00:00:05' -- give other spids a chance to work
            END;
 
		--informational - get & store ending row count
        SELECT  @EndRowCount = SUM(row_count)
        FROM    sys.dm_db_partition_stats
        WHERE   object_id = OBJECT_ID('dbo.SomeGynormousTable')
                AND ( index_id = 0
                      OR index_id = 1
                    );
	-- optionally send out a completion notice
        SET @sMessage = CAST(( @StartRowCount - @EndRowCount ) AS VARCHAR(32))
            + ' rows were deleted'
 
        EXEC msdb.dbo.sp_send_dbmail @profile_name = 'someprofile',
            @recipients = 'somedba@gmail.com', @body = @sMessage,
            @subject = 'purge of dbo.SomeGynormousTable done';
Posted in Uncategorized | Comments Off on Controlling Large Deletions and Transaction Log Growth

Collecting Database File Stats with PowerShell

Here is a fairly simple PowerShell script that can be used to collect database file statistics across multiple servers. My version of the script is incorporated into a scheduled SQL Agent job that runs daily. It is very useful for trend analysis.

You will need two tables to support this script. The first table is simply a list of servers that you wish to monitor. There is a name column and an active flag column. You can use the active flag column to temporarily or permanently exclude a server from monitoring. The DDL looks like this:

1
2
3
4
CREATE TABLE [dbo].[ServerList](
	[ServerName] [nvarchar](128) NOT NULL,
	[IsActive] [bit] NOT NULL
) ON [Data]

The second table stores the captured data. Its DDL looks like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
CREATE TABLE [dbo].[DatabaseFileStats](
	[ID] [int] IDENTITY(1,1) NOT NULL,
	[TIME_TAKEN] [smalldatetime] NULL,
	[DATABASE] [nvarchar](128) NULL,
	[FILEID] [smallint] NULL,
	[FILE_SIZE_MB] [decimal](12, 2) NULL,
	[SPACE_USED_MB] [decimal](12, 2) NULL,
	[FREE_SPACE_MB] [decimal](12, 2) NULL,
	[GROWTH_MB] [decimal](12, 2) NULL,
	[DISK_VOLUME] [nvarchar](2) NULL,
	[NAME] [sysname] NOT NULL,
	[FILENAME] [nvarchar](260) NOT NULL,
	[FILEGROUP] [sysname] NULL,
	[FG_File_Size_GB] [decimal](38, 2) NULL,
	[FG_SPACE_USED_GB] [decimal](38, 2) NULL
)

…note that you can throw a clustered index on the identity column if you want.

The PowerShell is quite straightforward. We grab the list of servers and then iterate thorough that list in an outer loop. For each server in the outer loop, we grab the list of online databases from master.sys.databases and iterate through it. One thing that is missing from that script is a routine to ping each server prior to trying to hit it. I’ll add that as time permits.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
#File: GetDatabase File Stats
Import-Module SqlServer
 
$Servers = invoke-sqlcmd -ServerInstance $env:computername -Database "DBA" -Query "SELECT  ServerName FROM dbo.ServerList where IsActive = 1 order by ServerName"
write-host $Servers.ServerName
 
foreach ($Server in $Servers)
{
    $Databases = invoke-sqlcmd -ServerInstance $Server.name -Database "master" -Query "SELECT  name FROM sys.databases where (database_id &gt;             4 or database_id = 2) and state_desc = 'online' order by name"
 
    foreach ($database in $Databases)
    {
            invoke-sqlcmd –ServerInstance $Server.name -Database $database.name -Query "INSERT INTO [dba].[dbo].[DatabaseFileStats]
               ([TIME_TAKEN]c
               ,[DATABASE]
               ,[FILEID]
               ,[FILE_SIZE_MB]
               ,[SPACE_USED_MB]
               ,[FREE_SPACE_MB]
               ,[GROWTH_MB]
               ,[DISK_VOLUME]
               ,[NAME]
               ,[FILENAME]
               ,[FILEGROUP]
               ,[FG_File_Size_GB]
               ,[FG_SPACE_USED_GB])
 
    SELECT
	    CAST(GETDATE() AS SMALLDATETIME) AS TIME_TAKEN,
	    DB_NAME() AS [DATABASE],
	    a.FILEID,
	    [FILE_SIZE_MB] = 
		    CONVERT(DECIMAL(12,2),ROUND(a.size/128.000,2)),
	    [SPACE_USED_MB] =
		    CONVERT(DECIMAL(12,2),ROUND(FILEPROPERTY(a.name,'SpaceUsed')/128.000,2)),
	    [FREE_SPACE_MB] =
		    CONVERT(DECIMAL(12,2),ROUND((a.size-FILEPROPERTY(a.name,'SpaceUsed'))/128.000,2)) ,
	    [GROWTH_MB] =
		    CONVERT(DECIMAL(12,2),ROUND(a.growth/128.000,2)),
	    [DISK_VOLUME] = LEFT(A.FILENAME, 2),
	    a.NAME,
	    a.FILENAME, 
	    fg.name AS FILEGROUP,
	    [FG_File_Size_GB] = 
	    SUM(CONVERT(DECIMAL(12,2),ROUND(a.size/131072.0,2))) OVER(PARTITION BY fg.name),
	    [FG_SPACE_USED_GB] =
	    SUM(CONVERT(DECIMAL(12,2),ROUND(FILEPROPERTY(a.name,'SpaceUsed')/131072.0,2))) OVER(PARTITION BY fg.name)
    FROM
	    sys.sysfiles a
	    LEFT OUTER JOIN sys.filegroups fg
	    ON fg.data_space_id = a.groupid
    ORDER BY fg.name --LEFT(A.FILENAME, 2) , SIZE DESC" -QueryTimeout 3600
    }
}
Posted in Powershell, SQL Server | Comments Off on Collecting Database File Stats with PowerShell

Leveraging SP_Whoisactive for Monitoring and Troubleshooting BI/DW Workloads

Wait stats analysis is often my tool of first resort for triaging performance problems. Adam Machanic’s SP_Whoisactive utility is my tool of choice for getting insight into what is happening on a server.

Data Warehouse ETL processes often occur overnight. Since I do not have the stamina or desire to stay up all night every night to get insight into what happens during overnight loads, I run a SQL Agent job every five minutes to persist snapshots from SP_Whoisactive into a table. That way I can review the table later in the day.
SP_Whoisactive has a parameter, @destination_table, that will redirect the output to a table. There is also a parameter to generate the DDL to create the destination table. See Adam’s blog post for more details. As an example, I want my output to include lock, task, and transaction information. A query that will generate the DDL for the output table looks like this:

1
2
3
4
5
6
7
DECLARE @s VARCHAR(MAX)
EXEC sp_WhoIsActive @get_locks = 1
	,@get_task_info = 2
	,@get_transaction_info = 1
	,@return_schema = 1
	,@schema = @s OUTPUT
SELECT @s

I can take the DDL created from the previous example, specify a table name, and run the query. For example:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
CREATE TABLE dbo.WhoisactiveTrace (
	[dd hh:mm:ss.mss] VARCHAR(8000) NULL
	,[session_id] SMALLINT NOT NULL
	,[sql_text] XML NULL
	,[login_name] NVARCHAR(128) NOT NULL
	,[wait_info] NVARCHAR(4000) NULL
	,[tasks] VARCHAR(30) NULL
	,[tran_log_writes] NVARCHAR(4000) NULL
	,[CPU] VARCHAR(30) NULL
	,[tempdb_allocations] VARCHAR(30) NULL
	,[tempdb_current] VARCHAR(30) NULL
	,[blocking_session_id] SMALLINT NULL
	,[reads] VARCHAR(30) NULL
	,[writes] VARCHAR(30) NULL
	,[context_switches] VARCHAR(30) NULL
	,[physical_io] VARCHAR(30) NULL
	,[physical_reads] VARCHAR(30) NULL
	,[locks] XML NULL
	,[used_memory] VARCHAR(30) NULL
	,[status] VARCHAR(30) NOT NULL
	,[tran_start_time] DATETIME NULL
	,[open_tran_count] VARCHAR(30) NULL
	,[percent_complete] VARCHAR(30) NULL
	,[host_name] NVARCHAR(128) NULL
	,[database_name] NVARCHAR(128) NULL
	,[program_name] NVARCHAR(128) NULL
	,[start_time] DATETIME NOT NULL
	,[login_time] DATETIME NULL
	,[request_id] INT NULL
	,[collection_time] DATETIME NOT NULL
	)

The companion query to output to the newly-created table looks like this:

1
2
3
4
5
EXEC sp_WhoIsActive 
	@get_locks = 1,
	@get_task_info = 2,
	@get_transaction_info = 1,
	@destination_table = 'dbo.WhoisactiveTrace'

For my part I incorporate the destination table into a DBA repository and I incorporate the previous query into a SQL Agent job step. I schedule the job to run approximately every five minutes.  Note that in production there is a second step to clean up old log entries.
I found this to be a useful aid in troubleshooting BI Data Warehouse loads where very large tables are being loaded on a nightly basis.

Posted in SQL Server, T-SQL | Comments Off on Leveraging SP_Whoisactive for Monitoring and Troubleshooting BI/DW Workloads

Missing Index Stats Enhancements

Here is my script to find missing indexes using the missing index DMVs. It’s pretty much like others with the exception that I first get the instance uptime and then use that to calculate the number of seeks and scans per day. That way I can make my index decisions both in terms of impact and frequency of use.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
DECLARE @uptime VARCHAR(20);
SELECT  @uptime = DATEDIFF(DAY, create_date, GETDATE())
FROM    sys.databases
WHERE   name = 'tempdb';
 
 
SELECT  @@SERVERNAME AS [server],
		@uptime AS [uptime_days] ,
		DB_NAME() AS [DATABASE],
		SCHEMA_NAME(schema_id) AS [schema],
        sys.objects.name AS [TABLE] ,
        user_seeks ,
        user_scans ,
        ( user_seeks + user_scans ) / @uptime AS [seeks&amp;scans/DAY] ,
        avg_total_user_cost ,
        ( avg_total_user_cost * avg_user_impact ) * ( user_seeks + user_scans ) AS Impact ,
        'CREATE NONCLUSTERED INDEX  ix_' + sys.objects.name + '_ ON '
        + SCHEMA_NAME(schema_id) + '.' + sys.objects.name COLLATE DATABASE_DEFAULT + ' ( '
        + ISNULL(mid.equality_columns, '')
        + CASE WHEN mid.inequality_columns IS NULL THEN ''
               ELSE CASE WHEN mid.equality_columns IS NULL THEN ''
                         ELSE ','
                    END + mid.inequality_columns
          END + ' ) ' + CASE WHEN mid.included_columns IS NULL THEN ''
                             ELSE 'INCLUDE (' + mid.included_columns + ')'
                        END
        + ' WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 95) on [PRIMARY];' AS CreateIndexStatement ,
        mid.equality_columns ,
        mid.inequality_columns ,
        mid.included_columns
FROM    sys.dm_db_missing_index_group_stats AS migs
        INNER JOIN sys.dm_db_missing_index_groups AS mig ON migs.group_handle = mig.index_group_handle
        INNER JOIN sys.dm_db_missing_index_details AS mid ON mig.index_handle = mid.index_handle
                                                             AND mid.database_id = DB_ID()
        INNER JOIN sys.objects WITH ( NOLOCK )  ON mid.object_id = sys.objects.object_id
WHERE   ( migs.group_handle IN (
          SELECT TOP ( 500 )
                    group_handle
          FROM      sys.dm_db_missing_index_group_stats WITH ( NOLOCK )
          ORDER BY  ( avg_total_user_cost * avg_user_impact ) * ( user_seeks
                                                              + user_scans ) DESC ) )
        AND OBJECTPROPERTY(sys.objects.object_id, 'isusertable') = 1
ORDER BY 8 DESC;
Posted in Performance, SQL, SQL Server | Comments Off on Missing Index Stats Enhancements

Columnstore Index Performance Improvements

We have dimensional data mart running on SQL Server 2012.  The larger fact tables vary between 7 and 17 billion rows.  No fancy data types, no LOB data types, and all the join keys are ints.  There were already well-tuned conventional covering indexes on the fact tables. Clustered indexes and column statistics are kept up-to-date. Non-clustered indexes are disabled and rebuilt with each daily data load.  Queries always have some sort of date range in the WHERE clause.  In implementing columnstore indexes, we experienced an average performance improvement of around 7:1.  We were very happy, considering the level of tuning already in place.

We did find one query were the performance actually worsened when using a columnstore index.  I dug into the query and found that it utilized a view that was doing aggregations.  It’s the classic anti-pattern of having aggregates in an intermediate result set without using a temp table. There’s a later join against those aggregates and the query optimizer cannot do a cardinality estimate.  The view met a legitimate business requirement, so we created a regular table that materialized the view and added some code to load the new table as part of the nightly ETL process.  After that we pointed the query to the new table and tested conventional rowstore covering and columnstore indexes.  Again, about a 7:1 performance improvement between the two.

Posted in Uncategorized | Comments Off on Columnstore Index Performance Improvements

Partition-Level Lock Escalation

I ran into a situation at work where partition lock escalation solved a particular problem.  A table, with about 300 million rows, has a primary key that contains something called a ProcID or Process ID. The process ID is tied to a monthly analytic unit of work for a specific client – more specifically, it identifies client’s loan portfolio reported on a monthly basis. Our client’s portfolios vary greatly in size from roughly a hundred to several million loans. (Before I go any farther, let me explain that this is part of a third-party application. I have no influence over the data model and will not discuss its shortcomings here.)

The clustered index includes the ProcID.  Because analysts work on one portfolio or ProcID at a time, clustering on ProcID improves concurrency – except on the very largest client where locks get escalated up to the table level. If an analyst was working on a portfolio for our largest client, other analysts were blocked from their tasks due to table-level lock escalation.

To fix this problem I created a partition function that put our mega-client in one partition and everyone else in the other partition.  After that I created an appropriate partition scheme and then enabled partition lock escalation. (See Paul Randal’s excellent article here).  With partition lock escalation in place, concurrency has improved dramatically. If an analyst is working on our largest client’s portfolio, other analysts can now work on their portfolios with no blocking. Because of our analyst’s usage patterns, deadlocks are not a possibility.

Posted in SQL Server | Comments Off on Partition-Level Lock Escalation

Troubleshooting beyond the database

Users were seeing transient performance slowdowns every day between 15:00 and 16:00.  The slowdowns would typically last for 2-3 minutes and occur one or more times during this time window.  After looking at things on the SQL Server and database side, it was clear the problem was elsewhere.  In fact it was clear that the problem had to do with network congestion, but I needed documentation.

The first thing I did was use the SysInternals PSPing utility and PowerShell to better define the scope of the problem.  PSPing has some cool features including the ability to ping over specific TCP ports, a corresponding server mode, and an optional histogram display.  I won’t bore you with the details here; go read about it if you’re interested.  PowerShell was used to iteratively run PSPing and redirect the output to a file.  I also output a timestamp to the file  with each iteration.  PSPing and PowerShell clearly showed the dimension of the problem as well as the times the problem occurred.   The output showed multiple latencies in excess of 3000 mS.  The periods of high latency typically lasted between 1 and 10 seceonds.

I also used tracetcp and tracert to further isolate the problem.  Both utilities show latency by router hop.  High latency values started at the first MPLS router hop.

I needed to figure out what was driving the congestion,  so I downloaded and installed a 30-day demo of a NetFlow analyzer.  I configured the core router and switches to export NetFlow data to the PC running the Netflow analyzer software.  From there it was pretty easy to drill down and find the source of the problem.  The image below shows what was  happening.  SMTP traffic was maxing out a 20Mbit MPLS circuit.  After querying a few users, I found what was causing the problem – a daily client mass-mailing.  Pretty simple really.

NetFlow capture

NetFlow capture

Posted in Performance | Comments Off on Troubleshooting beyond the database

Powershell file mover

Pretty simple script I wrote a while back. Moves files between two paths. For example, to sweep files from a DMZ server. Optionally renames files on the target system by appending a date code. This is to avoid name collisions. Pay close attention to $FileAge, which sets a lower limit on the age of files to copy and $RenameDelay, which sets an upper limit on the age of files to rename in the destination path. Change the copy-item method to the move-item method before going to production.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
set-StrictMode -version 2
 
$FileAge = 30 #age of files in minutes
$RenameDelay = 5 #delay value for renaming files
$Source = "\\server1\path1\" #source folder
$Destination ="\\server2\path2\" #destination folder
 
 
$Past = (Get-Date).AddMinutes($FileAge * -1) 
$Past2 = (Get-Date).AddMinutes($RenameDelay * -1)
 
#first pass - move files form source to destination
$List = dir $Source -recurse | Where {$_.LastWriteTime -lt $Past -and $_.name -notlike "someString*" -and $_.name -notlike "SomeOtherString*"}
if ($list -ne $null){
	write-host $list.count "items found to move"
	foreach($file in $list){
		#$file.Fullname
		If ($file.PSIsContainer -eq $false) {
			$MoveDirectory = (join-path -path $Destination -childpath $file.DirectoryName.SubString(14))
			new-item -path $MoveDirectory -type directory -ErrorAction:SilentlyContinue #create destination folder if it doesn't already exist
		}	
		$MovePath = (join-path -path $Destination -childpath $file.FullName.SubString(15))
		copy-item -Path $file.FullName -destination $MovePath -force #move file to destination folder
	}
}
else {write-host "no matching items found to move"}
 
#second pass - rename files in destination
$List = dir $destination -recurse | Where {$_.CreationTime -gt $Past2 -and $_.PSIsContainer -eq $false -and $_.name -notlike "SomeString*" -and $_.name -notlike "SomeOtherString*" }
if ($list -ne $null){
	write-host $list.count "items found to rename"
	foreach($file in $list){
		$file = ( rename-item -force –path $file.Fullname –Newname ( $file.basename + (get-date -format '_yyyy-MM-dd_hhmm') + $file.extension))
	}
}
else {write-host "no matching items found to rename"}
 
set-StrictMode -Off
Posted in Powershell, Windows OS | 1 Comment

Way too Many Virtual Log Files

Holy cow! Today I encountered a database with nearly 10,000 VLFs! I should have taken a screenshot. I used Kimberly Tripp’s guidance to get things under control

Posted in Uncategorized | Comments Off on Way too Many Virtual Log Files

Deduplicating files with LogParser and SQL Server

Perhaps this should be categorized as a stupid people trick. On the other hand, it’s a good, cheap way to deduplicate files assuming you’re running SQL Server and LogParser. This post assumes at least a basic knowledge of T-SQL and LogParser command line syntax.

Basically, you use LogParser to create a two-column table containing file paths and MD5 hashes like this:

LogParser “SELECT path, HASHMD5_FILE(Path) AS MD5 INTO T_FILE_INFO FROM D:\some_path\*.*” -i:FS -o:SQL -database:files -driver:”SQL Server” -createTable:ON

Note: the -o parameter directs LogParser to write the output to a SQL Server database table.

The T-SQL ranking function makes it very easy to find and pick out duplicate rows. Instead of finding duplicate rows, I use it to generate statements to delete duplicate files. The entire script looks something like this:

USE [TempDB]
GO
 
-- Create LogParser command
DECLARE @LogParserCMD VARCHAR(8000);
SET @LogParserCMD = 'LogParser "SELECT PATH, HASHMD5_FILE(Path) AS MD5 INTO #T_FILE_INFO FROM ''E:\some_path\*.*''" -i:FS -recurse:1 -o:SQL  -database:files -driver:"SQL Server"  -createTable:ON'
 
-- Create table to store file paths and MD5 hashes
IF  EXISTS (SELECT * FROM sys.objects WHERE OBJECT_ID = OBJECT_ID(N'[dbo].[#T_FILE_INFO]') AND TYPE IN (N'U'))
DROP TABLE [dbo].[#T_FILE_INFO]
GO
CREATE TABLE [dbo].[#T_FILE_INFO](
[PATH] [varchar](512) NULL,
[MD5] [varchar](255) NULL
) ON [PRIMARY]
GO
-- Next, run LogParser to populate table.  Either run from CMD prompt or use XP_CMDSHELL
 
-- note: this step can be time consuming, since LogParser must generate an MD5 has for every file specified
 
EXEC xp_cmdshell @LogParserCMD;
 
--Get duplicate hashes; generate deletion commands
WITH tblCTE(MD5, PATH, ranking)AS
(SELECT MD5, PATH, Ranking =
DENSE_RANK() OVER(PARTITION BY MD5
ORDER BY NEWID() ASC)FROM #T_FILE_INFO)
 
SELECT 'DEL "' + PATH + '"'
FROM tblCTE
WHERE Ranking &gt; 1
AND MD5 IS NOT NULL;
-- Review the result set and run it from a command prompt
 
-- clean up after yourself
DROP TABLE #T_FILE_INFO

Finally note that the MD5 hashing algorigthm has problems with hash collisions. Two unique files can produce the same hash. In this context there would be a risk of unintentionally deleting a file. You could tweak these scripts to generate and also match on file size. That would reduce the risk of unintentional deletions.

Posted in T-SQL, Utilities, Windows OS | Comments Off on Deduplicating files with LogParser and SQL Server

The final voyage of the USNS H. H. Hess

My first ship, the USNS H. H. Hess, is currently enroute to Brownsville, TX where it will be cut up into scrap metal and recycled.  The Hess was an oceanographic survey vessel that was operated by the Military Sealift Command from roughly 1978-1992.  The ship was originally built in 1965 as the SS Canada Mail,  a C4 Mariner class cargo ship that was operated by American Mail Lines.  The ship was converted into it’s survey role in 1976.

In its time it was the worlds largest floating data center with all sorts of cool computers and electronic instrumentation.   It performed bathymetric, magnetic field, and gravity surveys in support of the Navy’s FBM program.  The crew consisted of roughly 40 civilian mariners, 10 oceanographers, two commissioned naval officers, 18 enlisted Navy technicians, and several of defense contractor represenatives.

The ship was big, comfortable and well-appointed.  There were three dining rooms, two libraries, a conference room that doubled as a movie theater, nice staterooms, a workout room, a crew’s lounge and a large basketball court.   My favorite parts of the ship were the electronics shop – I pretty much had it to myself and built my first Z80-based computer there – and the bow, which offered sun, fresh air, and quiet solitude – a great place to study and contemplate life.

The photo below is a webcam capture that shows the Hess entering the Gatun locks of the Panama Canal on 2/18/2011.

hess-final-voyage-8.jpg

Posted in Personal | 2 Comments

Free SQL Server training videos

The people over at SQLSkills.com have made their MCM training videos available online for free.  They’re pretty incredible.  Subject areas include database internals and performance, high availability and disaster recovery, performance tuning,  security and development support.  Link here:  http://www.sqlskills.com/T_MCMVideos.asp

 

There are also a plethora of wonderful videos over at SQLBits.com.  One of my favorites is a talk by Thomas Kejser on designing I/O systems for SQL Server.

 

SQLWorkshops.com has a collection of wonderful videos on advanced query tuning.   The instructor provides live demos with  runtime statistics to fully demonstrate his tuning techniques.  Free and very worthwhile.

Posted in Performance, SQL Server | 1 Comment

Alert for long-running SQL database backups

One of my daily tasks is to do a quick check of each SQL Server instance using Activity Monitor, sp_who2, or a DMV-based script.  Sometimes I get busy and forget to do this task.  Today I broke down and wrote a simple script that is executed by a SQL Agent job.  It runs at 8:00 AM, and just goes out to all the instances and checks to see if any backups are still running — if any are still running, an e-mail alert is raised.

The essence of the script follows.  You would might want to modify it to iterate through a list of instances.

1
2
3
4
5
6
7
8
9
10
11
12
13
IF EXISTS (
		SELECT *
		FROM instance.MASTER.sys.sysprocesses
		WHERE cmd = 'backup database'
			AND program_name = 'SQL Management'
		)
BEGIN
	EXEC msdb.dbo.sp_send_dbmail @profile_name = 'Master'
		,@recipients = 'mailbox@domain.com'
		,@body = 'Backup job is still running on instance'
		,@subject = 'Backup job is still running on instace'
		,@importance = 'high';
END

Posted in SQL Server | 1 Comment

Learning SMO & Powershell

I created a small script to collect SQL Server file utilization data into a repository.  I’m collecting the space used by the files as well as the space used internally by SQL Server.  I’m doing this across all production databases and servers, so the T-SQL fileproperty function was not all that useful since it only works against the current database.  The solution was to use SMO in a PowerShell script to collect the data since it’s very easy to iterate across multiple servers and databases.  I’ll write up the solution at a later date, but for now I wanted to mention a minor fact that I learned today.

The SMO Database class has a Status property which returns at least two values, “Normal” and “Offline”.  You can use it to avoid trying to get info for offline databases.  For example:

if ( $db.status -eq “Normal”) { #exclude offline databases
#do something

}

Posted in Powershell, SQL Server | 1 Comment

SQL Generators for moving database files

Here are several trivial scripts for updating the system catalog, moving files via robocopy, and adjusting filegrowth values. I hope someone finds them useful.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
-- generate system catalog changes
-- *** note: need to manually edit destination path volume letter in output ***
SELECT 'ALTER DATABASE ' + DB_NAME() + ' SET OFFLINE'
 
SELECT 'ALTER DATABASE ' + DB_NAME() + ' MODIFY FILE
(
NAME = ' + NAME + ',
FILENAME = N'+ filename + '
);'
FROM dbo.sysfiles
-- where filename like 'i:\%'
-- order by size desc
 
SELECT 'ALTER DATABASE ' + DB_NAME() + ' SET ONLINE'
 
-- generate robocopy statements to be executed under command prompt
-- *** note: need to manually edit destination path volume letter in output ***
SELECT 'RoboCopy ' +
LEFT(filename,(LEN(filename) - CHARINDEX('\', REVERSE(filename)))+1) + ' ' +
LEFT(filename,(LEN(filename) - CHARINDEX('\', REVERSE(filename)))+1) + ' ' +
RIGHT(filename,(CHARINDEX('\', REVERSE(filename)))-1)
from sysfiles
 
 
-- modify filegrowth
SELECT 'ALTER DATABASE ' + DB_NAME() + ' MODIFY FILE
(
NAME = ' + NAME + ',
Filegrowth = 128MB
);'
FROM dbo.sysfiles
Posted in T-SQL | 1 Comment

Index to Filegroup mapping

Here is a trivial script to show where a particular index resides. It saves clicking around the SSMS GUI.

1
2
3
4
5
6
7
SELECT i.name, i.type_desc, i. is_primary_key, i.is_unique, s.name AS [Filegroup]
	FROM sys.indexes i
	INNER JOIN sys.data_spaces s
	ON i.data_space_id = s.data_space_id
	WHERE i.name IS NOT NULL
	AND i.name NOT IN ('clust', 'clst', 'nc1', 'nc2', 'nc3', 'nc', 'cl')
	ORDER BY s.data_space_id, i.name
Posted in SQL Server, T-SQL | Comments Off on Index to Filegroup mapping

PowerShell Script to Clean Up Old Files Based on Age

Here’s an extremely simple PowerShell script to remove old files based on age. You could use it to enforce a policy that keeps 90 days of ETL input files on a server. Files older than that would purged with this script. The script would be invoked by a SQLAgent CMDExec task or a Windows Task Scheduler job. Be sure to test with the -whatif switch before deploying.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
$intFileAge = 15  #age of files in days
$strCleanupPath = "filesystem::\\Server\Share..." #path to clean up - UNC paths permissible and probably preferred
 
#create filter to exclude files newer than specified age as well as all folder objects
Filter Select-FileAge 
      {
      param($days)
      If ($_.PSisContainer) {}
              # Exclude folders from result set
      ElseIf ($_.LastWriteTime -lt (Get-Date).AddDays($days * -1))
            {$_}
      }
 
get-Childitem -recurse $strCleanupPath | Select-FileAge $intFileAge | remove-item
Posted in Utilities, Windows OS | 3 Comments

Trigger Mass Enable / Disable

Pretty trivial, really:

SELECT ‘ALTER TABLE ‘ + OBJECT_NAME(PARENT_ID) + ‘ ENABLE TRIGGER ‘ + [NAME]
FROM sys.triggers
    

…add a where clause if needed.  Paste the output back into the query window and modify as needed.  You can also slap some code around it to generate some dynamic SQL like this:

DECLARE @SQL VARCHAR(MAX)
SELECT @SQL=COALESCE(@SQL,)+ ‘ALTER TABLE ‘ + OBJECT_NAME(PARENT_ID) + ‘ ENABLE TRIGGER ‘ + [NAME] +
‘; ‘
FROM
sys.triggers
PRINT @SQL
  

Posted in T-SQL | Comments Off on Trigger Mass Enable / Disable

Moving Master and Resource databases

You may encounter the error message “SQL SERVER is in SINGLE USER MODE and only one administrator can connect” when moving the Master and Resource databases when following the procedure in BOL.  It happens right after you bring up SQL Server in Master-only, single-user mode using the T3608 trace flag like this:

NET START MSSQLSERVER /f /T3608

For me the problem was that a remote process was grabbing the lone connection faster than I could.  The fix was to go into SQL Server Configuration Manager, Expand SQL Server Network Configuration, drill down into Protocols for MSSQLSERVER,  and then change the TCP/IP settings to point to a different port number.   In the example below, I pointed to port 14331 instead of 1433.

screencap.jpg

Once the port settings were changed, I stopped and restarted SQL Server and was able to finish moving the system databases.  Don’t forget to change back to port 1433 once you’re done!!!!

References:

http://msdn.microsoft.com/en-us/library/ms345408(SQL.90).aspx

Posted in Uncategorized | Comments Off on Moving Master and Resource databases

Quick and Dirty CSV import to SQL Server

You can use LogParser (a free, unsupported utility from Microsoft) to quickly import a CSV file into a SQL Server table using syntax like this:

logparser “SELECT * INTO MyTable FROM d:\MyFile.csv” -i:csv -o:SQL -server:MyServer -database:MyDatabase -createTable:ON

There are many other uses for LogParser.  Two of the most common are analyzing Windows Event Logs and IIS logs. 

Posted in SQL Server, Windows OS | Comments Off on Quick and Dirty CSV import to SQL Server

Getting data file space usage

Here is a simple query to get space usage by data file:

 SELECT 
   
a.FILEID

   
[FILE_SIZE_MB] CONVERT(decimal(12,2),ROUND(a.size/128.000,2
)), 
   
[SPACE_USED_MB] CONVERT(decimal(12,2),ROUND(FILEPROPERTY(a.name,'SpaceUsed')/128.000,2
)), 
   
[FREE_SPACE_MB] CONVERT(decimal(12,2),ROUND((a.sizeFILEPROPERTY(a.name,'SpaceUsed'))/128.000,2
)) , 
   
NAME LEFT(a.NAME,24
), 
   
FILENAME LEFT(a.FILENAME,55

   
FROM dbo.sysfiles a 

Posted in SQL Server, T-SQL | Comments Off on Getting data file space usage

Useful Multipliers

Multiply SQL database pages by 0.0078125 to get space in Megabytes
Multiply SQL database pages by 0.00000762939453125 to get space in Gigabytes

Here are a couple of examples of where this is useful:

This query uses a DMV to return TempDB utilization by object category such as user, internal, and version store:

SELECT
SUM(user_object_reserved_page_count)*0.0078125 as usr_obj_mb,
SUM
(internal_object_reserved_page_count)*0.0078125 as internal_obj_mb,
SUM(
version_store_reserved_page_count)*0.0078125  as version_store_mb,
SUM
(unallocated_extent_page_count)*0.0078125 as freespace_mb,
SUM
(mixed_extent_page_count)*0.0078125 as mixedextent_mb
FROM
sys.dm_db_file_space_usage 

The next query returns space utilization , in Megabytes, by filegroup:

select name, filename, cast(size * 0.0078125 as int)as size_mb
from sysfiles
ORDER
BY FILENAME
compute
sum(cast(size * 0.0078125 as int))

Posted in SQL Server, T-SQL | Comments Off on Useful Multipliers

One-off backups in SQL Server 2005

I am frequently asked to refresh development databases with production data.  The usual way to do this is to  back up the production database and then restore over the development database.  SQL Server 2005 has a new backup option, “WITH COPY_ONLY”.   This option allows you to perform a full backup without truncating the log and breaking the log chain.  The COPY_ONLY option is not supported in the SQL Server Management Studio (SSMS) GUI, so you have to perform the backup via a script.

It’s important to note that a backup that is created with the COPY_ONLY option cannot be restored with the SSMS GUI.  Instead you have to restore via a script.

SQL Server 2008 supports the COPY_ONLY option in the SSMS GUI.

Posted in SQL Server | 1 Comment

Converting an existing Perfmon .BLG file to CSV

You may run across the situation where you need to import an existing Perfmon (or System Monitor) binary log file into and Excel spreadsheet or SQL database.  The obvious first step is to convert it to CSV format.  It took googling numerous forums to figure out how to do this, so I thought that I’d document it here for future reference.

The relog.exe utility, which ships with the OS, has the ability to convert between Perfom log file formats and merge several log files into one file.  It has some nice filtering capabilities including filtering by time, counters, and nth record.  Example syntax for simply converting between file formats:

relog -f csv inputfile.blg -o outputFile.csv

You can display all the options by simply typing relog -? at the command prompt.

Posted in Uncategorized, Utilities, Windows OS | Comments Off on Converting an existing Perfmon .BLG file to CSV

Dynamic SQL – must declare scalar variable

I ran into a situation today where I was generating dynamic SQL that appeared to be valid.  I could take the generated SQL strings and execute them with no issues, but when executed dynamically, I got the error message “Must declare the scalar variable “@someVar”, even though the variable in question was declared properly.  I learned that the problem occurred because the variables were declared outside of the scope of the dynamic SQL that I was executing.  I was pointed to the following article:  http://support.microsoft.com/kb/262499.  This article describes how to use input and output parameters with the sp_executesql stored procedure.  Another good article on dynamic SQL parameters can be found on the SimpleTalk web site here.

Posted in T-SQL | Comments Off on Dynamic SQL – must declare scalar variable

SQL Server “Max Server Memory” Config Value

I learned an interesting tidbit while reading Inside Microsoft SQL Server 2005: Query Tuning and Optimization.  The Max Server Memory configuration value applies to the cache buffer pool and not SQL Server as a whole.  Here is a link to an MSDN article that explains it:  http://msdn.microsoft.com/en-us/library/ms180797.aspx

This book goes on to discuss how to detect, measure, and remedy various forms memory pressure.  I followed some of the examples in the book and came up with these statistics on one server:

memory_stats1.html

One thing that is kind of interesting about these statistics is that it clearly shows that the configured server memory is significantly less than the memory used by sqlserver.exe.

Posted in SQL Server | Comments Off on SQL Server “Max Server Memory” Config Value

Shrinking all log files

I’m not a big proponent of shrinking files. There are a lot of postings on DBA forums citing why this practice should be avoided. Tibor Karzai in particular has an article on why this is a bad idea. Shrinking log files in particular is problematic as log files cannot leverage instant file initialization; they have to be zeroed as they are grown.

Having said that, we have a development server with a lot of databases and no budget for storage upgrades. Consequently someghing has to give – and so we shrink the log files on a pretty regular basis.

I wrote a little query to dynamically build the list of databases and logical filenames and then run DBCC SHRINKFILE to shrink all the log files. Here’s the script:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
SET nocount ON;
CREATE TABLE #tempFileInfo
(
  dbName VARCHAR(256),
  logicalName VARCHAR(256),
)
 
DECLARE @DBName VARCHAR(256)
DECLARE @SQL VARCHAR(MAX)
 
DECLARE CursorDB CURSOR FOR
SELECT [name] FROM sys.databases WHERE database_id &gt; 4 AND state_desc = ‘ONLINE’
OPEN CursorDB
 
FETCH NEXT FROM CursorDB INTO @DBName
WHILE @@FETCH_STATUS = 0
  BEGIN
    SET @DBName =[+ @DBName +]SET @SQL =select+ CHAR(39) + @DBName + CHAR(39) + ‘, [name] from+ @DBName + ‘.sys.sysfiles WHERE filename LIKE+ CHAR(39) +%ldf’ +CHAR(39)print @SQL
    INSERT INTO #tempFileInfo
    EXEC (@SQL)
    FETCH NEXT FROM CursorDB INTO @DBName
  END
CLOSE CursorDB
DEALLOCATE CursorDB
 
SET @SQL =SELECT @SQL=COALESCE(@SQL,”)+USE+ dbName +DBCC SHRINKFILE(+ CHAR(39) + logicalName + CHAR(39) + ‘, 1); ‘
FROM #tempFileInfo
 
PRINT @SQL
EXEC(@SQL)
 
DROP TABLE #tempFileInfo
Posted in SQL Server, T-SQL | Comments Off on Shrinking all log files

How to determine if TCP Chimmney Offloading is working

The TCP Chimmney Offload feature was introduced in Windows Server 2003 SP2.  It can offload certain functions from the OS to the network adapter card.  A lot of our newer HP Servers have a Broadcomm Ethernet chipset that supports this function.   It isn’t without its issues.  To determine if it is in use you can run the netstat –nt command like this:

C:\>netstat -nt

 Active Connections

   Proto  Local Address          Foreign Address        State           Offload State
  TCP    192.168.4.97:445        192.168.4.13:2167       ESTABLISHED     InHost
  TCP    192.168.4.97:1433       192.168.4.95:3560       ESTABLISHED     InHost
  TCP    192.168.4.97:1433       192.168.4.95:3617       ESTABLISHED     InHost

The last column shows the offload status.  In this example the offload state is shown as InHost for all connections, so TCP offload is not enabled.

Posted in Windows OS | Comments Off on How to determine if TCP Chimmney Offloading is working

Oracle Sysdate formatting

You can use the PL-SQL to_char function to format the output of SYSDATE like this:

select to_char(sysdate,’DD/MM/YYYY HH24:MI:SS’) from dual;

Posted in SQL | Comments Off on Oracle Sysdate formatting

The Good Books

I used several books to study for the ICCP DBA exam including:

  • Modern Database Management, 8th Edition by Hoffer, Prescott and McFadden
  • Database Systems: Design, Implementation & Management 6th Edition by Rob and Coronel
  • Database Administration: The complete Guide to Practices and Procedures by Craig S. Mullins
  • An Introduction to Database Systems 5th Editon by C.J. Date

Of these, Modern Database Management was my favorite. Chapter 6 of this book covers physical database design and performance. There is a section in this book that has guideline for improving query performance. Here are a few of the  recommendations:

  • Use compatible data types, i.e., avoid type conversions
  • Write simple queries
  • Break complex queries into multiple, simple parts
  • Don’t nest one query inside another (just because you can use a subquery doesn’t always mean that you should)
  • Don’t combine a table with itself, i.e. avoid self-joins.
  • Create temporary tables for groups of queries.

Posted in T-SQL | Comments Off on The Good Books

Remote Reboot

Today while applying patches to a couple of mission-critical servers, I was reminded of something:  reboots issued through the terminal services client are not always successful.  The server can get stuck in an odd state where it is running but terminal services is not working.  For me the workaround has been to issue a remote reboot command using the shutdown utility:

shutdown /r  /m \\server_name_to_reboot /t 0  /f

Posted in Windows OS | Comments Off on Remote Reboot

Ongoing Lessons from the Cluster Lab

It’s been a while since I worked on my cluster lab. The first thing that I did was to tear down the old cluster and start over from scratch. The first lesson from this round of testing came when the cluster administrator scanned the hardware resoruces and threw a warning message that stated that only one adapter was found on the node. This warning is caused by fact that the two cluster nodes are interconnected via a crossover cable and the second node is shut down. With the node shut down, there is no physical layer link on the Ethernet card, so the OS thinks that the cable is disconnected.

I got past the error by simply plugging the cluster member’s private network connection into a spare switch that I had lying around. This provided a physical layer link which was what the OS needed to report a valid network connection. After that I rescanned the resources and the cluster installation proceeded.

Posted in Windows OS | Comments Off on Ongoing Lessons from the Cluster Lab

Getting HH:MM:SS value from Seconds

Here is another handy date manipulation function. This one converts the number of seconds, expressed as an int, into the HH:MM:SSS format

SELECT
CASE
WHEN @seconds/3600<10 THEN '0'
ELSE ''
END
+ RTRIM(@seconds/3600)
+
':' + RIGHT('0'+RTRIM((@seconds % 3600) / 60),2)
+
':' + RIGHT('0'+RTRIM((@seconds % 3600) % 60),2)

Rewritten as a function, we have:

CREATE FUNCTION SecToHHMMSS
(
-- Add the parameters for the function here
@seconds INT
)
RETURNS VARCHAR(10)
AS
BEGIN
RETURN
CASE
WHEN @seconds/3600<10 THEN '0'
ELSE ''
END
+ RTRIM(@seconds/3600)
+
':' + RIGHT('0'+RTRIM((@seconds % 3600) / 60),2)
+
':' + RIGHT('0'+RTRIM((@seconds % 3600) % 60),2)
END
GO

Posted in T-SQL | Comments Off on Getting HH:MM:SS value from Seconds

Stripping the time component from a datetime value

Here is a very simple and elegant way to strip the time component from a datetime value:

CAST(FLOOR(CAST(@date AS float)) AS datetime)

Written as a function:

CREATE FUNCTION BareDate
(
-- Add the parameters for the function here
@date datetime
)
RETURNS datetime
AS
BEGIN
RETURN
CAST(FLOOR(CAST(@date AS float)) AS datetime)
END
GO

When we pass in the value ‘2008-02-12 13:25:33.3’, it returns the value 2008-02-12 00:00:00.000

Other ways of stripping the time component:

SELECT DATEADD(dd,(DATEDIFF(dd,0,backup_start_date)),0)

or:

SELECT CAST(DATEDIFF(dd,0,backup_start_date) AS Datetime)Using the convert has performance issues.  I’ll have to come up with numbers for all these: or

SELECT CONVERT(Datetime, CONVERT(NCHAR(10), backup_start_date, 121))

Posted in SQL Server | Comments Off on Stripping the time component from a datetime value

Moving

I originally blogged on Yahoo 360, but it is a lame duck, so I’m moving my postings over to WordPress.

Posted in Uncategorized | Comments Off on Moving

The IIS Metabase Explorer Rocks

Today I was reminded of something that I knew but forgot. The IIS Metabase Explorer for IIS6 (or MetaEdit for IIS 4 and 5) is a very useful tool. I needed to blow away an inheritable attribute of the HttpRedirect property. The Metabase explorer made the task ridiculously simple.

The Metabase Explorer is part of the IIS 6.0 Resource Kit.

Posted in IIS | Comments Off on The IIS Metabase Explorer Rocks

Partition Misalignment can Cause Poor Performance

Here is an interesting Microsoft Knowledgebase article titled Disk performance may be slower than expected when you use multiple disks in Windows Server 2003, in Windows XP, and in Windows 2000.

It explains how a partition misalignment can cause significant I/O by causing reads and writes to straddle track boundaries. The fix is to use the diskpart.exe utility to create an alignment offset when you first create the partition.

This is similar to what you do with SAN storage to align the starting offset to “significant” boundaries. See the 3/31/08 post for more details.

Posted in Performance, SQL Server | Comments Off on Partition Misalignment can Cause Poor Performance

Using Diskpart to Align a Windows Disk Partition

There is an EMC whitepaper titled Using diskpar and diskpart to align partitions on Windows Basic and Dynamic Disks. This paper describes how legacy Windows issues cause a single disk I/O to span multiple tracks. This puts an unnecessary load on the SAN and reduces performance. The fix is to align the partition. Below is a sample session using the DISKPART utility. Comments are in red.

Invoke Diskpart

C:\>diskpart

Microsoft DiskPart version 5.2.3790.3959

Copyright (C) 1999-2001 Microsoft Corporation.

On computer: MYTESTBOX

You must select the disk to act on

DISKPART> select disk 5

Disk 5 is now the selected disk.

Create a partition using the EMC-recommended alignment offset

DISKPART> create partition primary align=64

DiskPart succeeded in creating the specified partition.

Select the partition / volume that you just created

DISKPART> select volume=0

Volume 0 is the selected volume.

Assign a drive letter

DISKPART> assign letter=o

DiskPart successfully assigned the drive letter or mount point.

Exit Diskpart

DISKPART> exit

Leaving DiskPart…

C:\>

You can now use the disk management MMC to format the partition

Posted in Performance, Windows OS | 6 Comments

MD5 Hash Utilities

FCIV.EXE is a handy little utility to snap a MD5 hash of a single file or multiple files. Optionally it can recurse directories and produce XML output. http://support.microsoft.com/kb/841290

MD5SUM.EXE is a more commonly used utility that runs on a number of different platforms. A good Wikipedia article with links can be found here and a Win32 binary can be found here.

Posted in Utilities | Comments Off on MD5 Hash Utilities

Simple SQL Server Log Filter / Review Script

Here’s a very basic log filtering /review script that you can run on a daily basis to review your SQL Server logs. It shows the last two days of log entries filtering out logon and backup entries.

-- daily log review script
SET nocount ON
CREATE TABLE
#Errors
(
LogDate DateTime,
ProcessInfo NVARCHAR(40),
ErrorText NVARCHAR(2048)
)
INSERT INTO #Errors
EXEC master..sp_readerrorlog -1

SELECT * FROM #Errors
WHERE DATEDIFF(DAY, LogDate, GETDATE())<2
AND ProcessInfo <> ‘Logon’
AND ProcessInfo <> ‘Backup’
DROP TABLE #Errors

I’m currently cooking up some VBScript to filter and e-mail log entries on a daily basis. I’ll publish that soon.

Posted in T-SQL | Comments Off on Simple SQL Server Log Filter / Review Script

Basic Database Inventory Script

CREATE TABLE #tempresults2
(
database_name NVARCHAR(128),
backup_start_date DATETIME
)
INSERT INTO #tempresults2
SELECT database_name, backup_start_date
FROM msdb.dbo.backupset AS b1
WHERE backup_start_date =
(
SELECT MAX(backup_start_date)
FROM msdb.dbo.backupset AS b2
WHERE b1.database_name = b2.database_name
AND TYPE IN ('D', 'I')
);
WITH TBL_CTE(name, Size_MB, create_date, compatibility_level, [state], recovery_model, page_verify_option) AS
(
SELECT D.name, CONVERT(DECIMAL(12,2),ROUND(SUM(f.[size])/128.000,2)) AS Size_MB, d.create_date, d.compatibility_level, d.state_desc, d.recovery_model_descd.page_verify_option_desc
FROM SYS.DATABASES D
INNER JOIN SYS.MASTER_FILES F
ON F.database_id = D.database_id
GROUP BY D.name, d.create_date, d.compatibility_level, d.state_desc, d.recovery_model_desc, d.page_verify_option_desc
)
SELECT name, Size_MB, create_date, compatibility_level, [state], recovery_model, page_verify_option,
tr2.backup_start_date AS last_backup, GETDATE() AS inventory_date
FROM TBL_CTE D
LEFT OUTER JOIN #tempresults2 tr2
ON D.name = tr2.database_name;
DROP TABLE #tempresults2

Posted in SQL Server | Comments Off on Basic Database Inventory Script

Starting Windows Performance Logging at boot time

Performance logging will not start on its own at boot time. Here’s how to get around that problem.

 

Applies to: W2k3, XP

  1. Using the Performance MMC, create a set of counter logs.
  2. Use a specific name to save the settings, e.g., SQLPERF.
  3. At the command prompt, try starting and stopping performance logging using the logman.exe utility:
    1. logman start sqlperf
    2. logman stop sqlperf
  4. Set the Windows registry to automatically start performance counters at system startup
    1. Open the Registry Editor, regedit.exe.
    2. Navigate to HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Run
    3. Add a string value. A good name would be logman.
    4. Set the value equal to logman start sqlperf (or the command in step 4.a. that successfully started performance logging).

References:

  1. http://support.microsoft.com/kb/303133
  2. http://www.jsifaq.com/SF/Tips/Tip.aspx?id=4455
Posted in Performance, Windows OS | Comments Off on Starting Windows Performance Logging at boot time

Query to get last database backup dates

Here’s a simple query to find the last backup dates:

 

SELECT backup_start_date, database_name
FROM msdb.dbo.backupset AS b1
WHERE backup_start_date =
(
SELECT MAX(backup_start_date)
FROM msdb.dbo.backupset AS b2
WHERE b1.database_name = b2.database_name
AND type = 'D'
)

Posted in SQL Server | Comments Off on Query to get last database backup dates

SQL Server Best Practices Analyzer (BPA)

A new version of the SQL Server 2005 Best Practices Analyzer (BPA) is here.

On a related note, the BPA reported several operating system issues that I was unaware of, including an issue where SQL Server’s working set gets paged to disk during large file copies. More info and a link to the Technet article discussing this issue can be found here.

Posted in SQL Server, Windows OS | Comments Off on SQL Server Best Practices Analyzer (BPA)

First stab at a database inventory script

Compatible with sql server 2k5:

USE master
CREATE TABLE #tempresults
(

[name] sysname,
db_size NVARCHAR(13),
[owner] sysname,
[dbid] smallint,
created NVARCHAR(11),
[status] NVARCHAR(600),
compatibility_level tinyint

)
INSERT INTO #tempresults
EXEC sp_helpdb
SELECT db.name, tr.db_size, db.compatibility_level,is_auto_shrink_on, state_desc, recovery_model_desc, page_verify_option_desc

FROM sys.databases db
JOIN #tempresults tr
ON db.name =tr.name

WHERE dbid >4 --This clause will exclude system DBs,master, model, msdb, and tempdb
ORDER BY db.name
DROP TABLE #tempresults

Posted in SQL Server, T-SQL | Comments Off on First stab at a database inventory script

Code Prettifier

I found an easy way to render code snippets in HTML. It is called the Simple Talk Code Prettifier. In addition to rendering, it also converts SQL keywords to uppercase and gives you proper indentation.  Also supports VB.  http://www.simple-talk.com/prettifier/default.php

By the way, the Simple Talk blog http://www.simple-talk.com is pretty cool.  My puny effort here pales by comparison.

Posted in SQL Server | Comments Off on Code Prettifier

Searching for a column name in a SQL database

Here are a couple of different ways to search for a column name within a SQL Server database:

-- using information_schema views (preferred method)

SELECT sc.table_name
  
FROM information_schema.columns sc
  
INNER JOIN information_schema.tables st
    
ON sc.table_name = st.table_name
  
WHERE st.table_type = 'base table'
  
AND sc.column_name = 'ColumnToFind'
  
ORDER BY sc.table_name

-- old school
SELECT name
  
FROM sysobjects
  
WHERE id IN
  
(
     
SELECT id
       
FROM syscolumns
       
WHERE name = 'ColumnToFind' )
        AND
xtype = 'U'
       
ORDER BY name

Posted in SQL Server, T-SQL | Comments Off on Searching for a column name in a SQL database

Graduated

I am happy to report that I graduated from Excelsior College with a BS in Computer Information Systems and a GPA of 4.0!

Posted in Personal | Comments Off on Graduated

ICCP DBA Exam

On a personal note, I am happy to report that I passed the ICCP Database Administration Exam at the mastery level. Passing this exam completes my degree requirements.

Posted in Personal | Comments Off on ICCP DBA Exam

Windows shell script: putting date & time into a file

SET mydate=%DATE:~4,2%_%DATE:~7,2%_%DATE:~10,4%
echo mydate >> somefile.txt

Posted in Windows OS | Comments Off on Windows shell script: putting date & time into a file

Open Source Diff Tool

Interesting tool: http://winmerge.org/

To quote the authors: WinMerge is an Open Source visual text file differencing and merging tool for Win32 platforms. It is highly useful for determing what has changed between project versions, and then merging changes between versions.

I have previously used to use windiff.exe to diff two files. This is or was part of the Windows Resource Kit. More recently I have used the freeware Context editor http://context.cx, which has a very good diff tool built in.

On a related note, Here’s a cool trick for finding registry differences between two machines: Export the subkey to a file for each machine and use a file diff tool to compare the differences. By default, the registry editor exports unicode, so either your diff tool has to support unicode, or you need to use the optional Win NT4/9x output format for your export.

Posted in Open Source, Utilities | Comments Off on Open Source Diff Tool

Script to Return Space Used for All Tables in a Database

Here’s a handy script to return space used by table:  spaceused.html

Posted in SQL Server, T-SQL | Comments Off on Script to Return Space Used for All Tables in a Database

Clustering Lab

I recently built an Windows Server 2003 cluster lab using a couple of PCs, two Adaptec AHA2940UW controllers and an old SCSI drive. When configuring the SCSI HBAs, it is important to remember to disable SCSI resets.

Note that when shopping for surplus SCSI HBAs, the AHA2940UW is probably the oldest card that you can use. It has a flash-programmable BIOS.   You can download the last BIOS version produced for that card from Adaptec’s website.  The older 2940 (without the U in the part number) uses pluggable EPROMS.

Posted in Windows OS | Comments Off on Clustering Lab

Tarpits

While working on a client’s Exchange server, I stumbled across an interesting knowledgebase article: SMTP tar pit feature for Microsoft Windows Server 2003. Tarpits insert a delay into certain SMTP communications that are associated with spam or with other unwanted traffic. They work by slowing all responses that contain SMTP protocol 5.x.x error codes. For the Microsoft SMTP server the delay value is configurable. There are some caveats as to whether or not this will be effective in your environment and whether or not it will slow legitimate traffic. It is definitely worth checking out.

On a related note, it would be interesting to put a LaBrea Tarpit on the top and bottom IPs of an Internet facing subnet. For example, if I owned subnet 207.103.204.0/24, I would bind addresses 207.103.204.1 and 207.103.204.254 to a LaBrea host. Zombie computers often do linear address space scans looking for victim machines. Having a tarpit at either end of the subnet could could significantly slow zombie activity. It would also tend to clean up much of that junk that you see in your HTTP or IDS logs.

Posted in Windows OS | Comments Off on Tarpits

Automated, Scheduled Domain Controller Diagnostics

Wouldn’t it be nice to have the time once a week to run the dcdiag against all of your domain controllers and review the results? (yeah, right) Well, I wrote a script to automate that process. It works like this: First there is a subroutine that enumerates the DCs and puts the result into an array of strings. Next there is a function that runs DCDiag against the list of DCs looking for the string “failure”. If that string is found it concatenates the server name and failure string to a global string variable. At the end of this function, if a failure was found, the function returns “True”. If the function returns true, the script e-mails an alert with the necessary information.

Posted in Windows OS | Comments Off on Automated, Scheduled Domain Controller Diagnostics

Retina WiFi Scanner

I audited my Wireless Access Point using the free Retina WiFi Scanner from eEye Digital Security. According to it, my WPA PSK key length was too short, so I increased it to around 25 characters. If you look at the scanner’s installation directory, there is a file named wepdic.txt. This is the wordlist that the scanner uses to attempt to brute force WEP keys. In theory you should be able to replace it with a larger wordlist.

Cain & Abel has been added to my list of “must have” security tools.  Essential for doing password audits.

Posted in Uncategorized | Comments Off on Retina WiFi Scanner

Troubleshooting SQL Server with Filemon

Today I came across an interesting problem while changing the SQL server service to run on a non-administrative domain account. Previously the service was configured to log in as localsystem, which has full rights locally, but no domain rights. With this change, the service would start and then terminate. Here is a SQL error log snippet:

2007-02-13 15:05:01.29 spid5 Clearing tempdb database.
2007-02-13 15:05:01.58 spid5 Encountered an unexpected error while checking the sector size for file ‘k:\mssql\MSSQL\data\tempdb.mdf’. Check the SQL Server error log for more information.
2007-02-13 15:05:01.62 spid11 Starting up database ‘Student’.
2007-02-13 15:05:01.72 spid10 Starting up database ‘Diners’.
2007-02-13 15:05:01.80 spid9 Starting up database ‘distribution’.
2007-02-13 15:05:02.10 spid8 Starting up database ‘Inventory’.
2007-02-13 15:05:02.72 spid11 Starting up database ‘NorthwindReportData’.
2007-02-13 15:05:02.80 spid10 Starting up database ‘TSQLDB’.
2007-02-13 15:05:02.83 spid9 Starting up database ‘SSEMDB’.
2007-02-13 15:05:03.57 spid5 CREATE DATABASE failed. Some file names listed could not be created. Check previous errors.
2007-02-13 15:05:03.57 spid5 WARNING: problem activating all tempdb files. See previous errors. Restart server with -f to correct the situation.

Making the service account a member of the administrators group, fixed the problem, but of course this is not a good idea. I decided to use the SysInternals File Monitor, filemon.exe, to see if I could find the cause of the problem. What I found was that mssql.exe was trying to access the root of the volume where the SQL data files exist and was getting an access denied error. The fix was to grant the service account list access to the root of that volume. With that change, the sqlserver.exe service came up and stayed up. I also found a Technet article that points to the same solution for a similar problem: PRB: Error 5177 May Be Raised When Creating Databases

By the way, SysInternals was recently acquired by Microsoft and Mark Russinovich is now a Microsoft Employee. It appears that the great SysInternals utilities continue to be maintained.

Posted in SQL Server | Comments Off on Troubleshooting SQL Server with Filemon

Earlier Postings

…stuff imported from an earlier journal



2/3/07  I stumbled across a number of useful SQL security tools at SQLSecurity.com: http://www.sqlsecurity.com/Tools/FreeTools/tabid/65/Default.aspx


1/23/07  I ran into a problem configuring SQL XML support in IIS 6.0. When I went to do a query It was coming back with a 404 (page not found). The IIS log entry looks like this:

2007-01-24 01:04:07 127.0.0.1 GET /Northwind sql=select+*+from+Customers+For+XML+Auto&root=Customers 80 MACBETH\Administrator 127.0.0.1 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+5.2;+.NET+CLR+1.1.4322;+.NET+CLR+2.0.50727) 404 2 1260

The last part of the string, ‘1260’ is the Win32 status. Doing a ‘net helpmsg 1260 ‘ from the command prompt shows a problem with policies:

Windows cannot open this program because it has been prevented by a software restriction policy.

Doing some searching, I found the following useful articles:

HTTP 404.x-File or Directory Not Found (IIS 6.0)
http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/0f4ac79a-dc2b-4a5f-89c1-d57266aa6ffe.mspx?mfr=true

IIS 6.0: ASP.NET Is Not Automatically Installed on Windows Server 2003
http://support.microsoft.com/kb/332124/

“HTTP Error 404 – File or Directory not found” error message when you request dynamic content with IIS 6.0
http://support.microsoft.com/?id=315122

Enabling Web Service Extensions (IIS 6.0)
http://www.microsoft.com/technet/prod … 2f-b029-aea2b0418bea.mspx


12/15/2006  I finally caught up with the times and replaced my AC97 audio with a SoundBlaster Audigy. The difference is quite noticeable. I also recently discovered EAC or Exact Audio Copy, a freeware program that accurately transcribes your audio CDs to WAV files. The combination of EAC and a decent sound card essentially turns your PC into a high-end CD player.


11/15/06  I purchased my fourth pneumatic nailer. This beauty is a framing nailer that shoots 2″ to 3 1/2″ nails. It makes quick work of many tasks around the home. By the way, did you know that “he who dies with the most tools wins”?framing-nailer1.jpg


11/3/06 Today is my last day at Citigroup and two days after my 7th anniversary. Company management decided to eliminate the Diners Club systems development group based here in Denver. My job was among 168 of those affected.  It’s a shame because it was probably the best software development team within Citigroup.


7/15/06   I have decided to install security cameras around the house. One thing that I needed was connectors for 75 ohm RG6 cable. Most RG6 connectors are the ‘F’ style, while most security cameras use BNC connectors. I also needed replacement solder-tab batteries for our Interplak toothbrushes. After checking with several suppliers, I placed my order with All Electronics Wednesday evening. The parts arrived via USPS Saturday afternoon — very quick turnaround!


6/25/06  We have to relocate our gas meter. The utility and the building department have pretty stringent requirements for this, so Friday I took the Denver homeowner’s plumbing exam and pulled a plumbing permit. The plumbing inspection requires that you pressurize the gas line to 10 PSI and connect a gauge. You must demonstrate to the inspector that the gas piping can maintain pressure for at least 15 minutes. Not having a suitable gauge or test fixture, I went General Hardware, purchased a few parts and the built one shown here. The parts include a 0-30 psi gauge, a Schrader valve for applying and bleeding the compressed air, a 3/4″ tee, and the necessary reducers for the gauge and the Schrader valve.testfitting.jpg


6/19/06  A colleague recommended a really useful site for finding useful security tools: http://sectools.org/.  Fyodor, the creator of Nmap, and the maintainer of the site http://www.insecure.org, surveyed security professionals and created a listing of the 100 most popular security tools.  Many of these tools have multiple uses. For example, the Paros proxy is a useful tool if you maintain and troubleshoot web sites.


6/1/06  Are your favorite troubleshooting tools not installed on the machine that you are currently working on? One solution is to install your applications on a USB drive a site called PortableApps.com has a number of applications that have been built to run off of a USB or CDROM drive.


4/20/06  Hooray for the retarded kid! I finished up the OS course and took a DANTES exam. Only 7 semester hours to go!


2/21/06  Two useful measures of energy density are Watt hours/Liter (volumetric energy density) or Watt hours/kilogram (gravimetric energy density).  A gallon of gasoline has an energy density of 9000 watt hours per liter while a lead acid battery has an energy density of 40 watt hours per liter.  This helps explain why electric cars are not yet common.
Liquid hydrogen has a substantially lower volumetric energy density than gasoline, only 2600 watt hours per liter.  Don Lancaster has an excellent tutorial on energy fundamentals: http://tinaja.com/glib/energfun.pdf.


1/6/06  Downloaded and installed GIMP on Linda’s XP computer. GIMP stands for GNU Image Manipulation Program. It seems to be a nice alternative to Photoshop.


1/2/06  Hooray for the retarded kid!  I’m finishing up my last fourteen semester hours of undergraduate work.  Today I am starting a course on operating systems.


12/28/05  Used an updated version of a familiar utility autoruns.exe from the sysinternals.com website.  There are all sorts of places within Windows that a program can be automatically launched.  This includes services as well as programs launched as the logged-in user.  These guys seemed to have figured it all out.


12/27/05  ps command usage on AIX:

  • to display all processes with memory utilization use ps gv
  • to display all processes with memory utilization and command line ps gev

Interesting options from the man page: (none of these are preceeded by a – sign)

  • e  Displays the environment as well as the parameters to the command, up to a limit of 80 characters.
  • ew  Wraps display from the e flag one extra line.
  • eww  Wraps display from the e flag as many times as necessary.
  • g  Displays all processes.
  • u  Displays user-oriented output. This includes the USER, PID, %CPU, %MEM, SZ, RSS, TTY, STAT, STIME, TIME, and COMMAND fields.
  • v  Displays the PGIN, SIZE, RSS, LIM, TSIZ, TRS, %CPU, %MEM fields.
  • w  Specifies a wide-column format for output (132 columns rather than 80). If repeated, (for example, ww), uses arbitrarily wide output. This information is used to decide how much of long commands to print.

12/5/05  Grep Examples (in a Win32 environment)

  • Search a file for lines containing “cat” or “dog”:   grep -i “cat\|dog” file.txt    in this case the escaped | symbol is a logical OR operator
  • Search a file for lines containing both “cat” and “dog”:   grep “cat” file.txt | grep “dog”  in this second case, the | symbol is used for redirection

11/18/05:   Found a useful Win32 utilities site. http://www.nirsoft.net/ Of particular note: MyUninstaller. Way more useful that the stock add/remove programs control panel applet.11/17/05: The Unix chsh utility can be used to change your working shell, but

  • This will only work if you have permissions to the passwd file
  • BOKS managed systems periodically push down the passwd file, which will overwrite your changes
Posted in Uncategorized | Comments Off on Earlier Postings