Exporting Microsoft SQL Server Data Using PowerShell

By June 12, 2014 code, links, productivity

One of the repetitive tasks coming across my desk over the past few years has been to regularly generate a CSV file from data in our Microsoft SQL Server instances for use by third party providers. While reading a newsletter from SQLServerCentral.com, I saw a post by Louis Li titled “Extracting data with headers using PowerShell“. 

I’ve been working a bit with PowerShell and SharePoint 2013 so the article peaked my interest. Ways to dump data out of MS SQL are a dime a dozen but I liked the simplicity of being able to simply hand a T-SQL file containing one or more SELECT statement to a PowerShell script and have the data saved out as a CSV file. This also makes scheduling the process easy by just using Task Scheduler to call the PowerShell script.

Before going through the rest of this post, please visit Louis Li’s article to get a basic understanding of what the script does.

I decided to take what Louis had done and build upon it a bit and add some extra features to make life a little easier for me and anyone else who wanted to take the script and run with it. If you want to just grab the file and go, here it is.

The first thing I did was to add some named parameters to the file.

 

[CmdletBinding()]
Param(
	[Parameter(Mandatory=$true)]
	[string]$serverName,
	
	[Parameter(Mandatory=$true)]
	[string]$databaseName,
	
	[Parameter(Mandatory=$true)]
	[string]$scriptFilePath,	
	
	[Parameter(Mandatory=$true)]
	[string]$saveDirectory,
	
	[Parameter(Mandatory=$false)]
	[string]$fileNames	
)

The only additions were the -saveDirectory and -fileNames parameters. The -saveDirectory parameter allows you to pass a location of where you want the CSV files saved. The -fileNames parameter takes a comma delimited set of strings and will use each string as the file name. The file names will be used in order so make sure the number of file names and number of SELECT statements match and are in the same order.

Next I needed to add some error checking to make sure the output directory existed and had a trailing slash. I also wanted to verify the location of the .SQL file being passed in exists and has a .SQL extension.

 

#test if the save directory exists
if ((Test-Path -Path $saveDirectory) -ne $true)
{
	throw "The -saveDirectory does not exist"
}
else
{
	#Add a trailing slash if there isn't one supplied on the saveDirectory param.
	if (!($saveDirectory.SubString($saveDirectory.Length-1,1) -eq "\")) { $saveDirectory += "\" }
}

#test SQL Script exists
if ((Test-Path -Path $scriptFilePath -Include "*.sql") -ne $true) {	
	throw "The -scriptFilePath either does not exist or is not a .sql file" 
	}

 

 

If those check out then I want to check to see if a string of file names was passed. If so, they need to be split to an Array for handling later. If the number of items in the Array doesn’t match the number of tables returned by the SQL statement then I set the array to empty and will just use numbers for file naming.

 

#If file names are supplied and the count doesn't match then default to number naming
if (Test-Path variable:fileNames)
{
	$arrNames = $fileNames.Split(',')	
	if ($arrNames.Count -ne $dataset.Tables.Count)
	{
		$arrNames = ""
	}	
}

 

I modified the Foreach loop to accommodate the custom file naming as follows.

 

Foreach ($dt in $dataset.Tables)
{
	if ($arrNames -ne "")
	{
		$saveName = $arrNames[$cnt]
		$cnt++
	}
	else
	{
		$cnt++
		$saveName = $cnt.ToString()
	}		
	$file = $saveDirectory + $saveName + ".csv"
	$dt | Export-Csv $file -NoTypeInformation
}

 

Lastly I added some comment based help at the end of the script to give a little more detail as to what each parameter does.

Many thanks to Louis Li for his article and enjoy! 

Download the script and some examples here: ExportToCSV.zip (2.48 kb)

If you know of any ways to make the script any easier or better, let me know.