Continuing from the first step of a legacy SSIS migration project – exporting MSDB packages – the next is to export SSISDB projects. As I mentioned in the previous post, Microsoft has included a handy wizard to do this for you since SQL 2016, however it might not be appropriate for you to use, as it wasn’t at my company – we were ditching a lot of of the existing projects, editing some of the others, and wanted a clean slate.
SSISDB projects, unlike their MSDB ancestors, are relatively sophisticated beasts and exported as .ispac (project) files. They can be exported easily via SSMS one at a time – but if you need to export hundreds, it’s not as simple as dumping out a VARBINARY conversion of column data.
Thanks to the SSIS .NET API it’s still pretty easy, though. We can access the SSIS instance, enumerate the folders and projects, then call the GetProjectBytes() function on each to get the .ispac data.
The below Powershell function from the Powershell SSIS repo I’m working on takes an instance name and output directory and outputs all of the projects as .ispac files, creating a subfolder structure in the output directory mirroring the SSISDB folders.
For example, for an SSISDB with three folders – QuarterlyReports, ChurnDataPointlessly, AwfulQueryGreatestHits, each with a Project called IntegrationServicesProject1 – and given an output dir of D:\DBA\, the result would be the following subfolders and files created;
I was recently faced with the task of a fairly massive migration of SSIS packages to SQL 2017, mostly from 2012/2014 but including a few 2008. Cool! 2017 is new, not ‘shiny shiny new’ but ‘still quite shiny, not so many catastrophic early-life bugs’ new, and that’s good. But we decided against the ‘SSISDB upgrade’ feature present in SQL2016+ for reasons, so, ouch, that’s a lot of work. Obviously, I immediately did what any lazy sysadmin worth their salt would do – try to automate the crap out of it.
A few Powershell scripts later and mission mostly complete. So I’ll share them for the handful loads of DBAs potentially having to do this kind of job because their SSIS servers were running critical, bespoke software locked to the server on an ‘if it ain’t broke don’t fix it’ adage from above.
The first one is for exporting packages from MSDB. I feel dirty saying ‘package’ and ‘MSDB’ in the same sentence. Hopefully this blog post is the last time I ever do.
The script below is a function that takes the instance name and either an output directory or DisplayOnly switch. Using the DisplayOnly switch means it just returns a list of folders and package names. Specifying an output directory exports all packages on the instance as .dtsx files by pulling the binary package data from the relevant MSDB tables.
function Export-SSISPackageMSDB
{
param
(
[Parameter(position=0, mandatory=$true)][string]$Instance,
[Parameter(position=1, mandatory=$false)][string]$OutputDir,
[Parameter(position=2, mandatory=$false)][switch]$DisplayOnly
)
Import-Module SqlServer
# Exit if OutputDir not specified without DisplayOnly switch
if(!$DisplayOnly)
{
if(!$OutputDir)
{
Write-Output "Error - must specify OutputDir unless using DisplayOnly switch"
return } if(!(Test-Path $OutputDir)) { Write-Output "Error - invalid path specified in OutputDir" return }
}
# Sanitise path input
if(!$OutputDir.EndsWith('\'))
{
$OutputDir = $OutputDir + '\'
}
$versionquery = "SELECT CAST(LEFT(CAST(SERVERPROPERTY('ProductVersion') AS VARCHAR(4000)),CHARINDEX('.',CAST(SERVERPROPERTY('ProductVersion') AS VARCHAR(4000))) - 1) AS DECIMAL(9,1)) AS 'Version'"
# Get SQL version, exit if cannot connect to instance to perform this query
try
{
$version = (Invoke-Sqlcmd -ServerInstance $Instance -Query $versionquery -ConnectionTimeout 3 -QueryTimeout 3 -ErrorAction Stop).Version
}
catch
{
Write-Output "Error connecting to server."
Write-Output $error[0]
return
}
# Set SSIS table names dependent on version
if ($version -gt 9)
{
Write-Debug "Version -gt 9"
$SSISFoldersTable = 'sysssispackagefolders'
$SSISPackageTable = 'sysssispackages'
}
else
{
Write-Debug "Version -le 9"
$SSISFoldersTable = 'sysdtspackagefolders90'
$SSISPackageTable = 'sysdtspackages90'
}
$PackagesQuery = "WITH cte AS
(
SELECT CAST(foldername AS VARCHAR(MAX)) AS 'FolderPath', folderid
FROM msdb.dbo.$($SSISFoldersTable)
WHERE parentfolderid = '00000000-0000-0000-0000-000000000000'
UNION ALL
SELECT CAST(c.folderpath + '\' + f.foldername AS VARCHAR(MAX)), f.folderid
FROM msdb.dbo.$($SSISFoldersTable) f
INNER JOIN cte c
ON c.folderid = f.parentfolderid
)
SELECT c.FolderPath, p.name, CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as 'pkg'
FROM cte c
INNER JOIN msdb.dbo.$SSISPackageTable p
ON c.folderid = p.folderid
WHERE c.FolderPath NOT LIKE 'Data Collector%'
UNION
SELECT NULL, p.name, CAST(CAST(p.packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) AS 'pkg'
FROM msdb.dbo.$($SSISFoldersTable) f
INNER JOIN msdb.dbo.$SSISPackageTable p
ON f.folderid = p.folderid
WHERE f.folderid = '00000000-0000-0000-0000-000000000000'"
$PackagesQueryDisplayOnly = "WITH cte AS
(
SELECT CAST(foldername AS VARCHAR(MAX)) AS 'FolderPath', folderid
FROM msdb.dbo.$($SSISFoldersTable)
WHERE parentfolderid = '00000000-0000-0000-0000-000000000000'
UNION ALL
SELECT CAST(c.FolderPath + '\' + f.foldername AS VARCHAR(MAX)), f.folderid
FROM msdb.dbo.$($SSISFoldersTable) f
INNER JOIN cte c
ON c.folderid = f.parentfolderid
)
SELECT c.FolderPath, p.name
FROM cte c
INNER JOIN msdb.dbo.$SSISPackageTable p
ON c.folderid = p.folderid
WHERE c.FolderPath NOT LIKE 'Data Collector%'
UNION
SELECT NULL, p.name
FROM msdb.dbo.$($SSISFoldersTable) f
INNER JOIN msdb.dbo.$SSISPackageTable p
ON f.folderid = p.folderid
WHERE f.folderid = '00000000-0000-0000-0000-000000000000'"
Write-Output "SSIS Packages being retrieved;"
if($DisplayOnly)
{
try
{
$packages = Invoke-Sqlcmd -ServerInstance $Instance -Database msdb -Query $PackagesQueryDisplayOnly -QueryTimeout 10 -ConnectionTimeout 10 -ErrorAction Stop
$i = 0 foreach($package in $packages) { Write-Host $package.Folderpath "/" $package.Name $i++ } Write-Output "Total $($i) packages." } catch { Write-Output "Error retrieving packages." Write-Output $error[0] return }
}
else
{
try
{
$packages = Invoke-Sqlcmd -ServerInstance $Instance -Database msdb -Query $PackagesQuery -MaxBinaryLength 100000 -MaxCharLength 10000000
try { $i = 0 foreach($package in $packages) { $package.pkg | Out-File -Force -Encoding ASCII -FilePath ("" + $($OutputDir) + $($package.Name) + ".dtsx") $i++ } Write-Output "$($i) packages successfully written to $($OutputDir)." } catch { Write-Output "Error writing .dtsx files to specified location;" Write-Output $error[0] return } } catch { Write-Output "Error retrieving packages from MSDB;" Write-Output $error[0] return }
}
<#
.SYNOPSIS
Retrieves all SSIS packages from MSDB database.
.DESCRIPTION
Retrieves all SSIS packages from MSDB database, optionally saving the .dtsx package files to
a designated output.
.PARAMETER Instance
Specifies the SQL instance name
.PARAMETER OutputDir
Specifies the full output directory for SSIS packages to be exported to.
.PARAMETER DisplayOnly
Switch parameter that causes function to just output a list of SSIS package folders and names.
.OUTPUTS
No direct outputs from fuction - returns list of SSIS packages or writes .dtsx files.
.EXAMPLE
PS> Export-SSISPackageMSDB -Instance MYSQL2008SERVER -OutputDir "C:\DBA\SSIS\Export\"
Exports all SSIS packages from MYSQL2008SERVER as .dtsx files to C:\DBA\SSIS\Export\
.EXAMPLE
PS> Export-SSISPackageMSDB -Instance MYSQL2008SERVER -DisplayOnly
Displays list of SSIS packages on MYSQL2008SERVER
#>
}
On-prem, connecting to SQL Server with AD authentication from Powershell or .NET code is an easy and long-established task – Invoke-SqlCmd just works and .NET SqlConnections support Integrated Security=SSPI in a connection string – just run your code with an authorised service account et voila. With Azure SQL DB, although SQL authentication remains simple, Azure Active Directory introduces additional complexity.
I recently spent a not-insignificant amount of time figuring out the methods and limitations of doing this as part of the never-ending DBA quest for automation, not helped by somewhat patchy documentation and not being able to find a central resource on the topic. So I thought I’d try and distil this information into one post covering everything. I’ll demo examples for connecting via AAD with either Powershell or .NET (using C#). Note – I’ve omitted try/catch logic from the code snippets for clarity, of course they should be used when acquiring tokens, connecting, running queries etc.
Auth Type
Powershell
NET (C#)
SQL
Yes, with SqlServer module
Yes, with System.Data.SqlClient
AAD – User Account
Yes, with System.Data.SqlClient
Yes, with System.Data.SqlClient
AAD – Service Principal
Yes, with ADAL.PS* and System.Data.SqlClient
Yes, with ADAL.NET*
AAD – User w/ MFA
No…well, kinda
Yes, with Microsoft.Data.SqlClient
*I know, ADAL has been superceded by MSAL, I’ll cover that next time.
SQL Authentication
I won’t go over this as it’s not AAD and has long been well-documented. It’s also long been a security risk compared to (A)AD authentication. Which is why I find it disappointing MS have made SQL logins compulsory for Azure SQL DB (exporting/importing databases), frequently used them as primary examples of connectivity to Azure SQL DB and lagged on AAD support in aspects like the SqlServer Powershell module or lack of sysadmin server role…. but I digress. I daresay the ‘virtual master’ underlying architecture will have made things complicated in this regard.
AAD – User account
If you have an AAD user account without MFA enabled, we can use the existing System.Data.SqlClient class. We just need to specify Authentication=Active Directory Password and pass a User ID and Password in the connection string. I’ll only show a Powershell example for this as usage of the .NET classes is identical.
We can use a Service Principal as a pseudo-service account – secured by either a password (‘Client Secret’) or certificate. Instructions on the ‘App Registration’ process are here – just follow the steps to create an App Registration and a Client Secret to go with it (make note of this before saving!), skipping the RBAC section as we don’t need to give the account RBAC permissions as we are just connecting to SQL DB. Make note of the Display Name, Application (client) ID, Client Secret and Directory (tenant) ID.
Then we need to add the account as a SQL DB user, using the EXTERNAL PROVIDER syntax. Example below assumes the Display Name of the service principal is MyTestServicePrincipal;
CREATE USER [MyTestServicePrincipal] FROM EXTERNAL PROVIDER
Now to connecting…
Powershell
To connect using a Service Principal with Powershell you’ll need to install the ADAL.PS module, a community-developed wrapper for the ADAL.NET library. Github and Powershell Gallery links.
Import-Module ADAL.PS
Import-Module SQLServer
# Variables for Azure AD Authentication.
# The Tenant ID is available in 'Tenant Properties' in Azure Portal or retrievable via Get-ADSubscription for the required subscriptions.
# The Client ID and Client Secret are taken from the Service Principal under App Registrations in the Portal.
# The SQLResourceURL is standard for any Azure SQL DB connection, at least for now.
$TenantID = 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
$ClientID = 'cccccccc-cccc-cccc-cccc-cccccccccccc'
$ClientSecret = 'verylongsecretkey'
$SQLResourceURL = 'https://database.windows.net/'
# Combine the standard AAD authorisation URL with Tenant ID to get the correct URL for authentication
$AADAuthorityURL = 'https://login.windows.net/' + $TenantID
# Client Secret needs to be in secure form, so convert from plain text
$ClientSecretSecure = ConvertTo-SecureString $ClientSecret -AsPlainText -Force
# Create a standard ConnectionString
$SQLServerURL = "mytestazuresqldb.database.windows.net"
$Database = "testdb"
$ConnectionString = "Server=tcp:$($SQLServerURL),1433;Initial Catalog=$($Database);Persist Security Info=False;Connect Timeout=30;Encrypt=True;TrustServerCertificate=False"
# Acquire an AccessToken
$AuthResult = Get-AdalToken -Authority $AADAuthorityURL -Resource $SQLResourceURL -ClientId $ClientID -ClientSecret $ClientSecretSecure
$token = $AuthResult.AccessToken
# Create a SqlConnection using the connection string and our AccessToken
$SQLConnection = New-Object System.Data.SqlClient.SqlConnection($ConnectionString)
$SQLConnection.AccessToken = $token
# Connect to Azure SQL DB
$SQLConnection.Open()
$query = "SELECT * FROM sys.databases"
$command = New-Object System.Data.SqlClient.SqlCommand($query, $SQLConnection)
$reader = $command.ExecuteReader()
while($reader.Read())
{
Write-Output $reader[0]
}
C#
To connect with C# we can just use the ADAL.NET library itself. Github and NuGet links.
using System;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Microsoft.Data.SqlClient;
namespace AzureSQL_ServicePrincipal_Test
{
class Program
{
static void Main(string[] args)
{
string authorityURL = "https://login.windows.net/";
string resourceURL = "https://database.windows.net/";
// SQL DB and Service Principal variables
string serverName = "mytestazuresqldb.database.windows.net";
string databaseName = "testdb";
string clientID = "cccccccc-cccc-cccc-cccc-cccccccccccc";
string tenantID = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx";
string clientSecret = "verylongsecretkey";
// Add Tenant ID to authority URL
authorityURL += tenantID;
string sqlConnectionString = String.Format("Data Source=tcp:{0},1433;Initial Catalog={1};Persist Security Info=False;Connect Timeout=30;Encrypt=True;TrustServerCertificate=False", serverName, databaseName);
// Create an AuthenticationContext from our authorisation URL, and a ClientCredential from our Client ID and Secret
AuthenticationContext authenticationContext = new AuthenticationContext(authorityURL);
ClientCredential clientCredential = new ClientCredential(clientID, clientSecret);
Console.WriteLine("Trying to acquire token...");
// Call AcquireTokenAsync to get our token
AuthenticationResult authResult = authenticationContext.AcquireTokenAsync(resourceURL, clientCredential).Result;
Console.WriteLine("Token acquired.");
// Now try to connect to Azure SQL
using (var sqlConn = new SqlConnection(sqlConnectionString))
{
sqlConn.AccessToken = authResult.AccessToken;
sqlConn.Open();
var sqlCmd = new SqlCommand("SELECT * FROM sys.databases;",sqlConn);
var reader = sqlCmd.ExecuteReader();
while(reader.Read())
{
string result = reader.GetString(0);
Console.WriteLine(result);
}
}
Console.WriteLine("Success!");
Console.ReadKey();
}
}
}
AAD with MFA
Multi-factor authentication introduces a prompt to login to your Azure tenancy with some sort of additional authentication measure like a text to a mobile phone.
Powershell
I’ve not found an easy way to connect via Powershell using MFA. It’s not supported by the SqlServer module in Invoke-SqlCmd nor in System.Data.SqlClient. It appears to be possible to use the new Microsoft.Data.SqlClient assembly (see C# example below), but there is no method of easily referencing this library due to its dependencies.
Note that at the point of opening the connection, a browser window should open where we can enter MFA credentials. Once this is complete the rest of the code will execute.
using System;
using Microsoft.Data.SqlClient;
using System.Text;
namespace ConnectToAzureSQLDB
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("ConnectToAzureSQLDB");
SqlConnectionStringBuilder builder = new SqlConnectionStringBuilder();
builder.DataSource = "mytestazuresqldb.database.windows.net";
builder.InitialCatalog = "master";
builder.Authentication = SqlAuthenticationMethod.ActiveDirectoryInteractive;
builder.ConnectTimeout = 30;
Console.WriteLine(builder.ConnectionString);
Console.WriteLine("Connecting to server - authentication prompt should open in browser window...");
using (SqlConnection sqlconn = new SqlConnection(builder.ConnectionString))
{
SqlCommand sqlcmd = new SqlCommand("SELECT * FROM sys.databases", sqlconn);
sqlconn.Open();
var reader = sqlcmd.ExecuteReader();
while (reader.Read())
{
Console.WriteLine(reader.GetString(0));
}
}
Console.WriteLine("Success");
Console.ReadKey();
}
}
}
And that’s enough connecting to Azure SQL for today. I’ll update/create a further post on this topic about using the new Active Directory authentication library MSAL.NET – and its Powershell wrapper module – once I’ve taken a proper look at it.
COVID-19, in comparison to previous pandemic diseases (Swine Flu, Bird Flu etc), sounds more like a computer virus than a mammalian one. It sounds like a virus Richard Morgan might write about in one of his cyberpunk novels, that is, a software virus that infects the ‘wetware’ of the inevitable man-machine hybrid we will become, with all the new horror that entails.
Which is why I find it apt that it has forced a culture of remote working for those it’s possible for, bringing us one step closer to a cyberpunk uto/dystopia. If there are to be silver clouds from this pandemic, I hope a persistent change in working practise is one of them.
Many people will have embraced the change willingly, and many will have already worked a day or two per week remotely, but there also will have been many managers anxious at losing physical visibility and face-to-fact contact. “If I can’t see them, they aren’t going to do any work!!” I hear them cry. I suspect that even for roles that don’t have easily measurable KPIs like tickets closed, backlog items/cases completed etc, there must be some deliverables that can be measured – if there exists a job that someone could completely slack off on without anyone ever finding out in any way, I would say the requirement for such a job to exist should be looked at closely. The future is here, and the concept of work as a location as opposed to an activity must change.
There are drawbacks of 100% remote working, for sure. It’s absolutely not ideal for new staff who benefit greatly from intense contact with coworkers as they form their future working relationships (though with improvements in VR this will eventually cease to be relevant). It’s not good for humans – social beings that we are – to be completely physically isolated, particularly if one happens to live alone or have few friends. And depending on available remote tooling – multiple screens, limitations of backend VPNs or RDP, phonelines – some may struggle to be as productive as in the office. And oh yeah, office romances? Good luck with that over Teams, though that’s probably for the best in many cases…
But for established, well-equipped workers who don’t have a need for the social aspect of work, full-time or mostly-full-time remote working makes sense. The benefits to people and the environment are myriad, here are some that spring to mind.
Saving time. No more travel to work – and some people commute for multiple hours per day. Time is truly precious – the only resource than can never be gained, only lost.
Better health. With more free time and no commute, stress levels drop, potentially leading to better mental health and more time to improve physical health – either by exercise or using some of that sweet free time to cook healthier food.
Saving the environment. No need to burn that litre of petrol/diesel just to do something you could have done in your underwear. And you can keep that trusty old car for years longer. This effect could be enormous.
Saving money. Either no public transport cost, or less car maintenance/fuel cost. Hey, maybe you can even get rid of the car? Or swap it for a motorbike. They’re fun, trust me.
Cutting traffic. Ever wanted to go somewhere on a day off and had to remember not to be anywhere near a road by 5 PM? Yeah, we have terrible traffic, traffic that has got worse over time as more cars join the roads but the roads stay the same size, but no longer. This would also make the commute for people who can’t work remotely much more bearable.
Less risk of burglary. You’re in the house all day, which would put off all but the most reckless good-fer-nothin’s from stealing your stuff. And if someone does have a go, that’s OK because you spent some of that saved travel money on medieval weaponry. “Yes your honour, I have always kept a claymore next to my PC”
Company saves money. No need to rent, electrify, heat, clean, secure and insure that huge, fancy office building for six or seven figures per year. Portacabins are cheap.
Cheaper car insurance. If you’re keeping the car but don’t need to commute anymore, car insurance will be cheaper. …Haha, just kidding. Car insurance will never be cheaper.
Location independence. This is a big one. Stuck renting an attic cupboard for a grand a month in London because the office is a mere 3 train journeys away? Imagine if you could keep your job, but live cheaply, hundreds of miles away? Or what if you’ve always wanted to live in France but can’t speak the language beyond ‘deux crossiants, s’il vous plait’? Remote working could make that dream a reality.
Better equipment. Hey, work? The 90s called and wants its screen size back. We all know that a typical company probably spends more on design consultancy for a new font on their website than quality computer hardware. I personally enjoy a spacious 144hz monitor and delightful mechanical keyboard at home, which is a much better experience than squinting at a 22″ screen and being the third owner of a Dell OfficeDronePro keyboard in the office. With all the money they’ve saved on downsizing the office, they can afford to buy you some decent kit.
At some point in the future, the closest thing to ‘going to work’ will be putting on a sleek VR headset – or plugging a cable into the base of your neck if Elon Musk has anything to do with it – and being in a life-like virtual office. It could be locationless, or the company could change the ‘theme’ at will – ‘This month we’re in Tokyo – if you look out of the window from the fourth floor next to the Network team you can see the Imperial Palace’.
For now, not having to put pants on will have to do.
I recently had an issue sending mail for the first time from a SQL 2016 server – mails were queuing with sent_status = unsent in msdb.dbo.sysmail_allitems.
A search led to a bug relating to SQL 2016 being installed on a server without .NET 3.5 – indeed, the server in question had .NET 4.6.x but not 3.5. I performed the following listed workaround then restarted SQL Agent (not mentioned in article) and the mails were sent immediately;
Create the DatabaseMail.exe.config and drop it next to the DatabaseMail.exe under the Binn folder. You can use notepad.exe or any other editor to edit it. Just make sure you save it by using UTF-8 encoding (in notepad.exe, select Save As… and in the Encoding combo box, select UTF-8): <?xml version="1.0" encoding="utf-8" ?> <configuration> <startup useLegacyV2RuntimeActivationPolicy="true" <supportedRuntime version="v4.0"/> <supportedRuntime version="v2.0.50727"/> </startup> </configuration>
The reason for this post is that on the KB article, MS says the bug was fixed in RTM CU2 and SP1 CU2. However the server I had the problem with was SP2 CU3. So either the bug wasn’t fixed until later or, more likely, applying CU2 or later doesn’t retrospectively fix the problem – it just stops it existing if a fresh install of a version beyond CU is performed, and you still need to fix/create the DatabaseMail.exe.config if you already had the problem.