How To Manage SQL Server Job Failure Emails with MS Outlook Conditional Formatting

Some scheduled jobs send failure emails upon an error and DBAs need to remember to fix the job and re-execute it. I use filter rules in MS Outlook to organize my inbox by redirecting all the scheduled job emails to a separate folder. In addition to that, I use conditional formatting rules to keep track of action items as pending or done. The conditional formatting rules highlight the job failure emails with the red color and turn them green when marked complete as shown in the following image.

JobEmailsAsTasksExpected

Create a Conditional Formatting Rule to Highlight a Failed Job Email in Red:

Navigate to the folder where you would like to modify the view. You can even customize your main Inbox folder view, if that is where your emails are. Right-click on the header row and go to View Settings.

JobEmailsAsTasks1

Click on Conditional Formatting.

JobEmailsAsTasks2

In the Conditional Formatting window, click on Add and rename the new untitled rule. Then change the font color to red and click on Condition.

JobEmailsAsTasks3

In the Filter window, put a sufficiently unique text phrase from the error email that can distinguish it from good emails. Also specify in the drop-down where exactly the filter should look for the text phrase i.e. just in the email subject or, email body or both.

My emails have the following text in the email body that I can use for identification –

STATUS:  Failed

Now this a simple example with a text phrase for filtering but you can use other criteria in More Choices and Advanced tabs too for more complex scenarios as necessary. If you get emails with different text patterns then you can always create a new rule for each phrase.

JobEmailsAsTasks4

That is it. Click on OK to come back to your inbox and verify that the error emails have a red font.

Create a Conditional Formatting Rule to Highlight a Failed Job Email in Green After Resolution:

Now we want another rule that will change the red email to a green one when it is marked as complete. Add another Conditional Formatting rule, rename it and change the font color to green and click on Condition.

JobEmailsAsTasks5

In the filter window, follow the same steps as in the previous red rule. Then go the More Choices tab.

JobEmailsAsTasks6

In the More Choices tab, check the Only Items which: and select the drop-down value of are marked complete.

JobEmailsAsTasks7

Back in the Conditional Formatting window, move the green job up. This is important otherwise the red rule will override the effects of the green rule.

JobEmailsAsTasks8

The final window will look like this.

JobEmailsAsTasks9Final

Now when ever the issue from the failure email is resolved, just right-click on the email, go to Follow Up, select Mark Complete. The email will turn green.

JobEmailsAsTasks10

How To Connect SSMS to ALWAYSON Read-Only Secondary Database

The databases in PRIMARY availability group can be used for read-write access. The databases in the SECONDARY availability group can be used just for read-only access.

An attempt to connect to a SECONDARY availability group database with a normal connection, which is read-write by default, shows the following error message –

Msg 978, Level 14, State 1, Line 1

The target database ('AGDemoDB') is in an
availability group and is currently accessible
for connections when the application intent is
set to read only. For more information about
application intent, see SQL Server Books Online.

To resolve the issue, the connection string needs to have the Application Intent = ReadOnly parameter. How do you pass parameters in a SSMS connection?

SSMS has many options that are not too obvious. One of them is to provide additional connection parameter options. All that is needed to resolve the above error is to use the Additional Connection Parameters screen in the connection dialog and put the parameter there.

AG_ReadOnlyIntent01

The keyword should not have any spaces.

AG_ReadOnlyIntent02

Further reading:

The AlwaysOn Professional MSDN blog has more examples of connection strings for various applications.

You may also like to review the Application Intent Filtering feature of AlwaysOn at this and this link.

How to Avoid Orphan Users in SQL Server ALWAYSON, Create Logins Correctly

SQL Server logins are stored in the [master] database. System databases (master, model, msdb and tempdb) cannot be included in an availability group so a login created on the PRIMARY replica will not show up on the SECONDARY replica automatically. It has to be created manually on the SECONDARY replica.

Logins have an SID associated with them. When a Windows authentication login is created in SQL Server, it uses the SID from the Windows Active Directory. So the SID for such a login will be the same on every SQL Server in the network domain.

When a SQL authentication login is created, it gets a new auto-generated SID assigned by the SQL Server. This auto-generated SID will be different in each SQL Server even if the login name and the password combination are the same.

The database users are mapped to the logins internally using the SID, not the login name/user name. There are some situations where the SID may have a mismatch, e.g. when a database is restored to a different server where the supposedly matching login already exists, or a login is dropped and recreated without consideration to the mapped users, or a login is recreated between a database detach and reattach etc. Such users that do not have a login with a matching SID are known as orphan users. This SID mismatch means that although an application or a user can connect to the server using the login, but it can not access the database.

The following options with varying degree of effectiveness can be used to fix the SID mapping between a login and an orphan user –

  • Drop and recreate the user in the restored database. Of course the user permissions will get deleted too and have to be granted again.
  • Drop and recreate the login with same SID as the restored database. This is a definite no-no if there are other databases on the server linked to that login. It will only complicate matters.
  • Run the system stored procedure sp_change_users_login. It has parameters to just report, fix one or fix all orphans. But the stored procedure is now marked as deprecated so there are no guarantees of future availability.
  • ALTER the user (there are some restrictions, like there cannot be a one-to many mapping etc.) –
    USE [MyUserDBName]
    GO
    ALTER USER someuser WITH LOGIN = somelogin
    

    As the user databases on the SECONDARY replica are read-only, the role of the SECONDARY server has to be changed to PRIMARY by doing a failover before the above ALTER command can be executed.

All of this can be avoided if the login is created correctly on the SECONDARY replica. We just have to make sure that the SID for the login on the SECONDARY replica matches the PRIMARY replica.

Let us begin with creating a new login on the PRIMARY replica of an existing Availability Group –

/* On the PRIMARY replica */

-- Create the Login
USE [master]
GO
CREATE LOGIN [TestLogin]
WITH PASSWORD=N'abc123#'
, DEFAULT_DATABASE=[master]
GO

Grant privileges to this login if necessary. They will NOT automatically replicate to the SECONDARY replica. Now find the SID of this new login.

-- Get the SID for the new Login
SELECT name, sid 
FROM sys.server_principals 
WHERE name = 'TestLogin'

/* Results:
name          sid
TestLogin     0x8EA0E033BD83524180CF813A20C5265B
*/

On the SECONDARY replica, create the login with the same SID. The GUI wizard to create logins does not have this feature to specify the SID, so the login has to be created using TSQL with an additional parameter.

/* On the SECONDARY replica */
-- Create the Login with the same SID as
-- the PRIMARY replica
CREATE LOGIN [TestLogin]
WITH PASSWORD=N'abc123#'
-- use the SID retrieved above
, SID = 0x8EA0E033BD83524180CF813A20C5265B
, DEFAULT_DATABASE=[master]
GO

Grant the same privileges to this login as done on the PRIMARY replica.

Now go back to the PRIMARY replica and create database user mapped to the login and grant required permissions at the database level. This new database user will be automatically replicated on the SECONDARY replica with its permissions and correctly map to the login. No action on SECONDARY required because the user database is in an Availability Group that is synced across replicas.

Further reading:
For the sake of completeness, I must mention the widely cited KB 918992 article (How to transfer logins and passwords between instances of SQL Server) which provides a stored procedure [sp_rev_login] to move the logins from one server to another. This stored procedure generates the CREATE LOGIN script with the password hash and the SID. You would need that stored procedure only if you don’t have access to the clear text passwords or, if you want to include that script as a scheduled job but not hardcode the password in the job. If you do not have those constraints then you can simply use the steps described in my blog post here.

How to Concatenate SSIS Variables with a For Loop Container

Variable value concatenation in SSIS is usually done with a Script Task. I am going to demonstrate variable concatenation with a For Loop Container in this post while working on a real world scenario of sending all files from a folder as email attachments.

Drag a ForEach Loop Container in the Control Flow. Then drag a For Loop Container within the ForEach Loop Container. There are no other tasks in these two containers. Finally add a Send Mail Task to the Control Flow and join it to the ForEach Loop Container with a precedence constraint.

The final package looks like this.

Concat1-Package

ForEach Loop Container is used to iterate through all files in a folder and return their full paths. For Loop Container will concatenate the file paths in a variable. Send Mail Task will use the concatenated file paths as an attachment list in an email.

Declare two variables of string type at the package level. Variable @[User::FELC_Iterate] will hold the single file path returned by the ForEach Loop. Variable @[User::FLC_Concat] will hold the concatenated list of file paths passed from the other variable.

Concat2-Variables

Double click the ForEach Loop Container to open the properties editor. In the Collections tab, give the folder path and select the radio button for fully qualified file name. You can also set the filter for the file types (e.g. *.txt, *.xls etc.). I let it be at its default of all files (i.e. *.*).

Concat3-FELCEditorCollectionsTab

In the Variable Mappings tab, set the variable @[User::FELC_Iterate] to index zero so that it can hold the file paths returned by the ForEach Loop Container. Click OK to close the editor.

Concat4-FELCEditorVariableMappingsTab

Now double click the For Loop Container to open its properties editor. Set the InitExpression to –

@[User::FLC_Concat] = @[User::FLC_Concat] + @[User::FELC_Iterate] + "|"

This expression will append the @[User::FELC_Iterate] value coming from the outer ForEach Loop along with a pipe symbol to the @[User::FLC_Concat] in each iteration. The pipe symbol is the separator required by the Send Mail Task if there are multiple files as attachments.

One iteration of the For Loop will suffice our requirement to append the variable value. So set EvalExpression to 1<0, or any improbable boolean condition. The For Loop executes once and then evaluates this condition to see if it can go for another execution. The improbable condition will exit the loop on second iteration because the condition will evaluate to FALSE. Click OK to close the editor.

Concat5-ForLoopEditor

Double click the Send Mail Task to open its editor. Setup the SMTP connection and other fields in the Mail tab. In the Expressions tab, add an expression for FileAttachments to use the @[User::FLC_Concat] variable. Click OK to close the editor.

Concat6-SendMailTaskEditorExpressions

Execute the package and the file paths are concatenated as shown on the Locals window. These files would be emailed as attachments.

Concat7-LocalsWindowVariableValues

How to Log SSIS Variable Values During Execution in the Event Log

Log entries created during the execution of an SSIS package help in monitoring, analysis, and issue resolution. In addition to logging events, you might want to capture the run-time values of variables in the package. I couldn’t find a native feature to log SSIS variable values so the following post shows how I did it.

The Setup

For this demo, I’ll do an INSERT operation on a table named TableA and use variables to save the before and after INSERT row count. The TableA is always blank at the beginning of the package i.e. the row count is zero. The Execute SQL Task named ESQLT-InsertRowsInTableA inserts 502 rows in TableA. The simple package looks like the image below. If you are wondering about the funny prefixes in the object names then you can read about my naming conventions mentioned in my other blog post.

Log SSIS Variable Value 1 - Package

I have two integer variables named rcTableA_PreRefresh and rcTableA_PostRefresh scoped at the package level.

Log SSIS Variable Value 2 - Variables Init Zero

Enable the Event

A log entry is generated when an event is triggered. Each object in SSIS has its own events that can be logged. I’ll use the event called OnVariableValueChanged, which as the name denotes, is triggered whenever the value of the variable changes. This event is disabled by default. To enable it, go to the Properties window of the variable and make the RaiseChangedEvent property to True. It must be enabled for each variable individually.

Log SSIS Variable Value 3 - Variable Properties

Next, I include the OnVariableValueChanged event in the logging configuration. It has to be included at the container level where the variables are scoped to. In my case, at the package level. I’m using the SSIS Log Provider for SQL Server in this package.

Log SSIS Variable Value 44 - Configure Log

Then I execute the package and look at the [dbo].[sysssislog] table for the log entries.

Log SSIS Variable Value 5 - Log Entries 1

There are some log entries but something is missing. I see the OnVariableValueChanged event logged for the Post Refresh variable but not the Pre Refresh variable.

The reason is that the initial value of the variable is set to zero in the package. The row count of a brand new empty table is also zero. So there was no change in variable value. The OnVariableValueChanged event fires only when the value actually changes! Overwriting with the same value doesn’t fulfill this condition.

To resolve that, I change the initial values in the package to -1. Now even if the row count turns out to be zero, the variable value will still change from -1 to zero. The COUNT function can’t count below zero, can it?

Log SSIS Variable Value 6 - Variables Init Minus One

I run the package again and check out the [dbo].[sysssislog] table.

Log SSIS Variable Value 7 - Log Entries 2

Things are better. The OnVariableValueChanged event for both the variables show up in the log. But the variable values are still not there.

Log SSIS Variable

The reason for missing values is that the event logging just captures the fact that the variable value changed. It doesn’t capture the value by itself. I’ll make an addition to the event handler to get the variable values too. I add an Execute SQL Task to the package level event handler for OnVariableValueChanged event.

Log SSIS Variable Value 8 - Event Handler

The General tab of the Execute SQL Task has the following properties and SQL command –

Log SSIS Variable Value 8 - Event Handler - ESQLT 1

INSERT INTO [dbo].[sysssislog]
([event]
,[computer]
,[operator]
,[ source]
,[sourceid]
,[executionid]
,[starttime]
,[endtime]
,[datacode]
,[databytes]
,[message])
VALUES
('*SSIS-OnVariableValueChanged' -- Custom event name
,? -- param 0
,? -- param 1
,? -- param 2
,? -- param 3
,? -- param 4
,? -- param 5
,? -- param 6
,0 -- Zero
,'' -- Blank string
,?) -- param 7

Notice that I precede the custom event name with an asterisk to differentiate it from the log entries created by the system.

The Parameter Mapping tab of the Execute SQL Task has the following properties –

Log SSIS Variable Value 9 - Event Handler - ESQLT 2

Pay attention to the System::VariableValue (last variable) in this screen. Its data type is LONG, which is appropriate for the numeric row counts in my example. You may have different data types for your variables. Do adjust the data type and length appropriately. Using a wrong type could lead to no value logged at all.

I run the package again and this time, the variable values are also logged in the table.

Log SSIS Variable Value 10 - Log Entries 2

Summary

A single event handler will take care of all variables in that scope. In my case, two package scoped variables are handled by a single package level event handler.

The variable value really has to change to fire the event.

The OnVariableValueChanged event is triggered for the container that has the variable in its scope. The container triggering the change in variable value could be different than the container that has the variable in its scope. In my demo, the variables were scoped to the package. Even though an Execute SQL Task is changing the variable values, I still put the event handler at the package level. As another example, assume there is a variable declared in the scope of a ForEachLoop container and there is a Script Task in the ForEachLoop. The Script Task changes the variable value. The OnVariableValueChanged event will be triggered for the ForEachLoop task.

I have used the default [sysssislog] logging table to log my variable values. You can easily use a different custom table by changing the OLEDB connection and making appropriate changes to the INSERT statement.

[jetpack_subscription_form subscribe_text=”If you liked this post then please subscribe to get new post notifications in email.”]

TSQL Gotcha – Order of expressions for a range search with BETWEEN

Do you trust your users to always pass range search parameters in the correct order? Or do the users trust that the code will take care of a small thing like parameter ordering?

If you have a stored procedure or a script that accepts two parameters to do a range search using a BETWEEN keyword (or maybe with a >= and <=) then it is important to verify that the start expression and end expression are in the correct order. The correct order for numeric values is smaller value before a larger value. The correct order for character values is the dictionary order. A user might pass on the expression values the wrong way around and see no results returned.

The following demonstration has a small check for the correct order of values and to do a quick reorder to fix any issues. To help things a bit more, instead of using generic variable names like @param1 and @param2, they can be made self-documenting by being descriptive like @begin_param and @end_param.

Demonstration with numeric values:

declare @DemoTableNumeric table 
(col1 int)

-- Insert 10 rows in the demo table
insert into @DemoTableNumeric (col1)
values 
(1),(2),(3),(4),(5),
(6),(7),(8),(9),(10)

-- verify data
select * 
from @DemoTableNumeric

declare 
@param1 int, 
@param2 int

-- Assign values
-- Note: Param1 &gt; Param2
select 
@param1 = 7, 
@param2 = 4

-- The following return zero rows
-- because the first expression is
-- greater than the second expression
-- It is a wrong order of values.
select * 
from @DemoTableNumeric
where 
col1 between @param1 and @param2

select * 
from @DemoTableNumeric
where 
col1 &gt;= @param1 and col1 &lt;= @param2

-- It is important to verify the expression
-- values and reorder them if necessary
-- The following IF condition does that
if @param1 &gt; @param2
begin
	declare @temp int
	set @temp = @param1
	set @param1 = @param2
	set @param2 = @temp
end

-- Now both queries return rows
select * 
from @DemoTableNumeric
where 
col1 between @param1 and @param2

select * 
from @DemoTableNumeric
where 
col1 &gt;= @param1 and col1 &lt;= @param2

Demonstration with character values:

declare @DemoTableChar table 
(col1 varchar(10))

insert into @DemoTableChar (col1)
values
('Alpha'),
('Golf'),
('Kilo'),
('Oscar'),
('Tango'),
('Zulu')

-- verify data
select * 
from @DemoTableChar

declare 
@param1 varchar(10), 
@param2 varchar(10)

-- Note: Param1 &gt; Param2
select 
@param1 = 'Tango', 
@param2 = 'Golf'

-- This returns zero rows
-- because the first expression is
-- greater than the second expression
-- It is a wrong order of values.
select * 
from @DemoTableChar
where 
col1 between @param1 and @param2

select * 
from @DemoTableChar
where 
col1 &gt;= @param1 and col1 &lt;= @param2

-- It is important to verify the expression
-- values and reorder them if necessary
if @param1 &gt; @param2
begin
	declare @temp varchar(10)
	set @temp = @param1
	set @param1 = @param2
	set @param2 = @temp
end

-- Now both queries return rows
select * 
from @DemoTableChar
where 
col1 between @param1 and @param2

select * 
from @DemoTableChar
where 
col1 &gt;= @param1 and col1 &lt;= @param2

Suppress the Error Number, Severity Level and State Number in the Error output

For any reason, if you don’t want to show the Error Number, Severity Level and the State Number along with the Error Message, use the Severity Level 0 (Zero) or 10. The Severity Number 10 is converted to Zero internally.

Demonstration:

A severity number except 0 or 10 displays the Error Number, Severity and State information.

SuppressErrorNumber1

A severity number 0 or 10 suppresses the Error Number, Severity and State information.

SuppressErrorNumber2

Further Reading:

Database Engine Error Severities
http://msdn.microsoft.com/en-us/library/ms164086(v=sql.105).aspx

How to Use SSIS to Export Clickable URLs to Excel

When SSIS is used to export data to a MS Excel workbook, Excel tries to guess the data type from the first row of data. (Tip: This can be overridden by the IMEX=n parameter in the Excel connection string, where n is the number of rows you want it to scan). As it happens, Excel can not always win the guessing game (or maybe it just plays it safe) and treats data as plain text. Sometimes even numbers and URLs are treated as text and an apostrophe is appended at the beginning of the data value.

If you have a requirement to export hot (i.e. clickable) URLs to Excel, you can use the workaround that I am going to demonstrate below.

We will use the following SQL table. The sample data is intentionally in bad format. We will fix it with a query –

create table URLDemo
(
 id int identity
,url varchar(100)
,friendlyname varchar(100)
);

insert into URLDemo
(url, friendlyname)
values
 ('www.google.com', 'Google')
,('http://www.yahoo.com', 'Yahoo')
,('bing.com', NULL);
-- Data in table ------------------------
select id, url, friendlyname 
from URLDemo

-- Clean data with query ----------------
select
id,
convert(varchar(100),
case
when left(url, 4) = 'www.' then 'http://'+url
when left(url, 11) &amp;amp;lt;&amp;amp;gt; 'http://www.' then 'http://www.'+url
when left(url, 7) &amp;amp;lt;&amp;amp;gt; 'http://' then 'http://'+url
else url
end
) as url,
convert(varchar(100),
case
when isnull(friendlyname, '') = '' then url
else friendlyname
end
) as friendlyname
from URLDemo

ExcelURLDemo1

The Clean Data query is an attempt to add any missing http://www. to the URL and also show a friendly name if missing. These will be useful later for the Excel HYPERLINK formula.

The problem –

We can export this data to Excel, even try to encode the Excel HYPERLINK formula as a derived column, the url and the formula will show up just as plain text, that is they will not be clickable.

The workaround –

Create an Excel Workbook that will be the template for data export. Type in the column names [id], [url] and [friendlyname] for column A, B and C respectively. Put the following Excel formula in the column D :-

=IF( AND(B1<>"", C1<>""), HYPERLINK(B1,C1), "")

Drag this formula cell to the number of rows as appropriate for your expected data volume. I did it for the first 20 rows. The idea here is to populate the [url] and [friendlyname] columns via SSIS and let the formula create a clickable link for us.

ExcelURLDemo2

Then select cells A1 to C1 and give this range a name, e.g. MyDataRange.

ExcelURLDemo3

Now create an SSIS package. Add a Data Flow Task and double-click to open its designer. Add a OLEDB Source in the Data Flow designer. The query in this editor is the Clean Data query above.

ExcelURLDemo4

Add a Data Conversion Task. It will be used to convert non-unicode (VARCHAR) columns namely [url] and [friendlyname] to unicode. Excel likes to have character data in unicode.

ExcelURLDemo5

Add an Excel Destination. It will use the data range name. Verify the column mappings.

ExcelURLDemo6

Gyaan: Excel tries to identify the last “dirty” row in the worksheet and puts new data from SSIS in the next row. In my case it will be row 21 because I dragged the formula till row 20. Even if the cells once used are emptied by deleting the values/formulas, they are still considered “dirty”. The only way to remove the “dirty” flag from a cell is to delete the whole row! If the package will put data in row 21 and beyond, we miss out on the formulas that we put in the first 20 rows. Using the range tells Excel to look for dirty cells only within the range, thus allowing us to use the formulas in the adjacent cells.

Now verify the column mappings. Note that Excel columns [url] and [friendlyname] are mapped to the [urlWSTR] and [friendlynameWSTR] of the Data Conversion.

ExcelURLDemo7

That is it, execute the package and click away at the URLs!

ExcelURLDemo8

Example of SSIS Foreach Loop Item Enumerator and Dynamic OLEDB Connection

Foreach Loop Container is one of the looping constructs available in SSIS packages. One of its enumerator options is the Foreach Item Enumerator. It enumerates through a list of items populated at design time. The list is static at execution time, i.e. you cannot load the list items dynamically from a configuration file or another source like an SQL table. It is probably due to the static nature of the item list, that this enumerator type is not used too often in packages.

On the other hand, there are some scenarios where a static list can prove to be useful. For example –

  • no need to create a database table just to pass on values for the Foreach Loop
  • enforce a specific order to the parameters passed to the loop
  • initialize variables values with the Item Enumerator instead of doing that in a Script task

Another frequent requirement in SSIS packages is to execute a set of operations against different SQL Servers or databases. The package should change the OLE DB connection string dynamically during execution to point towards the appropriate target. A looping construct iterates through the list of servers and databases, and the package Expressions change the OLE DB connection properties.

I am creating this step-by-step scenario to demonstrate the use of a Foreach Item Enumerator and use it to make a dynamic OLEDB connection. I will execute a single query in a loop against three different SQL databases and export the results to a single text file.

The Steps

Add a Foreach Loop Container to your SSIS package. Double-click on it to open its editor. Click on the Collections tab on the left of the editor. Select the enumerator as Foreach Item Enumerator. Click on the Columns… button. Add two columns of String type. As you add columns, the editor gives them names like Column0, Column1 and so on, and there is no way to customize these names.

ItemEnumeratorDemo1

Populate the column values by simply typing in the rows. Use Column0 for server names and Column1 for corresponding database names on that server. I have used my local server for all three rows but you can use your own server names here. This is the list of items that are iterated by the Foreach Loop.

ItemEnumeratorDemo2

Now you need two variables that will hold the values shredded from the item list as the Foreach Loop iterates through the rows. These variables are used later in the Property Expressions of the OLEDB connection to make it dynamic.

Click on Variable Mappings tab on the left and then on the <New Variable…>

ItemEnumeratorDemo3

Add two variables of String type and name them ServerName and DatabaseName.

ItemEnumeratorDemo4

The Index 0 and 1 of the variables correspond to the Column0 and Column1 of the Foreach Item list respectively.

ItemEnumeratorDemo5

That is all there is to it. Now the loop is ready to iterate through the item rows.

Let us move on to the dynamic OLEDB connection. Add a Data Flow Task to the Foreach Loop Container.

ItemEnumeratorDemo6

Double-click the Data Flow Task to view its design surface. Now add one OLE DB Source.

ItemEnumeratorDemo7

Double-click OLE DB Source to set its properties. The first window is of its connection manager.

ItemEnumeratorDemo8

Once the connection information is done, view its properties editor. Set the Data Access Mode to SQL Command and put the following query in the SQL Command Text –

select
@@SERVERNAME as ServerName
,DB_NAME() as DatabaseName
,GETDATE() as LogDateTime
,COUNT(*) as NumberOfIndexes
from sys.indexes;

ItemEnumeratorDemo9

Clicking the Columns tab is important so that the SSIS can find the metadata like column names, data types etc. from the underlying tables/views. If you don’t do this then you’ll find later that SSIS does not know of this metadata.

ItemEnumeratorDemo10

Again, it is a good idea to drag the connector from OLEDB source to the Flat File destination before modifying the Flat File destination properties. In this way, the SSIS Designer helps you by matching the destination with the source metadata automatically. Otherwise, you have to perform some extra steps to match columns. I hope you like to save time as much as I do!

ItemEnumeratorDemo11

Give a file path and check the box for Column Names in the First Row.

ItemEnumeratorDemo12

Verify the column mappings in the Mappings tab.

ItemEnumeratorDemo13

The trick to make the OLEDB connection string dynamic lies in the next couple of steps where you edit the properties of the OLEDB connection and assign expressions to its ServerName and InitialCatalog properties.

ItemEnumeratorDemo14

The property ServerName gets the variable @[User::ServerName] and the property InitialCatalog gets the variable @[User::DatabaseName]. Remember that these variables will get their values modified during each iteration of the Foreach Loop and as a result, the OLEDB connection string will get modified too.

ItemEnumeratorDemo15

As the last step, make sure that data will append to the contents of the destination text file. Otherwise, each iteration will overwrite the contents with the new source connection.

ItemEnumeratorDemo16