Log providers are used to define the destination for the log information that is generated when a package executes. For instance, if you require a record of the execution of your package, a log provider could persist the events and actions that had transpired into a log file, recording not only the execution but also, if required, the values and results of the execution.
Defining what should be logged during a package’s execution is a two-step process. First, you must define which log providers to use. You can define multiple providers in a single package. Second, you need to define what information should be sent to the defined log providers.
Keep in mind that if you’re executing a package store in a SQL Server 2014 catalog that the logging of events is done automatically and can be seen in the catalog.executions, catalog.operations, and catalog.operationmessages tables. To programmatically retrieve the logs, you will use the Executions and Operations properties of the Catalog class. We will see an example of that in this section.
To demonstrate how you would do this using the UI, open the MomDemo project and make sure package.dtsx is loaded. Then to configure logging for the test SSIS package you’ll select SSIS ⇒ Logging from the menu. The Configure SSIS Logs dialog that is displayed shows all the containers that currently exist in the package. The first step is completed by configuring SSIS Log Providers on the Providers and Logs tab, shown in below screenshot.
SQL Server Integration Services includes several default log providers. These providers are selected in the Provider Type combo box and are defined as follows:
>> SSIS Log Provider for Text Files: Used to store log information to a CSV file on the file system. This provider requires you to configure a File Connection object that defines the location of the file. Storing log information in a text file is the easiest way to persist a package’s execution. Text files are portable, and the CSV format is a simple-to-use industry-wide standard.
>> SSIS Log Provider for SQL Profiler: This provider produces a SQL Provider trace file. The file must be specified with a trc file extension so that you can open it using the SQL Profiler diagnostic tool. Using SQL Profiler trace files is an easy way for DBAs to view log information. Using Profiler, you could view the execution of the package step-by-step, even replaying the steps in a test environment.
>> SSIS Log Provider for SQL Server: This provider sends package log events to a table in the specified SQL Server database. The database is defined using an OLE DB connection. The first time this package is executed, a table called sysssislog is created automatically. Storing log information in a SQL Server database inherits the benefits of persisting information in a relational database system. You could easily retrieve log information for analysis across multiple package executions.
>> SSIS Log Provider for Windows Event Log: This provider sends log information to the Application event store. The entries created are under the Source name SQLISPackage. No additional configuration is required for this provider. Logging package execution to the Windows Event Log is possibly the easiest way to store log events. The Windows Event Log is easy to view and can be viewed remotely if required.
>> SSIS Log Provider for XML Files: This provider stores log information in a specified XML file on the file system. The file is specified through a File Connection object. Make sure you save the file with an xml file extension. Logging events to XML inherits the advantages of the XML specification. XML files are very portable across systems and can be validated against a Schema definition.
Once you have configured the log providers you wish to employ, you must define what events in the package to log. This is done in the Details tab (see below screenshot) of the log configuration dialog. To enable an event to be logged, check the box next to its name. For instance, in below screenshot, the OnError event for the package has been selected for logging. By selecting other containers on the left-hand side of the dialog, additional events can be selected, down to an individual task or Data Flow event level. To select all events at once, check the box in the header row of the table. By selecting individual containers in the tree view on the left, you can configure the logging of events on an individual task level. By configuring logging at the task level, the special events exposed by a task can additionally be included in the log.
This is the way to set up a log file using the UI. The next section describes how to examine log providers programmatically.
Programming to Log Providers
The Package object exposes the LogProviders collection object, which contains the configured log providers in a package. The LogProvider object encapsulates a provider’s configuration information.
The LogProvider object exposes the following key properties:
The next two examples (see the CSharp_GetPkgLogs
folder) enumerate all the log providers that have been configured in a package and write the results to the console window. To get extra mileage out of these examples, the C# version loads the package from a file; the VB.NET version loads the package from an Integration Server:
C#
private static void GetPackageLogsForPackage(string PackagePath)
{
Application dtsapp = new Application();
Package p = dtsapp.LoadPackage(PackagePath, null);
Console.WriteLine(“Executing Package {0}”, PackagePath);
p.Execute();
Console.WriteLine(“Package Execution Complete”);
Console.WriteLine(“LogProviders”);
LogProviders logProviders = p.LogProviders;
Console.WriteLine(“LogProviders Count: {0}”, logProviders.Count);
LogProviderEnumerator logProvidersEnum =
logProviders.GetEnumerator();
while (logProvidersEnum.MoveNext())
{
LogProvider logProv = logProvidersEnum.Current;
Console.WriteLine(“ConfigString: {0}”, logProv.ConfigString);
Console.WriteLine(“CreationName {0}”, logProv.CreationName);
Console.WriteLine(“DelayValidation {0}”, logProv.DelayValidation);
Console.WriteLine(“Description {0}”, logProv.Description);
Console.WriteLine(“HostType {0}”, logProv.HostType);
Console.WriteLine(“ID {0}”, logProv.ID);
Console.WriteLine(“InnerObject {0}”, logProv.InnerObject);
Console.WriteLine(“Name {0}”, logProv.Name);
Console.WriteLine(“——————“);
}
}
You can, of course, dynamically configure a package’s log providers. To do so, a valid connection must initially be created to support the communications to the database. In the following code (see the CSharp_GetPkgLogs
folder), first a package is loaded into memory. Then the connection is created for the mytext.xml file and named. This name is used later as the ConfigString for the log provider to connect the output to the File Connection Manager.
C#
public static void CreatePackageLogProvider(string PackagePath)
{
Application dtsapp = new Application();
Package p = dtsapp.LoadPackage(PackagePath, null);
ConnectionManager myConnMgr = p.Connections.Add(“FILE”);
myConnMgr.Name = “mytest.xml”;
myConnMgr.ConnectionString = “c:ssismytest.xml”;
LogProvider logProvider =
p.LogProviders.Add(“DTS.LogProviderXMLFile.2”);
logProvider.ConfigString = “mytest.xml”;
p.LoggingOptions.SelectedLogProviders.Add(logProvider);
p.LoggingOptions.EventFilterKind = DTSEventFilterKind.Inclusion;
p.LoggingOptions.EventFilter = new string[] { “OnError”, “OnWarning”,”OnInformation” };
p.LoggingMode = DTSLoggingMode.Enabled;
logProvider.OpenLog();
p.Execute();
}
Next, the log provider is instantiated by passing the ProgID of the provider you wish to create. The following is a list of the ProgIDs for each type of log provider available:
As mentioned earlier in the chapter, the execution log and messages for packages executed on a SQL Server 2014 server within a catalog are automatically stored to log tables in the catalog database (for example, SSISDB). To retrieve the execution logs and message, you can either query the table via views or use the properties available within the Catalog class.
Frequently Asked SSIS Interview Questions & Answers
If you’d like to see the execution results and message, you can query the following views in the catalog table:
operation_type
column stores the operation performed for the record. Some of the most common operations are 101: Deploy project
200: Create execution or start execution
201: Validate package
202: Stop operation
The Id column in the catalog.operations will match the Id column from the catalog.executions view if the operation is from an execution of a package.
In the example that follows (see the MomDemo folder), you’re going to retrieve the execution results of packages programmatically. To do so you will iterate the Executions property of the Catalog class. The Executions property is a collection of ExecutionOperation objects. Then, you will use a LINQ query on the Operations property to match the Id property from the ExecutionOperation class to the Id property of the Operation class. Finally, you will iterate through the Messages property of the Operation class to show the logged messages of the package execution.
Catalog catalog = isServer.Catalogs[“SSISDB”];
CatalogFolder folder = catalog.Folders[“ProSSIS”];
ProjectInfo p = folder.Projects[“My ProSSIS Project”];
catalog.Operations.Refresh();
StringBuilder messages = new StringBuilder();
foreach(ExecutionOperation exec in catalog.Executions)
{
// execution complete?
if (exec.Completed)
{
messages.AppendLine(exec.PackageName + ” completed ” +
exec.EndTime.ToString());
// query the operations property using the
// Id property and match it to the exec.Id
var ops = from a in catalog.Operations where a.Id == exec.Id
select a;
foreach(Operation op in ops)
{
op.Refresh();
foreach (OperationMessage msg in op.Messages)
{
messages.AppendLine(“t” + msg.Message);
}
}
}
}
LogFileTextbox.Text = messages.ToString();
After running this example you should see a window similar to below screenshot
SSRS | Power BI |
SSAS | SQL Server |
SCCM | SQL Server DBA |
SharePoint | BizTalk Server |
Team Foundation Server | BizTalk Server Administrator |
Name | Dates | |
---|---|---|
SSIS Training | Nov 02 to Nov 17 | View Details |
SSIS Training | Nov 05 to Nov 20 | View Details |
SSIS Training | Nov 09 to Nov 24 | View Details |
SSIS Training | Nov 12 to Nov 27 | View Details |
Ravindra Savaram is a Technical Lead at Mindmajix.com. His passion lies in writing articles on the most popular IT platforms including Machine learning, DevOps, Data Science, Artificial Intelligence, RPA, Deep Learning, and so on. You can stay up to date on all these technologies by following him on LinkedIn and Twitter.