Home  >  Blog  >   SSIS

Package Log Providers - SSIS

Rating: 4
  1. Share:
SSIS Articles


Log providers are used to define the destination for the log information that is generated when a package executes. For instance, if you require a record of the execution of your package, a log provider could persist the events and actions that had transpired into a log file, recording not only the execution but also, if required, the values and results of the execution.

Defining what should be logged during a package’s execution is a two-step process. First, you must define which log providers to use. You can define multiple providers in a single package. Second, you need to define what information should be sent to the defined log providers.

Inclined to build a profession as SSIS Developer? Then here is the blog post on SSIS TRAINING ONLINE.

Keep in mind that if you’re executing a package store in a SQL Server 2014 catalog that the logging of events is done automatically and can be seen in the catalog.executions, catalog.operations, and catalog.operationmessages tables. To programmatically retrieve the logs, you will use the Executions and Operations properties of the Catalog class. We will see an example of that in this section.

To demonstrate how you would do this using the UI, open the MomDemo project and make sure package.dtsx is loaded. Then to configure logging for the test SSIS package you’ll select SSIS ⇒ Logging from the menu. The Configure SSIS Logs dialog that is displayed shows all the containers that currently exist in the package. The first step is completed by configuring SSIS Log Providers on the Providers and Logs tab, shown in below screenshot.

Configure SSIS Logs dialog

SQL Server Integration Services includes several default log providers. These providers are selected in the Provider Type combo box and are defined as follows:

>> SSIS Log Provider for Text Files: Used to store log information to a CSV file on the file system. This provider requires you to configure a File Connection object that defines the location of the file. Storing log information in a text file is the easiest way to persist a package’s execution. Text files are portable, and the CSV format is a simple-to-use industry-wide standard.

>> SSIS Log Provider for SQL Profiler: This provider produces a SQL Provider trace file. The file must be specified with a trc file extension so that you can open it using the SQL Profiler diagnostic tool. Using SQL Profiler trace files is an easy way for DBAs to view log information. Using Profiler, you could view the execution of the package step-by-step, even replaying the steps in a test environment.

Perfect guide for getting started to applied SSIS. Access to freeSSIS Tutorials

>> SSIS Log Provider for SQL Server: This provider sends package log events to a table in the specified SQL Server database. The database is defined using an OLE DB connection. The first time this package is executed, a table called sysssislog is created automatically. Storing log information in a SQL Server database inherits the benefits of persisting information in a relational database system. You could easily retrieve log information for analysis across multiple package executions.

>> SSIS Log Provider for Windows Event Log: This provider sends log information to the Application event store. The entries created are under the Source name SQLISPackage. No additional configuration is required for this provider. Logging package execution to the Windows Event Log is possibly the easiest way to store log events. The Windows Event Log is easy to view and can be viewed remotely if required.

MindMajix Youtube Channel

>> SSIS Log Provider for XML Files: This provider stores log information in a specified XML file on the file system. The file is specified through a File Connection object. Make sure you save the file with an xml file extension. Logging events to XML inherits the advantages of the XML specification. XML files are very portable across systems and can be validated against a Schema definition.

Specifying Events to Log

Once you have configured the log providers you wish to employ, you must define what events in the package to log. This is done in the Details tab (see below screenshot) of the log configuration dialog. To enable an event to be logged, check the box next to its name. For instance, in below screenshot, the OnError event for the package has been selected for logging. By selecting other containers on the left-hand side of the dialog, additional events can be selected, down to an individual task or Data Flow event level. To select all events at once, check the box in the header row of the table. By selecting individual containers in the tree view on the left, you can configure the logging of events on an individual task level. By configuring logging at the task level, the special events exposed by a task can additionally be included in the log.

Specifying Events to Log

This is the way to set up a log file using the UI. The next section describes how to examine log providers programmatically.

Programming to Log Providers

The Package object exposes the LogProviders collection object, which contains the configured log providers in a package. The LogProvider object encapsulates a provider’s configuration information.

The LogProvider object exposes the following key properties:

  • Name: This is a descriptive name for the log provider.
  • ConfigString: This is the name of a valid Connection object within the package that contains information about how to connect to the destination store.
  • CreationName: This is the ProgID of the log provider. This value is used in the creation of log providers dynamically.
  • Description: This describes the type of provider and optionally the destination to which it points.

The next two examples (see the CSharp_GetPkgLogs folder) enumerate all the log providers that have been configured in a package and write the results to the console window. To get extra mileage out of these examples, the C# version loads the package from a file; the VB.NET version loads the package from an Integration Server:


private static void GetPackageLogsForPackage(string PackagePath)
Application dtsapp = new Application();
Package p = dtsapp.LoadPackage(PackagePath, null);
Console.WriteLine(“Executing Package {0}”, PackagePath);
Console.WriteLine(“Package Execution Complete”);
LogProviders logProviders = p.LogProviders;
Console.WriteLine(“LogProviders Count: {0}”, logProviders.Count);
LogProviderEnumerator logProvidersEnum =
while (logProvidersEnum.MoveNext())
LogProvider logProv = logProvidersEnum.Current;
Console.WriteLine(“ConfigString: {0}”, logProv.ConfigString);
Console.WriteLine(“CreationName {0}”, logProv.CreationName);
Console.WriteLine(“DelayValidation {0}”, logProv.DelayValidation);
Console.WriteLine(“Description {0}”, logProv.Description);
Console.WriteLine(“HostType {0}”, logProv.HostType);
Console.WriteLine(“ID {0}”, logProv.ID);
Console.WriteLine(“InnerObject {0}”, logProv.InnerObject);
Console.WriteLine(“Name {0}”, logProv.Name);

You can, of course, dynamically configure a package’s log providers. To do so, a valid connection must initially be created to support the communications to the database. In the following code (see the CSharp_GetPkgLogs folder), first a package is loaded into memory. Then the connection is created for the mytext.xml file and named. This name is used later as the ConfigString for the log provider to connect the output to the File Connection Manager.


public static void CreatePackageLogProvider(string PackagePath)
Application dtsapp = new Application();
Package p = dtsapp.LoadPackage(PackagePath, null);
ConnectionManager myConnMgr = p.Connections.Add(“FILE”);
myConnMgr.Name = “mytest.xml”;
myConnMgr.ConnectionString = “c:ssismytest.xml”;
LogProvider logProvider =
logProvider.ConfigString = “mytest.xml”;
p.LoggingOptions.EventFilterKind = DTSEventFilterKind.Inclusion;
p.LoggingOptions.EventFilter = new string[] { “OnError”, “OnWarning”,”OnInformation” };
p.LoggingMode = DTSLoggingMode.Enabled;

Next, the log provider is instantiated by passing the ProgID of the provider you wish to create. The following is a list of the ProgIDs for each type of log provider available:

  • Text File Log Provider: DTS.LogProviderTextFile.3
  • SQL Profiler Log Provider: DTS.LogProviderSQLProfiler.3
  • SQL Server Log Provider: DTS.LogProviderSQLServer.3
  • Windows Event Log Provider: DTS.LogProviderEventLog.3
  • XML File Log Provider: DTS.LogProviderXMLFile.3

SQL Server 2014 Operation Logs

As mentioned earlier in the chapter, the execution log and messages for packages executed on a SQL Server 2014 server within a catalog are automatically stored to log tables in the catalog database (for example, SSISDB). To retrieve the execution logs and message, you can either query the table via views or use the properties available within the Catalog class.


Frequently Asked SSIS Interview Questions & Answers


If you’d like to see the execution results and message, you can query the following views in the catalog table:

  • catalog.executions: This view retrieves the execution logs of any package executed on the catalog. The Id column from this view is also stored with the operation messages. The other key columns are project, packagename, and folder.
  • catalog.operations: This view stores all of the operations performed for a project or execution. The operation_type column stores the operation performed for the record. Some of the most common operations are

          101: Deploy project
          200: Create execution or start execution
          201: Validate package
          202: Stop operation

The Id column in the catalog.operations will match the Id column from the catalog.executions view if the operation is from an execution of a package.

  • catalog.operationmessages: This view stores all of the operational messages from an operation. For execution, this includes the common messages you’re used to seeing when executing a package, like “Validation has started.”

In the example that follows (see the MomDemo folder), you’re going to retrieve the execution results of packages programmatically. To do so you will iterate the Executions property of the Catalog class. The Executions property is a collection of ExecutionOperation objects. Then, you will use a LINQ query on the Operations property to match the Id property from the ExecutionOperation class to the Id property of the Operation class. Finally, you will iterate through the Messages property of the Operation class to show the logged messages of the package execution.

Catalog catalog = isServer.Catalogs[“SSISDB”];
CatalogFolder folder = catalog.Folders[“ProSSIS”];
ProjectInfo p = folder.Projects[“My ProSSIS Project”];
StringBuilder messages = new StringBuilder();
foreach(ExecutionOperation exec in catalog.Executions)
// execution complete?
if (exec.Completed)
messages.AppendLine(exec.PackageName + ” completed ” +
// query the operations property using the
// Id property and match it to the exec.Id
var ops = from a in catalog.Operations where a.Id == exec.Id
select a;
foreach(Operation op in ops)
foreach (OperationMessage msg in op.Messages)
messages.AppendLine(“t” + msg.Message);
LogFileTextbox.Text = messages.ToString();

After running this example you should see a window similar to below screenshot

log window for operations


Explore SSIS Sample Resumes! Download & Edit, Get Noticed by Top Employers!Download Now!


List of Related Microsoft Certification Courses:

 SSRS Power BI
 SSAS SQL Server
 SharePoint BizTalk Server
 Team Foundation Server BizTalk Server Administrator



Join our newsletter

Stay updated with our newsletter, packed with Tutorials, Interview Questions, How-to's, Tips & Tricks, Latest Trends & Updates, and more ➤ Straight to your inbox!

Course Schedule
SSIS TrainingMay 21 to Jun 05View Details
SSIS TrainingMay 25 to Jun 09View Details
SSIS TrainingMay 28 to Jun 12View Details
SSIS TrainingJun 01 to Jun 16View Details
Last updated: 03 Apr 2023
About Author

Ravindra Savaram is a Technical Lead at Mindmajix.com. His passion lies in writing articles on the most popular IT platforms including Machine learning, DevOps, Data Science, Artificial Intelligence, RPA, Deep Learning, and so on. You can stay up to date on all these technologies by following him on LinkedIn and Twitter.

read more
Recommended Courses

1 / 15