Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

LAB 04 : Implementing Control Flow in an SSIS Package

TAKE AWARE : Text which is writen in red color should be personalized with your personal parameters.

Part 1 : Implementing Control Flow in an SSIS Package


Scenario
You are implementing an ETL solution for Adventure Works Cycles and must ensure that the data flows you
have already defined are executed as a workflow that notifies operators of success or failure by sending an
email message. You must also implement an ETL solution that transfers data from text files generated by the
company’s financial accounting package to the data warehouse.

Objectives
After completing this lab, you will be able to:
• Use tasks and precedence constraints.
• Use variables and parameters.
• Use containers.

Estimated Time: 60 minutes

Exercise 1 : Using Tasks and Precedence in a Control Flow

Scenario
You have implemented data flows to extract data and load it into a staging database as part of the ETL process
for your data warehousing solution. Now you want to coordinate these data flows by implementing a control
flow that notifies an operator of the outcome of the process.
The main tasks for this exercise are as follows:
1. Prepare the Lab Environment
2. View a Control Flow
3. Add Tasks to a Control Flow
4. Test the Control Flow

N.B. To well complete the following tasks we suppose that you have a local SMTP server for managing
emails. If you don’t have it in your personal machine, questions in blue color will not function. This is not
a problem since the goal of this exercise is to well known how to implement the process of sending mail in
case of loading data errors.

Task 1 : Prepare the Lab Environment


1. Start SQL Server Management Studio and connect to the (local) instance of the SQL Server database
engine by using Windows authentication.
2. Restore files all files in the « LABS-Atelier SID\Lab04\LabA\BackupFiles » folder. You should delete
existant databases before start restoring newest ones. Indication : Refer to « Préparation
Environnement Labxx » file in the « LABS-Atelier SID » folder for help.

Task 2 : View a Control Flow


1. Start Visual Studio and open the AdventureWorksETL.sln solution in the « LABS-Atelier
SID\Lab04\LabA\Ex1 » folder.
2. In Solution Explorer, in the SSIS Packages folder, double-click Extract Reseller Data.dtsx. Then view
the control flow for the Extract Reseller Data package and note that it includes two Send Mail tasks
– one that runs when either the Extract Resellers or Extract Reseller Sales tasks fail, and one that
runs when the Extract Reseller Sales task succeeds.
3. Double-click the red dotted arrow connecting the Extract Resellers task to the Send Failure
Notification task. In the Precedence Constraint Editor, in the Multiple Constraints section, note
that Logical OR. One constraint must evaluate to True is selected so that the Send Failure
Notification task runs if either of the data flow tasks connected should fail. Then click Cancel.
4. Double-click the Send Failure Notification task to view its settings. On the Mail tab, note that the
task uses an SMTP connection manager named Local SMTP Server to send a high-priority email
message with the subject Data Extraction Notification and the message “The reseller data
extraction process failed” to Student@adventureworks.msft. Then click Cancel.
5. Double-click the Send Success Notification task to view its settings. On the Mail tab, note that the
task uses an SMTP connection manager named Local SMTP Server to send a high-priority email
message with the subject Data Extraction Notification and the message “The reseller data was
successfully extracted” to Student@adventureworks.msft. Then click Cancel.
6. In the Connection Managers pane, double-click Local SMTP Server to view its settings, and note
that it connects to the localhost SMTP server. Then click Cancel.
7. On the Debug menu, click Start Debugging, and observe the control flow as the task executes. Then,
when the task has completed, on the Debug menu, click Stop Debugging.
8. View the contents of the C:\inetpub\mailroot\Drop folder and note the email messages that have
been received by the local SMTP server.
9. Double-click the most recent message to open it with Outlook, read the email message, and then
close Outlook.

Task 3 : Add Tasks to a Control Flow

1. In Visual Studio, in Solution Explorer, in the SSIS Packages folder, double-click Extract Internet Sales
Data.dtsx. Then view the control flow for the Extract Internet Sales Data package.
2. In the SSIS Toolbox, in the Common section, double-click Send Mail Task. Then position the new
Send Mail task below and to the right of the Extract Internet Sales task.
3. Double-click the Send Mail task on the control flow surface, to view its settings, and in the Send
Mail Task Editor dialog box, on the General tab, set the Name property to Send Success
Notification.
4. In the Send Mail Task Editor dialog box, on the Mail tab, in the SmtpConnection drop-down list,
click <New connection>. Then in the SMTP Connection Manager Editor dialog box, enter the
following settings and click OK:
• Name: Local SMTP Server
• SMTP Server: localhost
5. In the Send Mail Task Editor dialog box, on the Mail tab, enter the following settings, and then click
OK:
• From: ETL@adventureworks.msft
• To: Student@adventureworks.msft
• Subject: Data Extraction Notification
• MessageSourceType: Direct Input
• MessageSource: The Internet Sales data was successfully extracted
• Priority: Normal
6. On the Control Flow surface, click the Extract Internet Sales task, and then drag the green arrow
from the Extract Internet Sales task to the Send Success Notification task.
7. In the SSIS Toolbox, in the Common section, double-click Send Mail Task. Then position the new
Send Mail task below and to the left of the Extract Internet Sales task.
8. Double-click the Send Mail task on the control flow surface, to view its settings, and in the Send
Mail Task Editor dialog box, on the General tab, set the Name property to Send Failure
Notification.
9. In the Send Mail Task Editor dialog box, on the Mail tab, enter the following settings, and then click
OK:
• SmtpConnection: Local SMTP Server
• From: ETL@adventureworks.msft
• To: Student@adventureworks.msft
• Subject: Data Extraction Notification
• MessageSourceType: Direct Input
• MessageSource: The Internet Sales data extraction process failed
• Priority: High
10. On the Control Flow surface, click the Extract Customers task, and then drag the green arrow from
the Extract Customers task to the Send Failure Notification task. Then right-click the arrow and
click Failure.
11. On the Control Flow surface, click the Extract Internet Sales task, and then drag the green arrow
from the Extract Internet Sales task to the Send Failure Notification task. Then right-click the arrow
and click Failure.
12. Double-click the red arrow connecting the Extract Customers task to the Send Failure Notification
task. In the Precedence Constraint Editor, in the Multiple Constraints section, select Logical OR.
One constraint must evaluate to True. Then click OK.

Task 4 : Test the Control Flow


1. In Visual Studio, on the Control Flow surface for the Extract Internet Sales Data package, click the
Extract Customers task, and press F4. Then in the Properties pane, set the ForceExecutionResult
property to Failure.
2. On the Debug menu, click Start Debugging, and observe the control flow as the task executes,
noting that the Extract Customer task fails. Then, when the task has completed, on the Debug menu,
click Stop Debugging.
3. View the contents of the C:\inetpub\mailroot\Drop folder and note the email messages that have
been received by the local SMTP server.
4. Double-click the most recent message to open it with Outlook, and read the email message, noting
that it contains a failure message. Then close the email message.
5. In Visual Studio, on the Control Flow surface for the Extract Internet Sales Data package, click the
Extract Customers task. Then in the Properties pane, set the ForceExecutionResult property to
None.
6. On the Debug menu, click Start Debugging, and observe the control flow as the task executes,
noting that the Extract Customer task succeeds. Then, when the task has completed, on the Debug
menu, click Stop Debugging.
7. View the contents of the C:\inetpub\mailroot\Drop folder and note the email messages that have
been received by the local SMTP server.
8. Double-click the most recent message to open it with Outlook, and read the email message, noting
that it contains a success message. Then close the email message.
9. Close Visual Studio, saving your changes if you are prompted.

Results: After this exercise, you should have a control flow that sends an email message if the Extract Internet
Sales task succeeds, or sends an email message if either the Extract Customers or Extract Internet Sales
tasks fail.

Exercise 2 : Using Variables and Parameterss

Scenario
You need to enhance your ETL solution to include the staging of payments data that is generated in comma-
separated value (CSV) format from a financial accounts system. You have implemented a simple data flow that
reads data from a CSV file and loads it into the staging database. You must now modify the package to
construct the folder path and file name for the CSV file dynamically at run time instead of relying on a hard-
coded name in the data flow task settings.
The main tasks for this exercise are as follows:
1. View a Control Flow
2. Create a Variable
3. Create a Parameter
4. Use a Variable and a Parameter in an Expression

Task 1 : View a Control Flow


1. View the contents of the « LABS-Atelier SID\Lab04\LabA\BackupFiles\Accounts » folder and note the
files it contains. In this exercise, you will modify an existing package to create a dynamic reference to
one of these files.
2. Start Visual Studio and open the AdventureWorksETL.sln solution in the « LABS-Atelier
SID\Lab04\LabA\Ex2 » folder.
3. In Solution Explorer, in the SSIS Packages folder, double-click Extract Payment Data.dtsx. Then
view the control flow for the Extract Payment Data package, and note that it contains a single data
flow task named Extract Payments.
4. Double-click the Extract Payments task to view it in the Data Flow tab, and note that it contains a
flat file source named Payments File, and an OLE DB destination named Staging DB.
5. Double-click the Payments File source and note that it uses a connection manager named Payments
File. Then click Cancel.
6. In the Connection Managers pane, double-click Payments File, and note that it references the
Payments.csv file in the « LABS-Atelier SID\Lab04\LabA\BackupFiles\Accounts » folder. Then click
Cancel.
7. On the Debug menu, click Start Debugging and observe the data flow while the package runs. When
the package has completed, on the Debug menu, click Stop Debugging.
8. On the Execution Results tab, find the following line in the package execution log :
[Payments File [2]] Information : The processing of the file “D:\Labfiles\Lab05A\Starter\Ex2\Payments.csv”
has started
9. Click the Data Flow tab to return to the data flow design surface.

Task 2 : Create a Variable


1. In Visual Studio, with Extract Payments Data.dtsx package open, on the View menu, click Other
Windows, and then click Variables.
2. In the Variables pane, click the Add Variable button, and create a variable with the following
properties:
• Name: fName
• Scope: Extract Payments Data
• Data type: String
• Value: Payments - US.csv

Note that the value includes a space on either side of the “-“ character.

Task 3 : Create a Parameter


1. In Visual Studio, in Solution Explorer, double-click Project.params.
2. In the Project.params [Design] window, click the Add Parameter button, and add a parameter with
the following properties:
• Name: AccountsFolderPath
• Data type: String
• Value: « C:\Users\Ines\Desktop\[BADS] Atelier SID\LABS-Atelier
SID\Lab04\LabA\BackupFiles\Accounts » (You should use your own path for Accouts
folder)
• Sensitive: False
• Required: True
• Description: Path to accounts files
Note : Be sure to include the trailing “\” in the Value property.

3. On the File menu, click Save All, and then close the Project.params [Design] window.

Task 4 : Use a Variable and a Parameter in an Expression


1. On the Data Flow design surface for the Extract Payments Data.dtsx package, in the Connection
Managers pane, click Payments File. Then press F4 to view the Properties pane.
2. In the Properties pane, in the Expressions property box, click the ellipsis (…) button. Then in the
Property Expressions Editor dialog box, in the Property box, select ConnectionString and in the
Expression box, click the ellipsis (…) button.
3. In the Expression Builder dialog box, expand the Variables and Parameters folder, and drag the
$Project::AccountsFolderPath parameter to the Expression box.
4. In the Expression box, type a plus (+) symbol after the $Project::AccountsFolderPath parameter.
5. Drag the User::fName variable to the Expression box to create the following expression:
@[$Project::AccountsFolderPath]+ @[User::fName]
6. In the Expression Builder dialog box, click Evaluate Expression and verify that the expression
produces the result « LABS-Atelier SID\Lab04\LabA\BackupFiles\Accounts\Payments - US.csv ».
Then click OK to close the Expression Builder dialog box, and in the Property Expressions Editor
dialog box, click OK.
7. On the Debug menu, click Start Debugging and observe the data flow while the package runs. When
the package has completed, on the Debug menu, click Stop Debugging.
8. On the Execution Results tab, find the following line in the package execution log, noting that the
default values for the fName variable and AccountsFolderPath parameter were used:
[Payments File [2]] Information: The processing of the file “LABS-Atelier
SID\Lab04\LabA\BackupFiles\Accounts\Payments - US.csv ” has started
9. Close the Visual Studio, saving the changes if you are prompted.

Results: After this exercise, you should have a package that loads data from a text file based on a parameter
that specifies the folder path where the file is stored, and a variable that specifies the file name.

Exercise 3 : Using Containers

Scenario
You have created a control flow that loads Internet sales data and sends a notification email message to
indicate whether the process succeeded or failed. You now want to encapsulate the data flow tasks for this
control flow in a sequence container so you can manage them as a single unit.
You have also successfully created a package that loads payments data from a single CSV file based on a
dynamically-derived folder path and file name. Now you must extend this solution to iterate through all the
files in the folder and import data from each one.
The main tasks for this exercise are as follows:
1. Add a Sequence Container to a Control Flow
2. Add a Foreach Loop Container to a Control Flow

Task 1 : Add a Sequence Container to a Control Flow


1. Start Visual Studio and open the AdventureWorksETL.sln solution in the « LABS-Atelier
SID\Lab04\LabA\Ex3 » folder.
2. In Solution Explorer, in the SSIS Packages folder, double-click Extract Internet Sales Data.dtsx.
Then view the control flow for the Extract Internet Sales Data package. You created the control flow
for this package in Exercise 1.
3. Right-click the red dotted arrow connecting the Extract Customers task to the Send Failure
Notification task, and click Delete. Repeat this step to delete the red dotted line connecting the
Extract Internet Sales task to the Send Failure Notification task, and the green arrow connecting
the Extract Internet Sales task to the Send Success notification task.
4. Drag a Sequence Container from the Containers section of the SSIS Toolbox to the control flow
surface. Right-click the new sequence container, click Rename, and change the container name to
Extract Customer Sales Data.
5. Click the Extract Customers task, hold the Ctrl key, and click the Extract Internet Sales task, then
drag both tasks into the Extract Customer Sales Data sequence container.
6. Click the Extract Customer Sales Data sequence container, and then drag the green arrow from the
Extract Customer Sales Data sequence container to the Send Success Notification task.
7. Click the Extract Customer Sales Data sequence container, and then drag the green arrow from the
Extract Customer Sales Data sequence container to the Send Failure Notification task. Right-click
the green arrow connecting the Extract Customer Sales Data sequence container to the Send
Failure Notification task, and click Failure.
8. On the Debug menu, click Start Debugging, and observe the control flow as the package executes.
Then, when package execution is complete, on the Debug menu, click Stop Debugging.

Task 2 : Add a Foreach Loop Container to a Control Flow


1. In Visual Studio, in Solution Explorer, in the SSIS Packages folder, double-click Extract Payment
Data.dtsx. Then view the control flow for the Extract Payment Data package, and note that it
contains a single data flow task named Extract Payments. This is the same data flow task you
updated in the previous exercise.
2. In the SSIS Toolbox, in the Containers section, double-click Foreach Loop Container. Then on the
control flow surface, click the Extract Payments task and drag it into the Foreach Loop Container.
3. Double-click the title area at the top of the Foreach Loop Container to view the Foreach Loop
Editor dialog box.
4. In the Foreach Loop Editor dialog box, on the Collection tab, in the Enumerator list, select the
Foreach File Enumerator. In the Expressions box, click the ellipsis (…) button.
5. In the Property Expressions Editor dialog box, in the Property list, select Directory and in the
Expression box click the ellipsis (…) button.
6. In the Expression Builder dialog box, expand the Variables and Parameters folder and drag the
$Project::AccountsFolderPath parameter to the Expression box. Click OK to close the Expression
Builder, and click OK again to close the Property Expression Editor.
7. In the Foreach Loop Editor dialog box, on the Collection tab, in the Retrieve file name section,
select Name and extension.
8. In the Foreach Loop Editor dialog box, on the Variable Mappings tab, in the Variable list, select
User::fName and in the Index column ensure that 0 is specified. Then click OK.
9. On the Debug menu, click Start Debugging and observe the data flow while the package runs. When
the package has completed, on the Debug menu, click Stop Debugging.
10. On the Execution Results tab, scroll through the package execution log, noting that the data flow
was executed once for each file in the D:\Accounts folder. The following files should have been
processed:
a. Payments – AU.csv
b. Payments – CA.csv
c. Payments – DE.csv
d. Payments – FR.csv
e. Payments – GB.csv
f. Payments – US.csv
11. Close Visual Studio, saving your changes if prompted.

Results: After this exercise, you should have one package that encapsulates two data flow tasks in a sequence
container, and another that uses a Foreach Loop to iterate through the files in a folder specified in a parameter
and uses a data flow task to load their contents into a database.
Part 2 : Using Transactions and Checkpoints
Scenario
You are concerned that the Adventure Works ETL data flow might fail, leaving you with a partially-loaded
staging database. To avoid this, you intend to use transactions and checkpoints to ensure data integrity.

Objectives
After completing this lab, you will be able to :
• Use transactions.
• Use checkpoints.

Estimated Time : 30 minutes

Exercise 1 : Using Transactions

Scenario
You have created an SSIS package that uses two data flows to extract, transform, and load Internet sales data.
You now want to ensure that package execution always results in a consistent data state, so that if any of the
data flows fail, no data is loaded.
The main tasks for this exercise are as follows:
1. Prepare the Lab Environment
2. View the Data in the Database
3. Run a Package to Extract Data
4. Implement a Transaction
5. Observe Transaction Behavior

Task 1 : Prepare the Lab Environment

1. Start SQL Server Management Studio and connect to the (local) instance of the SQL Server database
engine by using Windows authentication.
2. Restore files all files in the « LABS-Atelier SID\Lab04\LabB\BackupFiles » folder. You should delete
existant databases before starting restoring newest ones. Indication : Refer to « Préparation
Environnement Labxx » file in the « LABS-Atelier SID » folder for help.
3. Execute the « Update Environment.sql » file in the « LABS-Atelier SID\Lab04\LabB\BackupFiles »
folder

Task 2 : View the Data in the Database

1. Start SQL Server Management Studio, and when prompted, connect to the localhost database
engine using Windows authentication.
2. In Object Explorer, expand Databases, Staging, and Tables.
3. Right-click dbo.Customers and click Select Top 1000 Rows. Note that the table is empty.
4. Right-click dbo.InternetSales and click Select Top 1000 Rows. Note that the table is also empty.
5. Minimize SQL Server Management Studio.

Task 3: Run a Package to Extract Data


1. Start Visual Studio and open the AdventureWorksETL.sln solution in the « LABS-Atelier
SID\Lab04\LabB\Ex1 » folder.
2. In Solution Explorer, in the SSIS Packages folder, double-click Extract Internet Sales Data.dtsx.
Then view the control flow for the Extract Internet Sales Data package.
3. On the Debug menu, click Start Debugging, and observe the control flow as the package is
executed, noting that the Extract Customers task succeeds, but the Extract Internet Sales task fails.
Then, when package execution has completed, on the Debug menu, click Stop Debugging.
4. Maximize SQL Server Management Studio and re-execute the queries you created earlier to view the
top 1,000 rows in the dbo.Customers and dbo.InternetSales tables. Verify that the
dbo.InternetSales table is still empty but the dbo.Customers table now contains customer records.
5. In SQL Server Management Studio, click New Query. Then in the new query window, enter the
following Transact-SQL and click Execute: TRUNCATE TABLE Staging.dbo.Customers;
6. Close the query tab containing the TRUNCATE TABLE statement without saving it, and minimize SQL
Server Management Studio.

Task 4: Implement a Transaction


1. In SQL Server Data Tools, on the control flow surface for the Extract Internet Sales Data.dtsx
package, click the Extract Customer Sales Data sequence container and press F4 to view the
Properties pane.
2. In the Properties pane, set the TransactionOption property of the Extract Customer Sales Data
sequence container to Required.
3. Click the Extract Customers task, and in the Properties pane, ensure that the TransactionOption
property value is set to Supported, and set the FailParentOnFailure property to True.
4. Repeat the previous step for the Extract Internet Sales task.

Task 5: Observe Transaction Behavior


1. In Visual Studio, on the Debug menu, click Start Debugging, and observe the control flow as the
package is executed, noting that once again the Extract Customers task succeeds, but the Extract
Internet Sales task fails. Then, when package execution has completed, on the Debug menu, click
Stop Debugging.
2. Maximize SQL Server Management Studio and re-execute the queries you created earlier to view the
top 1,000 rows in the dbo.Customers and dbo.InternetSales tables, and verify that both tables are
empty. Then minimize SQL Server Management Studio.
3. In Visual Studio, on the control flow surface for the Extract Internet Sales Data.dtsx package,
double-click the Extract Internet Sales task to view it in the Data Flow tab.
4. Double-click the Calculate Sales Amount transformation and modify the Expression value to
remove the text “/ (OrderQuantity % OrderQuantity)”. When the expression matches the following
code, click OK: UnitPrice * OrderQuantity
5. Click the Control Flow tab, and on the Debug menu, click Start Debugging. Observe the control
flow as the package is executed, noting that both the Extract Customers and Extract Internet Sales
tasks succeed. Then, when package execution has completed, on the Debug menu, click Stop
Debugging.
6. Maximize SQL Server Management Studio and re-execute the queries you created earlier to view the
top 1,000 rows in the dbo.Customers and dbo.InternetSales tables, and verify that both tables now
contain data. Minimize SQL Server Management Studio as you will use it again in the next exercise.
7. Close Visual Studio, saving changes if prompted.

Results: After this exercise, you should have a package that uses a transaction to ensure that all data flow
tasks succeed or fail as an atomic unit of work.
Exercise 2 : Using Checkpoints

Scenario
You have created an SSIS package that uses two data flows to extract, transform, and load reseller sales data.
You now want to ensure that if any task in the package fails, it can be restarted without re-executing the tasks
that had previously succeeded.
The main tasks for this exercise are as follows:
1. View the Data in the Database
2. Run a Package to Extract Data
3. Implement Checkpoints
4. Observe Checkpoint Behavior

Task 1 : View the Data in the Database


1. Maximize SQL Server Management Studio, and in Object Explorer, ensure that Databases, Staging,
and Tables are expanded for the localhost database engine instance.
2. Right-click dbo.Resellers and click Select Top 1000 Rows. Note that the table is empty.
3. Right-click dbo.ResellerSales and click Select Top 1000 Rows. Note that this table is also empty.
4. Minimize SQL Server Management Studio.

Task 2 : Run a Package to Extract Data


1. Start Visual Studio and open the AdventureWorksETL.sln solution in the « LABS-Atelier
SID\Lab04\LabB\Ex2 » folder.
2. In Solution Explorer, in the SSIS Packages folder, double-click Extract Reseller Data.dtsx. Then view
the control flow for the Extract Internet Sales Data package.
3. On the Debug menu, click Start Debugging, and observe the control flow as the package is
executed, noting that the Extract Resellers task succeeds, but the Extract Reseller Sales task fails.
Then, when package execution has completed, on the Debug menu, click Stop Debugging.
4. Maximize SQL Server Management Studio and re-execute the queries you created earlier to view the
top 1,000 rows in the dbo.Resellers and dbo.ResellerSales tables. Verify that the dbo.ResellerSales
table is still empty but the dbo.Resellers table now contains records.
5. In SQL Server Management Studio, click New Query. In the new query window, enter the following
Transact-SQL code and click Execute: TRUNCATE TABLE Staging.dbo.Resellers;
6. Close the query tab containing the TRUNCATE TABLE statement without saving it, and minimize SQL
Server Management Studio.

Task 3 : Implement Checkpoints


1. In Visual Studio, click any empty area on the control flow surface for the Extract Reseller Data.dtsx
package, and press F4 to view the Properties pane.
2. In the Properties pane, set the following properties of the Extract Reseller Data package :
• CheckpointFileName: c:\...\LABS-Atelier SID\Lab04\LabB\Ex2\ETL\CheckPoint.chk (You
should use your own path for Accouts folder)
• CheckpointUsage: IfExists
• SaveCheckpoints: True
3. Click the Extract Resellers task, and in the Properties pane, set the FailPackageOnFailure property
to True.
4. Repeat the previous step for the Extract Reseller Sales task.

Task 4 : Observe Checkpoint Behavior


1. View the contents of the « c:\...\LABS-Atelier SID\Lab04\LabB\Ex2\ETL » folder and verify that no file
named CheckPoint.chk exists.
2. In Visual Studio, on the Debug menu, click Start Debugging. Observe the control flow as the Extract
Reseller Sales Data.dtsx package is executed, noting that once again the Extract Resellers task
succeeds, but the Extract Reseller Sales task fails. Then, when package execution has completed, on
the Debug menu, click Stop Debugging.
3. View the contents of the « c:\...\LABS-Atelier SID\Lab04\LabB\Ex2\ETL » folder and verify that a file
named CheckPoint.chk has been created.
4. Maximize SQL Server Management Studio and re-execute the queries you created earlier to view the
top 1,000 rows in the dbo.Resellers and dbo.ResellerSales tables, and verify that the
dbo.ResellerSales table is still empty, but the dbo.Resellers table now contains reseller records.
5. In Visual Studio, on the control flow surface for the Extract Reseller Data.dtsx package, double-click
the Extract Reseller Sales task to view it in the Data Flow tab.
6. Double-click the Calculate Sales Amount transformation and modify the Expression value to
remove the text “/ (OrderQuantity % OrderQuantity)”. When the expression matches the following
code, click OK : UnitPrice * OrderQuantity
7. Click the Control Flow tab, and on the Debug menu, click Start Debugging. Observe the control
flow as the package is executed, noting that the Extract Resellers task is not re-executed, and
package execution starts with the Extract Reseller Sales task, which failed on the last attempt. When
package execution has completed, on the Debug menu, click Stop Debugging.
8. View the contents of the « c:\...\LABS-Atelier SID\Lab04\LabB\Ex2\ETL » folder and verify that the
CheckPoint.chk file has been deleted.
9. Maximize SQL Server Management Studio and re-execute the queries you created earlier to view the
top 1,000 rows in the dbo.Resellers and dbo.ResellerSales tables. Verify that both now contain data,
then close SQL Server Management Studio.
10. Close Visual Studio, saving changes if prompted.

Results: After this exercise, you should have a package that uses checkpoints to enable execution to be
restarted at the point of failure on the previous execution.

You might also like