Re: How Can We Run Workflow With PMCMD?: - You - Work - Directly - by - Using - Remote - Connections - HTM

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 3

http://www.

all-
interview.com/7867402muk/how_can_u_work_with_remote_database_in_informaticadid
_you_work_directly_by_using_remote_connections.htm

Re: how can we run workflow with pmcmd?


connect to pmcmd,
connect to integration service.
pmcmd>connect -sv service_name -d domain_name -u
user_name
-p password;
start workflow,
pmcmd>startworkflow -f folder_name

Re: Why use the lookup transformation


To perform the following tasks.

Get a related value. For example, if your source


table
includes employee ID, but you want to include the
employee
name in your target table to make your summary data
easier
to read. Perform a calculation. Many normalized
tables
include values used in a calculation, such as gross
sales
per invoice or sales tax, but not the calculated
value (such
as net sales). Update slowly changing dimension
tables. You
can use a Lookup transformation to determine whether
records
already exist in the target.

Re: How the facts will be loaded? Explain


If all dimention tables are succeded then only you
can load
data in to the facts.if any one of the dimention is
not
filled you cannot load data into the fact tables.Fact
table
size can be calculated based on the lowest level
granularity of dimention tables.
Re: How to identify bottlenecks in
sources,targets,mappings,workflow,system and how to increase the
performance

Source:
Create the Filter transformation after all the Source
Qualifiers and make the filter condition FALSE so
that the
data will not go beyond this trasformation. Then run
the
session and find out the time taken from source. If
you
feel there is some lack in performance, then suggest
the
necessary index creation in Pre Session.
Note: If the source is File, then there is no
possibility
of performance problme in source side

Target:
Delete the target table from the mapping and create
the
same structure as a Flat file. Run the session and
find out
the time taken to write the file. If you feel problem
in
performance, then delete the INDEX of the table
before
loading the data. In post Session, Create the same
index
Note:If the target is File, then there is no
possibility of
performance problme in target side

Mapping:
The below steps need to be consider
#1. Delete all the transformations and make it as
single
pass through
#2. Avoid using more number of transformations
#3. If you want to use more filter transformation,
then use
Router transformation instead of it
#4. Calculate the index and data cache properly for
Aggregator, Joiner, Ranker, Sorter if the Power
center is
lower version. Advance version, Power center itself
will
take care of this
#5. Always pass the sorted i/p's to Aggregator
#6. Use incremental aggregation
#7. Dont do complex calculation in Aggregator
transformation.

Session:
Increas the DTM buffer size

System:
#1. Increase the RAM capacity
#2. Avoid paging

You might also like