Info Reading 1

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

DIFFERENCE BETWEEN INFORMATICA 7.1 AND 8.1 =========================================== Object Permissions* Effective in version 8.1.

1, you can assign object permissions to users when you add a user account, set user permissions, or edit an object. *Gateway and Logging Configuration* Effective in version 8.1, you configure the gateway node and location for log event files on the Properties tab for the domain. Log events describe operations and error messages for core and application services, workflows and sessions. Log Manager runs on the master gateway node in a domain. We can configure the maximum size of logs for automatic purge in megabytes. Powercenter 8.1 also provides enhancements to the Log Viewer and log event formatting. *Unicode compliance* Effective in version 8.1, all fields in the Administration Console accept Unicode characters. One can choose UTF-8 character set as the repository code page to store multiple languages. *Memory and CPU Resource Usage* You may notice an increase in memory and CPU resource usage on machines running PowerCenter Services. *Domain Configuration Database* PowerCenter stores the domain configuration in a database. *License Usage* Effective in version 8.0, the Service Manager registers license information. *High Availability* High availability is the PowerCenter option that eliminates a single point of failure in the PowerCenter environment and provides

minimal service interruption in the event of failure. High availability provides the following functionality: *Resilience. *Resilience is the ability for services to tolerate transient failures, such as loss of connectivity to the database or network failures. *Failover. *Failover is the migration of a service process or task to another node when the node running the service process becomes unavailable. *Recovery. *Recovery is the automatic or manual completion of tasks after an application service is interrupted. * * *Web services for Administration* The Administration Console is a browser-based utility that enables you to view domain properties and perform basic domain administration tasks, such as adding domain users, deleting nodes, and creating services. *Repository Security* Effective in version 8.1, PowerCenter uses a more robust encryption algorithm. Also provides advanced purges for purging obsolete versions of repository objects. *Partitioning* There is database partitioning available. Apart from this we can configure dynamic partitioning based on nodes in a grid, the number of partitions in the source table and the number of partitions option. The session creates additional partitions and attributes at the run time. *Recovery* The recovery of workflow, session and task are more robust now. The state of the workflow/session is now stored in the shared file system and not in memory.

*FTP* We have options to have partitioned FTP targets and Indirect FTP file source(with file list). *Performance* Pushdown optimization Uses increased performance by pushing transformation logic to the database by analyzing the transformations and issuing SQL statements to sources and targets. Only processes any transformation logic that it cannot push to the database. *Flat file performance* We can create more efficient data conversion using the new version. We can concurrently write to multiple files in a session with partitioned targets. One can specify a command for source or target file in a session. The command can be to create a source file like 'cat a file'. Append to flat file target option is available now. *Pmcmd/infacmd* New features added to pmcmd. We can use infacmd to administer Powercenter domains. Infasetup program to setup domain and node properties. *Mappings * We can now build custom transformation enhancements in API using c++ and Java code. We can use partition threads to specific custom t

Can you use the maping parameters or variables created in one maping into any other reusable transformation?

Yes.

Here are few real time examples of problems while running informatica mappings: 1) Informatica uses OBDC connections to connect to the databases. The database passwords (production ) is changed in a periodic manner and the same is not updated at the Informatica side. Your mappings will fail in this case and you will get database connectivity error. 2) If you are using Update strategy transformation in the mapping, in the session properties you have to select Treat Source Rows : Data Driven. If we do not select this Informatica server will ignore updates and it only Inserts rows. 3) If we have mappings loading multiple target tables we have to provide the Target Load Plan in the sequence we want them to get loaded. 4) Error:Snapshot too old is a very common error when using Oracle tables. We get this error while using too large tables. Idealy we should schelude these loads when server is not very busy (meaning when no other loads are running). 5) We might get some poor performance issues while reading from large tables. All the sourcetables should be indexed and updated regularly. TABLE OR VIEW DOES NOT EXIST if no sql is generated using TARGET HAS INDEX DEFINED UPON IT WHILE USING BULK LOAD.
What is the difference between PowerCenter 6 and powercenter 7?

1)lookup the flat files in informatica 7.X but we cann't lookup flat files in informatica 6.X 2) External Stored Procedure Transformation is not available in informatica 7.X but this transformation included in informatica 6.X MQ source Qualifier are used to convert Main Frames Source data Difference between Normal load and bulk load

Difference between normal load and bulk load?

Normal Load: Normal load will write information to the database log file so that if any recorvery is needed it is will be helpful. when the source file is a text file and loading data to a table,in such cases we should you normal load only, else the session will be failed.

Bulk Mode: Bulk load will not write information to the database log file so that if any recorvery is needed we can't do any thing in such cases. compartivly Bulk load is pretty faster than normal load.

RE: What is use of UNIX in Informatica? Hi These are some ways unix works with Informatica. 1) In unix box we can create a SCRIPT which calls informatica session and make the session run. 2) We can UPDATE STATISTICS or CLEANUP files by creating pre defined functions. 3) Also I can call sql files through the scripts to create/drop staging tables creta index.... 4) Create dependencies among various sessions. 5) create parameter files and define session and mapping variables like DBCONNECTION proc_dt badfile name log file name...

Not very much sure how to install informatica on unix.

here are new enchancements done in 8.6 ( i just copied from the help manual) Administration Console Generate and upload node diagnostics. You can use the Administration Console generate node diagnostics for a node and upload them to the Configuration Support Manager onhttp://my.informatica.com. In the Configuration Support Manager, you can diagnose issues in your Informatica environment and maintain details of your configuration. For more information, see the PowerCenter Administrator Guide. Command Line Programs New pmrep command. The pmrep command line program includes a new command, MassUpdate. Use MassUpdate to update a set of sessions based on specified conditions. For more information, see the PowerCenter Command Reference. Deployment Groups Non-versioned repository. You can create and edit deployment groups and copy deployment groups to non-versioned repositories. You can copy deployment groups between versioned and non-versioned repositories. When you copy a dynamic deployment group, the Repository Service converts it to a static deployment group. For a non-versioned repository, if the objects in the deployment group exist in the target repository, the deployment operation deletes the existing objects and creates new objects. Execute Deployment Groups privilege. The Execute Deployment Groups privilege enables you to copy a deployment group without the write permission on target folders. You still need read permission on the source folders and execute permission on the deployment group to copy the deployment group. Post-deployment validation. Validate the target repository after you copy a deployment group to verify that the objects and dependent objects are valid. For more information, see the PowerCenter Repository Guide. Domains Mixed-version domain. You can run multiple application service versions in a mixed-version domain. Run a mixed-version domain to manage services of multiple versions or to upgrade services in a phased manner. In a mixed-version domain, you can create and run 8.6 and 8.6.1 versions of the Repository Service, Integration Service, and the Web Services Hub. The application service version appears on the Properties tab of application services in the domain. If you run a mixed-version domain, you can filter the versions of services that you want to view in the Navigator. License Management Report. Monitors the number of software options purchased for a license and the number of times a license exceeds usage limits. The License Management Report displays the license usage information such as CPU and repository usage and the node configuration details. For more

information, see the PowerCenter Administrator Guide. For more information, see the PowerCenter Administrator Guide. Integration Service External Loading******* **************8IBM DB2 EE external loader. The IBM DB2 EE external loader can invoke the db2load95 executable located in the Integration Service installation directory. If the IBM DB2 client version 9.5 is installed on the machine where the Integration Service process runs, specify the db2load95 executable when you create the external loader connection. Partitioning Oracle subpartition support. PowerCenter supports Oracle sources that use composite partitioning. If you use dynamic partitioning based on the number of source partitions in a session with a subpartitioned Oracle source, the Integration Service sets the number of partitions to the total number of subpartitions at the source. Pushdown Optimization Pushdown compatible connections. When a transformation processes data from multiple connections, the Integration Service can push the transformation logic to the source or target database if the connections are pushdown compatible. Two connections are pushdown compatible if they connect to databases on the same database management system and the Integration Service can unambiguously identify the database tables that the connections access. Connection objects are always pushdown compatible with themselves. Additionally, some relational connections are pushdown compatible if they are of the same database type and have certain identical connection properties. Subquery support. The Integration Service can produce SQL statements that contain subqueries. This allows it to push transformations in certain sequences in a mapping to the source or target when the session is configured for source-side or full pushdown optimization. Netezza support. You can push transformation logic to Netezza sources and targets that use ODBC connections. Real-time JMS resiliency. Sessions that fail with JMS connection errors are now resilient if recovery is enabled. You can update the pmjmsconnerr.properties file which contains a list of messages that define JMS connection errors. Streaming XML. You can configure the Integration Service to stream XML data between a JMS or WebSphere MQ source and the XML Parser transformation. When the Integration Service streams XML data, it splits the data into multiple segments. You can configure a smaller input port in the XML Parser transformation and reduce the amount of memory that the XML Parser transformation requires to process large XML files. Synchronized dynamic cache. You can configure multiple real-time sessions to access the same lookup source table. Each session has a Lookup transformation that shares the lookup source with the other sessions. When one session generates data and updates the lookup source, then each session can update its own cache with the same information. Metadata Manager Business glossary. You can create business terms organized by category in a business glossary and add the business terms to metadata objects in the Metadata Manager metadata catalog. Use a business glossary to add business terms to the technical metadata in the metadata catalog. You can import and export business glossaries using Microsoft Excel. For more information, see the Metadata Manager Business Glossary Guide. User interface. The Browse page in Metadata Manager contains the Catalog, Business Glossary, and Shortcuts views. Use the views to browse and view details about metadata objects, categories and business terms, and shortcuts in Metadata Manager. In addition, the Details section contains a single view of all properties for metadata objects and business terms. For more information, see the Metadata Manager User Guide. Resources. Metadata Manager added the following resources: - ERwin. Extract logical models from ERwin. - Oracle Business Intelligence Enterprise Edition. Extract metadata from OBIEE. For more information, see the Metadata Manager Administrator Guide. Metadata Manager Service. You can configure an associated Reference Table Manager Service to view

reference tables from the Metadata Manager business glossary. For more information, see the PowerCenter Administrator Guide. Load Monitor. The Load Monitor updates the current status and load events and contains summary information for objects, links, and errors as you load a resource. It also displays session statistics for PowerCenter workflows. Error messages. Metadata Manager includes improved error messages to display the cause of errors when you create and configure resources. Data lineage. You can publish a complete data lineage diagram to a Microsoft Excel file. For more information, see the Metadata Manager User Guide. Migrate custom resource templates. You can migrate custom resource templates between Metadata Manager instances. For more information, see the Metadata Manager Custom Metadata Integration Guide. Object Relationships Wizard. Use the Object Relationships Wizard to create relationships by matching metadata objects by any property. You can also use the Object Relationships Wizard to create relationships between categories and business terms to technical metadata in the metadata catalog. For more information, see the Metadata Manager User Guide. PowerExchange (PowerCenter Connect) PowerExchange for Netezza Parameters and variables. You can use parameters and variables for the Null Value, Escape Character, and Delimiter session properties. PowerExchange for SAP NetWeaver Row-level processing. The Integration Service can process each row of an outbound IDoc according to the IDoc metadata and pass it to a downstream transformation. Row-level processing can increase session performance. Viewing ABAP program information. You can view ABAP program information for all mappings in the repository folder. PowerExchange for Web Services Importing WSDLs with orchestration. You can import a WSDL file generated by orchestration as a web service source, target, or transformation. Reference Table Manager Support for default schema for Microsoft SQL Server 2005. When you create a connection to a Microsoft SQL Server 2005 database, the Reference Table Manager connects to the default schema of the user account provided for the connection. Code page for import and export file. When you import data from external reference files into Reference Table Manager, you can set the code page for the external file. When you export data from Reference Table Manager, you can set the code page for the export file. Repository and Repository Services Upgrade Services Upgrade services to the latest version. You can use the Upgrade Services tab to upgrade the Repository Service and its dependent services to the latest version. For more information, see the PowerCenter Administrator Guide. Transformations Data Masking Expression masking. You can create an expression in the Data Masking transformation to mask a column. Key masking. You can configure key masking for datetime values. When you configure key masking for a datetime column, the Data Masking transformation returns deterministic data. Parameterizing seed values. You can use a mapping parameter to define a seed value when you configure key masking. If the parameter is not valid, the Integration Service uses a default seed value from the defaultValue.xml file. Masking Social Security numbers. You can mask a Social Security number in any format that includes nine digits. You can configure repeatable masking for Social Security numbers. Company Names Sample Data. The Data Masking installation includes a flat file that contains company names. You can configure a Lookup transformation to retrieve random company names from the file. Web Services Hub Web service security. The Web Services Hub security for protected web services is based on the OASIS

web service security standards. By default, WSDLs generated by the Web Services Hub for protected web services contain a security header with the UsernameToken element. The web service client request can use plain text, hashed password, or digested password security. For more information, see the PowerCenter Web Services Provider Guide. Changed data capture: CDC=3DChange Data Capture. Purpose : =3D=3D=3D=3D=3D=3D=3D 1. CDC is to simplify the ETL in Datawarehousing aplications. 2. Data Integration is the integral part of all the Datawarehousing applications.Data is often extracted on a nightly basis from the transaction system and transported to the Datawarehouse. 3. All the Data in the DW is refreshed with data extracted from the source system. 4.Since the Data extractiin takes place daily ,it would be much more efficient to extract and transport only the data that has been changed since the last extraction.. SCD =3D=3D=3D=3D It is a concept of extracting recently modified data. In DW It is necessary to maintain the history of the data for future descision making analysis. In Informatica we have 3 types od SCD's Type 1 : Keep most rescent values in the target - Which will not maintains the history Type 2 : Keep a full history of changes in the target. 2.1 Keep version number in the separate column 2.2 MARK the current dimension recorsd with the A FLAG 2.3 Mark the dimension records with their effective date range Type 3 : Keep the current and previous values in the target .

You might also like