Download as pdf or txt
Download as pdf or txt
You are on page 1of 32

TabIe of Contents

VeIocity ___________________________________________________________________ 3
Enterprise Strategies ________________________________________________________ 3
Data Governance
__________________________________________________________ 3
Integration Competency Center (ICC) and Lean Integration
______________________________ 17
Service Oriented Architecture
_________________________________________________ 26
2013 Informatica Corporation. All rights reserved. Velocity Methodology
Data Governance
Data Governance
Description
Although its value is not represented on the balance sheet, data is one of the
most important assets in an organization. Data represents an organization's
customers, employees and suppliers; its products and services; its activities
and transactions; its outcomes and results; and a history of decisions and
actions. When managed correctly, data can become an organization's most
valuable asset. lt can help the organization remain competitive and agile,
proactively meet customer needs, minimize compliance and corporate risk,
and keep costs in check.
As a practice with roots in corporate and lnformation Technology (lT)
governance, lnformatica defines data governance as the processes, policies,
standards, organization and technologies required to manage and ensure the
availability, accessibility, quality, consistency, coherency, auditability and
security of data in an organization. lnformatica's Data Governance Framework
is categorized into ten complementary facets.
This framework is not
tied to any specific
data management
business objective,
industry scenario,
use case or
technology
investments. For
example, it applies
when implementing
an MDM solution to
provide a single view
of customer for a
multi-channel retailer,
implementing an lLM
solution to enforce
data masking and
data privacy policies
for a financial
services firm or
implementing a
consolidated
financial data warehouse to support financial reporting requirements for a manufacturing company. ln all cases, these focus
areas play a role in determining overall success. This framework provides an outline and checklist for data management
specialists to successfully deliver data management capabilities within and across the organization.
The Ten Facets of Data Governance
Vision and Business Case
Velocity
3/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Data governance is not about the data; it's about the business processes, decisions and stakeholder
interactions that are to be enabled. The vision and business case must clearly articulate the business opportunity and not just
the "data opportunity. The data remains relevant as the "means and not the "end. The vision defines the broader strategic
objective while the business case identifies specific business opportunities to focus efforts. ln other words, the vision is the
wide view of what the solution can be. The business case defines which parts of the vision have the most immediate impact on
the business and therefore should be accomplished first. A vision statement is used to set an ultimate destination, but the
business case must outline how to start the journey to get there. The vision should look well into the future (3-5 years minimum)
in terms of the business value the data governance investments can deliver. Meanwhile, the business case must be pragmatic
and should consider qualitatively discussing parts of the business that will benefit from the improvements.
PeopIe
The right people are required to support, sponsor, steward, operationalize and ultimately deliver a positive return on data
assets. lt is common practice for organizations to form an executive steering committee to coordinate communication,
prioritization, funding, conflict resolution and decision-making across the enterprise. ln addition, the following roles are
essential in an effective data governance program:
* Executive Sponsor -The optimal Executive Sponsor will be senior CEO-level executive(s) whose responsibilities
include functional, line of business, application and geographic silos. ln larger enterprises, multiple Executive Sponsors
may be required. For example, a multi-national manufacturer would need an Executive Sponsor at the corporate level, but
they would also need one at each division (Aircraft, Healthcare, etc.) and possibly at lower levels. Sponsors should be
identified early since they drive resource allocation, staffing, funding, business prioritization and cross-functional
collaboration. To be effective, a sponsor must be an active participant and evangelist.
* Data StewardlData QuaIity Steward - Data Stewards are the business and lT subject-matter experts who can most
effectively translate how data and systems influence the business processes, decisions and interactions most relevant to
the organization. The business stewards must be lT-savvy, and the lT stewards must be business-savvy. Both must be
strong communicators and facilitators across the part of the organization they represent. Experienced business analysts
with the expertise to bridge business and lT communications often make the best business stewards while data and
enterprise architects can provide critical perspectives as lT stewards.
* Data Governance Leader - The responsibility of the Data Governance Leader is to ensure that governance
processes and decisions are carried out impartially and that disputes that may naturally arise between various parties
within the enterprise are addressed fairly, in accordance with the governance program and without bias. The Data
Governance Leader coordinates tasks for data stewards, helps communicate decisions made by stewards to relevant
stakeholders, drives ongoing data auditing and metrics to assess program success and ROl and is the primary point of
escalation to the executive sponsor and steering committee. A program or project management office (PMO) may
sometimes serve the role as Data Governance Program Leader.
TooIs and Architecture
As the following graphic represents, enterprise and data architects should consider the full lifecycle of critical enterprise data.
This includes:
* Upstream, on-premises transactional/operational applications, systems and processes that create, update, import or
purchase data.
* Downstream, on-premises analytical applications, systems and processes that consolidate, reconcile, deliver and
consume data.
* Growth of off-premises sources and targets of data including Cloud-based applications and platforms, Mobile devices,
third party data feeds, remote sensor data and increasing need to govern "Big Data assets including social data,
repositories of unstructured data and machine-generated data
* Supporting data management infrastructure that enable and ensure compliance with the organization's unique
requirements for delivery of the "right data at the right time with all that this implies (the right latency, throughput, quality,
security, context, etc.).
* Assessment and delivery of the Shared Capabilities that must be made available across the enterprise data architecture
and not confined within specific applications or tools. A common approach includes an lntegration Competency
Center (lCC) with responsibility for a unified data management platform. The shared capabilities may also involve
sharing business processes through a Business Competency Center which could include an lCC as one component.
Velocity
4/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Specific enabling software capabilities that should be considered to help launch a data governance effort
include:
* Data profiIing -Data profiling software helps business analysts and data stewards answer these questions: "what
does data look like today; "how does data in one system relate to data in another system; and "what rules and policies
should we consider defining to improve.
* Data discovery -While data profiling allows in-depth analysis of specified data sets, data discovery allows
organizations to identify where any data anomaly or business scenario occurs across all data sources. For example, the
data privacy organization may require the ability to identify where personally identifiable information is used, and how that
relates to specific business rules/processes where obfuscation or data masking needs to be used.
* Business gIossary -A business glossary allows the business to capture and share the full business context around
critical data.ln addition to the expected definitions of core data entities and attributes, context can also include rules,
policies, reference data, free form annotation, links and data owners. Some organizations manage their shared definitions
in Word documents or spreadsheets and typically focus on the terms and definitions but miss the broader context. A
packaged business glossary enables collaboration across the business and lT roles that create, approve and consume
these definitions (which minimizes the risk of redundancy, definition stagnation and versioning conflicts).
* Metadata managementldata Iineage -The ability to reconcile and provide transparency and visibility to the
supporting metadata of critical data is a foundational element of a data management reference architecture. Data lineage
visualization and auditing capabilities allow data architects and stewards to effectively assess impact analysis of potential
changes to data definitions, rules or schemas, as well as root cause analysis capabilities when responding to a data quality
or security failure. lT staff ranging from data modelers, enterprise architects, business systems analysts, developers and
DBAs often manage the technical metadata, while business analysts and business stewards are often responsible for the
business-oriented metadata. ldeally, the business glossary should be well integrated with the metadata solution.
ln addition, architects should also consider where and how they want to manage their data modeling, process modeling, data
Velocity
5/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
quality, data privacy, master data, data monitoring and auditing, workflow management, and collaboration
capabilities. They must determine how data stewards will be notified and how these stewards should mitigate exceptions to
any established data quality or privacy rules.
PoIicies
Business policies and standards are a critical path deliverable for any data governance function. Common policies that must be
agreed upon, documented and complied with include data accountability and ownership, organizational roles and
responsibilities, data capture and validation standards, information security and data privacy guidelines, data access and
usage, data retention, data masking and archiving policies. When it comes to defining these policies, there are opportunities
and challenges that must be understood. Common challenges often include:
* AnaIysis paraIysis - Policies, by definition, set parameters on how employees and other impacted stakeholders are
expected to behave. People resist change, especially if they're the ones impacted and if the impacts are uncertain. The
politics often involved in defining a policy could stall efforts if not well facilitated to ensure only constructive debate. The
influence of an Executive Sponsor or steering committee can help to drive results.
* Non-compIiance -Don't waste time defining policies that won't be used until there is top-down support to incent
compliance.
* UsabiIity - lt's one thing to have executive support to ensure compliance; it's quite another to understand how to
ensure compliance. Write the policy in an understandable way so that the business process, application and data
management stakeholders can deliver the necessary process, system and rule changes.
Many organizations offer an opportunity to leverage existing governance programs. Many organizations document business- or
lT-driven policies that set parameters on how enterprise data should be managed and used, but may not currently label it a
"data governance policy. For example, lT governance, enterprise architecture (EA) standards and any number of lT-led
competency centers or centers of excellence may define standards and policies on how best to capture, update and share
relevant data and metadata across the organization. Many business-driven data policies likely come from the organization's
Chief Risk and Chief Financial Officers. These may include governance, risk and compliance policies, information security or
data privacy policies, and of course any number of policies created to ensure compliance with external government and
industry regulations. When the data governance effort is ready to scope and define what policies it must document and
implement, it is often easier to start with the work that's already been done and reconcile which policies should be owned by
the data governance effort. Accordingly, these should be recognized and complied with and then be replaced.
Common policies that a data governance initiative will be responsible for documenting and maintaining include:
* Data accountabiIity and ownership - These policies clarify which senior business leaders-or combinations of
business leaders (i.e., steering committee)-are accountable for the quality and security of critical data. The policy must
outline what ownership actually means and define the rights and responsibilities of the owners and their span of control
over the data.
* OrganizationaI roIes and responsibiIities - These policies document and explain the responsibilities of business
and lT data stewards, Data Governance Leader and other dependent stakeholders.
* Data capture and vaIidation standards - These policies define minimum required data capture standards, data
validation rules, reference data rules, etc. The goal is to ensure the people, processes and systems that capture, import,
update, transform or purchase critical data do so in a consistent, standardized manner with a focus on quality and ensure
fitness for enterprise use.
* Data access and usage - Data usage policies ensure appropriate use of data by appropriate stakeholders. Limiting
access to sensitive or confidential information ensures compliance with edicts, such as insider trading regulations and
Payment Card lndustry Data Security Standard (PCl-DSS). Usage policies extend beyond regulatory compliance to also
ensure optimal use of data assets. For example, contact management policies are used to coordinate, prioritize and
minimize multi-channel customer communications across sales, marketing and service organizations. This helps to reduce
lower value contacts and avoid the risk of the customer feeling spammed which may lead them to opt out of future
communication.
ln addition to these policies, several policies focus on information security and data privacy. The risk, security and compliance
stakeholders that own and monitor compliance with these policies must be active contributors in the data governance program.
Security and privacy policies include:
Velocity
6/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
* Customer communication privacy preferences - These privacy policies ensure compliance with anti-spam
legislation, "Do Not Call registries and customer marketing contact management best practices. These policies are meant
to be transparent, customer-facing documents that should clarify whether the organization has adopted an opt-in policy
(will never send marketing communications unless customer expressly permits) or opt-out policy (will always send
marketing communications unless customer expressly requests a stop). These policies can be as granular as necessary
and can request opt-in/opt-out at the marketing communication channel level (e.g., phone, mail, email, text, social). Also,
they can ask customers to set preferences for which products and/or services they are interested in learning more about.
* Data masking - These data privacy policies define and classify sensitive data, identify where it resides and clarify
when it needs to be encrypted appropriately and consistently across multiple applications and database instances. Data
masking policies are critical to ensure compliance with mandated data protection rules and standards, such as PCl-DSS,
Personal Health lnformation (PHl) and Personally ldentifiable lnformation (Pll). Organizations demand the same data
security policies for archiving as they do for their testing and production environments, so the data governance effort must
include all of these instances in its scope when managing data privacy.
* Data archiving and subsetting - Organizations focus on these policies to dramatically reduce the inactive data in
production and legacy environments to cut down on storage costs. These policies, when effectively implemented, should
also reduce data storage costs in development and test environments.
* Data retention - Retention policies must balance the desire to archive and purge unused data to reduce storage
costs and business risk with business and legal discovery retention management requirements. These policies will clarify
what data needs to be stored for how long, in what format, applying what rules, with what level of masking or encryption,
and with what access guidelines.
OrganizationaI AIignment
While the People facet focuses on the skills and responsibilities, the organizational alignment facet focuses on the working
relationships between these roles. ln looking to define an optimal organizational structure to support the data governance
goals, questions that should be addressed include:
* Who wiII be the executive sponsor? The optimal executive sponsor will be very senior executive(s) whose
responsibilities span functional, line of business, application and geographic silos.
* WiII there be an executive steering committee? For any organization larger than a few thousand employees or $1
billion in revenue, it is a best practice to form an executive steering committee or council to help support and drive the
cross-functional decision-making, prioritization, resourcing and change management. The steering committee will include
the executive sponsor(s), dependent business and lT leadership, as well as the data governance program manager who
will help to define the actionable agenda items for this forum.
* Who are the business data owners? Every line of business, functional group and geographic area has different
priorities for which business processes, decisions and interactions are critical to supporting their key performance
indicators (KPls). The business leaders responsible for delivering those KPls must accept accountability for their role in
ensuring the supporting data can meet their needs.
* What are the escaIation paths for poIicy and data confIicts? Who are the stakeholders involved in mitigating an
exception to data quality or security policies, rules and standards?
* Are the data steward's fuII-time or part-time roIes? Finding the right data steward is a challenge facing most
organizations. The best data stewards are the top subject matter experts across business and lT, but if they've achieved
such a great reputation there is a high likelihood that they have little available capacity for additional duties. There is no
right answer, but a combination of using a percentage of existing staff on a part-time basis plus some full-time data
stewards may be a good strategy to consider.
* Do the stewards hoId soIid- or dotted-Iine reporting reIationships to the executive sponsors? lt's unlikely the
data steward is going to report directly to the executive sponsor, but whatever the case, it is important to define the
relationships, obligations, and objectives between the data stewards and the key stakeholders in the organization.
A DACl or RACl roles and responsibilities matrix is a useful tool to help answer these questions. (DACl defines roles of Driver,
Approver, Contributor and lnformed. RACl suggests similar roles of Responsible, Accountable, Consulted, and lnformed).
Measurement and Monitoring
Data governance must be measured at three distinct levels. First, the program level identifies and highlights the qualitative level
of organizational influence and impact the data governance efforts deliver. Next, operational data monitoring is required for the
Velocity
7/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
stewards to evaluate how the data is behaving against expected policy and validation baselines. Last
(and most important to sustain data governance momentum), the quantitative business value measurement that links data
management efforts to real business value such as revenue growth, cost savings, risk reduction, efficiency improvements,
customer satisfaction, etc.
Velocity Best Practices recommend building the measurement strategy as follows:
* Start with data governance program effectiveness to satisfy sponsors - Early in the life of a data governance
initiative, the biggest struggle is often getting the attention of business and lT stakeholders. An important measure of
success is level of engagement, participation and influence the data governance program is having. For example, measure
the number of lines of business, functional areas, system areas, project teams and other parts of the organization that have
committed stewardship resources or sponsorship. ln addition, categorize and track status of all issues that come in to the
data governance function, and capture all other types of value-added interactions such as training, consulting and project
implementation support. While these metrics may not demonstrate business value, it will help early stage data governance
efforts show progress to its sponsors as they work to operationalize data management efforts.
* DeveIop operationaI data quaIity and poIicy auditing metrics to focus data stewards - The business case and
ROl is not about the data; rather, the resulting business rules, policies, processes and standards are about the data.
The business and lT stewards alike are responsible for ensuring compliance with these standards and when necessary,
are required to mitigate or reconcile a data quality, privacy or security issue. Data stewards need visibility to both
proactively monitor and reactively mitigate any data-related issues that are routed to them through the predefined
stewardship workflows. Themes for these operational metrics include data accuracy, completeness, integrity, uniqueness,
consistency, standardization, and audits ensuring compliance with privacy and security policies. Visibility to these metrics
must be developed and made available to stewards. This includes the metadata and data lineage analysis capabilities to
enable stewards to perform root cause and impact analysis, as well as provide necessary auditability and transparency for
compliance.
* ModeI business vaIue and ROI measures to maintain momentum with business Ieadership - This is the
critical factor in the difference between data governance as a one-off lT project versus data governance as a sustainable
part of how the organization does business. Business value from data governance investments can range across a variety
of benefits and can include reducing penalties by ensuring regulatory compliance, reducing enterprise risk (e.g.,
contractual, legal, financial, brand), lowering costs (e.g., business, labor, software, hardware), optimizing spending (e.g.,
procurement, supply chain, services, labor), improving operational efficiencies (e.g., employee, partner, contractor),
increasing top-line revenue growth and optimizing customer experience and satisfaction.
Some business opportunities are "easier (being a relative term) to quantify than others. lf the data governance team can
provide a balanced report that blends at least some transparent quantitative business value on certain targeted processes, the
senior leadership at most organizations will be fine with more qualitative benefits for others (so long as enough anecdotal
support is presented to show how key influencers believe the data governance efforts are helping). At minimum, the ROl
measures should pay for resources that are being dedicated to it.
Change Management
No matter how compelling the vision and business case, making data a true corporate asset is a major culture shift for most
organizations. To accomplish this move from current to a desired future state, significant behavioral change will likely be
required across the workforce (and maybe even the partner/supplier ecosystems). Support for these organizational, business
process and policy changes will require significant training, communication and education with a "carrot and stick performance
management program. This program will incentivize good data practices while discouraging damaging behaviors from the past.
The data governance programs must include time, resources and management commitment to invest in necessary change
management. This will impact organizational behavior in a way that will allow data to be valued.
Dependent Processes
Critical enterprise data cannot be fully governed until the full lifecycle of the data is understood. These include the upstream
business processes that create, update, transform, enrich, purchase or import data, as well as downstream operational and
analytical processes that consume and derive insight and value from data and end-of-data-life processes that include the
retirement and destruction of data. These consuming processes are usually top of mind, as they often make up the crux of the
typical business case. To deliver trusted, secure data, often the biggest change must come from the upstream processes that
Velocity
8/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
may be responsible for much of the "garbage in/garbage out situation that the data governance effort is
chartered to resolve.
Dependent processes are categorized into three broad areas: upstream processes, stewardship processes and downstream
processes. An effective data governance organization will accept responsibility for assessing and improving all of the process
that touch the data and influence its usability.
Upstream processes - These are the business
processes that capture, create, import, purchase,
transform, or update data and introduce it into the
organization's information ecosystem. One of the most
common, and toughest to solve, data governance
challenges centers around the reality that those in the
organization responsible for these upstream processes
rarely have visibility-or incentive to care-about who is
consuming this data downstream and why. The
Executive Sponsor must enforce and evangelize the
recommended data capture and maintenance policies
generated by the data governance organization
amongst peers.
Stewardship processes - When referring to the
lifecycle of the physical data itself, the stewardship process stage is the most relevant. lt is the actual "application from a
system- or human-centric workflow of the data policies, business rules, standards and definitions created as part of the data
governance program. The automated application of these rules may appear as service-enabled or application-specific rules
that archive, cleanse, enrich, mask, match, merge, reconcile, repair, validate, verify or otherwise improve the security and
quality of data. The human-centric workflow refers to the end-to-end stewardship processes defined to facilitate the
identification, notification, escalation and mitigation of any exceptions to data quality or data privacy rules and policies listed
above that require manual intervention to resolve.
Downstream processes - These are the operational and analytical processes that consume, protect, archive, purge and
otherwise draw insight and value from data. Executive Sponsors will only agree to support changes to upstream processes,
systems and organ/zat/ona/ behav/ors -and allow investment to create the organizational and technology improvements
needed to enable the stewardship processes-if significant business value and ROl can be delivered against these
downstream processes that form the foundation of the vision and business case.
Program Management
A multi-phase, multi-year plan for starting small and growing into cross-enterprise, self-sustaining holistic data governance
doesn't manage itself. Whether through an official PMO or a team of program drivers, the data governance efforts need skilled
project/program management professionals to coordinate the complex interactions, communications, facilitations, education,
training and measurement strategy. Effective program management can ensure adoption, visibility and momentum for future
improvements.
lt is important to recognize that it will take significant coordination, facilitation and communication to evangelize, prioritize,
measure and evolve data governance from a pilot project to a foundational way of doing business; this must be someone's
full-time job. While the business must own and accept accountability for data governance and the resulting policies, rules and
standards, lT typically offers the strongest program/project management skills within an organization, and therefore will often
play a significant role in driving the effort.
Defined Processes
Velocity
9/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
These are the stewardship processes that make up the data governance function and are where the
majority of Velocity Best Practices focus. These include the processes that cleanse, repair, mask, secure, reconcile, escalate,
and approve data discrepancies, policies and standards. There are over twenty distinct processes segmented into core
process stages of Discovery, Definition, Application (of rules/policies) and Measure and Monitor.
Business Drivers and Objectives
The most common business drivers for data governance initiatives are:
* Growing revenue
* Lowering costs
* Ensuring regulatory compliance
* Enabling mergers, acquisitions and divestitures
* Optimizing partnering and outsourcing
While these are among the basic goals of any business, it is useful to examine them in more detail and review the integral role
data governance can play in achieving these goals.
Growing Revenue
One of the most important goals of almost any commercial business is to grow revenue; an effective ways to grow revenue is to
increase cross-sell/up-sell rates and improve retention among existing customers by providing optimal, differentiated customer
experiences. To do so, organizations need a broad and deep understanding of their existing customers, the products they've
purchased or expressed interest in, who these customers are influencing or are influenced by, and any other preferences or
propensities to purchase that can be uncovered. With this "single view of the customer," organizations can proactively provide
superior service and better target campaigns and offers based upon a specific customer's needs.
Relevant data about the customer is often scattered across dozens or even hundreds of different business systems. To resolve
these data issues, companies must address the underlying organizational, process and technical issues related to the data.
Velocity
10/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Data governance provides the framework for addressing complex issues, such as improving data quality
or developing a single view of the customer at an enterprise level.
Lowering Costs
Organizations can reduce costs and increase operational efficiency by automating business processes. For example,
organizations may automate their procurement processes to lower purchasing and administration costs. While business
process automation increases efficiency, problems with enterprise data prevent companies from capitalizing on the full
potential of operational efficiency initiatives. Streamlining business processes across multiple financial, human resource, sales
and other business systems requires that the structure and meaning of data be reconciled across those systems-a task that
has often been an afterthought in operational efficiency and business process optimization initiatives.
The need to lower costs is driving projects, such as supplier or product master data management, to streamline core business
processes (e.g., inventory and supply chain management; multi-channel product catalog and promotion management) by
rationalizing, cleansing and sharing key master data elements. Data governance plays a critical role in the success of such
projects, providing a structure for addressing the organizational and process issues around master data.
Ensuring ReguIatory CompIiance
Doing business today requires compliance with a growing number of external regulations as well as with internal corporate
governance policies designed to increase transparency and prevent corporate malfeasance and fraud. To ensure compliance
with regulations, such as Sarbanes-Oxley, Basel ll, the U.S. Patriot Act, the U.S. Health lnsurance Portability and
Accountability Act (HlPAA) and with internal policies and controls, companies must streamline the collection of reporting data.
For many regulations, they must also document the sources of the data being reported, certify its accuracy and implement
specific governance policies.
Data governance is an essential foundation for ensuring compliance. lt establishes the rigorous data standards, policies and
processes that are required by regulations and corporate governance policies, and it helps to automate compliance-related
tasks (while lowering costs). lt also helps to ensure auditability and accountability for the data.
EnabIing Mergers, Acquisitions and Divestitures
As merger and acquisition (M&A) activity picks up, organizations are faced with the need to rationalize and reconcile the lT
environments from merged or acquired entities. Typically these lT environments have very different systems, data models and
business processes. Post-M&A, lT organizations are often pressed to meet very tight timelines for integration. The goal is to
accelerate the promised synergies from the merger, both in the form of cost reductions from eliminating redundancies and
revenue growth from increased cross-selling. Equally challenging is the effort to divest or spin-off a portion of business and the
need to decouple systems and relevant data from what was once one organization into two or more.
The process of migrating and consolidating the data after a merger or acquisition, or decoupling data as part of a divestiture, is
a huge task (one that is often underestimated initially). lT groups must deal with unknown systems, resolve quality issues and
provide detailed documentation on how the information has been merged. The task involves much more than technical
integration. lT organizations must not only reconcile different data definitions and models, but processes must be put in place
to ensure alignment of the various entities. A data governance framework provides significant value in managing the
organizational and technical complexities of M&A consolidation and divestitures and accelerates positive business results.
Optimizing Partnering and Outsourcing
Another broad market trend is the use of partners and outsourcers to manage parts of an organization's value chain. Many
organizations focus on core competencies and hand-off, non-core functions and processes to partners and outsourcing
providers. Some examples:
* High tech equipment companies rely on contract manufacturers for production.
* Manufacturers turn to UPS and FedEx for logistics and warehouse management.
* Pharmaceutical companies rely on third-party clinical trials management firms.
* lT departments outsource application development and network management.
* HR groups outsource administrative functions such as payroll or benefits management.
Velocity
11/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
As business processes and lT systems shift to outside providers, the data associated with those
processes and systems is relocated outside the boundaries of the organization. Organizations must ensure that the data is
correctly migrated and secured for proper use by the outside provider. The data must be complete and accurate, and it has to
be restructured to work in the third-party system. lt is important to note that although the data has been moved to a third party, it
remains a core asset of the organization. While it sits outside the firewall, the organization cannot relinquish visibility into and
control over that data. Organizations that send customer or employee data to outside contractors must consider policies for
masking and encrypting sensitive data for anyone not required or authorized to view it to perform their role effectively. A robust
data governance framework is critical to managing data that is fragmented across the extended value chain especially when
defining the standards and processes for interaction and collaboration with external partners and outsourcers.
Industry Perspective
lnformatica commissioned an in-depth Data Governance benchmarking study to gather data from organizations with active
data governance initiatives and to drill into the detail of these to see just what organizations are really doing, how broad the
scope of these initiatives is, and how effective they are. The survey was targeted at senior business and lT leaders worldwide,
primarily from organizations larger than US $1 billion in annual revenues.
ln summary, the study found that organizations that considered themselves to have a quite successful data governance
program had:
1. A data governance mission statement
2. A clear and documented process for resolving disputes
3. Good policies for controlling access to business data
4. An active risk register (A risk register is a formal element of a data governance program in which the business risks
associated with poor data quality and/or inconsistent data are recorded along with steps to mitigate them.)
5. Effective logical models for key business data domains
6. Either business processes defined at a high level or fully documented at several levels and available for data
governance
7. Data quality assessments that were undertaken on a regular basis
8. A documented business case
9. A link between program objectives and team or personal objectives
10. A comprehensive training program
11. A Web site alongside a broader range of communication methods
The list is empirically collected and provides a foundation of techniques that have been demonstrated to be effective and
represent a view of current industry best practices. The full report, Data Governance Benchmarking Survey 2010, The
lnformation Difference Company Ltd., November 2010, is available on the lnformatica web site.
ImpIementation Approach
lmplementing a Data Governance program is, at a high level, a three step process:
1. Maturity Assessment
2. Gap Analysis, Target Definition and Business Opportunity Priority Assessment
3. Develop and Execute lmplementation Roadmaps
Since a data governance program is an ongoing business capability, the three steps are not just a one-time event but rather an
iterative process that is repeated continuously when viewed over the long term.
Step 1: Maturity Assessment
Most organizations that wish to implement a Data Governance program are not starting from zero; there likely are some
capabilities already in place. Every organization has a unique profile of strengths and opportunities, so it is necessary to first
understand the current context in order to develop a customized plan that is relevant and executable.
lnformatica defines five broad stages of maturity as outlined in the following figure (not including "Stage Zero: Unaware that
some organizations unfortunately find themselves in). There are four primary characteristics which define the maturity level of
an organization, represented by the bullet points within each level in the graphic:
Velocity
12/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
1. Leadership - Who is leading the effort - characterized as bottom-up grass-roots in Stage 1 to top-down
executive-level in Stage 5
2. Scope - From ad-hoc siloed focus in Stage 1 to a sustainable core business function in Stage 5.
3. MeasurementlMetrics - From tactical technology metrics in Stage 1 to enterprise-wide total impact to the business
in Stage 5.
4. Data Governance Management - From ad hoc exception-based management in Stage 1 to business-function
based management in Stage 5.
As
organizations evolve from Stage 1 through Stage 5, the Data Governance program moves from a fragmented silo-based
initiative to a broad-based holistic enterprise program. The natural evolution also moves from being an lT-driven to a
business-driven program. lt is important to recognize that value can be delivered at every level; although the types of benefits
that accrue to the organization, how they are measured and managed, and the overall impact and scale of the benefits change
at each stage. At Stage 1, the benefits are generally related to lT efficiency and compliance. As organizations mature, the
benefits of data governance increasingly become a strategic differentiator.
The detailed process of assessing an organizations maturity varies depending on whether it is done as an internal effort by
leaders within the organization or as a formal consulting engagement with hired consultants. The process may include
extensive interviews of business and lT staff, business risk surveys, business analyst time and activity analysis, and other
techniques. ln any case, the Velocity Best Practice is to capture the results in a Data Governance Maturity Assessment
worksheet Data Governance Maturity Assessment . ln the example below, the Total Maturity Score is 2.77. The worksheet
provides a foundation for Step 2 as well as a way to measure progress of the Data Governance Program over a period of years.
Velocity
13/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Step 2: Gap AnaIysis, Target Definition and Business Opportunity Priority
Assessment
While the Total Maturity Score can be used to measure progress of the program over a long period of time (years), the scores
for each of the 12 questions are more useful for planning purposes. For example, individual scores that are more than one point
below the total score could identify gaps in maturity that should be closed, but they may also present opportunities for
leveraging strength areas. The scores that are more than one point above the total represent strengths that can be leveraged.
ln the previous example, using a threshold of one point above and below the total score, it is possible to identify two major gaps
and four strengths as follows:
Major Gaps/Opportunities
* Monitoring Compliance to data policies
* Master Data Management (technology)
Velocity
14/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Strengths
* DG Program Focus
* Data Management Reference Architecture
* Standardization of DG Enabling Technology
* Data Profiling (technology)
Another tool that may be used in this step is the Business Opportunity Priority Assessment worksheet.
The steps
for this
activity
results in
a
business
opportunity "Heat Map as shown in the previous figure, and the instructions for leveraging the worksheet are as follows:
1. ln the "Opportunity Assessment" tab, complete each column to identify and assess one opportunity per column.
2. Only complete cells in green. The current workbook includes sample data to demonstrate how to complete. Delete all of
the results in green to complete.
3. All questions under "Business Value" and "Level of lnvestment and Effort" sections are equally weighted for each section.
The nine questions under Business Value each have a weight of 11.11% and the twelve questions under Level of
lnvestment and Effort each have a weight of 8.33%; each section should have a total weight adding up to 100%. The
weightings can be customized by "Unhiding" Column C, but each of the two sections must total 100% for the calculator to
work.
4. Score each question for each opportunity column.
The "Data Governance Heat Map" tab will automatically pull appropriate data from the "Opportunity Assessment" tab and
calculate and plot opportunities onto table and graphic. The only new information that should be added to this tab is a more
detailed "Opportunity Description" (in green) to provide more context for that opportunity.
Step 3: DeveIop and Execute ImpIementation Roadmaps
Step 3 is to develop one or more implementation plans to close the gaps or capitalize on opportunities identified in Step 2. We
refer to the high-level summary of these plans as "Roadmaps which have the following key characteristics:
* Each roadmap fits onto one 8.5x11 sheet of paper (commonly a PowerPoint slide). The one-page format is designed to
facilitate a "big picture communication by showing a summary of the plan. The roadmap is not a substitute for detailed
tasks plans or checklists which may be appropriate depending on the scale and complexity of the initiative.
Velocity
15/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
* Depending on the scale and complexity of the implementation program, one or more roadmaps may
be needed. Each roadmap should have a clearly identified owner (the individual responsible for leading the
implementation) and sponsor (the executive support individual who can help provide resources and overcome
organizational roadblocks). A key determinant of whether multiple roadmaps are required is if there are different owners for
the activities; each roadmap should be associated with a single owner. For example, the following is a list of six roadmaps
for a comprehensive program:
- Data Governance Program Roadmap
- Data Quality Roadmap
- Business Glossary and Metadata Roadmap
- Trusted lnformation Hub Roadmap
- Application Portfolio Optimization Roadmap
- Project Methodology Roadmap
* Each roadmap is structured into three or four phases. Long-term strategic roadmaps may be 3 or 4 years (1 year per
phase) while shorter-term tactical roadmaps are typically 30-60-90 days or 4 quarters. ldeally, each roadmap should
include both a strategic perspective and a tactical perspective.
* The activities/milestones on the roadmaps are organized into three tracks:
- People and Organization
- Process and Policy
- Technology and lnfrastructure
The three tracks help to keep a balanced focus on the plan and to ensure that all dimensions of the implementation progress in
coordination (for example, to ensure that the business glossary technical infrastructure doesn't get too far ahead of the people
and process dimension or doesn't fall too far behind). The lnformatica 10 facets framework serves as a useful checklist of
activities to consider at each level.
See Multi-Generation Roadmap for an example of a multi-generation roadmap that can also be used as a template.
Competencies
Enterprise Architecture
Metadata Management (including Metadata Manager Business Glossary )
lnformation Life-Cycle Management
Financial Management
Data Quality Management
TempIates
Data Governance Maturity Assessment
Business Priority Opportunity Assessment
Multi-Generation Roadmap
Velocity
16/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
lntegration Competency Center (lCC) and Lean lntegration
Integration Competency Center (ICC) and Lean Integration
Description
An lntegration Competency Center (lCC) or lntegration Center of Expertise
(COE) is a shared services function consisting of people, technology, policies,
best practices, and processes - all focused on rapid and cost-effective
deployment of data integration projects critical to meeting organizational
objectives. Lean lntegration is a management system that emphasizes
creating value for end customers, continuous improvement, and eliminating
waste as a sustainable data and process integration practice. An lCC may be
implemented without adopting Lean practices, but the reverse is not generally
feasible; Lean lntegration disciplines require the structure and focus of an lCC
in order to achieve the desired results.
Organizations have found that there is a direct connection between the caliber
of their lCC and their ability to respond quickly to changing processes, intense
competition and demanding customers. ln short, the lCC is an enabler for the
delivery of trustworthy data in real time.
An lCC is considered to be an enterprise strategy in that it has multiple dimensions that include:
* An organizational structure (federated, centralized, or hybrid) and people with special skills and competencies.
* Defined service offerings supported by processes, best practices and data integration policies.
* Technology, tools and standards to support the capabilities it offers.
Velocity
17/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
While the lCC may be a
temporary unit focused on a specific
program, it is generally used to
describe a permanent part of an
organization that is responsible for
maintaining repeatable methods,
providing ongoing, sustaining data
integration services, and providing
direction for the information movement
within an organization.
Business Drivers and
Objectives
There are many reasons that
organizations may wish to establish an
lCC. The following are just a few:
* To optimize the utilization of
scarce resources by combining
staff and tools into one group.
* To provide tighter governance and
control over a critical aspect of the
business or infrastructure.
* To improve the customer
experience by integrating information and establishing consistent processes across channels and products.
* To reduce project delivery times as well as development and maintenance costs.
* To improve Return on lnvestment (ROl) through re-use of low-level building blocks, application components and codified
business rules.
* To decrease duplication of data integration and data quality efforts and to promote the concept of re-use within projects
and the enterprise.
* To build on past successes instead of reinventing the wheel with each project.
* To lower total technology cost of ownership by leveraging technology investments across multiple projects.
* To capture best practices and lessons learned from one project to the next and build a knowledgebase to secure the
above objectives.
ln short, an lCC is anything that has a perceived value if two or more functional groups need to collaborate and to sustain that
collaboration across multiple projects or over a long time. The business value of capturing lessons learned through experience
and knowledge management of common patterns and procedures in order to achieve true integration of data brings value to
the organization
Industry Perspective
The term lntegration Competency Center and its acronym lCC was popularized by Roy Schulte of Gartner in a series of articles
and conference presentations beginning in 2001 with The lntegrat/on Competency Center [Ref SPA-14-0456]. He picked up
the term from one his colleagues, Gary Long, who found some of his clients using it (they took the established term
"competency center and applied it to integration). Prior to that (from 1997 to 2001) Gartner had been referring to it as the
''central integration team''. The concept itself, even before it was given a label, was originated in 1996 as part of one of
Gartner's first reports on integration.
From an industry perspective, a major milestone was the publication in 2005 of the first book on the topic: lntegrat/on
Velocity
18/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Competency Center: An lmp/ementat/on Methodo/ogy by John Schmidt and David Lyle (hereafter referred to as the lCC
Ou/debook ). The book introduced five lCC organizational models and explored the people, process and technology
dimensions of lCC's. The concept of integration as a competency in the lT domain has been refined over the past and
continues to gather momentum and broad-based acceptance.
The term Lean lntegration was first introduced by lnformatica in a series of blog articles starting in January 2009 entitled
10 Weeks To Lean lntegrat/on . This was followed by a white paper on the topic in April 2009 and the book Lean lntegrat/on, An
lntegrat/on Factory Approach to Bus/ness Ag///ty by John Schmidt and David Lyle in May 2010. Part 3 of the book,
lmplementation Practices, has much in common with Velocity since the lCC strategy and the eight competencies were used as
the basis for chapters 11-17.
These days lCC's are often referred to as lntegration Centers of Excellence, SOA Centers of Excellence, the Data
Management Center of Excellence or as other variants.
Integration Maturity LeveIs
Another way to look at integration is to examine how integration technologies and management practices have evolved and
matured over the past 50 years. The figure below summarizes four stages of evolution that have contributed to increasingly
higher levels of operational efficiency. Hand coding was the only technology available until around 1990 and is still a common
practice today, but it is gradually being replaced by modern methods. The movement to standard tools, more commonly known
as middleware, began in the 1990s, followed by industry consolidation of tool vendors during the first decade of the 2000s,
resulting in more "suites of tools that provide the foundation for an enterprise integration platform.
As we look to the future, we see the emergence of the lntegration Factory as the next wave of integration technology in
combination with formal management disciplines. This wave stems from the realization that of the thousands of integration
points that are created in an enterprise, the vast majority are incredibly similar to each other in terms of their structure and
processing approach. ln effect, most integration logic falls into one of a couple of dozen different "patterns or "templates,
where the exact data being moved and transformed may be different, but the general flow and error-handling approach is the
same. An lntegration Factory adds a high degree of automation to the process of building and sustaining integration points.
Management practices have also evolved from ad hoc or point-in-time projects, to broad-based programs (projects of
projects), to lntegration Competency Centers (lCCs) and now to Lean lntegration. A major paradigm shift began early in the
current century around the view of integration as a sustaining practice. The first wave of sustainable management practices is
encapsulated by the lCC. lt focused primarily on standardizing projects, tools, processes and technology across the enterprise
Velocity
19/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
and addressing organizational issues related to shared services and staff competencies. The second
wave of sustainable practices is the application of Lean principles and techniques to eliminate waste, optimize the entire value
chain and continuously improve. The management practice that optimizes the benefits of the lntegration Factory is Lean
lntegration. The combination of factory technologies and lean practices results in significant and sustainable business benefits.
The timeline shown on the bottom of the evolution graphic represents the period when the technology and management
practices achieved broad-based acceptance. There is no date on the first evolutionary state since it has been with us since the
beginning of the computer software era. The earlier stages of evolution don't die off with the introduction of a new level of
maturity. ln fact, there are times when hand coding for ad hoc integration needs still makes sense today. That said, each stage
of evolution borrows lessons from prior stages to improve its efficacy.
These four stages of evolutionary maturity are also used in the Velocity Enterprise Competency areas. The shorthand labels
used are as follows:
1. Project : disciplines that optimize integration solutions around time and scope boundaries related to a single
initiative.
2. Program : disciplines that optimize integration of specific cross-functional business collaborations, usually through
a related collection of projects.
3. Sustaining : disciplines that optimize information access and controls at the enterprise level and view integration as
an ongoing activity independent of projects including an lCC.
4. Lean : disciplines that optimize the entire information delivery value chain through continuous improvement driven
by all participants in the process.
ICC ImpIementation Approach
lmplementing an lCC involves multiple aspects of organization, people, process, technology and policy dimensions. The
implementation of an lCC may be very rapid in organizations that have the motivation and resources to achieve full results
quickly; while other organizations may 'grow' a full function lCC over time. ln either case, a key to success is to begin with a
solid plan and execute it with the long term lCC objective in mind.
A high-level approach to selecting the appropriate lCC model and developing an implementation approach is shown in the
figure below. Refer to Selecting the Right lCC Model for a detailed explanation of steps 1-5 and refer to Planning the lCC
lmplementation for a detailed explanation of steps 6 and 7.
Because
each type
of lCC
has its
own
advantages, organizations considering implementing an lCC should begin by defining the company's integration goals and the
processes needed to accomplish them. Setting the goals and processes clarifies which type of lCC would be most appropriate,
and which roles and technologies it requires.
Starting an lCC that's right for your enterprise requires the development of an integration strategy. The challenge when starting
from scratch, however, is similar to the proverbial "chicken and egg problem. Should you develop the strategy first and then
bring the resources on board, or should you first establish the core leadership team and let them develop the strategy? The
answer is both. Development of the integration strategy should involve an iterative process with the executive sponsor driving a
top down process and the lCC team (once they are in place) building the bottom-up plan.
lt's also possible that the lCC may be developed spontaneously as a grass roots movement and grow organically. Only in
Velocity
20/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
hindsight does the organization recognize the lCC for what it is. ln this scenario, the lCC team may in fact
be the best ones suited to document the strategy and gain formal management agreement.
The recommended approach that has seen the most success starts with a top-down strategy led by an executive sponsor. The
executive sponsor begins with the presumption that integration is an enabler of the business strategy and that having a
permanent part of the organization focused on integration as a discipline is a critical success factor.
ln summary, implementing an lCC is a seven step process:
1. EvaIuate SeIection Criteria: Determine the recommended model based on best practice guidelines related to
organizational size, strategic alignment, business value and urgency for realizing significant benefits.
2. Document Objective and PotentiaI Benefits: Define the business intent or purpose for the lCC and what the
desired or expected benefits are (this is not a business case at this stage).
3. Define Service Scope: Define the scope of services that will be offered by the lCC (detailed service definitions
occur in step 6 or 7).
4. Determine OrganizationaI Constraints: ldentify budget limitations or operational constraints imposed by factors
such as the geographic distribution of operations and the degree of independence of operational groups.
5. SeIect an ICC ModeI: Recommend a model and gain executive support or sponsorship to proceed. lf there is an
urgent need to implement the lCC, move to step 6 - otherwise proceed to step 7.
6. DeveIop a 120 Day ImpIementation PIan: Leverage the lCC Best Practice for implementing and launching an
lCC.
7. EvoIve to Target State: Develop a future-state vision and begin implementing it. The 120-day planning best
practice may still provide useful guidance, but the time-frame may be measured in years rather than months.
Of course it's not as simple at the seven step process may indicate. The strategy needs to be reviewed and refreshed on a
regular basis and there are a number of other elements which may be just as critical, including:
* Architecture Principles
* Outsourcing Strategy
* Financial Policies
* Business Alignment
* Supplier Partnerships
* Standards Selection
The lCC Organization Model and the people, process and technology considerations are the base. The detailed activities and
considerations for selecting the right organizational model, assembling and training staff, defining services and selecting
technologies and other dimensions are detailed in the related Velocity lCC Competencies and Best Practices.
Each lCC Competency is a component that can be used individually or in combination with other competencies, as required by
the defined scope and structure for a given lCC. For example, while Financial Management is not an essential competency for
a Best Practices lCC, it is essential for the Shared Services and Central Service lCC models.
TlP >>>
The lCC competencies for each discipline are designed to extend or leverage the existing body-of-knowledge and provide
information on how to apply that knowledge in an lCC context. For example, lCC Competencies and Best Practices for
Financial Management won't make you a finance expert, but they will help you to work more effectively with the finance
department in planning and managing the financial aspects of an lCC, and to use chargeback methods to create appropriate
organizational behavior.
Lean Integration ImpIementation Approach
Lean lntegration and lCC's are complementary. The lCC best practices address the core organizational, people, process,
technology and policy activities associated with establishing and operating a successful shared service function. Lean builds
on the lCC core practices by adding a management system for continuous improvement and factory automation. More
specifically, Lean lntegration is the systematic use of seven principles in the ongoing operations of an lCC:
1. Focus on the customer and eIiminate waste: Maintain a spotlight on customer value and use customer input as
Velocity
21/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
the primary driver for the development of services and integration solutions. Waste elimination is
related to this principle, since waste consists of activities that don't add value from the customer's perspective rather than
from the supplier's perspective. Related concepts include optimizing the entire value stream in the interest of things that
customers care about and just-in-time delivery to meet customer demands.
2. ContinuousIy improve : Use a data-driven cycle of hypothesis-validation-implementation to drive innovation and
continuously improve the end-to-end process. Related concepts include amplifying learning, institutionalizing lessons
learned and sustaining integration knowledge.
3. Empower the team : Share commitments across individuals and multifunctional teams and provide the support
they need to innovate and try new ideas without fear of failure. Empowered teams and individuals have a clear picture of
their role in the value chain, know exactly who their customers and suppliers are and have the information necessary to
make day-by-day and even minute-by-minute adjustments.
4. Optimize the whoIe: Make trade-offs on individual steps or activities in the interest of maximizing customer value
and bottom-line results for the enterprise. Optimizing the whole requires a big-picture perspective of the end-to-end
process and how the customer and enterprise value can be maximized even if it requires sub-optimizing individual steps
or activities.
5. PIan for change: Apply mass customization techniques to reduce the cost and time in both the build and run
stages of the integration life cycle. The development stage is optimized by focusing on reusable and parameter-driven
integration elements to rapidly build new solutions. The operations stage is optimized by leveraging automated tools and
structured processes to efficiently monitor, control, tune, upgrade and fix the operational integration systems.
6. Automate processes : Judiciously use investments to automate common manual tasks and provide an integrated
service delivery experience to customers. ln mature environments this leads to the elimination of scale factors, the ability
to respond to large integration projects as rapidly and cost-effectively as small changes, and the removal of integration
dependencies from the critical implementation path.
7. BuiId quaIity in: Emphasize process excellence and build quality in, rather than trying to inspect it in. Related
concepts include error-proofing the process and reducing recovery time and costs.
The integration competencies (see list at bottom of this document) provide further details for how to put these principles into
practice. From an overall implementation strategy, there are two fundamental implementation styles for Lean lntegration; top
down and bottom up. The top-down style starts with an explicit strategy with clearly-defined (and measurable) outcomes and is
led by top-level executives. The bottom-up style, which is sometimes referred as a "grass roots movement, is primarily driven
by front-line staff or managers with leadership qualities. The top-down approach generally delivers results more quickly, but
may be more disruptive. You can think of these styles as revolutionary versus evolutionary. Both are viable.
While Lean lntegration is relevant to all large organizations that use information to run their business, there are several
prerequisites for a successful Lean journey. The following five questions provide a checklist to see if Lean lntegration is
appropriate for a given organization and which style may be most appropriate:
* ls there senior executive support for improving how integration problems are solved for the business?
Support from a senior executive in the organization is necessary for getting change started and critical for sustaining
continuous improvement initiatives. ldeally the support should come from more than one executive, at a senior level such as
the CXO, and it should be "active support. You want the senior executives to be leading the effort by example, pulling the
desired behaviors and patterns of thought from the rest of the organization. lt might be sufficient if the support is from only one
executive, and if that person is one level down from C-level, but it gets harder and harder to drive the investments and
necessary changes as you water down the top-level support. The level of executive support should be as big as the
opportunity. Even with a bottom-up implementation style, you need some level of executive support or awareness. At some
point, if you don't have the support, you are simply not ready to formally tackle a Lean lntegration strategy.
* ls there a committed practice leader?
The second prerequisite is a committed practice leader. By "committed we don't mean that the leader needs to be an expert in
all the principles and competencies on day one, but the person does need to have the capability to become an expert and
should be determined to do so through sustained personal effort. Furthermore, it is ideal if this individual is somewhat
entrepreneurial, has a thick skin, is customer-oriented and has the characteristics of a change agent. lf you don't have top
leadership support or a committed practice leader, there is little chance of success. This is not to suggest that a grassroots
movement isn't a viable way to get the ball rolling, but at some point the bottom-up movement needs to build support from the
top in order to institutionalize the changes that will be necessary to sustain the shift from siloed operations to integrated value
Velocity
22/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
chains.
* ls the "Lean director an effective change agent?
Having a Lean director who is an effective change agent is slightly different from having one who is "committed. The Lean
champion for an organization may indeed have all the right motivations and intentions but simply have the wrong talents. For
example, an integrator needs to be able to check his or her ego at the door when going into a meeting to facilitate a resolution
between individuals, who have their own egos. Furthermore, a Lean perspective requires one to think outside the box-in fact,
to not even see a box and to think of his or her responsibilities in the broadest possible terms.
* ls the corporate culture receptive to cross-organizational collaboration and cooperation?
Many (maybe even most) organizations have entrenched views of independent functional groups, which is not a showstopper
for a Lean program. But if the culture is one where independence is seen as the source of the organization's success and
creativity, and variation is a core element of its strategy, a Lean approach will likely be a futile effort since Lean requires
cooperation and collaboration across functional lines. A corporate culture of autonomous functional groups with a strong
emphasis on innovation and variation typically has problems implementing Lean thinking.
* Can the organization take a longer-term view of the business?
A Lean strategy is a long-term strategy. This is not to say that a Lean program can't deliver benefits quickly in the short term-it
certainly can. But Lean is ultimately about long-term sustainable practices. Some decisions and investments that will be
required need to be made with a long-term payback in mind. lf the organization is strictly focused on surviving quarter by
quarter and does little or no planning beyond the current fiscal year, a Lean program won't achieve its potential.
lf you are in an organizational environment where you answered no to one or more of these questions, and you feel compelled
to implement a Lean program, you could try to start a grassroots movement and continue lobbying senior leadership until you
find a strong champion. There are indeed some organizational contexts in which starting a Lean program is the equivalent of
banging your head against the wall.
Lean requires a holistic implementation strategy or vision, but it can be implemented in incremental steps. ln fact, it is virtually
impossible to implement it all at once, unless for some reason the CEO decides to assign an entire team with a big budget to
fast-track the implementation. The idea is to make Lean lntegration a long-term sustainable process. "Long-term is about 10 to
20 years, not just the next few years. When you take a long-term view, your approach changes. lt certainly is necessary to have
a long-term vision and plan, but it is absolutely acceptable, and in many respects necessary, to implement it incrementally in
order to enable organizational learning. ln the same way, an lCC can start with a small team and a narrow scope and grow it
over time to a broad-based Lean lntegration practice through excellent execution and positive business results.
Integration Factory
The lntegration Factory is a core concept in an lCC that has adopted Lean lntegration principles. The factory is a cohesive
integration technology platform that automates the flow of materials and information in the process of building and sustaining
integration points. More specifically, the lntegration Factory is the collection of lnformatica products that that should be
considered for certain ongoing integration needs. For example, an enterprise that has an ongoing need (beyond just one
project) for application migration or consolidation in support of its M&A activities or application portfolio rationalization strategy,
could implement an application migration factory using the collection of products highlighted in the first figure below.
Conversely, if the focus of the lCC is on B2B operations, rapid on-boarding of external customers or suppliers and ongoing
partner management, then a B2B lntegration Factory could be implemented based on the products in the second figure below.
Velocity
23/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Competencies
Business Process Management
Data Quality Management
Enterprise Architecture
Financial Management
lnformation Lifecycle Management
Velocity
24/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
lntegration Methodology
lntegration Systems
Metadata Management
Modeling Management
Velocity
25/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Service Oriented Architecture
Service Oriented Architecture
Description
Service Oriented Architecture (SOA) is an approach that promotes Enterprise
lntegration with re-usable, interoperable and modular components. The ability
of the Enterprise to respond to changing business problems and opportunities
with efficiency, agility and adaptability hinges on the successful adoption of
SOA during data integration activities. Data Virtualization for Data
Warehousing and Service Oriented Architecture enables key decision makers
such as Analysts (Subject Matter Experts (SME's)) to directly access data
across diverse systems quickly, create reusable data services and data rules
and collaborate with lT to deliver new data to Bl tools and enterprise
applications, up to five times faster than ever before.
Over the years, each new acquisition, application, data warehouse or other
project has helped to create the integration hairball. The many point-to-point
integrations across the enterprise make it hard to understand, access, and
manage data. This integration hairball results in a long, manual process to
deliver the data -- where the business and lT iterate several times to fulfill the
business requirements, often failing to deliver in a timely fashion. For Bl applications, this hairball makes it difficult to
accommodate new data sources; and changes to data warehouses often lead to a proliferation of data marts and custom
reports. For SOA, data services that don't provide a flexible data services layer can quickly create another hairball and
eventually turn SOA into yet another " S low O ver-budget A pplication.
lnformatica Data Services delivers fast access to new data for reporting, applications and SOA, with the industry's
next-generation data virtualization. The business can quickly access the data they need while lT retains control. Analysts now
directly and quickly find, profile, create data rules and test the data they need. Built-in data virtualization makes direct access to
trusted data on remote systems easy and transparent, overcoming the limitations and complexity of data federation. lT
architects and developers govern the process and provide sophisticated transformations including data quality as needed to
the analysts. Once done, lT can easily deploy data services that access data on-the-fly from the original systems, or move the
data into a physical data store such as a data warehouse. Both data service deployment modes are optimized to run on the
same high performance engine, the lnformatica platform.
Business Drivers and Objectives
lnformatica Data Services transforms the way lT organizations work. With lnformatica Data Services, lT organizations can:
* Access data consistently from a single location regardless of where the source data resides and deliver new data that the
business needs and trusts.
* Accelerate access to new data in days (not months) by exposing data services.
* lmprove data governance and reduce risk by thwarting unwanted access to critical business and customer information.
* lnsulate applications from underlying changes.
Reference Architecture
High-LeveI Reference Architecture - GeneraI
Velocity
26/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
To enable lT organizations to respond effectively to changing business demands, data architectures need
to be revived and transformed with SOA data integration. These capabilities enable lT organizations to:
* Quickly find the data needed across heterogeneous data sources, including those that have been recently added to the
ecosystem or data that is time sensitive or external.
* Easily understand the structure of all the data, regardless of the type of data or where it resides.
* Proactively identify inconsistencies and inaccuracies across all data sources in real time.
* Define, enforce, and centrally manage data freshness, real-time data quality and data privacy policies.
* Provide a data abstraction layer that can insulate all consuming applications from changes in the underlying data.
* Represent data as the business thinks of it-as a business entity-and then create logical data models that provide
encapsulation from the physical implementation.
* Make smart decisions about whether data should be accessed by physically moving it, or by federating it on the fly, as
Web services or SQL, without rebuilding the logic.
* Deliver timely, trustworthy, and relevant data as a service, with support for the principles of service orientation, including
reuse, modularity, standards-based, and loose coupling.
High-LeveI Reference Architecture for Data VirtuaIization - Data Warehouse
Extension
lnformatica Data Services complements the data warehouse by co-existing with and extending the data warehouse
architecture with the capability to quickly federate the new data that is requested, on-the-fly.
Velocity
27/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
lnformatica Data Services provides the ability to build a logical view of all of the underlying data by
representing all the data as business entities - such as CUSTOMER, ORDER, PRODUCT, and lNVOlCE - the way the
business thinks about data.
High-LeveI Reference Architecture for Data VirtuaIization - Enterprise Data Services
Layer in an SOA
Velocity
28/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
MethodoIogy
lnformatica Data Services lets the business access the data when they need it, while lT retains control. lt makes the business
more self-sufficient and lT more productive. A set of powerful, role-based tools enables analysts to work with business
entity-based models of data, called Logical Data Objects and participate more proactively in data integration and data
governance processes. They can analyze data and define data rules themselves, decreasing dependence on limited lT
resources.
These role-based tools also provide lT organizations with a single, unified, highly productive environment for data profiling,
cleansing data, and applying and managing reusable data quality rules to Logical Data Objects. By using a common unified set
of collaborative role-based tools and by sharing data rules and metadata across the enterprise, the business and lT can work
more efficiently together to complete projects in days rather than months.
With lnformatica Data Services, integrated profiling can streamline the process of analyzing data to identify issues. This layer
can be leveraged to receive a query from a data consumer such as a Bl tool. Then, the query can be processed at run-time to
get the data from the sources without moving the data and cleanse the data on-the-fly to remove inconsistencies and
inaccuracies. Finally, the results can be served dynamically.
This can also provide an abstraction layer to a legacy system so that when it is retired all applications such as DW, Portals or
composite applications are not impacted by the change. The same is true if a new system is added or transient data needs to
be on-boarded rapidly. lf needed, at a later date, the results can be physically incorporated into the data warehouse.
Velocity
29/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Benefits
* First, create a logical data object that provides a single point for data access and handling change by delivering abstraction
and insulation.
* Next, shorten the analysis process by checking up-front if the planned joins will be successful or not.
* Since this is all about meeting the business's needs, having a purpose-built interface that a business user can leverage to
specify business rules and share the same metadata and logical data object that the developer sees in the developer tool.
* Also, certify the quality of the data upfront, rather than reactively dealing with data inaccuracies and inconsistencies.
* Then reuse the logical data objects to quickly repurpose them to deliver data using any protocol such as SQL or Web
services -- and at any speed, be it batch or real-time - without rebuilding logic.
* The lnformatica Platform is the same engine that will support both physical data integration and virtual data integration.
* The engine is optimized to support both ETL and data federation by leveraging numerous optimization techniques.
Leverage Data VirtuaIization for Optimizing the Data Integration Process in Data
Warehousing
Velocity
30/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Velocity
31/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology
Last but not least, as lnformatica customers adopt an SOA strategy, they realize the importance of data
quality. They are building highly scalable data cleansing solutions in conjunction with Data Services functionality. Velocity Best
Practices for Data Services and Data Quality are the first resources to start exploring real-life examples of data integration
implementations using SOA.
Competencies
Business Process Management
Enterprise Architecture
lntegration Methodology
lntegration Systems
Metadata Management
Velocity
32/32 2013 Informatica Corporation. All rights reserved. Velocity Methodology

You might also like