1               Creating a Successful High-Level Data Model

As we’ve discussed in previous articles, a high-level data model is used to convey the core concepts and/or principles of an organization in a simple way, using concise descriptions. The advantage of developing the high-level model is that it facilitates arriving at common terminology and definitions of the concepts and principles

There are ten steps that are required to successfully develop a high-level data model. Although you can start some of the steps out of sequence, they need to be completed in the order they appear. For example, you might find yourself jotting down stakeholders (Step 2) before identifying the purpose of the model (Step 1). However, you will need to revisit your model stakeholder list after finalizing the purpose of the model.

The ten steps for completing the high-level data model are as follows:

1.1         Step 1: Identify Model Purpose

Determine and agree on the primary reason for having a high-level data model. Always begin with the end in mind.

It is important to remember to focus the purpose of the high-level data model around a business need or process improvement. Data models are built to ensure that everyone has a precise understanding of terminology and business rules.

One of the fascinating outcomes of this first step is realizing that the model’s stakeholders see the world very differently from each other. It is not worth investing time and money in the other nine steps without a clear, agreed-upon reason for the model. That doesn’t mean the high-level data model cannot have more than one purpose, but there should be one primary purpose for building it.

Once there’s consensus on the purpose of the data model and it is documented, you need to determine whether a top-down, bottom-up or hybrid approach is ideal. Matching the right factors with the right modelling approach will dramatically increase the probability of having a successful model.

Here are the most common reasons for building a high-level data model (remember, communication is the main reason behind each of these):

·        Capture existing business terminology and rules.

·        Capture proposed business terminology and rules.

·        Capture existing application terminology and rules.

·        Capture proposed application terminology and rules.

1.2         Step 2: Identify Model Stakeholders

Document the names and departments of those who will be involved in building the high-level data model, as well as those who will use it after its completion.

A high-level data model stakeholder is someone who will be affected directly or indirectly by the model that is produced during the modelling sessions.

As you might expect, when the purpose of the high-level data model is to capture an existing or proposed section of the business, the builders tend to be people who know the business, such as business analysts and business users. Similarly, when the purpose of the high-level data model is to capture an existing or proposed application, the builders tend to be more technical, such as developers and database administrators. The users of the model though, could be anyone from business or IT.

Those with more of a business-oriented background can help build the business-focused view and those with more of a technical background can help build the application-focused view.

All or some of those users should also be your stakeholders and are required to sign off on the model.  

1.3         Step 3: Inventory Available Resources

Leverage the results of step 2 to determine what people will be involved in building the high-level data model and also identify any documentation that could provide useful content to the model.

Now that you have identified why you are building the model and who will be involved in building and using it, you need to identify the resources you will be using. The two types of resources are: people and documentation.

People include representatives from both business and IT. Businesspeople may be management and/or knowledge users. IT resources can span the entire IT spectrum, from analysts through developers, from program sponsors to team leads.

Documentation includes systems documentation or requirements documents. Systems documentation can take the form of standard vendor documentation for a packaged piece of software, or documentation written to support a legacy application. Requirements documents span business, functional and technical requirements and can be an essential input to building the high-level data model.

1.4         Step 4: Determine Type of Model

Determine which of the four types of high-level data models will work best based on the purpose of the model and the available resources.

The purpose of the model identified in step 1 aids in determining the type of model to build in step 4. The four different variations include:

Relational data model: A relational data model describes the operational databases that support business processes.

Dimensional data model: A dimensional model is used exclusively for reporting.

Business perspective: A business perspective is a high-level data model of a defined portion of the business. Choose the business perspective for any of the following situations:

·        Understanding a business area.

·        Designing an enterprise model.

·        Starting a new development effort.

Application perspective: An application perspective is a high-level data model of a defined portion of a particular application. Choose the application perspective for any of the following situations:

·        Understanding an application.

·        Starting a new development effort.

1.5         Step 5: Select Approach

Chose either a top-down, bottom-up or hybrid approach based on the purpose of the model and the available resources.

Even though the three approaches for building a high-level data model sound completely different from each other, they have a lot in common. In fact, the major difference between the approaches lies in the initial information-gathering step.

The top-down approach starts with purely a business need perspective. The business should aim for the sky. Ideas are accepted even if you know there is no way to deliver the requirement in today’s application environment.

 

The bottom-up approach, on the other hand, temporarily ignores what the business needs and instead focuses on the existing systems environment. You build an initial high-level data model by studying the systems that the business is using today. It can include operational systems that run the day-to-day business or it can include reporting systems that allow the business to view how well the organization is doing.

 

The hybrid approach is iterative and usually completes the initial information gathering step by starting with some top-down analysis and then some bottom-up analysis, and then some top-down analysis, etc., until the information gathering is complete.

The whole process is a constant loop of reconciling what the business needs with what information is available.

 

1.6         Step 6: Complete the Audience-View High-Level Data Model

Produce a high-level data model using the terminology and rules that are clear to those who will be using the model.

Once you are confident about which approach you should take, you need to build the audience-view high-level data model. This is the first high-level model to build. The purpose is to capture the viewpoint of the audience without complicating information capture by including how their viewpoint fits with other departments or with the organization as a whole.

The purpose here is to simply capture their view of the world; the next step will reconcile the deliverable from this step with enterprise terminology.

1.7         Step 7: Incorporate Enterprise Terminology

Now that the model is well-understood by the audience, ensure the terminology and rules are consistent with the organizational perspective.

Once you’ve captured your stakeholders’ view in the boxes and lines of the audience high-level model, you can move on to the enterprise perspective. To build the enterprise perspective, modify the audience model to be consistent with enterprise terminology and rules. Ideally, this enterprise perspective is captured within an enterprise data model.

1.8         Step 8: Sign-Off

Require and obtain approval from the stakeholders that the model is correct and complete.

After the initial information gathering, make sure the model is reviewed for data modelling best practices as well as the fact that it meets the requirements. The sign-off process on a high-level data model does not require the same formality as signing off on a physical design, but it should still be taken seriously. Usually email verification that the model looks accurate will suffice.

1.9         Step 9: Market

Similar to introducing a new product, advertise the data model so that all those who can benefit from it know about it.

Think of yourself as a product vendor of sorts – the best product on the market won’t necessarily sell unless it is marketed effectively.

In building a successful high-level modelling project, it is important to treat the marketing aspect as a project in and of itself. To that end, make sure to create a specific communication plan as part of your project’s deliverables. This communication plan outlines both the message and the target community.

1.10    Step 10: Maintain

Maintain. High-level data models require little maintenance, but they do require some. Make sure the model is kept up-to-date.

Remember that even after the model is complete, there is still a maintenance task that you must stay on top of. The high-level data model will not change often, but it will change. You need to have formal processes for keeping the model up-to-date and aligned with the other model levels. You also want to make sure that the high-level data model is actively used by other groups and processes in the organization and doesn’t become a passive artefact.

Mastering the ten-step approach to building a high-level data model will increase awareness of a number of factors and constraints that will heavily influence the actual modelling process. Understanding and weighing these factors and constraints will help you choose the modelling approach that best suits your business’ needs.

 

2               Standards: A Shifting Landscape

2.1         With data exchange standards and technology maturing, the focus is moving toward implementation, architecture and advocacy

Bill Kenealy

Despite their rigid reputation, data standards are not entirely immutable. Much like the business processes and transactions they govern, standards, and the way they are utilized, evolve.

Historically, standards were developed to help the industry bring order to myriad proprietary formats, and to govern interactions between a carrier and an entity outside its walls, such as a producer or regulator. Now, a new breed of standards is emerging that focuses on data transactions made inside a carrier's walls, and on the overall architecture of systems. Ample proof of this evolution can be found in Pearl River, N.Y., where the Association for Cooperative Operations Research and Development (ACORD) has been crafting standards for three decades. Long-standing ACORD exchange formats AL3 and ACORD XML concentrated on helping insurers and producers relate to business-to-business.

Standards are now becoming increasingly internalized. XML-based standards have grown from an efficiency play from the front office to back office, to the de facto interchange between systems and channels. Recently, in addition to its forms and data exchange offerings to the industry, the association has been working on the ACORD Framework, a series of five interrelated models (see chart) that can be used to develop consistent standards within an insurer's walls.

This foray into architectural issues only became feasible in recent years, as many of the disagreements surrounding the existing standards receded, says Lloyd Chumbley, VP of ACORD. With the major technology questions answered, it's now about the implementation and advocacy of standards. While the development of an ACORD standard was once protracted, the association can now produce one in as little as two or three weeks.

"With the architecture in place, we can turn around a standard pretty quickly," Chumbley says. "The development of standards has kind of become a factory. The reason for that is because XML and Web services technologies have matured. Now the debate is simply, 'Do you want that content or not?'"

Indeed, any discussion of standards must be viewed through the prism of technology. Where standards once reflected the paper and flat files used by insurers, they now must conform to an age in which service-oriented architectures and Web services have become prevalent.

Where, at one time, standards were primarily about commonality within a system, now they increasingly concern the interface between systems. By driving deeper into the enterprise, the ACORD Framework reflects this new, services-based reality. What's more, given its comprehensiveness, the Framework also can precede and catalyze the adoption of technologies rather than just conform to them.

To craft the Framework, ACORD has received input from both its carrier members and the vendor community. For example, in December 2009, IBM contributed its Business Object Model from their Insurance Application Architecture (IAA) to the Framework.

Chumbley says one the association's primary goals is to ease implementation issues through the development of new tool sets. "We've made huge strides this year, and we'll be moving even further than that during the remainder of the year with updates to the data models, and a revised information model," Chumbley says.

2.2         Spurring Adoption

Even the best-devised standard is meaningless for want of adoption. But why should insurers adopt standards? There are solid business reasons for doing so. Agents who had to deal with multiple carriers were among the earliest adopters of standards. Carriers seeking their business often followed suit. In addition to greater ease of doing business, standards also promise lower transaction costs and better data quality.

Competitive pressures also enter the picture. Traditional writers are competing with a cadre of direct writers built on standards who reap the benefits in terms of speed, cost and efficiency.

Nonetheless, some equally compelling reasons work against standards adoption. Perceived up-front costs may spur some to delay, or opt not to adhere to a standard. An insurer was rightfully reluctant to jettison a highly functional, if proprietary process, just to adhere to a standard. This may especially be true if the process is believed to be the source of differentiation in the market. Yet, while many are loathe to admit, the typical business processes performed by insurers are largely similar. Moreover, flexibility in standards enable adherents to tweak according to their business need.

Complexity is another reason often cited for lack of standards adoption. Martina Conlon, principal in the insurance practice at New York-based Novarica, says these concerns are unsubstantiated. "A lot of people don't understand what ACORD has to offer," she says. "They probably think it is more complex than it is really is."

2.3         Banding Together

Also, different lines of business tend to differ in their rate of standards adoption. For example, property/casualty insurers have traditionally been quicker to adopt standards than their life and health counterparts, notes Ki Nam Kim, VP of Boston-based Massachusetts Mutual Life Insurance Co.

"Because of the nature of the P&C business, they have to focus on efficiency," Kim says.

Yet, with the investment income life insurers long enjoyed eviscerated by the financial crisis, efficiency and the standards that beget them are receiving renewed attention. Other, more formal efforts to increase the adoption of standards by life and health insurers also are afoot. For example, the Plug and Play Consortium is made up of insurers interested in making ACORD standards "plug and play" ready. Member companies include: Mass Mutual, New York-based New York Life, Newport Beach, Calif.-based Pacific Life Insurance Co., Boston-based John Hancock Financial, New York-based AXA Equitable Life Insurance Co, and Omaha, Neb.-based Mutual of Omaha Insurance Co.

The goal of the consortium is to accelerate the adoption of standards by excising some of the ambiguity in the standards. Many standards will provide, say, 90% of the information ultimately contained within it, with the rest made up with extensions.

"While ACORD has made great progress toward best practices, there's a lot of interpretation in the ACORD standards as they stand today," adds David Williams, AVP center of excellence, lead for data analysis at New York-based AXA Equitable.

The devil is indeed in the details, Kim says. "Ambiguity leads to inconsistent implementation. We want to remove ambiguities and tighten the standards. Also, more importantly, we want to make it easier to adopt the standards."

To be sure, a great gulf lies between adopting and implementing standards. There are some companies that adopted standards, but are not doing large volumes of business with them because of implementation issues. The consortium was created to help address the problems insurers have had implementing ACORD standards in the past, Williams says. "We spend disproportionate effort integrating a new application into our environment."

To help surmount such issues and aid insurers, the consortium wants to make implementation artifacts available online. "To really understand what a standard says, somebody has to invest a lot of time and read through hundreds of pages of documentation to implement it," Kim says. "Instead of reading through a document, people can actually download a ready-made artifact for testing into their system."

Williams says the consortium is taking pains to make sure the plug-and-play certifications work in concert with the still-developing ACORD framework. "As the framework becomes more available, we'll be able to accelerate how we create specific implementations of plug-and play services," he says.

Kim says the consortium is still open to receiving new members. "We are very fortunate that many of the companies in our industry recognize the same need," he says. "Once they realize what they can get out of these standards, and also have access to the implementation artifacts, I'm sure they will be joining us left and right. What we really need now is for the industry to get together, remove the ambiguities, and make it easier to adopt standards so we can improve the efficiency of the industry at large. Ultimately, that benefits consumers."

Williams says that by lowering one of the primary barriers of entry-implementation costs-the plug-and-play approach will yield benefits for vendors as well. "They won't have to justify an expensive integration effort," he says. "They can just focus on the core offering and features."

Chumbley predicts consortium members can be effective advocates within the industry with the view that adoption of standards ultimately will save insurers both time and money. However, he cautions that there are limits to advocacy. "Standards are bought, not sold," he says.

Much as widespread adoption by producers induced reluctant carriers to adopt standards, a critical mass of carriers can convince more vendors to include standards in their products.

"Carriers getting together and forcing the issue will make a difference," Conlon says. "But the vendors need to step it up. You don't see as much adoption as you'd like to see from core system vendors. Very few provide ACORD XML support out of the box, so carriers purchase core systems then write translation routines to and from ACORD XML."

While it may be in a vendor's best interest to push a proprietary technology, carriers need to fear lock-in. "If you opt for a proprietary technology, you've built yourself another silo," says Neal Keene VP of industry solutions for Irvine, Calif.-based Thunderhead. "Open standards future-proof you to an extent."

To be sure, the push for standardization in insurance is significantly buoyed by the technological tide. With best-of-breed solutions and componentization also gaining sway, XML-based standards are seen as the best way to ensure true interoperability. "The macro trend with SOA, and the advancement of technology, means a lot more life insurers are adopting standards," Kim says. "The momentum is building."

Williams agrees that tighter standards are a requirement for the ultimate realization of SOA. "At a certain point in our development at AXA Equitable, we realized we were going to hit some barriers with services that have external touch points unless we tightened up how some standards were interpreted," he says. "To take the benefit of what a service-oriented architecture gives you and extend it out to business partners, you have to be much clearer in what you want from the standards."

What's more, insurers both in IT and in the business side are becoming increasingly conversant in the underpinnings of Web services, and the tenants of service-oriented architectures.

"What people know today is vastly more than they knew 10 years ago," says Neil Schapperd, president of Middletown, Conn.-based PilotFish Technology. "Now you have whole groups within organizations whose job it is to implement standards."

Schapperd says insurers are also the beneficiaries of more mature XML tools, and a good deal of hindsight. "Like any other new technology, some of the earliest adopters had unsuccessful outcomes that barely got past the pilot stage," he says. "Now you have more alignment between enterprise architects and business people. So, we're seeing a significant increase in the successful implementation of standards."

Which is not to say any standards initiative is slam dunk. Schapperd advises a pragmatic, ground-up approach, noting that it is important to let business needs drive standards adoption, rather than embracing the technology for its own sake. "Start with very specific requirement, get it implemented and perfected before rolling it out to rest of the organization," he says.

While IT may view a standards initiative purely as an efficiency play, there also are large implications for the enterprise as more business moves to the Web, self-service becomes more prominent and insurers juggle multiple distribution channels. "You don't know how customers are going to approach you any longer-they may want to do it using an iPhone," Keene says. "The adopters of standards are better-positioned to receive input from all channels."

In addition to the rise of SOA and Web services, another broad trend driving standards adoption is the increasing use of analytics. Standards give visibility into what's occurring and where it's occurring, and help quantify performance and expedite data collection from every corner of the enterprise.

"At first, standards just connected the pipes to make the flow of data more efficient," Keene says. "Now, we're moving to a whole new level where standards foundationally enable you to build much larger implementations of service oriented architectures, which allow you to report on all the things going on within an organization."

Excepting areas of competitive differentiation, Williams says consortium members and vendors are moving to a collaborative work environment where companies are more comfortable sharing material for the common good.

"It's a very positive, energetic and really interesting time to be involved in standards right now," he says.

2.5         Five Framework Facets

The ACORD Framework has five facets:

1.      Business Dictionary contains standardized definitions of insurance concepts.

2.      Capability Model defines a baseline of insurance companies' functions, and includes a listing of process names for some of those capabilities, called Process Maps.

3.      Information Model provides the relationships among insurance concepts, such as policy, product, party and claim.

4.      Data Model is a logical, level entity-relationship model that can be used to create physical data models or data warehouses, or to validate a carrier's data models.

5.      Component Model is a set of reusable components for the various data services in the insurance industry.

 

3               Breaking the Bad Data Bottlenecks

3.1         How to improve Profits, Efficiency, Visibility and Agility

Information Management Magazine, May/June 2010

Across industries, organizations are experiencing an increasing need to do more with less at the same time they are forced to cope with fast-paced, constrained and demanding business environments. This casts a spotlight squarely on leveraging existing information assets and investments and enabling building blocks to overall business and IT agility. Greater collaboration demands among business partners, the constrained economic environment, today's technology convergence and business agility requirements all factor into the expectations of business to manage and use integrated data more effectively and efficiently across the enterprise. Leading organizations have begun to address many of these data issues in response to strategic or tactical challenges with a variety of solutions, and much has already been written and discussed around common approaches to data management and architecture. However, many organizations are struggling to realize the promised value of delivery and integration due to the issues related to business alignment, broader business relevance, technology proliferation, evolving technology maturity and sustaining momentum. Common data-centric or technology-centric approaches have yielded less than desired outcomes in most cases and many companies are still searching for the right answers to problems that are spread across the entire enterprise. The complexity surrounding unstructured data and the gradual shift of application infrastructure to cloud are the next big frontier awaiting IT managers and CIO. It is quickly approaching a point where a data strategy that looked solid a few years ago requires a revamp or at least a significant refresh for many early adopters.

Many organizations have discovered that their key business initiatives are hitting a common data management bottleneck in the form of poor data integration and poor quality information. There are several root causes of bad data, many of which stem from the growth and complexity that characterize leading enterprises today:

•      Multiple and overlapping applications (some of which hold the same data);

•      Lack of business data ownership for in-process data quality maintenance;

•      Growing size and complexity across the applications landscape;

•      Exchanging data with external vendors, business partners and customers;

•      Mergers and acquisitions among both customers and IT vendors;

•      Business unit-centric versus process-centric views that lead to fragmentation;

•      No clear “master” or authoritative source of core data; and

•      Lack of enterprise data services and a mechanism to enable operational data integration.

While the root causes of fragmentation and deterioration of data quality are easy to understand, their impact on a company’s ability to sell, service and market effectively and manage supply chain operations is severe. The consequences include wasted effort and errors due to manual, repetitive data entry; lower customer satisfaction because call center and service personnel do not have the information they need; business partner frustration and competitive disadvantage due to inability to manage customized offerings in distribution channels; unrealized cost reduction opportunities due to lack of accurate global spend analysis that lead to uninformed vendor negotiations; unrealized revenue opportunities as a result of marketers’ inability to segment, analyze and target customers accurately; and increased compliance and legal risk.

With bad data and inaccurate information affecting so many areas of a business in so many ways, it stands to reason that companies must find a means to address bad data bottlenecks if they are to achieve a breakthrough in profitability or operational efficiency. Many companies have started to implement data management solutions that shift reliance from offline data warehouse systems to master data management and data services that support operational as well as analytical integration. Such solutions have typically taken the form of single-domain master data hubs and integration layers tasked with acquisition and syndication of specific information to different applications. Together the solution components are supposed to manage data quality and control master data distribution across the enterprise. For years, companies have invested heavily in MDM/data quality/enterprise application integration implementations and data warehouses in an attempt to attain reliable visibility, process integration and a single version of truth. But most have still not been able to fully solve their data issues, and data proliferation is still rampant.

3.2         An Integrated Single Version of Truth Remains Elusive

Many companies today face a collection of problems that lead to unrealized ROI or stalled data programs:

·        Divisional or Business Unit-specific data management platforms that are unable to scale enterprise-wide;

·        Inability to mobilize active data governance after pilot attempts;

·        Difficulty in achieving enterprise data integration;

·        Making decisions due to the quickly changing vendor technology landscape and tools that are still maturing gradually;

·        Architectural approach choices and options that have not yet fully matured;

·        A single-domain to multi-domain journey that isn’t clear;

·        Business adoption and traction that fizzles out after an initial victory.

As a result, early adopters are revisiting their data strategies and rethinking some of the common mantras of data program undertakings. The goal still remains the same as depicted in the diagram above – to support global end-to-end process integration and analytics in a way that allows all key data and content objects to be complete, accurate, timely and available. This includes all touch points where data supports processes or insights, whatever application it is in, and however it is being used to drive business transactions or key decisions. By providing on-demand, loosely coupled, semantically integrated and traceable access to cleansed, current and comprehensive data, the constraints on your key initiatives can be remediated by an overall information architecture without forcing your entire company into a single application and set of business logic. The benefits of this end state are profound from a business and IT transformation standpoint, but the journey remains unique for each company.

Having collaborated with many leading vendors and worked with executives of many large companies, I have distilled some of the hard-learned execution themes as key considerations you may find helpful:

Think process first, technology later. Take a big-picture view of the solution itself. Start with the business processes, data lifecycle mappings and supporting data models and focus on making them work end-to-end with cross-functional participation. Only then should you move on to the technical components and their design. If you don’t seamlessly integrate data strategy into overall business process engineering and tie benefits to process key performance indicator improvements, the business will not adopt the solution, regardless of executive sponsorship, business case or architectural validity.

Consider the politics of data. Don’t ignore the political reality of information and data. A lot of people within your organization “own” data as they progress through their capture and consumption of information. Everyone would like to own the definitions, but not the data quality. Regardless of top-down sponsorship, do not expect a data governance program to click if it cannot enforce accountability at or near capture with controls that can tangibly improve the efficiency or effectiveness of a process area or function. There is no cookie-cutter approach to address specific cultural and organizational dynamics – choose your battles where there is immediate bang for the buck with the right change agents willing to evangelize tradeoffs for the common enterprise good.

Follow a transformation agenda for success. Focus on business enablement and visible value delivery, not just data or technology remediation. Data programs closely attached to significant business or IT transformation programs have much greater chances of success than those without such leverage. Replacing or overhauling underpinnings of your enterprise applications require more than a sound architectural approach or appreciation of data-related problems. Broader disruption of the operational landscape is just too difficult or too painful to facilitate the change required in process, architecture and applications across the board. If tied directly and tangibly to the value statement of your key strategic enterprise initiatives, data programs can gather change momentum and an opportunity to affect the required overhaul.

Understand different approaches of data architecture. It is important to recognize that an MDM or data management architecture is not just an implementation of some central data hub. Instead it is a new, disciplined approach to data management that puts business process optimization and in-process governance first while taking advantage of a new class of architectures and data management services. New concepts and buzzwords for different approaches abound, but it simply boils down to what’s relevant for your industry, the specific set of challenges that you intend to address and how effectively you can leverage your enterprise standards and platforms to enable the required services.

Finally, don’t overlook quickly evolving next-generation needs. Just when you thought you had an answer for tackling structured data issues, many of you are probably now staring at the next frontiers of unstructured and ill-managed content such as emails, wiki, social computing, blogs, chats and Web content proliferation as well as game-changing ideas for cloud computing, Web 2.0, or mobile solutions. Technology consolidation, changing infrastructure models and information dynamics make it imperative for companies to revisit and refresh their information strategies to stay current and relevant. Many of the subplots, themes and considerations have been touched on here to highlight the new dynamics of data management. I will explore some of these in more detail throughout the rest of this series.

 

 

4               Top 10 IM Vendors 2010

For 2010, Information Management commissioned a ranking of the top 50 revenue-producing product vendors in information management - those that supply the tools and knowledge used by organizations to collect, distribute, analyse and report on data. The list looks at the 2009 revenue and profit of the largest publicly held information management companies.

 

 

 

5               MDM: Talent In Demand

I'm just back from a couple of master data management events this month sponsored by Information Management magazine and the MDM Institute. These last two sessions in Toronto and San Francisco taught me a lot and added a few new questions to my list.

First comes the good news. Over time we've gained a much clearer understanding of what MDM is and why it exists. Almost all attendees are pretty well able to articulate what they want the business to accomplish with MDM and how it justifies expensive plans for customer, product, supplier or other masters. (That last sentence deserves an exclamation point!)

Also, data governance is finally understood well enough that it has moved ahead of the software discussion -- in that everyone gets it  -- and no one is happier about this than the tool providers (and consultants of course). It takes unjustified heat off vendors for companies to realize that MDM is a bunch of products, and governance is a framework of policies, controls, politics and discipline.

To paraphrase, 'It's the culture of the organization, stupid.' With that mindset, I'm surprised as ever at the number of companies struggling to get ahead constructively with MDM programs.

On a scale of one to four, most companies are at the bottom, the "anarchy" or unresolved stage of maturity. And where it's working, MDM is mostly driven by some compliance or regulatory deadline and not by business opportunity. It's not to say MDM is easy because it's not. Yes, there are enterprises at our conferences measuring success with master data. That's almost worth another exclamation point.

But then there is the great divide of most companies by far unable to conjure their operational inefficiencies importantly enough to bring coordinated attention to tangible benefits of MDM.

It's hard to remember we're just a couple years separated from cycles of large infrastructure capital investment and the marching orders that come with those dollars. With apologies, I'll make the point: captains and lieutenants listen to non-commissioned sergeants when the boss's objective is clear and fighting gets tough.

We have the objective and we have the weapons to take on MDM, but the fiefdoms of middle management are often distracted, too busy or not under orders to make it happen. It's never as easy as it looks, but where MDM is working, there is a vision, short-term direction and a business cause that allows self-funding requirements to be set aside, at least momentarily.

The irony of these conferences is that the MDM shortfalls mentioned here are counter to intense but uneven market demand. Again and again I have seen and heard that MDM talent is hard to come by. Our conference co-chairman Aaron Zornes warns that companies must protect native MDM talents, lest they be poached by systems integrators and sold back to customers at a higher rate. He also notes that SIs are likely to bait and switch MDM staffing competency until the talent curve improves, and that is not immediately happening.

Somewhere between the big picture of an enterprise architect and the self-funding project mandate to the middle manager is where MDM talent finds itself right now. Many of you already know this game: business owns the data; IT owns the tools. Where you want to be is the point where MDM and a credible financial output intersect.  

It can verge on the coincidental: one day you're a talented pawn, and the next you're the missing game piece from the family chess set.

Short-term success in MDM looks very good right now. If MDM is what you do and your immediate objective is clear, you are possibly or likely in the right place. If you understand MDM and your mission is unclear or your sponsorship feels vague or surreptitious, gather your CV, promote yourself -- and prepare to move on to better things. You're in demand.

 

6               TDWI: How To Make Data Integration Work

6.1         A checklist of 10 characteristics shows it’s time to raise the bar on integration expectations due to industry evolution

June 24, 2010 – A recent report from TDWI offers a checklist of 10 best practices that shows it’s time to raise the bar on integration expectations as the industry’s evolution has outpaced mindsets.

Philip Russom, senior manager for TDWI Research and author of the report, emphasizes that there's no silver bullet for "ideal" data integration. As the report explains, DI is a diverse discipline that makes use of a number of tools depending on one's business and technological requirements.

DI has undergone an expansion over the last decade and has reached a critical mass of multiple techniques used in diverse applications and business contexts. Vendor products have achieved maturity; users have grown their DI teams to epic proportions; competency centers regularly staff DI work; and DI as a discipline has earned its autonomy from related practices like data warehousing and database administration. Given all this change, it’s not surprising that many DI specialists and the colleagues who depend on them suffer misconceptions and out-of-date mindsets that need adjustment.

TDWI’s report explains a few things to understand about  DI:

 

1.      Data integration is a family of diverse but related techniques.

2.      Data integration practices reach across both analytics and operations.

3.      Data integration is an autonomous data management discipline.

4.      Data integration is the repurposing of data via transformation.

5.      Data integration is a value-adding process.

6.      Data integration is a green technology that makes data management more sustainable.

7.      A data integration solution should have architecture.

8.      A data integration solution should be the product of collaboration.

9.      Data integration must be coordinated with other data management disciplines.

10.   Data integration should be governed, but also contribute to governance.

 

"This report is an eye-opener, regardless of who the reader is," says Russom. "Many technical and business people are aware of data integration, but they’re not fully aware of all its capabilities and benefits." Russom explains that even DI specialists sometimes focus on specific tasks  for a deliverable and sometimes grope for meaningful ways to describe the goals of DI. "To help people voice these issues, the report reflects on DI’s true mission and altruistic goals. Hence, for any reader with an open mind, this report redefines data integration and its potential in modern, future-facing terms."

7               Data Integration Channel

Data Integration refers to the organization’s inventory of data and information assets as well as the tools, strategies and philosophies by which fragmented data assets are aligned to support business goals. Data integration can pursue several strategies, including single, federated or virtual compilations of data for a given business purpose. Increasingly, businesses are striving to deliver consistent data views through master data management (MDM), which is meant to deliver a near real-time, hub-based and synchronized presentation of information to any seat or point of view in the organization. Data integration today is still heavily focused on middleware, software and management tools that connect software and data end points through connectors and adaptors. Over time, companies are migrating to the philosophy of a service-oriented architecture (SOA) that applies Web protocols and standards for self-identifying application and data end points. This transition is proceeding slowly and selectively as companies are reluctant to abandon proven systems, including mainframes and traditional messaging, which remain mission-critical to business operations.

 

8               Data Governance Remains Immature: Increase Focus on Business Process to Build Momentum

I just read a great blog post by Marty Moseley discussing the results of a data governance survey he and his team recently fielded. The feedback he collected matches recent data governance-related surveys and interviews I've done with my clients at Forrester - the general consensus being that most data governance programs - if they exist at all - remain extremely immature and fraught with risks. The most common roadblocks range from minimal to no executive sponsorship (as Marty also noted), IT-driven efforts with limited to no business participation, lack of business justification and the ever-present likelihood of "de-prioritization" when a more compelling initiative or fire drill comes along.

But there is a silver lining here. As I shared with the survey results in my Oct 2009 research "Trends 2009: Master Data Management," while only 4 percent of my 113 survey respondents felt they had a very high level of data governance maturity (represented by a cross-enterprise, cross-functional data governance organization spanning both business and IT roles with top-down executive sponsorship and measurable value-add), the vast majority of these organizations also recognized "trusted data technologies" like data quality and MDM as critical path to their organization's success. Most organizations admit there remain a number of inhibitors (mostly political, prioritization and ROI calculation-related) that make it difficult to support large investments in these technologies. But most also believe that data governance is the right approach to bridging this driver/inhibitor gap and are investing more time and resources to figure out how to operational data governance processes within their own organizational context and culture.

In my interactions with Forrester client, I get the sense that data governance is receiving the most senior management-level attention today than I've seen throughout my 18+ year data management career. One of the biggest turning points has been the growing recognition that data governance is not – and should never have been – about the data. High quality and trustworthy data sitting in some repository somewhere does not in fact increase revenue, reduce risk, improve operational efficiencies or strategically differentiate any organization from its competitors. It’s only when this trusted data can be delivered and consumed within the most critical business processes and decisions that run your business that these business outcomes can become reality. So what is data governance all about? It’s all about business process, of course.

My Forrester colleague Clay Richardson (who covers Business Process Management technologies and best practices) and I have been collaborating on a concept we call Process Data Management which recognizes that effective data management requires a focus on business processes – and vice versa. We kicked off our discussion on this topic in our research “Warning: Don’t Assume Your Business Processes Use Master Data” and will be publishing another piece later this quarter called “Align Process And Data Governance To Deliver Effective Process Data Management” which will focus on the need for data and process governance efforts to be more aligned to deliver real business value from either. Later this year I’ll also be publishing an update to my annual MDM trends piece “Trends 2010: Master Data Management” which will include new survey results reviewing data governance and MDM maturity.

I'm optimistic that we might finally see some real momentum building for data governance to be embraced as a legitimate competency within many large organizations – especially with a focus on business processes to help evangelize and secure business sponsorship. It will likely be a number of years before best practices outnumber worst practices, but any momentum in data governance adoption is good momentum!

 

9               What’s Next On The MDM Horizon?

Many large organizations have finally “seen the light” and are trying to figure out the best way to treat their critical data as the trusted asset it should be. As a result, master data management (MDM) strategies, and the enabling architectures, organizational and governance models, methodologies and technologies that support the delivery of MDM capabilities are…in a word…HOT! But the concept of MDM - and the homegrown or vendor-enabled technologies that attempt to deliver that elusive “single version of truth”, “golden record”, or “360-degree view” - has been around for decades in one form or another (e.g., data warehousing, BI, data quality, EII, CRM, ERP, etc. have all at one time or another promised to deliver that single version of truth in one form or another).

The current market view of MDM has matured significantly over the past 5 years, and today many organizations are on their way to successfully delivering multi-domain/multi-form master data solutions across various physical and federated architectural approaches. But the long-term evolution of the MDM concept is far from over. There remains a tremendous gap in what limited business value most MDM efforts deliver today compared to what all MDM and data management evangelists feel MDM is capable of delivering in terms of business optimization, risk mitigation, and competitive differentiation.

What will the next evolution of the MDM concept look like in the next 3, 5 and 10 years? Will the next breakthrough be one that’s focused on technology enablement? How about information architecture? Data governance and stewardship? Alignment with other enterprise IT and business strategies?

 

·        What is the linkage between MDM and Data Warehousing?

·        Do Data Virtualization efforts jeopardize MDM strategies?

·        Is “Lean” or “Agile” MDM possible?

·        Who should own/drive the MDM strategy in the organization?

·        When will MDM become MIM (Master Information Management)?

·        Should business process improvement efforts lead MDM strategies?