In my previous post, we examined some of the challenges associated with developing data models for master data entities managed within an MDM environment, and I implied that achieving the holy grail of MDM, namely the (elusive, if not purely mythical) “golden record” would first require a practice for business-oriented data conceptual and logical data modeling.
We tend to have a “dimensional attitude” about master data, focusing on “customer master” or “supplier master” or “product master” as if the representation of each entity should be completely isolated from the representations of all other entities. However, in real life that is not how business processes really work. Transactional and operational processes blend master entity data. For example, “customers” buy “products” and “suppliers” deliver “parts and materials.” Analytical processes blend master entity data. For example, reports organize “product” sales by “customer type” and by “location.”
The use of a generic data model for any of these entity types is bound to limit their integration into the broad range of transactional, operational, and analytical processes and applications. There will be questions about what data attributes belong in the master model and which data attributes are to be managed outside of the master model, as well as what attributes are necessary for differentiation of unique identities within the master environment. Furthermore, the usage models are going to be very different. An operational process may search for a single entity’s record at one time, but an analytical process might need master entity data organized within a star schema for dimensional analysis and reporting.
Generic master models might be a good starting point for deliberation, but a better approach involves developing a unified conceptual master entity model based on the organization’s business processes. In this exercise, start with some number (say five, perhaps?) of the key processes for running the business. For each process, step through each stage to determine which master data entities are involved, what information about those entities is needed, and how the information about the different entities is blended to execute the process.
Two things will become apparent through this evaluation. The first is the breadth of semantic variation associated with each entity. For example, the term “customer” might be used in business processes associated with sales, finance, and customer support. However, the perception of who the customer is and what that customer’s attributes are may differ across these three business functions. The second is learning how differently the attributes of the master entity’s data are used across the different business functions and process. To continue the example, the customer data may be used for executing a single sales transaction in real-time, grouped together for risk analysis within the finance department, and all transactions searched are retrieved for customer support.
As a result, a more complete view of what the business data consumers expect to see from a master customer (or part or product or supplier) record will emerge. At the same time, the relationships among the different master entities will become clearer as well, which will guide the design of the entity data models to support how the business processes consume the data.
In my next post, I will discuss how devising a more aligned set of master data models can drive the development of the application interfaces to ensure the quality and usability characteristics of the master data, while enabling applications to rapidly integrate with the master data environment.
Want to learn about ER/Studio? Try it for yourself free for 14 days!
You can also read this White Paper from David Loshin: Make the Most of Your Metadata
About the author:
David Loshin, president of Knowledge Integrity, Inc. (www.knowledge-integrity.com), is a recognized thought leader and expert consultant in the areas of analytics, big data, data governance, data quality, master data management, and business intelligence. Along with consulting on numerous data management projects over the past 15 years, David is also a prolific author regarding business intelligence best practices, with numerous books and papers on data management, including the second edition of “Business Intelligence – The Savvy Manager’s Guide”.