This blog post was authored by Todd Schraml.
Always Normalize Logical Data Models
The logical data model reflects the common language of the organization, bringing both IT and business people into coherence on terms used. Problem areas within an organization’s data and processes are often reflected in the semantics utilized. Because of this spooky action between language used and systems built, applying a critical evaluation of the business semantics can lead to finding solutions to ongoing problems within business systems. A logical data model should always be normalized, as the process of normalization tries to apply a level of logic to the question, “What does my data mean?” In denormalizing structures, for achieving either a dimensional design or for other reasons, the logical is left behind and the physical is approached. Without a normalized logical data model, the meaning and relationships of the data itself are left undocumented. Circumstances left undocumented could lead to problems in defining the necessary process steps in maintaining the data. As an example, problems could occur in maintaining updates on a dimension that has merged many logical concepts.
Enforce Standardization with Logical Data Modeling
Logical data modeling is the tool for performing a detailed semantic evaluation. One learns the business as one takes the journey of understanding the data; then when the data is understood, a proper logical data model can be sculpted. Stepping through a data modeling process effectively applies and enforces standardization – of object naming, or of object patterns instantiated for the enterprise. Once the standards are truly enforced, user communities slowly become more data literate and knowledgeable. Understanding the kinds of data to expect within attributes becomes clearer simply because a proper class word is always used. Other common patterns found across data models increase the confidence of users perusing their organization’s data content.
Untangle Enterprise Data with Data Modeling
Data modeling needs to be done for many reasons. As the modeling team learns the business, they become a good partner for the business staff. Existing data models provide easy to use documentation communicating plans across the business and development teams. As a developer, a data model provides a roadmap allowing optimization of one’s development efforts while also decreasing development time and errors. Data is an asset; and being able to easily share knowledge about our data and structures is of value. Data models serve as a common tool conveying that business value. As one who puts together data models, the designer is well positioned to be a data expert for the organization; thus designers can become a corporate asset simply because they did the smart thing in establishing proper data models as needed.
Without data models, where does an end user go to understand the data that is available? How do end users determine how the data relates to differing elements and areas? Developers often believe that since they can open modules and look at code, that approach is all that matters; but since an organization contains many more people, not everyone can or should be digging through programming logic. Alternately, the idea that no one should need to know or understand their data is unwise. If no one knows or understands their data, then who gets to say what was done rightly or wrongly in managing and administering it?
Model Data Into the Future
As tools leverage more low code/no code approaches, generate themselves, and any other new trendy futures, eventually it is only the data that will serve as the domain of knowledge to be leveraged and managed. It is imperative that we all become experts in understanding what our data means, how it interrelates, and how and why it may transform. In the context of all these realities, why wouldn’t an organization model data?
Data stakeholders — including various IT functions, data governance teams and business users — need a comprehensive solution to understand, control and distribute reliable data. ER/Studio does this better than anyone by providing unhindered access to enterprise metadata.
Experience for yourself how ER/Studio can help you use data models to be good business partners by scheduling a product demonstration with one of IDERA’s industry experts.
About the Author
Todd Schraml has over fifteen years experience in application development and maintenance. This includes eleven years focused on data warehousing initiatives and five years experience in database administration on massively parallel processing database management systems. Positions held include Project Manager, Data Warehouse Architect, Technical Lead, Database Administrator, Business Analyst, Developer, and Teacher.
Todd’s focus is on data analysis and design; implementing databases for both operational applications and data warehouses; using new and emerging technologies; and seeking ways to integrate formalized quality practices into Information Technology arenas.