Do you trust your data? Are you confident that the reporting delivered to your business users provides the insight they need? Good quality data is a powerful tool for making business critical decisions. In the data-driven landscape of modern business, the quality of your data matters. Continue reading Understanding Your Enterprise Data Quality
Editor’s Note: This week, we’re recapping our first Tekathon, an immersive education session bringing together Knowledgent Informationists and Big Data MDM vendors to discuss real-world use cases. This post by Ashish Saxena, MDM Informationist at Knowledgent, discusses key benefits and features of Novetta solutions.
As Informationists from Knowledgent, we find ways for our clients to eliminate boundaries of business needs and information silos and have them focus on outcomes. Increasingly, exploration and innovation are demanding more from information assets. Data gurus are being presented with messy data infrastructures that hide information of muddy quality. Value is locked in unstructured data assets, leading to misinterpretations of the content and context data. We need a way to find value in our data from variety of formats within our information ecosystem. Our clients look for reduced effort to extract, consolidate, and clean data to produce reports. Continue reading Tekathon: Novetta Review
You have invested in Master Data Management and are now reaping the benefits of MDM in your enterprise. This is great, but what do you do now? You may not have even thought about your MDM platform for a while.
Here are some reasons that you may want to review it, and see if an upgrade to your existing software is in your future: Continue reading When and Why to Consider MDM Upgrades
We’ve just released a new white paper on regulatory compliance! Continue reading White Paper: Transforming Regulatory Burden into Business Opportunity
The Big Data phenomenon is at least partly about answering questions – hard questions, and ideally, questions that predict the future in their own fuzzy way. But there is (at least) one question Big Data doesn’t necessarily have an answer for. It’s a question that comes up regularly and may be among the most feared questions heard in meetings: Continue reading One of the Most Feared Questions About Big Data
“How can we use the technologies we may already have to help our users find information?”
This is a question we’ve been hearing from several of our clients, many of whom are assessing big data and advanced analytic tools and technologies in the hope of using them to enable data-driven decision making. In many cases, they have moved beyond the stage of educating themselves about the strategic benefits of big data to implementing foundational data lakes and analytic sandboxes. Continue reading How MDM Enables Data-Driven Decision Making in a Big Data World
“Information is the new oil” is the latest trend, and like oil, crude data needs to be refined before it can be consumed. In other words, having big data won’t serve any purpose unless the data is good enough to be useful. With the potential for mismatching, duplication, and other quality threats from ingesting data across disparate sources, ensuring the accuracy and quality of data is more important than ever.
This is where big data meets Master Data Management (MDM). Based on the concept of “better to be safe than sorry,” MDM users can apply data matching techniques to resolve some data quality conflicts. Applying these techniques enables users to determine the data that is “most likely” to be correct, and if not perfect, at least at a “Fit to Purpose” level of quality. This post discusses two matching techniques, Deterministic Matching and Probabilistic, or “Fuzzy,” Matching, in the context of big data. Continue reading Deterministic versus Probabilistic Matching in Big Data
Want to know what it takes to work in data and analytics? Every other Friday, our own Informationists will share their thoughts, experiences, and advice on what they do and what they did to get there. Expect to see a wide range of answers from individuals in the same lines of work; our Informationists come from all walks of life, which only shows that there’s more than one way to get on the right career track.
Our first post featured Dip Kharod, Big Data Architect. Continuing the series is Reed Bradford, Data Quality Architect. Reed earned his MBA in Computer & Information Sciences from Temple University and his BS in Information Management from Brigham Young University (BYU). He has worked in information management and analytics for 28 years across the financial services, life sciences, hospitality, manufacturing, and retail industries.
Data Quality Management (DQM) is a major concern in most data-driven organizations. But many organizations are challenged with improving and remediating data quality. From the beginning, they struggle with questions that impede the progress of their data quality efforts.
For example: What is DQM? How do I get started? Do I even need it? What metrics should I use? What data quality rules should I define? Continue reading White Paper: How to Build a Successful DQM Program
Whether mandated by regulatory considerations, driven by executive dashboards, or meant to enable personalized targeting of marketing messages to consumers, the rapidly increasing reliance on analytics has made Data Quality a higher priority than ever before. In turn, this new status has reshaped the very meaning of Data Quality. There was a time when Data Quality really meant one thing: a simple, binary assessment of the accuracy of data. That was the beginning and end of the Data Quality discussion. Today, however, the questions have grown more complex.
From “Is my data correct?” to “What does my data actually mean?,” the questions surrounding Data Quality are undergoing a rapid transformation. This change has been driven by four major factors: