A Journey To The Golden Record

Master Data Management, three simple words, yet a term that can be so powerful and beneficial for an organization.  Some people might not understand what a golden record or a single source of truth means, but for us techies we get very excited when we think about data; especially when it’s organized, standardized, clean, accurate and documented.  Easy enough, right?  Master Data Management allows for operational processes that are executed and controlled on a foundation of people, processes, and technology to maintain and deliver master data that is trusted, regulated, understood, and fits a business purpose.  It integrates data from multiple datasets of systems that end up becoming a single, comprehensive, golden record, which is trusted as both accurate and correct for the organization.  Having this golden record will allow for better business decisions and improved business results.

So how does an organization get to the top and achieve that golden record?  The way that some tackle this is as a homeowner who was about to undertake a big remodeling effort. They’d been in their house for thirty years, everything has its place and is operating fine, but it was time to try to make things a little more organized and modern.  Like any renovation, the first thing is the discovery and design phase.  You have your vision, your business goals are clearly defined, objectives are set, the use case is strong, and it aligns with the organizational goals.   Data sources need to be evaluated and then a roadmap gets defined.  A blueprint or data inventory should be in place to document every attribute from the source system that is commonly shared across domains that map to the high-level master data fields.  Only fields that will serve business value should be incorporated into this 360-degree view.  Now that the plan is in place, it is time to choose a solution that will help you make it all happen.

When dealing with trying to incorporate a master data management program, the stakeholders must understand the value that a successful MDM program can bring and be open and honest about any apprehensions that might be faced with any potential vendor they are looking to vet.    Understanding your architecture and having knowledge of the data you will be bringing over is key.  If you have any doubts about the accuracy, completeness, and consistency of your data, you must secure a vendor that has a Data Quality solution in place.  For a successful MDM program, you need high quality, trusted, clean, and standardized data so that you can make important decisions.  Without high quality data, you cannot become data driven because you cannot trust the data within to make impactful business decisions, which will lead to inefficiency, missed opportunities and ultimately financial loss.  Another big decision you will have to make, what are your thoughts about on-premises solution or a cloud or SaaS approach? Imagine a contractor who says they will do all the work versus a contractor who sublets out.  Both have their advantages and disadvantages, my team happened to sample both.

Some organizations prefer to have software on-premises.  They can have full control and ownership of all the internal components and allow their development teams to work collaboratively with other IT groups to design, develop, maintain, and deploy robust solutions that fit the needs of the organization.  This permits more internal controls to be in place while allowing for ownership of security and accessibility.   However, while the price point might be optimal at buy-in, long-term associated costs could be high.  A company needs to estimate hardware costs, as well as the resources needed to maintain components, perform upgrades, monitoring and patching servers.  A large organization that has many employees trained in the different areas could think of having an on-premises solution, but most do not or do not want to invest in the time spent managing it all.

With a cloud first approach, a company can configure a cloud architecture based upon their needs.   It allows for a heads down design and implementation approach while allowing your vendor to take care of the infrastructure.  Gone will be the days of managing and maintaining hardware and servers.  Your staff will not have to spend time troubleshooting problems when technical issues arise, you will end up logging a ticket with your provider.  With cloud-based solutions, there usually are not many hidden costs, you pay subscription fees as you go.  Some organizations might be concerned about security risks when it comes to their data being in someone else’s house, however most top cloud companies offer comprehensive, multi-layered security with constant threat monitoring, network protection and data encryption as well as failovers, backup, logging, and restoration.

Once your company has chosen which way it wants to go, it’s time to get started.  Whether you choose to keep things on-premises or move it up to the cloud, the fun is about to start.  During an MDM Implementation, it is critical to have business owners assisting from the beginning.  Ensuring that your data is as clean and standardized as possible, and that weights and matching criteria are customized for your data is crucial to a successful program.   Usually with MDM implementations, after all the matching rules are set, a bulk load will happen where the rules are finally applied to the data.  It is at that point where in-depth testing should be performed on the merged data, ensuring that the data is aligning as it should, records are being merged properly into entities and any data elements aren’t being missed.  After the data has been validated, incremental loads are turned on, load schedules are automated, and applications can finally start to consume those golden records.

However, in some scenarios, merging data from different sources could pose some headaches for data stewards.  People are getting linked together erroneously in the MDM application more than likely by that lovely technical term called “dirty data”.  Some people might question, “What do you mean the data is dirty, people have had no issues with the data in the decades our companies have been in business?”  Unfortunately, many companies have not had a full holistic understanding of the data elements, or what has been entered in those fields behind the scenes.  However, when two entities are getting merged because of incorrect or in some cases fat fingered data, the issues are easily shown right there in front of the data stewards’ eyes.  What happens when two completely different entities are linked together that clearly are not the same person?  It happens, more than one would think.  You would hope your MDM tool is smart enough to assign a task, stating that these entities need manual intervention to indeed determine if they are the same entity or not because the tool is unsure.

To be successful in an MDM program, it is strongly recommended to have a Data Governance program in place or look into starting one immediately after MDM implementation.  Data Governance means accurate, cleansed, standardized data which will result in better business decisions and improved business results.  It should be a key driver of an organization’s approach to data management.  It incorporates the people, processes and technology required to create data handling that is accurate and reliable.  It also allows one to understand their information assets across the organization while allowing for decision making and accountability of the data.  Data is becoming the core organizational asset that will determine the success of your business as is now shown in your MDM platform.  Digital transformation is evolving, and you can only exploit your data assets and have a successful transformation if you are able to govern your data.  For your organization to deliver good business results, your data must be accurate, understood and the use of that data should be governed through policy and monitoring.  By creating a Data Governance framework, it will allow an organization to focus and control the areas of operational framework by defining clear roles and responsibilities that enable sustainable execution across the organization.    It will allow for development, training, communication, and program metrics.  Creating standard definitions for key data elements that will promote consistency and accuracy, improving the data usability, quality, and timeliness to promote data-based decision making.  It should be designed to prepare rules and regulations for the organization to handle any issues that may come up regarding data, ensuring compliance to policies, documentation of data ownership and data function management.

When you have a successful MDM program coupled with Data Governance, your organization will see their data as a valuable asset and any business objective can be achieved with confidence.  These two disciplines in data management are independent yet complementary at the same time.  It will pave the way for the future of data decision making, setting your company up for success with minimal risk.

In 2017, New Jersey Courts embarked on a bipartisan journey to reform what many deemed as broken and punitive bail system.  Prior to January 2017, all persons charged with a crime in New Jersey were entitled to be released on bail, regardless of their criminal history or the threat that they may pose to society.  More than 15,000 people were incarcerated on a given day with bails under $2500. Individuals with more serious crimes could post bail for the simple fact that they had the means to do so.

Effective January 2017, L. 2014,c.31 requires the preparation of a risk assessment and recommendation on conditions of release for every eligible defendant arrested and transported to jail.  A judge must make a pretrial release determination or an eligible defendant without unnecessary delay, but no later than 48 hours from arrest.  It has been the judiciary’s goal to prepare the recommendation and pretrial release detention under 24 hours. There are a lot of processes that happens from the time the defendant gets arrested to when the determination is made.

After the defendant gets arrested, law enforcement ID’s the charged and creates an official arrest record on a summons or warrant.  The judiciary uses that record to sweep 40 million party records and over 200 million cases, return the data associated with the defendant, and score it using a predictive algorithm and present the data within five seconds of the submit button being clicked.  This public safety assessment is descriptive and diagnostic, predictive by using data to assess the level of risk and prescriptive by making a release recommendation.

To achieve this goal, the NJ Courts has a Master Data Management process in place, where six of our case management’s systems demographic data are pulled, cleansed, standardized via an Extract, Transform and Load (ETL) process and then weighted and probabilistically matched behind the scenes to form a golden record of a defendant which this risk assessment consumes.  Without this master data management system in place, getting a 360 view of a defendant record would prove challenging, potentially inaccurate, would be time consuming and costly.  Since implantation, New Jersey Courts has migrated their MDM from an on-premises to a fully managed cloud solution and has embarked on a full data governance and data warehousing initiative to support reporting and analytics.

Between 2015 and 2023, New Jersey has seen a over a 23% decrease in pretrial jail population, allowing individuals to carry on with their everyday lives and duties while awaiting trial.

Share

Related

ESG data. A new master?

Master data management has long been an unsung hero....

Data Culture Shock: Why Your Organization Needs APDC to Survive in the AI Age

Have you ever wondered why some organizations soar effortlessly...

Evolving Enterprise into the Phygital Future– A Multi-dimensional Trek for a Data-Driven World

We exist in a Dickensian era of monumental dichotomies...

Unveiling Cybersec Asia x Thailand International Cyber Week 2024 (powered by NCSA)

The Premier exhibition & conference of Global Experts in...

Opt-in Marketing Channels Should Not be Ignored

Lurking beneath all of the social media, influencer, and...

Crypto Expo Europe: Central and Eastern Europe’s Premier Blockchain and Cryptocurrency Event

Bucharest, Romania – The Crypto Expo Europe is set...

Engagement -> Retention -> Success

HR thought leaders and influencers write and speak often...

London Biotechnology Show 2024: Anticipation Soars with Stellar Speakers and Top-notch Exhibitors

London, UK: Excitement is mounting for the upcoming London...

Latest

No posts to display

No posts to display

Translate »