Guide to setting up an effective Data Quality Platform
Guide to setting up an effective Data Quality Platform https://datazymes.com/wp-content/uploads/2021/01/Data-Quality-Platform-Step-by-Step-Guide.jpg 874 600 Vivek g Vivek g https://secure.gravatar.com/avatar/927c001dd3a13b6e015a946097c9aa5c?s=96&d=mm&r=gWhat does it take to set up an effective Data Quality platform that enables proactive, dynamic and meaningful Data Quality checks?
Gartner: “Effective data quality practices require more than a tool. A complete data quality solution includes built-in workflow, knowledge bases, collaboration, interactive analytics, and automation to support various uses cases across different industries and disciplines.”
Recent studies have reported that 85% of policy and strategical decisions are data driven. And with pharma companies’ inclination towards AI, ML and analytics for effective patient care, it has become more important to invest in data quality.
Higher the Data Quality, Lower the Operational cost
Investing in data quality might seems like a costly affair, but in long run it prevents high operational cost caused due to incorrect and inconsistent data impacting business decisions and business outcomes. A minor data fault at an initial stage of data processing can rapidly surge into a major business issue which could impact multiple crucial business operations such as Sales, Commercial, Compliance, regulations etc. The after effect can cascade to patient care, drug supply chain, clinical trials etc. which are highly data driven and process oriented.
Bad data quality leads to rework which can be both manual and automated, needing time and cost for companies. And all these reworks incur cost could have been avoided with a better data quality system.
In our experience across pharma customers, we have seen that a higher quality leads to higher operational efficiency which ultimately lowers the cost associated with risk.
Journey from ‘unawareness of data quality’ to ‘converting data as an asset’
The first step in setting up a data quality framework is to assess the current data quality levels. The ‘current data quality’ / ‘fitment quality of data’ will further help in understanding the status, identifying the loopholes, pain points and scope of improvement to meet the quality requirements as per business use cases.
Data Quality maturity journey to assess and strengthen data governance:
There are many approaches to assess the health of current data. To start with, conduct data quality surveys to understand and analyze the data present in the system in scope. The outcomes of these surveys help in evaluating the data quality maturity level. These maturity levels range from “Unaware” to “Aware” to “Effective”. While conducting these surveys, it is important to understand data consistency and user’s trust level in data during decision making.
For instance, if business users and relevant stakeholders do not trust the data available in the system where information is fragmented and inconsistent, which leads to strategic decisions taken without adequate and right information, then the data quality maturity level can be determined as “Unaware”.
Various factors such as awareness around poor data quality, fragmented and inconsistent data, importance of formal information architecture and data governance mechanism, strategic management of metadata decide the data quality maturity level. To get more confidence on the result of assessment, get the assessment outcomes reviewed with business stakeholders through review sessions. Such sessions will help in understanding business perspective and various dimensions of data quality.
Enable business focused data quality
Post data quality maturity assessment, it is important to define data quality guidelines considering the data quality maturity level. These data quality guidelines will help in setting up procedures to ensure that quality standards are followed throughout data life cycle, so that business users can have more trust on data to make strategic decisions.
These guidelines can be defined based on type audience, business use cases, business need and type of process. A few examples are listed as follows:
- Data quality check on the vendor provided data’, this will ensure that vendors are meeting certain quality standards as defined by organization.
- Control total check for deviation between raw and processed data’. For instance, the sales reported for a territory for a month in the raw data should be same as in the processed data.
- Referential integrity check’ to avoid linkage break between master data such as customer, product and transactional data like sales, call, claim etc.
- Trends, Threshold, and Standard deviation for key data sets’ to check deviations in Sales numbers, TRx/NRx counts, variance at territory level etc. This will help in identifying the root cause such as some underlying actual business problem on field, data source has sent sales corrections/instatements, or it could be related to bad or inconsistent data in source system, technical issue in data processing, or load failure etc.
Pick data quality rules wisely
Before finalizing any data quality rules, it is important to review the data assets of the system in scope, based on the data quality guidelines. These reviews will help in understanding the data assets and identifying data quality rules as per the business use cases and defined guidelines. These data quality rules can range from business-driven use cases to data completeness, integrity, referential integrity, quality including basic null checks etc.
Data quality rules which are business driven, can allow business users, to see KPI level deviations in sales, call, incentive compensation and other subject areas. For example, Organizations can set up an agreeable threshold on key data sets, any increase or decrease in such deviation can be reported to relevant stakeholder for corrective actions.
In Pharma industries there are multiple vendors for both primary and secondary data across business subject areas. The data providers share data in various formats through various data connectors. We need to ensure that any existing or new data vendor is meeting quality standards for raw/input data sets, rules like pre-defined file format check, null data in key attributes, data type and data uniqueness, missing files in shared location etc. are placed.
Setting up of data quality rules does not end with validation of vendor data at initial stage, rather it is configured at every stage of data journey. Data quality rules like referential integrity check between master and transactional data, data completeness such as number of records, aggregate match between layers, data integrity like TRx should be greater than or equal to NRX etc. ensures correctness, trust, accuracy, and reliability of data.
From our experience with pharma industries, we have understood that these deviations are caused by many factors like new business strategy, market condition, geography, and technical error etc.
And Data quality rules act as preventive and proactive measures, which will enable trust and reliability in commercial enterprise data warehouse, for successful data-driven business functions.
Operationalize proactive data quality process for alerts/notifications & business impact view
Once data quality rules are configured at different levels, operationalization of data quality process comes as the next step.
Data quality rules will identify the anomaly/ irregularity present in data caused due to various internal and external factors, but if these anomalies/ irregularities are not reported to relevant stakeholders and business users for resolution or corrective actions then the purpose of setting up these checks won’t be fulfilled completely.
To proactively report data deviations/issues, alerts or notification needs to be enabled for appropriate business users and technical operations team. These alerts could be system generated emails or data quality dashboards with more intuitive interface for business users.
The data quality dashboards are very intuitive in nature and can enable business users with deep insights on data quality across data exceptions, Sales variance, sales health, claims deviations etc. The alerts or notifications can be at various degrees based on the nature of data quality check, severity of data issue and control measures set in the system. For example, if a vendor file is not qualifying the required data quality standards then further processing will stop based on the alert raised by data quality system.
Whereas there are notifications which are informative in nature, these notifications will allow the business users or decision makers to decide if enforcement of any preventive measure or quality rule is required.
For Example, in CRM call activity dataset, there are list of HCP’s with call details, but these HCPs does not exist in customer master. Based on the business rule setup, system can be excluding these HCPs. An informative check, with the list of such HCPs, can notify relevant stake holders. Business user or data steward can perform root cause analytics and decide if there is any business need to include such HCPs or incorrect call logged etc.
Extend data quality as integral part of data governance to support information governance across operational business processes and decision making.
Empower the enterprise data governance model by enabling it with the data quality platform. This will give an edge to the existing governance model and will bring benefit across organization at multiple levels, be it business users, sales or marketing head or pharma sales force.
Every pharma company has its own organization-specific governance model, but their primary objective is similar, which is to build an ecosystem with standardized rules and policies regarding its data.
Operationalization of data quality platform will allow to see the types of data irregularities and risks associated with data. And Building the governance model while considering data quality will increase the stability and sustainability of the ecosystem.
In Pharma and Life Science industry, with strict regulatory norms and frequent mergers, it is paramount to have a data governance model which can not only protect sensitive data but also ensures data integrity for risk management proactively.
A holistic approach to data governance will help in developing data integrity and reducing risk, allowing pharma and life science industries to deliver insights for better market access. Robust and proactive data governance helps organization in understanding the emerging risks, better cost management, effective collaborations, and efficient processes.
Learn More
Build trust in your data to make better decisions
DataZymes DZLumen – an enterprise-grade data quality management solution – to ensure the accuracy and consistency of across Pharma commercial datasets and business processes. DZLumen moves and process data with integrity, giving tomorrow’s market leaders the ability to make better decisions and, ultimately, build new possibilities. DZLumen standardizes the data definitions enterprise wide. DZLumen enables advanced statistical and business-driven data quality checks to provide key data quality insights to businesses and IT users. DZLumen publishes all DQ validation outcomes over intuitive dashboards and provides event-based alerts and notifications on data quality issues.