Modern businesses function with data as their lifeblood. When data quality is affected, so is revenue, so it is important to take active steps to improve organisational data quality.


Why is Business Data Quality Important?

Many businesses recognise that their success hinges on their ability to leverage data to make decisions to streamline internal processes and take advantage of market opportunities. However, this depends on having these insights delivered in a timely and accurate manner in the first place. High quality, dependable business data created through well-honed processes is a vital prerequisite to all of this.

In short, to be good at analytics, organisations need to be also good at managing data quality. There are a number of reasons why businesses find this difficult and obstacles which may seem hard to surmount. However, there are also several tried and tested processes and tools which can be used to start improving business data quality and in general boost the effectiveness of analytics projects.

Why do Businesses Find it Hard to Improve Their Data Quality?

Lack of Ownership

One major reason why businesses often find it hard to improve their data quality is that they do not assign ownership over data assets. Lack of well-defined roles and responsibility creates a situation in which nobody may be accountable for questionable outcomes, but it also might be hard to identify where to even begin getting to the root of a data quality issue.

Responsibility needs to be assigned jointly across a number of areas. During this process it is important to specifically identify:

  1. People who are best placed to define the issues
  2. People who can actually make changes to the data to improve its quality
  3. People who can direct changes to systems
  4. People who can direct changes to business process

With these cohorts identified, an organisation is set up to actually be able to take meaningful action on improving data quality and following up on data quality initiatives.

Lack of Measurement and Tracking

It is hard to know if progress is being made if there is nothing in place to quantify and track it. Often organisations are hampered due to lacking a central library of rules and conditions which represent an ideal level of quality. Not only is important to have this library, but it is important to ensure that it is being checked on a regular basis, to ensure that there is a history of improvement over time.

Not Integrated Into Business Processes

Tackling a backlog of issues accumulated over a lot of time can feel overwhelming. It is much easier to resolve issues within minutes or days of them arising, rather than losing context on a particular area of data. An approach which allows these issues to build up in order to be handled periodically is less effective than one in which data quality resolution is built into the business process itself, to be handled as a matter of course.

Implementing this can be assisted through setting data quality-related KPIs for staff as well as a focus on reinforcing good discipline at a management level.

Methods to Improve Data Quality

Define Your Governance Organisational Structure

In order to overcome the aforementioned lack of accountability, it is important to formalise roles and responsibilities, as well as a decision-making framework across multiple data stakeholders. Specifically, there are several roles which are useful to assign and delegate areas of responsibility to:

  1. A Data Owner is responsible for guiding data management related decisions across a particular business domain.
  2. A Data Steward is familiar with the technical details necessary to managing day-to-day data
  3. A Data Originator is an operator performaing data entry, under whose watch human error may be causing data quality issues.
  4. A Steering Committee is a group at the executive level who designate funds and resources specifically for data governance and quality projects.
Work on Establishing a Data-Conscious Culture

An organisation in which data is produced and analytics are consumed in a conscious and informed way goes a long way to overcoming common pitfalls associated with poor data quality. Data entry operators who are aware of the repercussions improperly filled fields can have down the line are more likely to be diligent in their operations.

Furthermore, important context can be provided at the analytics point of consumption through the badging of reports according to the level of refinement which underpins the data. For example a Bronze rating can signify data that has been auto-populated in a staging area, a Silver rating can indicate a prototype tabular model and a Gold rating can mean a cube or tabular model from a key subject area.

Establish Automated Monitoring and Actions

Creating a library of data quality rules and tracking progress over time is a lot easier when you have dedicated tools for the job. It is useful to set up automated alerts and communications directly to the stakeholder to resolve issues and monitor outstanding issues. A tool like Loome Monitor can help you easily set the triggers as well as the associated actions that will help you take immediate action on data quality issues as they come up.

Example Data Environment

A data architecture visualisation of a system for the monitoring of business data quality