Blogs

RPA Can Fix Your Data Quality Issues

Robotic Process Automation (RPA)
can fix your data quality issues

According to Gartner, “the average financial impact of poor data quality on organizations is $9.7 million per year.”  In 2016, IBM estimated  the yearly cost of poor data quality in the US alone, to be $3.1 trillion.  Anyone that works with data that has completed its processing journey understands the impacts, but why are we not talking more about the issue with data quality?

While there are many possible explanations for organizations not addressing data quality, there is no identifiable relationship between data quality and business results (i.e. it cannot be quantified) and most business functions do not have awareness of the impacts that poor data has on downstream processes.

The reality is that executives, managers, employees, accountants, et al, just accommodate the bad data and integrate work-arounds as part of their day-to-day work.  Employees accept bad data since there is little/no incentive and it is easier to accept than trying to figure out where bad data originates then work with the upstream teams to correct the behaviors/actions resulting in the bad data.

These bad data accommodations cost organizations in both employee’s time and their expense since a large portion of their time is focused on the low-value tasks associated with data correction and standardization, rather than the higher value tasks of analysis for insights, risks, and opportunities.  Even non-reporting tasks cost business processes efficiency, since simple tasks like selecting a vendor or supplier could result in a series of try/fails until the right entry is found, because of undisciplined data creation practices.

We use RPA to standardize data entry and validate data to ensure data quality.  We look for the root cause of the data issues to correct bad data practices before we apply RPA to business processes to ensure we do not harden accommodations for bad data, since fixing at the root ensures we do not have to automate bad data accommodations at every related downstream process that utilizes the data.  Also, accurate and standardized data means faster processing which translates to faster report generation, which accelerates Close, Financial Analysis, Analytics, and more.

For more information about how hardening bad data practices adds data debt to your organization, read our Process Automation Approach blog

Author:

Joshua Gotlieb

Intelligent Automation Practice Director, Vigilant Technologies