Advertisement

Good AI in 2021 Starts with Great Data Quality

By on

Click here to learn more about Heine Krog Iversen.

More and more companies want to use artificial intelligence (AI) in their organization to improve operations and performance. Achieving good AI is a whole other story.

AI initiatives can take a lot of time and effort to get up and running, often exceeding initial budget and time targets. Even more alarming is this assessment (paywall) that claimed that close to half of AI projects failed to even make it to production. Despite this risk, a continuing growing number of companies are investing an inordinate amount of their resources in hopes of deriving value from AI.

Many projects are often derailed because their data environment is simply not suitable for AI. The same issue often occurs for machine learning (ML) programs as well.  These are not positive signs; however, the good news is that there are steps an organization can take to right the ship.

While there are various reasons as to why AI projects fail, one of the major data-related challenges companies face in achieving good AI can be attributed to poor Data Quality. To understand, first let’s accept that poor Data Quality on its own can be solely responsible for a significant resource drain on companies and the culprit for myriad problems throughout an organization. As just one example, a marketing manager trying to develop a direct mailer campaign could struggle mightily if the data used to formulate a mail strategy to potential customers is outdated, missing addresses, or incorrectly labeled.  

Having essential data in place that possesses integrity is a prerequisite for AI to be successful in a company.  When organizations find themselves scrambling to try and prepare their data properly after commencing their AI design, they’re left with budget and resource expenditures allocated to an AI project that is delayed…in many cases, leading to significant opportunity costs.  

Adding to this is the fact that over the last several years, organizations have been moving to self-service analytics. This is a great endeavor that most now support — business users needing data “on the spot” to make rapid decisions certainly are now in favor; however, business users are not always the best custodians for interpreting the level of Data Quality of the information that they’re retrieving.

Given these encounters, many companies have chosen to merely extinguish their AI hopes. But is waiving the AI white flag the best course of action? Various reports, such as this article in Harvard Business Review, have indicated that having a robust AI program can be a true driver for having stronger corporate performance. So the vision is a worthy one, and surely, there’s a way to stabilize the AI initiative to help advance corporate success. With that in mind, here are three steps to help get you started.

First, let’s consider that having strong Data Quality for your historical data is a must in order to have a formidable predictive model in place. A business must also account for its more modern data and the volume of future data that will pour into the organization – all of which can impact your predictive model as well. As companies look to couple the myriad of data that they have collected, along with unstructured data and more sophisticated data sets, the need for Data Quality grows even further. All this data should meet strict compliance for Data Quality standards and methods should be instituted to properly identify, label, describe, categorize, de-dupe, etc.

Second, organizations should assign a person(s) to oversee corporate Data Quality. A quality assurance (QA) system should be in place that follows best practices such as having proper protocol for managing metadata.

Third, there’s technology available on the market to help businesses significantly reduce the time it takes to build and maintain analytics data by leveraging the power of automation to model, consolidate, integrate, and centralize all your organizational data into a single repository, thereby making it easier to prepare data for AI. With modern innovation, companies can also now leverage this type of automation technology to cleanse, transform and model data for their deployment preference be it on-premises, hybrid, cloud, multi cloud, or private cloud.

Generally speaking, companies are looking for faster, smarter ways to get to their data. And now, with the investment in AI programs, this need has become even more prevalent. Companies are hungry for strategic ways to leverage their analytics data and AI, and to use predictive analytics to help them solve problems, predict and forecast, and plan for the future.

But it doesn’t end there as having Data Quality is also essential for supporting data governance: building safeguards to ensure proper access, control, security and privacy; all of which are crucial components in helping your organization stay compliant and meet existing and future regulatory requirements.

Looking ahead, as we prepare for all of 2021, here’s a call to action:  let’s start the year off right by accounting for the importance of Data Quality – understanding how it impacts your odds for analytics and AI success and taking the necessary steps to ensure that it’s achieved throughout your organization. And if you’re goal is to leverage fully-functional AI for your business, these Data Quality efforts will help get you started on a much firmer foundation.

As a side note, this article provides a broader look with tips on how to try and improve your success for AI adoption. 

Leave a Reply