The most successful companies of this century are those that take advantage of the knowledge of their customers and market

Data, data and more data. In recent times it has become especially necessary to save and store data. Not only for the proper functioning of a company, but also because of the urgent need to optimize processes and marketing campaigns. The growing need to save data only confirms the popular Spanish saying: information is power.

Due to omnichannel, more and more points of contact with customers are generated, and this generates a lot of data along the way. On the other hand, in companies we have increased the use of various applications and platforms that usually generate a database. This makes data and databases multiply in companies, but this phenomenon also attracts problems. Because data needs to be cleaned and related. So, if I have “Juan González” who has a loyalty card, it is the same as “Juan González” who buys from us through ecommerce. If I do not have this relationship created, I will not be able to know or value the client. As we move forward in wanting to automate processes or personalize offers or content, the challenge of processing more data and faster, in some cases in real time, will require standardized, clean and related data.

Poor data quality leads to poor customer experience

The multiplication of unconnected data sources without a Master Data Management program is becoming more frequent. A study[1] of more than 500 companies on their perception of data quality revealed that on average organizations manage 19 contact databases at a time.

In the end, customer data management platforms for marketing departments, such as CRMs, are not capable of processing all data sources, which in most cases are incomplete or of very low quality. In the end we find departments of companies that have fragmented or completely different knowledge of the same client and without knowing our users our decisions run the risk of being wrong.

85% of organizations[2] indicate that poor quality contact data for customers negatively impacts their operational processes and efficiency, and in turn hampers their ability to be flexible and agile. Many companies are exceeding the volume of data that is realistically manageable, proving that more data is not always better. Without the necessary tools, resources, or strategy, too much data can impede companies’ ability to innovate and successfully deliver a unified customer experience.

data quality impact

Achieve a Single View of the Client

Imagine if you could see the profile of your client and everything you need to know about that person was at your fingertips. Being able to fit all the fragments of information of a person in a single piece, obtaining a 360º knowledge.

This is what the Single Customer View is all about. It is the process of collecting data from different sources, processing it and comparing it until obtaining a unique and accurate record for each client.

More and more companies are realizing the importance of unifying all the information of a client in a single record, and in many cases linking all the data to that client to create a reliable view of it. Overall, over the past three years, organizations have been investing in the people, technology, and processes needed to improve their data maturity.

The results speak for themselves. 91% of companies say that improving the quality of their data has had a positive impact on the customer experience. And those who are most proactive about the quality of their data are more than twice as likely to exceed their goals (44% versus 19%).

In the end, the goal should be to have a “Golden Record” of each client with their most relevant unified information, built with the main parts of our different data masters. This “Golden Record” must contain all the information necessary to have that Single View of the Client.

Golden record
Example of Golden Record

Where to start

For all these reasons, the knowledge of our clients is based on good data quality. You will need to identify all the main data sources, format, normalize, deduplicate records and databases, which in many cases will require professional solutions such as the MyDataQ modular data quality platform.

An evolution of the classic data quality processes is Data Enhancement. An expression that literally means “data enhancement”. For this reason, by using it we refer to a process of cleaning and expanding existing data. To achieve this process, debugging techniques are used or the missing information is completed. In most cases, it is necessary to use external data sources.

Specifically, Data Enhancement is a set of techniques and methods that can be summarized as follows:

  • Deletion of records in the database. Deleted data includes data that is duplicated or incorrect.
  • Checking and correcting the information that is already in any of the company’s databases.
  • Consultation of external resources, from both private and public sources. In this way, it is possible to identify what information is missing and include it if necessary.

Data enhancement sometimes runs into a similar term. We refer to Data Enrichment. Specifically, the two terms refer to the improvement of data by consulting external sources. Whether you are going to invest in Data Enhancement or Data Enrichment, the result will be similar: more consistent and reliable databases.

Both terms are also closely related to data cleansing. Why? Basically, because this cleaning usually reveals the deficiencies in the databases. Both Data Enhancement and Data Enrichment are responsible for adding the missing information, as well as providing additional information that is of great value.

Virginia Pérez

Marketing at Deyde DataCentric

[1] EDQ

[2] EDQ