AI: What is the Reality? Interview with Mick Lévy

Numbers are promising, projects are flourishing, and imagination is seemingly without limit. Artificial intelligence has been a theme on the agenda of most major organizations for a few years and now the AI buzzword has reached the operational stage. But what is the reality today for those in the AI trenches? Which enterprises should be addressing these initiatives? How should enterprises prepare and ensure the success of AI projects? In this interview, Mick Lévy, Director of Business Innovation for Business & Decision Group, responds to the questions raised by this trend in the data world.

You can select English or French subtitles.

What is the mission of Business & Decision? 

My role is to help our clients accelerate their innovation projects around the topics of AI, data governance, big data, and data visualization, among others, in order to generate ROI.

Business & Decision has 2,500 employees in 11 countries. All our consultants are experts in the field of data - this is our specialty. We deliver upstream consulting and service integration for our various clients seeking to value the enterprise's data assets.

We talk a lot about AI, but what is today's reality of these initiatives?

All enterprises have a strong interest in topics that can bring them value. SMEs included!

The subject of AI is indeed high-profile. It is a huge buzzword at the moment, but the adoption in the real world varies greatly.

Some pioneers have launched AI initiatives in recent years and are today greatly benefiting from their outcomes. They are at a stage of multiplying projects and applications. Other enterprises are still at the stage of releasing projects into production. Many enterprises remain at the Proof of Concept stage. These enterprises made an unsuccessful attempt or did not want to scale up for some reason. Finally, there is a group of enterprise that have not yet embarked on this process.

How can you explain this trend?

AI is transforming society and businesses in a very profound way. If AI ​​is so attractive at the moment, it is for the value and the promises of transformation that it creates. Many of these initiatives carry a phenomenal ROI. We have experienced it with all the enterprises where we have operated.

What are the keys to success in an AI project?

First, enterprises must identify the right use case - what should the AI be applied to. Initially, people tend to think that AI initiatives should primarily run on tactical use cases, separate from and ​​in parallel with the strategic activities of the enterprise. In fact, the idea is rather to bring value to the most central and strategic use cases first. 

Enterprises also think they need a super data scientist. It is true that data scientists obviously have a key role but at its core, AI is based on algorithms and data. One of the most important part is the raw material feeding the algorithms: the data.

The key success factor is to master the knowledge of the critical data assets, including their data quality, and their ability to serve as the core for the artificial intelligence initiative.

What are the challenges faced by the teams leading these AI projects in your partner companies?

If we look at recent projects, again, the key topic is data. The enterprise needs to globally master its data ecosystem: creating a map of the available legacy data and increasing the footprint with external data assets available outside of the enterprise.

The second pitfall that we encounter most, and in general the first source of failure, is the quality of the data. Data science requires a certain amount of data, but above all, quality data to empower good decision making, to automate processes properly, and more. Tomorrow, when AI will take a much more prominent place in the actions of everyday life, poor quality data can generate disasters in the real world.

You mention difficulties on the upstream data quality, what main impact can you observe?

The first impact is the loss of time for the entire enterprise - for the analysts and managers who need trusted, cleansed and reliable data. It translates into a waste of time for the projects which are required to constantly compensate for poor data quality when launching AI initiatives. 

Even more importantly is the alteration of the decision-making processes. Imagine that logistics data is wrong. If an enterprise decides to send a truck and chooses a certain road for stock optimization, chances are that the analysis leading to the selection of the proper route will be wrong, thus leading to loss of time and money.

Analysts today introduce the new concept of the Data Hub. What are the key features covered by this concept and how does it integrate upstream of an AI project?

The concept of data hub is becoming very strong in enterprise architecture models because we finally have functioning tools that allow us to place the data at the heart of the information system. This not only allows us to collect and centralize the data, but above all, to governdistribute, expose, and make it available to all the functions of the enterprise.

For example, if we look at the customer data domain, all the information comes (in some cases) from 20, 30 or even 50 different sources. The data hub centralizes this information, makes it consistent, de-duplicates it, and then shares it with all the other applications. The cleansed and governed data can then be accessed in real-time across all channels and serve as the basis for most customer interactions.

This reasoning can be applied to all the other major critical data domains at the heart of the enterprise. Such data must be trusted, horizontally shared, and seamlessly integrated

The features to expect from a Data Hub are eventually:

  • Collecting data;
  • Interfacing and integrating with multiple sources;
  • Centralizing and storing data;
  • Enhancing the quality of data in a centralized and governed application with deduplication workflows and consistency validations;
  • Exposing this data to all other applications in the data landscape;
  • Managing large volumes of information in real time.

Another feature that becomes more and more important is that of cataloging the data to know its lineage, document it, and be able to consume it as simply as possible.

Finally, there is one last feature that is becoming essential, which is the ability to use AI to automate some of the processes implemented in the Data Hub itself. For example, to automate the reconciliation and discovery of data for which there is still a lot of tedious manual work. Modern technologies will integrate AI in their core distribution ​​to greatly improve the efficiency  of the enterprise users.

Among the Data Hub solutions on the market, Business & Decision works a lot with Semarchy. For several years we have realized large-scale projects together for our clients. We particularly appreciate the Intelligent Data Hub by Semarchy as it enables agility and flexibility during the implementation cycles. It also allows for rapid and iterative deployments to quickly deliver value to our clients.