How to concretely get added value from AI?

Artificial Intelligence, Machine Learning and Deep Learning: what is the difference?

Was it Machine Learning (ML), Deep Learning (DL) or artificial intelligence (AI) after all? Those looking beyond the buzzword bingo to correctly interpret the terms quickly lose track.

What exactly do the different terms mean and how do they differ in practice? An overview.

AI, ML and DL demystified

In a nutshell, the situation looks like this:

  • AI is the overarching science that deals with the creation of machines that exhibit some form of intelligence.
  • Machine Learning is part of AI that focuses on techniques with which computers can learn on the basis of data and patterns.
  • Deep Learning is in turn the collective name for a group of techniques for self-directed

Artificial intelligence

Artificial Intelligence or artificial intelligence is all about the building machines that exhibit some form of intelligence. As a concept, AI is inextricably linked to the history of the first computers. And for that, we have to go back a long way in time.

first mechanical computer
Remains of first mechanical computer

Even the Ancient Greeks had designed a complex analogue machine to perform intricate astronomical calculations more than two thousand years ago, although their invention was still very far from today's views on artificial intelligence.

The British mathematician and code cracker Alan Turing is generally regarded as the father of the modern computer. His pioneering work in the 1940s and 1950s not only heralded the era of lightning-fast calculators. Turing also laid the philosophical and practical basis for abstract concepts such as the self-awareness of intelligent computers. That groundbreaking thinking eventually led to the Turing test, which still constitutes an important, if somewhat outdated, measure of whether or not a machine can be considered intelligent and self-thinking.

Alan Turing

Today, AI includes a wide range of different concepts, which include Machine Learning and Deep Learning. In the meantime, a lot of work is also being done on practical implementations of fully conscious artificial intelligence, which means that the field will soon be expanded with, for example, human AI.

Machine Learning

Machine Learning is a part of AI that focuses on methods by which computers can learn based on imported data and patterns. In practice, this is done using ''data mining'. This is a technique for extracting relevant information from databases. A Machine Learning algorithm does not need a structured database to do this - like an Excel file with neatly ordered data - but is smart enough to decipher relevant data points based on unstructured data. Many companies are already applying Machine Learning today. Consider Amazon, which automatically recommends products to its users based on their previous purchases. Another example is Netflix, which suggests series and movies to its subscribers based on previous viewing behaviour.

Deep Learning

Deep Learning is essentially an advanced form of Machine Learning with one important distinguishing feature: independent adjustment. A Deep Learning model can adjust itself on the basis of external signals – that is, data -, where Machine Learning can only adjust on the basis of manual adjustment, such as in the underlying code of the algorithm.

Well-known examples of Deep Learning are found today in self-driving cars and in our own Trendskout platform. Neither of them requires explicit user feedback to adapt successfully. Deep Learning algorithms are fully focused on the requested end result and adjust themselves accordingly.

Certainly do not confuse Deep Learning with neural networks. A neural network is a technique that can be used for Machine Learning, Deep Learning as well as in overarching AI. Neural networks mimic the workings of the human brain in order to use based on examples classifying information. For example, they are known to the general public as a way to quickly categorise images based on a limited set of known pictures.

Conclusion

Appropriate type of AI for each project

We admit: sometimes confusing terminology and the constantly changing AI landscape do not make it easier to see the forest for the trees. Using the right technology in an organization is specialist work. That is exactly where the Trendskout platform proves its value. The platform automatically chooses the appropriate AI algorithm for each business case based on the relevant parameters.

And what about Business Intelligence?

Both Artificial Intelligence (AI) and Business Intelligence (BI) are still too often misused or even confused with each other. So where exactly are the differences between AI and BI? And why is it that smart companies use both together to make better decisions and strengthen their competitive position?

Reporting vs prediction

The terms BI and AI are often used interchangeably in a business context to describe tools that derive data-driven insight for decision-making purposes. While this definition is generally true for both technologies, it quickly becomes apparent that AI and BI are quite different indeed – both in theory and in practice. Summed up in one sentence: BI delivers comprehensible insight into past performance, while AI also predicts future trends and the most efficient actions.

In organisations, AI aimed at analysing and interpreting large amounts of data and then acting on it. AI itself makes connections, makes predictions and can also suggest actions for follow-up. In a business context, this delivers concrete advantage on. For example, sales teams manage to follow up on their leads in a more targeted way and process operators can better estimate and manage the downtime and maintenance of their machinery.

BI makes the past accessible, while AI predicts the future.

BI explained

BI or business intelligence is technology used to collect and understand display of data. BI does not interpret data itself, but merely provides a comprehensible display of data. Interpreting the data and finding connections and possible follow-up actions is the responsibility of the person reading the reports. For example, BI can generate a slick report on sales leads just as well as AI, but the former does not then provide a prediction on which leads are best contacted first for maximum chance of making a sale.

Overview

BI

Basic idea: Collecting and presenting data in a simple, readable way.
Focus: Answering questions about past performance.
Under the bonnet: Traditional statistical approaches and large amounts of spreadsheet-based data.
Concrete benefits: Data visualization and comprehensive overviews of historical data.
Key terms: Reporting, data warehousing, matrices, dashboards.

AI

Basic idea: Mimicking human intelligence and behaviour to support organizations with data-driven decision-making.
Focus: Making predictions about the future based on data from the past.
Under the bonnet: Advanced Machine Learning and Deep Learning algorithms.
Concrete benefits: Predictions about customers, competitiveness, and market changes, in addition to intelligence in machines.
Key terms: Predictive analysis, forecasting, natural language processing (NLP), image recognition.

How BI and AI are better together

Judged on their individual merits, both AI and BI provide plenty of business value. Nevertheless, both technologies can also be deployed side-by-side for even better results. Combined BI and AI are the perfect recipe for delivering analytical solutions in any business context. First, BI gets to work by analyzing historical data. Next, AI predicts future actions based on the available information. The best of both worlds.

In practical terms, BI provides comprehensive reports, while AI makes predictions and recommends actions

BI supported by AI is also called 'AI-enabled BI'. Who can dig deep into complex problems and lays crucial insights expose in data that was previously inaccessible or unexplored. When used together, they can automatically review previous data and provide alerts on new and interesting events or insights. AI-powered BI platforms can also free up a lot of time for analysts and enable them to move into more effective data analysis projects. By combining BI with the best capabilities of AI, businesses have the opportunity to analyse data even more efficiently, gain actionable insights and anticipate on the future. And that is exactly where organisations get added value.

Some example applications

The number of applications of AI is practically endless. To make everything somewhat manageable, for this e-book we have selected some common applications within three domains that are important for every organisation: Sales & Marketing, Production & Operations and Customer service.

Sales & Marketing

Artificial intelligence is having a major impact on just about every business department. Sales is no exception. More and more organisations are relying on AI to organise, streamline and make their sales teams more efficient. The need for AI software in sales is simple: salespeople sit on mountains of valuable data, but don't know where to look first to get started with it. At the same time, many sales people are drowning in repetitive work and lead qualification, which often contribute little to what really matters: closing new deals. So there are a lot of opportunities for improvement up for grabs.

  • Easily capture data
    Data on sales processes are relatively easy to record today. Contact moments or touchpoints with potential and existing customers are already automatically recorded in many companies. Quotations, order forms, pre-sales processes and invoicing are also part of the average sales cycle. Most modern ERP and CRM packages are also perfectly capable of storing this data in perpetuity and neatly historical charts from it. However, that is usually where it stops. Beautifully constructed sales charts from a CRM system may be valuable for measuring and comparing achieved performance, but the age-old stock market credo applies here too: past results do not guarantee future results. The analytical and predictive power of a CRM is therefore sorely lacking. And that's exactly where Trendskout's AI comes in.
  • Helping sales teams
    Even for the most experienced sales people, it is almost impossible to extract all relevant information from the endless sales-related data streams and then interpret them correctly. That is what Trendskout is used for. Their advanced data analyses uncover underlying sales dynamics and deliver concrete predictions about potential and existing customers. This allows salespeople to serve their prospects and customers in a more focused and better way, in less time. Opportunities hidden below the waterline suddenly surface with AI. Artificial intelligence notifies teams of new sales opportunities that would otherwise go unnoticed and provides data-based advice that supports sales people in their busy roles.

6 months after rollout, the total relative revenue increase was 8.5%

  • Key benefits of AI in sales
    There are a lot of ways companies are relying on sales-AI to capture more opportunities for their sales team. This is happening for both multinationals and SMEs on different areas. In each of these domains, AI picks up where classic ERPs and analytics tools leave off. An overview:

Sales forecasting and sales prediction

AI for sales forecasting gets to work with thousands of data points from all possible data sources. The artificial intelligence links data from different databases and looks for insights and hidden patterns that are impossible to detect manually. This enables accurate forecasting of future figures and helps companies know better who will buy what, and when.

AI-based forecasting thus goes beyond typical forecasting via spreadsheets or reporting systems and introduces the power of AI into business forecasting. This results in much more accurate forecasts and can also predict erratic patterns. This is not only useful for sales teams, but also for their customers. In some companies, the intelligent forecasting model even goes so far as to detect the needs of end customers. even before they realise it themselves.

Detecting sales opportunities

Why just predict sales when you can also influence them? That is exactly what opportunity detection does. Based on the data available in the company CRM and other tools, a sales opportunity detection algorithm goes to work and makes hidden opportunities visible. Often with spectacular results, as the case of Coeman Packaging also shows. AI informs sales teams about new opportunities that might otherwise be lost and ensures that companies can get more out of their leads and existing customers.

detection of sales opportunities

Estimating customer failures

A smart AI tool is connected to all underlying business software via plugins. He keeps an eye on all contact moments or touchpoints with potential and current customers in the background. Potential causes for increased customer dropout or customer churn are automatically identified and passed on to the responsible sales employee for further follow-up. That gives companies time to communicate on time and increase their customer retention. In other words: AI makes reactive sales proactive again.

Impact analysis

Smart AI algorithms can uncover sales drivers. For example, the sales AI of Trendskout Sales Booster performs advanced impact analysis. It reveals the decisive factors that cause customers to purchase or reorder. Impact analysis thus provides answers to many why questions, for deeper business and sales insight.

smart AI algorithms

Next best actions

AI software can take the following steps or recommend next best actions in a sales cycle. The algorithm does this based on historical data in your CRM system or other databases. Those data-based recommendations serve as a guide and increase the success rate of a phone call or e-mail from your sales people to existing or potential customers. That way, your team doesn't sail blind, but can make targeted contact and dose its sales efforts in a targeted way.

Product recommendation

An AI tool like the Trendskout Sales Booster interprets previous sales and itself recommends additional products or services that suit an existing customer. A company itself can have those suggestions automatically suggested to the end customer or passed on to the relevant account or sales manager, depending on the business model of the company in question. In this way, organisations can improve their maximising upsell and cross-sell. Under the bonnet, product recommendations are done via AI with the help of a so-called recommendation engine. That is a clever clustering and classification algorithm that, again, connects unexplored data points to arrive at personalised salad suggestions to come.

Data-based personas

AI classifies and segments customer profiles into sales and marketing personas. It does this on the basis of objective sales and other data – and no longer on the basis of subjective criteria. Data-based profiles are always more accurate and can be deployed at the interface between sales and marketing, which in turn ensures targeted communication.

Production & Operations

Production and process optimisation is not new. Already in the mid-20th century, led by East Asian industries, statistical modelling was applied to various variables in the production process. This was done using classical mathematical techniques and required a huge amount of error-prone manual work which meant it could only be applied by the largest concerns. Today, information flows are such that statistical analysis as then cannot be used in a cost-effective way.

Recent developments in AI and Machine Learning make it possible to automatically analyse this information and deploy it to optimise day-to-day operations. Leading industry players are already applying these techniques in several pilot projects and the race to deploy these techniques at scale has begun. Below, we provide some practical examples.

Predictive maintenance

The essence of predictive maintenance or predictable maintenance is simple in design. Complex equipment and industrial machines require regular maintenance. This preferably happens just before the end of the life of the machine or the part to be replaced in question.

In order to guarantee operationality, many machine parts in large production halls are still replaced on a regular basis based on their estimated service life. Often, as a precaution, a much too long buffer time is used and it would be more efficient to be able to intervene predictively.

AI-based predictive maintenance optimizes the timing of maintenance and thus ensures maximum cost savings. We use a number of specific algorithms for this, depending on the type of predictive maintenance that is required in practice.

Classification vs. anomaly detection

Behind the scenes there are two technical solutions to perform predictive maintenance via artificial intelligence. The choice of a specific AI algorithm for data analysis and training depends on the nature of the machines to be monitored. When it comes to devices that often experience breakdowns or downtime, classification is a logical option. If, on the other hand, it concerns devices that only rarely show defects, anomaly detection is usually the better choice.

Detect abnormal events

The inevitable disadvantage of reliable machines is that the data that the machine provides often contains few traces of a concrete failure or indicators that can predict downtime. As long as all components are running smoothly, monitoring systems will indicate few abnormal values ​​for the parameters they are monitoring. In that case, it is also difficult for the AI ​​to learn to estimate which suspicious indications or data anomalies could cause a possible failure of a machine. Fortunately, there are several algorithms specially designed for anomaly detection. Auto encoders, for example. An auto-encoder is a special type of neural network that learns to recognize what exactly can be considered “normal behavior”. Anything that deviates from that standard pattern is by definition an irregularity and cause for alarm. A potential danger of auto-encoders is that because of the way they are constructed, they could also consider more frequent anomalies or anomalies as “normal” if they crop up too frequently. It may therefore be a good idea to remove any excess data from the training data that feeds the algorithm.

detecting anomalous data

The graphs above indicate whether data has been found that deviate from the normal situation and, if so, how large the deviation is. This leads to a mean average error. It is not so much about the absolute values, but about their mutual relationship to each other.

The input field at the bottom of the settings screen shows the threshold or threshold value that the algorithm must use for anomaly detection. The algorithm also automatically determines a suggestion for this.

threshold algortime

In the figure above, the red line represents the threshold. The blue bars are the mean average errors of all recorded data. These are, by the way logarithmic shown to get a better overview. So in reality, the anomalous data points are even much larger outliers than one would think at first glance.

Thus, any blue bar above the red line counts as a reportable anomaly for the algorithm. The data is automatically forwarded via an API to the company’s external alert systems or can be added to a text report for further action.

Anomaly detection and quality control

Detecting exceptions, outliers or anomalies are a crucial part of any quality assurance process. This can be about detecting a failure that is not noticeable to human analysis, and often involves the interplay of several data points. When one data point has changed remarkably, and this is noticeable to human analysis, often a whole history of anomalous behaviour has preceded it.

Detecting this behaviour in advance is what anomaly detection is all about. This technique is one used in predictive maintenance but is also used separately in e.g:

  • Detecting subtle fluctuations in energy consumption,
  • Spot abnormalities in the production process that affect production quality,
  • Drive incident management systems,
  • ...

As in many other cases, static techniques were already used in the 20th century to try to determine the number of deviations in a process. Besides being manually time-consuming, this is also a very error-prone process. The assumption made each time is that the conditions in the sample are representative on a larger scale, which in reality often turns out not to be the case. After all, the production process itself is subject to many other processes in HR, supply chain and IT that are constantly changing.

Since the rise of AI and Deep Learning, the technology is powerful enough to discover all the nuances in this data.

Impact analysis of production parameters

Impact analysis and Deep Propensity Modelling answer questions like "Why are production targets for a certain product line not being met?", "Why does a certain type of machine need more maintenance?", "What motivates my employees?" or "What drives my ROI?". This type of analysis looks for the underlying reasons why something happens - or just doesn't happen.

For that impact analysis, techniques such as propensity modelling are applied, combined with the latest Deep Learning technology. That way, it becomes possible to discover all the connections and insights in your data and in the processes that drive your organisation. This is impossible for a human brain to do in a realistic timeframe.

How does this work technically?

Goal selection

The first, and crucial, step in this type of analysis is to define a goal, something important to you or your organisation such as ROI, conversion rate, downtime, etc. The AI needs this information to start evaluating purposefully in the next steps what drives these goals, in positive or negative ways. This can be done directly in a UI based on your data, and you don't need to provide separately annotated data.

Data expansion

Unlike traditional systems, Trendskout can evaluate multiple types of data simultaneously. This is not only a technical advantage, but also ensures that you can expand your original data with all kinds of other data sources that can be evaluated for relationships. The original data, in which you selected your target, is expanded with other data that you upload. This allows you to examine on a very broad scale what drives your goals, without missing any connection. One of the technological pillars of Trendskout is a distributed computing platform, with a high degree of parallelization. This technology is used to process, denormalize, clean up and transform the different data sources into other formats so that they can be processed by neural networks and other Deep Learning techniques in Trendskout.

distributed computing platform trendskout

Deep Propensity Modelling

Propensity Modeling is a technique that has been used by statisticians for several decades. The problem with these classical techniques was often that the discovered connections could not be properly described by static, mathematical formulas. Due to new developments in the field of Deep Learning, these relationships can now be modeled in a much more powerful way. By way of illustration you can compare modeling with purely mathematical formulas with trying to draw a face with only straight lines, the result will be angular and only a rough indication of that person’s appearance. Deep Learning techniques can also draw smooth lines, and will therefore paint a better picture. This is also what happens with Deep Propensity Modeling, the relationships in your data will be better understood by neural networks. During the Deep Propensity Modeling step, Trendskout applies various types of Deep Learning algorithms to your data, and it is continually evaluated whether the discovered connections and insights actually have an impact on your goal. For defining your goal in the first step and the data expansion afterwards, no interaction is required for this. As with other AI and Deep Learning analyzes in Trendskout, Auto ML & Solution Space Exploration – data processing, algorithm selection, and parameter hypertuning – automatically searches for the most efficient model.

After the Deep Propensity Modeling phase, the underlying relationships are extracted from the winning model. These relationships and results of simulations provide insight into how your business goal is influenced, in a positive or negative way. This report is one of the output options in Trendskout. In addition to direct consultation in Trendskout, the information in this report can also be linked to the business intelligence solution of your organization.

case team industries trendkout

Case study: Team Industries
Lead times are key elements in cost and pricing calculations. They depend on correlating factors that are sometimes difficult to detect. Team Industries turned to Trendskout to address this recurring issue. The AI is capable of predicting lead times of specific orders based on available production data. This information enables Team Industries to make accurate estimates and quotes that are both profitable and competitive.

Customer service

Underlying technology: Natural Language Processing (NLP)

NLP is a collective term for techniques that understand and respond to text or speech data - and respond with their own text or speech - in much the same way as humans. NLP combines computational linguistics - rule-based modelling of human language - with learning learning and deep learning models. Together, these technologies enable computers to process human language in the form of text or speech data and 'understand' its full meaning, complete with the intent and sentiment of the speaker or writer.

NLP applications are thus made possible by a combination of techniques such as:

  • Part of speech tagging, also known as grammatical tagging, is the process of determining the word type of a particular word or piece of text based on its usage and context.
  • Named Entity Recognition (NER), identifies words or phrases as useful entities. NER identifies Europe as a location or 'William' as a male name.
  • Sentiment analysis, tries to extract subjective properties - attitudes, emotions, sarcasm, confusion, suspicion - from text.
  • Text classification, is going to group and categorise pieces of text. E.g. quotation requests are grouped by subject, or requests are assigned to the right person.
  • Next best action, is not an NLP technique per se but is often used to suggest answers that can be presented in a particular context. E.g. customer service employees are automatically suggested an answer to a certain question in terms of a goal to be achieved, e.g. quick problem resolution.
automatic customer service response

Automatic response to customer service requests

At customer service desks in various sectors, numerous questions from customers or intermediaries arise in various forms. Just think of questions about delivery dates and times or questions about returns, a question about a quotation, a request for clarification about a certain file, etc. These questions must of course always be answered, which is often time-consuming and therefore weighs on the bottom line of every organisation. At the same time, the quality of this process is important, as a balance must be struck between answering speed and monitoring the organisation's objectives. By using an AI-driven customer service, all these questions can be recognised and answered automatically.

In doing so, the AI can take into account certain objectives and response strategies determined by the organisation:

  • The right customer service by customer type,
  • Monitoring the profitability of the customer service process, and minimising the number of contact moments as a goal.
  • Detect Upsell opportunities during the call
  • ...

For questions that require a personalised answer, such as questions about delivery times of goods or questions about specific orders, links to ERP or other software packages can be provided where the customer service AI searches the database for the specific info. In this way, questions that require a highly personalised response require neat automatic replies anyway.

This removes much of the workload for the customer support department. Only the questions that cannot be answered by the customer service AI are still forwarded to the human staff, allowing them to deal with the more complex issues and focus more on on personal service that really makes a difference.

Analysing customer service flows

A request often consists of several questions and is often handled by several people. Each conversation contains important information captured for the organisation about how their product or service is received and used and how effective the customer service process is in itself.

Therefore, many organisations are using AI to analyse customer communications, via mail or phone, and unlock the underlying information to support their strategic decisions or streamline the customer service process. Some questions that can be answered here include:

  • What questions are most common and what trends can we detect?
  • Which answers are most appropriate for certain questions?
  • How is our product or service compared in the market?
  • Which questions generate the most follow-up contact moments?
  • Can we derive scripts from best practices?

Also stay informed on the latest news on AI

"*" indicates required fields

GDPR
This field is for validation purposes and should be left unchanged.

How do organizations benefit from AI and Machine Learning?

Gain access to our Resource Center with interesting business cases and how-to's.

Share on Facebook
Share on Twitter
Share on LinkedIn
Share on Pinterest
Share via email