Download as pdf or txt
Download as pdf or txt
You are on page 1of 64

Video transcription

DIGITAL INFORMATION
When we think about the digital economy, when we think about "digital" in general, when we
think about information technology, we are interested in everything that is made possible by
the use of digital information.

To apprehand a digital culture, let's start with this fundamental element that is digital
information... and at least once, let's open the hood and look at how it works inside the
machine - very concretely.

A computer works with binary information.

It works with zeros. With ones. Is there anything else? No, nothing else.

Why limit ourselves to zeros and ones?


First of all, because it's easy to represent with electricity.

● Either there is no electricity in the wire - and we consider that we have a Zero.
● Or there is electricity in the wire - and we consider that we have a One.

It's easy: a wire, electricity, and a switch are all you need. You don't need to manage intensity
levels precisely to represent more values... Yet with just ones and zeros, we can do a lot.

We can represent everything we need for logical reasoning. We can easily represent True and
False. Without going into detail, we can deal with

● The OR operator, to signify that one or the other of two propositions is true;
● The AND operator - when two propositions must be true, one and the other;
● And finally the NO operator to reverse the value of a proposition - making it false if it is
true and vice versa.

Now it turns out that with OR, AND, NOT, we can represent all possible logical relationships.

1
Beyond logic, binary allows us to count
Mathematically, we say that the numbers will be represented in base 2. Each b here
corresponds to a 0 or a 1. Depending on its position, it contributes differently to the value of
the number represented. This value is calculated by adding the products of each digit to a
power of 2. The digit - 1 or 0 - in position 0 is multiplied by 2 to the power of 0, the digit in
position n is multiplied by 2 to the power of n. In this example, you have 0 multiplied by 2 to
the power of 0, plus 1 multiplied by 2 to the power of 1, plus 1 multiplied by 2 to the power of
2, plus 0 multiplied by 2 to the power of 3, plus 0 multiplied by 2 to the power of 4, plus 1
multiplied by 2 to the power of 5, which gives us…​ ​38.

Thus, with zeros and ones, we can represent all numbers.

A simple addition
Now, let's see how to make a simple addition with electricity. We want to add 1 with 1. I made
it clear that it was a simple addition! So we get 2. 2 in binary! So 1 0.

A simple adder

Here is the electronic circuit that makes this addition possible.

This part is what is called a "logic gate" which


corresponds to NO. If there is electricity at the input,
there is no electricity at the output - and vice versa.

This rounded part is an AND gate: there has to be


electricity in both the incoming wires to have electricity
in the outgoing wire.

2
And here is finally an OR gate: there is always
electricity in the outgoing wire, unless both the
incoming wires have no electricity.

Since we want to add 1 and 1, we put electricity into the two incoming wires of the circuit. All
you have to do is follow the flow of the current.

● Let's start with the B-wire. Since it brings electricity to the entrance of the gate NO,
there's no electricity coming out of that gate.
● The A-wire, quietly propagates its electricity. Since there is electricity in one wire, but no
electricity in the other, there's no electricity out.

Let's go back to tracking the B-wire. It brings electricity to the AND gate.

The A-wire, on its side, brings electricity to the entrance of the NO gate. So there's no electricity
coming out of that gate.

We find ourselves again with electricity in one of the


entrances of the AND gate, but no electricity in the
other, so there's no electricity coming out of that
gate… So there is no electricity at all at the entrance
of this OR gate. And therefore no electricity at the
exit. We just got the 0 of the 1 0 result.

Back to the tracking of wires A and B...

B, brings electricity to the entrance of this AND gate. A does the same. Since there's electricity
in both entrances to this AND gate, there's electricity coming out. Here is the carry-over, which
can later give the 1 of the result 1 0.

What about a previous carry-over?

If a previous carry-over is to be taken into account, it makes the circuit a bit more complex.

3
An adder at the transistor level

In addition, each logic gate is itself an electronic circuit.

Here is the circuit corresponding to an AND gate. The transistor


being a basic electronic component.

The circuit of a NO gate. And the circuit of an OR gate.

By replacing all the logic gates with their respective electronic circuits, we obtain the
transistor-based circuit, allowing a very simple addition to be made.

You can see that there is nothing magical about


it, it's just wires, electricity and transistors that
act as switches, so to speak.

This circuit enables - only - to make a simple


addition.

Then we understand why computer processor


manufacturers are trying to integrate more and
more transistors - today there are hundreds of
millions of transistors, over a billion.

4
32/64 bit microprocessor?

We also understand what it means for a processor to be a 32 or 64-bit processor. It’s simply the
number of incoming wires. In our example, there were two.

Abstraction stack

Thus, at the machine level, at the computer level, it’s only about electricity and a lot of
switches. To use these circuits, programs are written in ​binary​. That is, a series of 0's and 1's
which indicate whether or not there’s electricity in the incoming wires.

It works... but it's challenging for a human to use.

So for each type of processor, there is a​n assembler language​. It’s more compact than binary
language. It consists of blocks, of a few letters, and 1's and 0's. An assembler translates this
language into binary. However, this is still very difficult to use.

The so-called ​"high-level languages" ​were therefore created. These are, for example, Java,
Python, C++, Perl, Lisp, Ada, etc. These languages are generally close to English. Much easier to
use by human developers. Compilers and interpreters are used to translate them automatically
into assembler language.

These languages are said to be "high level" in the abstraction stack because they are distant
from the details of electronic circuits. They disregard the electronic specifics. A machine
language depends on the machine under consideration. A program in Java or Python is written
once, and for all computers. Successive translations will take into account the specificities of
the targeted machine.

The program written in a high-level language defines, in particular, an interface with the user
(click, voice command, etc.). This mode of interaction can be considered as a language that is
even more abstracted from the details of the machine.

This abstraction stack is at the heart of computing.

5
Conclusion
We're not interested in the details of the electronics. Nevertheless, we have to keep in mind
that in computing, in the end, everything comes down to electricity in wires. We can talk about
virtual reality immersion, artificial intelligence, Big Data, etc... at the end of the day, it's just
electricity in wires and a lot of switches.

It's dizzying. And at the same time very concrete.

6
Video transcription

WHY ARE DIGITAL TECHNOLOGIES


INTERESTING?
Digital information, digital technologies, are at the heart of the digitization of our economies
and societies. But basically, what is the point of these technologies? That's what we're going to
talk about in this video. Be ready: what I am about to say is obvious... but there are a few
elementary points that are worthwhile to remember, as humans easily get used to what they
have, forgetting what they didn't have.

Digital has advantages


These information technologies, these digital technologies, bring several advantages. In
particular, it has never been easier to copy information. And it has never been easier to share
this information widely... If I send an email to all ESCP students with a document attached, this
document is almost instantly copied a few thousand times and sent to the few thousand
recipients! Let's remember that a few centuries ago it was a completely different story!

Quick reminder

Here we see a copyist. In order to reproduce a document,


people like him were needed, who copied, by hand, the
contents of the document to be reproduced. To get an extra
copy of a book, it had to be copied in its entirety - again, by
hand.

This is an example of a text reproduced by a copyist. The


monasteries, in particular, devoted much effort to this activity.
If you wanted to access this text, or the book from which it is
taken, you had to access one of the few copies available in
Europe. So you had to know where to find such a copy -

1
probably in a monastery or a seigniorial library, go there physically and get permission to see
the highly-sought-after document.

Information processing
Information ​technologies, digital technologies, target information processing.

Here, information is at the center of attention. Information technologies make it possible to


store information, they allow the ​transport of information - hence its wide dissemination, its
exchange -, the ​copy of information. They enable ​modification of information, to produce it -
sometimes automatically, via sensors for example. They actually support information
processing​, in all its forms. Digitization has allowed a leap in the specific tooling for handling
information. And bringing us more ​automation of these processes - automation made possible
by the use of electricity and electronic circuits, information technologies also allow for greater
fluidity​. They support the ability to create and process much larger volumes of information, to
go much further in the information ​accuracy​, at higher speeds​. Overall, this digital tooling
allows much greater ​flexibility in information processing. Changing a sentence in the middle of
the page costs nothing, trying another way to analyzing accounting data will take us a few
moments - why not giving it a shot? Sharing a report with 1000 people is virtually free and
instantaneous.

From these new ​information processing capabilities, we obtain new tools for decision making.
This is the most interesting aspect for a manager, like you. ​Decision-making is based on the
processing of information, so logically, digital technologies are transforming decision-making.
And I want to say: Who cares about digital if it doesn't support your decision making? So it's up
to you to do something with it!

Conclusion
Digital technologies provide fabulous tools for information processing in general. There is still
much to be discovered in terms of tooling potential. You still have a lot to explore and invent in
terms of creating value from this tooling.

2
Video transcription

REPRESENTATION
In this video, we are interested in the notion of representation. It’s at the heart of digitization
and the role of managers in relation to digital technologies.

A detour through art


You probably recognize this work by Magritte, here
reproduced on a stamp. It is definitely a pipe... and yet the
artist makes it explicit: this is not a pipe. Indeed, as realistic
as this painting is, it’s only a painting - whose subject is a
pipe. We cannot smoke the painting. I mean, I guess
technically we would be able to, but it won't have the same
effect as smoking a pipe.

This is a representation of a pipe, not the pipe itself!

Computer science deals with representations


Why are we talking about this?

Because information technologies only deal with representations of the world.

A computer only has access to a representation of reality. A computer does not process reality
itself. The computer will grasp syntax - that is, the elements of representation in the
information world. The computer will in no way grasp the meaning of the representation.

The same reality has several representations


However, the same reality can have several representations. The computer will have to make
do with the representation transmitted to it.

1
This is an image of the Andromeda galaxy. This picture corresponds to
what a human eye can see (obviously using a very powerful telescope).

This is the same galaxy as Andromeda. But this time, instead of showing
radiation in the visible range, this picture captured infrared radiation.
For example, it can be noticed that there is an outer ring of objects
emitting a lot of infrared that does not correspond to an area of high
visible light emission. Depending on the type of objects the astronomer
wants to study, one or the other of these pictures will be more useful.

It’s important to understand that these two images - different from each other - correspond to
the same reality, the same Andromeda galaxy, seen in different wavelengths.

There's no mistake here. There are only representation choices.

Representation and abstraction


Representing is abstracting certain details. If all the details of reality were in the representation,
the representation would be the actual reality.

● What should be kept in the representation and what can be omitted? What do we
need? To make decisions, for example.
● What distance between representation and reality is tolerable? If you prefer, what level
of approximation is acceptable?
● How often is it necessary to update?
● In a word, what are the details that can be abstracted?

Building a representation
Here are several examples.

2
Suppose I'm in charge of a chemical production facility. For example, there is a tank containing
a product that can explode if its temperature exceeds 63°C. I need to know the temperature of
that tank to an accuracy of a tenth of a degree. And to have this temperature reading every
second. These measurements will allow me to detect very quickly if an upward trend is taking
shape.

If now I'm interested in the cash flow of a very large group. I can probably make do with a
weekly assessment within a few million dollars. If I'm still interested in cash flow, but this time
from a very small business, I probably have to follow it to the nearest thousand euros,
sometimes even to the nearest hundred euros, every day. Indeed, sometimes I have to choose
not to pay some bills depending on the incoming cash flow.

Business information: abstraction


Let us now review the flow of information in the company. Here, we also find the concept of
abstraction.

This is the classic hierarchical pyramid. Each color represents different departments (HR,
marketing, operations, etc.).

There are several levels of action.

● The operational level,


● The management level,
● And the strategic level.

3
In Parallel, we have different levels of information abstraction.

For operations, field information.

● For example, in supermarkets, it would be, for each barcode, to have a price, which
allows the customer to be charged when he goes to checkout.
● At the management level, information for decision making and monitoring. In our
example, this could be data on the average cart and its comparison with the average
cart a year earlier.
● And finally dashboards for strategic decisions.

Role of the manager


All these representation questions are not technical questions, but questions of management
and business expertise.

● It's up to the manager to define the representation he or she needs.


● It is up to the manager to set the abstraction level, namely, to decide what to keep and
what to omit in a representation.
● In fact, it’s up to the manager to know how he or she makes his or her decisions and
therefore what information, with what degree of precision, is necessary for this decision
making.

Conclusion
We have seen in this video that representations are at the heart of digitization and that the
managers are in the best position to define these representations.

4
Video transcription

HARDWARE, SOFTWARE, NETWORKS


Let’s look at the main components of a computer system to put the ideas back in place a bit.

Computer system
A computer system consists of ​hardware ​and ​software​. Hardware is where we find the
electrical wires, the laws of physics, electronic circuits, actuators - in domotics or robotics, for
example. This is what makes it possible in practice to store information, transmit information,
and process information. A computer is hardware. A smartphone, too, or even an autonomous
car. The ​software part of the system is there to control the hardware. A computer by itself
does nothing, but you install Windows, or Mac OS, and other software such as a word processor
or a strategy game, to coordinate the actions of the hardware in the desired direction. Software
is a computer program or set of programs.

Everything is system
Computer science, informatics, is fundamentally systemic. This is certainly true from a technical
point of view and it’s also true from a business point of view.

Inside the hardware​, if you zoom in a little more, you’ll find hardware elements connected in a
system. If you look inside your computer, make sure you know what you are doing before you
open the machine! And if it's a laptop, it's probably better not to open it... in short, inside the
computer, you'll find a hard drive, a microprocessor, some memory, maybe a DVD player
(which is less and less true...). All of these elements are connected to each other. It’s in fact this
system of elements that forms the element "computer".

If, on the contrary, you look at a wider plane, your computer is connected to other hardware
elements. It may be connected to a remote hard drive if you are using Google Drive for
example. It may be connected by wifi to an Internet access point... and to another computer, a
web server, via the Internet network for example.

1
For the software part, it's very similar: when you program, the smallest "piece of code"​, that is
to say, the smallest little computer program, exploits existing programs programmed by other
developers.

Genericness vs. Specialization


Digital systems can be compared based on their generic nature or, on the contrary, their
specialization.

There's a relationship between, on the one hand, the degree of ​specialization of the hardware​,
and on the other hand, the​ importance of software​.

The more specialized the hardware, the lighter the software. To put it another way, the more
specialized the hardware is, the more rigid it is - in the extreme, it can only do one thing -, so
there is no need for software to coordinate anything. On the contrary, a piece of very flexible
equipment is able to perform many actions. It’s the choice and coordination of these actions
that will enable us to achieve the desired behavior. That's what the software does. The
computer is probably the most flexible machine invented by humans... and a computer needs a
lot of software to be used.

Networks
An important part of the computer system is the network. It takes at least two to communicate.
An ​emitter and a ​receiver​. Certainly, these two individuals must be connected in some way. But
that's not enough. It also requires a minimum of coordination. Here, for example, one speaks
while the other is not listening... it's not going to work. They also need to speak the same
language, for example. Between machines, it's the same thing but with more constraints.

Networks: Hardware + Software


Computer networks will contain a stack of hardware and software.

The ​hardware part is the network infrastructure. It is crucial. When we talk about Ethernet,
we're talking about the cables with which you can build a network in a building. Wi-Fi antennas
are necessary to establish Wi-Fi connections. We sometimes forget this, but to have Internet

2
connections between North America and Europe, there is a physical connection. This
transatlantic cable is extremely important. In 2018, there were approximately 450 submarine
cables in the world to provide telecommunications, including Internet connections. Sometimes
these cables are damaged or broken, which can cut off Internet access to a region of the world
or slow it down greatly if another access is available, via another cable.

At the ​software level, we will find the communication protocols. For example http, HyperText
Transfer Protocol, is used to transfer web pages. FTP, File Transfer Protocol, was designed to
transfer potentially large files. The encryption of a message is also a software action.

Conclusion
Going into more detail would be too technical. At the management level, in order to
understand certain aspects of the digital transformation, it’s interesting to know that every
computer system is made up of hardware and software, that there is a relationship between
the specialization of hardware and the importance of software, and finally that everything is
fundamentally systemic.

3
Video transcription

PROPRIETARY SYSTEMS VS OPEN SYSTEMS


In the digital world, there are two main types of systems: so-called proprietary systems and
so-called open systems.

Proprietary system
Let's start with proprietary systems.

There is of course a designer/manufacturer of this system. The system is designed entirely


in-house. More importantly, the details of the outcome of this design stage are not made
public. On this basis, the product - software or hardware - is developed and sold. The buyer
does not have access to the design details. He cannot know what's going on inside the product
and has no right to change it anyway. When the user complements his system, he often has to
acquire other products from the same supplier because this one, in its proprietary approach,
may have defined its own internal methods of connections, its own standards, so to speak. This
is the approach that Apple had in the ‘80s: to print with a Mac ( hence an Apple computer), you
needed an Apple printer. You can still find this approach on the non-standard connectors of
iPhones.

Open system
An open system takes a different approach.

Obviously, the product under consideration has a designer/manufacturer. However, it’s using
external ideas. For example, it will follow existing standards. It is, in any case, designed to
connect to or integrate third-party elements. The product obtained on this basis is more
transparent to the user. At the very least, for all the standards used, he has a clear idea of what
they encompass. Transparency can go further and allow the user to have access to all or part of
the product design and allow him or her to modify it. When the user wants to complement his
or her system, he or she can easily use products from other manufacturers - different but

1
compatible because they themselves are open and respect the same standards. This is mostly
the approach taken by Google, for example, for its Android system.

Proprietary system
A proprietary system, allows ​total control over the system by the developer. Indeed, the
system is a pure home-made product whose only constraint is that it must be compatible with
other home-made products. This generally results in a very high quality of use.

All the necessary ​know-how is developed and capitalized internally. The speed of development
is limited by the capabilities of the designer/manufacturer. For example, Blackberry, which had
a fully proprietary system, also developed most of its smartphone applications in-house. In the
end, there were many more applications for Android or iPhone than for Blackberry because the
first two opened their systems so that everyone could develop applications for them.

With a proprietary system, ​customers are captive​. Either they update, extend their system with
the same supplier - the only one to offer compatible components - or they restart from scratch
with a competitor. Of course, the flip side of the coin is that there may be more hesitation
before the first purchase, knowing that this first purchase takes us into a closed universe.

There's a ​black box effect​, of course. For some applications, this black box effect requires
greater confidence. Indeed, it’s impossible to check that there is nothing in the black box that
could harm the user. This raises questions about cybersecurity.

Open system
For an open system, it's quite the opposite.

Since the system can be made up of components from different manufacturers, each supplier
has only partial control over the whole. He has to rely on his competitors to respect the
different standards so that all these elements are integrated correctly. Development efforts will
be shared among all the suppliers contributing to this open system, this is going much faster.
The clientele is non-captive​. User interaction is unified​. Since the same standards are followed,
it’s likely that the interfaces are similar - at least in their logic. This can be comforting for
potential users. Since the user has access to all or part of the inside of the system, there are

2
greater inspection possibilities​. This open approach ranges from compliance with ​standards to
freeware ​or ​open data​.

Conclusion
Of course, there is a gradation of the opening of a system. Furthermore, the same supplier may
develop proprietary systems and open-source software, depending on the strategic positioning
of the different products - this is the case with IBM, for example. The choice between a
proprietary solution or an open solution depends on the constraints specific to each project.

3
Video transcription

WHAT IS AI?
We talk a lot about AI these days in the business world. AI research started in the '50s. Let's try
to clarify what AI is and what it is not.

A.I., it's...
AI​, is​ A​rtificial​ I​ntelligence.

We could debate about a detailed definition of artificial intelligence, with differences between
several schools of thought, but this would not be very useful in our business perspective.

The simplest thing to remember is that artificial intelligence aims at giving more autonomy to
the computer system.

More autonomy
Why is system autonomy an important topic in computing?

In conventional computing, an algorithm tells the computer what to do for each situation. A
large part of a developer's job is to anticipate possible situations, so that he can indicate for
each of them, what will have to be done. For each situation, the solution is indicated.

Artificial intelligence algorithms do not say what to do, they say how to decide what to do.
Instead of listing the possible situations and indicating for each of them what to do, the
developer indicates how to build an appropriate action.

● With classical algorithms, what happens when faced with a new situation? When faced
with a situation not listed among the anticipated situations? The system fails. There's
nothing in its algorithm to indicate what it should do.
● With an artificial intelligence algorithm, what happens? An answer is constructed. It may
not be perfect, but something's happening!

Here we can clearly see the computer system’s increased autonomy.

1
Different applications

Artificial Intelligence is found in different forms, through various applications. AI applications


can be divided into two broad categories:

1. The situation assessment.


2. And decision-making.

Obviously both fields of application are linked. Decision-making requires knowing the situation
in which the system finds itself, and certain decisions can be made so that more information
can be gathered - and thus the assessment of the situation can be improved.

In decision making, the system can decide on the next action to be taken, or ​plan a series of
actions to achieve a set of goals from a given initial situation.

Action planning can solve a satisfaction problem​, that is, find an action plan that is a solution to
the problem, any solution, or find an optimal solution (one that minimizes costs or maximizes
profits). We can also plan ​under ​uncertainty​, that is, without being sure that the execution of
an action will go as planned (the wheels of an autonomous car can spin, an object can slip when
the clamp of a robot tries to grab it, or the sale of many L'Oréal shares can cause the price to

2
fall faster than expected...). Planning under uncertainty can be seen as a case of optimization,
where one tries to maximize the plan’s potential for success, whatever the hazards.

On the situation assessment side, we have image analysis (to find landmarks or facial
recognition, for example). We also have the ​processing of natural languages (French, English,
Spanish, etc ...). It’s worth mentioning ​location​, to find one's way on a map, in particular by
recognizing landmarks in an image of the system’s surroundings. Automatic mapping is the
creation of a map by exploring the surroundings. The little robot vacuum cleaners do this when
they arrive in a new apartment. Once the mapping is done, they optimize their movements. Of
course, there are other applications... and many are still to be defined.

Several approaches to decision-making


Let's just focus on decision-making. Here too, several approaches are available to us. We can
list:

● Game Theory,
● Decision Theory,
● Automatic logic demonstration,
● Action planning, expert systems - very popular in the 1990s.

And there is also, of course, machine learning. AI is not just machine learning, but machine
learning is definitely AI.

The first four approaches require an explicit model. The system needs to be provided with a
model describing the world and what can be done. In addition, these approaches allow for a
visualization of how they work, which provides elements of justification when the AI makes a
proposal. Machine learning, on the other hand, does not need an explicit model. He will build a
model from the raw data…

Example of a planning algorithm


To better understand how an explicit approach works, we will look at a small example in action
planning.

We will detail the machine learning in another transcription.

3
We are in a simple situation, with an environment consisting of 24 cells, organized in 4 rows
and 6 columns. Black squares are impassable obstacles.

Logical statements represent this universe - this is the explicit model that is provided to the
algorithm. There are 4 of them here (it's a very simple universe). They are used to define if a
cell is an obstacle, if a cell is empty, and the row and column of a robot.

These descriptive statements are complemented by


actions. Here we represent only the action of going
north (GoNorth). Other actions are of course available.
We assume North is up.

The action is for a robot. It has preconditions, all of


which must be fulfilled so that the action can be carried
out. In this example, in order for the robot to go north,
this robot must be somewhere, the cell on the same
column but one row above it must be empty (this is
what IsEmpty(x, y-1) expresses).

If this action is carried out, the effects are validated. The line of the robot's position is one line
above. The position where the robot was located becomes empty. The robot is no longer on the
line where it used to be. And the cell where he is now is no longer empty.

The search for a plan will be done simply by simulating the possible positions and the execution
of the available actions. Here, our robot is in position number 11, that is to say, row 1 and
column 4. The preconditions for North, East, South, and West movement actions are fulfilled -
these 4 actions are therefore executable.

Let's run the algorithm.

4
From the starting position, in (4,1), the robot
can go north, east, south, and west (we will
not show this branch of the decision tree to
avoid overloading the presentation). The
algorithm considers each of the positions
reached, determines which actions are
executable, and simulates their executions.
From position (4.0), the robot cannot go north
(because there is no cell to the north). But it
can go East, South, and West. From position
(5,1) the robot cannot go east, but the other movements are accessible to it. From position
(4,2), all movements are possible. The algorithm thus continues until a goal position is reached.
The construction of this decision tree allows us to understand the algorithm’s reasoning. The
branch then indicates the path to follow from the starting node to the reached node.

Machine learning
Machine learning is a totally different approach. It does not use an explicit model. There are ​no
observable mechanics (in our example, the tree of the different possible movements
constituted these observable mechanics).

Machine learning starts with the ​analysis of observations​. It's enough to transmit data, ​without
a model​. Machine learning has enabled ​highly ​successful business applications. That's why it’s
now in the spotlight.

Conclusion
You now know the main structural elements of Artificial Intelligence. We managed to put
machine learning in a broader context. This approach - machine learning - is developing
strongly in companies today. This is why it’s presented in more detail in another video.

5
Video transcription

FOCUS ON MACHINE LEARNING


Machine learning is a type of Artificial Intelligence approach that is currently getting a lot of
attention from companies. Machine learning is not all artificial intelligence, but it is probably
the best-known part of artificial intelligence today. We will discuss it in more detail, to better
understand exactly what it covers.

We are discussing ​Machine Learning​, but you may also have heard of ​Deep learning​. It's a
subset of the machine learning algorithms.

A machine learning algorithm studies datasets to detect ​regularities​, to detect repeating


patterns. The aim is to detect correlations between factors​. Since we are talking about
statistical approaches - advanced statistical approaches, certainly, but statistical approaches
nonetheless - the set of data studied must be very large. That's why machine learning is linked
to Big Data.

The data in the datasets are not processed for themselves, but for the relationships that are
identified​. For example, with data from the sales of a supermarket, we are not interested in the
fact that at 10.58 a.m. a customer bought grated Swiss cheese, but in the fact that baby diapers
are often bought at the same time as grated Swiss cheese (this example is fictional).

This is why we talk about machine learning: the computer system constructs information that
has not been explicitly communicated to it. This information is hidden in the data, but it has not
been given explicitly to the system - most often because the human agent does not know it.

Artificial experience
Thus, machine learning, in artificial intelligence, is in fact artificial experience.

● Using machine learning means building automatically​. Automatically because it's an


algorithm, a computer, that does it...
● An experience​. Because it’s based only on past observations, these observations
constituting the dataset under study.

1
● A shared experience​. Because the number of past observations considered is gigantic.
This corresponds to many more observations than would be possible in a single human
experience.

Experience?
Let us return to this notion of experience because it’s at the heart of machine learning.

The experience is built from the ​data​, from the observations​. Thanks to this principle, machine
learning does not need a pre-existing model. Exploiting the available data, machine learning will
identify more complex relationships on its own.

It's extremely convenient. It also raises a series of difficulties related to the availability and
quality of the data. Statistically identified relationships are correlations​. For example, we can
see that two variables evolve in the same way by examining the data. But we don't know
anything about the possible causal relationship between them. It's a matter of experience: by
experience, I think I know whether it's going to rain or not during the day based on the weather
in the morning (color of the sky, wind, presence of clouds, etc...), but I don't know anything
about meteorology: I don't know why it's going to rain or not.

My past observations have taught me correlations between various factors and whether or not
it rains, but they have taught me nothing about the causes and mechanisms of rain.

Finally, since machine learning builds an experience, its decisions are difficult to explain​. It's a
bit like a colleague who would recommend an action to you using his experience as an
argument: "Trust my great experience: you should do this".

Workflow: Learning stage


How does it work in practice?

Let's take the example of learning how to recognize numbers in handwriting. This recognition is
very useful, for example, for the automation of postcode recognition in sorting centers.

The first stage is a learning phase. We present a handwritten number. The system
communicates its proposal. We tell the system whether its answer is correct or not. We can
also tell it what the correct answer is. Based on our feedback, the algorithm will modify many

2
parameters in the learning system so that the value presented leads to the correct answer. A
new observation is presented to the system. It presents its proposal. We give it our feedback.
And so on for the considerable amount of data in the learning corpus.

Little by little, the system will find the settings that allow it to answer all of the learning stage
questions correctly - it’s said to converge.

Workflow: after the learning stage


Once the system has learned, The system knows the learning corpus by heart, but it can also
recognize numbers in examples that were not presented to it during the learning stage. This is
what we call ​generalization​. Some kind of pattern has been learned. On the one hand, this
model has only correlations. On the other hand, this model is not explicit. It is dissolved in the
fine-tuning of all parameters. Of course, the ​experience gained is dependent on the data used
during the learning stage.

Conclusion
We have just painted a broad picture of what machine learning is all about. This ability to create
experience automatically, to find previously unknown relationships in data sets has very strong
business potential.

3
Video transcription

IMPORTANCE OF DATA
An artificial intelligence algorithm, no matter how sophisticated, cannot do anything without
data. This is all the more true in machine learning, since this approach aims precisely at building
an artificial experience from past observations.

The absolutely necessary data


The first question to ask is: do we have these data? Either we have them, or we don't.

● When we do have them​, in some cases, it’s necessary to make sure that we also have
data to validate the answers for the learning stage. The second question to ask is: Do we
have ​the right ​to have these data? Many laws and regulations may come into play.
● If we don't have them​, we can ​collect them - through surveys, through sensors, or
through a new reporting process for example. We may also simply acquire these data
from a third party​. The data market is expected to grow in the coming years. In some
cases, it’s also possible to simulate the data​. An example of this is simulating the data
we need for the machine learning stage.

Protection of privacy
A special case of data availability that is becoming increasingly important is that of data related
to privacy.

There are ​many activities​, ​many business models​, which are based on individuals’ data
processing. This collection of information necessarily interferes with the notion of privacy.

There are specific legal frameworks​. In the European Union, for example, a brand new specific
legal framework for the protection of personal data has been in force since 2018 - the GDPR​. Of
course, our data processing must comply with the legal framework in force. However, this is not
enough. We have to be attentive to ​the perception of the data subjects​. Indeed, you can very
well implement some data processing that is completely legal, and yet, be considered

1
absolutely unacceptable by your customers or employees. This perception can be very different
from one culture to another, from one generation to another, and so on.

Data quality
Once we have ensured that we have the necessary data, the next step is to evaluate its quality.
Several pitfalls are possible.

● Some of the data may be ​missing​. Taking into account the absence of part of the data
can be extremely complex. Missing data can lead to biases in the data set. The data may
also simply contain errors. In fact, they do contain errors, that's for sure - what needs to
be determined is whether they contain many or few errors, and whether there are ways
to correct those errors - and at what cost.
● Finally, the data may be quite correct, perhaps even complete... but absolutely
irrelevant to the type of situation we are concerned about. Here is an example. Suppose
I use a machine learning algorithm. By making it learn from a set of data about
consumption patterns in Uganda, we produce an artificial experience of consumption in
Uganda. Is this experience relevant for entering the South Korean market? Probably not,
no because it’s reasonable to think that cultural differences between the two countries
have a strong impact on consumption habits. More generally, it’s necessary to ensure
that the data used represents reality and that they are not biased in one way or
another.

Conclusion
Data is crucial in the digital world - especially with the use of advanced technologies such as
machine learning. The management of these data, data quality control, data commoditization ,
are essential aspects of the digital transformation.

2
Video transcription

BIG DATA: 3V
A Big Data system is generally characterized by three V's and two rhythms of use.

Here we focus on the three Vs:

1. V as in ​Volume ​;
2. V as in ​Variety​ ;
3. V as in ​Velocity​.

Volume
The ​Volume​ aspect is most obvious when it comes to Big Data. After all, it's in the name: "Big"!

Usually, we start talking about Big Data when several terabytes of data need to be processed.
Suppliers claim that they can handle exabytes.

What does this mean?

The base unit is a ​byte​. A character, for example, corresponds to a single byte.

● It takes about 1000 bytes to get one ​kilobyte​;


● about 1000 kilobytes to a ​megabyte​;
● and about 1,000 megabytes to a ​gigabyte​.

We're used to kilobytes, megabytes, and gigabytes.

● There are about 1000 gigabytes in a ​terabyte​,


● about 1000 terabytes in a ​petabyte
● and about 1,000 petabytes in an ​exabyte​.

So one exabyte is the equivalent of a billion gigabytes... so 1 billion billion bytes... that's huge!

We'll talk about Big Data when we work with terabytes and more.

Let's see what these large numbers correspond to through two examples.

1
● Walmart is the world's leading retailer. When they're done shopping, customers go to
the checkout... generating over a million customer transactions per hour. Each of these
transactions generates data. These data are imported into databases estimated at over
2.5 petabytes. This is among the most extensive datasets held by a private player.
● In France, the social security system manages the partial reimbursement of health
expenses for all French citizens. The dataset stored is of the order of 500 terabytes.

These are two examples of what is known as the​ data deluge​.

Dealing with such volumes of data has required the development of new computing
approaches. The main problem is not to store these data, but to be able to search for a subset
of data, according to certain criteria, within a reasonable time.

Variety
V as in ​Variety​ refers to the types of data that can be processed.

There are two main types of data:

● structured data,
● and unstructured data.

Structured data are what we are used to in a business context: they are structured in tables
and generally come from a traditional database extraction.

For example, in a human resources department, we can produce a list of employees. In this
example, the corresponding table consists of 4 columns. Each column is associated with a given
field :

Last name, First name, Recruitment date, and Salary.

2
Last name First name Recruitment Salaire annuel
date

Nicole Badhar 01/09/2001 45 000 €

Pierre Monnier 12/06/2010 26 000 €

Dolores Alba 01/01/2015 60 000 €

Peter Smith 09/10/2016 50 000 €

Here, the structure of the table conveys the meaning of the content. For example, if we want to
know the average salary, the computer only has to calculate the average of the values in the
fourth column. Of course, the computer does not know what a salary is... but can retrieve the
information from column 4.

Unstructured data​, on the other hand, has no explicit structure - or more precisely, the
structure managed by the computer does not help from a semantic point of view. This is the
case for images, videos, common texts in natural languages (English, French, Spanish, Chinese,
etc.).

For example, consider the following email:

"Dear Yannick,

We just finished the interviews. We have selected Elodie Clark, freshly graduated from ESCP
Business School. As you would expect from that school, she's brilliant. She had salary
expectations a bit high for a junior... But she finally accepted our usual salary in exchange for a
few extra days off (she surfs...). She starts on January 1st.

Kind regards "

All the information is there: last name, first name, salary, date of recruitment... but it is not
explicitly indicated where it is. For the computer, this text is a sequence of characters. Finding
names, salaries, etc. is very difficult, as it requires a certain "understanding" of English. For this
reason, unstructured data is challenging for a computer to handle.

3
Same problem with an image: a computer "sees" a set of colored dots, while we humans
quickly structure the image according to the objects represented in it. Big Data technologies are
capable of processing unstructured data. They can analyze it to find structure.

This is very powerful, as we can see in security and safety, for video surveillance monitoring.
Videos can be analyzed by the system that isolates faces in the images and compares those
faces to a file of dangerous people. If one of them is detected, an alert is triggered.

These technologies also make it possible to detect unusual behavior, in order to prioritize the
attention of security agents to these behaviors.

In marketing, this makes it possible to cross-reference structured information (daily or weekly


sales of a given product...) with what is posted on the same product on consumer forums or on
Facebook. This makes it possible to detect a change in customer behavior more quickly.

Velocity
The last V is about ​Velocity​, the processing speed. In a Big Data system, data is processed
quickly. It must be processed quickly for two reasons.

Firstly, the pace of business is itself fast: when someone asks a question, he or she needs a
quick - ideally immediate - answer.

Second, if the flow of data entering the system is itself very fast (which is likely to be the case
when it comes to Big Data), if the analysis takes a long time, the result may already be out of
date given all the new data collected in the meantime.

4
Video transcription

2 RHYTHMS OF USE
A Big Data system is generally characterized by ​three V's ​and​ two rhythms​ of use.

We focus here on the two rhythms of use:

● Synchronous​, or​ nearly synchronous​ use;


● Asynchronous u ​ se.

Quasi-synchronous rhythm
The quasi-synchronous rhythm corresponds to the classical use of a database.

A query is defined, which allows a database to be interrogated, and the corresponding results
are retrieved. This makes it possible to extract a subset of data from the database - according to
several criteria, to combine data and cross-reference them with other data.

With Big Data technologies, it's the same thing: you start with a query. But this query can
interrogate a large number of databases at the same time,

● To process ​structured data​,


● but also ​unstructured data​ such as images,
● or ​videos​,
● sounds​ - why not?
● texts​ in natural languages, from social networks, for example.

Thus, this first use is very close to what already exists in traditional corporate information
systems. Still, Big Data offers much greater potential in terms of data volume, type of data -
structured and unstructured, and connection to a large number of different databases.

Asynchronous rhythm
Now let's take a look at the other rate of use: the ​asynchronous rate​. It’s this type of use that
brings the most attractive promises of Big Data, or at least the most prominent ones.

1
This is where ​machine learning ​comes into play. This is also known as "data mining"​. All of this
is a type of ​artificial intelligence​.

The corresponding algorithms analyze the data​ to find correlations between factors​.

A set of data is provided to the computer system.

It analyzes it, to build a correlational model - a set of correlations - some of which can be very
complex. It's really about identifying patterns in the data, repeating patterns. It’s called learning
because these patterns, these correlations, are not provided to the system: the system detects
them from the raw data.

We can use the purchases of customers in a store as an example. We can see that there is a
correlation between the purchase of grated Emmental cheese and the purchase of baby
diapers. Thus, we can promote grated Emmental cheese to attract interested customers, who in
turn,will buy baby diapers in our store. This example is of course, fictional!

Analyzing the data to detect these regularities, these correlations, takes time - it can even take
a long time. This is why we are talking about asynchronous use here.

Both rhythms of use can be connected.

First, in asynchronous mode, correlations are established, regularities are detected from the
data. Such as detecting consumption behavior.

Then, synchronously, we monitor the current data to recognize the patterns that have been
learned - here, consumption behaviors - in order to make decisions specific to these patterns (in
my example, specific to these consumption behaviors).

Conclusion
We have just seen the two rhythms of use pertaining to Big Data technologies, one -
synchronous - which greatly increases the potential of database queries, the other -
asynchronous - which introduces machine learning, and of course the combination of the two
which allows incoming data streams to be processed according to what has been learned from
past data.

2
Video transcription

INTERNET OF THINGS
The Internet of Things is developing rapidly. It’s one of the structuring technological trends for
the future. Here we present its main characteristics. The Internet of Things starts from the
classic Internet network, which connects computers, or at least devices dedicated to data
processing, and extends it with all kinds of things. A house with a domotics system, cars, and
robots are examples. We can also note production machines, in a factory, as they can be
connected in this way too.

A better apprehension of the real world


This allows a better grasp of the real world. A better grasp of the real world, by the computer
system! Indeed, each thing having a unique address on the network, each thing gives a
numerical means to identify it. This is what allows a computer system to apprehend the things
of the real world.

Let's consider an example in clothing distribution. A classical system has information about the
delivery of 5 T-shirts to a store. And that of those 5 T-shirts, 3 were sold. But with the Internet of
Things, we can identify this T-shirt among the T-shirts sold, and to know that this is that T-shirt
among the deliveries. All that remains is to use this precise identification to know that the
T-shirt has been in storage for a while, and then was positioned in the store at a particular
location. And a whole bunch of other steps before it was sold. Of course, the digital
identification of the T-shirt, in itself, does not create value... but it allows many uses that can
create value.

Interactions between objects


Beyond this identification of each item, the Internet of Things also allows the things themselves
to communicate - that is, to send and receive messages. These two characteristics -
identification of each thing and direct interactions between things - lead to a kind of merger
between the real world and the informational world.

1
Merger
There are strong links between the Internet of Things and Big Data. Indeed, the Internet of
Things makes it possible to finely track each item. These histories, for a population of items,
feed the Big Data - and in particular the machine learning part.

On the downside, this raises the question of disconnection. If the objects we use are themselves
connected, it becomes difficult for us to disconnect.

Conclusion
The Internet of Things is the extension of the Internet to all things. It’s already very present in
areas such as logistics. It's more prospective in other areas. In any case, it’s a heavy and
structuring trend for the future.

2
Video transcription

SAAS/CLOUD
SaaS, or the Cloud, are widely used terms that indicate a strong trend in the evolution of
corporate computing.

A business model
SaaS is above all a ​business model​, made possible by the evolution of technologies. SaaS stands
for Software as a Service​. Instead of buying software from a software publisher, installing it on
your computer and using it, the publisher installs its own software on its own servers and you
buy the use of the software from this provider. IaaS ​stands for Infrastructure as a Service​.
Google Drive is an example of this type of service: The company Google provides you with
space on its own hard drives. You rent the use of infrastructure. This is usually on a pay-per-use
basis. It can be a flat fee per user, or a pro-rata payment based on the time of use or
mobilization of the servers in terms of memory, computing time, etc. It’s also often referred to
as a "cloud".

Cloud?
So, yes, "​cloud​" means "cloud"... But that's not directly what we're interested in here. Cloud is a
more generic term, which covers both software and infrastructure provided that everything
happens remotely. You don't use your copy of the software on your computer, but you connect
to a remote server. The public cloud is the best known. The service is available to everyone.
Google Drive, the Google doc software, are examples of public cloud offerings. The private
cloud is the same for the user: he knows that nothing is happening on his computer, that
everything is remotely on a server somewhere. However, the person in charge of the IT system
knows that the applications involved, or the disk space involved, are on servers in the company.
He organizes a cloud offer open only to the company's employees.

1
Load balancing
One of the advantages of a cloud
offering is​ load balancing​.

On this graph, we have represented


the load as a function of time. For
example, this can be a number of
simultaneous connections on a server.
In this example, the load is irregular. In
the beginning, a small server would be
more than sufficient. But during the
strong rise in user activity, it would be overloaded. A powerful server would be required. Then
the load drops sharply, so a powerful server is oversized, before being needed again, and so on
and so forth. Investing in a powerful server is oversized for all periods of low activity - you pay
for a tool that you don't fully use. Just having a small server is insufficient for periods of high
activity, and will create customer dissatisfaction. With a cloud offer, since you pay per use, the
cost would follow the activity curve.

Impact on accounting
There is also an ​accounting impact​.

In a classic configuration, the company invests to buy a server. With the cloud, this investment
expense is transformed into operating expenses. It's a bit like the difference between buying an
office building and renting an office building. By the way, let's note that in a classic
configuration I also have, in addition to the investment, operating expenses such as the salaries
of the team in charge of the server operation. These operating expenses disappear with a cloud
offering since the server is managed by the provider.

Impact on safety
Finally, the cloud has a strong impact on security.

2
There is a balance to be found between, on the one hand, the flexibility allowed by the cloud,
and the lower level of cash mobilization. And on the other hand, the control of its computer
system. With a cloud offer, you accept that all or part of your data goes through your provider's
servers. As with any outsourcing contract, it’s up to you to evaluate the risks and it’s up to you
to evaluate the provider.

Conclusion
Cloud offers allow you to use the information processing power offered by digital technologies
without having to manage a computer system. This obviously has many advantages but also
some drawbacks.

3
Video transcription

DO NOT THINK TECHNOLOGY. THINK VALUE!


To question the impact of new technologies on the management of organizations, we question
here the link between technologies and value.

Management objective
Let's go back to basics. ​What is the objective of a manager?

Managers are here to create value and to protect this value. Value can come in many forms:

● It can be money,
● knowledge (in a scientific research institution, for example),
● more security,
● it can be more ecology,
● well-being, happiness -
○ (Well, I always think of cake in these cases - because I have a sweet tooth, but
you put what you want behind well-being and happiness.)
● The value can also be better health, increased life expectancy, or improved quality of
life.
● Love maybe!

What do I know?

In short, you define what value is in your context and you - the manager - must work to create
more and protect it. This is important because it means that when dealing with technologies,
you must also focus on value. I do not want to disappoint you, but technology does not create
value. Value does not come from technology alone. The value comes from the way you use the
technology, in your context.

1
Need for transversality
This requires a certain transversality.

Indeed, it means that we need technology ​experts​, of course, but also managers. Because the
managers are the ones who lead people, they are the ones who design the organization, they
are the ones who know their business. And all these people have to work together. We can't
reduce the problem to a technological issue, because it doesn't work. Managers need to know
enough about technologies to understand what they can do with them and what they can't do
with them, so that they can relate these technologies to their business issues and business
specificities.

IT geeks can't do it alone: they know their technologies very well, but they don't know enough
about the business to know which use would have the most significant impact.

We need to be able to manage ​multidisciplinary ​teams​, with ​different ​profiles - more or less
Tech-oriented, and more or less Business -- working together. Here again, we see the
importance of ​multicultural ​management​. Here we are talking about different professional
cultures.

Finally, exploiting ​inforzmation technology to create value is a matter of frontiers. We need


more and more expertise on the boundaries between domains. In the past, training led to more
and more expertise in one's field. We still need these experts. But on top of that, today we
need " experts of the frontier," men and women who know enough about two fields to make
connections, to get the experts in those fields to work together.

Impact on decision making


Now, let's go a little deeper into what information technologies can change. Information
technologies affect decision-making - at different levels of decision-making.

The idea is to take advantage of these technologies to ​exploit ​information​. That is, to ​create
value​ from this information.

There are several obvious impacts:

2
● Information technology is helping to ​speed up decision making​. We need to analyze
how this increased speed, concerning decision making, creates value.
● Information technology allows us to break free from geographical constraints​. Again,
what this changes in terms of value creation will depend on the situation.
● Finally, of course, information technologies make it possible to increase the ​rationality
of decisions.

How to take advantage of technology?


With this in mind, we can distinguish two ways in which information technologies can be used
to create value.

The first way to benefit from information technology is to ​use new IT tools to support existing
processes in a given business model​. This way, we reduce the number of errors, speed up
operations, and so on. This is a good first step.

However, you will create more value if you reinvent the processes, based on what is now
possible with new technologies. You may even be able to reinvent your business model based
on the possibilities of new technologies​.

That's more difficult. It probably requires a bit of intrapreneurial thinking. But this is the path
that leads towards much more value!

Conclusion
In this video, we insisted on the fact that technology in itself has no interest for management.
What matters is its use to create value. And to define this use, the manager has a key role to
play.

3
Video transcription

DON'T WASTE IT! THINK MORE VALUE!


Most organizations under-exploit their information. Our focus here is on increasing value
creation.

The digital challenge


What is at stake with digital technologies is the creation of value from data. How to Transform
Business Data into value.

Data are a resource to be valued.

● Do we have this resource?


● If not, can we get them? By collecting them, buying them, etc...
● Is it legal to have (or obtain) them?
● Do these data need to be secured?
● What can we do with them?

All data users know how to answer this question.

But an equally important and almost never considered question is ​whether we can do more​.

It's like asking whether we can create more value from the same data. It's about maximizing
your gains. It’s also about minimizing your costs. Digital information is completely reusable -
let's not turn it into a one-time resource through our behavior!

Difficulty: too narrow project


One of the main difficulties is to consider a project that is too narrow, with a perimeter that is
too small!

Is it indispensable to organize (and pay for!) the collection of the data we need? ​Aren't they
already available elsewhere in the company​? If so, we should use these data instead of
collecting them again!

1
We know how we create value with the information, but can we create even more value​? If we
can’t, is there anyone else in the company who might be interested in these data? Have we
checked?

Difficulty: unavailable information


Another difficulty is simply the non-availability of the data... Is it because of numerical
constraints? For example, we can have these data on paper forms, without a digital equivalent.
Maybe the data system is fragmented, and we simply don't have access to the subsystem
containing the data we want.

● Is this because of legal constraints?


● Maybe this information has simply not been collected,
● Or they are collected in the wrong form.
● Maybe the owner of these data doesn't want to share them...

There are a lot of data accessible, but not all of them!

Let's broaden our way of thinking


There are several ways of processing data.

● We can process them in detail, through fine tracking of the different items - what is
called per item information - by exploiting, for example, the Internet of Things.
● On the contrary, Big Data enables statistical processing, considering data in large
quantities.
● Data can be processed for what they are: they provide the information sought in a
query. For example, the list of sales for a given morning.
● They can also be used for the information they allow us to infer (for instance,
correlations between customer characteristics and consuming behavior). We would
then probably use machine learning.
● Starting from an initial project, we need to broaden our way of thinking and always
question the relevance of the data and the potential value created.

2
Example : initial draft

We want to integrate an RFID transponder into every garment in a garment distribution chain.

By doing so, we speed up inventory, because RFID tags can be read easily by a hand-held
reader. We have evaluated how this inventory acceleration can create value.

Necessary tools

For this project, we need several things. In particular, we need a lot of RFID tags. We also need
to change the checkout systems so that they can read these RFID tags instead of barcodes.
Finally, we also need portable RFID readers. All this is absolutely necessary for the project we
have in mind.

Create more value

But can we use this infrastructure, this investment, to create value differently?

We're starting from our original plan - to accelerate inventory. Once we put RFID chips on all
the clothes, we can use them as an anti-theft system. We can add an RFID reader to the fitting
rooms and know which garments are being tried on.

What would be the benefit of this application?

So far, we knew which clothes were the most and least sold. With the RFID reader in the fitting
rooms, we can also find out which clothes are tried on the most. A particularly interesting
category is clothing that is often tried on but rarely bought: it’s more likely that there is
something to be done during the season - without having to sell out massively at the end of the
season.

Let's go back to our diagram. Other ideas are possible. Why not a "social shopping" application?
It's like pairing shopping and social networks.

If I'm interested in a dress, I can read its RFID chip through my smartphone's reader. And
publish on social networks my purchase project. I will be able to collect opinions and advice
from my contacts. We can also test different layouts in the store, with the RFID chip allowing us
to precisely identify the item sold and know where it was in the store. We could also imagine
uses for the RFID chips after the sale. Be careful in this case to respect the customer's privacy...

3
Impact on ROI
Of course, I'm not claiming that these ideas are good ideas... However, in order to discuss their
value, one must at least have had these ideas. To have them, you have to go beyond the scope
of the initial project and work with other departments in the company!

These additional ideas have, of course, an impact on the project’s Return On Investment​.
Indeed, all these ideas have little impact on the investment. On the other hand, each additional
idea brings an additional return. Thus, the ​ROI of the initial project might be insufficient to
implement it, but once enriched with additional ideas, the ROI justifies launching the project.
Thus, beyond the overall value created, working more transversally can make possible projects
that would not otherwise have been accepted.

Conclusion
In this video, we have highlighted the fact that information is most often under-exploited. That
it’s important to broaden the scope of a project to reduce costs and increase the value creation
from a digital project.

4
Video transcription

INFORMATION SYSTEM
Information technologies do not create value in themselves. It’s their use that may create
value. So something else is needed with information technologies. Not technologies alone.

This is where the concept of "information system" plays a crucial role. To understand this
concept, let us consider three definitions.

Définitions
● The first tells us that an information system is an ​"Organized set of resources:
hardware, software, staff, data, procedures, etc... that enables information (in the
form of data, text, images, sounds, and so on) to be acquired, processed, stored
within and between organizations.”

This definition lists the elements of an Information System: hardware (computers,


smartphones, network, printers...), and software, data of course - with these three elements we
have the computing system. But this definition also includes staff - someone has to use this
hardware and software! And it also includes procedures, the way we are organized.

This is a key point:​ an information system is not just the computing system​.

This definition also emphasizes that our information system is not limited to the perimeter of
our organization. In the context of the extended enterprise, for example, information systems
between ordering parties and subcontractors can connect, or even integrate, to maximize value
creation along the value chain.

Here is a second definition, which takes a fundamental angle.

● It tells us that an information system is a "set of social actors that memorize and
transform representations via information technologies and operating modes.”

We find information technology. We find organization - here in the form of "operating


procedures". We find the people - via social actors. The central idea of this definition is the
notion of ​representation​. An information system only manipulates representations of the

1
world. This must always be kept in mind. These representations can be erroneous, incomplete,
not completely up to date... at least they are a simplified image of reality. And you are the one
who knows what needs to be included in these representations to allow you to do your job.
This is a business issue, not a technological issue.

● The third and final definition states that an information system is "a set of interrelated
components that collect (or retrieve) information, process, store, and disseminate
information to aid decision making, coordination, and control within an organization."

What is important about this definition is that it adds a purpose to the information system. An
Information System is there to "assist in decision making, coordination, and control within an
organization." Thus, two companies, even in the same sector, with different strategies, will lead
to different information systems because the decisions to be made will be different.

The 4 dimensions of the IS


If we summarize, an​ information system​ consists of,

● Information Technology​,
● Of ​people​ (this is where we find issues of competences, for example),
● And of ​organization​.

When analyzing a situation, it’s important to consider all three of these components in order to
make a diagnosis and to find solutions. It’s still all too common to see managers rushing
towards a purely technological solution whereas it is a question of organization, for example.

Finally, the information system is oriented by the company's objectives. It’s also important to
ensure that the information system is always aligned with the strategy.

2
IS in companies: a complex pyramid

In a company, the deployment of an information system must deal with many constraints.

In the hierarchical pyramid, there are several levels of decision making, to which correspond
several levels of abstraction of information :

● Field information;
● Information for monitoring and decision support;
● And strategic decision dashboards.

At each level of the pyramid, the situation details are a little more abstracted.VThis pyramid is
broken down into functional silos: HR, Marketing, Finance, etc.VIn each functional silo, the
relevant information is not the same. Of course in all of this, we have processes, which can be
transversal to several functions and concern several decision-making levels.

The information system must guarantee that each stakeholder has access to all the necessary
information. And no more than is needed, at the right level of abstraction, and in respect of
access rights (for organizational reasons or because of legal constraints).

Finally, to make sure that it’s not too simple, the information system can be linked to other
systems (of suppliers, customers, public administration, etc.).

3
Conclusion
In this video, we have briefly introduced the concept of information systems. It’s a key notion in
the analysis and management of value creation from information.

Bibliography in English

★ “Management Information System - Managing the Digital Firm”, 16th edition, by Kenneth
Laudon, Pearson, 2020.

Bibliography in French

★ “Management des systèmes d’information”, 16e édition, par Kenneth Laudon, Jane Laudon,
Eric Fimbel et Serge Costa, éditions Pearson France, 2020.

4
Video transcription

INFORMATION TECHNOLOGIES AND


ACCELERATION
Information technology is changing the relationship to time and distance. This video explores
the relationship of information technology to time and space and its effects on information
processing.

IT Changes the Experience of Time and Space


First of all, the distances are reduced:

It becomes possible to have a meeting with a person thousands of miles away with a video
conferencing tool (from professional installations to individual software such as Skype or
Google Hangouts for example).

It will even be possible to work together, at the same time, on the same document, thanks to
document sharing tools.

The same applies to the durations:

Sending and receiving a document by email is almost instantaneous, where at least one day
would have been necessary by traditional mail. Video conferencing, once again, avoids travel
time.

New parallelisms:

Information technologies also offer the possibility of new temporal parallels. An individual can
be in a videoconference, while looking at his emails, while being in a written conversation via
an instant messaging tool with another person. He can be busy in the same hour of business
concerning different geographical sites as long as he has access to the files specific to these
different sites. He can work while on the move…

1
These phenomena:

● Reduction of distances,
● Reduction of the durations,
● And increased parallelism,

lead to the perception of an accelerated society, of a constant feeling of urgency, particularly


felt in the workplace.

Decision-making is changing

Information technology has a direct impact on decision-making.

With the rise of Big Data, in particular, business analytics methods are being put forward. And
we can see information technology being used intensely and widely for decision making
support. As a result, decision making may be faster, but this is not systematically the case.

Most often, it’s more rational because it’s based on extensive and advanced data processing.
This processing is faster than before, but since it may be systematically required before a
decision is taken, it may ultimately lengthen the decision-making process.

More difficult disconnection

The development of the Internet of Things makes it more difficult to disconnect.

When information processing tools were computers - even laptops, and smartphones, it was
still technically possible to switch them off.

Now that information processing is spreading everywhere: in cars, in factory machines, in the
wallet - via passports or credit cards - or even in a domotics system at home... it becomes
difficult or even impossible to disconnect from the informational world.

Worlds at different speeds?

Moreover, one may wonder whether humans are indeed accelerated by these technologies, or
whether they are rather excluded from a faster world of computer processing and direct
connections between computers.

2
Human impact

Technologies and their relationship to time do of course have a direct impact on humans.

This impact can be positive:

● Bringing more speed,


● Efficiency,
● Fulfillment (allowing a more diverse experience than ever before),
● and Creativity (especially by automating less creative tasks).

But this impact can also be negative on humans:

● By making one feel like a failure if he or she can't keep up with the accelerated pace of
work, for example.
● This can lead to being overworked,
● Or even lead to cases of burn-out.
● Or, on the contrary, cases of " bore-out ", that is cases of extreme boredom - especially
in functions where the human operator no longer has (or will no longer have) real
control over what the computer does.

Digital divide

Additionally, one should not believe that the acceleration driven by information technology is
homogeneous throughout the world! The world has never been so connected, But the digital
divide has never been wider! Thus, if these technologies allow connections between individuals
all over the world, they also exclude certain individuals much more than before.

Everything goes faster, except forgetting

Finally, if information technologies have brought acceleration, and more speed, they have also
brought greater persistence, even a shared memory. Digital memory does not have the gradual
erasing properties of human memory.

With digital memory, everything that is stored is stored and remains stored with the same
sharpness. When a search engine lists about twenty links on the first page of the results for a
query, the date when the item was stored has no impact. Each element has the same
"sharpness" and each element is found with the same ease - regardless of when it was stored.
In a way, it’s as if these elements - located at different positions in the depth of time - were all

3
projected onto the query's timeline. At the time of the query, the time dimension is annihilated.
Old or recent digital memories, they are all there, on the same plane. This infallible memory
raises new questions - such as the right to be forgotten.

Conclusion
In this video, we have sketched out the new relationships to time, space, and memory that are
being shaped by information technology. These transformations are profound and have a
long-lasting impact on both management and society. They are also much more complex and
contrasted than they seem at first glance.

Bibliography in English

★ “​Social Acceleration: A New Theory of Modernity”, by Harmut Rosa (translated by Jonathan


Trejo-Mathys), 2013, Columbia University Press

Bibliography in French

★ “​@la recherche du temps - Individus hyperconnectés, société accélérée : tensions et


transformations​.”, ouvrage collectif sous la direction de Nicole Aubert, éditions érès, 2018
(avec la collaboration de Jean-Philippe Bouilloud, Isabelle Fortier, Yannick Meiller et Elisabeth
Tissier-Desbordes.

★ En particulier, dans cet ouvrage, le chapitre “Technologies de l’information, temps et espace :


nouvelle topographie du monde informationnel et nouvelles relations au monde réel.”, par
Yannick Meiller.

★ “​Accélération. Une critique sociale du temps”​, par Harmut Rosa, 2010, éditions La Découverte.

4
Video transcription

INFORMATION SECURITY, AN
UNDERESTIMATED NECESSITY
Information security is an essential aspect of the digital transformation, which is still often
insufficiently taken into account - both in large groups and in SMEs.

An inseparable duo
The digital transformation consists of an inseparable duo.

On the one hand, ​the ​creation of value ​from information, of course! And everyone who talks
about digital transformation talks about value creation. On the other hand, the protection of
the value ​consequently created from the information. Not talking about this is madness! It's
like pushing people to create value without protecting it. If you do that, you create value, but
for others! Others, who will plunder the value created.

A real threat
The cyber threat is real.

Every day there are ​proven attacks​… They can be ​targeted - against a particular company, or
on a ​global ​scale - such as the Wannacry attack, which affected numerous companies in 150
countries.

A successful attack can slow down or prevent your activity - by blocking your servers, for
example. It can lead to ​financial losses​, such as when you are asked to pay a ransom to make
your data accessible again, or when the attack makes you believe that a money transfer is
legitimate (so that you authorize it) when in fact it is not.

A cyber-attack can result in severe ​image ​damage​, can cause ​market share loss ​or the loss of
tenders​. Finally, cybersecurity is also a ​sovereign ​issue​, not only because of the threats to the
national economy, but also because cyber-attack is now also a military instrument.

1
The solution is not technological
There is a lot to be said on the subject, but for us managers, we must bear in mind that the
solution is not only technological. When we talk about information security, we're actually
talking about ​information system ​security… Now the information system is a triptych,
composed of ​information technology ​of course, but also the ​organization​, the different
processes in place, and finally, ​people​. People are highlighted here because they have a crucial
role to play in information security... and unfortunately, they often do not play it, or incorrectly.

A matter of management
This shows that cybersecurity is a matter of management.

Indeed, everyone must be an ​actor in information security. It’s the role of the manager to
mobilize his or her teams​. Information security awareness among teams is essential. Again, this
is not a technological issue but a management issue. Finally, an ​appropriate ​organization is
needed, and the managers are the ones who design the organization.

Conclusion

Information security is central to a successful digital transformation. And you, as managers,


have an active role to play in this.

Webography in french

★ Site de l’ANSSI (Agence Nationale de la Sécurité de Systèmes d’Information) :


https://www.ssi.gouv.fr/

2
Video transcription

THE THREE ASPECTS OF INTEGRATING A NEW


TECHNOLOGICAL TOOL
The digital transformation involves the integration of new technological tools - new software,
new devices, etc... The success of this type of integration is vital for the success of the digital
transformation and must be managed carefully. Here we address an analysis grid that allows for
a better understanding of this integration and to anticipate its major stages as well as potential
difficulties.

There are three aspects to the integration of a new technological tool.

● Adoption,
● Assimilation,
● And appropriation.

We'll take them in turn.

Adoption
Adoption happens at the organizational level. The organization, your company, for example,
decides whether or not to adopt a new technology.

This decision is made based on a set of analyses - usually including a before-hand ROI
calculation. When the decision is positive, namely, when it’s decided to adopt a particular tool,
this adoption targets a potential gain. It may be to improve staff safety, increase productivity,
reduce errors... it doesn't matter what the specific case is. Still, it’s important to keep in mind
that the adoption of a technological tool has a purpose. It’s the benefit of achieving this goal
that offsets the cost of integration and therefore justifies the decision to adopt.

Assimilation
Another aspect of integrating a new technology is assimilation.

1
This time, it happens at the individual level, at the level of each person involved within the
organization. In the end, it’s the individual who will use the tool - not the organization as such.
Assimilation already involves simply becoming aware that this new tool exists or will exist, that
there is something new in the toolbox. It is also a question of knowing what this new tool is for.

And it is also in the context of this assimilation that potential users are trained. This training is,
of course, essential if the tool is to be used.

Appropriation
The final aspect of integrating a new technological tool is appropriation.

Here again, we consider the individual level. This is probably even the most individual and least
organized part of integrating a new technological tool. It's regarding how the user appropriates
the tool, how he or she uses it - what he or she does with it. We might think that a tool is
designed for a specific use and, therefore, it’s that use that is implemented when the tool is
used, but that would be wrong.

An individual, when faced with a tool, will have an idea of how to use it. This idea may or may
not correspond to the anticipated uses at the time of the tool's design. A simple pencil is
designed to write or draw... but is sometimes used to hold hair in a bun... and could most
certainly be used as a weapon.

The hallmark of information technology is that it offers great flexibility of use - that is its
strength. Think of a computer: the same tool can be used to make financial projections, play
games, watch a video, or control a domotics system. When faced with a digital tool,
appropriation is, therefore, a key element.

Beware of the distance between adoption and appropriation

Leading us to the conclusion that appropriation plays a central role in the success - or failure -
of the integration of a new digital tool. At the time of the tool adoption - by the organization - a
certain use is anticipated, for which a return on investment is assessed.

But at the individual level, each user can appropriate the tool by creating a different use.
Perhaps very different. This difference, this distance between what was anticipated and what is
finally achieved, can be negative. It contributes to many failures.

2
But beware: it can also be positive! It’s often the users' exploration of what can be done with
the tools that creates the most value - this is where digital transformation meets the
intrapreneurial spirit.

Example: Messenger
Let's consider, as a first example, an instant messenger integration.

One of the reasons often given for its adoption is to help break down the barriers between
functions, departments, and geographical locations. This enables greater cross-functionality.
However, it’s difficult to anticipate the tool’s actual use. If employees use the tool to make
rapid progress on an issue - it is perfectly in line with the reasons for its adoption. But what if
the tool is mainly used to define whether to meet for lunch at 12:15 or 1:00 pm? or to organize
a small after-work squash tournament?

Note that the answer is not obvious. As a first approximation, we could say that the tool will at
best be useless from a business point of view or even contribute to a loss of productivity.
However, maybe these informal interactions, these lunches taken together, these squash
games, etc. maybe all these little things facilitated by the messenger, in the end, contribute to a
greater cohesion and better work together.

Example: Extranet
Our second example is about the implementation of an extranet.

During the adoption stage​, it’s proposed to set up a file system accessible from outside the
company. This would facilitate teleworking for all employees (whether it is linked to a choice to
stay at home at certain times, or because the individual is on a mission away from his or her
usual office). The idea is discussed. Analyses are made.

Finally, the security department issues a negative opinion - because this company handles very
sensitive information and the implementation of this extranet would pose too many security
problems. It’s nevertheless decided to have webmail - that is an electronic messaging system
that can be consulted from any web browser - as this type of tool is now standard. The project
is deployed and a particular webmail appropriation emerges.

3
In the evening, before leaving the office, some
employees send an email to themselves, with as
attachments, the documents they need to
continue working from home. Since these files are
not accessible via an extranet that has finally not
been set up, they use the webmail to be able to
consult email - and documents - from the browser
on their personal computer - at home.

Apart from the fact that the email system may not have been sized for such large document
exchanges, this ​appropriation ​is an information security nightmare. An extranet would
probably have been much more secure than having documents wandering around as
attachments, copies of which probably remain in the download folders of browsers, viewed on
any computer - potentially shared and infected with viruses.

Thus, the integration of a new technological tool has three aspects: ​adoption (at the
organizational level) and at the individual level ​assimilation​ and ​appropriation​.

Conclusion
These three aspects must be carefully considered. In particular, one must be aware of a
potential distance between the reasons for adoption and the result of appropriation - for better
or for worse.

4
Video transcription

ADVANCED TECHNOLOGIES POSE NEW


MANAGERIAL QUESTIONS
Advanced technologies - particularly those based on artificial intelligence - are raising new
management issues. Of course, they are the result of technological advances, but in order to
make the most of them, the challenges that accompany these new tools are management
issues.

Malfunction

A tool can be dysfunctional.

● Either the tool is working properly,


● it might be that the tool does not work at all,
● or it could be that the tool is malfunctioning.

When a tool does not work or malfunctions, it’s repaired or replaced.

Artificial intelligence
An AI can make mistakes

Now, let's consider the case of Artificial Intelligence.

Its results can be good. Hopefully, this is the case most of the time!

Its results can be bad, for example, because - in the case of machine learning - the experience
acquired is unsuitable, or irrelevant, for the situation under consideration.

Its results can be bad because the data submitted to it are bad.

In all these cases, artificial intelligence can be fully functional. Yes, it makes mistakes, but
without being bugged.

1
AI: tool or non-human colleague?

This raises the question of the status of AI. Is it a tool or a non-human colleague?

A functional tool only produces correct results - otherwise, it’s dysfunctional. A functional AI
can make mistakes... just like a colleague! A trainee or junior has less experience than a
colleague with 20 years in the company! When a trainee makes a mistake, we don't say he or
she is dysfunctional!

We see that he lacks experience and we try to improve his or her experience. From this
perspective, an AI is more like a non-human colleague than what is usually called a tool.

Need for trust


As a result, the integration of an AI's suggestions into the decision-making process is not a
straightforward matter. If the AI makes a ​counterintuitive suggestion​, what does the human
agent do?

● He can say "Wow! We did well to invest 2 M€ in this system: I would never have
thought of the solution it proposes!”
● On the contrary, he can say "Pffff! They wasted 2M€ by buying a buggy system! What
he's proposing doesn't make sense - nobody has ever done that before!”

Note that if you only agree to implement suggestions from the AI that correspond to what you
would have decided alone, there's no point in investing in an AI! Conversely, since the AI can
make mistakes, you can't always apply AI suggestions without questioning them.

The question is, therefore, to figure out when a human agent can trust what the AI proposes to
him or her. Several factors may come into play - research is being conducted on these topics.

● Of course, the ​degree of surprise plays an important role. If the AI's suggestion is what
you would have thought of on your own, it’s not difficult to trust that suggestion (at
least not more difficult than trusting your own decisions).
● The ​argumentation​, the explanation of the suggestion is important. Here, the type of
approach used in AI will have a substantial impact. Approaches based on automated
logical deduction can demonstrate their results, they can show the logical mechanisms
and logical rules at play. On the contrary, a machine learning system cannot explain its

2
proposal, because it proposes it on the basis of the experience it has acquired. It's a bit
like a colleague saying "trust me, I have 20 years of experience in this job".
● The (more or less detailed) ​understanding of the AI algorithm can be a plus. For
example, one can roughly understand how a machine learning algorithm works, on
which data it has been trained. Of course, its decision is not explained, but we know
what the underlying mechanisms that led to it are. These questions of understanding
the algorithm and explaining the proposition are at the heart of what is called
"algorithm transparency".
● Of course, the stakes of the decision ​are important. If it's a matter of deciding on a 2%
discount on green T-shirts all next week, we can more easily risk the error than if it's a
matter of deciding whether or not to cut off a patient's left leg.
● The ​AI track-record​. Has it made many mistakes in the past? Does it often make
surprising suggestions? Were these cases pleasant surprises or disasters?
● Of course, there may be other criteria. For example, related to characteristics of the
human agent.

AI Integration Management
Thus, the integration of AI must be managed. Part of the issues must be dealt with by
management. They are not the responsibility of the technical teams.

● For example, decision processes must be reworked. How much autonomy should be left
to an AI? How are its suggestions used in the decision-making process?
● How is the use of an AI integrated into the teams’ organization? Should an AI and a
human agent be paired? (a bit like a trainee is in binomial with his training supervisor).
What skills - including technical skills - should be included in the team?

There is a risk of forgetting the responsibilities of the decision-maker. The temptation to hide
behind the machine can be great. In the end, it’s the manager who is responsible for the
consequences - positive or negative - of the decision taken. These issues are indeed
management issues.

3
Conclusion
AI questions the nature of a tool and profoundly modifies the role of technological aid in
decision-making. Management must take these questions into consideration and provide
appropriate managerial answers. Solutions will not come from technology.

You might also like