10. 2 billion cameras
connected to AI
• (NVIDIA said that by 2025 there will
be 2 billion video cameras in cities
around the world, they want to
connect those to gather data about
flow of traffic, crime and so on)
11. 25% of the companies will
dedicate workers to oversee AI
systems
• (Gartner said, “AI continues to drive
change in how businesses and
governments interact with
customers and constituents”. AI is
already heavily used in spotting
breast cancer, in such a way that this
can be detected a year earlier than it
would have been normally
diagnosed)
12. AI Market will be over
$60 billion in 2025
• (Research company Tractica says the
AI market will skyrocket from $1.4
billion in 2016 to nearly $60 billion
in 2025, but markets related to AI
will grow as well. For instance,
Machine Learning as a Service will
be worth $19.9 billion by the same
year)
13. Smart Voice Assistants will be
in 15 billion devices in 2025.
• (RBC Capital Markets predicts Alexa
alone could be worth $20 billion
2025, both from sales and spending
on the Amazon AI platform)
14. AI in driverless cars could save 300,000
lives in America each decade
• (NVIDIA, Alphabet’s Waymo and
Tesla are already using computer
vision and deep learning
technologies to allow vehicles to
both react to what they see and learn
from their experiences. We’ll see a
huge decline in number of traffic
fatalities on the road)
15. 30% of companies could have
their revenues reduced
• (Gartner predicts that companies
that perform software services right
now could become irrelevant, or a
least less useful, once machines start
figuring out how to do things better
and faster than they can. AI will
cannibalize revenues from some
current market leaders)
16. 38% of jobs in the U.S. could be
vulnerable to AI by the early 2030s.
• (Price Waterhouse Coopers says that
the most at-risk jobs are in
transportation and storage,
manufacturing and wholesale and
retail)
35. But a whole lot
of them…
• This simple model has:
• 8 inputs
• 3 hidden layers, each having 9
neuros
• 4 outputs
• 270 lines, thus weights
• 27 bias values
• In total 297 parameters…
36. What???
• Input: the number from your
model
• Bias: just a number to add to it
(weight)
• Activation Function: a simple
math function to apply
• Next Layer: do it all over
again, or use it as output
𝑆 =
𝑖=0
𝑛
𝑤
37. Activation functions
• Binary step: 0, if X < 0, or 1 if X > 0
• Linear Activation: f(x) = x
• Non-linear Activation: f(x) = sin(x)
• ReLU (Rectified Lineair Unit): f(x) = max(0,x)
• Parametric (ReLU): f(x) = max(ax, x)
• And so on…
38. Workflow…
(part 1)
• Create a model
• Set up number of input neurons
• Define number of layers
• Define number of neurons per layer
• Define number of output neurons
• Define initial weight / activation functions
• Define optimization model (more on this
later)
39. Workflow…
(part 2)
• Feed data into system
• Run through all layers
• Check outcome with known-results
• Tweak weights in neurons
• Do it all over again
45. FIND THE
RIGHT DATA
• Big data is the most-used
source
• Use sample data or pre-
selected data
• OpenData initiative
• In NLD: CBS data
• Google for ML Datasets and
be amazed
46. CLEAN UP
THE DATA
• Clean up missing values
• Substitute
• Remove rows
• Calculate medians / averages / and so on
• Remove outrages values
• Clip data
47. FIND THE
RIGHT
MODEL
• Anomaly detection
• Regression
• Clustering
• Two-class classification
• Multi-class classification
53. ChatGPT
• Chat… it’s about, uhm, chatting?
• Generative: it generates something
new
• Pre-trained: it has been trained
before
• Transformative: the model behind it
all
54. Generative
• The system generates
something new.
• Other options are:
• Predictive
• Classification
• … well, we’ve talked about
those
55. Pre-Trained
• It has been trained on a large
dataset
• Training was unsupervised
• Then finetuned on known samples
• Other options:
• Supervised learning: labeled
data
• Unsupervised. Used for
clustering
• And others…
56. Transformative
• The model.
• Lots of jargon about dual layer, each
with 2 sub-layers, that feed on each
other…
• If you really, really, want to
know:
[1706.03762] Attention Is All You
Need (arxiv.org)
57. Some numbers
• Trained on 45 terabytes of text
• 1.5 billion parameters
• Since released, answered over 1
billion questions
• Backed by Microsoft for over
$10.000.000.000
• A single training-run costs
$12.000.000
• Hit 1 million users in the first 5
days….
60. How it works:
• Generate non-uniform random noise
• Feed the keywords in a reverse
neural network
• Work back to see if parts of the
image fire certain regions
• Use an AI system to de-noise the
image
• Repeat de-noising until satisfied.