%in ivory park+277-882-255-28 abortion pills for sale in ivory park
Taking advantageofai july2018
1. Artificial Intelligence and Machine Learning – July 2018 1/14
Yves Caseau
National Academy of Technologies
Michelin CIO
Taking Advantage of AI
July 11th
, 2018
V0.4
2. Artificial Intelligence and Machine Learning – July 2018 2/14
OutlineOutline
Artificial Intelligence and
Machine Learning « revolution »
A glance at the « toolbox »:
methods, protocols and assembly
A « how to guide » for corporation
How to grow « emergence » ?
1:AI“renewal”1:AI“renewal”2:The« Toolbox »2:The« Toolbox »3:Calltoaction3:Calltoaction
3. Artificial Intelligence and Machine Learning – July 2018 3/14
AI « renewal » /AI « renewal » /
Technology academy workgroupTechnology academy workgroup
Spectacular Investment Acceleration
Major players and venture capital
Belief that major benefits are yet to
come
Spectacular Performance Acceleration
Image, speech recognition,
translation, ….
Alpha Go, etc.
Moore’s law does not explain
everything
Workgroup questions
Revolution or evolution ?
AI Algorithms = commodity ?
« Exponential Organization » ?
1:AI“renewal”1:AI“renewal”
4. Artificial Intelligence and Machine Learning – July 2018 4/14
Taking advantage of AI availabilityTaking advantage of AI availability
Vaccine Manufacturing at Merck
5 Terabytes in a “Datalake”
Batch yield optimization
IHG Continental (hotels)
ultra-fine customer segmentation client
Similar approach at Amadeus
Ejection fraction analysis (cardiology)
Contest prepared by doctors and cardiologists
DNN to compute a volume through image analysis
FAA
Long term delay forecast through a Bayesian
network
Taking “avalanche effect” into accounts
5 years of data, 52 millions flights – noisy data
1:AI“renewal”1:AI“renewal”
5. Artificial Intelligence and Machine Learning – July 2018 5/14
1:AI“renewal”1:AI“renewal”
Most AI application are built on top of a feedback loopMost AI application are built on top of a feedback loop
Iterative
Developement
of AI Practice
Speed of learning
depends on
computing power
Smart
Algorithms
Smart
Engineering Smart
Services
Service
Usage
Growing
Large
Datasets
Distributed Software
Engineering Practices
Distributed Software
Engineering Practices
Management Vision
& Grit
Management Vision
& Grit
Ease to collectEase to collect Trust &
Acceptability
Trust &
Acceptability
A
Scientists
Open source
Startups
B-
Lack of SW
medium
sized
players
C+
Risk-adverse
Lack of SW
culture
B+
Market
Size /
language
B-
GDPR
CNIL
B+
Competitive
access to
GPU/TPU
6. Artificial Intelligence and Machine Learning – July 2018 6/14
Today’s Artificial Intelligence (& ML) makes an extended toolboxToday’s Artificial Intelligence (& ML) makes an extended toolbox
Open
question
Question
précise
Few data Lots of data
classical
« Data Science »
methods
Rules
OR / NLP
Agents
Evolutionary Game Theory
Deep Learning
(CNN)
Semantics
(e.g., Watson)
• Rule-based and
constraint-
based (e.g.,
configuration)
• Fuzzy
boundary with
operations
research
• Most companies
“AI use cases”
• Einstein /
TellMePlus /
Da Vinci Labs
• Moore’s Law &
Big Data
• Key role of
simulation
• Well suited to
complex
systems
• Continuous
but slow
progress
• News articles
written by
robots
• Pattern/situation
recognition
• This decade’s
inflexion point
Intelligence Artificielle et Apprentissage Automatique –May 2018 8/16
Quelques éléments clés de la boîte-à-outils
l Regression linéaire
/ logistique
l Réseau Bayésien
l Régularisation
l (K-mean) clustering
l Random Forest
l Gradient Boosting
l Support-Vector
Machines
l Réseaux
Neuronaux
l Ontologies
l Lexicographie
l ARMA, ARIMA, etc.
2:La“boïte-à-outils
2:The« Toolbox »2:The« Toolbox »
7. Artificial Intelligence and Machine Learning – July 2018 7/14
Some key pieces in the toolboxSome key pieces in the toolbox
l Linear / logistic
regression
l Bayesian networks
l Regularization
l (K-mean) clustering
l Random Forest
l Gradient Boosting
l Support-Vector
Machines
l Neural Networks
l Ontologies
l Lexicographic tools
l Rule-bases scripting
l ARMA, ARIMA, etc.
2:The« Toolbox »2:The« Toolbox »
8. Artificial Intelligence and Machine Learning – July 2018 8/14
Meta-Heuristics to mix these componentsMeta-Heuristics to mix these components
l Reinforcement learning
l Transfer learning
l Natural language processing toolbox
l Large-scale Intelligent agents
communities
l Game theory to reason about
competition and cooperation
• Hybrid AI: to combine
different tools and
meta-heuristics
• Generative approaches
2:The« Toolbox »2:The« Toolbox »
9. Artificial Intelligence and Machine Learning – July 2018 9/14
Cognitive Systems: Mixing various AI andCognitive Systems: Mixing various AI and
Machine Learning TechniquesMachine Learning Techniques
Smart System
Components:
Perception / environment
Self-consciousness of
goals
Forecast and adjust
Growth through usage
Biomimicry
Develop through
reinforcement
Add layered capabilities
for resilience
Cognitive computing
“reason from a purpose” –
IBM
“systems grow by
machine learning, not by
programmatic design”
EDA
Objects &
sensors
user
CEP
Reflexes
ACT
command
center
state
react
history
THINK
Decisions
(AI)
PLAN
Execution
Logic
goals
REFLECT
Evolutionary
ML
ANALYZE
Machine
Learning
ADAPT
Reactive
LEARN
Representation
VALUATION
emotions
services Other systems
Systems of
Systems
FORECAST
Anticipation
decide
insights
2:The« Toolbox »2:The« Toolbox »
10. Artificial Intelligence and Machine Learning – July 2018 10/14
AI strategy starts with data collectionAI strategy starts with data collection
Data collection process
Do not forget meta-data !
Build qualified training sets
“System thinking” (loop) :
collect tomorrow’s data as well as past
data
Prepare « machine vision » revolution
(& perception) through collecting images
and video … as well as customer digital
traces.
3:Calltoaction3:Calltoaction
Data
Lots of them, tagged
Algorithms
most often, open-source
Integration & meta-
heuristics
Training Protocols
Time & resourcesSkills / experience
11. Artificial Intelligence and Machine Learning – July 2018 11/14
Grow the success conditions for your teamsGrow the success conditions for your teams
Leverage the
« technology wave »
Beware of « false positives »
Overfitting, Spurious correlations, ..
… and of biases in training data
Mindset: distributed and emergent innovationMindset: distributed and emergent innovation
Data collection/ training setsData collection/ training sets
AI-friendly software environmentsAI-friendly software environments
Lab Culture (Data Science)Lab Culture (Data Science)
PerseverancePerseverance
Constant
flow of
software
It takes
time to
build skills
3:Calltoaction3:Calltoaction
12. Artificial Intelligence and Machine Learning – July 2018 12/14
Time to act is nowTime to act is now
Start right now with tools
that are easily available
Simple methods work
Take advantage of
« integrated/Automated »
toolboxes
Einstein, Holmes,
TellMePlus, etc.
Secure access to large-scale computing
power to increase the speed of learning
(GPU & TPU)
3:Calltoaction3:Calltoaction
Research &
Development
Digital
Manufacturing
Deliver
Product
Supply Chain
Assist
Customer
Pattern detectionPattern detection
Customer Interaction (e.g. Chatbots / Smart Assistants)Customer Interaction (e.g. Chatbots / Smart Assistants)
Operations Support / Information Systems
Digital Traces - IOT
Operations Support / Information Systems
Digital Traces - IOT
FraudFraud
Predictive maintenance
Quality Assurance
Automation Forecast / Optimization
Robotic Process AutomationRobotic Process Automation
Knowledge EngineeringKnowledge Engineering SearchSearch
13. Artificial Intelligence and Machine Learning – July 2018 13/14
To develop one’s situation potential (emergence)To develop one’s situation potential (emergence)
Artificial Intelligence is not a service that you buy,
it is a practical skill that one must grow.
It takes time …
Learning curve
To develop the kind of AI that is suited to one’s business
To work within a small team with outside experts (e.g., from academia)
To organize contests with business training sets
To build a continuous improvement process
Think Platform
Large scope vision
(upstream & downstream value chain)
« Win/win » : learn to share data
Example: Today’s “stupid” chatbots
collect data that will be used to train
tomorrow’s smart assistants
3:Calltoaction3:Calltoaction
14. Artificial Intelligence and Machine Learning – July 2018 14/14
Main take-awayMain take-away
These are the five domains that anyone
should start investigating without delays:
1.Smart Automation: RPA scripting tools,
Rule engines
2.Natural Language Processing:
Bots & ontologies
Sentiment analysis API
3.Pattern recognition :
Random Forests, Neural Nets
4.Forecasting : Machine
Learning Toolboxes /
Prediction API / Bayesian Networks
5.Machine Vision : play with CNN
(TensorFlow)
ConclusionConclusion
Hinweis der Redaktion
Bonjour à tous,
Je vais vous presenter les principaux messages du rapport
Organisé en 3 parties
Contexte par rapport au discours du président
Préconisation aux parties prenantes / complement au rapport de Villani
Conseil aux entreprises (la partie la plus originale du rapport)
(1) Donner d’autres exemples avec des courbesLe point clé est que nous avons dépassé les perfs humaiune
ImageNet challenge: better than the 5% of human performance with ML for Google and MS
http://www.eetimes.com/document.asp?doc_id=1325712
(2) Citer des chiffres clés
Equity deals to startups in artificial intelligence — including companies applying AI solutions to verticals like healthcare, advertising, and finance as well as those developing general-purpose AI tech — increased nearly 6x, from roughly 70 in 2011 to nearly 400 in 2015.
https://www.cbinsights.com/blog/artificial-intelligence-startup-funding-trends/
GAFIM : Google Amazon Facebook IBM Microsoft : billions of R&D dollars over a few years
(3) Réponse
Oui c’est une révolution car une rupture => effet immédiats et à venirAttention : le champs de l’IA est vaste, la maturité est inégale
Les algos sont des commodités, les données moins et les protocoles d’apprentissage pas du tout !
Attendre notre rapport … j’y reviens en conclusion
Le dernier exemple qui m’a le plus frappé est celui de Merck puisqu’il s’applique à l’optimisation de processus de production industriels qui m’intéresse professionnellement. « The manufacturing team used data science to conduct a large-scale analysis to integrate and analyze 5 terabytes of data using 15 billion calculations and more than 5.5 million batch-to-batch comparisons. They then created a “heat map” showing data clusters associated with high and low yields. Experts could look at the heat map, recommend changes, rework predictive models, and then run more analyses. … Merck uses a data lake for the petabytes of data its manufacturing plants generate. The data come in all formats, combining both in-house and outside data sets that extend backward up the production chain all the way to suppliers of raw materials. … In December 2016, it christened its first plant-wide analytics system in Singapore. A single dashboard will display real-time data flowing in from every part of the plant—manufacturing, tablet production, packaging, quality, warehousing, shipping, and so on.” Du point de vue de la CIO, la principale différence avec les outils précédents est le fait d’être passée d’une approche réactive à proactive : « We want to look at the data now and not wait until we have a problem ».
Un deuxième exemple intéressant est celui de la chaîne hôtelière IHG (Intercontinental) qui a utilisé des très gros volumes de données pour complètement revisiter sa segmentation client en produisant des dizaines de milliers de profils: “We concluded that advanced computation could identify more hidden relationships between customer attributes and likelihood-to-respond than is possible with… traditional modeling methods”.
Parmi ces exemples, j’ai relevé d’abord une utilisation de l’apprentissage par pour évaluer un flux (ejection fraction) à partir d’images vidéo : « Within three months, many of the teams had devised algorithms that enabled computers to read MRI cross sections as quickly as they are taken. The machines learned to find the specific image that shows the heart in its totally relaxed state (full of blood) and another in its totally contracted state (during pumping). They then compared the two and calculated the ejection fraction.” Ce qui est intéressant ici, c’est que si les données ont été préparées et annotées par des experts du sujet, l’équipe qui a produit le meilleur algorithme ne connaissait rien à la cardiologie : « The remarkable aspect about the winning team was that neither teammate knew anything about cardiology before the competition. Never before have organizations had at their disposal the global pool of talent to tackle the most complex problems of our time—including problems in fields of knowledge that data scientists know nothing about.”
L’exemple de la FAA qui a choisi d’analyser de façon globale une volume massif de données de vols est illustratif : « At the FAA, the team applied its computing horsepower to a data sample of 52 million flights over five years. The sample included 5.25 million rows of data. The computations were even more complicated than anticipated because the data were not clean; the Bayesian belief network was needed because it can estimate missing values amid all that complexity”. Je cite cet exemple car j’entends trop souvent dire qu’on ne peut utiliser l’apprentissage ou l’intelligence artificielle que sur des données nettoyées et exacte. Ce n’est pas le cas, on sait depuis longtemps appliquer des méthodes d’apprentissage sur des données bruitées, mais il faut en avoir conscience ! ce qui pose problèmes ce sont les données fausses alors qu’on croit qu’elles sont justes.
Définition intentionelle => sujet complexe definition en extension conduit à la boite à outils
Pour trier : deux axes :: (a) question précise / ouverte (b) peu de données / beaucoup de données
Panorama
1: frontières larges => discutable d’un point de vue conceptuel mais pratique d’un point de vue opérationnel
Cf article de la tribune de Bruno Maisonnier
https://www.latribune.fr/technos-medias/bruno-maisonnier-qualifier-d-intelligence-le-couple-deep-learning-et-reseaux-de-neurones-est-une-usurpation-773596.html
A terme : il faut hybdrider – cf. TODAI Robot
On pourrait aller plus bas et retrouver des éléments communs (optimization locale, descente de gradient, etc)
Mettre une photo de Sundar Pitchai et Google Agent (ML et ontologies)
Reinforcement learning => dans alphaGo mais aussi dans Libratus, de Carnegie Mellon (Poker)
Transfer learning : vision modulaire des réseaux profondexemple a Amadeus: apprendre avec un contexte complet puis utiliser avec un contexte plus limité
Progr-s constants: axes sémantique network + axe perception => système couplés
Loi de Moore aussi: voir Yves Demazeau / 1 million d’agents = voir aussi Cosmotech : modélisation de systèmes complexes
AXELROD
Todai Robot / aussi Andreesen Horrowitz en 2017
This picture is taken from my blog post about biomimetism
Template for event-driven architecture for smart systems such as smart home
Need to find three stories about this picture
This is a fractal design what you see here is a component !
same capabilities at all scale. Even simple objects (smart sensors) have perception / goals / computing capabilities Small systems are smart to generate fewer high value events
(2) Smart systems are grown by machine learning ! reinforce what get used ! Similarity with muscle and bio-engineering
Layered architecture: dumb functions for dumb task, reinforce resilience
Avoid the NEST catastrophy … or the Bill Gates house syndrom => PC down, no lights
(3) Quote IBM from John Kelly …. But also Kevin Kelly
Favorite quote by Kevin Kelly (20 years ago) :
« Investing machines with the ability to adapt on their own, to evolve in their own directions, and grow without human oversight is the next great advance in technology. Giving machines freedom is the only way we can have intelligent control. »
Importance of teonomy in system design – cf IBM
Message universel des experts : commencer par les données
Cf Google valorization of acquisiotions
Pierre Haren : IP value = test sets (annotated) and training protocols
(2) McKinsey : résultats tangibles et importants mais 20% seult pratiquent de facon commerciale, 40% experimentent
(3) Exemple de la lambda architecture un fondamental depuis 2010 / pas vraiment pratiqué
Profiter de l’exterieur => conditions à l’intérieur
De nombreux succès avec des méthodes simples, beaucoup de données et beaucoup de savoir faire
Pas IAaaS ! Au contraire fortement dépendante du domaine
4 exemples:
Ford: collection massive d’evenement sur des voitures instrumentées pour capturer des insights
Mesure du ”flux d’ejection” (ejection fraction) sur les images vidéo d’echo cardiographie / obtenue sur concours par des non-spécialistes (sur des données produites par des spécialistes/médecins)
Manufacturing process chez Merk : collection massive de tous les paramètres et enrichissement avec des données d’environement plus les données des fournisseurs de matériels (semblable à GE)
Segmentation “of one” (ou presque) pour créer programme de fidélisation chez IHG: Intercontinental Hotel Group / utilise du big data depuis longtemps / approche nouvelles (algo & machines & données massives) donne de bien meilleurs resultats
Déclinaison pratique des concepts précédents pour l’assurance
- une matrice pprocess client / axe temporel x du back office au contact client
- des sujets partout ! Des niveaux de maturité
- du bleu foncé au bleu clair
https://en.wikipedia.org/wiki/Robotic_process_automation
Déjà pas mal d’applictions
Large insurance companies
Fukoku Mutual Life Insurance http://mainichi.jp/english/articles/20161230/p2a/00m/0na/005000c
Exemple Alliance : Claim management
(2) Schéma très grossier à cause de la variété et de l’évolution permanente
Donc il faut se mettre en position d’évolution permanente
Savoir intégrer / utilise l’open source
Profiter de l’exterieur => conditions à l’intérieur
De nombreux succès avec des méthodes simples, beaucoup de données et beaucoup de savoir faire
Pas IAaaS ! Au contraire fortement dépendante du domaine
4 exemples:
Ford: collection massive d’evenement sur des voitures instrumentées pour capturer des insights
Mesure du ”flux d’ejection” (ejection fraction) sur les images vidéo d’echo cardiographie / obtenue sur concours par des non-spécialistes (sur des données produites par des spécialistes/médecins)
Manufacturing process chez Merk : collection massive de tous les paramètres et enrichissement avec des données d’environement plus les données des fournisseurs de matériels (semblable à GE)
Segmentation “of one” (ou presque) pour créer programme de fidélisation chez IHG: Intercontinental Hotel Group / utilise du big data depuis longtemps / approche nouvelles (algo & machines & données massives) donne de bien meilleurs resultats