Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.
DATA
SCIENCE
POP UP
AUSTIN
Surfing Silver: Dynamic Bayesian
Forecasting for Fun and Profit
Jonathan Dinu
Author and Teacher
...
DATA
SCIENCE
POP UP
AUSTIN
#datapopupaustin
April 13, 2016
Galvanize, Austin Campus
SURFING SILVERDYNAMIC BAYESIAN FORECASTING FOR FUN AND PROFIT
Jonathan Dinu // April 13th, 2016 // @clearspandex
whoami
Jonathan Dinu // April 13th, 2016 // @clearspandex
whoami
Jonathan Dinu // April 13th, 2016 // @clearspandex
Jonathan Dinu // April 13th, 2016 // @clearspandex
THE 2008 ELECTION
let me tell you a little story...
Jonathan Dinu // April 13th, 2016 // @clearspandex
SPOILER ALERT...IT'S BEEN DONE BEFORE
Jonathan Dinu // April 13th, 2016 // @clearspandex
> Nate Silver
> Drew Linzer
> Josh Putnam
> Simon Jackman
Jonathan Dinu // April 13th, 2016 // @clearspandex
ANDREW GELMAN
Jonathan Dinu // April 13th, 2016 // @clearspandex
ANDREW GELMAN (1995...)
Jonathan Dinu // April 13th, 2016 // @clearspandex
THE THEORY BEHIND THE MAGIC
Courtesy of 538 and Drew Linzer (Votamatic)
Jonathan Dinu // April 13th, 2016 // @clearspandex
CHALLENGES
> Historical Predictions susceptible to Uncertainty
> Sparse pre-election Poll Data
> Sampling Error and House ...
WHAT DREW (AND NATE) DID DIFFERENTLY
> State level vs. National Polls
> Online Updates as more data become available
> Not...
DYNAMIC BAYESIAN
FORECASTING2
National:
State:
Forecasts:
Not shown here: informative priors
based on historical predictio...
SO WHY AM I TELLING YOU
THIS THEN?
Jonathan Dinu // April 13th, 2016 // @clearspandex
STRUCTURED PREDICTIONSUPERVISED LEARNING ON SEQUENCES
Jonathan Dinu // April 13th, 2016 // @clearspandex
TRADITIONALLY
Jonathan Dinu // April 13th, 2016 // @clearspandex
TRADITIONALLY
Jonathan Dinu // April 13th, 2016 // @clearspandex
STATES + TIME + TRANSITIONS
Jonathan Dinu // April 13th, 2016 // @clearspandex
GRAPHICAL MODELS
> Assess Risk (uncertainty) as
Probability of Failure
> Unobservable (hidden) Failure States
> Proactive/...
KEY IDEAS
> Uncertainty
> Point vs. Distribution (or confidence intervals)
> Bayesian vs. Frequentists methods
> Temporal ...
KEY IDEAS (APPLIED)
Jonathan Dinu // April 13th, 2016 // @clearspandex
IOT IMPACT: DETECTING MACHINE FAILURES
> Historical Structural Predictions susceptible to Uncertainty
(Supervised Learning...
REMEMBER THIS...
National:
State:
Forecasts:
Jonathan Dinu // April 13th, 2016 // @clearspandex
REMEMBER THIS...
National:
State:
Forecasts:
Jonathan Dinu // April 13th, 2016 // @clearspandex
REMEMBER THIS...
National:
State:
Forecasts:
Jonathan Dinu // April 13th, 2016 // @clearspandex
INDUSTRIAL MACHINES3
HTTP://WWW.CITEMASTER.NET/GET/8BD1ACC0-F04B-11E3-BBAF-00163E009CC7/SALFNER05PREDICTING.PDF
Jonathan D...
MORE INTERPRETABLEWE HAVE TO ACTUALLY FIX THE MACHINES AFTER ALL...
Jonathan Dinu // April 13th, 2016 // @clearspandex
LATENT FACTORS
Jonathan Dinu // April 13th, 2016 // @clearspandex
CAUSALITY!
Jonathan Dinu // April 13th, 2016 // @clearspandex
REFERENCES
> The Signal and the Noise
> Data Journalism Handbook
> Dynamic Bayesian Forecasting of Presidential Elections ...
DATA
SCIENCE
POP UP
AUSTIN
@datapopup
#datapopupaustin
Data Science Popup Austin: Surfing Silver Dynamic Bayesian Forecasting for Fun and Profit
Nächste SlideShare
Wird geladen in …5
×

Data Science Popup Austin: Surfing Silver Dynamic Bayesian Forecasting for Fun and Profit

730 Aufrufe

Veröffentlicht am

Watch the talk ➟ http://bit.ly/1NJGRcb

2008 was a historic year in many ways, perhaps the most prominent being the election of the first African American president. But 2008 also saw an unlikely hero emerge amongst the record setting presidential race... Nate Silver and his astonishingly accurate prediction of its results. More important than Nate's remarkable result however was the attention it drew to the potential of data and the importance of uncertainty (through bayesian statistics). And it was in that moment that our modern incarnation of data journalism was born (though ironically the field dates back to an attempt to predict the 1952 presidential election) with Nate's (now famous) 538 blog.

In this talk I will walk through the approach that made Nate so successful in 2008, test its efficacy in predicting the early 2016 primary results, and show how these (relatively) simple concepts can be applied in novel ways to tangential fields to great effect (for fun and profit) by estimating the time to failure for industrial machines in our connected world of the IoT.

Veröffentlicht in: Daten & Analysen
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

Data Science Popup Austin: Surfing Silver Dynamic Bayesian Forecasting for Fun and Profit

  1. 1. DATA SCIENCE POP UP AUSTIN Surfing Silver: Dynamic Bayesian Forecasting for Fun and Profit Jonathan Dinu Author and Teacher clearspandex
  2. 2. DATA SCIENCE POP UP AUSTIN #datapopupaustin April 13, 2016 Galvanize, Austin Campus
  3. 3. SURFING SILVERDYNAMIC BAYESIAN FORECASTING FOR FUN AND PROFIT Jonathan Dinu // April 13th, 2016 // @clearspandex
  4. 4. whoami Jonathan Dinu // April 13th, 2016 // @clearspandex
  5. 5. whoami Jonathan Dinu // April 13th, 2016 // @clearspandex
  6. 6. Jonathan Dinu // April 13th, 2016 // @clearspandex
  7. 7. THE 2008 ELECTION let me tell you a little story... Jonathan Dinu // April 13th, 2016 // @clearspandex
  8. 8. SPOILER ALERT...IT'S BEEN DONE BEFORE Jonathan Dinu // April 13th, 2016 // @clearspandex
  9. 9. > Nate Silver > Drew Linzer > Josh Putnam > Simon Jackman Jonathan Dinu // April 13th, 2016 // @clearspandex
  10. 10. ANDREW GELMAN Jonathan Dinu // April 13th, 2016 // @clearspandex
  11. 11. ANDREW GELMAN (1995...) Jonathan Dinu // April 13th, 2016 // @clearspandex
  12. 12. THE THEORY BEHIND THE MAGIC Courtesy of 538 and Drew Linzer (Votamatic) Jonathan Dinu // April 13th, 2016 // @clearspandex
  13. 13. CHALLENGES > Historical Predictions susceptible to Uncertainty > Sparse pre-election Poll Data > Sampling Error and House Effects Bias Polls Jonathan Dinu // April 13th, 2016 // @clearspandex
  14. 14. WHAT DREW (AND NATE) DID DIFFERENTLY > State level vs. National Polls > Online Updates as more data become available > Not All Polls are Created Equal (weights/averages) > (Probabilistic) Forecasting in addition to Estimation Jonathan Dinu // April 13th, 2016 // @clearspandex
  15. 15. DYNAMIC BAYESIAN FORECASTING2 National: State: Forecasts: Not shown here: informative priors based on historical predictions Jonathan Dinu // April 13th, 2016 // @clearspandex
  16. 16. SO WHY AM I TELLING YOU THIS THEN? Jonathan Dinu // April 13th, 2016 // @clearspandex
  17. 17. STRUCTURED PREDICTIONSUPERVISED LEARNING ON SEQUENCES Jonathan Dinu // April 13th, 2016 // @clearspandex
  18. 18. TRADITIONALLY Jonathan Dinu // April 13th, 2016 // @clearspandex
  19. 19. TRADITIONALLY Jonathan Dinu // April 13th, 2016 // @clearspandex
  20. 20. STATES + TIME + TRANSITIONS Jonathan Dinu // April 13th, 2016 // @clearspandex
  21. 21. GRAPHICAL MODELS > Assess Risk (uncertainty) as Probability of Failure > Unobservable (hidden) Failure States > Proactive/Early Prediction > Interpretable Latent Properties > Online Algorithm (iteratively improve) Jonathan Dinu // April 13th, 2016 // @clearspandex
  22. 22. KEY IDEAS > Uncertainty > Point vs. Distribution (or confidence intervals) > Bayesian vs. Frequentists methods > Temporal variability All models are wrong, but some models are useful... or something Jonathan Dinu // April 13th, 2016 // @clearspandex
  23. 23. KEY IDEAS (APPLIED) Jonathan Dinu // April 13th, 2016 // @clearspandex
  24. 24. IOT IMPACT: DETECTING MACHINE FAILURES > Historical Structural Predictions susceptible to Uncertainty (Supervised Learning) > Sparse pre-election Poll Data (costly to measure) > Sampling Error Biases Polls Inspections (prediction in the absence of data) > Online Updates as more data become available > Not All Polls sensors are Created Equal (weights/averages) > (Probabilistic) Forecasting in addition to Estimation Jonathan Dinu // April 13th, 2016 // @clearspandex
  25. 25. REMEMBER THIS... National: State: Forecasts: Jonathan Dinu // April 13th, 2016 // @clearspandex
  26. 26. REMEMBER THIS... National: State: Forecasts: Jonathan Dinu // April 13th, 2016 // @clearspandex
  27. 27. REMEMBER THIS... National: State: Forecasts: Jonathan Dinu // April 13th, 2016 // @clearspandex
  28. 28. INDUSTRIAL MACHINES3 HTTP://WWW.CITEMASTER.NET/GET/8BD1ACC0-F04B-11E3-BBAF-00163E009CC7/SALFNER05PREDICTING.PDF Jonathan Dinu // April 13th, 2016 // @clearspandex
  29. 29. MORE INTERPRETABLEWE HAVE TO ACTUALLY FIX THE MACHINES AFTER ALL... Jonathan Dinu // April 13th, 2016 // @clearspandex
  30. 30. LATENT FACTORS Jonathan Dinu // April 13th, 2016 // @clearspandex
  31. 31. CAUSALITY! Jonathan Dinu // April 13th, 2016 // @clearspandex
  32. 32. REFERENCES > The Signal and the Noise > Data Journalism Handbook > Dynamic Bayesian Forecasting of Presidential Elections in the States (Drew A. Linzer) > Time for Change model (Alan Abramowitz) > Baysian Data Analysis Gelman > Causality Judea Pearl > 538: How we are Forecasting the 2016 Primaries > Predicting Time-to-Failure of Industrial Machines with Temporal Data Mining Jonathan Dinu // April 13th, 2016 // @clearspandex
  33. 33. DATA SCIENCE POP UP AUSTIN @datapopup #datapopupaustin

×