Discover what’s planned for Loud ML 1.5, the first deep learning software for predictive analytics available with a free plan. This next release will be faster on Intel architectures, and have a strong focus on horizontal scalability with cluster and multi-tenant deployment.
2. Our new Loud ML YouTube ChannelYouTube
channel
https://youtu.be/FrBLXoZRc4Y
Do you want to see
more videos on the
channel?
Let us know what
you think, and what
you need!
3. Overview
Loud ML 1.5 feature set in the roadmap
DISCLAIMER: The following information is being shared in order to outline some of our current product plans, but like everything else in
life, even the best laid plans get put to rest. We are hopeful that the following can shed some light on our roadmap, but it is important
to understand that it is being shared for INFORMATIONAL PURPOSES ONLY, and not as a binding commitment. Please do not rely on this
information in making purchasing decisions because ultimately, the development, release, and timing of any products, features or
functionality remains at our sole discretion, and is subject to change.
4. Overview
Loud ML 1.5 feature set in the roadmap
- Distributing the training load in a cluster
- Incremental retraining and checkpoints
- Dynamic data sources and data sinks
- JWT authentication
- Templates
- Better seasonality
- Better forecasts
5. New features to scale up your AIOps
• In release 1.5, users will be able to
distribute the training load in a
cluster
• Are you planning to scale
horizontally?
• Do you have specific multitenant
requirements? Let us know.
• Release 1.5 will support
incremental retraining
• Automated training checkpoints will
be created as new streaming data
becomes available
Technology
roadmap
What’s planned for
our next release
Loud ML 1.5
New data
Train
Forecast
Reinforce
6. Incremental retraining benefits
• When retraining, release 1.4 overwrites any existing training
data with new training data. Old training data is lost.
• The solution: Loud ML 1.5 will introduce incremental training
• Users can save “checkpoints” with the training state
• Users can select the “active” checkpoint, and select the “best fit”
• Users can restore a previous checkpoint
• Do you have summer and winter patterns (eg, cable car traffic on
Mont Blanc)?
• Solution: save two checkpoints, and select one for production use
– without losing information about the other checkpoint!
• Users can schedule training at regular intervals, and save the relevant
checkpoints automatically
• The best part: All APIs will remain backward compatible
Technology
roadmap
What’s planned for
our next release
Loud ML 1.5
7. Up to 8x faster
• Loud ML 1.5 will run up to 8 times faster on Intel
architectures!
• It’s time to upgrade to TensorFlow version 1.11
• MKL-DNN: Intel Math Kernel Library for Deep Neural
Networks will become our new standard for Intel
architecture packages
• No ARM support yet, let us know if you need it on
Github!
Technology
roadmap
What’s planned for
our next release
Loud ML 1.5
8. Sink, not source!
• A model will be able to output predictions to a
“datasink”
• In release 1.4, the data sink is implicitly the same as the
data source. Users have no option to override this
• In release 1.5, users will be able to select the datasink
for each model
• This will allow sophisticated chaining and easier
integration with third-party alerting tools
• Your model data source could be InfluxDB
• Your model data sink could be Kapacitor
• Users will have the option to assign a name they want
for output measurements
Technology
roadmap
What’s planned for
our next release
Loud ML 1.5
9. Dynamic data sources and JWT
• In release 1.4, /datasources API queries the active
data sources, and this information is read only
• This limitation is problematic in multi-tenant
environments; therefore:
• In release 1.5, users will have the capability to
dynamically add or delete data sources
• Users will be able to define JWT tokens if they want to
authenticate all queries sent to the API
Technology
roadmap
What’s planned for
our next release
Loud ML 1.5
10. SIMD: Single Instruction, Multiple Data
• In release 1.5, users will be able to apply SIMD
recipes to machine learning jobs
• Train your ML model once, using one data set
• Re-use the model for inference with other data sets,
using distinct tags
Technology
roadmap
What’s planned for
our next release
Loud ML 1.5
11. Templates are {cool}
• If only tag values or measurement names differ
between models, consider defining a template to
save time
• loudml create-template template.json
• loudml delete-template <template_name>
• To define a new model, you only need to provide
replacement keys and values in the template
• loudml create-model -t <template_name> [key=val list]
Technology
roadmap
What’s planned for
our next release
Loud ML 1.5
12. Loud ML 1.4
• 1.4 cares about seasonality in
time series models if the user
enables it during model
creation:
• Pros: users can switch it
on if they need to
• Cons: users have to
inspect the data first, then
guess if seasonality exists
Loud ML 1.5
• If you don’t know the
seasonality within your time
series in advance, 1.5 will have
the option to detect it
automatically
Better seasonalityTechnology
roadmap
What’s planned for
our next release
Loud ML 1.5
13. Loud ML 1.4
• 1.4 is able to forecast future
data points
• Future is uncertain: forecast
accuracy drops over time
• Longer-range forecasts means
lower accuracy
Loud ML 1.5
• 1.5 will provide better
confidence intervals via
forecast simulations and
distribution analysis of the
simulations
• 1.5 will also provide an upper
"bound" value for a maximum
forecast date to deal with
rising uncertainty levels. It will
take the “bound” date into
account, and cut the forecast if
the uncertainty level is too
high
Better forecastsTechnology
roadmap
What’s planned for
our next release
Loud ML 1.5