SlideShare ist ein Scribd-Unternehmen logo
1 von 9
Downloaden Sie, um offline zu lesen
1
Using AI to Solve Data and IT
Complexity -- And Better Enable AI
A discussion on how the rising tidal wave of data must be better managed, and how new tools are
emerging to bring artificial intelligence to the rescue.
Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard
Enterprise.
Dana Gardner: Hello, and welcome to the next edition of the BriefingsDirect Voice of the
Innovator podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host
and moderator for this ongoing discussion on the latest in IT innovation.
Our next discussion focuses on why the rising tidal wave of data must be better managed, and
how new tools are emerging to bring artificial intelligence (AI) to the rescue. Stay with us now as
we learn how the latest AI innovations improve both data and services management across a
cloud deployment continuum -- and in doing so set up an even more powerful way for
businesses to exploit AI.
To learn how AI will help conquer complexity to allow for higher abstractions of benefits from
across all sorts of data for better analysis, please join me in welcoming Rebecca Lewington,
Senior Manager of Innovation Marketing at Hewlett Packard Enterprise (HPE). Welcome to
BriefingsDirect, Rebecca.
Rebecca Lewington: Hi, Dana. It’s very nice to talk to you.
Gardner: We have been talking about massive amounts of data
for quite some time. What’s new about data buildup that requires
us to look to AI for help?
Lewington: Partly it is the sheer amount of data. IDC’s Data
Age Study predicts the global data sphere will be 175 zettabytes
by 2025, which is a rather large number. That’s what, 1 and 21
zeros? But we have always been in an era of exploding data.
Yet, things are different. One, it’s not just the amount of data; it’s
the number of sources the data comes from. We are adding in
things like mobile devices, and we are connecting factories’
operational technologies to information technology (IT). There are more and more sources.
Also, the time we have to do something with that data is shrinking to the point where we expect
everything to be real-time or you are going to make a bad decision. An autonomous car, for
example, might do something bad. Or we are going to miss a market or competitive intelligence
opportunity.
So it’s not just the amount of data -- but what you need to do with it that is challenging.
Lewington
2
Gardner: We are also at a time when Al and machine learning (ML) technologies have matured.
We can begin to turn them toward the data issue to better exploit the data. What is new and
interesting about AI and ML that make them more applicable for this data complexity issue?
Data gets smarter with AI
Lewington: A lot of the key algorithms for AI were actually invented long ago in the 1950s, but
at that time, the computers were hopeless relative to what we have today; so it wasn’t possible
to harness them.
For example, you can train a deep-learning neural net to recognize pictures of kittens. To do
that, you need to run millions of images to train a working model you can deploy. That’s a huge,
computationally intensive task that only became practical a few years ago. But now that we
have hit that inflection point, things are just taking off.
Gardner: We can begin to use machines to better manage data that we can then apply to
machines. Does that change the definition of AI?
How to Remove Complexity
From Multicloud and Hybrid IT
Lewington: The definition of AI is tricky. It’s malleable, depending on who you talk to. For
some people, it’s anything that a human can do. To others, it means sophisticated techniques,
like reinforcement learning and deep learning.
One useful definition is that AI is what you use when you know what the answer looks like, but
not how to get there.
Traditional analytics effectively does at scale what you could do with pencil and paper. You
could write the equations to decide where your data should live, depending on how quickly you
need to access it.
But with AI, it’s like the kittens example. You know what the answer looks like, it’s trivial for you
to look at the photograph and say, “That is a cat in the picture.” But it’s really, really difficult to
write the equations to do it. But now, it’s become relatively easy to train a black box model to do
that job for you.
Gardner: Now that we are able to train the black box, how can we apply that in a practical way
to the business problem that we discussed at the outset? What is it about AI now that helps
better manage data? What's changed that gives us better data because we are using AI?
Lewington: It’s a circular thing. The heart of
what makes AI work is good data; the right
data, in the right place, with the right
properties you can use to train a model,
which you can then feed new data into to get
results that you couldn’t get otherwise.
The heart of what makes AI work is
good data; the right data, in the right
place, with the right properties you can
use to train a model, which you can
then feed new data into to get results.
3
Now, there are many ways you can apply that. You can apply it to the trivial case of the cat we
just talked about. You can apply it to helping a surgeon review many more MRIs, for example,
by allowing him to focus on the few that are borderline, and to do the mundane stuff for him.
But, one of the other things you can do with it is use it to manipulate the data itself. So we are
using AI to make the data better -- to make AI better.
Gardner: Not only is it circular, and potentially highly reinforcing, but when we apply this to
operations in IT -- particularly complexity in hybrid cloud, multicloud, and hybrid IT -- we get an
additional benefit. You can make the IT systems more powerful when it comes to the application
of that circular capability -- of making better AI and better data management.
AI scales data upward and outward
Lewington: Oh, absolutely. I think the key word here is scale. When you think about data --
and all of the places it can be, all the formats it can be in -- you could do it yourself. If you want
to do a particular task, you could do what has traditionally been done. You can say, “Well, I
need to import the data from here to here and to spin up these clusters and install these
applications.” Those are all things you could do manually, and you can do them for one-off
things.
But once you get to a certain scale, you need to do them hundreds of times, thousands of times,
even millions of times. And you don’t have the humans to do it. It’s ridiculous. So AI gives you a
way to augment the humans you do have, to take the mundane stuff away, so they can get
straight to what they want to do, which is coming up with an answer instead of spending weeks
and months preparing to start to work out the answer.
Gardner: So AI directed at IT, what some people call AIOps could be an accelerant to this
circular advantageous relationship between AI and data? And is that part of what you are doing
within the innovation and research work at HPE?
Lewington: That’s true, absolutely. The mission of Hewlett Packard Labs in this space is to
assist the rest of the company to create more powerful, more flexible, more secure, and more
efficient computing and data architectures. And for us in Labs, this tends to be a fairly specific
series of research projects that feed into the bigger picture.
For example, we are now doing the Deep Learning Cookbook, which allows customers to find
out ahead of time exactly what kind of hardware and software they are going to need to get to a
desired outcome. We are automating the experimenting process, if you will.
And, as we talked about earlier, there is the
shift to the edge. As we make more and
more decisions -- and gain more insights
there, to where the data is created -- there
is a growing need to deploy AI at the edge.
That means you need a data strategy to get
the data in the right place together with the AI algorithm, at the edge. That’s because there often
isn’t time to move that data into the cloud before making a decision and waiting for the required
action to return.
As we make more and more decisions
– and gain more insights there, to
where the data is created – there is a
growing need to deploy AI at the edge.
4
Once you begin doing that, once you start moving from a few clouds to thousands and millions
of endpoints, how do you handle multiple deployments? How do you maintain security and data
integrity across all of those devices? As researchers, we aim to answer exactly those questions.
And, further out, we are looking to move the natural learning phase itself to the edge, to do the
things we call swarm learning, where devices learn from their environment and each other,
using a distributed model that doesn’t use a central cloud at all.
Gardner: Rebecca, given your title is Innovation Marketing Lead, is there something about the
very nature of innovation that you have come to learn personally that’s different than what you
expected? How has innovation itself changed in the past several years?
Innovation takes time and space
Lewington: I began my career as a mechanical engineer.
For many years, I was offended by the term innovation
process, because that’s not how innovation works. You give
people the space and you give them the time and ideas
appear organically. You can’t have a process to have ideas.
You can have a process to put those ideas into reality, to
wean out the ones that aren’t going to succeed, and to
promote the ones that work.
But the term innovation process to me is an oxymoron. And that’s the beautiful thing about
Hewlett Packard Labs. It was set up to give people the space where they can work on things
that just seem like a good idea when they pop up in their heads. They can work on these and
figure out which ones will be of use to the broader organization -- and then it’s full steam ahead.
How to Better Understand
What AI Can Do for Your Business
Gardner: It seems to me that the relationship between infrastructure and AI has changed. It
wasn’t that long ago when we thought of business intelligence (BI) as an application -- above
the infrastructure. But the way you are describing the requirements of management in an edge
environment -- of being able to harness complexity across multiple clouds and the edge -- this is
much more of a function of the capability of the infrastructure, too. Is that how you are seeing it,
that only a supplier that’s deep in its infrastructure roots can solve these problems? This is not a
bolt-on benefit.
Lewington: I wouldn’t say it’s impossible as a bolt-on; it’s impossible to do efficiently and
securely as a bolt-on. One of the problems with AI is we are going to use a black box; you don’t
know how it works. There were a number of news stories recently about AIs becoming
corrupted, biased, and even racist, for example. Those kinds of problems are going to become
more common.
And so you need to know that your systems maintain their integrity and are not able to be
breached by bad actors. If you are just working on the very top layers of the software, it’s going
to be very difficult to attest that what’s underneath has its integrity unviolated.
You can have a process
to put ideas into reality,
to wean out the ones
that aren’t going to
succeed, and to promote
the ones that work.
5
If you are someone like HPE, which has its fingers in lots of pies, either directly or through our
partners, it’s easier to make a more efficient solution.
Gardner: Is it fair to say that AI should be a new core competency, for not only data scientists
and IT operators, but pretty much anybody in business? It seems to me this is an essential core
competency across the board.
Lewington: I think that's true. Think of AI as another layer of tools that, as we go forward,
becomes increasingly sophisticated. We will add more and more tools to our AI toolbox. And
this is one set of tools that you just cannot afford not to have.
Gardner: Rebecca, it seems to me that there is virtually nothing within an enterprise that won't
be impacted in one way or another by AI.
Lewington: I think that’s true. Anywhere in our lives where there is an equation, there could be
AI. There is so much data coming from so many sources. Many things are now overwhelmed by
the amount of data, even if it’s just as
mundane as deciding what to read in the
morning or what route to take to work, let
alone how to manage my enterprise IT
infrastructure. All things that are rule-based
can be made more powerful, more flexible,
and more responsive using AI.
Gardner: Returning to the circular nature of using AI to make more data available for AI -- and
recognizing that the IT infrastructure is a big part of that -- what are doing in your research and
development to make data services available and secure? Is there a relationship between
things like HPE OneView and HPE OneSphere and AI when it comes to efficiency and security
at scale?
Let the system deal with IT
Lewington: Those tools historically have been rules-based. We know that if a storage disk
gets to a certain percentage full, we need to spin up another disk -- those kinds of things. But to
scale flexibly, at some point that rules-based approach becomes unworkable. You want to have
the system look after itself, to identify its own problems and deal with them.
Including AI techniques in things like HPE InfoSight, HPE Clearpath, and network user identity
behavior software on the HPE Aruba side allows the AI algorithms to make those tools more
powerful and more efficient.
You can think of AI here as another class of analytics tools. It’s not magic, it’s just a different
and better way of doing IT analytics. The AI lets you harness more difficult datasets, more
complicated datasets, and more distributed datasets.
Gardner: If I’m an IT operator in a global 2000 enterprise, and I’m using analytics to help run
my IT systems, what should I be thinking about differently to begin using AI -- rather than just
analytics alone -- to do my job better?
All things that are rule-based can be
made more powerful, more flexible,
and more responsive using AI.
6
Lewington: If you are that person, you don’t really want to think about the AI. You don’t want
the AI to intrude upon your consciousness. You just want the tools to do your job.
For example, I may have 1,000 people starting a factory in Azerbaijan, or somewhere, and I
need to provision for all of that. I want to be able to put on my headset and say, “Hey, computer,
set up all the stuff I need in Azerbaijan.” You don’t want to think about what’s under the hood.
Our job is to make those tools invisible and powerful.
Composable, invisible, and insightful
Gardner: That sounds a lot like composability. Is that another tangent that HPE is working on
that aligns well with AI?
How to Achieve Composability
Across Your Datacenter
Lewington: It would be difficult to have AI be part of the fabric of an enterprise without
composability, and without extending composability into more dimensions. It’s not just about
being able to define the amount of storage and computer networking with a line of code, it’s
about being able to define the amount of memory, where the data is, where the data should be,
and what format the data should be in. All of those things – from the edge to cloud – need to be
dimensions in composability.
You want everything to work behind the scenes for you in the best way with the quickest results,
with the least energy, and in the most cost-effective way possible. That’s what we want to
achieve -- invisible infrastructure.
Gardner: We have been speaking at a fairly abstract level, but let’s look to some examples to
illustrate what we’re getting at when we think about such composability sophistication.
Do you have any concrete examples or use cases within HPE that illustrate the business
practicality of what we’ve been talking about?
Lewington: Yes, we have helped a tremendous
number of customers either get started with AI in
their operations or move from pilot to volume use.
A couple of them stand out. One particular
manufacturing company makes electronic
components. They needed to improve the yields in
their production lines, and they didn’t know how to
attack the problem. We were able to partner with
them to use such things as vision systems and
photographs from their production tools to identify defects that only could be picked up by a
human if they had a whole lot of humans watching everything all of the time.
This gets back to the notion of augmenting human capabilities. Their machines produce
terabytes of data every day, and it just gets turned away. They don’t know what to do with it.
To improve the yields in their
production lines ... we were able
to partner with them … to identify
defects that only could be picked
up by a human if they had a
whole lot of humans watching
everything all of the time.
7
We began running some research projects with them to use some very sophisticated
techniques, visual autoencoders, that allow you, without having a training set, to characterize a
production line that is performing well versus one that is on the verge of moving away from the
sweet spot. Those techniques can fingerprint a good line and also identify when the lines go just
slightly bad. In that case, a human looking at line would think it was working just perfectly.
This takes the idea of predictive maintenance further into what we call prescriptive maintenance,
where we have a much more sophisticated view into what represents a good line and what
represents a bad line. Those are couple of examples for manufacturing that I think are relevant.
Gardner: If I am an IT strategist, a Chief Information Officer (CIO) or a Chief Technology Officer
(CTO), for example, and I’m looking at what HPE is doing -- perhaps at the HPE Discover
conference -- where should I focus my attention if I want to become better at using AI, even if
it’s invisible? How can I become more capable as an organization to enable AI to become a
bigger part of what we do as a company?
The new company man is AI
Lewington: For CIOs, their most important customers these days may be developers and
increasingly data scientists, who are basically developers working with training models as
opposed to programs and code. They don’t want to have to think about where that data is
coming from and what it’s running on. They just want to be able to experiment, to put together
frameworks that turn data into insights.
It’s very much like the programming world, where we’ve gradually abstracted things from bare-
metal, to virtual machines, to containers, and now to the emerging paradigm of serverless in
some of the walled-garden public clouds. Now, you want to do the same thing for that data
scientist, in an analogous way.
Today, it’s a lot of heavy lifting, getting these things ready. It’s very difficult for a data scientist to
experiment. They know what they want. They ask for it, but it takes weeks and months to set up
a system so they can do that one experiment. Then they find it doesn’t work and move on to do
something different. And that requires a complete re-spin of what’s under the hood.
Now, using things like software from the recent HPE BlueData acquisition, we can make all of
that go away. And so the CIO’s job becomes much simpler because they can provide their
customers the tools they need to get their work done without them calling up every 10 seconds
and saying, “I need a cluster, I need a cluster, I need a cluster.”
That’s what a CIO should be looking
for, a partner that can help them
abstract complexity away, get it done at
scale, and in a way that they can both
afford and that takes the risk out. This
is complicated, it’s daunting, and the
field is changing so fast.
A CIO should be looking for a partner that
can help them abstract complexity away, get
it done at scale, and in a way that they can
both afford and that takes the risk out.
8
Gardner: So, in a nutshell, they need to look to the innovation that organizations like HPE are
doing in order to then promulgate more innovation themselves within their own organization. It’s
an interesting time.
Containers contend for the future
Lewington: Yes, that’s very well put. Because it’s changing so fast they don’t just want a
partner who has the stuff they need today, even if they don’t necessarily know what they need
today. They want to know that the partner they are working with is working on what they are
going to need five to 10 years down the line -- and thinking even further out. So I think that’s one
of the things that we bring to the table that others can’t.
Gardner: Can give us a hint as to what some of those innovations four or five years out might
be? How should we not limit ourselves in our thinking when it comes to that relationship, that
circular relationship between AI, data, and innovation?
How to Transform
The Traditional Datacenter
Lewington: It was worth coming to HPE Discover in June, because we talked about some
exciting new things around many different options. The discussion about increasing automation
abstractions is just going to accelerate.
For example, the use of containers, which have a
fairly small penetration rate across enterprises, is
at about 10 percent adoption today because they
are not the simplest thing in the world. But we are
going to get to the point where using containers
seems as complicated as bare-metal today and
that’s really going to help simplify the whole data
pipelines thing.
Beyond that, the elephant in the room for AI is that model complexity is growing incredibly fast.
The compute requirements are going up, something like 10 times faster than Moore’s Law, even
as Moore’s Law is slowing down.
We are already seeing an AI compute gap between what we can achieve and what we need to
achieve -- and it’s not just compute, it’s also energy. The world’s energy supply is going up, can
only go up slowly, but if we have exponentially more data, exponentially more compute,
exponentially more energy, and that’s just not going to be sustainable.
So we are also working on something called Emergent Computing, a super-energy-efficient
architecture that moves data around wherever it needs to be -- or not move data around but
instead bring the compute to the data. That will help us close that gap.
And that includes some very exciting new accelerator technologies: special-purpose compute
engines designed specifically for certain AI algorithms. Not only are we using regular transistor-
logic, we are using analog computing, and even optical computing to do some of these tasks,
We are going to get to the point
where using containers seems as
complicated as bare-metal today
and that’s really going to simplify
the whole data pipelines thing.
9
yet hundreds of times more efficiently and using hundreds of times less energy. This is all very
exciting stuff, for a little further out in the future.
Gardner: I’m afraid we’ll have to leave it there. We have been exploring how the rising tidal
wave of data must be better managed and how new tools are emerging to bring AI to the
rescue. And we’ve heard how new AI approaches and tools create a virtuous adoption pattern
between better data and better analytics, and therefore better business outcomes.
So please join me in thanking our guest, Rebecca Lewington, Senior Manager for Innovation
Marketing at HPE. Thank you so much, Rebecca.
Lewington: Thanks Dana, this was fun.
Gardner: And thank you as well to our audience for joining this BriefingsDirect Voice of the
Innovator interview. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for
this ongoing series of Hewlett Packard Enterprise-sponsored discussions. Thanks again for
listening, please pass this along to your IT community, and do come back next time.
Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard
Enterprise.
A discussion on how the rising tidal wave of data must be better managed, and how new tools are
emerging to bring artificial intelligence to the rescue. Copyright Interarbor Solutions, LLC, 2005-2019. All
rights reserved.
You may also be interested in:
• How HCI forms a simple foundation for hybrid cloud, edge, and composable infrastructure
• How Ferrara Candy depends on automated IT intelligence to support rapid business growth
• How real-time data streaming and integration set the stage for AI-driven DataOps
• How the composable approach to IT aligns automation and intelligence to overcome mounting
complexity
• How Texmark Chemicals pursues analysis-rich, IoT-pervasive path to the ‘refinery of the future’
• How HPC supports 'continuous integration of new ideas' for optimizing Formula 1 car design
• Want to manage your total cloud costs better? Emphasize the ‘Ops’ in DevOps, says Futurum
analyst Daniel Newman
• A new Mastercard global payments model creates a template for an agile, secure, and compliant
hybrid cloud
• Where the rubber meets the road: How users see the IT4IT standard building competitive
business advantage

Weitere ähnliche Inhalte

Was ist angesagt?

Case Study: Sprint Simplifies IT Environment with Speedy Implementation of To...
Case Study: Sprint Simplifies IT Environment with Speedy Implementation of To...Case Study: Sprint Simplifies IT Environment with Speedy Implementation of To...
Case Study: Sprint Simplifies IT Environment with Speedy Implementation of To...Dana Gardner
 
Using a Big Data Solution Helps Conservation International Identify and Proac...
Using a Big Data Solution Helps Conservation International Identify and Proac...Using a Big Data Solution Helps Conservation International Identify and Proac...
Using a Big Data Solution Helps Conservation International Identify and Proac...Dana Gardner
 
Fontys Eric van Tol
Fontys Eric van TolFontys Eric van Tol
Fontys Eric van TolTalentEvent
 
Data Explosion and Big Data Require New Strategies for Data Management and Re...
Data Explosion and Big Data Require New Strategies for Data Management and Re...Data Explosion and Big Data Require New Strategies for Data Management and Re...
Data Explosion and Big Data Require New Strategies for Data Management and Re...Dana Gardner
 
How to Become a Data Scientist – By Ryan Orban, VP of Operations and Expansio...
How to Become a Data Scientist – By Ryan Orban, VP of Operations and Expansio...How to Become a Data Scientist – By Ryan Orban, VP of Operations and Expansio...
How to Become a Data Scientist – By Ryan Orban, VP of Operations and Expansio...Galvanize
 
EDW 2015 cognitive computing panel session
EDW 2015 cognitive computing panel session EDW 2015 cognitive computing panel session
EDW 2015 cognitive computing panel session Steve Ardire
 
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...Dana Gardner
 
Applications for Cognitive Computing
Applications for Cognitive Computing Applications for Cognitive Computing
Applications for Cognitive Computing IBM Watson
 
Bi isn't big data and big data isn't BI (updated)
Bi isn't big data and big data isn't BI (updated)Bi isn't big data and big data isn't BI (updated)
Bi isn't big data and big data isn't BI (updated)mark madsen
 
IT Support Gains Automation and Intelligence to Bring Self-Service to Both Le...
IT Support Gains Automation and Intelligence to Bring Self-Service to Both Le...IT Support Gains Automation and Intelligence to Bring Self-Service to Both Le...
IT Support Gains Automation and Intelligence to Bring Self-Service to Both Le...Dana Gardner
 
Cognitive computing big_data_statistical_analytics
Cognitive computing big_data_statistical_analyticsCognitive computing big_data_statistical_analytics
Cognitive computing big_data_statistical_analyticsPietro Leo
 
Smart Data 2017 #AI & #FutureofWork
Smart Data 2017 #AI & #FutureofWorkSmart Data 2017 #AI & #FutureofWork
Smart Data 2017 #AI & #FutureofWorkSteve Ardire
 
Big Data [sorry] & Data Science: What Does a Data Scientist Do?
Big Data [sorry] & Data Science: What Does a Data Scientist Do?Big Data [sorry] & Data Science: What Does a Data Scientist Do?
Big Data [sorry] & Data Science: What Does a Data Scientist Do?Data Science London
 
Lessons Learned The Hard Way: 32+ Data Science Interviews
Lessons Learned The Hard Way: 32+ Data Science InterviewsLessons Learned The Hard Way: 32+ Data Science Interviews
Lessons Learned The Hard Way: 32+ Data Science InterviewsGregory Kamradt
 
Creating Value in Health through Big Data
Creating Value in Health through Big DataCreating Value in Health through Big Data
Creating Value in Health through Big DataBooz Allen Hamilton
 
Dark Side of Cloud Adoption: People and Organizations Unable to Adapt and Imp...
Dark Side of Cloud Adoption: People and Organizations Unable to Adapt and Imp...Dark Side of Cloud Adoption: People and Organizations Unable to Adapt and Imp...
Dark Side of Cloud Adoption: People and Organizations Unable to Adapt and Imp...Dana Gardner
 
Snowforce 2017 Keynote - Peter Coffee
Snowforce 2017 Keynote - Peter CoffeeSnowforce 2017 Keynote - Peter Coffee
Snowforce 2017 Keynote - Peter CoffeePeter Coffee
 

Was ist angesagt? (20)

Case Study: Sprint Simplifies IT Environment with Speedy Implementation of To...
Case Study: Sprint Simplifies IT Environment with Speedy Implementation of To...Case Study: Sprint Simplifies IT Environment with Speedy Implementation of To...
Case Study: Sprint Simplifies IT Environment with Speedy Implementation of To...
 
Using a Big Data Solution Helps Conservation International Identify and Proac...
Using a Big Data Solution Helps Conservation International Identify and Proac...Using a Big Data Solution Helps Conservation International Identify and Proac...
Using a Big Data Solution Helps Conservation International Identify and Proac...
 
Fontys Eric van Tol
Fontys Eric van TolFontys Eric van Tol
Fontys Eric van Tol
 
Data Explosion and Big Data Require New Strategies for Data Management and Re...
Data Explosion and Big Data Require New Strategies for Data Management and Re...Data Explosion and Big Data Require New Strategies for Data Management and Re...
Data Explosion and Big Data Require New Strategies for Data Management and Re...
 
How to Become a Data Scientist – By Ryan Orban, VP of Operations and Expansio...
How to Become a Data Scientist – By Ryan Orban, VP of Operations and Expansio...How to Become a Data Scientist – By Ryan Orban, VP of Operations and Expansio...
How to Become a Data Scientist – By Ryan Orban, VP of Operations and Expansio...
 
EDW 2015 cognitive computing panel session
EDW 2015 cognitive computing panel session EDW 2015 cognitive computing panel session
EDW 2015 cognitive computing panel session
 
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
 
Data is not the new snake oil
Data is not the new snake oilData is not the new snake oil
Data is not the new snake oil
 
Applications for Cognitive Computing
Applications for Cognitive Computing Applications for Cognitive Computing
Applications for Cognitive Computing
 
Bi isn't big data and big data isn't BI (updated)
Bi isn't big data and big data isn't BI (updated)Bi isn't big data and big data isn't BI (updated)
Bi isn't big data and big data isn't BI (updated)
 
Introduction to R for Data Mining
Introduction to R for Data MiningIntroduction to R for Data Mining
Introduction to R for Data Mining
 
IT Support Gains Automation and Intelligence to Bring Self-Service to Both Le...
IT Support Gains Automation and Intelligence to Bring Self-Service to Both Le...IT Support Gains Automation and Intelligence to Bring Self-Service to Both Le...
IT Support Gains Automation and Intelligence to Bring Self-Service to Both Le...
 
Cognitive computing big_data_statistical_analytics
Cognitive computing big_data_statistical_analyticsCognitive computing big_data_statistical_analytics
Cognitive computing big_data_statistical_analytics
 
Smart Data 2017 #AI & #FutureofWork
Smart Data 2017 #AI & #FutureofWorkSmart Data 2017 #AI & #FutureofWork
Smart Data 2017 #AI & #FutureofWork
 
Big Data [sorry] & Data Science: What Does a Data Scientist Do?
Big Data [sorry] & Data Science: What Does a Data Scientist Do?Big Data [sorry] & Data Science: What Does a Data Scientist Do?
Big Data [sorry] & Data Science: What Does a Data Scientist Do?
 
Lessons Learned The Hard Way: 32+ Data Science Interviews
Lessons Learned The Hard Way: 32+ Data Science InterviewsLessons Learned The Hard Way: 32+ Data Science Interviews
Lessons Learned The Hard Way: 32+ Data Science Interviews
 
Creating Value in Health through Big Data
Creating Value in Health through Big DataCreating Value in Health through Big Data
Creating Value in Health through Big Data
 
Dark Side of Cloud Adoption: People and Organizations Unable to Adapt and Imp...
Dark Side of Cloud Adoption: People and Organizations Unable to Adapt and Imp...Dark Side of Cloud Adoption: People and Organizations Unable to Adapt and Imp...
Dark Side of Cloud Adoption: People and Organizations Unable to Adapt and Imp...
 
Snowforce 2017 Keynote - Peter Coffee
Snowforce 2017 Keynote - Peter CoffeeSnowforce 2017 Keynote - Peter Coffee
Snowforce 2017 Keynote - Peter Coffee
 
Data science and_analytics_for_ordinary_people_ebook
Data science and_analytics_for_ordinary_people_ebookData science and_analytics_for_ordinary_people_ebook
Data science and_analytics_for_ordinary_people_ebook
 

Ähnlich wie Using AI to Solve Data and IT Complexity -- And Better Enable AI

Mission Critical Use Cases Show How Analytics Architectures Usher in an Artif...
Mission Critical Use Cases Show How Analytics Architectures Usher in an Artif...Mission Critical Use Cases Show How Analytics Architectures Usher in an Artif...
Mission Critical Use Cases Show How Analytics Architectures Usher in an Artif...Dana Gardner
 
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...Dana Gardner
 
Manufacturer Gains Advantage by Expanding IoT Footprint from Many Machines to...
Manufacturer Gains Advantage by Expanding IoT Footprint from Many Machines to...Manufacturer Gains Advantage by Expanding IoT Footprint from Many Machines to...
Manufacturer Gains Advantage by Expanding IoT Footprint from Many Machines to...Dana Gardner
 
Crowdsourcing Wisdom
Crowdsourcing WisdomCrowdsourcing Wisdom
Crowdsourcing WisdomVantte
 
The Open Group Conference Panel Explores How the Big Data Era Now Challenges ...
The Open Group Conference Panel Explores How the Big Data Era Now Challenges ...The Open Group Conference Panel Explores How the Big Data Era Now Challenges ...
The Open Group Conference Panel Explores How the Big Data Era Now Challenges ...Dana Gardner
 
Internet of Things Brings On Development Demands That DevOps Manages, Say Exp...
Internet of Things Brings On Development Demands That DevOps Manages, Say Exp...Internet of Things Brings On Development Demands That DevOps Manages, Say Exp...
Internet of Things Brings On Development Demands That DevOps Manages, Say Exp...Dana Gardner
 
Converged IoT Systems: Bringing the Data Center to the Edge of Everything
Converged IoT Systems: Bringing the Data Center to the Edge of EverythingConverged IoT Systems: Bringing the Data Center to the Edge of Everything
Converged IoT Systems: Bringing the Data Center to the Edge of EverythingDana Gardner
 
Iot opportunities-challenges
Iot opportunities-challengesIot opportunities-challenges
Iot opportunities-challengesjohnkbutcher
 
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths NowWant a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths NowDana Gardner
 
Hype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerHype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerLuminaryLabs1
 
top 10 Digital transformation Technologies in 2022.docx
top 10 Digital transformation Technologies in 2022.docxtop 10 Digital transformation Technologies in 2022.docx
top 10 Digital transformation Technologies in 2022.docxAdvance Tech
 
Advanced IoT systems provide analysis catalyst for the petrochemical refinery...
Advanced IoT systems provide analysis catalyst for the petrochemical refinery...Advanced IoT systems provide analysis catalyst for the petrochemical refinery...
Advanced IoT systems provide analysis catalyst for the petrochemical refinery...Dana Gardner
 
Why Data and Information Management Remain Elusive After Decades of Deployments
Why Data and Information Management Remain Elusive After Decades of DeploymentsWhy Data and Information Management Remain Elusive After Decades of Deployments
Why Data and Information Management Remain Elusive After Decades of DeploymentsDana Gardner
 
How the Modern Data Center Extends Across Remote Locations Due to Automation ...
How the Modern Data Center Extends Across Remote Locations Due to Automation ...How the Modern Data Center Extends Across Remote Locations Due to Automation ...
How the Modern Data Center Extends Across Remote Locations Due to Automation ...Dana Gardner
 
Top Strategic Technology Trends for 2022.docx
Top Strategic Technology Trends for 2022.docxTop Strategic Technology Trends for 2022.docx
Top Strategic Technology Trends for 2022.docxAdvance Tech
 
Hype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerHype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerLuminary Labs
 
Allaboutailuminarylabsjanuary122017 170112151616
Allaboutailuminarylabsjanuary122017 170112151616Allaboutailuminarylabsjanuary122017 170112151616
Allaboutailuminarylabsjanuary122017 170112151616Quang Lê
 
Exploring the Business Decision to Use Cloud Computing
Exploring the Business Decision to Use Cloud ComputingExploring the Business Decision to Use Cloud Computing
Exploring the Business Decision to Use Cloud ComputingDana Gardner
 

Ähnlich wie Using AI to Solve Data and IT Complexity -- And Better Enable AI (20)

Mission Critical Use Cases Show How Analytics Architectures Usher in an Artif...
Mission Critical Use Cases Show How Analytics Architectures Usher in an Artif...Mission Critical Use Cases Show How Analytics Architectures Usher in an Artif...
Mission Critical Use Cases Show How Analytics Architectures Usher in an Artif...
 
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
 
Manufacturer Gains Advantage by Expanding IoT Footprint from Many Machines to...
Manufacturer Gains Advantage by Expanding IoT Footprint from Many Machines to...Manufacturer Gains Advantage by Expanding IoT Footprint from Many Machines to...
Manufacturer Gains Advantage by Expanding IoT Footprint from Many Machines to...
 
Crowdsourcing Wisdom
Crowdsourcing WisdomCrowdsourcing Wisdom
Crowdsourcing Wisdom
 
The Open Group Conference Panel Explores How the Big Data Era Now Challenges ...
The Open Group Conference Panel Explores How the Big Data Era Now Challenges ...The Open Group Conference Panel Explores How the Big Data Era Now Challenges ...
The Open Group Conference Panel Explores How the Big Data Era Now Challenges ...
 
Internet of Things Brings On Development Demands That DevOps Manages, Say Exp...
Internet of Things Brings On Development Demands That DevOps Manages, Say Exp...Internet of Things Brings On Development Demands That DevOps Manages, Say Exp...
Internet of Things Brings On Development Demands That DevOps Manages, Say Exp...
 
Converged IoT Systems: Bringing the Data Center to the Edge of Everything
Converged IoT Systems: Bringing the Data Center to the Edge of EverythingConverged IoT Systems: Bringing the Data Center to the Edge of Everything
Converged IoT Systems: Bringing the Data Center to the Edge of Everything
 
Iot opportunities-challenges
Iot opportunities-challengesIot opportunities-challenges
Iot opportunities-challenges
 
The Cognitive Digital Twin
The Cognitive Digital TwinThe Cognitive Digital Twin
The Cognitive Digital Twin
 
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths NowWant a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
 
Hype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerHype vs. Reality: The AI Explainer
Hype vs. Reality: The AI Explainer
 
top 10 Digital transformation Technologies in 2022.docx
top 10 Digital transformation Technologies in 2022.docxtop 10 Digital transformation Technologies in 2022.docx
top 10 Digital transformation Technologies in 2022.docx
 
Advanced IoT systems provide analysis catalyst for the petrochemical refinery...
Advanced IoT systems provide analysis catalyst for the petrochemical refinery...Advanced IoT systems provide analysis catalyst for the petrochemical refinery...
Advanced IoT systems provide analysis catalyst for the petrochemical refinery...
 
Why Data and Information Management Remain Elusive After Decades of Deployments
Why Data and Information Management Remain Elusive After Decades of DeploymentsWhy Data and Information Management Remain Elusive After Decades of Deployments
Why Data and Information Management Remain Elusive After Decades of Deployments
 
How the Modern Data Center Extends Across Remote Locations Due to Automation ...
How the Modern Data Center Extends Across Remote Locations Due to Automation ...How the Modern Data Center Extends Across Remote Locations Due to Automation ...
How the Modern Data Center Extends Across Remote Locations Due to Automation ...
 
Top Strategic Technology Trends for 2022.docx
Top Strategic Technology Trends for 2022.docxTop Strategic Technology Trends for 2022.docx
Top Strategic Technology Trends for 2022.docx
 
About Machine and real
About Machine and realAbout Machine and real
About Machine and real
 
Hype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerHype vs. Reality: The AI Explainer
Hype vs. Reality: The AI Explainer
 
Allaboutailuminarylabsjanuary122017 170112151616
Allaboutailuminarylabsjanuary122017 170112151616Allaboutailuminarylabsjanuary122017 170112151616
Allaboutailuminarylabsjanuary122017 170112151616
 
Exploring the Business Decision to Use Cloud Computing
Exploring the Business Decision to Use Cloud ComputingExploring the Business Decision to Use Cloud Computing
Exploring the Business Decision to Use Cloud Computing
 

Kürzlich hochgeladen

Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxhariprasad279825
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxLoriGlavin3
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rick Flair
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfMounikaPolabathina
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 

Kürzlich hochgeladen (20)

Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptx
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdf
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 

Using AI to Solve Data and IT Complexity -- And Better Enable AI

  • 1. 1 Using AI to Solve Data and IT Complexity -- And Better Enable AI A discussion on how the rising tidal wave of data must be better managed, and how new tools are emerging to bring artificial intelligence to the rescue. Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise. Dana Gardner: Hello, and welcome to the next edition of the BriefingsDirect Voice of the Innovator podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on the latest in IT innovation. Our next discussion focuses on why the rising tidal wave of data must be better managed, and how new tools are emerging to bring artificial intelligence (AI) to the rescue. Stay with us now as we learn how the latest AI innovations improve both data and services management across a cloud deployment continuum -- and in doing so set up an even more powerful way for businesses to exploit AI. To learn how AI will help conquer complexity to allow for higher abstractions of benefits from across all sorts of data for better analysis, please join me in welcoming Rebecca Lewington, Senior Manager of Innovation Marketing at Hewlett Packard Enterprise (HPE). Welcome to BriefingsDirect, Rebecca. Rebecca Lewington: Hi, Dana. It’s very nice to talk to you. Gardner: We have been talking about massive amounts of data for quite some time. What’s new about data buildup that requires us to look to AI for help? Lewington: Partly it is the sheer amount of data. IDC’s Data Age Study predicts the global data sphere will be 175 zettabytes by 2025, which is a rather large number. That’s what, 1 and 21 zeros? But we have always been in an era of exploding data. Yet, things are different. One, it’s not just the amount of data; it’s the number of sources the data comes from. We are adding in things like mobile devices, and we are connecting factories’ operational technologies to information technology (IT). There are more and more sources. Also, the time we have to do something with that data is shrinking to the point where we expect everything to be real-time or you are going to make a bad decision. An autonomous car, for example, might do something bad. Or we are going to miss a market or competitive intelligence opportunity. So it’s not just the amount of data -- but what you need to do with it that is challenging. Lewington
  • 2. 2 Gardner: We are also at a time when Al and machine learning (ML) technologies have matured. We can begin to turn them toward the data issue to better exploit the data. What is new and interesting about AI and ML that make them more applicable for this data complexity issue? Data gets smarter with AI Lewington: A lot of the key algorithms for AI were actually invented long ago in the 1950s, but at that time, the computers were hopeless relative to what we have today; so it wasn’t possible to harness them. For example, you can train a deep-learning neural net to recognize pictures of kittens. To do that, you need to run millions of images to train a working model you can deploy. That’s a huge, computationally intensive task that only became practical a few years ago. But now that we have hit that inflection point, things are just taking off. Gardner: We can begin to use machines to better manage data that we can then apply to machines. Does that change the definition of AI? How to Remove Complexity From Multicloud and Hybrid IT Lewington: The definition of AI is tricky. It’s malleable, depending on who you talk to. For some people, it’s anything that a human can do. To others, it means sophisticated techniques, like reinforcement learning and deep learning. One useful definition is that AI is what you use when you know what the answer looks like, but not how to get there. Traditional analytics effectively does at scale what you could do with pencil and paper. You could write the equations to decide where your data should live, depending on how quickly you need to access it. But with AI, it’s like the kittens example. You know what the answer looks like, it’s trivial for you to look at the photograph and say, “That is a cat in the picture.” But it’s really, really difficult to write the equations to do it. But now, it’s become relatively easy to train a black box model to do that job for you. Gardner: Now that we are able to train the black box, how can we apply that in a practical way to the business problem that we discussed at the outset? What is it about AI now that helps better manage data? What's changed that gives us better data because we are using AI? Lewington: It’s a circular thing. The heart of what makes AI work is good data; the right data, in the right place, with the right properties you can use to train a model, which you can then feed new data into to get results that you couldn’t get otherwise. The heart of what makes AI work is good data; the right data, in the right place, with the right properties you can use to train a model, which you can then feed new data into to get results.
  • 3. 3 Now, there are many ways you can apply that. You can apply it to the trivial case of the cat we just talked about. You can apply it to helping a surgeon review many more MRIs, for example, by allowing him to focus on the few that are borderline, and to do the mundane stuff for him. But, one of the other things you can do with it is use it to manipulate the data itself. So we are using AI to make the data better -- to make AI better. Gardner: Not only is it circular, and potentially highly reinforcing, but when we apply this to operations in IT -- particularly complexity in hybrid cloud, multicloud, and hybrid IT -- we get an additional benefit. You can make the IT systems more powerful when it comes to the application of that circular capability -- of making better AI and better data management. AI scales data upward and outward Lewington: Oh, absolutely. I think the key word here is scale. When you think about data -- and all of the places it can be, all the formats it can be in -- you could do it yourself. If you want to do a particular task, you could do what has traditionally been done. You can say, “Well, I need to import the data from here to here and to spin up these clusters and install these applications.” Those are all things you could do manually, and you can do them for one-off things. But once you get to a certain scale, you need to do them hundreds of times, thousands of times, even millions of times. And you don’t have the humans to do it. It’s ridiculous. So AI gives you a way to augment the humans you do have, to take the mundane stuff away, so they can get straight to what they want to do, which is coming up with an answer instead of spending weeks and months preparing to start to work out the answer. Gardner: So AI directed at IT, what some people call AIOps could be an accelerant to this circular advantageous relationship between AI and data? And is that part of what you are doing within the innovation and research work at HPE? Lewington: That’s true, absolutely. The mission of Hewlett Packard Labs in this space is to assist the rest of the company to create more powerful, more flexible, more secure, and more efficient computing and data architectures. And for us in Labs, this tends to be a fairly specific series of research projects that feed into the bigger picture. For example, we are now doing the Deep Learning Cookbook, which allows customers to find out ahead of time exactly what kind of hardware and software they are going to need to get to a desired outcome. We are automating the experimenting process, if you will. And, as we talked about earlier, there is the shift to the edge. As we make more and more decisions -- and gain more insights there, to where the data is created -- there is a growing need to deploy AI at the edge. That means you need a data strategy to get the data in the right place together with the AI algorithm, at the edge. That’s because there often isn’t time to move that data into the cloud before making a decision and waiting for the required action to return. As we make more and more decisions – and gain more insights there, to where the data is created – there is a growing need to deploy AI at the edge.
  • 4. 4 Once you begin doing that, once you start moving from a few clouds to thousands and millions of endpoints, how do you handle multiple deployments? How do you maintain security and data integrity across all of those devices? As researchers, we aim to answer exactly those questions. And, further out, we are looking to move the natural learning phase itself to the edge, to do the things we call swarm learning, where devices learn from their environment and each other, using a distributed model that doesn’t use a central cloud at all. Gardner: Rebecca, given your title is Innovation Marketing Lead, is there something about the very nature of innovation that you have come to learn personally that’s different than what you expected? How has innovation itself changed in the past several years? Innovation takes time and space Lewington: I began my career as a mechanical engineer. For many years, I was offended by the term innovation process, because that’s not how innovation works. You give people the space and you give them the time and ideas appear organically. You can’t have a process to have ideas. You can have a process to put those ideas into reality, to wean out the ones that aren’t going to succeed, and to promote the ones that work. But the term innovation process to me is an oxymoron. And that’s the beautiful thing about Hewlett Packard Labs. It was set up to give people the space where they can work on things that just seem like a good idea when they pop up in their heads. They can work on these and figure out which ones will be of use to the broader organization -- and then it’s full steam ahead. How to Better Understand What AI Can Do for Your Business Gardner: It seems to me that the relationship between infrastructure and AI has changed. It wasn’t that long ago when we thought of business intelligence (BI) as an application -- above the infrastructure. But the way you are describing the requirements of management in an edge environment -- of being able to harness complexity across multiple clouds and the edge -- this is much more of a function of the capability of the infrastructure, too. Is that how you are seeing it, that only a supplier that’s deep in its infrastructure roots can solve these problems? This is not a bolt-on benefit. Lewington: I wouldn’t say it’s impossible as a bolt-on; it’s impossible to do efficiently and securely as a bolt-on. One of the problems with AI is we are going to use a black box; you don’t know how it works. There were a number of news stories recently about AIs becoming corrupted, biased, and even racist, for example. Those kinds of problems are going to become more common. And so you need to know that your systems maintain their integrity and are not able to be breached by bad actors. If you are just working on the very top layers of the software, it’s going to be very difficult to attest that what’s underneath has its integrity unviolated. You can have a process to put ideas into reality, to wean out the ones that aren’t going to succeed, and to promote the ones that work.
  • 5. 5 If you are someone like HPE, which has its fingers in lots of pies, either directly or through our partners, it’s easier to make a more efficient solution. Gardner: Is it fair to say that AI should be a new core competency, for not only data scientists and IT operators, but pretty much anybody in business? It seems to me this is an essential core competency across the board. Lewington: I think that's true. Think of AI as another layer of tools that, as we go forward, becomes increasingly sophisticated. We will add more and more tools to our AI toolbox. And this is one set of tools that you just cannot afford not to have. Gardner: Rebecca, it seems to me that there is virtually nothing within an enterprise that won't be impacted in one way or another by AI. Lewington: I think that’s true. Anywhere in our lives where there is an equation, there could be AI. There is so much data coming from so many sources. Many things are now overwhelmed by the amount of data, even if it’s just as mundane as deciding what to read in the morning or what route to take to work, let alone how to manage my enterprise IT infrastructure. All things that are rule-based can be made more powerful, more flexible, and more responsive using AI. Gardner: Returning to the circular nature of using AI to make more data available for AI -- and recognizing that the IT infrastructure is a big part of that -- what are doing in your research and development to make data services available and secure? Is there a relationship between things like HPE OneView and HPE OneSphere and AI when it comes to efficiency and security at scale? Let the system deal with IT Lewington: Those tools historically have been rules-based. We know that if a storage disk gets to a certain percentage full, we need to spin up another disk -- those kinds of things. But to scale flexibly, at some point that rules-based approach becomes unworkable. You want to have the system look after itself, to identify its own problems and deal with them. Including AI techniques in things like HPE InfoSight, HPE Clearpath, and network user identity behavior software on the HPE Aruba side allows the AI algorithms to make those tools more powerful and more efficient. You can think of AI here as another class of analytics tools. It’s not magic, it’s just a different and better way of doing IT analytics. The AI lets you harness more difficult datasets, more complicated datasets, and more distributed datasets. Gardner: If I’m an IT operator in a global 2000 enterprise, and I’m using analytics to help run my IT systems, what should I be thinking about differently to begin using AI -- rather than just analytics alone -- to do my job better? All things that are rule-based can be made more powerful, more flexible, and more responsive using AI.
  • 6. 6 Lewington: If you are that person, you don’t really want to think about the AI. You don’t want the AI to intrude upon your consciousness. You just want the tools to do your job. For example, I may have 1,000 people starting a factory in Azerbaijan, or somewhere, and I need to provision for all of that. I want to be able to put on my headset and say, “Hey, computer, set up all the stuff I need in Azerbaijan.” You don’t want to think about what’s under the hood. Our job is to make those tools invisible and powerful. Composable, invisible, and insightful Gardner: That sounds a lot like composability. Is that another tangent that HPE is working on that aligns well with AI? How to Achieve Composability Across Your Datacenter Lewington: It would be difficult to have AI be part of the fabric of an enterprise without composability, and without extending composability into more dimensions. It’s not just about being able to define the amount of storage and computer networking with a line of code, it’s about being able to define the amount of memory, where the data is, where the data should be, and what format the data should be in. All of those things – from the edge to cloud – need to be dimensions in composability. You want everything to work behind the scenes for you in the best way with the quickest results, with the least energy, and in the most cost-effective way possible. That’s what we want to achieve -- invisible infrastructure. Gardner: We have been speaking at a fairly abstract level, but let’s look to some examples to illustrate what we’re getting at when we think about such composability sophistication. Do you have any concrete examples or use cases within HPE that illustrate the business practicality of what we’ve been talking about? Lewington: Yes, we have helped a tremendous number of customers either get started with AI in their operations or move from pilot to volume use. A couple of them stand out. One particular manufacturing company makes electronic components. They needed to improve the yields in their production lines, and they didn’t know how to attack the problem. We were able to partner with them to use such things as vision systems and photographs from their production tools to identify defects that only could be picked up by a human if they had a whole lot of humans watching everything all of the time. This gets back to the notion of augmenting human capabilities. Their machines produce terabytes of data every day, and it just gets turned away. They don’t know what to do with it. To improve the yields in their production lines ... we were able to partner with them … to identify defects that only could be picked up by a human if they had a whole lot of humans watching everything all of the time.
  • 7. 7 We began running some research projects with them to use some very sophisticated techniques, visual autoencoders, that allow you, without having a training set, to characterize a production line that is performing well versus one that is on the verge of moving away from the sweet spot. Those techniques can fingerprint a good line and also identify when the lines go just slightly bad. In that case, a human looking at line would think it was working just perfectly. This takes the idea of predictive maintenance further into what we call prescriptive maintenance, where we have a much more sophisticated view into what represents a good line and what represents a bad line. Those are couple of examples for manufacturing that I think are relevant. Gardner: If I am an IT strategist, a Chief Information Officer (CIO) or a Chief Technology Officer (CTO), for example, and I’m looking at what HPE is doing -- perhaps at the HPE Discover conference -- where should I focus my attention if I want to become better at using AI, even if it’s invisible? How can I become more capable as an organization to enable AI to become a bigger part of what we do as a company? The new company man is AI Lewington: For CIOs, their most important customers these days may be developers and increasingly data scientists, who are basically developers working with training models as opposed to programs and code. They don’t want to have to think about where that data is coming from and what it’s running on. They just want to be able to experiment, to put together frameworks that turn data into insights. It’s very much like the programming world, where we’ve gradually abstracted things from bare- metal, to virtual machines, to containers, and now to the emerging paradigm of serverless in some of the walled-garden public clouds. Now, you want to do the same thing for that data scientist, in an analogous way. Today, it’s a lot of heavy lifting, getting these things ready. It’s very difficult for a data scientist to experiment. They know what they want. They ask for it, but it takes weeks and months to set up a system so they can do that one experiment. Then they find it doesn’t work and move on to do something different. And that requires a complete re-spin of what’s under the hood. Now, using things like software from the recent HPE BlueData acquisition, we can make all of that go away. And so the CIO’s job becomes much simpler because they can provide their customers the tools they need to get their work done without them calling up every 10 seconds and saying, “I need a cluster, I need a cluster, I need a cluster.” That’s what a CIO should be looking for, a partner that can help them abstract complexity away, get it done at scale, and in a way that they can both afford and that takes the risk out. This is complicated, it’s daunting, and the field is changing so fast. A CIO should be looking for a partner that can help them abstract complexity away, get it done at scale, and in a way that they can both afford and that takes the risk out.
  • 8. 8 Gardner: So, in a nutshell, they need to look to the innovation that organizations like HPE are doing in order to then promulgate more innovation themselves within their own organization. It’s an interesting time. Containers contend for the future Lewington: Yes, that’s very well put. Because it’s changing so fast they don’t just want a partner who has the stuff they need today, even if they don’t necessarily know what they need today. They want to know that the partner they are working with is working on what they are going to need five to 10 years down the line -- and thinking even further out. So I think that’s one of the things that we bring to the table that others can’t. Gardner: Can give us a hint as to what some of those innovations four or five years out might be? How should we not limit ourselves in our thinking when it comes to that relationship, that circular relationship between AI, data, and innovation? How to Transform The Traditional Datacenter Lewington: It was worth coming to HPE Discover in June, because we talked about some exciting new things around many different options. The discussion about increasing automation abstractions is just going to accelerate. For example, the use of containers, which have a fairly small penetration rate across enterprises, is at about 10 percent adoption today because they are not the simplest thing in the world. But we are going to get to the point where using containers seems as complicated as bare-metal today and that’s really going to help simplify the whole data pipelines thing. Beyond that, the elephant in the room for AI is that model complexity is growing incredibly fast. The compute requirements are going up, something like 10 times faster than Moore’s Law, even as Moore’s Law is slowing down. We are already seeing an AI compute gap between what we can achieve and what we need to achieve -- and it’s not just compute, it’s also energy. The world’s energy supply is going up, can only go up slowly, but if we have exponentially more data, exponentially more compute, exponentially more energy, and that’s just not going to be sustainable. So we are also working on something called Emergent Computing, a super-energy-efficient architecture that moves data around wherever it needs to be -- or not move data around but instead bring the compute to the data. That will help us close that gap. And that includes some very exciting new accelerator technologies: special-purpose compute engines designed specifically for certain AI algorithms. Not only are we using regular transistor- logic, we are using analog computing, and even optical computing to do some of these tasks, We are going to get to the point where using containers seems as complicated as bare-metal today and that’s really going to simplify the whole data pipelines thing.
  • 9. 9 yet hundreds of times more efficiently and using hundreds of times less energy. This is all very exciting stuff, for a little further out in the future. Gardner: I’m afraid we’ll have to leave it there. We have been exploring how the rising tidal wave of data must be better managed and how new tools are emerging to bring AI to the rescue. And we’ve heard how new AI approaches and tools create a virtuous adoption pattern between better data and better analytics, and therefore better business outcomes. So please join me in thanking our guest, Rebecca Lewington, Senior Manager for Innovation Marketing at HPE. Thank you so much, Rebecca. Lewington: Thanks Dana, this was fun. Gardner: And thank you as well to our audience for joining this BriefingsDirect Voice of the Innovator interview. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise-sponsored discussions. Thanks again for listening, please pass this along to your IT community, and do come back next time. Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise. A discussion on how the rising tidal wave of data must be better managed, and how new tools are emerging to bring artificial intelligence to the rescue. Copyright Interarbor Solutions, LLC, 2005-2019. All rights reserved. You may also be interested in: • How HCI forms a simple foundation for hybrid cloud, edge, and composable infrastructure • How Ferrara Candy depends on automated IT intelligence to support rapid business growth • How real-time data streaming and integration set the stage for AI-driven DataOps • How the composable approach to IT aligns automation and intelligence to overcome mounting complexity • How Texmark Chemicals pursues analysis-rich, IoT-pervasive path to the ‘refinery of the future’ • How HPC supports 'continuous integration of new ideas' for optimizing Formula 1 car design • Want to manage your total cloud costs better? Emphasize the ‘Ops’ in DevOps, says Futurum analyst Daniel Newman • A new Mastercard global payments model creates a template for an agile, secure, and compliant hybrid cloud • Where the rubber meets the road: How users see the IT4IT standard building competitive business advantage