Social Media, mobile devices and new innovative infrastructures mean that more data is being used to serve end-users more than ever before. Enterprise customers must act quickly on data stored across their enterprise. IBM Elastic Caching solutions provide the best opportunity for improving your end-users experience in consuming application data. Every business, of every size, in every Industry needs an effective data caching solution. The industry has moved beyond the bottleneck of CPU processing and must address the growing data bottleneck problems which prevent predictable and cost-effective scalability that directly impacts the performance and throughput of every data-intensive application.
IBM Elastic Caching solutions WebSphere eXtreme Scale and the DataPower XC10 Appliance solve these problems better than the competition. Learn how IBM Elastic Caching solutions have evolved to eliminate enterprise data bottlenecks by elastically distributing data among many resources and allowing applications to efficiently access needed data quickly. We beat our competition by not only allowing our customers flexibility to create mission-critical applications that achieve predictable, scalable performance and high availability, but also extending and integrating IBM Elastic Caching into many IBM products covering Retail/Commerce solutions, Mobile Devices, Content Management, Business Rule Management, ESBs, Messaging and more.
With a more and more people on the internet we see customers challenged to deal effectively with user profiles… With increases in online activity like stock trades we see customers struggling to cost effectively increase the number of transactions they can support. With huge increases in the amount of new information generated each day our customers struggle with ways to quickly and cost-effectively get at that data to drive value into their business processes.
Strategic Imperative: Users and IT providers should look at in-memory computing as a long-term technology trend that may potentially have a disruptive impact on the IT industry, comparable to that of cloud computing. Gartner View of In-Memory Computing Gartner informally defines in-memory computing as a computing style by which it is postulated that the primary data store for applications (the "database of records") is the central memory of the computer running these applications. This implies that data access latency can be assumed to be negligible, even if the application needs to access large volumes of data, such as in analytical or event-processing applications. Therefore, in-memory computing implies that terabyte-size datasets can be held in the computer RAM, and these can be shared across multiple, distributed applications. Of course, even in in-memory computing, electromagnetic, "spinning" disks (or substitutes such as SSDs) are utilized and considered, but not anymore as the primary locus for data. In in-memory computing, "spinning" disks are conceptually used to persist in-memory data for recovery purposes, to manage overflow situations, to archive historical data and to transport data to other locations. In reality, traditional data stores and data management technologies will continue to play a critical role in users' strategy, given that the overwhelming majority of established applications are not in-memory-enabled. Moreover, for most run-the-business applications, traditional approaches are still the most convenient ways to store and manipulate data. In-memory computing-enabling software technologies have been available in the market for a long time, but their mainstream adoption is now possible because of two fundamental factors: (1) dramatic evolutions in processor technologies (multicore, 64-bits) and the never-ending decline in DRAM and flash memory costs, which made it possible to bring to market commodity-based hardware stuffed with terabytes of RAM; and (2) software vendors incorporating in-memory computing technologies into packaged applications and mainstream application infrastructure products (such as application servers, ESBs, BI tools, BPM tools and others). IBM View of In Memory Database relevance to Big Data: Traditionally , data is placed in storage and then, when needed, accessed and acted upon in memory. This results in a natural bottleneck that impacts performance. As volumes of data increase, the time to access, let alone analyze, increases to the point where it becomes too cumbersome for business needs. Complex calculation or search of data could not be accomplished in real time. At best, such an operation would take a few hours; at worst, it would take a few days. Today , with In-memory computing, and specifically in memory data grids like WXS & XC10, we can take advantage of a better understanding of how data is shaped and stored. Paired with the falling prices for memory and greater affordability of faster solid state memory, we can do away with the traditional concept of storage and the associated bottlenecks, gaining an orders-of-magnitude improvement in performance and enabling the development of a new class of applications. The use of in-memory technology marks an inflection point for enterprise applications, especially in dealing with big data. Having real-time information available at the speed of thought provides decision makers with insights that have not previously been available. Value : The most evident benefit of in-memory processing is its speed. When large data sets need to be analyzed they can be available and can be accessed near-instantaneously. Without the bottleneck of having to access data in storage, organisations can swiftly analyze information and use it to create the best possible strategies. Beyond speed, the underlying point of in-memory computing is the ability to process and analyze big data in a cost-effective manner. Database management currently accounts for more than 25% of most companies’ IT budgets. Since in-memory databases use hardware systems that require far less power than traditional database management systems, they dramatically reduce hardware and maintenance costs.
Continue your cloud conversations with customers by helping them to resolve the most basic concern: management of all that data. Now with WebSphere eXtreme Scale V8.6, we can ensure that the database is not the bottleneck that slows applications down. We do this by caching data in the application server, automatically placing data and replicas according to user preference to increase overall availability. In other words, when my cloud application all of a sudden scales up from 10 instances to 100 instances, if I have independently stored my state information, I don't have to worry about it. I know that it's there, I know that it's accessible, and I'm free to scale to whatever dimensions I need based on business requirements.
SPEAKER NOTES PRESENTERS may want to delete speaker notes prior to presenting to clients/external audience Offering Name: Offering Description: Target Market/Industries What is New News? Client Needs Addressed: bullet bullet Value Proposition: ( Describe how the offering helps the consumer improve operations, deliver/access new capabilities, increase revenues/profits, perform roles more efficiently, perform functions previously not possible ) bullet bullet bullet Key Benefits: (for Launch Theme) bullet bullet Competitors & Differentiation: ( List appropriate competitors and key differentiation from IBM vs. Competitors) bullet bullet Sparklers: (Measurable or significant performance/capability facts or proof points) bullet bullet
Smarter Planet workloads and this increase in data really drive the need for Elastic caching and XTP With a more and more people on the internet we see customers challenged to deal effectively with user profiles… With increases in online activity like stock trades we see customer struggling to cost effectively increase the number of transactions they can support. With huge increases in the amount of new information generated each day our customers struggle with ways to quickly and cost-effectively get at that data to drive value into their business processes.