In this presentation we review the new features in SQL 2008 R2.
Regards,
Ing. Eduardo Castro Martinez, PhD
http://comunidadwindows.org
http://ecastrom.blogspot.com
9. Information Platform Vision Empowered IT Pervasive Insight Dynamic Development Mission Critical Platform Cloud Desktop & Mobile Server & Datacenter
10. Scalable relational database platform Consistent, familiar model & tools Self-managed, highly available cloud services MPP support for 10s to 100s TB DW Highly scalable appliances Seamless integration with Microsoft BI SQL Server 2008 R2 Wave Managed Self-Service BI Multi-server management Virtualization & Live Migration
44. R2 in a glance New Editions 2 new premium editions to meet the needs of large scale datacenters and data warehouses New R2 Technologies Application and Multi-Server Management Managed Self-Service Business Intelligence Master Data Services StreamInsight Complex Event Processing Technology
45. R2 in a glance New R2 Solutions PowerPivot for SharePoint PowerPivot for Excel Parallel Data Warehouse Fast Track Data Warehouse
46. What’s R2 Building on the momentum of SQL Server 2008, “R2” improves IT efficiency by reducing the time and cost of developing and managing applications Empowers end users to make better decisions through Self-Service Business Intelligence and enables organizations to scale with confidence by providing high levels of reliability, security and scalability for business critical applications.
47. What’s R2 Report Builder 3.0 with support for geospatial visualization Application and Multi-server Management - SMP scale up with support for up to 256 logical processors. This CTP provides the first opportunity to explore some of the features of SQL Server 2008 R2 and see how it all comes together to enhance performance and scalability, enable self-service BI and improve IT and developer efficiency.
48. Scalability & Performance Enhancements SQL Server 2008 R2 comes up with two new feature in scalability and performance enhancements: Support for More Than 64 CPUs- The number of CPU cores that a server can use for database operations been increased from 64 to 512. Unicode Compression-Unicode data that is stored in nvarchar(n) and nchar(n) columns is now compressed by using an implementation of the standard Compression Scheme for Unicode(SCSU) algorithm.
49. Premium Edition Datacenter Datacenter Designed to deliver a high-performing data platform that provides the highest levels of scalability for large application workloads, virtualization and consolidation Key features new to Datacenter: Application and Multi-Server Management for enrolling, gaining insights and managing over 25 instances
50. Premium Edition Datacenter High-scale complex event processing with SQL Server StreamInsight Supports more processors and up to 256 logical processors for highest levels of scale Supports memory limits up to OS maximum
51. High Scale Data Warehouse SITUATION TODAY SQL SERVER 2008 R2 PARALLEL DATA WAREHOUSE Predictable scale-out through MPP on SQL Server and Windows Massive Scale with Low TCO – 10s to 100TB+ (total cost starts at $15k/TB!) Integrated BI platform for small and very large Enterprises Data volumes are exploding Growing population of users accessing information Increasingly complex data analyses performed against data “Parallel Data Warehouse is a natural complement to SQL Server, so we are excited about the possibilities the DatAllegro acquisition will bring.” - Ron Van Zanten, Directing Officer of Business Intelligence, Premier Bankcard Inc
52. Premium Edition Parallel DataWarehouse SQL Server 2008 R2 Parallel Data Warehouse is a highly scalable data warehouse appliance-based solution
53. Premium Edition Parallel DataWarehouse Key features new to Parallel Data Warehouse: 10s to 100s TBs enabled by MPP architecture Advanced data warehousing capabilities like Star Join Queries and Change Data Capture Integration with SSIS, SSRS, and SSAS Supports industry standard data warehousing hub and spoke architecture and parallel database copy
70. How Unicode Compression affects Data Storage You can implement it simply when creating table: With (Data_Compression=ROW) or after creating table: Alter Table [Name] REBUILD WITH(Data_Compression=ROW)
71. Manageability Enhancements SQL Server 2008 R2 introduces the SQL Server Utility for managing multiple instances of the SQL Server Database Engine. In SQL Server Management Studio, organizations will gain insights into their growing applications and SQL Server instances
72. Resource Optimization Consolidation management Use a new explorer in SQL Server Management Studio to access the central management for multi-server management and at-a-glance dashboard views Improve service levels Set policies to define desired utilization thresholds across target servers or applications within a new central management point. Identify issues with instances and applications
75. Manageability Enhancements It also introduces a unit of management called a data-tier application that provides an application-based view for managing the data-tier objects in the SQL Server Utility or stand-alone instances of the Database Engine. R2 makes it easier for DBAs to move databases around from server to server in much the same way virtualization admins move guest OS’s around between physical hosts.
76. Multi-Server Management TODAY TOMORROW Control server sprawl with 1 to many management – setup is fast and easy Manage capacity through policies – save time, optimize resources DAC Single unit of deployment – increase deployment and upgrade efficiency
77. R2: Bringing DAC SQL Server 2008 R2 still has the same concept of databases, but it’s added a new level above databases called Data-Tier Applications, abbreviated DAC The DAC includes the database schema plus some server-level objects required in order to support the database, like logins. The DAC does not include the data inside your database.
78. R2: Bringing DAC For deployment best practices, you should have any necessary data (configuration tables, basic lookup tables) already scripted out as part of your deployment strategy. With the DAC approach, it makes sense to put these scripts inside the database as objects. For example, you might have a stored procedure called usp_deploy that populates all of the necessary configuration tables via insert statements.
79. What DACs Mean for Database Administrators Developers create and update their database schema, stored procedures, functions, etc. inside Visual Studio, packaging them into DAC Packs, and handing them to the database administrator.
81. SQL Server 2008 R2 Virtualization SQL Server installs by creating sysprep-ed images of SQL Server standalone instances that can be copied and quickly installed on target systems. Enables rapid provisioning and configuration using prepared images stored in VHDMart for Hyper-V deployments.
82. SQL Server 2008 R2 Virtualization SQL Server 2008 Sysprep accomplishes the install in two phases behind the scenes: prepare and configure. In SQL Server 2008 R2, two new SQL Server setup actions are exposed to the users: PrepareImage (also referred to as “prepare”) and CompleteImage (also referred to as “complete” or “configure”).
83. SQL Server 2008 R2 Virtualization SQL Server PrepareImage takes about 30 minutes for Engine and RS components, whereas CompleteImage takes a few minutes. Windows SysPrep process can be run in between the two steps to create Windows OS image and deploy to target computers, but is not a required step. These two steps can be run back-to-back on the same computer. Sysprep will be able to save on the average an estimated 30 minutes per install.
85. Managed Self-Service BI Users IT Administrators Model Analyze Personalize Share Provision Administer Secure Track Empowered to create analysisand gain actionable insightswithout IT dependence Empowered to managecompliance and resources without user obstruction Managed Self-Service BI Provides alignment between IT and IW to give users the power to drill into any aspect of their business and compress decision cycles to gain deeper insight while increasing the efficiency of the IT department.
86. Managed Self-Service BI Empower End Users Share & Collaborate Increase IT Efficiency The ability to respond quickly to business opportunities and customer needs can be a huge differentiator.
88. Microsoft BI Vision & Strategy BIfor Everyone Empower Your People with Business Insights Improve Organizational Effectiveness EnableIT Efficiency
89. Use What You Already Know Producers of BI Consumers of BI Enjoy the power and familiarity of Microsoft Office Excel 2010. Enjoy anywhere access to BI with zero footprint deployment. The Microsoft BI Solution Stack Improving organizations by providing business insights to all employees leading tobetter, faster, more relevant decisions.
90. Empower End Users SITUATION TODAY POWERPIVOT FOR EXCEL & REPORT BUILDER 3.0 Mashup data from internal and external sources directly in Excel Powerful analysis & reporting against vast amounts of data (+100 million rows) directly in Excel Productive reporting with familiar Microsoft Office interface Users need timely access to information Ability to combine data across multiple sources provides a more accurate view Training users on multiple BI tools is inefficient and costly “Our analysts can do anything with Excel, so integrating Excel as a front end for our BI infrastructure is extremely popular. It makes it easier to explore the huge wealth of data we have in our 17-terabyte data warehouse” - Dan Zerfas, Vice President of Software Development
91. PowerPivot for Excel PowerPivoting Massive Data Volumes With a few mouse clicks, a user can create and publish intuitive and interactive self-service BI solutions.
95. Capitalizing on Existing Reports with Report Part Gallery SourceReport New Report ReusingReport Components Reusing common report elements helps to accelerate report creation, cut down costs, and increase end-user adoption. SharePoint provides the central location for sharing and editing reports and automatically synchronizes published content objects.
96. Not Everything is Managed by IT Less than 20% of users rely exclusively on IT managed data and solutions. The rest roll their own “systems” out of sight from IT. IT Managed Succeeding with BI True pervasive insight requires bridging the gap between end users and IT.
97. Managed Self-Service Business Intelligence Empower your business units and users to create and share BI solutions through familiar and intuitive tools SQL Server PowerPivot Add-in for Excel(formerly known as "Gemini) SharePoint 2010 based Operations Dashboard SQL Server Reporting Services Report Builder 3.0 Rich visualization of geospatial data
98. Overview of PowerPivot So, what is PowerPivot then? New client and server components for self-service, managed BI Client component lights up Excel 2010 Huge data capacities and mash-ups In-memory SSAS data model Server component lights up in SharePoint 2010 Integrated SSAS service to handle data models Excel Services used to render spreadsheets
99. PowerPivot Excel Add-in PowerPivot client tool Add-in for Excel 2010 Provides SSAS engine, PowerPivot UI Creates in-memory database model Gather data for analysis Load & Prepare Data option Connect to data sources (lots of options) Import data into PowerPivot tab Select columns, filter data as needed
100. PowerPivot Excel Add-in Describe how data is related Inform PowerPivot of data relationships Supports basic PK-FK type relationships Create basic data model Use expressions (DAX) to add calculations Includes many Excel functions Expressions based on columns, not cells
101. Analyzing Data with Excel 2010 Use pivot-tables and pivot-charts Updated task pane for selecting data Data elements grouped by PowerPivot tab Use as dimensions or measures as appropriate New and updated visualizations Slicers (excellent!) Sparklines (excellent!) Improved icon sets, data bar formatting Improved chart options and performance
102. SQL Server Power Pivot demo Ing. Eduardo Castro, PhD GrupoAsesor en Informática ecastro@grupoasesor.net
103. Master Data Management ROLE SECURITY VERSIONING WORKFLOW APPROVAL HIERARCHY MGMT CRM Purchasing DB HR Doc ERP Asset Mgmt
104. SQL Server 2008 R2 Master Data Services (MDS) Master Data Services helps enterprises standardize the data people rely on to make critical business decisions. With Master Data Services, IT organizations can centrally manage critical data assets companywide and across diverse systems, enable more people to securely manage master data directly, and ensure the integrity of information over time.
105.
106. SQL Server 2008 R2 Master Data Services (MDS) Master Data Services is a Master Data Management application that consists of the following components and tools Master Data Services Configuration Manager, Master Data Manager Master Data Services Web service
107. SQL Server 2008 R2 Master Data Services (MDS) Master data hub that provides central management of master data entities and hierarchies and collections Thin-client stewardship portal that provides secure, role-based Web access to master data Versioning of all data entities and hierarchies Human workflow that notifies assigned owners by e-mail of business rule violations
108. SQL Server 2008 R2 Master Data Services (MDS) Flexible and extensible business rules that safeguard the quality of data entered in the master data hub Support for a broad range of hierarchy and attribute management strategies and requirements Comprehensive role-based security model that enables fine-grained, secure access to master data
109.
110. SQL Sserver 2008 R2 Master Data Services Models are the highest level of data organization in Master Data Services Entities Attributes and attribute groups Hierarchies (derived and explicit) Collections
112. SQL Server 2008 R2 – StreamInsight Technology Data volumes are exploding with event data streaming from sources such as RFID, sensors and web logs The size and frequency of the data make it challenging to store for data mining and analysis. The ability to monitor, analyze and take business decisions in near real-time
113. SQL Server StreamInsight’s SQL Server StreamInsight’s ability to derive insights from data streams and act in near real time provides significant business benefits. Some of the possible scenarios include: Algorithmic trading and fraud detection for financial services Industrial process control (chemicals, oil and gas) for manufacturing Electric grid monitoring and advanced metering for utilities Click stream web analytics Network and data center system monitoring.
116. Events Represent the user payload along with temporal characteristics Streams Sequence of events Flows into (one or more) standing queries in StreamInsightengine Queries Operate on event streams Apply desired semantics on events Adapters Convert custom data from event sources to / from StreamInsight events Key Concepts
117. Event Complex Event Processing (CEP) is the continuous and incremental processing of event streams from multiple sources based on declarative query and pattern specifications with near-zero latency. request output stream input stream response What is CEP?
118. Latency Relational Database Applications CEP Target Scenarios Operational Analytics Applications, Logistics, etc. Data Warehousing Applications Web Analytics Applications Manufacturing Applications Financial Trading Applications Monitoring Applications Aggregate Data Rate (Events/sec) Event Processing Scenarios
136. R2 in a glance New Editions 2 new premium editions to meet the needs of large scale datacenters and data warehouses New R2 Technologies Application and Multi-Server Management Managed Self-Service Business Intelligence Master Data Services StreamInsight Complex Event Processing Technology
137. R2 in a glance New R2 Solutions PowerPivot for SharePoint PowerPivot for Excel Parallel Data Warehouse Fast Track Data Warehouse
138. Introduction to SQL Server 2008 R2 Ing. Eduardo Castro ecastro@mswindowscr.org Comunidad Windows
Hinweis der Redaktion
DatacenterBuilt on SQL Server 2008 R2 Enterprise, SQL Server 2008 R2 Datacenter is designed to deliver a high-performing data platform that provides the highest levels of scalability for large application workloads, virtualization and consolidation, and management for an organization’s database infrastructure. Datacenter helps enable organizations to cost effectively scale their mission-critical environment. Key features new to Datacenter:Application and Multi-Server Management for enrolling, gaining insights and managing over 25 instancesHighest virtualization support for maximum ROI on consolidation and virtualizationHigh-scale complex event processing with SQL Server StreamInsightSupports more than 8 processors and up to 256 logical processors for highest levels of scaleSupports memory limits up to OS maximum
DatacenterBuilt on SQL Server 2008 R2 Enterprise, SQL Server 2008 R2 Datacenter is designed to deliver a high-performing data platform that provides the highest levels of scalability for large application workloads, virtualization and consolidation, and management for an organization’s database infrastructure. Datacenter helps enable organizations to cost effectively scale their mission-critical environment. Key features new to Datacenter:Application and Multi-Server Management for enrolling, gaining insights and managing over 25 instancesHighest virtualization support for maximum ROI on consolidation and virtualizationHigh-scale complex event processing with SQL Server StreamInsightSupports more than 8 processors and up to 256 logical processors for highest levels of scaleSupports memory limits up to OS maximum
SQL Server 2008 R2 Parallel Data Warehouse is a highly scalable data warehouse appliance-based solution. Parallel Data Warehouse delivers performance at low cost through a massively parallel processing (MPP) architecture and compatibility with hardware partners – scale your data warehouse to tens and hundreds of terabytes.Key features new to Parallel Data Warehouse:10s to 100s TBs enabled by MPP architectureAdvanced data warehousing capabilities like Star Join Queries and Change Data CaptureIntegration with SSIS, SSRS, and SSASSupports industry standard data warehousing hub and spoke architecture and parallel database copy
SQL Server 2008 R2 Parallel Data Warehouse is a highly scalable data warehouse appliance-based solution. Parallel Data Warehouse delivers performance at low cost through a massively parallel processing (MPP) architecture and compatibility with hardware partners – scale your data warehouse to tens and hundreds of terabytes.Key features new to Parallel Data Warehouse:10s to 100s TBs enabled by MPP architectureAdvanced data warehousing capabilities like Star Join Queries and Change Data CaptureIntegration with SSIS, SSRS, and SSASSupports industry standard data warehousing hub and spoke architecture and parallel database copy
Resource OptimizationUse new tools in SSMS to gain insights for improved consolidation management, maximize investments and ultimately maintain healthier systems.Consolidation managementUse a new explorer in SQL Server Management Studio to access the central management for multi-server management and at-a-glance dashboard views. Dashboard viewpoints provide insights into utilization and policy violation to help identify consolidation opportunities or resources at risk. What’s more, data and log file utilization are rolled up for visibility across databases and volumes—helping identify potential issues for you to pinpoint and take action.Improve service levelsSet policies to define desired utilization thresholds across target servers or applications within a new central management point. Identify issues with instances and applications, reducing the amount of time spent troubleshooting which apps are running on potential problem servers. Customize what resource properties are displayed based on your needs and view this information through dashboard views. Dashboard views help enable impact analysis – quickly drill in on issues before internal customers come knocking.
In R2’s SQL Server Management Studio, right-click on a database and click Tasks, Extract Data-Tier Application. This starts a wizard that will reverse-engineer your database schema, figure out what makes it tick, and package it in a way that you can redeploy it on another server. The information is saved in a file with a .dacpac extension, and if you try to open it with SQL Server Management Server, you’ll hit a stumbling block:
What DACs Mean for Database AdministratorsIf you never had a change control process and your developers just implemented changes willy-nilly in production, then the DAC approach won’t change anything. Your developers will do what they’ve always done.If you’ve got change control processes in place, your developers probably hand you change scripts and tell you to implement them in production. If you’re ambitious, you audit their work as a sanity check to make sure their work will scale. In the future, your developers may be creating and updating their database schema, stored procedures, functions, etc. inside Visual Studio, packaging them into DAC Packs, and handing them to you. In order for you to check their work, you’ll need to switch over into Visual Studio, or perhaps log onto their development SQL Servers to see the schema changes there. This is another nail in the coffin of the power of the DBA. From the nosql movement to the DBA-less cloud, DBAs need to be acutely aware of how things are changing.This isn’t necessarily a bad thing; it’s worked great in the world of virtualization. As a VMware sysadmin, I didn’t need to understand what each virtual server was doing, whether it conformed to best practices, or even what was running on it. I managed them in large quantities with low overhead simply by moving things around based on the resources they needed. If a server’s needs grew, I could move them to a larger VMware host or a less-active host. I only purchased resources incrementally for the entire pool rather than micromanaging what each server needed. I didn’t do as good of a job as if I’d micromanaged each server’s configuration, but I was able to manage more servers with less manpower. Everything’s a tradeoff.What if you, as a production DBA, could manage more instances and more databases with less time? What if, instead of looking at lines of T-SQL code, you were able to step back and see the bigger picture? What if you treated every application as a sealed, hands-off third-party app?
Dramatically reduce the time required for SQL Server installs by creating sysprep-ed images of SQL Server standalone instances that can be copied and quickly installed on target systems. Enables rapid provisioning and configuration using prepared images stored in VHDMart for Hyper-V deployments. SQL Server 2008 Sysprep accomplishes the install in two phases behind the scenes: prepare and configure. In SQL Server 2008 R2, two new SQL Server setup actions are exposed to the users: PrepareImage (also referred to as “prepare”) and CompleteImage (also referred to as “complete” or “configure”). SQL Server PrepareImage takes about 30 minutes for Engine and RS components, whereas CompleteImage takes a few minutes. Windows SysPrep process can be run in between the two steps to create Windows OS image and deploy to target computers, but is not a required step. These two steps can be run back-to-back on the same computer. Sysprep will be able to save on the average an estimated 30 minutes per install.This feature is available in SQL Server 2008 R2 for Database Engine and Reporting Services deployments.
Dramatically reduce the time required for SQL Server installs by creating sysprep-ed images of SQL Server standalone instances that can be copied and quickly installed on target systems. Enables rapid provisioning and configuration using prepared images stored in VHDMart for Hyper-V deployments. SQL Server 2008 Sysprep accomplishes the install in two phases behind the scenes: prepare and configure. In SQL Server 2008 R2, two new SQL Server setup actions are exposed to the users: PrepareImage (also referred to as “prepare”) and CompleteImage (also referred to as “complete” or “configure”). SQL Server PrepareImage takes about 30 minutes for Engine and RS components, whereas CompleteImage takes a few minutes. Windows SysPrep process can be run in between the two steps to create Windows OS image and deploy to target computers, but is not a required step. These two steps can be run back-to-back on the same computer. Sysprep will be able to save on the average an estimated 30 minutes per install.This feature is available in SQL Server 2008 R2 for Database Engine and Reporting Services deployments.
Dramatically reduce the time required for SQL Server installs by creating sysprep-ed images of SQL Server standalone instances that can be copied and quickly installed on target systems. Enables rapid provisioning and configuration using prepared images stored in VHDMart for Hyper-V deployments. SQL Server 2008 Sysprep accomplishes the install in two phases behind the scenes: prepare and configure. In SQL Server 2008 R2, two new SQL Server setup actions are exposed to the users: PrepareImage (also referred to as “prepare”) and CompleteImage (also referred to as “complete” or “configure”). SQL Server PrepareImage takes about 30 minutes for Engine and RS components, whereas CompleteImage takes a few minutes. Windows SysPrep process can be run in between the two steps to create Windows OS image and deploy to target computers, but is not a required step. These two steps can be run back-to-back on the same computer. Sysprep will be able to save on the average an estimated 30 minutes per install.This feature is available in SQL Server 2008 R2 for Database Engine and Reporting Services deployments.
SQL Server PowerPivot Add-in for Excel(formerly known as "Gemini)This innovative Excel add-in enables Excel power users to easily create powerful BI solutions by streamlining the integration of data from multiple sources enabling interactive modeling and analysis of massive amount of data and by supporting the seamless sharing of data models and reports through Microsoft Office SharePoint 2010 SharePoint 2010 based Operations DashboardThis SharePoint managed service enables front-line operators and administrators to monitor access and utilization of analyses and reports as well as track patterns of hardware usage to help ensure the right security privileges are applied and user generated solutions are available, up-to-date, and managed in a consistent way.SQL Server Reporting Services Report Builder 3.0 This updated ad-hoc reporting client accelerates report creation, collaboration and consistency by allowing users to create and share report components that can be accessed via the shared component library and by enabling the rapid assembly of comprehensive business reports using these shared components.Rich visualization of geospatial data New support for geospatial visualization including mapping, routing, and custom shapes can help your end users create customized reports that leverage existing content objects, such as queries, data regions, and charts and graphs. You can also enhance location-based data reports with Bing Maps in Report Builder 3.0.
PowerPivot for Excel supports self-service business intelligence in the following ways.Current row-and-column limitations in Excel are removed so that you can import much more data. This goes far beyond 1,000,000 rows! A data relationship layer lets you integrate data from different sources and work with all of the data holistically. You can enter data, copy data from other worksheets, or import data from corporate databases. You can build relationships among the data to analyze it as if it all originated from a single source. Create portable, reusable data. Data stays inside the workbook. You do not need manage external data connections. If you publish, move, copy, or share a workbook, all the data goes with it. PowerPivot data is fully and immediately available to the rest of the workbook. You can switch between Excel and PowerPivot windows to work on the data and its presentation in PivotTables or charts in an interactive fashion. Working on data or on its presentation are not separate tasks. You work on both together in the same Excel environment.
PowerPivot for Excel supports self-service business intelligence in the following ways.Current row-and-column limitations in Excel are removed so that you can import much more data. This goes far beyond 1,000,000 rows! A data relationship layer lets you integrate data from different sources and work with all of the data holistically. You can enter data, copy data from other worksheets, or import data from corporate databases. You can build relationships among the data to analyze it as if it all originated from a single source. Create portable, reusable data. Data stays inside the workbook. You do not need manage external data connections. If you publish, move, copy, or share a workbook, all the data goes with it. PowerPivot data is fully and immediately available to the rest of the workbook. You can switch between Excel and PowerPivot windows to work on the data and its presentation in PivotTables or charts in an interactive fashion. Working on data or on its presentation are not separate tasks. You work on both together in the same Excel environment.
A Business Intelligence project can often run into the sand because of data quality issues and tools like PowerPivot and Reporting Services will only highlight these problems back to the business,. These quality issues aren’t simply about keying errors they relate to the reference data that is stored in multiple places in many systems. An obvious example is the many versions of a customer that exist across these systems e.g. the marketing system have an address where they send out the catalogue, but this is different to the billing address in the finance system. However while this may well need to be fixed, it isn’t killing the business in that bills are being paid by customers even if the odd catalogue is being mis-mailed. Anyway my point is that this reference data exists in several systems, and fixing this in the data warehouse is OK for reporting but doesn’t resolve issues that can occur in production. Also this kind of problem is a business process issue, rather than being of a technical nature. Having said that technology can certainly help and this is where Master Data Services in SQL Server 2008 R2 comes in.The new release will provide a portal where end users can manage this reference data..
Master Data Services Configuration Manager, from which you can create and configure Master Data Services databases and Web applications.Master Data Manager, from which users can manage master data.Master Data Services Web service, from which a developer can extend or develop custom solutions for Master Data Services in his or her environment.
Models (Master Data Services)[This topic is pre-release documentation and is subject to change in future releases. Blank topics are included as placeholders.] Models are the highest level of data organization in Master Data Services. A model contains the following objects:EntitiesAttributes and attribute groupsHierarchies (derived and explicit)CollectionsThese objects organize and define the master data, which are members and their attribute values. Model objects are maintained in the System Administration functional area of the Master Data Manager user interface, and the master data is maintained in the Explorer area.You can have one or many models. Each model should group similar kinds of data. The master data generally falls into one of four categories: people, places, things, or concepts. For example, you can create a Product model to contain product-related data or a Customer model to contain customer-related data.Initially, you create the structure of your model by creating entities to contain members and their attributes. Then you can produce hierarchies and collections to roll up members in different ways for analysis and publishing to subscribing systems.You can assign users and groups permission to view and update objects within the model. If you do not give permission to the model, it is not displayed.At any given time, you can create copies of the master data within a model. These copies are called versions.When you have defined a model in a test environment, you can deploy it, with or without the corresponding data, from the test environment to a production environment. This eliminates the need to recreate your models in your production environment. Example In the following example, the Product model defines the way to organize product-related data.Product (model) Product (entity) Name (free-form attribute) Code (free-form attribute) Subcategory (domain-based attribute and entity) Name (free-form attribute) Code (free-form attribute) Category (domain-based attribute and entity) Name (free-form attribute) Code (free-form attribute) StandardCost (free-form attribute) ListPrice (free-form attribute) ThumbNailPhoto (file attribute)Other common models are:Accounts, which could include entities such as balance sheet accounts, income statement accounts, statistics, and account type.Customer, which could include entities such as gender, education, occupation, and marital status.Geography, which could include entities such as postal codes, cities, counties, states, provinces, regions, territories, countries, and continents.
Data volumes are exploding with event data streaming from sources such as RFID, sensors and web logs across industries including manufacturing, financial services and utilities. The size and frequency of the data make it challenging to store for data mining and analysis. The ability to monitor, analyze and act on the data in motion provides significant opportunity to make more informed business decisions in near real-time