Develop Composite Business Services To Enable Reuse In A Service Orien...
A Service Oriented Architecture For Order Processing In The I B M Supply Chain
1. Service Oriented Architecture for Order Processing in the IBM Supply Chain Dr. Germán Goldszmidt ( [email_address] ), STSM, IBM Software Group Carl Osipov ( [email_address] ), Software Engineer, IBM Software Group 2166A Architecting the On Demand Enterprise October 16, 2006
2.
3.
4.
5. Use Processes, Workflows, Services, and Components Process Models & Modules Services Workflows Distributed Components transform use expose
30. On Demand Business Process Lifecycle developerWorks articles series 1. Create the foundation for your on demand business processes 2. Patterns for e-business recipe 3. Business process modeling using WebSphere Business Integration Modeler 4. Integrate artifacts from Rational XDE and WebSphere Business Integration Modeler 5. Workflow development, deployment, and testing 6. Apply customization policies and rules 7. Monitor business processes and emit events using CEI 8. Business process monitoring -- Create key performance indicators 9. Involve people 10. Develop message adapters for CICS transaction servers 11. Integrate business processes with CICS transaction servers 12. Implement a compensation service 13. Deploy in a clustered environment 14. Use a clustered WebSphere MQ deployment to balance messaging workload 15. Deploy a scalable, secure and stable foundation for a Service-Oriented Architecture
The goals of the on demand transformation is to achieve faster time to value, lower development costs, and increased capacity of BT projects. The high-level goals of the Oneida-2 PoC included: to demonstrate value to the business, to prepare and plan towards an on demand transformation, and to document best practices for other engagements. A stretch goal, was to actually move some of the pilot to the real production environment. Then, we want to scale and replicate this on multiple other internal IT projects for IBM. COATS is a shared order entry service for more than 20 manufacturing plants worldwide. It fields hardware orders from IBM customers, IBM Business Partners, IBM sales professionals and other internal organizations. The application sorts and prioritizes these orders, comparing them against manufacturing rules and the customer’s installed hardware base. Then, several times each day, it routes material lists and instructions to appropriate manufacturing facilities. C OATS supports orders for "complex” configured hardware: P-, I-, Z- Series, Storage, Printers, and Retail. Salespersons, business partners, or client buyers indirectly access COATS by introducing orders for new machines, upgrades, customizations, and other changes to existing customer orders. COATS “translates” the flow of customer orders into manufacturing material lists (Bill of Materials) and other instructions, which are then forwarded to IBM manufacturing plants worldwide. The manufacturing plants then fulfill the orders and ship them to the customers’ premises. An example of an order will be for a mainframe with detailed specifications, pre-loaded software, delivery date, etc. An example of the corresponding output will include a configurations to the manufacturing floor of a specific plant, in the corresponding nomenclature, including bill of materials, configuration plan, etc. The original application is a complex batch system, used as a shared “insourced” service. COATS supports many plants, and each plant has its own customization needs and access patterns, e.g. ‘High Volume-Low Price’ vs ‘High price-Low Volume’. Its history goes back over 25 years, and now includes 1.4MLoc of PL/1, OS/390 Assembler, Java, etc. It is used as a shared “insourced” service , with worldwide 365x24 coverage, supporting many plants, each plant has customization needs. It is running close to capacity at peak times. There are many a lterations to orders, including automatic alterations (scheduler system) to meet customer on time date. Multiple databases need to be updated and queried, depending on geography and other parameters, e.g., sold machines history is kept 5-10 years, in multiple history DBs. The application is continually being updated (quarterly versions, each takes 6 months to develop), to support new initiatives, product introductions, business opportunities, and outsourcing requirements. It delivers orders to manufacturing more than 20 times daily. It translates sales orders into manufacturing material lists and instructions, for new machines, upgrades, engineering changes, and delivers orders to manufacturing 20+ times daily.
1:30 SOA is here, it works, and is helping IBM save money. In addition to cutting the application development costs per release by 25%, the transformed application is more resilient to errors and hence has better availability. The performance of the application improved following the transformation to SOA and flexibility is designed into the architecture to add capacity to handle higher workloads in the future. Using Modeler did not only improve the existing processes but also given the project stakeholders the flexibility to simulate the system and identify new opportunities for process improvement. The ability to simulate the system is also helpful in analysing scenarios for response to changing business requirements and market conditions. To developers the introduction of the modeler means automated translation of the process changes and requirements into BPEL. As you have seen from the demo and the presentation, the application is easy for business analysts to modify via Modeler and the business rule editor. And the last but certainly not least, business services provided by the application became more visible to executives though sense and respond metrics collected from the business processes.
1:30 So how does the transformation actually happen? Who needs to be involved and what products does a team need to transform a legacy application to SOA? Early in a project lifecycle, the business analyst engages with the subject matter experts who are typically business owners of the transformation effort and model the business processes. Throughout this exercise, the business analyst can reuse any of the already modeled processes and business objects. Often some or all of the components that are needed to implement a business process may not be in place. The architect is responsible for the design of the required components and the initial code that should be used for the actual implementation. The article produced by both the architect and the analyst is imported into a programming environment where the developers can continue the work to produce deployable code When the application is running into production, a reuse engineer who is familiar with a particular domain can review the assets and identify what can be reused First the business analysts creates a model using WBI Modeler: Explore ‘to-be” business services, Assign resources, Perform process simulations. Identifying patterns in business solutions and other reusable assets from repositories. IT architects use Rational XDE for use case modeling of the business scenario and to develop an object model for the workflow process and services. Generate data models from UML. The resulting artifacts are used by the development team to produce an executable implementation. Java code is generated from the object model using the XDE code generation facilities. The generated Java code is imported into WSAD-IE. The exported object model is then converted to XML schema elements using WSAD-IE. The development team uses WSAD-IE to integrate the Java code and the XML schema elements with additional components and services to complete an executable workflow implementation. They c horeograph business process workflow, create adapters and services, add technology details and service implementations. They generate Web service code for legacy façade components; Integrate rules (BR Beans) using a rule service façade to integrate with workflow and components; Generate necessary WSDL/XSD and deployment artifacts; Enable Legacy access; Customize process templates (BPEL+) e.g. Java snippets, custom components; Integrate legacy components service interfaces; Integrate Staff activities for decision making, exception handling, etc; Generate (business, system) events for Key Performance Indicators; Integrate a presentation layer through Portal; Store reusable assets back into repository; The resulting application (EAR file) is then deployed on WBI-SF. (Deployment Manager); Reusable assets such as existing XSD schemas can be imported into WBI Modeler before the modeling process. SMEs identify and create reusable assets. TOOLS USED: WBI Modeler 5.1 , Rational XDE, WSAD-IE 5.1, WBI-SF 5.1, +Common Event Infrastructure (Tech Preview) , Websphere Portal Express (Toolkit 5.021), ALSO (not shown) Rational Requisite Pro, DB2, MQ Series, CICS 2.3, IBM Open Bazar CVS repository Development Steps
0:45 Lets take a look at the development lifecycle step by step starting with the process model development. The figure on the slide shows a modeled portion of an “as-is” business process. Of course on of the advantages of having a process model is that nce the model is created, the business analyst can make changes to it and immediately simulate the impact of these changes on the KPIs defined by the stakeholders. The goal of exploring various changes to the model is to get to the “to-be” version of the process that the business stakeholders want to implement.
Traditionally, business analysts have experience in using Visio or a similar diagramming tool to model business processes. Since a significant aspect of the business analyst’s role is to communicate the process model to the various stakeholders, usually we find that analysts settle on a particular style for appearance of the processes models which makes it easier to communicate the model to the stakeholders.
2:30 What are the caveats of the workflow development process? One of the common mistakes that teams make is forgetting that the workflow implementation in WSAD IE is EJB based and not hiring skills with sufficient expertise in Java/EJB. In fact, because the workflow implementation code is auto generated and is not well commented, it is imperative that the developers who are expected to maintain the workflow code should be familiar with the underlying Java/EJB technologies. Another issue that developers forget when designing complex schemas from scratch or adopting schemas from standards organizations such as oasis, is that WSADIE and WBI SF do not support arbitrary schemas out of the box. Before signing off on a schema implementation is it important to cross-reference the schema definition against the list of the limitations that are documented in the WSADIE Infocenter. Same applies to WID/WPS products. If a project is considering to adopt the WBI framework, one of the key questions to answer early in the project lifecycle relates to the change management of the business process modeling and the BPEL workflows. The current tooling does not support round tripping of changes from WSADIE/WID to Modeler or vice versa. Another design decision for a project is to identify strategy to handle the rules, metrics and roles and other requirements embedded in a process model. These definition are lost when exported to BPEL and will not be available in the final deployable artifact: the EAR file. I’ll talk about an approach to handle one of the elements of the process model that I mentioned in the next few slides that describe the business policy and rule implementation. Finally, another feature that was available in the past as a part of WebSphere InterChange Server, WICS – adapters is not yet fully supported in WSADIE, although there is some partial support of the WICS adapters in the WPS and WID 6.0.1.
1:30 The previous slide mentioned business policies as one of the examples of an aspect of a business process model which doesn’t not map directly to BPEL syntax. So what is a policy? It is typically a declarative statement that places a restriction on operation of business processes, e.g., " only USA customers can order machine X ". Each policy may require one or more enforcement points in the implementation. An enforcement point may be implemented as an explicit step in a process or a specific location in the code. After the enforcement points are defined in the process, and the process model is converted into BPEL, the workflow developer needs to implement the business rules to support the policies. One of the options for implementing the rule which I’ll describe in more detail in the next slide is to have the rules be services external to the process and to use the decisions made by the rules as variables in the BPEL workflow. This option is essentially an implementation of a façade pattern where a single web service interface façade provides access to the business rules needed by the workflow. Then the business rules are implemented as invoke activities in the flow and are bound to the specific rule implementation through partner links. Finally, after the application is running, business rules can be managed using clients shipped in the WBI SF and WPS products.
The Oneida Lifecycle goes beyond the development lifecycle to cover Business Process management (BPM). BPM covers the collection of business process events performance and input during execution, presentation and analysis, followed by adaptation and optimization. We used the Common Base Events (CBE) format to emit the relevant events from the execution. CBEs are retrieved and correlated to create and display Key Performance Indicators (KPIs), which are used to refine the business processes. CBEs are stored and distributed using the Common event Infrastructure (CEI). The CEI accepts events from various sources, and performs a set of tasks: It persists the events for later query and reporting, and publishes the events to interested subscribers. If can filter out selected events based on xPath expression. It can support multiple qualities of service for event transport and delivery. CBE is its native format, but CEI can support multiple event message formats. Events are correlated and monitored by the business executives using a performance dashboard that highlights the Key Performance Indicators. We used portlets to render and aggregate information into composite pages to provide information to users in a compact form. The portal also allows operators to handle assigned work items assigned, such as order exceptions.
Add more best practices
Learn about how to Build reusable assets to transform an order processing system in the developerWorks series On demand business process life cycle http://www-128.ibm.com/developerworks/ibm/ws-odbp/ Find out more about Oneida and COATS from http://w3.ibm.com/articles/workingknowledge/2005/04/swg_odsd_oneida.html See the latest OMCS/COATS demo at http://w3.webahead.ibm.com/w3ki/display/oneida/Demos Business value: a) increased order volume capacity b) faster turn around time to implement manufacturing requirements, and (on line request from manufacturing used to take now in 10 seconds vs 4 minutes in previous environment!) c) ability to make on demand changes to the run time workflow, through easily selectable business rules. (early added rule). In COATS, they had a daily average of approximately 2500 requests (about 300 at peak hours). Thus, 4 minutes savings, potentially more than 150 hours daily speed-up in handling of orders per day. (At this time, only a selected few are using OMCS in a trial basis, yesterday only 50 such requests). Today, using COATS, a manufacturing plant customer that enters an "order transaction“, (to speed up an order, move the order more quickly through processing to get down to mfg floor) must wait about 4 minutes for its completion. Using OMCS R1 portal a similar "order transaction" now takes approximately 10 seconds. We expect about 200 such transactions in an average day using OMCS R1. Thus, for the aggregate customers of supply chain, the wait time was reduced by 13 hours. . Additional performance benefits are expected in R2, when a significantly larger volume of orders will flow thru OMCS. All Release requests will process in R2 as well as on-line for MFC The manufacturing requests are typically "manual fixes" of orders, in which the staff from the plant update sales order with the right data, (from sales nomenclature to available parts numbers) or attempt to expedite an order in the scheduling system because it has priority, etc. From Feb 7, we feed actual orders ON-LINE through the new OMCS step for NewC validation, and continue with the legacy to manufacturing. CICS front end, WebsPhere to DB2 connection, COBOL instread of Pl/I to use OMCS (Order Management Component Services) project is transforming a subset of the overall COATS system to a real-time order submission system. This project makes extensive use of workflows, services, business rules, enterprise service bus (ESB) and other integration technology provided by WBI-SF. The users are integrated to applications through WPS. We connected systems such as legacy COATS and NEWC Order Validation/Configuration services with business process through CICS and MQ. The development process started with business process modeling using WBI Modeler. The workflows are enhanced using the WSAD-IE tool with business objects from Rational XDE and integration of legacy adaptors. During this work we harvested reusable development assets and best practices: § to connect to legacy systems using SOAP for CICS modules § rule access services and an object framework to integrate BRBeans rule engine § shared business process (e.g., Milestone Manager Process) across business units § used IBM Open Bazaar to share the process/code/workflows across projects § Stored procedures to connect to legacy databases from the workflows Values to OMCS This project helps COATS to implement a method for incremental transformation of their legacy business functions. We demonstrated incremental development methods, proper use of the tools, and middleware components. This results in improved performance, reduced development coat and faster development turn around time. The customizable business process improved the runtime customization of the processes by business owners. A better error handing facility is included with the processes for faster exception processing resulted in huge revenue savings. Values to external market This project demonstrated an on demand transformation of the legacy business through the proper use of elements of IBM middleware components. The assets and best practices demonstrated in this project could be replicated across other business units. For example, the knowledge could be shared about how to transform PL1 modules and integrate with the business processes, how to connect legacy functions from workflow using MQ and how to implement customizable workflows. The SOA based development methods and proper use of tools illustrates faster time to market. The process customization, incremental transformation, and legacy integration methods demonstrated in this project will helps create reusable assets and best practices across industry. From the WEB the customer can request to send from 1 to 30 MFGNO's. That in turn will get all the associated order numbers for thoses MFGNO's and send to NEWC for RELEASE Processing. The returning data from NEWC will update 5 COAT DB2 tables with topology and process information. We had 89 Release batch runs at an average batch run based on 1/09 to 02/09 of 4 minutes. The processed 7000 to Release so average would be approx 80. per run.
Add reference to a proof point – add the case in point chart, url. Add slides on the SOA+ collaboration, rowing chart… research, internal accounts. List of the projects where Oneida has been used. Expand on one of the projects in the deck