Presentation at the May 2012 Intelligence Workshop held in Rome Italy.
Interoperability is key to reducing cost in the development and maintenance of applications that span multiple providers or must be supported over long periods of time. This presentation describes the role of network middleware technologies in such systems and how the use of a data-centric middleware, such as OMG DDS, makes developing such systems easier and more cost-effective.
Axa Assurance Maroc - Insurer Innovation Award 2024
Interoperability for Intelligence Applications using Data-Centric Middleware
1. Your systems. Working as one.
Prerequisites of network centric intelligence:
Data Distribution Bus
Intelligence Workshop, Rome, May 2012
Gerardo Pardo‐Castellote, Ph.D. [gerardo@rti.com]
CTO, Real‐Time Innovations, Inc. [www.rti.com]
Co‐author of DDS specification
Co‐chair of the OMG Data‐Distribution SIG
12. Levels of Conceptual Interoperability (LCIM)
Level 6 Full assumptions and constraints of meaningful abstraction of
Conceptual Interoperability reality. Fully specified but independent model
Level 5 Maintains state changes between systems during run time.
Dynamic Interoperability Includes assumptions and constraints that effect data interchange
Level 4 Systems are aware of methods & procedures of other systems.
Pragmatic Interoperability Context is understood by all participating systems
Level 3 Meaning of data is exchanged through use of a common
information model. The meaning of information is shared and
Semantic Interoperability unambiguously defined.
Data‐Centric
Middleware
Level 2 Common structure or common data format for exchanging
information. The format of the information exchange is
Syntactic Interoperability unambiguously defined
Middleware
Traditional
Level 1 Communication protocol for exchanging data. Bits & Bytes are
Technical Interoperability exchanged in an unambiguous manner
Level 0 Stand alone systems that have no interoperability
No Interoperability
23. Quality of Service (QoS)
• Aside from the actual data to be delivered, users often
need to specify HOW to send it …
… reliably (or “send and forget”)
… how much data (all data , last 5 samples, every 2 secs)
… how long before data is regarded as ‘stale’ and is discarded
… how many publishers of the same data is allowed
… how to ‘failover’ if an existing publisher stops sending data
… how to detect “dead” applications
……
• These options are controlled by formally‐defined
Quality of Service (QoS)
24. Real‐Time Quality of Service (QoS)
QoS Policy QoS Policy
DURABILITY USER DATA
User QoS
HISTORY TOPIC DATA
Volatility
READER DATA LIFECYCLE GROUP DATA
WRITER DATA LIFECYCLE PARTITION
Presentation
LIFESPAN PRESENTATION
Infrastructure
ENTITY FACTORY DESTINATION ORDER
RESOURCE LIMITS OWNERSHIP
Redundancy
RELIABILITY OWNERSHIP STRENGTH
Delivery
TIME BASED FILTER LIVELINESS
Transport
DEADLINE LATENCY BUDGET
CONTENT FILTERS TRANSPORT PRIORITY