As Kafka deployments grow within your organization, so do the challenges around lifecycle management. For instance, do you really know what streams exist, who is producing and consuming them? What is the effect of upstream changes? How is this information kept up to date, so it is relevant and consistent to others looking to reuse these streams? Ever wish you had a way to view and visualize graphically the relationships between schemas, topics and applications? In this talk we will show you how to do that and get more value from your Kafka Streaming infrastructure using an event portal. It’s like an API portal but specialized for event streams and publish/subscribe patterns. Join us to see how you can automatically discover event streams from your Kafka clusters, import them to a catalog and then leverage code gen capabilities to ease development of new applications.
12. Benefits of an Event Portal
12
Solace
Your business can…
• Understand who is consuming which
events, how much, for what
• Create value-added services and
gain insights by combining events
• Make it easy for users across app
teams, LoBs and partner ecosystem
to find and reuse events
• Monetize popular or particular
high-value event streams
Reuse/Sharing
Architects & developers can:
• Collaborate on events
and event-driven apps
• Visually see event-driven
interactions to understand
impact of changes
• Low Code: Generate app code by
exporting AsyncAPI definitions
• Benefit from best practices &
consistently apply conventions
Productivity
CDO/data governance can:
• Understand where data is
coming from (lineage) and going
• Track changes and audit for
deviations while promoting apps
and dependencies though Dev,
Test, QA, and Prod
• Use application/event
relationships to create security
policies and validate compliance
with schema
Control