The document discusses why companies should build Apache Kafka connectors. It begins with an agenda that covers what event streaming is, Kafka and connecting to Kafka, the value of building a connector, and a Q&A session. It then discusses how event streaming is enabling new business models and outcomes across many industries. The Confluent Verified Integrations Program helps partners build and verify connectors, with benefits like marketing support and a multi-vendor case management process. The presentation provides information on the verification levels (Gold for Kafka Connect API connectors, Standard for other integrations), requirements, submission process, and how Confluent can help support partners. It encourages attendees to sign up for the program or contact them with any questions.
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
Why Build an Apache Kafka® Connector
1. 1
Why Build an Apache Kafka® Connector
Series - Building Kafka Connectors - the Why and How
Part 1 -Why Build a Kafka Connector
Part 2 -How to Build a Kafka Connector
2. 2
Speakers
Sree Karuthody, Sr. Manager,
Technology Partnerships
Jeff Bean,
Partner Solution Architect
Sid Rabindran,
Dir. Partner Programs and
Tech Ecosystem Partners
Lisa Sensmeier,
Partner Marketing
3. 3
What is Event Streaming
Kafka and Connecting to Kafka
Value of building a connector
Verified Integrations Program
Q and A
Agenda
Key business drivers
behind connecting to
Kafka
The Verified Integrations
Program, benefits, how
to participate, process
5. 55
The New Enterprise Reality
Innovate or be
disrupted
● Deliver new, real-time
customer experiences
● Create new business
models
● Deliver massive internal
efficiencies & risk reduction
Every company is a
software company
● Capital One
10,000 of 40,000 employees
are software engineers
● Goldman Sachs
1.5B lines of code across
7,000+ applications
Innovation is about
events, not “data”
● Event = something happened
● Your business = a continually
updating stream of events
● Your success = your ability to
respond to these events
6. 66
The Rise of Event Streaming
Data as a continuous stream of events
60%
Fortune 100 Companies
Using Apache Kafka
7. 77
Auto / Transport
Without Event Streaming With Event Streaming
Event Streaming Enables New Outcomes
Call for driver availability
No knowledge of driver arrival
No data on feature usage
Real-time driver-rider match
Real-time ETA
Real-time sensor diagnostics
Banking Nightly updated account balance
Batch fraud checks
Batch regulatory reporting
Real-time account updates
Real-time credit card fraud alerts
Real-time regulatory reporting
Retail Post-order “out of stock” emails
No upsell through personalization
Batch point-of sale reports
Real-time inventory
Real-time recommendations
Real-time sales reporting
10. 1010
Confluent Enables Your Event Streaming Success
Confluent founders are
original creators of
Kafka
Confluent team wrote 80% of
Kafka commits
Confluent Platform
extends Apache Kafka to
be a secure,
enterprise-ready platform
Confluent helps enterprises
successfully deploy event
streaming at scale and
accelerate time to market
Hall of Innovation
CTO Innovation
Award Winner
2019
11. 11
Confluent Platform
Operations and Security
Development & Stream Processing
Support,services,training&partners
Apache Kafka
Security plugins | Role-Based Access Control
Control Center | Replicator | Auto Data Balancer | Operator
Connectors
Clients | REST Proxy
MQTT Proxy | Schema Registry
KSQL
Connect Continuous Commit Log Streams
Complete Event
Streaming Platform
Mission-critical
Reliability
Freedom of Choice
Datacenter Public Cloud Confluent Cloud
Self-Managed Software Fully-Managed Service
15. 16
● End to end monitoring and analytics
● Enterprise-grade security
● Multi-datacenter replication
● SQL-based stream processing
● Cloud-native deployment with K8s operator
● Run as self-managed software or as fully
managed service with Confluent Cloud
Enterprise streaming platform built by the original creators of Apache Kafka
Edge
Cloud
Data LakeDatabases
Datacenter
IoT
SaaS AppsMobile
Microservices Machine
Learning
Confluent Platform
16. 17
Deploy and Stream with Confluent on Any Cloud
Confluent Platform
The Enterprise Distribution of Apache Kafka
Deploy on any platform on-premises or in public clouds
Fully-Managed Service
Confluent Cloud
Apache Kafka Re-engineered for the Cloud
Available on the leading public clouds
VM
Self-Managed Software
17. 1818
Apache Kafka™ Connect API – Streaming Data Capture
JDBC
Mongo
MySQL
Elastic
Cassandra
HDFS
Kafka Connect API
Kafka Pipeline
Connector
Connector
Connector
Connector
Connector
Connector
Sources Sinks
Fault tolerant
Manage hundreds of
data sources and sinks
Preserves data schema
Part of Apache Kafka
project
Integrated within
Confluent Platform’s
Control Center
21. 22
Leverage the
Ecosystem
Search/Log Alternative Processing
Databases/Data Stores
JDBC
Data Lake/Data Warehouse
File/Messaging/Custom
MQTT
FTP
Container/Deployment Platforms
Kafka and
Confluent
Ecosystem
JMS
SQS
22. 23
New
Opportunities
Engage with Early Adopters and Early Majority Kafka
users
Help customers access new, event driven
applications
Expand your reach into new markets, new use
cases and new opportunities
Extend the use of your product - a more complete
solution offering
Kafka adoption
continues to grow
23. 24
Visibility and access on Confluent Hub
Joint go-to-market activities including
Logo exchange
Blog opportunity
Online talk, field events - and lead sharing
Marketing development fund availability
Joint collateral
Confluent sales enablement activities
Joint tutorials
Marketing
Benefits
Go To Market with
Confluent
24. 25
"Customers want to ingest data streams in real
time from Apache Kafka into Kinetica for
immediate action and analysis. Because the
Confluent Platform adds significant value for
enterprises, we built out the Kinetica connector
using Connect APIs, offering a deeper level of
integration with the Confluent Platform."
-- Irina Farooq, CPO, Kinetica
“Neo4j and Confluent share a customer base that
is determined to push the boundaries of detecting
data connections as interactions and events
occur. Driven by customer need to realize more
value from their streaming data, we have
integrated Neo4j and Kafka both as a sink or
source in a Confluent setup. As a result, Confluent
and Neo4j customers will be able to power their
financial fraud investigations, social media
analyses, network & IT management use cases,
and more with real-time graph analysis.”
-- Philip Rathle, VP of Products, Neo4j
"A Verified Gold Connector with Confluent
Platform is important to our customers who want
a validated and optimized way to enable
operational data flows to and from Couchbase, an
enterprise-class NoSQL database, and Kafka.
With the Kafka Connect API, we have a deeper
level of integration with the Confluent Platform so
together, our joint solution is truly enterprise ready
for any application modernization or cloud native
application initiatives."
-- Anthony Farinha, Senior Director, Business
Development, Couchbase
“In collaboration with Confluent we developed a
Verified Gold connector that enables our
customers to achieve the highest throughput
rates possible. It also enables highly secure,
resilient, and flexible connections between
DataStax database products built on Apache
Cassandra™ and Confluent’s event streaming
platform. We promised our joint enterprise
customers a fully supported microservices-based
application stack and this partnership delivers on
that promise.“
—Kathryn Erickson, Senior Director of Strategic
Partnerships, DataStax
"Imply is a real-time analytics solution, built on
Apache Druid, to store, query, and visualize
event-driven data. Connecting Imply to Kafka
and Confluent Platform enables high-throughput
streaming and sub-second interactive queries at
scale. Together they enable enterprise data
applications to analyze clickstreams, use
behavior, network telemetry and more."
-- Gian Merlino, Chief Technology Officer, Imply
"Attunity, a division of Qlik, partners with
Confluent to provide technologies that solve the
very real problem of streaming data
continuously so you can run your business in
real time.
-- Itamar Ankorion, Managing Director, Data
Integration and SVP, Technology Alliances, Qlik
What our Partners are saying
Driven by customer need to realize
more value from their streaming data
..we developed a Verified Gold
Connector to enable the highest
throughput rates possible.
With the Kakfa Connect API, we
have a deeper level of integration
with Confluent Platform so our
solution is truly enterprise ready...
29. 30
TSANet
Multi vendor case management
process and tool
Solves multi vendor problems faster
and easier
Alleviates customer concerns about
finger pointing
Program
Benefits
30. 31
Kafka Connect API
Confluent Platform: Commercial standard for
Apache Kafka
Standardized source and sink
Schema registry integration
Control center
Better user experience
Better scalability
Free, easy to use, fault tolerant framework
Why Gold?
Best
Integration for
Confluent
Platform
31. 32
Process
1. Initiated
Discussions to start the
process
2. Guidance
Building the integration
or connector
3. Submitted
Start the testing with
Confluent
4. Verified
Published on
Confluent.io/Hub, start
demand gen activities
33. 34
How to Submit
Gold
● Contact Info
● Connector
● Documentation
○ Product Brief
○ End-to-end usage
○ Version info
○ Config info
○ Testing and results
Standard
● Contact Info
● Software Package
● Documentation
○ Product Brief
○ Evaluation Guide
○ Demonstration
Checklist
34. 35
Building the
connector
Have a
connector?
Review the Checklist
Contact us to discuss
Submit your connector
New Connector
Review the Connector
Verification Guide
Contact us to start the
process
Sign Up or Contact Us
confluent.io/verified-integrations-program
35. 37
Sign up - questions or to start the process:
Verified Integrations Program
confluent.io/verified-integrations-program/
Attend the online talk part 2:
How to build a Kafka connector
confluent.io/online-talks/
Kafka Summit San Francisco:
kafka-summit.org/
code KS19Online25 for 25% off
Q&A