SlideShare ist ein Scribd-Unternehmen logo
1 von 44
Himani Arora & Prabhat Kashyap
Software Consultant
@_himaniarora @pk_official
Who we are?
Himani Arora
@_himaniarora
Software Consultant @ Knoldus Software LLP
Contributed in Apache Kafka, Juypter,
Apache CarbonData, Lightbend Lagom etc
Currently learning Apache Kafka
Prabhat Kashyap
@pk_official
Software Consultant @ Knoldus Software LLP
Contributed in Apache Kafka and Apache
CarbonData and Lightbend Templates
Currently learning Apache Kafka
Agenda
●
What is Stream processing
●
Paradigms of programming
●
Stream Processing with Kafka
●
What are Kafka Streams
●
Inside Kafka Streams
●
Demonstration of stream processing using Kafka Streams
●
Overview of Kafka Connect
●
Demo with Kafka Connect
What is stream processing?
● Real-time processing of data
● Does not treat data as static tables or files
● Data has to be processed fast, so that a firm can react to
changing business conditions in real time. This is required
for trading, fraud detection, system monitoring, and many
other examples.
● A “too late architecture” cannot realize these use cases.
BIG DATA VERSUS FAST DATA
3 PARADIGMS OF PROGRAMMING
● REQUEST/RESPONSE
● BATCH SYSTEMS
● STREAM PROCESSING
REQUEST/RESPONSE
BATCH SYSTEM
STREAM PROCESSING
STREAM PROCESSING with KAFKA
2 APPROACHES:
● DO IT YOURSELF (DIY ! ) STREAM PROCESSING
● STREAM PROCESSING FRAMEWORK
DIY STREAM PROCESSING
Major Challenges:
● FAULT TOLERANCE
● PARTITIONING AND SCALABILITY
● TIME
● STATE
● REPROCESSING
STREAM PROCESSING FRAMEWORK
Many already available stream processing framework are:
SPARK
STORM
SAMZA
FLINK ETC...
KAFKA STREAMS : ANOTHER WAY OF STREAM PROCESSING
Let’s starts with Kafka Stream but wait.. What is KAFKA?
Hello! Apache Kafka
● Apache Kafka is an Open Source project under Apache Licence
2.0
● Apache Kafka was originally developed by LinkedIn.
● On 23 October 2012 Apache Kafka graduated from incubator to
top level projects.
● Components of Apache Kafka
○ Producer
○ Consumer
○ Broker
○ Topic
○ Data
○ Parallelism
Enterprises that use Kafka
What is Kafka Streams
● It is Streams API of Apache Kafka, available through a Java library.
● Kafka Streams is built on top of functionality provided by Kafka’s.
● It is , by deliberate design, tightly integrated with Apache Kafka.
● It can be used to build highly scalable, elastic, fault-tolerant, distributed
applications and microservices.
● Kafka Streams API allows you to create real-time applications.
● It is the easiest yet the most powerful technology to process data stored
in Kafka.
If we look closer
● A key motivation of the Kafka Streams API is to bring stream processing out of
the Big Data niche into the world of mainstream application development.
● Using the Kafka Streams API you can implement standard Java applications to
solve your stream processing needs.
● Your applications are fully elastic: you can run one or more instances of your
application.
● This lightweight and integrative approach of the Kafka Streams API – “Build
applications, not infrastructure!” .
● Deployment-wise you are free to chose from any technology that can deploy Java
applications
Capabilities of Kafka Stream
● Powerful
○ Makes your applications highly scalable, elastic, distributed, fault-
tolerant.
○ Stateful and stateless processing
○ Event-time processing with windowing, joins, aggregations
● Lightweight
○ Low barrier to entry
○ No processing cluster required
○ No external dependencies other than Apache Kafka
Capabilities of Kafka Stream
● Real-time
○ Millisecond processing latency
○ Record-at-a-time processing (no micro-batching)
○ Seamlessly handles late-arriving and out-of-order data
○ High throughput
● Fully integrated
○ 100% compatible with Apache Kafka 0.10.2 and 0.10.1
○ Easy to integrate into existing applications and microservices
○ Runs everywhere: on-premises, public clouds, private clouds, containers, etc.
○ Integrates with databases through continous change data capture (CDC) performed by
Kafka Connect
Key concepts of Kafka Streams
● Stateful Stream Processing
● KStream
● KTable
● Time
● Aggregations
● Joins
● Windowing
Key concepts of Kafka Streams
● Stateful Stream Processing
– Some stream processing applications don’t require state – they
are stateless.
– In practice, however, most applications require state – they are
stateful.
– The state must be managed in a fault-tolerant manner.
– Application is stateful whenever, for example, it needs to join,
aggregate, or window its input data.
Key concepts of Kafka Streams
● Kstream
– A KStream is an abstraction of a record stream.
– Each data record represents a self-contained datum in the
unbounded data set.
– Using the table analogy, data records in a record stream are
always interpreted as an “INSERT” .
– Let’s imagine the following two data records are being sent to
the stream:
("alice", 1) --> ("alice", 3)
Key concepts of Kafka Streams
● Ktable
– A KStream is an abstraction of a changelog stream.
– Each data record represents an update.
– Using the table analogy, data records in a record stream are
always interpreted as an “UPDATE” .
– Let’s imagine the following two data records are being sent to
the stream:
("alice", 1) --> ("alice", 3)
Key concepts of Kafka Streams
● Time
– A critical aspect in stream processing is the the notion of time.
– Kafka Streams supports the following notions of time:
●
Event Time
●
Processing Time
●
Ingestion Time
– Kafka Streams assigns a timestamp to every data record via
so-called timestamp extractors.
Key concepts of Kafka Streams
● Aggregations
– An aggregation operation takes one input stream or table, and
yields a new table.
– It is done by combining multiple input records into a single
output record.
– In the Kafka Streams DSL, an input stream of an aggregation
operation can be a KStream or a KTable, but the output
stream will always be a KTable.
Key concepts of Kafka Streams
● Joins
– A join operation merges two input streams and/or tables based
on the keys of their data records, and yields a new
stream/table.
Key concepts of Kafka Streams
● Windowing
– Windowing lets you control how to group records that have the same
key for stateful operations such as aggregations or joins into so-
called windows.
– Windows are tracked per record key.
– When working with windows, you can specify a retention period for
the window.
– This retention period controls how long Kafka Streams will wait for
out-of-order or late-arriving data records for a given window.
– If a record arrives after the retention period of a window has passed,
the record is discarded and will not be processed in that window.
Inside Kafka Stream
Processor Topology
Stream Partitions and Tasks
● Each stream partition is a totally ordered sequence of data records and
maps to a Kafka topic partition.
● A data record in the stream maps to a Kafka message from that topic.
● The keys of data records determine the partitioning of data in both Kafka
and Kafka Streams, i.e., how data is routed to specific partitions within
topics.
Threading Model
● Kafka Streams allows the user to configure the number of threads that
the library can use to parallelize processing within an application
instance.
● Each thread can execute one or more stream tasks with their processor
topologies independently.
State
● Kafka Streams provides so-called state stores.
● State can be used by stream processing applications to store and query
data, which is an important capability when implementing stateful
operations.
Backpressure
● Kafka Streams does not use a backpressure mechanism because it
does not need one.
● It uses depth-first processing strategy.
● Each record consumed from Kafka will go through the whole processor
(sub-)topology for processing and for (possibly) being written back to
Kafka before the next record will be processed.
● No records are being buffered in-memory between two connected
stream processors.
● Kafka Streams leverages Kafka’s consumer client behind the scenes.
DEMO
Kafka Streams
HOW TO GET DATA IN AND OUT OF KAFKA?
KAFKA CONNECT
Kafka connect
● So-called Sources import data into Kafka, and Sinks export data from
Kafka.
● An implementation of a Source or Sink is a Connector. And users deploy
connectors to enable data flows on Kafka
● All Kafka Connect sources and sinks map to partitioned streams of
records.
● This is a generalization of Kafka’s concept of topic partitions: a stream
refers to the complete set of records that are split into independent
infinite sequences of records
CONFIGURING CONNECTORS
● Connector configurations are key-value mappings.
● For standalone mode these are defined in a properties file and
passed to the Connect process on the command line.
● In distributed mode, they will be included in the JSON payload
sent over the REST API for the request that creates the connector.
CONFIGURING CONNECTORS
Few settings common that are common to all connectors:
● name - Unique name for the connector. Attempting to register again
with the same name will fail.
● connector.class - The Java class for the connector
● tasks.max - The maximum number of tasks that should be created for
this connector. The connector may create fewer tasks if it cannot
achieve this level of parallelism.
REFERENCES
●
https://www.slideshare.net/ConfluentInc/demystifying-stream-processing-with-apache-kafka-
69228952
●
https://www.confluent.io/blog/introducing-kafka-streams-stream-processing-made-simple/
●
http://docs.confluent.io/3.2.0/streams/index.html
●
http://docs.confluent.io/3.2.0/connect/index.html
Thank You

Weitere ähnliche Inhalte

Was ist angesagt?

Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka
Real-Life Use Cases & Architectures for Event Streaming with Apache KafkaReal-Life Use Cases & Architectures for Event Streaming with Apache Kafka
Real-Life Use Cases & Architectures for Event Streaming with Apache KafkaKai Wähner
 
Apache Kafka - Martin Podval
Apache Kafka - Martin PodvalApache Kafka - Martin Podval
Apache Kafka - Martin PodvalMartin Podval
 
Securing Kafka
Securing Kafka Securing Kafka
Securing Kafka confluent
 
How Apache Kafka® Works
How Apache Kafka® WorksHow Apache Kafka® Works
How Apache Kafka® Worksconfluent
 
Fundamentals of Apache Kafka
Fundamentals of Apache KafkaFundamentals of Apache Kafka
Fundamentals of Apache KafkaChhavi Parasher
 
Apache Kafka
Apache KafkaApache Kafka
Apache Kafkaemreakis
 
Apache Kafka Introduction
Apache Kafka IntroductionApache Kafka Introduction
Apache Kafka IntroductionAmita Mirajkar
 
Kafka Streams: What it is, and how to use it?
Kafka Streams: What it is, and how to use it?Kafka Streams: What it is, and how to use it?
Kafka Streams: What it is, and how to use it?confluent
 
A visual introduction to Apache Kafka
A visual introduction to Apache KafkaA visual introduction to Apache Kafka
A visual introduction to Apache KafkaPaul Brebner
 
Integrating Apache Kafka Into Your Environment
Integrating Apache Kafka Into Your EnvironmentIntegrating Apache Kafka Into Your Environment
Integrating Apache Kafka Into Your Environmentconfluent
 
An Introduction to Apache Kafka
An Introduction to Apache KafkaAn Introduction to Apache Kafka
An Introduction to Apache KafkaAmir Sedighi
 
When NOT to use Apache Kafka?
When NOT to use Apache Kafka?When NOT to use Apache Kafka?
When NOT to use Apache Kafka?Kai Wähner
 
Stream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NETStream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NETconfluent
 
Apache Kafka 0.8 basic training - Verisign
Apache Kafka 0.8 basic training - VerisignApache Kafka 0.8 basic training - Verisign
Apache Kafka 0.8 basic training - VerisignMichael Noll
 

Was ist angesagt? (20)

Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka
Real-Life Use Cases & Architectures for Event Streaming with Apache KafkaReal-Life Use Cases & Architectures for Event Streaming with Apache Kafka
Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka
 
Apache Kafka - Martin Podval
Apache Kafka - Martin PodvalApache Kafka - Martin Podval
Apache Kafka - Martin Podval
 
Securing Kafka
Securing Kafka Securing Kafka
Securing Kafka
 
How Apache Kafka® Works
How Apache Kafka® WorksHow Apache Kafka® Works
How Apache Kafka® Works
 
Fundamentals of Apache Kafka
Fundamentals of Apache KafkaFundamentals of Apache Kafka
Fundamentals of Apache Kafka
 
Apache Kafka
Apache KafkaApache Kafka
Apache Kafka
 
Apache Kafka
Apache KafkaApache Kafka
Apache Kafka
 
Apache Kafka Introduction
Apache Kafka IntroductionApache Kafka Introduction
Apache Kafka Introduction
 
Kafka Streams: What it is, and how to use it?
Kafka Streams: What it is, and how to use it?Kafka Streams: What it is, and how to use it?
Kafka Streams: What it is, and how to use it?
 
Apache kafka
Apache kafkaApache kafka
Apache kafka
 
A visual introduction to Apache Kafka
A visual introduction to Apache KafkaA visual introduction to Apache Kafka
A visual introduction to Apache Kafka
 
Integrating Apache Kafka Into Your Environment
Integrating Apache Kafka Into Your EnvironmentIntegrating Apache Kafka Into Your Environment
Integrating Apache Kafka Into Your Environment
 
kafka
kafkakafka
kafka
 
An Introduction to Apache Kafka
An Introduction to Apache KafkaAn Introduction to Apache Kafka
An Introduction to Apache Kafka
 
When NOT to use Apache Kafka?
When NOT to use Apache Kafka?When NOT to use Apache Kafka?
When NOT to use Apache Kafka?
 
Stream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NETStream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NET
 
Envoy and Kafka
Envoy and KafkaEnvoy and Kafka
Envoy and Kafka
 
Apache Kafka 0.8 basic training - Verisign
Apache Kafka 0.8 basic training - VerisignApache Kafka 0.8 basic training - Verisign
Apache Kafka 0.8 basic training - Verisign
 
Apache kafka
Apache kafkaApache kafka
Apache kafka
 
Apache Kafka Best Practices
Apache Kafka Best PracticesApache Kafka Best Practices
Apache Kafka Best Practices
 

Ähnlich wie Stream processing using Kafka

BBL KAPPA Lesfurets.com
BBL KAPPA Lesfurets.comBBL KAPPA Lesfurets.com
BBL KAPPA Lesfurets.comCedric Vidal
 
Building Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache KafkaBuilding Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache KafkaSlim Baltagi
 
Connecting kafka message systems with scylla
Connecting kafka message systems with scylla   Connecting kafka message systems with scylla
Connecting kafka message systems with scylla Maheedhar Gunturu
 
Alpakka - Connecting Kafka and ElasticSearch to Akka Streams
Alpakka - Connecting Kafka and ElasticSearch to Akka StreamsAlpakka - Connecting Kafka and ElasticSearch to Akka Streams
Alpakka - Connecting Kafka and ElasticSearch to Akka StreamsKnoldus Inc.
 
Apache frameworks for Big and Fast Data
Apache frameworks for Big and Fast DataApache frameworks for Big and Fast Data
Apache frameworks for Big and Fast DataNaveen Korakoppa
 
AWS Re-Invent 2017 Netflix Keystone SPaaS - Monal Daxini - Abd320 2017
AWS Re-Invent 2017 Netflix Keystone SPaaS - Monal Daxini - Abd320 2017AWS Re-Invent 2017 Netflix Keystone SPaaS - Monal Daxini - Abd320 2017
AWS Re-Invent 2017 Netflix Keystone SPaaS - Monal Daxini - Abd320 2017Monal Daxini
 
Building streaming data applications using Kafka*[Connect + Core + Streams] b...
Building streaming data applications using Kafka*[Connect + Core + Streams] b...Building streaming data applications using Kafka*[Connect + Core + Streams] b...
Building streaming data applications using Kafka*[Connect + Core + Streams] b...Data Con LA
 
Kafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsKafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsSlim Baltagi
 
Current and Future of Apache Kafka
Current and Future of Apache KafkaCurrent and Future of Apache Kafka
Current and Future of Apache KafkaJoe Stein
 
Structured Streaming with Kafka
Structured Streaming with KafkaStructured Streaming with Kafka
Structured Streaming with Kafkadatamantra
 
14th Athens Big Data Meetup - Landoop Workshop - Apache Kafka Entering The St...
14th Athens Big Data Meetup - Landoop Workshop - Apache Kafka Entering The St...14th Athens Big Data Meetup - Landoop Workshop - Apache Kafka Entering The St...
14th Athens Big Data Meetup - Landoop Workshop - Apache Kafka Entering The St...Athens Big Data
 
Stream, Stream, Stream: Different Streaming Methods with Spark and Kafka
Stream, Stream, Stream: Different Streaming Methods with Spark and KafkaStream, Stream, Stream: Different Streaming Methods with Spark and Kafka
Stream, Stream, Stream: Different Streaming Methods with Spark and KafkaDataWorks Summit
 
Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...
Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...
Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...Guido Schmutz
 
Change data capture
Change data captureChange data capture
Change data captureRon Barabash
 
Stream, stream, stream: Different streaming methods with Spark and Kafka
Stream, stream, stream: Different streaming methods with Spark and KafkaStream, stream, stream: Different streaming methods with Spark and Kafka
Stream, stream, stream: Different streaming methods with Spark and KafkaItai Yaffe
 
Apache Big Data Europe 2015: Selected Talks
Apache Big Data Europe 2015: Selected TalksApache Big Data Europe 2015: Selected Talks
Apache Big Data Europe 2015: Selected TalksAndrii Gakhov
 
Cloud lunch and learn real-time streaming in azure
Cloud lunch and learn real-time streaming in azureCloud lunch and learn real-time streaming in azure
Cloud lunch and learn real-time streaming in azureTimothy Spann
 
Big Data Streams Architectures. Why? What? How?
Big Data Streams Architectures. Why? What? How?Big Data Streams Architectures. Why? What? How?
Big Data Streams Architectures. Why? What? How?Anton Nazaruk
 
A Tour of Apache Kafka
A Tour of Apache KafkaA Tour of Apache Kafka
A Tour of Apache Kafkaconfluent
 

Ähnlich wie Stream processing using Kafka (20)

BBL KAPPA Lesfurets.com
BBL KAPPA Lesfurets.comBBL KAPPA Lesfurets.com
BBL KAPPA Lesfurets.com
 
Building Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache KafkaBuilding Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache Kafka
 
Connecting kafka message systems with scylla
Connecting kafka message systems with scylla   Connecting kafka message systems with scylla
Connecting kafka message systems with scylla
 
Alpakka - Connecting Kafka and ElasticSearch to Akka Streams
Alpakka - Connecting Kafka and ElasticSearch to Akka StreamsAlpakka - Connecting Kafka and ElasticSearch to Akka Streams
Alpakka - Connecting Kafka and ElasticSearch to Akka Streams
 
Apache frameworks for Big and Fast Data
Apache frameworks for Big and Fast DataApache frameworks for Big and Fast Data
Apache frameworks for Big and Fast Data
 
AWS Re-Invent 2017 Netflix Keystone SPaaS - Monal Daxini - Abd320 2017
AWS Re-Invent 2017 Netflix Keystone SPaaS - Monal Daxini - Abd320 2017AWS Re-Invent 2017 Netflix Keystone SPaaS - Monal Daxini - Abd320 2017
AWS Re-Invent 2017 Netflix Keystone SPaaS - Monal Daxini - Abd320 2017
 
Building streaming data applications using Kafka*[Connect + Core + Streams] b...
Building streaming data applications using Kafka*[Connect + Core + Streams] b...Building streaming data applications using Kafka*[Connect + Core + Streams] b...
Building streaming data applications using Kafka*[Connect + Core + Streams] b...
 
Kafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsKafka Streams for Java enthusiasts
Kafka Streams for Java enthusiasts
 
Current and Future of Apache Kafka
Current and Future of Apache KafkaCurrent and Future of Apache Kafka
Current and Future of Apache Kafka
 
Structured Streaming with Kafka
Structured Streaming with KafkaStructured Streaming with Kafka
Structured Streaming with Kafka
 
14th Athens Big Data Meetup - Landoop Workshop - Apache Kafka Entering The St...
14th Athens Big Data Meetup - Landoop Workshop - Apache Kafka Entering The St...14th Athens Big Data Meetup - Landoop Workshop - Apache Kafka Entering The St...
14th Athens Big Data Meetup - Landoop Workshop - Apache Kafka Entering The St...
 
Stream, Stream, Stream: Different Streaming Methods with Spark and Kafka
Stream, Stream, Stream: Different Streaming Methods with Spark and KafkaStream, Stream, Stream: Different Streaming Methods with Spark and Kafka
Stream, Stream, Stream: Different Streaming Methods with Spark and Kafka
 
Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...
Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...
Spark (Structured) Streaming vs. Kafka Streams - two stream processing platfo...
 
Change data capture
Change data captureChange data capture
Change data capture
 
Stream, stream, stream: Different streaming methods with Spark and Kafka
Stream, stream, stream: Different streaming methods with Spark and KafkaStream, stream, stream: Different streaming methods with Spark and Kafka
Stream, stream, stream: Different streaming methods with Spark and Kafka
 
Apache Big Data Europe 2015: Selected Talks
Apache Big Data Europe 2015: Selected TalksApache Big Data Europe 2015: Selected Talks
Apache Big Data Europe 2015: Selected Talks
 
Cloud lunch and learn real-time streaming in azure
Cloud lunch and learn real-time streaming in azureCloud lunch and learn real-time streaming in azure
Cloud lunch and learn real-time streaming in azure
 
Big Data Streams Architectures. Why? What? How?
Big Data Streams Architectures. Why? What? How?Big Data Streams Architectures. Why? What? How?
Big Data Streams Architectures. Why? What? How?
 
A Tour of Apache Kafka
A Tour of Apache KafkaA Tour of Apache Kafka
A Tour of Apache Kafka
 
Apache Kafka Streams
Apache Kafka StreamsApache Kafka Streams
Apache Kafka Streams
 

Mehr von Knoldus Inc.

Mastering Web Scraping with JSoup Unlocking the Secrets of HTML Parsing
Mastering Web Scraping with JSoup Unlocking the Secrets of HTML ParsingMastering Web Scraping with JSoup Unlocking the Secrets of HTML Parsing
Mastering Web Scraping with JSoup Unlocking the Secrets of HTML ParsingKnoldus Inc.
 
Akka gRPC Essentials A Hands-On Introduction
Akka gRPC Essentials A Hands-On IntroductionAkka gRPC Essentials A Hands-On Introduction
Akka gRPC Essentials A Hands-On IntroductionKnoldus Inc.
 
Entity Core with Core Microservices.pptx
Entity Core with Core Microservices.pptxEntity Core with Core Microservices.pptx
Entity Core with Core Microservices.pptxKnoldus Inc.
 
Introduction to Redis and its features.pptx
Introduction to Redis and its features.pptxIntroduction to Redis and its features.pptx
Introduction to Redis and its features.pptxKnoldus Inc.
 
GraphQL with .NET Core Microservices.pdf
GraphQL with .NET Core Microservices.pdfGraphQL with .NET Core Microservices.pdf
GraphQL with .NET Core Microservices.pdfKnoldus Inc.
 
NuGet Packages Presentation (DoT NeT).pptx
NuGet Packages Presentation (DoT NeT).pptxNuGet Packages Presentation (DoT NeT).pptx
NuGet Packages Presentation (DoT NeT).pptxKnoldus Inc.
 
Data Quality in Test Automation Navigating the Path to Reliable Testing
Data Quality in Test Automation Navigating the Path to Reliable TestingData Quality in Test Automation Navigating the Path to Reliable Testing
Data Quality in Test Automation Navigating the Path to Reliable TestingKnoldus Inc.
 
K8sGPTThe AI​ way to diagnose Kubernetes
K8sGPTThe AI​ way to diagnose KubernetesK8sGPTThe AI​ way to diagnose Kubernetes
K8sGPTThe AI​ way to diagnose KubernetesKnoldus Inc.
 
Introduction to Circle Ci Presentation.pptx
Introduction to Circle Ci Presentation.pptxIntroduction to Circle Ci Presentation.pptx
Introduction to Circle Ci Presentation.pptxKnoldus Inc.
 
Robusta -Tool Presentation (DevOps).pptx
Robusta -Tool Presentation (DevOps).pptxRobusta -Tool Presentation (DevOps).pptx
Robusta -Tool Presentation (DevOps).pptxKnoldus Inc.
 
Optimizing Kubernetes using GOLDILOCKS.pptx
Optimizing Kubernetes using GOLDILOCKS.pptxOptimizing Kubernetes using GOLDILOCKS.pptx
Optimizing Kubernetes using GOLDILOCKS.pptxKnoldus Inc.
 
Azure Function App Exception Handling.pptx
Azure Function App Exception Handling.pptxAzure Function App Exception Handling.pptx
Azure Function App Exception Handling.pptxKnoldus Inc.
 
CQRS Design Pattern Presentation (Java).pptx
CQRS Design Pattern Presentation (Java).pptxCQRS Design Pattern Presentation (Java).pptx
CQRS Design Pattern Presentation (Java).pptxKnoldus Inc.
 
ETL Observability: Azure to Snowflake Presentation
ETL Observability: Azure to Snowflake PresentationETL Observability: Azure to Snowflake Presentation
ETL Observability: Azure to Snowflake PresentationKnoldus Inc.
 
Scripting with K6 - Beyond the Basics Presentation
Scripting with K6 - Beyond the Basics PresentationScripting with K6 - Beyond the Basics Presentation
Scripting with K6 - Beyond the Basics PresentationKnoldus Inc.
 
Getting started with dotnet core Web APIs
Getting started with dotnet core Web APIsGetting started with dotnet core Web APIs
Getting started with dotnet core Web APIsKnoldus Inc.
 
Introduction To Rust part II Presentation
Introduction To Rust part II PresentationIntroduction To Rust part II Presentation
Introduction To Rust part II PresentationKnoldus Inc.
 
Data governance with Unity Catalog Presentation
Data governance with Unity Catalog PresentationData governance with Unity Catalog Presentation
Data governance with Unity Catalog PresentationKnoldus Inc.
 
Configuring Workflows & Validators in JIRA
Configuring Workflows & Validators in JIRAConfiguring Workflows & Validators in JIRA
Configuring Workflows & Validators in JIRAKnoldus Inc.
 
Advanced Python (with dependency injection and hydra configuration packages)
Advanced Python (with dependency injection and hydra configuration packages)Advanced Python (with dependency injection and hydra configuration packages)
Advanced Python (with dependency injection and hydra configuration packages)Knoldus Inc.
 

Mehr von Knoldus Inc. (20)

Mastering Web Scraping with JSoup Unlocking the Secrets of HTML Parsing
Mastering Web Scraping with JSoup Unlocking the Secrets of HTML ParsingMastering Web Scraping with JSoup Unlocking the Secrets of HTML Parsing
Mastering Web Scraping with JSoup Unlocking the Secrets of HTML Parsing
 
Akka gRPC Essentials A Hands-On Introduction
Akka gRPC Essentials A Hands-On IntroductionAkka gRPC Essentials A Hands-On Introduction
Akka gRPC Essentials A Hands-On Introduction
 
Entity Core with Core Microservices.pptx
Entity Core with Core Microservices.pptxEntity Core with Core Microservices.pptx
Entity Core with Core Microservices.pptx
 
Introduction to Redis and its features.pptx
Introduction to Redis and its features.pptxIntroduction to Redis and its features.pptx
Introduction to Redis and its features.pptx
 
GraphQL with .NET Core Microservices.pdf
GraphQL with .NET Core Microservices.pdfGraphQL with .NET Core Microservices.pdf
GraphQL with .NET Core Microservices.pdf
 
NuGet Packages Presentation (DoT NeT).pptx
NuGet Packages Presentation (DoT NeT).pptxNuGet Packages Presentation (DoT NeT).pptx
NuGet Packages Presentation (DoT NeT).pptx
 
Data Quality in Test Automation Navigating the Path to Reliable Testing
Data Quality in Test Automation Navigating the Path to Reliable TestingData Quality in Test Automation Navigating the Path to Reliable Testing
Data Quality in Test Automation Navigating the Path to Reliable Testing
 
K8sGPTThe AI​ way to diagnose Kubernetes
K8sGPTThe AI​ way to diagnose KubernetesK8sGPTThe AI​ way to diagnose Kubernetes
K8sGPTThe AI​ way to diagnose Kubernetes
 
Introduction to Circle Ci Presentation.pptx
Introduction to Circle Ci Presentation.pptxIntroduction to Circle Ci Presentation.pptx
Introduction to Circle Ci Presentation.pptx
 
Robusta -Tool Presentation (DevOps).pptx
Robusta -Tool Presentation (DevOps).pptxRobusta -Tool Presentation (DevOps).pptx
Robusta -Tool Presentation (DevOps).pptx
 
Optimizing Kubernetes using GOLDILOCKS.pptx
Optimizing Kubernetes using GOLDILOCKS.pptxOptimizing Kubernetes using GOLDILOCKS.pptx
Optimizing Kubernetes using GOLDILOCKS.pptx
 
Azure Function App Exception Handling.pptx
Azure Function App Exception Handling.pptxAzure Function App Exception Handling.pptx
Azure Function App Exception Handling.pptx
 
CQRS Design Pattern Presentation (Java).pptx
CQRS Design Pattern Presentation (Java).pptxCQRS Design Pattern Presentation (Java).pptx
CQRS Design Pattern Presentation (Java).pptx
 
ETL Observability: Azure to Snowflake Presentation
ETL Observability: Azure to Snowflake PresentationETL Observability: Azure to Snowflake Presentation
ETL Observability: Azure to Snowflake Presentation
 
Scripting with K6 - Beyond the Basics Presentation
Scripting with K6 - Beyond the Basics PresentationScripting with K6 - Beyond the Basics Presentation
Scripting with K6 - Beyond the Basics Presentation
 
Getting started with dotnet core Web APIs
Getting started with dotnet core Web APIsGetting started with dotnet core Web APIs
Getting started with dotnet core Web APIs
 
Introduction To Rust part II Presentation
Introduction To Rust part II PresentationIntroduction To Rust part II Presentation
Introduction To Rust part II Presentation
 
Data governance with Unity Catalog Presentation
Data governance with Unity Catalog PresentationData governance with Unity Catalog Presentation
Data governance with Unity Catalog Presentation
 
Configuring Workflows & Validators in JIRA
Configuring Workflows & Validators in JIRAConfiguring Workflows & Validators in JIRA
Configuring Workflows & Validators in JIRA
 
Advanced Python (with dependency injection and hydra configuration packages)
Advanced Python (with dependency injection and hydra configuration packages)Advanced Python (with dependency injection and hydra configuration packages)
Advanced Python (with dependency injection and hydra configuration packages)
 

Kürzlich hochgeladen

why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number SystemsJheuzeDellosa
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providermohitmore19
 
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...OnePlan Solutions
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...kellynguyen01
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)OPEN KNOWLEDGE GmbH
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...MyIntelliSource, Inc.
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsArshad QA
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataBradBedford3
 
Test Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendTest Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendArshad QA
 
Diamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionDiamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionSolGuruz
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfkalichargn70th171
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxComplianceQuest1
 
Right Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsRight Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsJhone kinadey
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 

Kürzlich hochgeladen (20)

why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
What is Binary Language? Computer Number Systems
What is Binary Language?  Computer Number SystemsWhat is Binary Language?  Computer Number Systems
What is Binary Language? Computer Number Systems
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
 
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview Questions
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
 
Test Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and BackendTest Automation Strategy for Frontend and Backend
Test Automation Strategy for Frontend and Backend
 
Diamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionDiamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with Precision
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
 
Exploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the ProcessExploring iOS App Development: Simplifying the Process
Exploring iOS App Development: Simplifying the Process
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docx
 
Right Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsRight Money Management App For Your Financial Goals
Right Money Management App For Your Financial Goals
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 

Stream processing using Kafka

  • 1. Himani Arora & Prabhat Kashyap Software Consultant @_himaniarora @pk_official
  • 2. Who we are? Himani Arora @_himaniarora Software Consultant @ Knoldus Software LLP Contributed in Apache Kafka, Juypter, Apache CarbonData, Lightbend Lagom etc Currently learning Apache Kafka Prabhat Kashyap @pk_official Software Consultant @ Knoldus Software LLP Contributed in Apache Kafka and Apache CarbonData and Lightbend Templates Currently learning Apache Kafka
  • 3. Agenda ● What is Stream processing ● Paradigms of programming ● Stream Processing with Kafka ● What are Kafka Streams ● Inside Kafka Streams ● Demonstration of stream processing using Kafka Streams ● Overview of Kafka Connect ● Demo with Kafka Connect
  • 4. What is stream processing? ● Real-time processing of data ● Does not treat data as static tables or files ● Data has to be processed fast, so that a firm can react to changing business conditions in real time. This is required for trading, fraud detection, system monitoring, and many other examples. ● A “too late architecture” cannot realize these use cases.
  • 5. BIG DATA VERSUS FAST DATA
  • 6. 3 PARADIGMS OF PROGRAMMING ● REQUEST/RESPONSE ● BATCH SYSTEMS ● STREAM PROCESSING
  • 10. STREAM PROCESSING with KAFKA 2 APPROACHES: ● DO IT YOURSELF (DIY ! ) STREAM PROCESSING ● STREAM PROCESSING FRAMEWORK
  • 11. DIY STREAM PROCESSING Major Challenges: ● FAULT TOLERANCE ● PARTITIONING AND SCALABILITY ● TIME ● STATE ● REPROCESSING
  • 12. STREAM PROCESSING FRAMEWORK Many already available stream processing framework are: SPARK STORM SAMZA FLINK ETC...
  • 13. KAFKA STREAMS : ANOTHER WAY OF STREAM PROCESSING
  • 14. Let’s starts with Kafka Stream but wait.. What is KAFKA?
  • 15. Hello! Apache Kafka ● Apache Kafka is an Open Source project under Apache Licence 2.0 ● Apache Kafka was originally developed by LinkedIn. ● On 23 October 2012 Apache Kafka graduated from incubator to top level projects. ● Components of Apache Kafka ○ Producer ○ Consumer ○ Broker ○ Topic ○ Data ○ Parallelism
  • 16.
  • 18. What is Kafka Streams ● It is Streams API of Apache Kafka, available through a Java library. ● Kafka Streams is built on top of functionality provided by Kafka’s. ● It is , by deliberate design, tightly integrated with Apache Kafka. ● It can be used to build highly scalable, elastic, fault-tolerant, distributed applications and microservices. ● Kafka Streams API allows you to create real-time applications. ● It is the easiest yet the most powerful technology to process data stored in Kafka.
  • 19.
  • 20. If we look closer ● A key motivation of the Kafka Streams API is to bring stream processing out of the Big Data niche into the world of mainstream application development. ● Using the Kafka Streams API you can implement standard Java applications to solve your stream processing needs. ● Your applications are fully elastic: you can run one or more instances of your application. ● This lightweight and integrative approach of the Kafka Streams API – “Build applications, not infrastructure!” . ● Deployment-wise you are free to chose from any technology that can deploy Java applications
  • 21. Capabilities of Kafka Stream ● Powerful ○ Makes your applications highly scalable, elastic, distributed, fault- tolerant. ○ Stateful and stateless processing ○ Event-time processing with windowing, joins, aggregations ● Lightweight ○ Low barrier to entry ○ No processing cluster required ○ No external dependencies other than Apache Kafka
  • 22. Capabilities of Kafka Stream ● Real-time ○ Millisecond processing latency ○ Record-at-a-time processing (no micro-batching) ○ Seamlessly handles late-arriving and out-of-order data ○ High throughput ● Fully integrated ○ 100% compatible with Apache Kafka 0.10.2 and 0.10.1 ○ Easy to integrate into existing applications and microservices ○ Runs everywhere: on-premises, public clouds, private clouds, containers, etc. ○ Integrates with databases through continous change data capture (CDC) performed by Kafka Connect
  • 23. Key concepts of Kafka Streams ● Stateful Stream Processing ● KStream ● KTable ● Time ● Aggregations ● Joins ● Windowing
  • 24. Key concepts of Kafka Streams ● Stateful Stream Processing – Some stream processing applications don’t require state – they are stateless. – In practice, however, most applications require state – they are stateful. – The state must be managed in a fault-tolerant manner. – Application is stateful whenever, for example, it needs to join, aggregate, or window its input data.
  • 25. Key concepts of Kafka Streams ● Kstream – A KStream is an abstraction of a record stream. – Each data record represents a self-contained datum in the unbounded data set. – Using the table analogy, data records in a record stream are always interpreted as an “INSERT” . – Let’s imagine the following two data records are being sent to the stream: ("alice", 1) --> ("alice", 3)
  • 26. Key concepts of Kafka Streams ● Ktable – A KStream is an abstraction of a changelog stream. – Each data record represents an update. – Using the table analogy, data records in a record stream are always interpreted as an “UPDATE” . – Let’s imagine the following two data records are being sent to the stream: ("alice", 1) --> ("alice", 3)
  • 27. Key concepts of Kafka Streams ● Time – A critical aspect in stream processing is the the notion of time. – Kafka Streams supports the following notions of time: ● Event Time ● Processing Time ● Ingestion Time – Kafka Streams assigns a timestamp to every data record via so-called timestamp extractors.
  • 28. Key concepts of Kafka Streams ● Aggregations – An aggregation operation takes one input stream or table, and yields a new table. – It is done by combining multiple input records into a single output record. – In the Kafka Streams DSL, an input stream of an aggregation operation can be a KStream or a KTable, but the output stream will always be a KTable.
  • 29. Key concepts of Kafka Streams ● Joins – A join operation merges two input streams and/or tables based on the keys of their data records, and yields a new stream/table.
  • 30. Key concepts of Kafka Streams ● Windowing – Windowing lets you control how to group records that have the same key for stateful operations such as aggregations or joins into so- called windows. – Windows are tracked per record key. – When working with windows, you can specify a retention period for the window. – This retention period controls how long Kafka Streams will wait for out-of-order or late-arriving data records for a given window. – If a record arrives after the retention period of a window has passed, the record is discarded and will not be processed in that window.
  • 33. Stream Partitions and Tasks ● Each stream partition is a totally ordered sequence of data records and maps to a Kafka topic partition. ● A data record in the stream maps to a Kafka message from that topic. ● The keys of data records determine the partitioning of data in both Kafka and Kafka Streams, i.e., how data is routed to specific partitions within topics.
  • 34. Threading Model ● Kafka Streams allows the user to configure the number of threads that the library can use to parallelize processing within an application instance. ● Each thread can execute one or more stream tasks with their processor topologies independently.
  • 35. State ● Kafka Streams provides so-called state stores. ● State can be used by stream processing applications to store and query data, which is an important capability when implementing stateful operations.
  • 36. Backpressure ● Kafka Streams does not use a backpressure mechanism because it does not need one. ● It uses depth-first processing strategy. ● Each record consumed from Kafka will go through the whole processor (sub-)topology for processing and for (possibly) being written back to Kafka before the next record will be processed. ● No records are being buffered in-memory between two connected stream processors. ● Kafka Streams leverages Kafka’s consumer client behind the scenes.
  • 38. HOW TO GET DATA IN AND OUT OF KAFKA?
  • 40. Kafka connect ● So-called Sources import data into Kafka, and Sinks export data from Kafka. ● An implementation of a Source or Sink is a Connector. And users deploy connectors to enable data flows on Kafka ● All Kafka Connect sources and sinks map to partitioned streams of records. ● This is a generalization of Kafka’s concept of topic partitions: a stream refers to the complete set of records that are split into independent infinite sequences of records
  • 41. CONFIGURING CONNECTORS ● Connector configurations are key-value mappings. ● For standalone mode these are defined in a properties file and passed to the Connect process on the command line. ● In distributed mode, they will be included in the JSON payload sent over the REST API for the request that creates the connector.
  • 42. CONFIGURING CONNECTORS Few settings common that are common to all connectors: ● name - Unique name for the connector. Attempting to register again with the same name will fail. ● connector.class - The Java class for the connector ● tasks.max - The maximum number of tasks that should be created for this connector. The connector may create fewer tasks if it cannot achieve this level of parallelism.

Hinweis der Redaktion

  1. continuously, concurrently, and in a record-by-record fashion. But as a continuous infinite stream of data integrated from both live and historical sources.
  2. A big data architecture contains several parts. Often, masses of structured and semi-structured historical data are stored in Hadoop (Volume + Variety). On the other side, stream processing is used for fast data requirements (Velocity + Variety). Both complement each other very well. This meetup focuses on real-time and stream processing.
  3. IMAGE SOURCE https://image.slidesharecdn.com/demystifyingstreamprocessingwithapachekafka-161118053223/95/demystifying-stream-processing-with-apache-kafka-4-638.jpg?cb=1479447621 Synchronous and tightly coupled Scaling is possible by adding more instances to this service Latency sensitive and due to tight coupling its sensitive to failures.
  4. you send all your inputs in and wait for your system to crunch all that data before it send all the output back.
  5. in between request/response and batch systems. here you send some inputs in and you get some outputs back. this definition of SOME is left to the program. the o/p is available at variable times too. the BIG shift is that, stream processing knows that the data is unbounded and it shall never be complete. BENEFIT: It gives complete control to the program over the tradeoffs involved. (latency, correctness and cost )
  6. DIY → you take your kafka libraries and you decide to decide to do everything yourself. If you have decided to do this then you should be aware of these hard problems.
  7. producers publish data to Kafka brokers, and consumers read published data from Kafka brokers. Producers and consumers are totally decoupled, and both run outside the Kafka brokers in the perimeter of a Kafka cluster. A Kafka cluster consists of one or more brokers.
  8. Kafka topics are divided into a number of partitions. Partitions allow you to parallelize a topic by splitting the data in a particular topic across multiple brokers — each partition can be placed on a separate machine to allow for multiple consumers to read from a topic in parallel. Consumers can also be parallelized so that multiple consumers can read from multiple partitions in a topic allowing for very high message processing throughput.
  9. Kafka Connect is a tool for scalably and reliably streaming data between Apache Kafka and other data systems. It makes it simple to quickly define connectors that move large data sets into and out of Kafka. Kafka Connect’s scope is narrow: it focuses only on copying streaming data to and from Kafka and does not handle other tasks, such as stream processing,
  10. Standalone: bin/connect-standalone worker.properties connector1.properties [connector2.properties connector3.properties ...] Standalone mode is the simplest mode, where a single process is responsible for executing all connectors and tasks. Since it is a single process, it requires minimal configuration. Distributed mode provides scalability and automatic fault tolerance for Kafka Connect. In distributed mode, you start many worker processes using the same group.id and they automatically coordinate to schedule execution of connectors and tasks across all available workers. curl -X POST -H "Content-Type: application/json" --data '{"name": "local-console-source", "config": {"connector.class":"org.apache.kafka.connect.file.FileStreamSourceConnector", "tasks.max":"1", "topic":"connect-test" }}' http://localhost:8083/connectors # Or, to use a file containing the JSON-formatted configuration # curl -X POST -H "Content-Type: application/json" --data @config.json http://localhost:8083/connectors
  11. Sink connectors also have one additional option to control their input, topics - A list of topics to use as input for this connector