ITIL®4 Foundation vizsga MAGYARUL

Apache Kafka Couse Certification Training

AKCC-HV
6 nap
740 000 Ft + ÁFA
tanfolyamkezdési időpontok:
Jelentkezem!
oktatók:

A tanfolyamról

Apache Kafka is an open-source messaging infrastructure designed by LinkedIn and used by several major SaaS (Software as a Service) applications that we use on a daily basis. Kafka was designed to work with large scale data movements and offer seamless performance and reliability. Today when most IT professionals are dealing with a data deluge in the form of hundreds of billions of message, Kafka is the big data solution that you need.
Apache Kafka training will take you through the architectural design of Kafka that enables it to process large strings of data in real-time. Kafka stores, processes, and publishes streams of data records seamlessly as they occur and in a durable manner. The speed and performance of Kafka can be attributed to the fact that it runs as a cluster on multiple servers, enabling it to span across several data centers.
IT professionals can use Kafka certification to dive into the intrinsic architecture of Apache Kafka. Moreover, it helps to understand Kafka API streams, learn how it is developed on Java, and eventually develop cutting-edge big data solutions using Kafka.

Individual Benefits:

  • Apache Kafka helps you to develop your own applications with ease
  • Get equipped to process large-scale data and kick-start a career in real-time analytics
  • Kafka helps you to get into multiple industries which include business services, retail, finance, manufacturing, etc.
  • Enables you to work in profiles like Kafka Developer, Kafka Testing Professional, Kafka Project Managers, and Big Data Architect in Kafka

Organization benefits:

  • It helps organizations to handle large volumes of data
  • Enables transparent and seamless message handling while avoiding downtime
  • Allows organizations to integrate with a variety of consumers
  • Implementation of Kafka helps to handle real-time data pipeline

What you will learn

  • Kafka Introduction
    Learns the basics of Kafka messaging system in Big Data, Kafka architecture, and its configuration.
  • Kafka and Big Data
    Know about Kafka and its components, and how Kafka technology helps in processing real-time data.
  • Kafka APIs
    Learn ways to construct and process messages in Kafka APIs such as producers, consumers, etc.
  • Kafka Example
    Learn how to design and develop robust messaging and subscribe topics on various platforms.
  • Cluster Architecture
    Learn about Kafka cluster and how it integrates with other Big Data Frameworks like Hadoop.
  • Kafka Integration
    Understand and learn various methods and the importance to Integrate Kafka with Storm, Spark.

Who should take the course

  • Data scientists
  • ETL developers
  • Data analysts
  • BI Analysts & Developers
  • SAS Developers
  • Big Data Professionals
  • Big Data Architects
  • Project Managers
  • Research professionals
  • Analytics professionals
  • Professionals aspiring for a career in Big Data
  • Messaging and Queuing System professionals

We provide the course in English.

 

Tematika

Curriculum

1 Introduction to Big Data and Kafka
Learning Objectives:
Understand where Kafka fits in the Big Data space, and learn about Kafka Architecture. Also, learn about Kafka Cluster, its Components, and how to configure a Cluster.

Topics:

  • Introduction to Big Data          
  • Big Data Analytics                        
  • Need for Kafka            
  • What is Kafka?             
  • Kafka Features             
  • Kafka Concepts            
  • Kafka Architecture                     
  • Kafka Components                    
  • ZooKeeper                    
  • Where is Kafka Used?               
  • Kafka Installation                        
  • Kafka Cluster                
  • Types of Kafka Clusters   

Hands-on:

  • Kafka Installation
  • Implementing Single Node-Single Broker Cluster

2 Kafka Producer
Learning Objectives:
Learn how to construct a Kafka Producer, send messages to Kafka, send messages Synchronously & Asynchronously, configure Producers, serialize using Apache Avro and create & handle Partitions.

Topics:

  • Configuring Single Node Single Broker Cluster
  • Configuring Single Node Multi Broker Cluster                 
  • Constructing a Kafka Producer              
  • Sending a Message to Kafka                  
  • Producing Keyed and Non-Keyed Messages                  
  • Sending a Message Synchronously & Asynchronously                
  • Configuring Producers              
  • Serializers                      
  • Serializing Using Apache Avro                
  • Partitions       

Hands-on:

  • Working with Single Node Multi Broker Cluster
  • Creating a Kafka Producer
  • Configuring a Kafka Producer
  • Sending a Message Synchronously & Asynchronously

3 Kafka Consumer
Learning Objectives:
Learn to construct Kafka Consumer, process messages from Kafka with Consumer, run Kafka Consumer and subscribe to Topics.

Topics:

  • Consumers and Consumer Groups
  • Standalone Consumer
  • Consumer Groups and Partition Rebalance                  
  • Creating a Kafka Consumer
  • Subscribing to Topics                 
  • The Poll Loop                
  • Configuring Consumers            
  • Commits and Offsets                
  • Rebalance Listeners                   
  • Consuming Records with Specific Offsets                         
  • Deserializers                 

Hands-on:

  • Creating a Kafka Consumer
  • Configuring a Kafka Consumer
  • Working with Offsets

4 Kafka Operations and Performance Tuning
Learning Objectives:
Learn about tuning Kafka to meet your high-performance needs

Topics:

  • Cluster Membership
  • The Controller           
  • Replication                     
  • Request Processing                   
  • Physical Storage                          
  • Reliability                        
  • Broker Configuration                 
  • Using Producers in a Reliable System                 
  • Using Consumers in a Reliable System               
  • Validating System Reliability                   
  • Performance Tuning in Kafka            

Hands-on:

  • Create a topic with partition & replication factor 3 and execute it on multi-broker cluster
  • Show fault tolerance by shutting down 1 Broker and serving its partition from another broker

5 Kafka Cluster Architectures and Administering Kafka
Learning Objectives:
Learn about Kafka Multi-Cluster Architectures, Kafka Brokers, Topic, Partitions, Consumer Group, Mirroring, and ZooKeeper Coordination in this module.

Topics:

  • Multi-Cluster Architectures
  • Apache Kafka’s MirrorMaker                 
  • Other Cross-Cluster Mirroring Solutions                           
  • Topic Operations                         
  • Consumer Groups                      
  • Dynamic Configuration Changes                           
  • Partition Management             
  • Consuming and Producing                      
  • Unsafe Operations      

Hands-on:

  • Topic Operations
  • Consumer Group Operations
  • Partition Operations
  • Consumer and Producer Operations

6 Kafka Stream Processing
Learning Objectives:
Learn about the Kafka Streams API in this module. Kafka Streams is a client library for building mission-critical real-time applications and microservices, where the input and/or output data is stored in Kafka Clusters.

Topics:

  • Stream Processing
  • Stream-Processing Concepts                 
  • Stream-Processing Design Patterns                    
  • Kafka Streams by Example                      
  • Kafka Streams: Architecture Overview    

Hands-on:

  • Kafka Streams
  • Word Count Stream Processing

7 Integration of Kafka with Hadoop, Storm and Spark
Learning Objectives:
Learn about Apache Hadoop, Hadoop Architecture, Apache Storm, Storm Configuration, and Spark Ecosystem. In addition, configure Spark Cluster, Integrate Kafka with Hadoop, Storm, and Spark.

Topics:

  • Apache Hadoop Basics
  • Hadoop Configuration              
  • Kafka Integration with Hadoop             
  • Apache Storm Basics                 
  • Configuration of Storm             
  • Integration of Kafka with Storm                           
  • Apache Spark Basics                  
  • Spark Configuration                   
  • Kafka Integration with Spark   

Hands-on:

  • Kafka integration with Hadoop
  • Kafka integration with Storm
  • Kafka integration with Spark

Kinek ajánljuk

Előfeltételek

Prerequisites

It is not mandatory for you to have a prior knowledge of Kafka to take up Apache Kafka training. However, as a participant you are expected to know the core concepts of Java or Python to attend this course.

Kapcsolódó tanfolyamok



Ajánlja másoknak is!