$125 | Duration: 7h 51m | Video: h264, 1920×1080 | Audio: AAC, 48kHz, 2 Ch | 1.89 GBGenre: eLearning | Language: English | November 30, 2018Handle high volumes of data at high speed. Architect and implement an end-to-end data streaming pipelineToday, organizations have a difficult time working with huge numbers of datasets.
In addition, data processing and analyzing need to be done in real time to gain insights.
This is where data streaming comes in.
As big data is no longer a niche topic, having the skillset to architect and develop robust data streaming pipelines is a must for all developers.
In addition, they also need to think of the entire pipeline, including the trade-offs for every tier.
This course starts by explaining the blueprint architecture for developing a completely functional data streaming pipeline and installing the technologies used.
With the help of live coding sessions, you will get hands-on with architecting every tier of the pipeline.
You will also handle specific issues encountered working with streaming data.
You will input a live data stream of Meetup RSVPs that will be analyzed and displayed via Google Maps.
By the end of the course, you will have built an efficient data streaming pipeline and will be able to analyze its various tiers, ensuring a continuous flow of data.
All the code and supporting files for this course are available atStyle and ApproachThis course is a combination of text, a lot of images (diagrams), and meaningful live coding sessions.
Each topic covered follows a three-step structure: first, we have some headlines (facts); second, we continue with images (diagrams) meant to provide more details; and finally we convert the text and images into code written in the proper technology.
Table of ContentsINTRODUCING DATA STREAMING ARCHITECTUREDEPLOYMENT OF COLLECTION AND MESSAGE QUEUING TIERSPROCEEDING TO THE DATA ACCESS TIERIMPLEMENTING THE ANALYSIS TIERMITIGATE DATA LOSS BETWEEN COLLECTION, ANALYSIS AND MESSAGE QUEUING TIERSWhat You Will LearnAttain a solid foundation in the most powerful and versatile technologies involved in data streaming: Apache Spark and Apache KafkaForm a robust and clean architecture for a data streaming pipelineImplement the correct tools to bring your data streaming architecture to lifeIsolate the most problematic tradeoff for each tier involved in a data streaming pipelineQuery, analyze, and apply machine learning algorithms to collected dataDisplay analyzed pipeline data via Google Maps on your web browserDiscover and resolve difficulties in scaling and securing data streaming applications