Rating: ****
Tags: Computers, Languages, Python, Data Science, Data Visualization, Distributed Systems, Cloud Computing, Lang:en
Publisher: Simon and Schuster
Added: May 26, 2021
Modified: November 5, 2021
Summary
Data Pipelines with Apache Airflow teaches you how
to build and maintain effective data pipelines.
Summary
Purchase of the print book includes a free eBook in PDF,
Kindle, and ePub formats from Manning Publications.
About the technology
About the book
What's inside
About the reader
About the author
Table of Contents
PART 1 - GETTING STARTED
1 Meet Apache Airflow
PART 2 - BEYOND THE BASICS
6 Triggering workflows
PART 3 - AIRFLOW IN PRACTICE
11 Best practices
PART 4 - IN THE CLOUDS
15 Airflow in the clouds
A successful pipeline moves data efficiently,
minimizing pauses and blockages between tasks, keeping every
process along the way operational. Apache Airflow provides a
single customizable environment for building and managing
data pipelines, eliminating the need for a hodgepodge
collection of tools, snowflake code, and homegrown processes.
Using real-world scenarios and examples,
Data Pipelines with Apache Airflow teaches you how
to simplify and automate data pipelines, reduce operational
overhead, and smoothly integrate all the technologies in your
stack.
Data pipelines manage the flow of data from initial
collection through consolidation, cleaning, analysis,
visualization, and more. Apache Airflow provides a single
platform you can use to design, implement, monitor, and
maintain your pipelines. Its easy-to-use UI, plug-and-play
options, and flexible Python scripting make Airflow perfect
for any data management task.
Data Pipelines with Apache Airflow teaches you how
to build and maintain effective data pipelines. You’ll
explore the most common usage patterns, including aggregating
multiple data sources, connecting to and from data lakes, and
cloud deployment. Part reference and part tutorial, this
practical guide covers every aspect of the directed acyclic
graphs (DAGs) that power Airflow, and how to customize them
for your pipeline’s needs.
Build, test, and deploy Airflow pipelines as DAGs
Automate moving and transforming data
Analyze historical datasets using backfilling
Develop custom components
Set up Airflow in production environments
For DevOps, data engineers, machine learning engineers,
and sysadmins with intermediate Python skills.
Bas Harenslak and
Julian de Ruiter are data engineers with
extensive experience using Airflow to develop pipelines for
major companies. Bas is also an Airflow committer.
2 Anatomy of an Airflow DAG
3 Scheduling in Airflow
4 Templating tasks using the Airflow context
5 Defining dependencies between tasks
7 Communicating with external systems
8 Building custom components
9 Testing
10 Running tasks in containers
12 Operating Airflow in production
13 Securing Airflow
14 Project: Finding the fastest way to get around
NYC
16 Airflow on AWS
17 Airflow on Azure
18 Airflow in GCP **