Data Ingestion with Python Cookbook: A practical guide to ingesting, monitoring, and identifying errors in the data ingestion process

Data Ingestion with Python Cookbook: A practical guide to ingesting, monitoring, and identifying errors in the data ingestion process - Gláucia Esppenchutz

Data Ingestion with Python Cookbook: A practical guide to ingesting, monitoring, and identifying errors in the data ingestion process

Deploy your data ingestion pipeline, orchestrate, and monitor efficiently to prevent loss of data and quality Purchase of the print or Kindle book includes a free PDF eBook
Key Features: Harness best practices to create a Python and PySpark data ingestion pipeline Seamlessly automate and orchestrate your data pipelines using Apache Airflow Build a monitoring framework by integrating the concept of data observability into your pipelines
Book Description: Data Ingestion with Python Cookbook offers a practical approach to designing and implementing data ingestion pipelines. It presents real-world examples with the most widely recognized open source tools on the market to answer commonly asked questions and overcome challenges. You'll be introduced to designing and working with or without data schemas, as well as creating monitored pipelines with Airflow and data observability principles, all while following industry best practices. The book also addresses challenges associated with reading different data sources and data formats. As you progress through the book, you'll gain a broader understanding of error logging best practices, troubleshooting techniques, data orchestration, monitoring, and storing logs for further consultation. By the end of the book, you'll have a fully automated set that enables you to start ingesting and monitoring your data pipeline effortlessly, facilitating seamless integration with subsequent stages of the ETL process.
What You Will Learn: Implement data observability using monitoring tools Automate your data ingestion pipeline Read analytical and partitioned data, whether schema or non-schema based Debug and prevent data loss through efficient data monitoring and logging Establish data access policies using a data governance framework Construct a data orchestration framework to improve data quality
Who this book is for: This book is for data engineers and data enthusiasts seeking a comprehensive understanding of the data ingestion process using popular tools in the open source community. For more advanced learners, this book takes on the theoretical pillars of data governance while providing practical examples of real-world scenarios commonly encountered by data engineers.
Citeste mai mult

-10%

transport gratuit

PRP: 330.58 Lei

!

Acesta este Pretul Recomandat de Producator. Pretul de vanzare al produsului este afisat mai jos.

297.52Lei

297.52Lei

330.58 Lei

Primesti 297 puncte

Important icon msg

Primesti puncte de fidelitate dupa fiecare comanda! 100 puncte de fidelitate reprezinta 1 leu. Foloseste-le la viitoarele achizitii!

Indisponibil

Descrierea produsului

Deploy your data ingestion pipeline, orchestrate, and monitor efficiently to prevent loss of data and quality Purchase of the print or Kindle book includes a free PDF eBook
Key Features: Harness best practices to create a Python and PySpark data ingestion pipeline Seamlessly automate and orchestrate your data pipelines using Apache Airflow Build a monitoring framework by integrating the concept of data observability into your pipelines
Book Description: Data Ingestion with Python Cookbook offers a practical approach to designing and implementing data ingestion pipelines. It presents real-world examples with the most widely recognized open source tools on the market to answer commonly asked questions and overcome challenges. You'll be introduced to designing and working with or without data schemas, as well as creating monitored pipelines with Airflow and data observability principles, all while following industry best practices. The book also addresses challenges associated with reading different data sources and data formats. As you progress through the book, you'll gain a broader understanding of error logging best practices, troubleshooting techniques, data orchestration, monitoring, and storing logs for further consultation. By the end of the book, you'll have a fully automated set that enables you to start ingesting and monitoring your data pipeline effortlessly, facilitating seamless integration with subsequent stages of the ETL process.
What You Will Learn: Implement data observability using monitoring tools Automate your data ingestion pipeline Read analytical and partitioned data, whether schema or non-schema based Debug and prevent data loss through efficient data monitoring and logging Establish data access policies using a data governance framework Construct a data orchestration framework to improve data quality
Who this book is for: This book is for data engineers and data enthusiasts seeking a comprehensive understanding of the data ingestion process using popular tools in the open source community. For more advanced learners, this book takes on the theoretical pillars of data governance while providing practical examples of real-world scenarios commonly encountered by data engineers.
Citeste mai mult

De pe acelasi raft

Parerea ta e inspiratie pentru comunitatea Libris!

Noi suntem despre carti, si la fel este si

Newsletter-ul nostru.

Aboneaza-te la vestile literare si primesti un cupon de -10% pentru viitoarea ta comanda!

*Reducerea aplicata prin cupon nu se cumuleaza, ci se aplica reducerea cea mai mare.

Ma abonez image one
Ma abonez image one