12-week senior data engineering. Lakehouse architecture, streaming pipelines, data mesh, and data platform design.
Student builds a data ingestion pipeline using Python and Kafka, demonstrating their ability to handle real-time data streams and integrate with messaging systems, proving their data engineering skills.
Student builds a Spark application to process CSV files, demonstrating their understanding of big data processing and Spark fundamentals, proving their ability to work with large datasets.
Student designs a data warehouse using dbt and PostgreSQL, demonstrating their understanding of data modeling and data warehousing concepts, proving their ability to design scalable data systems.
Student builds an Airflow workflow to automate data processing tasks, demonstrating their ability to create scalable and maintainable data pipelines, proving their understanding of workflow management.
Student builds a real-time data streaming application using Kafka and Spark, demonstrating their ability to handle high-volume data streams and integrate with big data processing systems, proving their real-time data processing skills.
+15 more projects available after enrollment
Build a real project in 4 weeks