Skip to content

Latest commit

 

History

History
28 lines (18 loc) · 1.48 KB

File metadata and controls

28 lines (18 loc) · 1.48 KB

From APIs to Warehouses: AI-Assisted Data Ingestion with dlt

Video

This hands-on workshop focuses on building reliable data ingestion pipelines to data warehouses (for example, Snowflake) using dlt (data load tool), enhanced with LLMs, the dlt dashboard, and dlt MCP.

What you'll learn

You'll work through the key building blocks of a production-ready ingestion setup, including:

  • Extracting data from APIs, files, and databases
  • Normalizing data into consistent schemas
  • Writing data to a data warehouse (e.g. Snowflake)
  • Using LLMs to accelerate dlt pipeline development
  • Validating data and schema changes using the dlt dashboard and dlt MCP

The session is fully practical and code-driven. By the end of the workshop, you'll understand how to design maintainable, scalable ingestion pipelines and use AI and validation tools to build them faster and with confidence.

Materials

About the Speaker

Aashish Nair is a Data Engineer at dltHub and the creator of the famous dlt deployment course, where he teaches best practices for running dlt pipelines in production.