Skip to content
#

adls-gen2

Here are 26 public repositories matching this topic...

End-to-end Azure data engineering pipeline ingesting real-time earthquake data from the USGS API. Implements a Bronze–Silver–Gold lakehouse using Azure Data Factory, Databricks, ADLS Gen2, and Synapse Analytics, with both manual execution and fully automated daily-triggered workflows.

  • Updated Dec 22, 2025
  • Jupyter Notebook

Master Azure Data Engineering with this Basic to Advance guide! Covers SQL, PySpark, Kafka, Databricks, Snowflake & Airflow. Build 15+ industrial projects using Azure (ADF, Synapse, Event Hubs), GCP & modern table formats (Delta Lake, Iceberg, Hudi). Learn real-time streaming, Medallion architecture, and cloud data warehousing with hands-on labs.

  • Updated Feb 23, 2026

Real-time streaming data pipeline using Apache Kafka, Spark Structured Streaming, and Delta Lake on Azure. Secure SSL Kafka integration, ADLS storage with OAuth2, and ML-driven anomaly detection with automated email alerts. Modular, scalable, and configurable for IoT and log analytics pipelines.

  • Updated Nov 15, 2025
  • Python
GitOps-and-ADLS-ingestion-into-Bronze-Pipelines

GitOps-driven Azure Data Factory pipeline that ingests multi-source data (GitHub + ADLS) into ADLS Bronze using dynamic, parameterized ETL workflows.

  • Updated Nov 19, 2025

End-to-end data engineering pipeline implementing Medallion Architecture (Bronze-Silver-Gold) for trip transaction analytics. Automated ETL using Azure Data Factory, Databricks, and Delta Lake with real-time monitoring and email notifications via Logic Apps.

  • Updated Nov 6, 2025
  • Jupyter Notebook

Predictive maintenance alert pipeline (C-MAPSS): ingest → preprocess to Parquet/Delta → train & score failure risk in Databricks → write alerts.json → Logic App notifies Teams/Email → Power Automate creates Planner tasks.

  • Updated Feb 27, 2026
  • Python

End-to-end Azure Data Engineering project using ADF for incremental ingestion, Databricks (DLT) for Medallion Architecture, and Delta Lake for CDC (SCD Type 1). Managed via Databricks Asset Bundles (DABs) for professional CI/CD. Focuses on real-time streaming, scalability, and Star Schema modeling.

  • Updated Jan 29, 2026
  • Python

Designed a production-grade Azure Data Engineering project centered on Azure Data Factory. Built dynamic, metadata-driven pipelines to ingest data from on-prem systems, REST APIs, and Azure SQL into ADLS Gen2 using Medallion Architecture, incremental loading, and enterprise-scale orchestration patterns.

  • Updated Jan 13, 2026

End-to-end Metadata-Driven Data Engineering framework built on Azure. Features dynamic SQL/REST API ingestion with range pagination, automated schema mapping, and event-driven orchestration. Implements robust CI/CD via GitHub Actions/YAML and automated failure alerting with Logic Apps. Optimized for scalability and DE best practices.

  • Updated Jan 16, 2026

🚀 Build an automated data pipeline with Azure Data Factory and Databricks to efficiently process and analyze trip transaction data using Medallion Architecture.

  • Updated Mar 2, 2026
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the adls-gen2 topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the adls-gen2 topic, visit your repo's landing page and select "manage topics."

Learn more