Course 1: Databricks Lakehouse Fundamentals
Subtitle
Master the Data Lakehouse Architecture with Databricks Free Edition
Description
Build a solid foundation in the Databricks Lakehouse Platform: understand the evolution from data warehouses and data lakes to the lakehouse paradigm, navigate the Databricks workspace and Unity Catalog, write Spark DataFrames and SQL, and work with Delta Lake for reliable, versioned data storage. This is a Databricks-only course — no Sovereign AI Stack component.
Certification Alignment
This course prepares you for the Databricks Accredited Lakehouse Platform Fundamentals accreditation:
- 25 multiple-choice questions
- Tests conceptual understanding of the platform
- Covers architecture, components, governance, and workloads
- Free for Databricks customers and partners
Learning Outcomes
- Explain the data lakehouse architecture and how it combines warehouse reliability with lake flexibility
- Navigate the Databricks workspace, Unity Catalog, and compute resources
- Use Databricks notebooks with magic commands, dbutils, and multiple languages
- Write Spark transformations (select, filter, groupBy, join) and actions
- Create and manage Delta Lake tables with ACID transactions, MERGE, and time travel
- Build parameterized ETL pipelines and schedule them as Databricks Jobs
Duration
~15 hours | 18 videos | 6 labs | 3 quizzes
Weeks
| Week | Topic | Focus |
|---|---|---|
| 1 | Lakehouse Architecture & Platform | Architecture, workspace, catalog, compute |
| 2 | Spark Fundamentals | Notebooks, DataFrames, SQL, transformations |
| 3 | Delta Lake & Workflows | Delta tables, DML, time travel, jobs |
Databricks Free Edition Features Used
- Workspace and Notebooks
- Unity Catalog (basic)
- Apache Spark DataFrames and SQL
- Delta Lake tables
- DBFS (Databricks File System)
- Jobs and Workflows
- Sample datasets (
/databricks-datasets/)
Prerequisites
- Basic SQL knowledge
- Familiarity with Python
- Databricks Free Edition account (sign up)