This one-day course provides participants with the essential knowledge and hands-on skills required to implement a Lakehouse architecture using Microsoft Fabric. Participants learn how to leverage Microsoft Fabric for end-to-end analytics, work with Delta Lake tables, use Apache Spark for data processing and transformation, and create Data Factory pipelines. By the end of the course, attendees will be equipped to set up and operate a Lakehouse solution efficiently.
Purpose
| Implement a Lakehouse architecture using Microsoft Fabric |
Audience
| Participants must have:
- Famiiarity with basic data concepts and SQL.
- Basic knowledge of cloud computing and Microsoft Azure services.
|
Role
| Data Scientist | Business Analyst |
Skill level
| Intemediate |
Style
| Lecture + Hands-on Activities |
Duration
| 1 day |
Related technologies
| Microsoft Azure | SQL |
Productivity objectives
- Understand the principles of Lakehouse architecture.
- Configure and deploy a Lakehouse solution using Microsoft Fabric.
- Utilize Apache Spark for data processing within the Lakehouse.
- Work with Delta Lake tables to manage and version data.
- Ingest data seamlessly using Dataflows Gen2.
- Create and manage Data Factory pipelines within Microsoft Fabric