Microsoft Data Integration Pipelines
The fundamentals to level 300.
Course Overview
In this course we will start with the very basics of data integration before building up the fundamental skills and artifacts needed to orchestrate your data platform end-to-end. We will do this using Azure Data Factory pipelines, not exclusively, but as the basis for our learning journey. The maturity of the Factory resource offers a set of good foundations to understand the technical capabilities needed for orchestration, before applying our knowledge to other tools such as Microsoft Fabric and Azure Synapse Analytics.
Through a blend of theory, demonstration and practical labs we will explore the components needed to orchestration cloud-based data processing workloads from source system ingestion, through to data model delivery. Focusing on the control flow plane, with supplementary options for building out low code data flow packages available as part of our integration pipelines.
Start the day knowing nothing about data integration pipelines, or as an experience SQL Server SSIS developer, and leave with the knowledge and resources to apply these skills to your role as a data professional working with Microsoft cloud native tools.
Course Objectives
o How cloud native data integration resources have involved over time to form the tools currently available in the Microsoft cloud. With a brief history lesson dating back to SQL Server DTS/SSIS packages.
o What the basic data pipeline artifacts are in the context of building out an affective workload and processing dependency chain with a variety of trigger options.
o What the common data movement deployment patterns are across a hybrid technology estate for efficient data ingestion from any data source.
o How to build complex, high dynamic control flow and data flow components as processing pipelines for your wider data platform solution.
o How to massively scale out executions and handle parallel orchestration workloads through metadata.
o What the complete end-to-end orchestration story looks like for a cloud-based analytics platform, using data integration pipelines as the bootstrap to your solution.
o Best practices for the deployment of orchestration resources into production from source control, including monitoring, logging, security and networking.
Training Format
• 1 Day Course
• All Training Materials Provided
• Mixed Theory and Labs
Technology Covered
Notes
Great for getting started ingesting and orchestrating data in the cloud.