Actions and Detail Panel
Data Vault Integrator - Implementation Training
Mon, Mar 27, 2017, 8:30 PM – Tue, Mar 28, 2017, 5:00 PM AEDT
Significantly improve your delivery
Learn how model driven design and data integration (ETL) generation principles that can be applied for Data Warehouse (DWH) solutions. The techniques you will learn will reduce time to value, and rapidly deliver a tested business solution with a solid and flexible Data Vault foundation.
Leveraging ETL generation techniques greatly reduces time spent on development by understanding patterns, concepts and architecture. Generation improves consistency and reduces the amount of customization, which enables you spend time adding value elsewhere or improving other components of the design solution and data model.
Is this for me?
The 2-day training teaches how to generate ETL for a Data Vault implementation by discussing the Data Vault patterns and their various implementation nuances. These different scenarios are provided in the context of a sample case (model), where each scenario extends the metadata model by adding a new layer of functionality. At the end of the course you will understand how a metadata model can be defined and configured to suit specific needs, but also which exception cases need to be supported and how to value existing available metadata-driven approaches. A combined option with the Data Vault modelling CDVDM course & certification hosted by Hans Hultgren / Genesee Academy. In this case the 3-day CDVDM course will run prior to this implementation course.
This course is relevant for anyone seeking to leverage ‘model-driven-design’ and pattern-based code-generation techniques to accelerate their Data Warehouse/ Business Intelligence programs. As advanced modelling and implementation techniques for Data Vault are also covered, this applies to a wide range of data professionals including BI and Data Warehouse professionals, data modelers and architects as well as DBAs and ETL specialists.
Varigence Biml and SQL are used as techniques to generate ETL as they provide a flexible framework to support the training concepts. Biml has been developed with automation of development for the Microsoft BI suite in mind, but concepts and approaches covered in the training are directly applicable to other platforms as well.
Biml is available as free plug-in for Visual Studio, or as licensed software.
- Familiarity with data modelling concepts
- Awareness of the core Data Vault components (e.g. Hub, Links, Satellites)
- Understanding of Data Warehouse and ETL development
- Some scripting / programming experience
Pre-reading materials will also be provided on these topics prior to the start of the course.
Implementation and automation using Data Vault modelling
Data Vault has emerged as the leader of contemporary data modelling techniques specialized for Data Warehouse design. Even though many data professionals are familiar with the basic concepts, the intricacies of implementing this into a maintainable, scalable and consistent manner are largely unknown.
The intent of the training is to move to implementation and advanced techniques as quickly as possible, not go through basic Data Vault modeling concepts. Training will primarily focus on implementation techniques, options and considerations.
Pre-read materials covering the fundamentals of Data Vault modelling and development will be provided prior to the start of the course to make sure all participants commence the course with a solid understanding of the Data Vault foundational principles, and to be able to provide sufficient focus on development techniques.
These cover the main concepts around Data Vault components (Hubs, Links and Satellites), architecture considerations (source-to-staging, Business Data Vault versus Raw Data Vault) as well as tools and configurations you can adopt to get started automating your development (such as Varigence Biml).
- Overarching principles: what concepts should a solution support?
- Data Vault implementation patterns, what kind of considerations are there?
- What prerequisites need to be in place? (ETL framework, conventions, patterns)
- How do database-level configurations support your Data Vault?
- Understand the difference between Raw and Business Data Vault
- ETL generation - how does this fit in and how do I get started?
- How does model driven design work, and what does it do for testing?
- What metadata do you need and where do you store it?
- How can I leverage virtualisation?
- How to balance performance issues using helper constructs (e.g. PIT, Bridge)?
- How can information from a Data Vault be exposed through (virtual) Data Marts?
- Day 1: Modelling refresher, base patterns, metadata requirements and development
- Day 2: Data Vault architecture decisions, advanced patterns and delivery (marts)
50% Classroom Lecture | 25% Group Workshops | 25% Discussion and Q&A
Please note: this training can be combined with classroom CDVDM modelling training & certification.
For modelling only, please consider the 3-day CDVDM: https://www.eventbrite.com/e/data-vault-modeling-certification-melbourne-tickets-31619263067 (USD $2500).
For a combination of modelling & implementation at a lower overall price, please consider the 5-day combined course. This offers the 3-day CDVDM modelling training and certification followed by this 2-day implementation training. Registration can be done here: https://www.eventbrite.com/e/special-cdvdm-dv-integrator-implementation-tickets-31901564438 (USD $2800).