Deep-dive course by the recognised PostgreSQL contributing sponsor, Data Egret, designed to help developers build fast and reliable applications with PostgreSQL at scale.
The course covers advanced indexing techniques, query optimisation, SQL best practices, interpreting EXPLAIN plans, and more.
of focused content
from recognised top PostgreSQL experts
of intensive PostgreSQL learnings
with live demos and field-tested techniques
to earn your ‘Working with High Performance PostgreSQL’ attendance certificate
This chapter introduces how PostgreSQL decides the best way to execute SQL queries. Key concepts include:
This chapter explains index structures and how to use them effectively:
LIKE '%...'
), or invalid indexes.This chapter explores the role of statistics in query optimization:
ANALYZE
, which can be configured for frequency and detail.
This chapter lists anti-patterns and common mistakes in SQL development:
NULLs
, implicit column orders, and excessive procedural logic in queries.
This chapter gives the first advice and best practices on writing performant SQL:
This chapter explains how to read and interpret PostgreSQL’s execution plans:
EXPLAIN
, EXPLAIN ANALYZE
, with options like BUFFERS
, WAL
, and SETTINGS
.Seq Scan
, Index Scan
, Hash Join
, etc.This extensive PostgreSQL course consists of seven 2.5-hour live sessions, delivered over five days.
To receive a certificate of attendance, participants must attend all seven sessions.
(all times CEST)
13:00–15:30
09:00–11:30
13:00–15:30
09:00–11:30
13:00–15:30
09:00–11:30
13:00–15:30
Course duration: 7 sessions; 2.5 hours each
Course cost: €850 before taxes
📝 Following the form submission: You’ll receive payment instructions via email.
💳 Payment Methods: Credit card or direct bank transfer.
⚠️ Important: Your spot is only confirmed once payment is received.
Daria Nikolaenko is a database specialist with a strong focus on monitoring, high availability, performance tuning, and scalable system architecture. She has hands-on experience with PostgreSQL exporters and metric analysis, and brings deep expertise in building robust data infrastructures.