
Senior Data Lakehouse Engineer
- Toronto, ON
- Permanent
- Full-time
- Lead the growth and evolution of a scalable, secure, and high-performing data lakehouse that meets diverse business needs
- Collaborate with business and technical stakeholders to translate strategic goals and operational processes into practical, well-modeled data products
- Design and implement flexible data models and transformation logic to support enterprise-wide 360-degree views across customer, revenue, and operations
- Build and optimize ETL/ELT pipelines to ensure timely, accurate delivery of curated datasets that are easy for business teams to access and use
- Enable real-time and batch processing of data to support decision-making, personalization, reporting, and automation use cases
- Partner closely with analytics, product, and operational teams to help them explore data, ask better questions, and trust the answers
- Champion data discoverability, quality, and documentation—ensuring business users understand what data is available, how it’s defined, and how to use it confidently
- Establish secure, automated data ingestion and transformation pipelines with clear lineage, versioning, and governance controls
- Implement privacy- and compliance-focused data handling practices, including masking, encryption, and auditability
- Maintain a centralized data catalog and documentation hub, connecting datasets to their business meaning and usage context
- Guide platform improvements that reduce friction, increase self-service access, and accelerate time to insight for teams across the organization
- Mentor data engineers and contribute to the maturity of our data engineering practices and platform strategy
- Actively participate in Porter’s Safety Management System (SMS), including reporting hazards and incidents; uphold and promote the Company Safety Policy
- 7+ years of experience in data engineering, with a strong focus on modern cloud-based data architectures and team leadership
- Expert-level SQL skills including stored procedure development, optimization, and modeling for analytics and operational use cases
- Proven experience designing Customer 360 data platforms, ideally integrating CRM and CDP sources in MACH-aligned architectures
- Deep understanding of versioned data models, incremental processing, and schema evolution
- Significant experience with real-time streaming technologies (e.g., Kafka, Kinesis, or equivalent)
- Hands-on experience with ETL/ELT tools such as AWS Glue, Matillion, Mulesoft, and orchestration tools like Airflow or AWS Step Functions
- Familiarity with cloud-native governance and cataloging frameworks (e.g., Lake Formation, Alation, Collibra)
- Strong understanding of data privacy and security frameworks, including PIPEDA and GDPR
- Ability to mentor others, lead design sessions, and communicate technical trade-offs with cross-functional stakeholders
- Excellent problem-solving, documentation, and communication skills
- Airline and/or aviation experience strongly preferred