
Data Engineer
- Montreal, QC
- Permanent
- Full-time
Reporting to the Lead Data Engineering, the Data Engineering is responsible for designing, developing, and maintaining data integration and transformation processes in our cloud-based data platform. While experience in Google Cloud Platform (GCP) is a significant asset, candidates with proven expertise in other major cloud platforms (AWS, Azure) will also be considered. This role emphasizes data governance, classification, and compliance—leveraging tools such as Collibra to ensure high-quality, secure, and well-documented data assets.Key ResponsibilitiesData Integration & Architecture
- Develop and orchestrate data pipelines for ingestion from various sources (e.g., MySQL, Oracle, PostgreSQL, flat files…etc.) into a cloud-based environment and move data around multiple system based on the business needs and requirements.
- Collaborate with Data Analysts and Data Architects on defining data models, requirements, and architecture for optimal performance in databases (e.g., BigQuery or other cloud-based relational databases).
- Ensure robust ETL/ELT processes that support scalability, reliability, and efficient data access.
- Implement and maintain data governance frameworks and standards, focusing on data classification, lineage, and documentation.
- Utilize Collibra or similar platforms to manage data catalogs, business glossaries, and data policies.
- Work closely with stakeholders to uphold best practices for data security, compliance, and privacy.
- Identify, design, and implement process enhancements for data delivery, ensuring scalability and cost-effectiveness.
- Automate manual tasks using scripting languages (e.g., Bash, Python) and enterprise scheduling/orchestration tools like Airflow.
- Conduct root cause analysis to troubleshoot data issues and implement solutions that enhance data reliability.
- Partner with cross-functional teams (IT, Analytics, Data Science, etc.) to gather data requirements and improve data-driven decision-making.
- Provide subject matter expertise on cloud data services, data classification standards, and governance tools.
- Monitor and communicate platform performance, proactively recommending optimizations to align with organizational goals.
- Technical Expertise
- Experience with at least one major cloud platform (AWS, Azure, GCP), with GCP exposure considered a significant asset.
- Strong understanding of RDBMS (PostgreSQL, MySQL, Oracle, SQL Server) with the ability to optimize SQL queries and maintain database performance.
- Familiarity with version control systems (Git) to manage codebase changes and maintain a clean development workflow.
- Familiarity with data governance and classification concepts, leveraging Collibra or similar platforms to manage data lineage, business glossaries, and metadata.
- Knowledge of Linux/UNIX environments, and experience working with APIs (XML, JSON, REST, SOAP).
- Data Pipeline Development
- Demonstrated ability to build large-scale, complex data pipelines for ETL/ELT processes.
- Hands-on experience with scripting/programming languages (e.g., Python, Bash) to automate data workflows and error handling.
- Strong analytical and problem-solving skills with the ability to work with unstructured datasets.
- Security & Compliance
- Functional knowledge of encryption technologies (SSL, TLS, SSH) and data protection measures.
- Experience implementing governance best practices to ensure data security and regulatory compliance.
- Soft Skills
- Fluent in English and French (spoken and written) required to collaborate with stakeholders in Quebec, Ontario, and across the United States.
- Excellent communication and collaboration skills to partner effectively with cross-functional teams.
- Curiosity and a growth mindset, with the initiative to explore emerging data technologies.
- Bachelor’s degree in Information Technology, Computer Science, or a related field; or an equivalent combination of education and experience.
- 5+ years of progressive experience in data engineering, data analytics, or a similar role.
- Proven track record in architecting, optimizing, and delivering enterprise-grade data solutions on a major cloud platform (AWS, Azure, or GCP).
- Demonstrated commitment to continuous learning and improvement in data engineering methodologies.