
Data Engineer
- Canada
- Permanent
- Full-time
- Data Pipeline Development: Design, build, and maintain robust and scalable data pipelines that automate data extraction, transformation, and loading from various sources (databases, APIs, flat files, etc.).
- Data Integration: Integrate disparate data sources into a unified, accessible format for analytics and reporting purposes.
- Data Modeling: Develop data models, data warehousing solutions, and implement best practices for structuring and storing large volumes of data.
- ETL Process Management: Develop and optimize ETL processes to handle high-throughput, real-time data streams and batch processing.
- Performance Optimization: Monitor, optimize, and troubleshoot data processing workflows for improved speed, efficiency, and scalability.
- Collaboration: Work with cross-functional teams, including Data Scientists, Analysts, and Business Intelligence teams, to deliver high-quality data solutions.
- Data Quality Assurance: Ensure data quality, integrity, and compliance with industry standards and best practices.
- Cloud and Big Data Technologies: Manage cloud-based data storage solutions and big data processing frameworks to enable real-time and batch analytics at scale.
- Bachelor's or Master's degree in Computer Science or equivalent professional experience
- 4+ years of professional experience in data engineering or a related field.
- Solid understanding of data modeling, ETL processes, and data warehousing.
- Experience working with large-scale data infrastructure, including batch and stream processing systems.
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork abilities.
- Ability to manage multiple projects and priorities in a fast-paced environment.
- Passion for continuous learning and keeping up with the latest data engineering trends
- Programming Languages: Proficient in Python, Java, or Scala for data processing and automation.
- Big Data Technologies: Familiarity with Hadoop, Spark, Kafka, or similar distributed data processing frameworks.
- ETL Tools: Experience with ETL tools like Apache NiFi, Talend, or Informatica.
- Databases: Knowledge of SQL and NoSQL databases such as PostgreSQL, MySQL, MongoDB, or Cassandra.
- Cloud Platforms: Proficiency with cloud services such as AWS (Redshift, S3, Lambda), Google Cloud (BigQuery, Dataflow), or Azure (Data Lake, SQL Data Warehouse).
- Data Warehousing Solutions: Experience with data warehousing platforms like Snowflake, Amazon Redshift, or Google BigQuery.
- Data Orchestration Tools: Familiarity with Apache Airflow, Prefect, or similar tools for workflow automation and scheduling.
- Version Control: Experience with Git for source code management and collaboration.
- AWS Certified Data Analytics - Specialty
- Google Professional Data Engineer
- Microsoft Certified: Azure Data Engineer Associate
- Cloudera Certified Associate (CCA) Data Analyst
- Certified Data Management Professional (CDMP)