Specialist Data Developer/Software Development
Canadian National Railway
- Montreal, QC
- Permanent
- Full-time
- Write performing quality code to fulfill design and pass code review with minimal number of defects
- Apply configuration on development environment when required
- Participate in implementing and supporting full product in production.
- Analyze source system data to assess data quality, connect to data sources, import data and transform data for Business Intelligence
- Design ETL processes and develop source-to-target data mappings, integration workflows, and load processes
- Interact with Data Designer to understand requirements for solutions. Highlight the technical impacts of the functional design on existing solutions based on a detailed analysis.
- Deliver technical design and database structure for medium to high product complexity
- Create, review and maintain technical documentations.
- Analyze and troubleshoot the production issues and provide remediation
- Contribute in developing the design and coding standards that will apply to the whole practice
- Document blueprint based on requirements & functional designs
- Document designs and architect data maps, develop data quality components and establish and/or conduct unit tests
- Involved in gathering, understanding and validating the project specifications and participate in ETL architecture design reviews
- Ensure Quality KPI are identified, measured and produced ensuring respect of development standards. Ensure right level of testing is consistent across all projects.
- Identify problems, develop ideas and propose solutions within differing situations requiring analytical, evaluative or constructive thinking in daily work.
- Perform reviews and quality checks after data has been loaded
- N/A
- Minimum 4 years overall work experience as a developer
- Hands On with Azure, ADF and Databricks is a MUST
- Proficiency with programming languages Java, Python and Shell scripting is a MUST
- Proficiency with programming language Scala is a plus
- Ability to develop Batch and Streaming applications
- Exposure to NoSQL database and thorough experience with SQL
- Experience working in Agile environments and Azure devops
- Azure Certification, DataBricks Certification, Snowflake expertise
- Knowledge of Hadoop ecosystem (Hive, Spark, HDFS, NiFi)
- Bachelor's degree in computer science or equivalent degree or work experience
- Strong communication skills, including the ability to speak clearly to technical and nontechnical people.
- Self-driven, highly motivated, team player and able to learn quickly
- Proficiency with SQL and/or data modeling skills.
- Proficiency with programming technologies in area of expertise, Python, Java/Scala, PowerShell
- Experience in troubleshooting and resolving database integrity and performance issues
- Experience in Data warehouse design, ELT/ETL and BI reporting/analytics tools
- Experience with Big Data techniques and Cloud, Knowledge of Messaging Queue (Kafka, Azure Event Hub, RabbitMQ, Etc..) and ELK
- Experience developing CI/CD pipelines
- Awareness of Agile principles, automation, Scripting Skills and DevOps
- Strong understanding of data warehousing and business intelligence architecture
- Experience with Azure (DataLake, DataFactory, DataBricks, Data Explorer, Data warehouse)
- Experience with version control systems (git) and Azure DevOps
- Knowledge of Big Data analytics technologies in a Cloud environment