Operational Excellence Manager
DSV View all jobs
- Victoria, BC
- $93,909 per year
- Permanent
- Full-time
Job Posting Title: Operational Excellence Manager
Time Type: Full TimeCompensation: Salary starting at $93,909, plus eligible benefits, in accordance with experience and internal equityThis posting is for an existing vacancy and not intended to create a talent pipelinePurpose of the PositionThe Partner Integration Team under Cross-Product Operational Excellence works on behalf of Air & Sea to define, maintain, and -jointly with IT- implement the standards for exchanging data with business partners with the objective of driving process automation and data quality in operations. Leveraging the global operational excellence organization in Air & Sea, the team will work with Business Partners to continuously improve data availability, timeliness, and accuracy through automated interfaces.Key tasks & responsibilities
- In collaboration with IT and in close alignment with Air & Sea Products, define and maintain global standards for data integration with carriers and other external partners on behalf of Air & Sea. This includes semantic models and associated mappings as well as different integration patterns (e.g., asynchronous/message-based vs synchronous/API based, push vs pull).
- In alignment with and on behalf of the Air & Sea Products, provide the logical mappings required for specific carrier integrations.
- In alignment with and on behalf of the Air & Sea Products and in collaboration with IT and Master Data Management, define and manage the ongoing maintenance of internal mapping tables and related structures required to enrich incoming partner data for consumption by internal processes and systems.
- Work with carriers to resolve data quality and performance issues (completeness, timeliness, correctness) for data exchanged via global interfaces; work with the global Operational Excellence organization to continuously improve quality and performance of data exchanged via global interfaces at a local level
- Monitor and drive utilization of electronic data exchange with Air- and Sea-Carrier in the organization to enable productivity gains.
- Participate in and contribute to Cross-functional Collaboration Teams as required
- As the role works closely with IT and enterprise data platforms, a strong technical foundation in modern data engineering and cloud-based integration environments is required.
- Understanding of data lake / lakehouse architectures and structured data processing (e.g., Delta Lake concepts)
- Experience with integration patterns in distributed systems (API-based integrations, event-driven architecture, message queues, batch vs. real-time processing)
- Knowledge of data modeling concepts (semantic models, canonical data models, mapping structures, enrichment logic)
- Practical experience with structured data formats such as JSON, XML, EDI, CSV and API payloads
- Understanding of data quality frameworks, monitoring approaches, and data validation techniques
- Familiarity with CI/CD principles, DevOps practices, and version-controlled deployment in data environments
- Ability to translate business integration requirements into scalable technical solutions in collaboration with IT teams
- Master's degree in related technical / business areas or equivalent work experience
- 7 years industry experience
- Good command, both verbal and written, English skills.
- Good communication skills as the role requires frequent communication to various directions and all management levels
- Ability to create collaborative engagements and active listening
- Strong analytical and conceptual capabilities
- Both a team builder and a team-player
- English is the principal language for this position, additional languages may be required in other locations.
- Ability to communicate effectively in a diverse multicultural environment.
- Highly proficient in MS Office
- Solid understanding of modern data architectures in cloud environments (preferably Microsoft Azure stack)
- Hands-on experience with Azure Data Factory (or comparable orchestration tools) for building, monitoring, and troubleshooting data pipelines
- Experience working with Databricks (Spark-based data processing), including transformation logic, data validation, and performance optimization