193 Cloud Data Engineer jobs in Canada
Senior Cloud Data Engineer | Ingnieure senior en donnes Cloud
Posted today
Job Viewed
Job Description
Job Description
Salary:
About the Role
We are hiring a Senior Cloud Data Engineer to lead the development of our cloud infrastructure that powers AI/ML product pipelines from data ingestion and orchestration to model deployment and secure application delivery.
This is a deeply hands-on engineering role, responsible for designing, coding, and maintaining scalable, secure, multi-tenant data environments across AWS and Azure. You will play a central role in building infrastructure to support AI/ML use cases such as demand forecasting, predictive analytics, and real-time decision-making.
This is a hands-on, engineering-heavy role ideal for someone who thrives in high-responsibility environments, demonstrates sharp problem-solving skills, and takes initiative to independently deliver robust solutions from end to end.
Key Responsibilities
Cloud Data Engineering & Architecture
- Architect and build scalable, secure, multi-tenant cloud data pipelines in AWS and Azure
- Implement robust ETL/ELT pipelines and APIs to move and access data across Oracle, AWS, and Snowflake including ERP-to-cloud, cloud-to-ERP, and intra-cloud flows
- Leverage AWS services (Glue, Lambda, S3, RDS, EventBridge), AWS Batch, Azure components, and orchestration tools like Airflow and Kedro to build resilient and maintainable pipelines; Enforce modularity and reusability.
- Automate infrastructure provisioning using Terraform / OpenTofu; manage CI/CD pipelines with Jenkins, GitHub Actions, or ArgoCD.
AI/ML Infrastructure & MLOps
- Build infrastructure to support AI/ML workflows (e.g. training, validation, versioning)
- Integrate with experiment tracking tools (e.g. MLflow) and model lifecycle pipelines
- Enable scalable model deployment in secure environments (containerized or cloud-native)
- Support full ML Ops lifecycle: data prep, parameter tuning, model deployment, and monitoring, in close collaboration with AI/ML scientists.
Secure Application Deployment
- Deploy and manage React or Python-based ML applications with secure user access
- Ensure private networking, MFA, RBAC, and encryption best practices in Azure and AWS
- Create CI/CD pipelines (e.g., Jenkins, GitHub Actions) integrated with Docker/Kubernetes
Automation & DevSecOps
- Design end-to-end automation for data movement, transformation, and model execution
- Integrate automated testing, scanning, and rollback strategies into CI/CD pipelines
- Maintain monitoring and logging with Prometheus, CloudWatch, or similar tools
Cross-Platform & Multi-Cloud Strategy
- Build portable components deployable across AWS and Azure
- Support integration of services like AWS Batch, AWS Glue, S3, Lambda, EventBridge, etc
ERP and Retail Context
- Work with ERP-based datasets (Oracle), including complex relational structures
- Understand the unique needs of AI applications in retail and supply chain analytics
Qualifications
Education & Experience
- Masters degree in Data Science, Computer Science, or Software Engineering
- 5+ years of real-world experience in cloud data engineering, infrastructure, and deployment roles
- Prior professional experience with AI/ML pipelines or applications is strongly preferred
Tech Stack & Tools (Must-Have Experience)
- AWS (S3, Lambda, Glue, RDS, IAM, EventBridge), AWS Batch, Azure
- Snowflake, Oracle, SQL
- Python, PySpark, Docker, Kubernetes, Terraform
- Airflow, MLflow (for tracking), Git, CI/CD pipelines
Traits We Value
- Self-driven, independent, and resourceful able to own solutions from idea to production
- High attention to detail and a deep respect for secure, responsible data handling
- Collaborative mindset to work with data scientists, product managers, and architects
- Excellent written and verbal communication
Additional Information
- Remote Work Model: Hybrid; 2days per week in the Montreal office
We thank all applicants for their interest; only those shortlisted will be contacted.
Join us to help build the cloud foundations of our AI-powered future!
---
propos du poste
Nous recrutons uneSenior Cloud Data Engineer pour piloter le dveloppement de notre infrastructure cloud, support des pipelines produits IA/ML de lingestion et lorchestration des donnes au dploiement des modles et la livraison scurise des applications.
Ce rle trs oprationnel consiste concevoir, coder et maintenir des environnements de donnes scalables, scuriss et multilocataires sur AWS et Azure. Vous jouerez un rle central dans la cration dinfrastructures pour prendre en charge des cas dusage IA/ML tels que la prvision de la demande, lanalytique prdictive et la prise de dcisions en temps rel.
Ce poste, forte dimension technique et autonome, est idal pour une personne qui spanouit dans des environnements haute responsabilit, fait preuve dun excellent sens du problme et prend linitiative pour livrer de bout en bout des solutions robustes.
Responsabilits cls
Ingnierie & architecture des donnes cloud
- Concevoir et btir des pipelines de donnes cloud scalables, scuriss et multilocataires sur AWS et Azure.
- Implmenter des pipelines ETL/ELT et des API robustes pour dplacer et accder aux donnes entre Oracle, AWS et Snowflake (ERPcloud, cloudERP, intracloud).
- Exploiter les services AWS (Glue, Lambda, S3, RDS, EventBridge), AWS Batch, composants Azure et outils dorchestration (Airflow, Kedro) pour garantir rsilience et maintenabilit, en privilgiant modularit et rutilisabilit.
- Automatiser le provisionnement dinfrastructure avec Terraform/OpenTofu ; grer les pipelines CI/CD via Jenkins, GitHub Actions ou ArgoCD.
Infrastructure IA/ML & MLOps
- Mettre en place linfrastructure pour les workflows IA/ML (entranement, validation, gestion des versions).
- Intgrer des outils de suivi dexprimentation (MLflow) et orchestrer les pipelines de cycle de vie des modles.
- Permettre le dploiement scalable des modles dans des environnements scuriss (conteneurs ou cloud natif).
- Couvrir le cycle MLOps complet : prparation des donnes, ajustement des paramtres, dploiement des modles et surveillance, en collaboration troite avec les data scientists.
Dploiement dapplications scuris
- Dployer et grer des applications ML bases sur React ou Python avec accs utilisateur scuris.
- Mettre en uvre rseau priv, MFA, RBAC et chiffrement selon les meilleures pratiques Azure et AWS.
- Crer des pipelines CI/CD (Jenkins, GitHub Actions) intgrs avec Docker/Kubernetes.
Automatisation & DevSecOps
- Concevoir lautomatisation de bout en bout pour le dplacement, la transformation des donnes et lexcution des modles.
- Intgrer tests automatiss, scans de scurit et stratgies de rollback dans les pipelines CI/CD.
- Maintenir la surveillance et la journalisation avec Prometheus, CloudWatch ou outils similaires.
Stratgie multicloud & crossplatform
- Dvelopper des composants portables dployables sur AWS et Azure.
- Supporter lintgration de services tels que AWS Batch, Glue, S3, Lambda, EventBridge, etc.
Contexte ERP & retail
- Travailler avec des jeux de donnes ERP (Oracle), y compris des structures relationnelles complexes.
- Comprendre les besoins spcifiques des applications IA en analytique retail et supply chain.
Qualifications
Formation & exprience
- Master en Data Science, Informatique ou Gnie logiciel.
- 5+ ans dexprience en ingnierie des donnes cloud, infrastructure et dploiement.
- Exprience professionnelle antrieure avec des pipelines ou applications IA/ML fortement recommande.
Stack & outils (exprience requise)
- AWS (S3, Lambda, Glue, RDS, IAM, EventBridge), AWS Batch, Azure
- Snowflake, Oracle, SQL
- Python, PySpark, Docker, Kubernetes, Terraform
- Airflow, MLflow (pour le tracking), Git, pipelines CI/CD
Qualits recherches
- Autonomie, dbrouillardise et capacit piloter des solutions de lide la production.
- Grande rigueur et respect des bonnes pratiques de scurisation des donnes.
- Esprit collaboratif avec data scientists, product managers et architectes.
- Excellente communication crite et orale.
Informations complmentaires
- Mode hybride: 2jours/semaine au bureau de Montral
- Nous remercions toutes les personnes intresses; seules les personnes retenues seront contactes.
Rejoigneznous pour construire les fondations cloud de notre avenir IA!
Solution Architect, IT Infrastructure & Cloud Computing
Posted today
Job Viewed
Job Description
We are looking for a solution architect (SA) who will report to the manager of solution architecture. The SA will primarily define the solution architecture for specific projects but will also work with the business leaders to build a roadmap and alignment of solutions. As part of his/her role, the SA will review and understand business requirements for a project in order to design a technology solution that will meet those requirements while respecting the guiding principles of the Enterprise Architecture within the actual BRP context.
YOU’LL HAVE THE OPPORTUNITY TO:
- Provide an architectural conception of the solution regarding business, application, information and technology domains via the solution architecture design document.
- Develop and implement infrastructure solutions to meet client needs. This involves assessing client requirements, selecting appropriate infrastructures and network services and designing scalable and secure architecture.
- Provide expertise in migrating existing applications and infrastructure to the cloud. This includes performing cloud readiness assessments, planning migration strategies, and ensuring a seamless transition with minimal downtime.
- Ensure infrastructure solutions adhere to industry standards and compliance requirements. This involves implementing security protocols, managing data encryption, and configuring cloud security services.
- Partner with the organization to understand organizational and departmental strategy and agree on information systems solutions to meet the needs.
- Estimate cost, and prepare business cases for IT solutions, considering infrastructure, licences, development and support.
YOU’LL THRIVE IN THIS ROLE IF YOU HAVE THE FOLLOWING SKILLS AND QUALITIES:
- Bachelor's degree in computer science, information systems or a related study, or equivalent project-related experience.
- Minimum of ten years of experience in IT, with at least five years in information system design.
- In-depth experience designing and implementing information solutions, with specialization in IT infrastructure (Server architecture (Windows and Linux), VDI, Virtualization, Networking projects).
- Strong Knowledge of enterprise networking, cyber security, identity management systems, Backup and Recovery, Monitoring tools, IaC.
- In-depth experience with Microsoft technologies (Azure, Sharepoint, OneDrive, M365 etc.)
- Knowledge of public Cloud technologies (IaaS, CaaS, SaaS, PaaS) running on Azure, GCP and AWS.
- Knowledge of Agile methodology.
- Experience in a factory IT environment.
- Strategic business acumen and understanding of organization strategy and ability to design information systems to deliver that strategy.
- Excellent communication skills with the ability to explain technical concepts to lay audiences. Some experience of working with board-level stakeholders.
- Team player with experience leading and collaborating cross-team to ensure successful delivery of solutions.
- Strong conceptual and analytical skills - demonstrating outside-the-box problem-solving skills and ability to develop solution architecture designs.
- Knowledge of Enterprise Architecture methodologies such as TOGAF with Archimate or the equivalent.
- Bilingualism (French & English) - knowledge of English is required because you will be representing BRP in negotiations with vendors in the US and exchanging with stakeholders around the world.
ACKNOWLEDGING THE POWER OF DIVERSITY
BRP is dedicated to nurturing a culture that invites, connects, and propels the ambitions of people of all backgrounds, profiles, beliefs and experiences. Ultimately, the diversity and uniqueness of our people fuel our ingenuity and set the course for the path ahead!
For this reason, we value diversity and we strive to always push each other forward to build an inclusive workplace where every employee feels like they belong, where they can grow and find meaning.
AT BRP, WHEN WE TALK ABOUT BENEFITS, WE GO ALL IN.
Let’s start with a strong foundation - You want it, we have it:
- Annual bonus based on the company’s financial results
- Generous paid time away
- Pension plan
- Collective saving opportunities
- Industry leading healthcare fully paid by BRP
What about some feel good perks:
- Flexible work schedule
- A summer schedule that varies by department and location
- Holiday season shutdown
- Educational resources
- Discount on BRP products
WELCOME TO BRP
We’re a world leader in recreational vehicles and boats, creating innovative ways to move on snow, water, asphalt, dirt and even in the air. Headquartered in the Canadian town of Valcourt, Quebec, our company is rooted in a spirit of ingenuity and intense customer focus. Today, we operate manufacturing facilities in Canada, the United States, Mexico, Finland, Australia and Austria, with a workforce made up of close to 20,000 spirited people, all driven by the deeply held belief that at work, as with life itself, it’s not about the destination; It’s about the journey.
#LI-Hybrid
#LI-EF1
Solution Architect, IT Infrastructure & Cloud Computing
Posted today
Job Viewed
Job Description
We are looking for a solution architect (SA) who will report to the manager of solution architecture. The SA will primarily define the solution architecture for specific projects but will also work with the business leaders to build a roadmap and alignment of solutions. As part of his/her role, the SA will review and understand business requirements for a project in order to design a technology solution that will meet those requirements while respecting the guiding principles of the Enterprise Architecture within the actual BRP context.
YOU’LL HAVE THE OPPORTUNITY TO:
- Provide an architectural conception of the solution regarding business, application, information and technology domains via the solution architecture design document.
- Develop and implement infrastructure solutions to meet client needs. This involves assessing client requirements, selecting appropriate infrastructures and network services and designing scalable and secure architecture.
- Provide expertise in migrating existing applications and infrastructure to the cloud. This includes performing cloud readiness assessments, planning migration strategies, and ensuring a seamless transition with minimal downtime.
- Ensure infrastructure solutions adhere to industry standards and compliance requirements. This involves implementing security protocols, managing data encryption, and configuring cloud security services.
- Partner with the organization to understand organizational and departmental strategy and agree on information systems solutions to meet the needs.
- Estimate cost, and prepare business cases for IT solutions, considering infrastructure, licences, development and support.
YOU’LL THRIVE IN THIS ROLE IF YOU HAVE THE FOLLOWING SKILLS AND QUALITIES:
- Bachelor's degree in computer science, information systems or a related study, or equivalent project-related experience.
- Minimum of ten years of experience in IT, with at least five years in information system design.
- In-depth experience designing and implementing information solutions, with specialization in IT infrastructure (Server architecture (Windows and Linux), VDI, Virtualization, Networking projects).
- Strong Knowledge of enterprise networking, cyber security, identity management systems, Backup and Recovery, Monitoring tools, IaC.
- In-depth experience with Microsoft technologies (Azure, Sharepoint, OneDrive, M365 etc.)
- Knowledge of public Cloud technologies (IaaS, CaaS, SaaS, PaaS) running on Azure, GCP and AWS.
- Knowledge of Agile methodology.
- Experience in a factory IT environment.
- Strategic business acumen and understanding of organization strategy and ability to design information systems to deliver that strategy.
- Excellent communication skills with the ability to explain technical concepts to lay audiences. Some experience of working with board-level stakeholders.
- Team player with experience leading and collaborating cross-team to ensure successful delivery of solutions.
- Strong conceptual and analytical skills - demonstrating outside-the-box problem-solving skills and ability to develop solution architecture designs.
- Knowledge of Enterprise Architecture methodologies such as TOGAF with Archimate or the equivalent.
- Bilingualism (French & English) - knowledge of English is required because you will be representing BRP in negotiations with vendors in the US and exchanging with stakeholders around the world.
ACKNOWLEDGING THE POWER OF DIVERSITY
BRP is dedicated to nurturing a culture that invites, connects, and propels the ambitions of people of all backgrounds, profiles, beliefs and experiences. Ultimately, the diversity and uniqueness of our people fuel our ingenuity and set the course for the path ahead!
For this reason, we value diversity and we strive to always push each other forward to build an inclusive workplace where every employee feels like they belong, where they can grow and find meaning.
AT BRP, WHEN WE TALK ABOUT BENEFITS, WE GO ALL IN.
Let’s start with a strong foundation - You want it, we have it:
- Annual bonus based on the company’s financial results
- Generous paid time away
- Pension plan
- Collective saving opportunities
- Industry leading healthcare fully paid by BRP
What about some feel good perks:
- Flexible work schedule
- A summer schedule that varies by department and location
- Holiday season shutdown
- Educational resources
- Discount on BRP products
WELCOME TO BRP
We’re a world leader in recreational vehicles and boats, creating innovative ways to move on snow, water, asphalt, dirt and even in the air. Headquartered in the Canadian town of Valcourt, Quebec, our company is rooted in a spirit of ingenuity and intense customer focus. Today, we operate manufacturing facilities in Canada, the United States, Mexico, Finland, Australia and Austria, with a workforce made up of close to 20,000 spirited people, all driven by the deeply held belief that at work, as with life itself, it’s not about the destination; It’s about the journey.
#LI-Hybrid
#LI-EF1
Solution Architect, IT Infrastructure & Cloud Computing
Posted today
Job Viewed
Job Description
We are looking for a solution architect (SA) who will report to the manager of solution architecture. The SA will primarily define the solution architecture for specific projects but will also work with the business leaders to build a roadmap and alignment of solutions. As part of his/her role, the SA will review and understand business requirements for a project in order to design a technology solution that will meet those requirements while respecting the guiding principles of the Enterprise Architecture within the actual BRP context.
YOU’LL HAVE THE OPPORTUNITY TO:
- Provide an architectural conception of the solution regarding business, application, information and technology domains via the solution architecture design document.
- Develop and implement infrastructure solutions to meet client needs. This involves assessing client requirements, selecting appropriate infrastructures and network services and designing scalable and secure architecture.
- Provide expertise in migrating existing applications and infrastructure to the cloud. This includes performing cloud readiness assessments, planning migration strategies, and ensuring a seamless transition with minimal downtime.
- Ensure infrastructure solutions adhere to industry standards and compliance requirements. This involves implementing security protocols, managing data encryption, and configuring cloud security services.
- Partner with the organization to understand organizational and departmental strategy and agree on information systems solutions to meet the needs.
- Estimate cost, and prepare business cases for IT solutions, considering infrastructure, licences, development and support.
YOU’LL THRIVE IN THIS ROLE IF YOU HAVE THE FOLLOWING SKILLS AND QUALITIES:
- Bachelor's degree in computer science, information systems or a related study, or equivalent project-related experience.
- Minimum of ten years of experience in IT, with at least five years in information system design.
- In-depth experience designing and implementing information solutions, with specialization in IT infrastructure (Server architecture (Windows and Linux), VDI, Virtualization, Networking projects).
- Strong Knowledge of enterprise networking, cyber security, identity management systems, Backup and Recovery, Monitoring tools, IaC.
- In-depth experience with Microsoft technologies (Azure, Sharepoint, OneDrive, M365 etc.)
- Knowledge of public Cloud technologies (IaaS, CaaS, SaaS, PaaS) running on Azure, GCP and AWS.
- Knowledge of Agile methodology.
- Experience in a factory IT environment.
- Strategic business acumen and understanding of organization strategy and ability to design information systems to deliver that strategy.
- Excellent communication skills with the ability to explain technical concepts to lay audiences. Some experience of working with board-level stakeholders.
- Team player with experience leading and collaborating cross-team to ensure successful delivery of solutions.
- Strong conceptual and analytical skills - demonstrating outside-the-box problem-solving skills and ability to develop solution architecture designs.
- Knowledge of Enterprise Architecture methodologies such as TOGAF with Archimate or the equivalent.
- Bilingualism (French & English) - knowledge of English is required because you will be representing BRP in negotiations with vendors in the US and exchanging with stakeholders around the world.
ACKNOWLEDGING THE POWER OF DIVERSITY
BRP is dedicated to nurturing a culture that invites, connects, and propels the ambitions of people of all backgrounds, profiles, beliefs and experiences. Ultimately, the diversity and uniqueness of our people fuel our ingenuity and set the course for the path ahead!
For this reason, we value diversity and we strive to always push each other forward to build an inclusive workplace where every employee feels like they belong, where they can grow and find meaning.
AT BRP, WHEN WE TALK ABOUT BENEFITS, WE GO ALL IN.
Let’s start with a strong foundation - You want it, we have it:
- Annual bonus based on the company’s financial results
- Generous paid time away
- Pension plan
- Collective saving opportunities
- Industry leading healthcare fully paid by BRP
What about some feel good perks:
- Flexible work schedule
- A summer schedule that varies by department and location
- Holiday season shutdown
- Educational resources
- Discount on BRP products
WELCOME TO BRP
We’re a world leader in recreational vehicles and boats, creating innovative ways to move on snow, water, asphalt, dirt and even in the air. Headquartered in the Canadian town of Valcourt, Quebec, our company is rooted in a spirit of ingenuity and intense customer focus. Today, we operate manufacturing facilities in Canada, the United States, Mexico, Finland, Australia and Austria, with a workforce made up of close to 20,000 spirited people, all driven by the deeply held belief that at work, as with life itself, it’s not about the destination; It’s about the journey.
#LI-Hybrid
#LI-EF1
Data Engineer
Posted 5 days ago
Job Viewed
Job Description
The **Data Engineer** will work closely with senior developers to build out new and maintain existing ETL pipelines in Airflow. They will be part of our Data Technology team working with an all new technology stack, working closely in collaboration with internal and external stakeholders.
Now, if you were to come on board as our **Data Engineer**, we’d ask you to do the following for us:
- Design, implement and maintain data pipelines for extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
- Build and maintain existing Airflow DAGs
- Maintain and add to our middle tier API
- Prototype new technology that supports our vision of making our consumer’s experiences better with our data
- Provide support and insights to the business analytics and data science teams
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs
Think you have what it takes to be our **Data Engineer**? We’re committed to hiring the best talent for the role. Here’s how we’ll know you’ll be successful in the role:
- At least 2-3 years of relevant experience in a similar role or function
- Bachelor’s degree or equivalent in Computer Science, Computer Engineering, Information Systems or similar
- Programming skills with Python and Spark
- Exposure to at least one cloud provider, AWS preferred
- Knowledge of SQL, relational database concepts and SQL scripting
- Some experience with Airflow highly preferred, or similar systems
- Web Development skills with languages and frameworks like Angular, Node.js would be a plus
- Experience with Docker, Kafka or other data stream processing software platform is preferred
Data Engineer
Posted 5 days ago
Job Viewed
Job Description
# **Job Summary**
The **Data Engineer** will work closely with senior developers to build out new and maintain existing ETL pipelines in Airflow. They will be part of our Data Technology team working with an all new technology stack, working closely in collaboration with internal and external stakeholders.
Now, if you were to come on board as our **Data Engineer**, we’d ask you to do the following for us:
- Design, implement and maintain data pipelines for extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
- Build and maintain existing Airflow DAGs
- Maintain and add to our middle tier API
- Prototype new technology that supports our vision of making our consumer’s experiences better with our data
- Provide support and insights to the business analytics and data science teams
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs
Think you have what it takes to be our **Data Engineer**? We’re committed to hiring the best talent for the role. Here’s how we’ll know you’ll be successful in the role:
- At least 2-3 years of relevant experience in a similar role or function
- Bachelor’s degree or equivalent in Computer Science, Computer Engineering, Information Systems or similar
- Programming skills with Python and Spark
- Exposure to at least one cloud provider, AWS preferred
- Knowledge of SQL, relational database concepts and SQL scripting
- Some experience with Airflow highly preferred, or similar systems
- Web Development skills with languages and frameworks like Angular, Node.js would be a plus
- Experience with Docker, Kafka or other data stream processing software platform is preferred
Compass Group Canada is committed to nurturing a diverse workforce representative of the communities within which we operate. We encourage and are pleased to consider all qualified candidates, without regard to race, colour, citizenship, religion, sex, marital / family status, sexual orientation, gender identity, aboriginal status, age, disability or persons who may require an accommodation, to apply.
For accommodation requests during the hiring process, please contact for further information.
Data Engineer
Posted today
Job Viewed
Job Description
Reference No. R2816003
Position title:
Data Engineer
Position Title:
Data Engineer
Department:
Digital R&D, Data Products and Platforms
Location:
Downtown Toronto, ON (3 days onsite)
About the Job
Our Hubs are a crucial part of how we innovate, improving performance across every Sanofi department and providing a springboard for the amazing work we do. Build a career and you can be part of transforming our business while helping to change millions of lives
At Sanofi, we’re committed to providing the next-gen healthcare that patients and customers need. It’s about harnessing data insights and leveraging AI responsibly to search deeper and solve sooner than ever before. Join our Data Products and Platforms Team as a Data Engineer and you can help make it happen.
Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives.
The Data Products and Platforms Team is a key team within R&D Digital, focused on developing and delivering Data and AI products for R&D use cases. This team plays a critical role in pursuing broader democratization of data across R&D and providing the foundation to scale AI/ML, advanced analytics, and operational analytics capabilities.
As a Data Engineer , you will join this dynamic team committed to driving strategic and operational digital priorities and initiatives in R&D. You will work as a part of a Data Product Delivery Pod, lead by a Product Owner, in an agile environment to deliver Data & AI Products. As a part of this team, you will be responsible for the design and development of data pipelines and workflows to ingest, curate, process, and store large volumes of complex structured and unstructured data. You will have the ability to work on multiple data products serving multiple areas of the business.
Our vision for digital, data analytics and AI:
Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means:
AI Factory - Versatile Teams Operating in Cross Functional Pods:
Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment.Leading Edge Tech Stack:
Experience build products that will be deployed globally on a leading-edge tech stack.World Class Mentorship and Training:
Working with renown leaders and academics in machine learning to further develop your skillsets.
We are an innovative global healthcare company with one purpose:
to chase the miracles of science to improve people’s lives. We’re also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started?
Main Responsibilities:
Data Product Engineering:
Provide input into the engineering feasibility of developing specific R&D Data/AI Products
Provide input to Data/AI Product Owner and Scrum Master to support with planning, capacity, and resource estimates
Design, build, and maintain scalable and reusable ETL / ELT pipelines to ingest, transform, clean, and load data from sources into central platforms / repositories
Structure and provision data to support modeling and data discovery, including filtering, tagging, joining, parsing and normalizing data
Collaborate with Data/AI Product Owner and Scrum Master to share progress on engineering activities and inform of any delays, issues, bugs, or risks with proposed remediation plans
Design, develop, and deploy APIs, data feeds, or specific features required by product design and user stories
Optimize data workflows to drive high performance and reliability of implemented data products
Oversee and support junior engineer with Data/AI Product testing requirements and execution
Innovation & Team Collaboration:
Stay current on industry trends, emerging technologies, and best practices in data product engineering
Contribute to a team culture of innovation, collaboration, and continuous learning within the product team
About You
Key Functional Requirements & Qualifications:
Bachelor’s degree in software engineering or related field, or equivalent work experience
5 years of experience in data product engineering, software engineering, or other related field
Understanding of R&D business and data environment preferred
Excellent communication and collaboration skills
Working knowledge and comfort working with Agile methodologies
Key Technical Requirements & Qualifications:
Proficiency with data analytics and statistical software (incl. SQL, Python, Java, Excel, AWS, Snowflake, Informatica)
Deep understanding and proven track record of developing data pipelines and workflows
Why Choose Us?
Bring the miracles of science to life alongside a supportive, future-focused team
Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally
Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact
Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs
Applications received after the official close date will be reviewed on an individual basis.
NOTE:
Internal applicants are required to notify their manager of their application.
This position is for a new vacant role that is now open for applications.
Sanofi is an equal opportunity employer committed to diversity and inclusion. Our goal is to attract, develop and retain highly talented employees from diverse backgrounds, allowing us to benefit from a wide variety of experiences and perspectives. We welcome and encourage applications from all qualified applicants. Accommodations for persons with disabilities required during the recruitment process are available upon request.
#GD-SP
#LI-SP
#LI-Onsite
All compensation will be determined commensurate with demonstrated experience. Employees may be eligible to participate in Company employee benefit programs, and additional benefits information can be found here.
Les employés peuvent être admissibles à participer aux programmes d'avantages sociaux de l'entreprise. Des informations supplémentaires sur les avantages sociaux peuvent être trouvées ici.
Be The First To Know
About the latest Cloud data engineer Jobs in Canada !
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Hybrid Hadoop Engineer and Hadoop Infrastructure Administrator to build and maintain a scalable and resilient Big Data framework to support Data Scientists. As an administrator, your responsibility will be to deploy and maintain Hadoop clusters, add and remove nodes using cluster management and monitoring tools like Cloudera Manager, and support performance and scalability requirements, in support of our Data scientist's needs. Some Relational Database administrator experience will also be desirable to support the general administration of Relational Databases.
Design, build, and maintain Big Data workflows/pipelines to process a continuous stream of data with experience in end-to-end design and build process of Near-Real-Time and Batch Data Pipelines.
Demonstrated work experience in the following with Big Data and distributed programming models and technologies
Knowledge of database structures, theories, principles, and practices (both SQL and NoSQL).
Active development of ETL processes using Spark or other highly parallel technologies, and implementing ETL/data pipelines
Experience with Data technologies and Big Data tools, like Spark, Kafka, Hive
Understanding of Map Reduce and other Data Query and Processing and aggregation models
Understanding of challenges of transforming data across distributed clustered environment
Experience with techniques for consuming, holding, and aging out continuous data streams
Ability to provide quick ingestion tools and corresponding access APIs for continuously changing data schema, working closely with Data Engineers around specific transformation and access needs
Preferred:
Experience as a Database administrator (DBA) will be responsible for keeping critical tools database up and running
Building and managing high-availability environments for databases and HDFS systems
Familiarity with transaction recovery techniques and DB Backup
Skills and Attributes:
Ability to have effective working relationships with all functional units of the organization
Excellent written, verbal, and presentation skills
Excellent interpersonal skills
Ability to work as part of a cross-cultural team
Self-starter and Self-motivated
Ability to work without lots of supervision
Works under pressure and is able to manage competing priorities.
Hide
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Salary:
Forum Asset Management Data Engineer
Location: 181 Bay Street, Toronto
In Office:5 days per week
Overview of Forum:
Join us in deliveringExtraordinary Outcomesthrough investment.
Forum is an investor, developer and asset manager operating across North America for over 28 years, focusing on real estate, private equity and infrastructure, with a strategic concentration in housing.
Our AUM exceeds $3 billion. We are committed to sustainability, responsible investing and creating value that benefits our stakeholders and the communities in which we invest, what we call our Extraordinary Outcomes.
In 2024, Forum completed thelargest real estate transaction in Canadawith theAlignvest acquisition,making us thelargest owner of Purpose-Built Student Accommodation (PBSA) in Canada through our $.5B open ended Real Estate Income and Impact Fund (REIIF).Ournational development pipeline now exceeds 3.5 billion,positioning Forum as thelargest developer of PBSA in Canada, operating from coast to coast.
The Forum team is adaptable, agile, and dynamic, committed to sustainable and responsible investing. Our people bring diverse cultural backgrounds and professional experiences, fostering innovation and thought leadership.We uphold integrity, trust, and transparency as core values, knowing that to achieveExtraordinary Outcomes,we need to support and develop anExtraordinary team.
Position Overview:
Were looking for aData Engineer to own the architecture, build, and evolution of Forums enterprise data platform using the Microsoft Fabric technology stack (or a suitable alternative). This is a rare greenfield opportunity to architect the full data lifecyclefrom ingestion and transformation to analytics and AI enablementwhile directly impacting Forums investment, real estate, and operational strategies.
This is a hands-on, in-office role with high visibility. Youll act as a trusted advisor and technical authority, collaborating with investment professionals, business leaders, analysts, and software developers to design enterprise-grade systems and AI-powered workflows that drive measurable business value. Your work will lay the foundation for our long-term data strategy and the future of decision-making at Forum.
Key Duties and Responsibilities:
- Own the end-to-end design and evolution of Forums Lakehouse architecture Microsoft Fabric, Azure Data Factory, Synapse, ADLS, and related Azure services
- Define and enforce data engineering standards, governance frameworks, and lifecycle management practices
- Lead large-scale data initiatives, migrations, and integrations across diverse internal and external systems
- Design and optimize enterprise-grade ETL/ELT pipelines for high-volume, high-complexity data
- Implement structured data workflows (e.g., Medallion Architecture) to deliver reliable, business-ready data
- Develop AI/ML-powered workflows using tools like Azure OpenAI, Document Intelligence, and Azure ML
- Create internal APIs and tools to enable self-serve analytics and data access across departments
- Partner with business teams across real estate, private equity, and operations to identify data opportunities and implement tailored solutions
- Develop and evolve our Power BI dashboards and reporting layer, ensuring reliable data access and visualization
- Promote best practices in data governance, automation, and AI application across Forums technology ecosystem
- Partner with internal teams and external technology vendors to drive the rollout of the platform
Candidate Profile:
- 6-7 years of progressive experience in data engineering, with a proven track record of architecting and delivering enterprise-scale data solutions
- Expert-level experience with the Microsoft Azure ecosystem: Microsoft Fabric, Azure Data Factory (ADF), Synapse, ADLS Gen2, and Power BI
- Proficiency in Python for data engineering, automation, and API development
- Deep understanding of data modeling, ELT/ETL design, and data warehouse best practices
- Track record architecting enterprise-scale data platforms in high-growth or regulated industries
- Proven success deploying AI/ML solutions into production at scale
- Experience integrating Azure OpenAI, LLMs, and document intelligence into real business processes
- Ability to evaluate, pilot, and operationalize emerging AI technologies for measurable business impact
- Demonstrated ability to work directly with executives, investors, and analysts to shape data strategy
- Strong communication skills and an entrepreneurial mindset: capable of turning ambiguity into working solutions
- Experience working with financial data models, capital tables, investment reports, or property management systems
- Background in private equity, real estate, asset management, or related sectors preferred
At Forum, we encourage diversity. We are committed to an inclusive workplace that reflects our belief that diversity is central to building a high-performing team. Forum is an equal-opportunity employer. We are committed to providing accessible employment practices. Should you require an accommodation during any phase of the recruitment process, please let the recruitment team know at
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Salary: $90,000 - $110,000
We are seeking a motivated Data Engineer with 2-3 years of experience to join our innovative team. In this role, you will design, build, and optimize data pipelines and analytics workflows using modern cloud data platforms and tools. You will collaborate with cross-functional teams to transform raw data into actionable insights that power intelligent product features, ML models, and strategic decision-making.
Responsibilities
- Design and implement robust ETL pipelines using Apache Airflow
- Build and maintain data warehouses in Snowflake to support scalable data ingestion and transformation
- Develop ETL workflows for structured and semi-structured data from various sources using AWS services
- Collaborate with data scientists and ML engineers to prepare and transform data for AI/ML model training and inference
- Build interactive data applications using Streamlit
- Design and implement data models to support machine learning workflows and analytics
- Integrate data from APIs, databases, and cloud storage using Python
- Implement data quality checks and monitoring systems to ensure data integrity
- Document data models and pipeline architectures
- Stay updated on advancements in Airflow, AWS, Snowflake, Streamlit, and AI/ML technologies
Requirements
- 2-3 years of professional experience in data engineering or a related field
- Hands-on experience with Apache Airflow for building and orchestrating ETL pipelines
- Practical experience with AWS services
- Experience working with Snowflake for data warehousing workloads
- Strong Python programming skills for data processing and automation
- Experience with Streamlit for building data applications
- Solid understanding of data modeling concepts
- Familiarity with AI/ML workflows, including data preparation for model training
- Bachelor's degree in Computer Science, Data Engineering, or a related technical field (or equivalent experience)
Preferred Qualifications
- Advanced experience with AWS data services
- Deep expertise in Snowflake optimization and performance tuning
- Experience building complex Streamlit applications
- Advanced Python skills for data wrangling and automation
- Experience with AI/ML model deployment and data modeling for machine learning
- Knowledge of Airflow best practices for ETL pipeline design