235 Etl Developer jobs in Canada
BI Developer & Data Specialist
Job Viewed
Job Description
We are seeking a BI Developer and Data Specialist - curious and meticulous - who will play a key role in validating, interpreting, and directing strategic data from the web and the industry.
YOU’LL HAVE THE OPPORTUNITY TO:
Validate the accuracy and consistency of data gathered through web scraping and various external sources.
Collaborate closely with the data engineer to specify the data extraction requirements.
Verify and cross-reference data from the industry to support teams in strategy, marketing, and product development.
Identify trends, anomalies, or gaps through SQL queries (via Snowflake) and formulate recommendations based on insights.
Contribute to the development of visual dashboards (an asset if you're skilled in Power BI).
Identify connections between various data sources to generate impactful analyses to aid in decision-making.
YOU'LL THRIVE IN THIS ROLE IF YOU HAVE THE FOLLOWING SKILLS AND QUALITIES:
A bachelor's degree in business intelligence, computer science, or a related field.
Excellent mastery of SQL and experience with modern data warehouses, preferably Snowflake.
A strong analytical mind, intellectual curiosity, and the ability to connect the dots.
Skilled at simplifying complex data into clear and actionable findings.
Knowledge of data visualization tools, specifically Power BI (an asset).
ACKNOWLEDGING THE POWER OF DIVERSITY
BRP is dedicated to nurturing a culture that invites, connects, and propels the ambitions of people of all backgrounds, profiles, beliefs and experiences. Ultimately, the diversity and uniqueness of our people fuel our ingenuity and set the course for the path ahead!
For this reason, we value diversity and we strive to always push each other forward to build an inclusive workplace where every employee feels like they belong, where they can grow and find meaning.
AT BRP, WHEN WE TALK ABOUT BENEFITS, WE GO ALL IN.
Let’s start with a strong foundation - You want it, we have it:
- Annual bonus based on the company’s financial results
- Generous paid time away
- Pension plan
- Collective saving opportunities
- Industry leading healthcare fully paid by BRP
What about some feel good perks:
- Flexible work schedule
- A summer schedule that varies by department and location
- Holiday season shutdown
- Educational resources
- Discount on BRP products
WELCOME TO BRP
We’re a world leader in recreational vehicles and boats, creating innovative ways to move on snow, water, asphalt, dirt and even in the air. Headquartered in the Canadian town of Valcourt, Quebec, our company is rooted in a spirit of ingenuity and intense customer focus. Today, we operate manufacturing facilities in Canada, the United States, Mexico, Finland, Australia and Austria, with a workforce made up of close to 20,000 spirited people, all driven by the deeply held belief that at work, as with life itself, it’s not about the destination: It’s about the journey.
#LI-Hybrid
#LI-GB1
Job No Longer Available
This position is no longer listed on WhatJobs. The employer may be reviewing applications, filled the role, or has removed the listing.
However, we have similar jobs available for you below.
Python ETL Developer/Data Engineer - Remote
Posted today
Job Viewed
Job Description
Job Description
Specific Duties
- -Reviewing, designing, developing ETL jobs to ingest data into Data Lake, load data to data marts;
- -extract data to integrate with various business applications.
- -Parse unstructured data, semi structured data such XML etc.
- -Design and develop efficient Mapping and workflows to load data to Data Marts
- -Map XML DTD schema in Python (customized table definitions)
- -Write efficient queries and reports in Hive or Impala to extract data on ad hoc basis for data analysis.
- -Identify the performance bottlenecks in ETL Jobs and tune their performance by enhancing or redesigning them.
- -Responsible for performance tuning of ETL mappings and queries.
- -import tables and all necessary lookup tables to facilitate the ETL process required to process daily XML files in addition to processing the very large (multi-terabytes) historical XML data files
Développeur ETC (Extraction, transformation et chargement) // (ETL Developer)
Posted today
Job Viewed
Job Description
Job Description
Durée du contrat prévue de 3 à 5 ans.
Date d'entrée en fonction prévue entre Q1 et Q3 2025.
Rôle hybride.
Les heures de travail sont normalement comprises entre 8 h et 17 h, du lundi au vendredi (35h par semaine), mais peuvent être étendues lors de certains projets.
1. Optimiser les traitements d’extraction existants et futurs,
2. Extraire les données nécessaires pour répondre aux nouveaux besoins d’affaires,
3. Migrer les données à travers les différentes couches en respectant les meilleures pratiques actuelles,
4. Documenter les extractions effectuées.
Exigences- Minimum de 5 ans d’expérience professionnelle dans les technologies de l’information, dont 4 ans dans un rôle avec des responsabilités similaires.
- Détenir un diplôme universitaire de premier cycle (BAC) en technologie de l’information ou en administration des affaires ou son équivalent.
- Connaissance de la base de données MS-SQL et Oracle, de Power BI et Power Apps, Automate et Apps Portail
- Français (écrit et parlé) de niveau professionnel.
- Capacité de passer une évaluation des antécédents judiciaires.
- Capacité de se déplacer dans la région de la ville de Québec.
ETL Developer (Microsoft Azure/Databricks/ Power BI/ Selenium/ JMeter)
Posted 28 days ago
Job Viewed
Job Description
ETL Developer (Microsoft Azure/Databricks/ Power BI/ Selenium/ JMeter)
Location Hybrid - Candidate MUST work 3 days onsite and 2 days remote
5700 Yonge Street, 10th Floor Toronto, Ontario, Finch Subway Station
Health Services Cluster, Ministry of Public and Business Service Delivery and Procurement
Contract, 9 months to start
Submission Deadline - 2025-05-30, 9:00 a.m.
MUST HAVES:
5+ years of experience in data engineering:
· Must have demonstrated expertise in designing and implementing comprehensive data pipelines and transformations in a Microsoft Azure + Databricks and Power BI technology stack.
· Must have strong knowledge and hands-on experience with:
· Microsoft Azure Data Services (ADF, ADLS Gen 2, Synapse, Azure SQL)
· Azure Databricks
· PySpark and SQL
· Power BI (data preparation, modelling, and visualization)
· Must have demonstrated experience in developing and maintaining data pipelines and release management through Continuous Integration and Continuous Delivery (CI/CD) and DevOps practices using Azure DevOps, Git, Visual Studio Code, and related tools
Must have experience conducting software quality functional assessments, systems testing, performance evaluation, automated testing, using a variety of tools such as:
- JIRA or Azure DevOps for defect tracking,
- Selenium for automated testing of web-based apps, including Power BI reports and dashboards,
- JMeter for performance evaluation.
· Conducting data quality assessments, validation, and profiling using tools like Great Expectations
· Developing and maintaining automated testing frameworks integrated into CI/CD pipelines and DevOps practices using Azure DevOps, Git, Visual Studio Code, and related tools.
ETL Developer (Microsoft Azure/Databricks/ Power BI/ Selenium/ JMeter)
Posted 28 days ago
Job Viewed
Job Description
ETL Developer (Microsoft Azure/Databricks/ Power BI/ Selenium/ JMeter)
Location Hybrid - Candidate MUST work 3 days onsite and 2 days remote
5700 Yonge Street, 10th Floor Toronto, Ontario, Finch Subway Station
Health Services Cluster, Ministry of Public and Business Service Delivery and Procurement
Contract, 9 months to start
Submission Deadline - 2025-05-30, 9:00 a.m.
MUST HAVES:
5+ years of experience in data engineering:
· Must have demonstrated expertise in designing and implementing comprehensive data pipelines and transformations in a Microsoft Azure + Databricks and Power BI technology stack.
· Must have strong knowledge and hands-on experience with:
· Microsoft Azure Data Services (ADF, ADLS Gen 2, Synapse, Azure SQL)
· Azure Databricks
· PySpark and SQL
· Power BI (data preparation, modelling, and visualization)
· Must have demonstrated experience in developing and maintaining data pipelines and release management through Continuous Integration and Continuous Delivery (CI/CD) and DevOps practices using Azure DevOps, Git, Visual Studio Code, and related tools
Must have experience conducting software quality functional assessments, systems testing, performance evaluation, automated testing, using a variety of tools such as:
- JIRA or Azure DevOps for defect tracking,
- Selenium for automated testing of web-based apps, including Power BI reports and dashboards,
- JMeter for performance evaluation.
· Conducting data quality assessments, validation, and profiling using tools like Great Expectations
· Developing and maintaining automated testing frameworks integrated into CI/CD pipelines and DevOps practices using Azure DevOps, Git, Visual Studio Code, and related tools.
ETL Developer (Azure Data Factory, Databricks, Logic App, IBM Cognos, PowerBI)
Posted today
Job Viewed
Job Description
Job Description
ETL Developer (ETL Azure Data Factory, Databricks, Logic App and Function App, IBM Cognos and Microsoft PowerBI, SQL Server Stored Procedure, Oracle PL/SQL)
1 year contract (with possibility of renewal)
Toronto (hybrid)
Deliverables section:
Scope of Services and Deliverables - The Services and Deliverables to be provided by the Vendor will include the following:
· Participate in the review of project, work request and maintenance release artefacts - Use Case, System Requirement Specification, etc.
· Deliver detailed system design documents and any other supporting documentation conforming to Ministry standards.
· Design logical and physical data models using Sybase Power Designer.
· Produce transformation mapping document.
· Implement data pipelines using Azure Data Factory (ADF), stored procedure, and Informatica PowerCenter performing extraction, transformation, and loading activities.
· Implement solutions using Logic App and Function App.
· Generate structured JSON file using ADF/ SQL.
· Develop Azure CI/CD pipeline to automate ADF release.
· Develop complex Oracle PL/SQL program to fulfill project requirements.
· Implement complex data conversion, e.g., binary to character, EBCDIC to UTF.
· Implement complex data transformation(s) for derived and calculated values.
· Integrate data sets from diverse source systems.
· Implement ADF for initial data load and incremental load.
· Promote ADF and Informatica ETL/ELT through all Ministry environments including Development, Integration Testing, QA, UAT, Production.
· Resolve and troubleshoot ADF pipeline, stored procedure, and Informatica workflow technical problems.
· Monitor and analyze incident, provide timely resolution.
· Collaborate with IT Professionals throughout the Software Development Life Cycle.
· Document information from diverse business area stakeholders and subject matter experts.
· Optimize performance of ADF pipeline, Azure SQL and Synapse databases, and Informatica workflow.
· Monitor application functionality and performance on daily basis.
· Provide effective knowledge transfer to Ministry staff at all stages of this assignment.
SkillsExperience and Skill Set Requirements
Evaluation Table
ETL Development-Data Integration 40%
· Extensive experience in gathering requirements and business process knowledge in order to design correct and high-quality data transformation.
· Extensive experience in design, development, and implementation with Azure Data Factory, Databricks, including:
o Self-Hosted Integration Runtime
o Data movement, ETL pipelines
o Performance tuning on large data volumes
o Complex data transformations
o ADF Debugger to validate data transformation.
o Azure Roles
· Experience with Informatica, including:
o Workflow Manager
o Repository Manager
o Designer
o Workflow Monitor
· Experience with developing solutions using Logic App and Function App.
· Experience in rapid application development (RAD) methodologies.
· Experience in maintaining and improving existing ETL processes.
· Experience in investigating data to identify potential issues within ETL pipelines.
· Experience with Business Intelligence tools IBM Cognos and Microsoft PowerBI.
Azure Technologies 40%
· Experience with CI/CD (DevOps) pipelines and concepts, including:
o Azure Resource Management
o GitHub Repo
o Source code version control
o Branching, pull requests, build, release, multiple parallel development repositories merge.
· Experience with Azure DevOps, including:
o Azure Boards
o Azure Test Plans
o Azure Pipelines
o Azure Repos
o Azure Artefacts
Oracle PL/SQL Development 10%
· Extensive experience in designing and developing SQL Server Stored Procedure, Oracle PL/SQL programs.
Database Technologies 10%
· Experience with Oracle, Microsoft SQL Server database and tools.
· Experience with relational and hierarchical database technologies.
· Extensive knowledge of Azure SQL DB and Synapse platforms, including:
o Performance tuning on large data volume
ETL Developer (Azure Data Factory, Databricks, Logic App, IBM Cognos, PowerBI)
Posted 28 days ago
Job Viewed
Job Description
ETL Developer (ETL Azure Data Factory, Databricks, Logic App and Function App, IBM Cognos and Microsoft PowerBI, SQL Server Stored Procedure, Oracle PL/SQL)
1 year contract (with possibility of renewal)
Toronto (hybrid)
Deliverables section:
Scope of Services and Deliverables - The Services and Deliverables to be provided by the Vendor will include the following:
· Participate in the review of project, work request and maintenance release artefacts - Use Case, System Requirement Specification, etc.
· Deliver detailed system design documents and any other supporting documentation conforming to Ministry standards.
· Design logical and physical data models using Sybase Power Designer.
· Produce transformation mapping document.
· Implement data pipelines using Azure Data Factory (ADF), stored procedure, and Informatica PowerCenter performing extraction, transformation, and loading activities.
· Implement solutions using Logic App and Function App.
· Generate structured JSON file using ADF/ SQL.
· Develop Azure CI/CD pipeline to automate ADF release.
· Develop complex Oracle PL/SQL program to fulfill project requirements.
· Implement complex data conversion, e.g., binary to character, EBCDIC to UTF.
· Implement complex data transformation(s) for derived and calculated values.
· Integrate data sets from diverse source systems.
· Implement ADF for initial data load and incremental load.
· Promote ADF and Informatica ETL/ELT through all Ministry environments including Development, Integration Testing, QA, UAT, Production.
· Resolve and troubleshoot ADF pipeline, stored procedure, and Informatica workflow technical problems.
· Monitor and analyze incident, provide timely resolution.
· Collaborate with IT Professionals throughout the Software Development Life Cycle.
· Document information from diverse business area stakeholders and subject matter experts.
· Optimize performance of ADF pipeline, Azure SQL and Synapse databases, and Informatica workflow.
· Monitor application functionality and performance on daily basis.
· Provide effective knowledge transfer to Ministry staff at all stages of this assignment.
SkillsExperience and Skill Set Requirements
Evaluation Table
ETL Development-Data Integration 40%
· Extensive experience in gathering requirements and business process knowledge in order to design correct and high-quality data transformation.
· Extensive experience in design, development, and implementation with Azure Data Factory, Databricks, including:
o Self-Hosted Integration Runtime
o Data movement, ETL pipelines
o Performance tuning on large data volumes
o Complex data transformations
o ADF Debugger to validate data transformation.
o Azure Roles
· Experience with Informatica, including:
o Workflow Manager
o Repository Manager
o Designer
o Workflow Monitor
· Experience with developing solutions using Logic App and Function App.
· Experience in rapid application development (RAD) methodologies.
· Experience in maintaining and improving existing ETL processes.
· Experience in investigating data to identify potential issues within ETL pipelines.
· Experience with Business Intelligence tools IBM Cognos and Microsoft PowerBI.
Azure Technologies 40%
· Experience with CI/CD (DevOps) pipelines and concepts, including:
o Azure Resource Management
o GitHub Repo
o Source code version control
o Branching, pull requests, build, release, multiple parallel development repositories merge.
· Experience with Azure DevOps, including:
o Azure Boards
o Azure Test Plans
o Azure Pipelines
o Azure Repos
o Azure Artefacts
Oracle PL/SQL Development 10%
· Extensive experience in designing and developing SQL Server Stored Procedure, Oracle PL/SQL programs.
Database Technologies 10%
· Experience with Oracle, Microsoft SQL Server database and tools.
· Experience with relational and hierarchical database technologies.
· Extensive knowledge of Azure SQL DB and Synapse platforms, including:
o Performance tuning on large data volume
Senior Java/ ETL Software Developer (Java, ETL, SQL, PL/SQL, Power BI)
Posted today
Job Viewed
Job Description
Job Description
Senior Java/ ETL Software Developer (Java, Informatica , ETL, SQL, PL/SQL,and Power BI)
Toronto, Ontario - Hybrid, 3 days onsite - 87 Sir William Hearst Ave Toronto, ON M3M 0B4
Contract, 9 months (with possibly of extension)
Employment Ontario is one of the major programs delivered by the Employment and Training Division of the Ontario Ministry of Training, Colleges and Universities. Employment Ontario helps the citizens of Ontario gain the training, skills and experience to achieve their goals.
The Employment Ontario Information Systems (EOIS) is the major IT system that supports the administration of these programs. Application subsystems under EOIS include CaMS (Case Management System), Service Provider lifeCycle (SPL), EO Self Service (EOSS), APPR (Apprenticeship), COJG (Canada Ontario Job Grant), and APPR AOL (Apply OnLine).
The operating Environment includes J2EE, CURAM, Crystal Reports, Oracle SQL/PL SQL, Power BI, Oracle Database, Business Objects Universe and Dashboarding, WebI, and Bursting technologies, and Informatica ETL.
This request is for strong developers to help in designing and developing data warehousing, BI, reporting and analytics.
Perform analysis, design, developing, unit testing, defect fixing and other necessary tasks in a data warehousing and BI environment in Oracle PL/SQL
Design, develop and test Informatica mappings
- ETL scripts to transfer and transform data from O LTP database to Data Warehouse and Data marts .
- Nice to have skills in Crystal reports, Power BI, and BOE reporting
Required to translate technical systems specifications into working, tested applications. This includes:
- Developing detailed programming specifications
- writing and/or generating code
- Compiling data-driven programs, maintaining, and conducting unit tests
- Resolves and troubleshoots technical problems which arise during the use and operation of software packages, including technical assistance in implementation, conversion, and migrations.
Must have :
- 5 yr+ Demonstrated experience in JAVA-based software development.
- 5+ years of ETL experience including data Warehouse design, Reporting and ETL concepts.
- 3+ years experience with analysis, design, developing, unit testing, defect fixing and other necessary tasks in data warehousing , BI environment designing and developing data warehousing, BI, reporting and analytics
- 3+ years Experience in Designing, developing and testing SQL and PL/SQL.
- 3+ years Experience in Designing, developing and testing in Informatica.
- Experience in RDBMS design concepts.
- Experience with Data Warehouse architecture, design, dimensional modeling, development and deployment of business intelligence systems.
Nice to have :
- Experience in BOE dashboards, universes & reports, Power BI, Crystal Report
- Public Sector experience
Be The First To Know
About the latest Etl developer Jobs in Canada !