623 Database Developers jobs in Canada

Data Engineer

Toronto, Ontario Autodesk

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

**Job Requisition ID #**
25WD89443
**Position Overview**
Autodesk is looking for diverse engineering candidates to join the Compliance team, a collection of systems and teams focused on detecting and exposing non-compliant users of Autodesk software. This is a key strategic initiative for the company. As a Data Engineer, you will contribute to improving critical data processing & analytics pipelines. You will work on challenging problems to enhance the platform's reliability, resiliency, and scalability.
We are looking for someone who is detail and quality oriented, and excited about the prospects of having a big impact with data at Autodesk. Our tech stack includes Hive, Spark, Presto, Jenkins, Snowflake, PowerBI, Looker, and various AWS services. You will report to Senior Manager, Software Development and this is a hybrid position based out of Toronto, Canada
**Responsibilities**
+ You will need a product-focused mindset. It is essential for you to understand business requirements and help build systems that can scale and extend to accommodate those needs
+ Break down moderately complex problems, contribute to documenting technical solutions, and assist in making fast, iterative improvements
+ Help build and maintain data infrastructure that powers batch and real-time data processing of billions of records
+ Assist in automating cloud infrastructure, services, and observability
+ Contribute to developing CI/CD pipelines and testing automation
+ Collaborate with data engineers, data scientists, product managers, and data stakeholders to understand their needs and promote best practices
+ You have a growth mindset. You will support identifying business challenges and opportunities for improvement and help solve them using data analysis and data mining
+ You will support analytics and provide insights around product usage, campaign performance, funnel metrics, segmentation, conversion, and revenue growth
+ You will contribute to ad-hoc analysis, long-term projects, reports, and dashboards to find new insights and to measure progress in key initiatives
+ You will work closely with business stakeholders to understand and maintain focus on their analytical needs, including identifying critical metrics and KPIs
+ You will partner with different teams within the organization to understand business needs and requirements
+ You will contribute to presentations that help distill complex problems into clear insights
**Minimum Qualifications**
+ 2-4 years of relevant industry experience in big data systems, data processing, and SQL databases
+ 2+ years of coding experience in Spark DataFrames, Spark SQL, and PySpark
+ 2+ years of hands-on programming skills, able to write modular, maintainable code, preferably in Python & SQL
+ Good understanding of SQL, dimensional modeling, and analytical big data warehouses like Hive and Snowflake
+ Familiar with ETL workflow management tools like Airflow
+ 1-2 years of experience in building reports and dashboards using BI tools, Knowledge of Looker is a plus
+ Exposure to version control and CI/CD tools like Git and Jenkins CI
+ Experience working with data in notebook environments like Jupyter, EMR Notebooks, or Apache Zeppelin
+ Bachelor's degree in Computer Science, Engineering or a related field, or equivalent training, fellowship, or work experience
**Learn More**
**About Autodesk**
Welcome to Autodesk! Amazing things are created every day with our software - from the greenest buildings and cleanest cars to the smartest factories and biggest hit movies. We help innovators turn their ideas into reality, transforming not only how things are made, but what can be made.
We take great pride in our culture here at Autodesk - our Culture Code is at the core of everything we do. Our values and ways of working help our people thrive and realize their potential, which leads to even better outcomes for our customers.
When you're an Autodesker, you can be your whole, authentic self and do meaningful work that helps build a better future for all. Ready to shape the world and your future? Join us!
**Salary transparency**
Salary is one part of Autodesk's competitive compensation package. Offers are based on the candidate's experience and geographic location. In addition to base salaries, we also have a significant emphasis on discretionary annual cash bonuses, commissions for sales roles, stock or long-term incentive cash grants, and a comprehensive benefits package.
**Diversity & Belonging**
We take pride in cultivating a culture of belonging and an equitable workplace where everyone can thrive. Learn more here: you an existing contractor or consultant with Autodesk?**
Please search for open jobs and apply internally (not on this external site).
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Vancouver, British Columbia Insight Global

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Job Description
Day to Day
Insight Global is seeking a Data Engineer to join one of a Retail Client in Vancouver BC. This Data Engineer will sit within the People Enablement Technology team which is committed to enabling seamless experiences for their global workforce. The Data & Analytics Pillar is a newly formed team, focused on automating our core health & wealth offerings as well as work across the department to enable efforts driven by the People Systems, Digital Workplace & Learning, and Workforce Agility pillars. The project the successful candidate will join is bringing the learning data from various platforms such as LI learning into an existing data lake. This will allow for analytics insights to be generated from this people data for various teams such as the ESG team. The data engineer will support using data bricks for this ingestion, and use unity catalog for its categorization and potential encryption if required.
A successful candidate will be a problem solver and an expert in ETL programming/scripting, data modelling, data integration, SQL and have exemplary communication skills. The candidate will need to be comfortable with ambiguity in a fast-paced and ever-changing environment, and able to think big while paying careful attention to detail. The candidate will know, and love working with new technologies, can model multidimensional datasets, and can partner with cross functional business teams to answer key business questions. Apart from building data pipelines, you will be an advocate for automation, performance tuning and cost optimization. Be ready to question the status quo and bring forth intelligent solutions and proof of concepts.
Principal Duties and Responsibilities:
Support the design and implementation of complex data integration, modeling, and orchestration workflows using tools such as Databricks, Azure Data Factory, and Snowflake
Collaborate with architects, engineers, analysts, and business stakeholders to deliver secure, reliable, and high-performing data solutions
Apply DevSecOps principles to ensure secure development, deployment, and monitoring of data pipelines and infrastructure
Implement and maintain data encryption, access controls, and governance using Databricks Unity Catalog to protect sensitive information
Participate in operational support, including troubleshooting, performance tuning, and continuous improvement of data workflows
Create and maintain technical documentation and contribute to knowledge-sharing across the team
We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form ( . The EEOC "Know Your Rights" Poster is available here ( .
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: .
Skills and Requirements
Must Haves:
5+ years of experience as a Data Engineer working with Databricks, SQL & Python
3+ years with Cloud Data Integration & Modelling (Azure Data Factory)
Strong experience with Unity Catalog for categorization and filing sensitive data
Experience with DevOps tools and building secure, scalable data workflows
Proficiency in Python, and Pyspark
Experience working with people centered and sensitive data to understand the complexities of encryption, decryption (such as health information, payment information, banking information).
Experience with data modeling, data warehousing, and building ETL pipelines.
Excellent problem-solving skills, combined with the ability to present your findings/insights clearly and compellingly in both verbal and written form.
Strong documentation and communication skills
Bachelors degree in Computer Science, Mathematics, Statistics, Operations Research. Plusses:
- Snowflake experience null
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal employment opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment without regard to race, color, ethnicity, religion,sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military oruniformed service member status, or any other status or characteristic protected by applicable laws, regulations, andordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to
This advertiser has chosen not to accept applicants from your region.

Data Engineer

New
Toronto, Ontario I-cube Software Llc

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Hybrid Hadoop Engineer and Hadoop Infrastructure Administrator to build and maintain a scalable and resilient Big Data framework to support Data Scientists. As an administrator, your responsibility will be to deploy and maintain Hadoop clusters, add and remove nodes using cluster management and monitoring tools like Cloudera Manager, and support performance and scalability requirements, in support of our Data scientist's needs. Some Relational Database administrator experience will also be desirable to support the general administration of Relational Databases.

Design, build, and maintain Big Data workflows/pipelines to process a continuous stream of data with experience in end-to-end design and build process of Near-Real-Time and Batch Data Pipelines.


Demonstrated work experience in the following with Big Data and distributed programming models and technologies
Knowledge of database structures, theories, principles, and practices (both SQL and NoSQL).
Active development of ETL processes using Spark or other highly parallel technologies, and implementing ETL/data pipelines
Experience with Data technologies and Big Data tools, like Spark, Kafka, Hive
Understanding of Map Reduce and other Data Query and Processing and aggregation models
Understanding of challenges of transforming data across distributed clustered environment
Experience with techniques for consuming, holding, and aging out continuous data streams
Ability to provide quick ingestion tools and corresponding access APIs for continuously changing data schema, working closely with Data Engineers around specific transformation and access needs

Preferred:

Experience as a Database administrator (DBA) will be responsible for keeping critical tools database up and running
Building and managing high-availability environments for databases and HDFS systems
Familiarity with transaction recovery techniques and DB Backup

Skills and Attributes:

Ability to have effective working relationships with all functional units of the organization
Excellent written, verbal, and presentation skills
Excellent interpersonal skills
Ability to work as part of a cross-cultural team
Self-starter and Self-motivated
Ability to work without lots of supervision
Works under pressure and is able to manage competing priorities.

Hide

This advertiser has chosen not to accept applicants from your region.

Data Engineer

New
Toronto, Ontario QA Consultants Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Data Engineer

Job Type: Contract/Perm (Open)
Primary Location: Toronto or Montreal - 2 days onsite per week

We currently have an opportunity for a Data Engineer , supporting a partner in Insurance sector. Client is expanding the team, and the successful resource will help with supporting team and taking on day to day responsibilities.

Job Requirements:

  • Design, develop, and maintain scalable, efficient data pipelines and ETL processes
  • Build and deploy data ingestion , transformation, and orchestration workflows
  • Implement data quality checks , observability, and error-handling mechanisms
  • Collaborate with data scientists, analysts, and architects to define technical solutions
  • Work with distributed data systems  (Spark, Databricks) and cloud infrastructure (AWS)
  • Write clean, modular, production-ready code in Python
  • Ensure CI/CD practices, version control, and infrastructure-as-code where relevant
  • Optimize performance of jobs, queries, and storage for large data volumes
  • Monitor, troubleshoot, and continuously improve pipeline reliability
Candidate requirements : Must-haves (3-5 max):
  • 5+ years  of experience in data engineering  roles in complex environments
  • Proven skills in Python programming  (clean code, packaging, testing)
  • Strong experience with Spark  and Databricks
  • Experience building pipelines from scratch  in production environments 
 Nice to have:
  • 1-2 years on Snowflake or/and Databricks management is highly recommended
  • Experience with streaming pipelines (Kafka, Spark Streaming, etc.)
  • Familiarity with Terraform  or other IaC tools
Tools / software / methodology / certification: Spark, Python, AWS, Snowflake, Databricks
Soft skills: Being customer driven, with great communication and collaboration skills

Not for you?
Check out our other opportunities at or follow us on LinkedIn. We thank all candidates in advance. Only selected candidates for interviews will be contacted.

About Us
QA Consultants is North America’s largest software quality engineering services firm. An award-winning onshore provider of software testing and quality assurance solutions, we are the trusted engineering services company for business, industry and government supported by leading practitioners and solutions. For almost 30 years, we have successfully delivered 12,000+ mission-critical projects in the private, public, and not-for-profit sectors.  We reduce risk and improve time to market with quality engineering, keep applications secure through dedicated application security capabilities, and reduce cost of ownership while enabling applications to scale via performance engineering.  We are proud of our vision to help clients achieve flawless technology outcomes.  QA Consultants also operates a robust emerging technologies practice with a focus on quality engineering solutions for connected and autonomous vehicles, artificial intelligence (AI), Internet of Things (IoT) and blockchain.  
For more information visit -

Our Purpose
Life continues to evolve and the technology we all rely on daily hinges on impeccable software. QAC understands that safe, effective technology is your right - it is our right. It is with this understanding that we deliver on our purpose. We support our clients to ensure technology enables flawless productivity and harmony for a Better, Brighter and Safer world for all of us.

What’s in it for you?
  • Make a difference every day as you help our clients deliver innovation and technology in a better, brighter, and safer way
  • Be part of a smart and dedicated team, disrupting quality assurance methodologies and creating something unique
  • Be involved in challenging and interesting work
  • Work from anywhere

We are growing faster than we expected and that’s humbling and exciting! So, for all those on board, we guarantee a rewarding journey – and we’re just getting started.
Diversity & Inclusion
QA Consultants is an equal opportunity employer, committed to meeting the needs of all individuals in accordance with the Accessibility of Ontarians with Disabilities Act (AODA) and the Ontario Human Rights Code (OHRC) where we evaluate applicants without regard to race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity, or other characteristics protected by law. We are committed to the creation of an exceptional work environment wherein we maintain values of mutual respect, integrity, dignity, and inclusivity; and encourage the open exchange of ideas and opinions.
 
If you require a specific accommodation because of a disability or a medical need, please inform the recruiter. This ensures that the appropriate accommodations are in place at time of your interview and before you begin your employment.
 
QAC’s main office is located in Toronto, Ontario. We acknowledge that the land on which we work is situated upon traditional territories. We wish to acknowledge the Ancestral Traditional Territories of the Ojibway, the Anishinaabe  and, the Mississauga  of the New Credit. We also recognize the enduring presence of Aboriginal peoples on this land.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

New
Toronto, Ontario Forum Asset Management

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Salary:

Forum Asset Management Data Engineer


Location: 181 Bay Street, Toronto


In Office:5 days per week


Overview of Forum:

Join us in deliveringExtraordinary Outcomesthrough investment.


Forum is an investor, developer and asset manager operating across North America for over 28 years, focusing on real estate, private equity and infrastructure, with a strategic concentration in housing.


Our AUM exceeds $3 billion. We are committed to sustainability, responsible investing and creating value that benefits our stakeholders and the communities in which we invest, what we call our Extraordinary Outcomes.


In 2024, Forum completed thelargest real estate transaction in Canadawith theAlignvest acquisition,making us thelargest owner of Purpose-Built Student Accommodation (PBSA) in Canada through our $.5B open ended Real Estate Income and Impact Fund (REIIF).Ournational development pipeline now exceeds 3.5 billion,positioning Forum as thelargest developer of PBSA in Canada, operating from coast to coast.


The Forum team is adaptable, agile, and dynamic, committed to sustainable and responsible investing. Our people bring diverse cultural backgrounds and professional experiences, fostering innovation and thought leadership.We uphold integrity, trust, and transparency as core values, knowing that to achieveExtraordinary Outcomes,we need to support and develop anExtraordinary team.

Position Overview:

Were looking for aData Engineer to design and build the backbone of Forums enterprise data infrastructure using the Microsoft Fabric technology stack (or a suitable alternative). This is a rare greenfield opportunity to architect the full data lifecyclefrom ingestion and transformation to analytics and AI enablementwhile directly impacting Forums investment, real estate, and operational strategies.

This is a hands-on, in-office role with high visibility. Youll collaborate with investment professionals, business leaders, analysts, and software developers to build scalable data systems and AI-powered workflows that drive real value. Your work will lay the foundation for our long-term data strategy and the future of decision-making at Forum.


Key Duties and Responsibilities



  • Design and implement a robust data Lakehouse architecture using Microsoft Fabric, Azure Data Factory, Synapse, ADLS , and related Azure services
  • Develop scalable ETL/ELT pipelines to consolidate, standardize, and model data from diverse internal and external sources
  • Develop AI/ML-powered workflows using tools like Azure OpenAI , Document Intelligence , and Azure ML
  • Create internal APIs and tools to enable self-serve analytics and data access across departments
  • Partner with business teams across real estate , private equity , and operations to identify data opportunities and implement tailored solutions
  • Develop and evolve our Power BI dashboards and reporting layer , ensuring reliable data access and visualization
  • Promote best practices in data governance, automation, and AI application across Forums technology ecosystem
  • Partner with internal teams and external technology vendors to drive the rollout of the platform


Candidate Profile



Technical Expertise

  • Deep experience with the Microsoft Azure ecosystem : Microsoft Fabric, Azure Data Factory (ADF), Synapse, ADLS Gen2, and Power BI
  • Proficiency in Python for data engineering, automation, and API development
  • Strong grasp of data modeling, ELT/ETL design, and data warehouse best practices
  • Familiarity with Medallion Architecture (Bronze/Silver/Gold layers) or similar structured data workflows

Applied AI Experience

  • Proven track record deploying AI tools (Azure OpenAI, Azure ML, etc.) into production to automate workflows, classify documents, or generate actionable insights
  • Experience integrating LLMs, document understanding , or AI agents into real business processes, preferably in investment, operations, or sales enablement contexts

Business Partnership & Leadership

  • Demonstrated ability to work directly with investment professionals, executives, and analysts to understand business needs and design data systems accordingly
  • A builders mindset: capable of working autonomously, identifying opportunities, and turning ambiguity into working solutions
  • Strong communication skills and an entrepreneurial, problem-solving approach
  • Experience working with financial data models , capital tables , investment reports , or property management systems
  • Background in private equity, real estate, asset management, or related sectors
  • Familiarity withdata governance , metadata management , and enterprise architecture principles



At Forum, we encourage diversity. We are committed to an inclusive workplace that reflects our belief that diversity is central to building a high-performing team. Forum is an equal-opportunity employer. We are committed to providing accessible employment practices. Should you require an accommodation during any phase of the recruitment process, please let the recruitment team know at


This advertiser has chosen not to accept applicants from your region.

Data Engineer

Montréal, Quebec Small Door Veterinary

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Small Door is membership-based veterinary care designed with human standards that is better for pets, pet parents, and veterinarians alike. We designed and delivered a reimagined veterinary experience via a membership that includes exceptional care, 24/7 telemedicine, and transparent pricing - delivered with modern hospitality in spaces designed by animal experts to be stress-free. We opened our flagship location in Manhattan's West Village in 2020 and have quickly expanded across the East Coast. Small Door now operates in New York City, Boston, Washington DC, and Maryland with continued expansion plans in 2025.

We're looking for a motivated, curious, and collaborative Entry-Level Data Engineer to join our growing Data team. In this role, you'll help build and maintain the pipelines, tools, and systems that power analytics, product insights, and strategic decision-making across the company.

This is a great opportunity for someone early in their career who's excited to learn, ship fast, and make a real impact at a company where data is core to everything we do.

What You'll Do

  • Collaborate with the product, technology and data team to build scalable data pipelines (ETL/ELT)
  • Maintain and optimize data infrastructure using tools like dbt, Airflow, and Snowflake (or similar)
  • Support data ingestion from internal and external sources (APIs, application databases, third-party tools)
  • Ensure data integrity, availability, and performance through testing and monitoring
  • Partner with analytics and product teams to make trusted, high-quality data available for reporting and machine learning
  • Write clear, maintainable code and documentation
  • Contribute to the evolution of our data architecture and engineering best practices
  • Help manage the data warehouse and its data model

Who You Are

  • A degree in Computer Science, Engineering, Data Science, or a related field — or equivalent practical experience
  • Proficiency in SQL and Python
  • Familiarity with data modeling concepts and ETL workflows
  • Interest or experience in tools like dbt, Airflow, Snowflake, Redshift, BigQuery, or similar
  • A growth mindset and eagerness to learn from feedback and challenges
  • Excellent problem-solving skills and attention to detail
  • Strong communication and collaboration skills

What you'll get

  • Competitive salary
  • A great top of the line benefit plan provided by PEO Canada and including short term disability insurance
  • An opportunity to make a real impact on the people around you
  • A collaborative group of people who live our core values and have your back
  • A clear career path with opportunities for development, both personally and professionally

Small Door is proudly a public benefit corporation and a certified B Corp. We are committed to creating a diverse, inclusive and equitable workplace, and we encourage qualified applicants of every background, ability, and life experience to apply to appropriate employment opportunities.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

New
Toronto, Ontario CI Financial Corp.

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Description

At CI, we see a great place to work as one that is a safe place for everyone to have a voice, where people are empowered to take ownership over meaningful work, where there is an opportunity to grow through stretching themselves, where they can work on innovative products and projects, and where employees are supported and engaged in doing so.

WHAT YOU WILL DO

  • Design, develop, and optimize highly scalable data pipelines to enable seamless extraction, transformation, and loading (ETL/ELT) processes to/from diverse systems
  • Collaborate with business and analytics team to gather and understand requirements, ensuring alignment with business objectives and data needs.
  • Build frameworks to accelerate and standardize data ingestion, transformations, validations, and monitoring.
  • Leverage Snowflake to create accurate, single version of the truth.
  • Document & assist development teams with best practice, data delivery solutions using the core platform and data management tools.
  • Take ownership of key aspects of project delivery, including planning, development, testing, and deployment, while working closely with cross-functional teams to ensure successful and timely outcomes.
  • Troubleshoot and resolve production issues, including performance bottlenecks, data quality problems, and system failures; proactively optimize existing workflows for reliability and efficiency.

WHAT YOU WILL BRING

  • 5+ years of experience in data engineering, with a strong focus on designing and developing scalable data pipelines, ETL/ELT workflows, and data integration solutions.
  • Experience with data warehousing concepts and technologies.
  • Expertise in SQL, with experience using dbt and Snowflake preferred.
  • Strong problem-solving skills with the ability to tackle open-ended challenges and deliver high-quality solutions.
  • Solid understanding of data management principles, including data architecture, data quality, data modeling, and data governance.
  • Experience with DevOps practices, including CI/CD pipelines and automated deployment/testing frameworks.
  • Experience with Data Vault 2.0 is considered an asset.
  • Ability to work with business stakeholders to understand requirements and explain complex technical solutions in simple terms.

CI Financial is an independent company offering global wealth management and asset management advisory services through diverse financial services firms. Since 1965, we have consistently anticipated and responded to the changing needs of investors. We are driven by a commitment to provide individuals and institutions with the highest-quality investments and advice.

Our commitment to the highest levels of performance means that whatever their position, CI employees must be comfortable in a fast-paced environment that will stretch them to tap into their highest potential. Employees with a healthy dose of ambition, a desire to commit to a curious mindset for continuous learning, and a willingness to go the extra mile thrive at CI.

WHAT WE OFFER

  • Modern HQ location within walking distance from Union Station
  • Training Reimbursement
  • Paid Professional Designations
  • Employee Savings Plan (ESP)
  • Corporate Discount Program
  • Enhanced group benefits
  • Parental Leave Top–up program
  • Fitness membership discounts
  • Volunteer Paid Day

We are focused on building a diverse and inclusive workforce. If you are excited about this role and are not confident you meet all the qualification requirements, we encourage you to apply to investigate the opportunity further.

Please submit your resume in confidence by clicking “Apply”. Only qualified candidates selected for an interview will be contacted. CI Financial Corp. and all of our affiliates (“CI”) are committed to fair and accessible employment practices and provide reasonable accommodations for persons with disabilities. If you require accommodations in order to apply for any job opportunities, require this posting in an additional format, or require accommodation at any stage of the recruitment process please contact us at , or call ext. 4747.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Database developers Jobs in Canada !

Data Engineer

New
Toronto, Ontario Annex It Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

We are seeking a skilled and detail-oriented Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support analytics, reporting, and data science initiatives.

You will work closely with data analysts, data scientists, and software engineers to ensure data quality, optimize performance, and drive data-driven decision-making.

Responsibilities

Design, develop, and maintain reliable data pipelines using tools like Apache Spark, Kafka, Airflow, or DBT.

Build and optimize data models and architectures for data lakes and data warehouses (e.g., Snowflake, Redshift, BigQuery).

Develop ETL/ELT processes to ingest and transform data from multiple sources (e.g., APIs, databases, flat files).

Ensure data quality, integrity, security, and compliance with best practices and regulations (e.g., GDPR).

Collaborate with stakeholders to understand data requirements and deliver scalable solutions.

Monitor pipeline performance and troubleshoot data issues.

Maintain documentation for data systems, pipelines, and workflows.

Stay current with industry trends and emerging technologies in data engineering and analytics.

Requirements

Bachelor's or Master’s degree in Computer Science, Engineering, Information Systems, or related field.

2+ years of experience in data engineering, data architecture, or related roles.

Strong experience with SQL and data modeling.

Proficiency in Python, Scala, or Java.

Hands-on experience with cloud platforms (AWS, GCP, Azure) and services like S3, Lambda, Redshift, or BigQuery.

Familiarity with data pipeline tools such as Apache Airflow, Kafka, or DBT.

Experience with CI/CD, Git, and containerization (Docker, Kubernetes) is a plus.

Excellent problem-solving and communication skills.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

New
Toronto, Ontario CI Financial Corp.

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Description

At CI, we see a great place to work as one that is a safe place for everyone to have a voice, where people are empowered to take ownership over meaningful work, where there is an opportunity to grow through stretching themselves, where they can work on innovative products and projects, and where employees are supported and engaged in doing so.

OVERVIEW
We are currently seeking a Data Engineer to join our CI GAM IT department. The successful candidate will work closely with our IT Client Reporting team on the development, modernization and maintenance of our statements and tax processes. In this role, you will assist with solutioning business requirements by extracting data, enhancing, and modifying applications to be used in client reporting.  You will also be assisting in transforming our legacy applications for cloud adoption. 

WHAT YOU WILL DO

  • Design, build, and modernize our statements and tax reporting processes in the Snowflake Data Cloud environment.
  • Develop integration processes using SQL and Python to produce curated data models.
  • Optimize and maintain data architecture, including schema design, performance tuning, and query optimization.
  • Develop and maintain comprehensive documentation for data processes, architecture, and procedures.
  • Collaborate with business partners, business systems analysts, and other stakeholders to deliver data-driven solutions and insights.
  • Participate in code reviews, provide mentorship to junior data engineers, and contribute to team knowledge sharing.
  • Maintain and support existing legacy applications and processes for both CI GAM and CI Wealth

 WHAT YOU WILL BRING

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.

 Experience 

  • 3+ years of experience in data engineering, data architecture, or a similar role in a professional environment
  • Experienced in Snowflake Data Cloud, DBT and Airflow, including data modeling and cloud data architecture.
  • Proficiency in SQL for data extraction, manipulation, and analysis.
  • Developed using Python for data processing and automation.
  • Solid experience with different data storage solutions in AWS such as Snowflake, S3 and Redshift
  • Experience with AWS services such as Step Functions, Glue, and Secrets Manager.
  • Strong analytical and problem-solving skills, with the ability to work effectively in a fast-paced, collaborative environment.
  • Excellent communication skills, with the ability to translate complex technical concepts for non-technical stakeholders.
  • Experience in the Financial Services Industry is an asset

 Preferred Skills 

  • Strong knowledge in agile methodologies, DevOps practices, including CI/CD pipelines, containerization, and infrastructure as code (IaC).
  • Experienced in the use of Atlassian collaboration tools.
  • Experienced in shell-scripting language like KSH and SQR.
  • Familiar with job scheduler tools like Autosys, cron, etc.
  • Experience in Restful API is a plus
  • Experience in cyber-security measures is a plus
  • Good interpersonal and team skills.
  • Excellent oral and written communication skills.
  • Strong organizational, analytical skills.

CI Financial is an independent company offering global wealth management and asset management advisory services through diverse financial services firms. Since 1965, we have consistently anticipated and responded to the changing needs of investors. We are driven by a commitment to provide individuals and institutions with the highest-quality investments and advice.

Our commitment to the highest levels of performance means that whatever their position, CI employees must be comfortable in a fast-paced environment that will stretch them to tap into their highest potential. Employees with a healthy dose of ambition, a desire to commit to a curious mindset for continuous learning, and a willingness to go the extra mile thrive at CI.

WHAT WE OFFER

  • Modern HQ location within walking distance from Union Station
  • Training Reimbursement
  • Paid Professional Designations
  • Employee Savings Plan (ESP)
  • Corporate Discount Program
  • Enhanced group benefits
  • Parental Leave Top–up program
  • Fitness membership discounts
  • Volunteer Paid Day

We are focused on building a diverse and inclusive workforce. If you are excited about this role and are not confident you meet all the qualification requirements, we encourage you to apply to investigate the opportunity further.

Please submit your resume in confidence by clicking “Apply”. Only qualified candidates selected for an interview will be contacted. CI Financial Corp. and all of our affiliates (“CI”) are committed to fair and accessible employment practices and provide reasonable accommodations for persons with disabilities. If you require accommodations in order to apply for any job opportunities, require this posting in an additional format, or require accommodation at any stage of the recruitment process please contact us at , or call ext. 4747.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

New
Toronto, Ontario CI Financial Corp.

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Description

At CI, we see a great place to work as one that is a safe place for everyone to have a voice, where people are empowered to take ownership over meaningful work, where there is an opportunity to grow through stretching themselves, where they can work on innovative products and projects, and where employees are supported and engaged in doing so.

We are currently seeking for a Data Engineer to join the Data Solutions team in the Wealth Technology group at CI Wealth. The Data Solutions team focuses on developing data-driven solutions, driving innovation, and uncovering key insights to support strategic initiatives. The successful candidate will work on data pipeline and data transformation tasks that underly these strategic pillars.

RESPONSIBILITIES

  • Collaborate with business teams and Data Analysts to gather and understand requirements, ensuring alignment with business objectives and data needs. 
  • Translate business needs into detailed technical requirements in collaboration with subject matter experts (SMEs) to ensure accuracy and feasibility. 
  • Recommend and design scalable, efficient, and effective data architectures and workflows to support current and future business requirements. 
  • Design, develop, and maintain data assets to enable seamless extraction, transformation, and loading (ETL/ELT) processes from diverse data sources, making data accessible to client-facing applications, data warehouses, and internal tools. 
  • Build, operate, and optimize highly scalable and reliable data pipelines and infrastructure to support analytics, reporting and operational use cases. 
  • Drive end-to-end ownership of projects, including planning, development, testing, and deployment, ensuring timely and successful delivery. 
  • Collaborate with Quality Assurance (QA) and Support teams to identify, troubleshoot, and resolve issues in production environments, ensuring the stability and reliability of data systems. 
  • Work with Release Management to plan, coordinate, and implement data pipeline updates, adhering to CI’s deployment standards and minimizing disruption to production systems. 
  • Implement and enforce best practices for observability, data lineage, and governance, ensuring transparency, reliability, and compliance across all data systems. 
  • Participate in data migration projects, transitioning legacy systems to modern platforms and architectures while minimizing disruption and data loss. 

EXPERIENCE 

  • 3+ years of professional experience in data engineering, with a strong focus on designing, developing, and optimizing scalable data pipelines, ETL/ELT workflows, and data integration solutions using modern tools and technologies. 

EDUCATION/TRAINING 

  • Post-secondary degree in a quantitative discipline. 

KNOWLEDGE, SKILLS, AND ABILITIES 

  • Comprehensive understanding of data pipeline architecture, modern data stack architecture, and cloud-based platforms, including AWS, Snowflake, and other cloud-native solutions. 
  • In-depth knowledge and experience with the following tools and concept: 
  • Data extraction – SQL, Python, API invocation, CDC 
  • Database systems – PostgreSQL, Sybase, MySQL, DynamoDB 
  • Data storage repositories – SFTP, AWS S3 
  • Scheduling of data jobs – CRON, Apache Airflow, AWS Step Functions 
  • ETL/ ELT tools and workflow – Snowflake, PySpark, AWS Glue, EMR, AWS Lambda, SCD 
  • CI/CD pipelines – Bitbucket, Git, Jenkins, CloudFormation, Terraform, Flyway 
  • Strong knowledge of observability and data lineage implementation to ensure pipeline transparency and monitoring. 
  • A strong analytical mindset and sophisticated written and verbal communication skills. 
  • Experience in the Financial Services Industry is an asset. 
  • Ability to work within an organization based upon continuous improvement. 

WHAT WE OFFER  

  • Modern HQ location within walking distance from Union Station
  • Equipment Purchase Program
  • Training Reimbursement
  • Paid Professional Designations
  • Employee Share Purchase Program (ESPP)
  • Corporate Discount Program
  • Enhanced group benefits
  • Parental Leave Top–up program
  • Fitness membership discounts
  • Volunteer paid Days

We are focused on building a diverse and inclusive workforce. If you are excited about this role and are not confident you meet all the qualification requirements, we encourage you to apply to investigate the opportunity further.

Please submit your resume in confidence by clicking “Apply”. Only qualified candidates selected for an interview will be contacted. CI Financial Corp. and all of our affiliates (“CI”) are committed to fair and accessible employment practices and provide reasonable accommodations for persons with disabilities. If you require accommodations in order to apply for any job opportunities, require this posting in an additional format, or require accommodation at any stage of the recruitment process please contact us at  , or call ext. 4747.

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Database Developers Jobs