206 Senior Data Engineer jobs in Canada

Data Engineer - Big Data, SQL, Python

Toronto, Ontario Astra North Infoteck Inc.

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Keywords: • Data Engineer• SQL• Python• HDFSCompetencies:• Digital : BigData and Hadoop Ecosystems• Digital : Python• Digital : ReactJS• PL/SQLExperience (Years):• 8-10Essential Skills – RESOURCE FOR Senior Data Engineer (Must-have):• Experience with SQL Server Back-End development including: SQL Scripts, Stored-Procedures, Dynamic Stored-Procedures• Experience development Python scripts and libraries• Experience with Agile development, GitHub, SSMS, Dataiku• Experience working with databases such as SQL Server, HDFS• Experience with DBMS Data Modelling, ERD, Data Pipelines and DQ rules Implementation• Experience to Implement complex logic from Business Requirements using both SQL Stored Procedures and/or Python• Experience to produce Architecture Design and Technical documents for all Layers such as Front-End UI, API End Points, and Back-End Stored ProceduresNice-to-have:• Experience ETL Work Flow and Python/SQL and Dataiku Recipes using Dataiku Platform• Experience designing front-end solutions with React.js, OpenShift, API End-Points (APIGEE)• Experience with HR systems & workflow management tools such as Workday, SAP, ServiceNow• Experience developing Tableau Dashboard and reusable Work Books• Good knowledge of API Endpoints development using Python Language
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Mississauga, Ontario Compass Group

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

# Job Summary

The **Data Engineer** will work closely with senior developers to build out new and maintain existing ETL pipelines in Airflow. They will be part of our Data Technology team working with an all new technology stack, working closely in collaboration with internal and external stakeholders.

Now, if you were to come on board as our **Data Engineer**, we’d ask you to do the following for us:

- Design, implement and maintain data pipelines for extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
- Build and maintain existing Airflow DAGs
- Maintain and add to our middle tier API
- Prototype new technology that supports our vision of making our consumer’s experiences better with our data
- Provide support and insights to the business analytics and data science teams
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs

Think you have what it takes to be our **Data Engineer**? We’re committed to hiring the best talent for the role. Here’s how we’ll know you’ll be successful in the role:

- At least 2-3 years of relevant experience in a similar role or function
- Bachelor’s degree or equivalent in Computer Science, Computer Engineering, Information Systems or similar
- Programming skills with Python and Spark
- Exposure to at least one cloud provider, AWS preferred
- Knowledge of SQL, relational database concepts and SQL scripting
- Some experience with Airflow highly preferred, or similar systems
- Web Development skills with languages and frameworks like Angular, Node.js would be a plus
- Experience with Docker, Kafka or other data stream processing software platform is preferred
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Mississauga, Ontario Compass Group

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

We are CDAI—the data and artificial intelligence engine of Compass Group North America. We design and deliver custom, in-house solutions tailored to the unique complexities of food service and hospitality. Our work is grounded in strong data foundations, layered with AI to enhance forecasting, streamline operations, and enable better, faster decision-making across Compass Group. With deep integration into the business and a commitment to white-glove service, CDAI empowers associates, clients, and customers through innovative, future-forward technologies.

# **Job Summary**

The **Data Engineer** will work closely with senior developers to build out new and maintain existing ETL pipelines in Airflow. They will be part of our Data Technology team working with an all new technology stack, working closely in collaboration with internal and external stakeholders.

Now, if you were to come on board as our **Data Engineer**, we’d ask you to do the following for us:

- Design, implement and maintain data pipelines for extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
- Build and maintain existing Airflow DAGs
- Maintain and add to our middle tier API
- Prototype new technology that supports our vision of making our consumer’s experiences better with our data
- Provide support and insights to the business analytics and data science teams
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs

Think you have what it takes to be our **Data Engineer**? We’re committed to hiring the best talent for the role. Here’s how we’ll know you’ll be successful in the role:

- At least 2-3 years of relevant experience in a similar role or function
- Bachelor’s degree or equivalent in Computer Science, Computer Engineering, Information Systems or similar
- Programming skills with Python and Spark
- Exposure to at least one cloud provider, AWS preferred
- Knowledge of SQL, relational database concepts and SQL scripting
- Some experience with Airflow highly preferred, or similar systems
- Web Development skills with languages and frameworks like Angular, Node.js would be a plus
- Experience with Docker, Kafka or other data stream processing software platform is preferred

Compass Group Canada is committed to nurturing a diverse workforce representative of the communities within which we operate. We encourage and are pleased to consider all qualified candidates, without regard to race, colour, citizenship, religion, sex, marital / family status, sexual orientation, gender identity, aboriginal status, age, disability or persons who may require an accommodation, to apply.

For accommodation requests during the hiring process, please contact for further information.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Toronto, Ontario Kingsdale Advisors

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

About Kingsdale Advisors

With offices in Toronto and New York, Kingsdale Advisors is the leading shareholder services and advisory firm, having acted on the largest and most high-profile proxy fights and transactions. Since 2003, public companies across North America have looked to the expertise of Kingsdale Advisors to help them reach out to shareholders and secure the success of transactions or resolutions driven by shareholder votes. Kingsdale Advisors’ multidisciplinary team offers an array of specialized services focused on strategic and defensive advisory, governance advisory and proxy analytics, strategic communications, and voting analytics.

Your Role and How You Will Contribute

Reporting to VP of Operations, the Data Engineer is responsible for updating and maintaining the data repositories of the organization using various ETL related tools to cleanse raw data into client ready format as well as within the organization. They will be required to communicate with the various departments to gather user requirements and execute standard operational procedures. Attention to detail is a must as data integrity and accuracy is extremely important.

Requirements

Responsibilities & Key Duties will include

  • Manipulate data sets using advanced ETL techniques.
  • Identify, create, maintain and report on data.
  • Provide support for data cleansing, mapping/transformation and quality control.
  • Data mining and research.
  • Extract, transform and load both structured and unstructured data in multiple environments to unlock the ability to self-serve across the business thus having the technical expertise to turn our different sets of data into meaningful results.
  • Provide analytical support to company by providing custom reports using various formats (Excel, Google Analytics, Power BI, etc.) Create value-added reports, metrics and analysis across multiple dimensions.
  • Analyze user requirements for reports, forms, queries, and data extraction.
  • Develop and deploy end-user practices and tools for data extraction, queries, and data manipulation in accordance with business processes.
  • Import and/or export data between different kinds of software.
  • Tracking, managing, and validating data across systems.
  • Ensure the stability and reliability of data access and data quality across the organization via ongoing database support and maintenance.
  • Willing to be flexible with shift start and end time including evenings and weekends.
  • Perform other duties as required.

Qualifications & Attributes

  • Ability to translate functional needs to technical documents.
  • Knowledge of ETL and Business Intelligence environments.
  • High attention to detail.
  • Ability to quickly learn new technology.
  • Intermediate to advanced Microsoft Excel knowledge.
  • Knowledge of data management tools (Python, Power BI tools, SQL, SSMS, SSIS).
  • Experience with VBA, RegEx, Microsoft Power Platform, and SharePoint is an asset.
  • Experience with building tables and queries using MS SQL.
  • Familiarity with Alteryx or experience in process automation is advantageous.
  • Understanding of software development and methodologies.
  • Ability to communicate ideas effectively.
  • Experience working in a fast-paced, team-oriented, collaborative environment.
  • Good interpersonal, written, and oral communication skills.
  • Demonstrates a rapid learning capability.

Equity Statement

Kingsdale Advisors is a proud advocate for diversity and inclusion. We welcome applications from members of the BIPOC community, women, people with disabilities, the LGBTQ+ community, and individuals with diverse intersectional identities. Accommodation is available throughout the recruitment and employment processes, in accordance with the Accessibility for Ontarians with Disabilities Act (AODA).

This advertiser has chosen not to accept applicants from your region.

Data Engineer

Vancouver, British Columbia ScalePad

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Salary: $90,000-$110,000

We're Hiring!

We're looking for brilliant thinkers to join our #Rocketeers. If you've ever wondered what it's like to work in a place where people enjoy their work and where talent is more important than the title, then keep reading.


What is ScalePad?

ScalePad is a market-leading software-as-a-service (SaaS) company with headquarters in Vancouver, Toronto, Montreal and Phoenix, AZ. However, we are proud to say our employee reach is now global so we can best serve our partners all over the world.


Our success is no accident: ScalePad provides MSPs of every size with the knowledge, technology, and community they need to deliver increased client value while navigating the continuously changing terrain of the IT landscape. With a suite of integrated products that automate and standardize MSPs operations, analyze and uncover new opportunities, and expand value to clients, ScalePad is equipping the MSP adventure.

ScalePad has received awards such as MSP Todays Product of the Year, G2s 2024 Fastest Growing Product, and 2024 Best IT Management Product. In 2023, it was named a Best Workplace in Canada by Great Place to Work. ScalePad is a privately held company serving over 12,000 MSPs across the globe.

You can contribute to our innovation and appreciate how your work is helping take this company to a higher level of operational maturity. More on that here.


Your mission should you choose to accept it.

As a Data Engineer supporting our Revenue Operations, your mission is to drive the evolution of our business intelligence, data management, and operational solutions. Your active involvement in developing innovative approaches will ensure our scalability aligns seamlessly with our vision for growth. Collaborating closely with Data Scientists, Business Analysts, and Software Engineers, you will contribute to the creation of exceptional solutions. Our collaborative work environment values your input, as it plays a pivotal role in our continuous innovation process. This unique opportunity empowers you to engage in every aspect of the data engineering cycle, with the guidance and resources needed to learn, optimize, advance, and grow every quarter.


Responsibilities.

  • Familiarize yourself with our current initiatives, teams, and operational workflows to seamlessly integrate into ongoing projects.
  • Master our technology architecture, gaining a deep understanding of our systems and infrastructure.
  • Gain insight into our business operations structure, needs, and cycles to align your work with organizational goals.
  • Join a new development team and kick-off major projects.
  • Collaborate closely with data scientists to enable their projects with robust data engineering solutions.
  • Work on data lake and data warehouse solutions, optimizing data storage and retrieval processes.
  • Develop solutions that seamlessly connect different departments, fostering efficient data flow and information sharing.
  • You have autonomy in your ability to manage and prioritize your tasks with an innate ability to shift gears when required.
  • You are a sponge when it comes to learning and excelling even when outside your comfort zone.
  • Maintain an unwavering commitment to the quality of your work and that of your teammates, upholding high standards.
  • Embrace a learning mindset, excelling even outside your comfort zone and staying updated on industry trends.
  • Possessing experience in OLAP databases, artificial intelligence, machine learning, and event sourcing is considered an asset.

Qualifications.

  • 2+ years of hands-on Data Engineer experience with a strong track record of delivering data solutions.
  • Proficiency in Python, R, C#, or equivalent programming languages for crafting data solutions.
  • Strong knowledge of SQL and NoSQL solutions.
  • Familiarity with Agile development methodologies.
  • Experience in conceptual, business, and analytical data modeling.
  • Proficiency in Databricks, PySpark, and DBT.
  • Experience with data ingestion tools like Fivetran, Airbyte, and Airflow.
  • Deep understanding of data pipelines, including ETL processes.
  • Commitment to best practices for efficient data pipelines, data quality, and integrity.
  • Passion for Big Data, data mining, artificial intelligence, and machine learning.
  • Curious mindset, love for learning, and strong problem-solving skills.
  • Ability to collaborate with business stakeholders to clarify data requirements and develop models.
  • Strong focus on practical and valuable solutions for end-users.
  • Agile and iterative approach to work, with the ability to recognize and rectify mistakes.
  • Nice qualifications to have: Experience with CRM, analytics, data visualization, and billing tools (e.g., Zendesk, HubSpot, Segment, GA, Tableau, Stripe, Chargebee).


What Youll Love Working As A Rocketeer:

  • Everyones an Owner: Through our Employee Stock Option Plan (ESOP), each team member has a stake in our success. As we scale, your contributions directly shape our future and you share in the rewards.
  • Growth, Longevity and Stability: Benefit from insights and training from our leadership and founder, whose extensive experience in funding and scaling successful software companies creates a stable environment for your long-term career growth. Their proven track record fosters a culture of lasting success.
  • Annual Training & Development: Every employee receives an annual budget for professional development, empowering you to advance your skills and career on your terms.
  • Hybrid Flexibility: Enjoy a world-class office at our headquarters in downtown Vancouver, Toronto, and Montreal
  • Cutting-Edge Gear: Whether in the office or at home, youll be set up for success with top-of-the-line hardware.
  • Wellness at Work: Our Vancouver office features a fitness facility, outdoor ping-pong tables
  • Comprehensive Benefits: Weve got you covered with an extensive benefits package with 100% medical and dental coverage fully employer-paid, RRSP matching after one year of employment, and even a monthly stipend to help offset the costs of the hybrid experience.
  • Flexible Time Off: With our unlimited flex-time policy in addition to all accrued vacation allows you to take the time you need to recharge and thrive.

Dream jobs dont knock on your door every day.

ScalePad is not your typical software company. When we hire you, we arent just offering you a job, but rather we are committing to investing in both you and your long-term career. You'll help shape how this modern SaaS company operates and make a genuine impact on the future of our people, product, and partners.

At ScalePad, we believe in the power of Diversity, Equity, Inclusion, and Belonging (DEIB) to drive innovation, collaboration, and success. We are committed to fostering a workplace where every individual's unique experiences and perspectives are valued, and where employees from all backgrounds can thrive. Our dedication to DEIB is woven into the fabric of our culture, guiding our actions and decisions as we build a stronger and more inclusive future together.

Join us and be part of a team that celebrates differences, embraces fairness, and ensures that everyone has an equal opportunity to contribute and grow. Together, we're creating an environment where diverse voices are not only heard but also amplified, where everyone feels valued, and where we can all achieve our full potential.



Please no recruiters or phone calls.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

Moncton, New Brunswick Four Eyes Financial

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

AI Engineer

Four Eyes Financial is a Canadian Regulatory Technology (RegTech) company based in Saint John, New Brunswick. We are at the forefront of change in how wealth management firms approach their data and compliance functions. Our proprietary Intelligent Compliance Platform streamlines interactions between Advisors, Clients, and Head Office.

We are expanding our data team to support the development of an AI-Powered Compliance Suite designed to transform financial supervision. This role is integral to building the foundational data architecture for our next-generation Generative AI and Agentic AI systems.

Candidate must be located in New Brunswick.


Focus Areas

Scalable Data Solutions: Database infrastructure is structured and operating efficiently.

  • Architect, build, and maintain a scalable Data Lakehouse using AWS S3, Apache Iceberg, and AWS Glue.
  • Develop and operate robust, continuous ETL pipelines to ingest high-volume data from various sources, including Aurora databases using AWS Database Migration Service (DMS).
  • Collaborate with the AI team to prepare, structure, and contextualize data for training and fine-tuning Large Language Models (LLMs) and specialized AI agents.
  • Contribute to the implementation and maintenance of the MLOps environment, ensuring proper version control for datasets and models.
  • Utilize Amazon Neptune to model and manage the complex permissions graph required for secure data access.
  • Experience with large data sets and Enterprise-grade databases (Relational, NoSQL and Node Graph Databases).
  • Experience with many data types: JSON, XML, CSV, etc.

ETL : Build, operate, and support our ETL pipelines & data systems.

  • Deep understanding of how to build and maintain event driven ETL processes with complex interdependencies.
  • Understanding of how to implement ETL processes using a combination of serverless and container-based infrastructure.
  • Experience architecting and building data pipelines.
  • Experience extracting data from multiple sources (APIs, SFTP, Web Scraping, etc.).

Leadership & Mentorship : Effective cross-functional collaboration.

  • Participate in sprint planning, peer code reviews, and architectural discussions.
  • Contribute to a fast-paced team solving complex business challenges with AI, ensuring alignment with product and project goals.
  • Document model development and decision-making processes clearly.
  • Support team members in learning best practices in AI development.
  • Contribute to a culture of learning and innovation within the team.

Qualifications

  • A graduate of Computer Science, Engineering or equivalent knowledge/experience ; with a minimum of 5 years of experience.
  • Strong programming skills in Python and advanced proficiency in SQL (Postgres experience is an asset).
  • Proven experience building and operating ETL pipelines using cloud-based tools such as AWS Glue, AWS DMS, Apache Airflow, or similar technologies.
  • Solid experience within the AWS data ecosystem (e.g., S3, Glue, Lambda).
  • An intuitive grasp of data modeling, data warehousing, and Data Lakehouse concepts.
  • Familiarity with Python libraries for data manipulation such as Pandas, NumPy, or Dask.
  • Experience with Amazon SageMaker, Amazon Bedrock, Amazon OpenSearch, or graph databases like Amazon Neptune.
  • Experience with Python coding frameworks such as Flask or FastAPI.
  • Bonus: Experience in fintech or regulatory/compliance -driven environments.

We have a diverse team of bright and high-performing people and we are in growth mode. You'd fit in well if you thrive in a fast-paced environment, have a thirst for learning, enjoy improving systems and processes, and focus on excellence in everything you do.

Benefits include:

  • Group health & dental benefits
  • RRSP matching program
  • Competitive salary & vacation days
  • Hybrid work options
  • And more!

Are you ready to accelerate your career? If you have a pioneering spirit, a passion for results, seek meaningful work and want to make an impact on a changing industry, come join us!

This advertiser has chosen not to accept applicants from your region.

Data Engineer

Toronto, Ontario CI Financial Corp.

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Description

At CI, we see a great place to work as one that is a safe place for everyone to have a voice, where people are empowered to take ownership over meaningful work, where there is an opportunity to grow through stretching themselves, where they can work on innovative products and projects, and where employees are supported and engaged in doing so.

We are currently seeking for a Data Engineer to join the Data Solutions team in the Wealth Technology group at CI Wealth. The Data Solutions team focuses on developing data-driven solutions, driving innovation, and uncovering key insights to support strategic initiatives. The successful candidate will work on data pipeline and data transformation tasks that underly these strategic pillars.

RESPONSIBILITIES

  • Collaborate with business teams and Data Analysts to gather and understand requirements, ensuring alignment with business objectives and data needs. 
  • Translate business needs into detailed technical requirements in collaboration with subject matter experts (SMEs) to ensure accuracy and feasibility. 
  • Recommend and design scalable, efficient, and effective data architectures and workflows to support current and future business requirements. 
  • Design, develop, and maintain data assets to enable seamless extraction, transformation, and loading (ETL/ELT) processes from diverse data sources, making data accessible to client-facing applications, data warehouses, and internal tools. 
  • Build, operate, and optimize highly scalable and reliable data pipelines and infrastructure to support analytics, reporting and operational use cases. 
  • Drive end-to-end ownership of projects, including planning, development, testing, and deployment, ensuring timely and successful delivery. 
  • Collaborate with Quality Assurance (QA) and Support teams to identify, troubleshoot, and resolve issues in production environments, ensuring the stability and reliability of data systems. 
  • Work with Release Management to plan, coordinate, and implement data pipeline updates, adhering to CI’s deployment standards and minimizing disruption to production systems. 
  • Implement and enforce best practices for observability, data lineage, and governance, ensuring transparency, reliability, and compliance across all data systems. 
  • Participate in data migration projects, transitioning legacy systems to modern platforms and architectures while minimizing disruption and data loss. 

EXPERIENCE 

  • 3+ years of professional experience in data engineering, with a strong focus on designing, developing, and optimizing scalable data pipelines, ETL/ELT workflows, and data integration solutions using modern tools and technologies. 

EDUCATION/TRAINING 

  • Post-secondary degree in a quantitative discipline. 

KNOWLEDGE, SKILLS, AND ABILITIES 

  • Comprehensive understanding of data pipeline architecture, modern data stack architecture, and cloud-based platforms, including AWS, Snowflake, and other cloud-native solutions. 
  • In-depth knowledge and experience with the following tools and concept: 
  • Data extraction – SQL, Python, API invocation, CDC 
  • Database systems – PostgreSQL, Sybase, MySQL, DynamoDB 
  • Data storage repositories – SFTP, AWS S3 
  • Scheduling of data jobs – CRON, Apache Airflow, AWS Step Functions 
  • ETL/ ELT tools and workflow – Snowflake, PySpark, AWS Glue, EMR, AWS Lambda, SCD 
  • CI/CD pipelines – Bitbucket, Git, Jenkins, CloudFormation, Terraform, Flyway 
  • Strong knowledge of observability and data lineage implementation to ensure pipeline transparency and monitoring. 
  • A strong analytical mindset and sophisticated written and verbal communication skills. 
  • Experience in the Financial Services Industry is an asset. 
  • Ability to work within an organization based upon continuous improvement. 

WHAT WE OFFER  

  • Modern HQ location within walking distance from Union Station
  • Equipment Purchase Program
  • Training Reimbursement
  • Paid Professional Designations
  • Employee Share Purchase Program (ESPP)
  • Corporate Discount Program
  • Enhanced group benefits
  • Parental Leave Top–up program
  • Fitness membership discounts
  • Volunteer paid Days

We are focused on building a diverse and inclusive workforce. If you are excited about this role and are not confident you meet all the qualification requirements, we encourage you to apply to investigate the opportunity further.

Please submit your resume in confidence by clicking “Apply”. Only qualified candidates selected for an interview will be contacted. CI Financial Corp. and all of our affiliates (“CI”) are committed to fair and accessible employment practices and provide reasonable accommodations for persons with disabilities. If you require accommodations in order to apply for any job opportunities, require this posting in an additional format, or require accommodation at any stage of the recruitment process please contact us at  , or call ext. 4747.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Senior data engineer Jobs in Canada !

Data Engineer

Montréal, Quebec Small Door Veterinary

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Small Door is membership-based veterinary care designed with human standards that is better for pets, pet parents, and veterinarians alike. We designed and delivered a reimagined veterinary experience via a membership that includes exceptional care, 24/7 telemedicine, and transparent pricing - delivered with modern hospitality in spaces designed by animal experts to be stress-free. We opened our flagship location in Manhattan's West Village in 2019 and have quickly expanded across the East Coast. Small Door now operates in New York City, Boston, Washington DC, Maryland and Virginia with continued expansion plans in 2026.

We're looking for a motivated, curious, and collaborative Entry-Level Data Engineer to join our growing Data team. In this role, you'll help build and maintain the pipelines, tools, and systems that power analytics, product insights, and strategic decision-making across the company.

This is a great opportunity for someone early in their career who's excited to learn, ship fast, and make a real impact at a company where data is core to everything we do.

What You'll Do

  • Collaborate with the product, technology and data team to build scalable data pipelines (ETL/ELT)
  • Maintain and optimize data infrastructure using tools like dbt, Airflow, and Snowflake (or similar)
  • Support data ingestion from internal and external sources (APIs, application databases, third-party tools)
  • Ensure data integrity, availability, and performance through testing and monitoring
  • Partner with analytics and product teams to make trusted, high-quality data available for reporting and machine learning
  • Write clear, maintainable code and documentation
  • Contribute to the evolution of our data architecture and engineering best practices
  • Help manage the data warehouse and its data model

Who You Are

  • A degree in Computer Science, Engineering, Data Science, or a related field — or equivalent practical experience
  • Proficiency in SQL and Python
  • Familiarity with data modeling concepts and ETL workflows
  • Interest or experience in tools like dbt, Airflow, Snowflake, Redshift, BigQuery, or similar
  • A growth mindset and eagerness to learn from feedback and challenges
  • Excellent problem-solving skills and attention to detail
  • Strong communication and collaboration skills

What you'll get

  • Competitive salary
  • A great top of the line benefit plan provided by PEO Canada and including short term disability insurance
  • An opportunity to make a real impact on the people around you
  • A collaborative group of people who live our core values and have your back
  • A clear career path with opportunities for development, both personally and professionally

Small Door is proudly committed to creating a diverse, inclusive and equitable workplace. We encourage qualified applicants of every background, ability, and life experience to apply to appropriate employment opportunities.

This advertiser has chosen not to accept applicants from your region.

Data Engineer

Toronto, Ontario Forum Asset Management

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Salary:

Forum Asset Management Data Engineer


Location: 181 Bay Street, Toronto


In Office:5 days per week


Overview of Forum:

Join us in deliveringExtraordinary Outcomesthrough investment.


Forum is an investor, developer and asset manager operating across North America for over 28 years, focusing on real estate, private equity and infrastructure, with a strategic concentration in housing.


Our AUM exceeds $3 billion. We are committed to sustainability, responsible investing and creating value that benefits our stakeholders and the communities in which we invest, what we call our Extraordinary Outcomes.


In 2024, Forum completed thelargest real estate transaction in Canadawith theAlignvest acquisition,making us thelargest owner of Purpose-Built Student Accommodation (PBSA) in Canada through our $.5B open ended Real Estate Income and Impact Fund (REIIF).Ournational development pipeline now exceeds 3.5 billion,positioning Forum as thelargest developer of PBSA in Canada, operating from coast to coast.


The Forum team is adaptable, agile, and dynamic, committed to sustainable and responsible investing. Our people bring diverse cultural backgrounds and professional experiences, fostering innovation and thought leadership.We uphold integrity, trust, and transparency as core values, knowing that to achieveExtraordinary Outcomes,we need to support and develop anExtraordinary team.

Position Overview:

Were looking for aData Engineer to own the architecture, build, and evolution of Forums enterprise data platform using the Microsoft Fabric technology stack (or a suitable alternative). This is a rare greenfield opportunity to architect the full data lifecyclefrom ingestion and transformation to analytics and AI enablementwhile directly impacting Forums investment, real estate, and operational strategies.

This is a hands-on, in-office role with high visibility. Youll act as a trusted advisor and technical authority, collaborating with investment professionals, business leaders, analysts, and software developers to design enterprise-grade systems and AI-powered workflows that drive measurable business value. Your work will lay the foundation for our long-term data strategy and the future of decision-making at Forum.


Key Duties and Responsibilities:

  • Own the end-to-end design and evolution of Forums Lakehouse architecture Microsoft Fabric, Azure Data Factory, Synapse, ADLS, and related Azure services
  • Define and enforce data engineering standards, governance frameworks, and lifecycle management practices
  • Lead large-scale data initiatives, migrations, and integrations across diverse internal and external systems
  • Design and optimize enterprise-grade ETL/ELT pipelines for high-volume, high-complexity data
  • Implement structured data workflows (e.g., Medallion Architecture) to deliver reliable, business-ready data
  • Develop AI/ML-powered workflows using tools like Azure OpenAI, Document Intelligence, and Azure ML
  • Create internal APIs and tools to enable self-serve analytics and data access across departments
  • Partner with business teams across real estate, private equity, and operations to identify data opportunities and implement tailored solutions
  • Develop and evolve our Power BI dashboards and reporting layer, ensuring reliable data access and visualization
  • Promote best practices in data governance, automation, and AI application across Forums technology ecosystem
  • Partner with internal teams and external technology vendors to drive the rollout of the platform


Candidate Profile:

  • 6-7 years of progressive experience in data engineering, with a proven track record of architecting and delivering enterprise-scale data solutions
  • Expert-level experience with the Microsoft Azure ecosystem: Microsoft Fabric, Azure Data Factory (ADF), Synapse, ADLS Gen2, and Power BI
  • Proficiency in Python for data engineering, automation, and API development
  • Deep understanding of data modeling, ELT/ETL design, and data warehouse best practices
  • Track record architecting enterprise-scale data platforms in high-growth or regulated industries
  • Proven success deploying AI/ML solutions into production at scale
  • Experience integrating Azure OpenAI, LLMs, and document intelligence into real business processes
  • Ability to evaluate, pilot, and operationalize emerging AI technologies for measurable business impact
  • Demonstrated ability to work directly with executives, investors, and analysts to shape data strategy
  • Strong communication skills and an entrepreneurial mindset: capable of turning ambiguity into working solutions
  • Experience working with financial data models, capital tables, investment reports, or property management systems
  • Background in private equity, real estate, asset management, or related sectors preferred



At Forum, we encourage diversity. We are committed to an inclusive workplace that reflects our belief that diversity is central to building a high-performing team. Forum is an equal-opportunity employer. We are committed to providing accessible employment practices. Should you require an accommodation during any phase of the recruitment process, please let the recruitment team know at

This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Senior Data Engineer Jobs