EN | FR

87 Analytics Engineer jobs in Canada

Analytics Engineer

Toronto, Ontario Clutch Technologies Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

About Clutch:

Clutch is Canada's largest online used car retailer, delivering a seamless, hassle-free car-buying experience to drivers everywhere. Customers can browse hundreds of cars from the comfort of their home, get the right one delivered to their door, and enjoy peace of mind with our 10-Day Money-Back Guarantee… and that's just the beginning.

Named one of Canada's top growing Companies two years in a row and also awarded a spot on LinkedIn's Top Canadian Startups list, we're looking to add curious, hard-working, and driven individuals to our growing team.

Headquartered in Toronto, Clutch was founded in 2017. Clutch is backed by a number of world-class investors, including Canaan, BrandProject, Real Ventures, D1 Capital, and Upper90. To learn more, visit clutch.ca.

What You'll Do

  • Build, maintain, and optimize our analytics data stack, including ELT pipelines, dimensional models, and semantic layers using tools like dbt, Snowflake, and Airflow.
  • Collaborate with teams across Marketing, Product, Operations, and Finance to define key metrics, improve reporting fidelity, and unlock self-serve analytics.
  • Own and evolve core data models and pipelines that drive reporting, dashboards, and downstream analysis.
  • Apply best practices around data testing, observability, and CI/CD to ensure reliable, version-controlled, production-ready analytics code.
  • Partner with stakeholders to understand business needs and translate them into performant and trustworthy data products.
  • Contribute to and advocate for Clutch's internal data documentation and governance practices.
  • Mentor others on data modeling, tooling, and process — bringing a scalable, engineering-first mindset to analytics.

What We're Looking For

  • 4–6 years of experience in analytics engineering, data analytics, or data engineering — ideally in fast-paced or high-growth environments.
  • Strong SQL expertise and comfort working with large-scale, normalized, or dimensional datasets.
  • Hands-on experience building data models with dbt in a production environment (CI, testing, documentation).
  • Experience working with cloud data warehouses like Snowflake , BigQuery , or Redshift .
  • Experience building semantic layers and enabling self-serve BI (Looker, Sigma, Tableau).
  • A proactive communicator who can collaborate with business stakeholders and technical teams alike.
  • A strong sense of ownership and accountability for data quality, performance, and maintainability.
  • Familiarity with using Python and orchestration tools (Airflow, Prefect, Dagster) for data pipelines and transformation is a plus

Why you'll love it at Clutch:

  • Autonomy & ownership -- create your own path, and own your work
  • Competitive compensation and equity incentives!
  • Generous time off program
  • Health & dental benefits

Clutch is committed to fostering an inclusive workplace where all individuals have an opportunity to succeed. If you require accommodation at any stage of the interview process, please email .

This advertiser has chosen not to accept applicants from your region.

Analytics Engineer

Toronto, Ontario Alpaca

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Who We Are:

Alpaca is a US-headquartered self-clearing broker-dealer and brokerage infrastructure for stocks, ETFs, options, crypto, fixed income, 24/5 trading, and more. Our recent Series C funding round brought our total investment to over $170 million, fueling our ambitious vision.

Amongst our subsidiaries, Alpaca is a licensed financial services company, serving hundreds of financial institutions across 40 countries with our institutional-grade APIs. This includes broker-dealers, investment advisors, wealth managers, hedge funds, and crypto exchanges, totalling over 6 million brokerage accounts.

Our global team is a diverse group of experienced engineers, traders, and brokerage professionals who are working to achieve our mission of opening financial services to everyone on the planet . We're deeply committed to open-source contributions and fostering a vibrant community, continuously enhancing our award-winning, developer-friendly API and the robust infrastructure behind it.

Alpaca is proudly backed by top-tier global investors, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Derayah Financial, Elefund, and Y Combinator.

Our Team Members:

We're a dynamic team of 230+ globally distributed members who thrive working from our favorite places around the world, with teammates spanning the USA, Canada, Japan, Hungary, Nigeria, Brazil, the UK, and beyond!

We're searching for passionate individuals eager to contribute to Alpaca's rapid growth. If you align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact, we encourage you to apply.

Your Role:

We are seeking an Analytics Engineer to own and execute the vision for our data transformation layer. You will be at the heart of our data platform, which processes hundreds of millions of events daily from a wide array of sources, including transactional databases, API logs, CRMs, payment systems, and marketing platforms.

You will join our 100% remote team and work closely with Data Engineers (who manage data ingestion) and Data Scientists and Business Users (who consume your data models). Your primary responsibility will be to use dbt and Trino on our GCP-based, open-source data infrastructure to build robust, scalable data models. These models are critical for stakeholders across the company—from finance and operations to the executive team—and are delivered via BI tools, reports, and reverse ETL systems.

Our team is 100% distributed and remote.

Responsibilities:

  • Own the Transformation Layer: Design, build, and maintain scalable data models using dbt and SQL to support diverse business needs, from monthly financial reporting to near-real-time operational metrics.
  • Set Technical Standards: Establish and enforce best practices for data modelling, development, testing, and monitoring to ensure data quality, integrity (up to cent-level precision), and discoverability.
  • Enable Stakeholders: Collaborate directly with finance, operations, customer success, and marketing teams to understand their requirements and deliver reliable data products.
  • Integrate and Deliver: Create repeatable patterns for integrating our data models with BI tools and reverse ETL processes, enabling consistent metric reporting across the business.
  • Ensure Quality: Champion high standards for development, including robust change management, source control, code reviews, and data monitoring as our products and data evolve.

Must-Haves:

  • Core Experience: 3+ years of experience in data analytics or data engineering with a strong focus on the "T" (transformation) in ELT.
  • Expert SQL Skills: High fluency in SQL for complex queries and data manipulation on large datasets.
  • Analytics Engineering Fundamentals: Deep understanding of data modeling, transformation principles, and data engineering best practices (e.g., source control, code reviews, testing).
  • dbt Experience: Proven experience building scalable transformation layers using formalized SQL modeling tools, preferably dbt.
  • Technical Versatility:
  • Work Ethic: Comfortable with ambiguity, able to take ownership with minimal oversight, and adaptable in a fast-paced environment.

Nice to Haves:

  • Experience with data ingestion tools (e.g., Airbyte) and orchestration tools (e.g., Airflow).
  • Experiences with Semantic Layer modelling (e.g. Cube, dbt Semantic Layer).

How We Take Care of You:

  • Competitive Salary & Stock Options
  • Health Benefits
  • New Hire Home-Office Setup: One-time USD $00
  • Monthly Stipend: USD 150 per month via a Brex Card

Alpaca is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce.

Recruitment Privacy Policy

This advertiser has chosen not to accept applicants from your region.

Senior Analytics Engineer

Coinbase

Posted 6 days ago

Job Viewed

Tap Again To Close

Job Description

Ready to be pushed beyond what you think you're capable of?
At Coinbase, our mission is to increase economic freedom in the world. It's a massive, ambitious opportunity that demands the best of us, every day, as we build the emerging onchain platform - and with it, the future global financial system.
To achieve our mission, we're seeking a very specific candidate. We want someone who is passionate about our mission and who believes in the power of crypto and blockchain technology to update the financial system. We want someone who is eager to leave their mark on the world, who relishes the pressure and privilege of working with high caliber colleagues, and who actively seeks feedback to keep leveling up. We want someone who will run towards, not away from, solving the company's hardest problems.
Our is intense and isn't for everyone. But if you want to build the future alongside others who excel in their disciplines and expect the same from you, there's no better place to be.
While many roles at Coinbase are remote-first, we are not remote-only. In-person participation is required throughout the year. Team and company-wide offsites are held multiple times annually to foster collaboration, connection, and alignment. Attendance is expected and fully supported.
The Analytics Engineering team bridges the gap between data engineering, data science, and business analytics by building scalable, impactful data solutions. We transform raw data into actionable insights through robust pipelines, well-designed data models, and tools that empower stakeholders across the organization to make data-driven decisions.
Our team combines technical expertise with a deep understanding of the business to unlock the full potential of our data. We prioritize data quality, reliability, and usability, ensuring stakeholders can rely on our data to drive meaningful outcomes.
*What We Do:*
* *Trusted Data Sources*: Develop and maintain foundational data models that serve as the single source of truth for analytics across the organization.
* *Actionable Insights*: Empower stakeholders by translating business requirements into scalable data models, dashboards, and tools.
* *Cross-Functional Collaboration*: Partner with engineering, data science, product, and business teams to ensure alignment on priorities and data solutions.
* *Scalable Data Products*: Build frameworks, tools, and workflows that maximize efficiency for data users, while maintaining high standards of data quality and performance.
* *Outcome-Focused Solutions*: Use modern development and analytics tools to deliver value quickly, while ensuring long-term maintainability.
*What you'll be doing:*
Analytics engineer is a hybrid Data Engineer/Data Scientist/Business Analyst role that has the expertise to understand data flows end to end, and the engineering toolkit to extract the most value out of it indirectly (building tables) or directly (solving problems, delivering insights).
* *Be the expert:*Quickly build subject matter expertise in a specific business area and data domain. Understand the data flows from creation, ingestion, transformation, and delivery.
* Examples:
* Step into a new line of business and work with Engineering and Product partners to deliver first data pipelines and insights.
* Communicate with engineering teams to fix data gaps for downstream data users.
* Take initiative and accountability for fixing issues anywhere in the stack.
* *Generate business value:*Interface with stakeholders on data and product teams to deliver the most commercial value from data (directly or indirectly).
* Examples:
* Build out a new data model allowing multiple downstream DS teams to more easily unlock business value through experimentation and ad hoc analysis.
* Combine Eng details of the algo engine with stats and data expertise to come up with feasible solutions for Eng to make the algo better.
* Work with PMs to tie together new x-PG, and x-Product data into one holistic framework to optimize key financing product business metrics.
* *Focus on outcomes not tools:*Use a variety of frameworks and paradigms to identify the best-fit tools to deliver value.
* Examples:
* Develop new abstractions (e.g. UDFs, Python packages, dashboards) to support scalable data workflows/infra.
* Stand up a framework for building data apps internally, enabling other DS teams to quickly add value.U
* se established tools with mastery (e.g. Google Sheets, SQL) to quickly deliver impact when speed is top priority.
*What We Look For in You:*
In addition to out of the box thinking, attention to detail, a sense of urgency and a high degree of autonomy and accountability, we expect you to have the following skills:
* *Data Modeling Expertise*: Strong understanding of best practices for designing modular and reusable data models (e.g., star schemas, snowflake schemas).
* *Prompt Design and Engineering:*Expertise in prompt engineering and design for LLMs (e.g., GPT), including creating, refining, and optimizing prompts to improve response accuracy, relevance, and performance for internal tools and use cases.
* *Advanced SQL*: Proficiency in advanced SQL techniques for data transformation, querying, and optimization.
* *Intermediate to Advanced Python*: Expertise in scripting and automation, with experience in Object-Oriented Programming (OOP) and building scalable frameworks.
* *Collaboration and Communication*: Strong ability to translate technical concepts into business value for cross-functional stakeholders. Proven ability to manage projects and communicate effectively across teams.
* *Data Pipeline Development*: Experience building, maintaining, and optimizing ETL/ELT pipelines, using modern tools like dbt, Airflow, or similar.
* *Data Visualization*: Proficiency in building polished dashboards using tools like Looker, Tableau, Superset, or Python visualization libraries (Matplotlib, Plotly).
* *Development Tools*: Familiarity with version control (GitHub), CI/CD, and modern development workflows.
* *Data Architecture*: Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks.
* *Business Acumen*: Ability to understand and address business challenges through analytics engineering.
* *Data savvy:*Familiarity with statistics and probability.
* *Bonus Skills*:
* Experience with cloud platforms (e.g., AWS, GCP).
* Familiarity with Docker or Kubernetes.
Disclaimer: Applying for a specific role does not guarantee consideration for that exact position. Leveling and team matching are assessed throughout the interview process.
ID: G2754
*Pay Transparency Notice: *The target annual salary for this position can range as detailed below. Full time offers from Coinbase also includebonus eligibility + equity eligibility + benefits (including medical, dental, and vision)
Pay Range:
$191,000-$191,000 CAD
Please be advised that each candidate may submit a maximum of four applications within any 30-day period. We encourage you to carefully evaluate how your skills and interests align with Coinbase's roles before applying.
Commitment to Equal Opportunity
Coinbase is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, creed, gender, national origin, age, disability, veteran status, sex, gender expression or identity, sexual orientation or any other basis protected by applicable law. Coinbase will also consider for employment qualified applicants with criminal histories in a manner consistent with applicable federal, state and local law. For US applicants, you may view the in certain locations, as required by law.
Coinbase is also committed to providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process, please contact us at accommodations Data Privacy Notice for Job Candidates and Applicants
Depending on your location, the General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) may regulate the way we manage the data of job applicants. Our full notice outlining how data will be processed as part of the application procedure for applicable locations is available Disclosure
For select roles, Coinbase is piloting an AI tool based on machine learning technologies to conduct initial screening interviews to qualified applicants. The tool simulates realistic interview scenarios and engages in dynamic conversation. A human recruiter will review your interview responses, provided in the form of a voice recording and/or transcript, to assess them against the qualifications and characteristics outlined in the job description.
For select roles, Coinbase is also piloting an AI interview intelligence platform to transcribe and summarize interview notes, allowing our interviewers to fully focus on you as the candidate.
*The above pilots are for testing purposes and Coinbase will not use AI to make decisions impacting employment*. To request a reasonable accommodation due to disability, please contact accommodations(at)coinbase.com
This advertiser has chosen not to accept applicants from your region.

Financial Analytics Engineer

Toronto, Ontario Compass Group

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

We are CDAI—the data and artificial intelligence engine of Compass Group North America. We design and deliver custom, in-house solutions tailored to the unique complexities of food service and hospitality. Our work is grounded in strong data foundations, layered with AI to enhance forecasting, streamline operations, and enable better, faster decision-making across Compass Group. With deep integration into the business and a commitment to white-glove service, CDAI empowers associates, clients, and customers through innovative, future-forward technologies.

# **Job Summary**

As an Analytics Engineer, you will be responsible for developing and maintaining our data transformation layer, ensuring data quality, and optimizing data pipelines for our analytics, reporting, and data science teams. You will work closely with our cross-functional teams to ensure that our data is accurate, reliable, and accessible. You will also be responsible for identifying opportunities to improve our data infrastructure and recommending new technologies and tools.

**Now, if you were to come on board as one of our Analytics Engineer, we’d ask you to do the following for us:**

- Develop and maintain data transformation pipelines using dbt to support our analytics, reporting and data science teams
- Ensure the accuracy, completeness, and consistency of our data by implementing data quality checks and processes
- Optimize data pipelines for performance and scalability, ensuring that our data is accessible and usable for our cross-functional teams
- Collaborate with cross-functional teams to identify and solve data-related problems and provide insights on data infrastructure improvements
- Monitor and troubleshoot data pipeline issues and work to resolve them in a timely manner
- Develop and maintain technical documentation for our data infrastructure, including data models, schema definitions, and transformation processes

**Think you have what it takes to be our Analytics Engineer? We’re committed to hiring the best talent for the role. Here’s how we’ll know you’ll be successful in the role.**

- Bachelor's degree in Computer Science, Software Engineering, or a related field
- 3+ years of experience in analytics engineering
- Strong programming skills in SQL and Python, experience with dbt is a bonus
- Experience with data modeling, database design, and data warehousing concepts
- Strong problem-solving and analytical skills with a keen attention to detail
- Strong communication and collaboration skills with the ability to work effectively in a team environment

Compass Group Canada is committed to nurturing a diverse workforce representative of the communities within which we operate. We encourage and are pleased to consider all qualified candidates, without regard to race, colour, citizenship, religion, sex, marital / family status, sexual orientation, gender identity, aboriginal status, age, disability or persons who may require an accommodation, to apply.

For accommodation requests during the hiring process, please contact for further information.
This advertiser has chosen not to accept applicants from your region.

Senior Analytics Engineer

Toronto, Ontario Clutch Technologies Inc.

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

About Clutch:

Clutch is Canada's largest online used car retailer, delivering a seamless, hassle-free car-buying experience to drivers everywhere. Customers can browse hundreds of cars from the comfort of their home, get the right one delivered to their door, and enjoy peace of mind with our 10-Day Money-Back Guarantee… and that's just the beginning.

Named one of Canada's top growing Companies two years in a row and also awarded a spot on LinkedIn's Top Canadian Startups list, we're looking to add curious, hard-working, and driven individuals to our growing team.

Headquartered in Toronto, Clutch was founded in 2017. Clutch is backed by a number of world-class investors, including Canaan, BrandProject, Real Ventures, D1 Capital, and Upper90. To learn more, visit clutch.ca.

What you'll do:

  • You will be responsible for writing clean, readable, and testable code that adheres to best practices, ensuring a high degree of reliability
  • Collaborate with the team to shape the future of our codebase by giving input into designing and implementing scalable and secure architectures that meet the needs of our growing business.
  • Work closely with stakeholders to understand their requirements and deliver improvements to the online customer experience.
  • Utilize your expertise and experience to engage in peer review sessions, provide constructive feedback, and participate in system design discussions.
  • Lead the complete development lifecycle of projects, starting from the initial planning phase, through development and testing, and into maintenance. This involves ensuring project milestones are met, coordinating with cross-functional teams, and driving successful project outcomes.

What we're looking for:

  • B.S. degree in Software Engineering or equivalent experience
  • 4+ years relevant industry experience developing software solutions
  • Proficiency in one or more modern programming languages (e.g. Typescript, Python, Go, Ruby, C#, Rust, etc…)
  • Strong understanding of Frontend and/or Backend frameworks such as (React/Vue, Express/Flask, Ruby on Rails, etc…)
  • Familiarity with cloud platforms such as AWS (Amazon Web Services), Azure, or Google Cloud Platform.
  • Experience working with relational databases like PostgreSQL, MySQL, or Oracle, and in writing efficient SQL queries, designing database schemas, and optimizing database performance.
  • Experience in optimizing application performance, identifying and resolving bottlenecks.
  • Knowledge of testing frameworks and methodologies for unit testing, integration testing, and end-to-end testing.
  • Experience working in an Agile development environment, following methodologies such as Scrum or Kanban, and using tools like Jira or Github Projects for project management and collaboration.

Why you'll love it at Clutch:

  • Autonomy & ownership -- create your own path, and own your work
  • Competitive compensation and equity incentives!
  • Generous time off program
  • Health & dental benefits

Clutch is committed to fostering an inclusive workplace where all individuals have an opportunity to succeed. If you require accommodation at any stage of the interview process, please email .

This advertiser has chosen not to accept applicants from your region.

Senior Analytics Engineer

Oakville, Manitoba Lotlinx

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Since our founding in 2012, Lotlinx has consistently pioneered advancements in the automotive landscape. We specialize in empowering automobile dealers and manufacturers by providing cutting-edge data and technology, delivering a distinct market advantage for every single vehicle transaction. Today, we stand as the foremost automotive AI and machine learning powered technology, excelling in digital marketing, risk management, and strategic inventory management.

Lotlinx provides employees with a dynamic work environment that is challenging, team-oriented, and full of passionate people. We offer great incentives to our employees, such as competitive compensation and benefits, flex time off, and career development opportunities.

Position Summary

We are seeking an experienced Senior Analytics Engineer to join our growing Data team. You will play a pivotal role in architecting, building, and optimizing the data foundations that power analytics and data-driven decision-making across LotLinx. Reporting to the Director of Data Analytics, you will collaborate closely with Data Analysts, Data Engineers, and Product Managers to translate business needs into robust, scalable, and reliable data models and pipelines that are incorporated into our product portfolio. This is a key position where you'll have significant ownership and impact on our data infrastructure and strategy.

This is a hybrid role and requires 3-4 days a week in either our Winnipeg, Hamilton, or Vancouver office locations.

Responsibilities
  • Architect & Build Data Models: Design, develop, and maintain scalable and performant data models in our data warehouse (Google BigQuery, Apache Pinot) to serve as the single source of truth for analytics.
  • Data Analysis: Conduct data validation and exploratory analysis across massive datasets (billions of rows, terabytes) to ensure the integrity of data pipelines and the accuracy of downstream reporting.
  • Develop Large Data Pipelines: Develop, monitor, and troubleshoot ELT/ETL pipelines processing high-volume data streams, ensuring reliability and performance at the terabyte scale.
  • Optimize Pipeline Performance: Optimize complex SQL queries and data transformation logic for maximum performance and cost-efficiency using multi-terabyte datasets within Google BigQuery.
  • Enhance OLAP performance: Analyze Apache Pinot query performance logs and usage patterns across terabyte-scale datasets to identify optimization opportunities and troubleshoot complex data access issues.
  • Enable Data Consumers: Partner with data analysts, data scientists, and business stakeholders to understand their data requirements, providing clean, well-documented, and easy-to-use datasets.
  • Champion Data Quality & Governance: Implement data quality checks, testing frameworks, and documentation standards to ensure the trustworthiness and usability of our data assets.
  • Collaborate & Mentor: Work effectively within a collaborative team environment. Potentially mentor junior team members and share best practices in analytics engineering.
  • Stay Current: Keep abreast of new technologies, tools, and best practices in the analytics engineering space and advocate for their adoption where relevant.
Qualifications

Experience: 4+ years of relevant professional experience in Analytics Engineering, Data Engineering, or a highly related role, with a proven track record of building and managing complex data systems.

  • Expert SQL: Deep expertise in writing complex, highly performant SQL for data transformation, aggregation, and analysis, particularly within a cloud data warehouse environment like BigQuery.
  • Performance Optimization: Demonstrated experience writing, tuning, and debugging complex SQL queries specifically for large-scale data warehouses (multi-terabyte environments), preferably BigQuery or Apache Pinot.
  • Data Modeling Mastery: Strong understanding of data modeling concepts (e.g., Kimball, Inmon, Data Vault) and practical experience designing and implementing warehouse schemas.
  • ETL/ELT & Orchestration: Proven experience building and maintaining data pipelines using relevant tools and frameworks. Python proficiency for scripting and data manipulation is essential.
  • Cloud Data Warehousing: Significant experience working with cloud data warehouses, specifically Google BigQuery. Understanding of underlying architecture and optimization techniques.
  • Problem-Solving: Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data issues independently.
  • Communication & Collaboration: Strong communication skills, capable of explaining complex technical concepts to both technical and non-technical audiences. Proven ability to collaborate effectively across teams.
Nice-to-Haves
  • Previous experience in the Automotive or AdTech industry.
  • Familiarity with workflow orchestration tools like Apache Airflow.
  • Experience with Apache Pinot.
  • Experience with data quality and testing frameworks.
  • Familiarity with Google Cloud Platform (GCP) services beyond BigQuery.
Our Tech Stack
  • Cloud/Warehouse: Google Cloud Platform (GCP), AWS, Google BigQuery, Apache Pinot
  • Transformation: SQL, Python
  • Orchestration: Airflow, Cloud Composer
  • Ingestion: Custom Scripts, Pub/Sub, Lambda

Other: GitLab, Kubernetes, Apache Pinot, OpenSearch

The salary range for this position is $103,000 - $157,000, with an annual target bonus.

Lotlinx is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.

Lotlinx is not currently able to offer sponsorship for employment visa status.

Lotlinx is headquartered in Peterborough, NH and has locations in Holmdel NJ, Manitoba, Ontario and British Columbia, Canada in addition to a large team spanning from the US to Canada.

Our success relies heavily on our customers but also our dedicated talent that continuously moves our platform forward. We value our employees, their abilities and seek to foster an open, cooperative, dynamic environment where the team and company alike can thrive.

This advertiser has chosen not to accept applicants from your region.

Data Analytics Engineer

Kitchener, British Columbia Trusscore

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

Job Description

Salary:

Who We Are

Trusscoreis a material science company for sustainable building materials. We are revolutionizing the way people build, with the ultimate vision of creating a market leading, sustainable, branded, true painted-drywall-alternative. Trusscore invests heavily in R&D to discover breakthroughs in materials and manufacturing processes, advancing the state-of-the-art in science and technology along the way.

Become part of a passionate, innovative team focused on reshaping the future of building materials. We offer a dynamic, flexible work environment, opportunities for personal and professional growth, and the chance to make a tangible impact in an established, fast-growing company with a focus on sustainability and innovation.



Your Mission

Trusscore is redefining building-product manufacturing through digitization, advanced automation, and AI-enhanced process control and detection. Your mission as a data analytics leader is to convert data from ERP, proprietary applications, SCADA feeds, and real-time ML models into decisive insights that drive a business impact.

Youll inject the analytical firepower to an experienced team of industrial engineers, mechatronics, and controls engineers to develop and deploy powerful analytics tools to pinpoint and unlock value creation and value capture opportunities that efficiently scale Trusscores manufacturing and operations.

As a data technology leader, you will apply your expertise in data engineering, analytics, and visualization. You will maintain and build upon our custom data collection platform, build models and automated dashboards in cloud-based business intelligence tools, and analyze manufacturing data to identify and drive efficiency improvements.

You will closely collaborate cross-functionally with Manufacturing, Quality, Marketing and IT, and external parties
to enhance our data infrastructure. Your work will be critical to creating the digital backbone enabling Trusscores manufacturing innovations.

This role demands both strategic thinking and hands-on execution. Youll leverage your knowledge in data modeling, systems integration, and process analytics while continuing to grow your skills in cloud architecture, automation, and operational intelligence. Your success will be measured by timely delivery of data tools that are accurate, accessible, and impactful for decision-making across the business.

Join us and help shape the future of building materials - through data.

Responsibilities

  • Lead the development of Trusscores data infrastructure to support real-time manufacturing insights and data-driven decision-making.
  • Design and maintain scalable data pipelines and integrations between MS SQL Server, Power Apps, ERP systems (e.g., NetSuite), SCADA platforms (e.g., Ignition), and manufacturing equipment.
  • Build interactive dashboards and automated reports using Power BI, Oracle Analytics, and NetSuite Analytics Warehouse to enable performance monitoring and process optimization.
  • Develop and maintain SQL and Python-based data transformation workflows to standardize, clean, and model raw data for downstream analysis.
  • Automate repetitive data tasks and manual processes using tools such as Power Automate, Celigo, or custom scripts in Python or JavaScript.
  • Identify opportunities to improve production efficiency, quality, and throughput by analyzing large, complex data sets from various manufacturing systems.
  • Work with peers to define the governance of data architectures for consistency and reliability; accelerate growth by co-creating a playbook to standardize the design, approval, and deployment of analytics tools and practices.
  • Apply basic machine learning frameworks (e.g., TensorFlow, Scikit-learn) for predictive modeling and advanced analytics use cases, where applicable.
  • Lead continuous improvement projects that evolve our data platform capabilities, support digital transformation, and unlock new levels of operational intelligence.

Qualifications & Skills

  • Bachelors or Masters degree in Mechatronics, Systems Design, Software, Computer, Management engineering disciplines or Mathematics (or equivalent experience)
  • Minimum 3 years of experience as a Data Analyst, Data Scientist, or Industrial Engineering
  • Familiarity working in an Industrial or Manufacturing environment
  • Self-starting, team player, organized and detail oriented
  • Data Management & Programming: Required: Database infrastructure/hosting architecture,SQL, Python; Desired: Java, C++, React, Node.js, Celigo, JSON, REST API
  • Microsoft 365 Apps: Required: Teams, Excel; Desired: Power Apps, Power Automate
  • Business Intelligence: (Power BI, Oracle Analytics, NetSuite Analytics Warehouse (NSAW))
  • Knowledge of Machine Learning Frameworks (TensorFlow, PyTorch, Apache Spark,Scikit-learn, etc.)
  • Familiarity with ERP systems (NetSuite)

Other preferred qualifications:

  • Understanding of basic networking protocols and network infrastructure technologies (IP, LAN, Routing, VPN, DNS, etc.)
  • Understanding of SCADA system (Ignition)
  • Machine Data Integration & Manufacturing Execution System
  • Web App Developer (Frontend & Backend)


Location; Travel

  • Hybrid working from the office in Kitchener, ON (The Tannery Building) and from home
  • Occasional travel may be required to support offsite activities with third party vendors, contractors, other Trusscore facilities, or for other business reasons



Internal Trusscore Employee Applicants: please email your manager and if you're interested in applying to this position.

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Analytics engineer Jobs in Canada !

Senior data and analytics engineer

H3B Quebec, Quebec LARGIER CONSEILS

Posted 26 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a Senior Data and Analytics Engineer for our client for a 6-month renewable contract (until January 31, 2026 + renewal option).

Contract Details:

35 hours per weekHybrid work model: 50% on-site at any Montreal, Brossard, Valcourt, or Sherbrooke offices (to be confirmed with the hiring manager)Estimated hourly rate by our recruitment firm: $80-85/hourThis is an incorporated consultant contract (however, an employment contract is also possible)

Application Requirements:

Resume in EnglishLanguage: English-speaking or bilingualProfilExpertise in companies with 10,000+ employees5+ years of experience as a data & analytics engineerSolid and demonstrated experience with the following technologies:SnowflakeDBTFluent in the following programming languages:SQLPythonProficient in writing, executing, and optimizing complex SQL queriesExperience with data profiling and other means of validating data qualityFamiliar with DevOps practicesExperience working within an agile team to build big data/analytics solutionsExperience with SAP, Power BI, AI/ML knowledge

Additional Assets:

Expertise in manufacturing/transportation/automotive/aerospace industriesExperience with international companies operating across multiple continentsMulti-site enterprise experienceKnowledge of the following programming languages is a plus:Javascript/Node.jsJavaDAXKnowledge of the following technologies is a plus:AzureMicrosoft SQL ServerPower BIFluent in writing, executing and optimizing complex SQL queriesExperience with data profiling and other means of validating data qualityUsed to Devops practicesExperience in working within an agile team to build big data / analytics solutionsExperience with SAP, Power BI, AI/ML knowledge
This advertiser has chosen not to accept applicants from your region.

Lead Data and Analytics Engineer, BI

Stantec

Posted 21 days ago

Job Viewed

Tap Again To Close

Job Description

At Stantec, we have some of the world's leading professionals passionate about enabling our business to be its best. Our business teams include finance, procurement, human resources, information technology, marketing, corporate development, HSSE, real estate, legal, and practice services. We bring diverse backgrounds, skills, and expertise and create a caring culture where everyone can thrive. Through teamwork and collaboration, we're building a stronger, more resilient Stantec every day.   
Your Opportunity
We are looking for a Lead Data and Analytics Engineer, BI on our Marketing Business Intelligence team.
Your Key Responsibilities
Strategy
- Lead the design, development, and evolution of our enterprise data warehouse solutions, ensuring scalability, performance, security, and data quality.
- Define and implement data modeling strategies (dimensional, Kimball, Inmon) optimized for analytical workloads and business intelligence reporting.
- Develop comprehensive ETL/ELT strategies and pipelines to ingest, transform, and load data from diverse sources, including our custom .NET CRM, custom proposal resource center, Oracle ERP, and other operational systems.
- Consult on BI capabilities and recommend and execute solutions to address business needs and performance requirements
- Extend the Data Warehouse with data from new sources, applying: data lineage analysis.
Technical
- Design, develop, and maintain robust and scalable data models (star/snowflake schemas) in Power BI using Power Query (M), DAX, and Dataflows.
- Develop comprehensive ETL/ELT strategies and pipelines to ingest, transform, and load data from diverse sources, including our custom CRM, custom proposal resource center, Oracle ERP, and other operational systems.
- Lead the development of data pipelines using Azure Data Factory, Synapse Analytics, and SQL Server to source, transform, and load data into BI environments.
- Implement granular row-level security (RLS) to accurately reflect our matrix organizational leadership structure and robust Role-Based Access Control (RBAC).
- Leverage SQL Server (on-premise or Azure SQL Managed Instance) for foundational data storage and data warehousing components.
- Utilize Azure DevOps / Git for version control, CI/CD pipelines, and collaborative development practices.
- Conduct peer reviews of models, reports, and DAX logic for performance and accuracy.
Ensure solutions align with enterprise architecture standards, including metadata management, source control, and documentation.
Communication
- Collaborate closely with marketing and sales teams to identify key data points that can inform winning strategies, and design data models that expose these insights.
- Effectively communicate highly technical information to functional, technical team members and management
- Act as a subject matter expert and technical leader, providing guidance and mentorship to other team members on data architecture and BI best practices.
- Maintain a positive attitude when dealing with others in a pressured environment
Education and Experience
- Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, Engineering, or a related quantitative field.
- 8+ years of progressive experience in data warehousing, business intelligence, and data analytics roles, with at least 3+ years in a lead or architectural capacity.
- Demonstrated experience integrating complex data from Oracle ERP, CRM systems, and bespoke applications into the Microsoft BI stack is a strong advantage.
- Proven experience designing and implementing enterprise BI systems using the Microsoft Power Platform (Power BI, Power Query, DAX) and Azure Data Services (ADF, Synapse, SQL Server)
- Strong proficiency in data modeling techniques (dimensional modeling, star/snowflake schemas).
- Experience with enterprise BI governance, dataset certifications, naming conventions, and lifecycle management.
- Experience in data security, including row-level security, workspace roles, and deployment pipelines in Power BI.
- Familiarity with CI/CD pipelines and DevOps practices for Power BI and Azure data pipelines is an asset.
This description is not a comprehensive listing of activities, duties or responsibilities that may be required of the employee and other duties, responsibilities and activities may be assigned or may be changed at any time with or without notice.
**Pay Transparency:** In compliance with pay transparency laws, pay ranges are provided for positions in locations where required. Please note, the final agreed upon compensation is based on individual education, qualifications, experience, and work location. At Stantec certain roles are bonus eligible.
**Benefits Summary:** Regular full-time and part-time employees (working at least 20 hours per week) will have access to health, dental, and vision plans, a wellness program, health care spending account, wellness spending account, group registered retirement savings plan, employee stock purchase program, group tax-free savings account, life and accidental death & dismemberment (AD&D) insurance, short-term/long-term disability plans, emergency travel benefits, tuition reimbursement, professional membership fee coverage, and paid time off.
Temporary/casual employees will have access to group registered retirement savings plan, employee stock purchase program, and group tax-free savings account.
The benefits information listed above may not apply to union positions because benefits for such positions are governed by applicable collective bargaining agreements.
**Primary Location:** Canada |
**Organization:** 1196 Marketing & Communications-CA Corporate-Edmonton AB
**Employee Status:** Regular
**Travel:** No
**Schedule:** Full time
**Job Posting:** 18/07/ :07:22
**Req ID:**
This advertiser has chosen not to accept applicants from your region.

Business Intelligence Engineer

Mississauga, Ontario Compass Group

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

We are CDAI—the data and artificial intelligence engine of Compass Group North America. We design and deliver custom, in-house solutions tailored to the unique complexities of food service and hospitality. Our work is grounded in strong data foundations, layered with AI to enhance forecasting, streamline operations, and enable better, faster decision-making across Compass Group. With deep integration into the business and a commitment to white-glove service, CDAI empowers associates, clients, and customers through innovative, future-forward technologies.

# **Job Summary**

As a BI Engineer at Compass Digital, you will be responsible for designing, developing, and maintaining the company's business intelligence solutions and data infrastructure. You will work closely with business and technical stakeholders from various teams to lead requirements gathering, build data pipelines, and create reports and dashboards to support data-driven decision-making within the organization. The ideal candidate is able to independently take the lead on data projects while maintaining strong communication with all parties.

**Now, if you were to come on board as one of our BI Engineer, we’d ask you to do the following for us:**

- Design, develop, and maintain interactive and visually appealing reports and dashboards using BI tools like Power BI and Looker
- SQL (dbt) transformation work to create data models
- Lead meetings and requirements gathering efforts with business stakeholders to understand reporting requirements.
- Collaborate with technical data team members from business intelligence, data engineering and data science teams
- Document report specifications and user guidelines.
- Provide training and support to end-users for self-service reporting.
- Take ownership of BI projects from inception to delivery, ensuring quality, accuracy, and timely completion
- Able to handle multiple projects and prioritize tasks

**Think you have what it takes to be our Analytics Engineer? We’re committed to hiring the best talent for the role. Here’s how we’ll know you’ll be successful in the role.**

- Expertise in PowerBI development with administration experience as a bonus (or other comparable enterprise BI tools)
- Strong proficiency in SQL for data querying and manipulation, bonus for dbt experience
- Excellent grasp of data modeling best practices and optimization
- Knowledge of data warehousing concepts and technologies (e.g., Snowflake, Redshift, BigQuery).
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities

Compass Group Canada is committed to nurturing a diverse workforce representative of the communities within which we operate. We encourage and are pleased to consider all qualified candidates, without regard to race, colour, citizenship, religion, sex, marital / family status, sexual orientation, gender identity, aboriginal status, age, disability or persons who may require an accommodation, to apply.

For accommodation requests during the hiring process, please contact for further information.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Analytics Engineer Jobs