34 Analytics Engineer jobs in Canada
Analytics Engineer
Posted 4 days ago
Job Viewed
Job Description
As an Analytics Engineer, you will be responsible for developing and maintaining our data transformation layer, ensuring data quality, and optimizing data pipelines for our analytics, reporting, and data science teams. You will work closely with our cross-functional teams to ensure that our data is accurate, reliable, and accessible. You will also be responsible for identifying opportunities to improve our data infrastructure and recommending new technologies and tools.
**Now, if you were to come on board as one of our Analytics Engineer, we’d ask you to do the following for us:**
- Develop and maintain data transformation pipelines using dbt to support our analytics, reporting and data science teams
- Ensure the accuracy, completeness, and consistency of our data by implementing data quality checks and processes
- Optimize data pipelines for performance and scalability, ensuring that our data is accessible and usable for our cross-functional teams
- Collaborate with cross-functional teams to identify and solve data-related problems and provide insights on data infrastructure improvements
- Monitor and troubleshoot data pipeline issues and work to resolve them in a timely manner
- Develop and maintain technical documentation for our data infrastructure, including data models, schema definitions, and transformation processes
**Think you have what it takes to be our Analytics Engineer? We’re committed to hiring the best talent for the role. Here’s how we’ll know you’ll be successful in the role.**
- Bachelor’s degree in Computer Science, Software Engineering, or a related field
- 3+ years of experience in analytics engineering
- Strong programming skills in SQL and Python, experience with dbt is a bonus
- Experience with data modeling, database design, and data warehousing concepts
- Strong problem-solving and analytical skills with a keen attention to detail
- Strong communication and collaboration skills with the ability to work effectively in a team environment
Analytics Engineer
Posted 1 day ago
Job Viewed
Job Description
Job Description
About Clutch:
Clutch is Canada's largest online used car retailer, delivering a seamless, hassle-free car-buying experience to drivers everywhere. Customers can browse hundreds of cars from the comfort of their home, get the right one delivered to their door, and enjoy peace of mind with our 10-Day Money-Back Guarantee… and that's just the beginning.
Named one of Canada's top growing Companies two years in a row and also awarded a spot on LinkedIn's Top Canadian Startups list, we're looking to add curious, hard-working, and driven individuals to our growing team.
Headquartered in Toronto, Clutch was founded in 2017. Clutch is backed by a number of world-class investors, including Canaan, BrandProject, Real Ventures, D1 Capital, and Upper90. To learn more, visit clutch.ca.
What You'll Do
- Build, maintain, and optimize our analytics data stack, including ELT pipelines, dimensional models, and semantic layers using tools like dbt, Snowflake, and Airflow.
- Collaborate with teams across Marketing, Product, Operations, and Finance to define key metrics, improve reporting fidelity, and unlock self-serve analytics.
- Own and evolve core data models and pipelines that drive reporting, dashboards, and downstream analysis.
- Apply best practices around data testing, observability, and CI/CD to ensure reliable, version-controlled, production-ready analytics code.
- Partner with stakeholders to understand business needs and translate them into performant and trustworthy data products.
- Contribute to and advocate for Clutch's internal data documentation and governance practices.
- Mentor others on data modeling, tooling, and process — bringing a scalable, engineering-first mindset to analytics.
What We're Looking For
- 4–6 years of experience in analytics engineering, data analytics, or data engineering — ideally in fast-paced or high-growth environments.
- Strong SQL expertise and comfort working with large-scale, normalized, or dimensional datasets.
- Hands-on experience building data models with dbt in a production environment (CI, testing, documentation).
- Experience working with cloud data warehouses like Snowflake , BigQuery , or Redshift .
- Experience building semantic layers and enabling self-serve BI (Looker, Sigma, Tableau).
- A proactive communicator who can collaborate with business stakeholders and technical teams alike.
- A strong sense of ownership and accountability for data quality, performance, and maintainability.
- Familiarity with using Python and orchestration tools (Airflow, Prefect, Dagster) for data pipelines and transformation is a plus
Why you'll love it at Clutch:
- Autonomy & ownership -- create your own path, and own your work
- Competitive compensation and equity incentives!
- Generous time off program
- Health & dental benefits
Clutch is committed to fostering an inclusive workplace where all individuals have an opportunity to succeed. If you require accommodation at any stage of the interview process, please email .
Senior Analytics Engineer
Posted 1 day ago
Job Viewed
Job Description
Job Description
Since our founding in 2012, Lotlinx has consistently pioneered advancements in the automotive landscape. We specialize in empowering automobile dealers and manufacturers by providing cutting-edge data and technology, delivering a distinct market advantage for every single vehicle transaction. Today, we stand as the foremost automotive AI and machine learning powered technology, excelling in digital marketing, risk management, and strategic inventory management.
Lotlinx provides employees with a dynamic work environment that is challenging, team-oriented, and full of passionate people. We offer great incentives to our employees, such as competitive compensation and benefits, flex time off, and career development opportunities.
Position SummaryWe are seeking an experienced Senior Analytics Engineer to join our growing Data team. You will play a pivotal role in architecting, building, and optimizing the data foundations that power analytics and data-driven decision-making across LotLinx. Reporting to the Director of Data Analytics, you will collaborate closely with Data Analysts, Data Engineers, and Product Managers to translate business needs into robust, scalable, and reliable data models and pipelines that are incorporated into our product portfolio. This is a key position where you'll have significant ownership and impact on our data infrastructure and strategy.
This is a hybrid role and requires 3-4 days a week in either our Winnipeg, Hamilton, or Vancouver office locations.
Responsibilities- Architect & Build Data Models: Design, develop, and maintain scalable and performant data models in our data warehouse (Google BigQuery, Apache Pinot) to serve as the single source of truth for analytics.
- Data Analysis: Conduct data validation and exploratory analysis across massive datasets (billions of rows, terabytes) to ensure the integrity of data pipelines and the accuracy of downstream reporting.
- Develop Large Data Pipelines: Develop, monitor, and troubleshoot ELT/ETL pipelines processing high-volume data streams, ensuring reliability and performance at the terabyte scale.
- Optimize Pipeline Performance: Optimize complex SQL queries and data transformation logic for maximum performance and cost-efficiency using multi-terabyte datasets within Google BigQuery.
- Enhance OLAP performance: Analyze Apache Pinot query performance logs and usage patterns across terabyte-scale datasets to identify optimization opportunities and troubleshoot complex data access issues.
- Enable Data Consumers: Partner with data analysts, data scientists, and business stakeholders to understand their data requirements, providing clean, well-documented, and easy-to-use datasets.
- Champion Data Quality & Governance: Implement data quality checks, testing frameworks, and documentation standards to ensure the trustworthiness and usability of our data assets.
- Collaborate & Mentor: Work effectively within a collaborative team environment. Potentially mentor junior team members and share best practices in analytics engineering.
- Stay Current: Keep abreast of new technologies, tools, and best practices in the analytics engineering space and advocate for their adoption where relevant.
Experience: 4+ years of relevant professional experience in Analytics Engineering, Data Engineering, or a highly related role, with a proven track record of building and managing complex data systems.
- Expert SQL: Deep expertise in writing complex, highly performant SQL for data transformation, aggregation, and analysis, particularly within a cloud data warehouse environment like BigQuery.
- Performance Optimization: Demonstrated experience writing, tuning, and debugging complex SQL queries specifically for large-scale data warehouses (multi-terabyte environments), preferably BigQuery or Apache Pinot.
- Data Modeling Mastery: Strong understanding of data modeling concepts (e.g., Kimball, Inmon, Data Vault) and practical experience designing and implementing warehouse schemas.
- ETL/ELT & Orchestration: Proven experience building and maintaining data pipelines using relevant tools and frameworks. Python proficiency for scripting and data manipulation is essential.
- Cloud Data Warehousing: Significant experience working with cloud data warehouses, specifically Google BigQuery. Understanding of underlying architecture and optimization techniques.
- Problem-Solving: Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data issues independently.
- Communication & Collaboration: Strong communication skills, capable of explaining complex technical concepts to both technical and non-technical audiences. Proven ability to collaborate effectively across teams.
- Previous experience in the Automotive or AdTech industry.
- Familiarity with workflow orchestration tools like Apache Airflow.
- Experience with Apache Pinot.
- Experience with data quality and testing frameworks.
- Familiarity with Google Cloud Platform (GCP) services beyond BigQuery.
- Cloud/Warehouse: Google Cloud Platform (GCP), AWS, Google BigQuery, Apache Pinot
- Transformation: SQL, Python
- Orchestration: Airflow, Cloud Composer
- Ingestion: Custom Scripts, Pub/Sub, Lambda
Other: GitLab, Kubernetes, Apache Pinot, OpenSearch
The salary range for this position is $103,000 - $157,000, with an annual target bonus.
Lotlinx is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.
Lotlinx is not currently able to offer sponsorship for employment visa status.
Lotlinx is headquartered in Peterborough, NH and has locations in Holmdel NJ, Manitoba, Ontario and British Columbia, Canada in addition to a large team spanning from the US to Canada.
Our success relies heavily on our customers but also our dedicated talent that continuously moves our platform forward. We value our employees, their abilities and seek to foster an open, cooperative, dynamic environment where the team and company alike can thrive.
Analytics Engineer (dbt)
Posted 1 day ago
Job Viewed
Job Description
Job Description
Tucows (NASDAQ:TCX, TSX:TC) is possibly the biggest Internet company you've never heard of. We started as a simple shareware site in 1993 and have since grown into a stable of businesses: Tucows Domains, Ting Internet and Wavelo.
What's next at Tucows
We embrace a people-first philosophy that is rooted in respect, trust, and flexibility. We believe that whatever works for our employees is what works best for us. It's also why the majority of our roles are remote-first, meaning you can work from anywhere you can connect to the Internet!
Today, over one thousand people work in over 20 countries to help us make the Internet better. If this sounds exciting to you, join the herd!
About the opportunityWe are seeking an experienced and versatile Analytics Engineer to join our dynamic team. In this role, you will apply your advanced analytics expertise to extract actionable insights from raw data. The ideal candidate will have a strong background in data engineering, analytics, and machine learning, with the ability to drive data-driven decision-making across the organization.
What You Will Be Doing- Data Modeling & Pipelines: Design, develop, and maintain complex data models in our Snowflake data warehouse. Utilize dbt (Data Build Tool) to create efficient data pipelines and transformations for our data platform.
- Snowflake Intelligence Integration: Leverage Snowflake Intelligence features (e.g., Cortex Analyst, Cortex Agents, Cortex Search, AISQL) to implement conversational data queries and AI-driven insights directly within our data environment. Develop AI solutions that harness these capabilities to extract valuable business insights.
- Advanced SQL & Analysis: Design and build advanced SQL queries to retrieve and manipulate complex data sets. Dive deep into large datasets to uncover patterns, trends, and opportunities that inform strategic decision-making.
- Business Intelligence (BI): Develop, maintain, and optimize Looker dashboards and LookML to effectively communicate data insights. Leverage Looker's conversational analytics and data agent features to enable stakeholders to interact with data using natural language queries.
- Cross-Functional Collaboration: Communicate effectively with stakeholders to understand business requirements and deliver data-driven solutions. Identify opportunities for implementing AI/ML/NLP technologies in collaboration with product, engineering, and business teams.
- Programming & Automation: Write efficient Python code for data analysis, data processing, and automation of recurring tasks. Skilled in shell scripting and command-line tools to support data workflows and system tasks. Ensure code is well-tested and integrated into automated workflows (e.g., via Airflow job scheduling).
- Visualization & Presentation: Create compelling visualizations and presentations to deliver analytical insights and actionable recommendations to senior management and cross-functional teams. Tailor communication of complex analyses to diverse audiences.
- Innovation & Best Practices: Stay up-to-date with industry trends, emerging tools, and best practices in data engineering and analytics (with a focus on dbt features, Snowflake's latest offerings and BI innovations). Develop and implement innovative ideas to continuously improve our analytics stack and practices.
- Education: Bachelor's degree in Computer Science, Statistics, or a related field; Master's degree preferred.
- Experience: 2+ years of experience in data analytics or a related field, with significant exposure to AI and Machine Learning applications in analytics.
- SQL Expertise: Advanced SQL skills with experience in writing and optimizing complex queries on large-scale datasets.
- dbt Proficiency: Hands-on experience with dbt (Data Build Tool) and its features for building, testing, and documenting data models.
- Data Modeling: Expert-level knowledge of data modeling and data warehouse concepts (e.g., star schema, normalization, slowly changing dimensions).
- Snowflake & AI Capabilities: Experience with Snowflake's Data Cloud platform and familiarity with its advanced AI capabilities (Snowflake Intelligence – Cortex Analyst, Cortex Agents, Cortex Search, AISQL, etc.) is highly preferred.
- Business Intelligence Tools: Strong skills in Looker data visualization and LookML (including familiarity with Looker's conversational AI and data agent capabilities) or similar BI tools.
- AI Agents & Automation: Experience with AI agents or generative AI tools to optimize workflows and service delivery (such as creating chatbots or automated analytic assistants) is a plus.
- Real-Time & Streaming Data: Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis, Spark Streaming) for handling continuous data flows.
- Programming: Proficient in Python for data analysis and manipulation (pandas, NumPy, etc.), with the ability to write clean, efficient code. Experienced with shell scripting and command-line tools for automating workflows and data processing tasks.
- ETL/Orchestration: Familiarity with ETL processes and workflow orchestration tools like Apache Airflow (or similar scheduling tools) for automating data pipelines alongside Docker for local development and testing.
- Cloud Platforms: Experience with cloud platforms and services (especially AWS or GCP) for data storage, compute, and deployment.
- Version Control & CI/CD: Solid understanding of code versioning (Git) and continuous integration/continuous deployment (CI/CD) processes in a data engineering context.
- Agile Methodology: Familiarity with agile development methodologies and ability to work in a fast-paced, iterative environment.
- Soft Skills: Excellent communication and presentation skills, with critical thinking and problem-solving abilities. Proven track record of working effectively on cross-functional teams and translating business needs into technical solutions.
- Data Governance & Ethics: Experience implementing data governance best practices, ensuring data quality and consistency. Knowledge of data ethics, bias mitigation strategies, and data privacy regulations (e.g., GDPR, CCPA) with a commitment to compliance.
- Community & Open Source: Contributions to open-source projects or active participation in data community initiatives.
- AI/ML Skills: Experience with applying Artificial Intelligence/Machine Learning techniques in analytics (e.g., building predictive models for forecasting, churn prediction, fraud detection, etc.). Practical experience deploying models and using MLOps/DataOps practices for lifecycle management.
- Statistical Background: Solid foundation in statistics and probability, with ability to apply various modeling techniques and design A/B tests or experiments.
- Additional Programming: Knowledge of additional programming or query languages (e.g., R, Scala, Julia, Spark SQL) that can be applied in analytics workflows.
- Certifications: Certifications in relevant data technologies or cloud platforms (such as Snowflake, AWS, GCP, or Looker) demonstrating your expertise.
The ideal candidate will be a self-starter with a passion for data and analytics, capable of navigating complex datasets to uncover valuable insights. They should be comfortable working in a fast-paced environment, adapting to new technologies, and driving innovation in our data practices.They should be able to navigate complex data landscapes, uncover meaningful insights, and communicate these findings effectively to both technical and non-technical audiences. The ability to stay current with industry trends and continuously learn new technologies is essential in this role.
This role offers the opportunity to make a significant impact on our organization's data strategy and contribute to critical business decisions through advanced analytics. The successful candidate will play a key role in shaping our data culture and driving the adoption of cutting-edge data technologies and methodologies.
If you are a data enthusiast with a track record of delivering impactful analytics solutions and a desire to push the boundaries of what's possible with data, we want to hear from you!
The base salary range for this position is $73,440 to $6,400. Range shown in CAD for Canadian residents. Other countries will differ. Range may vary on a number of factors including, but not limited to: location, experience and qualifications. Tucows believes in a total rewards offering that includes fair compensation and generous benefits.
Want to know more about what we stand for? At Tucows we care about protecting the open Internet, narrowing the digital divide, and supporting fairness and equality.
We also know that diversity drives innovation. We are committed to inclusion across race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status or disability status. We celebrate multiple approaches and diverse points of view.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Tucows and its subsidiaries participate in the E-verify program for all US employees.
Learn more about Tucows, our businesses, culture and employee benefits on our site here. #LI-Remote #LI-NA1
Analytics Engineer (dbt)
Posted 1 day ago
Job Viewed
Job Description
Job Description
Tucows (NASDAQ:TCX, TSX:TC) is possibly the biggest Internet company you've never heard of. We started as a simple shareware site in 1993 and have since grown into a stable of businesses: Tucows Domains, Ting Internet and Wavelo.
What's next at Tucows
We embrace a people-first philosophy that is rooted in respect, trust, and flexibility. We believe that whatever works for our employees is what works best for us. It's also why the majority of our roles are remote-first, meaning you can work from anywhere you can connect to the Internet!
Today, over one thousand people work in over 20 countries to help us make the Internet better. If this sounds exciting to you, join the herd!
About the opportunityWe are seeking an experienced and versatile Analytics Engineer to join our dynamic team. In this role, you will apply your advanced analytics expertise to extract actionable insights from raw data. The ideal candidate will have a strong background in data engineering, analytics, and machine learning, with the ability to drive data-driven decision-making across the organization.
What You Will Be Doing- Data Modeling & Pipelines: Design, develop, and maintain complex data models in our Snowflake data warehouse. Utilize dbt (Data Build Tool) to create efficient data pipelines and transformations for our data platform.
- Snowflake Intelligence Integration: Leverage Snowflake Intelligence features (e.g., Cortex Analyst, Cortex Agents, Cortex Search, AISQL) to implement conversational data queries and AI-driven insights directly within our data environment. Develop AI solutions that harness these capabilities to extract valuable business insights.
- Advanced SQL & Analysis: Design and build advanced SQL queries to retrieve and manipulate complex data sets. Dive deep into large datasets to uncover patterns, trends, and opportunities that inform strategic decision-making.
- Business Intelligence (BI): Develop, maintain, and optimize Looker dashboards and LookML to effectively communicate data insights. Leverage Looker's conversational analytics and data agent features to enable stakeholders to interact with data using natural language queries.
- Cross-Functional Collaboration: Communicate effectively with stakeholders to understand business requirements and deliver data-driven solutions. Identify opportunities for implementing AI/ML/NLP technologies in collaboration with product, engineering, and business teams.
- Programming & Automation: Write efficient Python code for data analysis, data processing, and automation of recurring tasks. Skilled in shell scripting and command-line tools to support data workflows and system tasks. Ensure code is well-tested and integrated into automated workflows (e.g., via Airflow job scheduling).
- Visualization & Presentation: Create compelling visualizations and presentations to deliver analytical insights and actionable recommendations to senior management and cross-functional teams. Tailor communication of complex analyses to diverse audiences.
- Innovation & Best Practices: Stay up-to-date with industry trends, emerging tools, and best practices in data engineering and analytics (with a focus on dbt features, Snowflake's latest offerings and BI innovations). Develop and implement innovative ideas to continuously improve our analytics stack and practices.
- Education: Bachelor's degree in Computer Science, Statistics, or a related field; Master's degree preferred.
- Experience: 2+ years of experience in data analytics or a related field, with significant exposure to AI and Machine Learning applications in analytics.
- SQL Expertise: Advanced SQL skills with experience in writing and optimizing complex queries on large-scale datasets.
- dbt Proficiency: Hands-on experience with dbt (Data Build Tool) and its features for building, testing, and documenting data models.
- Data Modeling: Expert-level knowledge of data modeling and data warehouse concepts (e.g., star schema, normalization, slowly changing dimensions).
- Snowflake & AI Capabilities: Experience with Snowflake's Data Cloud platform and familiarity with its advanced AI capabilities (Snowflake Intelligence – Cortex Analyst, Cortex Agents, Cortex Search, AISQL, etc.) is highly preferred.
- Business Intelligence Tools: Strong skills in Looker data visualization and LookML (including familiarity with Looker's conversational AI and data agent capabilities) or similar BI tools.
- AI Agents & Automation: Experience with AI agents or generative AI tools to optimize workflows and service delivery (such as creating chatbots or automated analytic assistants) is a plus.
- Real-Time & Streaming Data: Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis, Spark Streaming) for handling continuous data flows.
- Programming: Proficient in Python for data analysis and manipulation (pandas, NumPy, etc.), with the ability to write clean, efficient code. Experienced with shell scripting and command-line tools for automating workflows and data processing tasks.
- ETL/Orchestration: Familiarity with ETL processes and workflow orchestration tools like Apache Airflow (or similar scheduling tools) for automating data pipelines alongside Docker for local development and testing.
- Cloud Platforms: Experience with cloud platforms and services (especially AWS or GCP) for data storage, compute, and deployment.
- Version Control & CI/CD: Solid understanding of code versioning (Git) and continuous integration/continuous deployment (CI/CD) processes in a data engineering context.
- Agile Methodology: Familiarity with agile development methodologies and ability to work in a fast-paced, iterative environment.
- Soft Skills: Excellent communication and presentation skills, with critical thinking and problem-solving abilities. Proven track record of working effectively on cross-functional teams and translating business needs into technical solutions.
- Data Governance & Ethics: Experience implementing data governance best practices, ensuring data quality and consistency. Knowledge of data ethics, bias mitigation strategies, and data privacy regulations (e.g., GDPR, CCPA) with a commitment to compliance.
- Community & Open Source: Contributions to open-source projects or active participation in data community initiatives.
- AI/ML Skills: Experience with applying Artificial Intelligence/Machine Learning techniques in analytics (e.g., building predictive models for forecasting, churn prediction, fraud detection, etc.). Practical experience deploying models and using MLOps/DataOps practices for lifecycle management.
- Statistical Background: Solid foundation in statistics and probability, with ability to apply various modeling techniques and design A/B tests or experiments.
- Additional Programming: Knowledge of additional programming or query languages (e.g., R, Scala, Julia, Spark SQL) that can be applied in analytics workflows.
- Certifications: Certifications in relevant data technologies or cloud platforms (such as Snowflake, AWS, GCP, or Looker) demonstrating your expertise.
The ideal candidate will be a self-starter with a passion for data and analytics, capable of navigating complex datasets to uncover valuable insights. They should be comfortable working in a fast-paced environment, adapting to new technologies, and driving innovation in our data practices.They should be able to navigate complex data landscapes, uncover meaningful insights, and communicate these findings effectively to both technical and non-technical audiences. The ability to stay current with industry trends and continuously learn new technologies is essential in this role.
This role offers the opportunity to make a significant impact on our organization's data strategy and contribute to critical business decisions through advanced analytics. The successful candidate will play a key role in shaping our data culture and driving the adoption of cutting-edge data technologies and methodologies.
If you are a data enthusiast with a track record of delivering impactful analytics solutions and a desire to push the boundaries of what's possible with data, we want to hear from you!
The base salary range for this position is $73,440 to $6,400. Range shown in CAD for Canadian residents. Other countries will differ. Range may vary on a number of factors including, but not limited to: location, experience and qualifications. Tucows believes in a total rewards offering that includes fair compensation and generous benefits.
Want to know more about what we stand for? At Tucows we care about protecting the open Internet, narrowing the digital divide, and supporting fairness and equality.
We also know that diversity drives innovation. We are committed to inclusion across race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status or disability status. We celebrate multiple approaches and diverse points of view.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Tucows and its subsidiaries participate in the E-verify program for all US employees.
Learn more about Tucows, our businesses, culture and employee benefits on our site here. #LI-Remote #LI-NA1
Data Analytics Engineer
Posted 1 day ago
Job Viewed
Job Description
Job Description
Salary:
Who We Are
Trusscoreis a material science company for sustainable building materials. We are revolutionizing the way people build, with the ultimate vision of creating a market leading, sustainable, branded, true painted-drywall-alternative. Trusscore invests heavily in R&D to discover breakthroughs in materials and manufacturing processes, advancing the state-of-the-art in science and technology along the way.
Become part of a passionate, innovative team focused on reshaping the future of building materials. We offer a dynamic, flexible work environment, opportunities for personal and professional growth, and the chance to make a tangible impact in an established, fast-growing company with a focus on sustainability and innovation.
Your Mission
Trusscore is redefining building-product manufacturing through digitization, advanced automation, and AI-enhanced process control and detection. Your mission as a data analytics leader is to convert data from ERP, proprietary applications, SCADA feeds, and real-time ML models into decisive insights that drive a business impact.
Youll inject the analytical firepower to an experienced team of industrial engineers, mechatronics, and controls engineers to develop and deploy powerful analytics tools to pinpoint and unlock value creation and value capture opportunities that efficiently scale Trusscores manufacturing and operations.
As a data technology leader, you will apply your expertise in data engineering, analytics, and visualization. You will maintain and build upon our custom data collection platform, build models and automated dashboards in cloud-based business intelligence tools, and analyze manufacturing data to identify and drive efficiency improvements.
You will closely collaborate cross-functionally with Manufacturing, Quality, Marketing and IT, and external parties
to enhance our data infrastructure. Your work will be critical to creating the digital backbone enabling Trusscores manufacturing innovations.
This role demands both strategic thinking and hands-on execution. Youll leverage your knowledge in data modeling, systems integration, and process analytics while continuing to grow your skills in cloud architecture, automation, and operational intelligence. Your success will be measured by timely delivery of data tools that are accurate, accessible, and impactful for decision-making across the business.
Join us and help shape the future of building materials - through data.
Responsibilities
- Lead the development of Trusscores data infrastructure to support real-time manufacturing insights and data-driven decision-making.
- Design and maintain scalable data pipelines and integrations between MS SQL Server, Power Apps, ERP systems (e.g., NetSuite), SCADA platforms (e.g., Ignition), and manufacturing equipment.
- Build interactive dashboards and automated reports using Power BI, Oracle Analytics, and NetSuite Analytics Warehouse to enable performance monitoring and process optimization.
- Develop and maintain SQL and Python-based data transformation workflows to standardize, clean, and model raw data for downstream analysis.
- Automate repetitive data tasks and manual processes using tools such as Power Automate, Celigo, or custom scripts in Python or JavaScript.
- Identify opportunities to improve production efficiency, quality, and throughput by analyzing large, complex data sets from various manufacturing systems.
- Work with peers to define the governance of data architectures for consistency and reliability; accelerate growth by co-creating a playbook to standardize the design, approval, and deployment of analytics tools and practices.
- Apply basic machine learning frameworks (e.g., TensorFlow, Scikit-learn) for predictive modeling and advanced analytics use cases, where applicable.
- Lead continuous improvement projects that evolve our data platform capabilities, support digital transformation, and unlock new levels of operational intelligence.
Qualifications & Skills
- Bachelors or Masters degree in Mechatronics, Systems Design, Software, Computer, Management engineering disciplines or Mathematics (or equivalent experience)
- Minimum 3 years of experience as a Data Analyst, Data Scientist, or Industrial Engineering
- Familiarity working in an Industrial or Manufacturing environment
- Self-starting, team player, organized and detail oriented
- Data Management & Programming: Required: Database infrastructure/hosting architecture,SQL, Python; Desired: Java, C++, React, Node.js, Celigo, JSON, REST API
- Microsoft 365 Apps: Required: Teams, Excel; Desired: Power Apps, Power Automate
- Business Intelligence: (Power BI, Oracle Analytics, NetSuite Analytics Warehouse (NSAW))
- Knowledge of Machine Learning Frameworks (TensorFlow, PyTorch, Apache Spark,Scikit-learn, etc.)
- Familiarity with ERP systems (NetSuite)
Other preferred qualifications:
- Understanding of basic networking protocols and network infrastructure technologies (IP, LAN, Routing, VPN, DNS, etc.)
- Understanding of SCADA system (Ignition)
- Machine Data Integration & Manufacturing Execution System
- Web App Developer (Frontend & Backend)
Location; Travel
- Hybrid working from the office in Kitchener, ON (The Tannery Building) and from home
- Occasional travel may be required to support offsite activities with third party vendors, contractors, other Trusscore facilities, or for other business reasons
Data Analytics Engineer - Remote
Posted 1 day ago
Job Viewed
Job Description
Job Description
Data Analytics Engineer – Remote
Embark on the next step in your career journey with Direct Travel!
As one of the fastest growing Travel Management Companies (TMC) in the world, Direct Travel is committed to reimagining what is possible for the industry , including business travel, personalized experiences, and meetings & events. Under the forward-thinking direction of our experienced leadership team, we are rapidly expanding and leveraging next-generation technologies to deliver on our vision for The Perfect Trip . This is your opportunity to grow your career and be part of a dynamic team that is setting the new standard of travel and service excellence. If you’re passionate about innovation and ready for what’s next, we’d love for you to join us!
Direct travel is on a journey of data transformation, and we are seeking a skilled and motivated Data Analytics Engineer to join our team to be a pivotal part of our history.
The ideal candidate will be crucial to the design, execution and delivery of a best-in-class global analytics platform, providing data insights and global reporting for both internal stakeholders and thousands of clients globally.
A successful candidate will have hands on experience crafting embedded analytics applications, possess a background in software engineering and understand the latest methods and techniques in provisioning key visualizations and self-service capabilities within our data products.
In the role of an Analytics Engineer, a qualified candidate will work across engineering and data related assignments, contributing to the development of analytics products and using a data-driven mindset to solve business problems across the organization.
Expedition Expectations
- Data Management: Data collection and processing across various raw and unimproved data marts.
- Reporting & Visualization: Develop and implement key visualizations, reports, modeling and performance improving optimizations to our core data products.
- Collaboration: Work closely with IT and business units to understand data requirements and provide actionable insights and solutions to data problems.
- Process Improvement: Identify, analyze, and interpret trends or patterns in complex IT data sets to drive process improvements and policy development.
- Analytics & Insights: Provide key and differentiating industry insights for our customers through clear and concise analytical representations of core travel data.
Passport to Success
- Educational Background: Bachelor’s degree in information technology, Computer Science, Data Science, or a related field or equivalent experience.
- Technical Proficiency:
- Advanced knowledge of and experience with databases of all varieties including RDBMS SQL based platforms as well as document stores.
- Demonstrable experience with a programming language typically used in the field of data and analytics engineering e.g. Python, R or Scala.
- Experience and understanding of web application development, front-end reactive applications such as VueJS or React as well as familiarity with middle and back-end software development methodologies.
- Understanding of the software development lifecycle and SCM through Git or an alternative.
- Cloud proficiency:
- Experience with a cloud providers such as Amazon AWS, Azure or GCP and associated data platforms and products they provide.
- Analytics:
- Full understanding of the data lifecycle, from pipeline development to warehouse architecture to reporting and analytics enablement.
- Senior level experience with more than one cloud-based analytics platform such as Looker, Tableau or PowerBI.
- Experience developing semantic models across wide-reaching data in sustainable and performant ways.
- Exposure and understanding to cloud-based warehouse platforms such as Snowflake, DataBricks or Azure DataLake.
- Prior experience implementing governance and compliance related frameworks to analytics data products.
- Inquisitive diagnostic skills related to query and system tuning and performance.
- Experience building bespoke data products from the ground up that utilize analytics and business intelligence platforms in an integration manner for visuals and core functionality.
- Communication Skills: Strong verbal and written communication skills to effectively share findings with stakeholders.
- Attention to Detail: Adept at queries, report writing, and presenting findings.
Going the Extra Mile
- Experience: Proven working experience as a data analytics engineer.
- Platforms: Snowflake, Looker and LookML modeling experience.
Benefits Onboard
In addition to Medical, Dental, and Vision benefits Direct Travel offers an employee rewards and recognitions program, Total Rewards Package which includes Wellness, Sustainability, DE&I initiatives, and Mental Health Support.
Our Brand Voyage: Direct Travel
Direct Travel is a leading provider of corporate travel management services. By leveraging both the expertise of its people and innovative solutions, Direct Travel enables clients to derive the greatest value from their travel program in terms of superior service, progressive technologies, and significant cost savings. The company is led by CEO Christal Bemont and Executive Chairman Steve Singh, noted business investor and founder of Concur. Direct Travel has offices in over 80 locations and is currently ranked among the top providers of travel on Travel Weekly’s Power List. For more information, visit
Direct Travel is an EOE/AA/Veteran/People with Disabilities employer
If you're ready to chart a new course and advance your career with the valuable moments and travel experiences that await, we welcome you to submit your resume for consideration at Direct Travel.
Be The First To Know
About the latest Analytics engineer Jobs in Canada !
Data Analytics Engineer - Remote
Posted 1 day ago
Job Viewed
Job Description
Job Description
Data Analytics Engineer – Remote
Embark on the next step in your career journey with Direct Travel!
As one of the fastest growing Travel Management Companies (TMC) in the world, Direct Travel is committed to reimagining what is possible for the industry , including business travel, personalized experiences, and meetings & events. Under the forward-thinking direction of our experienced leadership team, we are rapidly expanding and leveraging next-generation technologies to deliver on our vision for The Perfect Trip . This is your opportunity to grow your career and be part of a dynamic team that is setting the new standard of travel and service excellence. If you’re passionate about innovation and ready for what’s next, we’d love for you to join us!
Direct travel is on a journey of data transformation, and we are seeking a skilled and motivated Data Analytics Engineer to join our team to be a pivotal part of our history.
The ideal candidate will be crucial to the design, execution and delivery of a best-in-class global analytics platform, providing data insights and global reporting for both internal stakeholders and thousands of clients globally.
A successful candidate will have hands on experience crafting embedded analytics applications, possess a background in software engineering and understand the latest methods and techniques in provisioning key visualizations and self-service capabilities within our data products.
In the role of an Analytics Engineer, a qualified candidate will work across engineering and data related assignments, contributing to the development of analytics products and using a data-driven mindset to solve business problems across the organization.
Expedition Expectations
- Data Management: Data collection and processing across various raw and unimproved data marts.
- Reporting & Visualization: Develop and implement key visualizations, reports, modeling and performance improving optimizations to our core data products.
- Collaboration: Work closely with IT and business units to understand data requirements and provide actionable insights and solutions to data problems.
- Process Improvement: Identify, analyze, and interpret trends or patterns in complex IT data sets to drive process improvements and policy development.
- Analytics & Insights: Provide key and differentiating industry insights for our customers through clear and concise analytical representations of core travel data.
Passport to Success
- Educational Background: Bachelor’s degree in information technology, Computer Science, Data Science, or a related field or equivalent experience.
- Technical Proficiency:
- Advanced knowledge of and experience with databases of all varieties including RDBMS SQL based platforms as well as document stores.
- Demonstrable experience with a programming language typically used in the field of data and analytics engineering e.g. Python, R or Scala.
- Experience and understanding of web application development, front-end reactive applications such as VueJS or React as well as familiarity with middle and back-end software development methodologies.
- Understanding of the software development lifecycle and SCM through Git or an alternative.
- Cloud proficiency:
- Experience with a cloud providers such as Amazon AWS, Azure or GCP and associated data platforms and products they provide.
- Analytics:
- Full understanding of the data lifecycle, from pipeline development to warehouse architecture to reporting and analytics enablement.
- Senior level experience with more than one cloud-based analytics platform such as Looker, Tableau or PowerBI.
- Experience developing semantic models across wide-reaching data in sustainable and performant ways.
- Exposure and understanding to cloud-based warehouse platforms such as Snowflake, DataBricks or Azure DataLake.
- Prior experience implementing governance and compliance related frameworks to analytics data products.
- Inquisitive diagnostic skills related to query and system tuning and performance.
- Experience building bespoke data products from the ground up that utilize analytics and business intelligence platforms in an integration manner for visuals and core functionality.
- Communication Skills: Strong verbal and written communication skills to effectively share findings with stakeholders.
- Attention to Detail: Adept at queries, report writing, and presenting findings.
Going the Extra Mile
- Experience: Proven working experience as a data analytics engineer.
- Platforms: Snowflake, Looker and LookML modeling experience.
Benefits Onboard
In addition to Medical, Dental, and Vision benefits Direct Travel offers an employee rewards and recognitions program, Total Rewards Package which includes Wellness, Sustainability, DE&I initiatives, and Mental Health Support.
Our Brand Voyage: Direct Travel
Direct Travel is a leading provider of corporate travel management services. By leveraging both the expertise of its people and innovative solutions, Direct Travel enables clients to derive the greatest value from their travel program in terms of superior service, progressive technologies, and significant cost savings. The company is led by CEO Christal Bemont and Executive Chairman Steve Singh, noted business investor and founder of Concur. Direct Travel has offices in over 80 locations and is currently ranked among the top providers of travel on Travel Weekly’s Power List. For more information, visit
Direct Travel is an EOE/AA/Veteran/People with Disabilities employer
If you're ready to chart a new course and advance your career with the valuable moments and travel experiences that await, we welcome you to submit your resume for consideration at Direct Travel.
#LI-DNP
Business Intelligence Analyst
Posted 1 day ago
Job Viewed
Job Description
Job Description
Salary: $60,000.00 - $65,000.00 Per year
SCOPE OF POSITION
ConeTec is an international full-service geo-environmental and geotechnical site characterization contractor. We offer clients superior project management and site investigation services across the globe, with a large presence in the Americas and Australia. ConeTec is known in the industry as a great place to work. We commit to all employees that we will provide a respectful, positive, and enriching work environment. We want you to look forward to going to work every day. We reward and recognize staff for exceptional contributions to the company. Our success is a direct result of the people who work here.
The Business Intelligence Analyst will work with data from multiple sources to develop analytics solutions and report back on their findings. Responsibilities for this role include consulting with management to define goals, developing and implementing data analytics tools, analyzing and synthesizing data, and collaborating with coworkers to implement improvements. Successful candidates need a background in database technology, with an emphasis on the use of analytical and reporting tools. *This is a 1 year fixed-term contract.
ROLES, RESPONSIBILITIES & EXPECTATIONS
Business Intelligence
- Produce data analysis and visual business intelligence reports through collecting, validating, and processing data, files, reports, databases, and other information.
- Help to develop visual dashboards using BI technologies (Power BI).
- Collaborate with IT and data engineering teams to access and integrate data from various sources.
- Contribute to data quality initiatives by identifying and addressing data quality issues.
- Support data governance efforts to ensure data is consistent and secure.
- Assist in maintaining and optimizing the BI infrastructure and data pipelines.
SKILLS, QUALIFICATIONS AND EXPERIENCE REQUIRED
Education Requirements
- Completion of a Bachelors Degree or Masters Degree in Computer Science, Data Science, Information Systems or related field.
Experience Requirements
- Experience with machine learning and analytics.
- Experience with RESTful or SOAP APIs for data integration tasks.
- Familiarity with Azure cloud services and other analytics tools like Tableau or Qlik.
- Knowledge of ETL processes, data pipelines, and data modeling
Technical Skill Requirements
- Excellent analytical skills and attention to detail.
- Proficiency in DAX queries and functions in Power BI.
- Knowledge of Power BI desktop, Power BI service, and Power BI gateway.
- Knowledge of SQL, database concepts, and data warehousing.
- Knowledge of API integration
- Knowledge of Python and other coding languages
- Familiarity with Azure cloud services and other analytics tools like Tableau or Qlik.
- Familiarity with BI technologies (e.g. Microsoft Power BI).
Soft skill requirements:
- Must possess strong work ethic and represent the company in a professional manner.
- Must be able to establish and maintain effective working relationships.
- Must be responsible and work well independently or in a team setting with minimal supervision.
- Must enjoy performing a wide variety of duties and be able to manage multiple tasks and priorities.
- Demonstrate the ability to prioritize tasks and identify problems and provide potential solutions.
- Demonstrate flexibility and adaptability to changes in business processes, goals and priorities.
- Ability to take direction and feedback from senior, contemporary and junior staff.
ConeTec is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status, or any other status protected by applicable law.
Business Intelligence Analyst
Posted 1 day ago
Job Viewed
Job Description
Job Description
The Business Intelligence - Business Analyst is a key member of a centralized BI team supporting business operations. This role supports cross-functional business intelligence initiatives including data integration, reporting, automation, and ad-hoc analytics. The analyst will play a critical role in designing and developing customized data models and interactive reports that drive insight and strategic decision-making across departments and management structures. In additional this individual will support in the budgeting process and will be responsible for key deliverables to management while liaising with regional operations staff to ensure the best practice is maintained.
This position will be located in Winnipeg MB or Campbell River, BC
Your contributions to the team include:
- Prepare, clean, and aggregate large data sets from multiple systems to support reporting and analytics needs.
- Develop and maintain customized, interactive reports and dashboards in Power BI and Excel.
- Support the annual budget process through data consolidation, report automation, and performance analysis.
- Perform financial and operational variance analysis, identifying trends, anomalies, and improvement opportunities.
- Collaborate with cross-functional teams to define reporting requirements and key performance indicators (KPIs).
- Integrate data from various software solutions and APIs to enable centralized reporting and insights.
- Identify and resolve data quality issues, ensuring accuracy and consistency across systems and reports.
- Support the development of automated workflows and data pipelines to streamline recurring reporting tasks.
- Contribute to data governance practices and documentation of data models and reporting logic.
- Present findings and insights in a clear, visual format for both technical and non-technical audiences.
What you need to be successful:
- Bachelor Degree in Business Administration, Data Analytics, Computer Science, or a related field.
- Business analyst experience with enterprise-level software.
- Proficiency in Power BI and advanced Microsoft Excel (including Power Query, Dax)
- Working knowledge of SQL and ability to query relational databases.
- Understanding of data modeling and ETL processes; experience connecting APIs or integrating third-party software data is a strong asset.
- Strong analytical, problem-solving, and organizational skills.
- Clear and effective communicator, comfortable presenting complex data to non-technical audiences.
- Able to manage multiple tasks and projects independently and in collaboration.
- Familiarity with Microsoft Fabric (Dataflows Gen2, OneLake, DirectLake, Etc.)
- Exposure to Python of other scripting languages for data analysis or automation.
- Knowledge of real estate, construction, property management, or hospitality operations.
The perks:
- Employer paid extended health, vision, and dental coverage (including family)
- Employee and Family Assistance Program
- Yearly health and wellness benefit
- RPP eligibility after one year
- Employee recognition program
- Company-provided cellphone
- In-house professional development opportunities
Why Broadstreet?
Broadstreet Properties Ltd. is a family-owned and operated property management company, partnered with Seymour Pacific Developments, that manages multi-family residential communities. We are a growing organisation made up of diverse team members who are motivated to continuously innovate our approach to asset management. We consider employee well-being a priority and are dedicated to protecting the health and safety of our teams while ensuring a workplace that is respectful of everyone.
Broadstreet Properties Ltd. practices equal opportunity hiring and onboarding processes to ensure equal access and participation for everyone. We understand that we have a responsibility to ensure a safe, dignified, and welcoming environment, and we are committed to creating an inclusive environment for all employees irrespective of race, colour, religion, sexual orientation, gender identity, or any other status protected by law. We believe in integrating people with disabilities into our workforce by removing barriers and meeting accessibility needs.
Powered by JazzHR
QxoaOw95jc