243 Data Engineer jobs in Canada
Big Data Platform Engineer
Posted today
Job Viewed
Job Description
Job Description
Big Data Platform Engineer
Position Overview
We are looking for someone armed with a strong tool-kit to develop and maintain technical solutions that adhere to engineering and architectural design principles while meeting business requirements. You'll be a technology owner providing technical expertise with a focus on efficiency, reliability, scalability, and security which includes planning, evaluating, designing, operationalizing, and supporting solutions in compliance with enterprise and industry standards. The ideal candidate is willing, and able to research, maintain, configure, administer, and provision infrastructure, applications, and services across our platforms.
- Understand architecture diagrams and provide input into application design
- Perform systems administration: monitor, configure, back-up, authenticate, tune, maintain, install, script, monitor applications, services, and systems.
- Script installs and stand up infrastructure in both private/public Cloud
- Identify issues, develop, and maintain processes that address and resolve them, (and be sure to communicate/alert stakeholders as needed).
- Design, implement and maintain an automated build and install/deploy process; develop and maintain build scripts of projects and/or products.
- Perform Release Engineering functions for either cloud or non-cloud services, products and platforms
- Ensure effective change management (using ServiceNow).
- Give specialized support (like research, installation, configuration, L3 support) and meets or exceeds established standards/service levels, while minimizing operational risk.
- Design, review, integrate infrastructure and application requirements (non-functional, security, integration, performance, quality, operations etc.).
- Build/deploy base infrastructure components (e.g. Azure capabilities including Virtual Machines, ASE, AKS, Blob storage, geo-replication, etc.) and application services for all environments. Help evolve the base infrastructure and operational environment, deploy new technologies in Azure and other cloud providers.
- Maintain base infrastructure components, work with vendors (Azure) to report problems, and receive fixes.
- Create and document disaster and business recovery plans and procedures.
Requirements
Looking for an individual with a strong engineering mindset , sense of ownership - strong organizational, follow-up and priority-setting skills to handle highly-complex and multi-faceted assignments and to work independently
- Undergraduate Degree or Technical Certificate
- 5-10 years relevant experience
- Appetite for contributing within a complex and critical environment
- Expert knowledge of specific domain or range of engineering frameworks, development, technology, tools, processes, standards and procedures, as well as organizational issues. Experience as a primary subject matter expert in multiple areas and a consultant on all aspects of technology and solutions
- Experience deploying, managing and operating complex applications in a Cloud environment e.g. Azure
- Understanding of shell script, powershell, Python and the ability to code for automation
- Understanding of critical concepts in DevOps (CI, CD, CM, IaC etc) and Agile principles
- Readiness and motivation (as senior or lead developer and valued subject matter expert) to address and resolve highly complex and multifaceted development-related issues, often independently.
- Excellent troubleshooting skills
- Experience in infrastructure, services and application monitoring and logging
- Configuring and managing big data technologies / databases and understanding of various approaches to data storage and indexing is an asset
Must-have:
- Linux OS administration experience
- Hadoop / Cassandra administration
- Cloud experience e.g. Azure Services including IaaS, AKS, ADLS, ADF, AWS, GCP
- Configuration management tools e.g. SALT, terraform, Ansible or CHEF
- Development/Engineering experience e.g. Bash, Shell, Python
- Excellent problem-solving skills, engineering mindset (must be able to demonstrate this in interviews)
- Jenkins, Github, Bitbucket, Nexus or similar toolsets
Nice to have:
- Windows Experience
- High-Performance Computing clusters (e.g. HPC Pack)
- Splunk, DataDog, ITRS, Sensu, Foglight, ELK
- AutoSys, ServiceNow, JIRA, Confluence
- Databricks, Blob, Event Hub and more
Data Engineer

Posted 1 day ago
Job Viewed
Job Description
25WD89443
**Position Overview**
Autodesk is looking for diverse engineering candidates to join the Compliance team, a collection of systems and teams focused on detecting and exposing non-compliant users of Autodesk software. This is a key strategic initiative for the company. As a Data Engineer, you will contribute to improving critical data processing & analytics pipelines. You will work on challenging problems to enhance the platform's reliability, resiliency, and scalability.
We are looking for someone who is detail and quality oriented, and excited about the prospects of having a big impact with data at Autodesk. Our tech stack includes Hive, Spark, Presto, Jenkins, Snowflake, PowerBI, Looker, and various AWS services. You will report to Senior Manager, Software Development and this is a hybrid position based out of Toronto, Canada
**Responsibilities**
+ You will need a product-focused mindset. It is essential for you to understand business requirements and help build systems that can scale and extend to accommodate those needs
+ Break down moderately complex problems, contribute to documenting technical solutions, and assist in making fast, iterative improvements
+ Help build and maintain data infrastructure that powers batch and real-time data processing of billions of records
+ Assist in automating cloud infrastructure, services, and observability
+ Contribute to developing CI/CD pipelines and testing automation
+ Collaborate with data engineers, data scientists, product managers, and data stakeholders to understand their needs and promote best practices
+ You have a growth mindset. You will support identifying business challenges and opportunities for improvement and help solve them using data analysis and data mining
+ You will support analytics and provide insights around product usage, campaign performance, funnel metrics, segmentation, conversion, and revenue growth
+ You will contribute to ad-hoc analysis, long-term projects, reports, and dashboards to find new insights and to measure progress in key initiatives
+ You will work closely with business stakeholders to understand and maintain focus on their analytical needs, including identifying critical metrics and KPIs
+ You will partner with different teams within the organization to understand business needs and requirements
+ You will contribute to presentations that help distill complex problems into clear insights
**Minimum Qualifications**
+ 2-4 years of relevant industry experience in big data systems, data processing, and SQL databases
+ 2+ years of coding experience in Spark DataFrames, Spark SQL, and PySpark
+ 2+ years of hands-on programming skills, able to write modular, maintainable code, preferably in Python & SQL
+ Good understanding of SQL, dimensional modeling, and analytical big data warehouses like Hive and Snowflake
+ Familiar with ETL workflow management tools like Airflow
+ 1-2 years of experience in building reports and dashboards using BI tools, Knowledge of Looker is a plus
+ Exposure to version control and CI/CD tools like Git and Jenkins CI
+ Experience working with data in notebook environments like Jupyter, EMR Notebooks, or Apache Zeppelin
+ Bachelor's degree in Computer Science, Engineering or a related field, or equivalent training, fellowship, or work experience
**Learn More**
**About Autodesk**
Welcome to Autodesk! Amazing things are created every day with our software - from the greenest buildings and cleanest cars to the smartest factories and biggest hit movies. We help innovators turn their ideas into reality, transforming not only how things are made, but what can be made.
We take great pride in our culture here at Autodesk - our Culture Code is at the core of everything we do. Our values and ways of working help our people thrive and realize their potential, which leads to even better outcomes for our customers.
When you're an Autodesker, you can be your whole, authentic self and do meaningful work that helps build a better future for all. Ready to shape the world and your future? Join us!
**Salary transparency**
Salary is one part of Autodesk's competitive compensation package. Offers are based on the candidate's experience and geographic location. In addition to base salaries, we also have a significant emphasis on discretionary annual cash bonuses, commissions for sales roles, stock or long-term incentive cash grants, and a comprehensive benefits package.
**Diversity & Belonging**
We take pride in cultivating a culture of belonging and an equitable workplace where everyone can thrive. Learn more here: you an existing contractor or consultant with Autodesk?**
Please search for open jobs and apply internally (not on this external site).
Data Engineer

Posted 1 day ago
Job Viewed
Job Description
Day to Day
Insight Global is seeking a Data Engineer to join one of a Retail Client in Vancouver BC. This Data Engineer will sit within the People Enablement Technology team which is committed to enabling seamless experiences for their global workforce. The Data & Analytics Pillar is a newly formed team, focused on automating our core health & wealth offerings as well as work across the department to enable efforts driven by the People Systems, Digital Workplace & Learning, and Workforce Agility pillars. The project the successful candidate will join is bringing the learning data from various platforms such as LI learning into an existing data lake. This will allow for analytics insights to be generated from this people data for various teams such as the ESG team. The data engineer will support using data bricks for this ingestion, and use unity catalog for its categorization and potential encryption if required.
A successful candidate will be a problem solver and an expert in ETL programming/scripting, data modelling, data integration, SQL and have exemplary communication skills. The candidate will need to be comfortable with ambiguity in a fast-paced and ever-changing environment, and able to think big while paying careful attention to detail. The candidate will know, and love working with new technologies, can model multidimensional datasets, and can partner with cross functional business teams to answer key business questions. Apart from building data pipelines, you will be an advocate for automation, performance tuning and cost optimization. Be ready to question the status quo and bring forth intelligent solutions and proof of concepts.
Principal Duties and Responsibilities:
Support the design and implementation of complex data integration, modeling, and orchestration workflows using tools such as Databricks, Azure Data Factory, and Snowflake
Collaborate with architects, engineers, analysts, and business stakeholders to deliver secure, reliable, and high-performing data solutions
Apply DevSecOps principles to ensure secure development, deployment, and monitoring of data pipelines and infrastructure
Implement and maintain data encryption, access controls, and governance using Databricks Unity Catalog to protect sensitive information
Participate in operational support, including troubleshooting, performance tuning, and continuous improvement of data workflows
Create and maintain technical documentation and contribute to knowledge-sharing across the team
We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form ( . The EEOC "Know Your Rights" Poster is available here ( .
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: .
Skills and Requirements
Must Haves:
5+ years of experience as a Data Engineer working with Databricks, SQL & Python
3+ years with Cloud Data Integration & Modelling (Azure Data Factory)
Strong experience with Unity Catalog for categorization and filing sensitive data
Experience with DevOps tools and building secure, scalable data workflows
Proficiency in Python, and Pyspark
Experience working with people centered and sensitive data to understand the complexities of encryption, decryption (such as health information, payment information, banking information).
Experience with data modeling, data warehousing, and building ETL pipelines.
Excellent problem-solving skills, combined with the ability to present your findings/insights clearly and compellingly in both verbal and written form.
Strong documentation and communication skills
Bachelors degree in Computer Science, Mathematics, Statistics, Operations Research. Plusses:
- Snowflake experience null
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal employment opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment without regard to race, color, ethnicity, religion,sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military oruniformed service member status, or any other status or characteristic protected by applicable laws, regulations, andordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Small Door is membership-based veterinary care designed with human standards that is better for pets, pet parents, and veterinarians alike. We designed and delivered a reimagined veterinary experience via a membership that includes exceptional care, 24/7 telemedicine, and transparent pricing - delivered with modern hospitality in spaces designed by animal experts to be stress-free. We opened our flagship location in Manhattan's West Village in 2020 and have quickly expanded across the East Coast. Small Door now operates in New York City, Boston, Washington DC, and Maryland with continued expansion plans in 2025.
We're looking for a motivated, curious, and collaborative Entry-Level Data Engineer to join our growing Data team. In this role, you'll help build and maintain the pipelines, tools, and systems that power analytics, product insights, and strategic decision-making across the company.
This is a great opportunity for someone early in their career who's excited to learn, ship fast, and make a real impact at a company where data is core to everything we do.
What You'll Do
- Collaborate with the product, technology and data team to build scalable data pipelines (ETL/ELT)
- Maintain and optimize data infrastructure using tools like dbt, Airflow, and Snowflake (or similar)
- Support data ingestion from internal and external sources (APIs, application databases, third-party tools)
- Ensure data integrity, availability, and performance through testing and monitoring
- Partner with analytics and product teams to make trusted, high-quality data available for reporting and machine learning
- Write clear, maintainable code and documentation
- Contribute to the evolution of our data architecture and engineering best practices
- Help manage the data warehouse and its data model
Who You Are
- A degree in Computer Science, Engineering, Data Science, or a related field — or equivalent practical experience
- Proficiency in SQL and Python
- Familiarity with data modeling concepts and ETL workflows
- Interest or experience in tools like dbt, Airflow, Snowflake, Redshift, BigQuery, or similar
- A growth mindset and eagerness to learn from feedback and challenges
- Excellent problem-solving skills and attention to detail
- Strong communication and collaboration skills
What you'll get
- Competitive salary
- A great top of the line benefit plan provided by PEO Canada and including short term disability insurance
- An opportunity to make a real impact on the people around you
- A collaborative group of people who live our core values and have your back
- A clear career path with opportunities for development, both personally and professionally
Small Door is proudly a public benefit corporation and a certified B Corp. We are committed to creating a diverse, inclusive and equitable workplace, and we encourage qualified applicants of every background, ability, and life experience to apply to appropriate employment opportunities.
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Title: Data Engineer - P3
Location: Remote Canada
Reports To: Sr. Principle, Data Science & Engineering
The Role:
We are seeking an experienced Data Engineer to join our dynamic team. In this role, you will leverage your expertise in SQL, Python, Spark, ETL development, and data warehousing to build scalable data solutions, optimize data pipelines, and ensure efficient data flow across our organization. If you are passionate about transforming raw data into actionable insights, we want to hear from you!
The Impact You Will Have on This Role:
As a Data Engineer, you will play a pivotal role in empowering our business with reliable, high-quality data. Your contributions will enable our teams to make data-driven decisions, enhance operational efficiency, and deliver innovative solutions to our clients. By ensuring seamless data integration and accessibility, you will directly influence our growth and market leadership.
What You'll Be Doing in This Role:
- Design, develop, and maintain ETL pipelines to support data ingestion, transformation, and delivery.
- Implement and manage scalable data architectures to support analytics and reporting needs.
- Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver optimal solutions.
- Monitor, troubleshoot, and optimize data workflows to ensure reliability and performance.
- Work with large-scale datasets using tools like Spark to ensure efficient processing.
- Ensure data integrity and security by implementing best practices for data warehousing and compliance.
- Document technical designs, processes, and operational workflows.
Qualifications of this Role:
- 5+ years of extensive experience as a Data Engineer, focus and intensively working on ETL development with SQL and Python.
- Experienced in developing maintainable and scalable SQL scripts and procedures for data transformations. Possesses knowledge of advanced SQL functions and complex logic, with proficiency handling complex and nested data types.
- Highly Proficient in Python programming, with an understanding of various data structures, error handling, and modular or object-oriented programming techniques for building scalable, reliable, and maintainable ETL pipelines.
- Experienced in writing clean, modular, and reusable code, with adherence to documentation and version control standards. Experienced in integrating Python scripts with relational and cloud-based data stores such as BigQuery, Snowflake, SQL Server, and PostgreSQL.
- Strong debugging and performance optimization skills in large-scale data environments.
- Experienced in JSON manipulation and API consumption.
- Hands-on experience with Spark and distributed data processing.
- Solid knowledge of data warehousing, dimensional modeling concepts and design principles.
- Familiarity with CI/CD pipelines for production environments is a plus.
- Familiar with Palantir platform is a plus.
- Excellent problem-solving skills and ability to work in a collaborative environment.
- Bachelor's degree or above in computer science, Engineering, or a related field
The Way We Work:
- Leader Led
- Remote First
- Foster Flexibility
- Reward Performance
- Time Off Matters
Company Mission
J.D. Power is clear about what we do to ensure our success into the future. We unite industry leading data and insights with world-class technology to solve our clients' toughest challenges.
Our Values
At J.D. Power, we strive to be Truth Finders, Change Makers and Team Driven - the distinct behaviors that, together, define our unique culture.
J.D. Power is committed to employing a diverse workforce. Qualified applicants will receive consideration without regard to race, color, religion, sex, national origin, age, sexual orientation, gender identity, gender expression, veteran status, or disability.
J.D. Power is an equal-opportunity employer and compliant with AODA/ADA legislation. Should you require accommodations during the recruitment and selection process, please reach out to
To all recruitment agencies: J.D. Power does not accept unsolicited agency resumes and we are not responsible for any fees related to unsolicited resumes.
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Description
At CI, we see a great place to work as one that is a safe place for everyone to have a voice, where people are empowered to take ownership over meaningful work, where there is an opportunity to grow through stretching themselves, where they can work on innovative products and projects, and where employees are supported and engaged in doing so.
We are currently seeking for a Data Engineer to join the Data Solutions team in the Wealth Technology group at CI Wealth. The Data Solutions team focuses on developing data-driven solutions, driving innovation, and uncovering key insights to support strategic initiatives. The successful candidate will work on data pipeline and data transformation tasks that underly these strategic pillars.
RESPONSIBILITIES
- Collaborate with business teams and Data Analysts to gather and understand requirements, ensuring alignment with business objectives and data needs.
- Translate business needs into detailed technical requirements in collaboration with subject matter experts (SMEs) to ensure accuracy and feasibility.
- Recommend and design scalable, efficient, and effective data architectures and workflows to support current and future business requirements.
- Design, develop, and maintain data assets to enable seamless extraction, transformation, and loading (ETL/ELT) processes from diverse data sources, making data accessible to client-facing applications, data warehouses, and internal tools.
- Build, operate, and optimize highly scalable and reliable data pipelines and infrastructure to support analytics, reporting and operational use cases.
- Drive end-to-end ownership of projects, including planning, development, testing, and deployment, ensuring timely and successful delivery.
- Collaborate with Quality Assurance (QA) and Support teams to identify, troubleshoot, and resolve issues in production environments, ensuring the stability and reliability of data systems.
- Work with Release Management to plan, coordinate, and implement data pipeline updates, adhering to CI’s deployment standards and minimizing disruption to production systems.
- Implement and enforce best practices for observability, data lineage, and governance, ensuring transparency, reliability, and compliance across all data systems.
- Participate in data migration projects, transitioning legacy systems to modern platforms and architectures while minimizing disruption and data loss.
EXPERIENCE
- 3+ years of professional experience in data engineering, with a strong focus on designing, developing, and optimizing scalable data pipelines, ETL/ELT workflows, and data integration solutions using modern tools and technologies.
EDUCATION/TRAINING
- Post-secondary degree in a quantitative discipline.
KNOWLEDGE, SKILLS, AND ABILITIES
- Comprehensive understanding of data pipeline architecture, modern data stack architecture, and cloud-based platforms, including AWS, Snowflake, and other cloud-native solutions.
- In-depth knowledge and experience with the following tools and concept:
- Data extraction – SQL, Python, API invocation, CDC
- Database systems – PostgreSQL, Sybase, MySQL, DynamoDB
- Data storage repositories – SFTP, AWS S3
- Scheduling of data jobs – CRON, Apache Airflow, AWS Step Functions
- ETL/ ELT tools and workflow – Snowflake, PySpark, AWS Glue, EMR, AWS Lambda, SCD
- CI/CD pipelines – Bitbucket, Git, Jenkins, CloudFormation, Terraform, Flyway
- Strong knowledge of observability and data lineage implementation to ensure pipeline transparency and monitoring.
- A strong analytical mindset and sophisticated written and verbal communication skills.
- Experience in the Financial Services Industry is an asset.
- Ability to work within an organization based upon continuous improvement.
WHAT WE OFFER
- Modern HQ location within walking distance from Union Station
- Equipment Purchase Program
- Training Reimbursement
- Paid Professional Designations
- Employee Share Purchase Program (ESPP)
- Corporate Discount Program
- Enhanced group benefits
- Parental Leave Top–up program
- Fitness membership discounts
- Volunteer paid Days
We are focused on building a diverse and inclusive workforce. If you are excited about this role and are not confident you meet all the qualification requirements, we encourage you to apply to investigate the opportunity further.
Please submit your resume in confidence by clicking “Apply”. Only qualified candidates selected for an interview will be contacted. CI Financial Corp. and all of our affiliates (“CI”) are committed to fair and accessible employment practices and provide reasonable accommodations for persons with disabilities. If you require accommodations in order to apply for any job opportunities, require this posting in an additional format, or require accommodation at any stage of the recruitment process please contact us at , or call ext. 4747.
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Tiger Analytics is a fast-growing advanced analytics consulting firm. Our consultants bring deep expertise in Data Science, Machine Learning and AI. We are the trusted analytics partner for several Fortune 100 companies, enabling them to generate business value from data. Our business value and leadership has been recognized by various market research firms, including Forrester and Gartner. We are looking for top-notch talent as we continue to build the best analytics global consulting team in the world.
We are seeking an experienced Data Engineer with expertise in Dataiku to join our data team. As a Data Engineer, you will be responsible for designing, building, and maintaining data pipelines, data integration processes, and data infrastructure. You will collaborate closely with data scientists, analysts, and other stakeholders to ensure efficient data flow and support data-driven decision making across the organization.
Requirements
- 8+ years of overall industry experience specifically in data engineering
- Strong knowledge of data engineering principles, data integration, and data warehousing concepts.
- Proficiency in building and maintaining data pipelines using Dataiku.
- Solid understanding of ETL processes and tools.
- Strong programming skills in Python, SQL, or Scala.
- Good understanding of data modeling, data architecture, and database design.
- Familiarity with cloud platforms like AWS, Snowflake, dbt, Azure, or GCP.
- Excellent problem-solving and troubleshooting skills.
- Strong communication and collaboration abilities.
- Attention to detail and a focus on delivering high-quality work.
Benefits
Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, challenging, and entrepreneurial environment, with a high degree of individual responsibility.
Be The First To Know
About the latest Data engineer Jobs in Canada !
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Salary:
Location - Toronto Hub
Summary
We are seeking a talented and experienced Data Engineer to join our growing team. The Data Engineer will play a key role in designing, building, and maintaining our data infrastructure, ensuring scalability, reliability, and performance.
Who You Are:
- An engineer at heart who takes pride in a clean and powerful code base.
- Deep knowledge of Data Architecture, Data Engineering best practices and passionate about making data accessible.
- Enjoys sharing knowledge and working in a collaborative team environment.
- Ready to take ownership and make decisions.
- Data governance, security, and compliance are always top of mind.
- Previous experience in ETL/ELT Architecture: Expertise in designing and optimizing ETL/ELT pipelines to handle various data formats (CSV, JSON, Parquet, etc.) and integrating data from multiple sources (e.g., APIs, cloud storage, client Snowflake shares).
- Strong understanding of REST API principles, experience with high-volume API requests, and ability to optimize API calls and data ingestion strategies.
- Proficiency in using orchestration tools like Apache Airflow, or similar tools to automate and manage data workflows.
- Expertise in building efficient ETL/ELT workflows to enable scalable feature engineering.
- Previous experience in performance testing and optimization (data load testing/performance tuning/monitoring) for various databases, and ETL pipelines.
- Experience building and testing resilient infrastructure using IaC and cloud-specific features for disaster recovery.
- Experience working in an Agile environment.
- Experience building data products in large scale distributed systems.
- Knowledge of industry best practices and compliance standards, such as DAMA, CCPA, PIPEDA, etc.
What You Will Do:
- Work with business partners and stakeholders to understand data requirements and support data-related projects.
- Work with engineering, product teams and 3rd parties to collect required data.
- Drive data modeling and warehousing initiatives to enable and empower data consumers.
- Develop ETL/ELT pipelines to ingest, prepare and store data for the product, analysis, and data science.
- Develop and implement data quality checks, conduct QA and implement monitoring routines.
- Improve the reliability and scalability of our ETL processes.
- Develop and manage backup and recovery strategies to ensure data integrity and availability.
- Ensuring our system is architected to balance cost and latency.
- Collaborate with partners to execute compliance, security, and privacy reviews/audits.
- Deploy data infrastructure to support product, analysis, and data science.
Qualifications:
- Education: Bachelors degree in Computer Science, Information Technology, or a related field.
- Experience: Minimum of 4 years working with databases, preferably within a platform tool and automation environment.
- Technical Skills: Programming Languages (PySpark, Scala, Python, Snow SQL, SnowPipe, SQL, Terraform), Data Orchestration and Automation (Airflow, K8 or similar), Cloud Infrastructure and Data Management Systems (MongoDB, Snowflake, Databricks and Azure or similar).
- Problem-Solving: Strong analytical and problem-solving skills.
- Communication: Excellent verbal and written communication skills.
- Team Player: Ability to work effectively in a collaborative team environment.
- Knowledge of DevOps and mobile software develop practices and tools.
remote work
Data Engineer
Posted today
Job Viewed
Job Description
Job Description
Description
At CI, we see a great place to work as one that is a safe place for everyone to have a voice, where people are empowered to take ownership over meaningful work, where there is an opportunity to grow through stretching themselves, where they can work on innovative products and projects, and where employees are supported and engaged in doing so.
OVERVIEW
We are currently seeking a Data Engineer to join our CI GAM IT department. The successful candidate will work closely with our IT Client Reporting team on the development, modernization and maintenance of our statements and tax processes. In this role, you will assist with solutioning business requirements by extracting data, enhancing, and modifying applications to be used in client reporting. You will also be assisting in transforming our legacy applications for cloud adoption.
WHAT YOU WILL DO
- Design, build, and modernize our statements and tax reporting processes in the Snowflake Data Cloud environment.
- Develop integration processes using SQL and Python to produce curated data models.
- Optimize and maintain data architecture, including schema design, performance tuning, and query optimization.
- Develop and maintain comprehensive documentation for data processes, architecture, and procedures.
- Collaborate with business partners, business systems analysts, and other stakeholders to deliver data-driven solutions and insights.
- Participate in code reviews, provide mentorship to junior data engineers, and contribute to team knowledge sharing.
- Maintain and support existing legacy applications and processes for both CI GAM and CI Wealth
WHAT YOU WILL BRING
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
Experience
- 3+ years of experience in data engineering, data architecture, or a similar role in a professional environment
- Experienced in Snowflake Data Cloud, DBT and Airflow, including data modeling and cloud data architecture.
- Proficiency in SQL for data extraction, manipulation, and analysis.
- Developed using Python for data processing and automation.
- Solid experience with different data storage solutions in AWS such as Snowflake, S3 and Redshift
- Experience with AWS services such as Step Functions, Glue, and Secrets Manager.
- Strong analytical and problem-solving skills, with the ability to work effectively in a fast-paced, collaborative environment.
- Excellent communication skills, with the ability to translate complex technical concepts for non-technical stakeholders.
- Experience in the Financial Services Industry is an asset
Preferred Skills
- Strong knowledge in agile methodologies, DevOps practices, including CI/CD pipelines, containerization, and infrastructure as code (IaC).
- Experienced in the use of Atlassian collaboration tools.
- Experienced in shell-scripting language like KSH and SQR.
- Familiar with job scheduler tools like Autosys, cron, etc.
- Experience in Restful API is a plus
- Experience in cyber-security measures is a plus
- Good interpersonal and team skills.
- Excellent oral and written communication skills.
- Strong organizational, analytical skills.
CI Financial is an independent company offering global wealth management and asset management advisory services through diverse financial services firms. Since 1965, we have consistently anticipated and responded to the changing needs of investors. We are driven by a commitment to provide individuals and institutions with the highest-quality investments and advice.
Our commitment to the highest levels of performance means that whatever their position, CI employees must be comfortable in a fast-paced environment that will stretch them to tap into their highest potential. Employees with a healthy dose of ambition, a desire to commit to a curious mindset for continuous learning, and a willingness to go the extra mile thrive at CI.
WHAT WE OFFER
- Modern HQ location within walking distance from Union Station
- Training Reimbursement
- Paid Professional Designations
- Employee Savings Plan (ESP)
- Corporate Discount Program
- Enhanced group benefits
- Parental Leave Top–up program
- Fitness membership discounts
- Volunteer Paid Day
We are focused on building a diverse and inclusive workforce. If you are excited about this role and are not confident you meet all the qualification requirements, we encourage you to apply to investigate the opportunity further.
Please submit your resume in confidence by clicking “Apply”. Only qualified candidates selected for an interview will be contacted. CI Financial Corp. and all of our affiliates (“CI”) are committed to fair and accessible employment practices and provide reasonable accommodations for persons with disabilities. If you require accommodations in order to apply for any job opportunities, require this posting in an additional format, or require accommodation at any stage of the recruitment process please contact us at , or call ext. 4747.