1,056 Python Engineer jobs in Canada
Senior Gen AI Python Engineer

Posted 3 days ago
Job Viewed
Job Description
**As a Senior** **GenAI Python Engineer** **, you will be responsible for building robust data pipelines, developing innovative GenAI solutions, and implementing custom algorithms to identify sophisticated market abuse patterns. You will contribute to architectural decisions, optimize engineering processes, and help maintain application health, infrastructure setup, and CI/CD pipelines.**
**This is an exciting opportunity to work on a high-impact project that will significantly influence our business and shape the future architecture of our transaction monitoring capabilities. You will be part of a dynamic team driving innovation in a critical area of the financial industry.**
**Key Responsibilities:**
**Design, develop, and maintain data-centric applications that host data pipelines and algorithms for detecting potential market abuse.**
**Develop and implement GenAI-based solutions for the financial industry, leveraging foundational models such as Gemini, LLama, GPT, and Claude.**
**Collaborate closely with clients and stakeholders to understand their requirements for platform development features and prioritize work accordingly.**
**Work effectively in a multidisciplinary team, building strong relationships with developers, Quants/Data Scientists, and production support teams.**
**Contribute to the team's strategy for development and deployment best practices, ensuring code quality, efficiency, and maintainability.**
**Participate in the full software development lifecycle, from design and implementation to testing and deployment.**
**Contribute to monitoring application health, infrastructure setup, and CI/CD processes.**
**Implement and maintain data pipelines for ingesting, processing, and analyzing large datasets.**
**Skills & Qualifications:**
**Extensive experience in designing, developing, and deploying high-performance Python-based backend services**
**Mandatory in-depth expertise with Pandas & NumPy for data manipulation and analysis.**
**Strong working knowledge of Kafka, Spark, Dask, and GenAI technologies.**
**Experience with NLP models, evaluation scenarios, prompt engineering, and RAG techniques is highly desirable.**
**Solid understanding of databases and experience with SQL and NoSQL technologies (e.g., SQL Server, Oracle, Couchbase, MongoDB).**
**Experience working in a DevOps culture and a strong advocate for automation and continuous improvement.**
**Proficiency with CI/CD tools (e.g., IBM UrbanCode Deploy, TeamCity, Jenkins), monitoring tools, and log aggregation tools.**
**Demonstrated high development standards, with a strong focus on code quality, unit testing, continuous integration, and deployment.**
**Excellent communication and interpersonal skills, with a proven ability to interact with clients and deliver results.**
**Experience working in fast-paced development environments, with a track record of delivering high-quality solutions on time and within budget.**
**Additional Desirable Skills:**
**Experience with containerization technologies such as Docker and orchestration platforms like Kubernetes.**
**Experience with cloud platforms (e.g., AWS, Azure, GCP).**
**Knowledge of financial markets and market abuse detection techniques.**
**Experience with machine learning model deployment and monitoring.**
**Qualifications:**
**6-10 years of relevant experience in Apps Development or systems analysis role**
**Extensive experience system analysis and in programming of software applications**
**Experience in managing and implementing successful projects**
**Subject Matter Expert (SME) in at least one area of Applications Development**
**Ability to adjust priorities quickly as circumstances dictate**
**Demonstrated leadership and project management skills**
**Consistently demonstrates clear and concise written and verbal communication**
**Education:**
**Bachelor's degree/University degree or equivalent experience**
**Master's degree preferred**
**This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.**
---
**Job Family Group:**
Technology
---
**Job Family:**
Applications Development
---
**Time Type:**
Full time
---
**Most Relevant Skills**
Please see the requirements listed above.
---
**Other Relevant Skills**
For complementary skills, please see above and/or contact the recruiter.
---
_Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law._
_If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review_ _Accessibility at Citi ( _._
_View Citi's_ _EEO Policy Statement ( _and the_ _Know Your Rights ( _poster._
Citi is an equal opportunity and affirmative action employer.
Minority/Female/Veteran/Individuals with Disabilities/Sexual Orientation/Gender Identity.
Senior Data Engineer (Python, Spark, Snowflake)
Posted today
Job Viewed
Job Description
Job Description
Senior Data Engineer (Python, Spark, Snowflake)
Hybrid (3 days remote), Toronto, Ontario, Canada
Experience: 5+ Years
Role Summary: NearSource is looking for a Senior Data Engineer with expertise in Python, Spark, and Snowflake to join our engineering team in Toronto. The selected candidate will architect and scale data pipelines, enable analytics, and support critical business insights for a Fortune 500 client.
Key Responsibilities:
Architect and scale data infrastructure to support batch and real-time processing of billions of records
Break down complex problems, document technical solutions, and deliver iterative improvements
Automate cloud infrastructure, services, and observability for resilient data platforms
Develop and maintain CI/CD pipelines with robust testing automation
Partner with data engineers, scientists, and product managers to understand requirements and promote best practices
Design and optimize SQL-based solutions for analytics and business intelligence
Build reports, dashboards, and analytics to deliver actionable insights to stakeholders
Support analytics initiatives, including funnel metrics, campaign performance, segmentation, and revenue growth
Deliver clear presentation,s translating complex data problems into strategic insights
Must-Have Skills:
5+ years of experience in big data systems, data processing, and SQL databases
Strong proficiency with Spark data frames, Spark SQL, and PySpark
Hands-on programming expertise in Python and SQL, writing modular and maintainable code
Solid understanding of SQL, dimensional modeling, and analytical data warehouses (Hive, Snowflake)
Experience with ETL workflow management tools such as Airflow
Proficiency in BI and dashboarding tools including PowerBI, Tableau, or Qlik
Experience with version control and CI/CD pipelines (Git, Jenkins)
Proficiency with notebook solutions such as Jupyter, EMR Notebooks, or Apache Zeppelin
Nice-to-Have Skills:
Familiarity with Presto and AWS cloud services
Knowledge of Looker for BI and dashboard development
Experience in distributed systems reliability and resiliency engineering
Exposure to observability and monitoring frameworks
Certifications / Education: Bachelor's degree in Computer Science, Engineering, or related field (or equivalent training and experience)
Apply now, or share your resume with salary expectations at Thank you for considering a career with us! Once you submit your application, our Talent Acquisition team will review your resume thoroughly. If there's a strong match, we'll reach out to discuss your experience, role details, benefits, compensation, and next steps. While we strive for transparency, we may not be able to respond to every applicant due to high volume, but we genuinely appreciate your time and interest.
About NearSource: NearSource Technologies is a trusted partner for future-ready software consulting, enabling Fortune 500 enterprises to accelerate digital transformation. Our global engineering teams build and deploy impactful technology for some of the world's most admired brands, working directly on long-term client initiatives.
Equal Opportunity Statement: NearSource is an equal opportunity employer committed to fostering an inclusive and respectful environment. We celebrate diversity and do not discriminate based on race, gender, religion, sexual orientation, age, disability, or background. Innovation thrives when everyone feels empowered to contribute.
Senior Data Engineer (Python, Spark, Snowflake)
Posted 4 days ago
Job Viewed
Job Description
Senior Data Engineer (Python, Spark, Snowflake)
Hybrid (3 days remote), Toronto, Ontario, Canada
Experience: 5+ Years
Role Summary: NearSource is looking for a Senior Data Engineer with expertise in Python, Spark, and Snowflake to join our engineering team in Toronto. The selected candidate will architect and scale data pipelines, enable analytics, and support critical business insights for a Fortune 500 client.
Key Responsibilities:
Architect and scale data infrastructure to support batch and real-time processing of billions of records
Break down complex problems, document technical solutions, and deliver iterative improvements
Automate cloud infrastructure, services, and observability for resilient data platforms
Develop and maintain CI/CD pipelines with robust testing automation
Partner with data engineers, scientists, and product managers to understand requirements and promote best practices
Design and optimize SQL-based solutions for analytics and business intelligence
Build reports, dashboards, and analytics to deliver actionable insights to stakeholders
Support analytics initiatives, including funnel metrics, campaign performance, segmentation, and revenue growth
Deliver clear presentation,s translating complex data problems into strategic insights
Must-Have Skills:
5+ years of experience in big data systems, data processing, and SQL databases
Strong proficiency with Spark data frames, Spark SQL, and PySpark
Hands-on programming expertise in Python and SQL, writing modular and maintainable code
Solid understanding of SQL, dimensional modeling, and analytical data warehouses (Hive, Snowflake)
Experience with ETL workflow management tools such as Airflow
Proficiency in BI and dashboarding tools including PowerBI, Tableau, or Qlik
Experience with version control and CI/CD pipelines (Git, Jenkins)
Proficiency with notebook solutions such as Jupyter, EMR Notebooks, or Apache Zeppelin
Nice-to-Have Skills:
Familiarity with Presto and AWS cloud services
Knowledge of Looker for BI and dashboard development
Experience in distributed systems reliability and resiliency engineering
Exposure to observability and monitoring frameworks
Certifications / Education: Bachelor's degree in Computer Science, Engineering, or related field (or equivalent training and experience)
Apply now, or share your resume with salary expectations at Thank you for considering a career with us! Once you submit your application, our Talent Acquisition team will review your resume thoroughly. If there's a strong match, we'll reach out to discuss your experience, role details, benefits, compensation, and next steps. While we strive for transparency, we may not be able to respond to every applicant due to high volume, but we genuinely appreciate your time and interest.
About NearSource: NearSource Technologies is a trusted partner for future-ready software consulting, enabling Fortune 500 enterprises to accelerate digital transformation. Our global engineering teams build and deploy impactful technology for some of the world's most admired brands, working directly on long-term client initiatives.
Equal Opportunity Statement: NearSource is an equal opportunity employer committed to fostering an inclusive and respectful environment. We celebrate diversity and do not discriminate based on race, gender, religion, sexual orientation, age, disability, or background. Innovation thrives when everyone feels empowered to contribute.
Senior Python Software Engineer

Posted 3 days ago
Job Viewed
Job Description
Req.#
**Responsibilities**
+ Code, test, and implement data structures and algorithms within the relevant framework
+ Work within an Agile or Scaled Agile Framework (SAFe) workflow pattern
+ Collaborate across a cross-functional team that includes engineers, designers, and product managers
+ Deliver high-quality code that meets relevant KPIs
+ You will be given tasks with a very high-level definition and be expected to gather low-level requirements, consider edge cases, produce a system design, and an actual implementation
**Requirements**
+ Python, React (TS), or other programming languages with 7+ years of hands-on experience
+ Experience with FastAPI (or other Python framework), AWS, Kubernetes, and Terraform
+ Experience with React API and other front-end technologies
+ The candidate should be ready to work in a start-up environment and be ready to study new languages, as new technologies come and the team is adapting along the development
+ The candidate should have a high bar for quality and reliability, as well as be self-driven in environments that promote autonomy and self-governance
**We offer**
+ Extended Healthcare with Prescription Drugs, Dental and Vision, and Healthcare Spending Account (Company Paid)
+ Life and AD&D Insurance (Company Paid)
+ Employee Assistance Program (Company Paid)
+ Telehealth (Company Paid)
+ Short-term Disability (Company Paid)
+ Long-Term Disability
+ Paid Time Off (including vacation and sick days)
+ Registered Retirement Savings Plan (RRSP) with Company match
+ Maternity/Parental/Adoption Leave Top-up
+ Employee Stock Purchase Program
+ Critical Illness Insurance
+ Employee Discounts
+ Unlimited access to LinkedIn learning solutions
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our clients, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.Engineer the Future with a Career at EPAM ( posting includes a base salary range EPAM Canada would reasonably expect to pay the selected candidate. Individual compensation offers within the range are based on a variety of factors, including, but not limited to, experience, credentials, education, training, the demand for the role, skillset, and overall business and local labour market considerations. Most candidates are hired at a salary within the range disclosed. Salary range: CA$140,000-CA$160,000. In addition, the details highlighted in this job posting above are a general description of all other expected benefits and compensation for the position.
EPAM Canada welcomes and encourages applications from candidates with disabilities. Please contact WFA Human Resource CA if you have questions in this regard, or if you require an accommodation to complete the application process. Click here ( to review EPAM's Accessibility for Ontarians with Disabilities Accessibility Policies and Multi-Year Access.
EPAM Systems, Inc. is an equal opportunity employer. We recognize the value of diversity and inclusion in creating success for our customers, business partners, shareholders, employees and communities. We are committed to recruiting, hiring, developing and promoting employees without discrimination. As a global employer, this commitment includes complying with all laws in the countries in which we operate. Nevertheless, we believe equal employment practices should not be limited to what the law requires. Equal opportunity and inclusion are essential to motivate, empower and recognize the best in everyone.
At EPAM, employment actions are based on individual qualifications, without regard to race, color, religion, creed, gender, pregnancy status, sexual orientation, gender identity, gender expression, marital or familial status, national origin, ancestry, genetics, age, disability status, veteran status, citizenship status when otherwise legally able to work, or any other characteristic protected by law.
Python ETL Developer/Data Engineer - Remote
Posted today
Job Viewed
Job Description
Job Description
Specific Duties
- -Reviewing, designing, developing ETL jobs to ingest data into Data Lake, load data to data marts;
- -extract data to integrate with various business applications.
- -Parse unstructured data, semi structured data such XML etc.
- -Design and develop efficient Mapping and workflows to load data to Data Marts
- -Map XML DTD schema in Python (customized table definitions)
- -Write efficient queries and reports in Hive or Impala to extract data on ad hoc basis for data analysis.
- -Identify the performance bottlenecks in ETL Jobs and tune their performance by enhancing or redesigning them.
- -Responsible for performance tuning of ETL mappings and queries.
- -import tables and all necessary lookup tables to facilitate the ETL process required to process daily XML files in addition to processing the very large (multi-terabytes) historical XML data files
Data Platform Engineer - Python & Spark
Posted today
Job Viewed
Job Description
Job Description
Data Platform Engineer - Python & Spark
Hybrid (3 days remote), Toronto, Ontario, Canada
Experience: 5+ Years
Role Summary: NearSource is looking for a Data Platform Engineer with expertise in Python, Spark, and Snowflake to join our engineering team in Toronto. The selected candidate will architect and scale data pipelines, enable analytics, and support critical business insights for a Fortune 500 client.
Key Responsibilities
- Design, build, and scale modern data platforms capable of handling both real-time and batch processing at enterprise scale.
- Translate complex business and technical challenges into actionable solutions through detailed documentation and iterative delivery.
- Automate deployment, monitoring, and management of cloud-based data infrastructure for high availability and reliability.
- Implement and maintain CI/CD pipelines with a strong focus on automated testing and continuous improvement.
- Collaborate closely with data engineers, data scientists, and product stakeholders to gather requirements and drive data best practices.
- Develop and optimize SQL queries and data models to enable efficient analytics and reporting.
- Create intuitive dashboards, reports, and analytical tools that provide actionable insights to business leaders.
- Contribute to analytics initiatives such as performance tracking, segmentation, and revenue optimization.
- Communicate findings effectively by presenting data-driven insights in a clear and strategic manner.
Required Skills & Experience
- 5+ years of hands-on experience working with large-scale data systems, processing pipelines, and relational databases.
- Strong expertise with Spark (DataFrames, Spark SQL, PySpark) .
- Advanced programming skills in Python and SQL , with a focus on clean, modular code.
- Solid foundation in SQL concepts, dimensional data modeling, and data warehouses such as Hive or Snowflake .
- Experience with workflow orchestration tools like Airflow .
- Proficiency with BI tools such as Tableau, Power BI, or Qlik .
- Knowledge of CI/CD practices and version control systems (Git, Jenkins).
- Hands-on experience with interactive notebook environments (Jupyter, EMR, or Zeppelin).
Preferred Skills
- Familiarity with Presto and AWS services .
- Experience working with Looker for BI development.
- Understanding of distributed system design with a focus on reliability and resilience.
- Exposure to monitoring and observability tools.
Certifications / Education:
Bachelor's degree in Computer Science, Engineering, or related field (or equivalent training and experience)
Apply now, or share your resume with salary expectations at Thank you for considering a career with us! Once you submit your application, our Talent Acquisition team will review your resume thoroughly. If there's a strong match, we'll reach out to discuss your experience, role details, benefits, compensation, and next steps. While we strive for transparency, we may not be able to respond to every applicant due to high volume, but we genuinely appreciate your time and interest.
About NearSource: NearSource Technologies is a trusted partner for future-ready software consulting, enabling Fortune 500 enterprises to accelerate digital transformation. Our global engineering teams build and deploy impactful technology for some of the world's most admired brands, working directly on long-term client initiatives.
Equal Opportunity Statement: NearSource is an equal opportunity employer committed to fostering an inclusive and respectful environment. We celebrate diversity and do not discriminate based on race, gender, religion, sexual orientation, age, disability, or background. Innovation thrives when everyone feels empowered to contribute.
Data Platform Engineer - Python & Spark
Posted 3 days ago
Job Viewed
Job Description
Data Platform Engineer - Python & Spark
Hybrid (3 days remote), Toronto, Ontario, Canada
Experience: 5+ Years
Role Summary: NearSource is looking for a Data Platform Engineer with expertise in Python, Spark, and Snowflake to join our engineering team in Toronto. The selected candidate will architect and scale data pipelines, enable analytics, and support critical business insights for a Fortune 500 client.
Key Responsibilities
- Design, build, and scale modern data platforms capable of handling both real-time and batch processing at enterprise scale.
- Translate complex business and technical challenges into actionable solutions through detailed documentation and iterative delivery.
- Automate deployment, monitoring, and management of cloud-based data infrastructure for high availability and reliability.
- Implement and maintain CI/CD pipelines with a strong focus on automated testing and continuous improvement.
- Collaborate closely with data engineers, data scientists, and product stakeholders to gather requirements and drive data best practices.
- Develop and optimize SQL queries and data models to enable efficient analytics and reporting.
- Create intuitive dashboards, reports, and analytical tools that provide actionable insights to business leaders.
- Contribute to analytics initiatives such as performance tracking, segmentation, and revenue optimization.
- Communicate findings effectively by presenting data-driven insights in a clear and strategic manner.
Required Skills & Experience
- 5+ years of hands-on experience working with large-scale data systems, processing pipelines, and relational databases.
- Strong expertise with Spark (DataFrames, Spark SQL, PySpark) .
- Advanced programming skills in Python and SQL , with a focus on clean, modular code.
- Solid foundation in SQL concepts, dimensional data modeling, and data warehouses such as Hive or Snowflake .
- Experience with workflow orchestration tools like Airflow .
- Proficiency with BI tools such as Tableau, Power BI, or Qlik .
- Knowledge of CI/CD practices and version control systems (Git, Jenkins).
- Hands-on experience with interactive notebook environments (Jupyter, EMR, or Zeppelin).
Preferred Skills
- Familiarity with Presto and AWS services .
- Experience working with Looker for BI development.
- Understanding of distributed system design with a focus on reliability and resilience.
- Exposure to monitoring and observability tools.
Certifications / Education:
Bachelor's degree in Computer Science, Engineering, or related field (or equivalent training and experience)
Apply now, or share your resume with salary expectations at Thank you for considering a career with us! Once you submit your application, our Talent Acquisition team will review your resume thoroughly. If there's a strong match, we'll reach out to discuss your experience, role details, benefits, compensation, and next steps. While we strive for transparency, we may not be able to respond to every applicant due to high volume, but we genuinely appreciate your time and interest.
About NearSource: NearSource Technologies is a trusted partner for future-ready software consulting, enabling Fortune 500 enterprises to accelerate digital transformation. Our global engineering teams build and deploy impactful technology for some of the world's most admired brands, working directly on long-term client initiatives.
Equal Opportunity Statement: NearSource is an equal opportunity employer committed to fostering an inclusive and respectful environment. We celebrate diversity and do not discriminate based on race, gender, religion, sexual orientation, age, disability, or background. Innovation thrives when everyone feels empowered to contribute.
Be The First To Know
About the latest Python engineer Jobs in Canada !
Senior Python Backend Engineer
Posted today
Job Viewed
Job Description
Job Description
Data Theorem is an exciting company focused on creating a more secure world for data. Rooted in a strong Engineer first culture, every employee has an impact on product and direction. We are searching for exceptional talent pursuing an opportunity to grow and take ownership of the projects that resonate most with them.
As a Senior Python Backend Engineer, you will be responsible for implementing web services, libraries and tools in Python, in order to automate the security analysis of mobile, cloud and web applications, at scale. We help thousands of Developers and Security Engineers discover, understand, and fix security and privacy issues affecting their applications.
You will:
Implement web services using Python, and deploy them to Google Cloud using modern technologies such as Cloud Functions and Cloud Run.
Collaborate with the Design team and the Front-end team to build new customer-facing UIs and flows for security analysis and automation.
Be an active member of the Data Theorem's Engineering team, which is spread across the United States, England, France, and Canada.
Contribute to our scanning platform, which is able to scan millions of mobile, web, and cloud assets every day to validate their security.
We’re looking for someone who has:
4+ years of Software Engineering experience.
Significant experience implementing web services and APIs in Python.
Familiarity with modern practices and tools for developing in Python (testing frameworks, type annotations, etc.).
Bonus points: experience with Google Cloud, Cloud Run, PostgreSQL, or Firestore.
Powered by JazzHR
QKrZYYQfke
Senior Data Engineer (Java+Python+ AWS)
Posted today
Job Viewed
Job Description
Job Description
Manage timelines/deliverables within the team towards the successful delivery of projects.
Design software solutions by interacting with portfolio managers, traders, operations staff and peers to understand requirements.
Develop solutions that are in line with client's technology biases, deliver efficiency and scalability, and enable new trading activities.
Provide knowledge transfer to team members and support staff through application demos, walkthroughs, and documentation.
a. Need Java, Python capabilities
b. Data engineer skills, experience with designing Database, Data Modelling , formulating Data Governance principles
c. AWS