104 Gcp jobs in Canada
DevOps Engineer – GCP
Posted 1 day ago
Job Viewed
Job Description
Google Support Engineer - GCP
Posted 14 days ago
Job Viewed
Job Description
**In this role, you will**
Manage non-standard/complex P1, P2 (major incidents), and P3 and P4 incidents and service requests.
Drive root cause analysis on repeatable incidents to help prevent issues in the future.
Ensure customer service satisfaction and enable continuous improvements.
Oversee vendor's service delivery and escalation.
Provide operational consultancy for future-state technologies.
Stay updated with emerging security threats and industry best practices related to container security and cloud-native technologies.
Participate in incident response activities, security incident investigations, and post-mortem analysis to improve incident handling processes.
**What you'll need to succeed (required skills)**
3+ years of experience with container technologies such as Kubernetes, Google Kubernetes Engine (GKE),AKS, Docker, Podman.
*Familiarity with Cloud PaaS services such as Google Cloud Run, Google GKE Autopilot, and Anthos Service Mesh.
*Experience developing CI/CD pipelines using technologies such as GitHub Actions, Jenkins.
*Strong understanding of network security principles, encryption protocols and identity management concepts.
*Strong understanding of Kubernetes resource types (i.e. cluster roles, services, deployments etc.).
*Experience developing Helm Charts.
*Experience implementing Kubernetes technologies such as network policies, service mesh, certificate manager, ingress controllers.
*Experience developing compliance policies/scripts using tools such as Google Org Policy, Aquasec, Wiz.
Experience supporting GCP services such as GKE, BigQuery, Cloud SQL (SQL/PostgreSQL), REDIS, Cassandra, BigTable, Cloud Filestore, Persistent Storage, Apigee, Kafka, Dataflow, GCS.
Knowledge of monitoring tools such as Dynatrace, Datadog, etc.
Experience and knowledge supporting an Azure Public Cloud environment (while not necessary) would be valuable.
Thorough problem determination skills to troubleshoot and resolve business application issues.
Knowledge with OS technologies (RedHat Linux, Windows).
DevOps and Agile understanding.
Working knowledge of Local Area Networks (LAN) and Wide Area Networks (WAN).
Comfortable with working in a rapidly changing, technically complex environment.
Knowledge of scripting languages and tools such as Python, JavaScript, Powershell, Bash.
Comfortable with the Agile methodology.
At Cognizant, we're eager to meet people who believe in our mission and can make an impact in various ways. We strongly encourage you to apply even if you only meet the required skills listed. Consider what transferable experience and skills make you a unique applicant and help us see how you'd be beneficial to this role.
**_Cognizant will only consider applicants for this position who are legally authorized to work in Canada without requiring employer sponsorship, now or at any time in the future._**
**Working arrangements**
At Cognizant, we strive to provide flexibility wherever possible, and we are here to support a healthy work-life balance through our various well-being programs. Based on this role's business requirements, this is a hybrid position in Toronto, Ontario.
_Note: The working arrangements for this role are accurate as of the date of posting. This may change based on the project you're engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations._
Cognizant is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.
GCP Data Architect
Posted 7 days ago
Job Viewed
Job Description
Insight Global is seeking a highly skilled Google Cloud Platform (GCP) Data Architect with SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics. This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: and Requirements
Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow.
Strong SQL and Python programming skills.
Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
Knowledge of data governance frameworks and data security best practices.
Experience with Boomi, Informatica, or MuleSoft for SAP and non-SAP integrations.
Experience in Google Cortex Framework for SAP-GCP integrations. SAP S/4HANA migration
Looker, Tableau, Power BIConsumer products/manufacturing/retail experience
Boomi, Informatica, MuleSoft
GCP Data Architect
Posted today
Job Viewed
Job Description
Job Description
Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.
We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working at remotely This is a 6 month contract initially, but could be extended.
Role: GCP Data Architect
Type: Contract
Duration: 6 months to start + potential extension
Location: Toronto, ON - remote with occasional office visits
Rate: $100 -$120 CDN/hr C2C depending on overall experience
GCP Data Architect - Role Overview
We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics . This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.
Key Responsibilities
1. Data Strategy, Security & Governance
- Define and implement enterprise-wide data strategy aligned with business goals.
- Establish data governance frameworks , data classification, retention, and privacy policies.
- Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).
2. Data Architecture & Modeling
- Design conceptual, logical, and physical data models to support analytics and operational workloads.
- Implement star, snowflake, and data vault models for analytical systems.
- Implement S4 CDS views in Google Big Query
3. Google Cloud Platform Expertise
- Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
- Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
- Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
- Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
- Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
- Develop complex SQL queries for analytics, transformations, and performance tuning.
- Build automation scripts and utilities in Python .
- Good understanding of CDS views, ABAP language
6. System Migration
- Lead on-premise to cloud migrations for enterprise data platforms (SAP BW/Bobj)
- Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
- Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
- Apply infrastructure-as-code principles for reproducible and scalable deployments.
Preferred Skills
- Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow .
- Strong SQL and Python programming skills.
- Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
- Knowledge of data governance frameworks and data security best practices.
- Experience with Boomi, Informatica, or MuleSoft for SAP and non-SAP integrations.
- Experience in Google Cortex Framework for SAP-GCP integrations.
GCP Data Architect
Posted 24 days ago
Job Viewed
Job Description
Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.
We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working at remotely This is a 6 month contract initially, but could be extended.
Role: GCP Data Architect
Type: Contract
Duration: 6 months to start + potential extension
Location: Toronto, ON - remote with occasional office visits
Rate: $100 -$120 CDN/hr C2C depending on overall experience
GCP Data Architect - Role Overview
We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics . This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.
Key Responsibilities
1. Data Strategy, Security & Governance
- Define and implement enterprise-wide data strategy aligned with business goals.
- Establish data governance frameworks , data classification, retention, and privacy policies.
- Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).
2. Data Architecture & Modeling
- Design conceptual, logical, and physical data models to support analytics and operational workloads.
- Implement star, snowflake, and data vault models for analytical systems.
- Implement S4 CDS views in Google Big Query
3. Google Cloud Platform Expertise
- Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
- Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
- Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
- Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
- Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
- Develop complex SQL queries for analytics, transformations, and performance tuning.
- Build automation scripts and utilities in Python .
- Good understanding of CDS views, ABAP language
6. System Migration
- Lead on-premise to cloud migrations for enterprise data platforms (SAP BW/Bobj)
- Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
- Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
- Apply infrastructure-as-code principles for reproducible and scalable deployments.
Preferred Skills
- Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow .
- Strong SQL and Python programming skills.
- Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
- Knowledge of data governance frameworks and data security best practices.
- Experience with Boomi, Informatica, or MuleSoft for SAP and non-SAP integrations.
- Experience in Google Cortex Framework for SAP-GCP integrations.
Senior Data Engineer – GCP & Informatica
Posted 16 days ago
Job Viewed
Job Description
GCP Data Architect - Remote
Posted today
Job Viewed
Job Description
Job Description
Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.
We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working at remotely This is a 6 month contract initially, but could be extended.
Role: GCP Data Architect
Type: Contract
Duration: 6 months to start + potential extension
Location: Toronto, ON - remote with occasional office visits
Rate: $110 -$140 CDN/hr C2C depending on overall experience
GCP Data Architect - Role Overview
We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics . This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.
Key Responsibilities
1. Data Strategy, Security & Governance
- Define and implement enterprise-wide data strategy aligned with business goals.
- Establish data governance frameworks , data classification, retention, and privacy policies.
- Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).
2. Data Architecture & Modeling
- Design conceptual, logical, and physical data models to support analytics and operational workloads.
- Implement star, snowflake, and data vault models for analytical systems.
- Implement S4 CDS views in Google Big Query
3. Google Cloud Platform Expertise
- Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
- Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
- Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
- Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
- Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
- Develop complex SQL queries for analytics, transformations, and performance tuning.
- Build automation scripts and utilities in Python .
- Good understanding of CDS views, ABAP language
6. System Migration
- Lead on-premise to cloud migrations for enterprise data platforms (SAP BW/Bobj)
- Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
- Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
- Apply infrastructure-as-code principles for reproducible and scalable deployments.
Preferred Skills
- Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow .
- Strong SQL and Python programming skills.
- Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
- Knowledge of data governance frameworks and data security best practices.
- Experience with Boomi, Informatica, or MuleSoft for SAP and non-SAP integrations.
- Experience in Google Cortex Framework for SAP-GCP integrations.
Be The First To Know
About the latest Gcp Jobs in Canada !
GCP Data Architect - Remote
Posted 24 days ago
Job Viewed
Job Description
Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.
We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working at remotely This is a 6 month contract initially, but could be extended.
Role: GCP Data Architect
Type: Contract
Duration: 6 months to start + potential extension
Location: Toronto, ON - remote with occasional office visits
Rate: $110 -$140 CDN/hr C2C depending on overall experience
GCP Data Architect - Role Overview
We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics . This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.
Key Responsibilities
1. Data Strategy, Security & Governance
- Define and implement enterprise-wide data strategy aligned with business goals.
- Establish data governance frameworks , data classification, retention, and privacy policies.
- Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).
2. Data Architecture & Modeling
- Design conceptual, logical, and physical data models to support analytics and operational workloads.
- Implement star, snowflake, and data vault models for analytical systems.
- Implement S4 CDS views in Google Big Query
3. Google Cloud Platform Expertise
- Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
- Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
- Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
- Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
- Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
- Develop complex SQL queries for analytics, transformations, and performance tuning.
- Build automation scripts and utilities in Python .
- Good understanding of CDS views, ABAP language
6. System Migration
- Lead on-premise to cloud migrations for enterprise data platforms (SAP BW/Bobj)
- Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
- Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
- Apply infrastructure-as-code principles for reproducible and scalable deployments.
Preferred Skills
- Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow .
- Strong SQL and Python programming skills.
- Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
- Knowledge of data governance frameworks and data security best practices.
- Experience with Boomi, Informatica, or MuleSoft for SAP and non-SAP integrations.
- Experience in Google Cortex Framework for SAP-GCP integrations.
Product Architect – Data Engineering (GCP Focus)
Posted 15 days ago
Job Viewed
Job Description
Python
Apache Spark
NoSQL
Google Cloud Platform (GCP)
Job ResponsibilitiesCoach and mentor squad members in:
Scrum practices
Software craftsmanship
Zoro norms
Design, develop, document, deploy, and maintain data pipelines
Collect, analyze, and profile various batch and streaming data sources
Lead Master Data Management (MDM) efforts within the squad
Actively participate and lead guild and chapter meetings
Collaborate with stakeholders to:
Groom ideas into small, independent, testable work items
Collaborate with Data Ops to:
Automate code analysis
Automate testing, building, and deploying
Collaborate with squads and tribes to:
Manage, groom, and prioritize technical debt
Expert Experience RequiredGoogle Cloud Platform (GCP)
Relational Databases (RDBMS)
Python and SQL (with focus on data manipulation and analysis)
NoSQL Databases (e.g., MongoDB)
Data Modeling and ETL/ELT processes
Building, deploying, and maintaining data pipelines
Sourcing and profiling highly variable data
Software craftsmanship , including:
Behavior-Driven Development (BDD)
Unit testing
Tools & TechnologiesMongoDB
BigQuery
Jira
Git
Kubernetes
Jenkins
Terraform
GCP Deployment Manager
Apache Airflow
Apache Beam
Apache Spark
Soft SkillsStrong people skills
Ability to build strong, meaningful, and lasting collaborative relationships
EducationBachelor’s degree in:
Computer Science
Applied Mathematics
Engineering
Or other technology-related field
Master’s degree preferred
Equivalent working experience is acceptable in lieu of formal education
Cloud Automation Specialtist - AWS, Azure, GCP
Posted 24 days ago
Job Viewed
Explore Google Cloud Platform (GCP) opportunities across Canada, with roles spanning cloud engineering, architecture, and data analytics. Professionals with expertise in GCP services like Compute Engine, Kubernetes Engine, and Cloud Storage are in high demand.