Staff, Data Engineer (Global Security)

TORONTO, Ontario April 18, 2026 Full Time

Job Description

WHAT WILL YOU DO?

  • Build and Optimize Data Pipelines: Design, develop, and enhance scalable ETL/ELT pipelines to migrate, transform, and load large datasets from diverse sources (e.g., databases, APIs, flat files), ensuring seamless integration for analytics, reporting, and AI solutions.
  • Drive Technical Innovation: Leverage advanced tools and techniques to create reusable, secure, and efficient technical solutions that align with business needs and project lifecycle deliverables, including data sharing and governance.
  • Champion Snowflake Best Practices: Guide users on effective Snowflake utilization, establishing standards for data consumption, storage, and workflow integration while designing and implementing high-impact stored procedures.
  • Collaborate Across Teams: Partner with cross-functional stakeholders to translate data requirements into robust solutions that empower analytics, reporting, AI, and machine learning initiatives.
  • Strengthen Data Governance: Implement and maintain best practices for metadata management, access controls, and compliance to ensure data integrity and security.
  • Ensure Performance and Scalability: Monitor system performance, troubleshoot issues, and optimize queries/processes to maximize efficiency and scalability.
  • Automate and Streamline Workflows: Use Python scripting and orchestration tools (e.g., Airflow) to automate platform utilities and workflows, reducing manual effort and enhancing reliability.
  • Document and Share Knowledge: Create clear technical documentation for processes, architectures, and data models to foster team collaboration and institutional knowledge.

WHAT DO YOU NEED TO SUCCEED?

  • Snowflake Expertise
  • Hands-on experience with Snowflake (e.g., designing stored procedures, optimizing queries, data storage/consumption best practices).
  • Data Pipeline Development
  • Proficiency in building, optimizing, and maintaining ETL/ELT pipelines for large-scale data migration and transformation.
  • Programming & Automation
  • Strong scripting skills in Python for automation and tool development.
  • Experience with workflow orchestration tools (e.g., Airflow).
  • Data Governance & Security
  • Knowledge of implementing data governance practices (metadata management, access controls, compliance).
  • Performance Optimization
  • Skills in monitoring, troubleshooting, and optimizing database/query performance for scalability.

Nice-to-have

  • Technical Collaboration & Communication
  • Ability to guide users/teams on platform best practices and present technical solutions in cross-functional meetings.
  • DevOps & CI/CD Practices
  • Knowledge of version control (Git), containerization (Docker), or CI/CD pipelines for data engineering workflows.
  • Documentation & Knowledge Sharing
  • Experience documenting technical processes, architectures, and data models for team use.
  • Advanced Data Modeling
  • Experience designing dimensional models (e.g., star/snowflake schemas) or optimizing data warehouses for analytics

What’s in it for you?

We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual.

  • A comprehensive Total Rewards Program including bonuses and flexible benefits, competitive compensation, commissions, and stock where applicable.
  • Leaders who support your development through coaching and managing opportunities.
  • Ability to make a difference and lasting impact.
  • Work in a dynamic, collaborative, progressive, and high-performing team.
  • A world-class training program in financial services.
  • Opportunities to do challenging work.

Job Skills

Analytics, Big Data, Big Data Management, Cloud Computing, Collaborating, Critical Thinking, Database Development, Data Engineering, Data Mining, Data Modeling, Data Pipelines, Datasets, Data Warehousing (DW), ETL Processing, ETL Tools, Extract Transform Load (ETL), Git, Group Problem Solving, Quality Management, Requirements Analysis

Additional Job Details

Address:

16 YORK ST:TORONTO

City:

Toronto

Country:

Canada

Work hours/week:

37.5

Employment Type:

Full time

Platform:

TECHNOLOGY AND OPERATIONS

Job Type:

Regular

Pay Type:

Salaried

Posted Date:

2026-04-17

Application Deadline:

2026-05-01

Note: Applications will be accepted until 11:59 PM on the day prior to the application deadline date above

Our Employment Opportunities

At RBC, we are guided by living shared values of Client First, Integrity, Collaboration, Respect and Excellence and winning together as One RBC. We believe an inclusive workplace that has diverse perspectives is core to our continued growth as one of the largest and most successful banks in the world. Maintaining a workplace where our employees feel supported to perform at their best, effectively collaborate, drive innovation, and grow professionally helps to bring our Purpose to life and create value for our clients and communities. RBC strives to deliver this through policies and programs intended to foster a workplace based on respect, belonging and opportunity for all.

Join our Talent Community

Stay in-the-know about great career opportunities at RBC. Sign up and get customized info on our latest jobs, career tips and Recruitment events that matter to you.

Expand your limits and create a new future together at RBC. Find out how we use our passion and drive to enhance the well-being of our clients and communities at jobs.rbc.com.

RBC is presently inviting candidates to apply for this existing vacancy. Applying to this posting allows you to express your interest in this current career opportunity at RBC. Qualified applicants may be contacted to review their resume in more detail.

Apply on company site

How well do you match this role?

Check My Resume