Your career is an investment that grows over time!
Wealthsimple is on a mission to help everyone achieve financial freedom by reimagining what it means to manage your money. Using smart technology, we take financial services that are often confusing, opaque and expensive and make them transparent and low-cost for everyone. We’re the largest fintech company in Canada, with over 3+ million users who trust us with more than $100 billion in assets.
Our teams ship often and make an impact with groundbreaking ideas. We're looking for talented people who keep it simple and value collaboration and humility as we continue to create inclusive and high-performing teams where people can be inspired to do their best work.
About the team:
The Analytics Platform team is the foundation that empowers our data analysts and data scientists to do their best work. We build the tools, infrastructure, and frameworks that make data work at Wealthsimple fast, reliable, and delightful. Our mission is to turn data complexity into developer productivity.
We're tackling some of the most impactful challenges in data infrastructure: building robust data quality and observability systems, creating intuitive testing frameworks, and enhancing the developer experience for dbt model development. Our work directly enables better, faster decision-making across the entire organization.
This is a collaborative, cross-functional role. You'll partner closely with data analysts and data scientists to understand their workflows and pain points, work with our Data Integrations team on Airflow orchestration challenges, and drive platform improvements that have company-wide impact.
About the role:
As a Senior Data Engineer on the Analytics Platform team, you'll be a force multiplier for our entire data organization. You'll design and build tools that make our data practitioners more productive, more confident in their work, and more impactful in their contributions to the business.
You'll own significant platform initiatives from conception to delivery, balancing technical excellence with pragmatic solutions that meet real user needs. This role requires both deep technical expertise and strong collaboration skills—you'll need to understand not just how to build great tools, but what tools to build and why.
What you will do:
Build developer tools and frameworks that enable data analysts and scientists to work more efficiently and confidently. This includes data quality systems, observability platforms, testing frameworks, and enhanced dbt development experiences.Drive cross-team platform initiatives by engaging directly with stakeholders to understand their day-to-day challenges, gathering requirements, and translating them into elegant technical solutions.Collaborate closely with the Data Integrations team to solve Airflow orchestration challenges and improve workflow reliability across the data pipeline.Design and implement data quality and observability solutions that catch issues early, provide clear visibility into data health, and make debugging faster and easier.Enhance the dbt developer experience through better tooling, improved testing capabilities, and streamlined workflows that reduce friction in the model development process.Apply software engineering best practices to analytics infrastructure—version control, CI/CD, automated testing, and clear documentation that makes systems maintainable and scalable.Champion platform-as-a-product thinking: treat internal data practitioners as your users, gather feedback, measure impact, and continuously iterate to deliver maximum value.
What do you bring:
5+ years of experience in data engineering, analytics engineering, or related technical roles with a focus on building tools and platforms.Expert-level proficiency in Python—you write clean, maintainable code and understand how to build robust, production-grade systems.Proficiency with Kubernetes and container orchestration, including experience deploying and managing data services in production Kubernetes environments. Experience with Helm charts for packaging, versioning, and deploying applications on Kubernetes.Strong infrastructure skills, including knowledge of infrastructure-as-code tools (like Terraform or CloudFormation) and cloud platforms (AWS preferred).Strong knowledge of dbt and modern analytics engineering practices. You understand not just how to use dbt, but how to make it work better for teams at scale.Experience with workflow orchestration tools, particularly Airflow, and an understanding of how to build reliable, observable data pipelines.Hands-on experience with modern data warehouses like Redshift and Snowflake, including performance optimization and cost management.A product mindset for internal tools: you understand that great developer tools require understanding your users, not just writing code.Strong communication and collaboration skills. You can explain complex technical concepts clearly and work effectively with stakeholders across different backgrounds.
Nice to have:
Experience building data quality, testing, or observability platforms.Familiarity with data lineage, cataloging, or metadata management systems.Background working in cross-functional or platform engineering environments.Experience with large-scale data warehouse migrations or transformations.Contributions to open-source data tools or active participation in the data engineering community.