Senior Data Engineering Consultant
Toronto, ON, CA Scarborough, ON, CA, M1H 3G2 Calgary, AB, CA Vancouver, BC, CA Burnaby, BC, CA Edmonton, AB, CA
Join our team and what we'll accomplish together
The Data Engineering and Metadata Management team is on a continuous journey toward helping TELUS become a world-class leader in data solutions by ensuring the delivery of robust data pipelines, high-quality data, and comprehensive metadata solutions built upon unified, scalable platforms and advanced AI tooling. We foster a data-product and data-platform oriented culture while always keeping an eye on the horizon, preparing for the next big thing.
We are searching for a talented candidate to become our team's Senior Data Engineering Consultant. Reporting to the Manager – Metadata Management and Data Quality, the successful candidate will be responsible for designing and implementing enterprise-scale data solutions that drive digital transformation across the organization. As a key technical expert in our cloud-first organization, you will leverage your deep analytical expertise to architect data pipelines, assess data lineage and impact across upstream and downstream systems, and modernize our enterprise data platform using Google Cloud Platform services. This role combines advanced technical proficiency in data engineering and cloud architecture with hands-on development experience in native GCP tools (BigQuery, Dataflow, Composer) and third-party data technologies, along with the ability to establish and evangelize cloud-related data engineering best practices and standards across TELUS.
What you will do
- Lead the architecture and design of data pipelines, data products, and platforms with a focus on metadata management and data quality frameworks across on-prem and cloud environments
- Design and build robust data extraction, loading, and transformation pipelines that ensure data lineage, cataloging, and quality controls are embedded throughout the data lifecycle
- Perform application impact assessments, requirements reviews, and develop comprehensive work estimates for complex data engineering initiatives
- Develop test strategies and site reliability engineering measures to ensure high availability and performance of data products and solutions
- Lead agile development scrums and facilitate solution reviews with cross-functional teams including business partners, domain architects, and product management
- Mentor junior Data Engineering Specialists, providing technical guidance and fostering their professional growth
- Conduct technical data stewardship tasks including metadata management, data cataloging, data lineage tracking, and implementing security and privacy by design principles
- Lead resolution of critical operations issues and conduct post-implementation reviews to drive continuous improvement
- Identify opportunities for automation in development and production processes to reduce manual effort and improve efficiency
- Propose solutions that enable business objectives while advancing enterprise technology strategies and metadata management capabilities
What you bring
- 6+ years of data engineering experience with 4+ years focused on data platform solution architecture and design
- Bachelor's degree in Computer Science, Software Engineering or equivalent
- Advanced proficiency in programming languages, including Python, as well as low-code tools such as Informatica, DataStage, or equivalent platforms
- Demonstrated experience leading and mentoring data engineering teams, driving technical excellence, and influencing architectural decisions across the organization
- Strong SQL and database proficiency with minimum 2 years of hands-on experience across various database technologies
- Expertise in task automation and orchestration using tools like Control-M, Apache Airflow, Prefect, or similar platforms for setting up DAGs and workflow management
- Advanced Unix scripting capabilities and deep knowledge of Quality Assurance and Site Reliability Engineering principles
- Intermediate to Advanced proficiency in Infrastructure as Code tools such as Terraform, Puppet, or Ansible, and data modeling techniques
- Working knowledge of MLOps practices and their integration with data engineering workflows
- Systems design experience with the ability to architect and explain complex systems interactions, data flows, common interfaces, and APIs
- Architecture-level understanding of APIs, security architecture, and cloud-native development across major cloud providers including GCP, Azure, AWS, Databricks, and Snowflake
Great-to-haves
- GCP Professional Data Engineering certification
- Understanding of TMF standards
- Experience with Databricks, Azure and AWS