L4 Data Management (Data Engineer)
Burnaby, BC, CA Vancouver, BC, CA Edmonton, AB, CA Montréal, QC, CA Calgary, AB, CA Toronto, ON, CA Ottawa, ON, CA Qu?bec, QC, CA
Join our team and what we'll accomplish together
The Data Strategy & Enablement (DSE) team is on a mission to make TELUS a world-class leader in data solutions. We're the architects of the future, building the unified, scalable data platforms and pioneering the data-product culture that powers decisions and innovation across the entire organization. Our work is the bedrock upon which advanced analytics and game-changing AI are built.
As a Data Engineer on our team, you will be at the heart of this transformation. You will apply your expert knowledge of data systems, architecture, and software development to build and deploy the robust, high-quality data pipelines that fuel our business. You'll collaborate closely with data scientists, ML engineers, architects, and business stakeholders, turning complex data challenges into reliable, scalable, and automated data solutions.
What you will do
- Partner with business and technology stakeholders to understand data requirements and translate them into technical designs for resilient, high-performance data pipelines.
- Lead the design, development, and deployment of sophisticated data ingestion, transformation, and delivery solutions using modern, cloud-native technologies.
- Develop and maintain robust, scalable data pipelines for continuous and reliable data flow, ensuring the quality and availability of data for machine learning models, analytics, and critical business operations.
- Work with diverse and complex datasets, engineering elegant solutions to extract, model, and prepare data for a variety of downstream use cases.
- Architect and implement the transformation and modernization of our data solutions, leveraging Google Cloud Platform (GCP) native services and other leading-edge data technologies.
- Identify opportunities for and implement automation in the development and production lifecycle to improve efficiency, reliability, and speed.
- Champion and implement best practices in software engineering, data engineering, and data management within the team.
- Continuously assess the evolving data technology landscape and identify new opportunities to drive innovation and efficiency.
What you bring
- A proven track record of designing, building, and deploying data pipelines and solutions that deliver tangible business value, reflected in your 5+ years of IT platform implementation experience.
- Bachelor's degree in Computer Science, Engineering, or an equivalent field.
- Deep understanding and hands-on experience in data engineering principles and best practices, with advanced knowledge of Dataflow, Spark, or Kafka.
- Strong systems design experience with the ability to architect, document, and explain complex system interactions, data flows, and APIs.
- Excellent communication skills, with the ability to articulate complex data concepts and solutions clearly to diverse technical and non-technical audiences.
- Advanced experience with Python and mastery of data manipulation libraries (like Pandas), paired with strong SQL skills for complex querying and data analysis.
- Familiarity with at least one major cloud computing platform (GCP, AWS, Azure), with practical experience deploying data solutions.
- Strong analytical and problem-solving skills, with a talent for translating complex business needs into scalable and maintainable data solutions.
- A passion for innovation, a results-oriented mindset, an agile approach with a bias for action, and a change agent mentality.
Great-to-haves
- GCP Professional Data Engineer certification.
- Practical experience with Databricks, Azure, or AWS data services.
- An understanding of telecommunications data models or TMF standards.
- Experience in MLOps, supporting the data infrastructure required for machine learning workflows.
- Familiarity with Infrastructure as Code (e.g., Terraform) and CI/CD practices