L4 Data Management (Data Engineer)
Burnaby, BC, CA Toronto, ON, CA Qu?bec, QC, CA Vancouver, BC, CA Ottawa, ON, CA Montréal, QC, CA Calgary, AB, CA Edmonton, AB, CA
Join our team and what we'll accomplish together
L’équipe Stratégie et mise en œuvre de données a pour mission de faire de TELUS un chef de file mondial en matière de solutions de données. Nous sommes les architectes de l’avenir, et à ce titre nous bâtissons les plateformes de données unifiées et évolutives et pavons la voie à une culture axée sur les produits et les données qui alimentent les décisions et l’innovation dans toute l’entreprise. Notre travail constitue le fondement sur lequel reposent l’analytique avancée et l’intelligence artificielle (IA) révolutionnaire.
En tant qu’expert ou experte en science des données au sein de notre équipe, vous serez au cœur de cette transformation. Vous appliquerez votre expertise des systèmes de données, de l’architecture et du développement logiciel pour construire et déployer les pipelines de données robustes et de haute qualité qui alimentent nos activités. Vous collaborerez étroitement avec les scientifiques des données, les ingénieurs en apprentissage-machine, les architectes de données et diverses parties prenantes de l’entreprise pour transformer les problèmes de données complexes en solutions de données fiables, évolutives et automatisées.
What you will do
- Partner with business and technology stakeholders to understand data requirements and translate them into technical designs for resilient, high-performance data pipelines.
- Lead the design, development, and deployment of sophisticated data ingestion, transformation, and delivery solutions using modern, cloud-native technologies.
- Develop and maintain robust, scalable data pipelines for continuous and reliable data flow, ensuring the quality and availability of data for machine learning models, analytics, and critical business operations.
- Work with diverse and complex datasets, engineering elegant solutions to extract, model, and prepare data for a variety of downstream use cases.
- Architect and implement the transformation and modernization of our data solutions, leveraging Google Cloud Platform (GCP) native services and other leading-edge data technologies.
- Identify opportunities for and implement automation in the development and production lifecycle to improve efficiency, reliability, and speed.
- Champion and implement best practices in software engineering, data engineering, and data management within the team.
- Continuously assess the evolving data technology landscape and identify new opportunities to drive innovation and efficiency.
What you bring
- A proven track record of designing, building, and deploying data pipelines and solutions that deliver tangible business value, reflected in your 5+ years of IT platform implementation experience.
- Bachelor's degree in Computer Science, Engineering, or an equivalent field.
- Deep understanding and hands-on experience in data engineering principles and best practices, with advanced knowledge of Dataflow, Spark, or Kafka.
- Strong systems design experience with the ability to architect, document, and explain complex system interactions, data flows, and APIs.
- Excellent communication skills, with the ability to articulate complex data concepts and solutions clearly to diverse technical and non-technical audiences.
- Advanced experience with Python and mastery of data manipulation libraries (like Pandas), paired with strong SQL skills for complex querying and data analysis.
- Familiarity with at least one major cloud computing platform (GCP, AWS, Azure), with practical experience deploying data solutions.
- Strong analytical and problem-solving skills, with a talent for translating complex business needs into scalable and maintainable data solutions.
- A passion for innovation, a results-oriented mindset, an agile approach with a bias for action, and a change agent mentality.
Great-to-haves
- GCP Professional Data Engineer certification.
- Practical experience with Databricks, Azure, or AWS data services.
- An understanding of telecommunications data models or TMF standards.
- Experience in MLOps, supporting the data infrastructure required for machine learning workflows.
- Familiarity with Infrastructure as Code (e.g., Terraform) and CI/CD practices