L4 Data Management
Scarborough, ON, CA, M1H 3J3 Edmonton, AB, CA Vancouver, BC, CA Calgary, AB, CA Toronto, ON, CA
Description
Our team and what we’ll accomplish together
Be a part of a transformational journey with us, where innovative talent meets cutting-edge technologies. At TELUS, the Data Center of Excellence team is dedicated to making TELUS the most insights-driven company globally. We provide business intelligence, data assets, data products, business metrics and data Engineering that drive and enable BI and analytics to support over 15,000 employees across diverse internal teams.
We are seeking a highly skilled and experienced Data Management Professional specializing in Google Cloud Platform (GCP) and Big Query(BQ) to join our dynamic team. The ideal candidate will have a strong background in engineering complex data assets and BI/Data solutions, experience in designing and implementing automated CI/CD pipelines and test automation. This role will involve collaborating with cross-functional teams to architect, implement, and manage cloud-based data solutions while ensuring security, compliance, and optimal performance. As a senior team member, you will provide mentorship, support your colleagues, and help build their expertise and capabilities
What you’ll do
- Design and implement cloud-based data solutions on Google Cloud Platform (GCP), Big Query, Snowflake and Postgres SQL to support business intelligence (BI) and data analytics requirements.
- Define/ Establish best practices for automated regression testing approaches to ensure the reliability and accuracy of data solutions.
- Establish robust Change Management and release management strategies to ensure smooth deployments and minimize downtime.
- Establish the Software Development Lifecycle (SDLC) process and best practices for cloud-based projects, including requirements gathering, design, development, testing, deployment, maintenance and documentation.
- Define and communicate the cloud vision and strategy, aligning technical solutions with business objectives and future scalability.
- Develop and maintain automated CI/CD pipelines for deploying data pipelines, analytics models, and applications.
- Design and implement ETL (Extract, Transform, Load) processes and integration workflows to ingest, process, and transform data from various sources into usable formats for analysis and reporting.
- Implement compliance best practices to protect data assets and ensure regulatory compliance (e.g., SOX).
- Oversee configuration management processes to maintain consistency and scalability across cloud environments.
- Implement API integration strategies to enable seamless communication and data exchange between systems and applications.
- Design, develop and optimize data models and explores within the Looker platform
Qualifications
What you bring
- Bachelor's degree in Computer Engineering, Science,, or related field.
- 10 years of experience SQL / PL SQL, Stored procedure, BigQuery, Dataflow(data Pipeline tools) , in building complex datasets and data engineering and data pipeline development
- 5-7 years of experience in cloud architecture and data engineering, with a focus on Google Cloud Platform (GCP)
- 2 to 5 years experience with Pulumi, CI/CD/ Terraform version control and IaC) infrastructure as code.
- 2 to 5 years dashboard and visualization tools such as tableau, Looker, Looker studio.
- Experience designing and implementing automated CI/CD pipelines using native GCP services and tools like Jenkins, GitLab CI/CD, or similar.
- Proven track record of delivering complex BI/Data solutions in a cloud environment.
- Strong understanding of regression testing methodologies and tools for data solutions.
- Solid understanding of Change Management and release management principles and best practices.
- Hands on experience with DevOps / SRE / automation
- Experience with code repositories and version control systems (e.g., Git, Bitbucket).
- Excellent communication and interpersonal skills with the ability to collaborate effectively with cross-functional teams.
Great-to-haves
- Cloud certification (GCP, Professional Cloud Architect, Professional Data Engineer AWS(Certified Solutions Architect. Certified Developer)
- Data certifications (data engineer, oracle or others)
- Experience working with data visualization tools (e.g., Looker, Looker Studio, Tableau).
- Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes).
- Experience with Azure and AWS
- Knowledge of security and compliance standards (e.g., SOX, GDPR) and best practices for cloud environments.
- Have designed data models for Online Transactional Processing (OLTP) and Online Analytical Processing (OLAP) warehousing, ensuring optimal performance and scalability.
Join our team and be part of a dynamic environment where your expertise will make a significant impact on our success.