L3 Data Management
Vancouver, BC, CA Toronto, ON, CA Calgary, AB, CA Edmonton, AB, CA
Description
Ready to create innovative solutions and best practices?
Our team and what we’ll accomplish together
The predictive analytics and customer experience insights team consists of Analysts, Data Scientists, and Data Engineers who are all dedicated to improving the customer experience through advanced AI and ML technologies across TELUS Global Operations. Our core mission is to provide global call center demand forecasting models with meaningful and actionable insights. This ensures optimal outcomes for our customers, capacity planning teams and business partners.
We foster an innovative environment with a flexible work style, allowing team members to work in or out of the office. Our contributions directly influence the business's direction and performance.
As an experienced professional in the field of data science, you will be part of the journey that brings us to a better understanding of our customers in order to predict their needs and interaction behaviour.
You will utilize your expertise in data engineering to design, build and maintain data pipelines, optimize performance, and ensure data quality. You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions.
A critical aspect of this responsibility is to continuously optimize the performance of these pipelines, ensuring efficient data flow and processing. Furthermore, a paramount focus will be placed on upholding the highest standards of data quality, implementing rigorous checks and validation processes to guarantee accuracy and reliability.
In addition to technical delivery, this role is crucial for building and maintaining customer trust. This is accomplished by consistently applying and upholding best practices across key areas:
- Data Security and Integrity: Rigorously adhering to secure development methodologies to ensure data confidentiality and integrity.
- Universal Accessibility: Integrating accessibility considerations throughout the entire design and development lifecycle, ensuring systems and data are usable by all intended stakeholders.
- User-Centric Design: Emphasizing thoughtful and intuitive design principles to create user-friendly and highly effective data solutions.
What you’ll do
As a Data Engineer within the team, you will be working with Business Stakeholders and the Data Scientists to build data assets that will be leveraged by the data scientists for machine learning/AI models to protect the customer base and allow us to build a variety of new models for our customers.
Key Responsibilities:
- Design, build, and maintain scalable and resilient Data Pipelines and Analytics solutions on Google Cloud Platform (GCP).
- Architect technical solutions for data acquisition, processing, and storage, ensuring data quality and availability for critical business operations, analytics, and machine learning models.
- Develop data assets on GCP specifically for machine learning purposes.
- Automate and optimize internal data processes.
- Provide technical data stewardship, including metadata management, and implement security and privacy by design principles.
- Lead the design, development, and review of data architecture, models, flows, and integration; be hands-on across the entire ETL pipeline.
- Stay current with evolving Cloud and Big Data technologies, sharing knowledge and enabling best practices, standards, and governed processes across the organization.
- Offer support for data analysis and standard reporting needs
Qualifications
What you bring
- Cloud Platforms: Experience with at least one major cloud computing platform (GCP, AWS, Azure).
- Databases and SQL: Proficiency with SQL and relational databases, including BigQuery, DDL, and procedural SQL.
- Software Development: Ability to design, implement, and deploy software features based on business requirements.
- Version Control: General knowledge and practical use of version control systems (e.g., Git and GitHub).
- Linux Environment: Comfort working from the command line in a Linux environment, including familiarity with common commands and file systems.
- Containerization: Experience working with and deploying virtual machines utilizing containerized operating systems and applications.
Great-to-haves
- Education: A degree in a software or computing field, such as Computer Science.
- Programming Languages: Development experience in Python or non-interpreted languages.
- Container Tools: Experience with Docker or Podman, including writing Dockerfiles and troubleshooting issues.
- Package Managers: Experience with various package managers (e.g., apt, dnf, pip).
- Version Control Maintenance: Experience maintaining version control system repositories.