Join to apply for the Senior Data Engineer (GCP) - AI Platforms role at TELUS Digital Brazil
2 days ago Be among the first 25 applicants
Who We AreWelcome to TELUS Digital — where innovation drives impact at a global scale.
As an award-winning digital product consultancy and the digital division of TELUS, one of Canada's largest telecommunications providers, we design and deliver transformative customer experiences through cutting-edge technology, agile thinking, and a people-first culture.
Position : Senior Data Engineer (GCP)
Who We AreWelcome to TELUS Digital — where innovation drives impact at a global scale.
As an award-winning digital product consultancy and the digital division of TELUS, one of Canada's largest telecommunications providers, we design and deliver transformative customer experiences through cutting-edge technology, agile thinking, and a people-first culture.
With a global team across North America, South America, Central America, Europe, and APAC, we offer end-to-end expertise across eight core service areas : Digital Product Consulting, Digital Marketing Services, Data & AI, Strategy Consulting, Business Operations Modernization, Enterprise Applications, Cloud Engineering, and QA & Test Engineering.
From mobile apps and websites to voice UI, chatbots, AI, customer service, and in-store solutions, TELUS Digital enables seamless, trusted, and digitally powered experiences that meet customers wherever they are — all backed by the secure infrastructure and scale of our multi-billion-dollar parent company.
Location and FlexibilityThis role can be fully remote for candidates based in the states of São Paulo and Rio Grande do Sul as well as in the cities of Rio de Janeiro, Belo Horizonte, Florianópolis, and Fortaleza due to team distribution and occasional in-person opportunities.
If you are based in São Paulo or Porto Alegre, you are welcome to work from one of our offices on a flexible schedule.
The OpportunityAs a Data Engineer part of our growing Fuel IX team, you will be responsible for designing, implementing, and maintaining robust and scalable data pipelines, enabling efficient data integration, storage, and processing across our various data sources.
You will collaborate with cross-functional teams, including Data Scientists, Software Engineers, and other technical stakeholders, to ensure data quality and support data-driven decision-making.
Responsibilities
Develop and optimize scalable, high-performing, secure, and reliable data pipelines that address diverse business needs and considerations
Identify opportunities to enhance internal processes, implement automation to streamline manual tasks, and contribute to infrastructure redesign
Help mentor and coach a product team towards shared goals and outcomes
Navigate difficult conversations by providing constructive feedback to teams
Identify obstacles to ensure quality, improve our user experience and how we build tests
Be self-aware of limitations, yet curious to learn new solutions while being receptive to constructive feedback from teammates
Engage in ongoing research and adoption of new technologies, libraries, frameworks, and best practices to enhance the capabilities of the data team
Qualifications
5+ years of relevant development experience writing high-quality code as a Data Engineer
Have actively participated in the design and development of data architectures
Hands-on experience in developing and optimizing data pipelines
Comprehensive understanding of data modeling, ETL processes, and both SQL and NoSQL databases
Experience with a general-purpose programming language such as Python or Scala
Experience with GCP platforms and services
Experience with containerization technologies such as Docker and Kubernetes
Proven track record in implementing and optimizing data warehousing solutions and data lakes
Proficiency in DevOps practices and automation tools for continuous integration and deployment of data solutions
Experience with machine learning workflows and supporting data scientists in model deployment
Solid understanding of data security and compliance requirements in large-scale data environments
Strong ability to communicate effectively with teams and stakeholders, providing and receiving feedback to improve product outcomes
Proficient in communicating and writing in English
Bonus Points
Big data tools such as Hadoop, Spark, or Kafka
Orchestration tools such as Airflow
Experience in an Agile development environment and familiarity with Agile methodologies
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology
Industries
Hospitals and Health Care
Non-profit Organizations
Government Administration
J-18808-Ljbffr
Senior Data Engineer • Porto Alegre, Rio Grande do Sul, Brasil