Senior Data Engineer Ai Platforms (Remote)

Jobs
TELUS Digital Brazil

TELUS Digital Brazil

-

🌎 Remote

Posted on: 9 June, 2025

Senior Data Engineer Ai Platforms (Remote)

Senior Data Engineer Ai Platforms | TELUS Digital Brazil | Brazil Senior Data Engineer (GCP)Who We AreWelcome to TELUS Digital — where innovation drives impact at a global scale. As an award-winning digital product consultancy and the digital division of TELUS, one of Canada’s largest telecommunications providers, we design and deliver transformative customer experiences through cutting-edge technology, agile thinking, and a people-first culture.With a global team across North America, South America, Central America, Europe, and APAC, we offer end-to-end expertise across eight core service areas: Digital Product Consulting, Digital Marketing Services, Data & AI, Strategy Consulting, Business Operations Modernization, Enterprise Applications, Cloud Engineering, and QA & Test Engineering.From mobile apps and websites to voice UI, chatbots, AI, customer service, and in-store solutions, TELUS Digital enables seamless, trusted, and digitally powered experiences that meet customers wherever they are — all backed by the secure infrastructure and scale of our multi-billion-dollar parent company.Location and FlexibilityThis role can be fully remote for candidates based in the states of São Paulo and Rio Grande do Sul as well as in the cities of Rio de Janeiro, Belo Horizonte, Florianópolis and Fortaleza due to team distribution and occasional in-person opportunities. If you are based in São Paulo or Porto Alegre, you are welcome to work from one of our offices on a flexible schedule.The OpportunityAs a Data Engineer part of our growing Fuel IX team, you will be responsible for designing, implementing, and maintaining robust and scalable data pipelines, enabling efficient data integration, storage, and processing across our various data sources. You will collaborate with cross-functional teams, including Data Scientists, Software Engineers, and other technical stakeholders, to ensure data quality and support data-driven decision-making.ResponsibilitiesDevelop and optimize scalable, high-performing, secure, and reliable data pipelines that address diverse business needs and considerationsIdentify opportunities to enhance internal processes, implement automation to streamline manual tasks, and contribute to infrastructure redesign Help mentor and coach a product team towards shared goals and outcomesNavigate difficult conversations by providing constructive feedback to teamsIdentify obstacles to ensure quality, improve our user experience and how we build testsBe self-aware of limitations, yet curious to learn new solutions while being receptive to constructive feedback from teammatesEngage in ongoing research and adoption of new technologies, libraries, frameworks, and best practices to enhance the capabilities of the data teamQualifications5+ years of relevant development experience writing high-quality code as a Data EngineerHave actively participated in the design and development of data architecturesHands-on experience in developing and optimizing data pipelinesComprehensive understanding of data modeling, ETL processes, and both SQL and NoSQL databasesExperience with a general-purpose programming language such as Python or ScalaExperience with GCP platforms and services.Experience with containerization technologies such as Docker and KubernetesProven track record in implementing and optimizing data warehousing solutions and data lakesProficiency in DevOps practices and automation tools for continuous integration and deployment of data solutionsExperience with machine learning workflows and supporting data scientists in model deploymentSolid understanding of data security and compliance requirements in large-scale data environmentsStrong ability to communicate effectively with teams and stakeholders, providing and receiving feedback to improve product outcomes.Proficient in communicating and writing in EnglishBonus PointsBig data tools such as Hadoop, Spark or KafkaOrchestration tools such as AirflowAgile development environment and familiarity with Agile methodologies Show more Show less

Tags:
ai
ml
Share the job:

Related Jobs