Data Scientist/ Applied AI - Latin America - Remote

Jobs
Azumo

Azumo

-

🌎 Remote

Posted on: 11 September, 2025

Data Scientist/ Applied AI - Latin America - Remote

Azumo is currently looking for a highly motivated Data Scientist / Machine Learning Engineer to develop and enhance our data and analytics infrastructure. The position is FULLY REMOTE, based in Latin America.

This position will provide you with the opportunity to collaborate with a dynamic team and talented data scientists in the field of big data analytics and applied AI. If you have a passion for designing and implementing advanced machine learning and deep learning models, particularly in the Generative AI space, this role is perfect for you. We are seeking a skilled professional with expertise in Python for production-level projects, proficiency in machine learning and deep learning techniques such as CNNs and Transformers, and hands-on experience working with PyTorch.

We’re looking for a versatile Machine Learning Engineer / Data Scientist to join our big-data analytics team. In this hybrid role you’ll not only design and prototype novel ML/DL models, but also productionize them end-to-end, integrating your solutions into our data pipelines and services. You’ll work closely with data engineers, software developers and product owners to ensure high-quality, scalable, maintainable systems. Key Responsibilities

Model Development & Productionization

  • Design, train, and validate supervised and unsupervised models (e.g., anomaly detection, classification, forecasting).
  • Architect and implement deep learning solutions (CNNs, Transformers) with PyTorch.
  • Develop and fine-tune Large Language Models (LLMs) and build LLM-driven applications.
  • Implement Retrieval-Augmented Generation (RAG) pipelines and integrate with vector databases.
  • Build robust pipelines to deploy models at scale (Docker, Kubernetes, CI/CD).

Data Engineering & MLOps

  • Ingest, clean and transform large datasets using libraries like pandas, NumPy, and Spark.
  • Automate training and serving workflows with Airflow or similar orchestration tools.
  • Monitor model performance in production; iterate on drift detection and retraining strategies.
  • Implement LLMOps practices for automated testing, evaluation, and monitoring of LLMs.

Software Development Best Practices

  • Write production-grade Python code following SOLID principles, unit tests and code reviews.
  • Collaborate in Agile (Scrum) ceremonies; track work in JIRA.
  • Document architecture and workflows using PlantUML or comparable tools.

Cross-Functional Collaboration

  • Communicate analysis, design and results clearly in English.
  • Partner with DevOps, data engineering and product teams to align on requirements and SLAs.

At Azumo we strive for excellence and strongly believe in professional and personal growth. We want each individual to be successful and pledge to help each achieve their goals while at Azumo and beyond. Challenging ourselves and learning new technologies is at the core of what we do. We believe in giving back to our community and will volunteer our time to philanthropy, open source initiatives and sharing our knowledge.

Based in San Francisco, California, Azumo is an innovative software development firm helping organizations make insightful decisions using the latest technologies in data, cloud and mobility. We combine expertise in strategy, data science, application development and design to drive digital transformation initiatives for companies of all sizes.

If you are qualified for the opportunity and looking for a challenge please apply online at azumo.co/careers or connect with us at people@azumo.co

Minimum Qualifications

  • Bachelor’s or Master’s in Computer Science, Data Science or related field.
  • 5+ years of professional experience with Python in production environments.
  • Solid background in machine learning & deep learning (CNNs, Transformers, LLMs).
  • Hands-on experience with PyTorch or similar frameworks (training, custom modules, optimization).
  • Proven track record deploying ML solutions.
  • Expert in pandas, NumPy and scikit-learn.
  • Familiarity with Agile/Scrum practices and tooling (JIRA, Confluence).
  • Strong foundation in statistics and experimental design.
  • Excellent written and spoken English.

Preferred Qualifications

  • Experience with cloud platforms (AWS, GCP, or Azure) and their AI-specific services like Amazon SageMaker, Google Vertex AI, or Azure Machine Learning.
  • Familiarity with big-data ecosystems (Spark, Hadoop).
  • Practice in CI/CD & container orchestration (Jenkins/GitLab CI, Docker, Kubernetes).
  • Exposure to MLOps/LLMOps tools (MLflow, Kubeflow, TFX).
  • Experience with Large Language Models, Generative AI, prompt engineering, and RAG pipelines.
  • Hands-on experience with vector databases (e.g., Pinecone, FAISS).
  • Experience building AI Agents and using frameworks like Hugging Face Transformers, LangChain or LangGraph.
  • Documentation skills using PlantUML or similar.
  • Paid time off (PTO)
  • U.S. Holidays
  • Training
  • Udemy free Premium access
  • Mentored career development
  • Profit Sharing
  • $US Remuneration

Original job Data Scientist/ Applied AI - Latin America - Remote posted on GrabJobs ©. To flag any issues with this job please use the Report Job button on GrabJobs.

Tags:
ai
ml
Share the job:

Related Jobs