AI /ML Engineer

Globo Language Solutions
Washington, US
Remote

Job Description

Job Type

Full-time

Description

Reporting to the Director of Data & AI Engineering, the AI/ML Engineer is a mid-level, hands-on technical role responsible for building and maintaining the data pipelines, AI models, and intelligent features that power the GLOBO platform. This role spans the full lifecycle of AI development-from cleaning and preparing data, to building and evaluating models, to shipping production features that directly improve operational efficiency and customer experience.

The AI/ML Engineer works across GLOBO's modern data stack (Fivetran, dbt, Snowflake) and AI infrastructure (AWS Bedrock, LLMs, agentic frameworks) to deliver reliable, well-tested solutions. This person is equally comfortable wrangling messy data and prompt-engineering an LLM, and takes pride in writing clean, tested code that other engineers can build on.

Key Responsibilities:

  • Data Engineering & Pipeline Development: Build and maintain ETL/ELT pipelines that move data from source systems into Snowflake via Fivetran and dbt. Clean, validate, and transform raw data into well-modeled datasets ready for analytics and ML workloads. Monitor pipeline health and troubleshoot data quality issues as they arise.
  • Model Development, Testing & Evaluation: Build, fine-tune, and evaluate AI/ML models and LLM-based solutions for specific business problems (e.g., automated QA, intelligent routing, context-aware translation aids). Design and run evaluation frameworks to measure model performance, detect hallucinations and bias, and ensure outputs meet GLOBO's reliability standards for vulnerable populations.
  • Feature Development & Integration: Collaborate with Product and Engineering to ship AI-powered features into the GLOBO platform. Build and deploy LLM integrations (AWS Bedrock, Anthropic Claude) and agentic workflows (CrewAI, LangChain). Write production-quality code with proper tests, documentation, and error handling.
  • Reliability & Safety: Implement guardrails, monitoring, and alerting for AI services in production. Ensure AI outputs are consistent and trustworthy. Contribute to evaluation datasets, prompt versioning, and regression testing for deployed models.
  • Performance & Cost Optimization: Monitor inference latency, token usage, and compute costs across AI services (AWS Bedrock) and data infrastructure (Snowflake). Identify opportunities to improve efficiency without sacrificing quality.

Requirements

Required Minimum Education and Experience:

  • Bachelor's Degree in Computer Science, Data Science, Information Systems, or related field.
  • 2+ years of experience in data engineering, software development, or ML engineering.
  • Experience with the below tech stack is required: • Python (advanced proficiency)
  • SQL (advanced proficiency)
  • LLM Integration (AWS Bedrock, Anthropic Claude, or OpenAI API)
  • dbt (data transformation and testing)
  • Snowflake (or similar cloud data warehouse)
  • AWS Lambda / Serverless architecture
  • Experience with the below tech stack is preferred: • Fivetran (or similar ELT/ingestion tooling)
  • Agentic Frameworks (CrewAI, LangChain, or similar)
  • Airflow (or similar workflow orchestration)
  • Vector Databases (Pinecone, PGVector, or OpenSearch)
  • AWS ECS/EKS
  • CDK and CloudFormation for automated deployments
  • Ruby on Rails (ability to read/debug core platform code)
  • Redis
  • PostgreSQL
  • React
  • Familiarity with model evaluation techniques, prompt engineering, and AI safety best practices.
  • Ability to communicate clearly and effectively verbally and in writing.
  • Ability to work independently in a decentralized, hybrid environment.

Work Environment: This position predominantly operates in a hybrid professional and/or remote office environment. This role routinely uses standard office equipment such as computers, phones, and related technology. While performing the duties of this job, the employee may be required to stand up or to sit for up to eight hours or more at a time.

Skills & Requirements

Technical Skills

PythonSqlLlm integrationDbtSnowflakeAws lambdaFivetranAgentic frameworksAirflowVector databasesAws ecs/eksCdk and cloudformationRuby on railsRedisPostgresqlReact

Employment Type

FULL TIME

Level

mid

Posted

5/1/2026

Apply Now

You will be redirected to Globo Language Solutions's application portal.

Sign in and we'll score your resume against this role.