About Brainpool AI

Brainpool AI is a UK-based artificial intelligence consultancy, founded in 2017. We design, build and deploy agentic AI systems for medium-sized professional services companies in the United Kingdom, the United States and Canada, with a focus on the Testing, Inspection & Certification (ICT), energy, engineering and environmental consulting sectors.

The core of our offer is Cortex, our own agentic platform, on which we develop customized solutions: LLM agents in production, fine-tuning of models (Gemini, Qwen, Gemma), MLOps pipelines on Vertex AI and AWS, automatic generation of quotes, technical analysis and reports.

About the role

You will work as a software engineer in a distributed and multidisciplinary team (engineering, data science, AI research), building and deploying Cortex platform functionalities and solutions for end customers. The role combines backend development in Python, integration with LLMs and agent orchestration, and work on cloud infrastructure (AWS and GCP).

Responsibilities

  • Design, build and maintain backend services in Python (FastAPI, async).
  • Deploy LLM agents and orchestration pipelines with LangChain / LangGraph.
  • Integrate and evaluate models (OpenAI, Anthropic, Gemini, open source models via Hugging Face).
  • Work on cloud infrastructure (AWS, GCP / Vertex AI), including deployment, observability and CI/CD.
  • Collaborate in architecture, code reviews and definition of technical standards.
  • Communicate directly with end customers and stakeholders in English (written and spoken).

Exclusive requirements

  • Advanced written and spoken English (exclusive). You will interact daily with the team and with clients in the United Kingdom, the United States and Canada. Meetings, code reviews, documentation and communication with clients are 100% in English.
  • 3+ years of professional experience in backend development with Python.
  • Experience building REST APIs and async services (FastAPI, asyncio).
  • Solid handling of Git, Docker, testing and CI/CD.
  • Experience with at least one cloud (AWS or GCP) in production.

Desirable

  • Experience building applications with LLMs in production (RAG, agents, fine-tuning, evaluation).
  • Familiarity with LangChain, LangGraph, MCP, Vertex AI or Bedrock.
  • Experience with infrastructure as code (Terraform, Pulumi).
  • Background in MLOps and model deployment.
  • Contributions to open source projects.

What we offer

  • Hiring as an independent contractor, monthly payment in USD.
  • 100% remote, flexible schedule (with time zone overlay with UK or USA).
  • Challenging technical work on the front lines of LLMs and agent systems in production.
  • Small team, high impact, direct exposure to clients and architectural decisions.
  • Budget for training, conferences and tools.

How to apply

Send your CV (in English, preferably) and a short paragraph telling us about a recent project with LLMs or distributed systems to peter@brainpool.ai. If you have a public GitHub profile, add it up.

python aws devops docker english fastapi gcp github langchain llm
Datos de la oferta laboral
Estado

Última Modificación
05/05/2026 16:41
Lugar de trabajo
Argentina O Uruguay · Remoto
Empresa
Brainpool AI
Permite trabajar remoto
No
Experiencia Requerida
3+
Modalidad de Trabajo
Remoto
Tipo de Contratación
Contractor long term
Rango Salarial
3000-4000 USD por mes, según experiencia
  • peter@brainpool.ai
  • https://brainpool.ai