AI Software Engineer

Joseph
Soper

AI & Agent Systems / Software Development / DevOps

I build AI-powered software — autonomous agent pipelines, RAG systems with graph and vector memory, LLM-integrated APIs, and the production infrastructure to run it all. I work with models at the API level, chain them into workflows with tools like LangGraph and MCP, and deploy everything self-hosted or in the cloud.

Remote only  ·  Petoskey, MI

Actively seeking remote opportunities
Joseph Soper

About Me

I'm a software engineer focused on building AI-powered systems — the kind that do real work autonomously. That means designing agent pipelines with multi-step reasoning, wiring tools and APIs into LLM workflows via MCP and LangGraph, and building the backend infrastructure that keeps it all running reliably in production.

My development stack is Python-first: FastAPI backends, async task queues, REST API design, PostgreSQL, and Redis. I've integrated AI at the API level across multiple providers — Anthropic, Google Vertex AI, and locally-hosted models via Ollama and LM Studio — and I understand how to move beyond basic prompting into context engineering, retrieval-augmented generation, and persistent memory architectures with vector and graph databases.

I also hold an uncommon advantage for a software engineer: I can take a project from code all the way through production deployment on infrastructure I built myself. My homelab runs a production private cloud — Proxmox hypervisor, GPU passthrough for local LLM inference, container orchestration, full observability stack — automated end-to-end with Terraform and Ansible. I learn by shipping things that actually run.

IBM Back-End Development Professional Certificate. PCEP Python certification. Completing BS Computer Science. Google Cloud Professional Architect in progress.

AI-First
Agent pipelines, RAG, MCP, LangGraph, multi-provider LLM integration
Full-Stack
Python · FastAPI · REST APIs · PostgreSQL · async workflows
Deployed
Self-hosted LLM inference, production cloud infra, GPU-accelerated AI stack
IaC
Terraform + Ansible automation from bare metal to running service

Technical Skills

AI Engineering & Agent Systems
Autonomous Agents LangGraph MCP Servers Multi-Agent Orchestration RAG Pipelines Context Engineering Prompt Engineering Tool Use / Function Calling Vertex AI Anthropic API Ollama LM Studio LiteLLM Open WebUI Qdrant Neo4j Mem0 Vector Search Knowledge Graphs Antigravity (Google) ROCm / GPU Inference
Software Development
Python FastAPI REST API Design Async / Celery PostgreSQL Redis SQL Bash Git
DevOps & Automation
Docker Docker Compose Terraform Ansible GitHub Actions CI/CD IaC
Cloud & Infrastructure
Google Cloud Cloudflare Proxmox VE KVM / QEMU ZFS Linux Traefik v3 WireGuard Prometheus Grafana

Projects

forge-cortex AI System

Self-hosted AI assistant built around a persistent, graph-backed memory architecture. FastAPI orchestration layer connects Open WebUI, LiteLLM for multi-model routing, locally-hosted LLMs via Ollama (AMD GPU, ROCm), Qdrant for vector retrieval, Neo4j for knowledge graph storage, and Mem0 for long-term memory management. Designed as a foundation for agentic workflows — the system remembers context across sessions and can retrieve relevant knowledge to ground its responses.

Agents RAG LiteLLM Ollama Qdrant Neo4j Mem0 FastAPI Python Open WebUI
BezaChain AI System

Async AI orchestration platform with intelligent routing across local and cloud LLMs. FastAPI backend exposes a clean REST API surface; Celery handles background task execution with Redis as broker and PostgreSQL for persistent job tracking. Routes requests based on task complexity, cost, and latency requirements — local Ollama models for throughput, cloud providers for capability.

LLM Routing Ollama LangChain FastAPI Python Celery Redis PostgreSQL
BezaForge Infrastructure ● Deployed

Production private cloud built from bare metal — Proxmox VE hypervisor, AMD GPU passthrough for local LLM inference (ROCm 7.2), 5-VLAN Omada SDN with inter-VLAN firewall policy, 12+ Docker services behind Traefik v3 with wildcard TLS, and a full observability stack (Prometheus, Grafana, Loki, Uptime Kuma). Entire environment is codified: Terraform manages VM provisioning, Ansible handles configuration.

Terraform Ansible Proxmox Docker Prometheus Grafana Traefik ZFS
github.com/thejollydev/bezaforge-infrastructure
arch-ansible ● Deployed

Ansible playbooks for fully automated Arch Linux developer environment provisioning from bare metal — packages, dotfiles, services, and developer toolchain in a single idempotent run.

Ansible Arch Linux Bash systemd
github.com/thejollydev/arch-ansible

Let's Connect

Status

Open to Remote Opportunities

Targeting: AI Engineer · Software Developer · DevOps · Automation · Technical Support

Remote only  ·  Petoskey, MI

Available now