An AI-powered multi-step procurement agent that autonomously negotiates with suppliers via email or chat. It optimizes decisions using internal price policies, REST API integrations, and autonomous reasoning strategies. The agent handles multiple stages of procurement:
- Search for suppliers
- Negotiate quotes
- Embed and compare offers
- Handle exceptions
- Update internal databases
This project aims to replicate real-world enterprise agent use cases and will be open-sourced for community contribution and feedback.
- AI-powered negotiation and follow-up drafting
- REST API call orchestration and optimization
- Exception detection and escalation
- Supplier response embedding and similarity matching
- Quote storage and ranking in a vector DB
- Stateless + stateful memory (ReAct / LangChain memory)
- Cost-aware policy optimization
| Component | Technology |
|---|---|
| Agent Framework | LangChain / AutoGPT |
| Prompt Logic | ReAct, Memory Chains, Prompt Engineering |
| Embeddings | FAISS / Chroma (free) |
| Backend | FastAPI / Flask + PostgreSQL |
| Hosting DB | Local Postgres (free PostgreSQL + API) |
To setup this repo locally
#Clone the repo
git clone https://github.com/kashewknutt/autonomous-procurement-agent.git
cd autonomous-procurement-agent
# Setup a virtual env
python -m venv temp
temp\Scripts\activate
# Install the required packages
pip install -r requirements.txt
# Install spacy model
python -m spacy download en_core_web_sm
# Setup postgres locally and create a redis account
# Create .env and .env variables
# GITHUB_API_TOKEN=""
# DATABASE_URL=""
# REDIS_URL=""
# Start the server
uvicorn app.main:app --reload
# Go ahead! Visit http://localhost:8000/docs to test the server
To run the app locally without Docker, you need a local PostgreSQL instance that matches the following DATABASE_URL format:
postgresql+psycopg2://postgres:postgres@localhost:5432/procurement
Follow these steps to install and configure PostgreSQL on Windows:
- Visit: https://www.postgresql.org/download/windows/
- Download the PostgreSQL Installer via EDB and run it.
- During setup:
- Choose a version (default is fine)
- Set password to:
postgres(or update.envto match your password) - Keep the default port:
5432 - Install additional tools like pgAdmin (optional but helpful)
After installation:
- Open pgAdmin or use the SQL Shell (psql).
- Run the following SQL commands:
CREATE DATABASE procurement;If using psql:
psql -U postgres
# Enter password: postgres
postgres=# CREATE DATABASE procurement;Make sure your .env file contains:
DATABASE_URL="postgresql+psycopg2://postgres:postgres@localhost:5432/procurement"Note: Replace
localhostwith127.0.0.1iflocalhostgives any issues.
pip install -r requirements.txtEnsure your app has code to initialize tables (e.g. Base.metadata.create_all(bind=engine)), then run:
python -m app.mainOr, if your project provides a specific init script:
python app/db/init_db.pyYouβre now connected to a local PostgreSQL instance running on Windows. Your FastAPI app should work with the database on localhost:5432.
- MVP First: Email bot β quote embedder β quote comparator β DB writer.
- Agentic Loop: Integrate LangChain agent to make decisions with tools.
- Scale Vector Search: Use FAISS/Chroma locally or Supabaseβs pgvector.
- Open-Source Friendly: All tools must run locally or on free-tier services.
- Extensibility: Modular functions, clear API design, config files for tools.
- Set up GitHub repo with MIT license
- Scaffold project with FastAPI + Docker + FAISS
- Configure Gmail API or SMTP (sandbox email)
- Add REST API endpoints for testing (quotes, suppliers, etc.)
- Integrate LangChain agent loop
- Build memory (ConversationBufferMemory or RedisMemory)
- Add tools for: REST API calls, vector DB search, policy check
- Define ReAct-style prompts
- Use sentence-transformers to embed supplier quotes
- Store vectors using FAISS or ChromaDB
- Add similarity threshold matching logic
- Create a PostgreSQL DB schema for quotes and suppliers
- Build CRUD APIs for quote history
- GitHub Repo Initialized
- Python project scaffolded (FastAPI + poetry/pipenv)
-
.envconfig file for secrets
- LangChain Agent Setup
- Tools created
- Memory chain enabled
- SentenceTransformer integration
- FAISS/Chroma vector DB
- PostgreSQL tables for quotes and suppliers
autonomous-procurement-agent/
βββ app/
β βββ agents/
β βββ embeddings/
β βββ api/
β βββ db/
β βββ utils/
βββ tests/
βββ requirements.txt
βββ Dockerfile
βββ docker-compose.yml
βββ README.md- Slack / Teams bot integration
- Frontend dashboard (React + Tailwind)
- Custom vector-based supplier ranking
- LLM fine-tuning with procurement datasets
- LLM cost tracking and logging
This project will be open-source. Contributions welcome!
- Fork the repo
- Create a feature branch
- Submit a PR
- Write tests if possible