CP
cperales/chat-one-another
This repository is a PoC about how to iterate conversations between different LLMs
Chat One Another
This repository contains a simple Python script that lets two local language models talk to each other using Ollama.
Requirements
- Python 3
- The
requestspackage (install viapip install -r requirements.txt) - An Ollama server running at
http://localhost:11434
Usage
-
Install the required Python package:
pip install -r requirements.txt
-
Add a
config.jsonto adjust the models (model_aandmodel_b), the starting prompt, the number of conversation turns, or the output file name. You can copy theconfig.json.example.cp config.json.example config.json
-
Run the script:
python script.py
The assistant responses will stream to the console as they are generated.
-
Conversation turns are logged to the file specified by
chat_historyinconfig.json.
Project Files
config.json– stores model names, the initial prompt, iteration count, and output file location.script.py– orchestrates the conversation between the two models.requirements.txt– lists the Python dependency.- history file (default
chat_history.md) – created when the script runs and stores the dialogue.