GitHunt
FI

fintools-ai/TradeInsightsAssistant

TradeInsightsAssistant is an interactive AI-powered chat interface that helps you analyze daily options Open Interest (OI) data to uncover potential market insights

πŸ“Š TradeInsightsAssistant

Conversational AI for Options Intelligence

Banner


πŸš€ What is TradeInsightsAssistant?

TradeInsightsAssistant is your options research analyst β€” powered by natural language. It transforms open interest (OI), volatility positioning, and cross-asset flows into precise trading insights using a powerful chat interface backed by LLMs.

Ask it questions like:

  • "Where are traders piling on SPY puts this week?"
  • "What’s the risk/reward on a June QQQ put spread?"
  • "Are there unusual builds in NVDA for Friday’s expiration?"

🧠 Key Capabilities

πŸ’¬ Conversational Intelligence

  • Ask questions in plain English
  • Receive clean, structured responses
  • Get Markdown-formatted trade setups

πŸ“ˆ Options Market Analysis

  • Open Interest clustering (per strike and expiry)
  • Call/Put ratio skew detection
  • Max Pain and magnet zones
  • Position buildup and liquidation tracking

βš’οΈ Professional Trade Outputs

  • Spreads, hedges, directional trades
  • Entry, stop-loss, profit target rules
  • Kelly Criterion sizing and confidence score

πŸ› οΈ How It Works

  1. Launch the CLI: python main.py
  2. Choose your LLM backend (Bedrock or Claude API)
  3. Start chatting:
πŸ“Š You: What are key OI levels on TSLA this week?
πŸ€– Agent: The 280 and 300 put strikes show growing OI with strong skew. Gamma risk is concentrated near 290.
  1. Want to save the response? Just say yes β€” it writes to Markdown with a full metadata trail.

πŸ” Use Cases

  • 0DTE and weekly trade planning
  • Institutional flow tracking
  • Quantified trade generation
  • Hedge scenario simulation
  • Risk-layered market analysis

⚑ Example Prompts

"Analyze SPY open interest for the next 5 days"
"What’s the options flow telling us about AAPL?"
"Show me support and resistance levels for TSLA"
"Give me a put hedge strategy for HOOD next month"

Run the CLI:

python main.py

You’ll be guided to choose your LLM provider (AWS Bedrock or Claude API). From there, start chatting.


πŸ“‚ MCP Server Configuration

Servers are defined in mcp_servers.json:

{
  "servers": [
    {
      "name": "openinterest",
      "command": "mcp-openinterest-server",
      "args": []
    },
    {
      "name": "news",
      "command": "mcp-news-server",
      "args": ["--api-key", "your-key"]
    }
  ]
}

Add/remove servers as needed β€” the CLI dynamically loads them.


🧩 Architecture

  • βœ… CLI-first UX built on rich
  • βœ… Orchestrator handles multi-turn LLM conversations
  • βœ… Dynamic tool invocation from MCP registry
  • βœ… Markdown + JSON output for traceability

πŸ“œ License

MIT License