deverman/langchainswiftclidemo
Test of using Langchain Swift Library
LangChain & LangGraph Swift Tutorial
A comprehensive tutorial demonstrating how to build an AI assistant using LangChain Swift and LangGraph. This example shows how to create a workflow-based AI application that combines custom tools with OpenAI's API.
What You'll Learn
- How to use LangChain Swift for building AI applications
- How to create and manage workflows with LangGraph
- How to create custom tools and integrate them with LangChain
- How to build a command-line interface for AI applications
- How to handle state management in AI workflows
- Best practices for error handling and resource management
Prerequisites
- macOS 13.0 or later
- Swift 5.9 or later
- OpenAI API key (get one at https://platform.openai.com)
Project Structure
.
├── Sources/
│ └── main.swift # Main implementation with LangChain and LangGraph integration
├── Package.swift # Dependencies
└── README.md # This file
Architecture Overview
The project demonstrates a modern agentic AI architecture with these key components:
-
State Management (
AssistantState)- Manages workflow state using LangGraph channels
- Tracks messages, tool executions, and completion status
- Provides type-safe access to state data
-
Tool System
- Defines a
Toolprotocol for creating custom tools - Each tool has a name, description, and execution logic
- Tools are automatically discovered by the LLM
- Defines a
-
Query Processing
- Uses LangChain for LLM integration
- Implements tool selection and execution
- Handles resource management and cleanup
-
Command Line Interface
- Built with ArgumentParser
- Supports query execution and verbose mode
- Provides clear error messages
Adding Your Own Tools
To add a new tool to the system:
- Create a new class implementing the
Toolprotocol:
class MyNewTool: Tool {
func name() -> String {
return "my_tool_name" // The name the LLM will use to identify this tool
}
func description() -> String {
return "A clear description of what this tool does and how to use it"
}
func run(_ input: String) async throws -> String {
// Your tool's logic here
// Process the input and return a result
return "Result"
}
}- Add your tool to the tools array in
AIAssistant.run():
let tools: [any Tool] = [
TimeCheckTool(),
CalculatorTool(),
MyNewTool() // Add your new tool here
]The LLM will automatically discover your tool and use it when appropriate based on its name and description.
Tool Design Best Practices
- Clear Names: Use descriptive names that clearly indicate the tool's purpose
- Detailed Descriptions: Provide clear descriptions that help the LLM understand when to use the tool
- Input Validation: Implement robust input validation in the
runmethod - Error Handling: Use clear error messages that help diagnose issues
- Resource Management: Clean up any resources your tool uses
Running the Application
- Set up your environment:
export OPENAI_API_KEY='your-api-key-here'- Build the project:
swift build- Try different examples:
# Get the current time
.build/debug/langchainswiftclidemo --query "What time is it?"
# Do a calculation
.build/debug/langchainswiftclidemo --query "Calculate 5 * 3"
# Try both tools at once
.build/debug/langchainswiftclidemo --query "What time is it and calculate 15 * 24?"
# See available tools
.build/debug/langchainswiftclidemo --verboseError Handling
The system implements several layers of error handling:
- Tool-level errors: Each tool implements its own error handling
- Query processing errors: The
QueryProcessorhandles LLM and execution errors - Resource management: Automatic cleanup of HTTP clients and event loops
- Input validation: Command-line argument validation
Resource Management
The system automatically manages:
- HTTP client lifecycle
- Event loop groups
- Tool resources
- Memory cleanup
Contributing
We welcome contributions! Please see our contributing guidelines for details.
License
This project is available under the MIT license.