Serverless LiteLLM-Proxy
This demo shows how to deploy LiteLLM-Proxy on AWS Lambda using Lambda Web Adapter to provide an OpenAI compatable API for Amazon Bedrock.
Pre-requisites
- AWS CLI
- AWS SAM CLI
- Docker
Build and deploy
Run the following commands to build and deploy this demo.
sam build
sam deploy --guidedYou need to enter an random string for ApiMasterKey parameter. And take note of the litellmProxyFunctionUrl in output. You will use it in the testing.
Test
Run this command to test it. Replace litellmProxyFunctionUrl and ApiMasterKey with the correct values. And you should see the response stream back.
curl <litellmProxyFunctionUrl>chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <ApiMasterKey>" \
-d '{
"model": "bedrock/anthropic.claude-v2",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "tell me a bedtime story about lambda and sqs"
}
],
"stream": true
}'