The server listens on a specified interface and port, and handles incoming HTTP requests and redirects them to an OpenAI compatible API or Azure API based on the configuration.
The server also includes a request interceptor mechanism, which allows you to modify the request data before it is sent to the OpenAI API. Currently, it includes a Google Search interceptor as an example.
- HTTP/HTTPS server using Go's standard
net/http
package - Configurable listening interface, port, and upstreams via command-line flags or a YAML configuration file
- Conveniently log your requests to an OpenAI-compatible API using Uber's Zap logging library
- Request interceptors for modifying request data
- Support for multiple upstream types (Azure, OpenAI)
- Go 1.x
git clone https://github.com/go-openai-proxy/go-openai-proxy.git
cd go-openai-proxy
go get -u
go build
Run the server with the default settings:
bash
./go-openai-proxy
- Build the project:
bash
go build
Run the server with the default settings:
./go-openai-proxy
Or specify the configuration options:
./go-openai-proxy --config config.yaml --listeners 192.168.1.1:6001 --logLevel debug --certFile /path/to/cert.crt --keyFile /path/to/key.key --useTLS false
You can also use a YAML configuration file to set up the server. Here's an example:
certFile: "/path/to/cert/file.crt"
keyFile: "/path/to/key/file.key"
logLevel: "debug"
listeners:
- interface: "0.0.0.0"
port: "6001"
upstreams:
Primary:
type: "azure"
model: "default"
url: "http://10.10.0.127:5001"
priority: 2
Secondary:
type: "openai"
model: "default"
priority: 1
Here's some example output you can get out of the logger:
$ go run cmd/main.go
2023-09-10T18:38:20.628-0600 DEBUG cmd/main.go:101 Added config to context: {"config": {"Upstreams":{"Primary":{"Type":"azure","URL":"http://10.10.0.127:5001","Model":"default","Priority":2},"Secondary":{"Type":"openai","URL":"","Model":"default","Priority":1}},"Listeners":[{"Interface":"0.0.0.0","Port":"6001"}],"LogLevel":"debug","CertFile":"/path/to/cert/file.crt","KeyFile":"/path/to/key/file.key","UseTLS":false}}
2023-09-10T18:38:20.628-0600 INFO cmd/main.go:113 Hostname {"hostname": "Administrators-MacBook-Pro.local"}
2023-09-10T18:38:20.628-0600 INFO cmd/main.go:123 Starting listener {"address": "0.0.0.0:6001"}{"level":"info","ts":1694269949.982666,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":""}
{"level":"info","ts":1694269949.983812,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":"Hello"}
{"level":"info","ts":1694269950.029989,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":"!"}
{"level":"info","ts":1694269950.205655,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":" How"}
{"level":"info","ts":1694269950.206295,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":" can"}
{"level":"info","ts":1694269950.2064052,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":" I"}
{"level":"info","ts":1694269950.2065,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":" assist"}
{"level":"info","ts":1694269950.209876,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":" you"}
{"level":"info","ts":1694269950.250054,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":" today"}
{"level":"info","ts":1694269950.284432,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":"?"}
{"level":"info","ts":1694269950.3289962,"caller":"internal/handlers.go:90","msg":"JSON Response Segment","content":""}
{"level":"info","ts":1694269950.329086,"caller":"internal/handlers.go:107","msg":"JSON Completed Response","response":"{\"completedResponse\":\"Hello! How can I assist you today?\",\"requestMessages\":[{\"role\":\"user\",\"content\":\"Hello world\"}]}"}
This project is licensed under the MIT License - see the LICENSE.md file for details.