Skip to content

Supporting multiple LLMs in auto-config #2610

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ddobrin opened this issue Mar 31, 2025 · 1 comment
Open

Supporting multiple LLMs in auto-config #2610

ddobrin opened this issue Mar 31, 2025 · 1 comment

Comments

@ddobrin
Copy link
Contributor

ddobrin commented Mar 31, 2025

Currently, proprietary or open models running in Google Vertex can be accessed through the Vertex Gemini module, respectively using the OpenAI API.

#################################
# Google Vertex AI Gemini
#################################
spring.ai.vertex.ai.gemini.project-id=<project>
spring.ai.vertex.ai.gemini.location=us-central1
spring.ai.vertex.ai.gemini.chat.options.model=gemini-2.0-flash-001
spring.ai.vertex.ai.gemini.transport=grpc

#################################
# OpenAI API VertexAI
#################################
spring.ai.openai.api-key=abc123
spring.ai.openai.vertex.ai.gemini.project-id=<project>
spring.ai.openai.vertex.ai.gemini.location=us-central1
spring.ai.openai.vertex.ai.chat.options.model=meta/llama3-405b-instruct-maas
spring.ai.openai.vertex.ai.chat.base-url=<baseURL>
spring.ai.openai.vertex.ai.chat.completions-path=/chat/completions
spring.ai.openai.vertex.ai.chat.options.max-tokens=1024

The problem - there is no ability to auto-config more than 1 Gemini or open model in Vertex at this time.

The proposal - introduce the ability to add a in the auto-config, to allow for multiple configurations

spring.ai.vertex.ai.gemini.<gemini-flash>.project-id=<project>
spring.ai.vertex.ai.gemini.<gemini-flash>.location=us-central1
spring.ai.vertex.ai.gemini.<gemini-flash>.chat.options.model=gemini-2.0-flash-001
spring.ai.vertex.ai.gemini.<gemini-flash>.transport=grpc

spring.ai.vertex.ai.gemini.<gemini-pro>.project-id=<project>
spring.ai.vertex.ai.gemini.<gemini-pro>.location=us-central1
spring.ai.vertex.ai.gemini.<gemini-pro>.chat.options.model=gemini-1.5-001
spring.ai.vertex.ai.gemini.<gemini-pro>.transport=grpc

Workaround - not using auto-config and manually configuring each model in use

CC @tzolov

@ThomasVitale
Copy link
Contributor

I would really like to have this possibility in Spring AI. Unfortunately, I'm afraid it's up to the Spring Boot project to support registering multiple beans of the same type via auto-configuration, a capability that so far has not been supported by Spring Boot.

I hope things will change in Spring Boot 4, based on the new changes introduced in Spring Framework 7 for programmatically registering beans, that would enable such functionality (https://docs.spring.io/spring-framework/reference/7.0/core/beans/java/programmatic-bean-registration.html). See spring-projects/spring-boot#15732.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants