correct provider field in continue config.json for VScode #5630
Labels
area:configuration
Relates to configuration options
ide:vscode
Relates specifically to VS Code extension
kind:bug
Indicates an unexpected problem or unintended behavior
Before submitting your bug report
Relevant environment info
Description
The documentation in https://docs.continue.dev/customize/model-providers/inception says that correct value for "provider": "inception", but when I try to use it in my VSCode config,json for Continue plugin, it does not work, the "inception" is not in the list of allowed values for "provider" field in config.json, the allowed values list (shown in VScode as Valid values):
Value is not accepted. Valid values: "openai", "free-trial", "anthropic", "anthropic-vertexai", "cohere", "bedrock", "bedrockimport", "sagemaker", "together", "novita", "ollama", "huggingface-tgi", "huggingface-inference-api", "llama.cpp", "replicate", "gemini", "gemini-vertexai", "lmstudio", "llamafile", "mistral", "mistral-vertexai", "deepinfra", "groq", "fireworks", "ncompass", "cloudflare", "deepseek", "azure", "msty", "watsonx", "openrouter", "sambanova", "nvidia", "vllm", "cerebras", "askSage", "nebius", "vertexai", "xAI", "kindo", "moonshot", "siliconflow", "function-network", "scaleway", "relace", "morph", "ovhcloud", "venice".
If i use "provider": "openai" , API requests fails with error: 401 "Incorrect API key provided".
How to correctly set up mercury-coder-small model from inception labs in Continue plugin in VSCode?
To reproduce
No response
Log output
The text was updated successfully, but these errors were encountered: