Draft
Conversation
- Add 'azure' to llmProviderSchema enum and ProviderConfigMap type - Add Azure entry to PROVIDER_META with API key + resource name auth - No default models (deployment-specific, depends on user's base URL) - Add Azure factory in LLM_PROVIDERS using @ai-sdk/azure createAzure - Add Azure icon SVG and provider icon case - Add 'Azure OpenAI' label to usage filters - Add Azure env vars to .env.example - Guard getDefaultModelId against empty models array Co-authored-by: Christophe Blefari <christophe.blefari@gmail.com>
Contributor
🚀 Preview Deployment
Preview will be automatically removed when this PR is closed. |
The resource name is embedded in the base URL the user provides, and the deployment name is the model ID entered in the UI. No need for a separate credential field. Co-authored-by: Christophe Blefari <christophe.blefari@gmail.com>
…rovider Restore resourceName as an extra field (alternative to baseURL — not both). Add apiVersion and useDeploymentBasedUrls extra fields matching the @ai-sdk/azure createAzure options. All three fall back to env vars. Co-authored-by: Christophe Blefari <christophe.blefari@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds Azure OpenAI as a supported LLM provider using the
@ai-sdk/azurepackage.Changes
Backend (
apps/backend/)'azure'tollmProviderSchemaenum intypes/llm.tsAzureOpenAIResponsesProviderOptionstoProviderConfigMapPROVIDER_METAinprovider-meta.tswith:AZURE_API_KEYenv var,AZURE_OPENAI_BASE_URLbase URL env varextractorModelId/summaryModelId(users must configure their own deployment names)LLM_PROVIDERSinproviders.tsusingcreateAzurefrom@ai-sdk/azureapiVersionanduseDeploymentBasedUrlsare forwarded tocreateAzurewhen set@ai-sdk/azureSDK default)getDefaultModelIdagainst empty models arrays to prevent crashesFrontend (
apps/frontend/)'azure'case toLlmProviderIconcomponent'Azure OpenAI'label toproviderLabelsin usage filtersConfig
.env.example@ai-sdk/azuredependencyEnvironment Variables
AZURE_API_KEYAZURE_RESOURCE_NAMEhttps://{name}.openai.azure.com) — alternative to base URLAZURE_OPENAI_BASE_URLAZURE_API_VERSIONv1)AZURE_USE_DEPLOYMENT_BASED_URLStruefor legacy deployment URL formatTesting
npm run lint -w @nao/backend— passesnpm run lint -w @nao/frontend— passes