The fastest way to get started with Azure OpenAI.
Rapidly deploy an Azure OpenAI instance with a GPT-5-mini model using a single CLI command. Includes OpenAI SDK for Python, TypeScript, Go, .NET and Java examples using the Responses API.
The Azure OpenAI Starter Kit provides Infrastructure as Code deployment with one-command setup and production-ready client examples for Python, TypeScript, Go, .NET and and Java, featuring secure EntraID authentication and the new Responses API optimized for GPT-5-mini.
β Azure Subscription β Azure Developer CLI β Azure CLI
# 1. Login to Azure - both Azure CLI and Azure Developer CLI
az login
azd auth login
# 2. Deploy GPT-5-mini to Azure OpenAI
azd upThat's it! π You now have Azure OpenAI with GPT-5-mini model deployed and ready to use!
Use keyless authentication with Azure Identity - the secure, production-ready approach.
Click to expand Keyless setup and code examples
zsh/bash
# 1. Get your endpoint
endpoint=$(azd env get-value 'AZURE_OPENAI_ENDPOINT')
# 2. Set environment variable
export AZURE_OPENAI_ENDPOINT=$endpoint
# 3. Assign yourself the OpenAI User role
userId=$(az ad signed-in-user show --query id -o tsv)
resourceId="/subscriptions/$(az account show --query id -o tsv)/resourceGroups/rg-$(azd env get-value 'AZURE_ENV_NAME')/providers/Microsoft.CognitiveServices/accounts/$(azd env get-value 'AZURE_OPENAI_NAME')"
az role assignment create --role "Cognitive Services OpenAI User" --assignee $userId --scope $resourceId
# 4. Run EntraID examples
cd src/python && python responses_example_entra.py
# or
cd src/typescript && tsx responses_example_entra.ts
# or
cd src/go && go run .
# or
cd src/dotnet && dotnet run responses_example_entra.cs
# or
cd src/java && mvn clean compile exec:java -Dexec.mainClass="com.azure.openai.starter.ResponsesExampleEntra"PowerShell
# 1. Get your endpoint
$endpoint = azd env get-value 'AZURE_OPENAI_ENDPOINT'
# 2. Set environment variable
$env:AZURE_OPENAI_ENDPOINT=$endpoint
# 3. Assign yourself the OpenAI User role
$userId = az ad signed-in-user show --query id -o tsv
$resourceId = "/subscriptions/$(az account show --query id -o tsv)/resourceGroups/rg-$(azd env get-value 'AZURE_ENV_NAME')/providers/Microsoft.CognitiveServices/accounts/$(azd env get-value 'AZURE_OPENAI_NAME')"
az role assignment create --role "Cognitive Services OpenAI User" --assignee $userId --scope $resourceId
# 4. Run EntraID examples
cd src/python && python responses_example_entra.py
# or
cd src/typescript && tsx responses_example_entra.ts
# or
cd src/go && go run .
# or
cd src/dotnet && dotnet run responses_example_entra.cs
# or
cd src/java && mvn clean compile exec:java -Dexec.mainClass="com.azure.openai.starter.ResponsesExampleEntra"NOTE: If your Azure account is bound with more than one Azure tenant, you should specify the tenant ID before running the app; otherwise you'll get an authentication error.
# zsh/bash export AZURE_TENANT_ID=$(az account show --query "tenantId" -o tsv) # PowerShell $env:AZURE_TENANT_ID = az account show --query "tenantId" -o tsv
Python Code:
from openai import OpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(
DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default"
)
client = OpenAI(
base_url=f"{os.getenv('AZURE_OPENAI_ENDPOINT')}openai/v1/",
api_key=token_provider
)
response = client.responses.create(
model="gpt-5-mini",
input="Explain quantum computing in simple terms",
max_output_tokens=1000
)
print(response.output_text)TypeScript Code:
import OpenAI from "openai";
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
const tokenProvider = getBearerTokenProvider(
new DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default"
);
const client = new OpenAI({
baseURL: `${process.env.AZURE_OPENAI_ENDPOINT}openai/v1/`,
apiKey: tokenProvider as any
});
const response = await client.responses.create({
model: "gpt-5-mini",
input: "Explain quantum computing in simple terms",
max_output_tokens: 1000
});
console.log(response.output_text);Go Code:
import (
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/azure"
"github.com/openai/openai-go/v3/option"
"github.com/openai/openai-go/v3/responses"
)
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("Failed to create DefaultAzureCredential: %s", err)
}
const scope = "https://cognitiveservices.azure.com/.default"
// Initialize OpenAI client with Azure endpoint and the token
client := openai.NewClient(
option.WithBaseURL(endpoint+"/openai/v1/"),
azure.WithTokenCredential(cred, azure.WithTokenCredentialScopes([]string{scope})),
)
resp, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{
Model: "gpt-5-mini",
Input: responses.ResponseNewParamsInputUnion{
OfString: openai.String("Explain quantum computing in simple terms"),
},
MaxOutputTokens: openai.Int(1000),
}).NET Code:
#!/usr/bin/dotnet run
#:package OpenAI@2.*
#:package Azure.Identity@1.*
using System.ClientModel.Primitives;
using Azure.Identity;
using OpenAI;
using OpenAI.Responses;
#pragma warning disable OPENAI001
var policy = new BearerTokenPolicy(new DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default");
var clientOptions = new OpenAIClientOptions
{
Endpoint = new Uri($"{Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT")!.TrimEnd('/')}/openai/v1/"),
};
var responseClient = new OpenAIResponseClient("gpt-5-mini", policy, clientOptions);
var responseCreationOptions = new ResponseCreationOptions
{
MaxOutputTokenCount = 1000
};
var response1 = await responseClient.CreateResponseAsync(
userInputText: "Explain quantum computing in simple terms",
options: responseCreationOptions);
Console.WriteLine(response1.Value.GetOutputText());Java Code:
add the following imports
import com.azure.identity.AuthenticationUtil;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.openai.client.OpenAIClient;
import com.openai.client.okhttp.OpenAIOkHttpClient;
import com.openai.credential.BearerTokenCredential;
import com.openai.models.responses.Response;
import com.openai.models.responses.ResponseCreateParams;code snippet
Supplier<String> bearerTokenSupplier = AuthenticationUtil.getBearerTokenSupplier(
new DefaultAzureCredentialBuilder().build(),
"https://cognitiveservices.azure.com/.default"
);
OpenAIClient client = OpenAIOkHttpClient.builder()
.baseUrl(System.getenv("AZURE_OPENAI_ENDPOINT"))
.credential(BearerTokenCredential.create(bearerTokenSupplier))
.build();
Response response = client.responses().create(
ResponseCreateParams.builder()
.model("gpt-5-mini")
.input(ResponseCreateParams.Input.ofText("Explain quantum computing in simple terms"))
.maxOutputTokens(1000)
.build()
);
System.out.println(response.output());Why Keyless?
β
No API keys to manage or rotate
β
Better security with Azure RBAC
β
Works with your Azure login
β
Production-ready and enterprise-grade
For quick testing and development:
Click to expand API key setup and code examples
zsh/bash
# 1. Get your endpoint
endpoint=$(azd env get-value 'AZURE_OPENAI_ENDPOINT')
# 2. Get your API key
apiKey=$(az cognitiveservices account keys list --name $(azd env get-value 'AZURE_OPENAI_NAME') --resource-group rg-$(azd env get-value 'AZURE_ENV_NAME'))
# 3. Set environment variables
export AZURE_OPENAI_ENDPOINT=$endpoint
export AZURE_OPENAI_API_KEY=$apiKey
# 4. Run API key examples
cd src/python && python responses_example.py
# or
cd src/typescript && npm start
# or
cd src/go && go run .
# or
cd src/dotnet && dotnet run responses_example.cs
# or
cd src/java && mvn clean compile exec:java -Dexec.mainClass="com.azure.openai.starter.ResponsesExample"PowerShell
# 1. Get your endpoint
$endpoint = azd env get-value 'AZURE_OPENAI_ENDPOINT'
# 2. Get your API key
$apiKey = az cognitiveservices account keys list --name $(azd env get-value 'AZURE_OPENAI_NAME') --resource-group rg-$(azd env get-value 'AZURE_ENV_NAME')
# 3. Set environment variables
$env:AZURE_OPENAI_ENDPOINT=$endpoint
$env:AZURE_OPENAI_API_KEY=$apiKey
# 4. Run API key examples
cd src/python && python responses_example.py
# or
cd src/typescript && npm start
# or
cd src/go && go run .
# or
cd src/dotnet && dotnet run responses_example.cs
# or
cd src/java && mvn clean compile exec:java -Dexec.mainClass="com.azure.openai.starter.ResponsesExample"Python Code:
from openai import OpenAI
client = OpenAI(
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
base_url=f"{os.getenv('AZURE_OPENAI_ENDPOINT')}openai/v1/"
)
response = client.responses.create(
model="gpt-5-mini",
input="Explain quantum computing in simple terms",
max_output_tokens=1000
)
print(response.output_text)TypeScript Code:
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.AZURE_OPENAI_API_KEY,
baseURL: `${process.env.AZURE_OPENAI_ENDPOINT}openai/v1/`
});
const response = await client.responses.create({
model: "gpt-5-mini",
input: "Explain quantum computing in simple terms",
max_output_tokens: 1000
});
console.log(response.output_text);Go Code:
import (
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/openai/openai-go/v3"
"github.com/openai/openai-go/v3/azure"
"github.com/openai/openai-go/v3/option"
"github.com/openai/openai-go/v3/responses"
)
client := openai.NewClient(
option.WithBaseURL(endpoint+"/openai/v1/"),
option.WithAPIKey(apiKey),
)
resp, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{
Model: "gpt-5-mini",
Input: responses.ResponseNewParamsInputUnion{
OfString: openai.String("Explain quantum computing in simple terms"),
},
MaxOutputTokens: openai.Int(1000),
}).NET Code:
#!/usr/bin/dotnet run
#:package OpenAI@2.*
using System.ClientModel;
using OpenAI;
using OpenAI.Responses;
#pragma warning disable OPENAI001
var credential = new ApiKeyCredential(Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY"));
var clientOptions = new OpenAIClientOptions
{
Endpoint = new Uri($"{Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT").TrimEnd('/')}/openai/v1/")
};
var responseClient = new OpenAIResponseClient("gpt-5-mini", credential, clientOptions);
var responseCreationOptions = new ResponseCreationOptions
{
MaxOutputTokenCount = 1000
};
var response1 = await responseClient.CreateResponseAsync(
userInputText: "Explain quantum computing in simple terms",
options: responseCreationOptions);
Console.WriteLine($"Response: {response1.Value.GetOutputText()}");Java Code:
add the following imports
import com.openai.client.OpenAIClient;
import com.openai.client.okhttp.OpenAIOkHttpClient;
import com.openai.models.responses.Response;
import com.openai.models.responses.ResponseCreateParams;code snippet
OpenAIClient client = OpenAIOkHttpClient.builder()
.apiKey(System.getenv("AZURE_OPENAI_API_KEY"))
.baseUrl(System.getenv("AZURE_OPENAI_ENDPOINT"))
.build();
Response response = client.responses().create(
ResponseCreateParams.builder()
.model("gpt-5-mini")
.input(ResponseCreateParams.Input.ofText("Explain quantum computing in simple terms"))
.maxOutputTokens(1000)
.build()
);
System.out.println(response.output());π See CLIENT_README.md for detailed setup guide with more examples
- Core Infrastructure: Azure OpenAI resource with GPT-5-mini deployment
- Optimal Configuration: Flexible region selection, GlobalStandard SKU, v1 API
- Secure Authentication: EntraID (Azure Identity) recommended + API key option
- Client Examples: Python, TypeScript, Go and Java using the new Responses API
- Validation Scripts: PowerShell and Bash scripts for testing
- Complete Documentation: Setup guides and troubleshooting tips
β
GPT-5-mini (2025-08-07) - Latest reasoning model, no registration required
β
Flexible region deployment - Choose your optimal region
β
New v1 API support - Future-proof, no version management needed
β
Automatic deployment - Model ready to use immediately
β
Multi-language examples - Python, TypeScript/Node.js, Go and Java clients
β
Two authentication methods - API keys (quick start) + EntraID (production-ready)
β
Unique resource naming - No conflicts with existing resources
βββ azure.yaml # azd configuration
βββ infra/
β βββ main.bicep # Main deployment template
β βββ main.parameters.json # Deployment parameters
β βββ resources.bicep # Azure OpenAI resource definition
βββ src/
β βββ dotnet/
β β βββ responses_example.cs # API key authentication
β β βββ responses_example_entra.cs # EntraID authentication
β β βββ global.json # .NET SDK configuration
β β βββ README.md # .NET prerequisites
β βββ go/
β β βββ responses_example
β β | βββ main.go # API key authentication
β β | βββ go.mod # Go module dependencies
β β | βββ go.sum # Go dependency checksums
β β βββ responses_example_entra
β β | βββ main.go # EntraID key authentication
β β | βββ go.mod # Go module dependencies
β β | βββ go.sum # Go dependency checksums
β βββ java/
β β βββ pom.xml # Maven dependencies
β β βββ src/main/java/com/azure/openai/starter/
β β βββ ResponsesExample.java # API key authentication
β β βββ ResponsesExampleEntra.java # EntraID authentication
β βββ python/
β β βββ responses_example.py # API key authentication
β β βββ responses_example_entra.py # EntraID authentication
β β βββ requirements.txt # Python dependencies
β βββ typescript/
β βββ responses_example.ts # API key authentication
β βββ responses_example_entra.ts # EntraID authentication
β βββ package.json # Node.js dependencies
β βββ tsconfig.json # TypeScript configuration
βββ CLIENT_README.md # Detailed setup guide (Python & TypeScript)
βββ validate.ps1 # PowerShell validation script
βββ validate.sh # Bash validation script
# Redeploy with changes
azd up
# See what would be deployed
azd provision --preview
# Debug deployment issues
azd up --debug
# Clean up everything
azd downWant to deploy to East US 2 instead?
azd env set AZURE_LOCATION eastus2
azd upWant a different model? Edit infra/resources.bicep:
// Current: GPT-5-mini (no registration required)
gptModelName: 'gpt-5-mini'
gptModelVersion: '2025-08-07'
// Alternatives (no registration required):
gptModelName: 'gpt-5-nano' // Fastest
gptModelName: 'gpt-5-chat' // Chat-optimized
// Full GPT-5 (requires registration):
gptModelName: 'gpt-5' // Needs approval"GPT-5-mini not available" β Try azd env set AZURE_LOCATION eastus2
"Quota exceeded" β Check your subscription quota for Azure OpenAI
"Permission denied" β Ensure you have Cognitive Services Contributor role
"Tenant provide token mismatch" β Set AZURE_TENANT_ID environment variable with the proper tenant ID.
Need debug info? β Run azd up --debug for detailed logs
β
Minimal setup - 2 commands instead of 20+
β
Latest model - GPT-5-mini with reasoning capabilities
β
Future-proof - Uses new v1 API, no version management
β
Production-ready - GlobalStandard SKU, EntraID auth, proper naming
β
Complete examples - Python, TypeScript, Go and Java with error handling
β
Secure by default - Supports keyless authentication with Azure Identity
β
Easy cleanup - Remove everything with azd down
Happy AI building with GPT-5-mini! π€β¨
Powered by Azure Developer CLI | Deploys GPT-5-mini (2025-08-07)
