Skip to content

Azure-Samples/azure-openai-starter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

58 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

The Azure OpenAI Starter Kit

The fastest way to get started with Azure OpenAI.

Rapidly deploy an Azure OpenAI instance with a GPT-5-mini model using a single CLI command. Includes OpenAI SDK for Python, TypeScript, Go, .NET and Java examples using the Responses API.

Architecture Overview

Azure OpenAI Starter Kit Architecture

The Azure OpenAI Starter Kit provides Infrastructure as Code deployment with one-command setup and production-ready client examples for Python, TypeScript, Go, .NET and and Java, featuring secure EntraID authentication and the new Responses API optimized for GPT-5-mini.

Prerequisites

βœ… Azure Subscription βœ… Azure Developer CLI βœ… Azure CLI

Quick Start

# 1. Login to Azure - both Azure CLI and Azure Developer CLI
az login
azd auth login

# 2. Deploy GPT-5-mini to Azure OpenAI 
azd up

That's it! πŸš€ You now have Azure OpenAI with GPT-5-mini model deployed and ready to use!

Next Steps

Option A: Keyless Authentication (Recommended) πŸ”

Use keyless authentication with Azure Identity - the secure, production-ready approach.

Click to expand Keyless setup and code examples

Setup Steps

zsh/bash
# 1. Get your endpoint
endpoint=$(azd env get-value 'AZURE_OPENAI_ENDPOINT')

# 2. Set environment variable
export AZURE_OPENAI_ENDPOINT=$endpoint

# 3. Assign yourself the OpenAI User role
userId=$(az ad signed-in-user show --query id -o tsv)
resourceId="/subscriptions/$(az account show --query id -o tsv)/resourceGroups/rg-$(azd env get-value 'AZURE_ENV_NAME')/providers/Microsoft.CognitiveServices/accounts/$(azd env get-value 'AZURE_OPENAI_NAME')"
az role assignment create --role "Cognitive Services OpenAI User" --assignee $userId --scope $resourceId

# 4. Run EntraID examples
cd src/python && python responses_example_entra.py
# or
cd src/typescript && tsx responses_example_entra.ts
# or
cd src/go && go run .
# or
cd src/dotnet && dotnet run responses_example_entra.cs
# or
cd src/java && mvn clean compile exec:java -Dexec.mainClass="com.azure.openai.starter.ResponsesExampleEntra"
PowerShell
# 1. Get your endpoint
$endpoint = azd env get-value 'AZURE_OPENAI_ENDPOINT'

# 2. Set environment variable
$env:AZURE_OPENAI_ENDPOINT=$endpoint

# 3. Assign yourself the OpenAI User role
$userId = az ad signed-in-user show --query id -o tsv
$resourceId = "/subscriptions/$(az account show --query id -o tsv)/resourceGroups/rg-$(azd env get-value 'AZURE_ENV_NAME')/providers/Microsoft.CognitiveServices/accounts/$(azd env get-value 'AZURE_OPENAI_NAME')"
az role assignment create --role "Cognitive Services OpenAI User" --assignee $userId --scope $resourceId

# 4. Run EntraID examples
cd src/python && python responses_example_entra.py
# or
cd src/typescript && tsx responses_example_entra.ts
# or
cd src/go && go run .
# or
cd src/dotnet && dotnet run responses_example_entra.cs
# or
cd src/java && mvn clean compile exec:java -Dexec.mainClass="com.azure.openai.starter.ResponsesExampleEntra"

NOTE: If your Azure account is bound with more than one Azure tenant, you should specify the tenant ID before running the app; otherwise you'll get an authentication error.

# zsh/bash
export AZURE_TENANT_ID=$(az account show --query "tenantId" -o tsv)

# PowerShell
$env:AZURE_TENANT_ID = az account show --query "tenantId" -o tsv

Code Samples

Python Code:
from openai import OpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider

token_provider = get_bearer_token_provider(
    DefaultAzureCredential(),
    "https://cognitiveservices.azure.com/.default"
)

client = OpenAI(
    base_url=f"{os.getenv('AZURE_OPENAI_ENDPOINT')}openai/v1/",
    api_key=token_provider
)

response = client.responses.create(
    model="gpt-5-mini",
    input="Explain quantum computing in simple terms",
    max_output_tokens=1000
)
print(response.output_text)
TypeScript Code:
import OpenAI from "openai";
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";

const tokenProvider = getBearerTokenProvider(
    new DefaultAzureCredential(),
    "https://cognitiveservices.azure.com/.default"
);

const client = new OpenAI({
    baseURL: `${process.env.AZURE_OPENAI_ENDPOINT}openai/v1/`,
    apiKey: tokenProvider as any
});

const response = await client.responses.create({
    model: "gpt-5-mini",
    input: "Explain quantum computing in simple terms",
    max_output_tokens: 1000
});
console.log(response.output_text);
Go Code:
import (
    "github.com/Azure/azure-sdk-for-go/sdk/azidentity"
    "github.com/openai/openai-go/v3"
    "github.com/openai/openai-go/v3/azure"
    "github.com/openai/openai-go/v3/option"
    "github.com/openai/openai-go/v3/responses"
)

cred, err := azidentity.NewDefaultAzureCredential(nil)

if err != nil {
    log.Fatalf("Failed to create DefaultAzureCredential: %s", err)
}

const scope = "https://cognitiveservices.azure.com/.default"

// Initialize OpenAI client with Azure endpoint and the token
client := openai.NewClient(
    option.WithBaseURL(endpoint+"/openai/v1/"),
    azure.WithTokenCredential(cred, azure.WithTokenCredentialScopes([]string{scope})),
)

resp, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{
    Model: "gpt-5-mini",
    Input: responses.ResponseNewParamsInputUnion{
        OfString: openai.String("Explain quantum computing in simple terms"),
    },
    MaxOutputTokens: openai.Int(1000),
})
.NET Code:
#!/usr/bin/dotnet run

#:package OpenAI@2.*
#:package Azure.Identity@1.*

using System.ClientModel.Primitives;
using Azure.Identity;
using OpenAI;
using OpenAI.Responses;

#pragma warning disable OPENAI001

var policy = new BearerTokenPolicy(new DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default");
var clientOptions = new OpenAIClientOptions
{
    Endpoint = new Uri($"{Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT")!.TrimEnd('/')}/openai/v1/"),
    
};

var responseClient = new OpenAIResponseClient("gpt-5-mini", policy, clientOptions);
var responseCreationOptions = new ResponseCreationOptions
{
    MaxOutputTokenCount = 1000
};

var response1 = await responseClient.CreateResponseAsync(
    userInputText: "Explain quantum computing in simple terms",
    options: responseCreationOptions);

Console.WriteLine(response1.Value.GetOutputText());
Java Code:

add the following imports

import com.azure.identity.AuthenticationUtil;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.openai.client.OpenAIClient;
import com.openai.client.okhttp.OpenAIOkHttpClient;
import com.openai.credential.BearerTokenCredential;
import com.openai.models.responses.Response;
import com.openai.models.responses.ResponseCreateParams;

code snippet

Supplier<String> bearerTokenSupplier = AuthenticationUtil.getBearerTokenSupplier(
    new DefaultAzureCredentialBuilder().build(), 
    "https://cognitiveservices.azure.com/.default"
);

OpenAIClient client = OpenAIOkHttpClient.builder()
    .baseUrl(System.getenv("AZURE_OPENAI_ENDPOINT"))
    .credential(BearerTokenCredential.create(bearerTokenSupplier))
    .build();

Response response = client.responses().create(
    ResponseCreateParams.builder()
        .model("gpt-5-mini")
        .input(ResponseCreateParams.Input.ofText("Explain quantum computing in simple terms"))
        .maxOutputTokens(1000)
        .build()
);
System.out.println(response.output());

Why Keyless?

βœ… No API keys to manage or rotate
βœ… Better security with Azure RBAC
βœ… Works with your Azure login
βœ… Production-ready and enterprise-grade


Option B: API Key Authentication (Quick Start)

For quick testing and development:

Click to expand API key setup and code examples

Setup Steps

zsh/bash
# 1. Get your endpoint
endpoint=$(azd env get-value 'AZURE_OPENAI_ENDPOINT')

# 2. Get your API key
apiKey=$(az cognitiveservices account keys list --name $(azd env get-value 'AZURE_OPENAI_NAME') --resource-group rg-$(azd env get-value 'AZURE_ENV_NAME'))

# 3. Set environment variables
export AZURE_OPENAI_ENDPOINT=$endpoint
export AZURE_OPENAI_API_KEY=$apiKey

# 4. Run API key examples
cd src/python && python responses_example.py
# or
cd src/typescript && npm start
# or
cd src/go && go run .
# or
cd src/dotnet && dotnet run responses_example.cs
# or
cd src/java && mvn clean compile exec:java -Dexec.mainClass="com.azure.openai.starter.ResponsesExample"
PowerShell
# 1. Get your endpoint
$endpoint = azd env get-value 'AZURE_OPENAI_ENDPOINT'

# 2. Get your API key
$apiKey = az cognitiveservices account keys list --name $(azd env get-value 'AZURE_OPENAI_NAME') --resource-group rg-$(azd env get-value 'AZURE_ENV_NAME')

# 3. Set environment variables
$env:AZURE_OPENAI_ENDPOINT=$endpoint
$env:AZURE_OPENAI_API_KEY=$apiKey

# 4. Run API key examples
cd src/python && python responses_example.py
# or
cd src/typescript && npm start
# or
cd src/go && go run .
# or
cd src/dotnet && dotnet run responses_example.cs
# or
cd src/java && mvn clean compile exec:java -Dexec.mainClass="com.azure.openai.starter.ResponsesExample"

Code Samples

Python Code:
from openai import OpenAI

client = OpenAI(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"), 
    base_url=f"{os.getenv('AZURE_OPENAI_ENDPOINT')}openai/v1/"
)

response = client.responses.create(
    model="gpt-5-mini",
    input="Explain quantum computing in simple terms",
    max_output_tokens=1000
)
print(response.output_text)
TypeScript Code:
import OpenAI from 'openai';

const client = new OpenAI({
    apiKey: process.env.AZURE_OPENAI_API_KEY,
    baseURL: `${process.env.AZURE_OPENAI_ENDPOINT}openai/v1/`
});

const response = await client.responses.create({
    model: "gpt-5-mini",
    input: "Explain quantum computing in simple terms",
    max_output_tokens: 1000
});
console.log(response.output_text);
Go Code:
import (
    "github.com/Azure/azure-sdk-for-go/sdk/azidentity"
    "github.com/openai/openai-go/v3"
    "github.com/openai/openai-go/v3/azure"
    "github.com/openai/openai-go/v3/option"
    "github.com/openai/openai-go/v3/responses"
)

client := openai.NewClient(
    option.WithBaseURL(endpoint+"/openai/v1/"),
    option.WithAPIKey(apiKey),
)

resp, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{
    Model: "gpt-5-mini",
    Input: responses.ResponseNewParamsInputUnion{
        OfString: openai.String("Explain quantum computing in simple terms"),
    },
    MaxOutputTokens: openai.Int(1000),
})
.NET Code:
#!/usr/bin/dotnet run

#:package OpenAI@2.*

using System.ClientModel;
using OpenAI;
using OpenAI.Responses;

#pragma warning disable OPENAI001

var credential = new ApiKeyCredential(Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY"));
var clientOptions = new OpenAIClientOptions
{
    Endpoint = new Uri($"{Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT").TrimEnd('/')}/openai/v1/")
};
var responseClient = new OpenAIResponseClient("gpt-5-mini", credential, clientOptions);
var responseCreationOptions = new ResponseCreationOptions
{
    MaxOutputTokenCount = 1000
};

var response1 = await responseClient.CreateResponseAsync(
    userInputText: "Explain quantum computing in simple terms",
    options: responseCreationOptions);

Console.WriteLine($"Response: {response1.Value.GetOutputText()}");
Java Code:

add the following imports

import com.openai.client.OpenAIClient;
import com.openai.client.okhttp.OpenAIOkHttpClient;
import com.openai.models.responses.Response;
import com.openai.models.responses.ResponseCreateParams;

code snippet

OpenAIClient client = OpenAIOkHttpClient.builder()
    .apiKey(System.getenv("AZURE_OPENAI_API_KEY"))
    .baseUrl(System.getenv("AZURE_OPENAI_ENDPOINT"))
    .build();

Response response = client.responses().create(
    ResponseCreateParams.builder()
        .model("gpt-5-mini")
        .input(ResponseCreateParams.Input.ofText("Explain quantum computing in simple terms"))
        .maxOutputTokens(1000)
        .build()
);
System.out.println(response.output());

πŸ“– See CLIENT_README.md for detailed setup guide with more examples

What This Template Includes

  • Core Infrastructure: Azure OpenAI resource with GPT-5-mini deployment
  • Optimal Configuration: Flexible region selection, GlobalStandard SKU, v1 API
  • Secure Authentication: EntraID (Azure Identity) recommended + API key option
  • Client Examples: Python, TypeScript, Go and Java using the new Responses API
  • Validation Scripts: PowerShell and Bash scripts for testing
  • Complete Documentation: Setup guides and troubleshooting tips

What You Get

βœ… GPT-5-mini (2025-08-07) - Latest reasoning model, no registration required
βœ… Flexible region deployment - Choose your optimal region
βœ… New v1 API support - Future-proof, no version management needed
βœ… Automatic deployment - Model ready to use immediately
βœ… Multi-language examples - Python, TypeScript/Node.js, Go and Java clients
βœ… Two authentication methods - API keys (quick start) + EntraID (production-ready)
βœ… Unique resource naming - No conflicts with existing resources

Template Structure

β”œβ”€β”€ azure.yaml                 # azd configuration
β”œβ”€β”€ infra/
β”‚   β”œβ”€β”€ main.bicep             # Main deployment template
β”‚   β”œβ”€β”€ main.parameters.json   # Deployment parameters
β”‚   └── resources.bicep        # Azure OpenAI resource definition
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ dotnet/
β”‚   β”‚   β”œβ”€β”€ responses_example.cs         # API key authentication
β”‚   β”‚   β”œβ”€β”€ responses_example_entra.cs   # EntraID authentication
β”‚   β”‚   β”œβ”€β”€ global.json                  # .NET SDK configuration
β”‚   β”‚   └── README.md                    # .NET prerequisites
β”‚   β”œβ”€β”€ go/
β”‚   β”‚   β”œβ”€β”€ responses_example
β”‚   β”‚   |   β”œβ”€β”€ main.go                  # API key authentication
β”‚   β”‚   |   β”œβ”€β”€ go.mod                   # Go module dependencies
β”‚   β”‚   |   └── go.sum                   # Go dependency checksums
β”‚   β”‚   └── responses_example_entra
β”‚   β”‚   |   β”œβ”€β”€ main.go                  # EntraID key authentication
β”‚   β”‚   |   β”œβ”€β”€ go.mod                   # Go module dependencies
β”‚   β”‚   |   └── go.sum                   # Go dependency checksums
β”‚   β”œβ”€β”€ java/
β”‚   β”‚   β”œβ”€β”€ pom.xml                      # Maven dependencies
β”‚   β”‚   └── src/main/java/com/azure/openai/starter/
β”‚   β”‚       β”œβ”€β”€ ResponsesExample.java           # API key authentication
β”‚   β”‚       └── ResponsesExampleEntra.java      # EntraID authentication
β”‚   β”œβ”€β”€ python/
β”‚   β”‚   β”œβ”€β”€ responses_example.py         # API key authentication
β”‚   β”‚   β”œβ”€β”€ responses_example_entra.py   # EntraID authentication
β”‚   β”‚   └── requirements.txt             # Python dependencies
β”‚   └── typescript/
β”‚       β”œβ”€β”€ responses_example.ts         # API key authentication
β”‚       β”œβ”€β”€ responses_example_entra.ts   # EntraID authentication
β”‚       β”œβ”€β”€ package.json                 # Node.js dependencies
β”‚       └── tsconfig.json                # TypeScript configuration
β”œβ”€β”€ CLIENT_README.md           # Detailed setup guide (Python & TypeScript)
β”œβ”€β”€ validate.ps1              # PowerShell validation script
└── validate.sh               # Bash validation script

Common Commands

# Redeploy with changes
azd up

# See what would be deployed
azd provision --preview  

# Debug deployment issues
azd up --debug

# Clean up everything  
azd down

Alternative Regions

Want to deploy to East US 2 instead?

azd env set AZURE_LOCATION eastus2
azd up

Model Customization

Want a different model? Edit infra/resources.bicep:

// Current: GPT-5-mini (no registration required)
gptModelName: 'gpt-5-mini'
gptModelVersion: '2025-08-07'

// Alternatives (no registration required):
gptModelName: 'gpt-5-nano'      // Fastest
gptModelName: 'gpt-5-chat'      // Chat-optimized

// Full GPT-5 (requires registration):
gptModelName: 'gpt-5'           // Needs approval

Troubleshooting

"GPT-5-mini not available" β†’ Try azd env set AZURE_LOCATION eastus2

"Quota exceeded" β†’ Check your subscription quota for Azure OpenAI

"Permission denied" β†’ Ensure you have Cognitive Services Contributor role

"Tenant provide token mismatch" β†’ Set AZURE_TENANT_ID environment variable with the proper tenant ID.

Need debug info? β†’ Run azd up --debug for detailed logs

Why This Template?

βœ… Minimal setup - 2 commands instead of 20+
βœ… Latest model - GPT-5-mini with reasoning capabilities
βœ… Future-proof - Uses new v1 API, no version management
βœ… Production-ready - GlobalStandard SKU, EntraID auth, proper naming
βœ… Complete examples - Python, TypeScript, Go and Java with error handling
βœ… Secure by default - Supports keyless authentication with Azure Identity
βœ… Easy cleanup - Remove everything with azd down


Happy AI building with GPT-5-mini! πŸ€–βœ¨

Powered by Azure Developer CLI | Deploys GPT-5-mini (2025-08-07)

About

The fastest way to get started with Azure OpenAI v1 with Responses API and GPT 5

Resources

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 7