Step-by-step guide to integrating Bolt.new AI with Azure Machine Learning in 2025 for faster, scalable AI workflows.

Book a call with an Expert
Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.
To integrate Bolt.new with Azure Machine Learning, you don’t “connect” Bolt itself to Azure; instead, you write normal client code inside your Bolt workspace that calls Azure ML’s REST endpoints or uses the official Azure ML REST API to submit jobs, call deployed models, or manage pipelines. Bolt acts as the environment where you write and run this integration code, while Azure ML stays the external system. You authenticate using Azure AD (typically a Service Principal) and call your Azure ML workspace endpoints from Bolt using standard HTTP requests.
You create Azure Machine Learning resources on Azure (workspace + model deployment or online endpoint), then in Bolt.new you:
That’s the entire integration pattern: Bolt → REST → Azure Machine Learning.
This is the simplest stable pattern used in real systems.
This version uses pure REST calls (no Azure SDK required) because it works in any environment including Bolt’s sandbox.
// This function gets an Azure AD token using OAuth2 client credentials
async function getAzureToken() {
const tenantId = process.env.AZURE_TENANT_ID;
const clientId = process.env.AZURE_CLIENT_ID;
const clientSecret = process.env.AZURE_CLIENT_SECRET;
const url = `https://login.microsoftonline.com/${tenantId}/oauth2/v2.0/token`;
const params = new URLSearchParams();
params.append("client_id", clientId);
params.append("client_secret", clientSecret);
params.append("grant_type", "client_credentials");
params.append("scope", "https://management.azure.com/.default");
const response = await fetch(url, {
method: "POST",
body: params
});
const data = await response.json();
return data.access_token;
}
// Call a deployed Azure ML model endpoint
async function callAzureMLEndpoint(inputData) {
const endpointUrl = process.env.AZURE_ML_ENDPOINT_URL; // example: https://xxxx.eastus.inference.ml.azure.com/score
const token = await getAzureToken();
const response = await fetch(endpointUrl, {
method: "POST",
headers: {
"Authorization": `Bearer ${token}`,
"Content-Type": "application/json"
},
body: JSON.stringify(inputData)
});
const result = await response.json();
return result;
}
// Example usage inside Bolt
(async () => {
const prediction = await callAzureMLEndpoint({
input_data: [{ text: "hello world" }]
});
console.log("Azure ML Prediction:", prediction);
})();
Azure ML model deployments expose HTTPS endpoints. These are normal REST endpoints. Bolt.new can call any REST API if you supply credentials. The authentication layer is Azure AD, not Bolt. Bolt is simply running Node/JS code that fetches a token and calls your endpoint.
After prototyping in Bolt.new, move your code into your real backend environment. Use a secrets manager (Azure Key Vault, AWS Secrets Manager, or environment variables) and ensure least-privilege Service Principal permissions. The REST call patterns remain identical.
When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.