Deploy ML models with Docker and Flask. Step-by-step guide for containerizing your ML app and launching scalable APIs.

Book a call with an Expert
Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.
model.pkl) is used to make predictions.app.py) will load the ML model and define endpoints for predictions.pickle to deserialize the trained model.
// Import necessary modules
from flask import Flask, request, jsonify // Flask framework for API creation
import pickle // To load the ML model
import numpy as np // For numerical operations
// Initialize the Flask application
app = Flask(**name**)
// Load the pre-trained ML model
with open('model.pkl', 'rb') as model\_file:
model = pickle.load(model\_file) // Model deserialization
// Define an API route for prediction
@app.route('/predict', methods=['POST'])
def predict():
// Retrieve JSON data from the request
input_data = request.get_json() // Input should be a dictionary containing features
// Convert input data to a suitable format (e.g., NumPy array)
features = np.array(input\_data['data']).reshape(1, -1) // Adjust shape based on model input requirements
// Make prediction using the loaded model
prediction = model.predict(features) // Use the model's predict method
// Return the prediction as a JSON response
return jsonify({'prediction': prediction.tolist()})
// Start the Flask application when the container is executed
if **name** == '**main**':
// Set host to '0.0.0.0' to make the app externally visible
app.run(host='0.0.0.0', port=5000, debug=True)
// Use an official Python runtime as a parent image
FROM python:3.9
// Set environment variables to prevent Python from writing pyc files and buffering stdout/stderr
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
// Set a working directory inside the container
WORKDIR /app
// Copy the requirements file to the container
COPY requirements.txt /app/requirements.txt
// Install any needed packages
RUN pip install --upgrade pip && pip install -r requirements.txt
// Copy the rest of the application code into the container
COPY . /app
// Expose port 5000, so it is accessible outside the container
EXPOSE 5000
// Define the command to run the Flask app when the container starts
CMD ["python", "app.py"]
requirements.txt.
// From the directory containing the Dockerfile, run:
docker build -t ml-flask-app:latest // Build the Docker image and tag it as ml-flask-app
// Run the Docker container and map port 5000 of the container to port 5000 on the host machine
docker run -p 5000:5000 ml-flask-app:latest
http://localhost:5000./predict endpoint can be called with appropriate JSON data to get model predictions.
// Example Python script for testing the /predict endpoint
import requests
import json
// Define the API URL
url = "http://localhost:5000/predict"
// Example input data matching the expected model features
data = {"data": [5.1, 3.5, 1.4, 0.2]} // Adjust values based on your model's requirements
// Send a POST request with JSON data
response = requests.post(url, json=data)
// Print the JSON response from the API
print(response.json())
docker logs <container\_id>.From startups to enterprises and everything in between, see for yourself our incredible impact.
Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We’ll discuss your project and provide a custom quote at no cost.Â