top of page
Writer's pictureHackers Realm

How to deploy a Trained Model using Docker

Deploying machine learning models traditionally involves managing dependencies, environment configurations, and scalability challenges. Docker, a containerization platform, simplifies this process by packaging applications and their dependencies into standardized units called containers. This approach ensures that your model runs seamlessly across various computing environments, from local development to production servers.

Deploy trained model using Docker
Deploy trained model using Docker

In this tutorial, we will explore how you can streamline the deployment process, mitigate compatibility issues, and focus more on optimizing your models for real-world applications by leveraging Docker.



You can watch the video-based tutorial with step by step explanation down below.


Please refer to this article to create a flask web app before proceeding further.

Project Setup


First we will see the project folder setup.

Project folder structure
Project folder structure
  • The project structure, as illustrated above, consists of several key components imported from the Iris Flask web app tutorial. These include the templates, the deploy.py file, the training notebook, the Iris dataset, and the trained model.

  • In the existing deploy.py file you will have to update the host to 0.0.0.0 and port to 5000.



Docker setup


First we will create a docker file.

FROM python:3.8-slim
WORKDIR /app

COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt

COPY . .
CMD ["python", "deploy.py"]
  • Use the official python:3.8-slim image as the base image.

  • Set the working directory inside the container to /app.

  • Copy the requirements.txt file to the container and install the dependencies listed in it.

  • Copy the entire current directory (.) to the container’s working directory.

  • Define the default command to run deploy.py using Python.



Deploy Docker Image


First we will build the docker image.

docker build -t iris-flask-app .
running docker image for deployment
docker build logs
docker build logs

This command will:

  • Build a Docker image using the Dockerfile located in the current directory (.).

  • Tag the resulting image with the name iris-flask-app.



Next we will run the docker image.

docker run -d -p 5000:5000 iris-flask-app
Docker image
Docker image
Logs generated
Logs generated

This command will:

  • Run the container in detached mode (-d), meaning the container will run in the background.

  • Map port 5000 of your host to port 5000 of the container (-p 5000:5000), allowing you to access the Flask app running inside the container at http://localhost:5000.

  • Run the container based on the iris-flask-app image, which was built previously.



Once the command is executed successfully, you should be able to visit http://localhost:5000 to interact with the Flask app.

Flask App
Flask App


Final Thoughts

  • Deploying a trained machine learning model using Docker provides a reliable, scalable, and consistent environment across different stages of development. By containerizing the model and its dependencies, you can ensure that the application will run seamlessly, regardless of the underlying infrastructure

  • Docker allows you to package your model, along with the necessary runtime, libraries, and configurations, into a lightweight container that can be easily shared, tested, and deployed in any environment—be it on-premise, cloud, or even serverless platforms.

  • For developers and data scientists, Docker bridges the gap between model development and deployment by eliminating common "it works on my machine" issues. It promotes a more collaborative workflow, enabling teams to focus on improving model accuracy and performance without worrying about environment inconsistencies.

In conclusion, using Docker for model deployment streamlines the process, promotes portability, and supports scalability, making it an essential tool for modern machine learning operations. As you continue to develop and deploy models, consider leveraging Docker's vast ecosystem to enhance your deployment pipelines and simplify operational complexities.



Get the project code from here


Thanks for reading the article!!!


Check out more project videos from the YouTube channel Hackers Realm

5 views

Comentarios


bottom of page