Get Free access to exclusive AI tutorials and amazing AI tools now
Request access

How to add Docker to a flask project and deploy on a VPS [Docker Coqui TTS]


After the last 3 articles, this is the last part of our series of creating a text-to-speech synthesizer and deploying it online. And in this episode, we will create the docker images, launch and deploy containers, and link it to a subdomain.

Here the steps are simple and straightforward. We must create a requirements.txt file in our project first and then fill it will all our dependencies


And create a Dockerfile that will contain some instructions on which image we should use, what we must install and how to launch the app. It gives something like this:

FROM python:3.9
RUN apt update
RUN apt-get install -y libsndfile1 // not always needed
COPY ./requirements.txt /app/requirements.txt
COPY .env /app/.env
RUN pip install -r /app/requirements.txt
COPY . /app
ENTRYPOINT ["python"]

CMD [""]

Let’s start by defining the basic image that is python:3.9. a Small image with python version 3.9.

Line 4 is not always. But our TTS packages needed it so we had to install it.

Then we have our docker file. We can run

$ docker build -t martha .
$ docker run -p 5000:5000 martha

And our app should be running locally.

So In our case, we can simply use  Nginx Proxy Manager to manage and create domains and subdomains. But you can use other methods also.

We need to run our app on the server within the same network as  Nginx Proxy Manager. In my case the network is npm. So to run my app I type:

docker run --network npm --name martha -d martha

I assign it a name because in our NPM we need the specify the name and the port like this:

And that is globally the steps.

You can find the whole process here: Youtube The source code is here on GitHub And the app is here Martha.

The next series on this blog is a complete course about how to quit from data to big data analytics. Stay tuned for the upcoming posts.

Let's Innovate together for a better future.

We have the knowledge and the infrastructure to build, deploy and monitor Ai solutions for any of your needs.

Contact us