Open source VoIP billing & routing platform

How to deploy PyFreeBilling with docker ?

Prerequisites

Understanding the Docker Compose Setup

Before you begin, check out the production.yml file in the root of this project. Keep note of how it provides configuration for the following services:

  • django: your application running behind Gunicorn;
  • postgres: PostgreSQL database with the application’s relational data;
  • redis: Redis instance for caching;
  • caddy: Caddy web server with HTTPS on by default.

HTTPS is On by Default

SSL (Secure Sockets Layer) is a standard security technology for establishing an encrypted link between a server and a client, typically in this case, a web server (website) and a browser. Not having HTTPS means that malicious network users can sniff authentication credentials between your website and end users’ browser.

It is always better to deploy a site behind HTTPS and will become crucial as the web services extend to the IoT (Internet of Things). For this reason, we have set up a number of security defaults to help make your website secure:

  • If you are not using a subdomain of the domain name set in the project, then remember to put your staging/production IP address in the DJANGO_ALLOWED_HOSTS environment variable before you deploy your website. Failure to do this will mean you will not have access to your website through the HTTP protocol.
  • Access to the Django admin is set up by default to require HTTPS in production or once live.

The Caddy web server used in the default configuration will get you a valid certificate from Lets Encrypt and update it automatically. All you need to do to enable this is to make sure that your DNS records are pointing to the server Caddy runs on.

You can read more about this here at Automatic HTTPS in the Caddy docs.

Configuring the Environment

The most important thing for us here now is env_file section enlisting ./.envs/.production/.postgres. Generally, the stack’s behavior is governed by a number of environment variables (env(s), for short) residing in envs/, for instance, this is what we generate for you:

.envs
├── .local
│   ├── .django
│   ├── .kamailio
│   └── .postgres
└── .production
    ├── .caddy
    ├── .django
    ├── .kamailio
    └── .postgres

Consider the aforementioned .envs/.production/.postgres:

# PostgreSQL
# ------------------------------------------------------------------------------
POSTGRES_HOST=postgres
POSTGRES_DB=pyfbv3
POSTGRES_USER=pyfreebilling
POSTGRES_PASSWORD=mypassword

The three envs we are presented with here are POSTGRES_DB, POSTGRES_USER, and POSTGRES_PASSWORD (by the way, their values have also been generated for you). You might have figured out already where these definitions will end up; it’s all the same with django and caddy service container envs.

and now, create the file  .envs/.production/.caddy:

# Caddy
# ------------------------------------------------------------------------------
DOMAIN_NAME=example.com

then , create the file  .envs/.production/.django:

# General
# ------------------------------------------------------------------------------
DJANGO_SETTINGS_MODULE=config.settings.production
DJANGO_SECRET_KEY=your_sercret_key_CHANGE_IT
DJANGO_ADMIN_URL=iKOwUVcqSTnUGwFTwD9ZCRIDBtfkcf7c/
DJANGO_ALLOWED_HOSTS=.example.com

# Security
# ------------------------------------------------------------------------------
# TIP: better off using DNS, however, redirect is OK too
DJANGO_SECURE_SSL_REDIRECT=False

# Email
# ------------------------------------------------------------------------------
MAILGUN_API_KEY=
DJANGO_SERVER_EMAIL=
MAILGUN_DOMAIN=

# Gunicorn
# ------------------------------------------------------------------------------
WEB_CONCURRENCY=4


# Redis
# ------------------------------------------------------------------------------
REDIS_URL=redis://redis:6379/0

and now, create an empty file  .envs/.production/.kamailio

(Optional) Postgres Data Volume Modifications

Postgres is saving its database files to the production_postgres_data volume by default. Change that if you want something else and make sure to make backups since this is not done automatically.

Building & Running Production Stack

You will need to build the stack first. To do that, run:

docker-compose -f production.yml build

Once this is ready, you can run it with:

docker-compose -f production.yml up

To run the stack and detach the containers, run:

docker-compose -f production.yml up -d

To run a migration, open up a second terminal and run:

docker-compose -f production.yml run --rm django python manage.py migrate

To create a superuser, run:

docker-compose -f production.yml run --rm django python manage.py createsuperuser

If you need a shell, run:

docker-compose -f production.yml run --rm django python manage.py shell

To check the logs out, run:

docker-compose -f production.yml logs

To see how your containers are doing run:

docker-compose -f production.yml ps
mattis sed adipiscing risus. id, venenatis, facilisis leo pulvinar justo