Matheus Bratfisch Cogito ergo sum

Docker-compose with PHP-FPM, sendmail, nginx, mariadb serving jekyll and wordpress

As I explained recently, I had a blog running Wordpress and decided to move to Jekyll but there was a catch, I didn’t want to loose any link I had to my wordpress blog, to achieve this, I setup an nginx which will try to find a static file from jekyll and if it is not found it will fallback to Wordpress.

I was running my server on ec2 instance with RDS and it was becoming a little bit expensive, so I decided to move everything to one machine and dockerize my setup so I could easily switch my servers.

To achieve this, I have created a docker-compose with:

  • PHP-FPM and sendmail to process php and sendmail
  • Nginx to serve jekyll static files and if they’re not found serve my old wordpress blog
  • MariaDB as my Database for Wordpress
version: '3'
services:
  fpm:
    # image: php:7.0-fpm-alpine
    build: php7fpm
    restart: always
    volumes:
      - ./wordpress.matbra.com/:/var/www/wordpress.matbra.com
      - ./php7fpm/sendmail.mc:/usr/share/sendmail/cf/debian/sendmail.mc
      - ./php7fpm/gmail-auth.db:/etc/mail/authinfo/gmail-auth.db
    ports:
      - "9000:9000"
    links:
      - mariadb 
    hostname: boarders.com.br
  
  nginx:
    image: nginx:1.10.1-alpine
    restart: always
    volumes:
      - ./nginx/nginx.conf:/etc/nginx/nginx.conf
      - ./nginx/app.vhost:/etc/nginx/conf.d/default.conf
      - ./logs/nginx:/var/log/nginx
      - ./wordpress.matbra.com/:/var/www/wordpress.matbra.com
      - ./jekyll.matbra.com/:/var/www/jekyll.matbra.com
    ports:
      - "80:80"
      - "443:443"
    links:
      - fpm

  mariadb:
    image: mariadb
    restart: always
    environment:
      - MYSQL_ROOT_PASSWORD=yourpassword
      - MYSQL_DATABASE=
    volumes:
    -   ./data/db:/var/lib/mysql

PHP-FPM container:

I’m using a custom Dockerfile which comes from php:7.0-fpm and add sendmail support and mysql extension. There is a custom starter script which will run sendmail + php-fpm. (I know I should create a specific container for sendmail)

On this container I’m basically mapping some php files and config files:

  • ./wordpress.matbra.com to /var/www/wordpress.matbra.com which are my wordpress files
  • ./php7fpm/sendmail.mc to /usr/share/sendmail/cf/debian/sendmail.mc which is my configuration file for sendmail
  • ./php7fpm/gmail-auth.db to /etc/mail/authinfo/gmail-auth.db which is the password for my gmail Configuring gmail as relay to sendmail

I’m also mapping the port 9000 to 9000, so I will communicate with PHP-FPM on this ports, creating a link to mariadb and naming my hostname.

NGINX container:

I’m using the regular nginx alpine with some maps:

  • ./nginx/nginx.conf to /etc/nginx/nginx.conf which is my nginx configuration
  • ./nginx/app.vhost to /etc/nginx/conf.d/default.conf which is my website configuration with Jekyll falling back to wordpress
  • ./logs/nginx to /var/log/nginx which will be my log directory
  • ./wordpress.matbra.com/ to /var/www/wordpress.matbra.com which is the place where nginx can find wordpress website
  • ./jekyll.matbra.com/ to /var/www/jekyll.matbra.com which is the place where nginx can find jekyll website

I’m also mapping ports 80 to 80 and 443 to 443 and create a link to PHP-FPM so nginx can communicate with fpm container.

MARIADB container:

No mistery here, regular mariadb image, with a mapping for data and some environment variables.

Because I’m not adding my website files to the image, I have created a command init.sh to remove website directory and clone website from git. There is a command called update-config.sh to update wp-config.php file with the correct environment variables.

With this I can easily spin up a new machine with my website structure.

https://github.com/x-warrior/blog-docker

I hope this will be helpful for you. Matheus

Comment

Install ZNC IRC Bouncer on AWS Linux

If you want to install ZNC IRC Bouncer you will need CMake, but AWS Linux CMake is too old. (Update your cmake to 3.x)[http://www.matbra.com/2017/12/07/install-cmake-on-aws-linux.html]

Now you will need git to clone the ZNC source code and openssl-devel to have ssl support

# yum install git openssl-devel

Clone ZNC source code

$ git clone https://github.com/znc/znc.git

Enter on the source code folder

$ cd znc

Initialize submodules

$ git submodule update --init --recursive

Install it with:

$ cmake . 
$ make
# make install (run this as root #)

Configure it with:

$ znc --makeconf

Best regards, Matheus

Comment

Install Cmake 3 on AWS Linux

If you are trying to build something using CMake and is getting the error: “CMake 3.1 or higher is required. You are running version 2.8.12.2”

You can manually install this CMake version, to do this, I removed the previous CMake.

# yum remove cmake

Tested if it was really removed

$ cmake 
-bash: /usr/bin/cmake: No such file or directory

Install G++

# yum install gcc-c++

Download latest version from: Cmake Download

$ wget https://cmake.org/files/v3.10/cmake-3.10.0.tar.gz

Extract it:

$ tar -xvzf cmake-3.10.0.tar.gz

Enter on cmake folder

$ cd cmake-3.10.0

Install it with:

# ./bootstrap
# make
# make install

Now you should have cmake under /usr/local/bin/cmake

Best regards, Matheus

Comment

Loopback model migration using postgresql database

I have been playing with Loopback, initially I was just declaring models and use in memory, but now I got to a point where I need to have a persistent database.

I couldn’t find how to keep my database synced with my models easily. I’m not sure if I’m not that familiar with Loopback yet, or if their documentation is not clear enough.

To create a script to sync your models with your database you can create a file under bin/ called autoupdate.js and add the following:

var path = require('path');

var app = require(path.resolve(__dirname, '../server/server'));
var ds = app.datasources.db;
ds.autoupdate(function(err) {
  if (err) throw err;
  ds.disconnect();
});

The code is pretty simple, it will fetch the app from server.js, grab the datasource and run the autoupate command. You could use automigrate, but this one will clean the database every time, so pay attention on this.

I think this will work for most of datasources, but if it doesn’t work for yours, drop me a line. I can try to help :D

Matheus

PS: Loopback will not create migrations and do a proper job as Django, sometimes you can get to weird states, it seems Loopback works better with NoSQL databases.

Comment

Django Storages with Boto3 and additional Metadata only for Media

I have a personal project which I’m using python with Django and django-storages to upload my static and media files to Amazon S3, because my media files have UUID and they’re not editable on my system I wanted to have a long expiration time on it, so I could save some bandwidth but I didn’t want this on the static files which are updated more regularly when I’m updating the system.

Most of resources refer to AWS_HEADERS but it didn’t work for me. It seems it is only for boto (not boto3) after looking into boto3 source code I discovered AWS_S3_OBJECT_PARAMETERS which works for boto3, but this is a system-wide setting, so I had to extend S3Boto3Storage.

So the code that solved my problem was:

class MediaRootS3Boto3Storage(S3Boto3Storage):
    location = 'media'
    object_parameters = {
        'CacheControl': 'max-age=604800'
    }

If you’re using boto (not boto3) and you want to have specific parameters only for Media classes you could use

class MediaRootS3Boto3Storage(S3BotoStorage):
    location = 'media'
    headers = {
        'CacheControl': 'max-age=604800'
    }

You also need to update your django-storages settings, pay attention to the class name, on boto 3 it is S3Boto3Storage on boto it doesn’t has the 3 after Boto.

DEFAULT_FILE_STORAGE = 'package.module.MediaRootS3Boto3Storage'

Very simple tip, but it took a while to find out how it works

Matheus

Comment

Nginx redirect on failure

As a few of you probably noticed, recently I have decided to update my really old wordpress blog from PHP4~5 to a most recent one. Leaving a shared host and going to heroku, which later became Amazon EC2.

I had to decide if I would keep Wordpress, or change to a different technology as Jekyll? Or what? I have thought a lot about this and in the end I decided to use Jekyll to be honest, why? Because using something new will motivate me to study, play with something new and work more.

Have decided to work with Jekyll, I had to think about my domain, because I didn’t want to break my old wordpress blog, I want to keep it alive as a record and keep it for SEO points, but how to keep both living together on an awesome way?

I thought the ideal would be to have something that tries to access the new website and if it is not found it should redirect to the old wordpress website. But how to redirect to the old blog only when a page is not found and complying with the http status code (ie: redirecting with 301).

After some documentation reading on nginx I found you can try to proxy to a server and if it fails redirect to a new one, it seems the ideal solution for now.

I have a nginx configuration file with multiple servers, first I have a nginx wordpress configuration, this server just adds PHP-FPM to process PHP files basically with my own custom domain.

server {
   listen 80;
   server_name wordpress.matbra.com;

    location / {
        root   /var/www/wordpress/live;
        index  index.php index.html index.htm;
        try_files $uri $uri/ /index.php?$uri$args;
    }

    location ~ \.php$ {
        root /var/www/wordpress/live;
        fastcgi_pass   unix:/var/run/php-fpm/php-fpm.sock;
        fastcgi_index  index.php;
        fastcgi_param  SCRIPT_FILENAME  $document_root$fastcgi_script_name;
        include        fastcgi_params;
    }
}

Read more

Build Jekyll as production after push

If you want to build your Jekyll blog on your own server after a git push you can use git hooks. To do it, you can extend the Deploy after git push and add this tree lines (after rm -rf), to install dependencies and to build it as production environment.

	cd $LIVE_PATH
	bundle install
	JEKYLL_ENV=production jekyll build

Matheus

Comment

Force www on Jekyll website using Javascript

I wanted to force using my jekyll website to have the “www” prefix and because my Jekyll doesn’t have a back-end I couldn’t do it on the server, so I needed to use Javascript or Meta tags. A few people says Google Search engine handles meta refresh as 301/302 so it would be better to go with this approach from a SEO perspective.

If you want to force www prefix on your website using javascript, you can use this snippet:

<script>
if (window.location.hostname.indexOf("www") != 0) {
	window.location = window.location.protocol + "//www." + window.location.hostname + window.location.pathname;
}
</script>

I have created a _include/force_www.html file and I’m using jekyll.environment to load it, so I’m only loading it on production.

Matheus

Comment

Set env var to PHP-FPM

After installing nginx and php, I wanted to use environment vars inside PHP 7 so I don’t need to save configuration to my repo.

Usually when using environment vars the ideal is to set it without having it saved in a file but on this case it was easier to.

If you want to add environment variables to your PHP-FPM you can edit /etc/php-fpm.d/www.conf (I’m doing it on Amazon Linux and PHP 7.0)

There is a flag clear_env = no where you’re able to set if php-fpm will receive a clean environment or not. I decided to leave it as the default value and but setting my vars as

env[WP_SECURE_AUTH_KEY] = "some-value"
env[WP_NONCE_KEY] = "nonce-key"

After this I restarted my nginx and php-fpm.

sudo service nginx restart
sudo service php-fpm restart

Matheus

Comment

Deploy after push to your own git

I have explained how to push your code to your own git server and after this you may want to execute some especific functions, in my specific case I wanted my code to be builded and to release a new version, so I used post-receive hook from my repo.

Oh, it also handle multiple versions keeping the last 3 versions of the release. To do this it uses your DEPLOY_PATH and create a new folder sources on it, which will have your versions and a live folder which is a symlink to the version which is running.

Vars:

  • REPO_PATH = Path to your git folder
  • DEPLOY_PATH = Path to your destiny folder
  • DEPLOY_BRANCH = Branch you want to deploy
#!/bin/bash
REPO_PATH=/home/someuser/test.git
DEPLOY_PATH=/var/www/
DEPLOY_BRANCH="master"

echo "REPO_PATH=$REPO_PATH"
echo "DEPLOY_PATH=$DEPLOY_PATH"

while read oldrev newrev refname
do
    branch=$(git rev-parse --symbolic --abbrev-ref $refname)
    if [ $DEPLOY_BRANCH == "$branch" ]; then
        TIMESTAMP=$(date +%Y%m%d%H%M%S)
        VERSION_PATH=$DEPLOY_PATH/sources/$TIMESTAMP
        LIVE_PATH=$DEPLOY_PATH/live
        echo "TIMESTAMP=$TIMESTAMP"
        echo "VERSION_PATH=$VERSION_PATH"
        echo "LIVE_PATH=$LIVE_PATH"

        mkdir -p $VERSION_PATH
        mkdir -p $VERSION_PATH/sources

        git --work-tree=$VERSION_PATH --git-dir=$REPO_PATH checkout -f $DEPLOY_BRANCH
        # Remove git files
        rm -rf $VERSION_PATH/.git
        rm -rf $LIVE_PATH
        ln -s $VERSION_PATH $LIVE_PATH


        # Delete old folder keeping the 3 most recent ones, which aren't the current live one, / (root, security measure, different from your source folder)
        rm -rf $(ls -1dt $(find -L $DEPLOY_PATH/sources/ -maxdepth 1 -type d ! -samefile / ! -samefile $DEPLOY_PATH/sources/ ! -samefile $LIVE_PATH -print) | tail -n+3)
    fi
done

If you have any question, let me know. Matheus

Comment