Matheus Bratfisch Cogito ergo sum

Deploy after push to your own git

I have explained how to push your code to your own git server and after this you may want to execute some especific functions, in my specific case I wanted my code to be builded and to release a new version, so I used post-receive hook from my repo.

Oh, it also handle multiple versions keeping the last 3 versions of the release. To do this it uses your DEPLOY_PATH and create a new folder sources on it, which will have your versions and a live folder which is a symlink to the version which is running.

Vars:

  • REPO_PATH = Path to your git folder
  • DEPLOY_PATH = Path to your destiny folder
  • DEPLOY_BRANCH = Branch you want to deploy
#!/bin/bash
REPO_PATH=/home/someuser/test.git
DEPLOY_PATH=/var/www/
DEPLOY_BRANCH="master"

echo "REPO_PATH=$REPO_PATH"
echo "DEPLOY_PATH=$DEPLOY_PATH"

while read oldrev newrev refname
do
    branch=$(git rev-parse --symbolic --abbrev-ref $refname)
    if [ $DEPLOY_BRANCH == "$branch" ]; then
        TIMESTAMP=$(date +%Y%m%d%H%M%S)
        VERSION_PATH=$DEPLOY_PATH/sources/$TIMESTAMP
        LIVE_PATH=$DEPLOY_PATH/live
        echo "TIMESTAMP=$TIMESTAMP"
        echo "VERSION_PATH=$VERSION_PATH"
        echo "LIVE_PATH=$LIVE_PATH"

        mkdir -p $VERSION_PATH
        mkdir -p $VERSION_PATH/sources

        git --work-tree=$VERSION_PATH --git-dir=$REPO_PATH checkout -f $DEPLOY_BRANCH
        # Remove git files
        rm -rf $VERSION_PATH/.git
        rm -rf $LIVE_PATH
        ln -s $VERSION_PATH $LIVE_PATH


        # Delete old folder keeping the 3 most recent ones, which aren't the current live one, / (root, security measure, different from your source folder)
        rm -rf $(ls -1dt $(find -L $DEPLOY_PATH/sources/ -maxdepth 1 -type d ! -samefile / ! -samefile $DEPLOY_PATH/sources/ ! -samefile $LIVE_PATH -print) | tail -n+3)
    fi
done

If you have any question, let me know. Matheus

Comment

Pushing to your own remote git

I’m creating a new server as you can notice and I would like to push directly to my git (hosted on my own server), so I could release a new version with a simple git push myserver branch.

If you want to achieve this as well you can connect to your remote ssh and execute

  1. $ mkdir test.git
  2. $ cd git
  3. $ git --bare init

You will need to know the full path of your git folder to add to as a remote on your local, to check the full path run pwd. Back to your local machine add your remote server.

git remote add my_server ssh://user@ip:/replace/with/pwd/test.git

After this you can use git push my_server branch to push to it.

Matheus

Comment

Install Nginx, PHP on Amazon Linux

I’m migrating my blog and a few other stuff I have running to Amazon infrastructure. I needed an Amazon EC2 instance with PHP support and able to connect to a MySQL.

Steps:

  1. yum update
  2. yum install nginx
  3. yum install php70 php70-fpm php70-mysqlnd
  4. Edit /etc/nginx/conf.d/virtual.conf
server {
    listen       3000;

    location / {
        root   /var/www/;
        index  index.php index.html index.htm;
    }

    location ~ \.php$ {
        root /var/www/;
        fastcgi_pass   unix:/var/run/php-fpm/php-fpm.sock;
        fastcgi_index  index.php;
        fastcgi_param  SCRIPT_FILENAME  $document_root$fastcgi_script_name;
        include        fastcgi_params;
    }
}
  1. Edit the following properties of: /etc/php-fpm-7.0.d/www.conf
user = nginx
group = nginx

listen = /var/run/php-fpm/php-fpm.sock

listen.owner = nginx
listen.group = nginx
listen.mode = 0660
  1. Create a php file on /var/www/
<?php
phpinfo();
?>
  1. Access http://SERVER_IP:3000

You will need your security group for your ec2 instance to have port 3000 opened.

If you want to add them to auto start:

sudo chkconfig nginx on
sudo chkconfig php-fpm on

If you want to restart this services:

sudo service nginx restart
sudo service php-fpm restart

Matheus

Comment

Migrate old Wordpress to Heroku, Amazon RDS and S3.

After a few good years with my blog out of date, I decided to start to write again and to migrate it to Heroku since his server was with a really old stack. I decided to use Heroku, Amazon RDS as Database service and S3 as file storage (for uploaded files)

Steps:

  1. Disable all Wordpress’ extensions
  2. Do a full backup (Wordpress, Database, Uploads, etc)
  3. Really, do a backup!
  4. Create a git repository
  5. Add Wordpress code to your git
    1. If you want to update your Wordpress add latest Wordpress version
      1. Don’t add your private configs (wp-config.php)!
      2. Don’t add uploads folder
      3. Add your plugins
      4. Add your theme
    2. If you want to keep your Wordpress version, add your current blog’s code
      1. Don’t add your private configs (wp-config.php)!
      2. Don’t add uploads folder
  6. Atention with your private files!!
  7. Update your wp-config
    1. All your private configs must use getenv, this function will be responsible to fetch the values from env vars.
    <?php
    define('AUTH_KEY',         getenv('WP_AUTH_KEY'));
    define('SECURE_AUTH_KEY',  getenv('WP_SECURE_AUTH_KEY'));
    define('LOGGED_IN_KEY',    getenv('WP_LOGGED_IN_KEY'));
    define('NONCE_KEY',        getenv('WP_NONCE_KEY'));
    
    define('AUTH_SALT',        getenv('WP_AUTH_SALT'));
    define('SECURE_AUTH_SALT', getenv('WP_SECURE_AUTH_SALT'));
    define('LOGGED_IN_SALT',   getenv('WP_LOGGED_IN_SALT'));
    define('NONCE_SALT',       getenv('WP_NONCE_SALT'));
    
    define('S3_UPLOADS_BUCKET', getenv('AWS_S3_BUCKET'));
    define('S3_UPLOADS_KEY', getenv('AWS_S3_KEY'));
    define('S3_UPLOADS_SECRET', getenv('AWS_S3_SECRET'));
    define('S3_UPLOADS_REGION', getenv('AWS_S3_REGION')); 
    
  1. Create a composer.json file to define requirements and packages versions
    1. Exemple composer.json
    {
      "require" : {
          "php": ">=7.0.0"
      },
      "require-dev": {
      }
    }
    
  1. Execute composer update to generate the composer.lock file
  2. Update your .htaccess file to redirect your uploads to your S3 bucket
    1. Update the url (at the 5th line) on the .htaccess to match your S3 and Bucket
    <IfModule mod_rewrite.c>
     RewriteEngine On
     RewriteBase /
     RewriteRule ^index\.php$ - [L]
     RewriteRule ^wp-content/uploads/(.*)$ https://s3-us-west-2.amazonaws.com/BUCKET/uploads/$1 [R=301,L]
     RewriteCond %{REQUEST_FILENAME} !-f
     RewriteCond %{REQUEST_FILENAME} !-d
     RewriteRule . /index.php [L]
     </IfModule>    
    
  3. Do a commit with all this files (don’t add your secrets/keys to your git)
  4. Amazon setup
    1. Create a RDS Database
    2. Import your backup into it
    3. Send your S3 files to S3
      1. Remember to import/change the permissions of your s3 files so guest users can access your uploaded files
  5. Heroku setup
    1. Add your environment vars on heroku with the correct values and names which you used on wp-config.php, remember to use the RDS ones for database.
    2. Update your DNS for Heroku
    3. Send your code to Heroku using the repository you have created
  6. Access your website

If you’re updating your Wordpress, there are chances to something go wrong or to some plugin to stop working with the new Wordpress version, so don’t forget to check and update them.

Also, if you have any other question or need more information a specific test, let me know. I can try to help.

Matheus

Comment