I finally got around to creating my first deployment pipeline

About a 8 minute read

Written by on

#automation #deployment #github

Deployment pipelines have always been something that sounds cool to do, but I never seemed to get around to doing for my personal projects. For the most part, it was easy enough to manually FTP the files.

As I was refactoring this very site, a thought occurred to me. It would be really cool to create an automated deployment script. This site is fairly simple, so it would be a good starting point for learning deployment automation.

Ideas

I started brainstorming ways that I could accomplish my goal. At first, I wanted to implement a Git based deployment. The idea being the master branch is “prod”, so whenever I merge into master that would be deployed.

After doing some Googling, I figured an easy way would be to set up a remote repository on my web server. Whenever I have changes I want to deploy, I’d push to that remote repo. This would work, but wasn’t exactly what I was looking for.

I then stumbled upon Git hooks, and thought about setting something up with the post-merge hook. After a merge, check if it was into master, then have a script that does something.

That train of thought lead me to what I ultimately implemented. I just needed to make sure the files made their way to the server after a merge into master. So, SFTP. My repo is hosted on GitHub.. so why not check out GitHub actions?

So that’s what I did.

The action

Like any person learning a new thing, I started Googling. First order of business, getting the trigger working. I want this action to trigger not only on a PR merge, but specifically into the master branch.

The hook

After some digging around, this is what I came up with:

on:
  pull_request:
    branches:
      - master
    types:
      - closed

Pretty simple, right?

We are listening for the pull_request webhook event, and then specifying that we only care about the master branch and PRs that are closed.

Thus, it will trigger when a PR involving master is closed.

The trigger

Now comes the fun part, which is figuring out how the hell to make the action do what I want it to do. Looking through the docs, I found a conditional that I definitely needed.

jobs:
  if_merged:
    if: github.event.pull_request.merged == true
    runs-on: ubuntu-latest

The if_merged condition combined with github.event.pull_request.merged check makes sure that the closed merge request was, in fact, merged. I definitely didn’t think about that when I first wrote this, so it was a good self-catch. Docs being helpful, who knew?

Checkout

From what I gathered thereafter, the first step is checking out the code that I want to be using. Sweet! Time for the first step!

jobs:
  if_merged:
    if: github.event.pull_request.merged == true
    runs-on: ubuntu-latest
    steps:
      - name: Checkout
        uses: actions/[email protected]

Straightforward.

A step with the name “Checkout” and it uses the checkout action. By default, it’ll checkout the main/master branch, which is terrific. I’m not going to lie to you when I say I was excited to even make it this far. Learning new things is so much fun.

SFTP

Now the big boy. The SFTP part really scared the shit out of me. I had an irrational fear I was going to somehow delete everything on my web server. Spoiler, I only accidentally deleted some directories.

jobs:
  if_merged:
    if: github.event.pull_request.merged == true
    runs-on: ubuntu-latest
    steps:
      - name: Checkout
        uses: actions/[email protected]
      - name: Deploy Files
        uses: wlixcc/[email protected]
        with:
          username: ${{ secrets.FTP_USERNAME }}
          password: ${{ secrets.FTP_PASSWORD }}
          server: ${{ secrets.FTP_SERVER }}
          port: 22
          local_path: './local/path/*'
          remote_path: '/remote/path'
          sftp_only: false
          delete_remote_files: true

Okay, a lot more than the checkout step, but still not too bad. There are quite a few SFTP actions in the marketplace. I chose this one since it seemed to be the most popular, which hopefully means most stable (fingers crossed).

The step is named “Deploy Files”, and it uses wlixcc/SFTP-Deploy-Action. After that, gotta tell it what to do and where to go. The properties under with are what you would expect. Username, password, server and port are what’s needed simply to log into the web server.

You probably noticed the username, password and server referencing something called “secrets”. This is a way to keep sensitive data secure, and you can read about it here.

Then there is local_path which uses the star glob syntax to grab a specific directory from our branch (you could also specify a specific file if you wanted). remote_path is where it’s going on the web server, and the part that made me the most nervous.

The other two options aren’t exactly self-explanatory, sftp_only and delete_remote_files. I had to mess around with both of them. For my purpose, I set sftp_only to false, and delete_remote_files to true because I want the remote directory to be wiped every time this runs.

That’s my pipeline!

All together now?

on:
  pull_request:
    branches:
      - master
    types:
      - closed

jobs:
  if_merged:
    if: github.event.pull_request.merged == true
    runs-on: ubuntu-latest
    steps:
      - name: Checkout
        uses: actions/[email protected]
      - name: Deploy Files
        uses: wlixcc/[email protected]
        with:
          username: ${{ secrets.FTP_USERNAME }}
          password: ${{ secrets.FTP_PASSWORD }}
          server: ${{ secrets.FTP_SERVER }}
          port: 22
          local_path: './local/path/*'
          remote_path: '/remote/path'
          sftp_only: false
          delete_remote_files: true

And that’s how my first pipeline came to be.

Testing it was a bit stressful. First, I tested that the steps were working properly. If you change the hook to workflow_dispatch then you can manually trigger the job. After that was working, I said YOLO and tried a merge into master. Surprisingly, it worked!

Thanks for reading, fellow pipeline dudes.