How I Made This Site - Hosting At 53 Cents A Month - Part 2

For this site, I have a local set up which allows me to add new features as well as publish content like this very blog post you are reading. In part 1 I showed you how the site is set up in the cloud with AWS services like S3, ACM, Cloudfront and Route53 . In this post I will show you how I add new features to my site, deploy changes and add content like this very blog post.

At a high level my local set up includes:

For the purposes of providing a lean and to the point information, let’s just say for now that the wordpress, MySQL database and use of Redis is rather idiosyncratic to my setup. In my opinion, these technologies are overkill so we will just focus on what I think is necessary for a set up like mine to succeed.

Wordpress, MySQL and Redis

Without going into too much detail, let me get this out of the way first. I set up wordpress using bedrock, which also requires a database, and I went with MySQL. In addition to wordpress and MySQL I install Redis so that whenever someone saves a post in wordpress, the php process will also save the post data into Redis. I add markdown into the post in wordpress, click save, which triggers the dump into Redis as well as the save into MySQL.

Whenever the rails process handles an HTTP request in its controller, the controller loads the post data from redis and then prepares the view with the post data in an instance variable. In short, Redis acts as a layer that sits between Rails and Wordpress allowing the user to experience the best of both worlds — the robust CMS’ing features of Wordpress and the rapid, solid application development of Rails.

Admittedly, the combination of Rails and Wordpress is overkill. One could easily just have Rails and build out a super small admin in Rails, which has forms set up for the admin to create a post. Assuming a database is in place for Rails, all one has to do is save the post with valid markdown — that’s it.

Redcarpet and Route

Redcarpet is a ruby gem that parses markdown to be turned into valid HTML. However, as this github issue states:

Redcarpet doesn’t come with syntax highlighting out of the box…

Rouge to the rescue. Just like it’s demonstrated in the above github issue I create a model app/models/custom_render.rb with the following code:

require 'redcarpet'
require 'rouge'
require 'rouge/plugins/redcarpet'

class CustomRender < Redcarpet::Render::HTML
  include Rouge::Plugins::Redcarpet
end

Then in my controller app/controllers/posts_controller.rb I invoke the class Redcarpet::Markdown, injecting CustomRender and passing an options hash:

# app/controllers/posts_controller.rb

class PostsController < ApplicationController
  def show
    @parser = Redcarpet::Markdown.new(CustomRender, {
      fenced_code_blocks: true,
      footnotes: true,
      highlight: true,
      lax_spacing: true,
      no_intra_emphasis: true,
      quote: true,
      strikethrough: true,
      superscript: true,
      tables: true,
      underline: true
    })

    @post = redis_get('posts').find { |post| post['title'].gsub(/-/, ' ').strip.downcase == params[:id].downcase.gsub(/-/, ' ').strip }

    @content = @parser.render(@post['content'])
  end
end

With the parser constructed I find the post by title and invoke Redcarpet::Markdown#render with the actual markdown from my post, @post['content']. The return value from Redcarpet::Markdown#render I set to an instance variable @content — and that’s all my view needs. Look at this super simple view:

# app/views/posts/show.haml

%main
  .post_container
    = @content.html_safe

By the way, as you can see I am also using Haml to power my rails views. As far as the code syntax highlighting is concerned I went with Rouge’s solarized theme:

// app/assets/stylesheets/rouge.scss.erb
<%= Rouge::Themes::Base16::Solarized.mode(:light).render(scope: '.highlight') %>

.highlight {
  border-radius:4px;
  background: #f2f2f2;
  margin-bottom:1rem;
  padding:1rem;
}

The index scss file app/assets/stylesheets/application.scss loads all my scss files that are in app/assets/stylesheets automatically so no need to specify them in application.scss.

Regarding other combinations with Rouge and Redcarpet this SO post and this other blog post look promising. However, I would have to try them out to know for sure. I do think the Rouge set up is a bit cryptic so I just invoked the Route styling as indicating on their README.md. Digging through the source code of Rouge, I did find the Solarized theme, which I thought colored my code in a clear way.

At this point we have a rails app that takes markdown with code brackets and renders this markdown with code very nicely. Now we need to look into deployment.

bin/deploy

The high level idea is this: I run my rails process in production mode. Once in production mode, I fire off a bash script — bin/deploy — this bash script makes HTTP requests to my production mode rails process locally, saves off the HTML responses from my rails server into public/pages and public/pages/posts and then with the aws cli command uploads the contents of public/pages and public/pages/posts into the specified S3 bucket. My hosted site instantly gets updated with my latest and greatest running code locally. Consider your code and everything you want your site to be deployed at this point.

Now, let’s get a closer look at this deploy process. Here’s my 7 step recipe:

0. create posts, pages, make code changes etc... by running your app with $ bin/rails s, a local rails console for development.

As described above in order to either add features or blog content I have to run my app with $ bin/rails s.

1. make sure public/pages, public/pages/posts directory exist

We will explain shortly why the above directories must exist. s3 essentially will hold the contents deposited as HTML files into the above directories: public/pages and public/pages/posts

2. if deploying CSS and JavaScript changes execute $ bin/bundle exec rake assets:precompile

The assets need to be precompiled so that the hosted site in production and QA only has one JS file and one CSS file. The above command takes my N number of SCSS and JS files and respectively consolidates them into just one CSS and JS file, which are each compiled. This command stores the compiled JS and CSS assets in public/assets

3. if there is a new post / page add the path of the post / page to the script in $ bin/deploy

Currently, my deploy script does not programmatically list out all the web pages and posts of my site. Eventually, I will get to this but for now, I type in the post / page that I want to deploy to my bucket. In my opinion, it’s really not a big deal. When it’s time to deploy a new post or a new page it’s usually just one harmless change I make to bin/deploy.

4. execute $ ENVIRONMENT=${environment} bin/rails s -e production. Enter, qa, for environment to load environment variables for robpjewell.com. Enter, production, for environment to load environment variables for robskrob.com. If you refresh your browser at http://localhost:3000/, you will notice your site has no styling or JS. Do not be alarmed. For some reason puma in production is not loading my assets from public/assets.

Currently I have two buckets: www.robskrob.com and www.robpjewell.com. I use www.robskrob.com as my production bucket, and I use www.robpjewell.com as my QA / Beta bucket. Depending on what environment variables I want read, I pass ENVIRONMENT=production for the production read variables in service of www.robskrob.com and ENVIRONMENT=qa for beta environment variables in service of www.robpjewell.com. In config/initializers/dotenv.rb I have ruby that reads the environment variable passed (ENVIRONMENT=production or ENVIRONMENT=qa) and based on the value, loads either .env.production, .env.qa or just the .env:

require 'dotenv'

if (ENV['ENVIRONMENT'] == 'production')
  Dotenv.load('.env.production', '.env')
elsif (ENV['ENVIRONMENT'] == 'qa')
  Dotenv.load('.env.qa', '.env')
else
  Dotenv.load
end

I apologize for this conditional branching. As stated here I know not to do that :facepalm: — but for now, this very small conditional branching provides a powerful feature. Let’s say I want to install Google analytics. But I want my production and beta sites to report on different analytics — In other words, I do not want my test data to interfere with the analytics of my production site. With the above local environment set up I can force my rails process to read environment variable data based on whatever environment context the rails process thinks it’s in. Depending on where I am deploying my code — www.robskrob.com or www.robpjewell.com — I can make sure that the correct environment values are read respectively of each bucket.

5. run bin/deploy

As of writing this post on August 4th 2019, my deploy script looks like this:

#!/bin/bash

`rm /Users/robskrob/code/projects/personal-site/public/assets/.sprockets-manifest-*.json`
`rm -rf /Users/robskrob/code/projects/personal-site/public/assets/express`
`rm /Users/robskrob/code/projects/personal-site/public/assets/*`

bundle exec rake assets:precompile

domain='robskrob'

declare -a pages=(
  "index"
  "about"
)

declare -a posts=(
  "culling-conditionals-2"
  "how-i-made-this-site---hosting-at-53-cents-a-month---part-1"
  "how-i-made-this-site---hosting-at-53-cents-a-month---part-2"
)

for page in "${pages[@]}"
do
  curl http://localhost:3000/$page > /Users/robskrob/code/projects/personal-site/public/pages/$page.html

  css_link=$(cat public/pages/$page.html | grep '/public/assets/application-.*.css' | head -n 1)
  js_tag=$(cat public/pages/$page.html | grep '/public/assets/application-.*.js' | head -n 1)

  new_css=$(ls public/assets | grep -m1 css)
  new_js=$(ls public/assets | grep -m1 js)

  sed -i -e "s#${css_link}#<link rel='stylesheet' media='all' href='/public/assets/${new_css}' data-turbolinks-track='reload' />#g" /Users/robskrob/code/projects/personal-site/public/pages/$page.html
  sed -i -e "s#${js_tag}#<script src='/public/assets/${new_js}' data-turbolinks-track='reload'></script>#g" /Users/robskrob/code/projects/personal-site/public/pages/$page.html

  rm "/Users/robskrob/code/projects/personal-site/public/pages/$page.html-e"
done

for post in "${posts[@]}"
do
  curl http://localhost:3000/posts/$post > /Users/robskrob/code/projects/personal-site/public/pages/posts/$post.html

  css_link=$(cat public/pages/posts/$post.html | grep '/public/assets/application-.*.css' | head -n 1)
  js_tag=$(cat public/pages/posts/$post.html | grep '/public/assets/application-.*.js' | head -n 1)

  new_css=$(ls public/assets | grep -m1 css)
  new_js=$(ls public/assets | grep -m1 js)

  sed -i -e "s#${css_link}#<link rel='stylesheet' media='all' href='/public/assets/${new_css}' data-turbolinks-track='reload' />#g" /Users/robskrob/code/projects/personal-site/public/pages/posts/$post.html
  sed -i -e "s#${js_tag}#<script src='/public/assets/${new_js}' data-turbolinks-track='reload'></script>#g" /Users/robskrob/code/projects/personal-site/public/pages/posts/$post.html

  rm "/Users/robskrob/code/projects/personal-site/public/pages/posts/$post.html-e"
done

aws s3 sync /Users/robskrob/code/projects/personal-site/public/pages s3://www.${domain}.com
aws s3 sync /Users/robskrob/code/projects/personal-site/public/assets s3://www.${domain}.com/public/assets

First the scrip clears the contents under /public/. Then it precompiles the assets, reads which domain these assets and html files goes to, reads which pages and posts it needs for its HTTP requests to the local server, loops through the pages and the posts, makes a curl request to each one and saves the response from the curl request into its appropriate location in public/ . Then the script uses sed to replace the CSS and JS links with the correct references from public/assets. For some reason, puma references JS and CSS links that exist no where locally and for this reason I have to replace the location that puma reads with the actual JS and CSS assets that exist in public/assets. Once all the HTML and Asset files are created in public/ the script then uses aws cli to upload the local html files into the bucket that they are instructed to go to

6. verify that changes are at raw url of s3 bucket.

This step tells me to go to the s3 bucket url http://www.robskrob.com.s3-website-us-east-1.amazonaws.com, which is the source of truth for the site. The latest changes are first read at this s3 url http://www.robskrob.com.s3-website-us-east-1.amazonaws.com.

7. invalidate cloud front cache to propagate changes in s3 around the world:

This is a critical step in making sure that the latest and greatest from www.robskrob.com reaches the world. The cloud front cache must be invalidated so the changes in the s3 bucket hosted here, http://www.robskrob.com.s3-website-us-east-1.amazonaws.com , can be propagated around the world and consumed at www.robskrob.com. To invalidate the cache I the cloud front service , click on the ID of the distribution I need to update, click on the Invalidations tab, click Create Invalidation and then I paste in the following to the text area:

/
.
/index.html
/about.html
/posts/*
/public/assets/*
index.html
about.html

And click Invalidate. In roughly 10-15 minutes www.robskrob.com will now have the latest changes from the bucket, http://www.robskrob.com.s3-website-us-east-1.amazonaws.com.

Thank you for reading this post. As always, if something is unclear please contact me at the about section of this site. Eventually I will have comments on every post but until I roll out that feature feel free to just email me.