Migrating to Hugo#
I previously used Hexo, which is basically similar to Hugo in terms of usage. You can directly copy the articles over and check the Front Matter section of the articles.
After checking, use the hugo
command to generate the static website files, which will be stored in the public
folder by default.
Configuring Amazon S3 Bucket#
Open the AWS S3 console and create a new S3 Bucket.
Go to the detailed information of the newly created Bucket and enable static website hosting. Take note of the "Bucket website endpoint" for future reference.
Remove the "Block all public access" option in the "Permissions" section and modify the "Bucket policy" to the following:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<bucket_name>/*"
}
]
}
Upload all the files from the public
folder obtained in the previous step to this Bucket. Access the "Bucket website endpoint" and you should be able to see the webpage (the styling may not be correct).
Configuring Amazon CloudFront CDN#
Open the AWS CloudFront console and create a new Web distribution. First, find the "SSL certificate" section and click "Request or import a certificate with ACM". Apply for a certificate for your domain and then refresh the create distribution page.
Set the parameters according to the following instructions, leaving the rest as default:
- Origin Domain Name: Select the Bucket created earlier
- Viewer Protocol Policy: Redirect HTTP to HTTPS
- SSL Certificate: Select "Custom SSL Certificate" and choose the certificate you applied for
After creating, click on the distribution ID you just created, go to "Origins and Origin Groups", and edit the "Origin". Change the origin domain name to the "Bucket website endpoint" and save.
Now, when you access the domain name, you should be able to see the webpage correctly.
Configuring GitHub Actions for Automated Deployment#
A while ago, I wrote a Deployer to achieve automated deployment. This time, I can't compromise on the experience.
Since I'm no longer deploying on my own server, using the Deployer would be unnecessary. So, I chose GitHub Actions instead.
Here is the Workflow file for GitHub Actions:
name: Deploy to Amazon S3
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v2
with:
submodules: true
fetch-depth: 0
- name: Setup Hugo
uses: peaceiris/actions-hugo@v2
with:
hugo-version: '0.80.0'
- name: Build
run: hugo --minify
- name: Deploy to S3
run: hugo deploy --force --maxDeletes -1 --invalidateCDN
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
Note: Before using this, you need to add the
AWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
environment variables to the project's Secrets.
Fixing Metadata for index.xml File#
By default, Hugo's Deploy function sets the Content-Type
of xml
files to application/rss+xml
, which can cause garbled characters when accessed. Changing it to application/xml
or text/xml
can solve this issue.
You can add the following content to the Workflow file of GitHub Actions so that you don't have to manually modify it every time:
...
jobs:
deploy:
...
steps:
...
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ secrets.AWS_REGION }}
- name: Fix Content-Type for index.xml
run: |
aws s3api copy-object --bucket "$AWS_BUCKET" --content-encoding "gzip" --content-type "text/xml" --copy-source "$AWS_BUCKET/index.xml" --key "index.xml" --metadata-directive "REPLACE"
env:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ secrets.AWS_REGION }}
AWS_BUCKET: ${{ secrets.AWS_BUCKET }}
Note: You need to add the
AWS_REGION
andAWS_BUCKET
environment variables to the project's Secrets.
Complete Workflow File#
name: Deploy to Amazon S3
on:
push:
branches:
- main
jobs:
deploy:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v2
with:
submodules: true
fetch-depth: 0
- name: Setup Hugo
uses: peaceiris/actions-hugo@v2
with:
hugo-version: '0.80.0'
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ secrets.AWS_REGION }}
- name: Build
run: hugo --minify
- name: Deploy to S3
run: hugo deploy --force --maxDeletes -1 --invalidateCDN
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
- name: Fix Content-Type for index.xml
run: |
aws s3api copy-object --bucket "$AWS_BUCKET" --content-encoding "gzip" --content-type "text/xml" --copy-source "$AWS_BUCKET/index.xml" --key "index.xml" --metadata-directive "REPLACE"
env:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ secrets.AWS_REGION }}
AWS_BUCKET: ${{ secrets.AWS_BUCKET }}
Secrets that need to be configured:
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- AWS_REGION
- AWS_BUCKET