Self-Hosting a FastAPI Application on AWS with Terraform and GitHub Actions CD
Self-Hosting a FastAPI Application on AWS with Terraform and GitHub Actions CD
This guide walks through setting up a self-hosted web application on AWS EC2 with automated deployments triggered by GitHub Actions.
Architecture Overview
- Application: FastAPI app running in Docker
- Infrastructure: AWS EC2 with Elastic IP
- Reverse Proxy: Nginx (containerized)
- Container Registry: Docker Hub
- CI/CD: GitHub Actions → AWS SSM → EC2
- Infrastructure as Code: Terraform
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────────────────┐
│ Push to main │────▶│ GitHub Actions │────▶│ Docker Hub │
│ │ │ Build & Push │ │ (image:latest) │
└─────────────────┘ └────────┬────────┘ └─────────────────────────────┘
│
▼
┌─────────────────┐
│ GitHub Actions │
│ Deploy via SSM │
└────────┬────────┘
│
▼
┌─────────────────┐
│ AWS SSM │
│ │
└────────┬────────┘
│
▼
┌────────────────────────────────────────────────────────────────────────────┐
│ EC2 Instance │
│ ┌──────────────────┐ ┌──────────────────┐ │
│ │ Nginx Container │───▶│ App Container │ │
│ │ (ports 80/443) │ │ (port 8000) │ │
│ └──────────────────┘ └──────────────────┘ │
└────────────────────────────────────────────────────────────────────────────┘
Step 1: Terraform Infrastructure
The Terraform configuration creates:
AWS Resources
- Security Group: SSH restricted to home IP, HTTP/HTTPS open to world
- EC2 Instance: Ubuntu 24.04 with Docker pre-installed via user data script
- Elastic IP: Static public IP for consistent access
- IAM Instance Profile: Allows EC2 to communicate with SSM
- IAM User: Scoped credentials for GitHub Actions to trigger deployments
Key Terraform Resources
```hcl resource "aws_instance" "app_server" { ami = data.aws_ami.ubuntu.id instance_type = var.instance_type key_name = aws_key_pair.deployer.key_name vpc_security_group_ids = [aws_security_group.web_sg.id] iam_instance_profile = aws_iam_instance_profile.ec2_ssm_profile.name
user_data = templatefile("${path.module}/scripts/setup.sh", { docker_image = var.docker_image }) } ```
The IAM policy for GitHub Actions is scoped to only allow:
- ssm:SendCommand on the specific EC2 instance
- ssm:GetCommandInvocation to check deployment status
Step 2: EC2 Setup Script
The user data script bootstraps the instance with Docker and creates the compose stack:
```bash
!/bin/bash
apt-get update && apt-get upgrade -y
Install Docker
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg echo "deb [arch=amd64 signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" > /etc/apt/sources.list.d/docker.list apt-get update && apt-get install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
Create app directory and docker-compose.yaml
mkdir -p /opt/app cat > /opt/app/docker-compose.yaml <<EOF services: app: image: ${docker_image} container_name: app restart: unless-stopped expose: - "8000"
nginx: image: nginx:alpine container_name: nginx restart: unless-stopped ports: - "80:80" - "443:443" volumes: - ./nginx.conf:/etc/nginx/nginx.conf:ro depends_on: - app EOF
cd /opt/app && docker compose up -d ```
Step 3: GitHub Actions Workflows
Build and Push (CI)
Triggered on push to main when app code changes:
```yaml name: Build and Push to Dockerhub
on: push: branches: [main] paths: - 'Dockerfile' - 'app/' - 'content/' - 'pyproject.toml'
jobs: build-and-push: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - uses: docker/setup-buildx-action@v3 - uses: docker/login-action@v3 with: username: ${{ secrets.DOCKERHUB_USERNAME }} password: ${{ secrets.DOCKERHUB_TOKEN }} - uses: docker/build-push-action@v6 with: context: . push: true tags: | ${{ secrets.DOCKERHUB_USERNAME }}/smr-website:latest ${{ secrets.DOCKERHUB_USERNAME }}/smr-website:${{ github.sha }} cache-from: type=gha cache-to: type=gha,mode=max ```
Deploy (CD)
Triggered after successful build, uses AWS SSM instead of SSH:
```yaml name: Deploy to EC2
on: workflow_run: workflows: ["Build and Push to Dockerhub"] types: [completed] branches: [main]
jobs: deploy: runs-on: ubuntu-latest if: ${{ github.event.workflow_run.conclusion == 'success' }} steps: - uses: aws-actions/configure-aws-credentials@v4 with: aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-region: ${{ secrets.AWS_REGION }}
- name: Deploy via SSM
run: |
COMMAND_ID=$(aws ssm send-command \
--instance-ids "${{ secrets.EC2_INSTANCE_ID }}" \
--document-name "AWS-RunShellScript" \
--parameters 'commands=["cd /opt/app && docker compose pull && docker compose up -d --remove-orphans && docker image prune -f"]' \
--query "Command.CommandId" \
--output text)
aws ssm wait command-executed \
--command-id "$COMMAND_ID" \
--instance-id "${{ secrets.EC2_INSTANCE_ID }}"
```
Why SSM Instead of SSH?
The traditional approach is to SSH from GitHub Actions into your server. This has problems:
- Dynamic IPs: GitHub Actions runners have changing IPs, so you'd need to either:
- Open SSH to the world (bad)
-
Dynamically update security groups (complex)
-
Key Management: You'd store an SSH private key in GitHub Secrets
SSM solves both issues: - No inbound ports needed - SSM agent connects outbound to AWS - Uses IAM credentials scoped to specific actions - Commands are logged in AWS CloudTrail
The only requirement is the SSM agent running on EC2 (included in Ubuntu 24.04) and an IAM instance profile.
GitHub Secrets Required
| Secret | Description |
|--------|-------------|
| DOCKERHUB_USERNAME | Docker Hub username |
| DOCKERHUB_TOKEN | Docker Hub access token |
| AWS_ACCESS_KEY_ID | From Terraform output |
| AWS_SECRET_ACCESS_KEY | From Terraform output |
| AWS_REGION | e.g., us-west-2 |
| EC2_INSTANCE_ID | e.g., i-0a3b7bb413291cbd5 |
Deployment Flow
- Push code to
mainbranch - GitHub Actions builds Docker image
- Image pushed to Docker Hub with
:latestand:shatags - Deploy workflow triggers via
workflow_run - GitHub Actions sends SSM command to EC2
- EC2 pulls new image and restarts containers
- Zero-downtime deployment complete
Cost Breakdown
- EC2 t3.small: ~$15/month
- Elastic IP: Free while attached to running instance
- Data transfer: Minimal for personal site
- Docker Hub: Free tier sufficient
Total: ~$15-20/month