The Cloud Resume Challenge: How to Stand Out in 2026

Complete guide to the Cloud Resume Challenge in 2026. Step-by-step implementation on AWS, Azure, or GCP with modern additions that make hiring managers take notice.

The Cloud Resume Challenge: How to Stand Out in 2026

The Cloud Resume Challenge, created by Forrest Brazeal in 2020, has become the de facto first portfolio project for aspiring cloud engineers. The premise is elegant: build a personal resume website hosted entirely on cloud infrastructure, with a visitor counter backed by a serverless API and database, all deployed through CI/CD pipelines and infrastructure as code.

Thousands of engineers have completed the challenge since its inception. In 2026, completing the basic challenge is no longer enough to differentiate yourself. Hiring managers have seen hundreds of identical S3-hosted static sites with a DynamoDB visitor counter. To stand out, you need to complete the core challenge and then extend it with modern cloud practices that demonstrate production-level thinking.

This guide walks through the full challenge implementation on AWS (with Azure and GCP alternatives noted), then covers five extensions that transform a beginner project into a portfolio piece that gets interviews.

Why the Cloud Resume Challenge Works

The challenge works because it covers the breadth of skills that cloud engineering roles require, compressed into a single project:

  • Frontend: HTML/CSS (or a static site generator) demonstrating web fundamentals
  • DNS and CDN: Custom domain with SSL/TLS termination via CloudFront
  • Serverless compute: Lambda function (or equivalent) for the visitor counter API
  • Database: DynamoDB (or equivalent) for storing and retrieving the count
  • API Gateway: HTTP endpoint connecting frontend to backend
  • Infrastructure as code: Terraform or CloudFormation defining all resources
  • CI/CD: GitHub Actions (or equivalent) deploying on every push
  • Testing: Unit tests for the Lambda function, integration tests for the API

No other single project touches this many services while remaining achievable in 2-4 weeks of part-time effort.

Step-by-Step Implementation on AWS

Step 1: Build the Resume Site

Write your resume in HTML and CSS. This is not a design competition — clean, readable formatting matters more than visual flair. Use semantic HTML: <header>, <section>, <article>, <footer>. Include structured data (JSON-LD) for your resume to demonstrate SEO awareness.

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Your Name — Cloud Engineer Resume</title>
    <link rel="stylesheet" href="style.css">
    <script type="application/ld+json">
    {
        "@context": "https://schema.org",
        "@type": "Person",
        "name": "Your Name",
        "jobTitle": "Cloud Engineer",
        "url": "https://resume.yourdomain.com"
    }
    </script>
</head>
<body>
    <header>
        <h1>Your Name</h1>
        <p>Cloud Engineer | AWS | Terraform | Kubernetes</p>
    </header>
    <main>
        <section id="experience">
            <h2>Experience</h2>
            <!-- Your experience entries -->
        </section>
        <section id="certifications">
            <h2>Certifications</h2>
            <!-- Your certs -->
        </section>
    </main>
    <footer>
        <p>Visitors: <span id="visitor-count">loading...</span></p>
    </footer>
    <script src="counter.js"></script>
</body>
</html>

Step 2: S3 Static Website Hosting

Create an S3 bucket configured for static website hosting. Do not make the bucket public — you will serve content through CloudFront with an Origin Access Control (OAC).

resource "aws_s3_bucket" "resume" {
  bucket = "your-resume-site-${random_id.suffix.hex}"
}

resource "aws_s3_bucket_public_access_block" "resume" {
  bucket = aws_s3_bucket.resume.id

  block_public_acls       = true
  block_public_policy     = true
  ignore_public_acls      = true
  restrict_public_buckets = true
}

resource "aws_s3_bucket_website_configuration" "resume" {
  bucket = aws_s3_bucket.resume.id

  index_document {
    suffix = "index.html"
  }

  error_document {
    key = "error.html"
  }
}

Step 3: CloudFront Distribution with Custom Domain

Set up a CloudFront distribution pointing to the S3 bucket with an ACM certificate for HTTPS.

resource "aws_cloudfront_distribution" "resume" {
  origin {
    domain_name              = aws_s3_bucket.resume.bucket_regional_domain_name
    origin_id                = "S3-resume"
    origin_access_control_id = aws_cloudfront_origin_access_control.resume.id
  }

  enabled             = true
  default_root_object = "index.html"
  aliases             = ["resume.yourdomain.com"]

  default_cache_behavior {
    allowed_methods  = ["GET", "HEAD"]
    cached_methods   = ["GET", "HEAD"]
    target_origin_id = "S3-resume"

    forwarded_values {
      query_string = false
      cookies {
        forward = "none"
      }
    }

    viewer_protocol_policy = "redirect-to-https"
    min_ttl                = 0
    default_ttl            = 3600
    max_ttl                = 86400
  }

  viewer_certificate {
    acm_certificate_arn      = aws_acm_certificate.resume.arn
    ssl_support_method       = "sni-only"
    minimum_protocol_version = "TLSv1.2_2021"
  }

  restrictions {
    geo_restriction {
      restriction_type = "none"
    }
  }
}

Step 4: Visitor Counter Backend

Create a DynamoDB table, a Lambda function, and an API Gateway endpoint.

DynamoDB Table:

resource "aws_dynamodb_table" "visitor_count" {
  name         = "resume-visitor-count"
  billing_mode = "PAY_PER_REQUEST"
  hash_key     = "id"

  attribute {
    name = "id"
    type = "S"
  }
}

Lambda Function (Python):

import json
import boto3
from decimal import Decimal

dynamodb = boto3.resource("dynamodb")
table = dynamodb.Table("resume-visitor-count")

def handler(event, context):
    response = table.update_item(
        Key={"id": "visitor-count"},
        UpdateExpression="SET visit_count = if_not_exists(visit_count, :start) + :inc",
        ExpressionAttributeValues={":inc": 1, ":start": 0},
        ReturnValues="UPDATED_NEW",
    )

    count = int(response["Attributes"]["visit_count"])

    return {
        "statusCode": 200,
        "headers": {
            "Content-Type": "application/json",
            "Access-Control-Allow-Origin": "https://resume.yourdomain.com",
        },
        "body": json.dumps({"count": count}),
    }

Step 5: Frontend JavaScript

async function updateVisitorCount() {
    try {
        const response = await fetch(
            "https://api.yourdomain.com/count",
            { method: "POST" }
        );
        const data = await response.json();
        document.getElementById("visitor-count").textContent =
            data.count.toLocaleString();
    } catch (error) {
        document.getElementById("visitor-count").textContent = "—";
    }
}

updateVisitorCount();

Step 6: CI/CD Pipeline

name: Deploy Resume

on:
  push:
    branches: [main]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: "3.12"
      - run: pip install pytest boto3 moto
      - run: pytest tests/

  deploy-infra:
    needs: test
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: hashicorp/setup-terraform@v3
      - run: terraform init
        working-directory: ./infra
      - run: terraform apply -auto-approve
        working-directory: ./infra
        env:
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

  deploy-site:
    needs: deploy-infra
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - run: |
          aws s3 sync ./site s3://${{ vars.BUCKET_NAME }} --delete
          aws cloudfront create-invalidation \
            --distribution-id ${{ vars.CF_DISTRIBUTION_ID }} \
            --paths "/*"
        env:
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

Step 7: Tests

Write unit tests for your Lambda function using moto (AWS mocking library):

import json
import boto3
from moto import mock_aws
from lambda_function import handler

@mock_aws
def test_visitor_count_increments():
    dynamodb = boto3.resource("dynamodb", region_name="us-east-1")
    table = dynamodb.create_table(
        TableName="resume-visitor-count",
        KeySchema=[{"AttributeName": "id", "KeyType": "HASH"}],
        AttributeDefinitions=[{"AttributeName": "id", "AttributeType": "S"}],
        BillingMode="PAY_PER_REQUEST",
    )

    result1 = handler({}, None)
    body1 = json.loads(result1["body"])
    assert body1["count"] == 1

    result2 = handler({}, None)
    body2 = json.loads(result2["body"])
    assert body2["count"] == 2

Azure and GCP Alternatives

Azure path: Azure Blob Storage (static site) -> Azure CDN -> Azure Functions -> Cosmos DB -> Azure DevOps Pipelines -> Bicep or Terraform.

GCP path: Cloud Storage (static site) -> Cloud CDN -> Cloud Functions -> Firestore -> Cloud Build -> Terraform.

The concepts are identical. Completing the challenge on a different cloud than what you typically use demonstrates adaptability.

Five Extensions That Make You Stand Out in 2026

Extension 1: Observability Dashboard

Add CloudWatch metrics, alarms, and a simple dashboard that tracks API latency, error rates, and visitor count trends. This shows you think about operational health, not just deployment.

Create a CloudWatch dashboard with widgets showing Lambda invocation count, duration percentiles, and DynamoDB consumed read/write capacity. Set an alarm that fires if the Lambda error rate exceeds 5% over a 5-minute window.

Extension 2: Infrastructure Cost Monitoring

Add a Cost Explorer or Infracost integration to your CI/CD pipeline that estimates the monthly cost of your infrastructure on every pull request. Include a cost badge in your README showing the current monthly spend (it should be well under $1/month for this project).

This demonstrates FinOps awareness — a skill that immediately sets you apart from candidates who deploy without considering cost.

Extension 3: Security Hardening

Go beyond the basics: - Enable AWS Config rules to monitor S3 bucket configuration drift - Add WAF rules on CloudFront to block common attack patterns - Implement Content Security Policy headers - Run Checkov or tfsec on your Terraform and fix all findings - Add DNSSEC to your Route 53 hosted zone - Enable CloudTrail logging for all API calls

Document each security control and why it matters. Security-conscious candidates are rare and valuable.

Extension 4: Multi-Region Deployment

Deploy the backend to two regions with API Gateway regional endpoints and Route 53 latency-based routing. This demonstrates understanding of high availability patterns that are central to cloud architecture interviews.

Extension 5: Blog Integration

Add a blog section to your resume site powered by markdown files rendered at build time (use a static site generator like Hugo or Eleventy). Write 2-3 technical blog posts about what you learned during the challenge. This shows communication skills and creates additional surface area for hiring managers to evaluate your thinking.

Common Mistakes to Avoid

Copying tutorials verbatim. Hiring managers check GitHub repositories. If your code is identical to a popular tutorial, it signals you followed instructions without understanding. Customize the implementation — different DynamoDB schema, additional features, unique frontend design.

Skipping tests. The testing requirement is where most candidates give up. Writing tests demonstrates software engineering discipline, not just infrastructure skills.

Not writing about it. A completed challenge without a blog post or README explaining your decisions is a missed opportunity. Write about the trade-offs you considered, problems you encountered, and what you would do differently.

Leaving the infrastructure running. Tear down your infrastructure when not using it (or use Terraform destroy and redeploy when needed). Leaving a CloudFront distribution and Lambda function running indefinitely costs money and shows poor resource management habits.

From Challenge to Career

The Cloud Resume Challenge is a starting point, not a destination. Once completed, use the same infrastructure patterns to build projects that solve real problems:

  • A URL shortener with analytics (demonstrates DynamoDB design patterns)
  • A serverless image processing pipeline (demonstrates event-driven architecture)
  • A cost anomaly detector (demonstrates CloudWatch, SNS, and data analysis)

Each project adds depth to your portfolio and gives you concrete talking points for interviews.

Citadel Cloud Management's cloud courses provide structured learning paths that build the foundational knowledge the Cloud Resume Challenge assumes — networking, IAM, serverless computing, and infrastructure as code. The Career Resources collection includes resume templates, portfolio project guides, and interview preparation materials specifically for cloud engineering roles.

For hands-on infrastructure code you can study and extend, the Cloud Toolkits collection provides production-grade Terraform modules, CI/CD pipeline templates, and serverless application patterns.

Ready to build a cloud portfolio that gets you hired? Start with Citadel's free cloud courses to build the skills, then tackle the Cloud Resume Challenge with confidence. Explore all resources for toolkits, career guides, and certification preparation.

Kehinde Ogunlowo

Senior Multi-Cloud DevSecOps Architect & AI Engineer

AWS, Azure, GCP Certified | Secret Clearance | FedRAMP, CMMC, HIPAA

Enterprise experience at Cigna Healthcare, Lockheed Martin, NantHealth, BP Refinery, and Patterson UTI.

Start Your Cloud Career Today

Access 17 free courses covering AWS, Azure, GCP, DevOps, AI/ML, and cloud security — built by a practicing Senior Cloud Architect with enterprise experience.

Get Free Cloud Career Resources

You might also like