<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[AlleDevOps]]></title><description><![CDATA[AlleDevOps]]></description><link>https://alledevops.com</link><generator>RSS for Node</generator><lastBuildDate>Fri, 17 Apr 2026 10:48:11 GMT</lastBuildDate><atom:link href="https://alledevops.com/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Building a Secure and Scalable Django Blog on AWS: The Ultimate Guide]]></title><description><![CDATA[TL;DR
This project deploys a Django-based blog application on AWS using various services such as EC2, RDS, S3, DynamoDB, CloudFront, and Route 53. The end result is a scalable and secure web application where users can upload pictures and videos on t...]]></description><link>https://alledevops.com/building-a-secure-and-scalable-django-blog-on-aws-the-ultimate-guide</link><guid isPermaLink="true">https://alledevops.com/building-a-secure-and-scalable-django-blog-on-aws-the-ultimate-guide</guid><category><![CDATA[Django]]></category><category><![CDATA[app-deployment-on-aws]]></category><category><![CDATA[scalability]]></category><category><![CDATA[cloudfront]]></category><category><![CDATA[AWS RDS]]></category><dc:creator><![CDATA[Mustafa Gönen]]></dc:creator><pubDate>Fri, 01 Dec 2023 17:31:59 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1699372777772/a7c8529a-9f3e-4e76-9d3e-e73a6329d904.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-tldr">TL;DR</h2>
<p>This project deploys a Django-based blog application on AWS using various services such as EC2, RDS, S3, DynamoDB, CloudFront, and Route 53. The end result is a scalable and secure web application where users can upload pictures and videos on their blog pages, which are stored on an S3 bucket and recorded on a DynamoDB table.</p>
<ul>
<li><a target="_blank" href="https://github.com/alledevops/blog-page-app-django-on-aws">Project Github Repo</a></li>
</ul>
<h2 id="heading-introduction">Introduction</h2>
<p>In this blog post, we will walk through the steps to deploy a Django-based blog application on the AWS (Amazon Web Services) cloud infrastructure. This project encompasses a wide range of AWS services such as VPC (Virtual Private Cloud), EC2 (Elastic Compute Cloud), RDS (Relational Database Service), S3 (Simple Storage Service), DynamoDB, CloudFront, Certificate Manager, IAM (Identity and Access Management) and Route 53. The end result is a robust and scalable web application.</p>
<h2 id="heading-project-description">Project Description</h2>
<p>The Blog Page Application deploys a web app using the Django Framework on AWS Cloud Infrastructure. This infrastructure includes an Application Load Balancer with an Auto Scaling Group of EC2 Instances and RDS on a defined VPC. Additionally, CloudFront and Route 53 manage traffic securely via SSL/TLS. Users can upload pictures and videos to their blog page, which are stored on an S3 Bucket. The object list of the S3 Bucket, containing movies and videos, is recorded on a DynamoDB table.</p>
<h2 id="heading-project-skeleton">Project Skeleton</h2>
<p>To successfully deploy the TechMust Blog Page Application on AWS infrastructure with the desired architecture, we will structure our project into several key components. This project skeleton will include:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1699370900176/d0fd5367-1108-4c8f-841a-14b32f35d375.jpeg" alt class="image--center mx-auto" /></p>
<h3 id="heading-1-amazon-web-services-aws">1. Amazon Web Services (AWS)</h3>
<ul>
<li><p><strong>Amazon Virtual Private Cloud (VPC)</strong>: We will configure a VPC with specific characteristics to isolate our application.</p>
</li>
<li><p><strong>Amazon Elastic Compute Cloud (EC2) Instances</strong>: These instances will host our Django web application, and we'll utilize Launch Templates to streamline the setup process.</p>
</li>
<li><p><strong>Amazon Relational Database Service (RDS)</strong>: We'll set up an RDS instance to store user registration data using MySQL.</p>
</li>
<li><p><strong>Amazon Simple Storage Service (S3)</strong>: S3 will serve as our storage solution for user-uploaded pictures and videos, and we will define two S3 buckets for regular use and failover.</p>
</li>
<li><p><strong>AWS Certificate Manager</strong>: We will use this service to create SSL certificates for secure connections, both on Application Load Balancer (ALB) and Amazon CloudFront.</p>
</li>
<li><p><strong>Amazon CloudFront</strong>: Configured as a cache server, CloudFront will efficiently manage content delivery from ALB.</p>
</li>
<li><p><strong>Amazon Route 53</strong>: It will be responsible for secure and reliable routing of traffic, allowing us to publish the website and ensuring failover in case of issues.</p>
</li>
<li><p><strong>Amazon DynamoDB</strong>: This NoSQL database will store an object list of S3 Bucket content, ensuring efficient data retrieval.</p>
</li>
<li><p><strong>AWS Lambda</strong>: A Python 3.8 Lambda function will be implemented to write objects from S3 to the DynamoDB table.</p>
</li>
<li><p><strong>AWS Identity and Access Management (IAM)</strong>: We'll define IAM roles and policies to grant necessary permissions to EC2 instances, Lambda function, and other resources.</p>
</li>
</ul>
<h3 id="heading-2-configuration-components">2. Configuration Components</h3>
<ul>
<li><p><strong>Security Groups</strong>: We will create and configure security groups for our ALB, EC2 instances, and RDS, ensuring secure traffic flow.</p>
</li>
<li><p><strong>NAT Instance or Bastion Host</strong>: Depending on your choice, we will set up the necessary components for secure access to private resources.</p>
</li>
</ul>
<h3 id="heading-3-project-github-repository">3. Project GitHub Repository</h3>
<p>We will set up a project repository on GitHub to store the application code, infrastructure configurations, and other project-related files.</p>
<h3 id="heading-4-developer-notes">4. Developer Notes</h3>
<p>We will follow the developer notes provided by the developer team to prepare the Django environment on EC2 instances, deploy the application, and configure RDS settings.</p>
<h3 id="heading-5-requirementstxt">5. Requirements.txt</h3>
<p>This file will include the required Python packages and dependencies for the Django application.</p>
<h2 id="heading-setup">Setup</h2>
<h3 id="heading-step-1-create-dedicated-vpc-and-whole-components">Step 1: Create dedicated VPC and whole components</h3>
<ul>
<li><strong>VPC</strong></li>
</ul>
<pre><code class="lang-plaintext">- Create VPC 
  .create a vpc named "aws_capstone-VPC" 
  .CIDR blok is "90.90.0.0/16" 
  .no ipv6 CIDR block 
  .tenancy: default 

- select "aws_capstone-VPC VPC", 
  click Actions
  enable DNS hostnames for the "aws_capstone-VPC"
</code></pre>
<ul>
<li><strong>Subnets</strong></li>
</ul>
<pre><code class="lang-plaintext">##Create Subnets 
- Create a public subnet 
  .named "aws_capstone-public-subnet-1A" 
  .under the vpc aws_capstone-VPC 
  .in AZ "us-east-1a" with 90.90.10.0/24 

- Create a private subnet 
  .named "aws_capstone-private-subnet-1A" 
  .under the vpc "aws_capstone-VPC" 
  .in AZ us-east-1a with 90.90.11.0/24 

- Create a public subnet 
  .named aws_capstone-public-subnet-1B 
  .under the vpc aws_capstone-VPC 
  .in AZ us-east-1b with 90.90.20.0/24 

- Create a private subnet 
  .named aws_capstone-private-subnet-1B 
  .under the vpc aws_capstone-VPC 
  .in AZ us-east-1b with 90.90.21.0/24

## Set auto-assign IP up for public subnets 
   - Select each public subnets and, 
   - click Modify "auto-assign IP settings" and, 
   - select "Enable auto-assign public IPv4 address"
</code></pre>
<ul>
<li><strong>Internet Gateway</strong></li>
</ul>
<pre><code class="lang-plaintext">- Click Internet gateway section on left hand side 
  .create an internet gateway named "aws_capstone-IGW" and,
  .click create

- ATTACH the internet gateway "aws_capstone-IGW" to the newly created VPC "aws_capstone-VPC" 
  .Go to Internet Gateways tab and select newly created IGW and, 
  .click action ---&gt; Attach to VPC ---&gt; Select "aws_capstone-VPC" VPC
</code></pre>
<ul>
<li><strong>Route Table</strong></li>
</ul>
<pre><code class="lang-plaintext">- Go to route tables on left hand side 
  .We have already one route table as main route table 
  .Change it's name as aws_capstone-public-RT 

- Create a route table and, 
  .give a name as "aws_capstone-private-RT" 

- Add a rule to "aws_capstone-public-RT" 
  .in which destination 0.0.0.0/0 (any network, any host) 
  .to target the internet gateway "aws_capstone-IGW" 
  .so that allow access to the internet. 

- Select the private route table, 
  .come to the subnet association subsection and,
  .add private subnets to this route table. 
  .Similarly, we will do it for public route table and public subnets
</code></pre>
<ul>
<li><strong>Endpoint</strong></li>
</ul>
<pre><code class="lang-plaintext">- Go to the endpoint section on the left hand menu 

- select endpoint and click create endpoint 

- service name : "com.amazonaws.us-east-1.s3" 

- VPC : "aws_capstone-VPC" 

- Route Table : private route tables 

- Policy : Full Access 

- click Create
</code></pre>
<h3 id="heading-step-2-create-security-groups-alb-ec2-rds-nat">Step 2: Create Security Groups (ALB, EC2 , RDS, NAT)</h3>
<pre><code class="lang-plaintext">1. ALB Security Group
Name            : aws_capstone_ALB_Sec_Group
Description     : ALB Security Group allows traffic HTTP and HTTPS ports from anywhere 
VPC             : AWS_Capstone_VPC
Inbound Rules
.HTTP(80)    ----&gt; anywhere
.HTTPS (443) ----&gt; anywhere

2. EC2 Security Groups
Name            : aws_capstone_EC2_Sec_Group
Description     : EC2 Security Groups only allows traffic coming from aws_capstone_ALB_Sec_Group Security Groups for HTTP and HTTPS ports. In addition, ssh port is allowed from anywhere
VPC             : AWS_Capstone_VPC
Inbound Rules
.HTTP(80)    ----&gt; aws_capstone_ALB_Sec_Group
.HTTPS (443) ----&gt; aws_capstone_ALB_Sec_Group
.ssh         ----&gt; anywhere

3. RDS Security Groups
Name            : aws_capstone_RDS_Sec_Group
Description     : RDS Security Groups only allows traffic coming from aws_capstone_EC2_Sec_Group Security Groups for MYSQL/Aurora port. 
VPC             : AWS_Capstone_VPC
Inbound Rules
.MYSQL/Aurora(3306)  ----&gt; aws_capstone_EC2_Sec_Group

4. NAT Instance Security Group
Name            : aws_capstone_NAT_Sec_Group
Description     : NAT Instance Security Group allows traffic HTTP and HTTPS and SSH ports from anywhere 
VPC             : AWS_Capstone_VPC
Inbound Rules
.HTTP(80)    ----&gt; anywhere
.HTTPS (443) ----&gt; anywhere
.SSH (22)    ----&gt; anywhere
</code></pre>
<h3 id="heading-step-3-create-rds">Step 3: Create RDS</h3>
<ol>
<li><p>First, we create a subnet group for our custom VPC. Click <code>subnet Groups</code> on the left-hand menu and click <code>create DB Subnet Group</code></p>
<pre><code class="lang-plaintext"> Name               : aws_capstone_RDS_Subnet_Group
 Description        : aws capstone RDS Subnet Group
 VPC                : aws_capstone_VPC
 Add Subnets
 Availability Zones : Select 2 AZ in aws_capstone_VPC
 Subnets            : Select 2 Private Subnets in these subnets
</code></pre>
</li>
<li><p>Go to the RDS console and click <code>create database</code> button</p>
<pre><code class="lang-plaintext"> Choose a database creation method : Standart Create
 Engine Options  : Mysql
 Version         : 8.0.20
 Templates       : Free Tier
 Settings        : 
     - DB instance identifier : aws-capstone-RDS
     - Master username        : admin
     - Password               : TechMust1234 
 DB Instance Class            : Burstable classes (includes t classes) ---&gt; db.t2.micro
 Storage                      : 20 GB and enable autoscaling(up to 40GB)
 Connectivity:
     VPC                      : aws_capstone_VPC
     Subnet Group             : aws_capstone_RDS_Subnet_Group
     Public Access            : No 
     VPC Security Groups      : Choose existing ---&gt; aws_capstone_RDS_Sec_Group
     Availability Zone        : No preference
     Additional Configuration : Database port ---&gt; 3306
 Database authentication ---&gt; Password authentication
 Additional Configuration:
     - Initial Database Name  : database1
     - Backup ---&gt; Enable automatic backups
     - Backup retention period ---&gt; 7 days
     - Select Backup Window ---&gt; Select 03:00 (am) Duration 1 hour
     - Maintance window : Select window    ---&gt; 04:00(am) Duration:1 hour
 create instance
</code></pre>
</li>
</ol>
<h3 id="heading-step-4-create-two-s3-buckets-and-set-one-of-these-as-a-static-website">Step 4: Create two S3 Buckets and set one of these as a static website</h3>
<p>Go to the S3 Console and let's create two buckets.</p>
<ol>
<li><p>Blog Website's S3 Bucket</p>
<pre><code class="lang-plaintext"> Bucket Name             : awscapstone&lt;YOUR NAME&gt;blog
 Region                  : N.Virginia
 Block all public access : Unchecked

 #Other Settings are keep them as are
 #create bucket
</code></pre>
</li>
<li><p>S3 Bucket for a failover scenario</p>
<pre><code class="lang-plaintext"> - Click Create Bucket

 Bucket Name : www.&lt;YOUR DNS NAME&gt;
 Region      : N.Virginia
 Block all public access : Unchecked
 #Please keep other settings as are
 #create bucket

 - Selects created www.&lt;YOUR DNS NAME&gt; bucket 
   ---&gt; Properties 
   ---&gt; Static website hosting

 Static website hosting : Enable
 Hosting Type           : Host a static website
 Index document         : index.html
 #save changes

 - Select www.&lt;YOUR DNS NAME&gt; bucket 
   ---&gt; select Upload and upload index.html and sorry.jpg files from given folder
   ---&gt; Permissions ---&gt; Grant public-read access 
   ---&gt; Checked warning massage
</code></pre>
</li>
</ol>
<h3 id="heading-step-5-copy-files-downloaded-or-cloned-from-alledevopsblog-page-app-django-on-aws-repo-on-github">Step 5: Copy files downloaded or cloned from <code>alledevops/blog-page-app-django-on-aws</code> repo on Github</h3>
<h3 id="heading-step-6-prepare-your-github-repository">Step 6: Prepare your Github repository</h3>
<p>Create a private project repository on your Github and clone it on your local. Copy all files and folders which are downloaded from <code>alledevops/blog-page-app-django-on-aws</code> repo under this folder. Commit and push them on <code>your_private_Repo</code> on GitHub.</p>
<h3 id="heading-step-7-prepare-the-userdata-to-be-utilized-in-the-launch-template">Step 7: Prepare the 'userdata' to be utilized in the Launch Template</h3>
<pre><code class="lang-bash"><span class="hljs-meta">#!/bin/bash</span>
<span class="hljs-comment"># Update the package lists for upgrades and new package installation</span>
apt-get update -y
<span class="hljs-comment"># Install git to clone the repository</span>
apt-get install git -y
<span class="hljs-comment"># Install Python 3.8</span>
apt-get install python3.8 -y
<span class="hljs-comment"># Change directory to the home directory</span>
<span class="hljs-built_in">cd</span> /home/ubuntu/
<span class="hljs-comment"># Set the access token for private repository</span>
TOKEN=<span class="hljs-string">"XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"</span>
<span class="hljs-comment"># Clone the private repository using the access token</span>
git <span class="hljs-built_in">clone</span> https://<span class="hljs-variable">$TOKEN</span>@&lt;YOUR PRIVATE REPO URL&gt;
<span class="hljs-comment"># Change directory to the cloned repository</span>
<span class="hljs-built_in">cd</span> /home/ubuntu/&lt;YOUR PRIVATE REPO NAME&gt;
<span class="hljs-comment"># Install pip for Python package installation</span>
apt-get install python3-pip -y
<span class="hljs-comment"># Install Python development files and MySQL client development files</span>
apt-get install python3.8-dev default-libmysqlclient-dev -y
<span class="hljs-comment"># Install libjpeg development files</span>
apt-get install libjpeg-dev -y
<span class="hljs-comment"># Install Python packages and dependencies from requirements.txt</span>
pip3 install -r requirements.txt
<span class="hljs-comment"># Change directory back to the repository</span>
<span class="hljs-built_in">cd</span> /home/ubuntu/&lt;YOUR PRIVATE REPO NAME&gt;
<span class="hljs-comment"># Collect static files for deployment</span>
python3 manage.py collectstatic --noinput
<span class="hljs-comment"># Create database migrations based on the models</span>
python3 manage.py makemigrations
<span class="hljs-comment"># Apply the database migrations</span>
python3 manage.py migrate
<span class="hljs-comment"># Run the Django application on port 80</span>
python3 manage.py runserver 0.0.0.0:80
</code></pre>
<h3 id="heading-step-8-configure-rds-and-s3-in-settings-file-and-push-to-github">Step 8: Configure RDS and S3 in Settings File and Push to GitHub</h3>
<p>Write the RDS database endpoint and S3 Bucket name into the settings file given by the Developer team and push your application into your public repo on Github Please follow and apply the instructions below:</p>
<ol>
<li><p>Update the AWS_STORAGE_BUCKET_NAME and AWS_S3_REGION_NAME variables</p>
<ul>
<li><p>Open the "/src/cblog/settings.py" file in your Django project.</p>
</li>
<li><p>Add the following lines to the file:</p>
<pre><code class="lang-plaintext">  AWS_STORAGE_BUCKET_NAME = 'awscapstones3&lt;YOUR NAME&gt;blog'
  AWS_S3_REGION_NAME = 'your_region_name'
</code></pre>
</li>
</ul>
</li>
<li><p>Update the Database Connection Variables</p>
<ul>
<li><p>Open the "/src/cblog/settings.py" file in your Django project.</p>
</li>
<li><p>Add the following lines to the file, replacing the placeholders with your RDS database information:</p>
<pre><code class="lang-plaintext">  NAME = 'database1'
  HOST = 'your_database_endpoint'
  PORT = '3306'
</code></pre>
</li>
</ul>
</li>
<li><p>Configure the PASSWORD Variable</p>
<ul>
<li><p>Create a new file named ".env" in the "/src/" directory of your Django project.</p>
</li>
<li><p>Add the following line to the ".env" file, replacing 'your_database_password' with your actual RDS database password:</p>
<pre><code class="lang-plaintext">  PASSWORD=your_database_password
</code></pre>
</li>
</ul>
</li>
</ol>
<p>After completing these steps, please <em><mark>check if this userdata is working or not </mark></em> by creating a new instance in a public subnet.</p>
<h3 id="heading-step-9-create-nat-instance-in-public-subnet">Step 9: Create NAT Instance in Public Subnet</h3>
<p>To launch NAT instance, go to the EC2 console and click the create button.</p>
<ul>
<li><p>Follow the instructions in <a target="_blank" href="https://docs.aws.amazon.com/vpc/latest/userguide/VPC_NAT_Instance.html">AWS Documentation</a> and create <code>aws_capstone_nat_ami</code></p>
</li>
<li><p>Then, continue with the below configuration of NAT instance.</p>
</li>
</ul>
<pre><code class="lang-plaintext">
- Select NAT Instance `aws_capstone_nat_ami` from My AMIs section 
Instance Type : t2.micro
Configure Instance Details  
    - Network : aws_capstone_VPC
    - Subnet  : aws_capstone-public-subnet-1A (Please select one of your Public Subnets)
    - Other features keep them as are
Storage ---&gt; Keep it as is
Tags: 
- Key    :Name     
- Value  :AWS Capstone NAT Instance

Configure Security Group
- Select an existing security group: aws_capstone_NAT_Sec_Group
- Review and select our own pem.key
- Click create
</code></pre>
<p><strong><mark>!!!IMPORTANT!!!</mark></strong></p>
<ul>
<li><p>select the newly created NAT instance and enable the <em>stop source/destination check</em></p>
</li>
<li><p>go to the private route table and write a rule</p>
<pre><code class="lang-plaintext">  Destination : 0.0.0.0/0
  Target      : instance ---&gt; Select the NAT Instance
  #Save
</code></pre>
</li>
</ul>
<h3 id="heading-step-10-create-a-launch-template-and-iam-role-for-it">Step 10: Create a Launch Template and IAM role for it</h3>
<ul>
<li>Go to the IAM role console click role on the right-hand menu then Create role.</li>
</ul>
<pre><code class="lang-plaintext">Trusted entity  : EC2 as  ---&gt; click Next:Permission
Policy          : AmazonS3FullAccess policy
Tags            : No tags
Role Name       : aws_capstone_EC2_S3_Full_Access
Description     : For EC2, S3 Full Access Role
</code></pre>
<ul>
<li><p>To create a Launch Template, go to the EC2 console and select <code>Launch Template</code> on the left-hand menu. Tab the Create Launch Template button.</p>
<pre><code class="lang-plaintext">  Launch template name                : aws_capstone_launch_template
  Template version description        : Blog Web Page version 1
  Amazon machine image (AMI)          : Ubuntu 18.04
  Instance Type                       : t2.micro
  Key Pair                            : mykey.pem
  Network Platform                    : VPC
  Security Groups                     : aws_capstone_EC2_sec_group
  Storage (Volumes)                   : keep it as is
  Resource tags                       : Key: Name   Value: aws_capstone_web_server
  Advance Details:
      - IAM instance profile          : aws_capstone_EC2_S3_Full_Access
      - Termination protection        : Enable
      - User Data
  #!/bin/bash
  apt-get update -y
  apt-get install git -y
  apt-get install python3.8 -y
  cd /home/ubuntu/
  TOKEN="XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
  git clone https://$TOKEN@&lt;YOUR PRIVATE REPO URL&gt;
  cd /home/ubuntu/&lt;YOUR PRIVATE REPO NAME&gt;
  apt-get install python3-pip -y
  apt-get install python3.8-dev default-libmysqlclient-dev -y
  apt-get install libjpeg-dev -y
  pip3 install -r requirements.txt
  cd /home/ubuntu/&lt;YOUR PRIVATE REPO NAME&gt;
  python3 manage.py collectstatic --noinput
  python3 manage.py makemigrations
  python3 manage.py migrate
  python3 manage.py runserver 0.0.0.0:80
</code></pre>
</li>
<li><p>Click Create</p>
</li>
</ul>
<h3 id="heading-step-11-create-ssltls-certification-for-secure-connection">Step 11: Create SSL/TLS certification for secure connection</h3>
<p>Go to the certification manager console and click <code>Request a certificate</code> button.</p>
<pre><code class="lang-plaintext">Select Request a public certificate, then request a certificate 
- Fully qualified domain name: *.&lt;YOUR DNS NAME&gt;  
- DNS validation 
- No tag 
- Review 
- click confirm and request button 
#Then it takes a while to be activated
</code></pre>
<h3 id="heading-step-12-create-alb-and-target-group">Step 12: Create ALB and Target Group</h3>
<p>Go to the Load Balancer section on the left-hand side menu of EC2 console. Click <code>create Load Balancer</code> button and select Application Load Balancer</p>
<pre><code class="lang-plaintext">Step 1 - Basic Configs
Name                    : awscapstoneALB
Schema                  : internet-facing
Availability Zones      : 
    - VPC               : aws_capstone_VPC
    - Availability zones: 
        1. aws_capstone-public-subnet-1A
        2. aws_capstone-public-subnet-1B
Step 2 - Configure Security Settings
Certificate type ---&gt; Choose a certificate from ACM (recommended)
    - Certificate name    : "*.&lt;YOUR DNS NAME&gt;" certificate
    - Security policy     : keep it as is
Step 3 - Configure Security Groups : aws_capstone_ALB_Sec_group
Step 4 - Configure Listners and Routing
- Create target group firts:
    - Target group        : New target group
    - Name                : awscapstoneTargetGroup
    - Target Type         : Instance
    - Protocol            : HTTP
    - Port                : 80
    - Protocol version    : HTTP1
    - Health Check        :
      - Protocol          : HTTP
      - Path              : /
      - Port              : traffic port
      - Healthy threshold : 5
      - Unhealthy threshold : 2
      - Timeout           : 5
      - Interval          : 30
      - Success Code      : 200
- Listeners:  
    ----&gt; HTTPS: Select "awscapstoneTargetGroup" for HTTPS
    ----&gt; HTTP : Redirect traffic from HTTP to HTTPS:
        - Redirect to HTTPS 443
        - Original host, path, query
        - 301 - permanently moved 
Step 5 - Register Targets
- Without register any target click Next: Review and click create.
</code></pre>
<h3 id="heading-step-13-create-auto-scaling-group-with-launch-template">Step 13: Create Auto Scaling Group with Launch Template</h3>
<p>Go to the Autoscaling Group on the left-hand side menu. Click create Autoscaling group.</p>
<ul>
<li>Choose a launch template or configuration</li>
</ul>
<pre><code class="lang-plaintext">Auto Scaling group name         : aws_capstone_ASG
Launch Template                 : aws_capstone_launch_template
</code></pre>
<ul>
<li>Configure settings</li>
</ul>
<pre><code class="lang-plaintext">Instance purchase options       : Adhere to launch template
Network                         :
    - VPC                       : aws-capstone-VPC
    - Subnets                   : Private 1A and Private 1B
</code></pre>
<ul>
<li>Configure advanced options</li>
</ul>
<pre><code class="lang-plaintext">- Load balancing                                : Attach to "awscapstoneALB" load balancer
- Choose from your load balancer target groups  : awscapstoneTargetGroup
- Health Checks
    - Health Check Type             : ELB
    - Health check grace period     : 300
</code></pre>
<ul>
<li>Configure group size and scaling policies</li>
</ul>
<pre><code class="lang-plaintext">Group size
    - Desired capacity  : 2
    - Minimum capacity  : 2
    - Maximum capacity  : 4
Scaling policies
    - Target tracking scaling policy
        - Scaling policy name       : Target Tracking Policy
        - Metric Type               : Average CPU utilization
        - Target value              : 70
</code></pre>
<ul>
<li>Add notifications</li>
</ul>
<pre><code class="lang-plaintext">Create new Notification
    - Notification1
        - Send a notification to    : aws-capstone-SNS
        - with these recipients     : your_email.com
        - event type                : select all
</code></pre>
<h3 id="heading-step-14-create-cloudfront-in-front-of-alb">Step 14: Create Cloudfront in front of ALB</h3>
<p>Go to the Cloudfront menu and click Create Distribution.</p>
<ul>
<li>Origin Settings</li>
</ul>
<pre><code class="lang-plaintext">Origin Domain Name          : aws-capstone-ALB-1947210493.us-east-2.elb.amazonaws.com
Origin Path                 : Leave empty (this means, define for root '/')
Protocol                    : Match Viewer
HTTP Port                   : 80
HTTPS                       : 443
Minimum Origin SSL Protocol : Keep it as is
Name                        : Keep it as is
Add custom header           : No header
Enable Origin Shield        : No
Additional settings         : Keep it as is
</code></pre>
<ul>
<li>Default Cache Behavior Settings</li>
</ul>
<pre><code class="lang-plaintext">Path pattern                                : Default (*)
Compress objects automatically              : Yes
Viewer Protocol Policy                      : Redirect HTTP to HTTPS
Allowed HTTP Methods                        : GET, HEAD, OPTIONS, PUT, POST, PATCH, DELETE
Cached HTTP Methods                         : Select OPTIONS
Cache key and origin requests
- Use legacy cache settings
  Headers     : Include the following headers
    Add Header
    - Accept
    - Accept-Charset
    - Accept-Datetime
    - Accept-Encoding
    - Accept-Language
    - Authorization
    - Cloudfront-Forwarded-Proto
    - Host
    - Origin
    - Referrer
Forward Cookies                         : All
Query String Forwarding and Caching     : All
Other stuff                             : Keep them as are
</code></pre>
<ul>
<li>Distribution Settings</li>
</ul>
<pre><code class="lang-plaintext">Price Class                             : Use all edge locations (best performance)
Alternate Domain Names                  : www.&lt;Your_Domain_Name&gt;
SSL Certificate                         : Custom SSL Certificate (example.com) ---&gt; Select your certificate created before
Other stuff                             : Keep them as are
</code></pre>
<h3 id="heading-step-15-create-route-53-with-failover-settings">Step 15: Create Route 53 with Failover settings</h3>
<p>Come to the Route53 console and select Health Checks on the left-hand menu.</p>
<ul>
<li><p>Click Create Health Check and start to configure</p>
<pre><code class="lang-plaintext">  Name                : aws capstone health check
  What to monitor     : Endpoint
  Specify endpoint by : Domain Name
  Protocol            : HTTP
  Domain Name         : Write cloudfront domain name
  Port                : 80
  Path                : leave it blank
  Other stuff         : Keep them as are
</code></pre>
</li>
<li><p>Click Hosted Zones on the left-hand menu</p>
</li>
<li><p>Create &lt;your_domain_name&gt; hosted zone and click it</p>
<pre><code class="lang-plaintext">  Name: &lt;your_domain_name&gt;
  Type: Public Hosted Zone
  Tag : No tag
</code></pre>
</li>
<li><p>Click Create Record to create a Failover scenario</p>
<pre><code class="lang-plaintext">  Configure records

  Record name             : www.&lt;YOUR DNS NAME&gt;
  Record Type             : A - Routes traffic to an IPv4 address and some AWS resources
  TTL                     : 300

  ---&gt; First we'll create a primary record for cloudfront

  Failover record to add to your DNS ---&gt; Define failover record

  Value/Route traffic to  : Alias to cloudfront distribution
                            - Select created cloudfront DNS
  Failover record type    : Primary
  Health check            : aws capstone health check
  Record ID               : Cloudfront as Primary Record
  ----------------------------------------------------------------

  ---&gt; Second we'll create secondary record for S3

  Failover another record to add to your DNS ---&gt; Define failover record

  Value/Route traffic to  : Alias to S3 website endpoint
                            - Select Region
                            - Your created bucket name emerges ---&gt; Select it
  Failover record type    : Secondary
  Health check            : No health check
  Record ID               : S3 Bucket for Secondary record type
</code></pre>
</li>
<li><p>click create records</p>
</li>
</ul>
<h3 id="heading-step-16-create-dynamodb-table">Step 16: Create DynamoDB Table</h3>
<p>Go to the DynamoDB table and click the Create Table button.</p>
<pre><code class="lang-plaintext">Name            : awscapstoneDynamo
Primary key     : id
Other Stuff     : Keep them as are
#click create
</code></pre>
<h3 id="heading-step-17-create-lambda-function">Step 17: Create Lambda Function</h3>
<ul>
<li>Before we create our Lambda function, we should create IAM role that we'll use for Lambda function. Go to the IAM console and select role on the left-hand menu, then create role button</li>
</ul>
<pre><code class="lang-plaintext">Select Lambda as trusted entity ---&gt; click Next:Permission
Choose: - LambdaS3fullaccess, 
        - Network Administrator
        - AWSLambdaVPCAccessExecutionRole
        - DynamoDBFullAccess
No tags
Role Name           : aws_capstone_lambda_Role
Role description    : This role give a permission to lambda to reach S3 and DynamoDB on custom VPC
</code></pre>
<ul>
<li>Then, go to the Lambda Console and click the Create function</li>
</ul>
<pre><code class="lang-plaintext">
Function Name           : awscapsitonelambdafunction
Runtime                 : Python 3.8
Select exist. IAM role  : aws_capstone_lambda_Role

Advance Setting:
- Enable VPC: 
    - VPC               : aws-capstone-VPC
    - Subnets           : Select all subnets
    - Security Group    : Creaete "aws_capstone_lambda_sg"
                        ---&gt; select all traffic (0.0.0.0/0) for inbound
                             and outbound rules
</code></pre>
<ul>
<li><p>Select the <code>awscapstonelambdafunction</code> lambda Function and click add trigger on the Function Overview.</p>
</li>
<li><p>For defining a trigger for creating objects</p>
</li>
</ul>
<pre><code class="lang-plaintext">Trigger configuration   : S3
Bucket                  : awscapstonec3&lt;name&gt;blog
Event type              : All object create events
Check the warning message and click add ---&gt; sometimes it says overlapping situation. When it occurs, try refresh page and create a new trigger or remove the s3 event and recreate again. then again create a trigger for lambda function
</code></pre>
<ul>
<li>For defining a trigger for deleting objects</li>
</ul>
<pre><code class="lang-plaintext">Trigger configuration   : S3
Bucket                  : awscapstonec3&lt;name&gt;blog
Event type              : All object delete events
Check the warning message and click add ---&gt; sometimes it says overlapping situation. When it occurs, try refresh page and create a new trigger or remove the s3 event and recreate again. then again create a trigger for lambda function
</code></pre>
<ul>
<li>Go to the code part and select lambda_function.py ---&gt; remove the default code and paste the code below. If you give DynamoDB a different name, please make sure to change it in the code.</li>
</ul>
<pre><code class="lang-plaintext">import json
import boto3

def lambda_handler(event, context):
    s3 = boto3.client("s3")

    if event:
        print("Event: ", event)
        filename = str(event['Records'][0]['s3']['object']['key'])
        timestamp = str(event['Records'][0]['eventTime'])
        event_name = str(event['Records'][0]['eventName']).split(':')[0][6:]

        filename1 = filename.split('/')
        filename2 = filename1[-1]

        dynamo_db = boto3.resource('dynamodb')
        dynamoTable = dynamo_db.Table('awscapstoneDynamo')

        dynamoTable.put_item(Item = {
            'id': filename2,
            'timestamp': timestamp,
            'Event': event_name,
        })

    return "Lammda success"
</code></pre>
<ul>
<li><p>Click deploy and all set. go to the website and add a new post with a photo, then control if their record is written on DynamoDB.</p>
</li>
<li><p>WE ALL SET!</p>
</li>
<li><p>Congratulations!! You have finished your AWS Capstone Project!</p>
</li>
</ul>
<h2 id="heading-outcome">Outcome</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1699540278480/ba059253-72ad-4b2f-bd33-18f2afdf40f4.jpeg" alt class="image--center mx-auto" /></p>
<h2 id="heading-clean-up">Clean Up</h2>
<p>It's essential to clean up the AWS resources you've created during your project to avoid incurring additional costs and maintain good resource management practices. Follow these steps to delete the resources you've created:</p>
<ol>
<li><p><strong>Delete Auto Scaling Group and EC2 Instances</strong></p>
<ul>
<li><p>Go to the Amazon EC2 console.</p>
</li>
<li><p>Navigate to "Auto Scaling Groups" and select the "aws_capstone_ASG" group.</p>
</li>
<li><p>Click on the "Actions" button and select "Delete."</p>
</li>
<li><p>Confirm the termination of all associated EC2 instances when prompted.</p>
</li>
</ul>
</li>
<li><p><strong>Delete Load Balancer</strong></p>
<ul>
<li><p>In the Amazon EC2 console, go to the "Load Balancers" section.</p>
</li>
<li><p>Select the "awscapstoneALB" load balancer.</p>
</li>
<li><p>Click on "Actions" and choose "Delete."</p>
</li>
</ul>
</li>
<li><p><strong>Delete DynamoDB Table</strong></p>
<ul>
<li><p>Access the Amazon DynamoDB console.</p>
</li>
<li><p>Click on "Tables" on the left-hand menu.</p>
</li>
<li><p>Select the "awscapstoneDynamo" table.</p>
</li>
<li><p>Choose the "Delete table" option.</p>
</li>
</ul>
</li>
<li><p><strong>Remove RDS Database</strong></p>
<ul>
<li><p>Navigate to the Amazon RDS console.</p>
</li>
<li><p>Click on the RDS instance named "aws-capstone-RDS."</p>
</li>
<li><p>In the "Instance actions" menu, select "Delete."</p>
</li>
<li><p>Confirm the deletion of the RDS instance and snapshots.</p>
</li>
</ul>
</li>
<li><p><strong>Delete S3 Buckets</strong></p>
<ul>
<li><p>Access the Amazon S3 console.</p>
</li>
<li><p>Choose the "awscapstones3&lt;your_name&gt;blog" and "www.&lt;your_domain&gt;" buckets.</p>
</li>
<li><p>Select "Empty" and delete all objects in both buckets.</p>
</li>
<li><p>Now, choose the "Delete" option for each bucket.</p>
</li>
</ul>
</li>
<li><p><strong>Delete CloudFront Distribution</strong></p>
<ul>
<li><p>Go to the Amazon CloudFront console.</p>
</li>
<li><p>Select the "awscapstoneALB" distribution.</p>
</li>
<li><p>Click on "Distribution Settings" and choose "Delete."</p>
</li>
</ul>
</li>
<li><p><strong>Delete Route 53 Records</strong></p>
<ul>
<li><p>Visit the Amazon Route 53 console.</p>
</li>
<li><p>In your hosted zone, delete the Route 53 records associated with your DNS name and ALB.</p>
</li>
</ul>
</li>
<li><p><strong>Remove Lambda Function</strong></p>
<ul>
<li><p>Open the AWS Lambda console.</p>
</li>
<li><p>Select the "awscapstonelambdafunction."</p>
</li>
<li><p>Click on "Delete" and confirm the deletion.</p>
</li>
</ul>
</li>
<li><p><strong>Detach and Delete Internet Gateway</strong></p>
<ul>
<li><p>In the Amazon VPC console, navigate to "Internet Gateways."</p>
</li>
<li><p>Select "aws_capstone-IGW" and choose "Actions" &gt; "Detach from VPC."</p>
</li>
<li><p>Once detached, select the "aws_capstone-IGW" again and click "Actions" &gt; "Delete."</p>
</li>
</ul>
</li>
<li><p><strong>Delete NAT Instance</strong></p>
<ul>
<li><p>Go to the Amazon EC2 console.</p>
</li>
<li><p>In the "Instances" section, select the AWS Capstone NAT Instance</p>
</li>
<li><p>Click on "Instance State" &gt; "Terminate Instance."</p>
</li>
</ul>
</li>
<li><p><strong>Delete Endpoints</strong></p>
<ul>
<li><p>Access the VPC console.</p>
</li>
<li><p>In the "Endpoints" section, select the created endpoints.</p>
</li>
<li><p>Choose "Actions" &gt; "Delete Endpoints."</p>
</li>
</ul>
</li>
<li><p><strong>Delete VPC</strong></p>
<ul>
<li><p>In the Amazon VPC console, choose "Your VPCs."</p>
</li>
<li><p>Select "aws_capstone-VPC" and click "Actions" &gt; "Delete VPC."</p>
</li>
<li><p>Confirm the deletion when prompted</p>
</li>
</ul>
</li>
<li><p><strong>Delete SSL/TLS Certificate</strong></p>
<ul>
<li><p>In the AWS Certificate Manager console, select the certificate created for SSL/TLS.</p>
</li>
<li><p>Click on the certificate, then choose "Actions" &gt; "Delete certificate."</p>
</li>
<li><p>Confirm the deletion when prompted.</p>
</li>
</ul>
</li>
</ol>
<h2 id="heading-conclusion">Conclusion</h2>
<p>In conclusion, this capstone project has empowered you with a diverse skill set essential for success in cloud computing and web development. You've learned to construct VPC environments, manage databases, employ web programming skills, and apply serverless computing through AWS Lambda functions. The project has honed your infrastructure configuration abilities and proficiently introduced you to version control using Git and GitHub. With this comprehensive knowledge, you are well-prepared to tackle real-world challenges in cloud technology and web development, setting a strong foundation for your career in the ever-evolving tech industry.</p>
<h3 id="heading-references">References</h3>
<ul>
<li><a target="_blank" href="https://docs.aws.amazon.com/vpc/latest/userguide/VPC_NAT_Instance.html">Nat Instance - AWS Documentation</a></li>
</ul>
]]></content:encoded></item><item><title><![CDATA[AWS General Immersion Day: Deep Dive into Practical AWS Expertise - for FREE!]]></title><description><![CDATA[TL;DR
Ready to level up your AWS skills? Click and get started with AWS General Immersion Day – a comprehensive training program offering hands-on experience across various AWS modules. --NO SIGN UP!!-- --PRACTICE LABS FOR FREE!!!--

Explore compute,...]]></description><link>https://alledevops.com/aws-general-immersion-day-deep-dive-into-practical-aws-expertise-for-free</link><guid isPermaLink="true">https://alledevops.com/aws-general-immersion-day-deep-dive-into-practical-aws-expertise-for-free</guid><category><![CDATA[AWS training]]></category><category><![CDATA[Cloud Computing]]></category><category><![CDATA[awsimmersionday]]></category><category><![CDATA[hands-on labs]]></category><category><![CDATA[skill enhancement]]></category><dc:creator><![CDATA[Mustafa Gönen]]></dc:creator><pubDate>Tue, 29 Aug 2023 09:04:44 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1693295848901/14253b74-de26-41d3-b610-cf4a8dff38ed.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-tldr">TL;DR</h2>
<p>Ready to level up your AWS skills? Click and get started with <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US"><strong>AWS General Immersion Da</strong></a><strong>y</strong> – a comprehensive training program offering hands-on experience across various AWS modules. <em>--NO SIGN UP!!-- --PRACTICE LABS FOR FREE!!!--</em></p>
<ul>
<li><p>Explore compute, network, security, monitoring, database, storage, and provisioning. Get practical with labs covering <strong>EC2, VPC, IAM, CloudWatch, RDS, S3, EFS, CloudFormation, web app optimization, and cost management</strong>.</p>
</li>
<li><p>Gain proficiency in AWS services, resource management, security, monitoring, database, storage, and more. No matter your skill level, dive into AWS expertise with AWS General Immersion Day.</p>
</li>
</ul>
<p>Don't miss out – start your AWS journey now!</p>
<h2 id="heading-introduction">Introduction</h2>
<p>Are you ready to take your AWS skills to the next level? Look no further than AWS General Immersion Day, a comprehensive training program designed to provide you with hands-on experience across various AWS modules. In this blog post, we will delve into the details of this immersive training program, covering the modules, hands-on labs, and the skills you can expect to gain. --NO REGISTRY FEE!!--</p>
<h2 id="heading-basic-modules">Basic Modules</h2>
<h3 id="heading-1-compute-amazon-ec2">1. Compute - Amazon EC2</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693296133192/008a6f89-5c69-4e46-9a8a-89a76a8c4f70.png" alt class="image--center mx-auto" /></p>
<ul>
<li><p>Launch and manage EC2 instances with ease</p>
</li>
<li><p>Explore auto-scaling capabilities for efficient resource management</p>
</li>
<li><p>Connect to EC2 instances on Linux and Windows operating systems</p>
</li>
</ul>
<h3 id="heading-2-network-amazon-vpc">2. Network - Amazon VPC</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693296190252/21f39515-d323-4156-9a90-2df93e4d3432.png" alt class="image--center mx-auto" /></p>
<ul>
<li><p>Create and configure virtual private networks (VPCs)</p>
</li>
<li><p>Set up subnets, security groups, and routing tables</p>
</li>
<li><p>Utilize Amazon API Gateway for building and deploying APIs</p>
</li>
</ul>
<h3 id="heading-3-security-aws-iam">3. Security - AWS IAM</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693296233239/a19f49f1-d014-4c1b-9ac0-df43482bf9e1.png" alt class="image--center mx-auto" /></p>
<ul>
<li><p>Understand the fundamentals of AWS Identity and Access Management (IAM)</p>
</li>
<li><p>Create IAM identities and assign roles for secure access to resources</p>
</li>
<li><p>Test access permissions to ensure robust security measures</p>
</li>
</ul>
<h3 id="heading-4-monitoring-amazon-cloudwatch">4. Monitoring - Amazon CloudWatch</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693296303958/c26ad7ac-de97-40df-80cc-0d5b1ba43374.png" alt class="image--center mx-auto" /></p>
<ul>
<li><p>Set up monitoring and alarms using Amazon CloudWatch</p>
</li>
<li><p>Gain insights into resource utilization and performance metrics</p>
</li>
<li><p>Ensure optimal performance and availability of your AWS resources</p>
</li>
</ul>
<h3 id="heading-5-database-amazon-rds">5. Database - Amazon RDS</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693296964104/06c526a5-f7f4-4486-adab-1cdd55897661.jpeg" alt class="image--center mx-auto" /></p>
<ul>
<li><p>Launch and configure RDS instances for MySQL and SQL Server databases</p>
</li>
<li><p>Access RDS instances from EC2 instances for seamless data management</p>
</li>
<li><p>Learn advanced tasks like snapshot creation and instance modification</p>
</li>
</ul>
<h3 id="heading-6-storage-amazon-s3-amp-efs">6. Storage - Amazon S3 &amp; EFS</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693296999349/e09c4d9a-0db4-4bfd-991b-f573cb5bd886.png" alt class="image--center mx-auto" /></p>
<ul>
<li><p>Create buckets in Amazon S3 and manage objects</p>
</li>
<li><p>Configure features like versioning and lifecycle policies for efficient storage management</p>
</li>
<li><p>Create and mount Elastic File System (EFS) drives across EC2 instances for shared storage</p>
</li>
</ul>
<h3 id="heading-7-provisioning-aws-cloudformation">7. Provisioning - AWS CloudFormation</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693297245192/875aa238-a979-4ca0-8dc3-94197754de33.webp" alt class="image--center mx-auto" /></p>
<ul>
<li><p>Discover the power of infrastructure as code with AWS CloudFormation</p>
</li>
<li><p>Create VPCs, launch EC2 instances, and manage resources using templates</p>
</li>
<li><p>Gain hands-on experience in automating infrastructure provisioning</p>
</li>
</ul>
<h2 id="heading-advanced-modules">Advanced Modules</h2>
<h3 id="heading-1-web-application">1. Web Application</h3>
<ul>
<li><p>Explore advanced topics like network, compute, database, and storage in the context of web applications</p>
</li>
<li><p>Deploy auto-scaling web services and connect them with databases</p>
</li>
<li><p>Gain insights into optimizing resources for high-performance web applications</p>
</li>
</ul>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693298475046/eedd2140-6385-4e08-93bf-0cbe2aac86a1.png" alt class="image--center mx-auto" /></p>
<h3 id="heading-2-cost-monitoring-and-observability"><strong>2. Cost Monitoring and Observability</strong></h3>
<ul>
<li><p>Learn how to effectively manage costs and optimize resources using AWS Cost Management Tools</p>
</li>
<li><p>Set up cost reports, budgets, and cost anomaly detection for better cost control</p>
</li>
<li><p>Gain visibility into resource usage and make informed decisions for cost optimization</p>
</li>
</ul>
<h2 id="heading-hands-on-labs-get-your-hands-dirty-with-aws">Hands-On Labs: Get Your Hands Dirty with AWS</h2>
<p>In addition to the comprehensive modules mentioned above, AWS General Immersion Day provides participants with the opportunity to gain practical experience through hands-on labs. These labs are designed to reinforce the concepts learned in the modules and allow participants to apply their knowledge in real-world scenarios. Let's take a closer look at some of the hands-on labs you can expect to encounter during the program:</p>
<p><strong><mark>!!!Click on the links to ACCESS LABS directly and easily!!!</mark></strong></p>
<h3 id="heading-lab-1-launching-and-managing-ec2-instances">Lab 1: Launching and Managing EC2 Instances</h3>
<p>In this lab, you will have the chance to launch and manage Amazon EC2 instances. You will learn how to choose the appropriate instance type, configure security groups, and connect to EC2 instances using both <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/10-ec2/ec2-linux">Linux</a> and <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/10-ec2/ec2-windows">Windows</a> operating systems. By the end of this lab, you will have a solid understanding of how to provision and manage EC2 instances efficiently.</p>
<h3 id="heading-lab-2-building-scalable-architectures-with-ec2-auto-scalinghttpscatalogworkshopsawsgeneral-immersiondayen-usbasic-modules10-ec2ec2-auto-scalingec2-auto-scaling"><a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/10-ec2/ec2-auto-scaling/ec2-auto-scaling">Lab 2: <strong>Building Scalable Architectures with EC2 Auto Scaling</strong></a></h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693298053656/56e185ed-36f0-4d40-824c-47770d2fe398.png" alt class="image--center mx-auto" /></p>
<p>You'll explore the power of EC2 Auto Scaling on AWS. This hands-on lab guides you through creating an Amazon Machine Image (AMI) using CloudFormation, setting up a Launch Template, and configuring an Auto Scaling group with an Application Load Balancer (ALB). By the end of this lab, you'll have a clear understanding of how to design resilient architectures that can dynamically scale based on CPU utilization, ensuring your applications remain highly available and performant under varying workloads.</p>
<h3 id="heading-lab-3-creating-and-configuring-vpcshttpscatalogworkshopsawsgeneral-immersiondayen-usbasic-modules20-vpcvpc"><a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/20-vpc/vpc">Lab 3: Creating and Configuring VPCs</a></h3>
<p>This lab focuses on creating and configuring Amazon Virtual Private Clouds (VPCs). You will learn how to set up subnets, security groups, and routing tables to establish secure and isolated network environments. Additionally, you will explore the capabilities of Amazon API Gateway for building and deploying APIs, enabling controlled access to your applications.</p>
<h3 id="heading-lab-4-building-a-dynamic-house-price-calculator-apihttpscatalogworkshopsawsgeneral-immersiondayen-usbasic-modules20-vpcapi-gateway"><a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/20-vpc/api-gateway"><strong>Lab 4: Building a Dynamic House Price Calculator API</strong></a></h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693297742198/f4ca58ab-b75f-4152-a73b-2a2fceed6a5a.jpeg" alt class="image--center mx-auto" /></p>
<p>In this hands-on lab, you'll walk through the process of building a dynamic API using AWS API Gateway and Lambda functions to calculate house prices based on square footage. You'll start by deploying a pre-defined lambda function and setting up the API infrastructure. Through core sections and optional segments, you'll learn about message transformation, request validation, authentication, and deployment, while also exploring advanced features such as message caching, usage plans, and throttling. By the end of the lab, you'll have a comprehensive understanding of building and optimizing APIs on AWS.</p>
<h3 id="heading-lab-5-implementing-iam-for-secure-accesshttpscatalogworkshopsawsgeneral-immersiondayen-usbasic-modules30-iamiam"><a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/30-iam/iam">Lab 5: Implementing IAM for Secure Access</a></h3>
<p>In this lab, you will dive into AWS Identity and Access Management (IAM) and learn how to implement secure access controls. You will create IAM identities for different user groups and assign roles and permissions accordingly. Through practical exercises, you will gain hands-on experience in testing access permissions and ensuring robust security measures for your AWS resources.</p>
<h3 id="heading-lab-6-monitoring-and-alarming-with-cloudwatchhttpscatalogworkshopsawsgeneral-immersiondayen-usbasic-modules40-monitoringmonitoring"><a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/40-monitoring/monitoring">Lab 6: Monitoring and Alarming with CloudWatch</a></h3>
<p>CloudWatch is a powerful monitoring service offered by AWS, and this lab will guide you through its features and functionalities. You will set up monitoring and alarms using CloudWatch, gaining insights into resource utilization and performance metrics. Through hands-on exercises, you will learn how to proactively monitor your infrastructure and resolve issues before they impact your services.</p>
<h3 id="heading-lab-7-managing-databases-with-amazon-rds">Lab 7: Managing Databases with Amazon RDS</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693297930684/e0c31178-bc20-4dff-8ebf-cdd3bc672f23.jpeg" alt class="image--center mx-auto" /></p>
<p>This lab focuses on managing databases using Amazon RDS. You will launch and configure RDS instances for <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/50-rds/rds-mysql"><em>MySQL</em></a> and <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/50-rds/rds-sqlserver"><em>SQL Server</em></a> databases, and learn how to access and manage them from your EC2 instances. Advanced tasks such as snapshot creation and instance modification will also be covered, providing you with the skills needed to efficiently manage your databases.</p>
<h3 id="heading-lab-8-efficient-storage-management-with-s3-and-efs">Lab 8: Efficient Storage Management with S3 and EFS</h3>
<p>In this lab, you will explore <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/60-s3/s3"><em>Amazon S3</em></a> and <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/60-s3/efs"><em>Elastic File System (EFS)</em></a> for efficient storage management. You will create S3 buckets, manage objects, and configure features like versioning and lifecycle policies. Additionally, you will create and mount EFS drives across your EC2 instances, enabling seamless collaboration and data sharing among your development teams.</p>
<h3 id="heading-lab-9-infrastructure-provisioning-with-cloudformation">Lab 9: Infrastructure Provisioning with CloudFormation</h3>
<p>CloudFormation is a powerful tool for automating infrastructure provisioning, and this lab will guide you through its capabilities. You will learn how to create reusable templates to automate <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/800-cfn/810-cfn-lab1-vpc"><em>the creation of VPCs,</em></a> <a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/basic-modules/800-cfn/820-cfn-lab2-ec2"><em>launch EC2 instances</em></a>, and manage other AWS resources. By the end of this lab, you will have hands-on experience in automating your infrastructure provisioning process.</p>
<h3 id="heading-lab-10-optimizing-web-applications-with-advanced-moduleshttpscatalogworkshopsawsgeneral-immersiondayen-usadvanced-modules"><a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/advanced-modules">Lab 10: Optimizing Web Applications with Advanced Modules</a></h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693297448452/87198d1f-c155-480a-8b45-ef4f19075caf.jpeg" alt class="image--center mx-auto" /></p>
<p>This lab focuses on optimizing web applications using advanced AWS modules. You will deploy auto-scaling web services that adjust resources based on demand, ensuring optimal performance and scalability. Additionally, you will learn how to seamlessly connect your web services with databases, enabling efficient data retrieval and storage for high-performance web applications.</p>
<h3 id="heading-lab-11-cost-monitoring-and-resource-optimization-with-advanced-moduleshttpscatalogworkshopsawsgeneral-immersiondayen-usadvanced-modules-cost-monitoring-observability"><a target="_blank" href="https://catalog.workshops.aws/general-immersionday/en-US/advanced-modules-cost-monitoring-observability">Lab 11: Cost Monitoring and Resource Optimization with Advanced Modules</a></h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1693297537099/e07d4fae-cf1f-4a09-991e-a631e22280c3.png" alt class="image--center mx-auto" /></p>
<p>Managing costs and optimizing resources are crucial aspects of AWS, and this lab will guide you through the process. You will learn how to effectively manage costs using AWS Cost Management Tools, including setting up cost reports, budgets, and cost anomaly detection. Through hands-on exercises, you will gain visibility into resource usage and make informed decisions for cost optimization.</p>
<h2 id="heading-promises-of-aws-general-immersion-day">Promises of AWS General Immersion Day</h2>
<ol>
<li><p><strong>Proficiency in AWS Services</strong>: Through the hands-on labs of AWS General Immersion Day, you will gain practical experience in working with a wide range of AWS services. This will enhance your understanding of these services and enable you to effectively utilize them in real-world scenarios.</p>
</li>
<li><p><strong>Resource Management</strong>: You will learn how to provision, configure, and manage AWS resources such as EC2 instances, VPCs, RDS databases, S3 buckets, and more. This will enable you to efficiently manage your infrastructure and optimize resource utilization.</p>
</li>
<li><p><strong>Security and Compliance</strong>: By participating in the security-focused labs, you will gain knowledge and hands-on experience in implementing robust security measures for your AWS resources. This includes setting up IAM roles and policies, configuring VPC security groups, and implementing encryption for data at rest and in transit.</p>
</li>
<li><p><strong>Monitoring and Troubleshooting</strong>: You will learn how to monitor your AWS resources using CloudWatch, set up alarms, and proactively identify and resolve issues before they impact your services. This will enable you to ensure the availability and performance of your applications.</p>
</li>
<li><p><strong>Database Management</strong>: Through the labs focused on Amazon RDS, you will gain the skills to launch, configure, and manage databases using RDS instances. This includes tasks such as backup and restore, snapshot creation, and instance modification.</p>
</li>
<li><p><strong>Storage Management</strong>: You will learn how to efficiently manage storage using Amazon S3 and Elastic File System (EFS). This includes creating S3 buckets, managing objects, configuring versioning and lifecycle policies, as well as creating and mounting EFS drives across EC2 instances.</p>
</li>
<li><p><strong>Infrastructure Automation</strong>: The CloudFormation lab will equip you with the skills to automate infrastructure provisioning using reusable templates. This will enable you to provision and manage AWS resources consistently and efficiently.</p>
</li>
<li><p><strong>Web Application Optimization</strong>: Through the labs focused on advanced modules, you will learn how to deploy auto-scaling web services, seamlessly connect them with databases, and optimize performance and scalability.</p>
</li>
</ol>
<h2 id="heading-conclusion">Conclusion</h2>
<p>By participating in the hands-on labs of AWS General Immersion Day, you can expect to gain practical experience and knowledge in working with various AWS services. These labs are designed to simulate real-world scenarios, allowing you to apply your skills and deepen your understanding of the AWS platform.</p>
<p><mark>No matter your level of experience, whether you are a beginner or an experienced AWS user, AWS General Immersion Day offers a valuable opportunity to build a solid foundation and enhance your expertise in compute, network, security, monitoring, database, storage, and provisioning.</mark></p>
<p>So, if you're ready to roll up your sleeves, get your hands dirty, and dive into the world of AWS, AWS General Immersion Day is the perfect program for you. Whether you want to enhance your existing skills or start your AWS journey from scratch, this program provides the tools and resources necessary to unlock the full potential of AWS.</p>
<p><em>Don't miss out on the chance to embark on your AWS journey today. Join AWS General Immersion Day and take the first step towards mastering AWS!</em></p>
]]></content:encoded></item><item><title><![CDATA[Scalable Serverless API: Amazon API Gateway & AWS Lambda]]></title><description><![CDATA[TL;DR
Learn how to create a powerful serverless API using Amazon API Gateway, AWS Lambda, and DynamoDB. Follow step-by-step instructions to set up the project, create Lambda functions, and deploy the API. Interact with DynamoDB for CRUD operations.

...]]></description><link>https://alledevops.com/scalable-serverless-api-amazon-api-gateway-aws-lambda</link><guid isPermaLink="true">https://alledevops.com/scalable-serverless-api-amazon-api-gateway-aws-lambda</guid><category><![CDATA[aws-apigateway]]></category><category><![CDATA[aws lambda]]></category><category><![CDATA[AWS DynamoDB]]></category><category><![CDATA[serverless]]></category><category><![CDATA[APIs]]></category><dc:creator><![CDATA[Mustafa Gönen]]></dc:creator><pubDate>Wed, 02 Aug 2023 09:09:51 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1690964144712/a31539a1-7ac7-44b1-b64c-ad3b08265948.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-tldr">TL;DR</h2>
<p>Learn how to create a powerful serverless API using Amazon API Gateway, AWS Lambda, and DynamoDB. Follow step-by-step instructions to set up the project, create Lambda functions, and deploy the API. Interact with DynamoDB for CRUD operations.</p>
<ul>
<li><p><a target="_blank" href="https://us-east-1.console.aws.amazon.com/cloudformation/home#/stacks/quickcreate?templateUrl=https://serverless-api-gateway-lambda-dynamodb.s3.amazonaws.com/cloudformation-template.yaml&amp;stackName=serverless-api-gateway-lambda-dynamodb">One-Click Deployment</a></p>
</li>
<li><p><a target="_blank" href="https://github.com/alledevops/serverless-api-gateway-lambda-dynamodb">Project GitHub Repo</a></p>
</li>
</ul>
<h2 id="heading-introduction">Introduction</h2>
<p>In this blog article, we will walk you through the process of building a powerful serverless API using Amazon API Gateway, AWS Lambda, and DynamoDB. This step-by-step guide will help you understand the high-level design, set up the project, and deploy the API, making it clear and easy to follow.</p>
<h2 id="heading-project-description">Project Description</h2>
<p>Our goal is to create <strong>a serverless API that interacts with DynamoDB to perform CRUD operations (Create, Read, Update, Delete)</strong> and supports additional operations for testing. The API will be backed by an AWS Lambda function, which will handle incoming requests and interact with the DynamoDB table based on the payload.</p>
<h2 id="heading-project-skeleton">Project Skeleton</h2>
<p>To achieve our goal, we will have the following components:</p>
<ol>
<li><p><strong>Amazon API Gateway</strong>: This service will act as the <em>frontend</em> for our API, receiving HTTP requests and routing them to the appropriate Lambda function.</p>
</li>
<li><p><strong>AWS Lambda</strong>: We will create a Python 3.7 Lambda function that serves as the <em>backend</em> for our API. It will handle various operations and communicate with the DynamoDB table.</p>
</li>
<li><p><strong>DynamoDB</strong>: This NoSQL database service from AWS will store our <em>data</em>. We will create a <em>table</em> and define a <em>primary key</em> for efficient data retrieval.</p>
</li>
</ol>
<h2 id="heading-tools-used">Tools Used</h2>
<p>For this project, we will be using the following AWS services and other tools:</p>
<ol>
<li><p><strong>Amazon API Gateway</strong>: To create and manage the API endpoints.</p>
</li>
<li><p><strong>AWS Lambda</strong>: For serverless computing to process incoming API requests.</p>
</li>
<li><p><strong>AWS DynamoDB</strong>: To store and retrieve data in a scalable and reliable manner.</p>
</li>
<li><p><strong>AWS IAM:</strong> To allow the Lambda function to carry out the required task.</p>
</li>
<li><p><strong>Postman</strong>: To test POST requests on the API and to inspect API responses.</p>
</li>
</ol>
<h2 id="heading-high-level-design">High-Level Design</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690899236087/dd4f2db8-6930-4c67-819a-599a0db1e97e.png" alt class="image--center mx-auto" /></p>
<p>Our API will have a single resource named "DynamoDBManager," which will handle various methods like POST, enabling different DynamoDB operations. Each API call will be routed to the appropriate Lambda function, allowing us to interact with the DynamoDB table seamlessly.</p>
<h2 id="heading-setup">Setup</h2>
<h3 id="heading-generate-lambda-iam-role">Generate Lambda IAM Role</h3>
<ol>
<li><p>Open the roles page in the IAM console.</p>
</li>
<li><p>Choose Create role.</p>
</li>
<li><p>Create a role with the following properties.</p>
<ul>
<li><p>Trusted entity: Lambda.</p>
</li>
<li><p>Role name: <strong>lambda-apigateway-role</strong>.</p>
</li>
<li><p>Permissions: The custom policy below has the permissions that the function needs to write data to DynamoDB and upload logs.</p>
</li>
</ul>
</li>
</ol>
<pre><code class="lang-json">{
  <span class="hljs-attr">"Version"</span>: <span class="hljs-string">"2012-10-17"</span>,
  <span class="hljs-attr">"Statement"</span>: [
    {
      <span class="hljs-attr">"Sid"</span>: <span class="hljs-string">"Stmt1428341300017"</span>,
      <span class="hljs-attr">"Action"</span>: [
        <span class="hljs-string">"dynamodb:DeleteItem"</span>,
        <span class="hljs-string">"dynamodb:GetItem"</span>,
        <span class="hljs-string">"dynamodb:PutItem"</span>,
        <span class="hljs-string">"dynamodb:Query"</span>,
        <span class="hljs-string">"dynamodb:Scan"</span>,
        <span class="hljs-string">"dynamodb:UpdateItem"</span>
      ],
      <span class="hljs-attr">"Effect"</span>: <span class="hljs-string">"Allow"</span>,
      <span class="hljs-attr">"Resource"</span>: <span class="hljs-string">"*"</span>
    },
    {
      <span class="hljs-attr">"Sid"</span>: <span class="hljs-string">""</span>,
      <span class="hljs-attr">"Resource"</span>: <span class="hljs-string">"*"</span>,
      <span class="hljs-attr">"Action"</span>: [
        <span class="hljs-string">"logs:CreateLogGroup"</span>,
        <span class="hljs-string">"logs:CreateLogStream"</span>,
        <span class="hljs-string">"logs:PutLogEvents"</span>
      ],
      <span class="hljs-attr">"Effect"</span>: <span class="hljs-string">"Allow"</span>
    }
  ]
}
</code></pre>
<h3 id="heading-create-lambda-function">Create Lambda Function</h3>
<ol>
<li>Click "Create function" in AWS Lambda Console</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690907777874/502e4d68-20bd-4ebd-86f2-ee60af90e199.jpeg" alt class="image--center mx-auto" /></p>
<ol>
<li>Select "Author from scratch". Use name "LambdaFunctionOverHttps" , select "Python 3.7"" as Runtime. Under Permissions, select "Use an existing role", and select "lambda-apigateway-role" that we created, from the drop down. Finally click "Create function" on the bottom right.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690908023174/9a3af45e-2c7f-448c-80a2-684c20ba5939.jpeg" alt class="image--center mx-auto" /></p>
<ol>
<li>Replace the boilerplate coding with the following code snippet and click "Save".</li>
</ol>
<pre><code class="lang-python"><span class="hljs-comment"># Sample Python Code for the Lambda Function</span>

<span class="hljs-keyword">from</span> __future__ <span class="hljs-keyword">import</span> print_function

<span class="hljs-keyword">import</span> boto3
<span class="hljs-keyword">import</span> json

print(<span class="hljs-string">'Loading function'</span>)


<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">lambda_handler</span>(<span class="hljs-params">event, context</span>):</span>
    <span class="hljs-string">'''Provide an event that contains the following keys:
      - operation: one of the operations in the operations dict below
      - tableName: required for operations that interact with DynamoDB
      - payload: a parameter to pass to the operation being performed
    '''</span>

    operation = event[<span class="hljs-string">'operation'</span>]

    <span class="hljs-keyword">if</span> <span class="hljs-string">'tableName'</span> <span class="hljs-keyword">in</span> event:
        dynamo = boto3.resource(<span class="hljs-string">'dynamodb'</span>).Table(event[<span class="hljs-string">'tableName'</span>])

    operations = {
        <span class="hljs-string">'create'</span>: <span class="hljs-keyword">lambda</span> x: dynamo.put_item(**x),
        <span class="hljs-string">'read'</span>: <span class="hljs-keyword">lambda</span> x: dynamo.get_item(**x),
        <span class="hljs-string">'update'</span>: <span class="hljs-keyword">lambda</span> x: dynamo.update_item(**x),
        <span class="hljs-string">'delete'</span>: <span class="hljs-keyword">lambda</span> x: dynamo.delete_item(**x),
        <span class="hljs-string">'list'</span>: <span class="hljs-keyword">lambda</span> x: dynamo.scan(**x),
        <span class="hljs-string">'echo'</span>: <span class="hljs-keyword">lambda</span> x: x,
        <span class="hljs-string">'ping'</span>: <span class="hljs-keyword">lambda</span> x: <span class="hljs-string">'pong'</span>
    }

    <span class="hljs-keyword">if</span> operation <span class="hljs-keyword">in</span> operations:
        <span class="hljs-keyword">return</span> operations[operation](event.get(<span class="hljs-string">'payload'</span>))
    <span class="hljs-keyword">else</span>:
        <span class="hljs-keyword">raise</span> ValueError(<span class="hljs-string">'Unrecognized operation "{}"'</span>.format(operation))
</code></pre>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690909224986/562dda48-824e-4171-993f-1c8efad98093.jpeg" alt class="image--center mx-auto" /></p>
<h3 id="heading-test-lambda-function">Test Lambda Function</h3>
<p>Before moving forward, let's test the newly created Lambda function. We'll do a sample <em>echo</em> operation, where the function should output whatever input we pass.</p>
<ol>
<li><p>Click the dropdown menu of the "Test" button in the AWS Lambda Console.</p>
</li>
<li><p>Configure a test event with the following JSON payload:</p>
</li>
</ol>
<pre><code class="lang-json">{
    <span class="hljs-attr">"operation"</span>: <span class="hljs-string">"echo"</span>,
    <span class="hljs-attr">"payload"</span>: {
        <span class="hljs-attr">"somekey1"</span>: <span class="hljs-string">"somevalue1"</span>,
        <span class="hljs-attr">"somekey2"</span>: <span class="hljs-string">"somevalue2"</span>
    }
}
</code></pre>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690903579658/849f9943-85f4-4a43-8f26-9fceef709388.jpeg" alt="test-lambda-function" /></p>
<p>With the successful test, we can move on to the next steps!</p>
<h3 id="heading-create-dynamodb-table">Create DynamoDB Table</h3>
<p>Now, we need to create the DynamoDB table that our Lambda function will utilize.</p>
<ol>
<li><p>Open the DynamoDB console.</p>
</li>
<li><p>Click "Create table."</p>
</li>
<li><p>Set the following settings for the table:</p>
<ul>
<li><p>Table name: lambda-apigateway</p>
</li>
<li><p>Primary key: id (string)</p>
</li>
</ul>
</li>
<li><p>Click "Create."</p>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690910082997/e9444058-a84a-44d8-bbba-c78efc644d70.jpeg" alt class="image--center mx-auto" /></p>
<h3 id="heading-build-api">Build API</h3>
<p>Our next step is to create the API using Amazon API Gateway.</p>
<ol>
<li><p>Go to the API Gateway console.</p>
</li>
<li><p>Click "Create API."</p>
</li>
<li><p>Select "REST API" and give the API a name, such as "DynamoDBOperations."</p>
</li>
<li><p>Click "Create API."</p>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690911264117/a0a8111f-0470-4833-a309-b990f8f8a055.jpeg" alt class="image--center mx-auto" /></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690911577605/d8eed177-3e8c-43d7-a943-554ea12b4408.jpeg" alt /></p>
<ol>
<li><p>Click "Actions" and then select "Create Resource."</p>
</li>
<li><p>Input "DynamoDBManager" as the Resource Name and click "Create Resource."</p>
</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690912242370/0d9f8299-6ea5-485a-8abb-d6cf0135d7e6.jpeg" alt class="image--center mx-auto" /></p>
<ol>
<li>Now, let's create a POST method for our API. With the "/dynamodbmanager" resource selected, click "Actions" again and select "Create Method."</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690912454214/d1c91826-5ec2-470c-8906-bf7648f03175.jpeg" alt class="image--center mx-auto" /></p>
<ol>
<li>Choose "POST" from the dropdown and click the checkmark.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690912883713/624b1935-adb5-44a2-935f-4eee65550e45.jpeg" alt /></p>
<ol>
<li>The integration will automatically select "Lambda Function." Choose "LambdaFunctionOverHttps" (our previously created Lambda function) from the list, and click "Save." A popup window will prompt you to add a resource policy to the Lambda function. Click "Ok."</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690912901874/61a09cab-5468-4b1b-896a-69428e788245.jpeg" alt class="image--center mx-auto" /></p>
<h3 id="heading-deploy-api">Deploy API</h3>
<p>To make our API accessible, we need to deploy it to a stage.</p>
<ol>
<li>Click "Actions," then select "Deploy API" when post method selected.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690913143388/36236573-ac64-4a00-b1f4-dfc2f9d4be7a.jpeg" alt class="image--center mx-auto" /></p>
<ol>
<li>For "Deployment stage," select "[New Stage]," and give the stage a name, such as "Prod." Finish it by clicking "Deploy."</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690913386365/fa3e6ba0-f289-400e-8a94-aff78b9767ef.jpeg" alt class="image--center mx-auto" /></p>
<ol>
<li>Once the deployment is complete, select the "Prod" stage using the stages tab of the sidebar to reveal the "Invoke URL." Copy this URL; it is the endpoint for our API.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690913481936/4844e25e-6b56-4f40-b040-69e35a7233fa.jpeg" alt class="image--center mx-auto" /></p>
<h2 id="heading-running-our-solution">Running Our Solution</h2>
<p>Now that our API is deployed, let's test it by invoking the Lambda function and interacting with the DynamoDB table.</p>
<ol>
<li>To create an item in the DynamoDB table, use the following JSON payload in a POST request:</li>
</ol>
<pre><code class="lang-json">{
    <span class="hljs-attr">"operation"</span>: <span class="hljs-string">"create"</span>,
    <span class="hljs-attr">"tableName"</span>: <span class="hljs-string">"lambda-apigateway"</span>,
    <span class="hljs-attr">"payload"</span>: {


        <span class="hljs-attr">"Item"</span>: {
            <span class="hljs-attr">"id"</span>: <span class="hljs-string">"1234ABCD"</span>,
            <span class="hljs-attr">"number"</span>: <span class="hljs-number">5</span>
        }
    }
}
</code></pre>
<ol>
<li><p>To execute the API from your local machine, you can use either Postman or cURL:</p>
<ul>
<li><strong>Postman</strong>: Select "POST," paste the API invoke URL, and provide the JSON payload in the "Body" section. Click "Send," and the API should return an "HTTPStatusCode" of 200.</li>
</ul>
</li>
</ol>
<p>    <img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690916950096/f518543d-2282-4f35-8de3-80293c252b2c.jpeg" alt class="image--center mx-auto" /></p>
<ul>
<li><p><strong>cURL</strong>: Use the following command in your terminal:</p>
<pre><code class="lang-bash">  $ curl -X POST -d <span class="hljs-string">"{\"operation\":\"create\",\"tableName\":\"lambda-apigateway\",\"payload\":{\"Item\":{\"id\":\"1\",\"name\":\"Bob\"}}}"</span> https://<span class="hljs-variable">$API</span>.execute-api.<span class="hljs-variable">$REGION</span>.amazonaws.com/prod/DynamoDBManager
</code></pre>
</li>
</ul>
<ol>
<li>To verify that the item is inserted into the DynamoDB table, navigate to the DynamoDB console, select the "lambda-apigateway" table, and click the "Explore table items" button.</li>
</ol>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690917764107/4b91aff4-957c-4ae3-8775-ba1836f70746.jpeg" alt /></p>
<ol>
<li>To retrieve all items from the table, you can use the "list" operation. Send the following JSON payload to the API via Postman:</li>
</ol>
<pre><code class="lang-json">{
    <span class="hljs-attr">"operation"</span>: <span class="hljs-string">"list"</span>,
    <span class="hljs-attr">"tableName"</span>: <span class="hljs-string">"lambda-apigateway"</span>,
    <span class="hljs-attr">"payload"</span>: {
    }
}
</code></pre>
<p>You will see the two items that appended <em>lambda-apigateway</em> table as it is below screenshot.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1690917214010/1ccf6e23-fb8d-466a-9f78-ae67512f2409.jpeg" alt class="image--center mx-auto" /></p>
<p>And that's it! You have successfully created a serverless API using Amazon API Gateway, AWS Lambda, and DynamoDB!</p>
<h2 id="heading-cleanup">Cleanup</h2>
<p>Once you have completed the tutorial and want to clean up the resources:</p>
<ol>
<li><p>Delete the DynamoDB table "lambda-apigateway" from the DynamoDB console.</p>
</li>
<li><p>Delete the Lambda function "LambdaFunctionOverHttps" from the AWS Lambda console.</p>
</li>
<li><p>Delete the API "DynamoDBOperations" from the Amazon API Gateway console.</p>
</li>
<li><p>Delete the "lambda-apigateway-role" from AWS IAM console.</p>
</li>
</ol>
<h2 id="heading-conclusion">Conclusion</h2>
<p>Building a serverless API with Amazon API Gateway, AWS Lambda, and DynamoDB provides a scalable and cost-efficient solution for your applications. By following this step-by-step guide, you've learned how to create, deploy, and test your serverless API. Now you can leverage this knowledge to build various serverless applications with ease and confidence. <em>Happy coding!</em></p>
<h3 id="heading-references">References</h3>
<ul>
<li><p>GitHub Repository: https://github.com/saha-rajdeep/serverless-lab</p>
</li>
<li><p>AWS Documentation:</p>
<ul>
<li><p>https://repost.aws/knowledge-center/api-gateway-authentication-token-errors</p>
</li>
<li><p>https://repost.aws/knowledge-center/api-gateway-internal-server-error</p>
</li>
</ul>
</li>
</ul>
]]></content:encoded></item></channel></rss>