diff --git a/LICENSE.md b/LICENSE.md new file mode 100644 index 0000000..62f587a --- /dev/null +++ b/LICENSE.md @@ -0,0 +1,5 @@ +# Labzy Labs — Source-Available Dual License +**Code:** Business Source License 1.1 (BUSL-1.1) +**Documentation & Media:** Creative Commons Attribution–NonCommercial–NoDerivatives 4.0 International (CC BY-NC-ND 4.0) + +Copyright (c) 2025 Meyi Cloud Solutions Pvt. Ltd. diff --git a/docs/00_overview.md b/docs/00_overview.md index 4bba659..13eab99 100644 --- a/docs/00_overview.md +++ b/docs/00_overview.md @@ -1 +1,83 @@ -# Overview \ No newline at end of file +# Terraform Instruction Lab – From Zero to Team Collaboration (2 Hours) + +This hands-on lab takes you from first contact with Terraform to a safe, team-ready workflow on AWS. You will provision real infrastructure, structure your code with variables and modules, and protect your state with an S3 backend and DynamoDB state locking. The material is paced for beginners and practical for engineers who want a concise, end‑to‑end setup they can reuse at work. + +**Audience:** Beginners using Ubuntu 24 for the first time +**Goal:** Deploy AWS resources using Terraform, adopt variables/outputs and modules, and enable remote state + locking for collaboration. + +## Outcomes +- Understand what Terraform is and how it manages infrastructure as code (IaC) +- Install and verify Terraform + AWS CLI on Ubuntu 24 +- Authenticate to AWS and validate IAM access +- Write a minimal Terraform configuration and deploy an EC2 instance +- Introduce variables, tfvars, and outputs for reuse and clarity +- Configure remote state in S3 with DynamoDB locking to prevent race conditions +- Compose and reuse modules (including a simple nested VPC + EC2 example) +- Clean up all resources to avoid unnecessary AWS costs + +## What We’ll Build +- A small, cost-conscious AWS stack: + - Optional VPC with public subnet and Internet gateway + - A single EC2 instance (Free Tier eligible type where possible) + - Remote Terraform state stored in S3 with DynamoDB table for locking +- A reusable module layout you can extend for real projects + +## Prerequisites +- AWS account (Free Tier is fine) and basic familiarity with regions +- IAM user or role with permissions for EC2, S3, and DynamoDB +- Ubuntu 24.04 machine (VM, physical, or WSL) with Internet access +- Willingness to use the terminal (copy/paste is fine!) + +> Tip: New to AWS CLI? No problem—setup is guided and verified in this lab. + +## Lab Roadmap +- `docs/01_install_setup.md`: Install Terraform + AWS CLI and verify environment +- `docs/02_first_ec2.md`: Author your first Terraform config and deploy EC2 +- `docs/03_remote_state_s3_dynamodb.md`: Configure S3 backend and DynamoDB locking +- `docs/04_variables_tfvars_outputs.md`: Introduce variables, tfvars, and outputs +- `docs/05_modules_reuse.md`: Create and consume modules; structure for reuse +- `docs/06_nested_modules_vpc_ec2.md`: Model a simple VPC + EC2 with nested modules +- `docs/07_cleanup.md`: Destroy resources and verify nothing is left behind +- `docs/08_test.md`: Optional checks and validation ideas + +## Estimated Time (2 Hours) +- Setup and verification: 15–20 min +- First EC2 with basics: 15–20 min +- Remote state + locking: 20–25 min +- Variables, tfvars, outputs: 15–20 min +- Modules + nested example: 25–30 min +- Cleanup and wrap‑up: 10 min + +## Key Concepts +- Declarative IaC: Describe desired state; Terraform plans and applies changes +- State: Terraform tracks real resources; protect it with remote storage + locks +- Idempotence: Re‑runs converge to the same outcome when code is unchanged +- Modules: Encapsulate patterns, promote reuse and reviewability +- Collaboration: S3 state + DynamoDB locks prevent conflicting applies + +## Safety, Cost, and Region +- Choose a region close to you and consistent across the lab (e.g., `us-east-1`) +- Prefer Free Tier eligible instance types (e.g., `t2.micro` or `t3.micro`) +- Always run the cleanup step in `docs/07_cleanup.md` after experimenting +- Remote state resources (S3 bucket, DynamoDB table) have minimal ongoing cost + +## Tools You’ll Use +- Terraform CLI (1.6+ recommended) +- AWS CLI v2 +- A text editor and terminal (bash/zsh) + +## Deliverables +- A working Terraform project that can: + - Deploy a basic EC2 instance (optionally inside a simple VPC) + - Output connection details + - Store state in S3 with DynamoDB locking for team safety +- A module structure you can clone for future services + +## Troubleshooting and Help +- Use `terraform init -upgrade` if providers appear outdated +- Validate AWS credentials with `aws sts get-caller-identity` +- Run `terraform plan` to preview changes before apply +- If a lock persists, check and clear it via the DynamoDB console (only if safe) +- See `docs/08_test.md` for additional verification ideas + +When you’re ready, start with installation in `docs/01_install_setup.md`. diff --git a/docs/01_install_setup.md b/docs/01_install_setup.md new file mode 100644 index 0000000..0b8bae8 --- /dev/null +++ b/docs/01_install_setup.md @@ -0,0 +1,81 @@ +# 1) Install Terraform & Configure AWS (15 min) + +We’ll install Terraform and set up your AWS credentials on **Ubuntu 24.04**. + +### A. Install Terraform + +```bash +sudo apt-get update +``` +Updates the local package index so your system knows about the latest software versions. + +```bash +sudo apt-get install -y gnupg software-properties-common wget +``` +Installs essential tools: +- **gnupg** → handles security keys +- **software-properties-common** → helps manage repositories +- **wget** → used to download files + +```bash +wget -O- https://apt.releases.hashicorp.com/gpg | gpg --dearmor | sudo tee /usr/share/keyrings/hashicorp.gpg +``` +Downloads and stores HashiCorp’s official GPG key for verifying Terraform packages. + +```bash +echo "deb [signed-by=/usr/share/keyrings/hashicorp.gpg] https://apt.releases.hashicorp.com $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/hashicorp.list +``` +Adds the HashiCorp package repository to your system so Terraform can be installed and updated. + +```bash +sudo apt update && sudo apt install -y terraform && clear +``` +Refreshes package list again and installs the latest Terraform release. + +```bash +terraform -v +``` +Verifies Terraform is installed by printing its version. + + +### B. Install AWS CLI v2 + +```bash +sudo apt-get install -y unzip +``` +Installs **unzip**, required to extract the AWS CLI package. + +```bash +curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" +``` +Downloads the AWS CLI v2 installer. + +```bash +unzip awscliv2.zip +``` +Extracts the AWS CLI installer files. + +```bash +sudo ./aws/install +``` +Runs the AWS CLI installer to make it available system-wide. + +```bash +aws --version +``` +Checks that AWS CLI is installed successfully. + + +### C. Configure AWS credentials + +```bash +aws configure +``` +Starts interactive setup for AWS CLI. Enter: + +- **AWS Access Key ID**: (from your IAM user) +- **AWS Secret Access Key**: (from your IAM user) +- **Default region name**: `ap-south-1` (Mumbai, for this lab) +- **Default output format**: `json` + +This creates config files (~/.aws/config and ~/.aws/credentials) so both AWS CLI and Terraform can connect securely to AWS. \ No newline at end of file diff --git a/docs/01_task.md b/docs/01_task.md deleted file mode 100644 index 17014e0..0000000 --- a/docs/01_task.md +++ /dev/null @@ -1 +0,0 @@ -## Initial task \ No newline at end of file diff --git a/docs/02_first_ec2.md b/docs/02_first_ec2.md new file mode 100644 index 0000000..3f8a423 --- /dev/null +++ b/docs/02_first_ec2.md @@ -0,0 +1,94 @@ +# 2) Your First Terraform Apply: EC2 (20 min) + +In this section, you will create your **first Terraform project** and use it to launch an Amazon EC2 instance in your AWS account. This is the "Hello World" of Terraform on AWS. + + +## Step 1: Create a new project directory + +```bash +mkdir -p ~/terraform-ec2-lab && cd ~/terraform-ec2-lab +``` +- `mkdir -p ~/terraform-ec2-lab` → Creates a folder called `terraform-ec2-lab` in your home directory. The `-p` option ensures the folder is created even if parent directories don’t exist. +- `cd ~/terraform-ec2-lab` → Moves into this new folder so you can keep your Terraform files organized. + +This directory will hold all the configuration files for this project. + + +## Step 2: Create the main configuration file + +Create a new file named **main.tf** and paste the following code: + +```hcl +provider "aws" { + region = "ap-south-1" +} + +resource "aws_instance" "lab_instance" { + ami = "ami-0e6329e222e662a52" # Amazon Linux 2 (Mumbai) + instance_type = "t2.micro" + + tags = { + Name = "Terraform-Lab-Instance" + } +} +``` + +### Explanation of the code + +- **provider "aws"** + Tells Terraform to use the AWS provider. The region is set to `ap-south-1` (Mumbai). This determines where your resources will be created. + +- **resource "aws_instance" "lab_instance"** + Declares that we want to create an EC2 instance resource in AWS. + - `ami`: The Amazon Machine Image (AMI) ID that defines the OS. Here we use Amazon Linux 2 in the Mumbai region. + - `instance_type`: Specifies the hardware size of the instance. `t2.micro` is eligible for AWS Free Tier. + - `tags`: Adds a tag so the instance will appear in AWS Console with the name **Terraform-Lab-Instance**. + + +## Step 3: Initialize Terraform + +```bash +terraform init +``` +- Downloads the AWS provider plugin. +- Prepares your working directory for use with Terraform. + + +## Step 4: Preview the changes + +```bash +terraform plan +``` +- Shows the actions Terraform **will take** without actually applying them. +- Useful for double-checking that the configuration does what you expect. + + +## Step 5: Apply the configuration + +```bash +terraform apply -auto-approve +``` +- Creates the EC2 instance in AWS according to your configuration. +- The `-auto-approve` flag skips the interactive “yes/no” confirmation step. (Normally Terraform asks before making changes.) + + +## Step 6: Verify in AWS Console + +1. Log in to the AWS Management Console. +2. Navigate to **EC2 → Instances**. +3. You should see a running instance named **Terraform-Lab-Instance**. +4. Confirm its details: + - Instance type: `t2.micro` + - AMI: Amazon Linux 2 + - Region: ap-south-1 (Mumbai) + + +## Wrap-Up + +Congratulations! 🎉 You just: +- Wrote your first Terraform configuration. +- Initialized Terraform. +- Planned and applied infrastructure changes. +- Verified the deployed EC2 instance in AWS. + +This is the foundation of Infrastructure as Code: **describe what you want → apply → verify**. diff --git a/docs/03_remote_state_s3_dynamodb.md b/docs/03_remote_state_s3_dynamodb.md new file mode 100644 index 0000000..447c48b --- /dev/null +++ b/docs/03_remote_state_s3_dynamodb.md @@ -0,0 +1,110 @@ +# 3) Remote State with S3 + DynamoDB Lock (25 min) + +By default, Terraform stores its state in a local file called **terraform.tfstate**. +This works fine for single-user setups, but in a team environment it can cause conflicts (two people updating at once). + +To solve this, we move the state file to a **remote backend** (Amazon S3) and use **DynamoDB** for state locking. +This ensures only one person can apply changes at a time. + + +## Step A: Create an S3 bucket for remote state + +```bash +aws s3api create-bucket --bucket my-terraform-state-lab --region ap-south-1 +``` +- Creates a new S3 bucket called `my-terraform-state-lab`. +- The bucket name must be **globally unique** across AWS. Change it to something like `terraform-state-yourname123`. + +```bash +aws s3api put-bucket-versioning --bucket my-terraform-state-lab --versioning-configuration Status=Enabled +``` +- Enables **versioning** on the bucket. +- This allows you to roll back to older state files if something goes wrong. + + +## Step B: Create a DynamoDB table for state locking + +```bash +aws dynamodb create-table --table-name terraform-locks --attribute-definitions AttributeName=LockID,AttributeType=S --key-schema AttributeName=LockID,KeyType=HASH --billing-mode PAY_PER_REQUEST +``` +- Creates a DynamoDB table called `terraform-locks`. +- The `LockID` column acts as a **lock key**. +- When Terraform runs, it inserts a lock entry in this table. This prevents two users from applying changes at the same time. + +## Step C: Configure backend in Terraform + +Create a new project folder and move into it: + +```bash +mkdir -p ~/terraform-remote-lab && cd ~/terraform-remote-lab +``` + +Now create a file called **main.tf**: + +```hcl +terraform { + backend "s3" { + bucket = "my-terraform-state-lab" # replace with your bucket name + key = "dev/terraform.tfstate" # file path inside the bucket + region = "ap-south-1" + dynamodb_table = "terraform-locks" + encrypt = true + } +} + +provider "aws" { + region = "ap-south-1" +} + +resource "aws_s3_bucket" "demo" { + bucket = "my-demo-bucket-student12345" # must be unique + acl = "private" +} +``` + +### Explanation of the configuration +- **backend "s3"** → Tells Terraform to store state in S3. + - `bucket`: Your S3 bucket name. + - `key`: Path/name of the state file inside the bucket. + - `region`: Region of your bucket. + - `dynamodb_table`: Table used for locks. + - `encrypt`: Ensures the state file is encrypted at rest. + +- **provider "aws"** → Tells Terraform to use AWS as the provider. + +- **resource "aws_s3_bucket" "demo"** → A sample resource (an S3 bucket) to test remote state functionality. + + +## Step D: Initialize the backend + +```bash +terraform init +``` +- Initializes the working directory. +- Terraform will detect the `backend "s3"` block and migrate your local state to S3. +- You may be asked: *Do you want to copy existing state to the new backend?* → type `yes`. + + +## Step E: Apply and verify + +```bash +terraform apply -auto-approve +``` +- Creates the demo bucket defined in `main.tf`. +- Stores the state file in **S3** instead of locally. +- Uses **DynamoDB** to prevent parallel execution. + +Now check in AWS Console: +- Go to **S3 → your bucket → dev/terraform.tfstate** → you should see the state file. +- Go to **DynamoDB → terraform-locks** → while Terraform runs, a lock record will appear. + + +## Wrap-Up + +You have now: +- Configured Terraform to use a **remote state** in S3. +- Added **versioning** for rollback. +- Used **DynamoDB locking** for safe team collaboration. + +This setup is considered a best practice for production environments. + diff --git a/docs/04_variables_tfvars_outputs.md b/docs/04_variables_tfvars_outputs.md new file mode 100644 index 0000000..d7a34a6 --- /dev/null +++ b/docs/04_variables_tfvars_outputs.md @@ -0,0 +1,142 @@ +# 4) Variables, tfvars & Outputs (20 min) + +In this section, you’ll **parameterize** your Terraform configuration so it’s flexible across environments (like *dev* and *prod*) and learn how to **print useful values** after deployment. + + +## Why variables and tfvars? +- **Variables** let you avoid hard‑coding values (e.g., instance type, names). +- **.tfvars files** store a specific set of variable values for each environment. You can switch environments quickly without editing code. +- **Outputs** print important information (IDs, IPs, AZs) for later use in scripts, CD pipelines, or as inputs to other modules. + + +## Step 1: Create a working directory + +```bash +mkdir -p ~/terraform-vars-lab && cd ~/terraform-vars-lab +``` +Creates a clean folder for this exercise and moves into it. + +## Step 2: Define variables + +Create a file **variables.tf**: + +```hcl +variable "instance_type" {} +variable "instance_name" {} +``` +- Declares two variables, `instance_type` and `instance_name`. +- No defaults are provided, so Terraform will expect values (either via `-var` flags, `.tfvars` files, or prompts). + +> Tip: You can add descriptions and types to make your code self-documenting: +> ```hcl +> variable "instance_type" { type = string, description = "EC2 size, e.g. t2.micro" } +> variable "instance_name" { type = string, description = "Tag: Name for the instance" } +> ``` + +## Step 3: Write the main configuration + +Create **main.tf**: + +```hcl +provider "aws" { region = "ap-south-1" } + +resource "aws_instance" "ec2_instance" { + ami = "ami-0e6329e222e662a52" + instance_type = var.instance_type + + tags = { Name = var.instance_name } +} +``` +- **provider "aws"**: points Terraform to AWS in the Mumbai region. +- **resource "aws_instance" "ec2_instance"**: creates an EC2 instance. + - `ami`: the base OS image (Amazon Linux 2 for ap-south-1). + - `instance_type`: references your variable using `var.instance_type`. + - `tags`: uses `var.instance_name` so the tag can change per environment. + +## Step 4: Expose useful outputs + +Create **outputs.tf**: + +```hcl +output "instance_id" { value = aws_instance.ec2_instance.id } +output "public_ip" { value = aws_instance.ec2_instance.public_ip } +output "availability_zone" { value = aws_instance.ec2_instance.availability_zone } +``` +- After apply, Terraform will print these values. +- Great for quickly finding the instance or for handing values to other tools. + +> Tip: Add `sensitive = true` to hide secrets in logs: +> ```hcl +> output "db_password" { value = var.db_password, sensitive = true } +> ``` + +## Step 5: Create environment files (tfvars) + +Create **dev.tfvars**: + +```hcl +instance_type = "t2.micro" +instance_name = "Dev-Instance" +``` + +Create **prod.tfvars**: + +```hcl +instance_type = "t3.micro" +instance_name = "Prod-Instance" +``` +- These files define different values for the *same* variables. +- You can add more environments (e.g., `test.tfvars`, `staging.tfvars`) as needed. + +> Tip: Keep environment files in version control, but never commit secrets. For secrets, use a secure tool (e.g., AWS SSM Parameter Store, Vault) or CI/CD‑controlled variables. + +## Step 6: Initialize, apply, output, and switch environments + +```bash +terraform init +``` +- Downloads provider plugins and prepares the working directory. + +```bash +terraform apply -var-file="dev.tfvars" -auto-approve +``` +- Applies with **dev** values. +- Creates a `t2.micro` instance named **Dev-Instance**. + +```bash +terraform output +``` +- Prints the values from **outputs.tf** to the terminal (instance id, public IP, AZ). + +```bash +terraform destroy -auto-approve +``` +- Destroys the **dev** instance to avoid charges before testing **prod**. + +```bash +terraform apply -var-file="prod.tfvars" -auto-approve +``` +- Applies with **prod** values. +- Creates a `t3.micro` instance named **Prod-Instance**. + +```bash +terraform output -json > output.json +cat output.json +``` +- Exports outputs in **JSON** format, perfect for automation pipelines. +- `output.json` can be parsed by scripts or downstream systems. + +## Common gotchas & best practices +- **Keep AMIs per region**: AMI IDs are region‑specific. If you change regions, update the AMI. +- **Validate before apply**: Run `terraform validate` and `terraform plan` to catch mistakes early. +- **Naming**: Tags are your friend. Use `Name` tags consistently so you can find resources fast. +- **Separate state**: For real projects, use separate state/workspaces or remote backends (S3 + DynamoDB) per environment. + +## Wrap-Up +You now know how to: +- Declare variables and pass values with `.tfvars` +- Reuse the same code for **multiple environments** +- Surface key information using **outputs** +- Export outputs to **JSON** for automation + +This pattern is the backbone of scalable Terraform projects. diff --git a/docs/05_modules_reuse.md b/docs/05_modules_reuse.md new file mode 100644 index 0000000..8afabf7 --- /dev/null +++ b/docs/05_modules_reuse.md @@ -0,0 +1,135 @@ +# 5) Build & Reuse a Simple Module (25 min) + +In this section, you’ll turn repeated EC2 logic into a **reusable Terraform module**. +Modules make your code **DRY** (Don’t Repeat Yourself), easier to test, and simpler to use across teams and environments. + +## What is a module? +A **module** is just a folder that contains Terraform configuration files (`.tf`). +Your **root module** (the folder where you run the CLI) can **call** other modules from: +- A local path (`./modules/ec2-instance`) +- A Git repo (`git::https://...`), or +- A registry (`registry.terraform.io`) + +Here we’ll use a **local** module. + +## Step 1: Create the project & module folder + +```bash +mkdir -p ~/terraform-modules-lab/modules/ec2-instance +cd ~/terraform-modules-lab +``` +- Creates a workspace with a nested folder `modules/ec2-instance` that will contain reusable EC2 code. + +Your tree will look like: +``` +terraform-modules-lab/ + main.tf # root (calls the module) + modules/ + ec2-instance/ + variables.tf + main.tf # module implementation + outputs.tf +``` + +## Step 2: Define module inputs (variables) + +Create **modules/ec2-instance/variables.tf**: +```hcl +variable "instance_type" { description = "Type of EC2 instance"; default = "t2.micro" } +variable "instance_name" { description = "Tag name for instance" } +variable "instance_count" { description = "Number of EC2"; default = 1 } +``` +- These are **inputs** the module expects. +- `default` makes inputs optional (callers can override). +- You can add stronger typing & validation for safety: + ```hcl + variable "instance_type" { + type = string + description = "EC2 size" + default = "t2.micro" + validation { + condition = can(regex("^t[23]\.", var.instance_type)) + error_message = "Use a t2.* or t3.* instance for this lab." + } + } + ``` + +## Step 3: Implement the module logic + +Create **modules/ec2-instance/main.tf**: +```hcl +resource "aws_instance" "this" { + count = var.instance_count + ami = "ami-0e6329e222e662a52" + instance_type = var.instance_type + tags = { Name = "${var.instance_name}-${count.index}" } +} +``` +- Uses `count` to create **N instances** with a single block. +- `tags.Name` includes the `count.index` suffix so each instance has a unique name (e.g., `App-Server-0`, `App-Server-1`). + +> **Note on AMIs:** AMI IDs are **region-specific**. We use an Amazon Linux 2 AMI for **ap-south-1 (Mumbai)**. If you switch regions, update this AMI or fetch it dynamically (e.g., with a data source). + +## Step 4: Expose useful outputs + +Create **modules/ec2-instance/outputs.tf**: +```hcl +output "instance_ids" { value = [for i in aws_instance.this : i.id] } +output "public_ips" { value = [for i in aws_instance.this : i.public_ip] } +``` +- Makes it easy for callers to **consume** important info (IDs, IPs). +- The `for` expression collects values from the resource instances created via `count`. + +## Step 5: Call the module from the root + +Create the **root** `main.tf` at `~/terraform-modules-lab/main.tf`: +```hcl +provider "aws" { region = "ap-south-1" } + +module "ec2_instance" { + source = "./modules/ec2-instance" + instance_type = "t2.micro" + instance_name = "App-Server" + instance_count = 2 +} +``` +- `source` points to the **local path** of your module folder. +- Inputs (`instance_type`, `instance_name`, `instance_count`) are set by the **caller** (the root module). +- You can define **multiple** module blocks to create different groups of instances, or use `.tfvars` files for environments. + +## Step 6: Initialize and apply + +```bash +terraform init +terraform apply -auto-approve +``` +- `terraform init` downloads providers and **fetches module sources** (local/Git/registry). +- `terraform apply` creates **2 EC2 instances** named `App-Server-0` and `App-Server-1` in **ap-south-1**. + +After the apply, view outputs: +```bash +terraform output +``` +You should see lists for `instance_ids` and `public_ips`. + +## How to reuse this module elsewhere +- Copy the `modules/ec2-instance` folder into any Terraform project and call it with `source = "./modules/ec2-instance"`. +- Or publish the module to a **Git repo** and reference with `source = "git::https://github.com/yourorg/yourrepo//modules/ec2-instance?ref=v1.0.0"`. +- Standardize inputs/outputs so teams can use it without reading internals. + +## Best practices +- **Version control your modules** (Git tags) to avoid breaking changes. +- Add **README.md** inside the module with usage examples and input/output docs. +- Prefer **types & validation** for variables. +- Keep AMIs **parametrized** or discover them via a `data "aws_ami"` query to avoid hard-coding. +- If multiple teams use the same module, consider a **private Terraform registry** or a Git monorepo with clear versioning. + +## Cleanup +When done (to avoid charges): +```bash +terraform destroy -auto-approve +``` + +## Summary +You built a **reusable EC2 module**, exposed clean **inputs/outputs**, and consumed it from the **root**. +This pattern scales to VPCs, RDS, ALBs, and more—compose modules like building blocks to create reliable infrastructure at speed. diff --git a/docs/06_nested_modules_vpc_ec2.md b/docs/06_nested_modules_vpc_ec2.md new file mode 100644 index 0000000..334344f --- /dev/null +++ b/docs/06_nested_modules_vpc_ec2.md @@ -0,0 +1,44 @@ +# 6) Nested Modules: VPC + EC2 (10–15 min) + +Create a minimal VPC module and keep EC2 separate for clarity. + +```bash +mkdir -p ~/terraform-modules-lab/modules/vpc +cd ~/terraform-modules-lab +``` + +**modules/vpc/main.tf** +```hcl +resource "aws_vpc" "this" { cidr_block = "10.0.0.0/16" } + +resource "aws_subnet" "public" { + vpc_id = aws_vpc.this.id + cidr_block = "10.0.1.0/24" + availability_zone = "ap-south-1a" +} +``` + +**modules/vpc/outputs.tf** +```hcl +output "vpc_id" { value = aws_vpc.this.id } +output "subnet_id" { value = aws_subnet.public.id } +``` + +Update **root main.tf** to include both modules: +```hcl +provider "aws" { region = "ap-south-1" } + +module "vpc" { source = "./modules/vpc" } + +module "ec2_instance" { + source = "./modules/ec2-instance" + instance_type = "t2.micro" + instance_name = "Nested-App" + instance_count = 1 +} +``` + +Apply: +```bash +terraform apply -auto-approve +``` diff --git a/docs/07_cleanup.md b/docs/07_cleanup.md new file mode 100644 index 0000000..216a667 --- /dev/null +++ b/docs/07_cleanup.md @@ -0,0 +1,21 @@ +# 7) Cleanup (5 min) + +Always delete resources to avoid charges. + +From each working directory you used: +```bash +terraform destroy -auto-approve || true +``` + +If you created the S3 bucket and DynamoDB table for backend testing, remove them **after** you’ve destroyed all Terraform-managed resources and the state is no longer needed. + +Delete S3 (must be empty first): +```bash +aws s3 rm s3://my-terraform-state-lab --recursive +aws s3api delete-bucket --bucket my-terraform-state-lab --region ap-south-1 +``` + +Delete DynamoDB table: +```bash +aws dynamodb delete-table --table-name terraform-locks +``` \ No newline at end of file diff --git a/docs/08_test.md b/docs/08_test.md new file mode 100644 index 0000000..5cd872a --- /dev/null +++ b/docs/08_test.md @@ -0,0 +1,171 @@ +Markdown Quick Reference +======================== + +This guide is a very brief overview, with examples, of the syntax that [Markdown] supports. It is itself written in Markdown and you can copy the samples over to the left-hand pane for experimentation. It's shown as *text* and not *rendered HTML*. + +[Markdown]: http://daringfireball.net/projects/markdown/ + + +Simple Text Formatting +====================== + +First thing is first. You can use *stars* or _underscores_ for italics. **Double stars** and __double underscores__ for bold. ***Three together*** for ___both___. + +Paragraphs are pretty easy too. Just have a blank line between chunks of text. + +> This chunk of text is in a block quote. Its multiple lines will all be +> indented a bit from the rest of the text. +> +> > Multiple levels of block quotes also work. + +Sometimes you want to include code, such as when you are explaining how `
` tag and *won't* be shown
+ as preformatted text.
+
+ You can keep adding more and more paragraphs to a single
+ list item by adding the traditional blank line and then keep
+ on indenting the paragraphs with two spaces.
+
+ You really only need to indent the first line,
+but that looks ugly.
+
+- Lists support blockquotes
+
+ > Just like this example here. By the way, you can
+ > nest lists inside blockquotes!
+ > - Fantastic!
+
+- Lists support preformatted text
+
+ You just need to indent an additional four spaces.
+
+
+Even More
+=========
+
+Horizontal Rule
+---------------
+
+If you need a horizontal rule you just need to put at least three hyphens, asterisks, or underscores on a line by themselves. You can also even put spaces between the characters.
+
+---
+****************************
+_ _ _ _ _ _ _
+
+Those three all produced horizontal lines. Keep in mind that three hyphens under any text turns that text into a heading, so add a blank like if you use hyphens.
+
+Images
+------
+
+Images work exactly like links, but they have exclamation points in front. They work with references and titles too.
+
+ and ![Happy].
+
+[Happy]: https://wpclipart.com/smiley/happy/simple_colors/smiley_face_simple_green_small.png ("Smiley face")
+
+
+Inline HTML
+-----------
+
+If markdown is too limiting, you can just insert your own crazy HTML. Span-level HTML can *still* use markdown. Block level elements must be separated from text by a blank line and must not have any spaces before the opening and closing HTML.
+
+