Profile Settings

Security Operation Engineer Projects

Random Name By Oliver Awa.     Updated Oct 12, 2024.       1st Published on Oct 12, 2022   Learn more about Oliver

Store, retrieve, and manage sensitive credentials in AWS Secrets Manager OS.

Security Architects are looking to reduce access to plaintext secrets by their application teams. Developers want a mechanism to securely retrieve secrets without hard-coding credentials in their application. They also want assurance that rotation of secrets will not affect application availability. Compliance teams want mechanisms to monitor security of the secrets and aligning with best practices or policy. Finally the SOC want mechanisms to respond to unauthorized or erroneous actions on secrets.
In this workshop, you will use a sample serverless application with AWS Lambda functions connecting to an Amazon RDS database. You will test programmatic retrieval of database credentials from AWS Secrets Manager as well as implement Attribute-based Access Control (ABAC) using tags. You will monitor compliance status of secrets using AWS Config. Later you will rotate secrets within AWS Secrets Manager and test application access. Finally you will test attempts to delete the secrets resource policy for retrieving secrets in plaintext from AWS Secrets Manager. Attendees will use AWS Event Bridge event driven response to deploy incident response workflows that will rotate the secret, restore the resource policy, alert the SOC, and deny access to the offender.

Scenario

You are working at a company who is moving towards storing their credentials in AWS Secrets Manager. Rather than applications using hard-coded credentials, developers will use Secrets Manager to retrieve the Database credentials for connecting to the database.

In this workshop, you will wear two hats. First, you will wear the Administrator hat to deploy the configuration to manage the secrets stored in Secrets Manager. Second, you will wear the Developer hat to configure your application for requesting database credentials from Secrets Manager instead of using hard-coded credentials. You will also test the Incident Response configuration that you put in place for remediating against Secret update and retrieval workflow.

This workshop is broken up into setup and then four modules:

Mitigating Common Web Application Attack Vectors Using AWS WAF

Welcome to Home Depot!. You have just joined the team and your first task is to enhance security for the company website. The site runs on Linux, PHP and Apache and uses an EC2 an autoscaling group behind an Application Load Balancer (ALB). After an initial architecture assessment you have found multiple vulnerabilities and configuration issues. The dev team is swamped and will not be able to remediate code level issues for several weeks. Your mission in this workshop round is to build an effective set of controls that mitigate common attack vectors against web applications, and provide you with the monitoring capabilities needed to react to emerging threats when they occur.

Working on the solution

Data Encryption

The Purpose of this project is to educate users about AWS KMS encryption at rest capabilities within a few AWS services

Problem Statement

As a new security architect for the Amazon Web Services environment within your firm, you receive a request from your governance and compliance department asking you to review and demonstrate privacy controls for data stored in AWS.
Reviewing the results of your Well-Architected review to protect data at rest, you discover that while your company enabled encryption in some areas, the configuration parameters are inconsistent. Additionally, your governance, compliance, and audit team asked you to provide a report on data consumers.
Working with your Amazon Web Services architects, you identify several focus areas.

  1. Logging and archival. Some CloudTrail logs can contain production data. You need to ensure the controls applied to CloudTrail logs can meet the privacy controls
  2. Privacy and Security of data at rest for EC2 instances and data backups of those instances
  3. Privacy and Security of data for higher-level services like RDS

Introduction

Project 1 - Part 2: Virtualbox and Vagrant installation on windows and other OS.

The goal of this project is to walk you through the process of setting up a local lab environment on Windows and other OS . After the installation process, we will be creating Linux virtual machines that will be running on your Windows or Mac system and we will be connecting to those Linux virtual machines using SSH. SSH stands for secure shell

Project 2, Part 1- Create your first vagrant project

We will Create a folder where we want to save all the Vagrant-related files. Since we will be working on different project, we will create a sub-folder or sub directory to host the individual project files.

Project 2, part 2 - Provision a Virtual Machine from the added vagrant box with other software

Vagrant allows you to automatically provision environments, including web servers. In this project, we will use the vagrant box: "hashicorp/bionic64" we added in project 1. part 1, to boot up a machine all together with the Apache web-server. we will then create a simple html page that will be display in the browser.

Project 3 - Monitor Linux servers Uptime with Prometheus

Performance monitoring and alerting are very crucial to measure the performance metrics of an application running in a production environment. In this project, you will create a metrics collection and graphing system. This will allow you to visually see what the system utilization is for a given host or across an entire environment. You will be installing two popular open-source tools known as Prometheus and Grafana alongside a Node_exporter. You will then use it to monitor servers running in our environment

Project 4- Centralized Syslog Solution with the ELK Stack

The goal for this project is to create a centralized syslog server that will allow you to store, graph, and search through the syslog messages from multiple servers. To do this, you'll be deploying the ELK stack. The components of the ELK stack are Elasticsearch, Logstash, and Kibana. Finally, you'll configure servers to send their messages to this new system.

Project 5 - Hesk or Jira or kanboard

Project 6 - Icinga

Project 7 - Telegraf, Influxdb and Grafana