How to utilize modern logging and Serverless technology to jump-start your security logging pipeline.

Forward Security Logs to S3 via Fluentd
To detect and prevent security breaches, security teams must understand everything that is happening in the environment. The primary way to accomplish this is by monitoring and analyzing log events, which provide information on activity within a system.
Traditionally, this was done with the built-in Unix command-line utility syslog
, where data was sent to a set of aggregation points for storage, searching, and analysis.
However, collecting high-value security logs from a large fleet of machines can be a challenge. Luckily, there are new tools to help. Over the years, new projects emerged for performant and flexible log management, such as:
In this tutorial, we will walk through how to aggregate and store security logs the cloud-native way. We will use Fluentd to transport syslog data from AWS EC2 instances to Amazon S3 in a secure and performant manner. Syslog provides information on users connecting to systems, running sudo commands, installing applications, and more.
Getting Started with EC2 Monitoring
Make sure to have the following setup:
$ git clone git@github.com:panther-labs/tutorials.git && cd tutorials
You will use the AWS CLI to run CloudFormation from the Panther Labs tutorials repository with predefined templates.
Step 1: Setup S3 Bucket, Instance Profile, and IAM Role
To centralize data from AWS EC2, we will use a S3 Bucket and an IAM Instance Profile to permit EC2 to send the data to the bucket. Instance profiles allow for temporary credentials to be generated, which avoids usage of long-lived credentials.
Run the command below from the panther-labs/tutorials
directory to setup all the required infrastructure above:
This will create the following:
S3 Data Bucket
Write-only IAM Role to send data to S3
An IAM Role to allow AWS EC2 to assume the write-only Role
IAM Instance Profile to attach to the instance
Step 2: Launch AWS EC2 Instance and Configure Fluentd
Next, launch an Ubuntu instance with IAM Role created above:
After a couple of moments, the instance will change to the running
status in the AWS EC2 Console.

Connect to the instance with SSH:
$ ssh ubuntu@<public-dns-name> -i <path/to/keypair>
Follow the guide to install Fluentd.
Use the following Fluentd configuration (/etc/td-agent/td-agent.conf
) to consume syslog messages from localhost
to send to S3:
Next, configure rsyslog
to forward messages to the local Fluentd daemon by adding these two lines to the bottom of /etc/rsyslog.d/50-default.conf
:
To enable this logging pipeline, start both services below:
To verify the Fluentd (td-agent) service is properly running:
If the service is unable to load or is throwing errors, verify the following:
The
/etc/td-agent/td-agent.conf
has no syntax errorsThe IAM Role is properly attached to the instance
If no errors are present, continue onward!
Step 3: View Logs in S3
After about an hour of data is generated, you should see data landing in your S3 Bucket:

Each file will have the following format:
To search through files, you can use S3 Select with the following settings:

S3 Select File Settings
And then issue a SQL query to look for sshd
events:

S3 Select Query and Result
If you are reading this, everything is working and you made it to the end!
Continuously Monitor EC2 Instances with Panther
This tutorial taught you how to configure secure and performant security log collection with Fluentd to send directly to a S3 Bucket. This is a jump-off point to more sophisticated collection and analysis of your choosing.
Monitoring EC2 is critical for understanding the history of EC2 metrics changes and detecting suspicious activity. Panther’s built-in policies support the continuous monitoring of EC2 instances, or you can write your own detections in Python to fit your internal business use cases.
Want to learn more about Panther? Book a demo today and find out why Panther is loved by cloud-first security teams.