S3 source
This documentation includes enhancements introduced in version 1.17. If you are using an older version of Panther, please refer to the older variant of this documentation for support.

Set Up Permissions to Pull Data

The steps below will enable secure access for Panther to pull security logs from S3 bucket(s).
From Integrations, click Log Sources > Add Source > Data Transport > AWS S3 Bucket

Step 1: Enter the Bucket Details

Field
Required?
Description
Name
Yes
Friendly name of the S3 source
Account ID
Yes
The 12-digit AWS Account ID where the S3 buckets are located
Bucket Name
Yes
The S3 Bucket ID/name to onboard
KMS Key
No
If your data is encrypted using KMS-SSE, provide the ARN of the KMS key
Stream Type
Yes
Events could be in line delimited, JSON Array format or they could be delivered to S3 from CloudWatch Logs. If an incorrect stream type is chosen, Panther will trigger an S3 Get.Object system error alert.
S3 Prefix Inclusion and Exclusion Filter & Log Types
Yes
The Log Types Panther should use to parse S3 objects matching the S3 Prefix Filter. In the Exclusion field, you can indicate which Prefixes to exclude. At least one Log Type must be selected from the dropdown menu. All S3 Prefix Filters may be left blank to allow ingestion of all files.
Click Next.

Step 2: Setup IAM role

Panther needs an AWS IAM role with permissions to read objects from your S3 bucket. You may either use Panther's provided CloudFormation templates to create an IAM role by downloading the CloudFormation template (first option in the UI) or launching a CloudFormation stack using the AWS console (second option in the UI). Alternatively, you can create the role yourself and fill in the role ARN in Panther (third option in the UI).

Creating an IAM role using AWS Console UI

This option will will redirect you to the AWS console with the template URL pre-filled. The CloudFormation stack will create an AWS IAM role with the minimum required permissions to read objects from your S3 bucket.

Creating an IAM role using CloudFormation Template File

This option allows you to download the template and apply it through your own pipeline.
After the CloudFormation stack creation is complete, the role ARN will be visible in the Outputs of the stack.
Fill in the role ARN to Panther and click Continue Setup.

Creating an IAM role manually or with other automation

You may create the required IAM role manually or through your own automation, and just fill in the role ARN in Panther. Note, the IAM role policy must include at least the statements defined in the below policy (below the screenshot):
1
"Version": "2012-10-17",
2
"Statement": [
3
{
4
"Action": ["s3:GetBucketLocation", "s3:ListBucket"],
5
"Resource": "arn:aws:s3:::<bucket-name>",
6
"Effect": "Allow"
7
},
8
{
9
"Action": "s3:GetObject",
10
"Resource": "arn:aws:s3:::<bucket-name>/*",
11
"Effect": "Allow"
12
}
13
]
14
}
Copied!
When the IAM role is ready, fill in the role ARN to Panther and click Continue Setup.

Step 3: Finish source setup

Once you've successfully set up the IAM role in your AWS console, you should be prompted with a success screen. Any permission errors detected will be surfaced in this screen and you will be asked to try configuring the IAM role again.
Once completed, be sure to set up a log drop-off alarm that will alert you if data stops flowing from the log source. Be sure to set an appropriate time interval for when you would like Panther to alert you that the log source is not sending data.
If you haven't opted in for Panther-managed notifications, follow the steps below to configure notifications for your S3 bucket.

Data Backup

Once the S3 bucket has been successfully onboarded to Panther and data is flowing, Panther will backup all raw logs for up to 30 days. After 30 days, the logs will be deleted. The raw logs are used for various reasons:
  • To backup dropped logs that may have not been successfully normalized and classified in Panther's data processing pipeline

Set Up Notifications of New Data

Now that Panther has the ability to pull log data, you need to configure your S3 buckets to send notifications when new data arrives.
We will configure the bucket to send notifications for new files to an SNS topic, which in turn will notify Panther's SQS queue.

Create SNS Topic

If you already have configured the bucket to send All object create events to an SNS topic, proceed to modify an existing SNS topic and subscribe it to Panther's input data queue.
First, create an SNS Topic and SNS Subscription to notify Panther that new data is ready for processing.
Log into the AWS Console of the account that owns the S3 bucket. Select the AWS Region where your S3 bucket is located, navigate to the CloudFormation console, and click on Create Stack (with new resources).
Under the Specify template section, enter the following Amazon S3 URL:
1
https://panther-public-cloudformation-templates.s3-us-west-2.amazonaws.com/panther-log-processing-notifications/latest/template.yml
Copied!
Specify the stack details below:
Field
Description
Stack name
A name of your choice, e.g. panther-log-processing-notifications-<bucket-label>
MasterAccountId
The 12 digit AWS Account ID where Panther is deployed
PantherRegion
The region where Panther is deployed
SnsTopicName
The name of the SNS topic receiving the notification, by default this is panther-notifications-topic
Click on Next, Next, and then Create Stack.
This stack has one output named SnsTopicArn.

Modify an existing SNS topic

If you opted to create a new SNS topic in the previous step, skip this step and proceed to configure bucket notifications below.
Follow the steps below if you wish to use an existing topic for sending bucket notifications. Note that the SNS topic must be in the same region as your S3 bucket.

Modify SNS Access Policy

Create a subscription between your SNS topic and Panther's log processing SQS queue.
  1. 1.
    Log into the AWS Console for the account where your S3 bucket exists
  2. 2.
    Navigate to the SNS Console and select the SNS Topic currently receiving events
  3. 3.
    Note the ARN of this SNS topic
  4. 4.
    Select the Edit button and scroll down to the Access Policy card
  5. 5.
    Add the statement shown below to the topic's Access Policy. Populate <PANTHER-MASTER-ACCOUNT-ID> with the 12-digit account ID where Panther is deployed - navigate to your Settings > General page in your Panther UI to obtain this ID. Populate SNS-TOPIC-ARN with the ARN you noted on step #3:
1
{
2
"Sid": "CrossAccountSubscription",
3
"Effect": "Allow",
4
"Principal": {
5
"AWS": "arn:aws:iam::<MasterAccountId>:root"
6
},
7
"Action": "sns:Subscribe",
8
"Resource": "<SNS-TOPIC-ARN>"
9
}
Copied!

Create SNS Subscription

Finally, create the subscription to the Panther Master account's SQS queue.
From the SNS Console, select the Create subscription button:
  1. 1.
    Protocol: Amazon SQS
  2. 2.
    Endpoint: arn:aws:sqs:<PantherRegion>:<MasterAccountId>:panther-input-data-notifications-queue
  3. 3.
    Select the Create subscription button

Configure Event Notifications on the bucket

With the SNS Topic created, the final step is to enable notifications from the S3 buckets.
Navigate to the AWS S3 Console, select the relevant bucket, and click the Properties tab.
From there, find the Event notifications card. Click + Create event notification and use the following settings:
Field
Value
Name
PantherEventNotifications
Events
All object create events
Send to
SNS Topic
SNS
panther-notifications-topic
Suffix
(optional) limits notifications to objects with keys that end in matching characters
Prefix
(optional) limits notifications to objects with keys that start with matching characters
Click Save.
That's it! Everything should be set up correctly and Panther can start processing new files arriving to your bucket.

Viewing Collected Logs

After log sources are configured, your data can be searched in Data Explorer! Learn more here.
Last modified 2d ago