MacOS System Logs to S3 via Fluentd

Overview

This guide provides a method to deliver MacOS System Logs to S3 using Fluentd. There are two different pipeline flows: via an AWS Firehose delivery stream and directly to an AWS S3 bucket

Prerequisites

This guide assumes that an S3 bucket or Firehose has already been created. If you need to create either of these resources, please see the Getting Started with Fluentd guide. If you have already provisioned the resources, you can adapt the guide below to fit your needs.

Setup Fluentd

Step 1. Install Fluentd (td-agent)

Follow the Fluentd installation instructions for the machine from which you want to collect MacOS System Logs. This guide will specifically cover using td-agent as the service to collect logs.

Step 2. Install the Fluent Plugin for MacOS Logs

Use the command below to install the Fluentd MacOS plugin.
1
sudo /opt/td-agent/bin/fluent-gem install fluent-plugin-macos-log
Copied!
Further documentation about this plugin can be found on Github.

Step 3. Edit Fluentd Configuration

Edit the Fluentd configuration with the aws_key_id, aws_sec_key, s3_bucket, and s3_region below.
Fluentd and td-agent will attempt to run services on conflicting ports. If this is a new installation you will need to change the ports in the configuration file or remove the default configuration from the file.
1
/etc/td-agent/td-agent.conf
Copied!
1
<source>
2
@type macoslog
3
style ndjson
4
tag macos
5
pos_file last-starttime.log
6
run_interval 10s
7
<parse>
8
@type json
9
time_type string
10
time_key timestamp
11
time_format %Y-%m-%d %H:%M:%S.%L%z
12
</parse>
13
</source>
14
15
<match **>
16
@type s3
17
aws_key_id <Key ID>
18
aws_sec_key <Key>
19
s3_bucket <Bucket>
20
s3_region <Region>
21
path macoslog/%Y/%m/%d/
22
store_as gzip
23
<buffer tag,time>
24
@type file
25
path /var/log/fluent/s3
26
timekey 300 # 5 min partition to post to S3
27
timekey_wait 2m
28
timekey_use_utc true # use utc
29
chunk_limit_size 256m
30
</buffer>
31
<format>
32
@type json
33
</format>
34
</match>
Copied!

Step 4. Point Fluentd to Configuration File and Validate

1
# Point fluentd to configuration file
2
fluentd -c /etc/td-agent/td-agent.conf
3
4
# Validate configuration
5
/opt/td-agent/usr/sbin/td-agent --dry-run
Copied!

Step 5. Verify Logging

After a few minutes have passed, verify that events are being logged to the S3 bucket. Logs should be showing up under the macos/ prefix within the bucket.

Panther UI

Step 1. Create a Custom Schema

Go to Log Analysis > Custom Schema > + New Schema and enter the below values in the schema fields:
Name: Custom.MacOSSystemLogs Description: MacOS System Logs for Application, Security, System
1
version: 0
2
fields:
3
- name: pid
4
type: bigint
5
- name: ppid
6
type: bigint
7
- name: message
8
type: string
9
- name: worker
10
type: bigint
11
- name: creatorActivityID
12
type: float
13
- name: messageType
14
type: string
15
- name: activityIdentifier
16
type: bigint
17
- name: backtrace
18
type: object
19
fields:
20
- name: frames
21
required: true
22
type: array
23
element:
24
type: object
25
fields:
26
- name: imageOffset
27
required: true
28
type: bigint
29
- name: imageUUID
30
required: true
31
type: string
32
- name: bootUUID
33
type: string
34
- name: category
35
type: string
36
- name: eventMessage
37
type: string
38
- name: eventType
39
type: string
40
- name: formatString
41
type: string
42
- name: machTimestamp
43
type: bigint
44
- name: parentActivityIdentifier
45
type: bigint
46
- name: processID
47
type: bigint
48
- name: processImagePath
49
type: string
50
- name: processImageUUID
51
type: string
52
- name: senderImagePath
53
type: string
54
- name: senderImageUUID
55
type: string
56
- name: senderProgramCounter
57
type: bigint
58
- name: subsystem
59
type: string
60
- name: threadID
61
type: bigint
62
- name: timezoneName
63
type: string
64
- name: traceID
65
type: float
Copied!

Step 2. Onboard the S3 bucket

Follow the S3 source onboarding documentation and use the S3 Bucket used in the previous setup.
Select the log type Custom.MacOSSystemLogs and prefix macos/ in the onboarding steps. After completing the bucket onboarding, data should now be flowing into Panther!
Last modified 1mo ago