Fluent Bit Configuration Examples

The examples on this page provide common methods to receive data with Fluent Bit and send logs to Panther via an HTTP Source or via an Amazon S3 Source.

In the examples below, log_level trace and output stdout are used to test and debug the configurations. These should be removed once the Fluent Bit configuration is working as expected.

Dummy to a Panther HTTP source

This example uses Fluent Bit's Dummy input to spawn one event per second. This is useful for testing output configurations and getting started with Fluent Bit.

Input: Dummy

Output: HTTP

[SERVICE]
    log_level trace

[INPUT]
    Name dummy
    Dummy {"message": "sample json message", "type": "json"}

[OUTPUT]
    Name       http
    Match      *
    Host       logs.{REDACTED}.runpanther.net
    Port       443
    URI        /http/{REDACTED}
    Header     x-sender-header {REDACTED}
    Format     json_lines
    TLS        On
    TLS.Verify On
    Json_Date_Key false

[OUTPUT]
    Name   stdout
    Match  *

This configuration results in the following:

# Input configuration:
Dummy {"message": "sample json message", "type": "json"}
    
# Ouput of raw event pre Panther parsing:
{"message": "sample json message", "type": "json"}

Tail local file to Amazon S3

This example uses the Tail input to ingest a file locally sent to S3. Multiple files can be provided. See the path setting in the Fluent Bit Tail documentation for more information.

Input: Tail

Output: S3

In the OUTPUT plugin configuration:

  • Usejson_date_key false to disable the appended date key.

  • Use log_key log to specify Fluent Bit to only send the raw log.

With these two settings, the raw input from the log file is sent without Fluent Bit's appended JSON fields.

[SERVICE]
    log_level trace

[INPUT]
    Name       tail
    Tag        wifi_log
    Path       /var/log/wifi.log

[OUTPUT]
    Name       s3
    Match      *
    Region     {REGION}
    Bucket     {BUCKET_NAME}
    Compression gzip
    json_date_key false
    upload_timeout 5m
    log_key log

    # Retrieving AWS Creds - https://github.com/fluent/fluent-bit-docs/blob/43c4fe134611da471e706b0edb2f9acd7cdfdbc3/administration/aws-credentials.md

[OUTPUT]
    Name   stdout
    Match  *

This configuration results in the following:

# Input result from tailing file:
Mon Feb  5 16:17:04.165 Usb Host Notification hostNotificationUSBDeviceInserted USB Billboard Device    isApple N seqNum 454 Total 4
Mon Feb  5 16:17:04.176 Usb Host Notification Apple80211Set: seqNum 454 Total 4 chg 1 en0
Mon Feb  5 16:17:28.841 Usb Host Notification hostNotificationUSBDeviceInserted USB MICROPHONE isApple N seqNum 455 Total 5
Mon Feb  5 16:17:28.846 Usb Host Notification Apple80211Set: seqNum 455 Total 5 chg 1 en0
    
# Output Result in AWS S3:
Mon Feb  5 16:17:04.165 Usb Host Notification hostNotificationUSBDeviceInserted USB Billboard Device    isApple N seqNum 454 Total 4
Mon Feb  5 16:17:04.176 Usb Host Notification Apple80211Set: seqNum 454 Total 4 chg 1 en0
Mon Feb  5 16:17:28.841 Usb Host Notification hostNotificationUSBDeviceInserted USB MICROPHONE isApple N seqNum 455 Total 5
Mon Feb  5 16:17:28.846 Usb Host Notification Apple80211Set: seqNum 455 Total 5 chg 1 en0

TCP to Amazon S3

This example uses the TCP input plugin. This plugin is useful if you need to ship syslog or JSON events to Fluent Bit over the network. The TCP plugin takes the raw payload it receives and forwards it to the Output configuration.

Input: TCP

Output: S3

[SERVICE]
    log_level trace

[INPUT]
    Name       tcp
    Tag        tcp_log
    Listen     0.0.0.0
    Port       5140
    Format     none

[OUTPUT]
    Name       s3
    Match      *
    Region     {REGION}
    Bucket     {BUCKET_NAME}
    Compression gzip
    json_date_key false
    upload_timeout 5m
    log_key log

    # Retrieving AWS Creds - https://github.com/fluent/fluent-bit-docs/blob/43c4fe134611da471e706b0edb2f9acd7cdfdbc3/administration/aws-credentials.md

[OUTPUT]
    Name   stdout
    Match  *

This configuration results in the following:

# Input command:
%echo "message from local echo" | nc 127.0.0.1 5140
%echo "message from local echo" | nc 127.0.0.1 5140

# Output in AWS S3 with prefix tcp_log/2024/02/06/02/55/generated_filename:
message from local echo
message from local echo

TCP to HTTP (Panther)

This example configuration demonstrates receiving logs using the TCP input plugin and sending directly to Panther's HTTP ingest using Fluent Bit's HTTP output plugin.

Input: TCP

Output: HTTP

The use of filters in the configuration below is required in order to keep raw payload as-is when sending the log to the HTTP destination. See the Fluent Bit HTTP output documentation for more information.

[SERVICE]
    log_level trace

[INPUT]
    Name       tcp
    Tag        tcp_log
    Listen     0.0.0.0
    Port       5140
    Format     none

# https://stackoverflow.com/questions/75291515/how-to-disable-json-format-and-send-only-the-log-message-to-sumologic-with-fluen
[FILTER]
    Name record_modifier
    Match *
    Record headers.content-type text/plain

# https://stackoverflow.com/questions/75291515/how-to-disable-json-format-and-send-only-the-log-message-to-sumologic-with-fluen
[FILTER]
    Name nest
    Match *
    Operation nest
    Wildcard headers.*
    Nest_under headers
    Remove_prefix headers.


[OUTPUT]
    Name       http
    Match      *
    Host       logs.{REDACTED}.runpanther.net
    Port       443
    URI        /http/{REDACTED}
    Header     x-sender-header {REDACTED}
    Format     json_lines
    TLS        On
    TLS.Verify On
    Json_Date_Key false
    body_key   $log
    headers_key $headers

[OUTPUT]
    Name   stdout
    Match  *

This configuration results in the following:

# Input command: 
% echo "message from local echo `date`" | nc 127.0.0.1 5140
% echo "message from local echo `date`" | nc 127.0.0.1 5140
% echo "message from local echo `date`" | nc 127.0.0.1 5140

# Ouput of raw event pre Panther parsing:
message from local echo Mon Feb 5 19:27:40 PST 2024
message from local echo Mon Feb 5 19:27:52 PST 2024
message from local echo Mon Feb 5 19:27:53 PST 2024

Last updated