Fleet logo
Menu An icon indicating that interacting with this button will open the navigation menu.
Fleet logo An 'X' icon indicating that this can be interacted with to close the navigation menu.
Multi platform
Device management   (+ MDM) Orchestration   (+ monitoring) Software management   (+ CVEs) Integrations

Docs
Stories
What people are saying News Ask around Meetups COMPANY
Origins   (Fleet & osquery) The handbook Logos & artwork Why open source?

Pricing Take a tour
Multi platform
Device management + MDM Orchestration + monitoring Software management + CVEs, usage, app library Integrations
Docs
Stories
What people are saying News Ask around Take a tour Meetups COMPANY Origins   (Fleet & osquery) The handbook Logos/artwork Why open source?
Pricing Try it yourself
{{categoryFriendlyName}}/
{{thisPage.meta.articleTitle}}
search

Log destinations

{{articleSubtitle}}

| The author's GitHub profile picture

Rachael Shaw

Share this article on Hacker News Share this article on LinkedIn Share this article on Twitter

On this page

{{topic.title}}
Docs Docs REST API REST API Guides Guides Talk to an engineer Talk to an engineer
Suggest an editSuggest an edit

Try it out

See what Fleet can do

Start now
macOS Windows Linux

Log destinations

{{articleSubtitle}}

| The author's GitHub profile picture

Rachael Shaw

Log destinations

Log destinations can be used in Fleet to log:

  • Osquery status logs.

  • Osquery schedule query result logs.

  • Fleet audit logs.

To configure each log destination, you must set the correct logging configuration options in Fleet.

Check out the reference documentation for:

  • Osquery status logging configuration options.
  • Osquery result logging configuration options.
  • Activity audit logging configuration options.

This guide provides a list of the supported log destinations in Fleet.

In this guide:

  • Amazon Kinesis Data Firehose
  • Snowflake
  • Splunk
  • Amazon Kinesis Data Streams
  • AWS Lambda
  • Google Cloud Pub/Sub
  • Apache Kafka
  • Stdout
  • Filesystem
  • Sending logs outside of Fleet

Amazon Kinesis Data Firehose

Logs are written to Amazon Kinesis Data Firehose (Firehose).

  • Plugin name: firehose
  • Flag namespace: firehose

This is a very good method for aggregating osquery logs into Amazon S3.

Note that Firehose logging has limits discussed in the documentation. When Fleet encounters logs that are too big for Firehose, notifications will be output in the Fleet logs and those logs will not be sent to Firehose.

Snowflake

To send logs to Snowflake, you must first configure Fleet to send logs to Amazon Kinesis Data Firehose (Firehose). This is because you'll use the Snowflake Snowpipe integration to direct logs to Snowflake.

If you're using Fleet's best practice Terraform, Firehose is already configured as your log destination.

With Fleet configured to send logs to Firehose, you then want to load the data from Firehose into a Snowflake database. AWS provides instructions on how to direct logs to a Snowflake database here in the AWS documentation

Snowflake provides instructions on setting up the destination tables and IAM roles required in AWS here in the Snowflake docs.

Splunk

How to send logs to Splunk:

  1. Follow Splunk's instructions to prepare the Splunk for Firehose data.

  2. Follow these AWS instructions on how to enable Firehose to forward directly to Splunk.

  3. In your main.tf file, replace your S3 destination (aws_kinesis_firehose_delivery_stream) with a Splunk destination:

resource "aws_kinesis_firehose_delivery_stream" "test_stream" {
  name        = "terraform-kinesis-firehose-test-stream"
  destination = "splunk"

  splunk_configuration {
    hec_endpoint               = "https://http-inputs-mydomain.splunkcloud.com:443"
    hec_token                  = "51D4DA16-C61B-4F5F-8EC7-ED4301342A4A"
    hec_acknowledgment_timeout = 600
    hec_endpoint_type          = "Event"
    s3_backup_mode             = "FailedEventsOnly"

    s3_configuration {
      role_arn           = aws_iam_role.firehose.arn
      bucket_arn         = aws_s3_bucket.bucket.arn
      buffering_size     = 10
      buffering_interval = 400
      compression_format = "GZIP"
    }
  }
}

For the latest configuration go to HashiCorp's Terraform docs here.

Amazon Kinesis Data Streams

Logs are written to Amazon Kinesis Data Streams (Kinesis).

  • Plugin name: kinesis
  • Flag namespace: kinesis

Note that Kinesis logging has limits discussed in the documentation. When Fleet encounters logs that are too big for Kinesis, notifications appear in the Fleet server logs. Those logs will not be sent to Kinesis.

AWS Lambda

Logs are written to AWS Lambda (Lambda).

  • Plugin name: lambda
  • Flag namespace: lambda

Lambda processes logs from Fleet synchronously, so the Lambda function used must not take enough processing time that the osquery client times out while writing logs. If there is heavy processing to be done, use Lambda to store the logs in another datastore/queue before performing the long-running process.

Note that Lambda logging has limits discussed in the documentation. The maximum size of a log sent to Lambda is 6MB. When Fleet encounters logs that are too big for Lambda, notifications will be output in the Fleet logs and those logs will not be sent to Lambda.

Lambda is executed once per log line. As a result, queries with differential result logging might result in a higher number of Lambda invocations.

An icon indicating that this section has important information

Queries are assigned differential result logging by default in Fleet. differential logs have two format options, single (event) and batched. Check out the osquery documentation for more information on differential logs.

Keep this in mind when using Lambda, as you're charged based on the number of requests for your functions and the duration, the time it takes for your code to execute.

Google Cloud Pub/Sub

Logs are written to Google Cloud Pub/Sub (Pub/Sub).

  • Plugin name: pubsub
  • Flag namespace: pubsub

Messages over 10MB will be dropped, with a notification sent to the Fleet logs, as these can never be processed by Pub/Sub.

Apache Kafka

Logs are written to Apache Kafka (Kafka) using the Kafka REST proxy.

  • Plugin name: kafkarest
  • Flag namespace: kafka

Note that the REST proxy must be in place in order to send osquery logs to Kafka topics.

Stdout

Logs are written to stdout.

  • Plugin name: stdout
  • Flag namespace: stdout

With the stdout plugin, logs are written to stdout on the Fleet server. This is typically used for debugging or with a log forwarding setup that will capture and forward stdout logs into a logging pipeline.

Note that if multiple load-balanced Fleet servers are used, the logs will be load-balanced across those servers (not duplicated).

Filesystem

Logs are written to the local Fleet server filesystem.

The default log destination.

  • Plugin name: filesystem
  • Flag namespace: filesystem

With the filesystem plugin, logs are written to the local filesystem on the Fleet server. This is typically used with a log forwarding agent on the Fleet server that will push the logs into a logging pipeline.

Note that if multiple load-balanced Fleet servers are used, the logs will be load-balanced across those servers (not duplicated).

Sending logs outside of Fleet

Osquery agents are typically configured to send logs to the Fleet server (--logger_plugin=tls). This is not a requirement, and any other logger plugin can be used even when osquery clients are connecting to the Fleet server to retrieve configuration or run live queries.

See the osquery logging documentation for more about configuring logging on the agent.

If --logger_plugin=tls is used with osquery clients, the following configuration can be applied on the Fleet server for handling the incoming logs.

Fleet logo
Multi platform Device management Orchestration Software management Integrations Pricing
Documentation Support Docs API Release notes Get your license
Company About News Jobs Logos/artwork Why open source?
ISO 27001 coming soon a small checkmarkSOC2 Type 2 Creative Commons Licence CC BY-SA 4.0
© 2025 Fleet Inc. Privacy
Slack logo GitHub logo LinkedIn logo X (Twitter) logo Youtube logo Mastadon logo
Tried Fleet yet?

Get started with Fleet

Start
continue
×