File
Synopsis
Creates a file target that writes log messages to files in various formats like JSON, MultiJSON, Avro, Parquet, with support for various compression methods and schemas.
Schema
- name: <string>
  description: <string>
  type: file
  pipelines: <pipeline[]>
  status: <boolean>
  properties:
    location: <string>
    name: <string>
    format: <string>
    compression: <string>
    extension: <string>
    schema: <string>
    field_format: <string>
    no_buffer: <boolean>
    batch_size: <integer>
    max_size: <integer>
    locations: <location[]>
    interval: <string|numeric>
    cron: <string>
    debug:
      status: <boolean>
      dont_send_logs: <boolean>
Configuration
The following fields are used to define the target:
| Field | Required | Default | Description | 
|---|---|---|---|
name | Y | Target name | |
description | N | - | Optional description | 
type | Y | Must be file | |
pipelines | N | - | Optional post-processor pipelines | 
status | N | true | Enable/disable the target | 
Files
Files can have the following properties:
| Field | Required | Default | Description | 
|---|---|---|---|
location | N | <service-root> | File output directory | 
name | N | "vmetric.{{.Timestamp}}.{{.Extension}}" | File name template | 
format | N | "json" | File format. See Formats below | 
compression | N | - | Compression algorithm. See Compression below | 
extension | N | Matches format | Custom file extension | 
schema | N | - | Data schema for Avro / Parquet formats. Can be a built-in schema name, a path to a schema file, or an inline schema definition | 
batch_size | N | 100000 | Maximum number of messages per file | 
max_size | N | 32MB | Maximum file size before rotating | 
no_buffer | N | false | Disable write buffering | 
field_format | N | - | Data normalization format. See applicable Normalization section | 
Multiple Locations
You can define multiple output locations with different settings:
targets:
  - name: multi_location_logs
    type: file
    properties:
      locations:
        - id: "security_logs"
          path: "/var/log/security"
          schema: "CommonSecurityLog"
          format: "parquet"
        - id: "system_logs"
          path: "/var/log/system"
          schema: "CommonSystemLog"
          format: "json"
Scheduler
| Field | Required | Default | Description | 
|---|---|---|---|
interval | N | realtime | Execution frequency. See Interval for details | 
cron | N | - | Cron expression for scheduled execution. See Cron for details | 
Debug Options
| Field | Required | Default | Description | 
|---|---|---|---|
debug.status | N | false | Enable debug logging | 
debug.dont_send_logs | N | false | Process logs but don't send to target (testing) | 
Details
The file target supports writing to multiple file locations with different formats and schemas. When using SystemS3 field in your logs, the value will be used to route the message to the location with a matching ID.
If no schema is specified for Avro or Parquet formats, a default schema will be used that captures epoch timestamp and message content.
The target supports the following built-in schema templates:
Syslog- Standard schema for Syslog messagesCommonSecurityLog- Schema compatible with Common Security Log Format (CSL)
You can also reference custom schema files by name (without the .json extension). The system will search for schema files in:
- User schema directory: 
<user-path>/schemas/ - Package schema directory: 
<package-path>/schemas/ 
Schema files are searched recursively in these directories, and filename matching is case-insensitive.
Files with no messages (i.e. with counter=0) are automatically removed when the target is disposed.
When no_buffer is enabled, each write operation will be immediately flushed to disk. This provides durability but may impact performance.
Templates
The following template variables can be used in the file name:
| Variable | Description | Example | 
|---|---|---|
{{.Year}} | Current year | 2024 | 
{{.Month}} | Current month | 01 | 
{{.Day}} | Current day | 15 | 
{{.Timestamp}} | Current timestamp in nanoseconds | 1703688533123456789 | 
{{.Format}} | File format | json | 
{{.Extension}} | File extension | json | 
{{.Compression}} | Compression type | zstd | 
{{.TargetName}} | Target name | my_logs | 
{{.TargetType}} | Target type | file | 
{{.Table}} | Location ID | security_logs | 
Formats
| Format | Description | 
|---|---|
json | Each log entry is written as a separate JSON line (JSONL format) | 
multijson | All log entries are written as a single JSON array | 
avro | Apache Avro format with schema | 
parquet | Apache Parquet columnar format with schema | 
Compression
Files can use the following compression algorithms:
| Format | Default | Compression Codecs | 
|---|---|---|
| JSON | - | Not supported | 
| MultiJSON | - | Not supported | 
| Avro | zstd | deflate, snappy, zstd | 
| Parquet | zstd | gzip, snappy, zstd, brotli, lz4 | 
Examples
JSON
Configuration for a JSON output (as "json" is the default format, no need to specify it):
targets:
  - name: json_logs
    type: file
    properties:
      location: "/var/log/vmetric"
Multiple Locations
Configuration for multiple output locations with different formats:
targets:
  - name: multi_location_logs
    type: file
    properties:
      locations:
        - id: "security"
          path: "/var/log/vmetric/security"
          format: "parquet"
          schema: "CommonSecurityLog"
          compression: "zstd"
        - id: "system"
          path: "/var/log/vmetric/system"
          format: "json"
        - id: "application"
          path: "/var/log/vmetric/app"
          format: "multijson"
          name: "app_{{.Year}}_{{.Month}}_{{.Day}}.json"
Avro with Built-in Schema
Configuration for an Avro output with compression using a built-in schema:
targets:
  - name: syslog_avro
    type: file
    properties:
      location: "/var/log/vmetric"
      format: "avro"
      compression: "snappy"
      schema: "Syslog"
Avro with Custom Schema File
Configuration for an Avro output using a custom schema file:
targets:
  - name: custom_avro
    type: file
    properties:
      location: "/var/log/vmetric"
      format: "avro"
      compression: "zstd"
      schema: "MyCustomSchema"
This will look for MyCustomSchema.json in the schema directories.
Avro with Inline Schema
Configuration for an Avro output with an inline schema definition:
targets:
  - name: avro_logs
    type: file
    properties:
      location: "/var/log/vmetric"
      format: "avro"
      compression: "zstd"
      schema: |
        {
          "type": "record",
          "name": "Log",
          "fields": [
            {"name": "epoch", "type": "long"},
            {"name": "message", "type": "string"}
          ]
        }
Parquet with Built-in Schema
Configuration for a Parquet output with compression using a built-in schema:
targets:
  - name: csl_logs
    type: file
    properties:
      location: "/var/log/vmetric"
      format: "parquet"
      schema: "CommonSecurityLog"
      compression: "brotli"
Parquet with Custom Schema File
Configuration for a Parquet output using a custom schema file:
targets:
  - name: custom_parquet
    type: file
    properties:
      location: "/var/log/vmetric"
      format: "parquet"
      compression: "zstd"
      schema: "NetworkTrafficSchema"
Parquet with Inline Schema
Configuration for a Parquet output with an inline schema definition:
targets:
  - name: parquet_logs
    type: file
    properties:
      location: "/var/log/vmetric"
      format: "parquet"
      compression: "zstd"
      schema: |
        {
          "timestamp": {
            "type": "INT64",
            "logicalType": "TIMESTAMP_MILLIS"
          },
          "message": {
            "type": "STRING",
            "compression": "ZSTD"
          },
          "level": {
            "type": "STRING"
          }
        }
Kusto Schema Conversion
You can also use Kusto schema format, which will be automatically converted:
targets:
  - name: kusto_format
    type: file
    properties:
      location: "/var/log/vmetric"
      format: "parquet"
      schema: "timestamp:datetime,message:string,level:string,source:string"
Windows
Configuration for a Windows environment with a proper path structure:
targets:
  - name: windows_logs
    type: file
    properties:
      location: "C:\\ProgramData\\VMetric\\Logs"
      format: "json"
      name: "windows_{{.Year}}\\{{.Month}}\\system_logs.json"
Daily Rotation with Templates
Configuration with daily file rotation using template variables:
targets:
  - name: daily_logs
    type: file
    properties:
      location: "/var/log/vmetric"
      format: "avro"
      compression: "zstd"
      name: "logs_{{.Year}}_{{.Month}}_{{.Day}}.avro"
      schema: "Syslog"