Skip to content

exec_filter does not work with inject&extract combination #5133

@steros

Description

@steros

Describe the bug

I'm testing a Go script to parse logs locally in docker with amazonlinux:2

I want to pass the tag which the log is send by through exec_filter instead of hardconfig a tag.
Even though the documentation is very bad in this section. I found out I can just inject together with extract to achieve this. (by injecting the incoming tag into the log with a temp key and then extracting it from the parsed log again)

If I hardconfig a tag together with an inject section. I receive logs in stdout. (aka logs are emitted by exec_filter) I can also see that the tag is properly injected (and my script is not tampering with it).
But as soon as I remove the hardconfig tag and instead add an extract section, no logs appear anymore in stdout.
The trace log shows that something is actually happening, minus the log being emitted back (I think).
Also it seems to be stuck in an endless loop. Even when sending test logs is stopped, the log still has the continuous same output. It's different than just the expected idle output of:

2025-10-31 19:53:22.615 | 2025-10-31 10:53:22 +0000 [debug]: #0 enqueue_thread actually running
2025-10-31 19:53:22.615 | 2025-10-31 10:53:22.615679138 +0000 fluent.debug: {"message":"enqueue_thread actually running"}
2025-10-31 19:53:22.615 | 2025-10-31 10:53:22 +0000 [trace]: #0 enqueueing all chunks in buffer instance=2000

As it works without extract the Go script can't be the issue. It is also reproducible just using cat as command.

I tried to use my own fluentD plugin which is just copy and paste of the exec_filter code.
It has the same behavior.
Yet when I add a log.info before the router.emit in the on_record function, logs are send to stdout together with the log.info. (Is there some kind of timing or flushing issue in the underlying application layer?)
For this plugin I also setup a local test in my IDE and I can debug that the behavior is as expected.
The tag is injected then extracted and the log emitted properly.

def on_record(time, record)
        tag = extract_tag_from_record(record)
        tag = @added_prefix_string + tag if tag && @add_prefix
        tag ||= @tag
        time ||= extract_time_from_record(record) || Fluent::EventTime.now
        log.info "start emitting log with tag: #{tag.to_s}, time: #{time.to_s}, record: #{record.to_s}", tag: tag, time: time, record: record // <-- this triggers the logs to be emitted
        router.emit(tag, time, record) // <-- has the correct tag injected into the log and extracted above
      rescue => e

But that's not the behavior in td-agent.

I have tried v4.5.2 and v4.4.2 additionally fluentD version 1.18

To Reproduce

  1. setup td-agent in amazonlinux:2 with docker compose
  2. up the system
  3. send some json logs

Expected behavior

logs are emitted by exec_filter and forwarded to stdout as per configuration

Your Environment

- td-agent version: 4.5.2
- fluentd version: 1.15.3 or 1.18
- Operating system: FROM amazonlinux:2

Your Configuration

<system>
  log_level trace
  workers 2
  root_dir /var/log/td-agent
</system>

<source>
  @type forward
  port 24224
</source>

<match test.*>
  @type exec_filter
  command /usr/local/bin/my-go-script
  <format>
    @type json
  </format>
  <parse>
    @type json
  </parse>
  <inject>
    tag_key execfiltertemp
  </inject>
  <extract>
    tag_key execfiltertemp
  </extract>
</match>

<match **>
  @type stdout
</match>

Your Error Log

2025-10-31 19:53:21.594 | 2025-10-31 10:53:21 +0000 [info]: init supervisor logger path=nil rotate_age=nil rotate_size=nil
2025-10-31 19:53:21.594 | 2025-10-31 10:53:21 +0000 [info]: parsing config file is succeeded path="/etc/td-agent/td-agent.conf"
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-calyptia-monitoring' version '0.1.3'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-elasticsearch' version '5.2.4'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-flowcounter-simple' version '0.1.0'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-kafka' version '0.18.1'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-metrics-cmetrics' version '0.1.2'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-opensearch' version '1.0.8'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-prometheus' version '2.0.3'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-prometheus_pushgateway' version '0.1.0'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-record-modifier' version '2.1.1'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-rewrite-tag-filter' version '2.4.0'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-s3' version '1.7.2'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-sd-dns' version '0.1.0'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-systemd' version '1.0.5'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-td' version '1.2.0'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-utmpx' version '0.5.0'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluent-plugin-webhdfs' version '1.5.0'
2025-10-31 19:53:21.609 | 2025-10-31 10:53:21 +0000 [info]: gem 'fluentd' version '1.15.3'
2025-10-31 19:53:21.615 | 2025-10-31 10:53:21 +0000 [trace]: registered output plugin 'exec_filter'
2025-10-31 19:53:21.618 | 2025-10-31 10:53:21 +0000 [trace]: registered metrics plugin 'local'
2025-10-31 19:53:21.624 | 2025-10-31 10:53:21 +0000 [trace]: registered buffer plugin 'memory'
2025-10-31 19:53:21.629 | 2025-10-31 10:53:21 +0000 [trace]: registered parser plugin 'json'
2025-10-31 19:53:21.632 | 2025-10-31 10:53:21 +0000 [trace]: registered formatter plugin 'json'
2025-10-31 19:53:21.635 | 2025-10-31 10:53:21 +0000 [trace]: registered output plugin 'stdout'
2025-10-31 19:53:21.642 | 2025-10-31 10:53:21 +0000 [trace]: registered formatter plugin 'stdout'
2025-10-31 19:53:21.647 | 2025-10-31 10:53:21 +0000 [trace]: registered input plugin 'forward'
2025-10-31 19:53:21.648 | 2025-10-31 10:53:21 +0000 [warn]: define <match fluent.**> to capture fluentd logs in top level is deprecated. Use <label @FLUENT_LOG> instead
2025-10-31 19:53:21.648 | 2025-10-31 10:53:21 +0000 [info]: using configuration file: <ROOT>
2025-10-31 19:53:21.648 |   <system>
2025-10-31 19:53:21.648 |     log_level trace
2025-10-31 19:53:21.648 |     workers 2
2025-10-31 19:53:21.648 |     root_dir "/var/log/td-agent"
2025-10-31 19:53:21.648 |   </system>
2025-10-31 19:53:21.648 |   <source>
2025-10-31 19:53:21.648 |     @type forward
2025-10-31 19:53:21.648 |     port 24224
2025-10-31 19:53:21.648 |   </source>
2025-10-31 19:53:21.648 |   <match test.*>
2025-10-31 19:53:21.648 |     @type exec_filter
2025-10-31 19:53:21.648 |     command "/usr/local/bin/my-go-script"
2025-10-31 19:53:21.648 |     <format>
2025-10-31 19:53:21.648 |       @type "json"
2025-10-31 19:53:21.648 |     </format>
2025-10-31 19:53:21.648 |     <parse>
2025-10-31 19:53:21.648 |       @type "json"
2025-10-31 19:53:21.648 |     </parse>
2025-10-31 19:53:21.648 |     <inject>
2025-10-31 19:53:21.648 |       tag_key "execfiltertemp"
2025-10-31 19:53:21.648 |     </inject>
2025-10-31 19:53:21.648 |     <extract>
2025-10-31 19:53:21.648 |       tag_key "execfiltertemp"
2025-10-31 19:53:21.648 |     </extract>
2025-10-31 19:53:21.648 |   </match>
2025-10-31 19:53:21.648 |   <match **>
2025-10-31 19:53:21.648 |     @type stdout
2025-10-31 19:53:21.648 |   </match>
2025-10-31 19:53:21.648 | </ROOT>
2025-10-31 19:53:21.648 | 2025-10-31 10:53:21 +0000 [info]: starting fluentd-1.15.3 pid=1 ruby="2.7.6"
2025-10-31 19:53:21.657 | 2025-10-31 10:53:21 +0000 [info]: spawn command to main:  cmdline=["/opt/td-agent/bin/ruby", "-Eascii-8bit:ascii-8bit", "/usr/sbin/td-agent", "-c", "/etc/td-agent/td-agent.conf", "--under-supervisor"]
2025-10-31 19:53:21.658 | 2025-10-31 10:53:21 +0000 [info]: fluent/log.rb:330:info: init supervisor logger path=nil rotate_age=nil rotate_size=nil
2025-10-31 19:53:22.068 | 2025-10-31 10:53:22 +0000 [info]: #0 init worker0 logger path=nil rotate_age=nil rotate_size=nil
2025-10-31 19:53:22.069 | 2025-10-31 10:53:22 +0000 [info]: adding match pattern="test.*" type="exec_filter"
2025-10-31 19:53:22.090 | 2025-10-31 10:53:22 +0000 [trace]: #0 registered output plugin 'exec_filter'
2025-10-31 19:53:22.092 | 2025-10-31 10:53:22 +0000 [trace]: #0 registered metrics plugin 'local'
2025-10-31 19:53:22.096 | 2025-10-31 10:53:22 +0000 [trace]: #0 registered buffer plugin 'memory'
2025-10-31 19:53:22.099 | 2025-10-31 10:53:22 +0000 [trace]: #0 registered parser plugin 'json'
2025-10-31 19:53:22.101 | 2025-10-31 10:53:22 +0000 [trace]: #0 registered formatter plugin 'json'
2025-10-31 19:53:22.102 | 2025-10-31 10:53:22 +0000 [info]: adding match pattern="**" type="stdout"
2025-10-31 19:53:22.104 | 2025-10-31 10:53:22 +0000 [trace]: #0 registered output plugin 'stdout'
2025-10-31 19:53:22.109 | 2025-10-31 10:53:22 +0000 [trace]: #0 registered formatter plugin 'stdout'
2025-10-31 19:53:22.109 | 2025-10-31 10:53:22 +0000 [info]: adding source type="forward"
2025-10-31 19:53:22.112 | 2025-10-31 10:53:22 +0000 [trace]: #0 registered input plugin 'forward'
2025-10-31 19:53:22.113 | 2025-10-31 10:53:22 +0000 [warn]: #0 define <match fluent.**> to capture fluentd logs in top level is deprecated. Use <label @FLUENT_LOG> instead
2025-10-31 19:53:22.113 | 2025-10-31 10:53:22 +0000 [info]: #0 starting fluentd worker pid=11 ppid=1 worker=0
2025-10-31 19:53:22.113 | 2025-10-31 10:53:22 +0000 [debug]: #0 buffer started instance=2000 stage_size=0 queue_size=0
2025-10-31 19:53:22.114 | 2025-10-31 10:53:22 +0000 [debug]: #0 Executing command title=:out_exec_filter_child0 spawn=[{}, "/usr/local/bin/my-go-script"] mode=[:read, :write] stderr=:discard
2025-10-31 19:53:22.116 | 2025-10-31 10:53:22 +0000 [info]: #0 listening port port=24224 bind="0.0.0.0"
2025-10-31 19:53:22.120 | 2025-10-31 10:53:22 +0000 [info]: #0 fluentd worker is now running worker=0
2025-10-31 19:53:22.120 | 2025-10-31 10:53:22.113476513 +0000 fluent.info: {"pid":11,"ppid":1,"worker":0,"message":"starting fluentd worker pid=11 ppid=1 worker=0"}
2025-10-31 19:53:22.121 | 2025-10-31 10:53:22.113633679 +0000 fluent.debug: {"instance":2000,"stage_size":0,"queue_size":0,"message":"buffer started instance=2000 stage_size=0 queue_size=0"}
2025-10-31 19:53:22.121 | 2025-10-31 10:53:22.113923846 +0000 fluent.debug: {"title":"out_exec_filter_child0","spawn":[{},"/usr/local/bin/my-go-script"],"mode":["read","write"],"stderr":"discard","message":"Executing command title=:out_exec_filter_child0 spawn=[{}, \"/usr/local/bin/my-go-script\"] mode=[:read, :write] stderr=:discard"}
2025-10-31 19:53:22.121 | 2025-10-31 10:53:22.116517304 +0000 fluent.info: {"port":24224,"bind":"0.0.0.0","message":"listening port port=24224 bind=\"0.0.0.0\""}
2025-10-31 19:53:22.121 | 2025-10-31 10:53:22.120358304 +0000 fluent.info: {"worker":0,"message":"fluentd worker is now running worker=0"}
2025-10-31 19:53:22.127 | 2025-10-31 10:53:22 +0000 [info]: #1 init workers logger path=nil rotate_age=nil rotate_size=nil
2025-10-31 19:53:22.146 | 2025-10-31 10:53:22 +0000 [trace]: #1 registered output plugin 'exec_filter'
2025-10-31 19:53:22.149 | 2025-10-31 10:53:22 +0000 [trace]: #1 registered metrics plugin 'local'
2025-10-31 19:53:22.153 | 2025-10-31 10:53:22 +0000 [trace]: #1 registered buffer plugin 'memory'
2025-10-31 19:53:22.156 | 2025-10-31 10:53:22 +0000 [trace]: #1 registered parser plugin 'json'
2025-10-31 19:53:22.158 | 2025-10-31 10:53:22 +0000 [trace]: #1 registered formatter plugin 'json'
2025-10-31 19:53:22.160 | 2025-10-31 10:53:22 +0000 [trace]: #1 registered output plugin 'stdout'
2025-10-31 19:53:22.164 | 2025-10-31 10:53:22 +0000 [trace]: #1 registered formatter plugin 'stdout'
2025-10-31 19:53:22.168 | 2025-10-31 10:53:22 +0000 [trace]: #1 registered input plugin 'forward'
2025-10-31 19:53:22.169 | 2025-10-31 10:53:22 +0000 [warn]: #1 define <match fluent.**> to capture fluentd logs in top level is deprecated. Use <label @FLUENT_LOG> instead
2025-10-31 19:53:22.169 | 2025-10-31 10:53:22 +0000 [info]: #1 starting fluentd worker pid=12 ppid=1 worker=1
2025-10-31 19:53:22.169 | 2025-10-31 10:53:22 +0000 [debug]: #1 buffer started instance=2000 stage_size=0 queue_size=0
2025-10-31 19:53:22.170 | 2025-10-31 10:53:22 +0000 [debug]: #1 Executing command title=:out_exec_filter_child0 spawn=[{}, "/usr/local/bin/my-go-script"] mode=[:read, :write] stderr=:discard
2025-10-31 19:53:22.172 | 2025-10-31 10:53:22 +0000 [info]: #1 listening port port=24224 bind="0.0.0.0"
2025-10-31 19:53:22.175 | 2025-10-31 10:53:22 +0000 [info]: #1 fluentd worker is now running worker=1
2025-10-31 19:53:22.175 | 2025-10-31 10:53:22.169490138 +0000 fluent.info: {"pid":12,"ppid":1,"worker":1,"message":"starting fluentd worker pid=12 ppid=1 worker=1"}
2025-10-31 19:53:22.175 | 2025-10-31 10:53:22.169646096 +0000 fluent.debug: {"instance":2000,"stage_size":0,"queue_size":0,"message":"buffer started instance=2000 stage_size=0 queue_size=0"}
2025-10-31 19:53:22.175 | 2025-10-31 10:53:22.169951096 +0000 fluent.debug: {"title":"out_exec_filter_child0","spawn":[{},"/usr/local/bin/my-go-script"],"mode":["read","write"],"stderr":"discard","message":"Executing command title=:out_exec_filter_child0 spawn=[{}, \"/usr/local/bin/my-go-script\"] mode=[:read, :write] stderr=:discard"}
2025-10-31 19:53:22.175 | 2025-10-31 10:53:22.172535971 +0000 fluent.info: {"port":24224,"bind":"0.0.0.0","message":"listening port port=24224 bind=\"0.0.0.0\""}
2025-10-31 19:53:22.175 | 2025-10-31 10:53:22.175197555 +0000 fluent.info: {"worker":1,"message":"fluentd worker is now running worker=1"}
2025-10-31 19:53:22.615 | 2025-10-31 10:53:22 +0000 [debug]: #0 enqueue_thread actually running
2025-10-31 19:53:22.615 | 2025-10-31 10:53:22.615679138 +0000 fluent.debug: {"message":"enqueue_thread actually running"}
2025-10-31 19:53:22.615 | 2025-10-31 10:53:22 +0000 [trace]: #0 enqueueing all chunks in buffer instance=2000
2025-10-31 19:53:22.616 | 2025-10-31 10:53:22.615874805 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:53:22.616 | 2025-10-31 10:53:22 +0000 [debug]: #0 flush_thread actually running
2025-10-31 19:53:22.616 | 2025-10-31 10:53:22.616223055 +0000 fluent.debug: {"message":"flush_thread actually running"}
2025-10-31 19:53:22.670 | 2025-10-31 10:53:22 +0000 [debug]: #1 enqueue_thread actually running
2025-10-31 19:53:22.670 | 2025-10-31 10:53:22.670506680 +0000 fluent.debug: {"message":"enqueue_thread actually running"}
2025-10-31 19:53:22.670 | 2025-10-31 10:53:22 +0000 [trace]: #1 enqueueing all chunks in buffer instance=2000
2025-10-31 19:53:22.670 | 2025-10-31 10:53:22.670720513 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:53:22.671 | 2025-10-31 10:53:22 +0000 [debug]: #1 flush_thread actually running
2025-10-31 19:53:22.671 | 2025-10-31 10:53:22.671252346 +0000 fluent.debug: {"message":"flush_thread actually running"}
2025-10-31 19:53:23.621 | 2025-10-31 10:53:23 +0000 [trace]: #0 enqueueing all chunks in buffer instance=2000
2025-10-31 19:53:23.621 | 2025-10-31 10:53:23.620269014 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:53:23.675 | 2025-10-31 10:53:23 +0000 [trace]: #1 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:05.791 | 2025-10-31 10:54:05 +0000 [trace]: #1 connected fluent socket addr="172.217.174.113" port=21513
2025-10-31 19:54:05.791 | 2025-10-31 10:54:05.790795255 +0000 fluent.trace: {"addr":"172.217.174.113","port":21513,"message":"connected fluent socket addr=\"172.217.174.113\" port=21513"}
2025-10-31 19:54:05.791 | 2025-10-31 10:54:05 +0000 [trace]: #1 accepted fluent socket addr="172.217.174.113" port=21513
2025-10-31 19:54:05.791 | 2025-10-31 10:54:05.791076797 +0000 fluent.trace: {"addr":"172.217.174.113","port":21513,"message":"accepted fluent socket addr=\"172.217.174.113\" port=21513"}
2025-10-31 19:54:05.796 | 2025-10-31 10:54:05 +0000 [trace]: #0 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:05.796 | 2025-10-31 10:54:05.795764672 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:05.796 | 2025-10-31 10:54:05 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:05.796 | 2025-10-31 10:54:05.795851088 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:05.867 | 2025-10-31 10:54:05 +0000 [trace]: #1 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:05.868 | 2025-10-31 10:54:05.867431588 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:05.923 | 2025-10-31 10:54:05 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:05.923 | 2025-10-31 10:54:05.923029588 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.044 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.046 | 2025-10-31 10:54:06.044259047 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.187 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.188 | 2025-10-31 10:54:06.186222505 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.242 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.242 | 2025-10-31 10:54:06.242099839 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.348 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.348 | 2025-10-31 10:54:06.347547380 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.492 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.492 | 2025-10-31 10:54:06.490772714 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.563 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.563 | 2025-10-31 10:54:06.562541005 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.624 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.624 | 2025-10-31 10:54:06.624089130 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.776 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.776 | 2025-10-31 10:54:06.775944506 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.797 | 2025-10-31 10:54:06 +0000 [trace]: #0 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:06.797 | 2025-10-31 10:54:06.797365172 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:06.863 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.863 | 2025-10-31 10:54:06.862817589 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.869 | 2025-10-31 10:54:06 +0000 [trace]: #1 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:06.869 | 2025-10-31 10:54:06.869451589 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:06.869 | 2025-10-31 10:54:06 +0000 [trace]: #1 enqueueing chunk instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:06.869 | 2025-10-31 10:54:06.869595797 +0000 fluent.trace: {"instance":2000,"metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"enqueueing chunk instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:06.966 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.966 | 2025-10-31 10:54:06.966439756 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.967 | 2025-10-31 10:54:06 +0000 [trace]: #1 dequeueing a chunk instance=2000
2025-10-31 19:54:06.967 | 2025-10-31 10:54:06.967044131 +0000 fluent.trace: {"instance":2000,"message":"dequeueing a chunk instance=2000"}
2025-10-31 19:54:06.967 | 2025-10-31 10:54:06 +0000 [trace]: #1 chunk dequeued instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:06.967 | 2025-10-31 10:54:06.967301506 +0000 fluent.trace: {"instance":2000,"metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"chunk dequeued instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:06.967 | 2025-10-31 10:54:06 +0000 [trace]: #1 trying flush for a chunk chunk="642722f32556f99de25b3c922f6949b4"
2025-10-31 19:54:06.967 | 2025-10-31 10:54:06.967622964 +0000 fluent.trace: {"chunk":"642722f32556f99de25b3c922f6949b4","message":"trying flush for a chunk chunk=\"642722f32556f99de25b3c922f6949b4\""}
2025-10-31 19:54:06.967 | 2025-10-31 10:54:06 +0000 [trace]: #1 adding write count instance=1980
2025-10-31 19:54:06.968 | 2025-10-31 10:54:06 +0000 [trace]: #1 executing sync write chunk="642722f32556f99de25b3c922f6949b4"
2025-10-31 19:54:06.968 | 2025-10-31 10:54:06.967767297 +0000 fluent.trace: {"instance":1980,"message":"adding write count instance=1980"}
2025-10-31 19:54:06.985 | 2025-10-31 10:54:06.967874964 +0000 fluent.trace: {"chunk":"642722f32556f99de25b3c922f6949b4","message":"executing sync write chunk=\"642722f32556f99de25b3c922f6949b4\""}
2025-10-31 19:54:06.985 | 2025-10-31 10:54:06 +0000 [trace]: #1 write operation done, committing chunk="642722f32556f99de25b3c922f6949b4"
2025-10-31 19:54:06.985 | 2025-10-31 10:54:06.985166256 +0000 fluent.trace: {"chunk":"642722f32556f99de25b3c922f6949b4","message":"write operation done, committing chunk=\"642722f32556f99de25b3c922f6949b4\""}
2025-10-31 19:54:06.986 | 2025-10-31 10:54:06 +0000 [trace]: #1 committing write operation to a chunk chunk="642722f32556f99de25b3c922f6949b4" delayed=false
2025-10-31 19:54:06.986 | 2025-10-31 10:54:06.985837881 +0000 fluent.trace: {"chunk":"642722f32556f99de25b3c922f6949b4","delayed":false,"message":"committing write operation to a chunk chunk=\"642722f32556f99de25b3c922f6949b4\" delayed=false"}
2025-10-31 19:54:06.986 | 2025-10-31 10:54:06 +0000 [trace]: #1 purging a chunk instance=2000 chunk_id="642722f32556f99de25b3c922f6949b4" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:06.987 | 2025-10-31 10:54:06.986083756 +0000 fluent.trace: {"instance":2000,"chunk_id":"642722f32556f99de25b3c922f6949b4","metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"purging a chunk instance=2000 chunk_id=\"642722f32556f99de25b3c922f6949b4\" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:06.987 | 2025-10-31 10:54:06 +0000 [trace]: #1 chunk purged instance=2000 chunk_id="642722f32556f99de25b3c922f6949b4" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:06.987 | 2025-10-31 10:54:06 +0000 [trace]: #1 done to commit a chunk chunk="642722f32556f99de25b3c922f6949b4"
2025-10-31 19:54:06.988 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.989 | 2025-10-31 10:54:06.986517089 +0000 fluent.trace: {"instance":2000,"chunk_id":"642722f32556f99de25b3c922f6949b4","metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"chunk purged instance=2000 chunk_id=\"642722f32556f99de25b3c922f6949b4\" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:06.989 | 2025-10-31 10:54:06.987556964 +0000 fluent.trace: {"chunk":"642722f32556f99de25b3c922f6949b4","message":"done to commit a chunk chunk=\"642722f32556f99de25b3c922f6949b4\""}
2025-10-31 19:54:06.990 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.990 | 2025-10-31 10:54:06.988852631 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.991 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.991 | 2025-10-31 10:54:06.990110214 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.991 | 2025-10-31 10:54:06.991490964 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.992 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.993 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.993 | 2025-10-31 10:54:06.992752672 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.994 | 2025-10-31 10:54:06.993563131 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.994 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.995 | 2025-10-31 10:54:06.994760839 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.997 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.997 | 2025-10-31 10:54:06.996895464 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:06.998 | 2025-10-31 10:54:06 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:06.999 | 2025-10-31 10:54:06.998461839 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.000 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.000 | 2025-10-31 10:54:07.000264839 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.002 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.002 | 2025-10-31 10:54:07.002356089 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.064 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.064 | 2025-10-31 10:54:07.063916547 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.191 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.191 | 2025-10-31 10:54:07.191684464 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.321 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.322 | 2025-10-31 10:54:07.321721464 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.418 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.418 | 2025-10-31 10:54:07.417909297 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.513 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.513 | 2025-10-31 10:54:07.512801589 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.653 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.653 | 2025-10-31 10:54:07.653505923 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.728 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.728 | 2025-10-31 10:54:07.727778923 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.802 | 2025-10-31 10:54:07 +0000 [trace]: #0 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:07.802 | 2025-10-31 10:54:07.801935714 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:07.837 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.837 | 2025-10-31 10:54:07.837123423 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.873 | 2025-10-31 10:54:07 +0000 [trace]: #1 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:07.873 | 2025-10-31 10:54:07 +0000 [trace]: #1 enqueueing chunk instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:07.873 | 2025-10-31 10:54:07.873312173 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:07.873 | 2025-10-31 10:54:07.873412839 +0000 fluent.trace: {"instance":2000,"metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"enqueueing chunk instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:07.912 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.912 | 2025-10-31 10:54:07.912498631 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.913 | 2025-10-31 10:54:07 +0000 [trace]: #1 dequeueing a chunk instance=2000
2025-10-31 19:54:07.913 | 2025-10-31 10:54:07 +0000 [trace]: #1 chunk dequeued instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:07.913 | 2025-10-31 10:54:07 +0000 [trace]: #1 trying flush for a chunk chunk="642722f442f35ca5e181cff2833d37e4"
2025-10-31 19:54:07.913 | 2025-10-31 10:54:07 +0000 [trace]: #1 adding write count instance=1980
2025-10-31 19:54:07.913 | 2025-10-31 10:54:07 +0000 [trace]: #1 executing sync write chunk="642722f442f35ca5e181cff2833d37e4"
2025-10-31 19:54:07.914 | 2025-10-31 10:54:07.913346256 +0000 fluent.trace: {"instance":2000,"message":"dequeueing a chunk instance=2000"}
2025-10-31 19:54:07.914 | 2025-10-31 10:54:07.913406631 +0000 fluent.trace: {"instance":2000,"metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"chunk dequeued instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:07.915 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.915 | 2025-10-31 10:54:07 +0000 [trace]: #1 write operation done, committing chunk="642722f442f35ca5e181cff2833d37e4"
2025-10-31 19:54:07.915 | 2025-10-31 10:54:07 +0000 [trace]: #1 committing write operation to a chunk chunk="642722f442f35ca5e181cff2833d37e4" delayed=false
2025-10-31 19:54:07.916 | 2025-10-31 10:54:07 +0000 [trace]: #1 purging a chunk instance=2000 chunk_id="642722f442f35ca5e181cff2833d37e4" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:07.916 | 2025-10-31 10:54:07 +0000 [trace]: #1 chunk purged instance=2000 chunk_id="642722f442f35ca5e181cff2833d37e4" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:07.916 | 2025-10-31 10:54:07 +0000 [trace]: #1 done to commit a chunk chunk="642722f442f35ca5e181cff2833d37e4"
2025-10-31 19:54:07.916 | 2025-10-31 10:54:07.913505339 +0000 fluent.trace: {"chunk":"642722f442f35ca5e181cff2833d37e4","message":"trying flush for a chunk chunk=\"642722f442f35ca5e181cff2833d37e4\""}
2025-10-31 19:54:07.916 | 2025-10-31 10:54:07.913518423 +0000 fluent.trace: {"instance":1980,"message":"adding write count instance=1980"}
2025-10-31 19:54:07.916 | 2025-10-31 10:54:07.913525631 +0000 fluent.trace: {"chunk":"642722f442f35ca5e181cff2833d37e4","message":"executing sync write chunk=\"642722f442f35ca5e181cff2833d37e4\""}
2025-10-31 19:54:07.916 | 2025-10-31 10:54:07.915338131 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.916 | 2025-10-31 10:54:07.915568839 +0000 fluent.trace: {"chunk":"642722f442f35ca5e181cff2833d37e4","message":"write operation done, committing chunk=\"642722f442f35ca5e181cff2833d37e4\""}
2025-10-31 19:54:07.917 | 2025-10-31 10:54:07.915697798 +0000 fluent.trace: {"chunk":"642722f442f35ca5e181cff2833d37e4","delayed":false,"message":"committing write operation to a chunk chunk=\"642722f442f35ca5e181cff2833d37e4\" delayed=false"}
2025-10-31 19:54:07.918 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.919 | 2025-10-31 10:54:07.915835464 +0000 fluent.trace: {"instance":2000,"chunk_id":"642722f442f35ca5e181cff2833d37e4","metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"purging a chunk instance=2000 chunk_id=\"642722f442f35ca5e181cff2833d37e4\" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:07.921 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.921 | 2025-10-31 10:54:07.915895506 +0000 fluent.trace: {"instance":2000,"chunk_id":"642722f442f35ca5e181cff2833d37e4","metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"chunk purged instance=2000 chunk_id=\"642722f442f35ca5e181cff2833d37e4\" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:07.921 | 2025-10-31 10:54:07.915917381 +0000 fluent.trace: {"chunk":"642722f442f35ca5e181cff2833d37e4","message":"done to commit a chunk chunk=\"642722f442f35ca5e181cff2833d37e4\""}
2025-10-31 19:54:07.923 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.923 | 2025-10-31 10:54:07.918295964 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.928 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.928 | 2025-10-31 10:54:07.921151756 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.928 | 2025-10-31 10:54:07.923208631 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.929 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.929 | 2025-10-31 10:54:07.928052256 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.930 | 2025-10-31 10:54:07.929652089 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.930 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.930 | 2025-10-31 10:54:07.930223631 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.930 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.930 | 2025-10-31 10:54:07.930745673 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.931 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.931 | 2025-10-31 10:54:07.931221798 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.932 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.932 | 2025-10-31 10:54:07.932172423 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.932 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.933 | 2025-10-31 10:54:07.932725464 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.933 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.934 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.934 | 2025-10-31 10:54:07.933176673 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.934 | 2025-10-31 10:54:07.934277798 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.934 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.934 | 2025-10-31 10:54:07.934876131 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.935 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.935 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.935 | 2025-10-31 10:54:07.935159839 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.935 | 2025-10-31 10:54:07.935409881 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.935 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.936 | 2025-10-31 10:54:07 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:07.936 | 2025-10-31 10:54:07.935781798 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:07.936 | 2025-10-31 10:54:07.936023881 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.049 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.049 | 2025-10-31 10:54:08.049230006 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.122 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.122 | 2025-10-31 10:54:08.121937464 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.220 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.220 | 2025-10-31 10:54:08.220486715 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.328 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.328 | 2025-10-31 10:54:08.328232465 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.400 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.400 | 2025-10-31 10:54:08.400243923 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.501 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.501 | 2025-10-31 10:54:08.501448673 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.651 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.651 | 2025-10-31 10:54:08.650889756 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.757 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.757 | 2025-10-31 10:54:08.757423465 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.803 | 2025-10-31 10:54:08 +0000 [trace]: #0 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:08.803 | 2025-10-31 10:54:08.803249798 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:08.870 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.871 | 2025-10-31 10:54:08.870719215 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.874 | 2025-10-31 10:54:08 +0000 [trace]: #1 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:08.874 | 2025-10-31 10:54:08.874660965 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:08.875 | 2025-10-31 10:54:08 +0000 [trace]: #1 enqueueing chunk instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:08.875 | 2025-10-31 10:54:08.874825340 +0000 fluent.trace: {"instance":2000,"metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"enqueueing chunk instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:08.920 | 2025-10-31 10:54:08 +0000 [trace]: #1 dequeueing a chunk instance=2000
2025-10-31 19:54:08.920 | 2025-10-31 10:54:08.920712632 +0000 fluent.trace: {"instance":2000,"message":"dequeueing a chunk instance=2000"}
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08 +0000 [trace]: #1 chunk dequeued instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08 +0000 [trace]: #1 trying flush for a chunk chunk="642722f52a022e8a3ba3c73a563b1295"
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08 +0000 [trace]: #1 adding write count instance=1980
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08.920872298 +0000 fluent.trace: {"instance":2000,"metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"chunk dequeued instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08.920923173 +0000 fluent.trace: {"chunk":"642722f52a022e8a3ba3c73a563b1295","message":"trying flush for a chunk chunk=\"642722f52a022e8a3ba3c73a563b1295\""}
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08.920941007 +0000 fluent.trace: {"instance":1980,"message":"adding write count instance=1980"}
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08 +0000 [trace]: #1 executing sync write chunk="642722f52a022e8a3ba3c73a563b1295"
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08.920953340 +0000 fluent.trace: {"chunk":"642722f52a022e8a3ba3c73a563b1295","message":"executing sync write chunk=\"642722f52a022e8a3ba3c73a563b1295\""}
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08.921382007 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.921 | 2025-10-31 10:54:08.921716215 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.922 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.922 | 2025-10-31 10:54:08.922235548 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.922 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.922 | 2025-10-31 10:54:08.922599965 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.923 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.923 | 2025-10-31 10:54:08.922949923 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.923 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.923 | 2025-10-31 10:54:08.923455507 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.923 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.924 | 2025-10-31 10:54:08.923804382 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.924 | 2025-10-31 10:54:08 +0000 [trace]: #1 write operation done, committing chunk="642722f52a022e8a3ba3c73a563b1295"
2025-10-31 19:54:08.924 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.924 | 2025-10-31 10:54:08.924162715 +0000 fluent.trace: {"chunk":"642722f52a022e8a3ba3c73a563b1295","message":"write operation done, committing chunk=\"642722f52a022e8a3ba3c73a563b1295\""}
2025-10-31 19:54:08.924 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.924 | 2025-10-31 10:54:08 +0000 [trace]: #1 committing write operation to a chunk chunk="642722f52a022e8a3ba3c73a563b1295" delayed=false
2025-10-31 19:54:08.925 | 2025-10-31 10:54:08.924260840 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.925 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.925 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.925 | 2025-10-31 10:54:08 +0000 [trace]: #1 purging a chunk instance=2000 chunk_id="642722f52a022e8a3ba3c73a563b1295" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:08.925 | 2025-10-31 10:54:08.924627965 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08.924689590 +0000 fluent.trace: {"chunk":"642722f52a022e8a3ba3c73a563b1295","delayed":false,"message":"committing write operation to a chunk chunk=\"642722f52a022e8a3ba3c73a563b1295\" delayed=false"}
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08.925504965 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08 +0000 [trace]: #1 chunk purged instance=2000 chunk_id="642722f52a022e8a3ba3c73a563b1295" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08.925742757 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08 +0000 [trace]: #1 done to commit a chunk chunk="642722f52a022e8a3ba3c73a563b1295"
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08.925808007 +0000 fluent.trace: {"instance":2000,"chunk_id":"642722f52a022e8a3ba3c73a563b1295","metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"purging a chunk instance=2000 chunk_id=\"642722f52a022e8a3ba3c73a563b1295\" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08.925988548 +0000 fluent.trace: {"instance":2000,"chunk_id":"642722f52a022e8a3ba3c73a563b1295","metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"chunk purged instance=2000 chunk_id=\"642722f52a022e8a3ba3c73a563b1295\" metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08.926091382 +0000 fluent.trace: {"chunk":"642722f52a022e8a3ba3c73a563b1295","message":"done to commit a chunk chunk=\"642722f52a022e8a3ba3c73a563b1295\""}
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.926 | 2025-10-31 10:54:08.926540757 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.927 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.927 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.927 | 2025-10-31 10:54:08.927013632 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.927 | 2025-10-31 10:54:08.927214840 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.927 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.928 | 2025-10-31 10:54:08.927742340 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.928 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.928 | 2025-10-31 10:54:08.928124757 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.928 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.928 | 2025-10-31 10:54:08.928425340 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.928 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.929 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.929 | 2025-10-31 10:54:08.928790007 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.929 | 2025-10-31 10:54:08.929021632 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.929 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.929 | 2025-10-31 10:54:08.929380715 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.929 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.929 | 2025-10-31 10:54:08.929658965 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.930 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.930 | 2025-10-31 10:54:08.929967715 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.930 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.931 | 2025-10-31 10:54:08.930342507 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.931 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.932 | 2025-10-31 10:54:08.931671298 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.932 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.933 | 2025-10-31 10:54:08.932618132 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.933 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.933 | 2025-10-31 10:54:08.933764090 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.934 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.934 | 2025-10-31 10:54:08.934697548 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.935 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.935 | 2025-10-31 10:54:08.935334007 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:08.935 | 2025-10-31 10:54:08 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:08.935 | 2025-10-31 10:54:08.935885673 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.012 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.012 | 2025-10-31 10:54:09.012141132 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.112 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.112 | 2025-10-31 10:54:09.111800423 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.170 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.170 | 2025-10-31 10:54:09.170699173 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.326 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.326 | 2025-10-31 10:54:09.323092090 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.423 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.423 | 2025-10-31 10:54:09.423105882 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.515 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.515 | 2025-10-31 10:54:09.514786382 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.658 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.659 | 2025-10-31 10:54:09.658644715 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.744 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.744 | 2025-10-31 10:54:09.744352340 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.801 | 2025-10-31 10:54:09 +0000 [trace]: #1 writing events into buffer instance=2000 metadata_size=1
2025-10-31 19:54:09.801 | 2025-10-31 10:54:09.800654215 +0000 fluent.trace: {"instance":2000,"metadata_size":1,"message":"writing events into buffer instance=2000 metadata_size=1"}
2025-10-31 19:54:09.805 | 2025-10-31 10:54:09 +0000 [trace]: #0 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:09.805 | 2025-10-31 10:54:09.805152590 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:09.877 | 2025-10-31 10:54:09 +0000 [trace]: #1 enqueueing all chunks in buffer instance=2000
2025-10-31 19:54:09.877 | 2025-10-31 10:54:09.876610299 +0000 fluent.trace: {"instance":2000,"message":"enqueueing all chunks in buffer instance=2000"}
2025-10-31 19:54:09.878 | 2025-10-31 10:54:09 +0000 [trace]: #1 enqueueing chunk instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:09.878 | 2025-10-31 10:54:09.877077799 +0000 fluent.trace: {"instance":2000,"metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"enqueueing chunk instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:09.928 | 2025-10-31 10:54:09 +0000 [trace]: #1 dequeueing a chunk instance=2000
2025-10-31 19:54:09.928 | 2025-10-31 10:54:09.927666757 +0000 fluent.trace: {"instance":2000,"message":"dequeueing a chunk instance=2000"}
2025-10-31 19:54:09.928 | 2025-10-31 10:54:09 +0000 [trace]: #1 chunk dequeued instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>
2025-10-31 19:54:09.928 | 2025-10-31 10:54:09 +0000 [trace]: #1 trying flush for a chunk chunk="642722f62035a54f1f128e4b30a62e91"
2025-10-31 19:54:09.928 | 2025-10-31 10:54:09.927995090 +0000 fluent.trace: {"instance":2000,"metadata":"#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>","message":"chunk dequeued instance=2000 metadata=#<struct Fluent::Plugin::Buffer::Metadata timekey=nil, tag=nil, variables=nil, seq=0>"}
2025-10-31 19:54:09.928 | 2025-10-31 10:54:09 +0000 [trace]: #1 adding write count instance=1980

Additional context

The match section that has logs being emitted when removing the extract section and adding a tag:

<match test.*>
  @type exec_filter
  command /usr/local/bin/my-go-script
  tag exec.filter
  <format>
    @type json
  </format>
  <parse>
    @type json
  </parse>
  <inject>
    tag_key execfiltertemp
  </inject>
</match>

Dockerfile:

# Fluentd stage
FROM amazonlinux:2

COPY td-agent.repo /etc/yum.repos.d/td-agent.repo

RUN yum update -y; \
yum install -y td-agent-4.5.2; \
yum install -y curl gpg gcc gcc-c++ git make procps tar which;

COPY td-agent.conf /etc/td-agent/td-agent.conf
RUN chmod 644 /etc/td-agent/td-agent.conf

EXPOSE 24224
EXPOSE 24224/udp

CMD ["td-agent", "-c", "/etc/td-agent/td-agent.conf"]

td-agent.repo:

[treasuredata]
name=TreasureData
baseurl=http://packages.treasuredata.com/4/amazon/$releasever/$basearch
gpgcheck=1
gpgkey=https://packages.treasuredata.com/GPG-KEY-td-agent

sendtest.py:

import json
import time
import random
from fluent import sender

FLUENTD_HOST = '127.0.0.1'
FLUENTD_PORT = 24224

with open('sendtest.json', 'r') as file:
    data = json.load(file)

logger = sender.FluentSender('test', host=FLUENTD_HOST, port=FLUENTD_PORT, timeout=3.0)

print("Starting continuous test (Ctrl+C to stop)...")

try:
    while True:
        print("emit 1")
        logger.emit('test', data)
        time.sleep(random.uniform(0.05, 0.15))
except KeyboardInterrupt:
    logger.close()
    print("Stopped.")

if logger.last_error:
    print('Error:', logger.last_error)

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    Status

    No status

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions