You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
yaml: fixes to all relevant Processor docs sections across all linked doc pages (#1869)
* Adding warning to yaml configuration pipeline doc page for processors.
Signed-off-by: Eric D. Schabell <[email protected]>
* Adding warning to processors main doc page for yaml only configuration.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration pipeline section doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration and add yaml only warning to conditional processing doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration and add yaml only warning to content modifier processor doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration and add yaml only warning to labels processor doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration and add yaml only warning to conditional processing processor doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration and add yaml only warning to filters processor doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration and add yaml only warning to sql processor doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration and add yaml only warning to sampling processor doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
* Fixes to standardize the yaml configuration and add yaml only warning to opentelemetry envelope processor doc page. Part of issue #1867.
Signed-off-by: Eric D. Schabell <[email protected]>
---------
Signed-off-by: Eric D. Schabell <[email protected]>
@@ -11,28 +11,42 @@ The `pipeline` section defines the flow of how data is collected, processed, and
11
11
12
12
## Example configuration
13
13
14
+
{% hint style="info" %}
15
+
16
+
**Note:** Processors can be enabled only by using the YAML configuration format. Classic mode configuration format
17
+
doesn't support processors.
18
+
19
+
{% endhint %}
20
+
14
21
Here's an example of a pipeline configuration:
15
22
23
+
{% tabs %}
24
+
{% tab title="fluent-bit.yaml" %}
25
+
16
26
```yaml
17
27
pipeline:
18
-
inputs:
19
-
- name: tail
20
-
path: /var/log/example.log
21
-
parser: json
22
-
23
-
processors:
24
-
logs:
25
-
- name: record_modifier
26
-
filters:
27
-
- name: grep
28
-
match: '*'
29
-
regex: key pattern
30
-
31
-
outputs:
32
-
- name: stdout
33
-
match: '*'
28
+
inputs:
29
+
- name: tail
30
+
path: /var/log/example.log
31
+
parser: json
32
+
33
+
processors:
34
+
logs:
35
+
- name: record_modifier
36
+
37
+
filters:
38
+
- name: grep
39
+
match: '*'
40
+
regex: key pattern
41
+
42
+
outputs:
43
+
- name: stdout
44
+
match: '*'
34
45
```
35
46
47
+
{% endtab %}
48
+
{% endtabs %}
49
+
36
50
## Pipeline processors
37
51
38
52
Processors operate on specific signals such as logs, metrics, and traces. They're attached to an input plugin and must specify the signal type they will process.
@@ -41,75 +55,91 @@ Processors operate on specific signals such as logs, metrics, and traces. They'r
41
55
42
56
In the following example, the `content_modifier` processor inserts or updates (upserts) the key `my_new_key` with the value `123` for all log records generated by the tail plugin. This processor is only applied to log signals:
43
57
58
+
{% tabs %}
59
+
{% tab title="fluent-bit.yaml" %}
60
+
44
61
```yaml
45
62
parsers:
46
-
- name: json
47
-
format: json
63
+
- name: json
64
+
format: json
48
65
49
66
pipeline:
50
-
inputs:
51
-
- name: tail
52
-
path: /var/log/example.log
53
-
parser: json
54
-
55
-
processors:
56
-
logs:
57
-
- name: content_modifier
58
-
action: upsert
59
-
key: my_new_key
60
-
value: 123
61
-
filters:
62
-
- name: grep
63
-
match: '*'
64
-
regex: key pattern
65
-
66
-
outputs:
67
-
- name: stdout
68
-
match: '*'
67
+
inputs:
68
+
- name: tail
69
+
path: /var/log/example.log
70
+
parser: json
71
+
72
+
processors:
73
+
logs:
74
+
- name: content_modifier
75
+
action: upsert
76
+
key: my_new_key
77
+
value: 123
78
+
79
+
filters:
80
+
- name: grep
81
+
match: '*'
82
+
regex: key pattern
83
+
84
+
outputs:
85
+
- name: stdout
86
+
match: '*'
69
87
```
70
88
89
+
{% endtab %}
90
+
{% endtabs %}
91
+
71
92
Here is a more complete example with multiple processors:
72
93
94
+
{% tabs %}
95
+
{% tab title="fluent-bit.yaml" %}
96
+
73
97
```yaml
74
98
service:
75
-
log_level: info
76
-
http_server: on
77
-
http_listen: 0.0.0.0
78
-
http_port: 2021
99
+
log_level: info
100
+
http_server: on
101
+
http_listen: 0.0.0.0
102
+
http_port: 2021
79
103
80
104
pipeline:
81
-
inputs:
82
-
- name: random
83
-
tag: test-tag
84
-
interval_sec: 1
85
-
processors:
86
-
logs:
87
-
- name: modify
88
-
add: hostname monox
89
-
- name: lua
90
-
call: append_tag
91
-
code: |
92
-
function append_tag(tag, timestamp, record)
93
-
new_record = record
94
-
new_record["tag"] = tag
95
-
return 1, timestamp, new_record
96
-
end
97
-
98
-
outputs:
99
-
- name: stdout
100
-
match: '*'
101
-
processors:
102
-
logs:
103
-
- name: lua
104
-
call: add_field
105
-
code: |
106
-
function add_field(tag, timestamp, record)
107
-
new_record = record
108
-
new_record["output"] = "new data"
109
-
return 1, timestamp, new_record
110
-
end
105
+
inputs:
106
+
- name: random
107
+
tag: test-tag
108
+
interval_sec: 1
109
+
110
+
processors:
111
+
logs:
112
+
- name: modify
113
+
add: hostname monox
114
+
115
+
- name: lua
116
+
call: append_tag
117
+
code: |
118
+
function append_tag(tag, timestamp, record)
119
+
new_record = record
120
+
new_record["tag"] = tag
121
+
return 1, timestamp, new_record
122
+
end
123
+
124
+
outputs:
125
+
- name: stdout
126
+
match: '*'
127
+
128
+
processors:
129
+
logs:
130
+
- name: lua
131
+
call: add_field
132
+
code: |
133
+
function add_field(tag, timestamp, record)
134
+
new_record = record
135
+
new_record["output"] = "new data"
136
+
return 1, timestamp, new_record
137
+
end
111
138
```
112
139
140
+
{% endtab %}
141
+
{% endtabs %}
142
+
113
143
Processors can be attached to inputs and outputs.
114
144
115
145
### How Processors are different from Filters
@@ -128,22 +158,28 @@ You can configure existing [Filters](https://docs.fluentbit.io/manual/pipeline/f
128
158
129
159
In the following example, the `grep` filter is used as a processor to filter log events based on a pattern:
0 commit comments