How to modify result sets and normalize data using Commands and functions.
01.-Appendpipe Command (09:50)
02.-Eventstats Command (08:54)
03.-Streamstats Command (06:09)
06.-Working with Xyseries and Untable (12:11)
08.-Modifing Field Values with Eval (11:57)
09.-Eval Conversion Functions (11:14)
10.-Eval text Functions (06:12)
11.-Eval Substr Functions (02:25)
12.-Eval Coalescence Function (02:07)
The appendpipe
command will allow us to take existing results and push them into the sub pipeline, meaning the results of the search specified in the appendpipe
command are appended to the end/bottom of the outer result set. the sub pipeline is executed when Splunk reaches the appendpipe
command:
- Contains one or more transforming commands.
- Does not overwrite original results; instead, appends output as new lines to the bottom of the original result set.
- Multiple
appenpipe
commands can exist in a search.
The appendpipe
command is preceded by a search (...|)
command and then followed in square brackets by the sub pipeline. The results of the wrapped-in-brackets search are what are appended to the results of the outer/previous/preceding search.
This command helps me to add a subtotal lines in a report.
From the free cloud trial....
index="_internal" component=Metrics name=parsing
. I will process only the events with the value "Metrics" in the component
field and the value "parsing" in the field name
.
I want the average executions by processor and ingest pipe. I will add a sub-average row per processor and a grand average per the last 15 minutes.
The search index="_internal" component=Metrics name=parsing | stats avg(executes) as my_avg by the processor, ingest_pipe
, produces this table (a 3-column table).
The appendpipe [stats avg(my_avg) as my_avg by processor
command appends at the end a table with the average per processor such as this one. (a 2-column table). Please notice that appendpipe
search uses the given name to the aggregation column of the outer/previous/precedent search my_avg
.
The command sort processor
sorts the rows in such a way that subtotal rows appear at the end of each processor
value.
To fill in the third column of the latter 2-column table, I create with 'eval' command a new column eval ingest_pipe = "Average for ". processor
.
Finally, to add a grand Average, I need to calculate the averages of the processor's average. I do that with a new appendpipe
command that aggregates the table's rows lines with "Average for".
| appendpipe [ search ingest_pipe = "Average for*" | stats avg(my_avg) as my_avg | eval ingest_pipe = " ===== ALL PROCESSOR AVERAGE ======"]
Here is the full search command.
The eventstats
command will generate statistics for fields in search events, saving them to a new field in the results, saving them inline with pertinent events.