Using the eval command (2024)

Splunk’s Search Processing Language (SPL) empowers users to search, analyze, and visualize machine data effortlessly. Using theeval commandallows you to apply various operations for data manipulation. Mastering the eval command enables you to create more meaningful and insightful searches. In this article, we discuss benefits of using the eval command in your Splunk searches. We also provide some real-world examples for how it can be used.

Understanding the eval Command

Theeval command evaluates expressions and assigns the output to a field. It performs arithmetic operations, string manipulations, conditional logic, and more. With the evalcommand, you can create new fields or modify existing ones based on complex criteria. This enables you to customize your search results and extract valuable insights from your data.

Benefits of Using eval

  • Performing calculations: Witheval, you can perform mathematical operations on numeric fields, such as calculating averages, sums, or percentages, directly within your SPL queries.
  • String manipulation: Like basic calculations, string manipulation is one simple way to use theeval command. It does not require more advanced functions. With eval, any Splunker can create a field containing custom text. This field can then be tailored using the content of other fields.
  • Handling multivalue data: Some eval functions are specifically designed to read, create, or modify fields that contain multiple values per event. When working to meet precise data presentation format requirements, these operations are crucial.
  • Interpreting time data: Aggregating logs from different technologies presents a challenge in both parsing and creating time data. Time functions with the eval command provides the flexibility to utilize and present time data in Splunk events.
  • Applying conditional logic: Theeval command supports conditional expressions, giving precise control of modifications to data based on content in the existing dataset.
  • Field value assignment: Any eval operation assigns values to new fields or overwrites values of existing fields.

Proper Command Syntax

The basic syntax for the eval command is as follows:

index=<index>| eval <new_field> = <expression></expression></new_field></index>

Thiscommand accepts new or existing field names and uses combinations of strings, calculations, and eval functions to create expressions that modify existing data values.

Example 1:Mathematical Calculations

Use Case: Gain further insights on Website Transaction data using Splunk Tutorial Data.

index="tutorial" sourcetype=access_combined_wcookie| stats sum(bytes) as bytes by action| eventstats sum(bytes) as total_bytes| eval percentage=(bytes/total_bytes)*100

This search generates summary statistics on the sum of bytes for each action. Using the eventstats command, we create an additional field (total_bytes) with the total sum of bytes in the data. Asimple calculation witheval can then create a new field showing the percentage of all traffic volume for each action.

For more information on theeventstats command used in this example, see the previous Search Command of the Week articleUsing the eventstats Command.

Example 2:String Manipulation
Use Case: Create a detailed, dynamic event message using field values from the original dataset.
index=_internal component=LicenseUsage| stats sum(b) as bytes by idx, h, st| eval gb=round(bytes/1024/1024, 2)| eval message="Splunk has ingested "+gb+" GB of datawith a sourcetype of "+st+" into the index "+idx+".This data can be found by searching:index="+idx+" sourcetype="+st+" host="+h)

Any Splunk instance can use this search with internal Splunk log data to show a breakdown of ingest-based license usage. The initial stats command produces a summarized table, where an eval command performs a calculation.This calculation also uses theround function for data readability. Then, another eval command combines a user-defined string with inserted data for each unique combination of index, sourcetype, and host to create the desired output of a custom event message.

Example 3: Handle Multivalue Data

Use Case 1: Use a multivalue list of hosts to create a field for supplemental reporting context.

index=_internal log_level=ERROR| stats values(host) as host_list by component| eval host_count=mvcount(host_list)

The query above searches Splunk’s internal logs for ERROR messages. Moreover, it produces a table listing all hosts with errors for each Splunk logging component. The multivalue eval function mvcount is utilized to create an additional field (host_list), indicating the number of hosts listed for each logging component.This provides administrators with additional context to prioritize potentially-widespread issues in the environment.

Use Case 2: Use a multivalue field to normalize hostnames.

index=tutorial| eval host_group=(random() % 10) + 1| eval domain=case(host_group&lt;4, ".org.com", host_group&gt;=4 AND host_group&lt;8, ".local", host_group&gt;=8, "")| eval hostname=case(host_group&lt;6, upper(host), host_group&gt;=6, lower(host))| eval host=hostname+"-"+host_group+domain| eval split_host=split(host, ".")| eval shortname=mvindex(split_host, 0)| eval host=upper(shortname)| stats count by host

This query introduces some additional concepts for simulating various hostname formats. Later in this article, we’ll revisit the type of conditional logic demonstrated with case to further illustrate this common use of the eval command. The scenario provides varying domains (and lack thereof), as well as varying case of hostname, for the hosts in Splunk Tutorial Data. This logic produces 20 unique hostname patterns for what could be uniquely identified as 5 hosts.

Using multivalueeval functions in this scenario works by splitting the hostnames on the “.” character usingsplit. First, the query selects the first segment of the hostname using mvindexby identifying the 0 index value of the “split_host” field, it is then writes it to the field “shortname”. We make the data sortable and consistent, and also case normalize with theupper function. Then, we use this last string to overwrite the “host” field, showing uppercase versions of the 5 original Tutorial Data hosts.

Example 4: Interpret Time Data

Use Case 1: Modify an epoch timestamp to use a chosen time format.

| tstats latest(_time) as latest_event where index=_internal earliest=-7d latest=now() by host| eval latest_event=strftime(latest_event, "%Y-%m-%dT%H:%M:%S.%Q")

The search identifies the latest event in the_internal index for each Splunk server and forwarder. It’s helpful for missing forwarders, but it writes an epoch timestamp for the “latest_field” field. Using thestrftimefunction withevalparses and formats the timestamp to a user-friendly timestamp format specified by ISO 8601.

Use Case 2: Parse an additional timestamp field for use in SPL logic.

index=tutorial sourcetype=vendor_sales| rex field=_raw "\[(?<timestamp>\d{1,2}/\w+/\d{4}:\d{2}:\d{2}:\d{2})\]"| eval timestamp=strptime(timestamp, "%d/%b/%Y:%H:%M:%S")| eval ingestion_delay=_indextime-timestamp| stats avg(ingestion_delay) as avg_delay</timestamp>

This example search ignores timestamp extraction from Splunk Tutorial Data to focus on troubleshooting ingestion latency. Initially searching “vendor_sales” sourcetype, a timestamp is extracted as a string to a new field “timestamp”. This is done with the rex command, which can be further explored in the Search Command of the Week articleUsing the rex Command.

Thestrptime function is used with theeval command to read this string as a valid timestamp. Splunk can perform a simple calculation to show the ingestion latency between the origination of the event and the time that it was indexed in Splunk. With the static dataset, a significant difference will appear, highlighting an example of a potential issue with active Splunk Universal Forwarder data streaming.

Example 5: Applying Conditional Logic

Use Case: Create a field identifying priority events.

index=tutorial sourcetype=access_combined_wcookie| eval priority_event=if(action=="purchase" AND status&gt;=400, 1, 0)| where priority_event=1

Like thecasefunction seen in an earlier example, theif function evaluates a condition to apply values to new or existing fields. This scenario is using anaction of “purchase” and astatus code representing a range for potential errors to identify events of interest.

If the only objective in this scenario was to filter data, this SPL deviates from the best-practice approach to filter as early in the search as possible. However, using this conditional logic with theeval command provides a discrete field that can assist with producing more detailed visualizations and statistics.

The search below shows a modification to produce a chart emphasizing that less than 2% of events meet the chosen criteria to raise concern.

index=tutorial sourcetype=access_combined_wcookie| eval event_category=if(action=="purchase" AND status&gt;=400, "concern", "deprioritized")| stats count by event_category)

Conclusion

Incorporating theevalcommand into your Splunk searches greatly expands your ability to extract meaningful information and make data-driven decisions. Additionally, as you continue to explore its capabilities, you’ll find endless possibilities for transforming your data and gaining valuable insights.

In summary, theeval command in Splunk SPL is a powerful tool for manipulating and deriving fields, enabling you to unlock deeper insights from your data.

Remember, by mastering the eval command, you can create more context for producing insightful reports and visualizations.

Moreover:

  • Theevalcommand allows you to perform calculations, manipulations, and conditional logic on fields.
  • It enables you to derive new fields based on existing ones, enhancing your data analysis capabilities.
  • By mastering theevalcommand, you can create more context for producing insightful reports and visualizations.

Using the eval command (2024)

References

Top Articles
Latest Posts
Article information

Author: Domingo Moore

Last Updated:

Views: 6084

Rating: 4.2 / 5 (53 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Domingo Moore

Birthday: 1997-05-20

Address: 6485 Kohler Route, Antonioton, VT 77375-0299

Phone: +3213869077934

Job: Sales Analyst

Hobby: Kayaking, Roller skating, Cabaret, Rugby, Homebrewing, Creative writing, amateur radio

Introduction: My name is Domingo Moore, I am a attractive, gorgeous, funny, jolly, spotless, nice, fantastic person who loves writing and wants to share my knowledge and understanding with you.