Progress Chef and Splunk, two prominent DevOps solutions, can work together to bring more depth to real-time monitoring by automating data collection across systems.
This integration benefits organizations by allowing Splunk to analyze configuration data, compliance metrics, infrastructure changes and application logs provided by Chef.
This blog discusses how this integration works, what data is shared and how to set it up.
Progress Chef Automate provides an integrated platform for automating infrastructure, applications and compliance tasks across your organization's environments. It’s the hub for managing and orchestrating workflows, maintaining consistency and driving compliance within DevOps pipelines.
Splunk is a real-time platform designed to search, monitor, analyze and visualize machine-generated data. It’s widely used for operational intelligence, application monitoring, security and troubleshooting across IT systems and infrastructures. Splunk has a Data Feed service that can be utilized by Chef.
The Data Feed service sends node data to a third-party service. This can be useful when updating configuration management databases, external security dashboards and IT service management platforms.
The following types of information are sent:
The client runs compliance reports, which the data feed service sends to the registered destinations every four hours (the interval is customizable). Please note that the aggregation will not happen if there are no destinations. Also, data is combined and transmitted in batches of 50 nodes at a time.
First, download or use a trial version of Splunk from here. Next, follow the instructions below:
o URL: Input the endpoint URL for your Splunk feed, including specific port details.
o Alternatively, choose a token type for Access Token and input the token value.
To modify Data Feed behavior with the available configuration settings, create a configuration patch file to update the configuration settings. Save this file in the .toml file format and name your file as desired.
For example, data-feed-patch.toml. To reflect the desired global Data Feed behaviour, include one or more configuration settings and their updated value(s) in your configuration patch .toml file.
Above is an example of data-feed-patch.toml. Here, we send data to Splunk every two minutes.
Find more details here.
To apply your configuration changes with the Chef Automate command-line tool: $ chef-automate config patch data-feed-patch.toml
settings>indexes>new index.
This is the directory where the logs will be stored.settings>data inputs
and create a data input by choosing the HTTP Event Collector (HEC). Note: Verify you have all Tokens in the ‘Enabled’ state with other settings.
Once you fill in all the above settings for both Automate and Splunk, the data will be sent to Splunk according to the defined global settings.
Example:
We are scanning two Linux machines and an AWS environment in this scenario. Based on their respective profiles, we create different scan jobs in Automate UI.
Per the Global setting, this data will be sent to Splunk every two minutes. To confirm this, log in to the Search & Reporting section and search for your index.
The picture above is raw data directly sent from Automate to Splunk. You can edit, modify or select entities based on your requirements. You can even use this data to construct a custom dashboard.
Integrating Chef Automate with Splunk empowers organizations with robust visibility into their infrastructure and compliance data, streamlining operations and improving decision-making. By centralizing the rich insights of Chef Automate within Splunk's powerful analytics platform, teams can easily monitor trends, identify anomalies and maintain compliance.
If you need more information or help integrating the above tools, don't hesitate to contact your Account Manager.