Splunk Enterprise is a well know standard for machine data analytics. Splunk is used by Applications Business Owners as well as IT Engineers. It rich features are far above Logmanagement requirements and we rather say about Splunk as „BI Log Sofware” rather than Log Server/Collector. On other hand, Elasticsearch platforms is enabled for simple log processing and storage. Splunk beats Elastic with features, but ELK covers more data – that is a rule. Licence of Splunk count GB of data indexed every day. That policy is often a blocker for larger investments in Splunk. We see that customers use Splunk for data that are more important from business perspective and require special math functions of Splunk. All other data like OS logs, Syslog data, AD and proxies history are directed into Energy Logserver and Elastic. Reason is the cost of Splunk that could cover all companies requirements. We can clearly say, that Splunk is for Business as Elastic is for IT.

Do we have to select ? Can we take benefits of both ?

Our answer to this is a „Splunk2Elastic connector” which we created. We all know that both applications relay on API calls and are deeply documented. Both, Splunk and Elastic, are made for integrations. The veriety of addons make it possible to establish communication between platforms. So lets give it a try !

Splunk2Elastic Connector is a software made for Splunk Enterprise, working as its internal application. After installation and basic configuration like providing ElasticSearch IP (optional authorization to Energy Logserver or X-Pack) we are given new commands for SPL. Look at the architecture:

And see how it works !

Reguar Splunk user keeps its GUI, but can use new commands that enable the integration. We start with a simple query on ELK Kibana, showing documents of syslog stream from CR25ia network device. We see all the results in ELK:

Our Splunk know nothing about this:

New let’s ntegrate. Use the connector to search in ELK:

The result comes to the Splunk Console. There are important features of the Splunk2Elastic Connector, which we should mention:

  • Connector carry about data time picker that is showed in abouve example as „Last 60 minutes” – so we search as much as we have to
  • We can pass arguments to ELK Search directing the query to its index pattern – we can even write a full query what will cause the search to run faster
  • All the documents keep their original parsing structure, also syslog format data like priority, level is shown corectly

Splunk still does not have those data, but stays open for its processing in SPL search bar:

All the ELK data can be handled with Splunk commands, like they were processed localy. Once we use „pipe” like : | search CR25ia –we use all Splunk feature.

Data we see are still not stored in Splunk which is why we do not need to bother for licence utylization. Can we work with those data ? Of course YES ! We can make Reports, we can trigger Alarms like on the example below:

The integration with Energy Logserver/Elastic enables Splunk in new area. Suddenly a competitor can take benefits out of this integrations, delivering new scope of data to business owners.

Every integration relay on system connectivity – things like network, bandwidth plays an important role in Connector performance and will always be slower than handling tasks localy. That is possible too. Once the result from ELK is delivered to Splunk we can decide to store it forever in local index – that is the moment for licence utilization. We use collect SPL command that puts visible output into designated index.

Like that:

Starting from now we have the data in Splunk so we can search again for events from example device CR25ia:

Why we made Splunk2Elastic connector ?

  • IT Log Centralization Projects require enormous data space for logs that are of little importance for the organization, but must be kept due to regulations
  • IT Budgets does not allow implementing Splunk for Full IT Logs
  • We want to use Splunk where it has never been – area of 100% IT Logs
  • We reduce the risk of the investment – You dp not have to be affraid of peaks in the data stream, Your Splunk will not be blocked
  • We create an Umbrella system out of Splunk. Good place for this soltion is to play master role in all integrations.

Sample Use case : Log Analytics in terms of Security – Splunk as SIEM

In that example we must analize big amout of data. Using Splunk2Elastic is a must in many project of that type. Lets create a backgroud reporting tasks that will connect to ELK and search for specyfic security relevant data:

Those tasks are scheduled every 5 minutes and search all new data in ELK. If the search meets its conditions, the results are taked back to Splunk in a raw format adding all necessary metadata. The result get a proper classification with additional meta data – all is stored in local index of Splunk from now.

Each search result gets its category and can be easly displayed on a SOC dashboard.

Each event is stored in Splunk as a result. If the log does not carry security relevat data it stays in ELK. All security data comes to Splunk and use its licence.That cost is in fact very reasonable. Connection of Splunk and Elastic allows to deliver highly scalable solution that can be easly adopted for new performance requiremnts.

Get for free