Splunk to Sentinel Migration – Part I

I have been a busy bee the last few years. Solliance has kept me focused on a myriad of projects and clients using various technologies such as Machine Learning, AI, Security, IoT and much more.

Recently, I have been focused on building internal training for a whole new breed of security professionals inside Microsoft called Security Cloud Solution Architects (CSAs). As part of their training, they must learn about Azure Defender, Microsoft Defender, Azure Sentinel etc.

One of the topics I added to the training focuses on moving customers to Azure Sentinel. Which means, you would need to retire your old SIEM/SOAR system in lieu of the new one. Some folks would say that this is a easy transition, however, if you have been tasked with a migration project of this kind, you know better.

At a high level, most SIEMs have the same myriad of concepts such as Alerts, Alert actions, Search Queries (along with the tools SIEM language), Dashboard, etc. Each of these present a different set of challenges when migrating them to another platform such as Azure Sentinel. In the following blogs series, I will attempt to run through how I have solved some of these challenges and how I have automated the process.

First, let’s talk about the artifacts that exist in Splunk:

  1. Alerts
  2. Alert Actions
  3. Source Types
  4. Searches
  5. Dashboard
  6. Reports
  7. Namespaces
  8. Permissions

Splunk Query Language to Kusto Query Language

Splunk customers put a lot of work into adding log sources based on source types and then building queries from the data that is ingested and indexed from them. Once they have built the queries, these queries then tend to be used a multiple places such as Alerts, Dashboards and Reports. Splunk uses the Splunk Processing Language (SPL). This is vastly different from the Azure Sentinel Kusto Query Language (KQL). Because it is not easy to map 1:1 from Splunk to Sentinel as it requires you to understand the SPL syntax and the corresponding KQL syntax. For example, take the following query in SPL:

| rest splunk_server=$splunk_server$ /services/server/info
| fields splunk_server version
| join type=outer splunk_server [rest splunk_server=$splunk_server$ /services/server/status/installed-file-integrity
    | fields splunk_server check_ready check_failures.fail]
| eval check_status = case(isnull('check_failures.fail') AND isnotnull(check_ready), "enabled", 'check_failures.fail' == "check_disabled", "disabled", isnull(check_ready), "feature unavailable")
| eval check_ready = if(check_status == "enabled", check_ready, "N/A")
| fields version check_status check_ready
| rename version AS "Splunk version" check_status AS "Check status" check_ready AS "Results ready?"

Would become something like this in KQL:

let table0 = externaldata (data:string) [@"/services/server/status/installed-file-integrity"]| project splunk_server,check_ready,check_failures.fail;
externaldata (data:string) [@"/services/server/info"]| project splunk_server,version
| join kind = outer (table0) on splunk_server
| extend check_status  = case (isnull(check_failures.fail) and isnotnull(check_ready),"enabled",'check_failures.fail' == "check_disabled","disabled",isnull(check_ready),"feature unavailable")
| extend check_ready  = iff (check_status == "enabled",check_ready,"N/A")
| project-rename version = "Splunk version",check_status = "Check status",check_ready = "Results ready?"

If you thought manually transforming queries was a challenge, think through the complexities of automating the process (which the above was done through). It requires writing a tokenizer, then creating a two-pass compiler to build out the language graph. From there, I have to be able to transform the graph of SPL to a graph of KQL, then spit out the KQL query text from the final converted graph. Can anyone say “Dragon Book”?!? I knew you could!

There are some quirks to all this however, I have to be able to know what are strings, variables, table and/or column names…which requires me to interrogate the Splunk indexes for that information. Once I have all the metadata in hand, err memory, I can more easily determine what each thing is in the query when building the graph and subsequent KQL queries.

Some may find this tasks daunting, however, if you do a search you may find a tool called https://uncoder.io. Although it shows great promise when you load the site, it doesn’t really work (broken most of the time) and thusly, I had to build my own as you can see from above.

Apps and Source Types – Oh My.

But…indexes are based on the Source Types that have been added in Splunk. Splunk provides several out of the box source types, but new ones can be added from Apps. Think things like F5, CheckPoint, etc. These source types provide log data that is then indexed and available for querying. Apps have to be mapped from the Splunk based ones to the Azure Sentinel ones, ouch. The indexes created from Splunk Apps have to be transformed to the similar table and column structure in the Azure Sentinel side.

Once you solve these problems, you can easily automate the query transformations from SPL to KQL.

Exporting Artifacts

In order to automate the entire process, you have to export all the objects in the Splunk instance. This can be done using the REST APIs provided by Splunk. It has a very object oriented way of storing these objects which are then serialized to JSON in the REST responses. This provides a very convenient way of dumping the artifacts that would then be used in a migration tool (such as the one I have written).

Next Post – Alerts and Alert Actions

Join me later this month when I write about how to convert Splunk Actions and Alert Actions to Azure Sentinel via automated means!

Splunk to Sentinel Blog Series

  1. Part I – SPL to KQL, Exporting Objects
  2. Part II – Alerts and Alert Actions
  3. Part III – Lookups, Source Types and Indexes
  4. Part IV – Searches
  5. Part V – Dashboards and Reports
  6. Part VI – Users and Permissions

References:

  1. Splunk Processing Language (SPL)
  2. Kusto Query Language (KQL)
  3. Splunk REST API
  4. Splunk Download
  5. Azure Sentinel
  6. My Twitter
  7. My LinkedIn
  8. My Email