
Connecting Splunk with LLM
Why Use LLMs for DFIR in Splunk?
As DFIR professionals, we deal with massive volumes of logs—security events, network traffic, endpoint telemetry, and more. While Splunk’s SPL (Search Processing Language) is powerful, what if we could ask natural language questions like:
“Show me suspicious login attempts in the last 24 hours.”
“Summarize the most critical security events from our firewall logs.”
“Explain this error log and suggest troubleshooting steps.”
The Large Language Model (LLM) for Splunk add-on makes this possible by integrating Retrieval-Augmented Generation (RAG) with Splunk. This means you can query logs conversationally while ensuring the LLM responds based only on your log data—not hallucinations from its training set.
In this guide, I’ll walk you through setting this up on a Windows-based Splunk deployment using Docker.
Prerequisites
Before we begin, ensure you have:
✅ Splunk Enterprise installed (Windows)
✅ Docker Desktop for Windows
✅ At least 16GB RAM (LLMs are resource-hungry)
✅ Administrative access to configure containers
Step 1: Setting Up the Required Containers
This solution relies on three key components:
Ollama (for running open-weight LLMs like
llama3
)Milvus (vector database for RAG)
Splunk LLM Connector (proprietary container for interfacing with Splunk)
1. Install Milvus (Vector Database)
Milvus stores embeddings of your logs for fast retrieval. Run this in PowerShell (Admin):
# Download Milvus standalone Docker Compose curl -o docker-compose.yml https://raw.githubusercontent.com/milvus-io/milvus/master/deployments/docker/standalone/docker-compose.yml # Start Milvus docker-compose up -d
2. Install Ollama (LLM Engine)
Ollama lets us run models like llama3
locally:
# Pull and run Ollama docker pull ollama/ollama docker run -d -p 11434:11434 --name ollama ollama/ollama # Download required models docker exec ollama ollama pull nomic-embed-text # For embeddings docker exec ollama ollama pull llama3 # For text generation
3. Install the Splunk LLM Connector
This container bridges Splunk to the LLM:
docker pull bendenzer98/splunk_llm:latest docker run -d -p 5555:5555 -e llm_URL="http://ollama:11434" -e Milvus_URL="http://milvus:19530" -e Local_Or_NAI="Local" -e Embeddings_URL="http://ollama:11434" --name splunk_llm bendenzer98/splunk_llm:latest
4. Connect Containers in a Network
Ensure all containers can communicate:
docker network create llm_net docker network connect llm_net ollama docker network connect llm_net milvus-standalone docker network connect llm_net splunk_llm
Step 2: Install the Splunk Add-on
In Splunk Web:
Go to Apps → Manage Apps → Browse More Apps
- Look for “Large Language Model for Splunk“. Install it.
Step 3: Configure the Add-on
Navigate to Apps → Large Language Model for Splunk → Configuration.
Set the LLM Server URL to
http://localhost:5555
.(Optional) Adjust query limits and logging as needed.
Step 4: Testing & Use Cases for DFIR
Now, try these in Splunk:
1. Natural Language Log Queries
“Show me failed SSH attempts in the last hour.”
“List all outbound connections to known malicious IPs.”
2. Log Summarization
“Summarize the top 10 critical events from our Windows Event Logs.”
3. Troubleshooting Assistance
“Explain this error: ‘Kerberos pre-authentication failed.'”
“What does this Suricata alert mean?”
Troubleshooting Tips
🔹 Containers not starting? Check logs with:
docker logs ollama docker logs milvus-standalone docker logs splunk_llm
🔹 Port conflicts? Ensure 11434
, 19530
, and 5555
are free.
🔹 Slow responses? Allocate more CPU/RAM to Docker in settings.
Final Thoughts
This integration unlocks conversational log analysis, making Splunk even more powerful for DFIR teams. While still in early stages, the ability to ask questions in plain English—and get contextual, log-based answers—could revolutionize how we investigate incidents.
Future improvements?
Support for more models (Mistral, GPT-4, etc.)
Pre-built detection prompts for common DFIR tasks
Have you tried LLMs with Splunk? Let me know in the comments!
Happy hunting! 🚀
📌 Liked this guide? Share it with your DFIR team!
🔗 Follow me on [Twitter/LinkedIn] for more Splunk & DFIR tips.