Not a member? Request access here!
Overview
Build innovative apps that extend Splunk’s capabilities. This theme focuses on creating impactful solutions that solve real-world challenges by integrating seamlessly with Splunk’s ecosystem.
Sample Problem Statement: Splunk App Development
Develop an app that enhances workflows, providing faster response and resolution using real-time Splunk data.
Create a customized dashboarding tool to deliver actionable insights for specific industries like healthcare, retail, finance, etc.
Build an app that automates reporting and compliance tracking.
Design an app to encourage data quality improvements within enterprise environments.
Supporting Links
Click here to learn about submission guidelines, tech stack requirements, and more details.
Overview
Create powerful Splunk Add-ons that integrate external data sources, services, and platforms with Splunk. This track focuses on enabling seamless data ingestion, enrichment, and interoperability between Splunk and third-party tools, making it easier for users to extract valuable insights from diverse data sources.
Sample Problem Statements: Splunk Add-on Development
IT Service Management Integration: As an integration engineer for a software vendor that has developed a SaaS product for secure identity and authentication, you are tasked with creating an integration that enables Splunk customers to collect authentication events from your product into Splunk. Your product exposes a REST API that can be used to retrieve these events, and you know that you can create "modular inputs" in Splunk to ingest data from external sources. Customers will require the ability to configure multiple tenants, as many customers use separate tenants for their different business units.
Optimized Resource Allocation for DevOps: As an integration engineer for a software vendor that has developed a SaaS product for IT service management, you are tasked with creating an integration that enables Splunk customers to send events from their Splunk searches to their ticketing system for investigation. Your product exposes a webhook-based API that can be used from other products to create tickets, and you know that you can create "modular alert actions" in Splunk to act on results from searches. You'll also need a configuration page where details of the customer's tenant, such as URL and API token, can be provided by the customer.
Threat Intelligence Integration: As an integration engineer for a cybersecurity vendor that provides a real-time threat intelligence feed (including malicious IP addresses, domains, and vulnerabilities), you are tasked with building a Splunk Technical Add-on. This Add-on will enable customers to pull threat intelligence data from your REST-based or streaming API into Splunk for enhanced security analytics. The Add-on should include a modular input to handle recurring data retrieval (with scheduling and robust error handling), automatic data parsing and enrichment (e.g., mapping fields to Splunk’s Common Information Model), and a configuration page where customers can specify API credentials, data filtering options, and ingestion intervals. Additionally, the Add-on should incorporate best practices for data volume management, allowing customers to filter or throttle incoming threat data to match their licensing and operational requirements.
Supporting Links
Click here to learn about submission guidelines, tech stack requirements, and more details.
Overview
Harness the power of SPL2 (Search Processing Language) to design efficient, scalable pipelines for data ingestion, transformation, and analysis. This theme encourages participants to simplify complex data workflows, optimize data processing, and extract meaningful insights.
Sample Problem Statements: SPL2 Data Processing
Reduce Log Volume: As a Network Operations Manager for a company that utilizes networking devices, you are tasked with effectively managing and monitoring the massive volume of logs that these devices generate. This high volume of network log data is leading to storage issues, longer processing times, and difficulty in finding relevant logs among the noise. It also increases the cost of log management and makes it challenging to comply with various data retention policies. For the provided data sources (X & Y), you are tasked with creating SPL2 pipelines that address these problems.
Convert Logs to Metrics: As a Data Administrator, converting logs to metrics is a valuable tool for observability because it enhances performance monitoring, simplifies analysis, reduces storage costs, and enables real-time alerting at scale. The Ingest Processor supports a dedicated logs-to-metrics (L2M) function in SPL2 and allows routing data to both Splunk Metrics Index and Observability Cloud. For the provided data sources (X & Y), you are tasked with creating SPL2 pipelines that convert logs into useful metrics that downstream engineers, site-reliability engineers, and admins can consume.
Supporting Links
Configuring and Deploying Splunk Data Management Pipeline Builders
Reducing PAN and Cisco Security Firewall Logs with Splunk Edge Processor
Converting Logs into Metrics with Edge Processor for Beginners
Click here to learn about submission guidelines, tech stack requirements, and more details.
Overview
Develop ML-based threat detections inside Splunk using MLTK. Take advantage of the Splunk ecosystem by bringing your data into Splunk and building real-time pipelines to capture threat actors using Splunk MLTK.
Sample Problem Statements: Machine Learning-Based Threat Detection
Unusual Volume of Bytes Written to USB per Device Model
Unusual Volume of Box Downloads per User Model
VPN Login from an Unusual Location per User Model
Unusual Time of Print Commands per User Model
Unusual Volume of Pages Printed per Device Model
Supporting Links
Get Data into Splunk User Behavior Analytics - CIM and Threat Detection
Welcome to the Splunk Machine Learning Toolkit - Splunk MLTK
Algorithms in the Splunk Machine Learning Toolkit - Splunk MLTK
Click here to learn about submission guidelines, tech stack requirements, and more details.