Splunk Build-a-thon!

211 Registered Allowed team size: 1 - 2
211 Registered Allowed team size: 1 - 2
idea phase
Online
starts on:
Apr 28, 2025, 04:00 PM UTC (UTC)
ends on:
May 26, 2025, 12:00 AM UTC (UTC)
Prototype Phase
Online
starts on:
May 26, 2025, 04:00 PM UTC (UTC)
ends on:
Jun 23, 2025, 12:00 AM UTC (UTC)

Overview

Slack

Not a member? Request access here!

Themes

Track 1: Splunk App Development

Overview

Build innovative apps that extend Splunk’s capabilities. This theme focuses on creating impactful solutions that solve real-world challenges by integrating seamlessly with Splunk’s ecosystem.

Sample Problem Statement: Splunk App Development

  1. Develop an app that enhances workflows, providing faster response and resolution using real-time Splunk data.

  2. Create a customized dashboarding tool to deliver actionable insights for specific industries like healthcare, retail, finance, etc.

  3. Build an app that automates reporting and compliance tracking.

  4. Design an app to encourage data quality improvements within enterprise environments.

Supporting Links

Click here to learn about submission guidelines, tech stack requirements, and more details.

Track 2: Splunk Add-on Integration Development

Overview

Create powerful Splunk Add-ons that integrate external data sources, services, and platforms with Splunk. This track focuses on enabling seamless data ingestion, enrichment, and interoperability between Splunk and third-party tools, making it easier for users to extract valuable insights from diverse data sources.

Sample Problem Statements: Splunk Add-on Development

  1. IT Service Management Integration: As an integration engineer for a software vendor that has developed a SaaS product for secure identity and authentication, you are tasked with creating an integration that enables Splunk customers to collect authentication events from your product into Splunk. Your product exposes a REST API that can be used to retrieve these events, and you know that you can create "modular inputs" in Splunk to ingest data from external sources. Customers will require the ability to configure multiple tenants, as many customers use separate tenants for their different business units.

  2. Optimized Resource Allocation for DevOps: As an integration engineer for a software vendor that has developed a SaaS product for IT service management, you are tasked with creating an integration that enables Splunk customers to send events from their Splunk searches to their ticketing system for investigation. Your product exposes a webhook-based API that can be used from other products to create tickets, and you know that you can create "modular alert actions" in Splunk to act on results from searches. You'll also need a configuration page where details of the customer's tenant, such as URL and API token, can be provided by the customer.

  3. Threat Intelligence Integration: As an integration engineer for a cybersecurity vendor that provides a real-time threat intelligence feed (including malicious IP addresses, domains, and vulnerabilities), you are tasked with building a Splunk Technical Add-on. This Add-on will enable customers to pull threat intelligence data from your REST-based or streaming API into Splunk for enhanced security analytics. The Add-on should include a modular input to handle recurring data retrieval (with scheduling and robust error handling), automatic data parsing and enrichment (e.g., mapping fields to Splunk’s Common Information Model), and a configuration page where customers can specify API credentials, data filtering options, and ingestion intervals. Additionally, the Add-on should incorporate best practices for data volume management, allowing customers to filter or throttle incoming threat data to match their licensing and operational requirements.

Supporting Links

Click here to learn about submission guidelines, tech stack requirements, and more details.

Track 3: Data Management (SPL2 Pipelines)

Overview

Harness the power of SPL2 (Search Processing Language) to design efficient, scalable pipelines for data ingestion, transformation, and analysis. This theme encourages participants to simplify complex data workflows, optimize data processing, and extract meaningful insights.

Sample Problem Statements: SPL2 Data Processing

  1. Reduce Log Volume: As a Network Operations Manager for a company that utilizes networking devices, you are tasked with effectively managing and monitoring the massive volume of logs that these devices generate. This high volume of network log data is leading to storage issues, longer processing times, and difficulty in finding relevant logs among the noise. It also increases the cost of log management and makes it challenging to comply with various data retention policies. For the provided data sources (X & Y), you are tasked with creating SPL2 pipelines that address these problems.

  2. Convert Logs to Metrics: As a Data Administrator, converting logs to metrics is a valuable tool for observability because it enhances performance monitoring, simplifies analysis, reduces storage costs, and enables real-time alerting at scale. The Ingest Processor supports a dedicated logs-to-metrics (L2M) function in SPL2 and allows routing data to both Splunk Metrics Index and Observability Cloud. For the provided data sources (X & Y), you are tasked with creating SPL2 pipelines that convert logs into useful metrics that downstream engineers, site-reliability engineers, and admins can consume.

Supporting Links

Click here to learn about submission guidelines, tech stack requirements, and more details.

Track 4: AI/ML

Overview

Develop ML-based threat detections inside Splunk using MLTK. Take advantage of the Splunk ecosystem by bringing your data into Splunk and building real-time pipelines to capture threat actors using Splunk MLTK.

Sample Problem Statements: Machine Learning-Based Threat Detection

  1. Unusual Volume of Bytes Written to USB per Device Model

  2. Unusual Volume of Box Downloads per User Model

  3. VPN Login from an Unusual Location per User Model

  4. Unusual Time of Print Commands per User Model

  5. Unusual Volume of Pages Printed per Device Model

Supporting Links

Click here to learn about submission guidelines, tech stack requirements, and more details.

Prizes

Main Prizes
Grand Prize in each track includes: (4)
  • A gift card reward of $400 per person (Splunk Store)
  • EDU credits - 150 credits ($1500 per track) and $6000 (for 4 tracks)
  • .conf25 pass  ($1,795 early bird)  (travel & lodging not included)
  • Recognition across Splunk’s social media channels and community platforms, including a special Developer Spotlight blog
2nd Place Prize in each track includes: (4)
  • A gift card reward of $350 (Splunk Store)
  • EDU credits - 100 credits ($1000 per track) - $4000 (for 4 tracks)
  • Recognition across Splunk’s social media channels and community platforms, including a special Developer Spotlight blog
3rd Place Prize in each track includes: (4)
  • A gift card reward of $300 (Splunk Store)
  • EDU credits - 50 credits ($500 per track) - $2000 (for 4 tracks)
  • Recognition across Splunk’s social media channels and community platforms, including a special Developer Spotlight blog
more
starts on:
Apr 28, 2025, 04:00 PM UTC (UTC)
closes on:
May 26, 2025, 12:00 AM UTC (UTC)

Social Share

Help & Support

Please contact event admin
HackerEarth Support at support@hackerearth.com

?