PANW OG2

Cribl and Palo Alto Networks Launch Partnership with Cortex XSIAM Integration

Last edited: April 29, 2025

Cribl’s powerful data processing engine is designed specifically for IT and Security teams, enabling organizations to take control of their ever-growing data volumes. By simplifying the management, processing, and analysis of telemetry data, such as logs, metrics, and traces, generated across complex digital environments. This empowers organizations with the choice, control, and flexibility to manage and analyze data, allowing them to adapt to evolving needs and strategies. Cribl’s product suite includes Cribl Stream, the industry’s leading observability pipeline, Cribl Edge, an intelligent vendor-neutral agent, Cribl Search, the industry’s first search-in-place solution, and Cribl Lake, a turnkey data lake. 

Cribl’s Technology Alliance Partner (TAP) program is a global ecosystem of technology partners focused on enhancing data management for joint customers. This is done by delivering flexible, scalable data strategies that give customers more choice in managing their data effectively. Partners work with Cribl experts to deliver tight, native integrations that accelerate our joint customers’ data modernization strategy.

Cribl and Palo Alto Networks (PAN) recently launched a partnership with the shared goal of routing third-party data to Cortex XSIAM for optimized search, storage, and analysis in modern Security Operations Centers (SOCs). In our 4.11.0 release, we’re pleased to announce our co-developed Cortex XSIAM destination. XSIAM also released a new Cribl Stream Integration in XSIAM as a source.

Introducing Cortex XSIAM

Palo Alto Networks' Cortex XSIAM (Extended Security Intelligence and Automation Management) is an AI-driven Security Operations Center (SOC) platform designed to unify and streamline threat detection, investigation, and response across an organization's digital environment. It integrates extended detection and response (XDR), security information and event management (SIEM), and security orchestration, automation, and response (SOAR) capabilities, leveraging advanced analytics and automation to accelerate mean-time-to-respond (MTTR) while reducing manual workload for security teams.

Cribl Steam is often used to modernize customer data architectures by centralizing the aggregation of third-party data sources and simplifying data onboarding into Cortex XSIAM through a single management console. If you're planning a migration from a legacy SIEM to Cortex XSIAM, Cribl Stream is the industry standard for quickly routing your security-relevant data to both platforms. This reduces migration costs, minimizes downtime, and ensures that historical data is preserved and accessible, supporting compliance and analysis needs during the transition.

unnamed.png

Getting Started

The integration between Cribl and Cortex XSIAM requires some configuration on both platforms. Before you begin, ensure that you have Admin access to both platforms. You will create an API token in the Cortex XSIAM platform and input both that token and an API endpoint in the new XSIAM destination in Cribl Stream. This allows security-relevant events from multiple third-party data sources to be sent from Cribl Stream into a single Cortex XSIAM API endpoint. Stream pipelines must be used to add data source-specific context (fields) to each event to ensure proper parsing and analysis by the Cortex XSIAM platform.

Configure a Cribl Stream Connector in Cortex XSIAM

Log in to your Cortex SIAM Console, select “Settings” from the left menu, then select “Data Sources” to begin adding the third-party integration for Cribl. Click “Add Data Source” from the top left of your console.

unnamed.png

Search for “Cribl” or click the “Analytics and SIEM” to select the new “Cribl” integration as detailed below. After adding the Cribl Integration to your Cortex XSIAM instance, you will be provided with an API URL and an authentication token, which you will use to configure the XSIAM destination in Cribl Stream. This is the only time you will be presented with the token, and you can only add a single instance of the Cribl integration.

unnamed.png

Configure a Cortex XSIAM Destination in Cribl Stream

Make sure you have saved your endpoint URL and token from the previous section and have them available for reference in this section. The token is presented to the creator of the XSIAM Cribl integration once, upon creating the Cribl integration.

You should now see a new destination in your Cribl Stream Data > Destinations collection titled “XSIAM”. Configuring this new destination should be as easy as clicking on the XSIAM tile, then clicking the “Add Destination” button, and populating the fields as indicated below.

unnamed.png

Building a Cribl Stream Pipeline for each Data Source

You will need to create a Cribl Stream Pipeline for each of your data sources to add several fields that direct each event to the appropriate parsing functionality within XSIAM. Several pipelines and data source samples are included in the Palo Alto XSIAM pack, available in the Cribl Pack Dispensary, for your reference and to help you get started building upon them.

We will walk through a few examples in this blog that highlight a couple of important scenarios to understand as you begin your data ingestion with XSIAM.

It is very important that you do not modify events or drop events being sent to Palo Alto XSIAM, as that may affect parsing, detections, and analytics related to streaming, behavioral, or baselining analytics.

A combination of the below fields are to be added to every event in a pipeline that sends data to the XSIAM destination (note the leading double-underscore):

  • __sourceIdentifier is required for ALL events

  • __vendor and __product this pair of fields are required for SOME events

The PAN documentation, which provides mappings for each data source to the __sourceIdentifier, __vendor, and __product values, is not available at the time of this blog. We will update it when it is released by PAN. Several examples of these field mappings that have been validated with the excellent PAN Engineering and Product teams are included in the Cortex XSIAM pack, which we will walk through shortly.

For common data sources related to well-known vendor/product combinations, you will be required to provide only the __sourceIdentifier. For syslog, CEF, LEEF, or custom data sources, you will need to send a non-unique __sourceIdentifier with __vendor and __product fields populated. Depending on how your environment is configured, you can either statically assign all the values or need to extract them for the __vendor and __product pair from each event.

The most common example of why you might need to dynamically assign or extract the values for __vendor and __product would be when you are receiving multiple data sources via a single syslog source. Cribl provides you with the control, choice, and flexibility to use a combination of filtering and routing to maintain multiple syslog pipelines (per data source) or conditions with a single syslog pipeline to assign values. In the examples below, I’m sticking with my personal preference for using separate pipelines for each data source, rather than assigning multiple __vendor and __product pairs in a single pipeline.

Install the Cortex XSIAM Pack

Within the Stream menu, select “Processing” from the toolbar, select “Packs”, click the “Add Pack” button, then select “Add from Dispensary”.

unnamed.png

Select the Cortex XSIAM pack from the list of available packs in the Packs Dispensary. After the pack is installed, click on the pack to examine its contents.

Illustrated in the screenshot below, we have sample data and routes pointing to data source-specific pipelines in this early rev of the pack. Since this is a demo system, you can see that the filter column shows that I am filtering on __inputID where you may need to filter based on the contents of the event or source.

unnamed.png

Examine the Okta pipeline

From the initial Routes tab in the pack, click on the XSIAM-OKTA pipeline and select the OKTA_SSO.log data sample. As detailed below, expand the “eval” function, select the “out” button, and enable “Show Internal Fields”.

In the left pane, you will see where the __sourceIdentifier field is being added. I’m also adding a __CollectorHost field to help me quickly validate the received events in XSIAM. In the right pane, you will see the addition of the required __sourceIdentifier field. For this data source, Cortex XSIAM does not require the addition of the __vendor and __product fields.

unnamed.png

Our final configuration step in Cribl Stream is to connect our Okta data source to our new XSIAM destination, ensuring that we route our event data through the Cortex XSIAM pack, as detailed below. I chose to route via Quick Connect to keep the screenshots clean, but you may have factors that require you to use routing (Data Lake FTW) or even attaching the pack directly to the destination as a post-processor.

unnamed.png

Log in to your XSIAM console and select “Investigation” and “Query Builder” from the Cortex XSIAM menu on the left, then click the “XQL Builder” tile to start validating our ingest.

unnamed.png

To validate that the data was properly received in the XSIAM console, I ran the following XDR query to examine the parsed events. I will remove the collector hostname field from the pipelines when they are ready to be pushed into production.

Query:
dataset = okta_sso_raw

|filter _collector_hostname = "ApgerOkta"

Parsed results:

unnamed.png

Examine the CheckPoint VPN1 & Firewall 1 pipeline (CEF Formatted)

We are going to skip the above repetition and focus on the use of the generic __sourceIdentifier value, which requires that we provide values for both __vendor and __product fields. As mentioned earlier, the __vendor and __product values can be assigned statically or extracted from the event depending on your environment. In this case, the XSIAM-Checkpoint-CEF pipeline from the XSIAM pack extracts the __vendor and __product fields using a regular expression.

unnamed.png

You can validate your event data in XSIAM by issuing the following query:

dataset = check_point_vpn_1_firewall_1_raw

|filter _collector_hostname = "ApgerCheckpoint*"

|filter _collector_type = "cribl"

Examine the Fortinet Fortigate pipeline (syslog)

We are going to use the generic __sourceIdentifier again, but for this data source, the __vendor and __product field values do not exist in the event and need to be statically assigned. The pipeline below, named XSIAM-Fortinet-Fortigate-Syslog from the Cortex XSIAM pack, shows how to assign those values. You will need to ensure that your routing filters within the pack send events to the appropriate pipelines.

unnamed.png

You can validate your event data in XSIAM by issuing the following query:

dataset = fortinet_fortigate_raw

|filter _collector_type = "cribl"

|filter _collector_hostname = "ApgerFortinetSyslog"

Learn More

Cribl, the Data Engine for IT and Security, empowers organizations to transform their data strategy. Customers use Cribl’s suite of products to collect, process, route, and analyze all IT and security data, delivering the flexibility, choice, and control required to adapt to their ever-changing needs.

We offer free training, certifications, and a free tier across our products. Our community Slack features Cribl engineers, partners, and customers who can answer your questions as you get started and continue to build and evolve. We also offer a variety of hands-on Sandboxes for those interested in how companies globally leverage our products for their data challenges.

More from the blog

get started

Choose how to get started

See

Cribl

See demos by use case, by yourself or with one of our team.

Try

Cribl

Get hands-on with a Sandbox or guided Cloud Trial.

Free

Cribl

Process up to 1TB/day, no license required.

OSZAR »