Configuring Proofpoint on Demand Connectors

Use this Stellar Cyber connector to ingest Proofpoint on Demand email logs to the data lake. This connector uses Proofpoint's streaming API. It keeps an open WebSocket connection to get streaming data.

The following license is required: Proofpoint On Demand’s Remote Syslog.

If your Stellar Cyber deployment is configured to use an HTTP proxy, be advised that this connector relies on Proofpoint’s API, which uses the WebSocket protocol for log collection (WebSocket is not reliable over HTTP proxy connections).

Stellar Cyber connectors with the Collect function (collectors) may skip collecting some data when the ingestion volume is large, which potentially can lead to data loss. This can happen when the processing capacity of the collector is exceeded.

Connector Overview: Proofpoint on Demand

Capabilities

  • Collect: Yes

  • Respond: No

  • Native Alerts Mapped: No

  • Runs on: DP

  • Interval: Every hour

Collected Data

Content Type

Index

Locating Records

Mail Log

Message

Syslog

msg_class:

proofpoint_on_demand_maillog

proofpoint_on_demand_message

msg_origin.source:

proofpoint_on_demand

msg_origin.vendor:

proofpoint

msg_origin.category:

email

Domain

logstream.proofpoint.com:443

Response Actions

N/A

Third Party Native Alert Integration Details

N/A

Required Credentials

  • Cluster ID and API Key

               Let us know if you find the above overview useful.

Adding a Proofpoint on Demand Connector

To add a Proofpoint on Demand connector:

  1. Obtain prerequisites
  2. Add the connector in Stellar Cyber
  3. Test the connector
  4. Verify ingestion

Obtaining Prerequisites

If your Stellar Cyber deployment is configured to use an HTTP proxy, be advised that this connector relies on Proofpoint’s API, which uses the WebSocket protocol for log collection (WebSocket is not reliable over HTTP proxy connections).

Before you configure the connector in Stellar Cyber, you must obtain and verify the following from your POD deployment.

  • Configuration of this connector requires your Proofpoint deployment be licensed for Proofpoint On Demand’s Remote Syslog.

  • API Keys as follows:

    • To configure collection of maillog and message together in one connector : One API key is needed.

    • To configure collection of maillog or message in two separate connectors: The same API key can be used for the two connectors.

    • To configure collection of maillog in more than one connector: You need a different API key for each connector.

    • To configure collection of message in more than one connector: You need a different API key for each connector.

    Generate API keys from the Admin portal at https://admin.proofpoint.com. Choose API Key Management under Settings.

  • Cluster ID This is assigned by Proofpoint and is located at the top of the Proofpoint on Demand deployment management interface.

Adding the Connector in Stellar Cyber

To add a Proofpoint on Demand connector in Stellar Cyber:

  1. Log in to Stellar Cyber.

  2. Click System | Integration | Connectors. The Connector Overview appears.

  3. Click Create. The General tab of the Add Connector screen appears. The information on this tab cannot be changed after you add the connector.

    The asterisk (*) indicates a required field.

  4. Choose Email from the Category drop-down.

  5. Choose Proofpoint on Demand from the Type drop-down.

  6. For this connector, the supported Function is Collect, which is enabled already.

  7. Enter a Name.

    This field does not accept multibyte characters.

  8. Choose a Tenant Name. The Interflow records created by this connector include this tenant name.

  9. Choose the device on which to run the connector.

    • Certain connectors can be run on either a Sensor or a Data Processor. The available devices are displayed in the Run On menu. If you want to associate your collector with a sensor, you must have configured that sensor prior to configuring the connector or you will not be able to select it during initial configuration. If you select Data Processor, you will need to associate the connector with a Data Analyzer profile as a separate step. That step is not required for a sensor, which is configured with only one possible profile.

    • If the device you're connecting to is on premises, we recommend you run on the local sensor. If you're connecting to a cloud service, we recommend you run on the DP.

  10. (Optional) When the Function is Collect, you can create Log Filters. For information, see Managing Log Filters.

  11. Click Next. The Configuration tab appears.

    The asterisk (*) indicates a required field.

  12. Enter the Cluster ID you obtained earlier.

  13. Enter the API Key you obtained earlier.

  14. Choose the Content Type you would like to collect. The logs for Mail Log and Message are supported.

  15. Click Next. The final confirmation tab appears.

  16. Click Submit.

    To pull data, a connector must be added to a Data Analyzer profile if it is running on the Data Processor.

  17. If you are adding rather than editing a connector with the Collect function enabled and you specified for it to run on a Data Processor, a dialog box now prompts you to add the connector to the default Data Analyzer profile. Click Cancel to leave it out of the default profile or click OK to add it to the default profile.

    • This prompt only occurs during the initial create connector process when Collect is enabled.

    • Certain connectors can be run on either a Sensor or a Data Processor, and some are best run on one versus the other. In any case, when the connector is run on a Data Processor, that connector must be included in a Data Analyzer profile. If you leave it out of the default profile, you must add it to another profile. You need the Administrator Root scope to add the connector to the Data Analyzer profile. If you do not have privileges to configure Data Analyzer profiles, a dialog displays recommending you ask your administrator to add it for you.

    • The first time you add a Collect connector to a profile, it pulls data immediately and then not again until the scheduled interval has elapsed. If the connector configuration dialog did not offer an option to set a specific interval, it is run every five minutes. Exceptions to this default interval are the Proofpoint on Demand (pulls data every 1 hour) and Azure Event Hub (continuously pulls data) connectors. The intervals for each connector are listed in the Connector Types & Functions topic.

    The Connector Overview appears.

The new connector is immediately active.

A new Proofpoint on Demand connector automatically collects logs for the last 7 days. It can take a few days for the logs to be current.

Testing the Connector

When you add (or edit) a connector, we recommend that you run a test to validate the connectivity parameters you entered. (The test validates only the authentication / connectivity; it does not validate data flow).

  1. Click System | Integrations | Connectors. The Connector Overview appears.

  2. Locate the connector that you added, or modified, or that you want to test.

  3. Click Test at the right side of that row. The test runs immediately.

    Note that you may run only one test at a time.

Stellar Cyber conducts a basic connectivity test for the connector and reports a success or failure result. A successful test indicates that you entered all of the connector information correctly.

To aid troubleshooting your connector, the dialog remains open until you explicitly close it by using the X button. If the test fails, you can select the  button from the same row to review and correct issues.

The connector status is updated every five (5) minutes. A successful test clears the connector status, but if issues persist, the status reverts to failed after a minute.

Repeat the test as needed.

ClosedDisplay sample messages...

Success !

Failure with summary of issue:

Show More example detail:

 

Verifying Ingestion

To verify ingestion:

  1. Click Investigate | Threat Hunting. The Interflow Search tab appears.
  2. Change the Indices to Syslog. The table immediately updates to show ingested Interflow records.