Skip To Content

Write Text to a Kafka Topic

The Write Text to a Kafka Topic output connector can be used to write event data, adapted and formatted as delimited text, to an Apache Kafka Topic. For more information about getting started with Apache Kafka, see Apache Kafka Introduction.

Usage notes

Keep the following in mind when working with the Write Text to a Kafka Topic output connector:

  • Given a point, the processor will create an equivalent point.
  • Given a line with two vertices, the processor will create an equivalent line.
  • Given a simple convex polygon or envelope, the processor will create an equivalent geometry.

  • Use this output connector to write data, adapted and formatted as delimited text, to a Kafka Topic. This output connector is a producer to Kafka.
  • This output connector pairs the Text outbound adapter with the Kafka outbound transport.
  • A message separator and an attribute separator are required to format event records as delimited text before writing to Kafka. The message separator indicates the character that identifies the end of a data record; the default is \n (new line). The attribute separator specifies the character used to separate one attribute from another in a single line of text; the default is , (comma). Any normal ASCII character, specified as the character or Unicode value, can be used as a message or attribute separator.
  • The Kafka inbound transport supports TLS 1.2 and SASL security protocols for authenticating with a Kafka cluster or broker.

Parameters

The following are the parameters for the Write Text to a Kafka Topic output connector:

ParameterDescription

Name

A descriptive name for the input connector used for reference in GeoEvent Manager.

Override with Custom Kafka Properties

Specify whether to override the default GeoEvent Server Kafkaclient properties. The default is No.

  • Yes—The default Kafka client properties exposed by the transport will be overridden. A folder registered with GeoEvent Server must be specified that contains a Kafka.properties file with the correct formatting for a valid Kafkaconfiguration. See Apache Kafka Configuration for a list of supported configurations and expected formatting for the specified .properties file.
  • No—The default Kafka client properties exposed by the transport will not be overridden. Kafka Bootstrap Servers and Consumer Group ID must be specified.

Kafka Bootstrap Servers

(Conditional)

A list of hostname:port pairs used to establish the initial connection to the Kafka cluster. Hostname:port pairs must be comma separated, for example, broker0.example.com:9092,broker1.example.com:9092,broker2.example.com:9092.

Registered Folder for the Kafka Properties File

(Conditional)

The folder registered with GeoEvent Server that contains the Kafka .properties file. The Kafka .properties file defines the custom Kafka properties when Override with Custom Kafka Properties is set to Yes. Ensure that the folder registered with GeoEvent Server is the full path to where the Kafka .properties file is located.

Kafka Properties File Name

(Conditional)

The name of the Kafka .properties file that contains the custom Kafka properties for client configuration. The name of the file should be specified without the .properties extension.

  • If the name of the custom Kafka properties file is sample.properties, specify this parameter as sample.

Topic Name

The name of the Kafka topic to publish data to.

  • topic1

Hinweis:

The Kafka outbound transport does not support publishing data to multiple topics.

Enable Exactly Once Delivery

Specifies whether exactly once semantics should be honored when writing to the topic. See Exactly Once Semantics for more information. The default is Yes.

  • YesKafka will honor Exactly Once Semantics.
  • NoKafka will not honor Exactly Once Semantics.

Hinweis:

Ensure the Kafka cluster supports exactly once semantics before continuing with the Enable Exactly Once Delivery parameter. Enabling exactly once semantics comes with a performance cost. For more information, see Producer Configs as related to acknowledgments.

Event Separator

A single literal character which indicates the end of an event data record. Unicode values may be used to specify a character delimiter. The character should not be enclosed in quotes. A newline (\n) is a common end-of-record delimiter.

Field Separator

A single literal character used to separate one attribute value from another in a message. Unicode values may be used to specify a character delimiter. The character should not be enclosed in quotes. A comma is a common attribute delimiter.

Output Date Format

Specifies the format of date/time values written to the file. The available output date formats are ISO 8601 Format or a Custom Format. The default is ISO 8601 Format.

  • ISO 8601 Format—Processed date/time values will be constructed in adherence the ISO 8601 format (yyyy-MM-dd'T'HH:mm:ss.SSSXXX).
  • Custom Format—Processed date/time values will be constructed in adherence to a custom format using the JavaSimpleDateFormat class convention.

Custom Date Format

The custom format for date/time values written to the file. The custom date format should be constructed using the JavaSimpleDateFormat class convention. For more information, see the SimpleDateFormat.

Language for Number Formatting

The locale identifier (ID) used for locale-sensitive behavior when formatting numbers from data values. The default is the locale of the machine GeoEvent Server is installed on. For more information, see Java Supported Locales.

Authentication Required

Specifies whether the connection to the Kafka cluster, or Kafka broker, requires authentication. The default is No.

  • Yes—Authentication is required to the Kafka cluster or broker.
  • No—Authentication is not required to connect to the Kafka cluster or broker.

Authenticate Using

(Conditional)

Specifies the security protocol that is used to secure the Kafka cluster. Available security protocols include TLS 1.2 and SASL.

  • TLS 1.2—The security protocol used by the Kafka cluster is TLS 1.2. Ensure that the Kafka cluster's PKI file (x509 certificate) is imported into the trust store of the ArcGIS Server with which ArcGIS GeoEvent Server is configured. For details, see the Import the certificate into ArcGIS Server section in Configure ArcGIS Server with an existing CA-signed certificate for specific instructions on importing certificates.
  • SASL—The security protocol used by the Kafka cluster is SASL. Only SASL, SSL, and Kerberos are supported.

Hinweis:

When using Kerberos, ensure the operating system user account running ArcGIS GeoEvent Server has read access to the keytab file in the Kerberos setup and configuration.

The parameter is shown when Authentication Required is set to Yes.

Registered Folder for Credential File

(Conditional)

The folder registered with GeoEvent Server that contains the Kafka cluster's PKI file (x509 certificate). Ensure that the folder registered with GeoEvent Server is the full path to where the Kafka cluster's certificate is located.

Credential Configuration File

(Conditional)

The name of the Kafka cluster's PKI file (x509 certificate). The certificate and its associated private key must be stored in the PKCS#12 format, which is represented by a file with either the .p12 or .pfx extension. Provide the name of the file in addition to the extension.

  • my_kafka_certificate.pfx
  • my_other_kafka_certificate.p12

Hinweis:

Only the certificate file name and extension are supported for this parameter. Relative paths to the certificate should not be specified in this parameter. Register the full path to the certificate file using the Registered Folder for Credential File parameter.

The parameter is shown when Authentication Required is set to Yes. It is applicable to TLS 1.2 only.

Keystore Password

(Conditional)

The password for the Kafka cluster's PKI file (x509 certificate). This is also known as the certificate's private key.

SASL Authentication Type

(Conditional)

Specifies the type of SASL authentication mechanism supported by the Kafka cluster. Available SASL authentication types include SASL GSSAPI (Kerberos) and SASL PLAIN.

  • SASL/GSSAPI (Kerberos)—The Kafka cluster uses SASL/GSSAPI Kerberos authentication.
  • SASL/PLAIN—The Kafka cluster uses SASL/PLAIN authentication.

Kerberos Principal

(Conditional)

The Kerberos principal for the specific user, for example, GeoEventKafkaClient1@example.com.

Use Key Tab

(Conditional)

Indicates whether to use the keytab in the Kerberos settings. The default is Yes.

  • Yes—The keytab will be used in the Kerberos settings.
  • No—The keytab will not be used in the Kerberos settings.

Store Key

(Conditional)

Indicates whether to store the key in the Kerberos settings. The default is Yes.

  • Yes—The key will be stored in the Kerberos settings.
  • No—The key will not be stored in the Kerberos settings.

Username

(Conditional)

Specifies the username used to authenticate with the Kafka cluster. This is also known as a connection string with certain cloud providers. Refer to the documentation of the chosen cloud provider for correct syntax.

Password

(Conditional)

Specifies the password used to authenticate with the Kafka cluster. Refer to the documentation of the chosen cloud provider for the correct syntax.

Considerations and limitations

The Write Text to a Kafka Topic output connector is a producer to Kafka. Apply the same considerations to this output connector as would be required for any other external producer to Kafka.