Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 1 addition & 3 deletions antora.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,6 @@ nav:
asciidoc:
attributes:
company: 'DataStax'
product: 'Astra Streaming'
product-short: 'Astra'
astra-db: 'Astra DB'
astra-ui: 'Astra Portal'
astra-url: 'https://astra.datastax.com'
Expand All @@ -21,7 +19,7 @@ asciidoc:
pulsar-version: '3.1' #DO NOT INCLUDE PATCH VERSION <MAJOR>.<MINOR>.<PATCH>
debezium-version: '1.7'
astra-streaming-examples-repo: 'https://raw.githubusercontent.com/datastax/astra-streaming-examples/master'
kafka-for-astra: 'Starlight for Kafka'
starlight-kafka: 'Starlight for Kafka'
starlight-rabbitmq: 'Starlight for RabbitMQ'
cass: Apache Cassandra
cass-short: Cassandra
Expand Down
2 changes: 1 addition & 1 deletion example-guide.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
A guide is a document offering step by step instruction to reach some goal. Guides are technical in nature and tend to make assumptions about the consumer's environment. To help create guides that will work in most environments, please follow these ideas.

* *Keep links to a minimum* - when someone is learning a new concept for the first time and xref:README.adoc[every] other xref:README.adoc[word] is linked it xref:README.adoc[makes] things xref:README.adoc[confusing] and hard to get a good flow going. Instead, annotate a word or phrase and provide a "Resources" area at the bottom of the guide.
* *Separate products and runtimes in tabs* - it is common to reach the same result through multiple ways. An example is creating a tenant/namespace/topic in {product} and Luna Streaming. Both have the same result but get there in very different ways. Offer each as a tab and let the consumer choose their path. The step after the tabbed step can assume the consumer has complete the previious step and is in a known state. Runtimes follow the same pattern. Weather one is using Java or C#, they are still creating a {pulsar-short} client to interact with the cluster. Create a single step in the guide with multiple tabs for each runtime.
* *Separate products and runtimes in tabs* - it is common to reach the same result through multiple ways. An example is creating a tenant/namespace/topic in Astra Streaming and Luna Streaming. Both have the same result but get there in very different ways. Offer each as a tab and let the consumer choose their path. The step after the tabbed step can assume the consumer has complete the previious step and is in a known state. Runtimes follow the same pattern. Weather one is using Java or C#, they are still creating a {pulsar-short} client to interact with the cluster. Create a single step in the guide with multiple tabs for each runtime.
* *Be thoughtful about the names you use* - if you are leaning a new concept or feature with no background on the product, words matter. Labeling a tab as "Luna Helm" and then referring to it as "{pulsar-short} Helm Chart" are two distinct things to that reader. The author of the document has such deep understanding that they consider those things the same - and technically they are at {company}. But the read isn't from {company}, so be mindful of their context.
* *Talk in first person* - humans create the guides and humans consume the guides. Write as if you are paired with your consumer in doing what ever the guide does. Use "we", "us", "you".
====
Expand Down
13 changes: 6 additions & 7 deletions modules/ROOT/nav.adoc
Original file line number Diff line number Diff line change
@@ -1,12 +1,11 @@
.Processing data
.Process data
* xref:cdc-for-cassandra:ROOT:cdc-concepts.adoc[Change Data Capture (CDC)]
* xref:astra-streaming:getting-started:real-time-data-pipelines-tutorial.adoc[Build real-time data pipelines with {product}]
* xref:astra-streaming:getting-started:real-time-data-pipelines-tutorial.adoc[Build real-time data pipelines with Astra Streaming]

.Migrating to {pulsar}
* xref:use-cases-architectures:starlight/index.adoc[]
* xref:use-cases-architectures:starlight/kafka/index.adoc[]
* xref:use-cases-architectures:starlight/rabbitmq/index.adoc[]
* xref:use-cases-architectures:starlight/jms/index.adoc[]
.Migrate to {pulsar}
* xref:starlight-for-kafka:ROOT:index.adoc[{starlight-kafka}]
* xref:starlight-for-rabbitmq:ROOT:index.adoc[{starlight-rabbitmq}]
* xref:starlight-for-jms:ROOT:index.adoc[Starlight for JMS]

.APIs and References
* Connectors
Expand Down
6 changes: 3 additions & 3 deletions modules/ROOT/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
We've included best practices for Apache Pulsar, a full connector reference,
and examples for getting the most out of CDC.

xref:astra-streaming:getting-started:real-time-data-pipelines-tutorial.adoc[Build real-time data pipelines with {product}]
xref:astra-streaming:getting-started:real-time-data-pipelines-tutorial.adoc[Build real-time data pipelines with Astra Streaming]

</div>

Expand Down Expand Up @@ -106,8 +106,8 @@

<ul class="!m-0 [&>li]:my-2">
<li>xref:starlight-for-jms::index.adoc[Starlight for JMS]</li>
<li>xref:starlight-for-kafka::index.adoc[]</li>
<li>xref:starlight-for-rabbitmq::index.adoc[]</li>
<li>xref:starlight-for-kafka::index.adoc[Starlight for Kafka]</li>
<li>xref:starlight-for-rabbitmq::index.adoc[Starlight for RabbitMQ]</li>
</ul>

</div>
Expand Down
36 changes: 18 additions & 18 deletions modules/functions/pages/astream-functions.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,21 +4,21 @@
Functions are lightweight compute processes that enable you to process each message received on a topic.
You can apply custom logic to that message, transforming or enriching it, and then output it to a different topic.

Functions run inside {product} and are therefore serverless.
Functions run inside Astra Streaming and are therefore serverless.
You write the code for your function in Java, Python, or Go, then upload the code.
It is automatically run for each message published to the specified input topic.

Functions are implemented using https://pulsar.apache.org/docs/en/functions-overview/[{pulsar-reg} functions].

[IMPORTANT]
====
Custom functions require a xref:astra-streaming:operations:astream-pricing.adoc[paid {product} plan].
Custom functions require a xref:astra-streaming:operations:astream-pricing.adoc[paid Astra Streaming plan].
====

== Deploy Python functions in a zip file

{product} supports Python-based {pulsar-short} functions.
These functions can be packaged in a zip file and deployed to {product} or {pulsar-short}.
Astra Streaming supports Python-based {pulsar-short} functions.
These functions can be packaged in a zip file and deployed to Astra Streaming or {pulsar-short}.
The same zip file can be deployed to either environment.

To demonstrate this, the following steps create function configuration YAML file, package all necessary function files as a zip archive, and then use the `pulsar-admin` CLI to deploy the zip.
Expand Down Expand Up @@ -152,8 +152,8 @@ Replace the following:
* `**INPUT_TOPIC_NAME**`: The input topic for the function
* `**OUTPUT_TOPIC_NAME**`: The output topic for the function

. Use `pulsar-admin` to deploy the Python zip to {product} or {pulsar-short}.
The command below assumes you've properly configured the `client.conf` file for `pulsar-admin` commands against your {pulsar-short} cluster. If you are using {product}, see xref:astra-streaming:developing:configure-pulsar-env.adoc[] for more information.
. Use `pulsar-admin` to deploy the Python zip to Astra Streaming or {pulsar-short}.
The command below assumes you've properly configured the `client.conf` file for `pulsar-admin` commands against your {pulsar-short} cluster. If you are using Astra Streaming, see xref:astra-streaming:developing:configure-pulsar-env.adoc[] for more information.
+
[source,console]
----
Expand All @@ -163,7 +163,7 @@ bin/pulsar-admin functions create --function-config-file /absolute/path/to/func-
. Verify that the function was deployed:
+
* Go to the {astra-ui} to see your newly deployed function listed under the **Functions** tab for your tenant.
See <<controlling-your-function,Controlling your function>> for more information on testing and monitoring your function in {product}.
See <<controlling-your-function,Controlling your function>> for more information on testing and monitoring your function in Astra Streaming.
* Use the `pulsar-admin` CLI to list functions for a specific tenant and namespace:
+
[source,bash,subs="+quotes"]
Expand All @@ -173,8 +173,8 @@ bin/pulsar-admin functions list --tenant **TENANT_NAME** --namespace **NAMESPACE

== Deploy Java functions in a JAR file

{product} supports Java-based {pulsar-short} functions which are packaged in a JAR file.
The JAR can be deployed to {product} or {pulsar-short}.
Astra Streaming supports Java-based {pulsar-short} functions which are packaged in a JAR file.
The JAR can be deployed to Astra Streaming or {pulsar-short}.
The same JAR file can be deployed to either environment.

In this example, you'll create a function JAR file using Maven, then use the `pulsar-admin` CLI to deploy the JAR.
Expand Down Expand Up @@ -289,15 +289,15 @@ userConfig:
+
[IMPORTANT]
====
{product} requires the `inputs` topic to have a message schema defined before deploying the function.
Astra Streaming requires the `inputs` topic to have a message schema defined before deploying the function.
Otherwise, deployment errors may occur.
Use the {astra-ui} to define the message schema for a topic.
====

. Use the `pulsar-admin` CLI to deploy your function JAR to {product} or {pulsar-short}.
. Use the `pulsar-admin` CLI to deploy your function JAR to Astra Streaming or {pulsar-short}.
+
The following command assumes you've properly configured the `client.conf` file for `pulsar-admin` commands against your {pulsar-short} cluster.
If you are using {product}, see xref:astra-streaming:developing:configure-pulsar-env.adoc[] for more information.
If you are using Astra Streaming, see xref:astra-streaming:developing:configure-pulsar-env.adoc[] for more information.
+
[source,bash]
----
Expand All @@ -307,17 +307,17 @@ bin/pulsar-admin functions create --function-config-file /absolute/path/to/func
. Verify that the function was deployed:
+
* Go to the {astra-ui} to see your newly deployed function listed under the **Functions** tab for your tenant.
See <<controlling-your-function,Controlling your function>> for more information on testing and monitoring your function in {product}.
See <<controlling-your-function,Controlling your function>> for more information on testing and monitoring your function in Astra Streaming.
* Use the `pulsar-admin` CLI to list functions for a specific tenant and namespace:
+
[source,bash,subs="+quotes"]
----
bin/pulsar-admin functions list --tenant **TENANT_NAME** --namespace **NAMESPACE_NAME**
----

== Add functions in {product} dashboard
== Add functions in Astra Streaming dashboard

Add functions in the **Functions** tab of the {product} dashboard.
Add functions in the **Functions** tab of the Astra Streaming dashboard.

. Select *Create Function* to get started.

Expand All @@ -326,7 +326,7 @@ Add functions in the **Functions** tab of the {product} dashboard.
image::astream-name-function.png[Function and Namespace]

. Select the file you want to pull the function from and which function you want to use within that file.
{product} generates a list of acceptable classes.
Astra Streaming generates a list of acceptable classes.
+
image::astream-exclamation-function.png[Exclamation Function]
+
Expand Down Expand Up @@ -416,7 +416,7 @@ If you want to use different topics, change `in`, `out`, and `log` accordingly.
. Verify that the response is `Created Successfully!`.
This indicates that the function was deployed and ready to run when triggered by incoming messages.
+
If the response is `402 Payment Required` with `Reason: only qualified organizations can create functions`, then you must upgrade to a xref:astra-streaming:operations:astream-pricing.adoc[paid {product} plan].
If the response is `402 Payment Required` with `Reason: only qualified organizations can create functions`, then you must upgrade to a xref:astra-streaming:operations:astream-pricing.adoc[paid Astra Streaming plan].
+
You can also verify that a function was created by checking the **Functions** tab or by running `./pulsar-admin functions list --tenant **TENANT_NAME**`.

Expand Down Expand Up @@ -512,4 +512,4 @@ A *Function-name Deleted Successfully!* message confirms the function was perman

== Next steps

Learn more about developing functions for {product} and {pulsar-short} https://pulsar.apache.org/docs/en/functions-develop/[here].
Learn more about developing functions for Astra Streaming and {pulsar-short} https://pulsar.apache.org/docs/en/functions-develop/[here].
4 changes: 2 additions & 2 deletions modules/functions/pages/deploy-in-sink.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@ Before this update, functions transformed data either after it was written to a
This required either an intermediate topic, with additional storage, IO, and latency, or a custom connector. +
Now, functions can be deployed at sink creation and apply preprocessing to sink topic writes. +

== Create sink function in {product}
== Create sink function in Astra Streaming

Creating a sink function is similar to creating a sink in the {astra-ui}, but with a few additional steps.

. xref:pulsar-io:connectors/index.adoc[Create a sink] as described in the {product} documentation.
. xref:pulsar-io:connectors/index.adoc[Create a sink] as described in the Astra Streaming documentation.

. During sink creation, select the transform function you want to run inside the sink.
+
Expand Down
2 changes: 1 addition & 1 deletion modules/functions/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ transform-function-2
======

[#deploy-as]
== Deploy with {product}
== Deploy with Astra Streaming

Deploy transform functions in the *Functions* tab of the {astra-ui}.

Expand Down
12 changes: 6 additions & 6 deletions modules/pulsar-io/pages/connectors/index.adoc
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
= Connectors
:navtitle: Connector Overview

{product} offers fully-managed {pulsar-reg} connectors.
Astra Streaming offers fully-managed {pulsar-reg} connectors.

Create, monitor, and manage both source and sink connectors through our simple UI, the `pulsar-admin` CLI, or RESTful API.
Connect popular data sources to {pulsar} topics or sink data from {pulsar-short} topics to popular systems.

Below is a list of {pulsar} source and sink connectors supported by {product}.
Below is a list of {pulsar} source and sink connectors supported by Astra Streaming.

[IMPORTANT]
====
{product} doesn't support custom sink or source connectors.
Astra Streaming doesn't support custom sink or source connectors.
====

[#sink-connectors]
Expand Down Expand Up @@ -154,7 +154,7 @@ xref:connectors/sources/kinesis.adoc[Kinesis source connector documentation]

== Experimental Connectors

{company} is always experimenting with connectors. Below are the connectors currently in development that have not yet been promoted to official support in *{product}*.
{company} is always experimenting with connectors. Below are the connectors currently in development that have not yet been promoted to official support in *Astra Streaming*.

To get access to these connectors, contact {support-url}[{company} Support].

Expand Down Expand Up @@ -238,7 +238,7 @@ Zeebe +

== Listing Sink Connectors

To list available sink connectors in your {product} tenant, use any of the following.
To list available sink connectors in your Astra Streaming tenant, use any of the following.

[tabs]
====
Expand Down Expand Up @@ -288,7 +288,7 @@ curl "$WEB_SERVICE_URL/admin/v3/sinks/builtinsinks" -H "Authorization: $ASTRA_ST

== Listing Source Connectors

To list available source connectors in your {product} tenant, use any of the following.
To list available source connectors in your Astra Streaming tenant, use any of the following.

[tabs]
====
Expand Down
2 changes: 1 addition & 1 deletion modules/pulsar-io/pages/connectors/sinks/astra-db.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

{company} {astra-db} Sink Connector is based on the open-source xref:pulsar-connector:ROOT:index.adoc[{cass-reg} sink connector for {pulsar-reg}]. Depending on how you deploy the connector, it can be used to sink topic messages with a table in {astra-db} or a table in a {cass-short} cluster outside of DB.

The {product} portal provides simple way to connect this sink and a table in {astra-db} with simply a token. Using `pulsar-admin` or the REST API, you can configure the sink to connect with a {cass-short} connection manually.
The Astra Streaming portal provides simple way to connect this sink and a table in {astra-db} with simply a token. Using `pulsar-admin` or the REST API, you can configure the sink to connect with a {cass-short} connection manually.

This reference assumes you are manually connecting to a {cass-short} table.

Expand Down
4 changes: 2 additions & 2 deletions modules/pulsar-io/pages/connectors/sinks/cloud-storage.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -141,9 +141,9 @@ include::partial$connectors/sinks/monitoring.adoc[]

== Connector Reference

With the Cloud Storage Sink there are two sets of parameters: {product} parameters and cloud storage provider parameters.
With the Cloud Storage Sink there are two sets of parameters: Astra Streaming parameters and cloud storage provider parameters.

=== {product} parameters for Cloud Storage Sink
=== Astra Streaming parameters for Cloud Storage Sink

[%header,format=csv,cols="2,1,1,3"]
|===
Expand Down
6 changes: 3 additions & 3 deletions modules/pulsar-io/pages/connectors/sinks/elastic-search.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Use Elasticsearch to store, search, and manage data for logs, metrics, search ba

[NOTE]
====
{product} currently supports {pulsar-reg} {pulsar-version}, which uses the https://opensearch.org/docs/1.2/clients/java-rest-high-level/[OpenSearch 1.2.4 library] to interact with
Astra Streaming currently supports {pulsar-reg} {pulsar-version}, which uses the https://opensearch.org/docs/1.2/clients/java-rest-high-level/[OpenSearch 1.2.4 library] to interact with
Elasticsearch.
====

Expand All @@ -33,7 +33,7 @@ include::partial$connectors/sinks/monitoring.adoc[]

There are two sets of parameters that support sink connectors.

=== {product}
=== Astra Streaming

[%header,format=csv,cols="2,1,1,3"]
|===
Expand All @@ -44,7 +44,7 @@ include::example$connectors/sinks/astra.csv[]

These values are provided in the "configs" area.

The {product} Elasticsearch sink connector supports all configuration properties provided by {pulsar}. Please refer to the https://pulsar.apache.org/docs/io-elasticsearch-sink/#property[connector properties] for a complete list.
The Astra Streaming Elasticsearch sink connector supports all configuration properties provided by {pulsar}. Please refer to the https://pulsar.apache.org/docs/io-elasticsearch-sink/#property[connector properties] for a complete list.

== What's next?

Expand Down
4 changes: 2 additions & 2 deletions modules/pulsar-io/pages/connectors/sinks/google-bigquery.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,9 @@ include::partial$connectors/sinks/monitoring.adoc[]

== Connector Reference

The BigQuery sink has multiple sets of parameters: the {product} parameters, the Kafka Connect Adapter parameters, and the Google BigQuery parameters. Each set of parameters provides a way to coordinate how data will be streamed from {pulsar-short} to BigQuery.
The BigQuery sink has multiple sets of parameters: the Astra Streaming parameters, the Kafka Connect Adapter parameters, and the Google BigQuery parameters. Each set of parameters provides a way to coordinate how data will be streamed from {pulsar-short} to BigQuery.

=== {product}
=== Astra Streaming

[%header,format=csv,cols="2,1,1,3"]
|===
Expand Down
6 changes: 3 additions & 3 deletions modules/pulsar-io/pages/connectors/sinks/jdbc-clickhouse.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ real-time.

[NOTE]
====
{product} currently supports {pulsar-reg} {pulsar-version}, which uses the https://github.com/ClickHouse/clickhouse-jdbc[Clickhouse 0.3.2 library] to interact with Clickhouse.
Astra Streaming currently supports {pulsar-reg} {pulsar-version}, which uses the https://github.com/ClickHouse/clickhouse-jdbc[Clickhouse 0.3.2 library] to interact with Clickhouse.
====

== Get Started
Expand All @@ -26,7 +26,7 @@ include::partial$connectors/sinks/monitoring.adoc[]

There are two sets of parameters that support sink connectors.

=== {product}
=== Astra Streaming

[%header,format=csv,cols="2,1,1,3"]
|===
Expand All @@ -37,7 +37,7 @@ include::example$connectors/sinks/astra.csv[]

These values are provided in the "configs" area.

The {product} JDBC Clickhouse sink connector supports all configuration properties provided by {pulsar}. Please refer to the https://pulsar.apache.org/docs/io-jdbc-sink#property[connector
The Astra Streaming JDBC Clickhouse sink connector supports all configuration properties provided by {pulsar}. Please refer to the https://pulsar.apache.org/docs/io-jdbc-sink#property[connector
properties]
for a complete list.

Expand Down
Loading