Skip to content

Commit c6e8c63

Browse files
committed
docs: Fixed formatting issues
1 parent 8d263dc commit c6e8c63

File tree

7 files changed

+165
-165
lines changed

7 files changed

+165
-165
lines changed

docs/guide/src/docs/asciidoc/index.adoc

+2-1
Original file line numberDiff line numberDiff line change
@@ -11,5 +11,6 @@ include::{includedir}/overview.adoc[]
1111
include::{includedir}/quickstart.adoc[]
1212
include::{includedir}/install.adoc[]
1313
include::{includedir}/sink.adoc[]
14-
include::{includedir}/source.adoc[]
14+
include::{includedir}/source-stream.adoc[]
15+
include::{includedir}/source-keys.adoc[]
1516
include::{includedir}/resources.adoc[]

docs/guide/src/docs/asciidoc/install.adoc

+2-1
Original file line numberDiff line numberDiff line change
@@ -14,4 +14,5 @@ Download the latest release archive: {link_releases}.
1414

1515
== Manually
1616

17-
Follow the instructions in {link_manual_install}
17+
Follow the instructions in {link_manual_install}.
18+

docs/guide/src/docs/asciidoc/overview.adoc

+2-1
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ This guide provides documentation and usage information across the following top
1111
* <<_docker,Docker Example>>
1212
* <<_install,Install>>
1313
* <<_sink,Sink Connector>>
14-
* <<_source,Source Connector>>
14+
* <<_source_stream,Stream Source Connector>>
15+
* <<_source_keys,Keys Source Connector>>
1516
* <<_resources,Resources>>
1617

docs/guide/src/docs/asciidoc/sink.adoc

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1+
:name: Sink Connector
12
[[_sink]]
2-
= Sink Connector Guide
3-
:name: Redis Kafka Sink Connector
3+
= {name}
44

55
The {name} consumes records from a Kafka topic and writes the data to Redis.
66

@@ -20,7 +20,7 @@ connector.class = com.redis.kafka.connect.RedisSinkConnector
2020
The {name} guarantees that records from the Kafka topic are delivered at least once.
2121

2222
[[_sink_tasks]]
23-
== Multiple tasks
23+
== Tasks
2424

2525
The {name} supports running one or more tasks.
2626
You can specify the number of tasks with the `tasks.max` configuration property.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,60 @@
1+
:name: Keys Source Connector
2+
[[_source_keys]]
3+
= {name}
4+
5+
The {name} captures changes happening to keys in a Redis database and publishes keys and values to a Kafka topic.
6+
The data structure key will be mapped to the record key, and the value will be mapped to the record value.
7+
8+
**Make sure the Redis database has keyspace notifications enabled** using `notify-keyspace-events = KEA` in `redis.conf` or via `CONFIG SET`.
9+
For more details see {link_redis_notif}.
10+
11+
[[_source_keys_class]]
12+
== Class Name
13+
14+
The {name} class name is `com.redis.kafka.connect.RedisKeysSourceConnector`.
15+
16+
The corresponding configuration property would be:
17+
18+
[source,properties]
19+
----
20+
connector.class = com.redis.kafka.connect.RedisKeysSourceConnector
21+
----
22+
23+
[[_source_keys_delivery]]
24+
== Delivery Guarantees
25+
26+
The {name} does not guarantee data consistency because it relies on Redis keyspace notifications which have no delivery guarantees.
27+
It is possible for some notifications to be missed, for example in case of network failures.
28+
29+
Also, depending on the type, size, and rate of change of data structures on the source it is possible the connector cannot keep up with the change stream.
30+
For example if a big set is repeatedly updated the connector will need to read the whole set on each update and transfer it over to the target database.
31+
With a big-enough set the connector could fall behind and the internal queue could fill up leading up to updates being dropped.
32+
Some preliminary sizing using Redis statistics and `bigkeys`/`memkeys` is recommended.
33+
If you need assistance please contact your Redis account team.
34+
35+
[[_source_keys_tasks]]
36+
== Tasks
37+
38+
The {name} should only be configured with one task as keyspace notifications are broadcast to all listeners and cannot be consumed in a round-robin fashion.
39+
40+
[[_source_keys_redis_client]]
41+
include::{includedir}/_redis_client.adoc[leveloffset=+1]
42+
43+
44+
[[_source_keys_config]]
45+
== Configuration
46+
[source,properties]
47+
----
48+
connector.class = com.redis.kafka.connect.RedisKeysSourceConnector
49+
redis.keys.pattern = <glob> <1>
50+
redis.keys.timeout = <millis> <2>
51+
topic = <name> <3>
52+
----
53+
<1> Key pattern to subscribe to.
54+
This is the key portion of the pattern that will be used to listen to keyspace events.
55+
For example `foo:*` translates to pubsub channel `$$__$$keyspace@0$$__$$:foo:*` and will capture changes to keys `foo:1`, `foo:2`, etc.
56+
See {link_redis_keys} for pattern details.
57+
<2> Idle timeout in millis.
58+
Duration after which the connector will stop if no activity is encountered.
59+
<3> Name of the destination topic.
60+
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,96 @@
1+
:name: Stream Source Connector
2+
[[_source_stream]]
3+
= {name}
4+
5+
The {name} reads from a Redis stream and publishes messages to a Kafka topic.
6+
7+
[[_source_stream_class]]
8+
== Class Name
9+
10+
The {name} class name is `com.redis.kafka.connect.RedisStreamSourceConnector`.
11+
12+
The corresponding configuration property would be:
13+
14+
[source,properties]
15+
----
16+
connector.class = com.redis.kafka.connect.RedisStreamSourceConnector
17+
----
18+
19+
[[_source_stream_delivery]]
20+
== Delivery Guarantees
21+
22+
The {name} can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
23+
The default is at-least-once delivery.
24+
25+
=== At-Least-Once
26+
27+
In this mode, each stream message is acknowledged after it has been written to the corresponding topic.
28+
29+
[source,properties]
30+
----
31+
redis.stream.delivery = at-least-once
32+
----
33+
34+
=== At-Most-Once
35+
36+
In this mode, stream messages are acknowledged as soon as they are read.
37+
38+
[source,properties]
39+
----
40+
redis.stream.delivery = at-most-once
41+
----
42+
43+
[[_source_stream_tasks]]
44+
== Tasks
45+
46+
Reading from the stream is done through a consumer group so that multiple instances of the connector configured via the `tasks.max` can consume messages in a round-robin fashion.
47+
48+
[[_source_stream_redis_client]]
49+
include::{includedir}/_redis_client.adoc[leveloffset=+1]
50+
51+
[[_source_stream_schema]]
52+
== Message Schema
53+
54+
=== Key Schema
55+
56+
Keys are of type String and contain the stream message id.
57+
58+
=== Value Schema
59+
60+
The value schema defines the following fields:
61+
62+
[options="header"]
63+
|====
64+
|Name|Schema|Description
65+
|id |STRING |Stream message ID
66+
|stream|STRING |Stream key
67+
|body |Map of STRING|Stream message body
68+
|====
69+
70+
[[_source_stream_config]]
71+
=== Configuration
72+
73+
[source,properties]
74+
----
75+
connector.class = com.redis.kafka.connect.RedisStreamSourceConnector
76+
redis.stream.name = <name> <1>
77+
redis.stream.offset = <offset> <2>
78+
redis.stream.block = <millis> <3>
79+
redis.stream.consumer.group = <group> <4>
80+
redis.stream.consumer.name = <name> <5>
81+
redis.stream.delivery = <mode> <6>
82+
topic = <name> <7>
83+
----
84+
85+
<1> Name of the stream to read from.
86+
<2> {link_stream_msg_id} to start reading from (default: `0-0`).
87+
<3> Maximum {link_xread} wait duration in milliseconds (default: `100`).
88+
<4> Name of the stream consumer group (default: `kafka-consumer-group`).
89+
<5> Name of the stream consumer (default: `consumer-${task}`).
90+
May contain `${task}` as a placeholder for the task id.
91+
For example, `foo${task}` and task `123` => consumer `foo123`.
92+
<6> Delivery mode: `at-least-once`, `at-most-once` (default: `at-least-once`).
93+
<7> Destination topic (default: `${stream}`).
94+
May contain `${stream}` as a placeholder for the originating stream name.
95+
For example, `redis_${stream}` and stream `orders` => topic `redis_orders`.
96+

docs/guide/src/docs/asciidoc/source.adoc

-159
This file was deleted.

0 commit comments

Comments
 (0)