Skip to content

Commit

Permalink
Release/v0.4.0 (#44)
Browse files Browse the repository at this point in the history
* Update Cargo, Readme, Changelog

* Update API Documentation

* Fix readme

* Update readme

* cargo fmt

* add github action check to main
  • Loading branch information
toelo3 authored Apr 25, 2024
1 parent 3c5bef3 commit 76928f6
Show file tree
Hide file tree
Showing 16 changed files with 68 additions and 100 deletions.
4 changes: 4 additions & 0 deletions .github/workflows/main.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,10 @@ jobs:
run: cargo install cargo-audit
- name: audit check
run: cargo audit
- name: install cargo-hack
run: cargo install cargo-hack --locked
- name: cargo check all features
run: cargo hack check --feature-powerset --no-dev-deps
coverage:
runs-on: ubuntu-latest
name: ubuntu / stable / coverage
Expand Down
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [Unreleased]
## [0.4.0] - 2024-04-25

### Fixed
- Fixed vulnerability RUSTSEC-2023-0071 by replacing Picky with RCGen
Expand Down
1 change: 0 additions & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@ members = [
"dsh_sdk",
"example_dsh_service",
]
readme = "README.md"
resolver = "2"

[workspace.package]
Expand Down
20 changes: 13 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,22 +1,28 @@
# Dsh-sdk-platform-rs
This repository contains the Rust SDK for the Data Sharing Hub (DSH) platform.


# DSH_SDK
## Dsh_sdk
The [dsh_sdk](dsh_sdk) is a Rust library that provides a simple interface to interact with the DSH platform. The SDK is used to create and manage data streams, and to send data to the DSH platform.
See [dsh_sdk/README.md](dsh_sdk/README.md) for more information.

# Example DSH Service
## Example DSH Service
The [example_dsh_service](example_dsh_service) is a simple example of a service that uses the DSH SDK. It demonstrates how to create an app, consume data from Kafka, and how to build and deploy the service to DSH.

## Changelog
## Docker
The [docker](docker) directory contains a docker-compose file that can be used for local development. The docker-compose file starts a Kafka cluster, a Zookeeper instance and a schema registry.

---

### Changelog
See [CHANGELOG.md](CHANGELOG.md) for all changes per version.

## Contributing
### Contributing
See [CONTRIBUTING.md](CONTRIBUTING.md) for more information on how to contribute to this project.

## License
### License
See [LICENSE](LICENSE) for more information on the license for this project.

## Security
### Security
See [SECURITY.md](SECURITY.md) for more information on the security policy for this project.

---
Expand Down
3 changes: 2 additions & 1 deletion SECURITY.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,8 @@ The following versions of this project are currently being supported with securi

| Version | Supported |
| ------- | ------------------ |
| 0.3.x | :white_check_mark: |
| 0.4.x | :white_check_mark: |
| 0.3.x | :x: |
| 0.2.x | :x: |
| 0.1.x | :x: |

Expand Down
2 changes: 1 addition & 1 deletion dsh_sdk/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ license.workspace = true
name = "dsh_sdk"
readme = 'README.md'
repository.workspace = true
version = "0.3.1"
version = "0.4.0"

[package.metadata.docs.rs]
features = ["full"]
Expand Down
6 changes: 3 additions & 3 deletions dsh_sdk/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,21 +22,21 @@ To use this SDK with the default features in your project, add the following to

```toml
[dependencies]
dsh_sdk = "0.3"
dsh_sdk = "0.4"
```

However, if you would like to use only specific features, you can specify them in your Cargo.toml file. For example, if you would like to use only the bootstrap feature, add the following to your Cargo.toml file:

```toml
[dependencies]
dsh_sdk = { version = "0.3", default-features = false, features = ["bootstrap"] }
dsh_sdk = { version = "0.4", default-features = false, features = ["bootstrap"] }
```

See [feature flags](#feature-flags) for more information on the available features.

To use this SDK in your project
```rust
use dsh_sdk::dsh::Properties;
use dsh_sdk::Properties;
use dsh_sdk::rdkafka::consumer::{Consumer, StreamConsumer};

fn main() -> Result<(), Box<dyn std::error::Error>>{
Expand Down
2 changes: 1 addition & 1 deletion dsh_sdk/examples/produce_consume.rs
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
use dsh_sdk::dsh::Properties;
use dsh_sdk::rdkafka::consumer::CommitMode;
use dsh_sdk::rdkafka::consumer::{Consumer, StreamConsumer};
use dsh_sdk::rdkafka::producer::{FutureProducer, FutureRecord};
use dsh_sdk::rdkafka::Message;
use dsh_sdk::Properties;

const TOTAL_MESSAGES: usize = 10;

Expand Down
2 changes: 1 addition & 1 deletion dsh_sdk/src/dlq.rs
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ use rdkafka::producer::{FutureProducer, FutureRecord};

use tokio::sync::mpsc;

use crate::dsh::Properties;
use crate::graceful_shutdown::Shutdown;
use crate::Properties;

/// Trait to convert an error to a dlq message
/// This trait is implemented for all errors that can and should be converted to a dlq message
Expand Down
2 changes: 1 addition & 1 deletion dsh_sdk/src/dsh/bootstrap.rs
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
//!
//! ## Example
//! ```
//! use dsh_sdk::dsh::Properties;
//! use dsh_sdk::Properties;
//!
//! let dsh_properties = Properties::get();
//! ```
Expand Down
4 changes: 2 additions & 2 deletions dsh_sdk/src/dsh/certificates.rs
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
//! To create the ca.crt, client.pem, and client.key files in a desired directory, use the
//! `to_files` method.
//! ```no_run
//! use dsh_sdk::dsh::Properties;
//! use dsh_sdk::Properties;
//! use std::path::PathBuf;
//!
//! # fn main() -> Result<(), Box<dyn std::error::Error>> {
Expand Down Expand Up @@ -138,7 +138,7 @@ impl Cert {
/// # Example
///
/// ```no_run
/// use dsh_sdk::dsh::Properties;
/// use dsh_sdk::Properties;
/// use std::path::PathBuf;
///
/// # fn main() -> Result<(), Box<dyn std::error::Error>> {
Expand Down
38 changes: 23 additions & 15 deletions dsh_sdk/src/dsh/mod.rs
Original file line number Diff line number Diff line change
@@ -1,16 +1,15 @@
//! # Kafka Properties
//! # DSH Properties
//!
//! This module contains logic to connect to Kafka on DSH and get properties of your tenant.
//! For example all available streams and topics.
//! This module contains logic to connect to Kafka on DSH and retreive all properties of your tenant.
//!
//! The implementation contains some high level functions to get the correct config to connect to Kafka and schema store.
//! From `Properties` there are level functions to get the correct config to connect to Kafka and schema store.
//! For more low level functions, see
//! - [datastream](datastream/index.html) module.
//! - [certificates](certificates/index.html) module.
//!
//! # Example
//! ```
//! use dsh_sdk::dsh::Properties;
//! use dsh_sdk::Properties;
//! use dsh_sdk::rdkafka::consumer::{Consumer, StreamConsumer};
//!
//! # #[tokio::main]
Expand All @@ -34,14 +33,14 @@ pub mod datastream;

static PROPERTIES: OnceLock<Properties> = OnceLock::new();

/// Kafka properties struct. Create new to initialize all related components to connect to the DSH kafka clusters
/// DSH properties struct. Create new to initialize all related components to connect to the DSH kafka clusters
/// - Contains a struct similar to datastreams.json
/// - Metadata of running container/task
/// - Certificates for Kafka and DSH Schema Registry
///
/// # Example
/// ```
/// use dsh_sdk::dsh::Properties;
/// use dsh_sdk::Properties;
/// use dsh_sdk::rdkafka::consumer::{Consumer, StreamConsumer};
///
/// #[tokio::main]
Expand Down Expand Up @@ -76,7 +75,7 @@ impl Properties {
///
/// # Example
/// ```
/// use dsh_sdk::dsh::Properties;
/// use dsh_sdk::Properties;
/// use dsh_sdk::rdkafka::consumer::{Consumer, StreamConsumer};
///
/// # #[tokio::main]
Expand Down Expand Up @@ -138,9 +137,9 @@ impl Properties {
///
/// # Example
/// ```
/// use dsh_sdk::Properties;
/// use dsh_sdk::rdkafka::config::RDKafkaLogLevel;
/// use dsh_sdk::rdkafka::consumer::stream_consumer::StreamConsumer;
/// use dsh_sdk::dsh::Properties;
///
/// #[tokio::main]
/// async fn main() -> Result<(), Box<dyn std::error::Error>> {
Expand Down Expand Up @@ -235,7 +234,7 @@ impl Properties {
/// ```
/// use dsh_sdk::rdkafka::config::RDKafkaLogLevel;
/// use dsh_sdk::rdkafka::producer::FutureProducer;
/// use dsh_sdk::dsh::Properties;
/// use dsh_sdk::Properties;
///
/// #[tokio::main]
/// async fn main() -> Result<(), Box<dyn std::error::Error>>{
Expand Down Expand Up @@ -296,13 +295,12 @@ impl Properties {
///
/// # Example
/// ```
/// # use dsh_sdk::dsh::Properties;
/// # use dsh_sdk::Properties;
/// # use reqwest::Client;
/// # #[tokio::main]
/// # async fn main() -> Result<(), Box<dyn std::error::Error>> {
/// let dsh_properties = Properties::get();
/// let client = dsh_properties.reqwest_client_config()?.build()?;
///
/// let dsh_properties = Properties::get();
/// let client = dsh_properties.reqwest_client_config()?.build()?;
/// # Ok(())
/// # }
/// ```
Expand All @@ -314,7 +312,17 @@ impl Properties {
Ok(client_builder)
}

/// Get the certificates and private key. If running local it returns None
/// Get the certificates and private key. Returns an error when running on local machine.
///
/// # Example
/// ```no_run
/// # use dsh_sdk::Properties;
/// # use dsh_sdk::error::DshError;
/// # fn main() -> Result<(), DshError> {
/// let dsh_properties = Properties::get();
/// let dsh_kafka_certificate = dsh_properties.certificates()?.dsh_kafka_certificate_pem();
/// # Ok(())
/// # }
pub fn certificates(&self) -> Result<&certificates::Cert, DshError> {
if let Some(cert) = &self.certificates {
Ok(cert)
Expand Down
26 changes: 14 additions & 12 deletions dsh_sdk/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
//!
//! ### Example:
//! ```
//! use dsh_sdk::dsh::Properties;
//! use dsh_sdk::Properties;
//! use dsh_sdk::rdkafka::consumer::stream_consumer::StreamConsumer;
//!
//! # #[tokio::main]
Expand All @@ -23,14 +23,13 @@
//! # }
//! ```
//!
//! ### Low level API
//! ## Low level API
//! It is also possible to get avaiable metadata or the certificates from the properties struct.
//!
//! ### Example:
//! ```no_run
//! # use dsh_sdk::dsh::Properties;
//! # use dsh_sdk::Properties;
//! # use dsh_sdk::rdkafka::consumer::stream_consumer::StreamConsumer;
//! # fn main() -> Result<(), Box<dyn std::error::Error>>{
//! # let dsh_properties = Properties::get();
//! // check for write access to topic
Expand All @@ -40,37 +39,37 @@
//! # Ok(())
//! # }
//! ```
//! //! ## Local
//! ## Local
//! It is possible to connect to local kafka cluster. By default it will connect to localhost:9092 when running on your local machine.
//! This can be changed by setting the environment variable `KAFKA_BOOTSTRAP_SERVERS` to the desired kafka brokers or by providing a [local_datastreams.json](https://github.com/kpn-dsh/dsh-sdk-platform-rs/blob/main/dsh_sdk/local_datastreams.json) in your root folder.
//!
//! It is possible to connect to local kafka cluster by enabling the `local` feature.
//! This enables to read in the local_datastreams.json file from root folder and parses it into the datastream struct inside the properties struct.
//! # Metrics
//! The metrics module provides a way to expose prometheus metrics. This module is a re-export of the `prometheus` crate. It also contains a function to start a http server to expose the metrics to DSH.
//!
//! See [local](local/index.html) for more information.
//! See [metrics](metrics/index.html) for more information.
//!
//! # Graceful shutdown
//!
//! To implement a graceful shutdown in your service, you can use the `Shutdown` struct. This struct has an implementation based on the best practice example of Tokio.
//! To implement a graceful shutdown in your service, you can use the `Shutdown` struct. This struct has an implementation based on the best practices example of Tokio.
//!
//! This gives you the option to properly handle shutdown in your components/tasks.
//! It listens for SIGTERM requests and sends out shutdown requests to all shutdown handles.
//!
//! See [graceful_shutdown](graceful_shutdown/index.html) for more information.
//!
//! # DLQ (Dead Letter Queue)
//!
//! `OPTIONAL feature: dlq`
//!
//! This is an experimental feature and is not yet finalized.
//!
//! This implementation only includes pushing messages towards a kafka topic. (Dead or Retry topic)
//!
//! ### NOTE:
//! This implementation does not (and will not) handle any other DLQ related tasks like:
//! - Retrying messages
//! - Handling messages in DLQ
//! - Monitor the DLQ
//! Above tasks should be handled by a seperate component set up by the user, as these tasks are use case specific and can handle different strategies.
//!
//!
//! The DLQ is implemented by running the `Dlq` struct to push messages towards the DLQ topics.
//! The `ErrorToDlq` trait can be implemented on your defined errors, to be able to send messages towards the DLQ Struct.
Expand All @@ -85,3 +84,6 @@ pub mod graceful_shutdown;
pub mod metrics;
#[cfg(any(feature = "rdkafka-ssl", feature = "rdkafka-ssl-vendored"))]
pub use rdkafka;

#[cfg(feature = "bootstrap")]
pub use dsh::Properties;
2 changes: 1 addition & 1 deletion example_dsh_service/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ description = "An example of DSH service using the dsh-sdk crate"
edition = "2021"

[dependencies]
dsh_sdk = { path = "../dsh_sdk", version = "0.3", features = ["rdkafka-ssl-vendored"] }
dsh_sdk = { path = "../dsh_sdk", version = "0.4", features = ["rdkafka-ssl-vendored"] }
log = "0.4"
env_logger = "0.11"
tokio = { version = "^1.35", features = ["full"] }
Loading

0 comments on commit 76928f6

Please sign in to comment.