- Multiple backend
- Flexible API
- Code generation with macros
- High modularity
- Spark support (located here)
- Easy to customize
There are several components for use.
Module | Description |
---|---|
core | primitives for other modules |
macros | compile-time generating of InfluxReader , InfluxWriter , InfluxFormatter |
akka-http | async HTTP client based on akka-http backend |
async-client | async HTTP client based on async-client backend |
url-http | sync HTTP client based on URLConnection backend |
udp | UDP client |
spark | Spark connector |
Core:
- enumeratum better scala enumerators
- jawn as a base for JSON operation
Macros:
- coreModel
- scala-reflect
Akka-HTTP:
- coreApi
- akka-http
Async-HTTP:
- coreApi
- sttp - async-client backend
Url-HTTP:
- coreApi
- sttp - url-conn backend
UDP
- coreModel
Spark
- Url-Http
Add to your dependencies list in build.sbt
:
// for Akka based client
libraryDependencies += "com.github.fsanaulla" %% "chronicler-akka-http" % <version>
// for Netty based client
libraryDependencies += "com.github.fsanaulla" %% "chronicler-async-http" % <version>
// for UrlHttp based client
libraryDependencies += "com.github.fsanaulla" %% "chronicler-url-http" % <version>
// for UDP protocol client
libraryDependencies += "com.github.fsanaulla" %% "chronicler-udp" % <version>
// macros extension
libraryDependencies += "com.github.fsanaulla" %% "chronicler-macros" % <version>
Let's take a look on a simply example of usage. In this example we will use async-http
client and macros
.
Sbt file looks like:
lazy val chronicler: String = "latest"
libraryDependencies ++= Seq(
"com.github.fsanaulla" %% "chronicler-async-http" % chronicler,
"com.github.fsanaulla" %% "chronicler-macros" % chronicler
)
Our code:
import com.github.fsanaulla.chronicler.async.{Influx, InfluxAsyncHttpClient}
import com.github.fsanaulla.macros.annotations.{field, tag, timestamp}
import com.github.fsanaulla.core.model.InfluxFormatter
import com.github.fsanaulla.chronicler.async.api.Measurement
import scala.util.{Success, Failure}
import scala.concurrent.ExecutionContext.Implicits.global
// let's define our model, and mark them with annotations for macro-code generation
case class Resume(
@tag id: String,
@tag candidateName: String,
@tag candidateSurname: String,
@field position: String,
@field age: Option[Int],
@field rate: Double,
@timestamp created: Long)
// let's create serializers/deserializers
implicit val fmt: InfluxFormatter[Resume] = Macros.format[Resume]
// setup credentials if exist
private val credentials: InfluxCredentials = InfluxCredentials("username", "password")
// influx details
final val host = "influx_host"
final val port = 8086
// establish connection to InfluxDB
val influx: AsyncIOClient =
Influx.io(host, port, Some(credentials)) // because we will make only IO action
val databaseName = "test_db"
val measurementName = "test_measurement"
// let's make it in type-safe approach
val measurement: Measurement[Resume] =
influx.measurement[Resume](databaseName, measurementName)
// let's write into measurement
val resume = Resume("dasdasfsadf",
"Jame",
"Lanni",
"Scala developer",
Some(25),
4.5,
System.currentTimeMillis() * 1000000)
// insert entity
measurement.write(resume).onComplete {
case Success(r) if r.isSuccess => println("Great new developer is coming!!")
case _ => // handle failure
}
// retrieve entity
val result: Array[Resume] = measurement.read("SELECT * FROM $measurementName").onComplete {
case Success(qr) if qr.isSuccess => qr.queryResult
case _ => // handle failure
}
// close client
influx.close()
For more details see next section. The same example can be applied for other client. With small difference.