For building and running the library, Java 17 (or above) is required.
Making the JDBC usage simpler and less verbose.
JDBC is the basic API for accessing relational databases. Being basic makes it quite tedious to use directly. This leads to higher level abstractions like JPA. Using a full-grown Object Relational Mapper on the other side might be to heavy weight for many uses cases. FacileJDBC tries to fill the gap by making the low-level JDBC access less verbose and tedious. SQL is still used as a query language.
The API of this library has been heavily influenced by the Scala Anorm library.
- A lightweight wrapper around the JDBC API.
- The possibility to fill query parameters by name instead of its position: Available via the
Param
interface.- Functions for creating (parsing) entity objects from query
ResultSet
s: Available via theRowParser
interface- Functions for splitting (deconstructing) entity objects to DB columns: Available via the
Dctor
interface.- A
Query
object to putting all things together.- Lightweight transaction handling support.
- An OR-Mapper
- A type safe query language
- An SQL query builder
- DB-vendor specific code, uses 100% pure JDBC.
- Query generation capabilities. The user is responsible for creating the proper SQL string.
- Generated classes or dynamically generated proxies.
- No connection pooling.
Other Java DB libraries
- jOOQ: Excellent library for accessing databases in a type safe way. Because of the different scope, it is more than a thin wrapper around the JDBC API.
- Jdbi: Similar scope, but with a different approach.
The following example show how to use FacileJDBC for different use cases. For a detailed description of the API, also have a look at the Javadoc.
SQL queries are defined via the Query
class. Since the Query
class is immutable, it is safe to use it in a multi-threaded environment or define it as static class member. The Query
class is the main entry point of the FacileJDBC library. For executing a query, all what it needs is a JDBC Connection
.
final Query query = Query.of("SELECT 1");
final boolean result = query.execute(conn)
The execute
method returns a boolean
value as specified in the PreparedStatement.execute()
method. The Javadoc documents the used methods of the PreparedStatement
for every execute method.
Usually, your selected rows will be stored in DTO objects. For the simple examples the following Person
DTO will be used.
public record Person(
String name,
String email,
String link
){}
The following query will select all persons which matches a given name pattern.
static final Query SELECT = Query.of("""
SELECT name, email, link
FROM person
WHERE name like :name
"""
);
Executing the select query is shown in the following code snippet. It also shows how query parameter are set and how the result rows are parsed.
final List<Person> persons = SELECT
.on(value("name", "M%"))
.as(PARSER.list(), conn);
The Query.on
method is used for filling the query parameters. The variables uses the usual syntax for SQL bind variables. With the Param.value
factory method it is possible to fill the defined variables. The Query.as
method executes the query and returns the parsed result rows. For converting the query result to a DTO, a RowParser
is needed. With the RowParser.list()
method you will tell the query that you expect 0 to n result rows. If only one result is expected, you need to use the RowParser.single()
or RowParser.singleOp()
method.
The following code snippet shows the RowParser
implementation for our Person
DTO.
static final RowParser<Person> PARSER = (row, conn) -> new Person(
row.getString("name"),
row.getString("email"),
row.getString("link")
);
Since the RowParser
is a functional interface it can be written as shown. The first function parameter represents the actual selected row and second parameter the JDBC Connection
used for executing the query. In most cases the conn
will not be used, but it is quite helpful for fetching dependent DTOs in a sub-query.
Whe you are useing Java record
s as entity objects, you can create the RowParser
with a simple factory method.
static final RowParser<Person> PARSER = RowParser.of(Person.class);
The FacileJdbc library allows you to lazily fetch results from the DB. This might be useful when the selected data can cause an OutOfMemoryError
.
final var select = Query.of("SELECT * FROM person;");
// Make sure to close the returned stream.
try (var persons = select.as(PARSER.stream(), conn)) {
// Do some processing with the person stream.
persons.forEach(person -> ...);
}
It's important to close the returned Stream
, which will close the underlying ResultSet
and PreparedStatement
. Every SQLException
, thrown while fetching the data from the DB, will be wrapped in an UncheckedSQLException
.
By setting the fetch-size, with the Query.withFetchSize(int)
method, it is possible to control the amount of data fetched at once by the JDBC driver.
Sometime it is convenient to export the whole selection result as CSV line.
final var select = Query.of("SELECT * FROM book;");
final var csv = select.as(ResultSetParser.csvLine(), conn);
System.out.println(csv);
The printed CSV string will look like the following example.
"ID","PUBLISHED_AT","TITLE","ISBN","PAGES"
"0","1987-02-04","Auf der Suche nach der verlorenen Zeit","978-3518061756","5100"
"1","1945-01-04","Database Design for Mere Mortals","978-0321884497","654"
"2","1887-02-04","Der alte Mann und das Meer","B00JM4RD2S","142"
For a big result set it is possible to lazily stream the selected rows into a file.
final var select = Query.of("SELECT * FROM book ORDER BY id;");
try (Stream<String> lines = select.as(RowParser.csvLine().stream(), conn);
Writer out = Files.newBufferedWriter(Path.of("out.csv")))
{
lines.forEach(line -> {
try {
out.write(line);
out.write("\r\n");
} catch (IOException e) {
throw new UncheckedIOException(e);
}
});
}
The rows are written without a header into the CSV file.
"0","1987-02-04","Auf der Suche nach der verlorenen Zeit","978-3518061756","5100"
"1","1945-01-04","Database Design for Mere Mortals","978-0321884497","654"
"2","1887-02-04","Der alte Mann und das Meer","B00JM4RD2S","142"
For inserting one new Person
into the DB an insert query have to be defined.
static final Query INSERT = Query.of("""
INSERT INTO person(name, email, link)
VALUES(:name, :email, :link);
"""
);
When all bind variable has been set, it can be inserted by calling the execute
method.
final boolean inserted = INSERT
.on(
value("name", "foo"),
value("email", "foo@gmail.com"),
value("link", "http://google.com"))
.execute(conn);
Setting the bind variables this way is quite tedious, if you already have a filled Person
DTO. Inserting the Person
DTO directly you need to define the variable-field mapping. This is done via the Dctor
(deconstructor) interface. The Dctor
can be seen as the inverse function of the RowParser
.
private static final Dctor<Person> DCTOR = Dctor.of(
field("name", Person::name),
field("email", Person::email),
field("link", Person::link)
);
The Dctor
interface also comes with a simple factory method for records.
private static final Dctor<Person> DCTOR = Dctor.of(Person.class);
Once a deconstructor is defined for your DTO, you can easily insert single Person
objects.
final Person person = ...;
final boolean inserted = INSERT
.on(person, DCTOR)
.execute(conn);
If you are interested in the automatically generated primary key of the insertion, you have to use the executeInsert
method. This method returns an Optional
in the case no primary key could be generated.
final Optional<Long> inserted = INSERT
.on(...)
.executeInsert(conn);
or
final Optional<Integer> inserted = INSERT
.on(...)
.executeInsert(RowParser.int32(1), conn);
if you are needed to control the parsing of the generated primary key.
If you have a collection of Person
s, you can insert it in one batch.
final List<Person> persons = ...;
final Batch batch = Batch.of(persons, DCTOR);
final int[] counts = INSERT.executeUpdate(batch, conn);
For simple insertions, you can also do some kind of ad-hoc batch insertions.
Query.of("INSERT INTO person(id, name) VALUES(:id, :name)")
.execute(
Batch.of(
List.of(value("id", 1), value("name", "Peter")),
List.of(value("id", 2), value("name", "Jack")),
List.of(value("id", 3), value("name", "John"))
),
conn
);
A parameter can be multi-value, like a sequence of IDs. In such case, values will be prepared to be passed appropriately in JDBC.
final List<Book> results = Query.of("SELECT * FROM book WHERE id IN(:ids);")
.on(Param.values("ids", 1, 2, 3, 4))
.as(PARSER.list(), conn);
The created JDBC query string will look like this
SELECT * FROM book WHERE id IN(?,?,?,?);
filled with the value 1
, 2
, 3
and 4
.
Sometimes it is not possible to use the available object conversions, available in the library. E.g. if you want to insert some raw byte content via an InputStream
.
final var query = Query.of("INSERT INTO book(name, pdf) VALUES(:name, :pdf)");
try (var in = Files.newInputStream(Path.of("./book.pdf"))) {
final long id = query
.on(
Param.value("name", "For Whom the Bell Tolls"),
// Call a "special" statement set-method, when setting the parameter.
Param.of("pdf", (index, stmt) -> stmt.setBinaryStream(index, in)))
.executeInsert(conn)
.orElseThrow();
System.out.println("Inserted book with ID: " + id);
}
It is possible to create automatic parameter value conversion via the SPI SqlTypeMapper
class. Usually, it is not possible to insert an URI
field directly into the DB. You have to convert it into a string object first.
public record Person(
String name,
URI link
){}
static final Dctor<Person> DCTOR = Dctor.of(
Person.class,
Dctor.field("email", p -> p.link().toString())
);
If a mapper for the URI
class is defined, it is possible to write the deconstructor more concise.
static final Dctor<Person> DCTOR = Dctor.of(Person.class);
The implementation of such a mapping is quite simple and will look like showed in the following code snippet.
public class URIMapper extends SqlTypeMapper {
public Object convert(final Object value) {
if (value instanceof URI) {
return value.toString();
} else {
return value;
}
}
}
Add the following line
org.acme.URIMapper
to the service definition file
META-INF/services/io.jenetics.facilejdbc.spi.SqlTypeMapper
and you are done.
The previous examples shows the basic usage of the library. It is possible to use this for all needed select and insert queries, as you will do it with plain JDBC. If you need to select or insert small object graphs, this becomes fast tedious as well.
Lets extend our initial example an convert the link of the Person
into an object
public record Person(
String name,
String email,
Link link
){}
and with a Link
class, which will look like the following.
public record Link(
String name,
URI link
){}
It is now possible to create one RowParser<Person>
and one Dctor<Person>
which automatically takes care about the linked Link
object. The new parser will look like the following code snippet.
static final RowParser<Person> PERSON_PARSER = (row, conn) -> new Person(
row.getString("name"),
row.getString("email"),
selectLink(row.getLong("link_id"), conn)
);
With the shown deconstructor.
static final Dctor<Person> PERSON_DCTOR = Dctor.of(
Person.class,
field("link_id", (p, c) -> insertLink(p.link(), c))
);
The needed helper methods are responsible for selecting/inserting the Link
object.
static final RowParser<Link> LINK_PARSER = (row, conn) -> new Link(
row.getString("name"),
URI.create(row.getString("url"))
);
static final Dctor<Link> LINK_DCTOR = Dctor.of(
field("name", Link::name),
field("url", l -> l.url.toString())
);
static Link selectLink(final Long linkId, final Connection conn)
throws SQLException
{
return Query.of("SELECT * FROM link WHERE id = :id")
.on(value("id", linkId))
.as(LINK_PARSER.singleNull(), conn);
}
static Long insertLink(final Link link, final Connection conn)
throws SQLException
{
return Query.of("INSERT INTO link(name, url) VALUES(:name, :url")
.on(link, LINK_DCTOR)
.executeInsert(conn)
.orElseThrow();
}
It is still necessary to implement the sub-inserts and sub-selects, but this can be re-used in other queries, where inserting and selecting of Link
s is needed. Note that this is not an automatic OR-mapping mechanism. The user is still in charge for the concrete implementation.
Note
Although the described feature is quite expressive and may solve some selection/insertion task in an elegant way, does not mean you have to use it. Just treat it as additional possibility.
FacileJDBC also contains two interfaces for simple transaction handling. The Transaction
interface defines methods for executing one or more queries in a transactional context.
final Transaction transaction = ...;
// Inserting a new link into the DB and returning
// the primary key of the newly inserted row.
final long id = transaction.apply(conn -> insertLink(link, conn));
If you are not interested in the return value of the SQL execution, you can use the accept
method instead. In the case of an error, the connection is rolled back. If everything works fine, the connection is committed.
transaction.accept(conn -> insertLink(link, conn));
The second interface is the Transactional
interface, which represents the transactional capability, typically exposed by a database. In this sense, it can be seen as a minimal database interface, just by exposing a Connection
factory method, Transactional::connection
. Since Transactional
is a functional interface, it can easily created by defining the Connection
factory method.
final Transactional db = () -> DriverManager.getConnection(
"jdbc:hsqldb:mem:testdb",
"SA",
""
);
The example above shows how to create a Transactional
instance for a HSQLDB in-memory database, perfectly usable for testing purposes. Then it can be used for performing some SQL inserts.
final long bookId = db.transaction().apply(conn ->
Book.insert(book, conn)
);
For production code you usually have a DataSource
, which represents the connection to the DB. It's equally easy to create a Transactional
object from a given DataSource
instance.
final DataSource ds = ...;
final Transactional db = ds::getConnection;
The library is licensed under the Apache License, Version 2.0.
Copyright 2019-2023 Franz Wilhelmstötter
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
- #23: Implementation of
Stored
class.
// Reading 'Link' objects from db.
final List<Stored<Long, Link>> links = select
.as(LINK_PARSER.stored("id").list(), conn);
// Printing the result + its DB ids.
links.forEach(System.out::println);
// > Stored[id=1, value=Link[http://jenetics.io, text=null, type=null]]
// > Stored[id=2, value=Link[http://jenetics.io, text=Jenetics, type=web]]
// > Stored[id=3, value=Link[https://duckduckgo.com, text=DuckDuckGo, type=search]]
- #49: Implement
PreparedQuery
class.
final var query = INSERT_LINK.prepareQuery(conn);
final var batch = Batch.of(links, LINK_DCTOR);
query.execute(batch);
- #23: Remove wrong
null
check inParam
factory method.