Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,9 @@ BugReports: https://github.com/posit-dev/btw/issues
Imports:
cli,
clipr,
DBI,
dplyr,
duckdb,
ellmer (>= 0.1.1.9000),
fs,
jsonlite,
Expand Down
47 changes: 47 additions & 0 deletions R/tool-query.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
#' Perform a SQL query on the data, and return the results as JSON.
#'
#' @param query A DuckDB SQL query; must be a SELECT statement.
#' @param data_frame The name of the data frame.
#' @return The results of the query as a JSON string.
btw_tool_env_query_data_frame <- function(query, data_frame) {
d <- get(data_frame)
conn <- btw_connection()

if (!DBI::dbExistsTable(conn, data_frame)) {
duckdb::duckdb_register(conn, data_frame, d, experimental = FALSE)
}
Comment on lines +6 to +12
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think about adding a registration system for these tables? The usage would be something like

btw_register_table(mtcars)
btw_register_table(mtcars, "my_mtcars")

client <- btw_client()

If done before the client is created, the tool description could include the table names. Or we could also have a "list tables" tool to introduce the available tables to the LLM.

The other advantage is that the LLM wouldn't need to provide data_frame, it would only need to write the SQL code.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There'd be friction in needing to register tables for use with btw, but I also like the explicit consent nature.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh and the registration function would be a generic, so we could also take existing connections, remote tables, .csv files etc. (eventually, but I'd stick with starting with R data frames via duckdb).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you imagine that the registration is optional or required? I do like the "if it's in my global environment, the model can find it" paradigm that's set with the existing environment and data frame description tools, which makes me lean towards making registration optional. There's also already some existing models of explicit consent happening depending on the UI; i.e. if btw is hooked up to Claude Desktop or Claude Code via acquaint, the first usages of the tools will ask for the user's permission anyway.

I do see the value in the registration system in that get() doesn't have a lot of uses beyond the global environment and models are otherwise unable to "fetch" data up to this point. The registration system sounds really helpful for those other types of data sources.


res <- DBI::dbGetQuery(conn, query)

btw_tool_env_describe_data_frame(res, format = "json", dims = c(Inf, Inf))
}

.btw_add_to_tools(
name = "btw_tool_env_query_data_frame",
group = "env",
tool = function() {
ellmer::tool(
btw_tool_env_query_data_frame,
.name = "btw_tool_env_query_data_frame",
.description =
"Run a DuckDB SQL query against a data frame.
Use this tool instead of btw_tool_env_describe_data_frame to run more
targeted queries, e.g. calculating statistics on specific columns.",
query = ellmer::type_string("A DuckDB SQL query, as a string."),
data_frame = ellmer::type_string("The name of the data frame, as a string.")
)
}
)

btw_connect <- function() {
# TODO: also check if the connection is active
if (is.null(.globals$conn)) {
.globals$conn <- DBI::dbConnect(duckdb::duckdb(), dbdir = ":memory:")
}
}

btw_connection <- function() {
btw_connect()

.globals$conn
}
2 changes: 2 additions & 0 deletions R/utils.R
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
.globals <- new_environment()

pandoc_convert <- function(path, ..., from = "html", to = "markdown") {
tmp_file <- withr::local_tempfile()

Expand Down
17 changes: 12 additions & 5 deletions man/btw_register_tools.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

19 changes: 19 additions & 0 deletions man/btw_tool_env_query_data_frame.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

18 changes: 18 additions & 0 deletions tests/testthat/_snaps/tool-query.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# btw_tool_env_query_data_frame() works

Code
btw_tool_env_query_data_frame("SELECT mpg FROM mtcars LIMIT 5;", "mtcars")
Output
[1] "```json"
[2] "[\n {\"mpg\":21},\n {\"mpg\":21},\n {\"mpg\":22.8},\n {\"mpg\":21.4},\n {\"mpg\":18.7}\n]"
[3] "```"

---

Code
btw_tool_env_query_data_frame("SELECT mpg FROM mtcars LIMIT 5;", "mtcars")
Output
[1] "```json"
[2] "[\n {\"mpg\":21},\n {\"mpg\":21},\n {\"mpg\":22.8},\n {\"mpg\":21.4},\n {\"mpg\":18.7}\n]"
[3] "```"

17 changes: 17 additions & 0 deletions tests/testthat/test-tool-query.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
test_that("btw_tool_env_query_data_frame() works", {
# can run a simple query
expect_snapshot(
btw_tool_env_query_data_frame(
"SELECT mpg FROM mtcars LIMIT 5;",
"mtcars"
)
)

# can run a query against the same table twice
expect_snapshot(
btw_tool_env_query_data_frame(
"SELECT mpg FROM mtcars LIMIT 5;",
"mtcars"
)
)
})