Reads just the schema from table
on conn
, synthesizes n
fake rows,
writes a schema JSON, fake dataset(s), and a README prompt, and optionally
zips them into a single archive.
Arguments
- conn
A DBI connection.
- table
Character scalar: table name to read.
- n
Number of rows in the fake dataset (default 30).
- level
Privacy level: "low", "medium", or "high". Controls stricter defaults.
- formats
Which data files to write: any of "csv","rds","parquet".
- path
Folder to write outputs. Default:
tempdir()
.- filename
Base file name (no extension). Example: "demo_bundle". This becomes files like "demo_bundle.csv", "demo_bundle.rds", etc.
- seed
Optional RNG seed for reproducibility.
- write_prompt
Write a README_FOR_LLM.txt next to the data? Default TRUE.
- zip
Create a single zip archive containing data + schema + README? Default FALSE.
- zip_filename
Optional custom name for the ZIP file (no path). If
NULL
(default), it is derived aspaste0(filename, ".zip")
, e.g."demo_bundle.zip"
.- sensitive_strategy
"fake" (replace with realistic fakes) or "drop". Default "fake".
Value
Invisibly, a list with useful paths:
schema_path
– schema JSONfiles
– vector of written fake-data fileszip_path
– zip archive path (ifzip = TRUE
)
Examples
# \donttest{
if (requireNamespace("DBI", quietly = TRUE) &&
requireNamespace("RSQLite", quietly = TRUE)) {
con <- DBI::dbConnect(RSQLite::SQLite(), ":memory:")
on.exit(DBI::dbDisconnect(con), add = TRUE)
DBI::dbWriteTable(con, "cars", head(cars, 20), overwrite = TRUE)
out <- llm_bundle_from_db(
con, "cars",
n = 100, level = "medium",
formats = c("csv","rds"),
path = tempdir(), filename = "db_bundle",
seed = 1, write_prompt = TRUE, zip = TRUE
)
}
# }