An R interface to GDAL’s unified command-line interface (GDAL >=3.11). Provides a lazy evaluation framework for building and executing GDAL commands with composable, pipe-aware functions. Supports native GDAL pipelines, gdalcli pipeline format persistence, and pipeline composition.
gdalcli is released as version-specific builds tied to particular GDAL
releases. Using a package version matching your GDAL version is
recommended. Newer package versions introduce features that require
newer GDAL versions. Existing functionality should generally remain
compatible with older GDAL installations, though this cannot be
guaranteed until the GDAL CLI is stabilized.
See GitHub Releases for the latest version-specific builds.
Each release is tagged with both the package version and the GDAL version it targets:
v0.2.1-3.11.0- Compatible with GDAL 3.11.0v0.2.1-3.12.0- Compatible with GDAL 3.12.0- etc.
# Check your system GDAL installation
system2("gdalinfo", "--version")Install the version compatible with your GDAL installation:
# For GDAL 3.12.x
remotes::install_github("brownag/gdalcli", ref = "release/gdal-3.12")
# For GDAL 3.11.x
remotes::install_github("brownag/gdalcli", ref = "release/gdal-3.11")Docker: Pre-built images available at
ghcr.io/brownag/gdalcli:gdal-X.Y.Z-latest
- R >= 4.1
- GDAL >= 3.11 (CLI must be available in system PATH for processx backend)
- gdalraster (optional; enables gdalraster backend)
- reticulate (optional; enables reticulate backend)
library(gdalcli)
# Create a job (lazy evaluation - nothing executes yet)
job <- gdal_raster_convert(
input = system.file("extdata/sample_clay_content.tif", package = "gdalcli"),
output = tempfile(fileext = ".tif"),
output_format = "COG"
)
# Execute the job
gdal_job_run(job)
#> 0...10...20...30...40...50...60...70...80...90...100 - done.job <- gdal_raster_convert(
input = system.file("extdata/sample_clay_content.tif", package = "gdalcli"),
output = tempfile(fileext = ".tif"),
output_format = "COG"
) |>
gdal_with_co("COMPRESS=LZW", "PREDICTOR=2") |>
gdal_with_config("GDAL_CACHEMAX=512")
gdal_job_run(job)pipeline <- gdal_raster_reproject(
input = system.file("extdata/sample_clay_content.tif", package = "gdalcli"),
dst_crs = "EPSG:32632"
) |>
gdal_raster_scale(src_min = 0, src_max = 100, dst_min = 0, dst_max = 255) |>
gdal_raster_convert(output = tempfile(fileext = ".tif"), output_format = "COG")
gdal_job_run(pipeline)
#> 0...10...20...30...40...50...60...70...80...90...100 - done.
#> 0...10...20...30...40...50...60...70...80...90...100 - done.# Save pipeline to gdalcli pipeline format
workflow_file <- tempfile(fileext = ".gdalcli.json")
gdal_save_pipeline(pipeline, workflow_file)
# Load pipeline for later use
loaded <- gdal_load_pipeline(workflow_file)# Set AWS credentials (from environment variables)
auth <- gdal_auth_s3()
job <- gdal_raster_convert(
input = "/vsis3/my-bucket/input.tif",
output = "/vsis3/my-bucket/output.tif",
output_format = "COG"
) |>
gdal_with_env(auth)
gdal_job_run(job)# Get detailed information about a raster file
info_job <- gdal_raster_info(
input = system.file("extdata/sample_clay_content.tif", package = "gdalcli")
)
gdal_job_run(info_job)# Convert vector format
vector_job <- gdal_vector_convert(
input = system.file("extdata/sample_mapunit_polygons.gpkg", package = "gdalcli"),
output = tempfile(fileext = ".geojson"),
output_format = "GeoJSON"
)
gdal_job_run(vector_job, backend = "processx")# Simple processing pipeline: reproject and convert
processing_pipeline <- gdal_raster_reproject(
input = system.file("extdata/sample_clay_content.tif", package = "gdalcli"),
dst_crs = "EPSG:32632"
) |>
gdal_raster_convert(output = tempfile(fileext = ".tif"))
gdal_job_run(processing_pipeline, backend = "processx")gdalcli supports multiple execution backends:
- processx (default): Executes GDAL CLI commands as subprocesses
- gdalraster (optional): Uses C++ GDAL bindings via gdalraster package
- reticulate (optional): Uses Python GDAL bindings via reticulate
Set your preferred backend globally:
options(gdalcli.backend = "gdalraster") # or "processx", "reticulate"Execute multi-step workflows as a single GDAL pipeline:
pipeline <- gdal_raster_reproject(
input = system.file("extdata/sample_clay_content.tif", package = "gdalcli"),
dst_crs = "EPSG:32632"
) |>
gdal_raster_convert(output = tempfile(fileext = ".tif"))
gdal_job_run(pipeline, backend = "processx")Persist pipelines as JSON for sharing and version control:
# Save pipeline to gdalcli pipeline format
workflow_file <- tempfile(fileext = ".gdalcli.json")
gdal_save_pipeline(pipeline, workflow_file)
# Load pipeline for later use
loaded <- gdal_load_pipeline(workflow_file)Generate executable shell scripts from pipelines:
# Generate bash script
script <- render_shell_script(pipeline, format = "native", shell = "bash")
cat(script)
#> #!/bin/bash
#>
#> set -e
#>
#> # Native GDAL pipeline execution
#> gdal raster pipeline ! read /home/andrew/R/x86_64-pc-linux-gnu-library/4.5/gdalcli/extdata/sample_clay_content.tif ! reproject --dst-crs EPSG:32632 --output /vsimem/gdalcli_219d8135044bf9.tif ! scale --src-min 0 --src-max 100 --dst-min 0 --dst-max 255 ! write /tmp/RtmpTRllud/file219d81197727bb.tif --input /vsimem/gdalcli_219d814c156e4b.tifgdalcli uses a three-layer architecture:
- Frontend Layer: Auto-generated R functions with composable modifiers
- Pipeline Layer: Automatic pipeline building and gdalcli pipeline format serialization
- Engine Layer: Command execution with multiple backend options
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
MIT License - see LICENSE file for details