Datacube Access Deployment Guide⚓︎
Note: This Building Block is under active development. Some features may still be evolving, so we recommend using it with consideration as updates are rolled out.
The Datacube Access building block allows users to access and explore multi-dimensional Earth Observation (EO) data using standard APIs. It is built on open standards from OGC (Open Geospatial Consortium).
Introduction⚓︎
Datacube Access gives users simple ways to discover, access, and process large Earth Observation datasets, known as “datacubes.” These datacubes are structured, multi-dimensional sets of data, useful for various analytics and visualisation tasks.
Prerequisites⚓︎
| Component | Requirement | Documentation Link |
|---|---|---|
| Kubernetes | Cluster (tested on v1.32) | Installation Guide |
| Helm | Version 3.5 or newer | Installation Guide |
| kubectl | Configured for cluster access | Installation Guide |
| Ingress | Properly installed | Installation Guide |
| Cert Manager | Properly installed | Installation Guide |
| STAC Catalog | Properly installed | Deployment Guide |
Clone the Deployment Guide Repository:
Validate your environment:
Deployment Steps⚓︎
- Run the Configuration Script
Configuration Parameters During script execution, provide:
INGRESS_HOST: Domain for ingress hosts.- Example:
example.com CLUSTER_ISSUER: Cert-manager issuer for TLS certificates.- Example:
letsencrypt-http01
- Deploy Datacube Access Using Helm
helm repo add eoepca-dev https://eoepca.github.io/helm-charts-dev
helm repo update eoepca-dev
helm upgrade -i datacube-access eoepca-dev/datacube-access \
--values generated-values.yaml \
--version 2.0.0-rc2 \
--namespace datacube-access \
--create-namespace
Validation and Operation⚓︎
1. Automated Validation⚓︎
2. Manual Validation via Web Browser⚓︎
Verify endpoints using a web browser:
- Landing/Home Page
- OpenAPI Documentation
- Collections Access
- Conformance Check
Collection Access Test⚓︎
Usage and Testing⚓︎
The Datacube Access BB filters your STAC catalog to expose only collections that include the STAC Datacube Extension - specifically those with cube:dimensions or cube:variables defined. This ensures processing tools like openEO only see properly-structured, analysis-ready collections.
Understanding Datacube-Ready Collections⚓︎
Standard STAC collections describe what data exists and where. Datacube-ready collections add structural metadata: dimensions (x, y, time, bands), coordinate reference systems, and dimension relationships. This metadata tells processing tools how to interpret and load the data as a multidimensional datacube.
Loading a Test Collection⚓︎
Add a sample datacube-ready collection to your STAC catalog. There is a provided script in the deployment-guide/scripts/datacube-access/collections/datacube-ready-collection/ directory. This is setup to work automatically with the eoapi component of the Data Access BB, but this can be adapted to other STAC catalogs, i.e. A POST request using the collections.json and items.json provided.
View the collection at
Testing with Processing Tools⚓︎
A test script is provided to demonstrate loading the datacube using Python libraries like pystac-client and odc-stac. This script connects to the Datacube Access STAC API, searches for the datacube-ready collection, and loads it into an xarray datacube.