.. _integrator_objectstorage:
Working with Object storage (like S3)
=====================================
Prepare files
-------------
We can prepare a GeoTIFF for the Cloud, see the `COG file format `_
and the `GDAL output driver options `_.
Generalities
------------
In this section, we explain how to use the S3-like storage from Exoscale.
First of all, you should set the following variables
on the ``geoportal``, ``config`` and ``qgisserver`` services:
* ``AWS_ACCESS_KEY_ID``: The project access key.
* ``AWS_SECRET_ACCESS_KEY``: The project secret key.
* ``AWS_DEFAULT_REGION=ch-dk-2``: The region used by Exoscale.
* ``AWS_S3_ENDPOINT=sos-ch-dk-2.exo.io``: The endpoint used by Exoscale.
For better performance, you should furthermore set the following variables
on the ``geoportal`` and ``qgisserver`` services:
* ``CPL_VSIL_CURL_USE_CACHE=TRUE``
* ``CPL_VSIL_CURL_CACHE_SIZE=128000000``
* ``CPL_VSIL_CURL_USE_HEAD=FALSE``
* ``GDAL_DISABLE_READDIR_ON_OPEN=TRUE``
Use the aws client to list the files:
.. prompt:: bash
aws --endpoint-url https://sos-ch-dk-2.exo.io/ --region ch-dk-2 \
s3 ls s3:///
Create the vrt file for a raster layer:
.. prompt:: bash
docker-compose exec geoportal bash -c \
'gdalbuildvrt /vsis3///index.vrt \
$(list4vrt / .tif)'
MapServer
---------
Create the shape index file for a raster layer:
.. prompt:: bash
docker-compose exec geoportal bash -c \
'gdaltindex index.shp $(\
aws --endpoint-url http://${AWS_S3_ENDPOINT} \
--region ${AWS_DEFAULT_REGION} \
s3 ls s3://// | \
grep tif$ | \
awk '"'"'{print "/vsis3///"$4}'"'"' \
)'
docker cp _geoportal_1:/app/index.shp mapserver/
docker cp _geoportal_1:/app/index.shx mapserver/
docker cp _geoportal_1:/app/index.dbf mapserver/
docker cp _geoportal_1:/app/index.prj mapserver/
Add the following config in the ``mapserver/mapserver.map.tmpl`` file:
.. code::
CONFIG "CPL_VSIL_CURL_USE_CACHE" "TRUE"
CONFIG "CPL_VSIL_CURL_CACHE_SIZE" "128000000"
CONFIG "CPL_VSIL_CURL_USE_HEAD" "FALSE"
CONFIG "GDAL_DISABLE_READDIR_ON_OPEN" "TRUE"
CONFIG "AWS_ACCESS_KEY_ID" "${AWS_ACCESS_KEY_ID}"
CONFIG "AWS_SECRET_ACCESS_KEY" "${AWS_SECRET_ACCESS_KEY}"
CONFIG "AWS_DEFAULT_REGION" "${AWS_DEFAULT_REGION}"
CONFIG "AWS_S3_ENDPOINT" "${AWS_S3_ENDPOINT}"
Use the shape index in the layer:
.. code::
TYPE RASTER
STATUS ON
PROCESSING "RESAMPLE=AVERAGE"
CONNECTIONTYPE OGR
TILEINDEX "index.shp"
TILEITEM "LOCATION"
Add a vector layer for the object storage:
.. code::
CONNECTIONTYPE OGR
CONNECTION "/vsis3///.shp"
DATA ""
`Some more information `_
QGIS client
-----------
Open settings, Option and define the following environment variables:
* ``AWS_ACCESS_KEY_ID``: The project access key.
* ``AWS_SECRET_ACCESS_KEY``: The project secret key.
* ``AWS_DEFAULT_REGION=ch-dk-2``: The region used by Exoscale.
* ``AWS_S3_ENDPOINT=sos-ch-dk-2.exo.io``: The endpoint used by Exoscale.
On Windows also add:
* ``GDAL_HTTP_UNSAFESSL=YES``
Then you can add a raster layer with:
* Open Data Source Manager,
* Raster,
* Protocol: HTTP(S), cloud, etc.,
* Type: AWS S3
* Bucket or container:
* Object key: /index.vrt
You can add a vector layer in an analogous manner.