3

I have a raster file on a google cloud storage bucket and I want to open it with GDAL. I am trying this:

from osgeo import gdal
from google.cloud import storage
from google.cloud import client

#ensure file exists name = '1.tif' storage_client = storage.Client() bucket_name = 'my-bucket' bucket = storage_client.bucket(bucket_name) stats = storage.Blob(bucket=bucket, name=name).exists(storage_client) print(stats)

This returns True and the full path is gs://my-bucket/1.tif

But when I do this

gdal.Open('gs://my-bucket/1.tif')

it returns None

I can open the file correctly just fine when I read it from my google drive, so this seems to be an issue only on google cloud storage

Stefano Potter
  • 838
  • 1
  • 12
  • 26
  • Have you already configured the AWS credentials in your environment (accesskeyid, secretaccesskey, region, etc.)? This thread may help you: https://gis.stackexchange.com/questions/201831/how-to-efficiently-access-files-with-gdal-from-an-s3-bucket-using-vsis3 – Kartograaf Apr 07 '22 at 18:50
  • 1
    It is google cloud storage, not AWS – Stefano Potter Apr 07 '22 at 18:52

1 Answers1

7

GCS buckets are implemented as a GDAL virtual filesystem, so you'll want to update the filepath to:

gdal.Open('/vsigs/my-bucket/1.tif')

But as @Kartograaf mentions, you may also need to set some additional configuration parameters if your bucket requires authentication.

Alternatively, rasterio appears to handle the gs:// notation.

James
  • 813
  • 1
  • 7
  • 15
  • 3
    Thanks, @James. In case it's useful to others, I found the configuration parameters for authentication here: https://gdal.org/user/virtual_file_systems.html#vsigs-google-cloud-storage-files – Leila Hadj-Chikh Sep 19 '22 at 15:19