Skip to content Skip to sidebar Skip to footer

How To Upload Folder On Google Cloud Storage Using Python Api

I have successfully uploaded single text file on Google Cloud Storage. But when i try to upload whole folder, It gives permission denied error. filename = 'd:/foldername' #here t

Solution 1:

This works for me. Copy all content from a local directory to a specific bucket-name/full-path (recursive) in google cloud storage:

import glob
from google.cloud import storage

def upload_local_directory_to_gcs(local_path, bucket, gcs_path):
    assert os.path.isdir(local_path)
    for local_file in glob.glob(local_path + '/**'):
        if not os.path.isfile(local_file):
           upload_local_directory_to_gcs(local_file, bucket, gcs_path + "/" + os.path.basename(local_file))
        else:
           remote_path = os.path.join(gcs_path, local_file[1 + len(local_path):])
           blob = bucket.blob(remote_path)
           blob.upload_from_filename(local_file)


upload_local_directory_to_gcs(local_path, bucket, BUCKET_FOLDER_DIR)

Solution 2:

A folder is a cataloging structure containing references to files and directories. The library will not accept a folder as an argument.

As far as I understand, your use case is to make an upload to GCS preserving a local folder structure. To accomplish that you can use the os python module and make a recursive function (e.g process_folder) that will take path as an argument. This logic can be used for the function:

  1. Use os.listdir() method to get a list of objects within the source path (will return both files and folders).
  2. Iterate over a list from step 1 to separate files from folders via os.path.isdir() method.
  3. Iterate over files and upload them with adjusted path (e.g. path+ “/“ + file_name).
  4. Iterate over folders making a recursive call (e.g. process_folder(path+folder_name)).

It’ll be necessary to work with two paths:

  1. Real system path (e.g. “/Users/User/…/upload_folder/folder_name”) used with os module.
  2. Virtual path for GCS file uploads (e.g. “upload”+”/“ + folder_name + ”/“ + file_name).

Don’t forget to implement exponential backoff referenced at [1] to deal with 500 errors. You can use a Drive SDK example at [2] as a reference.

[1] - https://developers.google.com/storage/docs/json_api/v1/how-tos/upload#exp-backoff [2] - https://developers.google.com/drive/web/handle-errors

Solution 3:

A version without a recursive function, and it works with 'top level files' (unlike the top answer):

import glob
import os 
from google.cloud import storage

GCS_CLIENT = storage.Client()
defupload_from_directory(directory_path: str, dest_bucket_name: str, dest_blob_name: str):
    rel_paths = glob.glob(directory_path + '/**', recursive=True)
    bucket = GCS_CLIENT.get_bucket(dest_bucket_name)
    for local_file in rel_paths:
        remote_path = f'{dest_blob_name}/{"/".join(local_file.split(os.sep)[1:])}'if os.path.isfile(local_file):
            blob = bucket.blob(remote_path)
            blob.upload_from_filename(local_file)

Solution 4:

I assume the sheer filename = "D:\foldername" is not enough info about the source code. Neither am I sure that this is even possible.. via the web interface you can also just upload files or create folders where you then upload the files.

You could save the folders name, then create it (I've never used the google-app-engine, but I guess that should be possible) and then upload the contents to the new folder

Solution 5:

Refer - https://hackersandslackers.com/manage-files-in-google-cloud-storage-with-python/

from os import listdir
from os.path import isfile, join

...

defupload_files(bucketName):
    """Upload files to GCP bucket."""
    files = [f for f in listdir(localFolder) if isfile(join(localFolder, f))]
    for file in files:
        localFile = localFolder + file
        blob = bucket.blob(bucketFolder + file)
        blob.upload_from_filename(localFile)
    returnf'Uploaded {files} to "{bucketName}" bucket.'

Post a Comment for "How To Upload Folder On Google Cloud Storage Using Python Api"