Closed ionuttiplea closed 6 days ago
We will fix
Fixed by
adding
from src.shared.common_fn import delete_uploaded_local_file
and using delete_uploaded_local_file(merged_file_path, file_name)
instaed of delete_file_from_gcs(BUCKET_UPLOAD,file_name)
Hi @punksta you can use GCS_FILE_CACHE = False variable in the backend env to process the files locally. If it is true the files will be stored in the google cloud storage.
It looks like Deleting files doesn't check this flag.
@punksta Please check the latest code.
Hi @punksta you can use GCS_FILE_CACHE = False variable in the backend env to process the files locally. If it is true the files will be stored in the google cloud storage.
I did try that yesterday, it didn't help
A temporary solution was commenting this :
The function call
delete_file_from_gcs(BUCKET_UPLOAD,file_name)
from backend/src/graphDB_dataAccess.py
@punksta Please check the latest code.
I will, as in the update from yesterday another condition has been added It's cool how fast this was addressed
Please take the latest pull from GIT, It will look like the below screen shot.
It is fixed now, many thanks to everyone involved :100:
I don't know the specifics, I most likely still lack the understanding of the project but, when I try to delete a file I uploaded, this would be the error in docker
Maybe GCS configuration is mandatory? Is the repository documentation up to date?
Or is this an issue that needs to be addressed?