Closed ebezerra-it closed 3 years ago
hi, I have added what you need. you can take a look on the v0.0.6 version. I also enhanced the way of authentication. YOu can take a look on the README.md
async function uploadAndDownload() { const folderId = ""; const filename = "./icon.png"; const newFile = await tsGoogleDrive.upload(filename, {parent: folderId}); const downloadBuffer = await newFile.download();
// you have to use "googleapis" package // of if you want stream const drive = google.drive({version: "v3", auth: newFile.client}); const file = await drive.files.get({ fileId: newFile.id, alt: 'media' }, {responseType: "stream"});
file.data.on("data", data => { // stream data }); file.data.on("end", () => { // stream end });
// or use pipe const writeStream = fs.createWriteStream('./output.png'); file.data.pipe(writeStream); }
Hi, mate! Thanks a lot for your time! By using both TSGoogleDrive (for instance, to check duplicate files) and GoogleApis (to upload huge files in stream mode), my application is going to auth on Gooogle Drive's account twice, right?
once you got the file, the "client" variable is an authenticated oAuthObject, so it won't do any extra authentication.
Perfect! This should work fine for me... In my project, the files are uploaded in a separeted process. But prior to the download, I perform a query, looking for the file I need to download. When I find it, I get its client to download it using Google Api, as you showed. I'll make this change in my project maybe tomorrow, but I come back to post a working snipet. Maybe It can be helpful to someone else... Thank you very much! 👊
Hi, mate! I need to download huge files from GDrive, however download() method returns a Buffer instead of a stream which I could manage to pipe data chunks thru my write stream. Is there a way to do that using Buffer for huge files? BTW, nice job you have been doing in this project. If you need some help, I'll be glad to give some contribution.