Closed vantezzen closed 2 years ago
Ah, just now seeing that #47 is already fixing this issue
Awesome, I saw that too, that PR solved your issue right? I'll have that fixed merged soon.
Yes that fixes the issue! Although it might still be good to wrap the file(); unlink();
in a try-finally
try {
$data = file($filePath);
} finally {
unlink($filePath);
}
to make sure the file gets deleted even if something throws and doesn't leave old files.
I disagree, I would personally want the package to throw an exception if something is wrong so it can be attended to immediately rather than an end-user having to report missing data days after Microsoft decides to change stuff again.
I disagree, I would personally want the package to throw an exception
@SamuelCompary The try-finally would still throw the exception to your code like it currently does, as there is no catch
in there - I fully agree that this should throw an exception.
Currently, if the file()
throws an exception, the unlink()
will never be reached and thus the zip file remains on disk. Consequently, we had some hundred report zip files on our server which filled the disc space.
With a try-finally clause, if the file()
throws an exception, the file will get cleanly removed before the exception is thrown to your code. (https://softwareengineering.stackexchange.com/a/131400)
Ah, my bad :) Added it to the PR now
This has been fixed I believe in the new v1.5 now. Thanks for the support guys!
The Bing Report downloader (https://github.com/tmarois/laravel-ads-sdk/blob/master/src/Services/BingAds/ReportDownload.php) always assumes that the report will be available at "[Report ID].csv" after unzipping.
In the last few weeks, MS seems to have changed something in their system though which results in the Report ID being in a format like "15012345678_161234567890" while the actual report csv is named "15012345678.csv" after unzipping (i.e. only the part before the underscore). Consequently, the Report Downloader will fail with "File not found" while attempting to read the file.
Currently, I am solving this issue locally by using a patched downloader like this:
Can others confirm this issue?