Open mustafakemalsigirci opened 9 months ago
@mustafakemalsigirci are you getting any error in the console when running either command? Could you set the log level to DEBUG and share the latest logs in the HOME/.azcopy directory?
Hi @gapra-msft , thank you for your response. There is no any error in console log. Log files are here.
238addd9-5d08-3346-78c5-4ee2044fb899.log
238addd9-5d08-3346-78c5-4ee2044fb899-chunks.log
238addd9-5d08-3346-78c5-4ee2044fb899-scanning.log
It seems like you are running an old version of AzCopy, is this reproducible in the latest as well?
yes, you can repeat in 10.20.1 version as well
From: Gauri Prasad @.> Sent: Thursday, September 21, 2023 9:03:26 PM To: Azure/azure-storage-azcopy @.> Cc: mustafakemalsigirci @.>; Mention @.> Subject: Re: [Azure/azure-storage-azcopy] File cannot be downloaded when throughput throttled (Issue #2380)
It seems like you are running an old version of AzCopy, is this reproducible in the latest as well?
— Reply to this email directly, view it on GitHubhttps://github.com/Azure/azure-storage-azcopy/issues/2380#issuecomment-1730057606, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AZ4LUACYVO6GKXRWXE435WDX3R6O5ANCNFSM6AAAAAA457TEP4. You are receiving this because you were mentioned.Message ID: @.***>
I am unable to reproduce the exact issue you are seeing on the latest main branch of azcopy. I will try the exact version you are using next. I have a 12MB file in a storage account and trying to download is with the 2 commands you shared and I see azcopy fail with 000 : request size greater than pacer target. When Enqueuing chunk. X-Ms-Request-Id:
Update: I get the same error when I run it on 10.20.1
Is there any other process that could be interfering with AzCopy?
There is no any other process. I just run azcopy. You are right. Update: in version 10.16.0 -> copy operation stucks. in version 10.20.1 -> fails(fail with 000 : request size greater than pacer target. When Enqueuing chunk.)
Hi @mustafakemalsigirci I believe the issue here is that --cap-mbps is in megaBITS per second and --block-size-mb is in megBYTES. This is also documented in the azcopy CLI help. If you adjust your cap-mbps parameter and convert it to megaBITS per second, it should resolve your issue. Closing this for now as this is the resolution. In case you are still experiencing the issue after following this advice, please feel free to reopen.
I dont want to use block size parameter. This was work around solution. Why copy fails when cap mbps is set? Could you solve the problem?
Android için Outlookhttps://aka.ms/AAb9ysg edinin
From: Gauri Prasad @.> Sent: Monday, September 25, 2023 7:18:09 PM To: Azure/azure-storage-azcopy @.> Cc: mustafakemalsigirci @.>; Mention @.> Subject: Re: [Azure/azure-storage-azcopy] File cannot be downloaded when throughput throttled (Issue #2380)
Hi @mustafakemalsigircihttps://github.com/mustafakemalsigirci I believe the issue here is that --cap-mbps is in megaBITS per second and --block-size-mb is in megBYTES. This is also documented in the azcopy CLI help. If you adjust your cap-mbps parameter and convert it to megaBITS per second, it should resolve your issue. Closing this for now as this is the resolution. In case you are still experiencing the issue after following this advice, please feel free to reopen.
— Reply to this email directly, view it on GitHubhttps://github.com/Azure/azure-storage-azcopy/issues/2380#issuecomment-1734071964, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AZ4LUAEGXDNSS6CDDVFA5OLX4GVEDANCNFSM6AAAAAA457TEP4. You are receiving this because you were mentioned.Message ID: @.***>
@mustafakemalsigirci ah I see. This will take a little more investigation, but I believe it could be because AzCopy computes a block size without considering the cap mbps parameter. Reopened to continue investigation.
Thank you very much !
Android için Outlookhttps://aka.ms/AAb9ysg edinin
From: Gauri Prasad @.> Sent: Monday, September 25, 2023 7:59:56 PM To: Azure/azure-storage-azcopy @.> Cc: mustafakemalsigirci @.>; Mention @.> Subject: Re: [Azure/azure-storage-azcopy] File cannot be downloaded when throughput throttled (Issue #2380)
Reopened #2380https://github.com/Azure/azure-storage-azcopy/issues/2380.
— Reply to this email directly, view it on GitHubhttps://github.com/Azure/azure-storage-azcopy/issues/2380#event-10465407715, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AZ4LUACNRBAXXVG5FATWI5TX4G2AZANCNFSM6AAAAAA457TEP4. You are receiving this because you were mentioned.Message ID: @.***>
Hi @mustafakemalsigirci the fix above isn't the technically correct way to resolve this issue. The correct way is more complicated, and we will have to triage this for a future release.
Curiously I'm getting request size greater than pacer target. When Enqueuing chunk.
when using --cap-mbps 32 --block-size-mb 4, as well as --cap-mbps 33 --block-size-mb 4. So the workaround of using a block size 1/8 * the cap doesn't always work.
Still failing for me with the latest azcopy and storage explorer. Might be nice if I could download files without slowing my coworkers.
In Linux, by imposing restrictions on the OS side, I was able to avoid download errors.
# modprobe ifb
# ip link set dev ifb0 up
# modprobe act_mirred
# tc qdisc add dev eth0 ingress handle ffff:
# tc filter add dev eth0 parent ffff: protocol ip u32 match u32 0 0 flowid 1:1 action mirred egress redirect dev ifb0
# tc qdisc add dev ifb0 root handle 1: htb default 10
# tc class add dev ifb0 parent 1:1 classid 1:10 htb rate 10mbit
azcopy command execute..
# tc qdisc del dev eth0 ingress handle ffff:
# tc qdisc del dev ifb0 root handle 1: htb
# rmmod ifb
Which version of the AzCopy was used?
10.16.0 and 10.20.1
Which platform are you using? (ex: Windows, Mac, Linux)
Windows
What command did you run?
azcopy copy "https://***.blob.core.windows.net/****.pdf****" "C:\test1.pdf" --cap-mbps 10 --log-level INFO --output-type text
azcopy copy "https://***.blob.core.windows.net/****.pdf****" "C:\test1.pdf" --cap-mbps 10 --log-level INFO --output-type text --block-size-mb 3
What problem was encountered?
File is 13MB. If I try to download, it stucks. There is no result like failed or completed.
How can we reproduce the problem in the simplest way?
try download files bigger than 10 MB and give --cap-mbps smaller than 20.
Have you found a mitigation/solution?
If --cap-mbps parameter is 4 times bigger than --block-size-mb parameter, download will be completed.