Open caiolopesdasilva opened 8 months ago
How many files are you trying to backup in this job? Could you give me the number of files, so I can reproduce this bug?
It's a known issue that if a job contains too many files, it might get very slow to load the web gui. But I don't know it can be this extreme.
It's a whole job from a company's storage server. There are cases that a job can be as big as a whole 13TB tape, this is one of the "small one" for us
1,48 TB 252.101 Files, 5.155 Folders
When I created a mount point to the tape, formatted in ltfs. I used a simple rsync -auvl --info=progress from this job to the tape. It took its time but it went well. so yeah its something about YATM that is not able to handle such a large number.
If there's anything I can do to help let me know. I will be out for the next three weeks but after I will be around. YATM is a very interesting tool for us
Thx for your Infomation, I will improve this in the future.
Hi Everyone, I'm trying to use LTFS and YATM into an ULTRIUM-HH8 tape reader for the first time and in my tests everything has been going well.
However, I have this issue where I add directories with multiple TBs of data to be loaded into an LTO-8 tape, upon selecting a tape and adding the barcode. my browser starts to consume all of my RAM and CPU, the page crashes and in the logs, YATM is getting stuck in mounting the tape.
I have to keep restarting the service and trying to delete the job multiple times until I am able to get access to the web gui again. in some of my smaller tests of 50GB single files being sent to tape it works ok.
This is running in an actual Dell PowerEdge 7415 server running CentOS 7.9
1x AMD EPYC 7401P 128GB RAM 256 O.S SSD
So I find it difficult that this would happen due to hardware limitations.
Does anyone have a clue of what can it be? When I issue LTFS commands and send data to the tape via CLI everything works fine.
Any help would be appreciated
Here are some logs: