Closed Beep6581 closed 8 years ago
Which behavior do you suggest?
I think that we don't expect anybody else than you to reach the maximum number if tries
:), so it should be pretty safe to raise it up to 500. :)
Reported by natureh.510
on 2013-02-25 00:42:50
PatchSubmitted
Ermm... maybe we would like to avoid the crash ? :)
Reported by natureh.510
on 2013-02-25 00:47:54
I get around it even without a change, as I know what to do, when it crashes.
Reported by heckflosse@i-weyrich.de
on 2013-02-25 10:33:41
My question was "what should be the expected behavior if the mamximum number of tries
has been reached?". Cancelling the save process?
Reported by natureh.510
on 2013-02-25 14:08:43
Sounds good to me :-)
Reported by heckflosse@i-weyrich.de
on 2013-02-25 15:03:28
Why not do it while filename exists tries++ and try again? Why should there be an upper
limit?
Reported by entertheyoni
on 2013-02-25 15:10:13
I like this idea very much! The patch would be so simple then :D
Reported by natureh.510
on 2013-02-25 15:13:17
I'm thinking of unlikely but possible situations where someone processes thousands of
images through RT, such as video frames, and relies in RT's filename incrementation
instead of doing the incrementation himself. There is at least one program that processes
thousands of photos through RT - DeSERt (for making timelapse videos; it uses RT to
smooth out and control the exposure etc.). It doesn't rely on RT incrementing the filename,
it does that itself, but there might be other programs out that we don't know about
that do. If this is a 1 minute fix, why not remove the upper limit altogether?
Reported by entertheyoni
on 2013-02-25 15:16:26
Re #7: In fact, i don't like infinite loop, but it might be safe in this case.
Here is the patch.
Reported by natureh.510
on 2013-03-07 22:21:52
:-) programmers don't like infinite loops. I'm with you, Hombre!
Reported by heckflosse@i-weyrich.de
on 2013-03-07 22:53:31
Electronics technicians know nothing continues perpetually ;]
Reported by entertheyoni
on 2013-03-07 23:02:05
;-) Nothing lasts longer than a good workaround!
Reported by heckflosse@i-weyrich.de
on 2013-03-07 23:09:57
I'm a bite doubtful about the fact of seeing a directory with as much as 1000 images
in it. But maybe Linux can do it with showing any slow down (unlike other systems ;)).
This is a low priority issue, but all we have to do is make a choice first between:
- raising the number of tries up to e.g. 1000 and open an alert window indicating
that the file has reached the maximum number the 1000th tries, and ask for a
"Continue" or "Cancel" choice. On command line option, there won't be any choice,
it would skip the file with a printed alert.
- making an infinite loop (but i don't like this options, even if simpler to do)
iterating more than 1000 files on each file saving can be time consuming,
but i can only guess, i was never faced to this situation :)
- ... any other option?
Reported by natureh.510
on 2013-03-09 00:07:40
I would like to close this one for the following reasons:
1.) It doesn't crash anymore when reaching the 100-files-limit
2.) It doesn't write a file when reaching the 100-files-limit (which is great for testing,
when no output is needed ;-)
3.) No normal user will reach the 100-files-limit, I think...
Ingo
Reported by heckflosse@i-weyrich.de
on 2013-10-23 22:11:25
FixedPendingConfirmation
Reported by heckflosse@i-weyrich.de
on 2015-05-27 22:21:24
Fixed
Originally reported on Google Code with ID 1683
Reported by
heckflosse@i-weyrich.de
on 2013-01-14 17:49:15