Open GoogleCodeExporter opened 9 years ago
I had an idea of a mode where saving (shift-F2) would not overwrite the file,
but
create incremental files instead: filename001.gif, filename002.gif,etc.
This idea would push it one step further, by making the save automatic instead
of
voluntary(manual).
Note that you had better have a quick computer and fast HDD, because the save
will
happen in the middle of your drawing.
Original comment by yrizoud
on 3 Feb 2009 at 1:35
The report suggest doing it on mouse release.
And, we can just save multiple undo pages at a time, no need to save them in
realtime. I wouldn't mind a save every 100 click and a flush when exiting the
program :)
Original comment by pulkoma...@gmail.com
on 3 Feb 2009 at 2:47
Saving several pages at a time would make it less frequent, but it multiplies
the lag
as well.
By the way, I think 2draw.net (online drawing, artist-like tools) has
recording/playback ability, detailed to every brush movement.
Original comment by yrizoud
on 3 Feb 2009 at 3:59
[deleted comment]
[deleted comment]
[deleted comment]
Original comment by pulkoma...@gmail.com
on 26 May 2009 at 7:51
Hi,
Colors for the Nintendo DS has such an recording function it stores the drawing
data
in an extra .drw file
here is a vid that shows the playback of such an Colors paint:
http://vids.myspace.com/index.cfm?fuseaction=vids.individual&videoid=38049555
and here is the website with amazing examples.
http://colors.collectingsmiles.com/
a autosave that saves the picture every 5 min or so if something was changed in
that
time would be great, i know how it is if you draw and draw and draw, u forget
time
and than you have drawn an hour without saving and something unforseen happens
(power loss or system crash) than u have painted 1 hour just for ur memorys :P
... greetings HoraK-FDF
Original comment by HoraK-...@web.de
on 1 Sep 2009 at 8:50
[deleted comment]
Original comment by pulkoma...@gmail.com
on 2 Sep 2009 at 9:41
Just an idea... Perhaps we could have a way to save all the steps stored in the
undo
buffer. Optionally you could opt to save every third step, etc.
The end result would be pretty much the same, only the implementation is
different.
Original comment by ilija.melentijevic
on 3 Sep 2009 at 7:15
The plan is to use the backup list actually. Either when you ask for it with a
keyboard shortcut, automatically or on request with a special panel. We don't
want
to save something each time the user clicks, but on every full rotation of the
backup list seems fine (so we save 4 pictures every 4 strokes with the default
setting). It should be possible to ask the program to save either the full list
or
only the newest page. If you save only the newest page, adjusting the backup
list
length allow you to set how often you want the autosave to save your work. In
this
case, we should always save to the same file (maybe filename.SAV.ext), whereas
when
in step mode we should use filename.001.ext, 002, ...
And we should also allow to disable the whole thing entirely.
Original comment by pulkoma...@gmail.com
on 3 Sep 2009 at 7:33
I wouldn't recommend saving the image on every stroke, it's going to hiccup
seriously
when the program "flushes". It will become unbearable when we get to layered
images
which several hundreds Kb each...
To avoid many cases where the saving interrupts your drawing, I had an idea to
use
multiple (user-definable) settings: a minimum number of drawing strokes (ex:
5), a
maximum number of drawing strokes (ex: 10), and a idle delay (ex: 1 second).
No saving occurs if there hasn't been at least 5 drawing strokes since last
save step.
Then, the saving will occur as soon as you trigger a non-drawing action
(opening a
menu, change drawing tool, scrolling, zoom in, etc.), OR if you stay idle for at
least 1 second. If you keep drawing without triggering any of these, then when
the
maximum is reached (10th drawing stroke), at end of drawing action, the saving
will
occur anyway.
To this system, we can add another setting: minimum interval between saves (ex:
300
seconds). No saving occurs unless the minimum time ALSO hasn't been exceeded.
This is
in case somebody isn't interested in an actual animation, but rather wants an
incremental "safety" save.
Then we can add one more setting: Rolling buffer size: unlimited, or max number
of
images to keep. The saving would overwrite old images so the disk usage stays
stable.
For example, for maximum safety, set save every 1 minute (minimum 1 paint
stroke)
with a rolling buffer of 10 images.
Before I forget, we need to make this "working steps" system activable
separately for
each image... In the program, you keep swapping between main and spare page,
and you
don't want a single movie to mix the two.
In any case, be very careful with disk activity. I'm not sure the saving
functions
are going to recover well if you run out of disk.
Original comment by yrizoud
on 3 Sep 2009 at 10:05
The idea is obviously not to save on every stroke, but save the whole backup
list at
once every time it does a full rotation (ie all the already saved images are
out of
it). This is if you want to use the feature as an animation system. If you want
to
get a safety backup/autosave, only save the head of the backup list (this is
backup
[0] with the current table system), each time you change it (so again on each
full
rotation of the buffer).
The animation mode is indeed heavy, we want to save everything and will have to
do
so in some way. The autosave mode can be more relaxed. I think saving on every
buffer rotation is simple and efficient enough. If you don't want all this disk
access to lock the program, then we could use a thread that would save things
in the
background while you're still drawing. Your proposed system adds more disk
access as
it will save way more often. Just save every 10 strokes, keep the code simple
and it
will be fine.
For the spare page, there is no problem as it already has its own filename.
Original comment by pulkoma...@gmail.com
on 3 Sep 2009 at 10:20
The backup buffer shifts _on itself_ on each drawing, Undo() or Redo(), so the
system
doesn't know if it looped the full buffer. We'll simply need to count how many
times
Backup() is called, adjusting with the rare calls of Undo() and Redo().
With layers, I hadn't planned on keeping the "flattened" version of every backup
step. If I do keep them all (to be able to save without causing an expensive
layer-flattening of all the backup steps), it will require a lot more memory:
for
example 100 steps of a 4-layer image will require a minimum of 204 pages of
memory
instead of 106 in my current system.
Original comment by yrizoud
on 3 Sep 2009 at 12:24
The backups are stored in a table so internally, the index to this table has to
be
reset to 0 at some point, unless I missed something. Things will get funnier if
you
start killing backup pages : what to do if they are already written to disk ?
You're right about layers. I don't want them flatenned in memory. If we write
only
the current page there is no problem. If we dump the full undo buffer we'll
probably
get some trouble. We could flatten the picture just before saving it, but then
a
background thread will really be needed.
My advice is to firt make layers work, that's why I put this feature in the 2.2
milestone. We'll need changes to the backup engine in the process of adding
layers
so it's better to see how it goes before adding a complex saving system.
A simple autosave can still be done if you feel like coding it.
Original comment by pulkoma...@gmail.com
on 3 Sep 2009 at 1:29
There's no 'current index', the entries in the array are physically shifted by
one
position, and the entry that leaves one end of the array is put back on the
other
side :) ! I was surprised too, and I was pretty sure I'd botch it if I tried to
rewrite this into an actual "rotating" buffer, as there are many pointers that
need
updating if you try to point to [0], then to [1], then to [2]...
This is also why I gave up my other idea/suggestion of unrolling the Undo/Redo
buffer
loop.
The shifting could be a performance problem, but in fact a cell does not contain
direct pixel data, "only" about 1.5K for a full palette (768b), two
filenames(256b
each) and a comment (32b), all the rest is pointers.
Original comment by yrizoud
on 3 Sep 2009 at 2:27
It's still an ugly way of doing things :)
A real bidirectionnal linked list would work better and not necesarily be more
complex. And it would be dynamically resizeable too. I guess weird things
happen
with the current system if you change the number of pages in the settings...
Original comment by pulkoma...@gmail.com
on 3 Sep 2009 at 3:26
Ugly, ugly... Liv Tyler is ugly too, if you look into the bones and tissues :)
There
are traces that it was initially a fixed 10-page system, so the shift is
understandable.
No weird thing happens when you change pages number (adding or removing pages),
it
seems rock-solid. Any mistake here wouldn't have caused weird thing, but rather
a
spectacular crash.
Original comment by yrizoud
on 3 Sep 2009 at 3:35
Original comment by pulkoma...@gmail.com
on 15 Sep 2009 at 7:13
For the record, since r1163, the Undo/Redo system uses a circular bidirectional
linked list.
About this idea, I'm considering the use of a directory called
"<image_name>.wip",
containing files with incremental numbers. The directory would help separate
the wip
files, and the name would make the feature compatible with filenames on DOS (FAT
actually) which are limited to 8 characters + extension. This would prevent the
use
of the feature on several files which only vary by file extension, but this
doesn't
sound like a big limitation.
When working in a layered image, I don't know if I should save the full layered
image
using the current "smart GIF" format, or a flattened version.
The first would make the files complete backups for safety, the second would
perhaps
be necessary for GIF animation tools to import them successfully.
Original comment by yrizoud
on 13 Jan 2010 at 12:29
+1 to the directory holding all the files. When you have two pics with same
name and
different extensions, it's usually the same in 2 different formats (I'm
thinking a
png and an scr when I work on amstrad for example), so it would make sense to
have
them share the directory.
As for the format, I think it's fine to use layered gif, we could for example
provide
a lua script that batch-process the images and flatten all of them, or offer a
way to
run a lua script on multiple pictures, or any other variant. Or veven do it all
in C
if we want so.
Unflattening seems a little more annoying to do ;)
Original comment by pulkoma...@gmail.com
on 13 Jan 2010 at 12:44
Original comment by pulkoma...@gmail.com
on 15 Feb 2011 at 8:12
Issue 464 has been merged into this issue.
Original comment by pulkoma...@gmail.com
on 28 Nov 2011 at 8:41
Arbitering issues that make it to v2.4
Original comment by yrizoud
on 8 Mar 2012 at 7:18
Original issue reported on code.google.com by
pulkoma...@gmail.com
on 3 Feb 2009 at 1:02