razzeee / xbmc-test

0 stars 0 forks source link

Support for larger rar'ed .sub subtitles files #58

Closed razzeee closed 7 years ago

razzeee commented 7 years ago

Reported by Gamester17 on 4 Oct 2003 11:25 UTC

Support for larger rar'ed .sub subtitles files

Supported for relativly unlimited size rar'ed .sub subtitles

Not sure how/if it is possible to on-the-fly only uncompress a section of a file into memory and read it then unload it to read in the next section in a rar or zip files. Maybe a dev or someone with more insight intohow RAR uncompression works can elaborate on that better than I

Otherwise code the application un-rar the whole sub to harddrive first then only read sections say maybe 256kb, 512kb, 1MB or 2MB of the file at a time into memory then unload it if possible (something similar is done with the TV-Guide listings.xml). Kawa-X is using harddrive paging, maybe that could be used if get a better un-rar code that first extract direct to harddrive.

This is how Mr K, the author of Kawa-X explained it to me how Kawa-X is using harddrive <=> memory paging:

"The principle is simple. First the data is read from the rom files, and processed in a suitable form, and it's wrote back in 256kb chunks on the Z: drive. Why 256kb ? cause it's fast to read or write that amount of data, and yet it contains a sufficiently significant amount of data (2048 tiles in the case of NeoGeo)

Then, in the case of KawaX, 20mb are allocated in RAM, and divided in 256kb chunks. Each of them will eventually recieve the content of one of the files. Such a chunk is called a page. So, at a given time, we always have 80 pages loaded in RAM.

During emulation, the following occurs: Instead of accessing directly a byte array that would contain the ROM, I call a function giving it the offset I need, and that will return a pointer to the relevant data The function checks if the needed page is loaded. Yes ? good, we return a derivated pointer. No ? We load the needed page from Z: to the least recently used page in central RAM.

Least recently is decided this way: when one page is accessed, its flag is set to 0, and at the end of each frame, all pages are incremented. So at a given time, the page with the highest flag is the one that was used the longest ago, and will be ditched first when needed."

Migrated-From: http://trac.kodi.tv/ticket/58

razzeee commented 7 years ago

Comment by gamester17 on 4 Oct 2003 13:39 UTC

Logged In: YES user_id=630186

According to eabair (MAMEoX) VMM could relativly easy be implemeted as long as the dev know where to hock it up, so guess/hope will not be too hard to hock up in MPlayer's code? http://sourceforge.net/tracker/? func=detail&atid=554245&aid=737393&group_id=78713

Maybe you can take a look at VMM and how MAMEoX does it?

of course the next big step would be to figure out the best way to decompress the sub files from the RAR archive best.

razzeee commented 7 years ago

Comment by gamester17 on 21 Apr 2005 07:07 UTC

Logged In: YES user_id=630186

7-Zip is open source under GNU LGPL (available both in C and C++) and looks to be the perfect library(?) for this as it handles most compression formats (such as: 7z, ZIP, CAB, RAR, ARJ, GZIP, BZIP2, Z, TAR, CPIO, RPM and DEB).

Are any developers interesting in porting this to XBMC?

http://www.7-zip.org

razzeee commented 7 years ago

Comment by spiff_ on 27 Jun 2008 22:23 UTC we support as large rarred subtitles as memory allow and stream from disk to conserve memory

razzeee commented 7 years ago

Modified by vdrfan on 6 Feb 2011 17:27 UTC