buliugu / airhead-research

Automatically exported from code.google.com/p/airhead-research
0 stars 0 forks source link

Loading very big sspace #107

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
Hello,
I have an 8 GB sspace that I would like to access. My question is what's the 
most efficient way to do this, the OnDiskSemanticSpace is still "loading" a lot 
of its information into memory before you can actually start extracting 
similarity information, which in my case is not feasible because simply the 
sspace "loading" takes so long time and leaves no memory for the app to run.

can you please advise me asap if a sspace can be accessed directly from the 
disk with minimal memory footprint? 

Also, is there any way to distribute the sspace on a data cluster (e.g. hadoop 
cluster) to access and query the sspace?

your quick answer is really appreciated.

Original issue reported on code.google.com by Ahmed.S....@gmail.com on 1 Apr 2012 at 10:20