jadianes / spark-movie-lens

An on-line movie recommender using Spark, Python Flask, and the MovieLens dataset
Other
816 stars 395 forks source link

engine.iteration #22

Open alperenbabagil opened 7 years ago

alperenbabagil commented 7 years ago

Hi jadianes, Thank you for your all work, it really helped me. But iteration in engine causes error in my system when it gets bigger than 5. I think 5 iterations are not enough for a good recommendation. Can you suggest any way to fix it? Error is this:

File "F:\bitirme\spark-2.0.1-bin-hadoop2.7\python\pyspark\mllib\common.py", line 123, in callJavaFunc 17/04/24 22:58:09 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-7,5,main] java.lang.StackOverflowError at java.nio.HeapByteBuffer.get(HeapByteBuffer.java:147) at org.apache.spark.util.ByteBufferInputStream.read(ByteBufferInputStream.scala:52) at java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2310)


ERROR LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(5,1493063889491,JobFailed(org.apache.spark.SparkException: Job 5 cancelled because SparkContext was shut down)

my system spesifications i7 2th gen 6 gb ram ssd 550/440 asus n53sv laptop

begulsen commented 7 years ago

Hello jadianes, Same problem here. I have 16gb ram but it didnt enough for 10 or 20 iteration. For this reason my recommendation is not so good. I tried many things to solve this problem, but I could not find a solution. Can you help me? Thanks in advance

jadianes commented 7 years ago

Hi guys. When I did this, some two years ago, I was using a 8-node Spark cluster with 8Gb each. That summed up to almost 64Gb of RAM. Maybe 16Gb isn't enough?

Sorry if I can't help that much. I haven't been contributing to the repo for a while due to other commitments! Any contributions/discussions are very welcome.