Closed lazymesh closed 6 years ago
spark 1.6
On Sun, Feb 25, 2018 at 11:25 AM, Ramesh Maharjan notifications@github.com wrote:
I don't think val sqlContext = SQLContext.getOrCreate(rdd.sparkContext) should work in mapr-streams-sparkstreaming-hbase/src/main/scala/solution/ SensorStreamConsumer.scala file as I don't see any getOrCreate() api for SQLContext.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/caroljmcdonald/mapr-streams-sparkstreaming-hbase/issues/2, or mute the thread https://github.com/notifications/unsubscribe-auth/ADMEZDHftv3s_uyeJn2nd-KblLPo9Nwcks5tYYlpgaJpZM4SSWxS .
for a more recent spark hbase example check out
On Sun, Feb 25, 2018 at 11:25 AM, Ramesh Maharjan notifications@github.com wrote:
I don't think val sqlContext = SQLContext.getOrCreate(rdd.sparkContext) should work in mapr-streams-sparkstreaming-hbase/src/main/scala/solution/ SensorStreamConsumer.scala file as I don't see any getOrCreate() api for SQLContext.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/caroljmcdonald/mapr-streams-sparkstreaming-hbase/issues/2, or mute the thread https://github.com/notifications/unsubscribe-auth/ADMEZDHftv3s_uyeJn2nd-KblLPo9Nwcks5tYYlpgaJpZM4SSWxS .
I never used 1.6 version of spark, so I didn't know about that. But is it really safe to create so many instances of sqlcontext?
It gets it if already created
-----Original Message----- From: "Ramesh Maharjan" notifications@github.com Sent: 2/25/2018 2:17 PM To: "caroljmcdonald/mapr-streams-sparkstreaming-hbase" mapr-streams-sparkstreaming-hbase@noreply.github.com Cc: "Carol McDonald" caroljmcdonald@gmail.com; "Comment" comment@noreply.github.com Subject: Re: [caroljmcdonald/mapr-streams-sparkstreaming-hbase] can youclarify this? (#2)
I never used 1.6 version of spark, so I didn't know about that. But is really safe to create so many instances of sqlcontext? — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.
ok. that explains it. The api document says it too. Thanks for the clarification. I learned a new stuff. :)
I don't think val sqlContext = SQLContext.getOrCreate(rdd.sparkContext) should work in mapr-streams-sparkstreaming-hbase/src/main/scala/solution/SensorStreamConsumer.scala file as I don't see any getOrCreate() api for SQLContext.