raphaelbrugier / spark-mongo-example

Example on how to use the MongoDB to Spark connector
2 stars 8 forks source link

Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:run (default-cli) on project spark-mongo-example: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 240 #3

Closed mostafaghadimi closed 11 months ago

mostafaghadimi commented 5 years ago

Hi @raphaelbrugier

Whenever I run mvn scala:run -DmainClass=com.github.rbrugier.MongoSparkMain, I face with the following error:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/08/21 13:41:02 INFO SparkContext: Running Spark version 1.6.3
19/08/21 13:41:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/08/21 13:41:02 WARN Utils: Your hostname, mostafa-UX303UB resolves to a loopback address: 127.0.1.1; using 192.168.199.64 instead (on interface wlp2s0)
19/08/21 13:41:02 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/08/21 13:41:02 INFO SecurityManager: Changing view acls to: mostafa
19/08/21 13:41:02 INFO SecurityManager: Changing modify acls to: mostafa
19/08/21 13:41:02 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mostafa); users with modify permissions: Set(mostafa)
19/08/21 13:41:03 INFO Utils: Successfully started service 'sparkDriver' on port 43395.
19/08/21 13:41:03 INFO Slf4jLogger: Slf4jLogger started
19/08/21 13:41:03 INFO Remoting: Starting remoting
19/08/21 13:41:03 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.199.64:35013]
19/08/21 13:41:03 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 35013.
19/08/21 13:41:03 INFO SparkEnv: Registering MapOutputTracker
19/08/21 13:41:03 INFO SparkEnv: Registering BlockManagerMaster
19/08/21 13:41:03 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-f78fe81a-ed09-4946-89d6-4410c01a467a
19/08/21 13:41:03 INFO MemoryStore: MemoryStore started with capacity 1089.8 MB
19/08/21 13:41:03 INFO SparkEnv: Registering OutputCommitCoordinator
19/08/21 13:41:04 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/08/21 13:41:04 INFO SparkUI: Started SparkUI at http://192.168.199.64:4040
19/08/21 13:41:04 INFO Executor: Starting executor ID driver on host localhost
19/08/21 13:41:04 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34675.
19/08/21 13:41:04 INFO NettyBlockTransferService: Server created on 34675
19/08/21 13:41:04 INFO BlockManagerMaster: Trying to register BlockManager
19/08/21 13:41:04 INFO BlockManagerMasterEndpoint: Registering block manager localhost:34675 with 1089.8 MB RAM, BlockManagerId(driver, localhost, 34675)
19/08/21 13:41:04 INFO BlockManagerMaster: Registered BlockManager
19/08/21 13:41:05 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 184.0 B, free 1089.7 MB)
19/08/21 13:41:05 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 365.0 B, free 1089.7 MB)
19/08/21 13:41:05 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:34675 (size: 365.0 B, free: 1089.7 MB)
19/08/21 13:41:05 INFO SparkContext: Created broadcast 0 from broadcast at MongoSpark.scala:476
19/08/21 13:41:06 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 184.0 B, free 1089.7 MB)
19/08/21 13:41:06 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 365.0 B, free 1089.7 MB)
19/08/21 13:41:06 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:34675 (size: 365.0 B, free: 1089.7 MB)
19/08/21 13:41:06 INFO SparkContext: Created broadcast 1 from broadcast at MongoSpark.scala:476
19/08/21 13:41:06 INFO cluster: Cluster created with settings {hosts=[127.0.0.1:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500}
19/08/21 13:41:06 INFO cluster: Cluster description not yet available. Waiting for 30000 ms before timing out
19/08/21 13:41:06 INFO cluster: Exception in monitor thread while connecting to server 127.0.0.1:27017
com.mongodb.MongoSocketReadException: Exception receiving message
    at com.mongodb.connection.InternalStreamConnection.translateReadException(InternalStreamConnection.java:480)
    at com.mongodb.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:225)
    at com.mongodb.connection.CommandHelper.receiveReply(CommandHelper.java:134)
    at com.mongodb.connection.CommandHelper.receiveCommandResult(CommandHelper.java:121)
    at com.mongodb.connection.CommandHelper.executeCommand(CommandHelper.java:32)
    at com.mongodb.connection.InternalStreamConnectionInitializer.initializeConnectionDescription(InternalStreamConnectionInitializer.java:83)
    at com.mongodb.connection.InternalStreamConnectionInitializer.initialize(InternalStreamConnectionInitializer.java:43)
    at com.mongodb.connection.InternalStreamConnection.open(InternalStreamConnection.java:115)
    at com.mongodb.connection.DefaultServerMonitor$ServerMonitorRunnable.run(DefaultServerMonitor.java:128)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketException: Connection reset
    at java.net.SocketInputStream.read(SocketInputStream.java:210)
    at java.net.SocketInputStream.read(SocketInputStream.java:141)
    at com.mongodb.connection.SocketStream.read(SocketStream.java:85)
    at com.mongodb.connection.InternalStreamConnection.receiveResponseBuffers(InternalStreamConnection.java:491)
    at com.mongodb.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:221)
    ... 8 more
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
    at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
Caused by: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting to connect. Client view of cluster state is {type=UNKNOWN, servers=[{address=127.0.0.1:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketReadException: Exception receiving message}, caused by {java.net.SocketException: Connection reset}}]
    at com.mongodb.connection.BaseCluster.getDescription(BaseCluster.java:160)
    at com.mongodb.Mongo.getClusterDescription(Mongo.java:378)
    at com.mongodb.Mongo.getServerAddressList(Mongo.java:371)
    at com.mongodb.spark.connection.MongoClientCache$$anonfun$logClient$1.apply(MongoClientCache.scala:161)
    at com.mongodb.spark.connection.MongoClientCache$$anonfun$logClient$1.apply(MongoClientCache.scala:161)
    at com.mongodb.spark.LoggingTrait$class.logInfo(LoggingTrait.scala:48)
    at com.mongodb.spark.Logging.logInfo(Logging.scala:24)
    at com.mongodb.spark.connection.MongoClientCache.logClient(MongoClientCache.scala:161)
    at com.mongodb.spark.connection.MongoClientCache.acquire(MongoClientCache.scala:56)
    at com.mongodb.spark.MongoConnector.acquireClient(MongoConnector.scala:239)
    at com.mongodb.spark.MongoConnector.withMongoClientDo(MongoConnector.scala:152)
    at com.mongodb.spark.MongoConnector.withDatabaseDo(MongoConnector.scala:171)
    at com.mongodb.spark.MongoConnector.hasSampleAggregateOperator(MongoConnector.scala:234)
    at com.mongodb.spark.rdd.MongoRDD.hasSampleAggregateOperator$lzycompute(MongoRDD.scala:180)
    at com.mongodb.spark.rdd.MongoRDD.hasSampleAggregateOperator(MongoRDD.scala:180)
    at com.mongodb.spark.sql.MongoInferSchema$.apply(MongoInferSchema.scala:61)
    at com.mongodb.spark.MongoSpark.toDF(MongoSpark.scala:521)
    at com.mongodb.spark.rdd.MongoRDD.toDF(MongoRDD.scala:73)
    at com.github.rbrugier.MongoSparkMain$.delayedEndpoint$com$github$rbrugier$MongoSparkMain$1(MongoSparkMain.scala:19)
    at com.github.rbrugier.MongoSparkMain$delayedInit$body.apply(MongoSparkMain.scala:11)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:76)
    at scala.App$$anonfun$main$1.apply(App.scala:76)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
    at scala.App$class.main(App.scala:76)
    at com.github.rbrugier.MongoSparkMain$.main(MongoSparkMain.scala:11)
    at com.github.rbrugier.MongoSparkMain.main(MongoSparkMain.scala)
    ... 6 more
19/08/21 13:41:36 INFO SparkContext: Invoking stop() from shutdown hook
19/08/21 13:41:36 INFO SparkUI: Stopped Spark web UI at http://192.168.199.64:4040
19/08/21 13:41:36 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/08/21 13:41:36 INFO MemoryStore: MemoryStore cleared
19/08/21 13:41:36 INFO BlockManager: BlockManager stopped
19/08/21 13:41:36 INFO BlockManagerMaster: BlockManagerMaster stopped
19/08/21 13:41:36 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/08/21 13:41:36 INFO SparkContext: Successfully stopped SparkContext
19/08/21 13:41:36 INFO ShutdownHookManager: Shutdown hook called
19/08/21 13:41:36 INFO ShutdownHookManager: Deleting directory /tmp/spark-270303b6-e839-44c8-90e1-6f21078c33f4
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  38.775 s
[INFO] Finished at: 2019-08-21T13:41:36+04:30
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:run (default-cli) on project spark-mongo-example: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 240 (Exit value: 240) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
raphaelbrugier commented 5 years ago

Is Mongo running correctly on this host and port: 127.0.0.1:27017 ?

On Wed, Aug 21, 2019, 05:13 Mostafa Ghadimi notifications@github.com wrote:

Hi @raphaelbrugier https://github.com/raphaelbrugier

Whenever I run mvn scala:run -DmainClass=com.github.rbrugier.MongoSparkMain, I face with the following error:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 19/08/21 13:41:02 INFO SparkContext: Running Spark version 1.6.3 19/08/21 13:41:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 19/08/21 13:41:02 WARN Utils: Your hostname, mostafa-UX303UB resolves to a loopback address: 127.0.1.1; using 192.168.199.64 instead (on interface wlp2s0) 19/08/21 13:41:02 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 19/08/21 13:41:02 INFO SecurityManager: Changing view acls to: mostafa 19/08/21 13:41:02 INFO SecurityManager: Changing modify acls to: mostafa 19/08/21 13:41:02 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(mostafa); users with modify permissions: Set(mostafa) 19/08/21 13:41:03 INFO Utils: Successfully started service 'sparkDriver' on port 43395. 19/08/21 13:41:03 INFO Slf4jLogger: Slf4jLogger started 19/08/21 13:41:03 INFO Remoting: Starting remoting 19/08/21 13:41:03 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.199.64:35013] 19/08/21 13:41:03 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 35013. 19/08/21 13:41:03 INFO SparkEnv: Registering MapOutputTracker 19/08/21 13:41:03 INFO SparkEnv: Registering BlockManagerMaster 19/08/21 13:41:03 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-f78fe81a-ed09-4946-89d6-4410c01a467a 19/08/21 13:41:03 INFO MemoryStore: MemoryStore started with capacity 1089.8 MB 19/08/21 13:41:03 INFO SparkEnv: Registering OutputCommitCoordinator 19/08/21 13:41:04 INFO Utils: Successfully started service 'SparkUI' on port 4040. 19/08/21 13:41:04 INFO SparkUI: Started SparkUI at http://192.168.199.64:4040 19/08/21 13:41:04 INFO Executor: Starting executor ID driver on host localhost 19/08/21 13:41:04 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34675. 19/08/21 13:41:04 INFO NettyBlockTransferService: Server created on 34675 19/08/21 13:41:04 INFO BlockManagerMaster: Trying to register BlockManager 19/08/21 13:41:04 INFO BlockManagerMasterEndpoint: Registering block manager localhost:34675 with 1089.8 MB RAM, BlockManagerId(driver, localhost, 34675) 19/08/21 13:41:04 INFO BlockManagerMaster: Registered BlockManager 19/08/21 13:41:05 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 184.0 B, free 1089.7 MB) 19/08/21 13:41:05 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 365.0 B, free 1089.7 MB) 19/08/21 13:41:05 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:34675 (size: 365.0 B, free: 1089.7 MB) 19/08/21 13:41:05 INFO SparkContext: Created broadcast 0 from broadcast at MongoSpark.scala:476 19/08/21 13:41:06 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 184.0 B, free 1089.7 MB) 19/08/21 13:41:06 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 365.0 B, free 1089.7 MB) 19/08/21 13:41:06 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:34675 (size: 365.0 B, free: 1089.7 MB) 19/08/21 13:41:06 INFO SparkContext: Created broadcast 1 from broadcast at MongoSpark.scala:476 19/08/21 13:41:06 INFO cluster: Cluster created with settings {hosts=[127.0.0.1:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500} 19/08/21 13:41:06 INFO cluster: Cluster description not yet available. Waiting for 30000 ms before timing out 19/08/21 13:41:06 INFO cluster: Exception in monitor thread while connecting to server 127.0.0.1:27017 com.mongodb.MongoSocketReadException: Exception receiving message at com.mongodb.connection.InternalStreamConnection.translateReadException(InternalStreamConnection.java:480) at com.mongodb.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:225) at com.mongodb.connection.CommandHelper.receiveReply(CommandHelper.java:134) at com.mongodb.connection.CommandHelper.receiveCommandResult(CommandHelper.java:121) at com.mongodb.connection.CommandHelper.executeCommand(CommandHelper.java:32) at com.mongodb.connection.InternalStreamConnectionInitializer.initializeConnectionDescription(InternalStreamConnectionInitializer.java:83) at com.mongodb.connection.InternalStreamConnectionInitializer.initialize(InternalStreamConnectionInitializer.java:43) at com.mongodb.connection.InternalStreamConnection.open(InternalStreamConnection.java:115) at com.mongodb.connection.DefaultServerMonitor$ServerMonitorRunnable.run(DefaultServerMonitor.java:128) at java.lang.Thread.run(Thread.java:748) Caused by: java.net.SocketException: Connection reset at java.net.SocketInputStream.read(SocketInputStream.java:210) at java.net.SocketInputStream.read(SocketInputStream.java:141) at com.mongodb.connection.SocketStream.read(SocketStream.java:85) at com.mongodb.connection.InternalStreamConnection.receiveResponseBuffers(InternalStreamConnection.java:491) at com.mongodb.connection.InternalStreamConnection.receiveMessage(InternalStreamConnection.java:221) ... 8 more java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala_maven_executions.MainHelper.runMain(MainHelper.java:164) at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26) Caused by: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting to connect. Client view of cluster state is {type=UNKNOWN, servers=[{address=127.0.0.1:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketReadException: Exception receiving message}, caused by {java.net.SocketException: Connection reset}}] at com.mongodb.connection.BaseCluster.getDescription(BaseCluster.java:160) at com.mongodb.Mongo.getClusterDescription(Mongo.java:378) at com.mongodb.Mongo.getServerAddressList(Mongo.java:371) at com.mongodb.spark.connection.MongoClientCache$$anonfun$logClient$1.apply(MongoClientCache.scala:161) at com.mongodb.spark.connection.MongoClientCache$$anonfun$logClient$1.apply(MongoClientCache.scala:161) at com.mongodb.spark.LoggingTrait$class.logInfo(LoggingTrait.scala:48) at com.mongodb.spark.Logging.logInfo(Logging.scala:24) at com.mongodb.spark.connection.MongoClientCache.logClient(MongoClientCache.scala:161) at com.mongodb.spark.connection.MongoClientCache.acquire(MongoClientCache.scala:56) at com.mongodb.spark.MongoConnector.acquireClient(MongoConnector.scala:239) at com.mongodb.spark.MongoConnector.withMongoClientDo(MongoConnector.scala:152) at com.mongodb.spark.MongoConnector.withDatabaseDo(MongoConnector.scala:171) at com.mongodb.spark.MongoConnector.hasSampleAggregateOperator(MongoConnector.scala:234) at com.mongodb.spark.rdd.MongoRDD.hasSampleAggregateOperator$lzycompute(MongoRDD.scala:180) at com.mongodb.spark.rdd.MongoRDD.hasSampleAggregateOperator(MongoRDD.scala:180) at com.mongodb.spark.sql.MongoInferSchema$.apply(MongoInferSchema.scala:61) at com.mongodb.spark.MongoSpark.toDF(MongoSpark.scala:521) at com.mongodb.spark.rdd.MongoRDD.toDF(MongoRDD.scala:73) at com.github.rbrugier.MongoSparkMain$.delayedEndpoint$com$github$rbrugier$MongoSparkMain$1(MongoSparkMain.scala:19) at com.github.rbrugier.MongoSparkMain$delayedInit$body.apply(MongoSparkMain.scala:11) at scala.Function0$class.apply$mcV$sp(Function0.scala:34) at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) at scala.App$$anonfun$main$1.apply(App.scala:76) at scala.App$$anonfun$main$1.apply(App.scala:76) at scala.collection.immutable.List.foreach(List.scala:381) at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35) at scala.App$class.main(App.scala:76) at com.github.rbrugier.MongoSparkMain$.main(MongoSparkMain.scala:11) at com.github.rbrugier.MongoSparkMain.main(MongoSparkMain.scala) ... 6 more 19/08/21 13:41:36 INFO SparkContext: Invoking stop() from shutdown hook 19/08/21 13:41:36 INFO SparkUI: Stopped Spark web UI at http://192.168.199.64:4040 19/08/21 13:41:36 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 19/08/21 13:41:36 INFO MemoryStore: MemoryStore cleared 19/08/21 13:41:36 INFO BlockManager: BlockManager stopped 19/08/21 13:41:36 INFO BlockManagerMaster: BlockManagerMaster stopped 19/08/21 13:41:36 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 19/08/21 13:41:36 INFO SparkContext: Successfully stopped SparkContext 19/08/21 13:41:36 INFO ShutdownHookManager: Shutdown hook called 19/08/21 13:41:36 INFO ShutdownHookManager: Deleting directory /tmp/spark-270303b6-e839-44c8-90e1-6f21078c33f4 [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 38.775 s [INFO] Finished at: 2019-08-21T13:41:36+04:30 [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:run (default-cli) on project spark-mongo-example: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 240 (Exit value: 240) -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/raphaelbrugier/spark-mongo-example/issues/3?email_source=notifications&email_token=AAENN5QLQYX5MRNZFKKA5VLQFUBNBA5CNFSM4IOD7HS2YY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4HGORHXQ, or mute the thread https://github.com/notifications/unsubscribe-auth/AAENN5UK3NLEOIUY7QBTCHTQFUBNBANCNFSM4IOD7HSQ .

mostafaghadimi commented 5 years ago

Yeah