perwendel / spark

A simple expressive web framework for java. Spark has a kotlin DSL https://github.com/perwendel/spark-kotlin
Apache License 2.0
9.64k stars 1.56k forks source link

Deploy SparkJava app in AWS Beanstalk #1042

Closed pellyadolfo closed 5 years ago

pellyadolfo commented 5 years ago

Hi, I have a SparkJava app, with SSL enabled, connecting MongoDB and currently deployed as a fat jar.

I would like to deploy it in AWS to query a MongoDB hosted in EC2 . Initially, I thought about an AWS Lambda but the problem of cold start would be an issue since the server provides microservices. So my next idea is deploying the SparkJava app with AWS Beanstalk.

Whereas I can find some information about deploying springboot or dropwizard in AWS Beanstalk, I cannot find any good document about deploying a SparkJava app. Any experience/recommendations?

I will share my findings in this thread anyway for other people.

pellyadolfo commented 5 years ago

AWS Beanstalk allows deploying 3 types of Java projects: 1 Tomcat based project, which does not apply to a SparkJava project 2 A Java project including Jetty projects as said here (https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_Java.html) 3* a Docker project.

Option 2 seems the way to go.

pellyadolfo commented 5 years ago

Finally, no problem deploying the spark java app as a Java project in AWS Beanstalk. It just need to open the right port into the associated Security Group.

Where I am finding a problem is trying to configure SSL in sparkjava since

secure(keystoreFilePath, keystorePassword, truststoreFilePath, truststorePassword); http://sparkjava.com/documentation#embedded-web-server

is not able to read a jks embedded in the fat jar. It needs a path to file system. I do not think current secure mechism in SparkJava is useful in AWS Beanstalk (https://stackoverflow.com/questions/48487246/spark-java-how-to-import-keystore-file-with-path-inside-jar)

This is currently a feature request: https://github.com/perwendel/spark/issues/782

The alternative is creating a Key Pair for the EC2 instance and uploading the certificate to the file system so we can provide the path to secure() method. But what if AWS scales horizontally your EC2? https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html#having-ec2-create-your-key-pair

The second alternative probably is configuring SSL in the EC2 or in the load balancer.

pellyadolfo commented 5 years ago

This worked:

Spark.secure("src/main/resources/mycert.jks", KEYSTORE_PASSWORD, null, null);