perwendel / spark

A simple expressive web framework for java. Spark has a kotlin DSL https://github.com/perwendel/spark-kotlin
Apache License 2.0
9.63k stars 1.56k forks source link

Desired features/changes for Spark 3.0 #1105

Open perwendel opened 5 years ago

perwendel commented 5 years ago

Hi, A 2.9.0 release will be done shortly and after that my work will be fully focused on 3.0. Any input on what would be fitting for Spark 3.0 is much appreciated. Please post in this thread. Thanks!

fwgreen commented 5 years ago

Please consider reopening the issues labeled Fix in version 3: They were closed two years ago without actually being resolved.

perwendel commented 5 years ago

@fwgreen I'll go through them and check if there are any that shouldn't have been closed.

RyanSusana commented 5 years ago

Native support for uploaded files instead of getting the raw request.

Multiple static file locations

mcgivrer commented 5 years ago

Would it be possible to add internal metrics (usages, perfs, custom) to answer my personal needs of control :) More seriously, in a world of containers, metrics are mandatory to monitor services. Maybe a look at microprofile-metrics and their annotations could inspire developers ? A /metrics output with a prometheus format (anyway something standard) would be a must ;) I clearly understand the need to KISS, and not using annotation make sens, but having a easy way to declare metric would be a killer feature (config file, fluent API extension of get(), post() etc... ?).

johnnybigoode-zz commented 5 years ago

@RyanSusana I was wondering the same for static files (https://github.com/perwendel/spark/issues/568)

Could you explain more about your use case?

RyanSusana commented 5 years ago

@johnnybigoode Well I would like to have one Spark instance to be able to hook on various staticfile locations.

One for the JS/CSS and one for /uploads or something

This would allow me to split my application up better.

For my specific use-case: I am developing a CMS framework and the Admin UI has it's own static resources. I would like my framework users to be able to hook on their own staticfiles.

Right now how I solve it, is that I traverse the classpath/jar and add a route for every file I have

laliluna commented 5 years ago

I have two ideas and if there is interest, I could try to provide pull requests.

1) Enhance testability In order to test routes and their output it is currently required to change the way how you declare routes. Actually you cannot test routing in combination with testing the output. If we change the Service to implement an interface and allow to swap it Spark.enableMock() This allows to write tests as demod here: https://github.com/perwendel/spark/issues/1085

2) Allow to decorate response and answer If I could decorate a response with a custom class extending the response, I could add behaviour and implement routes more elegant.

Once somewhere

    Spark.decorateResponse((response) -> return MySuperDuperResponse(response));

In your routes

    Spark.get("sample", (request, response) -> {
        return response.json(loadWhatever()).httpOk404IfNull();
     });
perwendel commented 5 years ago

@RyanSusana @mcgivrer @laliluna Good suggestions. We'll evaluate! Some of them will likely be part of 3.0.

mlitcher commented 5 years ago

Two big things on my wish list: break apart core and jetty, to allow for other embeddable servers (#137), and leverage servlet vs filter (#193).

OzzyTheGiant commented 5 years ago

CSRF Tokens would be a nice simple feature. I use them for Single Page Web Apps, storing them in sessions. Normally, in other languages, there are standalone libraries or packages that provide this functionality to be used with any framework. In the Java world, CSRF tokens are either already integrated into other frameworks (Spring Security for example) or are part of old packages that are no longer being maintained or have more complex configurations in XML that, frankly, I don't understand how to set up. Do you think this is something that could be added in? or do you happen to know of a library that I can pick up that has little to no configuration and is standalone? I tried searching Maven Central but no luck.

Technerder commented 4 years ago

Request: Method to respond with a File

robax commented 4 years ago

One thing that might be useful is the option to use Jax-RS style annotations on routes. This way, instead of reaching into the request object and grabbing seemingly random fields, you can define the expected inputs via annotations.

If there's any interest in this, we've already developed something we use internally. I could spin it out into a PR easily!

ontehfritz commented 4 years ago

A big thing that would be nice to have is OpenApi/Swagger support, or a plugin/maven package to add support. Most frameworks out there have this to autogenerate open api specs and have swagger UI integrated, it makes testing and auto generating interfaces from the spec for your api's really awesome!

rbygrave commented 4 years ago

Jax-RS style annotations

Note that I have done a APT based code generation project for Javalin and would look to do the same for Spark. The Javalin one is documented at: https://dinject.io/docs/javalin/ ... I just need to adapt the code generation for Spark request/response.

OpenApi/Swagger support,

As part of the APT code generation for controllers it also generates OpenApi/Swagger docs. The nice thing here is that APT has access to javadoc/kotlindoc so actually we just javadoc our controller methods and that goes into the generated swagger.

This approach is more similar to the jax-rs style with dependency injection and controllers. Note that the DI also uses APT code generation so it is fast and light (but people could swap it out for slower heavier DI like Guice or Spring if they wanted to).

Technerder commented 4 years ago

A way for the get post and other methods alike to listen for requests with a specific host parameter. Something like

Spark.get("/", "test.example.com", (request, response) -> {
    return "Hello!";
}
perwendel commented 4 years ago

Thanks everyone for your suggestions. It's been a long summer vacation with a resulting dip in project activity. Ramping up will begin within a month!

Chlorek commented 4 years ago

I am just now working on my first project with Spark and I like its minimalism, as time goes I will probably find more things, but these are some features I found missing early in development:

These are not deal-breakers, so I continue development and it's really good so far. However, I would like to add my two cents in matter of support of multiple HTTP server solutions: in my (maybe not so popular) opinion Spark should handle just one HTTP server very well, because - well it is literally just HTTP server, let's not make it more complicated than it is.

brixzen commented 4 years ago

Please add option to disable GZip in staticFiles response.

sid-ihycq commented 4 years ago

Allow other embeddable servers will be great!!

jlorenzen commented 4 years ago

A little late to the game but here are some improvements I'd like to suggest. I ran into this hurdles when I used sparkjava to implement a basic REST service that only had a few endpoints. Overall experience was great and I loved the simplicity of sparkjava.

That's about it. Appreciate all the hard work and if these suggestions sound interesting I think I'd be able to submit some patches if given some direction.

skedastik commented 4 years ago

An option to disable automatic gzip compression based on the presence of a Content-Type: gzip response header would be extremely useful. To wax philosophical for a second, I'm generally opposed to magic in frameworks. This is one of the reasons I gravitated toward Spark in the first place: It's thin, transparent, and almost entirely free of magic. Except for this feature which has no opt-out or clean workaround of any kind. Example use case: I have an endpoint that serves as an authenticated gateway to resources in S3. These resources are gzipped for good reason (consume less storage and less data over the wire). If I want to stream these resources I'm forced to wrap the InputStream in a GZIPInputStream, otherwise Spark will forcibly zip my resource twice when I include the relevant HTTP header.

RyanSusana commented 4 years ago

@skedastik

I ran into that same issue TODAY. How did you solve it?

skedastik commented 4 years ago

@RyanSusana I posted my (grotesque) workaround on Stack Overflow.

JusticeN commented 3 years ago

plugin system like in javalin will make spark extensible. Then creating plugin for common task like

realkarmakun commented 3 years ago

GraphQL would be very nice

grishka commented 3 years ago

A response type transformer as I've written in detail in #1181

sid-ihycq commented 3 years ago

Will 3.0 be release?

Typografikon commented 3 years ago

What about http2 support according to pr #1183 ? Also is there some release plan for 3.0 ?

lepe commented 1 year ago

What about http2 support according to pr #1183 ? Also is there some release plan for 3.0 ?

Already implemented in the Unofficial Build among with other features. As far as I know @perwendel is planning to come back and keep going with this project, but meanwhile, I'm merging and fixing what I can in that repository.