perwendel / spark

A simple expressive web framework for java. Spark has a kotlin DSL https://github.com/perwendel/spark-kotlin
Apache License 2.0
9.64k stars 1.56k forks source link

Exception mappers not working with 2.8.0 when running on Tomcat #1062

Open kliakos opened 5 years ago

kliakos commented 5 years ago

The issue has to do with http://sparkjava.com/documentation#exception-mapping

I have confirmed that the same code that works fine with 2.7.2, stops working with 2.8.0, when running on Tomcat. Embedded Jetty is fine though.

Instead of invoking the custom ExceptionHandler, it invokes the default one ( spark.http.matching.GeneralError ).

UshakovVasilii commented 5 years ago

Spark.exception method work with: https://github.com/perwendel/spark/blob/master/src/main/java/spark/Service.java#L88

But SparkFilter for servlet use other instance: https://github.com/perwendel/spark/blob/master/src/main/java/spark/ExceptionMapper.java#L39

Last project activity was in September, possible pull request for this problem already created.

Workaround:

  public synchronized <T extends Exception> void sevletException(Class<T> exceptionClass,
      ExceptionHandler<? super T> handler) {
    // wrap
    ExceptionHandlerImpl<T> wrapper = new ExceptionHandlerImpl<T>(exceptionClass) {
      @Override
      public void handle(T exception, Request request, Response response) {
        handler.handle(exception, request, response);
      }
    };

    ExceptionMapper.getServletInstance().map(exceptionClass, wrapper);
  }

and call sevletException with exception

    exception(Exception.class, new InternalServerErrorHandler());
    sevletException(Exception.class, new InternalServerErrorHandler());
cynthux commented 5 years ago

Is there any update on this issue? I would like to use Spark 2.8.0 because of the new awaitStop() method but I can't because of this issue. I wasn't able to apply a workaround so I moved back to 2.7.2 again.

kliakos commented 4 years ago

Any chance this will be fixed?

lglref32team2 commented 4 years ago

The workaround doesn't work for me in version 2.9.1 anymore. Any chance for fixing it since it is such an advertised feature?

fireandfuel commented 4 years ago

The workaround still works on 2.9.1, but it would be nice if this would be fixed soon. At least it works after I ported the workaround to Kotlin.

MikeMitterer commented 4 years ago

The problem is base on these changes: (@mikosik) https://github.com/perwendel/spark/commit/376dcc80713827ab17c1f1dac4b6b7b9091ba57a

nanderson87 commented 4 years ago

Hello is there any progress regarding this problem?

msidelnik commented 3 years ago

Hello! Is there an update about this issue? Would be fixed in 2.9.2 version?

Lloyd-Pottiger commented 3 years ago

I would like to fix this issue @perwendel

IP696 commented 3 years ago

Hello, I am really looking forward to solving this bug

MartinAndu commented 2 years ago

Hi, this issue keeps showing up even with the version 2.9.3, is there any update?