Open evernat opened 4 years ago
I recently tried to upgrade Spark in my project and encountered same issue. The only solution I found was to change path of the SparkFilter
and add some common prefix for all Spark endpoints, and another for others (which obviously generates some work and is not really viable in bigger system).
Can you tell me how you managed to work around this issue with AfterAfterFilters
? When I tried it, it failed, because to set new body in AfterAfterFilters
it must be different from null: https://github.com/perwendel/spark/blob/10146d7420169860391c659d31c21b85f9b715b9/src/main/java/spark/http/matching/AfterAfterFilters.java#L59 - and chain in MatcherFilter
is called only if body is null. So conditions seem exclusive.
@Soletaken I thought it was possible using AfterAfterFilters. It seems I was wrong and it's not possible. Sorry. In my case I worked around this issue by writing a servlet filter:
// https://github.com/perwendel/spark/issues/1148
public class FixSparkIssue1148Filter implements Filter {
@Override
public void init(final FilterConfig filterConfig) throws ServletException {
}
@Override
public void doFilter(final ServletRequest request, final ServletResponse response, final FilterChain chain)
throws IOException, ServletException {
final HttpServletRequest httpRequest = (HttpServletRequest) request;
final String contentType = httpRequest.getHeader("Content-Type");
if (contentType == null || !contentType.startsWith("application/json")) {
final String path = httpRequest.getRequestURI().substring(httpRequest.getContextPath().length());
request.getRequestDispatcher(path).forward(request, response);
} else {
chain.doFilter(request, response);
}
}
@Override
public void destroy() {
}
}
and by adding a filter/filterMapping in my web.xml for the FixSparkIssue1148Filter, before the filter/filterMapping of SparkFilter.
But for this filter to work, all http requests going to Spark should have a http header Content-Type
starting with application/json
. I was lucky enough for that, so I did not have to add a prefix for all Spark endpoints.
909 and 8dcf808 introduced another side effect in Spark 2.7.0 and later, besides #977.
In my case, SparkFilter is used in an external container (Tomcat). Some servlets are defined in web.xml without routes in Spark, for example "/login".
chain.doFilter(httpRequest, httpResponse)
becauseexternalContainer==true
: https://github.com/perwendel/spark/blob/2.6.0/src/main/java/spark/http/matching/MatcherFilter.java#L190. My servlet for "/login" was found at the end of the filter chain and used (good).!externalContainer
), "/login" returns "404 not found" from Spark andchain.doFilter(httpRequest, httpResponse)
is not called. So I have "404 not found" instead of the result of my servlet (bad). In fact,chain.doFilter(httpRequest, httpResponse)
can never be called in that code ~except by reverting "404 not found" using afterafterfilters~: https://github.com/perwendel/spark/blob/2.9.1/src/main/java/spark/http/matching/MatcherFilter.java#L196