我正在玩Spark http://sparkjava.com/(Java Web 框架,而不是 Apache Spark)。
我发现定义路由和过滤器非常好且容易,但是我希望将本机 servlet 过滤器应用于我的路由,但似乎找不到方法来做到这一点。
更具体地说,我想使用Jetty 的 DoSFilter http://www.eclipse.org/jetty/documentation/current/dos-filter.html这是一个 servlet 过滤器(与 Spark Filter 定义对比)。由于 Spark 使用嵌入式 Jetty,因此我没有 web.xml 来注册 DoSFilter。但是,Spark 不公开服务器实例,因此我也找不到以编程方式注册过滤器的优雅方法。
有没有办法将本机 servlet 过滤器应用于我的路由?
我想过将 DoSFilter 包装在我自己的 Spark Filter 中,但这似乎是一个奇怪的想法。
你可以这样做:
public class App {
private static Logger LOG = LoggerFactory.getLogger(App.class);
public static void main(String[] args) throws Exception {
ServletContextHandler mainHandler = new ServletContextHandler();
mainHandler.setContextPath("/base/path");
Stream.of(
new FilterHolder(new MyServletFilter()),
new FilterHolder(new SparkFilter()) {{
this.setInitParameter("applicationClass", SparkApp.class.getName());
}}
).forEach(h -> mainHandler.addFilter(h, "*", null));
GzipHandler compression = new GzipHandler();
compression.setIncludedMethods("GET");
compression.setMinGzipSize(512);
compression.setHandler(mainHandler);
Server server = new Server(new ExecutorThreadPool(new ThreadPoolExecutor(10,200,60000,TimeUnit.MILLISECONDS,
new ArrayBlockingQueue<>(200),
new CustomizableThreadFactory("jetty-pool-"))));
final ServerConnector serverConnector = new ServerConnector(server);
serverConnector.setPort(9290);
server.setConnectors(new Connector[] { serverConnector });
server.setHandler(compression);
server.start();
hookToShutdownEvents(server);
server.join();
}
private static void hookToShutdownEvents(final Server server) {
LOG.debug("Hooking to JVM shutdown events");
server.addLifeCycleListener(new AbstractLifeCycle.AbstractLifeCycleListener() {
@Override
public void lifeCycleStopped(LifeCycle event) {
LOG.info("Jetty Server has been stopped");
super.lifeCycleStopped(event);
}
});
Runtime.getRuntime().addShutdownHook(new Thread() {
@Override
public void run() {
LOG.info("About to stop Jetty Server due to JVM shutdown");
try {
server.stop();
} catch (Exception e) {
LOG.error("Could not stop Jetty Server properly", e);
}
}
});
}
/**
* @implNote {@link SparkFilter} needs to access a public class
*/
@SuppressWarnings("WeakerAccess")
public static class SparkApp implements SparkApplication {
@Override
public void init() {
System.setProperty("spring.profiles.active", ApplicationProfile.readProfilesOrDefault("dev").stream().collect(Collectors.joining()));
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(ModocContext.class);
ctx.registerShutdownHook();
}
}}
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)