Spark web framework unit tests

3.2k views Asked by At

I am working with the Spark web framework and creating a RESTful API. (http://sparkjava.com since there are multiple things out there named "Spark")

My employer's standards mandate that we write a series of unit tests that will be automatically run once a day to confirm that applications are still up.

Spark is easy to test myself using a tool like Postman but I have not found any good examples of JUnit tests being written with Spark or even with HTTP Requests being made programmatically with it.

Has anyone done this before? Is it possible?

4

There are 4 answers

0
Fernando Wasylyszyn On BEST ANSWER

I had the same requirement that you and I found a way to make it work. I searched over Spark source code and I found two classes that are useful:

  • SparkTestUtil: this class wraps Apache HttpClient and expose methods to make different http requests against a local web server (running in localhost) with customizable port (in constructor) and relative path (in requests methods)
  • ServletTest: it starts a Jetty instance in a local port with an application context and a relative directory path where a WEB-INF/web.xml file descriptor can be found. This web.xml will be use to simulate a web application. Then it uses SparkTestUtil to make http requests against this simulated application and assert results.

This is what I did: I created a junit test class that implements SparkApplication interface. In that interface I create and initialize the "controller" (a class of my application) in charge of answer http requests. In a method annotated with @BeforeClass I initialize the Jetty instance using a web.xml that refers to the junit test class as the SparkApplication and a SparkTestUtil

JUnit test class

package com.test

import org.eclipse.jetty.server.Connector;
import org.eclipse.jetty.server.Server;
import org.eclipse.jetty.server.ServerConnector;
import org.eclipse.jetty.webapp.WebAppContext;

public class ControllerTest implements SparkApplication {

    private static SparkTestUtil sparkTestUtil;

    private static Server webServer;

    @Override
    public void init() {
         new Controller(...)
    }

    @BeforeClass
    public static void beforeClass() throws Exception {
       sparkTestUtil = new SparkTestUtil(PORT);
       webServer = new Server();
       ServerConnector connector = new ServerConnector(webServer);
       connector.setPort(PORT);
       webServer.setConnectors(new Connector[] {connector});
       WebAppContext bb = new WebAppContext();
       bb.setServer(webServer);
       bb.setContextPath("/");
       bb.setWar("src/test/webapp/");
       webServer.setHandler(bb);
       webServer.start();
       (...)
    }

    @AfterClass
    public static void afterClass() throws Exception {
       webServer.stop();
       (...)
    }    

}

src/test/webapp/WEB-INF/web.xml file

<!DOCTYPE web-app PUBLIC
 "-//Sun Microsystems, Inc.//DTD Web Application 2.3//EN"
 "http://java.sun.com/dtd/web-app_2_3.dtd" >

<web-app>
    <display-name>Archetype Created Web Application</display-name>
    <filter>
        <filter-name>SparkFilter</filter-name>
        <filter-class>spark.servlet.SparkFilter</filter-class>
        <init-param>
            <param-name>applicationClass</param-name>
            <param-value>com.test.ControllerTest</param-value>
        </init-param>
    </filter>

    <filter-mapping>
        <filter-name>SparkFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>
</web-app>

This can be improved, but it is a good starting point I think. Maybe some "spark-test" component could be created?

Hope this would be useful for you!

8
Fernando Wasylyszyn On

we have develop a small library that facilitates the unit testing of Spark controllers/endpoints.

Github

Also, the version 1.1.3 is published in Maven Central Repository

<dependency>
        <groupId>com.despegar</groupId>
        <artifactId>spark-test</artifactId>
        <version>1.1.3</version>
        <scope>test</scope>
    </dependency>
0
Pau On

Another approach wis to create a class which implements Route in each path or route. For example, if you have a route like next:

get("maintenance/task", (req, response) -> {....}); 

Then replace (req, response) -> {....} lambda by a class implementing Route.

For example:

public class YourRoute implements Route {
   public Object handle(Request request, Response response) throws Exception {
     ....
   }
}

Would be:

get("maintenance/task", new YourRoute()); 

Then you can unit testing YourRoute class using JUnit.


0
x-x On

Here is my Solution.You just need additional add apache-http and junit dependency.

<dependency>
    <groupId>org.apache.httpcomponents</groupId>
    <artifactId>httpclient</artifactId>
    <version>4.5.2</version>
</dependency>

public class SparkServer {
    public static void main(String[] args) {
        Spark.port(8888);
        Spark.threadPool(1000, 1000,60000);
        Spark.get("/ping", (req, res) -> "pong");
    }
}

public class SparkTest {
    @Before
    public void setup() {
        SparkServer.main(null);
    }
    @After
    public void tearDown() throws Exception {
        Thread.sleep(1000);
        Spark.stop();
    }
    @Test
    public void test() throws IOException {

        CloseableHttpClient httpClient = HttpClients.custom()
                .build();

        HttpGet httpGet = new HttpGet("http://localhost:8888/ping");
        CloseableHttpResponse response = httpClient.execute(httpGet);

        int statusCode = response.getStatusLine().getStatusCode();
        BufferedReader rd = new BufferedReader(
                 new InputStreamReader(response.getEntity().getContent()));

        StringBuffer result = new StringBuffer();
        String line = "";
        while ((line = rd.readLine()) != null) {
            result.append(line);
        }

        assertEquals(200, statusCode);
        assertEquals("pong", result.toString());
    }
}