This blog annotates the Jetty 7 example web application (also updated for jetty-9) that uses Jetty asynchronous HTTP client and the proposed suspendable servlets 3.0 API, to call an eBay restful web service.   The technique combines the Jetty asynchronous HTTP client with the Jetty servers ability to suspend servlet processing, so that threads are not held while waiting for rest responses. Thus threads can handle many more requests and web applications using this technique should obtain at least ten fold increases in performance.

 Screenshot from 2013-04-19 09:15:19

The screen shot above shows four iframes calling either a synchronous or the asynchronous demonstration servlet, with the following results:

Synchronous Call, Single Keyword
A request to lookup ebay auctions with the keyword “kayak” is handled by the synchronous implementation. The call takes 261ms and the servlet thread is blocked for the entire time. A server with a 100 threads in a pool would be able to handle 383 requests per second.
Asynchronous Call, Single Keyword
A request to lookup ebay auctions with the keyword “kayak” is handled by the asynchronous implementation. The call takes 254ms, but the servlet request is suspended so the request thread is held for only 5ms.  A server with a 100 threads in a pool would be able to handle 20,000 requests per second (if not constrained by other limitations)
Synchronous Call, Three Keywords
A request to lookup ebay auctions with keywords “mouse”, “beer” and “gnome” is handled by the synchronous implementation. Three calls are made to ebay in series, each taking approx 306ms, with a total time of 917ms and the servlet thread is blocked for the entire time. A server with a 100 threads in a pool would be able to handle only 109 requests per second!
Asynchronous Call, Three Keywords
A request to lookup ebay auctions with keywords “mouse”, “beer” and “gnome” is handled by the asynchronous implementation. The three calls can be made to ebay in parallel, each taking approx 300ms, with a total time of 453ms and the servlet request is suspended, so the request thread is held for only 7ms. A server with a 100 threads in a pool would be able to handle 14,000 requests per second (if not constrained by other limitations).

It can be seen by these results that asynchronous handling of restful requests can dramatically improve both the page load time and the capacity by avoiding thread starvation.
The code for the example asynchronous servlet is available here (updated for jetty-9 )and works as follows:

  1. The servlet is passed the request, which is detected as the first dispatch, so the request is suspended and a list to accumulate results is added as a request attribute:

    if (request.isInitial() || request.getAttribute(CLIENT_ATTR)==null){
      String[] keywords=request.getParameter(ITEMS_PARAM).split(",");
      final List<Map<String, String>> results= Collections.synchronizedList(new ArrayList<Map<String, String>>());
      final AtomicInteger count=new AtomicInteger(keywords.length);
      request.suspend();
      request.setAttribute(CLIENT_ATTR, results);

    The request is suspended before starting the searches in order to avoid races if the searches somehow complete before the request is suspended.
    In Jetty-9 version, we just check for results==null to identify if we are processing the initial request or not and the standard Servlet 3.0 startAsync() method is used to suspend the request:

    // If no results, this must be the first dispatch, so send the REST request(s)
    if (results==null) {
        final Queue<Map<String, String>> resultsQueue = new ConcurrentLinkedQueue<>();
        request.setAttribute(RESULTS_ATTR, results=resultsQueue);
        final AsyncContext async = request.startAsync();
        async.setTimeout(30000);
        ...
  2. After suspending, the servlet creates and sends an asynchronous HTTP exchange for each keyword:

      for (final String item:keywords){
        ContentExchange exchange = new ContentExchange()
        {
          protected void onResponseComplete() throws IOException
          {         // see step 3 below      }    };
        exchange.setMethod("GET");
        exchange.setURL("http://open.api.ebay.com/shopping?MaxEntries=5&appid=" +
                        ...
        _client.send(exchange);

    The API for the Jetty Http client  exchanges was inspired by the callback style of javascript XHR.
    For Jetty-9 the new asynchronous HTTP Client API is used:

    for (final String item:keywords) {
      _client.newRequest(restURL(item)).method(HttpMethod.GET).send(
        new AsyncRestRequest() {
          @Override
          void onAuctionFound(Map<String,String> auction) {
            resultsQueue.add(auction);
          }
          @Override
          void onComplete() {
            if (outstanding.decrementAndGet()<=0)
              async.dispatch();
          }
        });
    }
  3. All the rest requests are handled in parallel by the eBay servers and when each of them completes, the call back on the exchange object is called. The code (omitted above, shown below)extracts auction information from the JSON response and adds it to the results list. The count of expected responses is then decremented and when it reaches 0, the suspended request is resumed: xxx
    protected void onResponseComplete() throws IOException
    {
        Map query = (Map) JSON.parse(this.getResponseContent());
        Object[] auctions = (Object[]) query.get("Item");
        if (auctions != null) {
          for (Object o : auctions)
              results.add((Map) o);
          }
        if (count.decrementAndGet()<=0)
          request.resume();
    }

    For Jetty-9 some of this uses the Servlet API for dispatch instead of resume and moves the add message out of this message, but is essentially the same:

    @Override void onComplete() {
     if (outstanding.decrementAndGet()<=0)
       async.dispatch();
    }
  4. After being resumed (dispatched), the request is re-dispatched to the servlet. This time the request is not initial and has results, so the results are retrieved from the request attribute and normal servlet style code is used to generate a response:
    List<Map<String, String>> results = (List<Map<String, String>>) request.getAttribute(CLIENT_ATTR);
    response.setContentType("text/html");
    PrintWriter out = response.getWriter();
    out.println("<html><head><style type='text/css'>img:hover {height:75px}</style></head><body><small>");
    for (Map<String, String> m : results){
      out.print("<a href=""+m.get("ViewItemURLForNaturalSearch")+"">");
      ...

This example shows how the Jetty asynchronous client can easily be combined with the asynchronous servlets of Jetty-8/9 (or the Continuations of Jetty-7) to produce very scalable web applications.


7 Comments

Fredrik Tyboni · 30/09/2008 at 12:16

Hi Greg,
This sounds really interesting, especially for a small project I’ve been working on. I’ve basically hacked together a ESI-like servlet filter, that in conjunction with the suspend request and asynchronous HTTP client could be used to create a stack of cache servers with just Jetty. I’ll give it a spin and see how it works out.

Valery Silaev · 29/10/2008 at 12:01

Greg, Probably my comment is not directly connected to your post, but it shows Jetty 7 scalability with some visual metrics and explanation of results — http://flex.sys-con.com/node/720304 Disclaimer: obviously, the BlazeDS streaming endpoint mentioned in the article was created by my company 🙂

Troy Kelley · 07/08/2009 at 17:35

Greg,
The link to your servlet source code returns a 404. Great write-up!

Greg Wilkins · 07/08/2009 at 23:46

The code has been refactored a little bit.
The latest is for jetty-7 and can be seen
at http://svn.codehaus.org/jetty/jetty/trunk/example-async-rest-webapp/

imh4rdc0r3 · 20/03/2010 at 16:19

you said, —snip– the servlet request is suspended so the request thread is held for only 2ms. A server with a 100 threads in a pool would be able to handle 5000 requests per second (if not constrained by other limitations) –snip–
if a request only takes .002 seconds, then 1 thread would be able to handle 1/.002 = 500 requests/second
so wouldnt 100 be able to handle 50,000 requests/seconds and not 5,000?
thanks.

subhash · 08/05/2010 at 04:41

Recently i designed chat application with blazeds & jetty,
my application bandwidth is highly, how can i reduce connection,
and i wrote only plane classes, configured in Remote-config.xml, calling remoteObjects in Flex,
while launching my application(for each client) it pulls data from database ,once after complete pulling data from database it gets the chat messages from Admin to client.
Please can anybody assist me few techniques to reduces the bandwidth consume.
Thanks,
subhash

Webtide Blogs · 18/04/2013 at 23:54

[…] blog is an update for jetty-9 of one published for Jetty 7 in 2008 as an example web application  that uses Jetty asynchronous HTTP client and the asynchronoous […]

Comments are closed.