The 6.0.0alpha3 release of Jetty is now available
and provides a 2.4 servlet server in 400k jar, with only 140k of dependencies (2.6M more if you want JSP!!!).
But as well as being small, fast, clean and sexy, Jetty 6 supports a new feature
called Continuations that will allow scalable AJAX applications to be built, with
threadless waiting for asynchronous events.

Thread per connection

One of the main challenges in building a scalable servlet server is how to
handle Threads and Connections. The traditional IO model of java associates a thread
with every TCP/IP connection. If you have a few very active threads, this model can
scale to a very high number of requests per second.
However, the traffic profile typical of many web applications is many persistent HTTP
connections that are mostly idle while users read pages or search for the next link
to click. With such profiles, the thread-per-connection model can have problems scaling
to the thousands of threads required to support thousands of users on large scale deployments.

Thread per request

The NIO libraries can help, as it allows asynchronous IO to be used and threads can be
allocated to connections only when requests are being processed. When the connection is
idle between requests, then the thread can be returned to a thread pool and the
connection can be added to an NIO select set to detect new requests. This thread-per-request
model allows much greater scaling of connections (users) at the expense of a
reduced maximum requests per second for the server as a whole (in Jetty 6 this expense
has been significantly reduced).

AJAX polling problem

But there is a new problem. The advent of AJAX as a
web application model is significantly changing the traffic profile seen on the server side. Because
AJAX servers cannot deliver
asynchronous events to the client, the AJAX client
must poll for events on the server. To avoid a busy polling
loop, AJAX servers will often hold onto a poll request
until either there is an event or a timeout occurs.
Thus an idle AJAX application will
have an outstanding request waiting on the server which can be used to send a response to the
client the instant an asynchronous event occurs.
This is a great technique, but it breaks the thread-per-request model, because
now every client will have a request outstanding in the server. Thus the server again
needs to have one or more threads for every client and again there are problems scaling
to thousands of simultaneous users.

Jetty 6 Continuations

The solution is Continuations, a new feature introduced in Jetty 6. A java Filter or
Servlet that is handling an AJAX request, may now request a Continuation object
that can be used to effectively suspend the request and free the current
thread. The request is resumed after a timeout or immediately if the resume method
is called on the Continuation object. In the Jetty 6 chat room demo, the following
code handles the AJAX poll for events:

private void doGetEvents(HttpServletRequest request, AjaxResponse response)
{
member = (Member)chatroom.get(request.getSession(true).getId());
// Get an existing Continuation or create a new one if there are no events.
boolean create=!member.hasEvents();
Continuation continuation=ContinuationSupport.getContinuation(request,create);
if (continuation!=null)
{
if(continuation.isNew())
// register it with the chatroom to receive async events.
member.setContinuation(continuation);
// Get the continuation object. The request may be suspended here.
Object event= continuation.getEvent(timeoutMS);
}
// send any events that have arrived
member.sendEvents(response);
// Signal for a new poll
response.objectResponse("poll", "");
}

When another user says something in the chat room, the event is delivered to
each member by another thread calling the method:

class Member
{
public synchronized void addEvent(Event event)
{
_events.add(event);
if (_continuation!=null)
// resume requests suspened in getEvents
_continuation.resume(event);
}
...
}

How it works

Behind the scenes, Jetty has to be a bit sneaky to work around Java and the Servlet specification
as there is no mechanism in Java to suspend a thread and then resume it later.
The first time the request handler calls continuation.getEvent(timeoutMS) a
RetryReqeuest runtime exception is thrown. This exception propogates out of all the request
handling code and is caught by Jetty and handled specially.
Instead of producing an error response, Jetty places the request on a timeout queue and returns the
thread to the thread pool.
When the timeout expires, or if another thread calls continuation.resume(event)
then the request is retried. This time, when continuation.getEvent(timeoutMS)
is called, either the event is returned or null is returned to indicate a timeout.
The request handler then produces a response as it normally would.
Thus this mechanism uses the stateless nature of HTTP request handling to simulate a
suspend and resume. The runtime exception allows the thread to legally exit the
request handler and any upstream filters/servlets plus any associated security context.
The retry of the request, re-enters the filter/servlet chain and any security context
and continues normal handling at the point of continuation.
Furthermore, the API of Continuations is portable. If it is run on a non-Jetty6 server
it will simply use wait/notify to block the request in getEvent. If Continuations prove
to work as well as I hope, I plan to propose them as part of the 3.0 Servlet JSR.


3 Comments

Anonymous · 03/10/2006 at 18:44

Expect Servlet 3.0 JSR come out early!

How does Jetty handle with NIO? Does Jetty use Grizzly to do that?

Jan Bartel · 23/11/2006 at 00:55

Jetty has its own NIO layer. We’ve also been working with some of the
Glassfish team  to make Grizzly work in Jetty. I believe Jean Francois Arcand
<a href="http://weblogs.java.net/blog/jfarcand/archive/2006/11/jezzly_or_grett_1.html">blogged</a&gt;
about it recently.

The upshot is, you’re spoilt for choice as far as the IO layer goes! 🙂

Jetty, SPDY and HAProxy | Webtide Blogs · 23/10/2012 at 16:00

[…] has a long story of staying cutting edge when it is about web features and network protocols. Jetty first implemented web continuations (2005) as a portable library, deployed them successfully for years to customers, […]

Comments are closed.