Examples of this kind of applications are financial applications that provide stock quote price updates, or online games, or position tracking systems for fast moving objects (think a motorbike on a circuit).
These applications have in common the fact that they generate a high rate of server-side events, say in the order of around 10 events per second.
With such an event rate, most of the times you start wondering if it is appropriate to really send to clients every event (and therefore 10 events/s) or if it not better to save bandwidth and computing resources and send to clients events at a lower rate.
For example, even if the stock quote price changes 10 times a second, it will probably be enough to deliver changes once a second to a web application that is conceived to be used by humans: I will be surprised if a person can make any use (or even see it and remember it) of the stock price that was updated 2 tenths of a seconds ago (and that in the meanwhile already changed 2 or 3 times). (Disclaimer: I am not involved in financial applications, I am just making a hypothesis here for the sake of explaining the concept).
The CometD project provides lazy channels to implement this kind of message flow control (it also provides other message flow control means, of which I’ll speak in a future entry).
A channel can be marked as lazy during its initialization on server-side:
BayeuxServer bayeux = ...;
bayeux.createIfAbsent("/stock/GOOG", new ConfigurableServerChannel.Initializer()
public void configureChannel(ConfigurableServerChannel channel)
Any message sent to that channel will be marked to be a lazy message, and will be delivered lazily: either when a timeout (the max lazy timeout) expires, or when the long poll returns, whichever comes first.
It is possible to configure the duration of the max lazy timeout, for example to be 1 second, in
With this configuration, lazy channels will have a max lazy timeout of 1000 ms and messages published to a lazy channel will be delivered in a batch once a second.
Assuming, for example, that you have a steady rate of 8 messages per second arriving to server-side that update the GOOG stock quote, you will be delivering a batch of 8 messages to clients every second, instead of delivering 1 message every 125 ms.
Lazy channels do not immediately reduce the bandwidth consumption (since no messages are discarded), but combined with a GZip filter that compresses the output allow bandwidth savings by compressing more messages for each delivery (as in general it is better to compress a larger text than many small ones).