Clustering can be a bit of a mess. Each server (and in node’s case each process on the server) has it’s own pool of connections that are connected to clients. Keeping two processes in sync to keep everything real-time can be a bit of a scaling headache. This is a small prototype using mongo’s capped collection as a event bus across many processes. This seems like a reasonable solution because the bottleneck is Mongo which is known to scale well.

Working example can be found at

The stack

  • Using mongoose as mongo drive
  • Using express for simple server
  • Using for the streaming library
  • Using pm2 to create a simple load balanced enviorment

In mongoose create a schema for a collection that defines itself as capped.

mongoose schema

var mongoose = require('mongoose'),
	Schema = mongoose.Schema;

var MessageQueue = new Schema({
	channel     : String,
  	message		: Schema.Types.Mixed
		size: 1024,
		max: 1000,
		autoIndexId: true

module.exports = MessageQueue;

server code

Create a socket connection that writes to that queue and listens to that queue…

// create a tailable query
var messageStream = MessageQueue.find().tailable(true).stream();

// console out any errors
messageStream.on('error', function (err) {

// listen to the mongo capped collection and
// emit everything to the chat namespace on
messageStream.on('data', function(doc){
	if( socket ) io.emit('chat', doc);

io.on('connection', function (sock) {
	socket = sock;
	socket.on('chat', function (data) {
		var msg = new MessageQueue({channel:'chat', message:data});

client code

var socket = io('http://localhost');
$('.message-form').on('submit', function writeToSocket (e) {
	socket.emit('chat', { message: });
socket.on('chat', function (data) {
	var box = $(".socket-write");
	box.val(box.val() + data.message.message + '\n');