Long Noodle: Heroku Long Polling with Node.js, jQuery, Rails, Cross Site

I like using servers that manage all the layers except the one I care about: the application. I was using Rails grid servers at Media Temple for awhile but they shut those down. More recently I’ve been using Heroku.

I was thinking about ways to get data into and out of Spongecell’s dynamic ads. A few dozen rails unicorns would quickly be overloaded. A CDN is great for pushing cached content to millions of viewers but what if I wanted live data or live interactions? One way to get instant data is using long polling which is an http request that is left open until data arrives.

Heroku’s new Cedar stack is perfect for this! You can even use their old Rails stack for standard web requests and then use Cedar for lighter connections.

Try out Long Noodle for Long Polling and fork it on github.

Long Noodle GET requests subscribe to a key and POST requests broadcast for a key using token authentication. The key is simply the path of the url just like S3.

In the setup above imagine the web server is hosting a panel conversation between 10 famous astronomers. Anyone can listen in. The web server displays the conversation when you first visit the site. To get updates a long poll is opened to the Long Noodle server. The astronomers are authenticated and post their wise words to the web server which in turn passes to Long Noodle to trigger client updates.

Long Noodle is very lightweight. Messages are not persisted and clients have no guarantee they receive every message, just the most recent message. This is similar to Apple Push Notifications.

As a bonus, Long Noodle works cross site using javascript callbacks so you can set it up anywhere.

UPDATE: I did some quick load testing of Long Noodle on a free server at Heroku. For returning data immediately Node was able to pump out 1300 requests/second! For long polling errors started to occur once passing about 500 concurrent requests.

About these ads

2 Responses to Long Noodle: Heroku Long Polling with Node.js, jQuery, Rails, Cross Site

  1. Does your explanation fall true when using the Worker Dyno’s? I’m building an app that will have thousands and thousands of long-pooling requests. Does each worker dyno handle about 500 concurrently? Thanks for your help!

  2. Chris Hobbs says:

    500 is a conservative estimate from 2011. Performance may be better now. Another note is that the test was done with all connections waiting for the same resource. Performance could be better if connections have multiple resources to wait for.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: