June 13, 2012
2 weeks ago our marketing site experienced a Distributed Denial of Service attack. It happened once during the day and then again that night. IP addresses looked like eastern European. Thankfully our marketing site is on Heroku so we simply upped the resources to handle all the requests. Why would someone want to attack or hack our marketing site? They may have been trying to take down our ad server or perhaps send erroneous contact requests through our contact form. Our marketing site was much more vulnerable than our ad server.
Ad serving is a lot like handling DDoS attacks.
During the attack we received an email from Neustar who noticed we were down and offered their DDoS mitigation services. This seemed very suspicious at first. How did they know? Were they the attacker? I spoke with them later and they monitor bot traffic to help companies. The representative said they do not pay outside parties for attack data which I was happy to hear since that could encourage more attacks. We decided not to sign up with Neustar and have installed more protections into our marketing site.
The same week the lock on the front door of my building broke and I couldn’t get in. There happened to be a business card for a locksmith right outside so I was all set.
November 30, 2011
Earlier this year our content delivery network Limelight sent us a huge bill saying it was due to traffic spikes. In our business we don’t have a lot of control on traffic. If a partner wants to run our ads they do. Even if its gigabytes per second. I did some research and the competing CDNs I checked did not have spike clauses. They charge by the byte and want us to use as many bytes as possible.
After a lot of back and forth and headaches Limelight finally let us out of that bill. They also took the spike clause out of our contract. Thanks, Limelight!
While we are on the topic of CDNs, Carpathia recently showed up at our New York office. They told our office manager they had a meeting scheduled with our VP of Engineering and yours truly. That was strange because the two of us work out of San Francisco! Apparently they had added meeting requests to our calendars via email.
They promptly left the office and then sent us new meeting requests. I guess we need to decline the requests if we don’t want them to show up. The offenders from Carpathia are Dave Stinson and Jon Greaves.
June 24, 2011
I like using servers that manage all the layers except the one I care about: the application. I was using Rails grid servers at Media Temple for awhile but they shut those down. More recently I’ve been using Heroku.
I was thinking about ways to get data into and out of Spongecell’s dynamic ads. A few dozen rails unicorns would quickly be overloaded. A CDN is great for pushing cached content to millions of viewers but what if I wanted live data or live interactions? One way to get instant data is using long polling which is an http request that is left open until data arrives.
Heroku’s new Cedar stack is perfect for this! You can even use their old Rails stack for standard web requests and then use Cedar for lighter connections.
Try out Long Noodle for Long Polling and fork it on github.
Long Noodle GET requests subscribe to a key and POST requests broadcast for a key using token authentication. The key is simply the path of the url just like S3.
In the setup above imagine the web server is hosting a panel conversation between 10 famous astronomers. Anyone can listen in. The web server displays the conversation when you first visit the site. To get updates a long poll is opened to the Long Noodle server. The astronomers are authenticated and post their wise words to the web server which in turn passes to Long Noodle to trigger client updates.
Long Noodle is very lightweight. Messages are not persisted and clients have no guarantee they receive every message, just the most recent message. This is similar to Apple Push Notifications.
UPDATE: I did some quick load testing of Long Noodle on a free server at Heroku. For returning data immediately Node was able to pump out 1300 requests/second! For long polling errors started to occur once passing about 500 concurrent requests.
June 9, 2011
We had an existing Newrelic account and wanted to track some other apps on Heroku. I thought getting our Newrelic key into the Heroku apps would be the easiest solution. Here’s what I did.
Add this to Gemfile:
Add a Heroku config:
heroku config:add NEWRELIC_LICENSE_KEY
Add this to environment.rb or a lib somewhere:
NewRelic::Control.instance['license_key'] = ENV['NEWRELIC_LICENSE_KEY']
I have a feeling I didn’t do this the easiest way but it works. Or that I’m breaking some terms of service.
June 7, 2011
This blog has been neglected. Let’s resurrect it!
Did you see this recent post about a hot display advertising company? There’s some good quotes in there.
February 27, 2009
I recently revisited the problem of detecting browser time zone because I wanted to try the new time zone functionality in Rails 2.1. I found this post from Dave Johnson. To my disappointment this was the same solution Spongecell used in the personal calendar 3 years ago.
I wanted a simpler solution: one that doesn’t require a js cookie library, nor around filters, nor a UI with a combo box with 100s of time zone choices.
The solution presented is to send browser info using jquery and then storing the time zone in the session for use in all subsequent requests.
in the view (this is haml), this can be in your layout on all pages:
- unless @time_zone
'offset_minutes':(-1 * (new Date()).getTimezoneOffset())})"
in the controller:
#sets the time zone for this request if a session time zone exists
#if it doesn't the default is UTC
@time_zone = ActiveSupport::TimeZone[session[:time_zone_name]] if session[:time_zone_name]
Time.zone = @time_zone.name if @time_zone
#this receives browser info from a jquery request and stores
#time zone info in the session
offset_seconds = params[:offset_minutes].to_i * 60
@time_zone = ActiveSupport::TimeZone[offset_seconds]
@time_zone = ActiveSupport::TimeZone["UTC"] unless @time_zone
session[:time_zone_name] = @time_zone.name if @time_zone
render :text => "success"
in the formatter:
return "" unless t
return t.in_time_zone.strftime('%Y-%m-%d %H:%M:%S %Z')
Look how simple that is! I choose to default the time zone to UTC if one cannot be determined on the first attempt. Now the formatter will output all the UTC times you have in your db or anywhere in the user's browser's time zone.
If there is a better solution in Rails please let me know. We 'll see if this solution works for daylight saving time.
January 30, 2009
Obama Day was caching day!
Obama day was sort of intense! Not only did Aretha Franklin have a sweet hat, but all the internet traffic on CNN Live put the hurt on old skool Spongecell Promote. Chris saved the day by quickly page caching the content that was in high demand. So basically some files were copied and everyone was happy (except maybe Texas?)
This trick is super obvi, but hella useful, yo. But, <tear>, its not a permanent fix. Alas, not every day can be Obama Day ! So I set out to fix this permanently. We need to page cache for one customer who always sends the same request, expire the page when the data changes, and action cache for everyone else who is not logged in.
So I wrote a tiny little conditional page cache plugin to let you choose your caching type at runtime. You want to page cache giraffe 1, action cache giraffe 2 when they are green and serve everything else straight up.
def page_cache?; @giraffe == 1; end
def action_cache?; @giraffe = 2 && @giraffe_color = 'blue'; end
@giraffe = blah_blah
To clarify the temporary caching trick:
Suppose you have: http://snowgiraffe.com/pictures/giraffe/1.html
Then you simply move the file to: