Bypass ActiveRecord When Using Big Sets For Fast Response Times in Rails 3 & Ruby 1.9.2
March 12, 2011
If you need to work with big sets (in my case 1,000 records) ActiveRecord is slow. And so is calling to_json to that collection.
This is my workaround to transform a ~3000ms request into ~300ms:
def index
# before
# @users = User.where('some condition').sort('some order')
# render :json => @users
# after
@users = ActiveRecord::Base.connection.select_all(User.where('some condition').sort('some order').to_sql)
render :text => @users
end
Also, in your front-end you need to transform the response into valid json (I'm usign jQuery):
$.get('/users', function(users) {
users = jQuery.parseJSON(users.replace(/=>/g, ':').replace(/nil/g, 'null'));
// Do the rest
});
NOTE: I understand that this is not suitable for most cases, like for a public API, given that the output is not valid JSON. Also it has a lot of limitations, like it wont deserialize your serialized_attributes, you cannot use the to_json options like, :methods, :include, etc.