Optimizing Rails App for Speed (View time vs ActiveRecord)

I’ve built an contact manager/lightweight CRM application and am now starting to optimize it for speed as speed is becoming an issue.

The table shown below is the slowest part of the application now. It loads via jquery Datatables (server-side). I am running miniprofiler to show queries and load times. I think the fix is some sort eager loading or pre-joining of tables. However I’m confused about the duration vs Query Time.

The duration of executing index action is 3800+ms , but the query time is 306ms.
My server logs reflect something similar: “Completed 200 OK in 3832ms (Views: 2205.9ms | ActiveRecord: 1613.3ms)”

If i optimize the queries, am I working on the smaller problem? Why do the views take so long or are they closely related?

What am I not thinking of?

@austinwang Are these JSON endpoints? The difference between AR and your Views is likely the JSON generation from all the data you’re returning. There are a couple things you can investigate in order to speed this process up.

First, I’d recommend checking out a gem for generating JSON from your models; there are many options (GitHub - nesquena/rabl: General ruby templating with json, bson, xml, plist and msgpack support, GitHub - rails/jbuilder: Jbuilder: generate JSON objects with a Builder-style DSL, GitHub - rails-api/active_model_serializers: ActiveModel::Serializer implementation and Rails hooks) and I’d recommend ActiveModel::Serializers for its ease-of-implementation and Ruby-ness (I personally found jbuilder horribly confusing and unintuitive).

Second, you may want to investigate your JSON encoder/decoder and see if you can improve that. GitHub - brianmario/yajl-ruby: A streaming JSON parsing and encoding library for Ruby (C bindings to yajl) and GitHub - ohler55/oj: Optimized JSON are both improvements over the stock JSON encoder/decoder with OJ being significantly faster in many cases.

I’d follow along with the Improving Performance screencast for ideas about reducing DB queries, investigate reducing the amount of JSON you return with ActiveModel::Serializers, and try out the Oj gem for faster JSON encoding. With these tools, I bet you could be able to get those responses to under 500ms (obviously this depends on the dataset and queries, but that’s the sort of improvement I’ve seen personally with these sorts of performance tweaks).

Good luck, and if you get hung up on any of those tools, fire off another thread!

@austinwang you might also want to check out the source code for discourse as they do the techniques discussed in @joshclayton 's post GitHub - discourse/discourse: A platform for community discussion. Free, open, simple.

Wow, thanks @jon and @joshclayton! I spent a day on reducing db queries and reduced the 517queries you see pictured to < 10 through a better understanding of eager loading and joining. response times are well < 500ms in most cases, nearly 8x faster. Still have a lot to learn but excited to dive deeper into better JSON encoders and such too. Thanks for pointing me in the right direction!

Hi @joshclayton or @jon. I just pushed this updated code to production (Heroku) and have noticed that the page is still very slow (4-6 seconds per request!). Our production is usually about 20-30% slower than localhost, but not 5x slower. On localhost I was seeing ~300ms responses. On heroku I’m seeing up to ~7000ms or even more! I’m researching this now, but thought I’d throw out a few theories or see if anyone here had any thoughts on why this might be happening and how I can fix it.

We added a new model and migrated a bunch of data to be held here. So maybe the indexes are still being created?
The new activerecord/sql statements I’ve written involve a lot of joins and maybe Heroku handles this worse than localhost?
Heroku simply has more data held in it (our users) and so the queries take longer?

Let me know what information I can provide to help diagnose. Thanks in advance!

@austinwang I’d start by verifying that the indices are there and that you actually deployed to production (I’ve deployed just to staging, looked at production, and been sorely disappointed for fifteen minutes or so before I figured out I deployed to the wrong environment). If the deploy looks to have gone through, I’d dig into New Relic and see where it’s reporting slow actions. You’ll be able to immediately see if your eager loading impacted SQL performance by number of queries alone.

If you’re still seeing slow response times from ActionView or the controller itself, it’s likely JSON encoding/decoding. That’s where YAJL and Oj come in to speed that up. ActiveModel::Serializers will help too (if you haven’t already started using it) because it’ll render less content most likely.

Let me know what you see and feel free to attach screenshots from New Relic if you’re having trouble making sense of things. New Relic allows the pro account to be turned on/off and it’s prorated for Heroku, so you may benefit from turning it on for an hour or two, looking at the results, and turning it back off (so you don’t pay gobs of money for data that you don’t always need reporting on).

1 Like

Thanks again @joshclayton, with new relic and rack-mini-profiler we found the query that was causing the slowness. It was a count of all contacts. Silly me :slight_smile: