Is Your Angular App A House of Cards?

Written by HubbaDev | Published 2017/03/16
Tech Story Tags: javascript | angularjs | development | performance | tech

TLDRvia the TL;DR App

Ours was, our performance suffered, and we did something about it

Hubba helps make the world of commerce run faster. We connect brands, retailers, and influencers to help them discover new business relationships. And we do this with a single-page application built on Angular 1.5, and one which displays tons of product-related information on every screen.

If you are an Angular 1.x developer you are probably asking,

With so much on-screen data, what does that do to your performance?

Well, nothing good. And this post is about one powerful technique that we use to whip Angular 1.5 into a performance machine.

In the late spring of 2016, I started my career path at Hubba. I was an experienced dev, fresh off a team that had been using React for the better part of a year to build an enterprise level application. Previous to that I had been part of a team building a media monitoring application using Backbone along with other custom front-end tools. Needless to say that coming into an Angular shop was a fresh, if not eye-opening, experience.

A House of Cards

At Hubba, one of the tools in our Batman-like Utility Belts of Justice are cards. Cards are simple enough — they’re self-contained pieces of code that render data for certain features of our site. We have cards that display product information, user and brand profiles, product lists, blog posts, and so on.

Cards are neat. Each one handles its own little corner of the world: they load their own data, they render their own HTML, they handle their own user interactions, and they let other parts of the application hook into them to accomplish certain business goals.

I hadn’t been around all that long at Hubba, but it was easy to see that we were using cards in more and more places in the app. And why not — they are convenient and powerful. A single line of HTML-esque code that pointed at some IDs in our database would be enough to get a developer up and running with a screen full of cards in moments.

As we started adding cards to pages that didn’t have them before, and building new features that relied on the convenience of cards, we began to notice a lot of performance degradation. At first, I assumed the performance issues were DOM related. I speculated that accessing the DOM for each piece of data as cards loaded in was causing the performance degradation. I made some well-intentioned adjustments to batch some of the DOM handling, expecting to solve most of our woes with one fell swoop.

That didn’t exactly happen.

Time to Dig In

I began to use Chrome dev tools to paint a better picture of what was going on. I was able to identify that there was a huge number of XHR calls used for fetching data. Each card was performing ~3–5 data fetches during its lifecycle.

When we had a page full of 10–20 cards doing ~30–50 data fetches performance would take a serious beating. Chrome would eat another couple sticks of delicious RAM and then eventually get back up off the floor. The power of the cards we developers knew and loved was leading us into a janky downfall.

How cards loaded data in 2016

We bandaged the situation for a while by using the InView directive to prevent the cards from doing any work until they were actually on the user’s screen. It worked for a while, but performance would still do a tailspin when we had 5–10 cards in view at a time. Not a fun experience for the average user who just wanted to get the package dimensions for Turkey Bags or to find out the newest flavour of dog ice cream from The Bear & The Rat.

Many cards meant many data fetches. When there were 10+ cards things could get ugly.

We needed a solution that would solve this. We needed data on the fly, but it couldn’t have a major impact on the development cycle of existing cards. Since we were running on Angular 1.5, we didn’t really have a good data store option that suited our needs. So we had to go custom, and I had an idea.

Time to Dig Out

The first proof of concept was anchored into our Product Cards which are the bread and butter of Hubba’s business. Product cards were a good test bed for my idea. Each product card requires 3 pieces of data:

  • The product data
  • How many likes that product has received
  • If you’re a logged in user we also display whether you’ve favourited that product.

The first implementation was a bit rough but was a huge win for performance. Not only did it solve the data fetching issues, but the page was noticeably more performant. There was no giant spike of lag when upwards of 50 product cards were painted to the page in my testing scenarios.

My idea was to have some sort of layer that sits between the cards themselves and the server. Cards still individually ask for data, then the layer waits a small amount of time, batches all of the requested IDs together, then fetches all the data in one request while still returning data individually to each card. This way, each card isn’t directly polluting the browser with data requests, but each card can still get the things it needs.

How our relays changed things. Instead of the cards fetching their own data, relays now batch and handle it for them.

I remember the words my boss said as he reviewed the pull request: “holy mother of speed”. It marked the eureka moment that proved I was on the right track. I decided to name the pattern “Relays”, and they’ve since become their own entity that we rely on every day. Other developers here have started using them and adding into them, or developed features based on the flexibility they provide.

Relays fit well with our card-based application. They let us avoid the step of having to prepare our data in advance. We can efficiently load everything on the fly whenever we need it. It’s been a tool that lets us develop many things much quicker and has even relieved a few headaches.

Since then, we’ve implemented relays into all of our cards, and have relays that cover basically all of our internal data sources. We’ve also added in some quality of life things that do things like automatically converting fetched data into data models, caching the data as it comes in, and batching and queuing requests into smaller sets for even better performance.

Relays are just one of the cool things I’ve gotten the chance to work on at Hubba that’s made some impact. We’re always developing cool things here and love to hear feedback on how we’re doing — feel free to get in touch!

Adam HutchinsonSenior Front End Guy at Hubba@_adamhutch


Published by HackerNoon on 2017/03/16