The New Chrome DevTool Feature You Want to Know About

Written by dormoshe | Published 2017/05/30
Tech Story Tags: web-development | javascript | google | programming | tech

TLDRvia the TL;DR App

Improve the quality of your web page by using Lighthouse

This article originally appeared on dormoshe.io

The Chrome development teams work on features and improvements that make our browsing and developing experience better. The Google I/O 2017 conference took place in April and there are significant tidings. A part of them is about the DevTool, so it affects us as web developers that use chrome. Chrome 60 is coming with many new features and changes in the DevTool. The “WOW” feature is the new Audits panels.

The Audits panels are powered by Lighthouse. Lighthouse provides a comprehensive set of tests for measuring the quality of your web pages. The tests categories are Performance, Accessibility, Best Practices and PWA (Progressive Web Apps).

In this article, we will explore the Audit feature, understand the categories, run it on some popular websites, cover the report results and get a taste from the Lighthouse’s architecture.

Chrome version < 60

The Audits panel is an existing feature in the chrome DevTool for a while. Before version 60 of Chrome, this panel contained only Network Utilization and Web Page Performance measurements. The Audits panel has been replaced with an integrated version of the Lighthouse tool.

The look and feel are so different between the versions. The differences are so essential that Google presents this feature as a new one. There was a way to use Lighthouse in the old versions of chrome as a browser extension or as a node command-line tool, but now it is a built-in feature in the browser.

Audits feature before Chrome 60

Lighthouse

The Audits panel is now powered by Lighthouse. Lighthouse was developed by Google. It provides a comprehensive set of tests for measuring the quality of your web pages. Lighthouse is an open-source project.

“Do Better Web” is an initiative within Lighthouse to help web developers modernize their existing web applications. By running a set of tests, developers can discover new web platform APIs, become aware of performance pitfalls, and learn (newer) best practices. In other words, do better on the web! DBW is implemented as a set of standalone gatherers and audits that are run alongside the core Lighthouse tests.

To learn more about how it works and how to contribute to it, check out the Lighthouse talk from Google I/O 2017 below:

Google I/O 2017 talk about Lighthouse

This talk walks through what’s new in Lighthouse and how it’s evolved into a companion for modern web development. In addition, it covers the using of Lighthouse in different environments (Node CLI, Chrome DevTools, WebPageTest and headless Chrome), the architecture, Github/Travis/CI integration, headless Chrome and the ways you can extend Lighthouse by authoring custom audits to run against your own site.

How to use it

The Audits tab is the last built-in tab in the browser DevTool. In order to use it you need to install the dev or canary latest version of Chrome 60.

In order to audit a page you should follow this steps:

  1. Press F12 to open the DevTool.
  2. Click the Audits tab.
  3. Click Perform an audit.
  4. Click Run audit. Lighthouse sets up DevTools to emulate a mobile device, runs a bunch of tests against the page, and then displays the results in the Audits panel.

The Audits panel with the Lighthouse Logo before performing an audit.

The panels

Lighthouse analyzes the page according to 4 categories: Performance, Accessibility, Best Practices and Progressive Web Apps (PWA). Lighthouse runs the page through a series of tests such as different device sizes and network speeds. It also checks for conformance to accessibility guidelines such as color contrast and ARIA best practices.

Audits report result per category

The scores at the top are your aggregate scores for each of those categories. The rest of the report is a breakdown of each of the tests that determined your scores. Each panel focuses on one of the categories and shows the category results in an appropriate structure.

Progressive web app

Progressive Web Apps (PWA) are reliable, fast, and engaging, although there are many things that can take a PWA from a baseline to exemplary experience.

To help teams create the best possible experiences, Lighthouse has put together a checklist which breaks down all the things we think it takes to be a Baseline PWA, and how to take that a step further with an Exemplary PWA by providing a more meaningful offline experience, reaching interactive even faster and taking care of many more important details.

PWA results — failed tests part

When we click on the PWA circle in the top bar, the first part we see is the Failed tests list. We can read, explore and then fix the failing tests.

The next parts of the PWA report are the Passed items list and the Manual checks. There are checks that must run manually in order to verify them. Those checks are important, but they don’t affect the score.

PWA report — passed items and the manual checks parts

Performance

Web performance refers to the speed in which web pages are downloaded and displayed on the user’s web browser. Web performance optimization is the field of knowledge about increasing web performance.

Faster website download speeds have been shown to increase visitor retention and loyalty and user satisfaction, especially for users with slow internet connections and those on mobile devices.

The first part of the performance category is the Metrics. These metrics encapsulate the app’s performance across a number of dimensions.

Performance's metrics

As you can see there are 3 main points of loading:

  • First meaningful paint — first meaningful paint measures when the primary content of a page is visible.
  • First interactive — the first point at which necessary scripts of the page have loaded and the CPU is idle enough to handle most user input.
  • Consistently interactive — the point at which most network resources have finished loading and the CPU is idle for a prolonged period.

The next performance’s part is the Opportunities. These are opportunities to speed up your application by optimizing some resources, like images and text compression.

The Opportunity and Diagnostics parts

The last part is the Diagnostics. These diagnostics show more information about the performance. One of them is the Critical Request Chains, that shows what resources are required for the first render of the page. We can improve page load by reducing the length of chains, reducing the download size of resources or deferring the download of unnecessary resources.

Accessibility

Accessibility refers to the experience of users who might be outside the narrow range of the “typical” user, who might access or interact with things differently than you expect. Specifically, it concerns users who are experiencing some type of impairment or disability — and bear in mind that that experience might be non-physical or temporary.

The accessibility category contains tests to analyze the capability of screen reader and other assistive technologies to work correctly on the page. For example, usage of attributes by elements, ARIA attributes best practices, discernable names of elements and more.

Accessibility category report

Best practices

The best practices category checks some recommendations for modernizing the page and avoiding performance pitfalls. For example, application cache, HTTPS usage, deprecated APIs, permission requests from the user and more. This part contains failed and passed tests lists.

Best practices category report

Popular websites scores

In this section, we will see the top scores of 3 popular websites. The first is the landing page of Weather.com. The second is a results page of Google. The last is the wall page in Facebook.

Popular websites scores

We can see that the PWA is the lowest score category, maybe because PWA is a new field in the web. We can see that while the performance of Google is the best, the performance of weather.com is bad (consistently interactive after over 25ms). The accessibility of all tested sites is good with a score greater than 80. Accessibility is a field, that gets a large focus nowadays and recently had made part of the law in some countries.

How it works — The architecture

The lighthouse’s flow contains some main steps. Part of the steps occurred in the browser and the others executed by the lighthouse runner.

Lighthouse architecture

Here are the Lighthouse’s components:

  • Driver — Interfaces with Chrome Debugging Protocol (API viewer)
  • Gatherers — Uses Driver to collect information about the page. Minimal post-processing. The outputs of a gatherer are called Artifacts.
  • Audits — Using the Artifacts as input, Audits evaluate a test and assign pass/fail/scoring.
  • Categories — Grouping audit results into a user-facing section of the report (e.g the best practices). Applies weighting and overall scoring to the section.

Conclusion

Accessibility and PWA became main measures in the modern web development. Companies invest time and money to improve them in their web pages. The integration of Lighthouse in the dev tool is viable. It will help web developers to be more professionals and to deliver pages in a higher quality. I really sure that we will hang out a lot of time in the Audits tab and after we ran it on some popular websites, not only us.

You can follow me on dormoshe.io or Twitter to read more about Angular, JavaScript and web development.


Published by HackerNoon on 2017/05/30