How Google Rewards and Penalizes Healthcare Websites

Photo by Carlos Alberto Gómez Iñiguez

A whopping 53% of internet users report abandoning a site if it takes longer than 3 seconds to fully load. To Google, this indicates a poor user experience. Since more people are searching on mobile devices than traditional computers, and Google’s goal is to deliver the best search experience to its users, mobile speed is increasingly important to consider.

What Happened with the Google Speed Update?

In 2018, with the rise of the mobile-first indexing, Google announced a new algorithm for ranking mobile pages. They’ve dubbed it: “the Speed Update.”

As Google’s Z. Wang and D. Phan explained, the Google Speed Update would affect only the slowest pages. And if they still have great content, they won’t see their ranking drop (too much).

The History of Page Speed as a Ranking Factor

Healthcare webmasters and SEOs have seen this update coming for years, ever since the number of searchers coming from mobile surpassed desktop (in 2016, mobile searches accounted for more than 50% of global searches for the first time).

Google has been conducting tests on the importance of loading speed since 2009. Even then, they found that longer loading times — slow web speeds — led to decreased engagement.

Patients want answers, and they wanted them fast. 

Following up on the results, Google announced that it would use site speed as a ranking signal in 2010. Back then, speed wasn’t as important when calculating the SEO-health of your website. Content was still king.

The Importance of Back-End Performance

In 2013, Moz ran an experiment. They wanted to see exactly how Google made good on their promise to penalize slow-loading sites. They found that:

  • TTFB (Time-To-First-Byte) had a bigger impact than full/partial rendering. Sites that responded faster to Google had higher search rankings.
  • Pages and websites with more content earned higher rankings. 

Their data ultimately showed that there was no correlation between page-load time and Google rankings. But websites that had better infrastructure (servers and hosting) and delivered content faster ranked higher.

The conclusions were quite natural.

TTFB is the most reliable metric for Google’s crawlers. It’s the first thing they can assess, and it’s the easiest way of understanding page speed.

For webmasters, that meant that they shouldn’t just remove unnecessary elements from their pages. Instead, they had to focus on the back-end performance of their websites. And yes, that includes servers, content delivery networks, and hosting providers.

Slow Health Websites Drive Fewer Bookings

In time, it became clear that an increasing number of people started using their phones as the gateway to the World Wide Web.

So in 2015, Google announced a new algorithm change that prioritized mobile-friendliness.

They introduced mobile optimization as a ranking signal. And, as we know today, a major part of “mobile-friendliness” is page speed.

Back in 2015, people were quick to call this algorithm update “Mobileggedon.”

Suddenly, webmasters found their landing pages ranking lower, and losing traffic overnight because they didn’t load fast enough or display well on mobile devices.

However, others understood that there was a direct correlation between page speed and customer acquisition.

Google explained it succinctly: Speed = Revenue.

In 2018, Google analyzed more than 11 million mobile-ad landing pages.

They found that mobile drove more traffic, but desktop drove more conversions.

Further benchmark analysis showed that for mobile users:

  • The probability of bounce increased 32% for pages that took between 1 and 3s to load; and
  • The probability of bounce increased 90% for pages that took between 3 and 5s to load.

The probability kept increasing until it became clear that speed truly equals revenue. If you wanted to make sure your mobile visitors convert, you have to make your pages fast.

All things considered, the 2018 Google Speed Update was a long time coming.

The Future of Mobile Speed: Core Web Vitals

Google continues to experiment with ways to improve the user experience by helping users quickly identify which sites will load quickly, and which sites will be frustratingly slow.

Core Web Vitals – Google’s search ranking signal – focuses on three specific areas of the user experience: loading, interactivity, and visual stability with specific metrics that can be measured and tracked.

“Core Web Vitals are the subset of Web Vitals that apply to all web pages, should be measured by all site owners, and will be surfaced across all Google tools. Each of the Core Web Vitals represents a distinct facet of the user experience, is measurable in the field, and reflects the real-world experience of a critical user-centric outcome,” a Google developer site described.

In order to pass the web vitals assessment, a page has to meet the 75th percentile over a 28-day period – meaning at least three out of four visitors experience the target level of performance or better. A page meeting the recommended targets for all three metrics passes the assessment. For new pages without 28 days of data, Google aggregates similar pages and computes scores based on groupings.

Although the way Google calculates the web vitals assessment — effectively, looking at real user data — hints that traffic volume might be a scoring factor. However, Google’s Senior Webmaster Trends Analyst John Mueller laid those concerns to rest in a statement to Search Engine Journal explaining, “So, for Core Web Vitals, the traffic to your site is not important as long as you … reach that threshold that we have data for your website.” He added, “ … kind of the pure number of visitors to your site is not a factor when it comes to Core Web Vitals and generally not a factor for ranking either.”

Real-world UX data

What might be the most important aspect of Core Web Vitals is the data used by Google for scoring is based on real-world, field data. This data comes from the Chrome User Experience Report, a Google report based on actual user visits and webpage interactions meaning its data isn’t based on simulated or synthetic page loads.

In practice, this means lab-simulation tools, such as Lighthouse, can produce different results for metrics and benchmarks. Lighthouse creates a snapshot of what a hypothetical site visitor’s user experience might be and provides suggestions for areas to improve. The Chrome User Experience Report (CRuX) provides different results because it’s based on field data that’s based on how actual users experience the site.

Even though Core Web Vitals is a Google ranking signal, according to Mueller, the quality of the content remains vital. It’s now performance plus content.

“And the other thing is that relevance is still by far much more important,” he told Search Engine Journal. “We still require that relevance is something that should be kind of available on the site. It should make sense for us to show the site in the search results because, as you can imagine, a really fast website might be one that’s completely empty. But that’s not very useful for users.”