Google Releases May 2022 Core Algorithm Update

On 25th May 2022 Google announced the release of a new broad core algorithm update:

As with recent updates the stated rollout period is set to be around 1-2 weeks, meaning a likely date of completion at some time during w/c 6th June.

As the name suggests, broad core algorithm updates are designed to be a general “refresh” of Google’s algorithmic ranking processes and are not intended to target any particular website niches or areas of organic search. Websites can see a change in ranking performance as a result of updates, both positive and negative, but it’s also possible to see a negligible impact.

Within the Search Central blog on the most recent update, Google’s Danny Sullivan wrote:

Core updates are changes we make to improve Search overall and keep pace with the changing nature of the web. There’s nothing in a core update that targets specific pages or sites. Instead, the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.

https://developers.google.com/search/blog/2022/05/may-2022-core-update

Websites may be positively or negatively impacted by an update, but regardless of this the official advice remains the same. In summary, there’s nothing specific that webmasters need to do in response, and the focus should remain on creating “quality content”. In Google’s own words:

Pages that drop after a core update don’t have anything wrong to fix. This said, we understand those who do less well after a core update change may still feel they need to do something. We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.

https://developers.google.com/search/blog/2019/08/core-updates

With the last core algorithm refresh released in late November 2021, and updates appearing to be pushed out on a rough 6 monthly schedule, it’s highly likely that another similar update will take place in late 2022.

If you found this update useful, check out our latest blog posts for the latest news, and if you’re interested in finding out more about what we can do for your brand, get in touch with the team today.

Google Announces New Page Experience Signal

On Thursday Google announced the addition of a set of new user experience metrics to its growing list of ranking factors.

The additions – which Google is referring to as “Page Experience” metrics – will be designed to evaluate how users perceive browsing, loading and interacting with specific webpages, and incorporate criteria measuring:

  • Page load times
  • Mobile friendliness
  • Incorporation of HTTPS
  • The presence of intrusive ads or interstitials
  • Intrusive moving of page content or page layout

Webmasters should already be familiar with many of these factors, with recent years seeing Google driving home the importance of mobile friendliness, page speed, HTTPS adherence and avoidance of intrusive interstitials.

However, the new Page Experience signal also includes areas from the new “Core Web Vitals” report, recently incorporated into Google’s PageSpeed Insights and Search Console tools.

What are Core Web Vitals?

Core Web Vitals are a trio of metrics designed to evaluate a user’s experience of loading, interaction, and page stability when visiting a web page:

  • Largest Contentful Paint (LCP): This measures the perceived loading performance of a page, or the time passed before main page content is visible to users. An LCP time of 2.5 seconds viewed as good, with higher in need of improvement.
  • First Input Delay (FID): Measuring interactivity / load responsiveness, or the time it takes for a user to be able to usefully interact with content on the page. An FID of less than 100ms is optimal, with higher scores in need of improvement.
  • Cumulative Layout Shift (CLS): Measuring visual stability, or whether the layout of a page moves or changes while a user is trying to interact. Pages should aim for a CLS of less than 0.1 in order to provide a good user experience.

Largest Contentful Paint and First Input Delay will already be recognisable to most webmasters, with Google’s PageSpeed and Lighthouse tools already providing information on these metrics.

However, Cumulative Layout Shift appears to be new, with Google’s John Mueller stating that the CLS metric has been created to gage levels of user “annoyance”. CLS looks at the familiar experience of content shifting as a page loads, which Google illustrate with the below GIF:

What does this change?

Whilst most of the individual metrics within Page Experience are pre-existing ranking factors, the new announcement places them together as one part of an overarching signal:

Google state that they are aiming to provide a more “holistic picture of the quality of a user’s experience on a web page”, by grouping previously separate factors together.

Each factor will be weighted uniquely, although as Google have declined to comment on how this weight will be distributed, it will likely be up to webmasters to determine the importance of each.

The new signal is also set to bring changes to how mobile top stories are determined, with the adoption of AMP (Accelerated Mobile Pages) no longer a prerequisite for inclusion within this section.

In future, top stories will be based on an evaluation of Page Experience factors, with non-AMP pages able to appear alongside AMP pages.

When will Page Experience roll out?

Google state that changes around Page Experience “will not happen before next year”, and promise to give at least 6 months’ notice before any roll out takes place.

This gives webmasters plenty of time to get ready for the changes, with preparation hopefully made easier through the early incorporation of P.E into tools like Google Search Console, Lighthouse, and PageSpeed insights.

Check out our recent blog posts for the latest news, and if you’re interested in finding out more about what we can do for you, get in touch with us today.

Fusion Win Retail Marketing Campaign of the Year

We are very proud to announce that Fusion Unlimited & Halfords have been awarded Retail Marketing Campaign of the Year at this September’s Online Retail Awards.

The special recognition award highlights our combined efforts with Halfords across PPC, SEO, affiliates and content marketing, with Fusion Unlimited coming ahead of competitors across the online retail sectors.

The Online Retail Awards aims to show the achievements of online retailers and digital agencies regardless of size, with international and independent business nominated in the same space. The awards highlight websites that are “the embodiment of excellence for their customers”, seeking out “examples of retailers’ web, mobile and tablet strategies that offer great online shopping experiences for customers”.

We’ve helped deliver best-in-class digital strategies alongside Halfords for more than 10 years and it’s always rewarding to be recognised for our performance orientated approach.

Following our accreditation as an ‘Elite’ agency in this years’ Drum Independent Agency Census , 2016 is proving to be a great year for the team and our clients.

Fusion Nominated for 3 UK Search Awards

Uk Search Shortlist

We’re delighted to share that Fusion Unlimited has been shortlisted for three awards at this November’s UK Search Awards, for our work across PPC, content marketing, and proprietary software development.

Our creation of a bespoke, hyper-local PPC campaign for Your Move and Reeds Rains has been shortlisted in the Best Local Campaign category. In the Best Use of Content Marketing category, we’ve been nominated for our execution of “The Ultimate UK Camping Guide” campaign alongside Halfords. Last but by no means least, our fresh from the lab Feed Catalyst tool is in contention for the title of Best Search Software Tool.

Now in its 6th year, the UK Search Awards is one of the most renowned celebrations of PPC, SEO, and Content Marketing work in the UK, spanning 28 categories and attracting hundreds of entries each year.

We’re proud to have been recognised for our hard work and innovation, and look forward to seeing if we can bring the awards home on the evening!

 

 

Google Updates Penguin Algorithm

Last Friday Google confirmed the fourth major update of its Penguin algorithm, “Penguin 4.0”. The news comes nearly two years after the previous update, Penguin 3.0, which on release in late October 2014 affected around 1% of UK/US search results.

Alongside the update Google has announced that Penguin is now part of its core algorithm, effectively meaning that Penguin 4.0 is the last update webmasters will see.

What is Penguin?

First launched in April 2012, Penguin is designed to stop websites seen to be using “spammy” techniques from appearing in Google’s search results. The algorithm looks to identify and penalise sites using “bad links”, which have been bought or acquired in an attempt to boost ranking positions.

Sites caught out by Penguin typically see a sharp drop in ranking positions, with recovery only a possibility after a number of steps have been taken to remove links seen as toxic.

Even after these steps have been taken, a site might not see recovery until the next refresh of the Penguin algorithm. As Penguin has traditionally been refreshed manually, many site owners have faced a long wait for improvements to be seen.

However, with Penguin 4.0 come two important changes.

Penguin 4.0 runs in real time

As part of the core algorithm, Google has said that Penguin will now run on a real time basis, in contrast to the manual refreshes typical of previous updates. This means that if a site is affected by the algorithm, and efforts are made to rectify any issues, then recovery of rankings should take place fairly quickly; basically, as soon as a site is re-crawled and re-indexed.

As Penguin is effectively now running constantly, Google’s Gary Illyes has stated that the company is “not going to comment on future refreshes”. Although not the end of Penguin, this marks the end of the algorithm as most webmasters have come to know it.

Penguin 4.0 is granular

Previously, the Penguin algorithm affected sites in a blanket way; even if only one page had one “bad link”, the whole site could be penalised.

Now, Google has said that Penguin “devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site”. Rather than a whole site being negatively affected, Penguin will now look to penalise on a page by page, or folder by folder basis. This means that whilst some pages, sections, or folders may receive penalties, others will not be affected.

Google has yet to confirm whether the Penguin 4.0 has been fully rolled out, with many predicting that the full update is likely to take place over a few weeks. Although webmasters could pre-empt any negative effects by performing a link detox, it’s positive for webmasters to know that any sites penalised will no longer face a long and frustrating road to recovery.

Fusion SEO Market Updates: April 2016

Google issues new mobile friendly warnings

A month after Google boosted the mobile-friendly algorithm, Google have changed the way in which they inform site owners if their website is not optimised for mobile users.

When a site owner searches for their own website on their mobile phone, if it’s not optimised, the result for the site will include a small notice above the meta description saying, “Your page is not mobile-friendly”. The message a hyperlink, and when clicked, will take users through a Google help page with more information about mobile-friendliness. For all other users searching for the website, no such message will be displayed.

Google’s John Mueller has confirmed that the feature is an experiment to see how mobile friendliness can be boosted across the internet.

Sites penalised for free product review links

In the first week of April, Google issued penalties to websites found to be hosting “unnatural outbound links”. Issued by the Google manual actions team, the penalties are aimed at websites linking to other sites with the aim of manipulating Google rankings.

Several days after issuing the penalties, it emerged that the unnatural link building in question was specifically in relation to free product reviews featured by bloggers, in exchange for links.

Following Google’s guidelines issued several weeks earlier advising bloggers to disclose free product and ‘nofollow’ their links, Google has now acted on its warning, and sent out manual actions to those sites that did not comply.

Google sent 4 million messages about search spam last year

Google has announced it’s latest development in it’s bid to clean up search results.

Over 2015, Google explained that they noticed a 180% increase in websites being hacked since 2014, as well as the number of websites with sparse, low quality content increasing. In order to counter this, Google unveiled their hack spam algorithm late last year. By sending out 4.3 million manual notices to website owners and webmasters, Google were able to clean up “the vast majority” of the issues stated.

Google saw a 33% increase in the total number of sites going through the reconsideration process, which shows the importance of verifying your website in the Google Search Console, which allows you to receive alerts when Google finds issues with your website.
Additionally, Google received over 400,000 spam reports submitted by users, and was able to act on a whopping 65% of them, thanks to over 200 Hangouts aired to help webmasters.

Fusion SEO Market Updates: April 2015

SearchUpdateApril2015

Google finally rolls out mobile friendly update

On the 21st of April Google finally began to roll out its much anticipated mobile friendly update. Announced early on in the year, the exact nature and effect of the update has been heavily speculated about within the SEO community, with reported 4.7% of webmasters making changes to ensure that sites fit within Google’s requested parameters.

However, at the time of writing the update has had a far smaller impact than previously anticipated. As of the 1st of May, Google have said that the algorithm has fully rolled out in all of its data centres. However, the majority of webmasters have reported no big changes in mobile search results rankings, and those who’ve been tracking the update have seen no significant impact, as seen in the below graph from Moz.

April 2015 - MOZ Mobile Rankings

Google’s Gary Illyes stated that as many sites have not been re-indexed, they aren’t as of yet being affected by the new scores. This means it’s still possible for “unaffected” sites to be hit, and it’s still recommended that sites that are not yet mobile friendly be made so.

Google tests lightweight mobile results for slow connections

Google have continued their recent focus on mobile search results optimisation with the test of a “lightweight” display for mobiles with slow connections. Initially announced to simply effect mobile SERP’s, Google have now given webmasters the option to show a “toned down” version of their site to users on a slow connection. Whilst the lightweight version of the search results page is automatic, the option to strip out heavy images and files on a site will be down to webmasters to decide.

However, when tested on users in Indonesia, Google reported that sites that had opted in to lightweight display had a 4x faster load time, used 80% fewer bytes, and saw a 50% increase in mobile traffic – something surely likely to influence whether webmasters opt in.

Search Queries report being randomly replaced by Search Analytics in Webmaster Tools

At the beginning of the year, Google tested a new “Search Impact” report amongst a few select users, now renamed as “Search Analytics”. As well as the standard Search Impact features, the new report displays clicks, impressions, CTR and average search results position. On top of this, Search Analytics also allows for a comparison of these factors, broken down by specific queries, pages, devices, and country.

Google’s Webmaster Trends Analyst Zineb Ait Bahajji also commented that the report is “slow to catch up” at the moment, having only 90 days of data. However, this is expected to increase shortly. Whilst at the moment Search Analytics is only available to a random selection of users, it’s expected that at some point it will receive a full rollout and replace the Search Queries report.

Google begins replacing URL search result snippet with breadcrumb pathway

After a long period of testing, Google has finally started to replace site URL in the search results snippet with a site name and breadcrumb pathway. This update comes after years of beta testing and randomly selected rollouts, and is designed to “better reflect the names of websites”, Google has stated.

With this update, webmasters will be given the opportunity to better reflect site structure and content to users, and display a “real world” version of the site rather than a domain name. At the time of writing, this update has only affected mobile results in the U.S, but is expected to have a worldwide rollout in the near future.

In order to make sure these changes take place, webmasters will have to implement specific site name and breadcrumb schema within a sites source code.

Image Source: http://searchengineland.com/googles-mobile-friendly-algorithm-a-week-later-was-it-really-mobilegeddon-219893

Fact Based Search Ranking: Is Google Smarter than You?

Blog - March 2015 - JC (resized)

The average person today will digest more information than at any other point in history. Through the internet, music, TV and plain old fashioned print media, they’ll encounter around 100,000 words. Or about 2.5 novels. In total, they’ll process the equivalent of 34 gigabytes of information every day; 5 times more than 30 years ago.

These figures could give the impression that society in 2015 is more educated. With Google, Siri, and blogs like this just a few clicks away, we can encounter a wealth of information, learning whatever we feel like, whenever we feel like. Want to know tomorrow’s weather? Who was King of France in 1390? How tall Noel Edmonds is? There’s nothing stopping you.

However, have you ever thought that a lot of the information you encounter, process and learn might be wrong? Google has, and they’re wanting to rectify this.

For just under a year, Google has been developing their Knowledge Vault, a huge store of information taken from all across human history. Knowledge Vault is an ever expanding database that autonomously collects data from around the internet. This information is then cross referenced with similar or corresponding information in order to sift facts from falsities.

Google’s existing Knowledge Graph works in a similar way, albeit on a smaller scale. However, rather than compiling information from the whole of the internet, the Knowledge Graph uses “trusted” sources like Wikipedia and Freebase to offer a collection of related and relevant information on a given search term. For example, if I search “Noel Edmonds”, Knowledge Graph provides a collection of useful and unimaginably interesting facts on the man himself, as visible below.

Noel Edmonds

Very recently, a Google research team published a research paper announcing aspirations to implement Knowledge Vault as a search ranking factor. This means that rather than a collection of information simply being shown to users alongside search results – as with Knowledge Graph – the Vault would control all the information on the search results page. Sites that contain information Google considers true would be ranked highly, and sites that contain dubious information would be penalised.

Whilst this is a suggestion still only in its formative period, it’s one that would entirely alter the way Google search works.

At the moment, sites are ranked according to a number of factors, one of these being links. The more links a site has from trustworthy sources, the more trustworthy that site is considered. This is a largely human process; when you link to a site, you’re showing a vote of confidence.

However, a ranking based on the Knowledge Vault would take away this human influence. As the Vault is an autonomous system, it and it alone decides what separates fact from truth, and what makes a site trustworthy.

Current ranking factors like links are far from perfect; something testified by algorithms like Penguin designed to halt manipulative link-building. However, possibilities for manipulating the Knowledge Vault in theory still exist. If the Vault is simply collecting together information it views as similar, and deciding truthfulness based on this, then what’s to stop webmasters from sprinkling their “facts” across the web in an attempt to manipulate higher rankings? Plus, what about dubious information that large numbers of people on the web consider to be true? Does this mean that moon landing conspiracy theories and folk health remedies should be considered facts, and afforded a high ranking? What about “facts” that are opinion based? Should the statement “Noel Edmond’s best days are behind him” be deemed any more truthful than “Noel Edmonds has a long and fruitful future ahead in show business”?

Perhaps more importantly, the implementation of a Knowledge Vault based ranking system is a step towards Google controlling a large flow of information. Whereas with the current ranking system, if a piece of dubious information is encountered, this can be argued against; a healthy discussion can be formed. However, with the implementation of this algorithm, there will be no need for discussion; just a nod of the head as Google pumps out a stream of complete, inarguable “facts ”. With this move, Google could be taking the power to invest confidence in information and sites away from users; something surely more important than encountering the odd “spider eats dog” article.

With this being said, and as Google haven’t imposed any real plans for implementation, at this point we can only speculate how a knowledge based search rankings system would work. It may be that Google could simply decide to implement a fact ranking alongside existing systems – perhaps displayed within a search snippet – something which at the time of writing seems a safer and more feasible option. In the unlikely eventuality that a full overhaul does take place, users may even become savvier, and more clued up to whether they’re being shown sketchy information. In any case, it’s not as if Google has never made big changes to the way search works before, and we’ll look forward to watching and adapting to whatever plays out.