Google Releases May 2022 Core Algorithm Update

On 25th May 2022 Google announced the release of a new broad core algorithm update:

As with recent updates the stated rollout period is set to be around 1-2 weeks, meaning a likely date of completion at some time during w/c 6th June.

As the name suggests, broad core algorithm updates are designed to be a general “refresh” of Google’s algorithmic ranking processes and are not intended to target any particular website niches or areas of organic search. Websites can see a change in ranking performance as a result of updates, both positive and negative, but it’s also possible to see a negligible impact.

Within the Search Central blog on the most recent update, Google’s Danny Sullivan wrote:

Core updates are changes we make to improve Search overall and keep pace with the changing nature of the web. There’s nothing in a core update that targets specific pages or sites. Instead, the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.

https://developers.google.com/search/blog/2022/05/may-2022-core-update

Websites may be positively or negatively impacted by an update, but regardless of this the official advice remains the same. In summary, there’s nothing specific that webmasters need to do in response, and the focus should remain on creating “quality content”. In Google’s own words:

Pages that drop after a core update don’t have anything wrong to fix. This said, we understand those who do less well after a core update change may still feel they need to do something. We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.

https://developers.google.com/search/blog/2019/08/core-updates

With the last core algorithm refresh released in late November 2021, and updates appearing to be pushed out on a rough 6 monthly schedule, it’s highly likely that another similar update will take place in late 2022.

If you found this update useful, check out our latest blog posts for the latest news, and if you’re interested in finding out more about what we can do for your brand, get in touch with the team today.

SEO Market Updates: December 2021

Join Fusion’s SEO team as we round up last month’s major industry updates

December 2021 Product Reviews Update

On 1st December Google announced the rollout of a new product reviews update, following on from a previous update released in April 2021.

The news was announced within a new Search Central blogpost, with Google writing that the update was designed to reward “high-quality product reviews” and that webmasters may notice changes in how their reviews are ranked as a result.

According to Google, the decision to release a new update was largely based on new feedback from users on what is viewed as “trustworthy” or “useful” review content. This feedback has formed the basis of two new best practice recommendations, taken into account in the most recent update:

  •  Provide evidence such as visuals, audio, or other links of your own experience with the product, to support your expertise and reinforce the authenticity of your review.
  • Include links to multiple sellers to give the reader the option to purchase from their merchant of choice.

As well as forming a part of the most recent update, the new recommendations have now also been added to Google’s official documentation around product reviews.

As with the previous update, only websites offering product reviews should be impacted by the recent release, with no other content types impacted.

Google Search Console Experiences Widespread Bugs

December was a rocky month for Google Search Console, with the platform experiencing at least two widespread issues impacting the accuracy and accessibility of data.

In mid-December, many webmasters reported a large spike in redirect issues within the platform, often across multiple websites. Following coverage of this, on 13th December Google stated that the spikes reported were false and that this was due to an issue with the platform:

Just a few days later another issue was reported, with some users finding that they were unable to access and essentially locked out of their account. Google again announced this was a bug:

Google has since confirmed that both issues are now resolved, although some within the industry are continuing to report sporadic issues with the platform. No further information has been provided as to the reasons behind the bugs, with Google simply stating that they were “internal issues”.

No Penalty For Failing to No-Follow Affiliate Links

In a recent Q&A session, Google’s John Mueller stated that failing to correctly no-follow affiliate links is unlikely to pose a real issue.

In answer to the question “Would I be penalized if I don’t set the rel sponsored or rel no follow for my affiliate links?”, Mueller stated:

Probably not. […] From our point of view, affiliate links fall into that category of something financial attached to the links, so we really strongly recommend to [add a rel sponsored or rel no-follow tag]. But for the most part if it doesn’t come across as you selling links, then it’s not likely to be the case that [Google] would manually penalize a website for having affiliate links and not marking them up.

As affiliate links indicate some kind of financial relationship between the linking and linked-to website, Google considers it best practice to ensure that they are tagged with a “rel=sponsor” attribute or”rel=nofollow” attribute. Both tags prevent equity from being transferred to the linked-to website, with the sponsor tag also indicating to search engines clearly that a financial agreement is in place between the two websites.

However, Mueller’s answer indicates that failing to use the tag for affiliate links is unlikely to cause any real issue, despite this being contrary to best-practice guidelines.

Watch the question at 31:58 here.

100k URLs Unlikely to Pose Crawl Budget Issue

Google’s John Mueller recently stated that websites sized around 100k URLs shouldn’t encounter issues with crawl budget. In response to a question on Twitter focused on whether to de-index lower-quality content, Mueller tweeted:

https://twitter.com/JohnMu/status/1473268153316691975

Although Google’s official documentation on crawl budget contains brief definitions of what it considers a “Large site” (+1,000,000 URLs) and a “Medium or larger site” (+10,000 URLs), there is no information given on how well Google is able to handle both size brackets.

As such, Mueller’s tweet provides a useful – albeit informal – guideline for webmasters to follow when considering the impact of website size.

If you found this update useful, check out our latest blog posts for the latest news, and if you’re interested in finding out more about what we can do for your brand, get in touch with the team today.

SEO Market Updates: August 2021

Join Fusion’s SEO team as we round up last month’s major industry updates.

Google Updates How Title Tags Are Generated

In late August Google announced that they had “introduced a new system of generating titles for web pages” in a new post within the Search Central blog. The statement came following widespread speculation across the industry, with webmasters and SEOs initially noticing that something had changed around the 17th of August.

According to Google, while title tags could previously alter to match a user query, with the new system this will no longer happen. Instead, webpages will now display the same title tag regardless of the query, in order to produce “titles that work better for documents overall”.

In addition, Google is now making greater use of on-page elements to generate title tags, in particular “text that humans can visually see when they arrive at a web page”. This includes key on-page elements like H1s and other header tags, alongside content that is “large and prominent through the use of style treatments” such as pull-quotes or text within links. The rationale is to avoid instances where manually input HTML title tags are too long, contain boilerplate text, or have obvious instances of keyword stuffing.

Google has made use of on-page elements to produce title tags for years, but early analysis suggests that this is now happening with far greater frequency. One limited case study indicates a 77% drop in HTML title tag usage following the change, although the search engine maintains that the updates are limited and that HTML tags are still used “more than 80% of the time”.

Despite the update, Google says that webmasters should still focus on “creating great HTML title tags”. Google also acknowledge that the new system is still being refined, and are welcoming feedback from webmasters.

Safe Browsing Dropped As Page Experience Signal

Last month Google stated that the Safe Browsing requirement would no longer form part of its Page Experience signal. The announcement came as part of a post on wider changes made to Search Console’s Page Experience report, which also included news on the the removal of the Ad Experience tool.

In the post, Google explain that as issues such as malware or third-party site hijacking are “not always in the control of site users[..]we’re clarifying that Safe Browsing isn’t used as a ranking signal and won’t feature in the Page Experience report”. This means that the signal was not considered within the recent Page Experience update, which completed its rollout on 31st August.

To further make clear the factors which now form part of the Page Experience signal, Google released the updated graphic below:

Safe Browsing issues will still be flagged within other areas of Google Search Console and should of course remain a wider consideration for SEOs and webmasters.

Google Search Console Experiences Data Loss

Google Search Console experienced a widespread data loss issue in late August, resulting in many site owners reporting dramatic drops in clicks and impressions to their websites.

The issue impacts the 23rd and 24th of August, and according to Google was due to an “internal problem”. In a post within the Search Console support hub, Google explain that “Users might see a significant data drop in their performance reports during this period. This does not reflect any drop in clicks or impressions for your site, only missing data in Search Console.”.

Although many initially assumed that the issue was a glitch or reporting delay, this appears to be real data loss, with Google’s John Mueller stating the traffic information is unlikely to be backfilled:

Google Search Console is relied on widely to provide detailed insight into organic performance, and the data loss means that many SEOs may be reporting slightly lower than average levels of traffic for August. Although unfortunate, this is purely an artificial drop, and it’s safe to assume that clicks & impressions for the impacted dates will be slightly higher than GSC reports.

Google Made 4500 Changes To Search In 2020

According to a recent post by Danny Sullivan, in total around 4500 changes to Google Search were made within 2020. These changes likely cover everything from smaller user interface changes to larger algorithm updates, with many unlikely to be noticed by everyday users. 

The announcement came within a wider post announcing the launch of an updated How Search Works portal, which is to give insight into how Google approaches “the big, philosophical questions, along with the nitty-gritty details”. Explaining, Sullivan writes:

“On the site, you can find details about how Google’s ranking systems sort through hundreds of billions of web pages and other content in our Search index — looking at factors like meaning, relevance, quality, usability and context — to present the most relevant, useful results in a fraction of a second. And you can learn about how we go about making improvements to Search. (There have been 4,500 such improvements in 2020 alone!)”

Although only a brief comment, with around 3200 changes made in 2019, and 980 reportedly made in 2014 , this indicates that Google is rapidly increasing the rate at which it updates search.

If you found this update useful, check out our latest blog posts for the latest news, and if you’re interested in finding out more about what we can do for your brand, get in touch with the team today.

SEO Market Updates: June 2021

Join Fusion’s SEO team as we round up last month’s major industry updates.

PAGE EXPERIENCE ALGORITHM NOW ROLLING OUT

On 15th June Google announced the rollout of its much-anticipated Page Experience algorithm.

Incorporating the new Core Web Vitals metrics, the algorithm measures a range of factors broadly related to page usability and user experience, including:

  • Page speed
  • Interactivity
  • Visual stability
  • Mobile-friendliness
  • Safe browsing
  • HTTPS usage
  • Usage of intrusive interstitials

Sites that are marked as optimal across the above factors will be considered as offering good page experience and may be potentially favoured in SERPs as a result. However, according to Google sites should not expect to see drastic changes as an immediate result of the update.

Although the current update only applies to mobile devices, Google has confirmed that Page Experience will become a ranking factor for desktop in the near future. A timeline for this has not yet been set out, with an announcement expected closer to the time of release.

To find out more about what to expect from the new update and our approach to measuring Page Experience, read our dedicated blog post here.

BROAD CORE ALGORITHM UPDATE RELEASED ON 2nd JUNE

Prior to the release of the planned Page Experience update, earlier in June Google rolled out a previously unannounced broad core update.

Referred to as the June 2021 Core Update, the release began to roll out on the 2nd of June and finished around the 12th. Unusually, Google announced that this would be a two-part update, with the 2nd round of updates taking place at some point in July. Google’s Danny Sullivan clarified:

  “Some of our planned improvements for the June 2021 update aren’t quite ready, so we’re moving ahead with the parts that are, then we will follow with the rest with the July 2021 update. Most sites won’t notice either of these updates, as is typical with any core updates.”

Google have not disclosed any exact details as to the changes made in the two updates, simply stating the update is fairly typical and that sites may see a negative, positive, or negligible impact. As has become usual with core updates, Google also maintained that there’s nothing in particular for webmasters to do in response.

… AND A NEW TWO-PART SEARCH SPAM UPDATE

If the Page Experience and June / July 2021 updates were not enough, in late June Google also released another update, this time targeting “search spam”.

Rolling out in two parts, the first release started on the 23rd and completed within a single day, with the second following up a week later on the 28th. Confirming the update on the 23rd, Google stated:

“As part of our regular work to improve results, we’ve released a spam update to our systems. This spam update will conclude today. A second one will follow next week.”

Clarification has not been provided on the exact types of spam targeted in the releases, with Google simply advising webmasters to follow their best practice guidelines for search.

GOOGLE LAUNCHES SEARCH CONSOLE INSIGHTS

Google Search Console received another round of features in June, with the addition of a new “Insights” report.

Google Console graph

Insights joins together data from Google Search Console and Google Analytics in an effort to make it easier to analyze the performance of site content. In Google’s words, Insights aims to help site owners answer the below questions:

  • “What are your best performing pieces of content, and which ones are trending?”
  • “How do people discover your content across the web?”
  • “What do people search for on Google before they visit your content?”
  • “Which article refers users to your website and content?”

The new tool began rolling out in mid-June and should now be available to most Google Search Console users. Site owners can either access Insights directly through Search Console, or via a new portal on the Google site.

ROBOTS.TXT ON SHOPIFY SITES NOW EDITABLE

Owners of Shopify sites are now able to manually upload and edit robots.txt files. The new feature was announced on Twitter by Shopify CEO Tobi Lutke, and as of 21st June should be fully rolled out.

Shopify had previously only applied default robots.txt files to all websites, with no clear workaround should webmasters need to edit the file. However, the file can now be manually changed via the robots.txt.liquid theme template, with site owners able to:

  • Block certain crawlers
  • Disallow (or allow) certain URLs from being crawled
  • Manually add extra sitemap URLs
  • Add crawl-delay rules for specific crawlers.

While Shopify maintains that the default robots.txt “works for most stores”, the new functionality ultimately gives greater control to site owners and is likely to be welcomed by SEOs working with Shopify sites.

If you found this update useful, check out our latest blog posts for the latest news, and if you’re interested in finding out more about what we can do for your brand, get in touch with the team today.

SEO Market Updates: August 2020

Join Fusion’s technical SEO team as we round up last month’s major industry updates.

Search Console Insights Report Now In Beta

Following months of testing, Google has officially announced the release of its Google Search Console Insights report.

Google describes the Insights report as “a new experience tailored for content creators and publishers”, designed to help increase understanding of how users discover and engage with site content.

The report provides a combination of data from Google Analytics and Google Search Console from the previous 28 days, roughly showing:

  • An overview of all Page Views
  • A breakdown of Page Views by channel
  • Page Views & Avg. Page Duration for recently published & top performing content
  • Top clicked keywords
  • Top referral links
  • Top social channels

At the time of writing Insights is in closed Beta, and is only accessible to users by invitation. However, if don’t think you’ve received an invitation you may still be able to access data for some sites – just head to this link link whilst logged in from your GSC account to see if you have access.

As with all Beta releases, it’s unclear when or whether the feature will receive a full rollout, although it’s likely that an open Beta will be released in the coming months.

New Dev Tools & Lighthouse Features Coming to Chrome 86

Google has provided an outline of the new Dev Tools and Lighthouse report features that will be available in Chrome 86, which is currently expected to be released on October 6th 2020.

The updated Dev Tools will contain a number of new debugging and auditing features, including:

  • New Media panel: Updated to allow users to more efficiently view and debug video content
  • Capture Node Screenshots: Available via a dropdown within the elements, allowing for nodes to be selected and captured
  • Emulate Missing Local Fonts: Makes the browser act as if fonts are missing, providing greater insight into how fonts are fetched

Chrome 86 is also set to be released with a new version of the Lighthouse report. Alongside a bug fixes, Lighthouse 6.2 is also set to contain the below new capabilities:

  • Avoid non-composited animations: Reports on animations that shift during load, reducing CLS
  • Avoid long main thread tasks: Provides info on the longest main thread tasks
  • Unsized image elements: Reporting on whether image elements have a set height and width

Google “Glitch” Causes Ranking Anomalies

Around the 10th of August many within the SEO community reported widespread and sudden ranking fluctuations, leading most to assume an algorithm update was in process.

This would have been an unwelcome and unexpected surprise, as in recent years Google has largely warned in advance of updates. Google have also specifically stated that they are unlikely to make any significant algorithm changes within 2020, whilst the industry deals with the fallout from COVID-19.

However, fluctuations were soon followed by reports of stabilization, and it soon became apparent that the changes were the result of what Google has referred to as a “glitch”.

 

https://twitter.com/JohnMu/status/1293045032124059649

In typical fashion, the statement from Google on the exact causes of the glitch was fairly oblique, simply stating that the changes were a result of an “indexing issue”.

Server Side Tagging Now Available in GTM

Google Tag Manager and Tag Manager 360 have been updated with a new server side tagging feature.

 Server side tagging allows companies to host third party tags within a Google Cloud hosted server container, rather than directly on a website. This means that when a user visits a site with server side tagging in place, the tags will be loaded directly within the cloud rather than on a webpage.

Whilst primarily reported in PPC circles, the new feature should also open up benefits for those working in SEO. Third party tags are a common contributor to performance and site speed issues, and more often than not the solution to dealing with these issues isn’t entirely simple. However, if a client is using GTM to serve third party tags, the new feature provide offer SEO’s a relatively simple way to improve performance. 

 

If you found this update useful, check out our latest blog posts for the latest news, and if you’re interested in finding out more about what we can do for your brand, get in touch with the team today.

 

Google Announces New Page Experience Signal

On Thursday Google announced the addition of a set of new user experience metrics to its growing list of ranking factors.

The additions – which Google is referring to as “Page Experience” metrics – will be designed to evaluate how users perceive browsing, loading and interacting with specific webpages, and incorporate criteria measuring:

  • Page load times
  • Mobile friendliness
  • Incorporation of HTTPS
  • The presence of intrusive ads or interstitials
  • Intrusive moving of page content or page layout

Webmasters should already be familiar with many of these factors, with recent years seeing Google driving home the importance of mobile friendliness, page speed, HTTPS adherence and avoidance of intrusive interstitials.

However, the new Page Experience signal also includes areas from the new “Core Web Vitals” report, recently incorporated into Google’s PageSpeed Insights and Search Console tools.

What are Core Web Vitals?

Core Web Vitals are a trio of metrics designed to evaluate a user’s experience of loading, interaction, and page stability when visiting a web page:

  • Largest Contentful Paint (LCP): This measures the perceived loading performance of a page, or the time passed before main page content is visible to users. An LCP time of 2.5 seconds viewed as good, with higher in need of improvement.
  • First Input Delay (FID): Measuring interactivity / load responsiveness, or the time it takes for a user to be able to usefully interact with content on the page. An FID of less than 100ms is optimal, with higher scores in need of improvement.
  • Cumulative Layout Shift (CLS): Measuring visual stability, or whether the layout of a page moves or changes while a user is trying to interact. Pages should aim for a CLS of less than 0.1 in order to provide a good user experience.

Largest Contentful Paint and First Input Delay will already be recognisable to most webmasters, with Google’s PageSpeed and Lighthouse tools already providing information on these metrics.

However, Cumulative Layout Shift appears to be new, with Google’s John Mueller stating that the CLS metric has been created to gage levels of user “annoyance”. CLS looks at the familiar experience of content shifting as a page loads, which Google illustrate with the below GIF:

What does this change?

Whilst most of the individual metrics within Page Experience are pre-existing ranking factors, the new announcement places them together as one part of an overarching signal:

Google state that they are aiming to provide a more “holistic picture of the quality of a user’s experience on a web page”, by grouping previously separate factors together.

Each factor will be weighted uniquely, although as Google have declined to comment on how this weight will be distributed, it will likely be up to webmasters to determine the importance of each.

The new signal is also set to bring changes to how mobile top stories are determined, with the adoption of AMP (Accelerated Mobile Pages) no longer a prerequisite for inclusion within this section.

In future, top stories will be based on an evaluation of Page Experience factors, with non-AMP pages able to appear alongside AMP pages.

When will Page Experience roll out?

Google state that changes around Page Experience “will not happen before next year”, and promise to give at least 6 months’ notice before any roll out takes place.

This gives webmasters plenty of time to get ready for the changes, with preparation hopefully made easier through the early incorporation of P.E into tools like Google Search Console, Lighthouse, and PageSpeed insights.

Check out our recent blog posts for the latest news, and if you’re interested in finding out more about what we can do for you, get in touch with us today.

Fusion Win Retail Marketing Campaign of the Year

We are very proud to announce that Fusion Unlimited & Halfords have been awarded Retail Marketing Campaign of the Year at this September’s Online Retail Awards.

The special recognition award highlights our combined efforts with Halfords across PPC, SEO, affiliates and content marketing, with Fusion Unlimited coming ahead of competitors across the online retail sectors.

The Online Retail Awards aims to show the achievements of online retailers and digital agencies regardless of size, with international and independent business nominated in the same space. The awards highlight websites that are “the embodiment of excellence for their customers”, seeking out “examples of retailers’ web, mobile and tablet strategies that offer great online shopping experiences for customers”.

We’ve helped deliver best-in-class digital strategies alongside Halfords for more than 10 years and it’s always rewarding to be recognised for our performance orientated approach.

Following our accreditation as an ‘Elite’ agency in this years’ Drum Independent Agency Census , 2016 is proving to be a great year for the team and our clients.

Fusion Nominated for 3 UK Search Awards

Uk Search Shortlist

We’re delighted to share that Fusion Unlimited has been shortlisted for three awards at this November’s UK Search Awards, for our work across PPC, content marketing, and proprietary software development.

Our creation of a bespoke, hyper-local PPC campaign for Your Move and Reeds Rains has been shortlisted in the Best Local Campaign category. In the Best Use of Content Marketing category, we’ve been nominated for our execution of “The Ultimate UK Camping Guide” campaign alongside Halfords. Last but by no means least, our fresh from the lab Feed Catalyst tool is in contention for the title of Best Search Software Tool.

Now in its 6th year, the UK Search Awards is one of the most renowned celebrations of PPC, SEO, and Content Marketing work in the UK, spanning 28 categories and attracting hundreds of entries each year.

We’re proud to have been recognised for our hard work and innovation, and look forward to seeing if we can bring the awards home on the evening!

 

 

SEO Market Updates: September 2016

Google’s Penguin Algorithm now runs in Real-time

On September 23, Google announced that it’s Penguin filter, designed to devalue sites using link spam as a way to skew results in their favour, will be updating in real time. Previously this filter was only updated periodically, and sites penalised by it would remain penalised even if their status improved.

With real-time updates, however, Penguin is more granular, affecting only spammy areas of a given site, rather than the entire thing. It also releases pages upon the next crawl of the site if they have changed for the better.

Google Begins to show more AMP results

Google will now begin to show AMP-supported results inside of the standard organic results, as well as in the top stories, which have been displayed since February.

Large non-news companies including eBay and Shopify are now beginning to adopt the technology, which aims to provide content with 4x the speed and 10x less data usage.

While there has been no confirmation of a ranking boost for using AMP, Google is showing a label next to pages that support AMP technology.

Google adds new “science Datasets” Rich Data Schema

Last month Google introduced a new schema for marking up scientific data to be used to in rich snippets within search results. The schema can be used to display the additional metadata about the scientific information, including the author, source and license.

The types of format relevant to this markup could be:

  • a table or a CSV file with some data;
  • a file in a proprietary format that contains data;
  • a collection of files that together constitute some meaningful dataset;
  • a structured object with data in some other format that you might want to load into a special tool for processing;
  • images capturing the data

Google Updates Penguin Algorithm

Last Friday Google confirmed the fourth major update of its Penguin algorithm, “Penguin 4.0”. The news comes nearly two years after the previous update, Penguin 3.0, which on release in late October 2014 affected around 1% of UK/US search results.

Alongside the update Google has announced that Penguin is now part of its core algorithm, effectively meaning that Penguin 4.0 is the last update webmasters will see.

What is Penguin?

First launched in April 2012, Penguin is designed to stop websites seen to be using “spammy” techniques from appearing in Google’s search results. The algorithm looks to identify and penalise sites using “bad links”, which have been bought or acquired in an attempt to boost ranking positions.

Sites caught out by Penguin typically see a sharp drop in ranking positions, with recovery only a possibility after a number of steps have been taken to remove links seen as toxic.

Even after these steps have been taken, a site might not see recovery until the next refresh of the Penguin algorithm. As Penguin has traditionally been refreshed manually, many site owners have faced a long wait for improvements to be seen.

However, with Penguin 4.0 come two important changes.

Penguin 4.0 runs in real time

As part of the core algorithm, Google has said that Penguin will now run on a real time basis, in contrast to the manual refreshes typical of previous updates. This means that if a site is affected by the algorithm, and efforts are made to rectify any issues, then recovery of rankings should take place fairly quickly; basically, as soon as a site is re-crawled and re-indexed.

As Penguin is effectively now running constantly, Google’s Gary Illyes has stated that the company is “not going to comment on future refreshes”. Although not the end of Penguin, this marks the end of the algorithm as most webmasters have come to know it.

Penguin 4.0 is granular

Previously, the Penguin algorithm affected sites in a blanket way; even if only one page had one “bad link”, the whole site could be penalised.

Now, Google has said that Penguin “devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site”. Rather than a whole site being negatively affected, Penguin will now look to penalise on a page by page, or folder by folder basis. This means that whilst some pages, sections, or folders may receive penalties, others will not be affected.

Google has yet to confirm whether the Penguin 4.0 has been fully rolled out, with many predicting that the full update is likely to take place over a few weeks. Although webmasters could pre-empt any negative effects by performing a link detox, it’s positive for webmasters to know that any sites penalised will no longer face a long and frustrating road to recovery.

SEO Market Updates: August 2016

Google Updates Local Pack Algorithm

Google’s local pack results algorithm has had a big update as of last week, and various SEO forums have lit up with webmasters detailing the various changes to rankings they have noticed, with some saying this update is the biggest change to local rankings in a long time.

The update is thought to mostly be the spam algorithm refreshed, with new results appearing for certain terms for the first time in years, while other pages are suspended. These findings indicate that the update is either a core local ranking change, or a clean-up of the spam in the local index. Either way, spam is no longer ranking as well in local.

Google hasn’t commented on the changes.

New Mobile Penalty Arriving Early 2017

Google have announced that they will be clamping down on “intrusive” interstitials on mobile devices which impede their users’ browsing experiences. The new algorithm will launch in January 2017 with the aim of making pages running certain interstitials “not rank as highly” as previously. Google advised that the following would be affected:

  • Sites using pop ups that cover the main content, whether it’s immediately after accessing the page, or while browsing it.
  • Sites displaying an interstitial which users have to dismiss before accessing the page’s content.
  • Sites that use layouts where the above-the-fold portion of the page appears similar to a standalone interstitial, but the original content has been inlined underneath the fold.

Google has also advised that the following would not be penalised, if used responsibly:

  • Interstitials that appear due to a legal obligation, for example for cookie usage or age verification.
  • Login boxes on websites where content is not publicly accessible. For example, this would include private content such as email or unindexable content hidden behind a paywall.
  • Banners that use a reasonable amount of screen space and are easily dismissible.

Google to Drop Mobile Friendly Label

Google announced on the 23rd of August that they will be axing the ‘mobile friendly’ label from mobile search results. It’s important not to confuse this with the idea that Google are scrapping the mobile friendly ranking factor/signal – that remains as important as ever. Instead, Google is simply removing the indicator on search results which tells readers whether a site will load well on mobile, and the change is thought to be mostly cosmetic.

Additionally, Google have commented that “85% of all pages in the mobile search results now meet this criteria and show the mobile-friendly label”, indicating there is no longer a great benefit to the label.

McCain appoint Fusion for social launch

McCain Foodservice has appointed Fusion Unlimited to handle the launch of its B2B social media strategy after a two way pitch in April. With a cohesive organic and paid strategy already managed by Fusion, McCain understood the value to be gained from a holistic search and social marketing strategy aligned to deliver their goals for the next twelve months.

We are excited to be working with such a strong brand in the food industry as McCain , and look forward to developing their social platform for the Foodservice business over the coming months.

Additionally,we will work alongside our sister agency Principles Media (part of the Principles Communications Group) who successfully retained the McCain Foodservice integrated media buying account, having worked together for the past 11 years.

SEO Market Updates: July 2016

Google confirms no PageRank loss when using 301, 302, or 30x redirects

It has long been understood that 301 redirects or other forms of 30X redirects would cause PageRank dilution on Google.

Back in February, Google’s John Mueller made the announcement that PageRank is no longer lost for 301 or 302 from HTTP/HTTPS, which was largely assumed to be an effort on Google’s part to encourage the adoption of HTTPS by webmasters. In a tweet on the 26th of July however, Google’s Gary Illyes made the simple announcement that “30x redirects don’t lose PageRank anymore.” John Mueller later added that this tweeted wasn’t intended as new information, but instead as concrete confirmation. This means that Google has now stated absolutely that 30X redirects do not lose any PageRank weight.

Download all landing pages from Google Search Console via Analytics Edge

Google Search Console, or GSC, enables users to peruse a wealth of information straight from Google. The Search Analytics report allows webmasters to view which queries their website ranks for and their relevant landing pages. Search Analytics allows users to view queries, pages, devices and countries, all by date, but is limited by the fact you can only download 1,000 URLs. The solution? Analytics Edge.

Analytics Edge is an add-in for Google which works to export landing pages from Google Analytics with ease – all of them. Additionally, users can import pages in bulk. The ability to mass import and export landing pages is especially useful if your site is undergoing a redesign or migration, and looks to be a huge help for webmasters.

Possible for Google Featured Snippets To Be Localised

In recent weeks, several posts across social media sites have noted given examples of times when Google’s featured snippets can be altered depending on the user’s location. In one such example, a user searched ‘ER waiting times’, and was presented with information on their local hospital’s wait times, rather than general information on the subject.

According to Google, this technology isn’t new, however it seems it isn’t currently rolled out to 100% of searches, with test searches proving unsuccessful. It may be the case that the user’s proximity to the service nearby is factored in, or it may be only used for certain kinds of searches. Either way, it’s an impressive and interesting development for featured snippets.

SEO Market Updates: June 2016

Google asserts TLD keywords do not influence search results

Last month, an article circulated claiming that a lawyer has changed the top level domain name for his website form .com to .attorney, and had seen a substantial increase in organic traffic. The article suggested that tailored, target TLD keywords could help boost traffic. However, Google has since stamped out any such rumours.

During a Google Hangout on the 14th of June, John Mueller explained that TLD’s crammed with keywords do not factor into Google rankings. Mueller went so far as to state that Google’s algorithm completely ignores top level domain names. Both John Mueller and Gary Illyes from Google urged webmasters not to listen to rumour, and to stick with their current top level domains, rather than change them following “vague promises” for increased traffic.

Google to remove custom date range search filter for mobile users

Google is set to remove a further search filter, and this time around it only affects mobile users. In a forum, Google explained that they are the removing the functionality that allows mobile browsers to search results filtered by a start and end date. Mobile users wishing to search using this filter can still do so via the desktop version of Google.

Instead of the ‘custom range’ time option, mobile Googlers can still search for articles based on pre-specified periods of time since they were posted, such as past hour, past 24 hours and so on.

Google commented:

“After much thought and consideration, Google has decided to retire the Search Custom Date Range Tool on mobile. Today we are starting to gradually unlaunch this feature for all users, as we believe we can create a better experience by focusing on more highly-utilized search features that work seamlessly across both mobile and desktop. Please note that this will still be available on desktop, and all other date restriction tools [e.g., ‘Past hour,’ ‘Past 24 hours,’ ‘Past week,’ ‘Past month,’ ‘Past year’] will remain on mobile.”

Apple brings Siri to Mac, new exposure for non-Google search engines

Apple has announced that it will be bringing its Siri technology to the Mac. While seemingly, innocuous, this development means that Mac users will have access to search from the macOS operating system, and the results presented to them will not be solely from Google.

Apple demonstrated the new technology during its Worldwide Developers Conference, showing eager viewers how Siri will work when it arrives on the latest macOS, “Sierra”, later this year. Users will be able to speak their requests to Siri and will receive a selection of results not only from Google, but from competitor search engines such as Bing and Yelp. This could lead to higher usage and increased exposure for these less commonly used search engines.

Google still stands as the default search engine when opening Apple’s Safari feature, however, as the result of a long standing deal between the two companies. When the deal expired last year, it was anticipated that Apple would consider dropping Google, however, despite the face that no formal announcement was ever made, it looks as though the two companies will continue working together.

Google now using ‘Rankbrain’ for every search query

The mysterious Google algorithm, ‘Rankbrain’, has apparently been a success. Since Google first made the announcement of the vaguely defined ranking method last year, it’s use in searches has gone from 15% to 100%, meaning that all two trillion searches per year are handled with input from Rankbrain. In short, Google is clearly incredibly confident in the ill-defined system, and is now using it to filter every query it receives.

Google has stated that Rankbrain is the third-most useful criteria when handling a query, and selecting the results to present. There are hundreds of factors (or ‘signals) Google considers when weighing up a query, such as geographical location and whether a website’s headline matches the wording of the query. The fact that Rankbrain is the third most important says something about the nature of the technology being developed.

It is generally assumed that Rankbrain works by internally rearranging the wording of a query to allow for the best quality results. A search along the lines of “best places to go for lunch in Headingley” might be rearranged to “best Leeds restaurants”. In this way, more popular search terms are used in order for Google to extract more in depth and extensive data for its search results.

SEO Market Updates: May 2016

Voice search reporting may be coming to Search Analytics report

Hints from Google seem to indicate that voice query reporting will feature in the Google Search Console’s Search analytic report, although the company has remained vague on when that might be.

John Mueller, Google Webmaster Trends Analyst, announced that Google is searching for ways to display voice queries to webmasters via the Google Search Console. Mueller explained that Google are looking for ways to divide up whether people search via voice or keyboard in the Search Analytics report. According to John, Google are seeking to “make it easier to pull out what people have used to search on voice and what people are using by typing. Similar to how we have desktop and mobile set up separately.”

John went on further, to explain that because voice searches are usually done with long sentences, Google Search Analytics may not detect the search volume for the topic and group it together it with less-common keywords. John explained they are still debating internally what the best way to circumvent this issue is.

Search Analytics report gets update

The way in which Google calculates impressions and clicks in the Search Analytics report within the Google Search Console has been updated. Google posted the following on their ‘data anomalies’ page:

We refined our standards for calculating clicks and impressions. As a result, you may see a change in the click, impression, and CTR values in the Search Analytics report. A significant part of this change will affect website properties with associated mobile app properties. Specifically, it involves accounting for clicks and impressions only to the associated application property rather than to the website.

In your Search Analytics report you will see a line saying ‘update’. This is in reference to the new metrics which will come into use as of the 26th of April. John Mueller, Google Webmaster Trends Analyst, explained: “Other changes include how we count links shown in the Knowledge Panel, in various Rich Snippets, and in the local results in Search (which are now all counted as URL impressions).”

While most users noticed no change in their Search Analytics report, Google suggested that mobile users might notice the largest difference.

Google’s mobile-friendly algorithm boost rolls out

Google has released their latest algorithm, which is designed to provide a ranking boost for mobile-optimised websites in the search results.

Google’s Webmaster Trends Analyst, John Mueller, took to Twitter to make the announcement of the second version of the mobile-favouring algorithm. Google had previously hinted they would boost the algorithm back in March.

Google’s stated intentions with the update are to “increase the effect of the [mobile-friendly] ranking signal.” Additionally, the company has said any sites which are already mobile-friendly needn’t worry, and won’t be affected by the update.

The mobile algorithm is a page-by-page signal, which is the reason the update has taken some time to roll out fully, as Google has to asses each page separately. This means the impact of the update can take time to materialise.

One concerned Tweeter asked John Mueller if this update meant “mobilegeddon”. “No, not really. :)” came the reply.

Google expands featured snippets

Google has begun to use extended feature snippets for certain queries. Featured snippets are the information displayed at the top of a search, before any site results. This information is displayed when Google is able to collect information that it is confident can answer your query immediately.

Now, Google has extended this feature, with ‘related topics’ appearing below lengthened snippets. The related topics contain a brief explanation and links to other Google queries.

Fusion SEO Market Updates: April 2016

Google issues new mobile friendly warnings

A month after Google boosted the mobile-friendly algorithm, Google have changed the way in which they inform site owners if their website is not optimised for mobile users.

When a site owner searches for their own website on their mobile phone, if it’s not optimised, the result for the site will include a small notice above the meta description saying, “Your page is not mobile-friendly”. The message a hyperlink, and when clicked, will take users through a Google help page with more information about mobile-friendliness. For all other users searching for the website, no such message will be displayed.

Google’s John Mueller has confirmed that the feature is an experiment to see how mobile friendliness can be boosted across the internet.

Sites penalised for free product review links

In the first week of April, Google issued penalties to websites found to be hosting “unnatural outbound links”. Issued by the Google manual actions team, the penalties are aimed at websites linking to other sites with the aim of manipulating Google rankings.

Several days after issuing the penalties, it emerged that the unnatural link building in question was specifically in relation to free product reviews featured by bloggers, in exchange for links.

Following Google’s guidelines issued several weeks earlier advising bloggers to disclose free product and ‘nofollow’ their links, Google has now acted on its warning, and sent out manual actions to those sites that did not comply.

Google sent 4 million messages about search spam last year

Google has announced it’s latest development in it’s bid to clean up search results.

Over 2015, Google explained that they noticed a 180% increase in websites being hacked since 2014, as well as the number of websites with sparse, low quality content increasing. In order to counter this, Google unveiled their hack spam algorithm late last year. By sending out 4.3 million manual notices to website owners and webmasters, Google were able to clean up “the vast majority” of the issues stated.

Google saw a 33% increase in the total number of sites going through the reconsideration process, which shows the importance of verifying your website in the Google Search Console, which allows you to receive alerts when Google finds issues with your website.
Additionally, Google received over 400,000 spam reports submitted by users, and was able to act on a whopping 65% of them, thanks to over 200 Hangouts aired to help webmasters.

Fusion SEO Market Updates: March 2016

Google reveals top 3 ranking signals

Last year Google stated that it considers RankBrain – it’s machine learning technology – to be the third most important search ranking factor. Whilst this information led to much speculation about what exactly RankBrain is and does, many were more concerned with another question; if RankBrain is only the third most important signal, then what are the other two?

Until last month, this wasn’t information that Google seemed ready to divulge, even after repeated questioning. However, in Q&A with Google’s Search Quality team, the top two signals were finally revealed; links, and content.

This information doesn’t exactly come as a huge surprise; Google has driven in the importance of “quality content” and linkbuilding for years.

Plus, given that RankBrain isn’t so much a signal as a system that uses signals, and that links and content are influenced by a number factors, this is just the latest in Google’s long list of vague announcements.

Google updates search quality guidelines

Google has released another version of it’s search quality rater guidelines, less than 6 months after the release of the previous document.

The documents don’t appear to be too different from those released back in November 2015, with many sections remaining unchanged, and others receiving only slight tweaks.

However, a number of areas appear to have been de-emphasised. Supplementary Content, the potential negative or positive effects of which have been explored in previous documents , now receives much less attention.

On the other hand, areas such as local search – now termed “Visit-in-person” in the updated guidelines – have been emphasised and redrafted. Mobile also receives more attention, with more illustrations of high and poor quality search activity using Mobile search as an example.

Other sections have been completely cut, leading some to believe that Google no longer requires human evaluation of these factors, relying solely on algorithmic evaluation. If anything, the revision of the guidelines so soon after the previous release also illustrates the constantly evolving nature of natural search.

My Business ranking factors documented

Google has updated it’s help section on improving local rankings, vastly expanding on the previous document with a number of more in depth pieces of advice.

Whilst much of this appears to be common sense – ensure your business is verified, make sure to post accurate opening hours, respond to reviews, add photos of your business – it’s good to have what Google considers important for local business search in writing in one place.

The section frequently mentions and stresses the importance of three factors when creating My Business listings; relevance, distance, and prominence. A listing has relevance if it closely matches the terms users are searching for. Distance refers to how close a listing is from the terms users are searching for; e.g. are users searching for a different location than that stated in the listing? Prominence relates to how well known a business is, and takes into account existing offsite information such as reviews and articles and how these can positively affect local rankings.

Fusion SEO Market Updates: February 2016

Feb - Market Updates

Google updates AdWords, stalls Search Analytics

Donkeys

Last month, Google made major changes to the way ads are displayed page, with the removal of right hand side results from the right side of SERPs. Instead, a block of four Ads will be displayed at the top of the search results, followed by a further block at the bottom of the pages. It’s currently stated that the four AdWords block will only appear for “highly commercial queries” (e.g. “new car”, “home insurance”).

This means that in many instances organic listings will be pushed below the fold, with many speculating that this could lead to a lowered click through rate for natural search results.

Although there has been speculation as to how the Google ad changes will affect paid search, as of yet there has been fairly little comment on the changes from an SEO perspective. Alongside this, as Google’s Search Analytics report is currently stalled on February 23rd – coincidentally, around the time of the update – an independent analysis of the effect the changes could have on organic click through rate cannot yet take place.

Changes made to Knowledge Box

Platypus Knowledge

As of February 2016, the Knowledge Graph boxes that appear in Google’s SERPs can be manually updated by an of the account associated with the graph.

Previously, the information in knowledge box results was largely taken from structured data and schema mark-up on relevant websites. For example, if a business wanted its phone number, location, or social profiles to be displayed within its knowledge graph, this information would need to be included within the site’s schema mark-up.

With the recent update, “official representatives” of the company, person, or website associated with the knowledge graph can now “suggest a change” to the graph. However, this does not make schema redundant, and having valid and rich structured data is still important.

Although it’s not guaranteed that suggested changes will actually take place, the update gives a further degree of power to webmasters or account owners in ensuring that their site is visible in the SERPs.

Accelerated Mobile Pages now live

AMP

After being trialed in a mobile demo site, Google’s Accelerated Mobile Pages are now showing as live in mobile results for all users.

Initially announced by Google in October 2015, the Accelerated Mobile Pages project is designed to help boost page speed, cut load times, and result in a faster mobile search experience.

The open source initiative is currently backed by around 6000 developers, with thousands of sites currently signed up to show AMPs.

At the time of writing, Google has declined to comment on whether accelerated mobile pages are likely to receive a ranking boost. However, when asked about AMP and natural search, Google’s David Besbris reiterated that page speed and load time are both relevant ranking factors. This hints that at some point in the future, Accelerated Mobile Pages could be treated preferentially in comparison to regular pages.

Study shows outbound links could affect rankings

In January, Google’s John Mueller stated that outbound links are not used as a ranking signal.

However, the results of a recent study suggest that this might not be the case. To test Google’s claims, US based SEO agency Reboot Online conducted an experiment to find whether outbound links really do have no effect on rankings.

Reboot created 10 new web pages, each containing similar but not identical copy. Each page contained a control nonsense keyword – Phylandocic – that before the experiment resulted in 0 search results. 5 of the 10 webpages linked out to high authority sites like the University of Oxford, and 5 contained no outbound links.

Once the webpages had been crawled, a search for Phylandocic resulted in the 10 webpages being displayed in the SERPs. It was found that the top 5 webpages were those that linked out externally, whereas the bottom 5 were those that had no links whatsoever.

Whilst the conditions of the experiment were by no means natural, the results seems to indicate that linking externally could in fact have some benefit to a site, going against Google’s recent comments.

Fusion SEO Market Updates: January 2016

Jan 16 - Search Update

Google core search algorithm updated

In mid January Google confirmed that an update to it’s core ranking algorithm had taken place.

Core Search

The news came after a number of webmasters reported significant ranking fluctuations, leading many to believe that the impending Penguin update was to blame.

On top of this, Google stated that it’s Panda algorithm, which is responsible for detecting poor quality or “spammy” content, is now incorporated within the it’s core search algorithm. However, it’s been made clear that Panda wasn’t refreshed within the recent core update.

Whilst details of the exact nature of the update are still thin on the ground, many of the ranking drops reported by webmasters occurred on sites that had thin or poor quality content; publisher websites were particularly hard hit.

With this in mind, and with the Penguin update set to drop soon, webmasters should be checking rankings regularly.

Title tags not a “primary ranking factor”

In a Google Q&A session last month, John Mueller stated that having title tags is not a “primary ranking factor” for a page.

Mueller’s initial statement was that title tags are not a “critical ranking factor”, which led many to assume that tags are not a ranking factor at all.

Title Tags

However, in a clarification Mueller said that “titles are important! They are important for SEO. They are used as a ranking factor”, but that specifically targeting a large number of keywords in a title tag is likely to have negligible benefit for the page. “It’s not worthwhile filling [title tags ] with keywords” as this won’t help a page rank, and can be bad for user experience.

Instead, Mueller stated that the main ranking signal on a page is the content, saying that “the actual content on the page” is a critical ranking signal, and that if a page has good content it could in theory rank without a title tag. Whilst this doesn’t mean that title tags have no importance, it’s perhaps a sign for webmasters to rethink how they’re using them.

Google updates webmaster guidelines

Checklist (resized)

Last month Google carried out quiet changes to it’s webmaster guidelines, the best practice document that acts as a “do’s and don’t’s” list for webmasters.

consisted of clarifications or minor edits to existing points, the new changes contain entirely new guidelines, with the removal of certain guidelines.

One of the most significant new additions is the recommendation that sites use HTTPS, with Google saying “If possible, secure your site’s connections with HTTPS”. Whilst this has been something Google has informally pushed for a while, the update makes having HTTPS a best practice requirement.

The updates now also include optimisation for mobile as an official guidelines, stating “Design your site for all device types and sizes, including desktops, tablets, and smartphones.”

Other additions relate to accessibility, and the use of content containing tabs, with Google saying that any important content that might be hidden by a tab should be made “visible by default”.

Outbound links not a ranking factor

Links

In a recent Google hangout, Google’s John Mueller cleared up the question whether linking out externally to high quality websites provides any benefit.

When asked if Google considers external links to other sites as a ranking factor, Mueller stated “external links to other sites, so links from your site to other people’s sites isn’t specifically a ranking factor”.

Whilst not commonly accepted as a ranking factor, external linking has been viewed by some SEO’s as something that could bring a small benefit to the linking site, in part due to the fact that inbound links do have a ranking benefit.

However, Mueller says that the only potential benefit of external links has is that they can “bring value to your content”.

Whilst this has been stated by Google before, this is perhaps the clearest answer we’ve had on the matter so far.

Fusion SEO Market Updates: December 2015

Dec 15 SEO updates

Google starts to index HTTPS pages before HTTP

https

Last month Google continued it’s increasing focus on secure search, with the announcement that HTTPS pages will be automatically indexed in preference to HTTP pages when possible.

In a post on the webmaster central blog, Zineb Ait Bahajji of Google’s indexing team stated that Google will now “start crawling HTTPS equivalents of HTTP pages, even when the former are not linked to from any page”.

Bahajji described this as a “natural continuation” of Google’s historical preference of HTTPS on Gmail, YouTube, and it’s own search platforms, but also of more recent moves like giving slight ranking boosts to HTTPS sites.

Although HTTPS URLs will now be indexed “by default” over HTTP URLs when possible, this does depend on a few criteria. For example, if a HTTPS site is blocked from being crawled by robots.txt, then it won’t be indexed, nor will it be indexed if it contains a “noindex” tag or rel=”canonical“ pointing to a HTTP page.

Summing up the move, the Bahajji stated that “by showing users HTTPS pages in our search results, we’re hoping to decrease the risk for users to browse a website over an insecure connection and making themselves vulnerable to content injection attacks”.

Penguin 4.0 delayed until early 2016

Although initially scheduled for release in late 2015, in December google announced that the real time Penguin update would be put on hold until early 2016.

Penguin 2016

Speaking to Search Engine Land, a spokesperson for Google stated that “With the holidays upon us, it looks like the penguins won’t march until next year”. Google have previously had few qualms with releasing new algorithm updates during busy or inconvenient periods, which suggests that rather than being a “Christmas gift” to webmasters, the delay is likely due to Penguin 4.0 being incomplete and unfit to release.

While it can be assumed that the update will happen sometime during January, as usual Google have been reluctant to give any exact date or time-scales. When asked about the expected date of Penguin 4.0, Google’s Webmaster Trends Analyst John Mueller responded “I’m pretty confident it’s good for January, but I really don’t want to make any promises on that”.

As per, all webmasters can do is sit, wait, and ensure that they’re ready for the update to happen, whenever that might be.

“Accelerated Mobile Pages” set to launch in Feb 2016

The “Accelerated Mobile Pages” project, which aims to be an easily accessible way to improve page load time, is set to launch in February 2016 according to Richard Gingras, Google’s Head of News.

Mobile

Gingras says that the Google backed project – which sites and publishers must first opt into – has been tipped for a launch as soon as late February. In tests by , AMP pages have been shown to load four times faster than traditional mobile pages, using around eight times less data.

A number of high profile sites like blogging platform WordPress, Social Media site Pinterest, and messaging app Viber have so far agreed to link to AMP content, whilst Google says that it’s own Analytics tool will have AMP support by late January 2016.

Perhaps the most SEO relevant piece of information Gingras revealed is that AMP pages could receive a ranking boost, and sites that contain links to AMP may be prioritized in search results. Alongside this, it’s been hinted that AMP may receive some kind of “fast” label in the search results. Whilst the project is still in it’s early days, SEOs should keep an eye out and weigh up the potential benefits of incorporating AMPs.

Google reveals top search terms of 2015

As in previous years, December 2015 saw Google release information on it’s top trending topics for the year. The data gives a good insight into what the world was talking about last year, with search information taken from Google users across the globe.

Ronnie Pickering

American Basketball player Lamar Odom, who made news after being hospitalised in mid October, was the most searched person and topic of the whole year. Other top searches highlight an interest in films, with Jurassic World, American Sniper, and Straight Outta Compton all featuring in the top 10 overall most searched terms.

Google also released the top 10 searches for a number of categories, including male and female celebrities, music acts, new moments, politicians, and people of sport. Perhaps unsurprisingly, Jeremy Clarkson and Cilla Black come in number 1 in the male and female celebrities categories respectively, whilst it’s again no big shock that Adele was the most searched musician of the year.

Things are a little less predictable when we look at the “What/Where/Who is..?” data, with “Who is Lucy the Australopithecus?” and “Who is Ronnie Pickering?” showing an unlikely twin global interest in prehistory and You’ve Been Framed style candid camera footage. May that continue into 2016.

Fusion SEO Market Updates: November 2015

Google changes search results location filter

Location Filter - After

The location filter in Google’s search results page has been changed, with users now being unable to manually choose which area they see search results from.

Changes to the feature were recorded throughout November, with many assuming that these were likely to just be testing on Google’s part; features are often rolled out (or removed) on a sporadic basis for short periods of time, before reverting back to normal.

However, it appears that as of late November, the filter has been dropped altogether. Google confirmed this, stating the filter “was getting very little usage, so we’re focusing on other features”.
The feature allowed users to manually see search results for a specific location, which could either be a country or a smaller town or region. However, Google is now only showing results based on what they know of a users precise location, meaning that users can no longer see the search results for locations they’re not physically in.

Updated search quality guidelines released

Checklist (resized)

November saw Google self-release the full version of it’s quality rater handbook for the first time, after first being leaked by an independent source. This handbook contains the guidelines used by search quality raters, who are responsible for reviewing sites Google flags up for manual action.

The guide still asks raters to specifically look out for what Google calls a site’s E-A-T; expertise, authoritativeness , and trustworthiness. This basically means checking for a number of factors considered positive – authoritative non-user sourced content, knowledge graphs – alongside negative indicators like distracting advertisements, hidden text, or a general lack of purpose.

Alongside this, the new guidelines place a big emphasis on mobile factors, which essentially encompasses what we already know through mobile friendly guidelines. In fact, the guidelines state that if a site is already flagged as not mobile friendly, it will automatically receive a low quality rating. As ever, this means that webmasters need to be on the ball when it comes to mobile, and treat it just as importantly as desktop; something that Googlehas emphasised throughout 2015.

Google begins indexing app only content

App only results

Content only available within apps is now indexed and being shown within search results, Google has announced.

Previously, content within apps would only have been indexed and displayed if Google found a web page that corresponds to or mirrors this content. However, Google is now displaying app content that has no corresponding web page. Users that have the app installed can click through to open it, whilst those that don’t can view a stream of the app that is stored within Google’s cloud.

At present, only displaying content from certain “partners” is being displayed, including the U.S National Parks app Chimani, hotel booking app Hotel Tonight, and horoscope apps Daily Horoscope and My Horoscope. However, it’s likely that this feature will be rolled out on a wider scale in future. Speaking about the changes, Google’s Engineering Manager Jennifer Lin said “Because we recognize that there’s a lot of great content that lives only in apps, starting today, we’ll be able to show some “app-first” content in Search as well.”

The new feature represents a significant marketing step for businesses with apps, who may now have the opportunity to target potential customers with relevant for information only contained within in app content.

New Penguin set to be an update, not refresh

Penguin 2 (resized)

In October Google announcedd that the next update of Penguin is likely to happen before the end of 2015, and is set to roll out continuously in real time.

Alongside this, Google’s Gary Illyes has confirmed that the next Penguin is set to be a true update, rather than a refresh. This means that rather than simply being a re-run of the previous Penguin update (released in 2013), the upcoming Penguin will incorporate new ranking signals. As such, sites that were not affected by the last refresh could very well be at risk of being hit this time round.

Penguin was initially released to penalise sites viewed as engaging in “spammy” tactics, targeting sites that engage in keyword stuffing and manipulative link building. Sites hit by Penguin in the past would have had to wait long periods to recover, essentially until the algorithm either refreshes or updates again. However, with Penguin being rolled out in real time, the hope is that the recovery time will be much faster.

The exact date for the next Penguin update still isn’t known, although webmasters should be making efforts to ensure they’re well prepared for a surprise roll-out, and shouldn’t expect any warning from Google when this happens.

Fusion SEO Market Updates: October 2015

Google introduces new artificial intelligence

Brain

Google has added a new signal to it’s search algorithm in order to help process and understand search queries.

Officially named RankBrain, the signal is a machine learning artificial intelligence which Google says it has been using to process “a very large fraction” of search queries. As around 15% of daily searches made through Google have never been searched before, the RankBrain system is designed to help Google identify, understand, and categorise these alongside similar queries.

It’s thought that RankBrain is instead the newest addition to Hummingbird, Google’s Search algorithm. Greg Corrade, a senior research scientist at Google, semi-confirmed this; “RankBrain is one of the “hundreds” of signals that go into an algorithm that determines what results appear on a Google search page and where they are ranked”.
Alongside this, Corrado also stated that in few months RankBrain has been operating, it has become the third most important signal that contributes to search results. Whilst it isn’t currently known exactly how the system will affect SEO, you can read a more in depth post on RankBrain here.

Penguin 4.0 expected to roll out during 2015

After months of hinting, Google have confirmed that the new Penguin 4.0 algorithm update is expected to roll out before the end of 2015.

Penguin

As with the previous months’ Panda updates, Penguin 4.0 – which targets links that Google identifies as “spammy” or unnatural – is set to be implemented in real time.

This means that rather than updating sporadically, Penguin will be constantly working to detect unnatural linking practices, with penalties given out to affected sites on a real time basis. Vice-versa, if a site is hit by Penguin and makes steps to rectify unnatural behaviour, then in theory penalty recovery should take place equally quickly.

As always, Google seem unwilling to let slip the exact date that the update is set to take place; either that, or they don’t actually know, which seems fairly unlikely. However, so long as sites are following Google’s best practice guidelines on link building, then the update shouldn’t be anything to be too concerned about.

New report reveals important ranking factors for mobile

New research carried out by Searchmetrics has provided an insight into the importance of a range of mobile ranking factors. The research looks at the top ranked pages for mobile search results in 2015 and 2014 and desktop results in 2015, identifying trends across these.

Content factors featured heavily within the research, largely comparing the requirements for mobile content when compared to desktop. For example, it was found that the top 10 ranked pages for mobile in 2015 had an average of 868 words compared to 1285 for desktop. Mobile pages also had a lower average keyword count at 5.48, compared to 10.22 for desktop.

User experience features were also found to play a big part in mobile rankings, with high ranking mobile pages having fewer internal links, a higher prevalence of unordered lists, and a lower number of images than high ranking desktop pages.

Other important factors included a fast load speed (top pages averaged a 1.10s load time), and an avoidance of flash, with only 5% of the top ranked pages incorporating this.

Overall, the report suggests that the factors that influence search rankings in mobile are slightly different , although not entirely separate, from those that influence desktop.

Google warns webmasters not to use “sneaky” mobile redirects

Sneaky

Last month Google reiterated it’s policy on mobile redirects, stating that sites that redirect in a “sneaky” way can expect to receive penalties.

The warning is directed at webmasters who implement redirects to unrelated content on mobile landing pages. This means that when a user clicks through the site fro the SERP’s, they will be directed to an unrelated page, often without knowing.

This isn’t always something implemented by webmasters, and can also indicate that a site has been hacked. Whatever the reason, these kind of redirects are against webmaster guidelines, and the new announcement suggests that Google will be placing a bigger emphasis on identifying and penalising sites acting in this way.

This isn’t to say that any mobile redirects are being targeted, as Google’s Search Quality team stated; “Redirecting mobile users to improve their mobile experience (like redirecting mobile users from example.com/url1 to m.example.com/url1) is often beneficial to them. But redirecting mobile users sneakily to a different content is bad for user experience and is against Google’s webmaster guidelines.”

Fusion SEO Market Updates: September 2015

Google warns of harsh penalties for repeated guideline violations

JUDGE

Google has warned that sites found repeatedly violating webmaster guidelines, or that have a “clear intention to spam”, could face harsher manual action penalties.

Usually, if a site receives a manual penalty for violating guidelines, they need to rectify the violation and send a reconsideration request to Google in order for this to be revoked.

However, if after a positive reconsideration request a site then proceeds to further violate guidelines, the new blog post states that “further action” will be undertaken.

This “further action” will make any future reconsideration requests more difficult to carry out, less likely to be accepted, and in general reduce the chance of any manual actions being removed.

Summing this up, Google state that “In order to avoid such situations, we recommend that webmasters avoid violating our Webmaster Guidelines [in the first place], let alone [repeat this]”.

HTTPS acts as a “tiebreaker” in search results

In a recent video hangout, Google’s Webmaster Trends Analyst Gary Illyes emphasised again the slight ranking boost given to HTTPS sites, clarifying it as a “tiebreaker”.

In situations where the quality signals for two separate sites are essentially equal, if one site is on HTTP and one is on HTTPS, the HTTPS site will be given a slight boost. This reflects Google’s recent attitude towards HTTPS; whilst Google doesn’t regard encryption as essential, it is heavily recommended.

This doesn’t mean that HTTP is viewed as a negative by Google, and Illyes clarified that it’s still “perfectly fine” for a website to not be HTTPS.

However, whilst having a site on HTTPS alone isn’t enough to result in a positive SERP ranking, “if you’re in a competitive niche, then it can give you an edge from Google’s point of view”.

Google hints that structured data could be used as a ranking factor

How to make toast

Although data that is relevant to a specific site or niche works to make a sites SERP snippets richer, and in turn could potentially improve CTR, it’s not something currently used by Google as a ranking factor.

However, new comments – alongside the fact that Google now issues penalties for improper schema implementation -suggest that this could change in future. Acknowledging the usefulness that structured markup can have to users, Mueller stated that “over time, I think it is something that might go into the rankings”.

Mueller gave a brief example of how this might work, saying that in a case where a person is searching for a car, “we can say oh well, we have these pages that are marked up with structured data for a car, so probably they are pretty useful in that regard”.

However, it was emphasised that this wouldn’t be used as a sole ranking signal, and that a site would need to have “good content” as well being technically sound in order to benefit from any potential structured data ranking factors.

Study finds increased CTR on position 2 results with rich snippets

Blue nile snippets

Market research company Blue Nile Research has suggested in a new study that rich snippets could shift CTR percentage from position 1 to position 2.

The study compared responses to three scenarios; a result in position 1 with no rich snippets, a result in position 2 with rich snippets (such as stars, images, videos etc), and a result in position 2 with no rich snippets.

A comparison of clicks for each scenario found a 61% click share for the position 2 with rich media, whereas the position 1 with no rich media had only 48% click share. Meanwhile, position 2 with no rich snippets had the lowest click share at 35%.

The study looked at the search habits of 300 people in a lab environment, and as such doesn’t necessarily give the most accurate representation of natural user activity. However, it does suggest that structured markup and rich snippets have a valid part to play when considering how to boost click through rate.

Google says linking externally has no SEO benefit

Links

Although it’s common knowledge that gaining links from good quality sites can have a positive SEO benefit, the effect of linking out externally hasn’t always been as clear cut.

It’s often been thought that whilst not comparable to earning links, linking to external sites could provide a marginal search benefit. Although not ever explicitly confirmed, this belief has been reinforced by Google; in 2009, Matt Cutts stated that “in the same way that Google trusts sites less when they link to spammy sites or bad neighbourhoods, parts of our system encourage links to good sites”.

However, new comments have suggested that this isn’t the case. When asked “is there a benefit of referencing external useful sites within your content?”, Google’s John Mueller clarified that “It is not something that we would say that there is any SEO advantage of”, but that “if you think this is a link that helps users understand your site better, then maybe that makes sense.”

So, although linking eternally appears to have no direct SEO benefit, it should still be considered as a valuable part of creating a user friendly site architecture.

Fusion SEO Market Updates: August 2015

August

Panda 4.2 update still in progress”

In July 2015, Google announced a “slow-rollout” of Panda, saying that the latest update would be continuous and occurring over a larger space of time than previously. For this reason, it was suggested that websites might not notice ranking increases or decreases immediately.

Panda 6

After a month of Panda 4.2, reports from webmasters have been mixed, with many sites as of yet seeing little to no influence.

Other webmasters have noticed short term ranking increases or decreases occurring over 1-2 week periods, only for a site to return to it’s pre-Panda standing. This has led to some to speculate as to whether the 4.2 update has been “reversed”.

However, it’s more likely that these ups and downs are simply due to the slower nature of the 4.2 refresh; in fact, Google warned that this rollout may result in ranking fluctuations. As such, webmasters shouldn’t accept any ranking changes attributed to Panda 4.2 as permanent, and should anticipate subsequent fluctuations as the refresh continues.

Going “mobile only” is fine

Websites only operating with a mobile versions, and without desktop, will not see adverse ranking effects, so says Google’s Webmaster Trends Analyst John Mueller.

This statement follows the mobile friendly update rolled out earlier in the year, after which not having a mobile friendly site could have a negative effect on a sites search rankings.

Muller states that “you definitely do not need a specific desktop website in addition to a mobile website”, so long as you“ make sure that desktop users can still see some [of your sites] content”.

This means that so long as sites optimised for mobiles and tablets are still usable on a desktop device, it isn’t a necessity for a separate desktop site to be created. However, a mobile site must still be properly optimised along Google’s outlines in order to rank well within mobile results.

Google clarifies position on soft 404 response codes

404 Error Small

It’s common knowledge that pages returning a 404 error code are not crawled by Google. In fact, Google even recommends to 404 pages that contain “bad links” pointing to them if these links can’t be removed.

However, it’s not widely known how Google treats so called “soft 404s”; pages that should be returning a 404 code, but actually return a 200 “ok” status code.

Recently, Google’s Gary Illyes and John Mueller both gave the similar responses when asked about soft 404s. Illyes said that soft 404 responses are treated like 404s, and thus pages where they occur are not indexed.

However, Mueller expanded on this a little, stating that whilst soft 404s aren’t indexed (and thus any links pointing to them have no influence on a sites ranking), Google first needs to work out that a page is a soft 404; something that Mueller states can be “difficult”. As such, before a soft 404 page is identified, it will be indexed, potentially carrying on equity –positive or negative – from links pointing to it. Once identified, as Mueller states that Google only indexes pages eliciting a 200 response, the soft 404 page will no longer be indexed.

Moz releases 2015 search ranking factors study

SEO software company Moz has released it’s annual search study, based on a survey of “over 150 leading search marketers” giving their “expert opinions on over 90 ranking factors”. Factors were rated on a scale of 1-10, with 10 being most influential and 1 being least.

Moz - Search Rankings 2015

The study shows that links remain a strong perceived ranking factor, with link features rated both 1st and 2nd highest by those surveyed. Keyword related factors were also rated as strong, coming in at 3rd, 4th, 7th and 8th, whilst engagement was rated as the 5th most important ranking factor.

,p>Although the results are taken from a relatively small pool, they do serve to reinforce the importance that basic SEO factors can have on a sites ranking. Again, just because a factor is rated lower (e.g. social) doesn’t mean it’s influence is negligible; good rankings come from a range of these factors combined, rather than time invested solely in one area.

Google’s business model restructured

Alphabet - Business Model

On August 10th 2015, Google CEO Larry Page announced the formation of “Alphabet”, a public holding company for Google and it’s subsidiary businesses. Whilst Google will still be the largest business under the Alphabet umbrella, the restructure will result in a “slimmed down” and more streamlined Google. Following the restructure, Page will become CEO of Alphabet, with Google’s current Product Manager Sundar Pichai taking his place.

Alongside the creation of Alphabet, Google has received a brand update. The company has revealed a new logo and logo icons, and is slowly revealing updated search results pages. So far, there has been a large focus on mobile usability, mainly on Google’s own Android devices.

It’s unlikely that the restructure or re-branding process will have any direct SEO implications. However, the introduction of new usability features suggests that further prerequisites for mobile usability or schema mark-up could be implemented in future.

Image sources:

https://en.wikipedia.org/wiki/Alphabet_Inc.#/media/File:Alphabet_Chart-vector.svg

https://moz.com/search-ranking-factors

Fusion SEO Market Updates: July 2015

July - Market Updates

Google confirms Panda 4.2 now rolling out slowly

After months of speculation, Google have stated that as of 22nd July 2015, the Panda 4.2 update is now rolling out. Panda last updated around 10 months ago in September 2014, making this the longest gap between updates so far.

PANDA

The rollout means that sites penalized by the last update – which affects sites with “poor quality” content– should in theory be able to recover, providing they’ve made steps to stand in line with Google’s recommendations.
However, unlike previous updates, webmasters are unlikely to notice these changes immediately. That’s because Panda 4.2 is rolling out at a much slower rate than usual, meaning that any changes to rankings are likely to take place over a much longer period of time.

Speaking about the update, Google’s Webmaster Trends Analyst John Mueller stated that Panda 4.2 is updating at a slower rate than normal due to “technical reasons” and an “internal issue”. With this in mind, it could take months for webmasters to see any positive or negative influence.

Webmasters warned for blocking JavaScript & CSS

CSS & JS - Search Console

Towards the end of July mass notifications were issued to webmasters through Google Search Console (formerly Google Webmaster Tools), highlighting sites that appeared to be blocking CSS and JavaScript.

This follows Google’s October 2014 webmaster guidelines update, which warned that blocking CSS and JavaScript could result in “suboptimal rankings”. This is something the notification specifically states, before offering information on how to fix the issue.

Whist recommendations not to block JS and CSS may have been in place for a while, this is the first time that webmasters have been nudged en masse towards rectifying this.

However, webmasters in receipt of this notification shouldn’t worry, as it appears to be a widespread and often general warning. If you did receive this notification, the best course of action is to simply follow the steps within.

Google says all generic Top Level Domains treated the same

Globe

Back in 2014, the rules around generic Top Level Domains ( e.g. .com, .org) changed, essentially allowing for a whole new and unrestricted range to be created.

These changes brought on much speculation about how new gTDLs would be treated by Google, with many assuming that certain domains would receive preferential treatment. This was especially the case for geo specific gTLD’s (e.g. .london), which it was commonly assumed would rank higher in their respective locations.

However, in a recent Webmaster Central Blog Google’s John Mueller cleared up some misconceptions. As it turns out, the new gTLD’s are handled in the same way as the old gTLD’s, with Mueller stating that “our systems treat new gTLDs like other gTLDs…. keywords in a TLD do not give any advantage or disadvantage in search”. This is the same for geo-specific TLD’s (although “there may be exceptions down the line”). However, Google does make an effort country code top-level domains (like .uk) to geo-target websites, but this is something that was already known.

Google summed up the rules around gTLD’s as “if you spot a domain name on a new TLD that you really like, you’re keen on using it for longer, and understand there’s no magical SEO bonus, then go for it ”.

Google clarifies position on asking for links

During July, a small post on the Portuguese Google webmaster blog attracted the attention of many in the SEO community, after it said that asking for links could result in a penalty for Google.

The translated post , with original emphasis, reads “let some advice to [sic] ensure you that you are not violating Google’s guidelines: do not buy, sell, exchange or ask for links”.

Linkbuilding has long been a standard procedure for SEO’s, and as such the implication that this practice inherently falls outside of Google guidelines was news to many.

However, Google later altered the post to read “do not buy sell, exchange or ask for links that may violate our linking webmaster guidelines”. So, asking for links and linkbuilding is not a violation of guidelines, so long as this is done in a manner Google approves of.

Fusion SEO Market Updates: June 2015

June 2015 - Market

Google says to expect Panda update soon

PANDA

At the start of the month, Google’s Webmaster Trends Analyst Gary Illyes warned to expect a Panda update in the coming weeks. In typical Google fashion, Illyes was relatively vague in giving the exact timeframe of the expected update, being only as specific as “two to four weeks”.

Rather than an algorithm update, the most recent rollout is stated to just be a “data refresh”, meaning that sites hit by previous updates could potentially recover. However, this doesn’t mean that the refresh won’t have an impact on previously unaffected sites, and as always there is a risk of sites with poor quality content being hit.

At the time of writing – around 4 weeks since the announcement – signs of any update to the content quality update have yet to be noticed, meaning that the update could come at any time.

Google updates core search algorithm

In mid-June, webmasters reported seeing significant ranking changes, something of course first blamed on the expected Panda update. However, clarification from Google revealed that any ranking changes were most likely to be due to an update to the Core Search Algorithm, as the Panda updates had yet to take place.

Whilst Google regularly makes updates to its core search algorithms, it’s rare for these changes to have such a large effect on rankings. With this in mind, many have searched for other explanations for the ranking changes. One frequently cited influencing factor could be that Wikipedia –a number 1 search result – decided to change all its URL’s from HTTP to HTTPS. This could have affected the top 5 rankings for many searches, thus causing such a fluctuation in rankings across the board.

As always Google have been tight lipped, meaning that any explanations can only really be speculation.

Penalties now issued for improper schema implementation

Schema Penalty

Penalties issued as a result of May’s “Quality Update” have led to a reading between the lines of Google’s policies on structured data. In the aftermath of the update, some webmasters were given penalties relating to site schema; data used in order to show rich snippets, which can improve organic search visibility.

In March 2015, Google updated its policies on rich snippet markup, stating that this may only be placed on specific items and not whole pages or categories. However, either due to lack of awareness of the updated policy or misunderstanding of how to correctly implement markup, many sites were hit with warnings and penalties.

As such, to avoid penalties, it’s recommended that webmasters become au-fait with Google’s policies before implementing markup. Google also has a Structured Data Testing Tool, allowing developers to check whether markup is correctly implemented before making any real changes.

Google tests “slow to load” mobile results label

Slow to load label

This month, some mobile users have reported seeing “slow to load” labels in the mobile search results page. The labels – as seen in the right example – are designed to indicate to users that certain pages may take longer than average to load, or not load at all.

At the moment, the “slow to load” label is in testing, and as such not all users will be able to see them. A similar label was placed into testing back in February, indicating that some form of labelling for mobile devices is likely to be introduced fully at some stage.

However, there has been much speculation as to what exactly Google define as a “slow to load” page, and how this is determined. It isn’t known whether the labelling depends solely on the site or page itself, or whether the speed of an individual’s device or connection is taken into account. As such, some have expressed concerns that the labelling in its current form is arbitrary, giving little indication to webmasters on how to act to prevent a page or site being labelled.

Google reports spike in “near me” searches

Nearby search - June 2015

In the past year, searches with localised qualifiers have rapidly increased, Google recently reported.

In a post on the Inside AdWords blog, Google stated that queries with “nearby and “near me” qualifiers doubled, with around 80% of these searches coming from mobile. Google cited “heightened expectations for immediacy and relevance” for the increase, with a reported 4 out of 5 people stating they’d prefer search ads to be less generic, and specifically tailored to their city, post code, or immediate surroundings.

The information was released alongside details of a new ad format, specifically targeting “near me” searches. Google announced that from late May users searching in a “[business] near me” and “nearby [business]” format will be shown 3 or 4 different local business ads. Rather than containing simply copy, these ads will show buttons that allow users to find the location of, or directly call the business, as seen below.

Near Me - June 2015

Both directions and a call button have only previously been available on organic local business listings, with ads having only a call button; users will have had to click through to find out location details.

This comes off the back of November 2014’s location extensions update, which meant that users could potentially be shown 3 or 4 ads for different locations of the same business. However, with these latest changes, Google appear to be levelling the playing field somewhat, allowing for more businesses to achieve top of the page and above the fold ad space.

To view the statistics in full, and read more about the update, head over to the Inside AdWords blog.

Fusion SEO Market Updates: May 2015

Search Updates - May 2015

Google shakes up search rankings with “quality update”

May 1

At the start of the May, webmasters reported seeing both positive and negative ranking changes across multiple sites, leading many to assume a possible Google Panda update had taken place. After initial denials of any changes, Google eventually confirmed that an update had taken place at the beginning of the month – but not to Panda.

Google’s John Mueller described the update as “essentially just a normal algorithm update” taking place to “increase the relevance and the quality of the search results”, and advised webmasters of affected sites to “work on your web site in general”.

The lack of specificity regarding the purpose or intent of the changes have led many to dub it the “quality update”, and at the time of writing the reasons for sites being affected isn’t yet known. However, Mueller recommends that webmasters keep “focusing on your site and make it the best it can possibly be” to prevent ranking updates to similar updates in future.

Google clarify how Panda and Penguin algorithms operate

Recently, the operational nature of Google’s Panda and Penguin updates has caused much confusion. Google’s contradictory statements have often been the driving force behind this uncertainty, with both algorithms being stated as operating manually and real time.

However, some clarification was reached in May, with Google confirming that the seemingly oppositional statements they’ve previously made are both true; Panda and Penguin operate both manually and in real-time, simultaneously.

Google employee Mariya Moeva stated that “Panda and Penguin are built-in in the real-time infrastructure, but the data has to be updated separately”. That explains why ranking changes can appear to be sudden, as whilst the algorithm is constantly running, the data that affects search rankings needs to be manually updated or refreshed for a change to take place based on this.

Webmaster tools rebranded as “Google Search Console”, new features added

As part of a wider “inclusive” rebranding process, Google have renamed Google Webmaster tools “Google Search Console”. Citing that Webmaster Tools is “not just for webmasters”, the name Search Console appeals to the toolset’s apparent wider user base of “hobbyists, small business owners, SEO experts, marketers, programmers, designers, app developers”.

Alongside the rebranding, Google have added two new features into the tool, both based around app indexing. Search Analytics now allows webmasters to see top queries, pages and country specific data specifically for apps. Also added is an Alpha version of Fetch as Google for apps, which allows app developers to see the results of Googlebot attempting to fetch and index the apps.

Google search results page now shows real-time tweets

May 3

As a result of the partnership between Google and Twitter announced back in March, Google now displays real time twitter results in the mobile search results page. Relevant tweets relating to a search term are shown in a scrollable “carousel” format, appearing either at the top of the page – as seen in the example to the right – or sometimes lower down the page.

Google have stated that the changes represent “another way for organisations and people on twitter to reach a global audience at the most relevant moments”. At the time of writing real time tweets are only displayed in search results on mobile devices, although a desktop roll-out is expected to take place soon.

Google Maps fix causes local business ranking changes

A fix made by Google to prevent offensive search terms leading to locations on Google Maps appears to have boosted the search rankings of some local results. Google acknowledged and apologised for the offensive results after the problem – which caused racial slurs to lead to the White House – was brought to wider media attention, and stated they would make algorithm changes to fix the issue.

The exact cause of the problem is not yet known, although there have been a few suggestions. One of these is a Googlebomb, where users make deliberate steps to manipulate results by attempting to make webpages, or in this case locations, rank highly for irrelevant terms . Another is Google’s local search Pigeon algorithm, which looks for references across the web to influence how local results rank.

Whatever the cause, many webmasters reported significant changes in local results traffic (as seen in the left example) after Google said they’d resolved the issue, leading some to presume that the algorithm changes were responsible. Google have neither confirmed or denied these suspicions.

Image Source: http://searchengineland.com/google-twitter-deal-live-221148

Fusion SEO Market Updates: April 2015

SearchUpdateApril2015

Google finally rolls out mobile friendly update

On the 21st of April Google finally began to roll out its much anticipated mobile friendly update. Announced early on in the year, the exact nature and effect of the update has been heavily speculated about within the SEO community, with reported 4.7% of webmasters making changes to ensure that sites fit within Google’s requested parameters.

However, at the time of writing the update has had a far smaller impact than previously anticipated. As of the 1st of May, Google have said that the algorithm has fully rolled out in all of its data centres. However, the majority of webmasters have reported no big changes in mobile search results rankings, and those who’ve been tracking the update have seen no significant impact, as seen in the below graph from Moz.

April 2015 - MOZ Mobile Rankings

Google’s Gary Illyes stated that as many sites have not been re-indexed, they aren’t as of yet being affected by the new scores. This means it’s still possible for “unaffected” sites to be hit, and it’s still recommended that sites that are not yet mobile friendly be made so.

Google tests lightweight mobile results for slow connections

Google have continued their recent focus on mobile search results optimisation with the test of a “lightweight” display for mobiles with slow connections. Initially announced to simply effect mobile SERP’s, Google have now given webmasters the option to show a “toned down” version of their site to users on a slow connection. Whilst the lightweight version of the search results page is automatic, the option to strip out heavy images and files on a site will be down to webmasters to decide.

However, when tested on users in Indonesia, Google reported that sites that had opted in to lightweight display had a 4x faster load time, used 80% fewer bytes, and saw a 50% increase in mobile traffic – something surely likely to influence whether webmasters opt in.

Search Queries report being randomly replaced by Search Analytics in Webmaster Tools

At the beginning of the year, Google tested a new “Search Impact” report amongst a few select users, now renamed as “Search Analytics”. As well as the standard Search Impact features, the new report displays clicks, impressions, CTR and average search results position. On top of this, Search Analytics also allows for a comparison of these factors, broken down by specific queries, pages, devices, and country.

Google’s Webmaster Trends Analyst Zineb Ait Bahajji also commented that the report is “slow to catch up” at the moment, having only 90 days of data. However, this is expected to increase shortly. Whilst at the moment Search Analytics is only available to a random selection of users, it’s expected that at some point it will receive a full rollout and replace the Search Queries report.

Google begins replacing URL search result snippet with breadcrumb pathway

After a long period of testing, Google has finally started to replace site URL in the search results snippet with a site name and breadcrumb pathway. This update comes after years of beta testing and randomly selected rollouts, and is designed to “better reflect the names of websites”, Google has stated.

With this update, webmasters will be given the opportunity to better reflect site structure and content to users, and display a “real world” version of the site rather than a domain name. At the time of writing, this update has only affected mobile results in the U.S, but is expected to have a worldwide rollout in the near future.

In order to make sure these changes take place, webmasters will have to implement specific site name and breadcrumb schema within a sites source code.

Image Source: http://searchengineland.com/googles-mobile-friendly-algorithm-a-week-later-was-it-really-mobilegeddon-219893

Fusion SEO Market Updates: March 2015

SearchUpdateMarch2015

Google limits crawling of sites with response-times over 2 minutes

Whilst it’s well known that having a site with a slow server response and load time can have an effect on your search results rankings, exactly what Google classes as a “slow site” has been up for debate. However, in a recent Webmaster Help thread, John Mueller stated that if Googlebot takes “over 2 seconds to fetch a single URL”, this will affect how your site is crawled. If Google views a site as slow, it will limit the number of URL’s crawled on your site, affecting how well your site ranks.

Google give more details on upcoming mobile-friendly changes

Ahead of its release on the 21st of April, Google have clarified a number of points regarding the mobile-friendlyalgorithm. The roll-out is set to run over the course of a week in a real time, page by page basis. Real time means that a site may benefit from any mobile-friendly changes made as soon as Google picks up on these, and “page by page” means that only pages on a site that are mobile friendly will benefit, rather than the whole site. Again, Google stated the algorithm will run on a binary “yes/no” basis, meaning there are no in-betweens; Google classifies a page either as mobile friendly, or not. Google have also released details as to the scale of the algorithm, which is set to have a wider effect than both Penguin and Panda. Although set to only impact search rankings on mobile devices, it’s becoming increasingly apparent that ignoring Google’s mobile recommendations could result in dire consequences in the coming weeks.

Google set to penalise doorway pages

Sites that attempt to maximise their search results appearance with “doorway pages” are set to be hit by a new ranking adjustment, Google have announced. Doorway pages are pages specifically created to rank highly for certain search results, often containing little in depth or useful information and simply acting as a “doorway” to a site. As such, Google views doorway pages as leading to a bad user experience, and with these ranking adjustment updates, no longer wants to rank them. If your site currently has pages that could be classified as doorway pages, it’s likely you may see a ranking drop in the near future.

More than 80% of HTTPS URLs are not displayed in Google SERP’s

A recent webmaster trends analysis discovered that over 80% of HTTPS URLS are not currently being displayed in Google’s search results, instead appearing as HTTP. This is something Google puts down to webmaster configuration, with many webmasters not using HTTPS versions in sitemaps, rel-canonical, and rel-alternate-hreflang elements. This means that although a site is still indexed, it appears as HTTP. Google have previously suggested that they’d prefer sites to use the more secure HTTPS, always displaying this variant if possible and even affording a small ranking boost to sites that use this. Although the benefits might not be immediately visible, it’s worthwhile for webmasters to use and make visible HTTPS on eligible sites.

Fact Based Search Ranking: Is Google Smarter than You?

Blog - March 2015 - JC (resized)

The average person today will digest more information than at any other point in history. Through the internet, music, TV and plain old fashioned print media, they’ll encounter around 100,000 words. Or about 2.5 novels. In total, they’ll process the equivalent of 34 gigabytes of information every day; 5 times more than 30 years ago.

These figures could give the impression that society in 2015 is more educated. With Google, Siri, and blogs like this just a few clicks away, we can encounter a wealth of information, learning whatever we feel like, whenever we feel like. Want to know tomorrow’s weather? Who was King of France in 1390? How tall Noel Edmonds is? There’s nothing stopping you.

However, have you ever thought that a lot of the information you encounter, process and learn might be wrong? Google has, and they’re wanting to rectify this.

For just under a year, Google has been developing their Knowledge Vault, a huge store of information taken from all across human history. Knowledge Vault is an ever expanding database that autonomously collects data from around the internet. This information is then cross referenced with similar or corresponding information in order to sift facts from falsities.

Google’s existing Knowledge Graph works in a similar way, albeit on a smaller scale. However, rather than compiling information from the whole of the internet, the Knowledge Graph uses “trusted” sources like Wikipedia and Freebase to offer a collection of related and relevant information on a given search term. For example, if I search “Noel Edmonds”, Knowledge Graph provides a collection of useful and unimaginably interesting facts on the man himself, as visible below.

Noel Edmonds

Very recently, a Google research team published a research paper announcing aspirations to implement Knowledge Vault as a search ranking factor. This means that rather than a collection of information simply being shown to users alongside search results – as with Knowledge Graph – the Vault would control all the information on the search results page. Sites that contain information Google considers true would be ranked highly, and sites that contain dubious information would be penalised.

Whilst this is a suggestion still only in its formative period, it’s one that would entirely alter the way Google search works.

At the moment, sites are ranked according to a number of factors, one of these being links. The more links a site has from trustworthy sources, the more trustworthy that site is considered. This is a largely human process; when you link to a site, you’re showing a vote of confidence.

However, a ranking based on the Knowledge Vault would take away this human influence. As the Vault is an autonomous system, it and it alone decides what separates fact from truth, and what makes a site trustworthy.

Current ranking factors like links are far from perfect; something testified by algorithms like Penguin designed to halt manipulative link-building. However, possibilities for manipulating the Knowledge Vault in theory still exist. If the Vault is simply collecting together information it views as similar, and deciding truthfulness based on this, then what’s to stop webmasters from sprinkling their “facts” across the web in an attempt to manipulate higher rankings? Plus, what about dubious information that large numbers of people on the web consider to be true? Does this mean that moon landing conspiracy theories and folk health remedies should be considered facts, and afforded a high ranking? What about “facts” that are opinion based? Should the statement “Noel Edmond’s best days are behind him” be deemed any more truthful than “Noel Edmonds has a long and fruitful future ahead in show business”?

Perhaps more importantly, the implementation of a Knowledge Vault based ranking system is a step towards Google controlling a large flow of information. Whereas with the current ranking system, if a piece of dubious information is encountered, this can be argued against; a healthy discussion can be formed. However, with the implementation of this algorithm, there will be no need for discussion; just a nod of the head as Google pumps out a stream of complete, inarguable “facts ”. With this move, Google could be taking the power to invest confidence in information and sites away from users; something surely more important than encountering the odd “spider eats dog” article.

With this being said, and as Google haven’t imposed any real plans for implementation, at this point we can only speculate how a knowledge based search rankings system would work. It may be that Google could simply decide to implement a fact ranking alongside existing systems – perhaps displayed within a search snippet – something which at the time of writing seems a safer and more feasible option. In the unlikely eventuality that a full overhaul does take place, users may even become savvier, and more clued up to whether they’re being shown sketchy information. In any case, it’s not as if Google has never made big changes to the way search works before, and we’ll look forward to watching and adapting to whatever plays out.

Fusion SEO Market Updates: February 2015

SearchUpdateFeb15

Google to start favouring mobile friendly sites in search
results page

Google has revealed a significant expansion of the effect “mobile-friendliness” gives sites within the search engine results page. Announced in a Webmaster Central blog, the changes are set to take place from April 21st. Sites that are deemed mobile-friendly will automatically be ranked higher in the device search results page than sites will low mobile usability. The algorithm changes are likely to significantly affect the mobile search results rankings, meaning that sites currently viewed as “unfriendly” should attempt to make changes before the algorithm comes into action in April. Read our in depth analysis of the changes here.

Google labels slow loading pages in SERPs

Users have recently reported spotting red “slow” labels in the mobile search results snippets of slow loading sites. These labels will warn users in advance before clicking on a site that the page may load slowly. Although Google has considered page loading speed when determining ranking factors since 2010, labelling pages for speed is something not previously noticed. However, based on the testing and introduction of the mobile friendly label last year, it’s possible this feature may turn out to be more than just an experiment.

Google tests new look mobile search results interface

Google may have just rolled out a new look mobile search results interface. Owners of iOS and Android devices have reported seeing a coloured line separator in the search results, rather than the typical grey line, as seen in the two examples below. The exact reason for the alternation is not yet known, and for some this might not seem like a huge change. However, it’s likely that this is a feature designed to increase mobile-usability, reflecting the increasing importance Google has placed on this area in the past few months.

Google test live chat functionality in knowledge graph results

Google have recently tested a “live chat” tool within the search results of local businesses. Displayed within the knowledge graph local box, the feature shows whether someone from the business is available to chat. When the feature is clicked a Google Chat/Google Hangouts page opens, allowing users and potential customers to chat with an employee. Some have expressed concern that the feature may have the capacity to affect CTR, and thus have a negative impact on SEO. For example, a potential customer may not need to click through onto the site of the business, as their query has been answered offsite through the live chat function. However, as this option is currently only in testing mode, these concerns are only speculative.

 

Fusion SEO Market Updates: January 2015

Search Update

Google starts sending “mobile-usability” warnings to webmasters

Following on from last year’s increased emphasis on mobile usability, Google has reportedly begun sending out warnings to webmasters of “mobile-unfriendly” websites. The warnings, sent out en-masse via Webmaster Tools and email, warn webmasters to fix mobile usability issues on the affected sites in question. Specific problems or affected pages are not listed within the warning message, and webcasters must download a detailed report to see these. This is yet another move from google to increase the mobile-usability of sites, and although not explicitly stated, a suggestion that an algorithm change may be in the pipeline.

Google can now crawl and index locale-adaptive webpages

Websites that automatically change their content depending on the location of visitors can now be crawled by Google, according to an announcement made last month. In a post on the Webmaster Central blog, it was stated that sites that have the capacity to change their language depending on visitor location/language settings will now be crawled and indexed, something that google has previously found difficult; in the past, Googlebot would only see the U.S English language version of locale-adaptive webpages. However, Google is still recommending that webmasters wanting to show their site is locale-adaptive continue to use the suggested rel=alternate hreflang annotations, to help Googlebot recognise that sites are locale-adaptive.

Mobile sites blocking Google now visible in search results

Google has announced that sites that are blocking Google’s crawlers will be made visible within the mobile search results page. Users will be able to tell a site is attempting to block Google by the information in the search results snippet, which will specify the reason why the text is unable to be displayed (as seen in the below example). This has been a feature on desktops searches since back in 2012, but will now apply to all uncrawlable mobile sites, even if the desktop version is crawlable. Sites blocking Google mobile from accessing JavaScript, CSS, or image files for mobile usability purposes will be made visible, representing another push for webmasters to make their sites mobile friendly.

Social profiles for brands now visible in Google’s knowledge graph results

Google has started to display social profiles other than in the knowledge graph results of certain brands. Although a link to Google+ has previously been displayed for brands, the knowledge graph now displays icons for Facebook, Twitter, LinkedIn, YouTube, Instagram and Myspace. This feature has been previously available, but only for “personalities” and celebrities. Brands and companies wishing to have their social profiles visible in the knowledge graph will need to apply a new mark-up to their sites.

Image Sources:
http://searchengineland.com/figz/wp-content/seloads/2015/01/google-blocked-snippet-mobile.png

Critical Mobile SEO Updates from Google You Need to Know

As of November 2014, Google will begin to account for how “mobile friendly” a page is as an organic mobile SEO ranking factor. Officially announced over at Google’s Webmaster Central Blog, with this latest change to the ranking algorithm, Google are aiming to improve the online experience of mobile users. Whilst Google already penalises websites viewed as offering a bad mobile experience, this is the first we’ve heard of directly rewarding mobile friendly sites.

As well as the new update affecting page rankings, Google will also be introducing a new “mobile friendly” label. This will appear at the beginning of a page’s search results snippet, as seen the in the example below, directly informing users that the page offers good mobile usability.

Mobile-Friendly Snippet

In order to qualify as a “mobile friendly” page, Googlebot takes into account a few criteria. Mobile friendly sites are classed as sites that:

  • Avoid technologies that are not universally compatible with all mobile devices e.g. Flash
  • Size content to fit the screen, so users don’t have to zoom to view images or text
  • Have appropriate link spacing, allowing for easy clicking

Interestingly, the “mobile friendly” tag is attributed at a page level and not domain level.

To see if a page is considered mobile friendly, Google have provided a simple tool at the following link:https://www.google.com/webmasters/tools/mobile-friendly/. This tool is fairly self-explanatory, and easy to use. If users enter a site that meets Google’s criteria for mobile devices, they’ll be told the site is mobile-friendly, and greeted with a screenshot of how Googlebot sees the page, as seen below.

mobile friendly - pass

However, when users enter a site that doesn’t meet the criteria, they’ll be shown a list of reasons why the site in question isn’t mobile friendly:

Mobile Friendly - Fail

This comes shortly after the introduction of mobile usability reports to Google Webmaster Tools, a feature that informs webmasters of errors affecting many of the factors the new update will be taking into account. Alongside the announcement of a new “mobile-friendly test”, it’s clear that Google are pushing webmasters and developers to seriously consider the mobile browsing experience their sites offer to users.

Image Sources

http://searchengineland.com/google-officially-launches-mobile-friendly-labels-mobile-search-results-208949