Join Fusion’s SEO team as they round up last month’s major industry updates


Although the Robots Exclusion Protocol (REP) has been the unofficial format for robots.txt for the past 25 years, Google released a webmaster blog post on 1st July confirming that they have submitted REP to The Internet Engineering Task Force (IETF) in order to officialise the REP. Along with submitting REP to the IEFT, Google also updated its robots.txt spec to match REP.

Alongside the act of making REP an internet standard, Google also announced that they have made their robots.txt parser open source; this is currently available to view on GitHub.


Google announced that they no longer support noindex directives within robots.txt files, reasoning that this has never been an official directive within the REP.

In order for webmasters to update their sites for this change, Google have provided the below alternatives:

  • Noindex in robots meta tags: Supported both in the HTTP response headers and in HTML, the noindex directive is the most effective way to remove URLs from the index when crawling is allowed.
  • 404 and 410 HTTP status codes: Both status codes mean that the page does not exist, which will drop such URLs from Google’s index once they’re crawled and processed.
  • Password protection: Unless markup is used to indicate subscription or paywalled content, hiding a page behind a login will generally remove it from Google’s index.
  • Disallow in robots.txt: Search engines can only index pages that they know about, so blocking the page from being crawled usually means its content won’t be indexed.  While the search engine may also index a URL based on links from other pages, without seeing the content itself, we aim to make such pages less visible in the future.
  • Search Console Remove URL tool: The tool is a quick and easy method to remove a URL temporarily from Google’s search results.

Google also sent warnings to Google Search Console users.

Robots.txt warning - GSC

Within the warning, Google notified users that on 1st September 2019, noindex directives within robots.txt files will no longer work.


With Google admitting that they made around 3,200 changes to Google Search within the past year, either with the release of new features or regular updates, it’s safe to assume that multiple updates to Google’s algorithm take place daily. A Tweet from Google’s Gary Illyes confirmed this back in 2017, stating that Google release on average 3 daily updates.

Despite being armed with this knowledge, we still see articles and tweet every month asking if any algorithm updates have been released recently, or that an “unconfirmed” algorithm update took place, but only on a specific day.

Unconfirmed Algorithm Updates

The only truth behind any of these updates is that Google release multiple smaller updates on a daily basis, these smaller updates focus on specific aspect of site ranking, such as site speed. Large broad core updates are released less frequently, having a much broader focus that looks to provide a more generic improvement to Google’s search algorithm.

The best way to track if any of these daily updates affect you is to keep an eye on your site’s traffic and rankings and look out for any sudden changes, aided with the use of online tools. Tools such as SEMRush’s sensor can provide details on times in which algorithm updates produced more movement within search rankings, categorised industries and which of these were affected the most, and with the use of an account,  the ranking of sites alongside google algorithm updates to see how these were effected on a daily basis.


Rolling out for the US within July, Google new feature will allow customers to post their photos, along with a review, which will be displayed within the product listing.

This feature can only currently be implemented with Yotpo, PowerReviews, Bazaarvoice, and Influenster. Google have also confirmed that they will be looking to expand on this list in the future.


Google has been spotted testing a new feature within Google’s search results which allows users to share, open a cache of the page or open the page in a new tab.

Although this is currently a test and may or may not make it past the testing stage, it would be interesting to see how users interact with this feature compared to just sharing the page once they have clicked through.


The newest edition to data visualisations within Google Data Studio is the treemap.

Added to 18th July, this new graph will allow users to view data organised into dimension hierarchies and create drill down options, such as product brand > product category > product.