On Thursday Google announced the addition of a set of new user experience metrics to its growing list of ranking factors.
The additions – which Google is referring to as “Page
Experience” metrics – will be designed to evaluate how users
perceive browsing, loading and interacting with specific webpages, and
incorporate criteria measuring:
Page load times
Incorporation of HTTPS
The presence of intrusive ads or interstitials
Intrusive moving of page content or page layout
Webmasters should already be familiar with many of these
factors, with recent years seeing Google driving home the importance of mobile
friendliness, page speed, HTTPS adherence and avoidance of intrusive interstitials.
However, the new Page Experience signal also includes areas
from the new “Core
Web Vitals” report, recently incorporated into Google’s PageSpeed
Insights and Search Console tools.
Largest Contentful Paint (LCP): This measures the perceived
loading performance of a page, or the time passed before main page content is
visible to users. An LCP time of 2.5 seconds viewed as good, with higher in
need of improvement.
First Input Delay (FID): Measuring interactivity / load
responsiveness, or the time it takes for a user to be able to usefully interact
with content on the page. An FID of less than 100ms is optimal, with higher
scores in need of improvement.
Cumulative Layout Shift (CLS): Measuring visual stability, or
whether the layout of a page moves or changes while a user is trying to
interact. Pages should aim for a CLS of less than 0.1 in order to provide a
good user experience.
Largest Contentful Paint and First Input Delay will already be recognisable to most webmasters, with Google’s PageSpeed and Lighthouse tools already providing information on these metrics.
However, Cumulative Layout Shift appears to be new, with Google’s John Mueller stating that the CLS metric has been created to gage levels of user “annoyance”. CLS looks at the familiar experience of content shifting as a page loads, which Google illustrate with the below GIF:
What does this
Whilst most of the individual metrics within Page Experience are pre-existing ranking factors, the new announcement places them together as one part of an overarching signal:
Google state that they are aiming to provide a more “holistic
picture of the quality of a user’s experience on a web page”, by grouping
previously separate factors together.
Each factor will be weighted uniquely, although as Google
have declined to comment on how this weight will be distributed, it will likely
be up to webmasters to determine the importance of each.
The new signal is also set to bring changes to how mobile
top stories are determined, with the adoption of AMP (Accelerated Mobile Pages)
no longer a prerequisite for inclusion within this section.
In future, top stories will be based on an evaluation of
Page Experience factors, with non-AMP pages able to appear alongside AMP pages.
When will Page
Experience roll out?
Google state that changes around Page Experience “will not
happen before next year”, and promise to give at least 6 months’ notice before
any roll out takes place.
This gives webmasters plenty of time to get ready for the
changes, with preparation hopefully made easier through the early incorporation
of P.E into tools like Google Search Console, Lighthouse, and PageSpeed
It will come as no surprise that when Google updated its 14 year nofollow link attribute value on 10th September it caused quite the stir in the world of SEO.
This update sees an extension to the well-recognised nofollow tag, broken down by Google as follows:
rel=”sponsored”: Use the sponsored attribute to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.
rel=”ugc”: UGC stands for User Generated Content, and the ugc attribute value is recommended for links within user generated content, such as comments and forum posts.
rel=”nofollow”: Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.
Historically, the nofollow tag was initially introduced by Google to help prevent comment spam, latterly this update is regarded as a way for website owners to tell Google to ignore the link. In other words, the link wouldn’t be crawled and it wouldn’t be used as an indication to help improve rankings. It therefore became a common way for websites to still acknowledge guest blog posts, partnered or sponsored content without losing any of their site equity.
“All the link attributes — sponsored, UGC and nofollow — are treated as hints about which links to consider or exclude within Search. We’ll use these hints — along with other signals — as a way to better understand how to appropriately analyze and use links within our systems.”
The idea that it could be a ‘hint’ is great news for websites looking to earn links and increase their backlink portfolio, what was once a redundant link is now being used as a ‘hint’ for ranking.
Google has made it clear that websites don’t need to update old nofollow tags to follow the new structure but instead this can be introduced by websites who want to be more granular in their link tags.
You took time to ask a client if they would adopt something no one said you needed to do? Again, this is a voluntary outside of sponsored links. If people *want* to use more granular identification, those who *do want* can do so. If you don’t, don’t. https://t.co/rMmSrUHaSQpic.twitter.com/hbNMgPaxtd
Prolific North’s Northern Digital Awards 2019 will take place on the 31st January and here at Fusion HQ we’re delighted to have been nominated for ‘Search Agency of the Year’ for the second time.
We’ve also been recognised for our unique SEO software, Natural Edge, which has been nominated for ‘Best Digital Tool or Software’, and we couldn’t be more proud of the recognition that Natural Edge has received.
We thought it might be a good idea to explain a little more about our Natural Edge software and how it is helping give our clients a competitive edge in an increasingly competitive SEO marketplace.
As an SEO team, a key part of our day-to-day activity is keeping our clients ahead of the curve in organic search – and outranking their competitors. If a prospective customer searches for a cycling related keyword, for example, then we’d want our bike retailing client to be among the first to appear, with high visibility in all the right places.
Several years ago, we sought out SEO software that would be able to assist with doing this – for ourselves and our clients. It needed to be adaptable to algorithmic changes (like the increasing prominence of localised search), flexible from a pricing point-of-view, and offer clear reporting metrics that clients could use to inform the KPIs they set and the ROI of our services.
However, the tools we looked into didn’t meet our clients’ needs. Ranking software would only give you your keyword position without considering how much traffic you would gain, for example. It might only benchmark a small set of competitors, or keywords would be looked at in isolation rather than holistically, missing out on larger insights that can truly drive a strategy forward.
Instead of spending big on little return, we invested in proprietary technology of our own, building a highly adaptable and cost-effective suite that could tailor bespoke solutions for our clients’ needs – giving them the Natural Edge required in order to shine.
What can Natural Edge do?
Natural Edge was nominated for the award on the basis of its versatility and the range of benefits it offers to its users – and our clients. However, here are just some of the highlights:
See the bigger picture
It’s easy to become obsessed with individual keywords.
Natural Edge identifies every site ranked on the first page for each relevant keyword in each location, and uses our proprietary algorithm to calculate how much traffic a site will earn from its positions. Natural Edge collates all of this data and presents a league table ordered by the highest traffic drivers, so that results are easy to understand and analyse.
Identify true competition
Competitors in search are very different to competitors in daily business life. In fact, the majority of brands are competing with companies they’d be incredibly surprised by.
Natural Edge benchmarks clients against anyone who ranks on the first page for specified keywords in every location they have presence. Finding out who you’re up against has never been clearer.
This provides a range of opportunities for growth, from identifying high priority keywords to target, inspiring new content ideas and analysing competitor backlinks to spot potential partners.
Understand what drives competitor visibility
While some sites rank for dozens of long-tail keywords, others rank highly for a couple of high volume phrases.
Natural Edge tells you how competitors have built their market share, allowing you to flesh out your digital strategy with key industry insights.
Understand local performance
Natural Edge offers highly localised insights, highlighting the composition of organic search results by identifying the number of localised and map results generated at a keyword level. A client can enter their locations into Natural Edge, thereby identifying generic keywords that create local results, and identify share of voice and individual keyword rankings for each of those locations.
Why we’re so proud to be up for nomination
At Fusion, we’ve been working with award-winning retail clients for over twenty years, delivering exceptional service with demonstrated ROI whilst using best-in-class innovation to create unique solutions to today’s digital problems.
Natural Edge is just one example of how our team’s outside-the-box thinking, and we’re beyond chuffed that our hard work and expertise is being acknowledged by one of the region’s leading authorities in the field.
Greatest of all, however, is the fact that it’s a testament to our team’s quality and ability, as a cutting-edge independent agency producing award-nominated software, and investing in genuinely pioneering solutions to achieve our clients’ goals.
Interested in what our services can do for you? Get in touch with the Fusion team today at firstname.lastname@example.org or learn more about Fusion Natural Edge here.
Join Fusion’s SEO team as they round up last month’s major industry updates.
GOOGLE SEARCH CONSOLE ADDS NEW INSIGHTS TO SERPS
Many SEOs, including ourselves, started noticing at the start of October that Google was now providing us with a glimpse into our Google Search Console data when searching for a keyword.
This window will only appear if you are logged into Google Search Console and have a property that ranks for the keyword. If multiple properties rank for the same keyword, a drop-down menu will become available to allow users to switch between properties. There is also an option to ‘see ways to improve’ a keywords ranking, although, the same two suggestions will appear for every keyword:
Compare this query to your overall data
Find out how to use this information to make changes to your site so that you can increase your chances to show up for the queries that you care about.
GOOGLE’S OCTOBER ALGORITHM REVIEW
Although we saw unstable fluctuations in average rankings, one of Google’s daily algorithm updates saw a much larger spike in fluctuations than on any other day. On 17th October the fluctuations had an impact across all industries according to SEO tracker tools. The only industries which seem to have been affected the least are sports, news and art & entertainment. The law & government, jobs & education and finance categories were affected the most, with other categories not far behind.
Google have said that their algorithm is constantly refreshing and that due to the large amount of updates that take place, they are unable to provide information on what changes have been made each day. As Google have not released a statement on what caused these fluctuations, we are unable to see the cause of this. Daily fluctuations were seen across October and we will possibly see this throughout December too.
GOOGLE SHUTS DOWN GOOGLE+
Google announced on 8th October that they will be closing Google+ over a 10 month period to give users full opportunity to transition, with an aim to close the site by the end of August 2019.
Released in 2011, Google+ was Google’s answer to social media, after dominating the search engine market in the early 2000’s, taking over the online video industry in 2006, and surpassing Internet Explorer as the top web browser in 2012.
However, poor uptake by users has led Google to pull the plug on Plus. And so social won’t be an area that Google will compete in.
GOOGLE MY BUSINESS ANNOUNCES NEW PRODUCTS FEATURE
Spotted by an SEO and posted on Twitter, Google has released a product collection and product menu on selected Google My Business accounts. This feature replaces the services menu and will allow users to add a product collection, along with products within the collection.
Been playing with new GMB beta feature, Products (Beta) Looks to be replacement for Services menu located in Info tab. Set up collections with products in each collection. I am putting service items in as well. Any additional info here @mblumenthal ? pic.twitter.com/1UKyVXKfkb
Google is yet to release any documentation providing more information on this feature or when it will be released from beta.
GOOGLE MY BUSINESS RELEASES NEW MIGRATION TOOL & BRANDED INSIGHTS
Announced on 22nd October via Twitter, Google’s new migration tool will allow users with organisation accounts to easily migrate locations from a personal account into the organisation account.
Users have the option to remove the transferred locations from the personal account or keep the locations on both the personal and organisation account, Google recommends that locations are removed from the personal account and the personal account is added to the organisation account in order for the account the view the properties for the best experience. Other information on the organisation account migration tool includes:
Locations from several personal accounts can be added to the organisation account.
Users who opt to keep locations available on both the personal and organisation accounts, when a property is added on the personal account, this will not also be added to the organisation account.
Location transfer requests will only work for transferring locations from a personal to an organisation account, users will not be able to transfer locations between organisation accounts.
Google has also implemented the ability to see branded searches within the Google My Business insights. Branded searches will be classed differently to direct searches (a customer directly searching for the business’ name) and will display within the chart of the dashboard.
GOOGLE MAPS ALLOWS USERS TO CONNECT WITH BUSINESSES BY FOLLOWING THEM
Google announced on the 24th October via their blog, ‘The Keyword’, that users will be able to follow businesses within the Google Maps app in order to receive updates which will be displayed within a ‘For You’ tab in the app.
Businesses that are not yet open will also be able to connect with users before their open date. The business’ profile can be visible to users up to three months in advance of opening. This can be used to tease the opening date, make users aware of any opening events or keep people updated on the types of products or services soon to be provided.
NEW BOOK AND SCHEDULE BUTTONS INTRODUCED TO LOCAL GOOGLE RESULTS
Google has expanded on giving mobile users access to call a company from the local pack listings and has implemented ‘book’ and ‘schedule’ buttons which allow users to book or schedule a service within the ‘Reserve with Google’ platform.
Google has a full list of booking software companies that work with ‘Reserve with Google’ along with a list of companies they will soon be working with.
PUBLIC TESTING STARTS FOR GOOGLE ASSISTANT’S DUPLEX TECHNOLOGY
After Google’s incredible unveiling of Google Duplex, a new artificial intelligence to be implemented within Google Assistant that can converse with businesses on behalf of the user in order to accomplish tasks such as book a table at a restaurant or make an appointment at a hairdressers, Google has released a blog post confirming that they will begin public testing for Google Pixel users at restaurants within New York, Atlanta, Phoenix and the San Francisco Bay Area.
We’re more than intrigued to see how this develops in the future!
Once provided with a URL that you own, the ‘URL Inspection Tool’ will provide crawl, index and serving information to allow users to check if an URL is being displayed. If errors are found within the URL, the tool will provide an error report detailing what issues it has found.
Google My Business Agency Dashboard
Google has released a new dashboard which allows agencies to access and manage multiple Google My Business listings from a single page.
Manage all locations under one account: Manage thousands of locations within a single account rather than being limited to 100 locations per account.
Send and receive invitations to manage listings: Send, receive and manage invitations within a dedicated section of the agency dashboard.
Location groups: All locations within an organisation are required to be held within a location group to simplify location management. Agencies can send/receive invitations to co-manage customer’s locations group listings.
User groups: Create and manage groups of users to easily manage access to certain locations groups.
Search: Quickly search for locations within the account or specific location group.
Google Search Consoles API now has access to 16 Months of Data
Although Google has provided users with sixteen months of historical data since the release of the new Google Search Console interface, a tweet from the Google Webmasters account confirmed in mid-June that users can now access this volume of data through the Search Console Analytics API. As a consequence, sixteen months of data can now be integrated within CMS and other tools:
If you're using the Search Console Search Analytics #API, you now have access to all 16 months of data provided in the UI! If you'd like to integrate the data with your CMS or make your own tools, check out our docs athttps://t.co/cqVVyHIbUp
Bing Announce Support for JSON-LD within Webmaster Tools
A month after Bing started supporting JSON-LD markup, Bing’s principal program manager, Fabrice Canel, announced extended support for JSON-LD markup within Bing Webmaster Tools during his appearance at SMX Advanced.
This will now allow users to enter their JSON-LD code into Bing Webmaster’s markup validator and receive debugging information.
This follows on from another broad update that went in last month. Google followed up this tweet by saying that there is no way to fix pages that may have lost performance from this update, but to instead keep on building good content.
The fluctuations from the update in search results lasted more than 10 days, appearing to begin on 17 April. Because this was a core update, it was not given an identifiable name and does not appear to target anything in particular.
Google replaces pagination with a “More results” button on mobile
Google have launched a change to their pagination on iOS and Android devices. Next and previous buttons have been replaced with a single “More results” button.
Instead of taking the user to a new page, the new feature loads the next set of results directly below the current set. When ads are loaded, these get inserted where the top of the next page would previously have been.
There have been mixed reactions to this change. Some SEOs said that this new functionality gives a poor user experience, while others said that it could make it more likely that users will make it to the second set of results than previously.
Google My Business adds lists of services
Google My Business has added a new feature within the management interface that allows some listing owners to create a list of their services for each map listing.
This was announced in mid-April on the Google My Business Help forums. It is available in addition to the food menu editor that is available for restaurants.
The service menu can be created and edited from the Google My Business dashboard. The menu must be created in sections and items can contain a name, price and description.
Google rolling out mobile-first indexing to more sites
Google has confirmed in a blog post that they have begun migrating sites that follow mobile best practices over to mobile-first indexing.
Some sites are now beginning to receive notifications in their Google Search Console properties that the site has been migrated:
Google said that sites with content that doesn’t follow mobile best practice need not worry about this change, but also recommended that they begin to make their content mobile-friendly.
Google core update in early March
Our rank tracking tools detected very high volatility in the search engine results from 2 March to 10 March. Here’s what that looked like in SEMrush:
There was a lot of chatter on the SEO forums during this time from webmasters that saw drastic changes in their traffic. Google confirmed that this was a core algorithm update on 12 March:
Each day, Google usually releases one or more changes designed to improve our results. Some are focused around specific improvements. Some are broad changes. Last week, we released a broad core algorithm update. We do these routinely several times per year….
Google have not provided a name with which to reference this update.
Captions added to Google Images, taken from page titles
Google has added captions to their image results on mobile. The captions reflect the title of the page from which the image is sourced.
The caption also contains the site’s domain. The caption previously contained only the domain. Adding the title gives the image more context; it also means that page titles have become even more important than before.
Bing adds support for JSON-LD schema
Bing confirmed to John Henshaw on Twitter that they now support parsing of JSON-LD schema metadata. Search Engine Land also received confirmation of the support.
This is the most proof I'm offering. My high-level Bing contact said it's not in their webmaster tools yet, but it is supported. So you may have to wait awhile until you can get it in writing (public help doc and/or it appearing in their tools), but I'm moving forward with it. pic.twitter.com/YDiucVOsYc
Google Chrome to display “Not Secure” warning for non-HTTPS sites starting July 2018
Over the past few years, movements have been started to make the web more secure. Most recently, Google announced that they would begin to mark sites that are using insecure HTTP connections as “Not Secure” via a label in the address bar in the Chrome browser.
Pages with password fields served over HTTPS are already marked as “Not Secure”, but starting July 2018, all HTTP will be labeled. Install your security certificates now!
Possible major Google algorithm update
SEMrush Sensor and other SERP tracking tools were showing high volatility in the search results around 20 February, leading many to believe that this was another major Google algorithm update. However, discussion in search forums leaves a general uncertainty over what this update might be targeting.
Google responded to comment with the usual statement that they make multiple minor updates every week.
“View image” and “Search by image” removed from Google Images
In the past month, Google has removed the “View image” and “search by image” button from image search results. This is in response to a new partnership between stock image provider Getty Images and Google, after Getty Images complained that they may be losing revenue since searchers are able to access images directly from search results, without ever visiting the site.
It’s now intended that searchers should view the image in the context of the site from which it originates, however, the response to these changes from Google Images users was generally negative.
Today we're launching some changes on Google Images to help connect users and useful websites. This will include removing the View Image button. The Visit button remains, so users can see images in the context of the webpages they're on. pic.twitter.com/n76KUj4ioD
Google now displays multiple featured snippets for broad search queries
Featured snippets on Google usually appear as a single box of text, a list or a table at the top of some search results pages, before the organic results; this is known as position 0. A featured snippet may appear when a page directly answers the searcher’s question.
Google has updated the display of featured snippets, so now 2 or more may be shown when the question is broad or could be answered in multiple ways. Google clarified in an article on their blog that this is to provide more “comprehensive” results. For site owners, this means even more opportunities to appear in a featured snippet.
New Google Search Console rolls out for more sites
Google have now opened up access for the new beta version of Google Search Console to all users.
The new interface comes with a number of improvements to user experience, as well as some new reports such as the Index Coverage report.
Until now, access to the new Search Console has been limited to a select number of beta testers. However, it was reported by SearchEngineLand on 22 January that it had become accessible to everyone.
There are still a number of bugs and some of the reports are missing while they are reconstructed. The old version will remain accessible alongside the new one for the forseeable future.
Page speed to become a mobile search ranking factor
Google announced on 17 January that they will be making updates to mobile search, meaning that mobile page speed will become a ranking factor. It is now more important than ever to audit your site and work on ensuring fast delivery of content.
The “Speed Update”, as Google called it, will come into effect in July 2018 and affect the ranking position the slowest pages in search results. This update demonstrates Google’s ongoing push for developers to deliver a better user experience.
Google also provided some tools that can be used to measure page performance, which included their own Lighthouse report and the recently updated PageSpeed Insights.
Google updates causing fluctuation throughout January
December saw a number of algorithm updates to Google search results, some of which were even confirmed by Google themselves. Updates have continued into January as webmasters reported changes in site traffic on forums; some positive, some negative.
Reports came in especially around 15 January, with search monitoring tools all showing high volatility around this date. Nothing regarding this has been confirmed by Google, but we are monitoring closely to see if anything changes.
103% increase in smart speaker sales
Adobe Digital Insights reported after their study on voice search that smart speaker sales increased 103% in Q4 2017 vs 2016. Smart speakers were very popular gifts for the recent festive season.
Voice assistant technology is improving daily and more companies are releasing new products to compete in this market. Research data reported on by SearchEngineLand shows that Google Home has now sold 44 million units in total, accounting for 40% of smart speaker sales over the holiday period.
There’s no better time than now to aim for position 0 in search results, with the majority of voice search answers coming from sites in this position.
2017 has been a happening time for digital content: Wendy’s broke the retweets record; the world began reacting to the proliferation of suspect news; Twitter doubled its character count; and the world’s biggest brands continued capturing our imaginations through brilliant campaigns, as content proved that it firmly remains the king.
One of today’s greatest producers of digital content is leading music streaming service Spotify. From quirky Times Square billboards to esoteric partnerships with leading franchises like Stranger Things, its creative campaigns have continuously made headlines and captured the popular imagination.
Last month, Spotify rolled out 2017 Wrapped – its end of the year campaign – which may well be their best to date. The campaign collates each user’s top one hundred most played songs of 2017 into single playlists, and as users keenly published their enigmatic soundtracks to the year, these quickly filled the web.
Read on for our analysis of how 2017 Wrapped wrapped up Spotify’s 2017 so brilliantly.
We’re all well familiar with the age old aphorism of never judging books by their covers. Nevertheless, at a point in time when the Internet is brimming with curated content, and users’ attention spans are becoming slimmer and slimmer for engaging with the same, it’s crucial for content to lead immediately with points of interest that compel.
2017 Wrapped’s visual elements are perfect. The campaign is centered on a micro-site with a homepage that’s animated, interactive and full of colour and life, which is accompanied by compelling copy that brings a sense of immediacy and gives the tool purpose. With conviction, it states: ‘In a year that many wanted to tune out, music gave us a reason to keep listening’.
It’s great for your campaign activity to be telling a rich story or glowing with meaningful content. However, it’s key for there to be high visual quality to match, to ensure that users engage with your work in the first instance.
Find what your audience loves
Spotify’s USP is how it allows users to freely listen to the songs that they want to hear, in playlists they curate, in orders they arrange.
It’s quickly obvious that the personal preferences of its listeners lies at the heart of the service.
2017 Wrapped links into these very same sentiments, creating content that’s unique to every user, which means that they’re more inclined to engage with the content.
Think about what motivations inspire engagement with your business or service. How could your content provide for them?
Increase engagement opportunities by going one step further
As extra elements of the campaign, Spotify included self-curated playlists covering the year’s most popular hits, such as ‘UK Top Female Artists 2017’, ‘UK Top Male Artists 2017’ and ‘Top Groups of 2017 ‘, and a quiz testing your knowledge of your listening habits.
By no means were these the main drivers of user interest. However, they proved to be simple ways of increasing the campaign’s breadth, which required minimal effort to make and expanded the opportunities for engagement.
Identify unique opportunities to create unique user experiences
2017 Wrapped creates Spotify playlists using data that none of Spotify’s competitors can access.
As such, the deliverables that 2017 Wrapped returns are genuinely original – they’ve never happened before and there’s nothing like them – which is an incredibly valuable asset at a time when every brand is competing for attention and clicks.
Netflix’s recent social activity has taken a similar direction, combining its data with inventive copy to hit enormous engagement figures on social:
To the 53 people who've watched A Christmas Prince every day for the past 18 days: Who hurt you?
Whilst GDRP need to be closely adhered to, think about the data you have that could be made into meaningful content. Be sure to make the most of the opportunity!
And let users share their results
Once a user has generated their playlist, they’re able to share it to their social feeds.
Though each share has a relatively microscopic reach, on a macrocosmic level they fulfill an essential branding purpose, as each sharing user becomes a brand advocate promoting Spotify’s created content to new audiences, driving expansive visibility and facilitating new user engagement opportunities.
When you’ve made excellent content that tells a compelling story, be sure for it to be easily sharable. Beyond anything else, your excellent work deserves all the reach it can get!
We’ll be back in 2018 to cover all the wonderful content the New Year has in store. In the meantime, we wish you all a very Merry Christmas and a wonderful New Year!
Post-Halloween greetings from Fusion Content. It’s been the season to be scary. The world’s biggest brands and content creators headed outside to trick and treat, as campaign activity took a frightening turn towards the paranormal.
Now that the pumpkin lanterns are well and truly out, read on for the best examples of Halloween marketing this year.
Stranger Things turns the internet Upside Down (spoiler free!)
The Internet loves talking about what it’s been watching. In recent months, few series have received more hype than the newest Stranger Things. Netflix released the show’s second season on October the 27th.
As a goldmine of pop culture throwbacks and retro references, the series presented super partnership opportunities to a range of brands:
Spotify allowed users to find out which Stranger Things characters share their musical tastes. Firstly, they tailored playlists to the season’s main characters’ preferences. Then, they compared them with users’ listening histories. Options ranged from the Demogorgon’s Upside Downers to Eleven’s Breakfast Jams!
Topshop converted their flagship Oxford Street branch into a Stranger Things shrine. They recreated a range of the show’s most iconic locations, such as Hawkins Lab, the games arcade and the Byers’ heavily graffitied living room:
These are the snack of choice of one of Stranger Things’ most iconic characters. Since series two arrived, the Eggo Twitter has practically become a Stranger Things fan account (with a few outrageously bad puns thrown into the mix):
All of these are instances of brands imaginatively tapping into mainstream pop culture events to create relevant and timely marketing. Kellogg’s activity is an especially excellent example. By capitalising on Eggo’s sudden uplift of pop culture relevance, and crafting a social strategy around it, they’ve been able to grow their brand in a new direction for a widened and younger audience.
Svedka Vodka uses display ads to haunt the internet
Svedka Vodka took an unconventional but eerily brilliant approach to its Halloween-themed marketing, which combined creative activity and remarketing to possess users’ social feeds with spooky Svedka Vodka content.
The campaign began by serving clickbait Halloween-themed cocktail recipes on users’ feeds. However, all wasn’t as it seemed. If a user clicked they link, they’d instead be spirited away to a video proclaiming that the curse had been laid:
From then on, they would be shown a cocktail of creepy banner ads. Geotargeting and retargeting methods made the curses uncannily unique: users in New York would be served New York specific ads, for instance.
And the user could only lift the curse by sharing one of the clickbait articles from Svedka Vodka’s Halloween hub. The curse would pass on to their friends and the cycle would begin again!
The brand’s multi-channel strategy created a memorable, outside-of-the-box campaign. Whilst we wouldn’t normally advocate shaping a digital strategy around hexing your audience, it certainly paid dividends on this occasion!
Burger King clowns around with McDonald’s and IT
Like Kellogg’s Eggos, Burger King’s Halloween content tied into pop culture happenings. Yet, rather than using pop culture to promote their own product, Burger King used it instead to take a swipe at a rival. The target? Historic arch-nemesis, Ronald McDonald.
This Halloween, BK invited the world to dress-up as scary clowns and, in many of its biggest locations (such as Leicester Square), offered free burgers as a reward. The campaign’s motto summarised the endeavour succinctly: ‘Never trust a clown’.
This isn’t the first time that Burger King has trolled its competitors in its Halloween content. Last year, one outlet dressed up as McDonalds’ ghost:
Nevertheless, it’s an inventive approach that capitalised on a seasonal opportunity to create conversations and serve up buzz around the brand at a competitor’s expense (which is risky, but fits within Burger King’s wider brand identity).
Come back next month, where we’ll be chatting all things Christmas!
Knightsbridge are market leaders in producing high quality and bespoke B2B furniture solutions. Their specialism is furnishing hospitality and healthcare locations. They’ve been based in nearby Bradford for over eighty years.
Fusion met with Knightsbridge to discuss the performance of their digital channels. With extensive experience of B2B SEO, we outlined a clear vision for improving the company’s holistic digital approach, and succinctly presented the expertise that we can bring to the table.
Our proposal met a glowing response. ‘We loved Fusion’s expertise and commitment to detailed planning’, said Knightsbridge Furniture CEO Alan Towns. ‘Their pitch showed us that they understood exactly what our business is hoping to achieve, alongside a proven track record of success in B2B markets.
Fusion has strong experience working with big names in our target sectors and we’re confident that their in-depth strategy will achieve real results’.
Craig Broadbent, Fusion Unlimited Technical Director, said: ‘We’re really looking forward to applying our specialist B2B SEO knowledge to Knightsbridge. There are clear search marketing opportunities within the contract furniture market and we’re delighted to be partnering with Knightsbridge in the next stage of their growth’.
Working with Knightsbridge is an exciting opportunity. We look forward to helping the company grow online and enjoy all of the benefits that a best-in-class SEO strategy can bring.
Whilst the intermittent wet weather of the last two weeks seems set to bring the British summertime to a close, we’ve recently been delighted to see the wide acclaim received by our client Halfords for their exceptional performance throughout the summer, making headlines in leading publications such as Internet Retailing and The Telegraph.
Halfords’ strategy focused on the on-trend phenomenon of staycations. Growing numbers of British families are swapping ten-hour flights for fish ‘n chips and pitching their tents a little closer to home. As one of the UK’s leading suppliers of holiday-making must-haves like sleeping bags, tents, bikes and roof-racks, it was essential for Halfords’ voice to be at the heart of the conversation.
In collaboration with Halfords’ internal teams, we implemented a cross-channel strategy to bring Halfords’ vision to life. With the objective of maintaining and increasing Halfords’ visibility for the camping category, we sought to create compelling content to drive organic visibility and secure coverage with major publications and features on high-quality lifestyle blogs. Production of an interactive camping guide whilst working alongside influencers to produce unique stories and advice helped Halfords increase SoV by 3.86% with over 50 pieces of coverage. Additionally, we supported staycation-specific products with promotional PPC ad copy to harness intent driven by the wider content strategy. Granular Shopping structure allowed dynamic support of key products during peak periods.
Revenue-wise, our combined activity provided the brand with a summer to remember. In comparison to the first twenty weeks of the last financial year, total sales rose by 11.2%, revenue from retail services (such as bike repairs and car -part fitting) increased by 18.3%, and overall revenue went up by 4.8%.
Another significant action by the brand was their perfecting of their in-store collection services. 85% of all digital orders are now picked up in Halfords stores, which is important for a brand who specialise in items difficult to ship. The availability enables customers to enjoy the benefits of easy online purchasing whilst minimising the hassle of delivery.
It’s always great to see our clients gain the recognition their efforts deserve, and we’re excited to how our brands’ successes will be received in the future!
Interested in how we can help your brand flourish online? Explore our range of digital services.
As the Internet catches its breath after last season’s Game of Thrones, and ‘winter is coming’ becomes more and more of a reality, the greatest minds in digital marketing continue to produce buzzing campaigns and pique the attention of the Internet.
A tea giant ran a giveaway in their cricket whites, the National Gallery introduced Van Gogh to Facebook, and an airline produced its own take on John Cage’s 4″33. Read on for five digital media happenings that caught the attention of Fusion’s content team last month!
1) Yorkshire Tea hits the content for six
In the build-up to last month’s match between England and the West Indies at our very own Headingley Stadium, an inspired Yorkshire Tea competition asked entrants to film themselves bowling their teabag into their cuppa as spectacularly as possible.
The prizes didn’t stray far from the wickets. The lucky winners received VIP tickets to the game, a signed cricket bat, and the opportunity to chat with legendary English cricketer Michael Vaughan.
The competition benefited the brand in several ways: it enabled them to highlight their Yorkshire roots and tap into pop culture interests, whilst encouraging the creation of unique user generated content that created animated conversations on social media.
2) Vincent Van Gogh gets social with the National Gallery and Facebook
Recently, Vincent Van Gogh has enjoyed an unlikely pre-eminence in digital media. In February, Airbnb partnered with the Art Institute of Chicago to build a real life version of his iconic painting ‘The Bedroom’, making it available for art-minded guests to stay in. Now, he’s starred at the heart of a foray into VR by the UK’s National Gallery, who used Facebook Live to host a virtual exhibition that united his legendary ‘Sunflowers’ paintings – displayed in galleries all over the world – for the first time in their history.
The exhibition functioned as a relay between five galleries. Each had fifteen minutes to present their own portrait to the audience, before passing the impressionist baton on to the next.
‘We launched our first Facebook Live a year ago’, said Dr Gabriele Finaldi, Director of the Gallery. ‘They’ve been growing in popularity ever since, so we are delighted to be teaming up with galleries all over the world and Facebook for the first ever live relay focusing on Van Gogh’s ‘Sunflowers’. This collaboration is a key step in the National Gallery’s Digital Strategy, which will see us fully explore the potential of immersive media to create new ways of experiencing art’.
The joint effort provides a wonderful example of a brand capitalising on the new opportunities that new channels present to create unique, ground-breaking content. We’re excited to see how marketers will pick up on Facebook’s Live availability in the future!
3) easyJet gets ambient with charity album
Musically, August delivered a fairly happening month for the airwaves: Leeds and Reading Festival came and went, as thousands of starry-eyed festival-goers discovered the invention of the beer bong and depleted the country’s Frosty Jack’s, whilst Taylor Swift deleted her social media channels, before springing back with a new single which remained very Taylor Swift. However, what piqued our interest the most was the astonishing arrival of an ambient album from easyJet, with the release of their two-song EP titled ‘Jet Sounds’.
This wasn’t the sign of a change of heart for the airline brand, but a clever example of a zany, high quality campaign. It followed the UK Sleep Council’s recent findings that 22% of the population get a poor quality of sleep, and their recommendation that one way to remedy this is by listening to white noise before we go to bed – a monotonous and droning sound that soothes our minds and eases out distractions. As the low humming of an aircraft’s engines meets these criteria exactly, the airline decided to record two tracks of it on a plane flying from Gatwick to Nice!
All of the release’s proceeds go to The Children’s Sleep Charity. If you’re feeling a little tired yourself, or just fancying some Boeing 737 beats to liven up your weekend predrinks, check out the album on iTunes here.
4) Airbnb criticised over email marketing campaign
After the devastation of flooding in Houston by Hurricane Harvey, Airbnb waived rental fees across the city, enabling hosts to let out their properties for free and provide shelter to the tens of thousands of people left homeless. However, the company simultaneously received criticism for the poor timing of a concurrent email campaign which promoted an opportunity for holidaymakers to stay in a ‘floating world’, spending a trip in a home on the sea ‘without touching dry land’.
Striking too much of a chord with the events unfolding in Texas, the brand received an overwhelmingly negative backlash on social media:
Considering the major meteorological disaster atm, not sure @Airbnb's email promoting floating cottages was particularly well thought-out.
An Airbnb spokesperson said: ‘The timing of this email marketing campaign was insensitive and we apologise for that. We continue to keep everyone affected by Harvey and all the first responders and their families in our thoughts’.
Whilst Airbnb has thorough disaster response measures, it’s essential for all brands to ensure that their content schedule remains suitably responsive to current events.
5) The North Face challenges Trump’s wall in campaign championing social mobility
Since Trump’s election as President of the USA, talk of the border wall with Mexico continues to dominate the headlines. Leading outerwear brand The North Face alluded to it heavily in their latest campaign, cleverly titled ‘Walls Are Meant for Climbing’.
Integral to the brand’s campaign are themes of unity and community-building, opposing the barriers that divide us. ‘Some people build walls. Other people climb them’, says the print copy. Whilst there’s no direct mentioning of White House policy, the reference’s political elements are readily apparent.
On a less metaphorical and more practical level, the campaign centres on the brand’s objective of making climbing more universally available. They’ve invested fairly heavily to do this: they donated $1,000,000 to the USA’s Trust for Public Land, for them to build climbing walls and facilities in public spaces across the country, and partnered with gyms across the world to establish August the 19th as a global day of climbing, allowing people to climb, for a day, for free.
The North Face’s campaign placed the brand within an important, relevant and politicised conversation, which – as shown by Pepsi earlier this year – can be a risky line for brands to tread. When doing this, it’s essential for the brand to support marketing efforts with practical and impactful activity. Here, however, the North Face accomplishes this impeccably.
Google has sent notifications to Search Console property owners with sites using insecure HTTP regarding plans to display a “Not Secure” warning in the address bar on pages with input fields.
This is the beginning of a long term push to increase web security by persuading sites to use the secure HTTPS protocol.
Google My Business action URL update
Google has expanded support for “action URLs” within local listings set up using Google My Business.
More business types can now set up additional URLs for online orders, reservations and appointments, which are then displayed within the listing in Maps and search.
This will make it easier for potential customers to carry out common actions with a business.
“Product” label in Google Images
Following last month’s expansion of the display of structured data in Google Image results, thumbnails which contain product information now carry a label on mobile.
At first glance, these appear like ads; however, they are instead triggered by valid Product schema markup on the target page.
Google may End support for site name
There is evidence to suggest that Google may soon cease displaying custom site names within search results.
The site name could be set up for a site by using the WebSite schema markup. The documentation for the search feature has, however, been removed from Google’s developer docs site, suggesting a change in support.
There is not yet an official comment on whether this feature has or will be removed.
July proved to be a busy month on the football pitch: the England women’s team demonstrated superb quality throughout the European Cup, defeating France for the first time in forty-three years to reach the quarter finals, before losing 3-0 to a clinical Netherlands side in the semis, whilst various high profile moves have consistently dominated the headlines.
Lest we get too carried away, last month was busy for content, too: some campaigns gloriously hit the back of the net, whilst others variably missed the mark. Join us for post-match highlights of last month’s campaign activity and trends!
Converse goes back to school
Amidst the new trailers for the second season of Stranger Things, Millie Bobby Brown, one of the show’s star actresses, has been making headlines all of her own, teaming up with Converse for a quirky back to school campaign titled ‘First Day Feels’:
Big Spaceship, the agency behind the campaign, converted the video into a range of GIFs, to be shared by leading relevant publications such as Buzzfeed and Teen Vogue.
It’s a neat example of a brand creating compelling content by combining with a key influencer who bears enormous appeal to target audience.
Simultaneously, with her performance as Eleven fresh in our memories, Bobby Brown creates a great link between the brand and the Stranger Things series – a show that perfectly hits the retro notes that Converse are looking to replicate.
Whilst Tourism Ireland heads to Westeros
For years, vast parts of HBO’s iconic series Game of Thrones have been shot in Ireland. To commemorate the launch of season seven, Tourism Ireland immortalised the exploits of the series’ characters in a gigantic artwork based on the Bayeux Tapestry. At a colossal seventy-seven metres long, it’s actually seven metres longer than the original!
The artwork is available online in all its glory. The physical original is on show in Belfast’s Ulster Museum.
Football gets creative with transfer announcements
As cheques matching the GDP of small countries continue changing hands, the world’s leading football teams have decided to replicate their investment on the pitch with their creative endeavours off of it, producing imaginative, weird and wonderful videos to announce their new signings’ arrivals.
In particularly dramatically notes, Sevilla announced the return of their former captain Jesus Navas:
Whilst last month, Chelsea humorously revealed the signing of AS Roma’s Antonio Rüdiger:
Of all teams contesting for creative premiership, however, AS Roma look set to clinch the metaphorical title, passing deftly to absurdity in their surreal announcements of Başakşehir midfielder Cengiz Ünder and Manchester City full-back Viktor Kolarov.
For Ünder, Roma’s fancy announcement video parodies the concept of YouTube highlights reels:
Airbnb and Audi link up for Bayern Munich giveaway
As one of the world’s leading companies, it makes sense that Airbnb’s campaigns should rank amongst the best. Recently, we covered Airbnb’s partnership with a Chicago gallery that enabled guests to sleep in a real life envisioning of a Van Gogh painting.
Last month, Airbnb partnered with Audi to promote the Audi Cup, a football tournament based in Bayern Munich’s Allianz Arena, lacing up their imagination and offering one lucky family the chance to sleep for a night on the stadium’s pitch, as the guest of FC Bayern defender, Jerome Boateng.
The prize was a holistic Audi and Bayern experience: a driverless Audi A7 collected the winners from the airport and drove them to the stadium, where they were introduced to the Bayern Munich players and given VIP seats for Bayern’s match against Liverpool. After the match, they were able to catch up with Bayern player Jérôme Boateng, one of the world’s best footballers, before getting a good night’s sleep on the green!
Like Converse and Bobby Brown, Audi and Airbnb’s partnership benefitted both brands considerably, especially with football thrown (or rather, kicked) into the mix: both Audi and Airbnb capitalised on promotional opportunities, where their combined contributions strengthened the overall campaign’s effect all the more.
With that, it’s auf Wiedersehn from us! Come back next month for all of August’s content news!
To a mix of excitement and surprise, Google have launched their new Google Posts feature, allowing all Google My Business customers to microblog directly onto the search results stream, enabling brands to reach their audience with unprecedented ease.
The Google Posts interface was first trialled in January 2016, in the build-up to the US election: Google gave electoral candidates the chance to summarise their responses to pressing political concerns in posts of up to 14,400 characters, and then made those responses visible on relevant search queries. Searching for issue X, for instance, would show you the stances of politicians Y and Z towards it.
A year and a half later, Google has completely reimagined the tool and expanded its availability, now enabling all businesses to post content directly to the search feed.
Brands’ posts will be visible for up to seven days before they disappear, exhibited in a scrollable carousel that rotates up to ten posts at a time, in a move that encourages businesses to keep their content fresh and vibrant.
Like Facebook and Twitter, posts can be brought to life with images and photography, although the interface doesn’t currently support GIFs or video. There’s a 300 words limit; only the first 100 characters will appear immediately in the Knowledge Panel, encouraging brands to balance creativity and concision when delivering their message.
There’re various ways that posts can be made more actionable: they can be created as ‘events’, causing the content to display for the event’s duration as defined by the user, or they can be rounded off with a call to action, be it a link for users to follow for more information, or an ‘add to cart’ functionality for quick and easy purchasing.
The whole of the interface is superbly tailored for mobile use; it’s clear that mobile search lies at the heart of Google’s bold philosophy and plans for the future.
It ties in beautifully to the company’s ever-expanding focus on local search, empowering small businesses by giving them an even greater opportunity to spread the word of their services through curated content.
Google Posts equally presents a brilliant opportunity to larger multi-location brands, allowing for the publication of bespoke content relating to each store locality.
If there’s going to be an exciting event or a brilliant promotion running in your Leeds’ store, for example, you’ll be able to use Google Posts to advertise it specifically on the Leeds store’s GMB page.
It’s very new, and there’s certainly scope for several of the interface’s features to be improved, such as widening the list of available calls to action and broadening the reach of the Insights module, providing greater information for analysis.
Needless to say, Google Posts is an exciting direction for Google to be heading in, opening another channel for the creation and promotion of content, and one that brands would be wise to think about, too!
When we think of Cannes, we think of films. The stunning gowns and clothes of the awards ceremony, the gilded prizes, the sunshine rippling on red carpet and Hollywood’s brightest glimmering upon it.
Happening each year in May, Cannes Film Festival is one of the most acclaimed and prestigious events in the entertainment calendar. However, that’s not all the lights, cameras and action that the summer has in store for the glamorous Riviera city.
Every June, the Cannes Lions festival celebrates the greatest achievements in content creation across the globe: showbiz meets SEO, acting and Adwords, as best actor morphs into best advert and Spielberg into Google.
Across the many categories, so much of the content that’s been nominated is of an exceptionally high standard. Read on for our five favourite pieces from the Cannes Lions prize winners and nominations!
Chicago Gallery Brings Van Gogh to Life With Airbnb
The bedroom of Vincent Van Gogh’s 1890s’ home in Arles is arguably one of the most famous rooms in the history of art: it’s the subject of three paintings by the Dutch master, the first damaged by river flooding and the second and third painted as ‘repetitions’.
Last year, the Art Institute of Chicago had the unprecedented opportunity of presenting all three versions of Van Gogh’s painting in the same exhibition. In the run up to the event, the Institute partnered with agency Leo Burnett, creating a striking campaign that enabled the world to experience Van Gogh’s masterpieces more vividly than ever before:
The gallery and Leo Burnett commissioned a team of artists and designers to recreate the iconic bedroom as a real room, which they then placed on Airbnb for guests to rent out at just $10 a night, including tickets to the exhibition!
It’s a brilliant instance of an impeccable use of technology, mixed with some phenomenal thinking outside of the box and artistry. Life as art turns to art as life. We love it!
Björk Buzzes As VR Music Video Picks Up Grand Prix for Digital Craft
VR took the plaudits this year in the Digital Craft category, and no-one exhibited a better understanding or application of the increasingly-deployed technology than Björk in the sublime music video for her song ‘NOTGET’.
The jury unanimously praised Björk’s masterful and bold deployment of virtual reality, perceiving the video’s VR elements as being essential to the content’s success, profoundly facilitating the telling of its story.
Previously, brands have been criticised for excessively incorporating VR into their content for limited, novelty purposes, adding an advanced UX to material that may otherwise be completely lacklustre. This year saw content creators really adapting to VR’s opportunities; Google won second place in the category for their VR tech, the Google Tilt Brush.
Bank of Aland’s Green Cards Bloom with the Grand Prix for Cyber Tech
As part of a wider Unesco-supported education programme called ‘The Baltic Sea Project’, the Bank of Aland-who operate throughout Scandinavia-were applauded for their development of environmentally friendly payment cards and awarded the Grand Prix for Cyber in kind.
Made from biodegradable plastic, the cards provide customers with monthly insights into the impact of each transaction on their carbon footprint, advising how they can reduce it in the future.
Overall, it’s a really cool and smart campaign, executed with style and flair, and for a great and relevant cause, too.
Twitter’s Minimalist # Strategy Makes Major Impression
Known for being one of the most happening corners of the Internet, it’s no surprise to see Twitter in the Cannes Lions running. However, you may not be expecting the category in which they won their Grand Prix: Outside Advertising!
Using just the iconic Twitter #, the campaign shows a sophisticated, creative understanding of what it is we think of when we think of Twitter, masterfully and succinctly capturing and reflecting the brand’s essence.
Another entrant in the category that caught our eye, and made enormous, continuous impact on the web, was a campaign led by BTEC Paris for French alcohol awareness organisation Addict Aide, titled ‘Like My Addiction’ and based around an influencer: Louise Delage.
From her Insta content, Delage seems like your typical online socialite: a Paris-born bon vivant with over 100,000 followers, jet-setting all across the world to live her flashiest life, regularly uploading stylish content along the way.
Delage’s Instagram presents a person who loves, lives, to party: there’s a drink in literally every photograph, no matter what she’s doing. Her fans followed her revelry with every like, watching her journey through day, night and the early hours.
Here lies the twist: Louise Delage doesn’t exist; she never has. She’s a character that BTEC Paris and Addict Aide created, an online persona on a fake Insta account posting scheduled and studio-crafted content, her social media presence inflated by the use of bot followers and the participation of leading influencers for outreach.
Vividly, and with outstanding creative commitment, the campaign illustrated the difficulty of identifying addiction and reflected back to us-the viewers and users of the Internet-the casual ways in which we can enable such behaviour with every like and share.
June began with the Champions League final, seeing a scintillating Juventus side square up to a fiery Real Madrid.
The match’s twists and turns made a fitting end for May, which proved to be a high-octane month across the spheres of social media, digital campaigns and content.
We watched a burger brand bravely contest the monarchy of Belgium, a Playstation ad play games with the laws of physics, a shark dramatically steal a 90s popstar’s thunder, and a crisp brand unknowingly create a content piece fusing Gary Lineker with Rebecca Black (a little less conventional than salt and vinegar!).
Read on for post-match highlights, analysis and more of five of the last month’s most noteworthy campaigns!
Walkers wave “the Walkers Wave” goodbye
Walkers’ “Walkers Wave” was an innovative approach to the competition format, but an unfortunate tactical oversight caused the campaign to be abandoned in less than twenty four hours.
The competition, presenting entrants with the opportunity to win tickets for Cardiff’s Champions League final, revolved around user generated content: they asked people to send in a selfie, converted the selfie into a video of the person’s picture being held up by Gary Lineker and then automatically tweeted the video back to the user from Walkers’ verified Twitter account.
The process of posting tweets to Walkers’ feed was automated. Herein lies the fatal flaw, for many of the selfies that were submitted were not actually selfies at all.
The campaign, quickly peaking into virality, witnessed Lineker welcoming into the Wave people that ranged from Joe Biden to Rebecca Black, alongside more unpleasant images such as mugshots of criminals.
A high profile, high cost campaign, Walkers supported online activity with media elements such as supersized displays of the social stream in Cardiff city centre, which unfortunately added all the more publicity to the brand’s own goal.
User generated content catalyses campaign’s momentum, increasing interest and visibility on social, but it’s vital to only integrate moderated content into your brand.
As epitomised by last year’s beloved Boaty McBoatface, the denizens of the Internet enjoy a near-endless supply of spanners for every possible works.
It’s best not to take risks that can have serious implications down the line.
Discovery makes a splash with Seal
A whole host of famous figures errantly ended up in Walkers’ waters. Across the pond, the Discovery Channel combined with a celebrity of their own to announce this year’s “Shark Week”.
We’re happy to give Discovery a bye over the pressing safety concerns of putting on a show by shark-infested shore, because the ad’s creative risk-taking boldly pays off.
Its bleak and ironic humour moves the nature brand into waters that are largely unexplored by its competitors, whose creative incentives are to either impress a sense of horror or awe: think of Planet Earth 2‘s photography and the commanding voice of Attenborough!
Like a strong keepie uppie, it’s good to keep the football metaphor going. Take creative inspiration from Xavi and Iniesta and look out for creative directions that your competitors aren’t occupying.
There could be a good reason why content creators haven’t followed a particular creative path – e.g. sharks. Equally, there might not be. Just because someone else isn’t doing it doesn’t mean that it’s bad; you could just be the first to spot the opportunity!
Sony enjoys eureka breakthrough with Gravity Rush 2
Strikingly executed and based on a simple yet stunning concept, a Sony advert for the new video game Gravity Rush 2 received an extraordinary response across the web.
Some things defy words, others defy gravity. This incredible creative piece by Tokyo agency Hakuhodo speaks boldly for itself:
Gravity Rush 2 revolves around the player manipulating the laws of gravity. As such, the ad’s a brilliant example of demonstrating a product: it shows, rather than tells, what you’re able to do.
The ad complements the creative flourish with gameplay footage towards the end, giving a full illustration of the product’s functionality and confidently overcoming a problem that the field of game advertising has historically struggled with, sometimes showing inaccurate video animation that doesn’t actually reflect the game at all.
This is an advert that might seem to totally flip the box upside down, rather than merely think outside of it. However, it’s based on a simple premise done exceptionally well, making small but enormous changes to the rules of physics we take for granted, showing the massive effect that can be made in subtly tweaking the everyday!
McDonald’s television ad receives backlash from charities
Social media and charities alike responded harshly to a recent McDonald’s ad, accusing the international fast food giants of manipulating childhood bereavement into a marketing strategy.
Dr Shelley Gilbert, president of Grief Encounter, said that ‘Parents [are] telling us their bereaved children have been upset by the advert, and alienated by McDonald’s as a brand that wants to emotionally manipulate its customers’.
Like last month’s ill-advised Pepsi campaign starring Kendall, it’s vital for brands to only incorporate social issues into their branding work if the product or service has direct relevance.
Without making too much in the way of comment, it seems that, in respect of the above, the Filet-o-Fish certainly doesn’t.
‘Il n’y a pas de place pour deux kings en Belgique’: Burger King lose bid for Belgian succession
We wrote about BK in last month’s round-up and they’ve featured in previous months too, because of their penchant for the absurd throughout their creative:
🍔: Don’t fall in love, kid.
Having just spoken about Maccies, you’d think that Burger King would regard them as their competitors. Not so, apparently, with Burger King last month launching an online campaign against, erm, the King of Belgium, instead.
The campaign, directed by French agency Buzzman, featured a poll on the website www.whoistheking.be that gave users the choice of picking between two kings: the King of Belgium or the king of Burgers.
Who deserves the country’s crown “Two Kings. One crown. Who will rule” Belgium Monarchy not impressed with Burger King promotion pic.twitter.com/mChO1a2qPJ
Any attempt to vote for the King of Belgium led to prompts in BK’s favour, such as ‘Are you sure? He won’t cook you fries’.
Perhaps they anticipated it, perhaps they didn’t, but Burger King’s campaign met a guarded response from representatives of the Belgian royal family.
Spokesman Pierre Emmanuel de Bauw said that ‘we would not have given our authorisation’ for the king’s likeness to be used in the material, landing the American brand in a sticky situation.
Moreover, Burger King actually lost the election – albeit narrowly – with 51% of its electorate preferring their current royalty to the House of Hamburger.
The combination of the above led to Burger King stopping the campaign. However, they handled it with flair and creative grace, editing the website, removing ‘King’ from their logo, and declaring in a caption: ‘There isn’t room for two kings in Belgium’.
The brand used controversy for promotional gain, which is an incredibly risky strategy that we wouldn’t necessarily advocate, but it paid off.
They may have lost the election, capturing the attention and ire of the Belgian royals in the process, but BK’s campaign was certainly a success, bringing more attention to their zany, off-kilter branding.
Thanks for dropping by and see you next month – we look forward to finding out what June has to offer!
A number of our reporting tools were showing dramatic changes for sites for a week in the middle of May.
Google spokespeople mention that they make regular updates which cause rankings to fluctuate. However, in cases like this, there is often a specific target for which they are attempting to improve the algorithm.
Google have upgraded the features available in the reviews for hotels in the local listings.
Users can now see specific rating averages from different types of guests. It is also possible to filter the reviews to show only those from third-party sites like Expedia.com and Hotels.com.
Hotels have had some special functionality for a while in the ability to book and see pricing directly from the listings on Google.
Bing Introduces Business Chatbots
Microsoft is beginning to roll out automatic chatbots to local businesses. These are available to restaurants only initially and when activated they can be accessed through search Bing search results.
The chatbots use information available through Bing Places to answer the questions, otherwise referring to the business’ phone number. They will work with both Facebook Messenger and Microsoft’s Cortana virtual assistant.
Highlights in Local Google Listings
Some locations on Google are beginning to show highlights in their listings on mobile. These are specific perks for visiting consumers.
Examples of highlights shown include “quiet”, “good for kids”, “casual”, “good for groups”, “bar games”, “great cocktails” and “on critics’ lists”.
The data to display these highlights likely comes from responses to questions answered through the Local Guides programme.
Google is Now an Art Expert
Google has updated their Maps and Search functionality to provide more information about works of art and where they are housed.
Search results now display specific details about the artwork and the artist(s) and when the art was created.
In Maps, you can now take a virtual tour of art museums. Google have used visual recognition software to scan the walls, so each artwork is labeled with useful annotations.
Google’s fact check schema markup, introduced during the 2016 US elections, is now out in core search and News results.
Google said: “When you conduct a search on Google that returns an authoritative result containing fact checks for one or more public claims, you will see that information clearly on the search results page.”
Any publisher can include the relevant markup on the page but Google will only display it for what it deems to be “authoritative” sources.
Google Test “Suggested Clip” for Video
A new Google feature was seen this month involving the video featured snippets.
For how-to queries, Google may suggest a portion of a video, recognising which section of the video contains the answer to the question. Clicking on the link takes you to the appropriate timestamp.
This feature cannot be replicated consistently, suggesting it is just a test.
Style Ideas and Similar Items in Image Search
Google image search results on mobile and in the Android app now display “similar items” for relevant searches.
Similar items will be displayed for a few types of products that contain Product schema markup on site. This currently only applies handbags, sunglasses and shoes, but the list will be expanding soon.
There is also an upcoming Style Ideas panel which will show similar products for certain clothing searches.
Google Owl Promotes Authoritative Content
A new Google update named Project Owl is designed to promote content with more authority.
This will be specifically beneficial around queries that could show offensive or misleading pages.
This also goes hand in hand with new feedback forms implemented for autocomplete and featured snippets.
“Best” Filter in Maps Pack
Google is beginning to filter the map results for local queries containing the words “best”, “outstanding”, “great”, etc.
When one of these searches is conducted, the 3 map locations are filtered to show only those with a 4-star rating.
This now makes reviews an even more important part of physical business’ local strategy.
15% of Google searches are unique
According to an announcement from Google, 15% of searches conducted by users daily are new and have never been searched before.
Google says that collectively it handles over 2 trillion searches per year in statistics released by them.
This reaffirmation comes after their announcement to provide more legitimate sources with Project Owl.
Google maps reminds you where you parked
Google Maps on Android and iOS can now remind you where you park your car when you set it manually upon arrival at a destination.
On adding the reminder, it is also possible to include a note and a reminder when your meter is close to running out.
Maps on iOS is already capable of automatically setting a parking location when disconnecting from USB audio or bluetooth in a new location.
By including the appropriate markup on the podcast’s website, episode information and an embedded player can be shown directly in the SERPs.
This currently only applies to searches conducted on the Google Search app v6.5 and higher and Google Home, though support will be extending in the near future.
Google Posts program opens in US and Brazil
Google Posts is a new way for companies to pass information directly to Google. These posts then show up for relevant search queries.
The program was originally only available to a small handful of users, including the Clinton and Trump presidential campaigns. It is now being opened up to more organisations, including museums, cinemas, musicians and sports groups.
Spring’s here and busily getting underway: the buds are opening, the birds are singing. Undoubtedly, May will alight with thunder and June will snow us in, but for now everything seems perfectly peachy, particularly because March proved to be such a superb month in content, seeing a sequence of stellar campaigns spanning myriad channels and topics, online and off.
Join us in exploring five of last month’s most vivid and stimulating campaigns. As with February’s entry, we’ve contemplated the factors that made these ads so successful, identifying the key lessons that every brand can take from them to refine and enhance their own image in the future.
With Twin Peaks set to make its long-anticipated return to television in May, the show’s promotion has taken in an idiosyncratically Twin Peaks turn.
Last month, billboards began cropping up across North America that displayed nothing but a picture of a single cherry pie. A favourite food of the Twin Peaks population, there was little doubt among the show’s die-hard fans of what they were referring to!
That the campaign only makes sense to Twin Peaks fans seems like a risk, signifying a strategy that’s fated to lose rather than gain a potential audience. However, the confidence and directness with which the ad targets the show’s most committed fans is also its greatest strength.
People return to Twin Peaks because of its quirks and the way in which the show regularly and fearlessly gestures to narratorial obliqueness. It’s a show that’s situated as far away from ‘normal’ as television tends to get; marketing it as being anything otherwise, just another show to perch at the end of a Netflix queue, would fail to connect the fans who love Twin Peaks for its weirdness while missing out on an opportunity to create vivid content to get social channels abuzz!
The campaign summarily made sweeping impressions across social media, alerting viewers to the fact of the show’s return and reminded them of the quirkiness for which they loved Twin Peaks to begin with.
There’s so much value in being mindful of the ways in which you’re using your mediums. Could you be utilising your channels to get an even bigger slice of the cherry pie? Seek ways of expressing your brand in the recognisable terms and ideas that inspire your audience to keep returning to it.
As opposed to Twin Peaks‘ quirky promotional campaign for an equally quirky show, Heinz have incorporated AMC’s series Mad Men into their latest campaign to bring a new yet vintage lease of life to their iconic tomato ketchup.
Mad Men follows the story of Don Draper, a fictional advertising exec working in the New York of the 1950s. One plot-line sees Draper actually pitching to Heinz; he proposes a thoroughly minimalist campaign that, to the puzzlement of the Heinz execs, omits the ketchup bottle entirely. Instead, Draper’s concepts simply show photos of the foods that ketchup best accompanies, alongside the caption: ‘Pass the Heinz.’
Mad Men‘s fictitious Heinz responded negatively to Draper’s proposal, but the Heinz of the real world today have made a considerable, belated U-turn and are now running Draper’s ads across the billboards of NYC!
AdWeek spoke to Nicole Kulwicki, Heinz’s head of brand, who said: ‘What we loved about the campaign is that it doesn’t require paragraphs of copy to explain it. It features mouth-watering food images, and all that’s missing is the Heinz.’
The campaign’s a perfect example of a well-known brand maximising the potential of pop culture references to expand a brand’s image, using Mad Men’s artwork to transform Heinz from a household name into a brand with a desirable vintage, emblazoning it with the retro notes that Mad Men’s stylistic flair and lush cinematography emanate.
Think about the potential of pop culture affiliation to shape your brand’s image and consider the exciting and vivid openings to which life combining with art can lead.
Nike pumps the nostalgia to the Air Max
Sunday the 26th of March marked the thirtieth anniversary of the release of Nike’s iconic Air Max 1. Like Twin Peaks shaping its promotion to reflect the aesthetic it’s famous for, Nike celebrated the birthday in quintessentially Nike fashion, evoking the contemporary flair and edge by which the global brand continues to be epitomised, in a series of “fake ads” that they, ironically, commissioned.
Nike collaborated with artists Ava Nirui and Alex Lee to refashion and customise old-school Nike Air Max ads materials, mixing their artworks with Nike’s classic branding.
Originally appearing in Dazed and now making waves across social media, Nike’s innovative collaboration with Nirui and Lee makes a vibrant, stylish example of high-quality content generating correspondingly high-quality conversations.
BT Sport channels Neymar and enjoys sublime night in the Champions League
When Barcelona met Paris Saint-German in the Champions League round of 16 at Camp Nou, it wasn’t really a matter of ‘playing’ a match; it was a rout, a masterclass in getting revenge, as Barça’s battalion of superstars combined to constellate one of the Champions League’s greatest ever performances.
BT Sport broadcasted the match on UK television and were able to get in on the action, running a dynamic a Twitter campaign that saw the brand’s average interactions increase by a whopping 1,730%. Some golazo for the BT Sport social team!
Rather than merely posting a timeline of the match’s events, the relatively new but leading sport channel posted graphics like GIFs and delivered their content through engaging, creative copy.
We looked last month at the power of creative social posting, vis-à-vis the kooky Tweets that American restaurateurs Denny’s continue serving up, and the same applies here. It’s also a great example of keeping your content current and flowing and allowing your social channel the freedom to start conversations when opportunities for them arise!
Refuge goes viral with moving music video
Content that’s done well is content that creates conversations. After all, the web’s a pretty big animal and every piece of content’s just another drop in the ocean. But the best content gets the ocean going, and when they combined with BRIT-nominated singer Frances to produce a music video for her song ‘Grow’, Refuge, the charity, achieved exactly that.
The video shows an animated woman walking through her daily life, returning at night to a starkly-coloured home, into which the camera never ventures. Only half visible, drawn as if a ghost, the figure becomes a moving, hard-hitting metaphor for the struggle people face for their struggles to be heard.
Eventually, after encountering someone who offers her a helping hand, the woman becomes fully visible; listened to and supported, she’s able to come alive, as the video’s pallet shifts to brighter tones, reifying a final sense of fulfillment and recovery.
Though Refuge only released the video on the 19th March, it’s already reached over 150,000 YouTube views. Seeking to go viral to spread awareness of the support that the charity provides, the campaign is a moving and important example of the impact that well-made content can have!
February may have been a short month, but it delivered content in a big way: we saw Moonlight light up the Oscars with a Best Picture win after a suitably tragicomic mix-up involving La La Land, Beyoncé grace the Grammys with a phenomenal rendition of songs from Lemonade and Leicester City bid adieu to a teary Claudio Ranieri, proclaiming that his dream had “died” after a fairly toothless season from his formerly bellicose foxes.
Mirroring the pace that the world picks up, plenty’s been going on in the campaigns domain, too, from Pancake Day tweets that flipped social media norms upside down to the New York Times incurring the digital wrath of America’s overly-digital Tweeter-in-Chief.
We’ve taken a retrospective look at five of the last month’s best campaigns that got the world abuzz, online and off!
1) Coming Up: One Branded Content Record Breaker
When building a brand a persona on social media, there’s usually a degree of semblance between the channel’s tone and the company’s image. If the World Bank started tweeting in lolcats, for instance, we’d be right to be a little confused. However, February saw one company smash the performance record for branded content, with a modus operandi of doing exactly the opposite.
Denny’s describe themselves as an “America’s diner […] where guests have come for over 60 years now to sit back, relax and enjoy delicious, hearty meals”. Denny’s Twitter radically leaves behind this retro feel, operating instead in a weird, off-kilter territory that most brands would fear to tread, deftly and rapidly switching between equally hearty doses of irony and earnestness.
One of Denny’s Pancake Day tweets stunned the Internet and marketers alike, picking up over 100,000 retweets and over 150,000 likes:
Denny’s meme-heavy content shows the importance of ensuring that you’re speaking with your audience as opposed to merely speaking to them, situating your brand within the same jokes and trends to which your customers respond.
we promise if we give you pancakes they are your pancakes and we will not take them away
Denny’s’ ongoing Twitter joy shows the importance of being current, staying relevant by ensuring that your brand remains in-time with the rhythms of contemporary life.
Though they’re in an entirely different field, Cancer Research have found similar success by taking a similarly progressive approach.
For this year’s World Cancer Day, the charity led a smart campaign of installing smart benches in Lewisham and Islington; these were seating areas complemented by smartphone charging docks, free internet access and a contactless donation panel that allowed people to donate £2 at a time.
As a brand considering any type of installation, to think of simple ways in which modern technology can be integrated into the set-up is a wholly worthwhile angle to take.
3) Honda Scales New Heights with its New Civic
Honda’s advert for the newest generation of their classic Civic range arrived in the form of a breath-taking visual spectacle, presenting the remodelled vehicle through the rich metaphor of a person climbing up a mountainside. As the one minute thirty advert progresses, the camera reveals the mountain to be in the shape of a car.
Though it’s not an overly subtle image, it evokes a strong and affective sense of Honda’s timeliness as a brand, whose cars have been leaders on the road for well over fifty years. Simultaneously, the ad’s supremely high production values and sophisticated, polished camerawork balance the old with the new, evincing Honda’s long-term commitment to technological innovation.
It’s quite the far cry from Denny’s’ antics on Twitter; if their joy comes from a meticulously-crafted social channel that takes the brand out of orbit, here Honda have created an ad that matches their voice to perfection.
4) The New York Times Trumps Trump
In 2017, it’s clear that the Information Age has unfortunately spiralled into a Misinformation Age, with the biggest companies on the planet struggling to contain the rising tide of “fake news”. The New York Times have confirmed their commitment to the truth in a Droga5-led campaign titled “The Truth is Hard”, based in print media, online and in platforms such as a New York advertising board.
It’s rare for a campaign to pick up traction in the form of the attention of the President of the USA, but here it did, with President Trump remarking:
For first time the failing @nytimes will take an ad (a bad one) to help save its failing reputation. Try reporting accurately & fairly!
With the launch of the Nintendo Switch and the release of the newest addition to the Legend of Zelda saga of games, Domino’s have gone in for a slice of the action with a humorous Zelda/pizza hybrid campaign. The level of detail is relatively uncanny, right down to the pizza box shield and pizza cutter sword. It’s another great example of a brand wholly connecting with its customers’ humour and language, with the graphic picking up a highly positive response across social media channels.
We are very proud to announce that Fusion Unlimited & Halfords have been awarded Retail Marketing Campaign of the Year at this September’s Online Retail Awards.
The special recognition award highlights our combined efforts with Halfords across PPC, SEO, affiliates and content marketing, with Fusion Unlimited coming ahead of competitors across the online retail sectors.
The Online Retail Awards aims to show the achievements of online retailers and digital agencies regardless of size, with international and independent business nominated in the same space. The awards highlight websites that are “the embodiment of excellence for their customers”, seeking out “examples of retailers’ web, mobile and tablet strategies that offer great online shopping experiences for customers”.
We’ve helped deliver best-in-class digital strategies alongside Halfords for more than 10 years and it’s always rewarding to be recognised for our performance orientated approach.
Following our accreditation as an ‘Elite’ agency in this years’ Drum Independent Agency Census , 2016 is proving to be a great year for the team and our clients.
On September 23, Google announced that it’s Penguin filter, designed to devalue sites using link spam as a way to skew results in their favour, will be updating in real time. Previously this filter was only updated periodically, and sites penalised by it would remain penalised even if their status improved.
With real-time updates, however, Penguin is more granular, affecting only spammy areas of a given site, rather than the entire thing. It also releases pages upon the next crawl of the site if they have changed for the better.
Google Begins to show more AMP results
Google will now begin to show AMP-supported results inside of the standard organic results, as well as in the top stories, which have been displayed since February.
Large non-news companies including eBay and Shopify are now beginning to adopt the technology, which aims to provide content with 4x the speed and 10x less data usage.
While there has been no confirmation of a ranking boost for using AMP, Google is showing a label next to pages that support AMP technology.
Google adds new “science Datasets” Rich Data Schema
Last month Google introduced a new schema for marking up scientific data to be used to in rich snippets within search results. The schema can be used to display the additional metadata about the scientific information, including the author, source and license.
The types of format relevant to this markup could be:
a table or a CSV file with some data;
a file in a proprietary format that contains data;
a collection of files that together constitute some meaningful dataset;
a structured object with data in some other format that you might want to load into a special tool for processing;
Last Friday Google confirmed the fourth major update of its Penguin algorithm, “Penguin 4.0”. The news comes nearly two years after the previous update, Penguin 3.0, which on release in late October 2014 affected around 1% of UK/US search results.
Alongside the update Google has announced that Penguin is now part of its core algorithm, effectively meaning that Penguin 4.0 is the last update webmasters will see.
What is Penguin?
First launched in April 2012, Penguin is designed to stop websites seen to be using “spammy” techniques from appearing in Google’s search results. The algorithm looks to identify and penalise sites using “bad links”, which have been bought or acquired in an attempt to boost ranking positions.
Sites caught out by Penguin typically see a sharp drop in ranking positions, with recovery only a possibility after a number of steps have been taken to remove links seen as toxic.
Even after these steps have been taken, a site might not see recovery until the next refresh of the Penguin algorithm. As Penguin has traditionally been refreshed manually, many site owners have faced a long wait for improvements to be seen.
However, with Penguin 4.0 come two important changes.
Penguin 4.0 runs in real time
As part of the core algorithm, Google has said that Penguin will now run on a real time basis, in contrast to the manual refreshes typical of previous updates. This means that if a site is affected by the algorithm, and efforts are made to rectify any issues, then recovery of rankings should take place fairly quickly; basically, as soon as a site is re-crawled and re-indexed.
As Penguin is effectively now running constantly, Google’s Gary Illyes has stated that the company is “not going to comment on future refreshes”. Although not the end of Penguin, this marks the end of the algorithm as most webmasters have come to know it.
Penguin 4.0 is granular
Previously, the Penguin algorithm affected sites in a blanket way; even if only one page had one “bad link”, the whole site could be penalised.
Now, Google has said that Penguin “devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site”. Rather than a whole site being negatively affected, Penguin will now look to penalise on a page by page, or folder by folder basis. This means that whilst some pages, sections, or folders may receive penalties, others will not be affected.
Google has yet to confirm whether the Penguin 4.0 has been fully rolled out, with many predicting that the full update is likely to take place over a few weeks. Although webmasters could pre-empt any negative effects by performing a link detox, it’s positive for webmasters to know that any sites penalised will no longer face a long and frustrating road to recovery.
Google confirms no PageRank loss when using 301, 302, or 30x redirects
It has long been understood that 301 redirects or other forms of 30X redirects would cause PageRank dilution on Google.
Back in February, Google’s John Mueller made the announcement that PageRank is no longer lost for 301 or 302 from HTTP/HTTPS, which was largely assumed to be an effort on Google’s part to encourage the adoption of HTTPS by webmasters. In a tweet on the 26th of July however, Google’s Gary Illyes made the simple announcement that “30x redirects don’t lose PageRank anymore.” John Mueller later added that this tweeted wasn’t intended as new information, but instead as concrete confirmation. This means that Google has now stated absolutely that 30X redirects do not lose any PageRank weight.
Download all landing pages from Google Search Console via Analytics Edge
Google Search Console, or GSC, enables users to peruse a wealth of information straight from Google. The Search Analytics report allows webmasters to view which queries their website ranks for and their relevant landing pages. Search Analytics allows users to view queries, pages, devices and countries, all by date, but is limited by the fact you can only download 1,000 URLs. The solution? Analytics Edge.
Analytics Edge is an add-in for Google which works to export landing pages from Google Analytics with ease – all of them. Additionally, users can import pages in bulk. The ability to mass import and export landing pages is especially useful if your site is undergoing a redesign or migration, and looks to be a huge help for webmasters.
Possible for Google Featured Snippets To Be Localised
In recent weeks, several posts across social media sites have noted given examples of times when Google’s featured snippets can be altered depending on the user’s location. In one such example, a user searched ‘ER waiting times’, and was presented with information on their local hospital’s wait times, rather than general information on the subject.
According to Google, this technology isn’t new, however it seems it isn’t currently rolled out to 100% of searches, with test searches proving unsuccessful. It may be the case that the user’s proximity to the service nearby is factored in, or it may be only used for certain kinds of searches. Either way, it’s an impressive and interesting development for featured snippets.
Google asserts TLD keywords do not influence search results
Last month, an article circulated claiming that a lawyer has changed the top level domain name for his website form .com to .attorney, and had seen a substantial increase in organic traffic. The article suggested that tailored, target TLD keywords could help boost traffic. However, Google has since stamped out any such rumours.
During a Google Hangout on the 14th of June, John Mueller explained that TLD’s crammed with keywords do not factor into Google rankings. Mueller went so far as to state that Google’s algorithm completely ignores top level domain names. Both John Mueller and Gary Illyes from Google urged webmasters not to listen to rumour, and to stick with their current top level domains, rather than change them following “vague promises” for increased traffic.
Google to remove custom date range search filter for mobile users
Google is set to remove a further search filter, and this time around it only affects mobile users. In a forum, Google explained that they are the removing the functionality that allows mobile browsers to search results filtered by a start and end date. Mobile users wishing to search using this filter can still do so via the desktop version of Google.
Instead of the ‘custom range’ time option, mobile Googlers can still search for articles based on pre-specified periods of time since they were posted, such as past hour, past 24 hours and so on.
“After much thought and consideration, Google has decided to retire the Search Custom Date Range Tool on mobile. Today we are starting to gradually unlaunch this feature for all users, as we believe we can create a better experience by focusing on more highly-utilized search features that work seamlessly across both mobile and desktop. Please note that this will still be available on desktop, and all other date restriction tools [e.g., ‘Past hour,’ ‘Past 24 hours,’ ‘Past week,’ ‘Past month,’ ‘Past year’] will remain on mobile.”
Apple brings Siri to Mac, new exposure for non-Google search engines
Apple has announced that it will be bringing its Siri technology to the Mac. While seemingly, innocuous, this development means that Mac users will have access to search from the macOS operating system, and the results presented to them will not be solely from Google.
Apple demonstrated the new technology during its Worldwide Developers Conference, showing eager viewers how Siri will work when it arrives on the latest macOS, “Sierra”, later this year. Users will be able to speak their requests to Siri and will receive a selection of results not only from Google, but from competitor search engines such as Bing and Yelp. This could lead to higher usage and increased exposure for these less commonly used search engines.
Google still stands as the default search engine when opening Apple’s Safari feature, however, as the result of a long standing deal between the two companies. When the deal expired last year, it was anticipated that Apple would consider dropping Google, however, despite the face that no formal announcement was ever made, it looks as though the two companies will continue working together.
Google now using ‘Rankbrain’ for every search query
The mysterious Google algorithm, ‘Rankbrain’, has apparently been a success. Since Google first made the announcement of the vaguely defined ranking method last year, it’s use in searches has gone from 15% to 100%, meaning that all two trillion searches per year are handled with input from Rankbrain. In short, Google is clearly incredibly confident in the ill-defined system, and is now using it to filter every query it receives.
Google has stated that Rankbrain is the third-most useful criteria when handling a query, and selecting the results to present. There are hundreds of factors (or ‘signals) Google considers when weighing up a query, such as geographical location and whether a website’s headline matches the wording of the query. The fact that Rankbrain is the third most important says something about the nature of the technology being developed.
It is generally assumed that Rankbrain works by internally rearranging the wording of a query to allow for the best quality results. A search along the lines of “best places to go for lunch in Headingley” might be rearranged to “best Leeds restaurants”. In this way, more popular search terms are used in order for Google to extract more in depth and extensive data for its search results.
Voice search reporting may be coming to Search Analytics report
Hints from Google seem to indicate that voice query reporting will feature in the Google Search Console’s Search analytic report, although the company has remained vague on when that might be.
John Mueller, Google Webmaster Trends Analyst, announced that Google is searching for ways to display voice queries to webmasters via the Google Search Console. Mueller explained that Google are looking for ways to divide up whether people search via voice or keyboard in the Search Analytics report. According to John, Google are seeking to “make it easier to pull out what people have used to search on voice and what people are using by typing. Similar to how we have desktop and mobile set up separately.”
John went on further, to explain that because voice searches are usually done with long sentences, Google Search Analytics may not detect the search volume for the topic and group it together it with less-common keywords. John explained they are still debating internally what the best way to circumvent this issue is.
Search Analytics report gets update
The way in which Google calculates impressions and clicks in the Search Analytics report within the Google Search Console has been updated. Google posted the following on their ‘data anomalies’ page:
We refined our standards for calculating clicks and impressions. As a result, you may see a change in the click, impression, and CTR values in the Search Analytics report. A significant part of this change will affect website properties with associated mobile app properties. Specifically, it involves accounting for clicks and impressions only to the associated application property rather than to the website.
In your Search Analytics report you will see a line saying ‘update’. This is in reference to the new metrics which will come into use as of the 26th of April. John Mueller, Google Webmaster Trends Analyst, explained: “Other changes include how we count links shown in the Knowledge Panel, in various Rich Snippets, and in the local results in Search (which are now all counted as URL impressions).”
While most users noticed no change in their Search Analytics report, Google suggested that mobile users might notice the largest difference.
Google’s mobile-friendly algorithm boost rolls out
Google has released their latest algorithm, which is designed to provide a ranking boost for mobile-optimised websites in the search results.
Google’s Webmaster Trends Analyst, John Mueller, took to Twitter to make the announcement of the second version of the mobile-favouring algorithm. Google had previously hinted they would boost the algorithm back in March.
Google’s stated intentions with the update are to “increase the effect of the [mobile-friendly] ranking signal.” Additionally, the company has said any sites which are already mobile-friendly needn’t worry, and won’t be affected by the update.
The mobile algorithm is a page-by-page signal, which is the reason the update has taken some time to roll out fully, as Google has to asses each page separately. This means the impact of the update can take time to materialise.
One concerned Tweeter asked John Mueller if this update meant “mobilegeddon”. “No, not really. :)” came the reply.
Google expands featured snippets
Google has begun to use extended feature snippets for certain queries. Featured snippets are the information displayed at the top of a search, before any site results. This information is displayed when Google is able to collect information that it is confident can answer your query immediately.
Now, Google has extended this feature, with ‘related topics’ appearing below lengthened snippets. The related topics contain a brief explanation and links to other Google queries.
A month after Google boosted the mobile-friendly algorithm, Google have changed the way in which they inform site owners if their website is not optimised for mobile users.
When a site owner searches for their own website on their mobile phone, if it’s not optimised, the result for the site will include a small notice above the meta description saying, “Your page is not mobile-friendly”. The message a hyperlink, and when clicked, will take users through a Google help page with more information about mobile-friendliness. For all other users searching for the website, no such message will be displayed.
Google’s John Mueller has confirmed that the feature is an experiment to see how mobile friendliness can be boosted across the internet.
Sites penalised for free product review links
In the first week of April, Google issued penalties to websites found to be hosting “unnatural outbound links”. Issued by the Google manual actions team, the penalties are aimed at websites linking to other sites with the aim of manipulating Google rankings.
Several days after issuing the penalties, it emerged that the unnatural link building in question was specifically in relation to free product reviews featured by bloggers, in exchange for links.
Following Google’s guidelines issued several weeks earlier advising bloggers to disclose free product and ‘nofollow’ their links, Google has now acted on its warning, and sent out manual actions to those sites that did not comply.
Google sent 4 million messages about search spam last year
Google has announced it’s latest development in it’s bid to clean up search results.
Over 2015, Google explained that they noticed a 180% increase in websites being hacked since 2014, as well as the number of websites with sparse, low quality content increasing. In order to counter this, Google unveiled their hack spam algorithm late last year. By sending out 4.3 million manual notices to website owners and webmasters, Google were able to clean up “the vast majority” of the issues stated.
Google saw a 33% increase in the total number of sites going through the reconsideration process, which shows the importance of verifying your website in the Google Search Console, which allows you to receive alerts when Google finds issues with your website.
Additionally, Google received over 400,000 spam reports submitted by users, and was able to act on a whopping 65% of them, thanks to over 200 Hangouts aired to help webmasters.
Last year Google stated that it considers RankBrain – it’s machine learning technology – to be the third most important search ranking factor. Whilst this information led to much speculation about what exactly RankBrain is and does, many were more concerned with another question; if RankBrain is only the third most important signal, then what are the other two?
Until last month, this wasn’t information that Google seemed ready to divulge, even after repeated questioning. However, in Q&A with Google’s Search Quality team, the top two signals were finally revealed; links, and content.
This information doesn’t exactly come as a huge surprise; Google has driven in the importance of “quality content” and linkbuilding for years.
Plus, given that RankBrain isn’t so much a signal as a system that uses signals, and that links and content are influenced by a number factors, this is just the latest in Google’s long list of vague announcements.
Google updates search quality guidelines
Google has released another version of it’s search quality rater guidelines, less than 6 months after the release of the previous document.
However, a number of areas appear to have been de-emphasised. Supplementary Content, the potential negative or positive effects of which have been explored in previous documents , now receives much less attention.
On the other hand, areas such as local search – now termed “Visit-in-person” in the updated guidelines – have been emphasised and redrafted. Mobile also receives more attention, with more illustrations of high and poor quality search activity using Mobile search as an example.
Other sections have been completely cut, leading some to believe that Google no longer requires human evaluation of these factors, relying solely on algorithmic evaluation. If anything, the revision of the guidelines so soon after the previous release also illustrates the constantly evolving nature of natural search.
My Business ranking factors documented
Google has updated it’s help section on improving local rankings, vastly expanding on the previous document with a number of more in depth pieces of advice.
Whilst much of this appears to be common sense – ensure your business is verified, make sure to post accurate opening hours, respond to reviews, add photos of your business – it’s good to have what Google considers important for local business search in writing in one place.
The section frequently mentions and stresses the importance of three factors when creating My Business listings; relevance, distance, and prominence. A listing has relevance if it closely matches the terms users are searching for. Distance refers to how close a listing is from the terms users are searching for; e.g. are users searching for a different location than that stated in the listing? Prominence relates to how well known a business is, and takes into account existing offsite information such as reviews and articles and how these can positively affect local rankings.
Last month, Google made major changes to the way ads are displayed page, with the removal of right hand side results from the right side of SERPs. Instead, a block of four Ads will be displayed at the top of the search results, followed by a further block at the bottom of the pages. It’s currently stated that the four AdWords block will only appear for “highly commercial queries” (e.g. “new car”, “home insurance”).
This means that in many instances organic listings will be pushed below the fold, with many speculating that this could lead to a lowered click through rate for natural search results.
Although there has been speculation as to how the Google ad changes will affect paid search, as of yet there has been fairly little comment on the changes from an SEO perspective. Alongside this, as Google’s Search Analytics report is currently stalled on February 23rd – coincidentally, around the time of the update – an independent analysis of the effect the changes could have on organic click through rate cannot yet take place.
Changes made to Knowledge Box
As of February 2016, the Knowledge Graph boxes that appear in Google’s SERPs can be manually updated by an of the account associated with the graph.
Previously, the information in knowledge box results was largely taken from structured data and schema mark-up on relevant websites. For example, if a business wanted its phone number, location, or social profiles to be displayed within its knowledge graph, this information would need to be included within the site’s schema mark-up.
With the recent update, “official representatives” of the company, person, or website associated with the knowledge graph can now “suggest a change” to the graph. However, this does not make schema redundant, and having valid and rich structured data is still important.
Although it’s not guaranteed that suggested changes will actually take place, the update gives a further degree of power to webmasters or account owners in ensuring that their site is visible in the SERPs.
Accelerated Mobile Pages now live
After being trialed in a mobile demo site, Google’s Accelerated Mobile Pages are now showing as live in mobile results for all users.
Initially announced by Google in October 2015, the Accelerated Mobile Pages project is designed to help boost page speed, cut load times, and result in a faster mobile search experience.
The open source initiative is currently backed by around 6000 developers, with thousands of sites currently signed up to show AMPs.
At the time of writing, Google has declined to comment on whether accelerated mobile pages are likely to receive a ranking boost. However, when asked about AMP and natural search, Google’s David Besbris reiterated that page speed and load time are both relevant ranking factors. This hints that at some point in the future, Accelerated Mobile Pages could be treated preferentially in comparison to regular pages.
However, the results of a recent study suggest that this might not be the case. To test Google’s claims, US based SEO agency Reboot Online conducted an experiment to find whether outbound links really do have no effect on rankings.
Reboot created 10 new web pages, each containing similar but not identical copy. Each page contained a control nonsense keyword – Phylandocic – that before the experiment resulted in 0 search results. 5 of the 10 webpages linked out to high authority sites like the University of Oxford, and 5 contained no outbound links.
Once the webpages had been crawled, a search for Phylandocic resulted in the 10 webpages being displayed in the SERPs. It was found that the top 5 webpages were those that linked out externally, whereas the bottom 5 were those that had no links whatsoever.
Whilst the conditions of the experiment were by no means natural, the results seems to indicate that linking externally could in fact have some benefit to a site, going against Google’s recent comments.
In mid January Google confirmed that an update to it’s core ranking algorithm had taken place.
The news came after a number of webmasters reported significant ranking fluctuations, leading many to believe that the impending Penguin update was to blame.
On top of this, Google stated that it’s Panda algorithm, which is responsible for detecting poor quality or “spammy” content, is now incorporated within the it’s core search algorithm. However, it’s been made clear that Panda wasn’t refreshed within the recent core update.
Whilst details of the exact nature of the update are still thin on the ground, many of the ranking drops reported by webmasters occurred on sites that had thin or poor quality content; publisher websites were particularly hard hit.
With this in mind, and with the Penguin update set to drop soon, webmasters should be checking rankings regularly.
Title tags not a “primary ranking factor”
In a Google Q&A session last month, John Mueller stated that having title tags is not a “primary ranking factor” for a page.
Mueller’s initial statement was that title tags are not a “critical ranking factor”, which led many to assume that tags are not a ranking factor at all.
However, in a clarification Mueller said that “titles are important! They are important for SEO. They are used as a ranking factor”, but that specifically targeting a large number of keywords in a title tag is likely to have negligible benefit for the page. “It’s not worthwhile filling [title tags ] with keywords” as this won’t help a page rank, and can be bad for user experience.
Instead, Mueller stated that the main ranking signal on a page is the content, saying that “the actual content on the page” is a critical ranking signal, and that if a page has good content it could in theory rank without a title tag. Whilst this doesn’t mean that title tags have no importance, it’s perhaps a sign for webmasters to rethink how they’re using them.
Google updates webmaster guidelines
Last month Google carried out quiet changes to it’s webmaster guidelines, the best practice document that acts as a “do’s and don’t’s” list for webmasters.
consisted of clarifications or minor edits to existing points, the new changes contain entirely new guidelines, with the removal of certain guidelines.
One of the most significant new additions is the recommendation that sites use HTTPS, with Google saying “If possible, secure your site’s connections with HTTPS”. Whilst this has been something Google has informally pushed for a while, the update makes having HTTPS a best practice requirement.
The updates now also include optimisation for mobile as an official guidelines, stating “Design your site for all device types and sizes, including desktops, tablets, and smartphones.”
Other additions relate to accessibility, and the use of content containing tabs, with Google saying that any important content that might be hidden by a tab should be made “visible by default”.
Outbound links not a ranking factor
In a recent Google hangout, Google’s John Mueller cleared up the question whether linking out externally to high quality websites provides any benefit.
When asked if Google considers external links to other sites as a ranking factor, Mueller stated “external links to other sites, so links from your site to other people’s sites isn’t specifically a ranking factor”.
Whilst not commonly accepted as a ranking factor, external linking has been viewed by some SEO’s as something that could bring a small benefit to the linking site, in part due to the fact that inbound links do have a ranking benefit.
However, Mueller says that the only potential benefit of external links has is that they can “bring value to your content”.
Last month Google continued it’s increasing focus on secure search, with the announcement that HTTPS pages will be automatically indexed in preference to HTTP pages when possible.
In a post on the webmaster central blog, Zineb Ait Bahajji of Google’s indexing team stated that Google will now “start crawling HTTPS equivalents of HTTP pages, even when the former are not linked to from any page”.
Bahajji described this as a “natural continuation” of Google’s historical preference of HTTPS on Gmail, YouTube, and it’s own search platforms, but also of more recent moves like giving slight ranking boosts to HTTPS sites.
Although HTTPS URLs will now be indexed “by default” over HTTP URLs when possible, this does depend on a few criteria. For example, if a HTTPS site is blocked from being crawled by robots.txt, then it won’t be indexed, nor will it be indexed if it contains a “noindex” tag or rel=”canonical“ pointing to a HTTP page.
Summing up the move, the Bahajji stated that “by showing users HTTPS pages in our search results, we’re hoping to decrease the risk for users to browse a website over an insecure connection and making themselves vulnerable to content injection attacks”.
Penguin 4.0 delayed until early 2016
Although initially scheduled for release in late 2015, in December google announced that the real time Penguin update would be put on hold until early 2016.
Speaking to Search Engine Land, a spokesperson for Google stated that “With the holidays upon us, it looks like the penguins won’t march until next year”. Google have previously had few qualms with releasing new algorithm updates during busy or inconvenient periods, which suggests that rather than being a “Christmas gift” to webmasters, the delay is likely due to Penguin 4.0 being incomplete and unfit to release.
While it can be assumed that the update will happen sometime during January, as usual Google have been reluctant to give any exact date or time-scales. When asked about the expected date of Penguin 4.0, Google’s Webmaster Trends Analyst John Mueller responded “I’m pretty confident it’s good for January, but I really don’t want to make any promises on that”.
As per, all webmasters can do is sit, wait, and ensure that they’re ready for the update to happen, whenever that might be.
“Accelerated Mobile Pages” set to launch in Feb 2016
The “Accelerated Mobile Pages” project, which aims to be an easily accessible way to improve page load time, is set to launch in February 2016 according to Richard Gingras, Google’s Head of News.
Gingras says that the Google backed project – which sites and publishers must first opt into – has been tipped for a launch as soon as late February. In tests by , AMP pages have been shown to load four times faster than traditional mobile pages, using around eight times less data.
A number of high profile sites like blogging platform WordPress, Social Media site Pinterest, and messaging app Viber have so far agreed to link to AMP content, whilst Google says that it’s own Analytics tool will have AMP support by late January 2016.
Perhaps the most SEO relevant piece of information Gingras revealed is that AMP pages could receive a ranking boost, and sites that contain links to AMP may be prioritized in search results. Alongside this, it’s been hinted that AMP may receive some kind of “fast” label in the search results. Whilst the project is still in it’s early days, SEOs should keep an eye out and weigh up the potential benefits of incorporating AMPs.
Google reveals top search terms of 2015
As in previous years, December 2015 saw Google release information on it’s top trending topics for the year. The data gives a good insight into what the world was talking about last year, with search information taken from Google users across the globe.
American Basketball player Lamar Odom, who made news after being hospitalised in mid October, was the most searched person and topic of the whole year. Other top searches highlight an interest in films, with Jurassic World, American Sniper, and Straight Outta Compton all featuring in the top 10 overall most searched terms.
Google also released the top 10 searches for a number of categories, including male and female celebrities, music acts, new moments, politicians, and people of sport. Perhaps unsurprisingly, Jeremy Clarkson and Cilla Black come in number 1 in the male and female celebrities categories respectively, whilst it’s again no big shock that Adele was the most searched musician of the year.
Things are a little less predictable when we look at the “What/Where/Who is..?” data, with “Who is Lucy the Australopithecus?” and “Who is Ronnie Pickering?” showing an unlikely twin global interest in prehistory and You’ve Been Framed style candid camera footage. May that continue into 2016.
The location filter in Google’s search results page has been changed, with users now being unable to manually choose which area they see search results from.
Changes to the feature were recorded throughout November, with many assuming that these were likely to just be testing on Google’s part; features are often rolled out (or removed) on a sporadic basis for short periods of time, before reverting back to normal.
However, it appears that as of late November, the filter has been dropped altogether. Google confirmed this, stating the filter “was getting very little usage, so we’re focusing on other features”.
The feature allowed users to manually see search results for a specific location, which could either be a country or a smaller town or region. However, Google is now only showing results based on what they know of a users precise location, meaning that users can no longer see the search results for locations they’re not physically in.
Updated search quality guidelines released
November saw Google self-release the full version of it’s quality rater handbook for the first time, after first being leaked by an independent source. This handbook contains the guidelines used by search quality raters, who are responsible for reviewing sites Google flags up for manual action.
The guide still asks raters to specifically look out for what Google calls a site’s E-A-T; expertise, authoritativeness , and trustworthiness. This basically means checking for a number of factors considered positive – authoritative non-user sourced content, knowledge graphs – alongside negative indicators like distracting advertisements, hidden text, or a general lack of purpose.
Alongside this, the new guidelines place a big emphasis on mobile factors, which essentially encompasses what we already know through mobile friendly guidelines. In fact, the guidelines state that if a site is already flagged as not mobile friendly, it will automatically receive a low quality rating. As ever, this means that webmasters need to be on the ball when it comes to mobile, and treat it just as importantly as desktop; something that Googlehas emphasised throughout 2015.
Google begins indexing app only content
Content only available within apps is now indexed and being shown within search results, Google has announced.
Previously, content within apps would only have been indexed and displayed if Google found a web page that corresponds to or mirrors this content. However, Google is now displaying app content that has no corresponding web page. Users that have the app installed can click through to open it, whilst those that don’t can view a stream of the app that is stored within Google’s cloud.
At present, only displaying content from certain “partners” is being displayed, including the U.S National Parks app Chimani, hotel booking app Hotel Tonight, and horoscope apps Daily Horoscope and My Horoscope. However, it’s likely that this feature will be rolled out on a wider scale in future. Speaking about the changes, Google’s Engineering Manager Jennifer Lin said “Because we recognize that there’s a lot of great content that lives only in apps, starting today, we’ll be able to show some “app-first” content in Search as well.”
The new feature represents a significant marketing step for businesses with apps, who may now have the opportunity to target potential customers with relevant for information only contained within in app content.
New Penguin set to be an update, not refresh
In October Google announcedd that the next update of Penguin is likely to happen before the end of 2015, and is set to roll out continuously in real time.
Alongside this, Google’s Gary Illyes has confirmed that the next Penguin is set to be a true update, rather than a refresh. This means that rather than simply being a re-run of the previous Penguin update (released in 2013), the upcoming Penguin will incorporate new ranking signals. As such, sites that were not affected by the last refresh could very well be at risk of being hit this time round.
Penguin was initially released to penalise sites viewed as engaging in “spammy” tactics, targeting sites that engage in keyword stuffing and manipulative link building. Sites hit by Penguin in the past would have had to wait long periods to recover, essentially until the algorithm either refreshes or updates again. However, with Penguin being rolled out in real time, the hope is that the recovery time will be much faster.
The exact date for the next Penguin update still isn’t known, although webmasters should be making efforts to ensure they’re well prepared for a surprise roll-out, and shouldn’t expect any warning from Google when this happens.
Google has added a new signal to it’s search algorithm in order to help process and understand search queries.
Officially named RankBrain, the signal is a machine learning artificial intelligence which Google says it has been using to process “a very large fraction” of search queries. As around 15% of daily searches made through Google have never been searched before, the RankBrain system is designed to help Google identify, understand, and categorise these alongside similar queries.
It’s thought that RankBrain is instead the newest addition to Hummingbird, Google’s Search algorithm. Greg Corrade, a senior research scientist at Google, semi-confirmed this; “RankBrain is one of the “hundreds” of signals that go into an algorithm that determines what results appear on a Google search page and where they are ranked”.
Alongside this, Corrado also stated that in few months RankBrain has been operating, it has become the third most important signal that contributes to search results. Whilst it isn’t currently known exactly how the system will affect SEO, you can read a more in depth post on RankBrain here.
Penguin 4.0 expected to roll out during 2015
After months of hinting, Google have confirmed that the new Penguin 4.0 algorithm update is expected to roll out before the end of 2015.
As with the previous months’ Panda updates, Penguin 4.0 – which targets links that Google identifies as “spammy” or unnatural – is set to be implemented in real time.
This means that rather than updating sporadically, Penguin will be constantly working to detect unnatural linking practices, with penalties given out to affected sites on a real time basis. Vice-versa, if a site is hit by Penguin and makes steps to rectify unnatural behaviour, then in theory penalty recovery should take place equally quickly.
As always, Google seem unwilling to let slip the exact date that the update is set to take place; either that, or they don’t actually know, which seems fairly unlikely. However, so long as sites are following Google’s best practice guidelines on link building, then the update shouldn’t be anything to be too concerned about.
New report reveals important ranking factors for mobile
New research carried out by Searchmetrics has provided an insight into the importance of a range of mobile ranking factors. The research looks at the top ranked pages for mobile search results in 2015 and 2014 and desktop results in 2015, identifying trends across these.
Content factors featured heavily within the research, largely comparing the requirements for mobile content when compared to desktop. For example, it was found that the top 10 ranked pages for mobile in 2015 had an average of 868 words compared to 1285 for desktop. Mobile pages also had a lower average keyword count at 5.48, compared to 10.22 for desktop.
User experience features were also found to play a big part in mobile rankings, with high ranking mobile pages having fewer internal links, a higher prevalence of unordered lists, and a lower number of images than high ranking desktop pages.
Other important factors included a fast load speed (top pages averaged a 1.10s load time), and an avoidance of flash, with only 5% of the top ranked pages incorporating this.
Overall, the report suggests that the factors that influence search rankings in mobile are slightly different , although not entirely separate, from those that influence desktop.
Google warns webmasters not to use “sneaky” mobile redirects
Last month Google reiterated it’s policy on mobile redirects, stating that sites that redirect in a “sneaky” way can expect to receive penalties.
The warning is directed at webmasters who implement redirects to unrelated content on mobile landing pages. This means that when a user clicks through the site fro the SERP’s, they will be directed to an unrelated page, often without knowing.
This isn’t always something implemented by webmasters, and can also indicate that a site has been hacked. Whatever the reason, these kind of redirects are against webmaster guidelines, and the new announcement suggests that Google will be placing a bigger emphasis on identifying and penalising sites acting in this way.
This isn’t to say that any mobile redirects are being targeted, as Google’s Search Quality team stated; “Redirecting mobile users to improve their mobile experience (like redirecting mobile users from example.com/url1 to m.example.com/url1) is often beneficial to them. But redirecting mobile users sneakily to a different content is bad for user experience and is against Google’s webmaster guidelines.”
Google warns of harsh penalties for repeated guideline violations
Google has warned that sites found repeatedly violating webmaster guidelines, or that have a “clear intention to spam”, could face harsher manual action penalties.
Usually, if a site receives a manual penalty for violating guidelines, they need to rectify the violation and send a reconsideration request to Google in order for this to be revoked.
However, if after a positive reconsideration request a site then proceeds to further violate guidelines, the new blog post states that “further action” will be undertaken.
This “further action” will make any future reconsideration requests more difficult to carry out, less likely to be accepted, and in general reduce the chance of any manual actions being removed.
Summing this up, Google state that “In order to avoid such situations, we recommend that webmasters avoid violating our Webmaster Guidelines [in the first place], let alone [repeat this]”.
HTTPS acts as a “tiebreaker” in search results
In a recent video hangout, Google’s Webmaster Trends Analyst Gary Illyes emphasised again the slight ranking boost given to HTTPS sites, clarifying it as a “tiebreaker”.
In situations where the quality signals for two separate sites are essentially equal, if one site is on HTTP and one is on HTTPS, the HTTPS site will be given a slight boost. This reflects Google’s recent attitude towards HTTPS; whilst Google doesn’t regard encryption as essential, it is heavily recommended.
This doesn’t mean that HTTP is viewed as a negative by Google, and Illyes clarified that it’s still “perfectly fine” for a website to not be HTTPS.
However, whilst having a site on HTTPS alone isn’t enough to result in a positive SERP ranking, “if you’re in a competitive niche, then it can give you an edge from Google’s point of view”.
Google hints that structured data could be used as a ranking factor
Although data that is relevant to a specific site or niche works to make a sites SERP snippets richer, and in turn could potentially improve CTR, it’s not something currently used by Google as a ranking factor.
However, new comments – alongside the fact that Google now issues penalties for improper schema implementation -suggest that this could change in future. Acknowledging the usefulness that structured markup can have to users, Mueller stated that “over time, I think it is something that might go into the rankings”.
Mueller gave a brief example of how this might work, saying that in a case where a person is searching for a car, “we can say oh well, we have these pages that are marked up with structured data for a car, so probably they are pretty useful in that regard”.
However, it was emphasised that this wouldn’t be used as a sole ranking signal, and that a site would need to have “good content” as well being technically sound in order to benefit from any potential structured data ranking factors.
Study finds increased CTR on position 2 results with rich snippets
Market research company Blue Nile Research has suggested in a new study that rich snippets could shift CTR percentage from position 1 to position 2.
The study compared responses to three scenarios; a result in position 1 with no rich snippets, a result in position 2 with rich snippets (such as stars, images, videos etc), and a result in position 2 with no rich snippets.
A comparison of clicks for each scenario found a 61% click share for the position 2 with rich media, whereas the position 1 with no rich media had only 48% click share. Meanwhile, position 2 with no rich snippets had the lowest click share at 35%.
The study looked at the search habits of 300 people in a lab environment, and as such doesn’t necessarily give the most accurate representation of natural user activity. However, it does suggest that structured markup and rich snippets have a valid part to play when considering how to boost click through rate.
Google says linking externally has no SEO benefit
Although it’s common knowledge that gaining links from good quality sites can have a positive SEO benefit, the effect of linking out externally hasn’t always been as clear cut.
It’s often been thought that whilst not comparable to earning links, linking to external sites could provide a marginal search benefit. Although not ever explicitly confirmed, this belief has been reinforced by Google; in 2009, Matt Cutts stated that “in the same way that Google trusts sites less when they link to spammy sites or bad neighbourhoods, parts of our system encourage links to good sites”.
However, new comments have suggested that this isn’t the case. When asked “is there a benefit of referencing external useful sites within your content?”, Google’s John Mueller clarified that “It is not something that we would say that there is any SEO advantage of”, but that “if you think this is a link that helps users understand your site better, then maybe that makes sense.”
So, although linking eternally appears to have no direct SEO benefit, it should still be considered as a valuable part of creating a user friendly site architecture.
In July 2015, Google announced a “slow-rollout” of Panda, saying that the latest update would be continuous and occurring over a larger space of time than previously. For this reason, it was suggested that websites might not notice ranking increases or decreases immediately.
After a month of Panda 4.2, reports from webmasters have been mixed, with many sites as of yet seeing little to no influence.
Other webmasters have noticed short term ranking increases or decreases occurring over 1-2 week periods, only for a site to return to it’s pre-Panda standing. This has led to some to speculate as to whether the 4.2 update has been “reversed”.
However, it’s more likely that these ups and downs are simply due to the slower nature of the 4.2 refresh; in fact, Google warned that this rollout may result in ranking fluctuations. As such, webmasters shouldn’t accept any ranking changes attributed to Panda 4.2 as permanent, and should anticipate subsequent fluctuations as the refresh continues.
Going “mobile only” is fine
Websites only operating with a mobile versions, and without desktop, will not see adverse ranking effects, so says Google’s Webmaster Trends Analyst John Mueller.
This statement follows the mobile friendly update rolled out earlier in the year, after which not having a mobile friendly site could have a negative effect on a sites search rankings.
Muller states that “you definitely do not need a specific desktop website in addition to a mobile website”, so long as you“ make sure that desktop users can still see some [of your sites] content”.
This means that so long as sites optimised for mobiles and tablets are still usable on a desktop device, it isn’t a necessity for a separate desktop site to be created. However, a mobile site must still be properly optimised along Google’s outlines in order to rank well within mobile results.
Google clarifies position on soft 404 response codes
It’s common knowledge that pages returning a 404 error code are not crawled by Google. In fact, Google even recommends to 404 pages that contain “bad links” pointing to them if these links can’t be removed.
However, it’s not widely known how Google treats so called “soft 404s”; pages that should be returning a 404 code, but actually return a 200 “ok” status code.
Recently, Google’s Gary Illyes and John Mueller both gave the similar responses when asked about soft 404s. Illyes said that soft 404 responses are treated like 404s, and thus pages where they occur are not indexed.
However, Mueller expanded on this a little, stating that whilst soft 404s aren’t indexed (and thus any links pointing to them have no influence on a sites ranking), Google first needs to work out that a page is a soft 404; something that Mueller states can be “difficult”. As such, before a soft 404 page is identified, it will be indexed, potentially carrying on equity –positive or negative – from links pointing to it. Once identified, as Mueller states that Google only indexes pages eliciting a 200 response, the soft 404 page will no longer be indexed.
Moz releases 2015 search ranking factors study
SEO software company Moz has released it’s annual search study, based on a survey of “over 150 leading search marketers” giving their “expert opinions on over 90 ranking factors”. Factors were rated on a scale of 1-10, with 10 being most influential and 1 being least.
The study shows that links remain a strong perceived ranking factor, with link features rated both 1st and 2nd highest by those surveyed. Keyword related factors were also rated as strong, coming in at 3rd, 4th, 7th and 8th, whilst engagement was rated as the 5th most important ranking factor.
,p>Although the results are taken from a relatively small pool, they do serve to reinforce the importance that basic SEO factors can have on a sites ranking. Again, just because a factor is rated lower (e.g. social) doesn’t mean it’s influence is negligible; good rankings come from a range of these factors combined, rather than time invested solely in one area.
Google’s business model restructured
On August 10th 2015, Google CEO Larry Page announced the formation of “Alphabet”, a public holding company for Google and it’s subsidiary businesses. Whilst Google will still be the largest business under the Alphabet umbrella, the restructure will result in a “slimmed down” and more streamlined Google. Following the restructure, Page will become CEO of Alphabet, with Google’s current Product Manager Sundar Pichai taking his place.
Alongside the creation of Alphabet, Google has received a brand update. The company has revealed a new logo and logo icons, and is slowly revealing updated search results pages. So far, there has been a large focus on mobile usability, mainly on Google’s own Android devices.
It’s unlikely that the restructure or re-branding process will have any direct SEO implications. However, the introduction of new usability features suggests that further prerequisites for mobile usability or schema mark-up could be implemented in future.
After months of speculation, Google have stated that as of 22nd July 2015, the Panda 4.2 update is now rolling out. Panda last updated around 10 months ago in September 2014, making this the longest gap between updates so far.
The rollout means that sites penalized by the last update – which affects sites with “poor quality” content– should in theory be able to recover, providing they’ve made steps to stand in line with Google’s recommendations.
However, unlike previous updates, webmasters are unlikely to notice these changes immediately. That’s because Panda 4.2 is rolling out at a much slower rate than usual, meaning that any changes to rankings are likely to take place over a much longer period of time.
Speaking about the update, Google’s Webmaster Trends Analyst John Mueller stated that Panda 4.2 is updating at a slower rate than normal due to “technical reasons” and an “internal issue”. With this in mind, it could take months for webmasters to see any positive or negative influence.
Whist recommendations not to block JS and CSS may have been in place for a while, this is the first time that webmasters have been nudged en masse towards rectifying this.
However, webmasters in receipt of this notification shouldn’t worry, as it appears to be a widespread and often general warning. If you did receive this notification, the best course of action is to simply follow the steps within.
Google says all generic Top Level Domains treated the same
Back in 2014, the rules around generic Top Level Domains ( e.g. .com, .org) changed, essentially allowing for a whole new and unrestricted range to be created.
These changes brought on much speculation about how new gTDLs would be treated by Google, with many assuming that certain domains would receive preferential treatment. This was especially the case for geo specific gTLD’s (e.g. .london), which it was commonly assumed would rank higher in their respective locations.
However, in a recent Webmaster Central Blog Google’s John Mueller cleared up some misconceptions. As it turns out, the new gTLD’s are handled in the same way as the old gTLD’s, with Mueller stating that “our systems treat new gTLDs like other gTLDs…. keywords in a TLD do not give any advantage or disadvantage in search”. This is the same for geo-specific TLD’s (although “there may be exceptions down the line”). However, Google does make an effort country code top-level domains (like .uk) to geo-target websites, but this is something that was already known.
Google summed up the rules around gTLD’s as “if you spot a domain name on a new TLD that you really like, you’re keen on using it for longer, and understand there’s no magical SEO bonus, then go for it ”.
Google clarifies position on asking for links
During July, a small post on the Portuguese Google webmaster blog attracted the attention of many in the SEO community, after it said that asking for links could result in a penalty for Google.
The translated post , with original emphasis, reads “let some advice to [sic] ensure you that you are not violating Google’s guidelines: do not buy, sell, exchange or ask for links”.
Linkbuilding has long been a standard procedure for SEO’s, and as such the implication that this practice inherently falls outside of Google guidelines was news to many.
However, Google later altered the post to read “do not buy sell, exchange or ask for links that may violate our linking webmaster guidelines”. So, asking for links and linkbuilding is not a violation of guidelines, so long as this is done in a manner Google approves of.
At the start of the month, Google’s Webmaster Trends Analyst Gary Illyes warned to expect a Panda update in the coming weeks. In typical Google fashion, Illyes was relatively vague in giving the exact timeframe of the expected update, being only as specific as “two to four weeks”.
Rather than an algorithm update, the most recent rollout is stated to just be a “data refresh”, meaning that sites hit by previous updates could potentially recover. However, this doesn’t mean that the refresh won’t have an impact on previously unaffected sites, and as always there is a risk of sites with poor quality content being hit.
At the time of writing – around 4 weeks since the announcement – signs of any update to the content quality update have yet to be noticed, meaning that the update could come at any time.
Google updates core search algorithm
In mid-June, webmasters reported seeing significant ranking changes, something of course first blamed on the expected Panda update. However, clarification from Google revealed that any ranking changes were most likely to be due to an update to the Core Search Algorithm, as the Panda updates had yet to take place.
Whilst Google regularly makes updates to its core search algorithms, it’s rare for these changes to have such a large effect on rankings. With this in mind, many have searched for other explanations for the ranking changes. One frequently cited influencing factor could be that Wikipedia –a number 1 search result – decided to change all its URL’s from HTTP to HTTPS. This could have affected the top 5 rankings for many searches, thus causing such a fluctuation in rankings across the board.
As always Google have been tight lipped, meaning that any explanations can only really be speculation.
Penalties now issued for improper schema implementation
Penalties issued as a result of May’s “Quality Update” have led to a reading between the lines of Google’s policies on structured data. In the aftermath of the update, some webmasters were given penalties relating to site schema; data used in order to show rich snippets, which can improve organic search visibility.
In March 2015, Google updated its policies on rich snippet markup, stating that this may only be placed on specific items and not whole pages or categories. However, either due to lack of awareness of the updated policy or misunderstanding of how to correctly implement markup, many sites were hit with warnings and penalties.
As such, to avoid penalties, it’s recommended that webmasters become au-fait with Google’s policies before implementing markup. Google also has a Structured Data Testing Tool, allowing developers to check whether markup is correctly implemented before making any real changes.
Google tests “slow to load” mobile results label
This month, some mobile users have reported seeing “slow to load” labels in the mobile search results page. The labels – as seen in the right example – are designed to indicate to users that certain pages may take longer than average to load, or not load at all.
At the moment, the “slow to load” label is in testing, and as such not all users will be able to see them. A similar label was placed into testing back in February, indicating that some form of labelling for mobile devices is likely to be introduced fully at some stage.
However, there has been much speculation as to what exactly Google define as a “slow to load” page, and how this is determined. It isn’t known whether the labelling depends solely on the site or page itself, or whether the speed of an individual’s device or connection is taken into account. As such, some have expressed concerns that the labelling in its current form is arbitrary, giving little indication to webmasters on how to act to prevent a page or site being labelled.
In the past year, searches with localised qualifiers have rapidly increased, Google recently reported.
In a post on the Inside AdWords blog, Google stated that queries with “nearby and “near me” qualifiers doubled, with around 80% of these searches coming from mobile. Google cited “heightened expectations for immediacy and relevance” for the increase, with a reported 4 out of 5 people stating they’d prefer search ads to be less generic, and specifically tailored to their city, post code, or immediate surroundings.
The information was released alongside details of a new ad format, specifically targeting “near me” searches. Google announced that from late May users searching in a “[business] near me” and “nearby [business]” format will be shown 3 or 4 different local business ads. Rather than containing simply copy, these ads will show buttons that allow users to find the location of, or directly call the business, as seen below.
Both directions and a call button have only previously been available on organic local business listings, with ads having only a call button; users will have had to click through to find out location details.
This comes off the back of November 2014’s location extensions update, which meant that users could potentially be shown 3 or 4 ads for different locations of the same business. However, with these latest changes, Google appear to be levelling the playing field somewhat, allowing for more businesses to achieve top of the page and above the fold ad space.
To view the statistics in full, and read more about the update, head over to the Inside AdWords blog.
Google shakes up search rankings with “quality update”
At the start of the May, webmasters reported seeing both positive and negative ranking changes across multiple sites, leading many to assume a possible Google Panda update had taken place. After initial denials of any changes, Google eventually confirmed that an update had taken place at the beginning of the month – but not to Panda.
Google’s John Mueller described the update as “essentially just a normal algorithm update” taking place to “increase the relevance and the quality of the search results”, and advised webmasters of affected sites to “work on your web site in general”.
The lack of specificity regarding the purpose or intent of the changes have led many to dub it the “quality update”, and at the time of writing the reasons for sites being affected isn’t yet known. However, Mueller recommends that webmasters keep “focusing on your site and make it the best it can possibly be” to prevent ranking updates to similar updates in future.
Google clarify how Panda and Penguin algorithms operate
Recently, the operational nature of Google’s Panda and Penguin updates has caused much confusion. Google’s contradictory statements have often been the driving force behind this uncertainty, with both algorithms being stated as operating manually and real time.
However, some clarification was reached in May, with Google confirming that the seemingly oppositional statements they’ve previously made are both true; Panda and Penguin operate both manually and in real-time, simultaneously.
Google employee Mariya Moeva stated that “Panda and Penguin are built-in in the real-time infrastructure, but the data has to be updated separately”. That explains why ranking changes can appear to be sudden, as whilst the algorithm is constantly running, the data that affects search rankings needs to be manually updated or refreshed for a change to take place based on this.
Webmaster tools rebranded as “Google Search Console”, new features added
As part of a wider “inclusive” rebranding process, Google have renamed Google Webmaster tools “Google Search Console”. Citing that Webmaster Tools is “not just for webmasters”, the name Search Console appeals to the toolset’s apparent wider user base of “hobbyists, small business owners, SEO experts, marketers, programmers, designers, app developers”.
Alongside the rebranding, Google have added two new features into the tool, both based around app indexing. Search Analytics now allows webmasters to see top queries, pages and country specific data specifically for apps. Also added is an Alpha version of Fetch as Google for apps, which allows app developers to see the results of Googlebot attempting to fetch and index the apps.
Google search results page now shows real-time tweets
As a result of the partnership between Google and Twitter announced back in March, Google now displays real time twitter results in the mobile search results page. Relevant tweets relating to a search term are shown in a scrollable “carousel” format, appearing either at the top of the page – as seen in the example to the right – or sometimes lower down the page.
Google have stated that the changes represent “another way for organisations and people on twitter to reach a global audience at the most relevant moments”. At the time of writing real time tweets are only displayed in search results on mobile devices, although a desktop roll-out is expected to take place soon.
Google Maps fix causes local business ranking changes
A fix made by Google to prevent offensive search terms leading to locations on Google Maps appears to have boosted the search rankings of some local results. Google acknowledged and apologised for the offensive results after the problem – which caused racial slurs to lead to the White House – was brought to wider media attention, and stated they would make algorithm changes to fix the issue.
The exact cause of the problem is not yet known, although there have been a few suggestions. One of these is a Googlebomb, where users make deliberate steps to manipulate results by attempting to make webpages, or in this case locations, rank highly for irrelevant terms . Another is Google’s local search Pigeon algorithm, which looks for references across the web to influence how local results rank.
Whatever the cause, many webmasters reported significant changes in local results traffic (as seen in the left example) after Google said they’d resolved the issue, leading some to presume that the algorithm changes were responsible. Google have neither confirmed or denied these suspicions.
On the 21st of April Google finally began to roll out its much anticipated mobile friendly update. Announced early on in the year, the exact nature and effect of the update has been heavily speculated about within the SEO community, with reported 4.7% of webmasters making changes to ensure that sites fit within Google’s requested parameters.
However, at the time of writing the update has had a far smaller impact than previously anticipated. As of the 1st of May, Google have said that the algorithm has fully rolled out in all of its data centres. However, the majority of webmasters have reported no big changes in mobile search results rankings, and those who’ve been tracking the update have seen no significant impact, as seen in the below graph from Moz.
Google’s Gary Illyes stated that as many sites have not been re-indexed, they aren’t as of yet being affected by the new scores. This means it’s still possible for “unaffected” sites to be hit, and it’s still recommended that sites that are not yet mobile friendly be made so.
Google tests lightweight mobile results for slow connections
Google have continued their recent focus on mobile search results optimisation with the test of a “lightweight” display for mobiles with slow connections. Initially announced to simply effect mobile SERP’s, Google have now given webmasters the option to show a “toned down” version of their site to users on a slow connection. Whilst the lightweight version of the search results page is automatic, the option to strip out heavy images and files on a site will be down to webmasters to decide.
However, when tested on users in Indonesia, Google reported that sites that had opted in to lightweight display had a 4x faster load time, used 80% fewer bytes, and saw a 50% increase in mobile traffic – something surely likely to influence whether webmasters opt in.
Search Queries report being randomly replaced by Search Analytics in Webmaster Tools
At the beginning of the year, Google tested a new “Search Impact” report amongst a few select users, now renamed as “Search Analytics”. As well as the standard Search Impact features, the new report displays clicks, impressions, CTR and average search results position. On top of this, Search Analytics also allows for a comparison of these factors, broken down by specific queries, pages, devices, and country.
Google’s Webmaster Trends Analyst Zineb Ait Bahajji also commented that the report is “slow to catch up” at the moment, having only 90 days of data. However, this is expected to increase shortly. Whilst at the moment Search Analytics is only available to a random selection of users, it’s expected that at some point it will receive a full rollout and replace the Search Queries report.
Google begins replacing URL search result snippet with breadcrumb pathway
After a long period of testing, Google has finally started to replace site URL in the search results snippet with a site name and breadcrumb pathway. This update comes after years of beta testing and randomly selected rollouts, and is designed to “better reflect the names of websites”, Google has stated.
With this update, webmasters will be given the opportunity to better reflect site structure and content to users, and display a “real world” version of the site rather than a domain name. At the time of writing, this update has only affected mobile results in the U.S, but is expected to have a worldwide rollout in the near future.
In order to make sure these changes take place, webmasters will have to implement specific site name and breadcrumb schema within a sites source code.
Google limits crawling of sites with response-times over 2 minutes
Whilst it’s well known that having a site with a slow server response and load time can have an effect on your search results rankings, exactly what Google classes as a “slow site” has been up for debate. However, in a recent Webmaster Help thread, John Mueller stated that if Googlebot takes “over 2 seconds to fetch a single URL”, this will affect how your site is crawled. If Google views a site as slow, it will limit the number of URL’s crawled on your site, affecting how well your site ranks.
Google give more details on upcoming mobile-friendly changes
Ahead of its release on the 21st of April, Google have clarified a number of points regarding the mobile-friendlyalgorithm. The roll-out is set to run over the course of a week in a real time, page by page basis. Real time means that a site may benefit from any mobile-friendly changes made as soon as Google picks up on these, and “page by page” means that only pages on a site that are mobile friendly will benefit, rather than the whole site. Again, Google stated the algorithm will run on a binary “yes/no” basis, meaning there are no in-betweens; Google classifies a page either as mobile friendly, or not. Google have also released details as to the scale of the algorithm, which is set to have a wider effect than both Penguin and Panda. Although set to only impact search rankings on mobile devices, it’s becoming increasingly apparent that ignoring Google’s mobile recommendations could result in dire consequences in the coming weeks.
Google set to penalise doorway pages
Sites that attempt to maximise their search results appearance with “doorway pages” are set to be hit by a new ranking adjustment, Google have announced. Doorway pages are pages specifically created to rank highly for certain search results, often containing little in depth or useful information and simply acting as a “doorway” to a site. As such, Google views doorway pages as leading to a bad user experience, and with these ranking adjustment updates, no longer wants to rank them. If your site currently has pages that could be classified as doorway pages, it’s likely you may see a ranking drop in the near future.
More than 80% of HTTPS URLs are not displayed in Google SERP’s
A recent webmaster trends analysis discovered that over 80% of HTTPS URLS are not currently being displayed in Google’s search results, instead appearing as HTTP. This is something Google puts down to webmaster configuration, with many webmasters not using HTTPS versions in sitemaps, rel-canonical, and rel-alternate-hreflang elements. This means that although a site is still indexed, it appears as HTTP. Google have previously suggested that they’d prefer sites to use the more secure HTTPS, always displaying this variant if possible and even affording a small ranking boost to sites that use this. Although the benefits might not be immediately visible, it’s worthwhile for webmasters to use and make visible HTTPS on eligible sites.
The average person today will digest more information than at any other point in history. Through the internet, music, TV and plain old fashioned print media, they’ll encounter around 100,000 words. Or about 2.5 novels. In total, they’ll process the equivalent of 34 gigabytes of information every day; 5 times more than 30 years ago.
These figures could give the impression that society in 2015 is more educated. With Google, Siri, and blogs like this just a few clicks away, we can encounter a wealth of information, learning whatever we feel like, whenever we feel like. Want to know tomorrow’s weather? Who was King of France in 1390? How tall Noel Edmonds is? There’s nothing stopping you.
However, have you ever thought that a lot of the information you encounter, process and learn might be wrong? Google has, and they’re wanting to rectify this.
For just under a year, Google has been developing their Knowledge Vault, a huge store of information taken from all across human history. Knowledge Vault is an ever expanding database that autonomously collects data from around the internet. This information is then cross referenced with similar or corresponding information in order to sift facts from falsities.
Google’s existing Knowledge Graph works in a similar way, albeit on a smaller scale. However, rather than compiling information from the whole of the internet, the Knowledge Graph uses “trusted” sources like Wikipedia and Freebase to offer a collection of related and relevant information on a given search term. For example, if I search “Noel Edmonds”, Knowledge Graph provides a collection of useful and unimaginably interesting facts on the man himself, as visible below.
Very recently, a Google research team published a research paper announcing aspirations to implement Knowledge Vault as a search ranking factor. This means that rather than a collection of information simply being shown to users alongside search results – as with Knowledge Graph – the Vault would control all the information on the search results page. Sites that contain information Google considers true would be ranked highly, and sites that contain dubious information would be penalised.
Whilst this is a suggestion still only in its formative period, it’s one that would entirely alter the way Google search works.
At the moment, sites are ranked according to a number of factors, one of these being links. The more links a site has from trustworthy sources, the more trustworthy that site is considered. This is a largely human process; when you link to a site, you’re showing a vote of confidence.
However, a ranking based on the Knowledge Vault would take away this human influence. As the Vault is an autonomous system, it and it alone decides what separates fact from truth, and what makes a site trustworthy.
Current ranking factors like links are far from perfect; something testified by algorithms like Penguin designed to halt manipulative link-building. However, possibilities for manipulating the Knowledge Vault in theory still exist. If the Vault is simply collecting together information it views as similar, and deciding truthfulness based on this, then what’s to stop webmasters from sprinkling their “facts” across the web in an attempt to manipulate higher rankings? Plus, what about dubious information that large numbers of people on the web consider to be true? Does this mean that moon landing conspiracy theories and folk health remedies should be considered facts, and afforded a high ranking? What about “facts” that are opinion based? Should the statement “Noel Edmond’s best days are behind him” be deemed any more truthful than “Noel Edmonds has a long and fruitful future ahead in show business”?
Perhaps more importantly, the implementation of a Knowledge Vault based ranking system is a step towards Google controlling a large flow of information. Whereas with the current ranking system, if a piece of dubious information is encountered, this can be argued against; a healthy discussion can be formed. However, with the implementation of this algorithm, there will be no need for discussion; just a nod of the head as Google pumps out a stream of complete, inarguable “facts ”. With this move, Google could be taking the power to invest confidence in information and sites away from users; something surely more important than encountering the odd “spider eats dog” article.
With this being said, and as Google haven’t imposed any real plans for implementation, at this point we can only speculate how a knowledge based search rankings system would work. It may be that Google could simply decide to implement a fact ranking alongside existing systems – perhaps displayed within a search snippet – something which at the time of writing seems a safer and more feasible option. In the unlikely eventuality that a full overhaul does take place, users may even become savvier, and more clued up to whether they’re being shown sketchy information. In any case, it’s not as if Google has never made big changes to the way search works before, and we’ll look forward to watching and adapting to whatever plays out.
Google to start favouring mobile friendly sites in search
Google has revealed a significant expansion of the effect “mobile-friendliness” gives sites within the search engine results page. Announced in a Webmaster Central blog, the changes are set to take place from April 21st. Sites that are deemed mobile-friendly will automatically be ranked higher in the device search results page than sites will low mobile usability. The algorithm changes are likely to significantly affect the mobile search results rankings, meaning that sites currently viewed as “unfriendly” should attempt to make changes before the algorithm comes into action in April. Read our in depth analysis of the changes here.
Google labels slow loading pages in SERPs
Users have recently reported spotting red “slow” labels in the mobile search results snippets of slow loading sites. These labels will warn users in advance before clicking on a site that the page may load slowly. Although Google has considered page loading speed when determining ranking factors since 2010, labelling pages for speed is something not previously noticed. However, based on the testing and introduction of the mobile friendly label last year, it’s possible this feature may turn out to be more than just an experiment.
Google tests new look mobile search results interface
Google may have just rolled out a new look mobile search results interface. Owners of iOS and Android devices have reported seeing a coloured line separator in the search results, rather than the typical grey line, as seen in the two examples below. The exact reason for the alternation is not yet known, and for some this might not seem like a huge change. However, it’s likely that this is a feature designed to increase mobile-usability, reflecting the increasing importance Google has placed on this area in the past few months.
Google test live chat functionality in knowledge graph results
Google have recently tested a “live chat” tool within the search results of local businesses. Displayed within the knowledge graph local box, the feature shows whether someone from the business is available to chat. When the feature is clicked a Google Chat/Google Hangouts page opens, allowing users and potential customers to chat with an employee. Some have expressed concern that the feature may have the capacity to affect CTR, and thus have a negative impact on SEO. For example, a potential customer may not need to click through onto the site of the business, as their query has been answered offsite through the live chat function. However, as this option is currently only in testing mode, these concerns are only speculative.
Google have announced that they will be “expanding the use of mobile-friendliness as a ranking signal” from April 21st and the change will “will have a significant impact in search results”.
This is big news, as it’s rare that Google give so much advance notice about an algorithm change that they say will have a dramatic impact on search results.
I think you can assume that if a website doesn’t get the “mobile friendly” snippet in mobile results, that site will see their mobile visibility reduce drastically, and this will be the point in time where mobile results will change dramatically from desktop.
You can use the following link to get advice from Google as to why a site isn’t mobile friendly:
The same post also announced that Google may start to display relevant mobile app information in search results for users that are logged in and have the relevant app installed. This update is already in place, therefore organisations that have a mobile app as well as a website may want to consider Google’s guide to getting app content indexed
Google starts sending “mobile-usability” warnings to webmasters
Following on from last year’s increased emphasis on mobile usability, Google has reportedly begun sending out warnings to webmasters of “mobile-unfriendly” websites. The warnings, sent out en-masse via Webmaster Tools and email, warn webmasters to fix mobile usability issues on the affected sites in question. Specific problems or affected pages are not listed within the warning message, and webcasters must download a detailed report to see these. This is yet another move from google to increase the mobile-usability of sites, and although not explicitly stated, a suggestion that an algorithm change may be in the pipeline.
Google can now crawl and index locale-adaptive webpages
Websites that automatically change their content depending on the location of visitors can now be crawled by Google, according to an announcement made last month. In a post on the Webmaster Central blog, it was stated that sites that have the capacity to change their language depending on visitor location/language settings will now be crawled and indexed, something that google has previously found difficult; in the past, Googlebot would only see the U.S English language version of locale-adaptive webpages. However, Google is still recommending that webmasters wanting to show their site is locale-adaptive continue to use the suggested rel=alternate hreflang annotations, to help Googlebot recognise that sites are locale-adaptive.
Mobile sites blocking Google now visible in search results
Social profiles for brands now visible in Google’s knowledge graph results
Google has started to display social profiles other than in the knowledge graph results of certain brands. Although a link to Google+ has previously been displayed for brands, the knowledge graph now displays icons for Facebook, Twitter, LinkedIn, YouTube, Instagram and Myspace. This feature has been previously available, but only for “personalities” and celebrities. Brands and companies wishing to have their social profiles visible in the knowledge graph will need to apply a new mark-up to their sites.
As of November 2014, Google will begin to account for how “mobile friendly” a page is as an organic mobile SEO ranking factor. Officially announced over at Google’s Webmaster Central Blog, with this latest change to the ranking algorithm, Google are aiming to improve the online experience of mobile users. Whilst Google already penalises websites viewed as offering a bad mobile experience, this is the first we’ve heard of directly rewarding mobile friendly sites.
As well as the new update affecting page rankings, Google will also be introducing a new “mobile friendly” label. This will appear at the beginning of a page’s search results snippet, as seen the in the example below, directly informing users that the page offers good mobile usability.
In order to qualify as a “mobile friendly” page, Googlebot takes into account a few criteria. Mobile friendly sites are classed as sites that:
Avoid technologies that are not universally compatible with all mobile devices e.g. Flash
Size content to fit the screen, so users don’t have to zoom to view images or text
Have appropriate link spacing, allowing for easy clicking
Interestingly, the “mobile friendly” tag is attributed at a page level and not domain level.
To see if a page is considered mobile friendly, Google have provided a simple tool at the following link:https://www.google.com/webmasters/tools/mobile-friendly/. This tool is fairly self-explanatory, and easy to use. If users enter a site that meets Google’s criteria for mobile devices, they’ll be told the site is mobile-friendly, and greeted with a screenshot of how Googlebot sees the page, as seen below.
However, when users enter a site that doesn’t meet the criteria, they’ll be shown a list of reasons why the site in question isn’t mobile friendly:
This comes shortly after the introduction of mobile usability reports to Google Webmaster Tools, a feature that informs webmasters of errors affecting many of the factors the new update will be taking into account. Alongside the announcement of a new “mobile-friendly test”, it’s clear that Google are pushing webmasters and developers to seriously consider the mobile browsing experience their sites offer to users.
One of the main talking points in the SEO industry right now is Google’s recent announcement that they will count https as a ranking factor.
Google say the update is a “light” ranking factor, impacting 1% of queries, but it is likely they will decide to turn up the dial in time. To quote the announcement:
“…while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web”
Having seen Google’s softly softly approach before, reading between the lines they appear to be telling site owners rather than suggesting to them that they should be updating to HTTPS. I imagine there will be an amnesty period “to give webmasters time to switch to HTTPS”, but eventually HTTPS will become a fully blown ranking signal. This is what Google wants, and you either play by their rules or suffer the consequence of lowered rankings.
So what exactly does HTTPS mean?
HTTPS is the secure version of HTTP, which creates a secure connection for the user and stops sensitive information from being leaked – HTTPS has therefore been a standard implementation for securing ecommerce shopping carts. Google’s recommendation is to make all static content secure and not just pages that transfer sensitive data. The main benefit for site owners is that it prevents “Man-in-the-Middle” attacks, a type of hack that relays data between two parties through a third middle man.
As per Google’s announcement, instructions for implementing HTTPS in a satisfactory way are as follows:
Decide the kind of certificate you need: single, multi-domain, or wildcard certificate
Use 2048-bit key certificates
Use relative URLs for resources that reside on the same secure domain
Don’t block your HTTPS site from crawling using robots.txt
Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.
Moving to a full HTTPS implementation gives site owners the benefit of a fully secure connection, but with some minor downsides. The first is the cost of implementation – it costs around £50 for a certificate (avoid free or cheap certificates) and then a couple of hours for the server configuration – so not exactly prohibitive costs. The biggest challenge is getting the implementation right, so it’s important that someone with experience sets the server up. One final downside is that HTTPS requests will slow page load down slightly, ironic given Google’s constant banging of the drum on improving site speed.
To conclude, unless you’re in the (unclear) 1% of queries impacted, it’s unlikely that switching to HTTPS immediately will provide a noticeable improvement to your rankings. But given Google’s ominous tone of saying they are “giving webmasters enough time” and they “may decide to strengthen”, it’s fairly clear which direction they would like to head in, and adding HTTPS implementation should be on your road-map for the coming months.
I’ve seen couple of interesting pieces from Google recently around mobile SEO and reading between the lines I think we could be seeing a bigger push from them in the next couple of months with regards to rewarding sites with a good mobile experience, or more ominously, demoting sites that don’t tow their line.
On an Official Google Webmaster Central Blog post a couple of weeks ago, Google announced they might highlight underneath a search result that you may be redirected to the site’s homepage rather than the page you were hoping to land on, because a mobile equivalent doesn’t exist. It’s been an issue since the dawn of the smartphone, where many sites will blanket redirect users on mobile devices to the homepage rather than equivalent mobile URLs. It was common for developers to use this shortcut when rushing to develop a mobile site, but it’s always resulted in a frustrating user experience and it looks as though Google is highlighting they share this opinion.
Google call these “faulty redirects” and they have helpfully provided a report in Webmaster Tools that shows if this is an issue for your site. The report can be accessed under Crawl > Crawl Errors and then the smartphone tab. From here you can get an idea of the scale of any issues as well as problem URLs.
Depending on your resources, our preferred order of fixes would be:
Develop a responsive or adaptive site which resolves all content regardless of device
Redirect users to an equivalent mobile URL
If the mobile URL doesn’t exist, present the desktop page instead. To quote Google, “Doing nothing is better than doing something wrong in this case”.
Another tidbit of information suggesting Google’s mobile direction came from Matt Cutts at the recent SMX Advanced conference. To quote Sugarrae in her Matt’s You&A talk wrap-up:
“He kept saying how important it was for us to be mobile ready. He asked the audience how many people had auto-fill markup on their mobile site forms. Hardly anyone raised their hands. Danny said “that’s not mobile” and Matt said “yes it is“.
Emphasis is mine – once again Google are hinting at the importance of mobile user experience, and that they may be seeing UX and “traditional” optimisation as interchangeable.
It all suggests to me that Google will be placing a much higher emphasis on mobile in the coming months, and I think we could be 12-18 months away from seeing substantially different search results for mobile devices than those on desktop. And that’s before you even consider voice search or wearable media! The “faulty redirect” development would also suggest that mobile factors are going to play a bigger part in organic algorithms overall.
With our clients seeing anything up to 40% of traffic just going to handheld devices, but often without conversion rates to match, it’s essential that site owners start getting their houses in order and not just from an SEO perspective, but also UX, creative and cross device tracking.
I’ll leave you with another quote from Matt Cutts’ SMX Advanced speech: “The mobile dominant Internet is coming faster than most people in this room realise“.
Google recently announced around 40 algorithm changes that have taken place during February 2012, or are about to be rolled out. Whilst most SEOs attention was drawn to the “link evaluation” point, and the fact that Google may soon make big changes to how they evaluate the characteristics of links to judge the content of a destination URL, it’s the roll out of an algorithm called “Google Venice” which has caught our attention today.
The “Google Venice” algorithm update focuses on local results. Historically, a generic keyword search e.g. for “fitted kitchens”, would most likely return a Google Places map result with some local listings, alongside some generic non-local standard organic results. However, we are now seeing many generic searches that generate a Places map result and generic results, as well as featuring some local results in the main organic listings.
Google uses a number of methods to detect where a user is based – most notably, the user can set their default location in their search preferences, and Google will also look at IP address and to some degree past search history.
This is big news on two fronts. First of all, there’s a clear advantage for businesses with a local physical presence to gain visibility for generic phrases amongst searchers in their area.
Secondly, bigger nationwide companies who have strong visibility for generic phrases despite not having a physical presence in the searcher’s area will most likely lose visibility, at the expense of local businesses.
Any business with physical and online presence must consider this as part of their search strategy if they weren’t before, at a local SME level as well as national multi-store retailers. Our recommendation would be to first identify searches relevant to your product and service which may trigger the Venice algorithm, and to ensure that on-page optimisation elements target those products/service combined with location. For businesses in one location this will most likely be your homepage, whilst multi-store businesses should scale this across individual location pages. The big challenge for multi-store businesses will then be tracking results for multiple phrases across multiple geographic areas, and it remains to be seen how effective standard off-site SEO practises will be in improving Venice results.