#Google Webmaster Hangouts Notes – 11 January, 2019

Hey! Welcome to my Google Webmaster Hangouts Notes!

Table of Contents

 Announcements

  • Google+ will be sunsetted soon.
  • There will be many changes in Google Search Console: some sections will be closed down and not transferred to the new GSC. A  lot of things are not really necessary, showing too much information which doesn’t have any immediate value. For example, crawl errors where all the errors that have ever been discovered are shown.

Questions and Answers

If you deliver a server-rendered page to Google, make sure there’s no JavaScript that can remove or reload content if this JS breaks

When you serve a server-rendered page to Google and this page has JavaScript, make sure that if this JavaScript breaks, it won’t lead to removing or reloading the content. Otherwise, it may interfere with Google indexing.

One more thing to note: if you’re doing server-side rendering, your static HTML page should have all the functionality that your users do. For example, all the links and schema markup should be in place or Google won’t be able to crawl them.

Using dynamic rendering or showing a mobile version of the website to Google and a desktop version to users is not seen as cloaking

Cloaking is when the content served to a user and Googlebot is significantly different, e.g. spammy vs not spammy, with different functionality, etc.

Mobile version and server-side rendering are equivalent content with the same functionality, just served in a different way, so it’s not seen as cloaking by Google.

Redirect the old resources URLs to the new URLs when making significant changes to your website

When you make changes to the website, it’s advisable to redirect the old resources to the new ones, for example, the old CSS and JS URLs with version indication in them. When you do a push and a resource URL changes, Google can’t access the old resource URL anymore, and it makes rendering much more difficult.

Mobile usability issues notifications does not always mean your pages are not mobile-friendly

Sometimes websites get notifications that some of the pages are not mobile friendly while in reality the mobile friendliness tool shows that they are.

This might not be a problem with your website, just Google Search Console being ‘too helpful’. As we know, there is a delay between indexing and rendering a page. So when some of resources are not available during indexing, Google Search Console may flag a page as not mobile-friendly. But once these resources are fetched and rendered, everything will be OK.

So if your live test of mobile-friendliness is fine, don’t worry about the alerts.

Sites using a CDN are treated by Google as any other website

A CDN (which stands for Content Delivery Network) is just a way to deliver the content.

Additionally, a CDN doesn’t influence your geo targeting: it’s OK to have a CDN or hosting in a country X while you are targeting a country Y as long as Google can recognize the targeted country in the top level domain or/and in your GSC settings.

If you change images URLs, set up redirects from the old URs to the new ones

If you change images URLs (in the process of migration, for example), it will definitely influence your image search traffic. So it’s advisable to set up redirects from the old image URLs to the new ones to help Google forward any signals it has from the old images to the new images.

If you don’t implement the redirects, the new images will still be discovered by Google, but they will need time to get ranked again all the way up where the old images used to be. And this process might take quite long as image search changes much less frequently than web search.

Switching design templates or themes can lead to significant changes in rankings

If you change a template or re design your website, you may see significant changes in rankings even if your URL structure and the content stay the same.

But this doesn’t mean the rankings will drop. In many cases you might see positive changes if you significantly improve your website (e.g. add relevant structured data, use a clear HTML structure, etc,).

It’s OK to have 2 sets of the same content in the source and show only one depending on the device. But…

From Google standpoint, it’s fine to have 2 sets of the same content in the source code of pages: mobile and desktop, when one version is hidden with CSS depending on the website a user is on.

However, such a configuration seems over-complicated. It also slows down the pages since Google still loads both versions. So it’s better to find a more efficient way of having a mobile website version (for example, a responsive design).

Multiple websites targeting different locations vs a single website with location pages

John Mueller says that both options are possible. But he recommends having a single strong website with separate location pages. This eliminates potential problems with shared content (e.g. general product or service description) as duplicate pages on your multiple websites would compete with each other.

Having a noindex X-Robots-Tag in HTTP header of your XML sitemap won’t influence crawl frequency of the sitemap

An XML sitemap is a file which is primarily created for search engines to process; it’s not meant to be shown in the search. So adding a noindex X-Robots-Tag in HTTP header of the XML sitemap won’t affect its crawling in any way. But it will prevent the XML file from being accidentally shown in Google search results.

A separate XML sitemap for pages and images vs a single XML sitemap

John Mueller says that there is no difference in having separate XML sitemaps for images and pages and a single sitemap listing them all together. Both approaches work since Google will combine all your sitemaps anyway before processing them.

If your website still gets traffic to HTTP long after moving to an HTTPS version, there might some problems with the migration

If you moved to HTTPS but still see a significant amount of traffic coming to your HTTP version long after the migration, there might be some issues with your website. You need to check many different aspects to understand why Google still shows your website with HTTP or how people are coming to your HTTP pages.

I would say that the first 2 things to look at would be redirects and canonicals.

You won’t be able to target a country B with a country C code top level domain

If you have a country code level domain but want to target individual countries outside your own country, you might not be able to do that.

However, your site can still be relevant for a global audience even if you have a country code top level domain. So if you cater for all the people worldwide, a country code top level domain is not in any way different from a global top level domain (e.g. .com) from an SEO perspective.

If you show content to users from some states and block it for the users from other states, it might be seen as cloaking by Google

Google needs to see the same content as your site visitors would see. Googlebot usually comes to your site with an IP based in California. So , for example,  if you want to hide your content from California users, you will have to block Googlebot as well. But this will result in your content being not indexed.

A good solution would be is to find a way to have general content that is not restricted in any state and have it crawled by Google.

Note that Google doesn’t have an ability to control in which states your content is shown. So even if it’s not restricted in California but is prohibited in other states, there is no way for Google to prevent showing your pages in search results to users in other states. Blocking Google cache page, using a no snippet meta tag might be something to consider in this case. But unfortunately, there is no easy way to handle such situations.

This also applies to the content which can be restricted in some countries but not in others.

Don’t give Google conflicting signals

If you want Google to process your content or directive on your website, you should make them as clear as possible. For example, don’t use the same markup type with conflicting information as this way you’re making Google guess what needs to be picked up and what needs to be ignored. And Gogle’s shoice may not always be in your favour.

Google can show page 2, page 3, etc. in search results even if you’re using rel=next/prev

John Mueller says, that it’s totally fine for Google to show paginated URLs other than your Page 1 in Google search results. Google uses rel=next/prev to understand that this is a connected set of items but it doesn’t mean that it will always show the first page in the list.

It may be a sign though that people are searching for something that is not found on your first page of a paginated sequence, that’s why another page is shown. So you might think how to restructure your content in a way to show most important items on the first page in the list.

If you want to see a canonical tag that Google chose for a page, it’s better to use Google Inspection Tool rather than rely on the ‘info’ operator

While the info: operator does show a canonical tag picked up by Google, it’s not a technical SEO tool. It has been designed for a wider audience, so it sometimes can be misleading. Thus, it’s better to rely on Google Inspect Tool which is specifically designed for showing canonical tags chosen by Google.

That’s it for today! See ya next time 🙂