#Google Webmaster Hangouts Notes – 3 May 2019 – Part 1

Welcome to MarketingSyrup! This post is part of my Google Webmaster Hangouts Notes. I cover them regularly to save you time.

Here are the notes from May 3rd. When I already had so much content while I didn’t get through the whole video, I decided to break this down into 2 parts. This is the first part, enjoy! The timestamps of the answers are in the brackets.

There’s no minimum amount of words you need in a blog post (1:56)

Google doesn’t count the words in a blog post at all. So there’s no optimal amount of words you need to use, it’s more important to cover the topic and provide useful information to your readers.

Google looks at different factors to determine the quality of the content, and the amount of words and lines is not one of them.

Kristina’s note: Sometimes it’s just easier to give a number to the copywriters to make sure you won’t end up with posts of 100 words. Though in some cases it’s possible to cover a topic with 100 words, it happens quite rarely.

The number of words in a blog post also depends on the type of it. If it’s a guide that is supposed to cover some topic in detail, a few paragraphs won’t be enough. I remember when one blog manager didn’t like my guide of 2000+ words… until it became the most popular post on that blog.

But if you just answer a question or provide some tips, people don’t want to read an excessive amount of the text. The bottomline here: use common sense.

Use the XML sitemap file with <lastmod> date to speed up re-indexing of your pages (4:56)

The best way to let Google know that your page has been changed and needs to be re-indexed is to use the last modification date for this page(s) in your XML sitemap. Additionally, ping the sitemap in Google Search Console.

Note that Google doesn’t use the change frequency and priority parameters in XML sitemaps anymore.

But don’t overuse this method. If you start adding <lastmod> to the pages which haven’t been updated, Google might eventually start ignoring this.

Google can show multiple pages from one website if they are strong and relevant (9:09)

If there are a few pages from a single website which Google sees as very relevant for that particular term, they all can be ranked high in Google.

But if these ranking pages are constantly replacing one another in search, it means that none of them is strong enough, and Google tries to understand which of them should be shown.

Kristina’s note: This is known as self-cannibalization: website pages compete against each other. Sometimes such things are hard to avoid (for example, I have a series of Google Webmaster Hangouts Notes which cannot be named differently). But in most cases, it’s easy to achieve.

Using parameters in some URL types (internal search, pagination) helps Googlebot better optimize crawl budget (10:52)

From the Google point of view, both variants – website.com/search/searchterm or website.com/search?q=searchterm – will work.

But John Mueller still recommends using a query parameter (like q= in the example above) as it makes easier for Google to understand that this part might vary.

Moreover, parameters help Google learn faster how to optimize crawl budget on the website. This is also applicable to pagination created with parameters (e.g. ?page=3) vs pagination being part of the URL (e.g. /page3).

Kristina’s note: This is really interesting and new to me. I don’t think you need to change your current pagination, especially if you don’t have crawl budget issues. But it’s still good to know if you’re developing a new website which is going to have lots of paginated URLs (e.g. an eCommerce store) since rel=prev/next won’t help anymore.

In terms of the site search, you’ll 100% need to use parameters as this way you will also be able to track what people are looking for within your website. This can be configured in Google Analytics. I’ll soon write a post about it, so subscribe and get it delivered to your inbox.


If you want to rank a JS based website or a single page web app, make sure Google can render your page(s) (15:08)

When it comes to JavaScript based websites (including single page web apps), you should make sure Google is able to render the pages to pull out the content and use it for indexing.

You can use Mobile friendly testing tool to see the HTML after rendering and check if Google was able to pick up your content.

Kristina’s note: Good news here! Google updated the version of Googlebot a few days ago. The new version will support 1000+ new features which will help Google render JS better. The new version is Chrome 74 but don’t worry if you still see Chrome 41, this is an intentional behavior.

Average position in GSC is not a real indication of where your website is ranking (17:18)

The ranking data (aka Average Position) is not a precise ranking position of your website in a particular country. It’s rather a snapshot of what Google saw when it showed search results to users.

This means that if at that particular point your website was ranking #3 (due to personalization, for example) and then was pushed out of page 1, Google report will show ‘3’ as your average ranking position.

So take the average positions with a grain of salt.

It’s fine to directly show a 404 page or redirect users to a custom 404 page (19:22)

That’s it, both options work well for Google.

Moving content from a subdomain to the main domain can bring good results if the latter is seen by Google as having higher quality than the subdomain (20:11)

From a development point of view, subdomains (e.g. content.siete.com) and subfolders (site.com/content) are treated in the same way.

But Google looks at a website overall to understand its quality. And subdomains are treated as separate websites. This means that, for example, a blog on a subdomain can be seen as of lower quality than the main domain. And after moving this blog to a subfolder (blog.domain.com -> domain.com/blog), you might see an increase in rankings and overall search visibility for the blog.

But if Google sees the domain and subdomain as having the same quality, the switch to a subfolder is not likely to cause any positive results.

Google won’t be able to pick up your images if they are added via CSS or with div (22:50)

Make sure to embed your images with <img> if you want Google to see them and use in its image search.

Kristina’s note: Check out my post on Image SEO for more useful tips.

Google is constantly testing new search features (24:42)

Google is working on new search features all the time trying to improve search results. That’s why sometimes we might see something new. This is a never ending process as what works now might not work later.

Google also tries to keep the balance between providing ready answers directly in the search (featured snippets, knowledge panel) and guiding users to other websites.

Previous episodes