#Google Webmaster Hangouts Notes – 1 November 2019
Welcome to MarketingSyrup! And first of all, I want to announce something I’m really excited about! This is the SEO Challenge for those who want to learn SEO. Check it out!
Now let’s get to the Notes. By the way, Happy Halloween to you (even though it’s not October 31st today!). And look at John, he’s got a great Halloween costume for the video 😀
Google can place a website to the omitted results if it has found the same content on a different website and chose to rank it (3:15)
The ‘Show omitted results link” is shown when Google would have multiple results that would show the same snippet. So it doesn’t make sense to display these identical snippets to users, so Google chooses only one page and shows it, the others are added to the omitted results.
So if a website is seen only when you click ‘Repeat the search with the omitted results included’, it suggests that Google found the same content somewhere else and is currently ranking only that version.
Having 404 or 410 errors doesn’t decrease website rankings (6:27)
If many pages of a website start returning 404s or 410s, it doesn’t negatively influence how Google sees and ranks the website in general. It will still try to crawl the URLs that return errors but eventually it will mostly concentrate on the live pages.
Google can see and index hidden content only if it’s loaded in the HTML by default (10:31)
In this mobile-first indexing era, Google will treat the content hidden in tabs or behind ‘Read more’ as normal content if it’s loaded by default in HTML.
Google doesn’t use likes on social media as a ranking signal (12:54)
Likes on Facebook, Instagram, etc. are not used by Google as a ranking factor. So don’t worry if they are disappearing, it won’t impact your website rankings in Google.
Google can still crawl the URLs with ‘Crawl: none’ set in the Parameters handling tool (25:07)
Setting the URL Parameters in Google Search Console to ‘Crawl: none’ doesn’t mean Google will never crawl it. If you want something never to be crawled, use robots.txt for that.
Google Search Console Parameter handling tool and canonicals (25:33)
When you set parameters in the Parameter handling tool. Google still needs to figure out what the canonical URL is for these URLs.
You can help it by setting up canonicals yourself. In this case, the URLs with parameters will be crawled much less frequently. For big websites, this is a good way to optimize the crawl budget.
Kristina’s note: Just recently I was trying to come up with a simpler explanation of self-referencing canonicals and how they are connected with URL parameters 🙂
Google will display review stars in search results only if certain conditions are met (32:41)
- The structured data implementation should be technically and logically correct
- The website should be of reasonable quality
- If the review type is supported
Some time ago Google started showing review rich results only in certain cases, so if your content type doesn’t match the supported ones, there’s nothing you can do about that.
There’s another aspect of that: if you try to apply the wrong schema type, it may cause a manual action, and the review stars will not be displayed in the search results either.
After a website migration, the old URLs might still be visible in Google in some situations, and that’s OK (46:00)
When you move a website from one domain to another, it’s ok to see the old domain in the search results. For example, if you explicitly look for the old website, it can show up in the search. But if you look at the cache of the page, it will display the new website.
So the old URLs on Google are not an indication of any issue. And it’s not recommended to use the URL removal tool for that (it just hides the URLs, it doesn’t influence indexing). Also, don’t set error status codes on the old domain pages. Make sure to set up 301 redirects instead.
That’s it for today! Subscribe to be always on top of SEO news and best practices!