#Google Webmaster Hangouts Notes – 11 June 2019
Welcome to MarketingSyrup! This post is part of my Google Webmaster Hangouts Notes. I cover them regularly to save you time.
Below are the notes from June, 11th. The timestamps of the answers are in the brackets. Let’s dive in!
Having the same block across all website pages if fine if… (2:00)
It’s fine to have the same block across all the pages of a website (e.g. Testimonials) if it helps users and generally makes sense.
But if someone specifically looks for the content in this block, it’s hard for Google to understand which page is most relevant for this search, and it will most likely show only one of the pages. Note that Google won’t penalize your website for this.
Having extended structured data (and structured data in general) helps to display more data in search results but doesn’t influence rankings (7:00)
You can extend structured data with different available features to have an opportunity for more data displayed in Google search. It doesn’t give any ranking advantage though.
So you can use any available schema markup but don’t spend too much time on it. It’s a low impact but high effort task.
How and why quality raters review the websites (11:30)
Quality raters help improve Google search results. But they don’t review individual websites. What happens is when a team at Google makes improvements to the algorithms, they send search results pages with and without those changes to the quality raters. The latter go through the results and say which of them are better and why. The Guidelines just help them evaluate those results.
So the raters’ feedback doesn’t directly influence how specific websites are ranking. And there’s no need to optimize a website specifically for the quality raters.
Make sure that your www version of the website is redirected to the non-www version (or vice versa) (14:28)
The website should be available with and without www. Otherwise, if, say the www version of the website doesn’t load, Google might think that the website went offline or not available anymore.
This is especially critical for website migrations and other large changes.
All pagerank that a page has collected is passed through all the links which are found on this page. So if you have an infinite number of the links on a page, this part of the pagerank each page gets will be very small.
In terms of crawling, Google does have a limit of the links it follows on a page. But this limit is really high, several thousands of URLs which most (if not all) of the websites won’t reach anyway.
A rule of thumb then is to have a reasonable number of links per page.
Google opens a page, parses the content and pulls out all the info from the page and then follows all the links found there. So links at the beginning of an HTML don’t get more value than links at the end of it.
AMP is not a ranking factor (27:40)
AMP (accelerated mobile pages) help really well with improving mobile site speed but they are not a ranking factor.
But at the same time, they may have an indirect effect: improve speed -> speed IS a ranking factor -> improve rankings.
Having a self-referencing canonical is not a strict requirement but is always helpful (28:53)
It’s not critical to have a self-referencing canonical on a page (this is a canonical pointing to the page itself). But having those does help Google pick up the right URLs that you want to be chosen as canonical.
Having a self-referencing canonical is especially helpful if you have pages with parameters (e.g. UTM parameters). But note that Google uses different methods to choose a canonical page, and there are situations when rel=canonical doesn’t work.
A self-referencing canonical also works well with hreflang tags.
Expired pages can be deleted and return 404s (30:22)
If you have pages that expire quickly, it’s ok to treat them as 404s.
(The question was initially about job postings that expire within a week or so).
Kristina’s note: But if those expired pages have external links, I’d recommend setting up 301 redirects to other relevant pages returning 200. Otherwise, the authority passed by these external links will be lost.
If Google sees spam in your structured data, it can turn off rich snippets and send a notification in GSC (32:08)
If Google sees spam in the structured data or misleading structured data, it can turn off the rich snippet completely for this website. You will get a notification in Google Search Console about this and will be able to submit a reconsideration request once the structured data is fixed.
Reconsideration requests usually take a couple of weeks to be reviewed (33:45)
If your website was hit by Google, you cleaned things up and submitted a reconsideration request, it should take Google just a couple of weeks to review it.
You can either have your structured data in one graph or in different blocks (34:32)
Google pulls out the information from a page and then matches the items with their values. In this regard, there’s no difference in having all structured data added in a single block of code or in multiple ones.
You can even have microdata and JSON-LD for different types of info on the same page, this will also work provided there are no errors.
It’s fine to have heflang setup via sitemap for one domain and HTML setup for another domain (38:25)
While crawling, Google extracts the information from pages and sitemap files and then processes this info independently of the format that was used to add this data to the website.
So it’s OK to have hreflang for one of the domains added in the header of every page and use XML sitemap hreflang for another website in your international setup.
Google picks up info to show in snippets in the search results algorithmically (39:03)
The initial question was about a website which has forum snippets but in reality, it’s not a forum. John recommended to review the content and try to find out why a page might be treated as a forum discussion (e.g. many comments that are seen as discussion rather than just comments) and change the page formatting or even hide the comments from Google.
In any case, the way snippets are displayed doesn’t influence website ranking.
Don’t use the indexing API for pages which are not job postings or live streams (40:39)
The indexing API should be used only for job postings and live stream pages, so it’s not recommended to use it for normal HTML pages.
What you can do instead to have pages (not job postings or live streams) indexed quickly is to use a sitemap file (for many URLs) or Google Inspect tool (for individual pages).
That’s it for now! Subscribe to the newsletter and I’ll keep you updated 🙂
I cook digital marketing dishes. Take 3 tablespoons of on-page SEO, add 2 pinches of backlinks and sprinkle it all with paid advertising. Season to taste with actionable data from Analytics and bake until golden brown. Serve hot.