#Google Webmaster Hangouts Notes – 9 July 2019
Welcome to MarketingSyrup! This post is part of my Google Webmaster Hangouts Notes. I cover them regularly to save you time. You might want to subscribe not to miss anything!
Google can still show and rank pages blocked in the robots.txt (22:20)
Google sometimes shows pages blocked in robots.txt in the search results if it sees that they work really well (e.g. other people link to them).
It means that disallowing a page in the robots.txt file doesn’t really prevent it from showing up in the search results, it just prevents Google from accessing content on that page.
Kristina’s note: Disallowing a page in the robots.txt prevents only crawling of this page, not indexing. So If you absolutely don’t want your page to show in the search, use meta robots noindex instead of robots.txt.
When you change content on a page, Google needs to re-evaluate it (24:30)
When page content is changed, Google takes signals associated with this page and looks at the new content to find new signals it can pick up.
It means that changing content influences page rankings.
If the changes are minor, the re-evaluation process shouldn’t influence a lot how Google sees and ranks this page. Bigger changes can lead to more significant shifts.
While changing the content, keep the old URLs whenever possible (25:52)
If you’re changing your content, try to keep the same URLs as before. It makes it a lot easier for Google to associate the old signals with the new content.
If you can’t keep the old URLs, make sure to 301 redirect them to the new ones.
Nothing changes from Google point of view if the website owner changes and the website stays the same (26:28)
If the owner of a website changes but the website stays the same, it doesn’t influence how Google sees or ranks this website in any way.
But if a domain name expires, somebody buys this domain name and puts a new website there, that’s essentially a new website. Google will need to re-evaluate all the signals for it.
Don’t use automatic IP redirects based on users’ location if you want Google to index all languages versions you have (28:22)
Automatic IP redirects based on the user’s location is not the best option from an SEO perspective. Googlebot always crawls your website from the USA. It means that Google only sees the English version of your website and never gets access to the other language versions, so it can’t index it.
It’s better to have individual landing pages for the individual languages and set up hreflang tags so that Google will be able to show the right language versions of the URLs in the search results.
From a UX perspective, it’s also valuable to give people an option to switch the versions manually if they need to.
A few words on an alt tag of an image link (47:08)
Google uses alt attributes found in the image links in 2 ways:
- As a part of the page where this image is found
- As an anchor text for the page which is linked through this image
So the alt tag should ideally be relevant to both pages and describe the image.
That’s it for today. What was the most interesting part for you? Share in the comments!
I cook digital marketing dishes. Take 3 tablespoons of on-page SEO, add 2 pinches of backlinks and sprinkle it all with paid advertising. Season to taste with actionable data from Analytics and bake until golden brown. Serve hot.