TRAFFIC
http://www.examples.com/dir redirects to http:// Copy the robots.txt file from the HTTP
version to the HTTPS version and update
the sitemap reference to the new sitemap
file. For example: www.examples.com/dir/ , which redirects to User-Agent: * extremely useful for the move to HTTPS.
So add a set with every relevant HTTP
and HTTPS property in the Google Search
Console. For example, add the following
properties to one set:
https://www.examples.com/dir , which redirects Disallow: http://example.com/
to the final destination https://www.examples.
com/dir/ .
More efficient will be to redirect any of the
following URLs: Sitemap: https://www.example.com/sitemap.xml https://example.com/
redirecting them to the other variation on
the same protocol, resulting in additional
redirects. For example:
http://www.examples.com/dir
http://www.examples.com/dir/
https://www.examples.com/dir
Directly to:
https://www.examples.com/dir/
When creating the new redirection rules,
check if it is beneficial to make the trailing
slash optional in the regular expression for
the redirection rule. For example:
http://www.example.com/
Configuring Google Search Console
Now that the content has been moved to
the HTTPS version, the redirections on
the HTTP version are in place and the
XML sitemaps and robots.txt have been
updated, it is time to go to Google Search
ConsoleGoogle Search Console and let
Google know about the update.
RewriteRule ^(dir[\/]?)$ https://www.example.com/ dir/ [R=301,L] http://example.com
http://www.example.com
Naked domain vs. WWW
While writing the new redirection rules,
choose a primary hostname and set up
redirection rules for the non-primary to the
primary version on HTTPS. For example,
when the WWW hostname is the primary
HTTPS version, let’s also redirect all naked
domain URLs on the HTTPS version to the
primary WWW on HTTPS version:
https://example.com
https://www.example.com
RewriteCond %{HTTP_HOST} ^example.com [NC] Verify and add the ones that are
currently not currently present in Google
Search Console.
When If the website has any
subdom ains in use, or any subdirectories
separately added to Google Search
Console, then these also need to be added
for both the HTTP and HTTPS versions.
For example:
RewriteRule ^(.*)$ https://www.example.com/$1 http://m.example.com
[R=301,L] https://m.example.com
Once all new redirection rules to the
HTTPS version are live, continue to the
next step. ● ● Create set
Since May 2016, Google Search Console
has been supporting grouping data 14 of
one or more properties as a set. This is
RewriteEngine On
When using subdomains and/or
subdirectories for specific geographic
targeting, add additional sets for each
geographic target with every relevant
HTTP and HTTPS version. For example,
add the following to sets:
Set 1:
http://www.example.com/nl/
Adding sites variations
A minimum of four variations of the
domain name need to be present in Google
Search Console. These are as follows:
RewriteEngine On
https://www.example.com/
https://www.example.com/nl/
Set 2:
http://de.example.com/
https://de.example.com/
Test Fetch and Render
To make sure everything works as intended 15
for Googlebot, use the Fetch and Render
toolFetch and Render tool in Google Search
Console to fetch and render see Figure 4).
• G
o to the homepage of the HTTP
version and verify it redirects properly. If
everything checks out, click the “Submit to
Index” button;
• O
nce in the homepage of the HTTPS
version, verify that it renders correctly. If
everything checks out, click the “Submit
to Index” button and select the “Crawl
this URL and its direct links” option when
prompted (see Figure 4).
Note: The submission to the index will
also notify Googlebot of the HTTPS version
and it requests Googlebot to starts crawling it.
Crawl HTTP version (again)
This time, find the earlier extracted URLs
from the server log files, the XML sitemaps,
and the crawl of the HTTP version. The
names of the files may be:
• logs_extracted_urls.csv
• sitemap_extracted_urls.csv
• crawl_extracted_urls.csv
Use a crawler such as Screaming Frog
SEO Spider to crawl every URL and verify
that all the redirections work as intended,
and that every URL on the HTTP version
redirects to the correct HTTPS version.
When all is working as intended, continue
to the next step.
Replace robots.txt
At this stage, the robots.txt on the HTTPS
version needs to be updated.
14
15
Figure 4: Example of the Fetch and Render tool in Google Search Console
https://webmasters.googleblog.com/2016/05/tie-your-sites-together-with-property.html
https://support.google.com/webmasters/answer/6066468
iGB Affiliate Issue 62 APR/MAY 2017
15