TRAFFIC
still used a lot by feed reader alternatives and other programs that utilise syndication.
Check if the website has any feeds, and when found, verify on the HTTPS version if the HREF annotations( the link to the article) to the content and the in-contentlink references( a link in an article) are updated to HTTPS. If not, depending on which platform the feeds are generated, it may be necessary to talk to the web developers or the IT team and update all links to absolute HTTPS URLs( don’ t use just“//” as it is unknown where the content may be syndicated to, and if this runs on HTTP or HTTPS).
• Any CSS, images, Javascript, fonts, Flash, video, audio, iframes being loaded insecurely through HTTP instead of HTTPS;
• Any redirects to the HTTP version;
• Any internal links, canonicals, hreflang, and / or structured data, etc. pointing to the HTTP version;
• Any 40x or 50x errors in the server log files for the HTTPS version.
This redirects any deep pattern on the HTTP version to the HTTPS version. However, this makes the robots. txt and XML sitemaps on the HTTP version inaccessible as this catch-all redirection rule redirects any request for these files to the HTTPS version. To prevent this from happening, an exemption rule needs to be added. For example, in the. htaccess for Apache on the HTTP version, this may
Accelerated Mobile Pages If the website is AMP-enabled, the link references to AMP URLs in the source code need to be updated to the absolute HTTPS version. For example, < link rel =” amphtml” href =” http:// www. example. com / amp /”> becomes < link rel =” amphtml” href =” https:// www. example. com / amp /”>.
In addition, any internal links, link references, canonicals, asset links, etc. in the source code of the AMP pages need to be updated to the relevant HTTPS version.
For more information about AMP, visit the AMP Project 13.
Cookies It is also important that no cookies are sent unsecure. Allowing this could expose the data in a cookie, e. g. authentication data, in plain text to the rest of the world. Double check the server settings so that cookies are secure. For example, with PHP check the php. ini file for the following: session. cookie _ secure = True.
With ASP. NET set the following in the web. config file: < httpCookies requireSSL =” true” />.
Verify the setting by accessing a page that sets a new cookie and check the HTTP headers for the following: Set-Cookie: mycookie = abc; path =/ secure /; Expires = 12 / 12 / 2018; secure; httpOnly;.
Moving to HTTPS Now that the content is prepared and updated for the HTTPS version, it is time to move the website.
Crawl HTTPS version Before completing the switch from the HTTP to the HTTPS version and going live with the HTTPS version to the outside world, Googlebot included, a safety check needs to be performed( see Figure 3).
Crawl the entire HTTPS version, and while crawling, check for the following:
Figure 3: Example of a security audit in Google Chrome Developer Tools, highlighting mixed insecure content
When no errors occur, continue to the next step. It’ s important to: limit the crawl to the HTTPS version, do not crawl the HTTP version.
Updating the XML sitemap( again) Extract all the URLs from the crawl of the HTTPS version and compare this list with the URLs mentioned in the XML sitemaps. Find out which URLs are live and indexable on the HTTPS version and do not have an entry in the XML sitemaps. Update the XML sitemaps with the missing indexable URLs found in the HTTPS crawl of the HTTPS version.
Redirects Now that all content has been moved and updated, new redirection rules need to be implemented to redirect all HTTP traffic to the relevant HTTPS versions. A simple catch-all Apache solution can be used in the. htaccess of the HTTP version: RewriteEngine On RewriteRule ^(.*)$ https:// www. example. com /$ 1 [ R = 301, L ]
look like: RewriteEngine On RewriteRule( robots. txt | sitemap. xml)$- [ L ] RewriteRule ^(.*)$ https:// www. example. com /$ 1 [ R = 301, L ]
This will redirect every request on the HTTP version to the HTTPS version, except requests for the robots. txt and sitemap. xml files.
Move through canonicals When moving content through canonicals, wait to implement the redirection rules until enough of the critical content is indexed and served from the HTTPS version. Once Googlebot has seen most or all of the content on HTTPS, the redirection rules can be pushed live to force Googlebot and the users to the HTTPS version.
Reduce redirect chains When implementing the new redirection rules, double check the old redirection rules and update these to point directly to the new HTTPS end destination. Avoid a redirect chain like: HTTP A redirects to HTTP B, which in turn redirects to HTTPS C.
Also keep in mind that some systems may add or remove trailing slashes by
13 https:// www. ampproject. org /
12 iGB Affiliate Issue 62 APR / MAY 2017