Or in Firefox:
As of January 2017, Chrome and Mozilla (the makers of Firefox) have implemented this policy to inform people that they should be careful when filling in forms containing sensitive data such as passwords or credit card details.
In the coming years, the Chrome Security team plans on marking all HTTP (non-SSL) pages (not just login pages or credit card forms) with a red icon indicating the connection is not secure.
You should strongly consider adding SSL to your entire site, and enforcing it site-wide so that any HTTP requests are redirected to HTTPS URLs.
Since 2014 Google has been giving a rankings boost to sites with SSL.
This means that aside from the obvious benefits of increasing security on your site, you can also improve your ranking in the SERPs (search engine results pages) on Google. Other key factors for ranking are mobile-friendliness and speed. Beyond that “content is king” – meaning your site should be an authority on its subject, be updated regularly, and ideally have some other sites with decent PageRank linking to it. You don’t need a zillion inbound links, especially if they’re low-quality / low PageRank. Focus on quality content, speed, security, and mobile-friendliness / responsiveness.
What it takes to add SSL on your site:
1. An SSL Certificate
You will need an SSL certificate. You can buy one from any number of vendors. We like RapidSSL but you can buy these from loads of vendors. Our favorite option, letsencrypt.org, is entirely free, but does require setting up a cron job. That can prove difficult in some hosting environments (e.g. shared hosting). You can buy single, multi-domain, or wildcard certificates. A single certificate will generally work for both yourdomain.com and www.yourdomain.com, but not any other subdomains (such as xyz.yourdomain.com). This option is the most common, and the cheapest.
2. Use 2048-bit key certificates
This refers to the RSA key pair using 2048-bit encryption. A typical SSL certificate would be 2048-bit with 256-bit encryption. Double-check Google recommendation before purchasing a certificate, since these parameters change over time and get continuously more secure.
3. Use TLS 1.2+
Older SSL protocols are considered insecure and will trigger browser security warnings.
4. Use relative URLs for resources that reside on the same secure domain
For example, if you have an image, you’d want to use a path of “/images/some-image.jpg” and not “http://www.mysite.com/images/some-image.jpg”. This way you can change to https:// without having to change all those absolute paths to https://. If you’re using a CMS such as Drupal, this will often be root-relative already by default, or it can be configured to be that way.
4. Use https:// URLs for all other domains
5. Check out Google’s advice on moving your site, since switching from HTTP to HTTPS will be considered a site move by Google.
If you don’t use Google Search Console, check it out. With Search Console you can get a sense of how Google sees your site. Google expects you to set up separate properties for each domain/protocol variation, even though it does consider them to be the same site. For example if your domain is “mysite.com” and you initially built it without SSL, you’d have two properties: mysite.com and www.mysite.com. If you then add SSL, you will need to also add properties for https://www.mysite.com and https://mysite.com. There are options you can set to instruct Google to prefer www over non-www URLs. Google will automatically prefer https, if it finds https.
6. Make sure your SSL site’s robots.txt does not block search engine crawlers
This advice only applies to sites that have different docroots (file paths) for HTTP vs HTTPS versions of the site. For example, you might have the HTTP site at /var/www/html, but the SSL/HTTPS version at /var/www/html_secure. In that case, you will need to check that /var/www/html_secure/robots.txt does not include “Disallow: /” or similar lines.
If you just have a single site (e.g. at /var/www/html), then you should already have a properly formatted robots.txt file in place which does not block search engine crawlers. It should not contain “Disallow: /” or similar lines.
7. Avoid the noindex robots meta tag, and allow indexing of your pages by search engines whenever possible
You can put noindex rules in robots.txt, which is preferable, but not always feasible.
8. Redirect any HTTP requests to HTTPS
You can and should enforce SSL site-wide so that any HTTP requests are redirected to HTTPS URLs. This can be done using rules in your Apache or Nginx virtual host, or (if using Apache) with .htaccess.