A website I frequent have finally decided to enable TLS to their servers, only not to mandate it as a lot of websites out there do. The maintainer claims that TLS must be optional. Why?
On my own website I have long set up mandated TLS and HSTS with long periods, and the weaker cipher suites are disabled. Plaintext access is guaranteed to be walled out with a HTTP 301 to the TLS-protected version. Does this affect my website negatively?
In this day and age, TLS + HSTS are markers that your site is managed by professionals who can be trusted to know what they're doing. That is an emerging minimum-standard for trustability, as evidenced by Google stating they'll provide positive ranking for sites that do so.
On the other end is maximum compatibility. There are still older clients out there, especially in parts of the world that aren't the United States, Europe, or China. Plain HTTP will always work (though, not always work well; that's another story).
TLS + HSTS: Optimize for search-engine ranking
Plain HTTP: Optimize for compatibility
Depends on what matters more for you.
There is one good reason for simple read only websites not to use HTTPS.
To truly know the answer to this question, you must ask them. We can, however, make some guesses.
In corporate environments, it's common for IT to install a firewall that inspects traffic incoming and outgoing for malware, suspicious CnC-like activity, content deemed inappropriate for work (e.g. pornography), etc. This becomes much harder when the traffic is encrypted. There are essentially three possible responses:
For a concerned sysadmin, none of these options are particularly appealing. There are a great many threats that attack a corporate network, and it is their job to protect the company against them. However, blocking a great many sites entirely raises the ire of users, and installing a root CA can feel a bit scummy, as it introduces privacy and security considerations for users. I remember seeing (sorry, can't find the thread) a sysadmin petition reddit when they were first turning on HSTS because he was in exactly this situation, and didn't want to block all of reddit simply because he was compelled by the business to block the porn-focused subreddits.
The wheels of technology keep churning ahead, and you'll find many who argue that this sort of protection is old-fashioned and should be phased out. But there are still many who practice it, and perhaps it is them with whom your mysterious maintainer is concerned.
There are several good reasons to use TLS
(and only few marginal reasons not to do so).
Even on static, merely informational sites, using TLS ensures no-one has tampered with the data.
Since Google I/O 2014, Google has taken several steps to encourage all sites to use HTTPS:
The Mozilla Security Blog has also announced of Deprecating Non-Secure HTTP by making all new features available only to secure websites and gradually phasing out access to browser features for non-secure websites, especially features that pose risks to users’ security and privacy.
There are also several good reasons to enforce TLS
If you already have a widely trusted certificate, why not always use it? Practically all current browsers supports TLS and has root certificates installed. The only compatibility problem I've actually seen in years have been Android devices and Missing intermediate certificate authority as Android only trusts root CAs directly. This can easily be prevented by configuring the server to send the chain of certificates back to the root CA.
If your maintainer still would like to allow HTTP connections without direct
301 Moved Permanently
, say for ensuring access from some really old browsers or mobile devices, there is no way for the browser to know that you even have HTTPS configured. Furthermore, you shouldn't deploy HTTP Strict Transport Security (HSTS) without301 Moved Permanently
:The problem of various sites configured for both protocols is recognized by The Tor Project and the Electronic Frontier Foundation and addressed by a multibrowser HTTPS Everywhere extension:
Mixed content was also a huge problem due to possible XSS attacks to HTTPS sites through modifying JavaScript or CSS loaded via non-secure HTTP connection. Therefore nowadays all mainstream browsers warn users about pages with mixed content and refuses to automatically load it. This makes it hard to maintain a site without the
301
redirects on HTTP: you must ensure that every HTTP page only loads HTTP contect (CSS, JS, images etc.) and every HTTPS page only loads HTTPS content. That's extremely hard to achieve with the same content on both.It all comes down to your security requirements, user choice, and risk of implicit downgrading. Disabling old ciphers server-side is largely necessary because browsers will happily fall through to absolutely horrible ciphers client-side in the name of user experience/convenience. Making sure nothing of yours that depends on a secure channel to the user cannot be reached with an insecure method is, of course, also very sound.
Not allowing me to to explicitly downgrade to insecure HTTP when I've deemed that your blog post about why you like Python more than Ruby (not saying you do, just a generic example) isn't something I mind the spooks or the public knowing I accessed is just getting in my way for no good reason, on the assumption that HTTPS will be trivial for me.
There are, today, embedded systems which don't have the ability to use TLS out of the box, or ones which are stuck on old implementations (I think it's awfully bad that this is so, but as a power user of [insert embedded device here], I sometimes can't change this).
Here's a fun experiment: try downloading a recent version of LibreSSL from the upstream OpenBSD site over HTTPS with a sufficiently old TLS/SSL implementation. You won't be able to. I tried the other day on a device with an older OpenSSL build from 2012 or so, because I wanted to upgrade this embedded system to more secure, new stuff from source - I don't have the luxury of a prebuilt package. The error messages when I tried weren't exactly intuitive, but I presume it was because my older OpenSSL didn't support the right stuff.
This is one example where the move the only-HTTPS can actually detriment people: if you don't have the luxury of recent pre-built packages and want to fix the problem yourself by building from source, you're locked out. Thankfully, in the LibreSSL case, you can fall back to explicitly requesting HTTP. Sure, this won't save you from an attacker already rewriting your traffic, capable of replacing source packages with compromised versions and rewriting all checksums in HTTP bodies describing the packages available for download on the webpages you browse, but it's still useful in the much more common case.
Most of us aren't one unsecured download away from being owned by an APT (Advanced Persistent Thread: security jargon for national intelligence agencies and other extremely well-resourced cyber threats). Sometimes I just want to
wget
some plain text documentation or a small program whose source I can quickly audit (my own tiny utilities/scripts on GitHub, for example) onto a box that doesn't support the most recent cipher suites.Personally, I'd ask this: is your content such that a person could legitimately decide "I'm okay with me accessing being public knowledge"? Is there a plausible chance of real risk to non-technical people accidentally downgrading to HTTP for your content? Weight your security requirements, enforced-privacy-for-your-users requirements, and risk of implicit downgrades against the ability of users who understand the risks making an informed choice on a case-by-case basis to go unsecured. It's entirely legitimate to say that for your site, there's no good reason to not enforce HTTPS - but I think it's fair to say that there are still good use-cases for plain HTTP out there.
There is a lot of discussion here as to why tls is good - but that was never asked as in the original post.
Maxthon asked 2 questions:
1) why has has a random, un-named site decided to maintain both http and https presences
2) Is there a negative impact to Maxthon serving only 301 responses to http requests
With regard to the first question, we don't know why the providers chose to retain both http and https sites. There may be lots of reasons. In addition to the points about compatibility, distributed caching, and some hints about geo-political accessibility, there is also a consideration about content integration and avoiding ugly browser messages about the content being insecure. As Alvaro pointed out, TLS is just the tip of the iceberg with regard to security.
The second question, however is answerable. Exposing any part of your site of your website via http when it actually requires https for secure operation provides an exploitable vector for attacks. However it does make some sense to maintain this in order to identify where traffic is being incorrectly directed to port 80 on your site and fixing the cause. I.e. there is both a negative impact and the opportunity for a positive impact, the net result depends on whether you are doing your job as an administrator.
Sysadmin1138 says that https impacts seo rankings. While Google have stated that it does impact rankings, the only reliable studies I have seen suggest the difference is small. This is not helped by people who should know better claiming that, since top ranked sites are more likely to have an https presence, an https presence therefore improves rankings.
This is not a good reason, as it means you have bad/broken/insecure clients, but if there are automated processes accessing resources via the existing
http://
urls, it's possible that some of them do not even support https (e.g. busybox wget, which doesn't have TLS support internally and only added it more recently via an openssl child process) and would break if they were given a redirect to an https url that they can't follow.I would be tempted to deal with this possibility by writing the redirect rule to exclude unknown (or known-legacy) User-Agent strings from being redirected and let them access the content via http if they want, so that actual browsers can all benefit from forced https/hsts.
In the past, I've had to use HTTP rather than HTTPS because I've wanted to
<embed>
pages from elsewhere that themselves have been served over HTTP, and they won't work otherwise.There are very few good reasons for using HTTP instead of HTTPS on a website. If your website handles transactions of any kind or stores any kind of sensitive or personal data, you must absolutely use HTTPS if you want said data to be secure. The only decent reason I would see for not enforcing HTTPS is if your website relies on caching as HTTPS does not work with caching. However, it is often worth sacrificing a bit of performance in order to ensure security of your website. It is also possible that your clients may not support HTTPS, but really, in 2017, they should.