HiveBrain v1.2.0
Get Started
← Back to all entries
patternrubyMinor

Let OpenSSL decide which TLS protocol version is used by default

Submitted by: @import:stackexchange-codereview··
0
Viewed 0 times
protocoltlsversionuseddefaultletdecidewhichopenssl

Problem

I use Gem in a Box, a Ruby project that allows to create personal self-hosted gem repositories. Gem in a Box uses httpclient to connect to the gem repository. After reviewing my server's TLS setup in the aftermath of the OpenSSL heartbleed bug, I decided only to allow protocol versions of TLSv1 and higher on my server. A decision quite a few fellow administrators made too, because of security vulnerabilities in SSLv3 and below.

Now after I did this change, Gem in a Box refused to work with the following message:

OpenSSL::SSL::SSLError: SSL_connect returned=1 errno=0 state=SSLv3 read server hello A: sslv3 alert handshake failure


I traced this problem back until I found the source in httpclient. Even though OpenSSL can find the best protocol version to use by itself, you set it only to use SSLv3 by default. I think this is a very bad idea and created a patch which lets OpenSSL do its own decision again by default.

As I am not so familiar with OpenSSL itself and the documentation of the Ruby binding is severely lacking I want to prevent introducing more problems than there were originally. Please review my changes and verify that it is doing what I intended.

It is quite hard to post the actual changes here because they are only meaningful if one knows the context. So now I will try to describe what I did:

httpclient has a class called SSLConfig which seems to produce helper objects to configure OpenSSL sockets. It sets some default configurations in its #initialize method. Originally it had the @ssl_version instance variable set to "SSLv3", When using the library an instance of SSLConfig is created and its #set_context method is called which applies the aforementioned defaults onto an instance of OpenSSL::SSL::SSLContext, which afterwards seems to be used to create OpenSSL sockets.

I now changed to initial value of @ssl_version to the symbol :auto and modified the #set_context method to only set the actual value of @ssl_version

Solution

I've got a few things to say about the code, but having looked at the diff:

-      @ssl_version = "SSLv3"
+      @ssl_version = :auto


and:

-      ctx.ssl_version = @ssl_version
+      ctx.ssl_version = @ssl_version unless @ssl_version == :auto


your code changes seem minimal, and to the point doing exactly what they are supposed to do, and nothing more.

What I think you should add are the tests to show your changes, and make sure no-one will break your code in the future.

Code Snippets

-      @ssl_version = "SSLv3"
+      @ssl_version = :auto
-      ctx.ssl_version = @ssl_version
+      ctx.ssl_version = @ssl_version unless @ssl_version == :auto

Context

StackExchange Code Review Q#49106, answer score: 2

Revisions (0)

No revisions yet.