Hello everybody,
Not sure this forum is the right place for my problem...
I am developing a SSL TCP Server in ruby on FreeBSD 10.1 and testing it against a multi threads client. When the number of threads on the client side is less than 190, there is no problem on the server, all the messages are received correctly. But once I increase the number of threads on the client side over 195 (194 is OK ?!), two problems pop up :
NOTE : I don't face any of these problems with the same code without SSL support even with 1000 concurrent threads on the client side ?!
Problem 1 : Exception ECONNABORTED on the server side
I am able to workaround this by restarting the accept loop in the exception handler.
Problem 2 : Server stuck When I increase the number of threads on the client side (e.g. 250), after several seconds, the server is frozen, i.e. no exception and no new connection allowed. This one is really annoying because there is no way on the server side to know it is frozen.
OS : FreeBSD 10.1 Ruby version : 2.2.1 (tried 2.1.5 as well)
Server code :
Client code :
........
I tried to tune a kernel parameter (
EDIT : I just tried with the same server running on Mac OS and I am able to have 1000 concurrent threads on the client side without any problem ?!!!
Not sure this forum is the right place for my problem...
I am developing a SSL TCP Server in ruby on FreeBSD 10.1 and testing it against a multi threads client. When the number of threads on the client side is less than 190, there is no problem on the server, all the messages are received correctly. But once I increase the number of threads on the client side over 195 (194 is OK ?!), two problems pop up :
NOTE : I don't face any of these problems with the same code without SSL support even with 1000 concurrent threads on the client side ?!
Problem 1 : Exception ECONNABORTED on the server side
Code:
/usr/local/rvm/rubies/ruby-2.1.5/lib/ruby/2.1.0/openssl/ssl.rb:232:in`accept': Software caused connection abort - accept(2) (Errno::ECONNABORTED)
from /usr/local/rvm/rubies/ruby-2.1.5/lib/ruby/2.1.0/openssl/ssl.rb:232:in `accept'
from server.rb:30:in `block (2 levels) in start_server'
Problem 2 : Server stuck When I increase the number of threads on the client side (e.g. 250), after several seconds, the server is frozen, i.e. no exception and no new connection allowed. This one is really annoying because there is no way on the server side to know it is frozen.
OS : FreeBSD 10.1 Ruby version : 2.2.1 (tried 2.1.5 as well)
Server code :
Code:
server =TCPServer.new(ip_address, port)
sslContext =OpenSSL::SSL::SSLContext.new
sslContext.cert =OpenSSL::X509::Certificate.new(File.open("cert/cert.pem"))
sslContext.key =OpenSSL::PKey::RSA.new(File.open("cert/key.pem"), SSL_PASSWORD)
sslServer =OpenSSL::SSL::SSLServer.new(server, sslContext)
loop do
Thread.new(sslServer.accept) do|connection|
begin
messageIn = connection.gets
connection.close
rescueException => ex
puts "Exception in main loop : "+ ex.message
puts "Backtrace : "+ ex.backtrace.join("\n")
end
end
Code:
def create_client(host, port)
begin
socket = TCPSocket.open(host,port)
ssl_context = OpenSSL::SSL::SSLContext.new()
ssl_context.cert = OpenSSL::X509::Certificate.new(File.open("lib/cert/cert.pem"))
ssl_context.key = OpenSSL::PKey::RSA.new(File.open("lib/cert/key.pem"), SSL_PASSWORD)
ssl_context.ssl_version = :SSLv3
ssl_context.ssl_timeout = 10
ssl_socket = OpenSSL::SSL::SSLSocket.new(socket, ssl_context)
ssl_socket.connect
rescueException=> ex
puts "Exception in create_client"
sleep 1
return create_client(host, port )
end
return ssl_socket
end
Code:
for j in 1..10 do
threads = []
for i in 1..n.to_i do
threads <<Thread.new do
begin
socket = create_client(ip, port)
socket.puts("hello")
socket.flush
socket.close
rescueException=> ex
puts "Exception"
end
end
end
threads.each(&:join)
end
I tried to tune a kernel parameter (
sysctl kern.ipc.somaxconn=1024
) but it did not change anything...EDIT : I just tried with the same server running on Mac OS and I am able to have 1000 concurrent threads on the client side without any problem ?!!!