I'm currently trying to set-up VSFTPD on an ubuntu 16.04 server and I want to use FTPS (Ideally I would use SFTP but unfortunately I'm constrained by a legacy system)
I've managed to set it up using the default config and no TLS and I can connect fine via filezilla.Though for the past 2 days I've been trying to enable TLS and no amount of questions on SE or elsewhere seem to lead to a positive result.
My certificate details in the vsftpd.conf file are like below:
rsa_cert_file=/path/to/fullchain.pem
rsa_private_key_file=/path/to/privkey.pem
allow_anon_ssl=NO
ssl_enable=YES
force_local_data_ssl=YES
force_local_logins_ssl=YES
However I can no longer connect with the following shown in the filezilla console:
Status: Verifying certificate...
Status: TLS connection established.
Status: Server does not support non-ASCII characters.
Status: Logged in
Status: Retrieving directory listing...
Status: Server sent passive reply with unroutable address. Using server address instead.
Command: LIST
Error: Connection timed out after 20 seconds of inactivity
Error: Failed to retrieve directory listing
This is my first time configuring VSFTPD so I've been following some tutorials online. They've also involved UFW and I've opened up the ports as shown.
I've also tried adding in the following lines to my vsftpd.conf file
ssl_tlsv1=YES
ssl_sslv2=NO
ssl_sslv3=NO
require_ssl_reuse=NO
ssl_ciphers=HIGH
I've seen other posts mentioning the pasv_address option
so I've tried adding this to my config with the external IP of my server - please note it's hosted on Google Compute Engine and I've also updated my firewall rules in Compute to allow the same ports etc. that were specified in the tutorial. This too though does not work.
I can only assume it's something to do with ports/firewalls or other TLS options but I'm completely stumped. I guess it doesn't help that I have the google cloud network firewall and then ufw (though disabling ufw has no effect)
My ufw rules look as follows:
20/tcp ALLOW Anywhere
21/tcp ALLOW Anywhere
990/tcp ALLOW Anywhere
40000:50000/tcp ALLOW Anywhere
and if anyone wants to know more the tutorial I followed was here: configuring-ftp-access
There doesn't appear to be any logs in vsftpd.log that would indicate an issue but turning on verbose logging in filezilla reveals the following:
Binding data connection source IP to control connection source IP 192.168.1.100
which I presume might be an issue as that looks like a local IP. Though I'm stumped how to fix this especially with as I also have the following in my vsftpd.conf file :
pasv_address=(EXTERNAL GOOGLE COMPUTE IP)
My Google Cloud firewall rules are :
IP ranges: 0.0.0.0/0
tcp:20-21
Allow
1000
default
pass-ports
sftp
IP ranges: 0.0.0.0/0
tcp:40000-50000
(these will be locked down IP wise eventually but even testing with all I can't get this working)
And also in my vsftpd.conf file I believe I added those ports as the ones to use via:
port_enable=YES
pasv_enable=YES
pasv_min_port=40000
pasv_max_port=41000
Update
I can now connect to this from the box itself using lftp and the following arguments
set ftp:ssl-force true
I connect via the domain name rather than IP as the cert is mapped to the domain so it won't work with the IP.
I can then create new directories etc via the command line. However if I try to do ls
I get ls at 0 [Making data connection...]
and it just hangs there. I also get an error via an external FTP client such as filezilla. This just times out at the LIST
command
Command: LIST
Error: The data connection could not be established: ETIMEDOUT - Connection attempt timed out
Response: 425 Failed to establish connection.
Error: Failed to retrieve directory listing
Error: GnuTLS error -15: An unexpected TLS packet was received.
Status: Disconnected from server: ECONNABORTED - Connection aborted
The only other info that I can think might be relevant is that the domain is port forwarded by NGINX to a node app. But I assume this should only do this for ports 80 and 443 so shouldn't be effecting port 21.
Does anyone have any ideas?
0 Answers