Server Bug Fix: How to avoid lftp Certificate verification error?

Original Source Link

I’m trying to get my Pelican blog working. It uses lftp to transfer the actual blog to ones server, but I always get an error:

mirror: Fatal error: Certificate verification: subjectAltName does not match ‘blogname.com’

I think lftp is checking the SSL and the quick setup of Pelican just forgot to include that I don’t have SSL on my FTP.


This is the code in Pelican’s Makefile:

ftp_upload: $(OUTPUTDIR)/index.html
lftp ftp://$(FTP_USER)@$(FTP_HOST) -e "mirror -R $(OUTPUTDIR) $(FTP_TARGET_DIR) ; quit"

which renders in terminal as:

    lftp ftp://[email protected] -e "mirror -R /Volumes/HD/Users/me/Test/output /myblog_directory ; quit"

What I managed so far is, denying the SSL check by changing the Makefile to:

lftp ftp://$(FTP_USER)@$(FTP_HOST) -e "set ftp:ssl-allow no" "mirror -R $(OUTPUTDIR) $(FTP_TARGET_DIR) ; quit"

Due to my incorrect implementation I get logged in correctly (lftp [email protected]:~>) but the one line feature doesn’t work anymore and I have to enter the mirror command by hand:

mirror -R /Volumes/HD/Users/me/Test/output/ /myblog_directory

This works without an error and timeout. The question is how to do this with a one liner.


In addition I tried:

  • set ssl:verify-certificate/ftp.myblog.com no
  • This trick to disable certificate verification in lftp:

    $ cat ~/.lftp/rc
    set ssl:verify-certificate no

However, it seems there is no “rc” folder in my lftp directory – so this prompt has no chance to work.

From the manpage:

-c commands
Execute the given commands and exit. Commands can be separated with a semicolon (;), AND (&&) or OR (||). Remember to quote the commands argument properly in the shell. This option must be used alone without other arguments.

So you want to specify the commands as a single argument, separated by semicolons:

lftp ftp://$(FTP_USER)@$(FTP_HOST) -e "set ftp:ssl-allow no; mirror -R $(OUTPUTDIR) $(FTP_TARGET_DIR) ; quit"

You can actually omit the quit command and use -c instead of -e.

I had a similar issue, though my lftp does have ssl support compiled in (Fedora RPM). ssl:verify-certificate false did the trick for me.

no certificate check

echo "set ssl:verify-certificate no" >> ~/.lftp/rc

will solve the problem if you dont want the certificate to be checked

The secure solution with certificate is

What worked for me step by step with lftp:

  1. get certificate of host with openssl s_client -connect <ftp_hostname>:21 -starttls ftp, at the begining of result I got something like -----BEGIN CERTIFICATE-----
    MIIEQzCCAyu.....XjMO
    -----END CERTIFICATE-----
  2. copy that -----BEGIN CERTIFICATE-----
    MIIEQzCCAyu.....XjMO
    -----END CERTIFICATE-----
    into /etc/ssl/certs/ca-certificates.crt
  3. Into lftp configuration reference this certificate file adding to /etc/lftp.conf for systemwide set ssl:ca-file "/etc/ssl/certs/ca-certificates.crt"
  4. and then do your sync or whatever with lftp, on my case it is lftp -u "${FTP_USER},${FTP_PWD}" ${FTP_HOST} -e "set net:timeout 10;mirror ${EXCLUDES} -R ${LOCAL_SOURCE_PATH} ${REMOTE_DEST_PATH} ; quit"

ssl:verfy-certificate false didn’t work for me, I was getting a timeout error when “making data connection”.

I followed these instruction by adding set ftp:ssl-allow false to my ~/.lftprc file.

In addition I tried:

  • set ssl:verify-certificate/ftp.myblog.com no
  • This trick to disable certificate verification in lftp:

$ cat ~/.lftp/rc set ssl:verify-certificate no

Try using set ftp:ssl-allow no; it worked like a charm for me.

I was also facing similar sort of ssl certificate verification error. Setting verify-certificate to ‘no’ worked for me.

Example:

lftp -c ‘set ftps:initial-prot “”; set ftp:ssl-force true; set ftp:ssl-protect-data true; set ssl:verify-certificate no; open -u Usename,Password 208.82.204.46; put uploadfilename;’

I have read man pages and found solution.
Create file

~/.lftp/rc

and add there next line:

set ssl:check-hostname false;

Need the lftp command: set ftp:ssl-allow no;

You could execute the command just after selecting:

lftp www.yourdomain.com -u username,password -e "set ftp:ssl-allow no;"

or save the command into ~/.lftprc.

lftp -u username,password host -e "set ftp:ssl-allow no" 

fixed the issue for me

Solved using this:

lftp ftp://$(FTP_USER)@$(FTP_HOST) -e "set ssl:verify-certificate no; mirror -R $(OUTPUTDIR) $(FTP_TARGET_DIR) ; quit"

example:

lftp ftp://[email protected] -e "set ssl:verify-certificate no; mirror -R /Volumes/HD/Users/me/Test/output /myblog_directory ; quit"

Tagged : / / /

Server Bug Fix: FTP Error 530, User cannot log in, home directory inaccessible

Original Source Link

I’ve been tasked with setting up an FTP directory for a client of ours. I’m working from a Windows 2008 Server with IIS 7 installed.

To create the FTP user directory I’ve followed this eHow tutorial.

The FTP site is already set up on IIS 7, so I skipped that bit and followed the rest exactly. However, when I try to connect via FileZilla, I get the following errors:

Status: Connecting to xxx.xx.xx.xx:21...
Status: Connection established, waiting for welcome message...
Response: 220 Microsoft FTP Service
Command: USER userFTP
Response: 331 Password required for userFTP.
Command: PASS ********
Response: 530 User cannot log in, home directory inaccessible.
Error: Critical error
Error: Could not connect to server

I’ve double checked the permissions of the user and everything appears to be as it should. If anyone has any advice, I’d be so grateful.

It’s not clear to me from reading your post and the link you provided as to whether or not you’re using user isolation. My suggestion would be to determine whether or not you want to use user isolation or not and then start from scratch.

Here’s a link that may help:

http://learn.iis.net/page.aspx/305/configuring-ftp-75-user-isolation/

It is the user isolation setting.

You will need change it to “do not isolate users, start users in “user name directory” “

I just hit this issue and for anyone googling the error would like to add the solution that worked on Windows Server 2012 IIS 8.0. It was very simple in the end you have to create a LocalUser folder in the FTP root you specified when creating the FTP site. Then create your username folders under this folder.

For e.g. D:ftp-rootLocalUseruser1

Another cause of this error can be the use of FTP IPv4 Address and Domain Restrictions.

If your IIS FTP Site, or one of its parents including the Default site, is using IPv4 Address Restrictions then you’ll need to ensure that your IP address is allowed.

I had this same issue you’ve described, with the exact same Error returned to FileZilla. Here’s how I fixed it:

  1. Open the IIS Manager
  2. Click on the Sites > Default FTP Site settings
  3. Open FTP IPv4 Address and Domain Restrictions
  4. Ask Google what is my ip
  5. Add your public IP address to the allowed list under FTP IPv4 Address and Domain Restrictions
  6. Open Services from the Start Menu
  7. Find the Microsoft FTP Service in the Started Services list
  8. Restart the Microsoft FTP Service

IIS Manager FTP IPv4 Address and Domain Restrictions

We had the same issue . (530 user cannot log in, home directory inaccessible)The problem was a new opening (To allow more sessions) in our firewall allowed another IP to our FTP server (We have IP restrictions setup)
Solution was to add the IP to the IPRestrictions ALLOW LIST

Check the FTP logs recorded by IIS. The status and sub-status codes will give you more information about the issue. Here is a list of the status codes: The FTP status codes in IIS 7.0 and later versions

In my case, this issue occured because my IIS wasn’t configured for passive mode. After entering a port range and external IP address in FTP Firewall Support feature, the error message disappeared:

enter image description here

In this blog post, it mentions a few more root causes: 530 User cannot log in, home directory inaccessible

Authorization rules. Make sure to have an Authorization rule that allows the user or anonymous access. Check “IIS > FTP site > FTP Authorization Rules” page to allow or deny access for certain or all users.

NTFS permissions. The FTP users (local or domain users) should have permissions on the physical folder. Right click the folder and go to Properties. In the Security tab, make sure the user has required permissions. You can ignore Shared tab. It is not used for FTP access.

Locked account. If you local or domain account is locked or expired, you may end up seeing “User cannot log in” error. Check local user properties or Active Directory user settings to make sure the user account is active.

Other permission issues. The user account may not have “Log on locally” or “Allow only anonymous connections security” rights.

I know you said you double checked the permissions, but I wanted to verify that you’d checked the file- level permissions as well as the share permissions?

You will need to verify the Physical Path of the FTP. Following is the steps to check.

Go to IIS.

Right, Click on Default FTP site. Manage FTP Sit >> Advance

Settings >> Physical Path.

It must be correct or you will find home directory inaccessible.

Tagged : / /

Linux HowTo: trust server certificate with lftp

Original Source Link

When connecting to a server with lftp, I have the following issue:

Certificate verification: Not trusted: no issuer was found (AA:AA:AA:[...]:AA:AA)

Which indicates at least that the cert verification failed. I would like to whitelist that certificate.
Obviously, disabling certificate verification is not an option due to security concerns.

Here is what I already tried:

  • Following that guide to retrieve certs from the server, and use them with set ssl:ca-file. Following that guide, I have three certs. I tried them all, then concatenated together, which didn’t change a thing. Also tried with ssl:cert-file.
  • using the same method as above with openssl s_client -connect my.server.tld:21 -starttls ftp, which yields only one certificate
  • setting ssl:ca-file to the system’s ca store
  • using gnutls-cli works fine with the -s option, so do the above openssl s_client commands.

The certificate seems to be signed by a valid chain of trust, as far as those commands report.

Filezilla works fine, but displays the following warning, which might be related:

Server sent unsorted certificate chain in violation of the TLS specifications

I have no control over the server as I do not host it myself, but the greeter identifies itself as Pure-FTPd.

Other clients that didn’t work (lack of support for ftps, or for the specific server): ftp, ncftp, dolphin (KIO), curlftpfs, tnftp, firefox

The only solution a year later is still to turn off ssl:verify-certificate for specific certificate fingerprints.

set ssl:verify-certificate/{fingerprint1} no
set ssl:verify-certificate/{fingerprint2} no

See lftp closed issue 214 — https://github.com/lavv17/lftp/issues/214#issuecomment-197237482

What worked for me step by step with lftp:

  1. get certificate of host with openssl s_client -connect <ftp_hostname>:21 -starttls ftp, at the begining of result I got something like -----BEGIN CERTIFICATE-----
    MIIEQzCCAyu.....XjMO
    -----END CERTIFICATE-----
  2. copy that -----BEGIN CERTIFICATE-----
    MIIEQzCCAyu.....XjMO
    -----END CERTIFICATE-----
    into /etc/ssl/certs/ca-certificates.crt
  3. Into lftp configuration reference this certificate file adding to /etc/lftp.conf for systemwide set ssl:ca-file "/etc/ssl/certs/ca-certificates.crt"
  4. and then do your sync or whatever with lftp, on my case it is lftp -u "${FTP_USER},${FTP_PWD}" ${FTP_HOST} -e "set net:timeout 10;mirror ${EXCLUDES} -R ${LOCAL_SOURCE_PATH} ${REMOTE_DEST_PATH} ; quit"
Tagged : / / / /

Linux HowTo: SSH/FTP connection denied after terminal automatically change ComputerName

Original Source Link

I was using SSH and FTP connection using terminal and ATOM respectively on my MAC. My local machine (MAC) was able to connect to the Linux PC through SSH and FTP on my home network. When I moved both machines to the office network my MAC seemed to have changed its ComputerName in terminal. Previously int terminal it was [email protected] ~ % now it is [email protected] ~ %. However my computer name in SystemPreference>RemoteSharing is still MacbookPro but in SystemPreference > Network > Advance > WINS have MOBILE44 as NetBIOS name.

I remember connecting to the Linux machine through SSH on my office network at-least once using this command: `ssh -v [email protected]’ and for FTP I used .ftpconfig file.

Now when I try to make connection through terminal I get this and time out after a while:

OpenSSH_8.1p1, LibreSSL 2.7.3
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 47: Applying options for *
debug1: Connecting to 172.17.0.1 [172.17.0.1] port 22.

For FTP connection I get this error:

Error: getaddrinfo ENOTFOUND aceinna-nvidia-laptop.local aceinna-nvidia-laptop.local:22

I would appreciate some help to get the SSH connection working again. Thanks in advance.

Regards,
Rishit

Looks like a DNS issue. The computer names you use end in .local. Although that might have worked at home, at the office there is likely a different setup and those names are probably not right. Try to connect using the IP address instead of a computer name. You can also use ping to make sure you can reach the other machine first.

Tagged : / / / /

Making Game: trust server certificate with lftp

Original Source Link

When connecting to a server with lftp, I have the following issue:

Certificate verification: Not trusted: no issuer was found (AA:AA:AA:[...]:AA:AA)

Which indicates at least that the cert verification failed. I would like to whitelist that certificate.
Obviously, disabling certificate verification is not an option due to security concerns.

Here is what I already tried:

  • Following that guide to retrieve certs from the server, and use them with set ssl:ca-file. Following that guide, I have three certs. I tried them all, then concatenated together, which didn’t change a thing. Also tried with ssl:cert-file.
  • using the same method as above with openssl s_client -connect my.server.tld:21 -starttls ftp, which yields only one certificate
  • setting ssl:ca-file to the system’s ca store
  • using gnutls-cli works fine with the -s option, so do the above openssl s_client commands.

The certificate seems to be signed by a valid chain of trust, as far as those commands report.

Filezilla works fine, but displays the following warning, which might be related:

Server sent unsorted certificate chain in violation of the TLS specifications

I have no control over the server as I do not host it myself, but the greeter identifies itself as Pure-FTPd.

Other clients that didn’t work (lack of support for ftps, or for the specific server): ftp, ncftp, dolphin (KIO), curlftpfs, tnftp, firefox

The only solution a year later is still to turn off ssl:verify-certificate for specific certificate fingerprints.

set ssl:verify-certificate/{fingerprint1} no
set ssl:verify-certificate/{fingerprint2} no

See lftp closed issue 214 — https://github.com/lavv17/lftp/issues/214#issuecomment-197237482

What worked for me step by step with lftp:

  1. get certificate of host with openssl s_client -connect <ftp_hostname>:21 -starttls ftp, at the begining of result I got something like -----BEGIN CERTIFICATE-----
    MIIEQzCCAyu.....XjMO
    -----END CERTIFICATE-----
  2. copy that -----BEGIN CERTIFICATE-----
    MIIEQzCCAyu.....XjMO
    -----END CERTIFICATE-----
    into /etc/ssl/certs/ca-certificates.crt
  3. Into lftp configuration reference this certificate file adding to /etc/lftp.conf for systemwide set ssl:ca-file "/etc/ssl/certs/ca-certificates.crt"
  4. and then do your sync or whatever with lftp, on my case it is lftp -u "${FTP_USER},${FTP_PWD}" ${FTP_HOST} -e "set net:timeout 10;mirror ${EXCLUDES} -R ${LOCAL_SOURCE_PATH} ${REMOTE_DEST_PATH} ; quit"
Tagged : / / / /

Making Game: SSH/FTP connection denied after terminal automatically change ComputerName

Original Source Link

I was using SSH and FTP connection using terminal and ATOM respectively on my MAC. My local machine (MAC) was able to connect to the Linux PC through SSH and FTP on my home network. When I moved both machines to the office network my MAC seemed to have changed its ComputerName in terminal. Previously int terminal it was [email protected] ~ % now it is [email protected] ~ %. However my computer name in SystemPreference>RemoteSharing is still MacbookPro but in SystemPreference > Network > Advance > WINS have MOBILE44 as NetBIOS name.

I remember connecting to the Linux machine through SSH on my office network at-least once using this command: `ssh -v [email protected]’ and for FTP I used .ftpconfig file.

Now when I try to make connection through terminal I get this and time out after a while:

OpenSSH_8.1p1, LibreSSL 2.7.3
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 47: Applying options for *
debug1: Connecting to 172.17.0.1 [172.17.0.1] port 22.

For FTP connection I get this error:

Error: getaddrinfo ENOTFOUND aceinna-nvidia-laptop.local aceinna-nvidia-laptop.local:22

I would appreciate some help to get the SSH connection working again. Thanks in advance.

Regards,
Rishit

Looks like a DNS issue. The computer names you use end in .local. Although that might have worked at home, at the office there is likely a different setup and those names are probably not right. Try to connect using the IP address instead of a computer name. You can also use ping to make sure you can reach the other machine first.

Tagged : / / / /

Linux HowTo: Securely remotely access files on a HDD attached to a router?

Original Source Link

I’m looking for a simple but genuinely secure way to remotely access files on a hard drive attached (via USB 3) directly to my router. (Currently using ASUS RT-AX56U.)

Many routers with USB ports allow you to attach a drive, and often offer an FTP server which, combined with DDNS, means you can access it remotely. However, for some reason, I’ve been unable to find any router which implements either FTP over TLS (FTPS) or FTP through SSH (SFTP) – they all only seem to offer plain old unencrypted FTP, which therefore I don’t want to use.

How can I securely access these files remotely?

I’d like to avoid:

  • Using any proprietary built-in remote access functions, as these are notoriously deeply insecure (no matter what the marketing says)
  • Installing custom firmware unless you can reassure me this doesn’t, itself, introduce new security vulnerabilities?

And when I say “securely access files” what I mean is end-to-end encrypted, i.e. no file or password is ever transmitted over the line in clear text, and no one could intercept the keys either (i.e. encrypted via some standard public key cryptography scheme).

Your first bullet point makes no sense.
You don’t trust any build-in remote access solution (like a VPN), but you are willing to trust a file-transfer protocol hosted on the same router, that is might be just as buggy/unpatched/outdated.

Just enable the OpenVPN service (all Asus routers have it) and use that in combination with the on-board FTP or SMB file-sharing.

Asus is fairly good with timely updates and the openVPN implementation is very stable and taken directly from the Open Source code. No weird changes that introduce more problems.

Tagged : / / / /

Making Game: Ubuntu SFTP no longer works after server reboot

Original Source Link

I was setting my entire LAMP server on ubuntu server 20.04 with guides from this site: https://devanswers.co/

Finally I managed to make everything work, including sftp based on https://devanswers.co/configure-sftp-web-server-document-root/

Unfortunately while trying to setup multiple virtual hosts I forgot to set ownership of new directories. While trying to fix permission errors I was trying to redo many steps from the guides. Finally I realized that I should add ovnership and for a while everything worked. But after I reboot server, I can no longer connect to sftp. Filezilla returns FATAL ERROR: Network error: Software caused connection abort, and notepad++ returns this:

Connecting
[SFTP] Host key accepted
[SFTP] Successfully authenticated
[SFTP] Error initialising channel: Socket error: Unknown error 
Unable to connect
Disconnected

What did happen? I’m almost sure that I messed up something, since i’m new with ubuntu. But I don’t know what I did since I changed a lot of thing in the meantime, trying to redo all the steps from linked guides while I was unaware that I should change directories ovnership.

If I should provide more logs, than please guide me – where can I find those? Since N++ did get authenticated, than server deffinetly does have some logs that could help find the reason of my problems, right?

The issue was with directories chmod and chown. It’s hard to tell me right now if the reboot caused this, or if it was my ingeration that caused this. Anyway, I found help here: https://devanswers.co/configure-sftp-web-server-document-root/

Tagged : / / /

Linux HowTo: Vsftpd Implementation “Per-user only one active concurrent session is allowed”

Original Source Link

Vsftpd Implementation “Per-user only one active concurrent session is allowed”

I found that Pure-FTP has this compile options “–with-peruserlimits”. I wonder if vsftpd has this option?

Unfortunately, no (Source).

I couldn’t find anything mentioning limit by user on that man page, so i think we can’t do that.

Tagged : / /

Linux HowTo: Ubuntu – FTP file edit permission denied

Original Source Link

I have been trying to make an Unturned server for my friends, and I cannot edit ANY files in it.

What I’ve Tried:

  • I can’t delete or upload to the FTP server; when I try to change the file permissions, it denies it.

  • I have changed the write_enable=NO to write_enable=YES in /etc/vsftpd.conf but it didn’t help. I cannot edit the permissions from the FTP server.

  • Researching the web for answers, but I cannot find anything on this anywhere on the internet.

Other Details:

  • I am using FileZilla for FTP services.

Goal:

I want to make it so anyone that can see the folder can edit it, and not just the terminal user.

If anyone can help me that would be great.

-Solution-

Solving this took me some time but I finally figured it out. You must first, login to the account you want on FTP (On a local network you may leave the port blank). If you can access the files, you are fine. Next, open a terminal connection to the Ubuntu Server itself. Once you have done so, login to an account with sudo permissions. You must then change the owner of the file. I suggest giving the user ownership of Everything in order to ensure full access.
First, run cd ~to change the seclected folder, thensudo chown -R [User's Username]:root /home/[username]. That should change the owner of the folder to the specified user along with all the files inside the folder. If you want to change ownership of just a specific file, run sudo chown [User's Username]:root [Directory (example: /home/user/folder/document.txt)]

-Flat out Instructions-

Login to the user (Sudo permissions required), run cd~, find the folder you want, and run sudo chown -R [User's Username]:root /home/[username]. Or for a specific file, sudo chown [User's Username]:root [Directory (example: /home/user/folder/document.txt)]

Tagged : / / / /