Server Bug Fix: curl: (25) Failed FTP upload: 553 to vsftpd docker

Original Source Link

I’m running you container and try to send files using curl but it fails.

Running the container

export FTP_USER="test"
export FTP_PASSWORD="test"

docker run 
    --name mock_ftp_server 
    --publish 21:21 
    --publish 4559-4564:4559-4564 
    --env FTP_USER="$FTP_USER" 

Sending file

$ curl --upload-file /tmp/ ftp://localhost --user $FTP_USER:$FTP_PASSWORD
% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                               Dload  Upload   Total   Spent    Left  Speed
0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
curl: (25) Failed FTP upload: 553


What’s the matter here? Do I need to add something?


Based on VSFTPD 553 error: could not create file on AskUbuntu I fixed it by changing the owner of the root directory (/srv/) to the FTP user ftp:

docker run …
docker exec mock_ftp_server chown ftp:ftp -R /srv/
curl …

I’m waiting for information about security for this solution.

Tagged : / / / /

Server Bug Fix: PSCP: Upload an entire folder, Windows to Linux

Original Source Link

I am using PSCP to upload some files from Windows to Linux. I can do it fine just uploading one file at a time. But I have some very large directories and I want to upload an entire directory at once.

I have tried:

pscp -i C:sitedeployabt-keypair.ppk includes* [email protected]:/usr/local/tomcat/webapps/ROOT/includes/*

Throws error: “pscp: remote filespec /usr/local/tomcat/webapps/ROOT/includes/*: not a directory”


pscp -i C:sitedeployabt-keypair.ppk includes [email protected]:/usr/local/tomcat/webapps/ROOT/includes/

Throws error: “scp: includes: not a regular file”


pscp -i C:sitedeployabt-keypair.ppk includes [email protected]:/usr/local/tomcat/webapps/ROOT/includes

Throws error: “scp: includes: not a regular file”

Two problems: First, the * does not go on the destination side. Second, -r is for copying an entire directory and subdirectories.

pscp -i C:sitedeployabt-keypair.ppk includes* [email protected]:/usr/local/tomcat/webapps/ROOT/includes/

Will copy all of the files in the local includes directory to the .../includes/ directory on the server.

pscp -r -i C:sitedeployabt-keypair.ppk includes [email protected]:/usr/local/tomcat/webapps/ROOT/

Will copy the includes directory itself, including all files and subdirectories, to the .../ROOT/ directory on the server (where the contents of the local directory would merge with any existing .../ROOT/includes/ directory.


You don’t need to use -i for this. It’s for private key file authentication. Just use -r to copy the source files recursively.

You might want a drag and drop method since you’re using Windows. You can – for example – use WINSCP client.

If you want to copy a directory and its contents you don’t need to provide a file specification for the destination. Just use the directory name, for example.

pscp  -i C:sitedeployabt-keypair.ppk includes* [email protected]:/usr/local/tomcat/webapps/ROOT/includes/

If you want to copy the directory and everything below it then you can use -r:

pscp -r -i C:sitedeployabt-keypair.ppk includes [email protected]:/usr/local/tomcat/webapps/ROOT/includes/

If you want to copy the folder itself with everything under it, you must use a command as below:

pscp -r -i C:PrivateKeysMyPrivateKey.ppk C:FOLDER1 <username>@<server_id>:/home/<username>/

But notice there is no slash at the end of folder path “C:FOLDER”;
if you use it with an ending slash like “C:FOLDER1” it doesn’t copy the folder itself, but only copies everything under the folder.

Tagged : / / / /

Server Bug Fix: Upload directory from my NGINX server

Original Source Link

I have nginx with RTMP module on my server.
I want to know: there is some way to upload a directory each time a new file is created to another server?

My server create HLS playlists in my /directory.

Can I upload this /directory (each time is updated with a new file) to ?
I can’t setup a cronjob to do this, because it will be so slowly.

Tagged : / /

Server Bug Fix: Unable to increase IIS 7.5 upload limit (still 404 error)

Original Source Link

I need to be able to upload large files to an ASP.NET application. I knew that IIS 7.5, by default, enforces a 30MB request limit. And I know that IIS throws a 404 error when you try to upload a file larger than the upload limit.

I have tried both to set a 500MB upload limit in my application’s web.config and double-checked IIS console’s Request Filter for the 500MB upload limit, successfully.

I still get a 404 error with a file large 23.2MB

I’m writing on SF because I believe it’s not an application problem but a server configuration problem. What more can I check?

try the following in the web.config of your site. Look a the docs for what units the values are in.

in <system.web>

<httpRuntime maxRequestLength="2097151" executionTimeout="10000"/>

MSDN docs

in <system.webServer>

    <requestLimits maxAllowedContentLength="1073741824"/>

MSDN docs

Tagged : /

Server Bug Fix: Allow anonymous upload for Vsftpd?

Original Source Link

I need a basic FTP server on Linux (CentOS 5.5) without any security measure, since the server and the clients are located on a test LAN, not connected to the rest of the network, which itself uses non-routable IP’s behind a NAT firewall with no incoming access to FTP.

Some people recommend Vsftpd over PureFTPd or ProFTPd. No matter what I try, I can’t get it to allow an anonymous user (ie. logging as “ftp” or “anonymous” and typing any string as password) to upload a file:

# yum install vsftpd

# mkdir /var/ftp/pub/upload

# cat vsftpd.conf

#anonymous users are restricted (chrooted) to anon_root
#directory was created by root, hence owned by root.root


When I log on from a client, here’s what I get:

500 OOPS: cannot change directory:/var/ftp/pub/incoming

I also tried “# chmod 777 /var/ftp/incoming/”, but get the same error.

Does someone know how to configure Vsftpd with minimum security?

Thank you.

Edit: SELinux is disabled and here are the file permissions:

# cat /etc/sysconfig/selinux

# sestatus
SELinux status:                 disabled
# getenforce

# grep ftp /etc/passwd
ftp:x:14:50:FTP User:/var/ftp:/sbin/nologin

# ll /var/
drwxr-xr-x  4 root root 4096 Mar 14 10:53 ftp

# ll /var/ftp/
drwxrwxrwx 2 ftp ftp 4096 Mar 14 10:53 incoming
drwxr-xr-x 3 ftp ftp 4096 Mar 14 11:29 pub

Edit: latest vsftpd.conf:


#anonymous users are restricted (chrooted) to anon_root

#500 OOPS: bad bool value in config file for: chown_uploads

Edit: with trailing space removed from “chown_uploads”, err 500 is solved, but anonymous still doesn’t work:

client> ./ftp server
Connected to server.
220 (vsFTPd 2.0.5)
Name (server:root): ftp
331 Please specify the password.
500 OOPS: cannot change directory:/var/ftp/pub/incoming
Login failed.
ftp> bye

With user “ftp” listed in /etc/passwd with home directory set to “/var/ftp” and access rights to /var/ftp set to “drwxr-xr-x” and /var/ftp/incoming to “drwxrwxrwx”…could it be due to PAM maybe? I don’t find any FTP log file in /var/log to investigate.

Edit: Here’s a working configuration to let ftp/anonymous connect and upload files to /var/ftp:


You have created a dir called pub/upload:

# mkdir /var/ftp/pub/upload

But then you configured uploads to go to pub/incoming:


So it’s a simple path mismatch, all the rest seems OK.

  1. For anonymous logins, change the “ftp” users home directory in /etc/passwd.

    ftp:x:119:131:ftp daemon,,,:/var/ftp/pub/:/bin/false
  2. And add this to your /etc/vsftpd.conf file.

  3. And make sure that the ftp user has access (chmod 755) to enter every directory up to the location /var/ftp/pub/

I was fighting this problem for hours. vsftpd doesn’t give clear help or suggestions for errors.

You probably have userlist_deny=NO in your conf file. Change it to YES and make sure that the username you’re using isn’t in /etc/vsftpd/user_list or in /etc/vsftpd/ftpusers.

You probably have SE linux enabled. Rather than disable the whole thing you can use

/usr/sbin/setsebool -P ftp_home_dir 1

to allow ftp to work correctly.

Looking over this again, if the commands above are what you really typed then /var/ftp/pub/incomming doesn’t exist so make sure it does then try again.

here is what worked for me (he said, while mentally strangling a developer)



create/chown Directories

mkdir /var/ftp
chown nobody:nogroup /var/ftp
mkdir /var/ftp/uploads
chown ftp:ftp /var/ftp/uploads
#edit for good measure also (gave me grief with pure-ftpd)
chmod 777 /var/ftp/uploads

change homedir in


    ftp:x:116:124:ftp daemon,,,:/var/ftp:/usr/sbin/nologin

(default home dir on debian is /srv/ftp so you can make yo life
easier sticking to that dir)

uploads are possible to /var/ftp/uploads
SElinux/Apparmor is disabled on this system, debian 10.3

Note: vsftp will throw an errormessage if the initial homedirectory
has rw rights (aka chown ftp:ftp) on /var/ftp directly

Edit note: with the faint hope, this whole text is somewhat understandable.

Is SE linux enabled. Folks in that forum were able to resolve the issue by disabling SE linux.

Tagged : / / / /

Code Bug Fix: Grails. Upload files to temp folder and display them in gsp

Original Source Link

My intention is to upload images and store them in a temp folder. Then I want to display these images in the .gsp views. The process I’ve been trying to make it to work is something like this:

First, upload the file from input:

<input id="inputImg" type="file" accept="image/*">

Create the file:

def saveFile(MultipartFile inputImg) {

    def contentType = inputImg.getContentType()
    def originalFilename = inputImg.getOriginalFilename()
    def extension = FilenameUtils.getExtension(originalFilename)

    String tempPath = System.getProperty("") + "/uploads"

    File file = new File("$tempPath/$originalFilename")

    if (contentType == 'application/octet-stream') {
        contentType = MimeTypeUtils.getContentTypeByFileName(originalFilename)

    Path filePath = Paths.get(file.toString())
    Path path = Paths.get(tempPath)
    Path relativePath = path.relativize(filePath)

    Avatar avatar = new Avatar(
            path: relativePath.toString(),
            contentType: contentType,
            name: originalFilename,
            extension: extension

Once is stored in the temp folder, I found this solution but I’m not sure if it’s the best way to do it. I’m trying to process the image with base64 encoding before sending it to the view:

def filename = user?.avatar?.name
def file = new File("$tempPath/$filename")
def base64file = file?.readBytes()?.encodeBase64()

And finally show it in the gsp:

<img alt="img" src="data:image/*;base64,${base64file}"/>

I would like to know if there is another best way to do this process, I don’t know if I’m missing something or if this isn’t a good procedure to manage with files and images…

You are using the inline images with Base64 encoding which is good for displaying relatively small images (up to 5k). The advantage of this approach is that you dump the page WITH images in a single HTTP-connection.

If the images grow considerably larger (> 1MB), then you can not use caching and other nice features, so you have to send the data over the line over and over again and that would slow down user experience.

Another way would be to deliver each image in a separate request.

You could define a controller action like:

class ImageController {

  def image(String id){
    def file = new File("$tempPath/$id")
    if( !file.exists() )
      render status: 404
      response.contentType  = 'image/jpeg'
      response.withOutputStream{ it << file.readBytes() }

then in your GSP you put:

<img alt="img" src="${g.createLink( controller:'image', action:'image', )}"/>

Tagged : / / / /

Linux HowTo: Why some uploads fail, but succeed from a VPN

Original Source Link

I am not sure if this is the right site for this. I thought about superuser but it may not be the best for networking issue. If this is the wrong site,I apologize and would be happy to have the question transferred where it is more appropriate.

I have a very weird problem:

I am currently in Germany, having a 200/100mbps connection from a local company called Pyur.

The issue happens when uploading docker images to For each upload, there are 6-7 simultaneous connections opened and the files are around 100mb.

Since I am with this provider, I get timeouts on these connections, maybe 1/3 of the time.

But when I do the same operation through a VPN, the uploads happen normally.

Everything else works perfectly, no problem at all with the provider.

How should I approach troubleshooting this?

While these uploads fail, the connection is working properly.

Gitlab is large enough that if the problem was on their end, they’d be aware of it.
They are sitting behind cloudflare, but, again, I’d expect cloudflare to be aware of this if it was a problem.

The leaves my provider, but the issue only happens in that scenario and the connection works perfectly otherwise.

Tagged : /

Ubuntu HowTo: Error opening Serial Port – Arduino

Original Source Link

I try to upload arduino code to an Arduino Mega. The serial port is detected the first time I plug in the board. As soon as I upload, I get an error saying ‘Error opening Serial Port ‘. And then the Serial Port option is not available for some time. The error repeats the next time I try to upload.

Binary sketch size: 1,500 bytes (of a 258,048 byte maximum) Error opening serial port    '/dev/ttyACM0'.
Caused by: Invalid Parameter
  ... 9 more Error opening serial port '/dev/ttyACM0'.

To solve that I added myself to the group that owns the port device (the dialup or dialout group from memory). Either that or make the port device world writable.

Tagged : / /

Server Bug Fix: Apache 2.2 + SSL large file upload fail

Original Source Link

I’m using apache 2.2 + ssl + wordpress. I can easily upload files up to 1GB without using SSL, but when I enable SSL I can only upload files up to ~250 MB.
Error logs shows nothing.

Am I missing somekind of SSL config?

Take a look here, one of these should solve your problem,

Most likely you need to change the SSL buffer size in your apache config file.

SSLRenegBufferSize 10486000

The name and location of the conf file is different depending on distributions.

In Debian you find the conf file here,

Tagged : / /