Gzip Stdout Input/output Error
Contents |
communities company blog Stack Exchange Inbox Reputation and Badges sign up log in tour help Tour Start here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta ls input output error linux Discuss the workings and policies of this site About Us Learn more
Ls Cannot Access File Input Output Error
about Stack Overflow the company Business Learn more about hiring developers or posting ads with us Ask Ubuntu
Ubuntu Input Output Error External Hard Drive
Questions Tags Users Badges Unanswered Ask Question _ Ask Ubuntu is a question and answer site for Ubuntu users and developers. Join them; it only takes a minute: Sign
Input Output Error Ubuntu Usb
up Here's how it works: Anybody can ask a question Anybody can answer The best answers are voted up and rise to the top input / output error, drives randomly refusing to read / write [closed] up vote 4 down vote favorite 1 I have an issue with one of our servers running Ubuntu 10.04, it is running BackupPC and input/output error centos collects backups from various machines / servers around the building. On the 8th minute (12:08, 12:18, 12:28 etc) the backups are transferred to an external hard drive, we have three and rotate one drive for another everyday. The problem we are having is we are randomly experiencing input / output errors, when this happens you cannot read / write to the drive, it hasn't unmounted so I can cd to the mount point /media/backup1. The drives are not faulty as it's happening on all of them, so I'm at a loss as to what the problem could be, here is an example of the many errors we get: gzip: stdout: Input/output error /var/lib/backuppc/backuppc_offline: line 47: /media/backup1/Tue/offline.log: Input/output error ls: cannot access /media/backup1/Tue/incr_1083_host1.something.co.uk.tar.gz: Input/output error ls: cannot access /media/backup1/Tue/incr_1088_host1.something.co.uk.tar.gz: Input/output error ls: cannot access /media/backup1/Tue/incr_1089_host1.something.co.uk.tar.gz: Input/output error ls: cannot access /media/backup1/Tue/incr_1090_host1.something.co.uk.tar.gz: Input/output error /var/lib/backuppc/backuppc_offline: line 39: /media/backup1/Tue/offline.log: Input/output error /var/lib/backuppc/backuppc_offline: line 44: /media/backup1/Tue/offline.log: Input/output error /var/lib/backuppc/backuppc_offline: line 45: /media/backup1/Tue/incr_1090_host1.something.co.uk.tar.gz: Input/output error /var/lib/backuppc/backuppc_offline: line 47: /media/backup1/Tue/offline.log: Input/output error ls: cannot access /media/backup1/Tue/incr_591_tech2.something.co.uk.tar.gz: Input/output error /var/lib/backuppc/backuppc_offline: line 44: /media/backup1/Tue/offline.lo
Aug 2005 13:38:53 -0700 User-agent: ls reading directory . input/output error redhat Gnus/5.1007 (Gnus v5.10.7) Emacs/21.4 (gnu/linux) Eduardo Hernández unix input output error
writes: > I was thinking that it is because ls reading directory input output error ubuntu the size of the file... it's 11GB > compressed. Since gzip created the file, it's unlikely http://askubuntu.com/questions/15476/input-output-error-drives-randomly-refusing-to-read-write that the file size is the issue. Most likely it's a hardware problem. You can try "dd if=/dev/whatever of=/dev/null ibs=512", or perhaps use a different input block size depending on your device. https://lists.gnu.org/archive/html/bug-gnu-utils/2005-08/msg00144.html Of course you'll need a large-file dd. Most dd's are these days. If yours isn't, build and use the latest coreutils09:42 PMI'm trying to make a backup to a remote ftp server. the ftp server is on the same network, and so far i have been able to create a file and add a lot https://ubuntuforums.org/archive/index.php/t-1421748.html of files. The ftp folder is mounted with curlftpfs. I need this because my server runs off an usb stick, thus creating a local copy is impossible. These are the commands i used: sudo mkdir /remoteftp sudo https://www.quora.com/An-input-output-error-appears-when-using-gzip-with-compressing-an-image-virtual-file-20gb-for-backup-How-do-I-deal-with-it curlftpfs 192.168.0.1:60021 -o user=admin:admin /remoteftp sudo tar cvpzf /remoteftp/backup.tgz --exclude=/proc --exclude=/lost+found --exclude=/media --exclude=/mnt --exclude=/sys --exclude=/remoteftp --directory / / this works fine, up to the location /cdrom, upon tar gives me this: tar: Exiting with failure status due output error to previous errors /cdrom is an empty folder, as there is no cd currently in there. So what exactly is causing tar to lock up? for now, i'm going to go ahead and exclude /cdrom, but i'd really like to know what is causing tar to break. It could happen again later and that could cause me to miss a lot of backups. EDIT: after another try, i got slightly further, but now tar input output error got stuck and exited when trying to put /vmlinuz in the backup. However, i can not find this folder at all. From what i understand, vmlinuz is the linux kernel, but i do not see how this would cause tar to see a new folder. Another exclude until i can find the cause. EDIT 2: after the third try, i got a "gzip: stdout: Input/output error" after the archive reached 4 kb. Odd, as the ftp server has plenty of room left. EDIT 3: fourth try, again a "gzip: stdout: Input/output error", this time at 878 MB. I really don't see why this pops up, as the ftp connection is set to time out after 24 hours, and 500 simultaneous connections allowed. This is the only job using the ftp server. EDIT 4: fifth try, "tar: Exiting with failure status due to previous errors". This time, /SElinux is the problem. TurikoMarch 5th, 2010, 06:59 AMbump, anyone have an idea? lavinogMarch 5th, 2010, 07:32 AMThis might help you understand what is going on: ls -l / look at cdrom and vmlinuz...you should see that they are symlinks to other files. Since it runs off of a usb stick, could you not just boot a live cd and backup the usb stick with dd to an image? This way recovery would just involve
an image virtual file (20gb) for backup. How do I deal with it?UpdateCancelAnswer Wiki1 Answer Matt Mahoney, Developed PAQ and ZPAQ. Maintains the Large Text Benchmark and others. On the...Written 3w agoOne possibility is that you ran out of disk space. gzip creates a temporary output file, which might be as large as the input file depending on how compressible it is. When it is finished, it puts a .gz extension on the output file and deletes the input file.If you are backing up to an external disk, you can use the -c option to compress to standard output and keep the input unchanged, for example:gzip -c input > output.gzIn this case, output.gz would be on the external disk. When it is finished, you can delete the input yourself.If you have at least 20 GB of free disk space and you are still getting this error, you may have intermittent connectivity due to a hardware or network problem. In that case, try running it again.149 Views · Answer requested by Chandana KanchanapallyView More AnswersRelated QuestionsCan gzip file compression increase the file size?Data Compression: Does bzip2 work as well on gzipped files as it does on uncompressed files?Why doesn't Google use GZIP/Deflate compression for Google Fonts?How do I compress multiple Image Files using Javascript before uploading them to the server?How do I compress image file using Adaptive Huffman Coding in Matlab?What is the file system/compression algorithm used to store a sequence of images (no audio, just images)?Data Compression: Who created gzip?If you use the highest compression rating with gzip, does it take longer to gunzip?If I use still images in a video will it compress to be a much smaller file size?How do I enable gzip compression for an HTML website using .htaccess?What are the pros and cons of using gzip compression versus alternatives such as bzip2, 7z, and xz?How do I enable gzip compression?How can sound and images be compressed into smaller files?What are some of the benefits and problems with using GZIP compression on a PHP web site?What compression algorithm does WhatsApp use for their images?I wanted to know if anybody know of a tools which can be used to compress image files(lose less) except smush.it?How does one compress images using lossless compression when images don't have sufficient repetitiveness?When compressing, why do my Photoshop JPGs often output at twice the predicted file size?What is the advantage of using threshold in image compression?What is the file format for saving an image after image compression in MATLAB?Related QuestionsCan gzip file compression increase the file size?Data Compression: Does bzip2 work as well