How to delete a huge number of files at a time


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting How to delete a huge number of files at a time
# 1  
Old 12-29-2010
Network How to delete a huge number of files at a time

I met a problem on HPUX with 64G RAM and 20 CPU.

There are 5 million files with file name from file0000001.dat to file9999999.dat, in the same directory, and with some other files with random names.

I was trying to remove all the files from file0000001.dat to file9999999.dat at the same time.

If use 'rm file???????.dat', trying to remove them at once, then there was an error output.

If instead use 'rm file1??????.dat' , trying to remove 10% of them at a time, then the shell did not response in several hours and had to kill the process.

Does any expert of file system can help? Or is it possible to do that by any chance?

Thanks a lot!
# 2  
Old 12-29-2010
What version of HP-UX?
What type of filesystem on what physical disc arrangement?
Is the filesystem mirrored?
Is NFS or anything slow involved?
What is the approximate total size of the files to be deleted?
What is the size of the directory file and how many inodes?
Code:
ls -lad /directory_name
df -i /directory_name


What was the "error output" mentioned above?

Are there any subdirectories under the directory containing these files?
i.e. Does this "find" command find all the files without unwanted hits and without finding files we don't want and without pointless searching?
Code:
find /directory_tree -type f -name file\?\?\?\?\?\?\?\.dat -print

# 3  
Old 12-29-2010
This might help:
https://sites.google.com/site/tfsidc...umber-of-files

I suspect you're running into a shell limitation.
This User Gave Thanks to sandholm For This Post:
# 4  
Old 12-29-2010
Quote:
Originally Posted by lisp21
If instead use 'rm file1??????.dat' , trying to remove 10% of them at a time, then the shell did not response in several hours and had to kill the process.
Let it run. Deleting that many files will take a very long time. It probably did delete some of them. Do "ls file1??????.dat | wc -l" to count how many are left.
# 5  
Old 12-29-2010
give you some outputs, when deleting, so you can monitor the progress.

Code:
ls file???????.dat |while read file
do
  echo "Deleting file $file"
  rm $file
done

# 6  
Old 12-30-2010
Quote:
Originally Posted by rdcwayx
give you some outputs, when deleting, so you can monitor the progress.

Code:
ls file???????.dat |while read file
do
  echo "Deleting file $file"
  rm $file
done

That is adding half a million fork() calls to a procedure that is already painfully long. I must advise against that. Once an hour or so he can count the remaining files using the command I gave. (Put the delete in the baxkground or use a second window.)
# 7  
Old 12-30-2010
Quote:
Originally Posted by rdcwayx
give you some outputs, when deleting, so you can monitor the progress.
That'd probably die with 'too many arguments', too, just like the OP did. The answer to 'too many arguments' is not to cram the too many arguments into ls instead, the answer is to not use that many arguments because you cannot cram an unlimited number in there. On some OSes the limit is surprisingly small.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help for Count number of files in certain time

Chaps, I need to count number of files in a remote directory from Linux (FreeBSD) as if 10 trace files (log files) been generated within 5min of time. So this is the script then I can setup a monitoring. I came across with ls -1 \ip\d:\Logs | wc -l but then what else requires to check time... (8 Replies)
Discussion started by: samwijekoon
8 Replies

2. UNIX for Beginners Questions & Answers

Count the number of files to delete doesnt match

Good evening, need your help please Need to delete certain files before octobre 1 2016, so need to know how many files im going to delete, for instance ls -lrt file_20160*.lis!wc -l but using grep -c to another file called bplist which contains the list of all files backed up doesn match... (7 Replies)
Discussion started by: alexcol
7 Replies

3. Shell Programming and Scripting

Split a folder with huge number of files in n folders

We have a folder XYZ with large number of files (>350,000). how can i split the folder and create say 10 of them XYZ1 to XYZ10 with 35,000 files each. (doesnt matter which files go where). (12 Replies)
Discussion started by: AlokKumbhare
12 Replies

4. Shell Programming and Scripting

Search and replace ---A huge number of files

Hello Friends, I have the below scenario in my current project. Suggest me which tool ( perl,python etc) is best to this scenario. Or should I go for Programming language ( C/Java ).. (1) I will be having a very big file ( information about 200million subscribers will be stored in it ). This... (5 Replies)
Discussion started by: panyam
5 Replies

5. Shell Programming and Scripting

Shell Script to delete files within a particular time frame under multiple sub folders

Greetings! I'm looking for starting information for a shell script. Here's my scenario: I have multiple folders(100) for example: /www/test/applications/app1/logs /www/test/applications/app2/logs Within these folders there are log files files that need to be deleted after a month. ... (3 Replies)
Discussion started by: whysolucky
3 Replies

6. Shell Programming and Scripting

search a number in very very huge amount of data

Hi, I have to search a number in a very long listing of files.the total size of the files in which I have to search is 10 Tera Bytes. How to search a number in such a huge amount of data effectively.I used fgrep but it is taking many hours to search. Is there any other feasible solution to... (3 Replies)
Discussion started by: vsachan
3 Replies

7. UNIX for Dummies Questions & Answers

Delete large number of files

Hi. I need to delete a large number of files listed in a txt file. There are over 90000 files in the list. Some of the directory names and some of the file names do have spaces in them. In the file, each line is a full path to a file: /path/to/the files/file1 /path/to/some other/files/file 2... (4 Replies)
Discussion started by: inakajin
4 Replies

8. UNIX for Dummies Questions & Answers

Need to delete the files based on the time stamp of the file

Hi Everyone, I want to delete some files in a path based on the time stamp of the file that is i want to delete the file once in a month. Can any one help me on this? Thanks in advance (2 Replies)
Discussion started by: samudha
2 Replies

9. Shell Programming and Scripting

Delete lines from huge file

I have to delete 1st 7000 lines of a file which is 12GB large. As it is so large, i can't open in vi and delete these lines. Also I found one post here which gave solution using perl, but I don't have perl installed. Also some solutions were redirecting the o/p to a different file and renaming it.... (3 Replies)
Discussion started by: rahulrathod
3 Replies

10. Filesystems, Disks and Memory

Time taken for creation of a huge core file

Hi, I needed to know how I can find out the time needed for an Unix machine(HP) to create a corefile as huge as 500MB(core created either by a SEGV or a kill -6 command). An approximate figure of the time taken would be really helpful.:confused: (4 Replies)
Discussion started by: nayeem
4 Replies
Login or Register to Ask a Question