What I need is to not to copy the file again if the filename already exists under /p2/bkp
The reason is that the file size under /p2/arch/log/ will be reduced (contents truncated), so I need to keep the files copied over for the very 1st time.
In that case, I keep the original files w/o losing the contents.
Note, however, that if a file is being written into /p2/arch/log/file when the cron job starts running, you might copy the initial contents of file to /p2/bkp/file and never pick up data written to the end of the file after the copy completes.
Are files in /p2/arch/log being updated after all of the data that is initially written to them has been completed? If not and /p2/arch/log and /p2/bkp are on the same filesystem, did you consider just making a hard link from /p2/arch/log/file to /p2/bpk/file?
Sorry... I should have been much more verbose about the environment needed to use a hard link as a backup method.
I once worked in a group where a logging daemon was started when the system booted and ran forever to log errors that would occur. It opened the log files with something like:
(note that it is opening a file for writing with no write permissions on the file it creates).
If the open() failed with an EPERM error, it looked to see if another daemon is already running (and quietly exited if it was). Otherwise, it incremented that day's log sequence number and tried again.
At midnight, it closed the current log file, opened a new log file for the new day, hard linked any log files created the previous day to a backup directory, and removed any log files in the log directory that were more than a week old (leaving the link in the backup directory untouched).
Another process ran every few days archiving the log files in the backup directory to tape and removing log files more than three months old from the backup directory.
This scheme worked well for that project, but it assumes a lot about how the logs are created and used that is atypical for most loggers today.
This User Gave Thanks to Don Cragun For This Post:
Try `cp --help' for more information. cp: invalid option -- n Try `cp --help' for more information. [root@]# cp --help Usage: cp [OPTION]... [-T] SOURCE DEST or: cp [OPTION]... SOURCE... DIRECTORY or: cp [OPTION]... -t DIRECTORY SOURCE... Copy SOURCE to DEST, or multiple SOURCE(s) to DIRECTORY.
Mandatory arguments to long options are mandatory for short options too. -a, --archive same as -dR --preserve=all --backup[=CONTROL] make a backup of each existing destination file -b like --backup but does not accept an argument --copy-contents copy contents of special files when recursive -d same as --no-dereference --preserve=link -f, --force if an existing destination file cannot be opened, remove it and try again -i, --interactive prompt before overwrite -H follow command-line symbolic links -l, --link link files instead of copying -L, --dereference always follow symbolic links -P, --no-dereference never follow symbolic links -p same as --preserve=mode,ownership,timestamps --preserve[=ATTR_LIST] preserve the specified attributes (default: mode,ownership,timestamps), if possible additional attributes: context, links, xattr, all -c same as --preserve=context --no-preserve=ATTR_LIST don't preserve the specified attributes --parents use full source file name under DIRECTORY -R, -r, --recursive copy directories recursively --remove-destination remove each existing destination file before attempting to open it (contrast with --force) --sparse=WHEN control creation of sparse files --strip-trailing-slashes remove any trailing slashes from each SOURCE argument -s, --symbolic-link make symbolic links instead of copying -S, --suffix=SUFFIX override the usual backup suffix -t, --target-directory=DIRECTORY copy all SOURCE arguments into DIRECTORY -T, --no-target-directory treat DEST as a normal file -u, --update copy only when the SOURCE file is newer than the destination file or when the destination file is missing -v, --verbose explain what is being done -x, --one-file-system stay on this file system -Z, --context=CONTEXT set security context of copy to CONTEXT --help display this help and exit --version output version information and exit
By default, sparse SOURCE files are detected by a crude heuristic and the corresponding DEST file is made sparse as well. That is the behavior selected by --sparse=auto. Specify --sparse=always to create a sparse DEST file whenever the SOURCE file contains a long enough sequence of zero bytes. Use --sparse=never to inhibit creation of sparse files.
The backup suffix is `~', unless set with --suffix or SIMPLE_BACKUP_SUFFIX. The version control method may be selected via the --backup option or through the VERSION_CONTROL environment variable. Here are the values:
none, off never make backups (even if --backup is given) numbered, t make numbered backups existing, nil numbered if numbered backups exist, simple otherwise simple, never always make simple backups
As a special case, cp makes a backup of SOURCE when the force and backup options are given and SOURCE and DEST are the same name for an existing, regular file.
Assuming that you just want to copy files from the directory /p2/arch/log (and not from subdirectories of that directory), you could try something like the following. It is untested, but should come close to what you were trying to do:
This should work with any shell based on Bourne shell syntax (e.g., ash, bash, dash, ksh, sh,zsh, etc.), but will not work with csh and its derivatives.
This User Gave Thanks to Don Cragun For This Post:
Hi,
I want a simple line of code that will compress files within a directory specified (parameter) and its subdirectories and also i want to remove files which are exactly 365 days old from the sysdate after this compression.
Please help.
Thanks,
JD (8 Replies)
How do I move all folders and its contents from a directory A to another directory B, skipping all files in Directory A ?
---------- Post updated at 12:53 PM ---------- Previous update was at 12:42 PM ----------
Ok. Got it.
mv /A/*/ /B/ (1 Reply)
Hi,
suppose I have the following file and certain rows have missing columns,
how do i skip these rows and create an output file which has all the columns in it
E/N Ko_exp %err Ko_calc %err diff diff- diff+ 0.95
======== ======= ==== ======= ==== ===== ===== =====... (12 Replies)
I have 2 files with the same header and need to append them and put the result in a 3rd file
the 2 files has the same header and while appending i want to skip the second file header and need the result to be put in a third file
Normally, this would work
Cat file1 file2 >> file3....But how... (5 Replies)
Hiiii Friends
I have 2 files with huge data. I want to compare this 2 files & if they hav same set of vales in specific rows & columns i need to get that value from one file & replace it in other.
For example: I have few set data of both files here:
a.dat:
PDE-W 2009 12 16 5 29 11.11 ... (10 Replies)
find . -type f -name "*.sql" -print|xargs perl -i -pe 's/pattern/replaced/g'
this is simple logic to find and replace in multiple files & folders
Hope this helps.
Thanks
Zaheer (0 Replies)
I have a directory that contains some specific files. I want to find all the files and copy them to a different directory, however the files are in /dir1/dir2/dir3/filedir/archive. In ~/filedir contains about 100 directories that contains an archive directory where the files I need are. How can I... (6 Replies)
Hi,
I wanted to compare EDI files present in Two different Directories which can be related by the file names. While comparing the EDI files i have to skip selected segments such as "ISA" "IEA" and "GS" "GE" since this may have datetime stamp and different "Sender" "Receiver" Qual.
and... (3 Replies)
Hi All
Please see the script below
for file in ${filelist
}; do
if ]; then
if ]; then
print " $(date) STEP 6 ------- Copying $file to $destpath"
fi
if ! cp $ftppath/$file $destpath 2> /dev/null; then
writeToLog "ERROR: ${0##*/} - $upartition Could not copy file $file"
if ]; then... (1 Reply)
Hi,
I have a crontab that runs some shell scripts at fixed intervals. Although, one condition for a script to continue is that a previous instance of the same script should have terminated. But, inspite of the fact that the old instance has finished execution, the new instance still terminates.
... (3 Replies)