Linux and UNIX Man Pages

Linux & Unix Commands - Search Man Pages

uniq(1) [minix man page]

UNIQ(1) 						      General Commands Manual							   UNIQ(1)

NAME
uniq - delete consecutive identical lines in a file SYNOPSIS
uniq [-cdu] [-n] [+n] [input [output]] OPTIONS
-c Give count of identical lines in the input -d Only duplicate lines are written to output -u Only unique lines are written to output -n Skip the first n columns when matching +n Skip the first n fields when matching EXAMPLES
uniq +2 file # Ignore first 2 fields when comparing uniq -d inf outf # Write duplicate lines to outf DESCRIPTION
Uniq examines a file for consecutive lines that are identical. All but duplicate entries are deleted, and the file is written to output. The +n option skips the first n fields, where a field is defined as a run of characters separated by white space. The -n option skips the first n spaces. Fields are skipped first. SEE ALSO
sort(1). UNIQ(1)

Check Out this Related Man Page

UNIQ(1) 						    BSD General Commands Manual 						   UNIQ(1)

NAME
uniq -- report or filter out repeated lines in a file SYNOPSIS
uniq [-cdu] [-f fields] [-s chars] [input_file [output_file]] DESCRIPTION
The uniq utility reads the standard input comparing adjacent lines, and writes a copy of each unique input line to the standard output. The second and succeeding copies of identical adjacent input lines are not written. Repeated lines in the input will not be detected if they are not adjacent, so it may be necessary to sort the files first. The following options are available: -c Precede each output line with the count of the number of times the line occurred in the input, followed by a single space. -d Don't output lines that are not repeated in the input. -f fields Ignore the first fields in each input line when doing comparisons. A field is a string of non-blank characters separated from adja- cent fields by blanks. Field numbers are one based, i.e. the first field is field one. -s chars Ignore the first chars characters in each input line when doing comparisons. If specified in conjunction with the -f option, the first chars characters after the first fields fields will be ignored. Character numbers are one based, i.e. the first character is character one. -u Don't output lines that are repeated in the input. If additional arguments are specified on the command line, the first such argument is used as the name of an input file, the second is used as the name of an output file. The uniq utility exits 0 on success, and >0 if an error occurs. COMPATIBILITY
The historic +number and -number options have been deprecated but are still supported in this implementation. SEE ALSO
sort(1) STANDARDS
The uniq utility is expected to be IEEE Std 1003.2 (``POSIX.2'') compatible. BSD
January 6, 2007 BSD
Man Page

14 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Sort and Unique in Perl

Hi, May I know, if a pipe separated File is large, what is the best method to calculate the unique row count of 3rd column and get a list of unique value of the 3rdcolum? Thanks in advance! (20 Replies)
Discussion started by: deepakwins
20 Replies

2. Shell Programming and Scripting

Column sum group by uniq records

Dear All, I want to get help for below case. I have a file like this. saman 1 gihan 2 saman 4 ravi 1 ravi 2 so i want to get the result, saman 5 gihan 2 ravi 3 like this. Pls help me. (17 Replies)
Discussion started by: Nayanajith
17 Replies

3. Shell Programming and Scripting

extract unique pattern from large text file

Hi All, I am trying to extract data from a large text file , I want to extract lines which contains a five digit number followed by a hyphen , like 12345- , i tried with egrep ,eg : egrep "+" text.txt but which returns all the lines which contains any number of digits followed by hyhen ,... (19 Replies)
Discussion started by: shijujoe
19 Replies

4. Shell Programming and Scripting

Concatenating column values with unique id into single row

Hi, I have a table in Db2 with data say id_1 phase1 id_1 phase2 id_1 phase3 id_2 phase1 id_2 phase2 I need to concatenate the values like id_1 phase1,phase2,phase3 id_2 phase1,phase2 I tried recursive query but in vain as the length of string to be concatenated in quite long. ... (17 Replies)
Discussion started by: jsaravana
17 Replies

5. Shell Programming and Scripting

How to identify the occurence of a pattern between a unique character?

hi, is it possible to find the number of occurences of a pattern between two paranthesis. for e.g i have a file as below. >>{ >>hi >>GoodMorning >>how are you? >>} >>is it good, >>tell me yes, if it is good In the above file, its clear the occurence of word "Good"... (17 Replies)
Discussion started by: divak
17 Replies

6. UNIX for Dummies Questions & Answers

Getting non unique lines from concatenated files

Hi All, Is there a way to get NON unique lines from 2 or more concatenated files? Basically I have several files which are very similar with the exception of few lines and I want to find out which lines are different in each file. Very simple example is file1 contains: 1 2 3 4 5file2... (122 Replies)
Discussion started by: pawannoel
122 Replies

7. Shell Programming and Scripting

[uniq + awk?] How to remove duplicate blocks of lines in files?

Hello again, I am wanting to remove all duplicate blocks of XML code in a file. This is an example: input: <string-array name="threeItems"> <item>item1</item> <item>item2</item> <item>item3</item> </string-array> <string-array name="twoItems"> <item>item1</item> <item>item2</item>... (19 Replies)
Discussion started by: raidzero
19 Replies

8. Shell Programming and Scripting

Compare multiple files and print unique lines

Hi friends, I have multiple files. For now, let's say I have two of the following style cat 1.txt cat 2.txt output.txt Please note that my files are not sorted and in the output file I need another extra column that says the file from which it is coming. I have more than 100... (19 Replies)
Discussion started by: jacobs.smith
19 Replies

9. Shell Programming and Scripting

Removing Dupes from huge file- awk/perl/uniq

Hi, I have the following command in place nawk -F, '!a++' file > file.uniq It has been working perfectly as per requirements, by removing duplicates by taking into consideration only first 3 fields. Recently it has started giving below error: bash-3.2$ nawk -F, '!a++'... (17 Replies)
Discussion started by: makn
17 Replies

10. Shell Programming and Scripting

Fetching record based on Uniq Key from huge file.

Hi i want to fetch 100k record from a file which is looking like as below. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX ... (17 Replies)
Discussion started by: lathigara
17 Replies

11. Shell Programming and Scripting

Using grep and a parameter file to return unique values

Hello Everyone! I have updated the first post so that my intentions are easier to understand, and also attached sample files (post #18). I have over 500 text files in a directory. Over 1 GB of data. The data in those files is organised in lines: My intention is to return one line per... (23 Replies)
Discussion started by: clippertm
23 Replies

12. Shell Programming and Scripting

Filter uniq field values (non-substring)

Hello, I want to filter column based on string value. All substring matches are filtered out and only unique master strings are picked up. infile: 1 abcd 2 abc 3 abcd 4 cdef 5 efgh 6 efgh 7 efx 8 fgh Outfile: 1 abcd 4 cdef 5 efgh 7 efxI have tried awk '!a++; match(a, $2)>0'... (32 Replies)
Discussion started by: yifangt
32 Replies

13. Shell Programming and Scripting

Unique entries in multiple files

Hello, I have a directory with a log files(many of them). Lines look like this: Sep 1 00:05:05 server9 pop3d-ssl: LOGIN, user=abc@example.com, ip=, port= Sep 1 00:05:05 server9 pop3d-ssl: LOGOUT, user=abc@example.com, ip=, port=, top=0, retr=0, rcvd=12, sent=46, time=0 Sep 1 00:05:05... (19 Replies)
Discussion started by: ramirez987
19 Replies

14. Shell Programming and Scripting

Add unique identifier from file to filetype in directory

I am trying to add a unique identifier to two file extensions .bam and .vcf in a directory located at /home/cmccabe/Desktop/index/R_2016_09_21_14_01_15_user_S5-00580-9-Medexome. The identifier is in $2 of the input file. What the code below is attempting to do is strip off the last portion... (21 Replies)
Discussion started by: cmccabe
21 Replies