0
What is going to be my best way to search and remove duplicate's from over 50GBs worth of text files and merge them into one? I figure a GUI app will just hang so I need a CLI style with threading support or a way to use linux.
0
What is going to be my best way to search and remove duplicate's from over 50GBs worth of text files and merge them into one? I figure a GUI app will just hang so I need a CLI style with threading support or a way to use linux.
0
Auslogics Duplicate File Finder is free and should have no problems with 50GBs of text files. Make sure you download direct from the site to avoid the CNET downloader and don't accept the Auslogics toolbar. It's fairly intuitive and easy to get rid of duplicates.
To concatenate your text files in windows you could try this in a cmd window as admin:
copy *.txt bigfile.txt
Which will copy all your text files into one big file. No idea if this will work with the volume of files you have.
Do you want each collection of duplicates merged together, or did you mean you want to merge all the remaining text files into one file at the end of this procedure? – iglvzx – 2012-05-05T20:55:00.627
I would like to merge every text file into one without duplicates – Rachel Nark – 2012-05-06T02:05:46.933