3
I am using our university network for my research work, and was given a new PC with Ubuntu 18.04, and an Intel i7-7700. I work a lot with machine learning applications, so I am constantly moving huge datasets (~ 30-50 GB) of relatively small files. With Linux, I run into severe performance problems when doing so, especially when using Nautilus.
For example, when working with the AVA-Dataset (255530 .jpg-images), Nautilus takes FOREVER (~3 Minutes??) to simply open the folder and display the contents! Note that I am using list layout, without any thumbnails that have to be displayed, or anything like that - just the filename, and that's it. On my Windows machine, opening similar image folders only takes about 2-3 seconds. Am I doing something wrong, or is this a general Linux-specific problem?
Help and suggestions greatly appreciated! :)
1Windows is probably benefiting from file indexing here perhaps? Not sure if there's a similar thing in Ubuntu ? – Smock – 2019-09-16T12:24:37.840
1Use shell commands, as you have trained yourself for some time you will be whizzing along. (tip: www.tldp.org contains at least two "guides" on bash, ...). In the process of learning to use bash you will be automating all your tasks to a reasonable level. – Hannu – 2019-09-16T16:43:36.153
I have tried the same thing within the command line - the standard -ls command also takes forever, as it has to
– masterBroesel – 2019-09-17T06:21:25.113stat
and sort each file (multiple) times. Removing stat and sort features as described in [link] https://unix.stackexchange.com/questions/120077/the-ls-command-is-not-working-for-a-directory-with-a-huge-number-of-files helped a bit.