Why not use root folder (C:\) as a documents directory?

0

What, if any, performance or security impacting reasons are there to avoid placing a large number of documents (like 15,000 PDF's) on the root of the C:\ drive of a server or PC?

Scenario: I have a client who likes to place documents directly on the root of C:\ (via ftp), because selecting a subfolder "costs an extra click". Unfortunately, my role is only to provide feedback on the files they place in the root, and how it impacts their server, NOT the FTP access. The Client's internal IT department ultimately makes decisions on the FTP usage.

TaterJuice

Posted 2017-02-06T18:27:26.713

Reputation: 111

4So point the FTP server at the desired subfolder as its root. There's nothing saying an FTP server has to expose the entire system drive. Unix FTP programs do not do this, on purpose, because it's a massive security hole. Indeed, typical Unix FTP configurations — when enabled at all these days — use chroot, jails, and such in order to keep the FTP user from escaping the box they've been put in. – Warren Young – 2017-02-06T18:38:31.160

2The path you mention by default, requires elevation, in order to create or modify a file. This means an application creating a file in that location has permissions it doesn't actually need. – Ramhound – 2017-02-06T19:13:37.733

2You do realize that answers like this are why IT security is so bad, right? Try this, sometime: say "no". – Warren Young – 2017-02-06T19:36:26.923

+1 @WarrenYoung again I agree, however, I'm not the decision-maker on the underlying issue. – TaterJuice – 2017-02-06T19:39:47.560

Maybe not, but do they not employ you for your technical expertise? This is very much a technical matter. "But I waaaant it!" should not be a sufficient rebuttal to the hard technical fact that you're dropping your pants with this move. – Warren Young – 2017-02-06T19:42:20.770

Configure their FTP client to go to a different directory. In other words configure the FTP server to point them to a different directory, and make it not possible, to navigate outside of that directory. – Ramhound – 2017-02-06T19:44:02.460

Were it so easy. I have been asked to provide a security and performance white paper which can be passed to their internal IT department before any changes can be approved or put into production. All final decisions are made by the client's internal IT, and my role is to support those decisions, though I disagree with them. This white paper IS my opportunity to make a change. – TaterJuice – 2017-02-06T19:53:50.433

Answers

3

File hierarchies provide the ability to limit who can see which files exist. This is accomplished by preventing scanning of the directory. Files in the root directory are visible to everyone that can see the drive. If the files, have inherited permissions they are also accessible to everyone.

Normally users have a default directory set for the drive. Usually their own directory.

Many file systems have a limit on the number of files they can contain. I've had to remove huge numbers of files from windows directories when they prevented patching. This is a severe security concern.

Depending on the way the directory indexed, performance can degrade severely as the number of files increases.

FTP itself is considered a security risk and should have limited access to the file system. Configure your FTP server to start in a dedicated subdirectory. If possible, configure it so that it can only work within that subdirectory.

BillThor

Posted 2017-02-06T18:27:26.713

Reputation: 9 384

FTR, I believe the file count limitation per folder on Windows machines is 65,555 – TaterJuice – 2017-02-06T20:36:39.193