A company with 50 graphic designers are working on the same file server. They are using mainly Indesign. A typical project is a 60 pages indesign document with 1.5GB of linked files (something like 100 PSD, JPG, .ai). They continuously edit linked files. This causes a lot of stress to the server. All worksations have onboard SSD (700MB/s) that are used only for system and apps.
I wonder if there are any way to use the local drive as a cache for remote folder. Let's say we assign 200GB for caching files. Each time a file is accessed it would check on the server the last change date and retrive the file only if cached data is obsolete. If a file change on the server it notify workstations. It would be thinked as a dropbox or google drive but with local server.
If anyone knows solutions of this kind, please, point me to it.
If there are not, I really wonder why. It is a need I see in every companies of this kind.
Adobe recomends to copy file locally, work, then re-upload. This is really painfull and impossible when 3 graphists are working on different chapters of the same book that share a lot of assets.
(Note this issue is soved but I let it for constiency with comments :) Using SMB it uses huge amount of CPU on the server, I think Indesign is watching after changes on the linked files and constantly request for changes. If a linked file is updated indesign immediately show a sign next to it. (Solution : when using afp to connect remote folders CPU usage on server is normal, there definetly an issue with SMB implementation on OSX.)
Anyway, I'm still looking for a way to mount remote folders... having local cache.