Alternate question title: How can I translate “This software gives me the creeps” to a business case to upper management for not buying it?
I am the IT Department for a small company which has experienced several years of sustained growth. We started out on QuickBooks, shifted to a different accounting system, and are now in the market for mid-market ERP systems that are a little more comprehensive and customizable. We are presently evaluating an ERP system that is written in Visual FoxPro 9, and I have a bad feeling about it, but I am not able to enumerate exactly why this is so.
It consists of several modules, the back office module and a web module are the two that we are interested in. The backoffice module contains the usual ERP order/fulfillment/shipping/accounting functions. The web module is driven by the same FoxPro DB via pointing IIS at a .NET component opening the database from another machine using a UNC path. I don't know about that either, but that's a separate issue, for now.
My concern is that the system is ‘installed’ by doing the following:
1. Create a top-level folder on a server.
2. share that folder with appropriate users and groups as \\server\erp
3. unzip the .exe and dlls and \data folder in the shared folder
4. map \\server\erp to a drive on client computers
5. create a shortcut to the \\server\data\erp.exe on client desktops.
6. double click on shortcut! You’re ERPing! (after some other minimal setup)
The .exe uses access to files in the \\server\data subdirectory in order to populate forms and etc, as usual.
I am concerned that concurrent users (25 or more) accessing one .exe which is accessing files through the network file system (cifs) in order to perform database functions seems...suspect. Every other system I’ve seen uses a separate database engine, either a home-grown one (which is bad enough) or something like SQL Server, Oracle, even PostGreSQL or heck, even MySQL to handle data access, but this one is just one .exe file on a shared folder, run directly from that shared folder on each client’s desktop. It seems like this is inefficient, or at least inelegant, and that it would cause a lot of excess network traffic. The .exe is about 10MB in size and residing on the server, opens .dbf files residing in an adjacent \data directory. The vendor was asking if we had a gigabit network (we do) and it seemed very important to him...now I can see why he was asking.
I don’t have a deep development background, but it seems to me that you should have a separate database engine which communicates with clients via either named pipe or TCP/IP socket, or at least some sort of binary network protocol if nothing else. Using netBIOS sharing (you enter UNC paths into the database as properties) just seems wrong, because wouldn’t you run into file locking issues if e.g. two users want to open the same customer in A/R? Am I just being overly cautious? Is this really standard practice, as the vendor says? I do not have a great deal of experience in the larger accounting systems like this. Our current package uses a client-server model with a DB engine handling files, and then the users running the software on their machines talk to it via the network. Am I wrong to think that something purportedly more advanced will have a similar interface?