4

Alternate question title: How can I translate “This software gives me the creeps” to a business case to upper management for not buying it?

I am the IT Department for a small company which has experienced several years of sustained growth. We started out on QuickBooks, shifted to a different accounting system, and are now in the market for mid-market ERP systems that are a little more comprehensive and customizable. We are presently evaluating an ERP system that is written in Visual FoxPro 9, and I have a bad feeling about it, but I am not able to enumerate exactly why this is so.

It consists of several modules, the back office module and a web module are the two that we are interested in. The backoffice module contains the usual ERP order/fulfillment/shipping/accounting functions. The web module is driven by the same FoxPro DB via pointing IIS at a .NET component opening the database from another machine using a UNC path. I don't know about that either, but that's a separate issue, for now.

My concern is that the system is ‘installed’ by doing the following:

1.  Create a top-level folder on a server.  
2.  share that folder with appropriate users and groups as \\server\erp
3.  unzip the .exe and dlls and \data folder in the shared folder
4.  map \\server\erp to a drive on client computers
5.  create a shortcut to the \\server\data\erp.exe on client desktops.
6.  double click on shortcut!  You’re ERPing! (after some other minimal setup)

The .exe uses access to files in the \\server\data subdirectory in order to populate forms and etc, as usual.

I am concerned that concurrent users (25 or more) accessing one .exe which is accessing files through the network file system (cifs) in order to perform database functions seems...suspect. Every other system I’ve seen uses a separate database engine, either a home-grown one (which is bad enough) or something like SQL Server, Oracle, even PostGreSQL or heck, even MySQL to handle data access, but this one is just one .exe file on a shared folder, run directly from that shared folder on each client’s desktop. It seems like this is inefficient, or at least inelegant, and that it would cause a lot of excess network traffic. The .exe is about 10MB in size and residing on the server, opens .dbf files residing in an adjacent \data directory. The vendor was asking if we had a gigabit network (we do) and it seemed very important to him...now I can see why he was asking.

I don’t have a deep development background, but it seems to me that you should have a separate database engine which communicates with clients via either named pipe or TCP/IP socket, or at least some sort of binary network protocol if nothing else. Using netBIOS sharing (you enter UNC paths into the database as properties) just seems wrong, because wouldn’t you run into file locking issues if e.g. two users want to open the same customer in A/R? Am I just being overly cautious? Is this really standard practice, as the vendor says? I do not have a great deal of experience in the larger accounting systems like this. Our current package uses a client-server model with a DB engine handling files, and then the users running the software on their machines talk to it via the network. Am I wrong to think that something purportedly more advanced will have a similar interface?

atroon
  • 498
  • 3
  • 10
  • 23
  • Reminds me of when people used to do that sort of thing with Access DBs. Horrible. That caused problems with 5 users let alone 25. – JamesRyan Sep 19 '11 at 11:57

3 Answers3

6

What you've described would certainly give me cause for concern for a couple of reasons:

  • The fact that the rep was insistent on Gigabit networking for such a small app. It may turn out to have no practical impact on your network, but it would make me seriously concerned about the application's design. A 50-user ERP system shouldn't trouble a 10Mbit line, let alone a gigabit one.

  • The security implications of the install process would be a deal breaker for me. This is a system that's holding customer (and likely customer payment) information. Users will launch an exe form a shortuct, meaning there'll be 25-50 instances of this exe running under the user's credentials on their workstations. These processes are directly accessing and writing to the shared database files on the same share. That means every user, by design, must have direct read/write access to your entire database. That's horrible from a technical and a security compliance standpoint.

Personally, I'd run a mile from this app on the latter point alone. I'm sure people with more familiarity in compliance areas (or with FoxPro) can comment further.

SmallClanger
  • 8,947
  • 1
  • 31
  • 45
  • +1 on the basis of the point re: security. I'd also add to this re: you'll never be able to run this application over a WAN / VPN without using a remote desktop technology (Terminal Services, etc). This thing is decidedly not client/server when it comes to its database. Icky. I'd *run* from this thing. (Software companies "leveraging" their "legacy" "investment" in crap database technologies sicken me and we should all be voting with our wallets to make them stop or, alternatively, just go out of business.) – Evan Anderson Sep 19 '11 at 13:20
  • Thank you for your insights. I've written them up and taken them to management, and I'm reasonably sure we won't go this route, unless something drastic changes. They were quite receptive to the security arguments. – atroon Sep 23 '11 at 13:00
  • Visual FoxPro and the like may not be trendy and may become increasingly unreliable with future versions of Windows. It is not, however, a 'crap database technology' because if it was it wouldn't still be around and wouldn't still be helping to run thousands of businesses with little or no maintenance. Yes, in an ideal world everything would be on SQL Server or whatever, and nobody is going to start new applications using old technology but the fact remains that the things written in COBOL and VFP and all the rest do the job day in, day out, so why should customers change for the sake of it? – Alan B Oct 06 '11 at 08:55
3

Been there, done that.

I admit VFP9 appears a bit dated, but it is proven technology.

Believe it or not: VFP9 is very robust in a setup like this. It is not like msaccess at all. I had my doubts about it as well at first, but in practice there is no problem at all.

Added bonus: You can even do plain file-based backups of the DB's as long as no user has the application open which is nice if you are not running a 24/7 shop.

But you do need a fast (at least 100 Mb) LAN between clients and servers. (And for God's sake put a 1G NIC in the server.) The EXE local or on the server doesn't really matter, either will work just fine.

As others have stated: For non-local users you need a WTS setup. Recommendation is to do that on a seperate machine and not on the VFP9 server itself. (Of course with Hyper-V or VMWare both machines could be VM's on the same physical box.)

Please note that the WTS users may need a fair amount of RAM per user. It depends a bit on the efficiency of the DB queries and the needs of the application.

From what I have seen 100 to 200 MB working sets are fairly normal for vfp9 apps, of which about half would be pinned in physical RAM. In my experience 20 to 30 users on a 4GB RAM WTS server is doable if they don't need to run much else in the WTS session. Depending on your application needs your mileage may vary.

Tonny
  • 6,252
  • 1
  • 17
  • 31
1

I have a lot of experience with a Visual FoxPro 9 ERP package that works more or less the same way - file share on a server, multiple clients with local EXEs and support files accessing the SMB share. The dataset for each company in the ERP is hundreds of DBF files.

In terms of performance we don't see any issues - we have many customers running 30 or more clients off a fairly standard Windows server, accessing DBF files that can have millions of rows and over 1GB. If the indexing and locking is implemented correctly by the ERP then those numbers are all fine.

Yes, there are security implications with a file share full of DBF files. I suppose it depends on the type of customer our ERP sells to but in over 15 years and with thousands of sites the number that I personally know we've lost because of this setup might total 20 or so. It is also possible to get round this problem using Terminal Services.

You're absolutely correct also that you would have to employ Terminal Services, possibly through a VPN, to run it remotely.

Also, if you end up using the ERP in question and you have Windows 7 machines and Server 2008, make sure those are on SP1 as there was a bug in SMB that caused index files to corrupt.

Alan B
  • 503
  • 5
  • 15