How to Screw up Choosing a Vendor: First Form a Committee...
Although many of my disaster stories involved places I've worked, this one takes place in the city I live in, Arlington, Texas.
Arlington is best known as the largest city in the US with no public transportation, and home of one of the most dense retail areas in the country. It has 360,000 residents and is around the 50th largest city in the US. Its police write around 200,000 tickets a year, all handled in a municipal court system.
In 2000, the city looked to replace an aging mainframe based court management system that was no longer capable of handling the volume and lacked many needed functions.
So the city formed a committee from various departments to help select a new software vendor of a municipal court system. They then hired a consulting firm to help select a new vendor, and collected bids from 1.3 to 2.6 million dollars, from which the firm suggested two bids.
That's the last thing that went right.
The committee decided that the bids were too high and changed the requirements, collected three new bids; one way less than the others at only $300,000 (the nest lowest bid was $761,000). They jumped at the "deal" and negotiated with the company. That the company's software had never handled a city as large as Arlington, had poor comments from its other customers, and that the current version was not actually in use anywhere else, didn't seem to matter. The city council was told none of the negatives (and didn't ask for any) and thus voted for the contract. One last thing: the city had to pay the costs up front. Nice deal if you can get it.
First the go-live date was a year behind schedule. Once it was running users started reporting the software didn't or couldn't manage simple tasks it was supposed to do, like court scheduling and processing bonds. It crashed a lot and ran sluggishly. In the first year of use the city lost money despite record ticket volume as the software made collections difficult. Soon the court system was more than 100,000 cases behind. The company claimed that the city wasn't installing the frequent patches, but the city found patches were making the software even more unstable.
Of course the city never hired anyone to be a project manager either. Amazing how people think IT projects can run themselves. During the time the software was in use it totaled nearly $1.7 in actual costs, not including city employee time and money, nor the lost revenues. A wee bit over the original bid.
Last year the city finally gave up, hired another consulting firm to analyze the software and see if it was fixable. It wasn't. The city then hired the consulting firm to oversee choosing a new vendor, which is ongoing.
There wasn't any mention of a new committee.
I have never understood why people think the lowest bidder is a valid way to pick a vendor. If you have at least 3 vendors, tell everyone you will be picking the second lowest vendor. This way there is an incentive to be realistic in the bids, since lowballing will likely cost a vendor the contract. Sometimes this isn't possible due for legal reasons.
Another thing with bidding is to keep the bid price from the decision makers until all bidders have been examined for all other details, such as investigating other customers, trial uses, documentation, or other important information has been analyzed. The whole point is to avoid prematurely choosing a vendor strictly on price. Governments are suckers for low bids, since it looks good politically to "save voter's money". Rarely does the final price ever get publicized. That's why I would prefer to see the second lowest bid bidding process.
The entity hiring the software vendor should be prepared to have a project manager (either external or internal) be involved from the earliest possible moment until the software is fully in production and enough time has elapsed to see that it is functional and usable. Ideally this individual (or group is necessary) should have no connection to the vendor but have valid experience and be given sufficient authority to ensure the entity is getting what they paid for.
All software has bugs (unless you are NASA, then your software has no bugs but your hardware blows up). The software should be able to do all of the required and promised functionality, do it reliably, be relatively usable by the actual users, and require only a minimum of patches. The vendor should have a sufficiently large QA department and be able to demonstrate some kind of repeatable test plan for each patch. It's amazing how many companies I have seen which develop software (either for internal use or sale) with no QA team at all. My two software companies both had around 1 QA person for 4 programmers. Netscape in the 90's hired 100 programmers before they hired a single QA engineer. Never buy from a vendor who's QA department is their customers.
I read an article this weekend where a local columnist wondered if it was possible to go back to a paper-based system like "the old days". This is sad commentary on how easy it is for software projects to make people wish for how nice it was when armies of clerks pushed paper around. With some common sense, good engineering, good management and smart people it is possible to improve business or government with software solutions.
Sadly in most places "common sense, good engineering, good management and smart people" are rarer than bugs in NASA's mission critical software. Witness the continual disasters at the FBI.
Maybe if we all formed a committee...