ITP Sites:   ITP Site|TechBlog|TechHub in schools|NZ CloudCode|All Tech Events|Software Escrow NZ

ITP Techblog

Brought to you by IT Professionals NZ
Menu
« Back to Home

Fixing government IT procurement

Dave lane, Guest Post. 24 September 2018, 11:40 am
Fixing government IT procurement

Part 1: The problem

Today in New Zealand, local and central government procurement of IT services and software uses a model which, it is widely thought in the domestic IT industry, produces unsatisfactory results. Despite many years of tweaks and refinements, the current RFP model and various vendor panels have not made measurable improvements. Many of us in the domestic industry believe a more drastic change is required.

The purpose of this post is to sketch out a model I believe would be a substantial improvement over what we currently have.

Some issues with the status quo

The current system has some obvious issues, for example the tendency to award procurement contracts, particularly for large bespoke software development projects to the lowest bidder. This has not, in generally, ended well.

Other issues with the current system seem a bit more subtle. The one which strikes me as most problematic is this:

Most government tenders specify more than just a set of business requirements to be filled by a software implementation. By exceeding this brief, they're presupposing the solution. They almost always stipulate functional requirements, including compatibility requirements with existing systems employed, for integration purposes.

What seems to go largely unnoticed is that these functional requirements revolve around proprietary protocols, formats, or schema, dictated or unilaterally controlled by existing suppliers. These are local monopolies granted to existing suppliers which, in turn, substantially constrain government agencies' ability to seek competitive solutions for the broadest range of suppliers, and are particularly prone to excluding smaller domestic service providers and suppliers while favouring incumbents.

False attribution

These limited monopolies, which include the substantial majority of software solutions and services, have the effect of increasing the perceived cost of alternative solutions which can compete but are not fully compatible with these proprietary offerings, which, in turn, acts as a barrier to adoption.

Government procurement officers mistakenly attribute the cost penalty of being "incompatible" to the incoming supplier rather than attributing the cost of migration to an incompatible solution to the business practices of the incumbent which was previously granted a monopoly on that particular service. This is a form of "lock-in" which seems almost entirely unacknowledged.

It is also extremely prone to abuse by powerful suppliers (the largest of whom individually have market capitalisation in excess of New Zealand's GDP, effectively rendering them "above the law" here) who exploit their advantage through monopoly rents who use the perceived high cost of switching to other suppliers to maintain their lucrative positions.

Software development is an immature industry. Many types of software that most of us use every day didn't exist a decade ago, some not even a couple years ago. As a result, many software suppliers assert that they need to create proprietary (supplier-defined and unilaterally changeable) formats, protocols, and schemas to allow themselves to pioneer these new and (sometimes) useful software applications. Examples of this would include the emerging "Internet of Things", some areas of biotechnology and gene manipulation, Augmented Reality, and a bunch of other specialist niches to which mainstream software users are yet to be exposed.

The freedom to add to, or change, formats, schema, and protocols is a reasonable need when pioneering brand new software domains, where developers are making up whole new classes of software as they go. But where a software domain has matured to the point where there are multiple competing solutions, and the problems of that domain are well understood, those proprietary formats cease to be justifiable, and instead become a barrier to entry for would-be competition.

In order to compete against the market leader, competitors typically have to offer "compatibility" with the market leader's solution. This is either done via

  1. licensing agreements with the leader (which are expensive and risky - the market leader has a monopoly of their proprietary standards, and can therefore charge what they like, and they also have an incentive to be less than cooperative), or by

  2. reverse engineering - a somewhat less expensive route, but very risky, because software can be extremely complex, and the market leader is often known to change things capriciously if a fast-follower catches up too quickly.

These proprietary de facto standards are a stark contrast to open standards, which should be adopted in any established software domains. To be useful, an open standard, must be developed by multi-supplier groups, with a well-defined change process, be unencumbered by patents, royalty free to anyone wanting to implement them, and ideally have objective compliance vetting systems and multiple independent software implementations. 

Proprietary standards create huge risks for competition. The market leader retains dominance by ensuring that would-be competitors cannot offer real interoperability with their solution. They do this by constantly shifting the goal posts. They have full control over their proprietary standards and can arbitrarily alter them at their whim, or use threats of software patent litigation, or in some jurisdictions (with rampant corporate influence like the US), push through government policy to make reverse engineering legally nebulous. These risks result in lower investment in competing technologies, and long periods of extreme dominance. This pattern has resulted in a handful of dominant software companies attaining a scale seen previously in human history only during times of colonial excess which has ruined entire cultures.

Mutually incompatible repetitive inefficiency at scale

New Zealand has 67 regional authorities. Each of these, though being geographically distributed, has roughly the same remit and the same set of problems for which it requires automated systems, usually implemented in software. There are likely to be in excess of a hundred different areas in which the typical territorial authority deploys a software solution.

Given the similarities between authority obligations, one might assume that the councils would work collectively, collaboratively, and methodically to catalogue their software requirements, perhaps with the aid of private sector software requirements analysts. They could publish a set of well defined standards for private sector suppliers to meet, using their substantial combined buying power to influence the market's offerings, and achieve a consistent set of applications across all of those authorities, with many flow-on benefits like lower costs of training, support, and procurement thanks to consistency and economies of scale.

Sadly, if one assumed this, one would be painfully disappointed when shown what actually happens. There is almost no collaboration. Procurement is ad hoc, uncoordinated, and authorities actually pay IT suppliers to lock them into proprietary software solutions that are often incompatible with those used by most other neighbouring councils.

This is a classic "divide and conquer" model where each territorial authority pays license fees for each application, when these amounts in aggregate across all 67 authorities could probably have funded the development of these applications from scratch each year. Multiply this incredibly wasteful process by the hundred or more software implementations at each authority to get an idea of the scale of the problem.

To provide a concrete example: the amalgamation of IT systems by seven former city councils - with scores of mutually incompatible systems - into a single set of coherent systems for the new Auckland City "super council" has already cost the taxpayer well in excess of $1 billion dollars, with precious little to show for it. That's a cost of about $250 per person in New Zealand so far, and there's no end in sight. That, my friends, is the measure of bureaucratic inefficiency we are currently suffering all around the country (and we're by no means alone here in New Zealand).

Similar inefficiencies pervade the procurement of software by our 20 District Health Boards and many state owned entities like our Universities and Crown Research Institutions.

Unfortunately central government, with significant scale and overlap between agencies and departments, and their software requirements, is little better. So let's just say there is plenty of room for improvement.

Part two of Dave's piece will appear later this week.

Dave Lane is a dedicated technologist, a fan of strengthening our Commons, and an active participant in our democracy, because although things are generally grand here in New Zealand, he thinks things could be even better for all of us. Reposted from Dave's blog with kind permission.


Comments

You must be logged in in order to post comments. Log In

Mike Dennehy 25 September 2018, 4:32 pm

Dave, thanks for your well-thought-out analysis and conclusions. It's hard to disagree with any of your points, and I'd like to add some of my own.

I presented on this topic at ITx 2016 and, in my experience, the problems with central and local government IT procurement - including local authorities and DHBs (and you're right, we have far too many) - are more fundamental. The RFx model is broken and almost never works as intended, and the people that use it do a very poor job.

In my former role as CEO of a vertical-market software house with a decent number of government clients in both NZ and Australia, I reluctantly gave up on responding to any opportunities advertised through GETS or the Australian equivalents. I advised my bid team that if they ever felt the urge to resist my warnings and submit a proposal, they should instead stand in the corner and bash their heads against the wall repeatedly, while feeding $50 notes into a shredder. It would be cheaper and involve less pain than responding.

Some problems with the RFx model:

- It is too prescriptive, as you mention. Instead of stating the current and desired states, many bid documents include a "design" of how they see the new system working (and often how they see it looking).

- Paper-based procurement models are inefficient, ineffective and cannot accurately represent the desired state, or provide the end users with any understanding of how they might operate it.

- Business units (in both the private and public sectors, but more often in the latter) are inexperienced in developing business and system requirements. So they call in a consultant to run the procurement process.

- Consultants understand software development, but almost never understand the business requirements for specialist software. So you have the visually impaired leading the partially-sighted.

- As a result, the procurement process maintains a huge distance between the central parties - vendor and users - with no communication possible. Any information to be exchanged by the central parties must go through several filters, and it's no surprise that the end result is misunderstanding.

- Big companies stand the greatest chance of winning, if a decision is ever made, because they can throw more resource at it. Most of the systems integrators have permanent staff dedicated to winning new business, wherever it can be found. Many times I've seen proposals bringing in a system designed overseas for overseas requirements, that is hopelessly unfit for purpose. Not only that, it has no reference sites in New Zealand and no resources able to manage an implementation and provide support. The lease management system purchased by the Government Property Group (known then as the Property Management Centre of Expertise) is a classic case in point.

- Following on from the above, government IT procurement processes almost never follow their own rules. Novopay is an example of this point and the one above - though it also failed most spectacularly in the go-live process due to ignoring the most basic of testing, pilot and staged roll-out processes, which were agreed in writing.

Unproven, untested and unfit solutions have been adopted many times, in the face of Government IT Procurement's own rules and guidelines. A 77-page novella "Guide to Mastering Procurement" is available and while it is well laid-out and has a small amount of relevant information, it forces procurement down a path that we know from experience is broken.

- In my experience, the number 1 all-time grand champion winning tenderer is "No Decision". I have witnessed a prominent government department go to the market 5 times in 9 years for a fairly standard solution, and on the first 4 occasions at least, they made no decision and went back to the drawing board. With a straight face they asked the small pool of suppliers to go to the well once more, and in my case they begged us to at least attend a vendor briefing, which we reluctantly did (at some expense). I shouldn't have been surprised that the person begging us to attend wasn't even there for the session, which started late, finished after 20 minutes and contributed precisely nothing to the sum of information already available. I have no idea whether they eventually purchased and implemented a solution because I just don't care, but I'd be very surprised if they did.

Treating local suppliers with such contempt is sadly par for the course, and my assertion that the people in government running these projects do a very poor job of it, is the kindest way I can think to put it.

There are dozens of opportunities to use synergy, economies of scale, collaborative design and collective bargaining in the search for modern, effective and cost-effective solutions to government requirements. Local authorities, government departments with common requirements and DHBs are obvious ones, and there are likewise dozens of New Zealand companies ready and able to deliver, if only they had the chance.

Sadly, these opportunities are lost to bungling and - often - incompetence of the most basic nature. Even if your own guidelines are imperfect, failing to follow them is a sure recipe for failure. So, while we should rightly focus on understanding and determining the right kind of solution, and I agree with you in general about open technologies vs proprietary, for me the two big elephants in the room are the way government runs IT procurement and implementation.

Mike Dennehy

Cactus Consulting


Web Development by The Logic Studio