Every time a new policy proposal, strategy or idea comes down the pike, no matter how appealing, we have to remind ourselves that few ideas are THE idea, that as smart as the new idea might be, it almost certainly won’t apply in all cases.
Indeed, one constant and core tenet of the acquisition reform movement and efforts to enhance federal technology procurement and performance has been to incentivize more creativity and flexibility.
Military specifications? Sensible in some cases, but not in all.
So in one of the first acquisition reforms of the 1990s, Secretary of Defense William Perry directed a move away from “milspecs” in order to increase accessibility to proven commercial solutions and speed up the development of new systems.
Software development certifications? Great idea; but there’s more than one, so “or equivalent” language emerged in place of mandates tied solely to any individual certification. The list goes on.
Unfortunately, specific, mandatory certifications and specifications, many of them government-unique, have actually been making a worrisome comeback in recent years, even as the government ostensibly seeks to broaden its competitive aperture. From cost accounting to capability maturity models, from contract type to acquisition strategies, we see too many procurements that send a clear message: Those who haven’t done it our way, or are not doing it a certain way, need not apply. And that isn’t good for the government customer or the taxpayer.
An interesting take on this dichotomy was raised recently in a piece titled “The Tyranny of Agile.” In it, Jennifer Pahlka, the head of Code for America, a leading apostle of agile software development and one of the architects of the Digital Services Playbook, warns against assuming that one size fits all in the software development space.
Lest there be any misconception, Pahlka is most certainly not abandoning her belief in agile or the failures of waterfall methodologies. What she, and others, are concerned about however, is what she calls “the tyranny of misapplied doctrine.” As she says, “agile is one useful doctrine, not THE doctrine.”
Along the same lines, I was privileged recently to participate in the American Public Health Systems Association’s IT Solutions Management Conference. The theme of our panel was “The Role of the Systems Integrator in a Post ‘Big Bang’ Era.” Make no mistake. That role is changing. The advent of more modular, agile strategies has a number of significant impacts on how and where an integrator engages.
But there also appeared to be a consensus that even in an agile environment, the “integrator,” whoever that might be, remains critical.
While agile is clearly the principal methodology of choice, experience with it on complex government systems and business processes remains limited. There are intricate, interwoven networks and co-dependencies throughout the system, and how they are knit together and integrated requires an internal or external entity with the domain and technical expertise, the authority and the accountability needed to drive scaled impact.
Agile techniques certainly bring great value to large complex programs; the challenge is how to apply them properly within an overall modernization strategy that has many, sometimes disparate components. As my co-panelist, a former state CIO, put it, we are in a transition phase during which the application of agile on a broad systemic scale remains a work in progress.
This too resonated with the audience, most of whom were state, local and federal health IT and systems professionals and are on the front lines of this transition. Nonetheless, it does not appear to be universally held.
From public statements, actual procurements, and even some policy proposals, there are tendencies in some quarters to issue narrow methodological mandates, rather than allowing the user’s needs and outcome objectives to guide the overall strategy. And those tendencies are often accompanied by a false presumption that agile itself is the sole province of one community of companies. That a systems integrator, for example, is de facto incapable of implementing an agile methodology or managing in an agile environment.
And so we return to the point we started with: The tendency, on multiple fronts, to conflate issues, to assume an all or nothing posture, when, in fact, what we need is an effective blend.
Better and smarter design and implementation of federal IT programs is a goal everyone should share; as is the recognition that traditional modes, including almost exclusive reliance on waterfall development, need to change.
But it’s too easy, as Pahlka warns, to apply “the wrong context or to misapply methods such as efforts to go ‘all agile’.” After all, an agile methodology doesn’t obviate the need for configuration management, systems engineering or integration; it just changes its character and context.
Milspecs; certifications on contract; misapplied tech mandates. In these and other areas, we continue to fall prey to the one size fits all syndrome. But one size rarely fits all and the assumption that it does is an often destructive oversimplification in a world where little is nearly that simple.