Strong Signals: The Recombinant Corporation

0107 Parkinson

Amongst the next big challenges in business computing is what we call “users as programmers.” It’s the elimination of almost all application programming as we think about it today. You switch out of the idea of hard-coded applications and into the mindset that you give users a portfolio of somewhat self-assembling tools they can put together according to the description of a task for which they require support.

When it’s time to pay people, for instance, the code assembles itself into a payroll system. When it needs to process an order and a payment, it just assembles itself into the software that does that, makes a change to some persistent data store and then breaks up into components that go on and do something else.

All this depends on the emergence of what we call a portfolio technology strategy. In a portfolio strategy, as much technology as possible becomes a “utility” that’s provided the same way everywhere. For example, judging from the recent experience in California, most people don’t worry much about where the electricity comes from. There’s an assumption the utility infrastructure can provide as much as you need, even if you have unanticipated demand. And to a very large degree that’s true, although you can’t exceed the total capacity of the system, as we learned.

Then you focus on what we call “common” components: doing things the same way in as many places as possible. Only after you’ve exhausted the opportunity for common components do you create “custom” components that are unique to you. And you do it all within a standard framework that is rich enough to cope with the emergence and integration of new things.

The trick strategically is to start thinking now about what this portfolio is going to have to look like. Even if you can’t get to a utility model now, you ought to be designing the common and custom things you need so they can become utility-provided later. Two interesting questions that we pose in this context: What would a business have to look like to take advantage of the portfolio approach? And if we could do this, what would it make possible for a business that took advantage of it?

The hard work is in the composition of the portfolio. How much can be utility because it’s best provided that way? How much can exist as common across many enterprises, but not all? And how much must be customized? So you rearchitect your entire business-computing environment around those broad principles, and you do it under the assumption that, over time, custom migrates to common and common migrates to utility. We call the result “Adaptive IT.”

So the person whose job it will be to orchestrate all this had better figure out how to explain it to the business side, so that you don’t get trapped in an asset-ownership strategy that will disadvantage you later. And you had better start thinking about how your specific business computing needs map to this portfolio architecture. And you’d better start thinking about how you’re going to stay current—at least as current as your risk tolerance allows—with the evolving shifts in user expectation.

Sometime in the next 25 years, maybe quite soon, we’re going to have to look for a new representational metaphor for the technology infrastructure and environment we use in business. Industrial and architectural metaphors are starting to fail us as technology gets more complex. One obvious place to look is biology, which operates perfectly happily without the constraints of architecture.

So you end up with what we call directed evolution, in which you play with the macro system variables to tweak the odds for things emerging that seem close to what you want. Ultimately, you could run experiments in the recombination of things—information, information context, functional capabilities, human interaction models—to see what happens. And if you were good at that, you would be effectively supermanaging the evolution of the system.

When you combine the effects of the “users as programmers” approach and the metaphors from biology, you realize that we are really only beginning to push the edges of what we could automate today. I think we’re going to see over the next couple of decades a move toward much more core automated decision making in business operations.

Already, we almost have enough computing power to simulate business ahead of real time. And I think that within a couple of decades what will happen is that business strategy and operations will basically consist of spending a couple of hours a night running tomorrow, or next week or next month, in simulation, seeing what you think is going to happen, and then tracking tomorrow against the simulation to catch unanticipated variations. The longer we have a history of looking at what our decisions make happen, the better our simulations will get. Never perfect, however. This is not a matter of completely automating life.

John Parkinson is chief technologist for Cap Gemini Ernst & Young LLC in the U.S. and a member of the firm’s strategy and technology leadership team. Comments on this story can be sent to editors@cioinsight.com.

Get the Free Newsletter!

Subscribe to Daily Tech Insider for top news, trends, and analysis.

Latest Articles