There’s some interesting discussion in our industry on the merits of open, disaggregated/decomposed systems versus vertically integrated solutions. Some argue for a far reaching disaggregation as a means to create full flexibility, while others outline the benefit of pre-integrated sub-systems. I’d like to look at this question from a very different perspective. Evolutionary processes have proven to be very powerful at optimizing complex systems. Let’s have a look at what they can teach us.
Only a few years ago, a lot of players in our industry promoted ideas which were opposed to disaggregation. Many suppliers proudly presented their “god boxes”, highlighting the benefits of vertically integrated technologies. The story was pretty simple: A vendor integrates technologies once and many service providers benefit by avoiding the troublesome effort of system integration. It’s a great story with an obvious business case, which is further strengthened as such integration removes interfaces between technology domains.
I also happened to work for a company which followed this technology trend. When I promoted vertically integrated solutions, I frequently pointed out that this technical evolution was closely aligned with the evolutionary process in nature. The emergence of higher forms of living beings comes from natural selection but also functional integration. Life started with simple bacteria, which became building blocks for cells and multiple cells, forming organisms and living creatures. As evolution in nature is an extremely powerful, efficient and successful process, the same principles should be applied to technical evolution. Hence, the “god box” seemed to be an evolutionary necessity.
But in recent years, our industry has changed direction. We’ve moved away from an approach which seemed to be closely aligned with evolution in nature, with methods successfully applied for billions of years. What’s caused this 180 degree shift? Does the interest in disaggregation of hardware and decomposition of software mean the previous direction was wrong?
Major companies promote disaggregation as a means to make networks simpler, hence, to be better able to manage complexity. For good reasons, they suggest a separation of the data plane from the control plane (SDN) and separation of hardware from software (NFV). Subsystems become simpler, but the integration effort and the complexity of operational systems to control segregated sub-systems increase. Have a look at the discussion with NFV, which significantly extends the ability of a service provider to rapidly introduce new services but, at the same time, also increases management complexity as additional entities such as virtual infrastructure managers are required.
Let me share my view on the key difference between technical and natural systems, a difference that has a significant impact on the functional partitioning and results in a need for integration. The production of complex technical systems and the way living organisms reproduce differ in this important respect: While nature has implemented a DNA-based self-reproduction capability, technical systems are built from hundreds and thousands of pieces in sophisticated factories based on detailed specifications and assembly plans. While nature has inherently solved the production and integration problem of complex functions, this is an extensive effort in technical systems. In consequence, you may not directly map rules of natural evolution onto technical systems.
There are however technical domains which do allow for comparable simple reproduction: software and integrated circuits. Those two technologies enable the production of highly complex functionality, readily integrated in a comparably simple production process by copying of software or processing a wafer.
For those two domains, the evolutionary rules of continuously increasing complexity with monolithic systems might be applied. This would suggest that for each of those domains – software and high-scale integration – monolithic, highly complex solutions will evolve. Open interfaces will be key for such software solutions to work with complex hardware building blocks, which consist of a single or a few high-scale integrated circuits.
I guess we can observe such a trend when looking at roadmaps of key components for switches and servers as well as the efforts of software suppliers and open projects to create pre-integrated, comprehensive software solutions. You might want to have a look at the work of OPNFV, an open project driving NFV towards commercial deployment.
Such view seems to contradict the idea of disaggregation. However, present discussion on disaggregation is believed to be an essential intermediate step towards a repartitioned and re-architected, optimized ICT. The CORD (central office re-architected as a data center) project is an example for such transformation process. As an outcome, hardware-implemented functions might move into software or vice versa.
In short, it’s unlikely that disaggregated/decomposed functions will be used as isolated building blocks of future ICT infrastructures. The system integration effort would be too massive for any communication or cloud service provider to undergo this painful and expensive process. Instead, I’d expect to see atomic functions either becoming part of comprehensive software solutions or being integrated into silicon for switches or servers. Open interfaces will enable complex software solutions to be operated on highly integrated hardware devices, with both domains continuously extending complexity and functionality.