One of the many things that excite me about working in the tech sector is collaboration. Working collectively on new ideas and new technologies with people spread across the globe continues to inspire me on a daily basis. While on vacation a few weeks ago, I was able to watch such collaboration in action at a conference held by Om Malik on the topic of big data.
With a global delegate list and many people remotely interacting through social media channels, the Structure Big Data conference explored all aspects of managing today’s enormous data demands. However, there was one particular discussion that caught my attention. Focused on the challenges of moving big data around the world, this particular dialogue dived deep into the hurdles both business and domestic users face when trying to access data when they need it, where they need it.
To anyone working in the networking industry, the challenge of universal data access is one that drives a great deal of discussion. The rapid growth of cloud computing applications and the demand to access rich media across fixed and mobile infrastructure has resulted in networks that are oversubscribed and at times unable to respond. The question of how we move big data across our global networks has never been so critical.
What I find interesting is that it’s the very notion of collaboration that is partly responsible for causing today’s data explosion. If we consider the rapid growth in cloud computing and the sharing of data, or the demand for tools that enable online communication, such as Skype or FaceTime, we can see that the world's moving at greater speeds towards online collaboration. Whether it be at work or at home, the shift towards online collaboration and communication has never been so strong. To facilitate this move, the network needs to change.
In an earlier post, I noted how the buzz is back in infrastructure. A much broader spectrum of venture capitalists are now starting to invest in optical networking, realizing that a new wave of technological innovation is about to hit. This innovation is targeted at rebuilding the core of our networks; moving away from current legacy infrastructures to new technologies that can answer the growing bandwidth demand. I believe that it’s this innovation that will answer the big data challenge.
It will be interesting to watch how a new core network impacts upon businesses’ adoption of cloud computing solutions. Will a new core see a new wave of encryption and low-latency solutions that makes the cloud more attractive for the transport of mission-critical information? Will businesses move away from custom-built networks to ready-made solutions? It’s going to be an exciting few years ahead.
Do you agree? What challenges do you find with big data? Do you believe that a new core will be able to respond to future bandwidth demand? As always, I’d be interested to hear from you on this.
Read more on this topic here: Agile Core Transport