Optimizing data center placement and network design to strengthen cloud computing – Phys.Org

February 14, 2017

Telecommunication experts estimate the amount of data stored "in the cloud" or in remote data centers around the world, will quintuple in the next five years. Whether it's streaming video or business' database content drawn from distant servers, all of this data isand will continue in the foreseeable future to be - accessed and transmitted by lasers sending pulses of light along long bundles of flexible optical fibers.

Traditionally, the rate information is transmitted does not consider the distance that data must travel, despite the fact that shorter distances can support higher rates. Yet as the traffic grows in volume and uses increasingly more of the available bandwidth, or capacity to transfer bits of data, researchers have become increasingly aware of some of the limitations of this mode of transmission.

New research from Nokia Bell Labs in Murray Hill, New Jersey may offer a way to capitalize on this notion and offer improved data transfer rates for cloud computing based traffic. The results of this work will be presented at the Optical Fiber Communications Conference and Exhibition (OFC), held 19-23 March in Los Angeles, California, USA.

"The challenge for legacy systems that rely on fixed-rate transmission is that they lack flexibility," said Dr. Kyle Guan, a research scientist at Nokia Bell Labs. "At shorter distances, it is possible to transmit data at much higher rates, but fixed-rate systems lack the capability to take advantage of that opportunity."

Guan worked with a newly emerged transmission technology called "distance-adaptive transmission," where the equipment that receives and transmits these light signals can change the rate of transmission depending on how far the data must travel. With this, he set about building a mathematical model to determine the optimal lay-out of network infrastructure for data transfer.

"The question that I wanted to answer was how to design a network that would allow for the most efficient flow of data traffic," said Guan. "Specifically, in a continent-wide system, what would be the most effective [set of] locations for data centers and how should bandwidth be apportioned? It quickly became apparent that my model would have to reflect not just the flow of traffic between data centers and end users, but also the flow of traffic between data centers."

External industry research suggests that this second type of traffic, between the data centers, represents about one-third of total cloud traffic. It includes activities such as data backup and load balancing, whereby tasks are completed by multiple servers to maximize application performance.

After accounting for these factors, Guan ran simulations with his model of how data traffic would flow most effectively in a network.

"My preliminary results showed that in a continental-scale network with optimized data center placement and bandwidth allocation, distance-adaptive transmission can use 50 percent less wavelength resources or light transmission, and reception equipment, compared to fixed-rate rate transmission," said Guan. "On a functional level, this could allow cloud service providers to significantly increase the volume of traffic supported on the existing fiber-optic network with the same wavelength resources."

Guan recognizes other important issues related to data center placement. "Other important factors that have to be considered include the proximity of data centers to renewable sources of energy that can power them, and latencythe interval of time that passes from when an end user or data center initiates an action and when they receive a response," he said.

Guan's future research will involve integrating these types of factors into his model so that he can run simulations that even more closely mirror the complexity of real-world conditions.

Explore further: New optical transmitter enables better communication networks

All the world's data pictures, video, sounds, and text has to traverse complex networks of optical fibers that crisscross cities, regions, and countries. To better handle the glut of information, a research team from ...

Nokia Bell Labs, Deutsche Telekom T-Labs and the Technical University of Munich (TUM) have achieved unprecedented transmission capacity and spectral efficiency in an optical communications field trial with a new modulation ...

By prioritizing the delivery of rich visual data, A*STAR researchers have demonstrated that the quality of streaming video can be vastly improved on even the most crowded wireless networks.

The electronic data connections within and between microchips are increasingly becoming a bottleneck in the exponential growth of data traffic worldwide. Optical connections are the obvious successors but optical data transmission ...

Enterprise networks and data centers continue to increase their demands for connectivity, with ever larger quantities of data expected to be transmitted in the foreseeable future. Over the past 20 years, fiber-optic technology ...

As IP traffic continues to increase and the router interface rate extends beyond 100 gigabits-per-second (Gb/s), future optical networksones that would achieve unprecedented speeds of 1 terabit-per-second (Tb/s)will ...

Interested in making an investment that promises a 100 percent return on your money, and then some? Buy a low-cost, open-source 3-D printer, plug it in and print household items.

I'll bet you don't have one of these at home.

Long-distance couples can share a walk, watch movies together, and even give each other a massage, using new technologies being developed in Carman Neustaedter's Simon Fraser University lab.

When you browse online for a new pair of shoes, pick a movie to stream on Netflix or apply for a car loan, an algorithm likely has its word to say on the outcome.

The Google Chromebook, a type of stripped-down laptop, isn't a practical mobile device for many peoplemostly because it basically turns into an expensive paperweight whenever it can't find a Wi-Fi connection.

Reliability measures of electrical grid has risen to a new norm as it involves physical security and cybersecurity. Threats to either can trigger instability, leading to blackouts and economic losses.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

View post:
Optimizing data center placement and network design to strengthen cloud computing - Phys.Org

Related Posts

Comments are closed.