Challenges of high-density fiber connections in hyperscale data centers
The demand for fast, flawless network connectivity continues to expand across the globe as our world becomes increasingly digitally interconnected. High-speed 4G and 5G networks provide instant access to data-intensive apps, games and videos on billions of devices across the world, while the Internet of Things (IoT) allows them to seamlessly interact with one another. Augmented Reality (AR), autonomous vehicles and smart home systems, such as virtual assistants and smart appliances will soon be the standard in our homes. New analytical tools, which include real-time aircraft tracking & maintenance, credit card fraud detection and high-frequency financial trading are made possible thanks to the powerful predictive abilities of big data.
In short, we are addicted to our data. Cisco prediction estimates that internet traffic will increase by 127-fold from 2005 to 2021. To put things into perspective, there are about 4 billion smartphones that are currently in use today. Since it was announced that the world is currently facing a pandemic, internet traffic has increased dramatically. As the majority of Europe is currently in lockdown (and has been for weeks now), Netflix, Youtube and Amazon Prime are reducing the quality of their streams in Europe to deal with the influx of internet traffic. Microsoft has also found that there has been a 775% increase in team’s calling and meeting monthly users within a one month period within Italy (as well as those countries who are in a state of lockdown or practicing social distancing). Every single one of these devices are connected to a giant data center somewhere on this earth. That is why all data-driven companies are constantly focusing on finding ways to manage their user data demand and the high traffic that comes with it. Hence, data centers are considered an astounding, fast-growing service industry.
However, data center growth creates a different set of problems. The only thing that makes connectivity possible is an effective and reliable network to manage the large volume, i.e. the big data, that are required to operate them. Data-driven companies are constantly trying to meet demands, and one solution is to use “hyperscale” data centers – i.e. data centers with more than 200,000 square feet. For instance, “The Citadel” in Reno, Nevada, is named the largest and most efficient data center in the United States with an astonishingly huge land of 7.2 million square feet.
We are getting to the point where hyperscale data centers serving digital companies like Facebook, Google, Amazon, and Microsoft are becoming the norm. There are millions of servers inside these hyperscale data centers that are operating together using fiber optic networks to control the massive amount of data traffic required by users. These networks typically consist of hundreds of thousands of metres of fiber optic cables, and hundreds of thousands of optical connections, all of which enable fast and efficient data handling. Due to this massive volume of servers, they run very hot and thus require extensive cooling systems. Hence, companies often utilise climate-control systems to lower the indoor temperature, which in turn produces massive quantities of greenhouse gases and requires massive electricity supply.
Another problem is their massive footprint. Due to their large size, they are often built in remote areas where wide-open spaces and land are widely available, thus allowing for a cheaper cost. However, having a far-away data center does result in performance problems such as latency delays. These time-degraded responses are unacceptable for many critical applications, such as GPS responses for military systems, remote medical monitoring and diagnostic results, or vital financial transactions.
One solution in reducing the land footprint of hyperscale data centers is to squeeze more fiber into a smaller cable. Many fiber-optic cable makers have changed their construction in a way that allows them to pack thousands of optical fibers into one cable. Recently, an UHCF (Ultra-High Count Fiber) cable was introduced, which packs 6,912 fibers into a single cable. Such cable carries double or triple data in even less space, and hence, reduces the land size of data centers. This in turn allows for more accessible data center locations and better energy efficiency.
With the higher fiber count cables, more of the connectors’ end faces become vulnerable to contamination. All connectors are essentially dirty due to the moving parts – such as springs, connectors, and latches – which generate wear debris. This could potentially lead to fiber network problems, such as insertion loss (i.e. weakened signal), back-reflection (signal is diverted back to its source), or even a complete system shut down. Moreover, multi-fiber connectors are more prone to micro-scratches, and the moulded plastic could attract dust which can be difficult to clean.
Another challenge faced by hyperscale data centers is the recent unprecedented surge in internet traffic caused by an influx of work-from-home activity. As most people spend their whole time indoors, they are spending more time on their devices, whether their phone, laptop, or TV. Many businesses require employees to engage in video call conferencing, schools require students to attend online classes via similar platforms, while others turn to online streaming services, such as YouTube and Netflix, to fulfil their entertainment needs. Voice communications, particularly Wi-Fi calling, has seen an activity increase of up to 100%. Wireless network operators are working tirelessly to handle these growing levels of activity, however it undoubtedly causes a network congestion and affects internet speed.
Despite the internet activity hiccups due to the pandemic, the crisis is actually driving the biggest internet expansion in years. People are using the internet more widely, and there are now more spread-out internet usage peaks during different times. Video-streaming companies, including Netflix, YouTube, and the newly launched Disney+ have agreed to cut the picture quality of streamed videos to prevent adding further to the strain. Internet health checks from Oookla, creator of the popular internet test Speedtest.net, stated that internet broadband is in overall good health despite some minor speed slowdowns. Last but not least, the current situation may encourage newer technology to be deployed earlier than expected. The increased demand has pushed the need for more robust, faster, next-generation wireless internet service. The new 5G technology has download speeds that are 100 times faster than 4G, and thus, the crisis may actually fast-forward 5G adoption in some countries such as America and China.
Hence, in order to obtain the best fiber optic performance, cleaning the fiber end-faces prior to installation is always a top priority for any data centers. Companies looking for assistance in choosing the best optic cleaning tools and methods should partner with a fiber-cleaning expert. If you’re looking for fiber inspection and cleaning, be sure to connect with us today.