To recognize 100 years since the development of quantum mechanics, the United Nations has proclaimed 2025 as the International Year of Quantum Science and Technology. While quantum computing creates enormous economic and scientific opportunities, it also poses serious cybersecurity risks such as rendering current encryption tools obsolete. Experts estimate that there is a more than 50% chance of quantum computers being powerful enough to break standard public-key encryption in less than 15 years.

NIST, the National Institute of Standards and Technology, is developing new standards for the next generation of encryption algorithms – called Post-Quantum Cryptography (PQC) –that can withstand the power of quantum computers.

The Urgency of Migrating to PQC

While quantum computers are still 5-10 years away, bad actors can illegally capture data now and decrypt it later when quantum computers become operational. This threat is called Store Now, Decrypt Later (SNDL) and it is a major concern to governments, military and financial institutions. One way to address this threat is by adopting post-quantum cryptography (PQC) to protect sensitive data from the potential impact of quantum computing and the risk of stolen information.

NIST has released three PQC algorithms deemed strong enough to resist quantum cyberattacks, allowing institutions to begin transitioning from classic encryption to PQC. The process is frequently compared to Y2K preparations, involving system audits and upgrades designed to minimize disruption to employees.

Ensuring a Smooth Transition to PQC

Migration to PQC presents numerous challenges, ranging from technical to operational:

  • Algorithm migration: transitioning from classic cryptography to PQC requires identifying and integrating the most suitable algorithms and whether to go direct to PQC or employ a hybrid approach (mixing classic algorithms with PQC algorithms)
  • Key sizes and performance impacts: the big difference between classic and PQC algorithms is the encryption key size. PQC key lengths are much longer which may strain bandwidth, memory and processing power
  • SW and HW upgrades: Many legacy systems are not designed to support PQC. Updating software libraries, firmware and hardware to handle the computational and storage demands of PQC is resource intensive.

The only way to guarantee safe migration to PQC with protected end-user performance is to test, and test early!

It is crucial to verify key performance indicators such as:

  • Number of concurrent HTTPS proxy users
  • Concurrent connections
  • New connections per second
  • Business throughput
  • End-to-end latency
  • Long-term stability

While VPN infrastructure manufacturers and VPN service providers endeavor to perform such testing, the responsibility ultimately falls on the enterprise IT department to maintain end user performance, after the migration to PQC.

The Importance of Scale Test

Many enterprise IT departments test a new feature before rolling it out company-wide. This could involve a small group of test personnel who log in from around the world to test the new feature. While this approach works for most upgrades and bug fixes, it falls short for changes involving additional compute overhead. In the U.S. alone, there are over 10,000 enterprises with more than 1,000 employees, making it essential to conduct a scale test to ensure a smooth transition to PQC. While scheduling such a test with real employees is impossible, there is an alternative.

The VIAVI NITRO Wireless TeraVM can emulate tens of thousands of employees and their location (remote, VPN, onsite, managed device, and more), then emulate their office traffic such as collaboration tools, video conferences, private application access and more. TeraVM can run traffic with increasing amounts of employees, simultaneously monitoring KPIs such as latency, throughput and MoS scores – ensuring confidence to launch live.

About The Author

TeraVM Marketing Manager, VIAVI Solutions

Close