Transitioning to post-quantum cryptography (PQC) is one of the largest and most impactful changes industrial organizations can implement to improve their security posture. Through a series of activities to map cryptographic dependencies and develop crypto-agile architectures, organizations can prepare to get ahead of the threat curve as opposed to responding to it. Leaders have more to gain than mere compliance, as they move the needle of organizational focus to operational confidence, supply chain trust, and enduring cyber resilience that can be a true competitive advantage. With the tools, standards, and regulatory frameworks in place, industrial leaders willing to act now will look at the post-quantum era as not a threat to manage, but more of an architecture to build.

    Even though there is still no quantum computer powerful enough to break today’s cryptography techniques, the danger is present, since encrypted information that is being harvested may be decrypted using future technologies. As this plays out, organizations must be cautious that many industrial devices being used now will be operational for years to come. In addition, the development of the new world of connectedness through Industry 4.0 and the IIoT (Industrial Internet of Things) will diminish the isolation of devices, hence creating more opportunities for exploitation.

    Industrial systems are not prepared for the quantum era, and the gap is structural. Most OT (operational technology) environments still rely on encryption schemes that could be broken by powerful quantum computers and the long lifecycle of industrial assets means today’s deployments will likely still be running when quantum threats become practical. As IoT, OT, quantum computing, and AI converge, the issue is no longer theoretical. 

    The threat is already active. The ‘harvest now, decrypt later’ model makes quantum risk a present-day concern, especially for critical infrastructure where data must remain secure for decades.

    Crypto-agility emerges as a viable response, though enabling it across OT is not straightforward. Legacy PLCs, embedded devices, and safety systems were never designed for frequent cryptographic updates. NIST’s CSWP 39 reframes agility as a design requirement, pushing organizations to build systems where algorithms can be swapped without disrupting operations. That begins with visibility, something that many asset owners and operators lack. Thus, a clear inventory of where cryptography is used makes a cryptographic bill of materials essential to managing risk at scale.

    However, recognizing the need for crypto-agility is only the first step. Turning it into operational reality requires new architectures, deeper visibility into cryptographic dependencies, and a shift in how security is engineered across industrial environments. Otherwise, no matter how strong algorithms used for encryption might be, achieving sufficient security is almost impossible.

    The answer to these needs is a strategic implementation of PQC, an algorithm-resistant technology based on a novel set of cryptographic algorithms approved by the NIST. Currently, PQC is considered one of the best technologies to ensure the security of industrial control systems against the threats caused by quantum-based attacks and achieve security resilience over the years. The implementation of such technology is now encouraged by regulatory authorities in different sectors.

    When it comes to quantum-resistant encryption standards, NIST finalized the ML-KEM, ML-DSA, and SLH-DSA PQC algorithms in August 2024, with quantum-vulnerable algorithms targeted for complete transition by 2035. Though high-risk systems are expected to transition much sooner. The standards provide organizations with a clear and proven foundation that they can build on, removing much of the uncertainty that has historically slowed industrial security transitions. 

    Yet compliance cannot come at the cost of continuity. A realistic post-quantum transition will be phased, risk-based, and measured in years, beginning with cryptographic discovery and prioritization, not wholesale replacement. Most critical OT systems will likely be the last to reach quantum safety. That makes early planning the only credible strategy.

    Are industrial systems ready for quantum shift?

    Industrial Cyber reached out to experts to gauge the urgency of the post-quantum threat in OT environments. They also look into what industrial security leaders should understand about the likely timeline before today’s cryptographic protections become insufficient. 

    Dr. Dustin Moody, a mathematician in the Computer Security Division at the U.S. NIST and head of its PQC standardization projectDr. Dustin Moody, a mathematician in the Computer Security Division at the U.S. NIST and head of its PQC standardization projectDr. Dustin Moody, a mathematician in the Computer Security Division at the U.S. NIST and head of its PQC standardization project

    “The quantum threat is significant because of the ‘harvest now, decrypt later’ risk,” Dr. Dustin Moody, a mathematician in the Computer Security Division at the U.S. NIST (National Institute of Standards and Technology) and head of its PQC standardization project, told Industrial Cyber. “While a cryptographically relevant quantum computer (CRQC) doesn’t exist yet, data with long-term sensitivity intercepted today could be decrypted in the future. For any environment with long-lived assets, the time to begin planning is now, as the transition to quantum-resistant algorithms will likely take years to fully implement across an enterprise.”

    Jen Sovada, general manager for public sector at ClarotyJen Sovada, general manager for public sector at ClarotyJen Sovada, general manager for public sector at Claroty

    “For OT, the post‑quantum problem is less about ‘Q‑day’ and more about asset useful life versus how long it will take to migrate to quantum-resistant cryptography,” Jen Sovada, general manager for public sector at Claroty, told Industrial Cyber. “Many control systems deployed today will run for 10–30 years, well into when cryptanalytically relevant quantum computers (CRQC) are expected to be operational. Even conservative estimates place them in the 2030-2035 range.”

    Sovada said that industrial leaders should assume any OT system with a useful life beyond 10 years must be quantum‑resistant, and that planning, inventory, and pilots must start now. “CISA already warned that OT will likely be the last domain to reach post‑quantum compliance because of long patch cycles and safety constraints, which argues for early, phased preparation.”

    Mike Hoffman, technical leader at DragosMike Hoffman, technical leader at DragosMike Hoffman, technical leader at Dragos

    Mike Hoffman, technical leader at Dragos, told Industrial Cyber that the post-quantum threat in OT is real but not immediate, and its impact is uneven across OT verticals and across their network architectures. “Most ICS/OT protocols already lack native cryptography, or if they do include it, it is rarely enabled, so the risk is concentrated in higher-level systems, such as OT-to-IT communications, remote access, and identity services.”

    He called upon owners and operators to focus on where cryptography underpins trust, including authentication and data integrity across trust zones. “Given long OT lifecycles, organizations should begin now by inventorying crypto dependencies, engaging vendors on post-quantum readiness, and planning phased upgrades based on business risk and system criticality. In actuality, this is more likely a low percentage of communications that need to be addressed in the near term.”

    Anton Shipulin, industrial cybersecurity evangelist at Nozomi NetworksAnton Shipulin, industrial cybersecurity evangelist at Nozomi NetworksAnton Shipulin, industrial cybersecurity evangelist at Nozomi Networks

    “The post-quantum threat is not immediate. It will take years before quantum computing can break widely used cryptographic algorithms at scale,” according to Anton Shipulin, an industrial cybersecurity evangelist at Nozomi Networks. “However, OT organisations should not delay action. Industrial systems have long lifecycles, often 10–15 years, and are difficult to modify due to limited maintenance windows and operational constraints.”

    Identifying that this makes early planning essential, Shipulin added that the current timeline provides an opportunity to prepare gradually, without urgency.” At the same time, many OT environments still have limited use of cryptography, so organisations must both improve adoption today and plan for future transitions.”

    Enabling crypto-agility across OT environments

    The executives address what crypto-agility practically means in OT environments. They also focus on how organizations can build in the ability to replace or upgrade cryptographic algorithms despite long asset lifecycles, legacy systems, and tight maintenance windows.

    Moody said that cryptographic (crypto) agility refers to the capabilities needed to replace and adapt cryptographic algorithms in protocols, applications, software, hardware, firmware, and infrastructures while preserving security and ongoing operations. “Owners and operators need to push their technology providers to consider which aspects of crypto agility they can achieve.”

    In OT, Sovada said that crypto‑agility means treating cryptography as a changeable subsystem, not a baked‑in product feature. Practically, that means algorithm‑independent interfaces, externalized key management, and updatable trust anchors for devices and gateways.

    “OT systems must include modular cryptographic libraries instead of hard‑coded primitives in PLC firmware, firmware updates that can accept new signature schemes, and gateways that can terminate hybrid VPNs and legacy protocols,” Sovada added. “Because OT lifecycles span decades, organizations should build crypto‑agility into new designs now, while using bump‑in‑the‑wire encryptors, protocol translators, and segmented architectures to wrap legacy equipment. Governance matters as much as architecture: crypto‑agility should be treated as a standard engineering requirement, not a future ‘nice to have.’”

    Hoffman said that crypto-agility in OT means the ability to identify, replace, and upgrade cryptographic mechanisms, especially those tied to identity and trust, without disrupting operations. “While many OT environments have limited need for data confidentiality, they rely more on authentication and integrity across trust boundaries. This is where post-quantum risk is most significant, as it primarily affects asymmetric public key cryptography rather than symmetric encryption, such as AES-256.”

    In practice, he added that organizations should focus on mapping communication paths that use cryptography for identity and trust, and prioritize and select systems that support modular or upgradable crypto. Designing for crypto-agility in the future requires vendor engagement, standardized protocols, and maintenance planning aligned with operational constraints and long asset lifecycles.

    “In OT environments, crypto-agility means the ability to update or replace cryptographic algorithms without major redesign, hardware replacement, or operational disruption,” Shipulin told Industrial Cyber. “Today, the priority is not immediate migration, but ensuring systems can support future changes. This mainly requires vendors to design software, firmware, and hardware that can accommodate cryptographic updates with minimal impact, along with other security capabilities, which are more needed now against existing threats.”

    He added that this is critical in OT, where maintenance windows are limited, and changes are costly. At the same time, many vendors are cautious about modifying stable systems, making crypto-agility as much a vendor and lifecycle management issue as a technical one.

    Tracking encryption dependencies in OT environments

    The executives examine where the most significant cryptographic exposures exist across industrial infrastructure today. They also look into how organizations should tackle the challenge of identifying and inventorying encryption dependencies across complex, heterogeneous OT environments.

    “One of the first steps for any organization to do is a comprehensive cryptographic discovery process,” Moody observed. “This inventory should focus on identifying where public-key encryption is used for identity, authentication, and secure communication. Without a clear inventory of what algorithms are currently in use, it is impossible to prioritize what needs to be replaced. Your first question included the threat of ‘Harvest Now, Decrypt later.’” 

    He added that OT organizations can leverage awareness to begin to use their inventories to correlate with all their other dashboards to identify high-value assets to be considered for migration to PQC first.

    Sovada mentioned that the most consequential cryptographic exposures are remote access and VPNs terminating into OT networks; PKI used for firmware signing, safety systems, and field device authentication; and proprietary management protocols with weak or non‑existent encryption on critical links.

    She added that organizations should inventory current cryptographic protocols and cipher suites, and engage OEMs for bill‑of‑materials and roadmaps to transition to quantum-resistant algorithms.

    “The most significant cryptographic exposures in OT are found at trust boundaries and in lower-trust zones, particularly in edge communications such as SCADA telemetry, AMI metering, and IIoT deployments,” Hoffman said. “These environments rely on present implementations of public-key cryptography and key-exchange mechanisms, which are particularly vulnerable to post-quantum threats.” 

    The latest Dragos YIR report highlights that adversaries are actively targeting edge devices as a primary entry method. In a post-quantum world, these edge devices may be even easier to compromise.

    Hoffman said that organizations should start by inventorying edge devices and communication paths where identity and trust are enforced. “The focus should be on identifying dependencies on public/private key algorithms, mapping trust relationships, and prioritizing high-impact systems. Given the scale and diversity of OT environments, this requires a phased, risk-based approach aligned to operational criticality.”

    “The most significant cryptographic exposure in OT today is still the limited use of encryption across industrial protocols and communications,” Shipulin assessed. “While this is improving, it remains a more immediate risk than post-quantum threats.”

    He noted that another critical gap is the lack of firmware integrity protection. Many industrial devices still do not enforce digital signatures, which have been exploited in real incidents, for example, in a recent attack on Polish critical infrastructure, where malicious firmware was deployed to RTUs without verification, causing disruption.

    “From our practical experience, wireless networks are also a weak point. As their use grows in OT, many deployments still rely on weak or outdated security, or no protection at all,” Shipulin said. “To address this, organisations need to move beyond basic asset inventory. They must identify where and how cryptography is used across communications and software. This requires deeper visibility into software components (e.g., SBOM), supply chain dependencies, and trust mechanisms. In practice, this places new requirements on both tools and processes for asset inventory, making it a foundational capability for managing current and future cryptographic risks.”

    Balancing standards and strategy in post-quantum shift

    The executives explore how industrial operators should prioritize post-quantum readiness. They also concentrate on how much weight to assign to emerging government standards, regulatory guidance, and vendor roadmaps when shaping long-term strategy.

    Prioritization should follow a risk-based approach, Moody identified, adding that organizations should focus first on their most critical data and long-life assets. “It is essential to follow and use PQC standards and monitor vendor roadmaps. Rather than ‘reinventing the wheel,’ readiness should align with industry-wide best practices and regulatory guidance as they mature.” 

    He mentioned that specific OT sectors should develop consensus on questions to ask their suppliers about CRQC readiness plans, and individual OT organizations should tailor such surveys for their providers of services and products offered that leverage public-key cryptography.

    “Industrial operators should prioritize post‑quantum readiness based on mission impact and time‑to‑change,” Sovada detailed. “Start with OT functions that require confidentiality or integrity beyond 10 years and have long change cycles; those become early PQC candidates.”

    She noted that government standards and guidance should anchor the baseline, since they increasingly drive regulation and vendor behavior. “Standards define the ‘what,’ regulators define the ‘when,’ and vendors shape the ‘how fast’ you can move within specific product lines. Board‑level risk discussions should frame PQC as a long‑term resilience and compliance program, not a one‑time cryptography upgrade.”

    “Industrial operators should prioritize post-quantum readiness based on risk, focusing first on systems where cryptography underpins identity, trust, and external connectivity,” Hoffman said. “New procurements should require vendor alignment with current and emerging cryptographic standards, including post-quantum readiness and upgradeability.”

    He added that government regulations and industry standards guidance should provide important direction, but vendor roadmaps and practical deployability will ultimately drive timelines. As with past transitions (e.g., DES, RC4, MD5, weak Wi-Fi protocols like WEP, and legacy schemes such as LANMAN), insecure cryptography tends to persist in OT far beyond its intended lifespan. Organizations should plan for phased replacement, implement compensating controls when upgrades are not immediately feasible, and align investments with business risk and operational impact.

    Shipulin said that post-quantum readiness should start with integrating crypto-agility into overall risk management and governance. At this stage, the priority is planning and inventorying critical assets and their cryptographic dependencies.

    “Standards and regulatory guidance play a critical role. They define recommended and, in some cases, mandatory algorithms and provide a clear direction for organizations,” according to Shipulin. “Frameworks such as NIST PQC standards, NIST CSWP 39, NSA CNSA 2.0, CISA procurement guidance, and the NCSC 2035 are particularly important as they shape the industry baseline for post-quantum transition.”

    He added that vendor roadmaps are also a key input. They provide visibility into how technologies will evolve and how post-quantum capabilities will be implemented in real products. This helps organizations set realistic expectations and define requirements for cryptographic flexibility and long-term support. In practice, organizations should balance all three – use standards to guide strategy, vendor roadmaps to validate feasibility, and internal risk assessment to prioritize actions.

    Planning realistic post-quantum transition for critical infrastructure

    For critical infrastructure operators, the executives outline what a realistic post-quantum transition plan looks like. They also zero in on how to balance the push to modernize cryptography with the operational realities of systems that cannot be easily patched, updated, or taken offline.

    Moody said that a realistic plan is phased and starts with education and inventory. This begins by assessing the ‘quantum risk’ of current systems, followed by a gap analysis against new standards. 

    “Because many systems cannot be taken offline or easily patched, the transition will likely involve a hybrid approach—layering quantum-resistant protections alongside existing classical encryption to ensure security remains intact during the multi-year migration,” according to Moody. “This start should also include identifying the OT organization’s internal team of risk experts, long-lived data, authentication systems, and communications/networking.” 

    He added that Section 5 of NIST CSWP 39: Considerations for achieving crypto agility, strategies and practices offers guidance on Managing Organizations’ Crypto Risks.

    “A realistic post‑quantum transition for critical infrastructure is phased, hybrid, and tightly aligned to maintenance realities,” Sovada charts out. “Near‑term, operators should harden segmentation, reduce remote‑access exposure, and deploy hybrid cryptography (classical plus PQC).”

    In parallel, she said that they should build a cryptographic asset inventory, classify systems by criticality and updatability, and integrate PQC requirements into technology refresh and procurement cycles. “For truly unpatchable or safety‑frozen systems, the plan will rely on compensating controls: one‑way gateways, data diodes, mediating proxies, and strict operational procedures. The goal is to synchronize cryptographic changes with existing outage windows and regulatory reviews, accepting that some OT segments will lag but will be ring‑fenced with strong architectural defenses.”

    Hoffman identified that a realistic post-quantum transition plan in OT starts with fundamentals. “Organizations should first align with recommended practices, such as the Five SANS ICS Cybersecurity Critical Controls, particularly around asset visibility and network segmentation, before prioritizing cryptographic modernization. If you don’t understand your asset communication and network architectures, you can’t effectively address crypto risk.”

    “The urgency is not immediate; many operators are still addressing baseline security gaps,” he pointed out “Focus first on high-exposure areas such as edge devices and external connectivity, then work down the stack. Where upgrades are not feasible, implement compensating controls. A phased, risk-based approach is essential, balancing modernization with operational constraints, while avoiding unnecessary complexity or blanket cryptography mandates.”

    Shipulin said that the key is to treat PQC as a lifecycle program, not a one-time upgrade. “For critical infrastructure, the best approach is ‘inventory now, design for agility now, migrate where possible now, and replace the hardest legacy assets on a defined schedule.’”

    Facebook Twitter Pinterest LinkedIn Bluesky Threads

    Comments are closed.