Denis Mandich, CTO of Qrypt, a quantum cybersecurity company, and founding member of the Quantum Economic Development Consortium and CQT.
The center of gravity in enterprise AI has moved from “who has the best model” to “who can operate the largest reliable AI factory.” Rack-scale platforms from NVIDIA and others that combine a GPU, CPU, networking and security processors into a single validated system accelerate that shift, compressing the time between prototype and production. That’s the good news.
The uncomfortable part is that AI factories and AI agents expand the blast radius of security failures. When your competitive advantage is embodied in model weights, proprietary training data and real-time telemetry, data-in-transit becomes as valuable as data-at-rest. And as these factories increasingly power physical systems like robots, vehicles, autonomous workflows and industrial controls, the cyber risk becomes operational and kinetic risk.
Executives don’t need another cyber vendor pitch. They need a clear cyber operating principle: In the AI factory era, cryptography is a supply chain and must be engineered for disruption. Let’s look at the main pressures reshaping enterprise cryptography and how to manage them.
‘Harvest now, decrypt later’ is already a now problem.
Many organizations still treat quantum risk as a future event or the next guy’s problem. But adversaries don’t need a cryptanalytically relevant quantum computer to start winning when they can capture encrypted traffic today and wait, for little to no cost, with the potential for high returns. NIST explicitly calls out “harvest now, decrypt later” (HNDL) as a key driver in prioritizing this, hopefully, once-in-a-generation migration.
For AI factories, the stakes are higher because the payloads are richer, including foundation model weights and finetunes, proprietary datasets and synthetic data pipelines, agent telemetry and tool outputs and orchestration traffic between clusters, storage and edge endpoints.
A “breach” may be invisible until years later, when it’s far too late to recover from harvested intellectual property (IP). The cost of sitting on harvested encrypted information is effectively zero compared to the potential payoff.
Physical AI creates a long-life security mandate.
Robots, industrial edge devices and autonomous systems aren’t on a two-year refresh cycle. They’re typically deployed for a decade or more. That makes crypto design a life-cycle decision: A system secured with legacy primitives in 2026 may become brittle mid-service, when upgrades are most complex and most expensive.
This is why post-quantum cryptography (PQC) moved from the research phase to standards and deployment programs. NIST finalized its first PQC standards in 2024, including ML-KEM for key establishment. The U.S. national security community is also publishing transition guidance aligned to quantum-resistant algorithms, and the White House quantum strategy will soon complement the 2025 AI roadmap.
Integrity becomes as critical as confidentiality in agentic systems.
As models become more agentic with planning, executing and chaining actions, the integrity of the data pipeline becomes an existential issue. A subtle poisoning of training inputs, policy prompts or model updates can produce failures indistinguishable from valid responses, which only appear under rare conditions or unknowable triggers installed by adversaries.
AI factories create a new category of attack by bending the brain instead of stealing. This infrastructure shift should fundamentally change security architecture.
Today’s rack-scale AI platforms are increasingly designed as trusted computing systems with dedicated security processors and high-throughput networking, positioning security as part of the fabric rather than a bolt-on. That’s absolutely necessary but still insufficient. Hardware can accelerate encryption, but it can’t solve the strategic problem that cryptography must survive algorithmic disruption. If PQC fails due to the combined resources of AI-powered research and quantum computers, there’s currently no plan B. Cyber forklift upgrades will lead to long downtimes across the fleet for a known and entirely preventable issue.
Making A Quantum-Ready Crypto Estate Possible
Enterprises must converge on the practical target of a quantum-ready crypto estate, which in practice looks like:
• Crypto-Agility By Design: All environments must support rapid swapping of algorithms, certificates and key-establishment methods across the data center and edge, without rewriting monolithic applications.
• PQC For Key Exchange And Identity, Phased In Via Hybrid Modes: Start with key establishment paths that protect long-lived secrets and high-value traffic, using standardized algorithms and, where needed, hybrid approaches for interoperability.
• High-Entropy Keying That Doesn’t Depend On “Sending The Secret And Keys Together”: Even with PQC, key management remains the soft underbelly due to distribution, rotation, compromise recovery and operator error. The strongest designs minimize exposure of key exchange material over the wire, decouple them from the encrypted data and emphasize provable entropy at both endpoints, so intercepted traffic remains opaque, even in the face of future scientific breakthroughs.
• Inline Performance That Matches AI Factory Throughput: Security can’t become the bottleneck. The winning pattern is software-defined cryptography that can be offloaded, out-of-band and accelerated on modern DPUs/NICs, keeping latency low while scaling policy consistently.
• Attestation And Integrity Controls For “Packets Of Intelligence”: Treat model artifacts, prompts, tool calls and update channels as signed, attested supply-chain objects, not just files moved around a network.
For boards, executives and C-suite leadership:
1. Declare “AI factory crypto readiness” a top-tier risk, alongside resilience and safety, because it directly protects IP, uptime and physical operations.
2. Map your AI factory data flows (e.g., east-west cluster traffic, storage, replication and edge telemetry), and tag the flows that contain “long-life secrets” vulnerable to harvest-now risks.
3. Adopt a PQC migration plan anchored to standards, and require interoperability in vendor roadmaps.
4. Mandate crypto-agility in procurement by requiring platforms to support algorithm swaps, certificate evolution and centralized policy without application rewrites.
5. Prioritize keying architectures with demonstrable, endpoint-generated entropy and minimal exposure of key exchange material, especially for crown-jewel traffic and long-lived strategic devices.
6. Pilot “secure-by-design” reference architectures in one AI factory domain (e.g., training cluster, inference cluster or robotics edge) and measure latency impact, operational overhead and upgrade friction.
The organizations that win the AI factory era will have better compute and stronger cryptography that can evolve faster than the threat model without breaking the business.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Leave a comment