Because among all the other things, this too needs to move forward.
Cryptographic entropy, i.e. a ready source of large unpredictable numbers, is an absolute necessity for achieving communication security over insecure channels, such as computer networks of a scale larger than your bedroom closet.
Thus, in order to continue building an honest business on selling secure computing devices of any sort, we need trustworthy, i.e. verifiable, entropy source and collection components (TRNGs, "True Random Number Generators"), and not just in the head or in the hand but in active and reliable production. Such a font does not exist in the present world and therefore we need to build it. Moreover, if done carefully, developing this capability can pave the way to improving our competitive advantage in other parts of the stack too, through taking greater ownership as the mainstream options continue to deteriorate.
As a starting point for the entropy source, we have the known and relatively simple - if not well publicly explained or scrutinized - analog circuit design published by No Such lAbs (S.NSA), based on Johnson noise: the background thermal noise present in all resistive components. Compared to the perhaps more titillating idea of using Geiger tube timings to collect "true quantum" randomness, it uses commonplace parts without requiring procurement and long-term maintenance of radioactive material. Compared to the avalanche breakdown effect in a semiconductor junction, it doesn't require a fancy power supply to step up the voltage with due precaution against introducing signal into the noise.
The entropy collector part requires some amount of digital logic, at bare minimum to sample the analog signal emitted by the source and interface it to some input mechanism supported by existing digital computers. Debiasing (pointedly distinct from "whitening") may also be required, though it could in principle be implemented in software on the computer side.
There are many options for implementing digital logic nowadays, offering different trade-offs between quantitative factors including costs of initial design, costs of producing in volume, costs of making changes, size, speed, and power consumption of the product ; and qualitative ones such as complexity of components, complexity of assembly, supply chain dependence, and verifiability. To oversimplify a bit by reducing it all to one dimension, the spectrum goes something like this:
- Discrete transistor or relay switching (practically never used anymore except for high voltage or high power applications)
- Discrete small-scale integrated circuits (ICs) from a catalog such as the famous TI 7400 series, working at the abstraction level of logic gates and flip-flops ("lego blocks")
- Small-scale integrated Programmable Logic Devices (PLD)
- Larger scale integrated Field-Programmable Gate Arrays (FPGA, CPLD)
- Standard microprocessors (implementing the desired logic in software)
- Standard microcontrollers (processors integrated with supporting peripherals such as ROM, RAM and I/O ports)
- Semi-custom ICs (ASICs, designed at the gate or block level)
- Full custom ICs (designed at the transistor level)
Once a designer selects the general technology and then the specific set of chips, he can proceed to using their physical dimensions, pin positions, and interconnections as inputs to the process of laying out a Printed Circuit Board (PCB) and routing its traces (wires).(i)
The S.NSA entropy collector design employed a Xilinx CPLD (in that company's naming that basically means a small FPGA with builtin configuration storage saving the need for a separate ROM chip). It thereby demanded a proprietary and downright disgusting multi-gigabyte software stack in order to verify or modify the logic design. We have very little appetite for pouring our energies into supporting the continued relevance of this stack, porcine as it is in hygiene as in weight, so in principle this puts us back at the top of the decision tree.
The CPLD/FPGA path is still looking like the best move to me, but drawing out the reasoning is rapidly expanding into a whole article of its own right so I will leave it for another day.
- In the good old days, you could prototype and tinker much more easily by plugging chips, wires and test probes into a standard "breadboard". But in the relentless drive for smaller, lighter, cheaper, higher performing and lower power consuming devices, the finger-friendly "through-hole" component formats have all but disappeared in favor of Surface-Mount Technology (SMT). Even if you could get special components for the purpose, electrical differences that become relevant at higher signal frequencies would threaten havoc when switching a design from prototype to production. Thus, I figure it's important that your printing and assembly shops (for whatever production steps can't be done in-house) can support fast turnaround and acceptable unit cost even for low-volume runs, because especially as a beginner you're likely to need several iterations of testing and revision. [^]
[...] goal of this project is to reproduce an existing design for a hardware random number generator (RNG) implemented as a [...]
Pingback by JWRD-RNG working spec « Fixpoint — 2022-12-12 @ 06:16