Exact probability distributions for noise in the TFHE External Product operation, computed using the Alea.jl probabilistic programming system (BDD-based exact discrete inference).
The standard noise analysis (in ../Parameter-Selection/) tracks only
variances under a Gaussian approximation. This tool computes the
full probability mass function of each noise term, providing exact
tail probabilities that are critical for setting security parameters.
ExactNoiseEstimation/
README.md
extprod_alea.jl # exact External Product noise via Alea.jl
cmux_alea.jl # exact CMUX noise via Alea.jl
cmux_pmbx_alea.jl # exact CMUXPMBX noise via Alea.jl
blind_rotate_alea.jl # Blind Rotate as repeated CMUXPMBX loops
alea/ # Alea.jl (git submodule)
- Julia 1.8.5+
- Python 3 (for Alea's SymPy dependency)
Install SymPy (required by Alea):
pip3 install sympygit submodule update --init --recursivejulia --project=alea -e "import Pkg; Pkg.instantiate()"julia --project=alea extprod_alea.jljulia --project=alea cmux_alea.jljulia --project=alea cmux_pmbx_alea.jljulia --project=alea blind_rotate_alea.jlTo profile smaller loop counts (recommended before large runs):
julia --project=alea blind_rotate_alea.jl --loops=10
julia --project=alea blind_rotate_alea.jl --loops=100blind_rotate_alea.jl accepts:
--loops=<n>(orBR_LOOPS=<n>)--loop-model=stateful|iid(orBR_LOOP_MODEL=stateful|iid)
stateful is the default and composes loops as
E_{t+1} = E_t (+) ext_base (one carried noise term plus fresh ext-base
noise per loop). iid keeps the legacy approximation that convolves
the single-loop CMUXPMBX PMF with itself n times.
An Apptainer definition is provided at:
apptainer/ExactNoiseEstimation.def
Build image:
apptainer build ExactNoiseEstimation.sif apptainer/ExactNoiseEstimation.defRun from this repository (bind current directory):
apptainer exec --bind $PWD:/work ExactNoiseEstimation.sif \
bash -lc 'cd /work && julia --project=alea blind_rotate_alea.jl --loops=100'Convolution backend options:
- FFT (default, fast but approximate):
julia --project=alea extprod_alea.jl --conv=fft - Exact (slow, O(n^2) per convolution):
julia --project=alea extprod_alea.jl --conv=exact
You can also set EXTPROD_CONV=fft or EXTPROD_CONV=exact in the environment.
From Julia, you can use Alea.set_conv_mode!(:fft) or Alea.set_conv_mode!(:exact) and query via Alea.conv_mode().
TRGSW plaintext mode options:
- Fixed plaintext
p=1(default):--trgsw-pt=fixed - Binary plaintext
p~Bernoulli(0.5):--trgsw-pt=binary
You can also set TRGSW_PT=fixed|binary in the environment.
The script:
- Uses Alea to compute exact single-term distributions for:
- a decomposition digit
- key × rounding noise
- direct rounding noise
- Computes digit×CBD analytically and convolves multiple terms to build full component distributions.
- Reports exact means, variances, and tail probabilities.
- Compares exact variances to the Gaussian formula from
../Parameter-Selection/python/noiseestimation/keyvariation.py.
We model the External Product TRGSW(p=1) * TRLWE for a single output
coefficient (index j=0) under the following toy parameters:
| Parameter | Value | Description |
|---|---|---|
| N | 512 | Polynomial degree |
| k | 2 | TRLWE dimension |
| l = la | 2 | Decomposition levels |
| Bgbit | 9 | Base bit width (Bg = 512) |
| eta | 4 | CBD parameter |
| qbit | 25 | Torus precision (q = 33554432) |
The output noise decomposes into three independent components:
Each term samples a ciphertext coefficient c ~ Uniform[0, q),
deterministically decomposes it into digits d_0, d_1, and multiplies
each digit by an independent CBD(eta) sample. The signs from negacyclic
convolution do not change this single-term marginal PMF
(digit*CBD products are symmetric).
Same structure as Component 1 but for the b polynomial row of the
TRLWE.
- Key * rounding:
s[m] * eps_a[m]wheres ~ Bernoulli(1/2)andepsis the deterministic rounding error from decomposition. For j=0 negacyclic convolution: 1 positive term,N-1negative terms. Signs matter here becauseepsis not symmetric. - Direct rounding:
-eps_b[0]from theb-polynomial decomposition. - Input noise:
e[0] ~ CBD(eta)from the fresh ciphertext.
Rather than modeling the rounding error as an independent uniform random
variable, we sample c ~ Uniform[0, q) and decompose it
deterministically inside the Alea program using shifts and arithmetic.
The digits and rounding error are derived quantities, not independently
sampled. For a power-of-2 torus, the decomposition is bijective, so
digits and eps happen to be independent anyway — but the deterministic
approach is more principled and generalizes to non-power-of-2 moduli.
Even with BDD-based exact inference, a direct model of all N terms can
blow up in size. Instead, we compute one term exactly in Alea and
convolve N independent copies in Julia. This keeps the Alea programs
small while still producing exact PMFs.
For Blind Rotate, blind_rotate_alea.jl first computes one-loop
CMUXPMBX terms, then composes n loops by convolution. In default
stateful mode this uses one carried input-noise term and n fresh
external-product base terms. So this repository does not avoid
convolution; it relies on convolution at both component aggregation and
Blind Rotate loop aggregation.
../Parameter-Selection/python/noiseestimation/keyvariation.py—extpnoisecalc(Gaussian variance formula)../TFHEpp/include/trgsw.hpp— Decomposition and External Product implementations