THE ENGINE · 400 LINES OF NUMPY

ALPHAFOLDMICRO

Self-referential algebraic closure on 256-dimensional complex vectors. The engine finds its own parameters by optimizing itself against itself. Four operations. Eleven numbers. One attractor.

F(x) = x

GITHUB · INSTALL
psiloceyeben/alphafoldmicro
git clone https://github.com/psiloceyeben/alphafoldmicro.git
FIBONACCI 1-1-2 · TWIN REPOS
This is the theorem. Its twin qum.pi is the sensor CLI that applies the derived fixed point to real-world scalar streams. Together they generate the third term.
1
THE SPACE

The engine operates on a 256-dimensional complex Hilbert space, 256. State vectors are initialized from unit-norm random-phase generators. The state is constrained to lie on a 10-basis "sephiroth" manifold — ten orthogonal-ish archetypal directions whose Kabbalistic names are notation, not mysticism.

x ∈ ℂ256,   |x|2 = 1.5,   energy = |x|22 = 2.25

The 10 basis vectors: KETER, CHOKMAH, BINAH, CHESED, GEVURAH, TIFERET, NETZACH, HOD, YESOD, MALKUTH. Swap them for any ten orthogonal-ish basis vectors and the algebra is unchanged.

2
THE FOUR OPERATIONS

One tick is a composition of four verbs. These are the minimum operations required for a system to represent, compose, accumulate, and remain coherent. Fewer cannot be self-referential. More is not minimal.

I
seed
exp(iφ) / √dim
C
bind
IFFT(FFT(a) ⋅ FFT(b))
A
remember
ctx += α ⋅ δ
P
stay
x · (r / |x|)

One tick

statet+1 = P( A( statet, C( I(readingt), carrier ) ) )

Identity → Convolution → Accumulation → Projection. Seed, bind, remember, stay. HRR binding (Plate 1995) composes the reading with the carrier; EMA accumulates into persistent state; manifold projection prevents attractor collapse.

3
THE FIXED POINT

Eleven parameters (ten numeric, one categorical) govern the four operations. The engine derives its own parameters by optimizing itself using itself — F(x) = x is literally the equation the optimizer solves.

correction_alpha0.12
constraint_strength0.05
range_width_scale1.5
learning_rate0.05
spectral_weight1.0
freq_cutoff0.75
conv_passes1
momentum0.0
decay0.0
norm_constant0.8
phase_distributiongaussian

Convergence from three starting points

All three converge to identical fitness to machine precision:

Φ(θ*) = 2651.0757864972775
4
PHASE TRANSITION, NOT GRADIENT DESCENT

Convergence is not smooth. Fitness drops 85% in a single step (4050 → 588) and then snaps to the attractor. This is a first-order phase transition: projection parameters cross a critical threshold where correction overcomes drift. Below threshold, state wanders freely. Above it, state is trapped on the manifold. No intermediate state.

Iteration 1: 4050 → 588 [phase transition -- 8 parameters found] Iteration 2: 588 → 583 [2 parameters refined] Iteration 3: 583 → 583 [FIXED] Iteration 4: 583 → 583 [FIXED] ⋮ Iteration 13: 583 → 583 [FIXED]

Quadratic discriminant of the trajectory is negative (Δ = −12071), confirming the trajectory cannot be fit by a parabola — it is not polynomial descent. After two refinement rounds, eleven consecutive F(x) = x. Every parameter identical. Zero delta.

5
THE JACOBIAN PROBLEM

At the attractor, the Jacobian's spectral radius is approximately ρ(J) ≈ 218. Classical Banach contraction theory requires ρ < 1 for guaranteed convergence. So classical contraction predicts divergence.

!
The system converges anyway. Stability is topological (trapping region on the constraint manifold), not metric (Banach contraction). A Lyapunov-style certificate for this trapping region is open work.
6
HONEST FRAMING

This is an empirical fixed point, not a theorem. What is demonstrated:

Prior art: Holographic Reduced Representations (Plate 1995) · complex-valued embeddings (Trouillon et al. 2016) · classical regulator theory · autopoiesis (Maturana & Varela 1972).

7
RUN IT YOURSELF
# 1 second — verify reproducibility
pip install numpy Pillow
python verify.py

# find the fixed point
python fixed_point_engine.py --rounds 13 --ticks 50