Information-Theory
Browse posts by tag
Information Theory, Inference, and Learning Algorithms
What I Learned by Analyzing My Own Research as Data
I asked an AI to analyze 140+ repos and 50+ papers as a dataset. The unifying thesis it found: compositional abstractions for computing under ignorance.
Maximizing Confidentiality in Encrypted Search Through Entropy Optimization
All Induction Is the Same Induction
Solomonoff induction, MDL, speed priors, and neural networks are all special cases of one Bayesian framework with four knobs.
The Beautiful Deception: How 256 Bits Pretend to be Infinity
Cryptographic theory assumes random oracles with infinite output. We have 256 bits. This paper explores how we bridge that gap, and what it means that we can.
Entropy Maps
Entropy maps use prefix-free hash codes to approximate functions without storing the domain, achieving information-theoretic space bounds with controllable error.
Entropy Maps
Entropy maps use prefix-free hash codes to approximate functions without storing the domain, achieving information-theoretic space bounds with controllable error.
Perfect Hashing: Space Bounds, Entropy, and Cryptographic Security
Space bounds, entropy requirements, and cryptographic security properties of perfect hash functions.
Packed Containers: Zero-Waste Bit-Level Storage in C++
What if containers wasted zero bits? A C++ library for packing arbitrary value types at the bit level using pluggable codecs.
Introduction to Sequential Prediction
The problem of predicting what comes next, from compression to language models