The next web, explained in plain English
Regularization that replaces hard labels with softened targets to reduce overconfidence.
Polynomial reconstruction from points; used in secret sharing and zk proofs.
One‑time signature scheme using hash functions; basis for post‑quantum XMSS and SPHINCS.
Time between a request and its response; critical for UX in wallets, L2s, and inference.
Compressed representation where generative models operate to encode semantic structure.
Post‑quantum cryptography based on lattice problems such as LWE and SIS.
Base blockchain that provides security and consensus, examples include Bitcoin, Ethereum, and Acki Nacki.
Base blockchain that handles consensus and data availability directly on its main chain.
Scaling networks built on top of a Layer 1, batching or compressing transactions and posting proofs back to L1.
Scaling system that executes off‑chain or rollup transactions while inheriting L1 security.
App‑specific or ecosystem rollups built atop L2s for specialized performance or privacy.
Normalization technique applied across features for each token; stabilizes training in transformers.
Minting deferred until purchase; creator signs a voucher, buyer pays gas at claim.
Scalar step size for gradient updates; too high diverges, too low slows training.
Policy to vary learning rate over time, e.g., cosine decay with warmup.
Smart‑contract system for collateralized borrowing and interest‑bearing deposits.
Client that verifies headers and proofs without storing full chain state; ideal for wallets and bridges.
Layer for Bitcoin enabling fast payments via payment channels and HTLCs.
Order to buy or sell at a specified price or better; implemented in DEXes via RFQs or AMM overlays.
Test that fits a linear classifier on frozen embeddings to measure learned representations.
Forced sale of collateral when a loan’s health factor falls below thresholds.
Pool that gradually shifts weights to discover price while limiting whale dominance.
Distribution of tokens to LPs to incentivize providing liquidity.
Smart‑contract pool of tokens enabling automated market making and swaps.
Property that a system continues to make progress, e.g., blocks keep finalizing.
Transformer‑based model trained on large corpora to predict tokens; used for chat, code, and agents.
Bloom filter in block headers summarizing events and addresses for fast log queries.
Adjustment to token logits at inference time to steer generation or block outputs.
Per‑token log‑probabilities emitted by models; useful for calibration and safety filters.
Directional positions in perpetuals or spot; long benefits from price increase, short from decrease.
Parameter‑efficient fine‑tuning that injects low‑rank matrices into pretrained weights.
Objective minimized during training, e.g., cross‑entropy for classification or language modeling.
Token representing a share of a liquidity pool; accrues fees and can be staked.
Write‑optimized storage structure used by RocksDB, LevelDB; common in node databases.
Recurrent neural network architecture with gates for learning long‑range dependencies.