Word2Vec
Neural embedding technique that learns word vectors from co occurrence statistics using CBOW or Skip Gram, foundational for many NLP systems pre transformers.
Neural embedding technique that learns word vectors from co occurrence statistics using CBOW or Skip Gram, foundational for many NLP systems pre transformers.
GAN objective that minimizes the Wasserstein distance with a critic network constrained by Lipschitz conditions, improving training stability and mode coverage.
Optimal transport based distance between probability distributions; stabilizes GAN training and evaluates distributional shifts more robustly than JS divergence.
List of approved addresses permitted to mint, claim, or access beta features; modern usage prefers the neutral term allowlist over whitelist.
Manipulative practice where the same party buys and sells an asset to inflate volume or price; common red flag in thinly traded NFT collections and tokens.
Market participant with large holdings whose trades can move price or supply liquidity; often tracked by on chain analytics and alert bots.
Cross chain messaging system that locks or burns assets on one chain and mints representations on another via guardian validators and relayers.
Mechanism that processes validator exits and partial withdrawals over time to protect liveness; limits daily churn and can delay large batch exits.
Hash committed in a validator’s deposit that specifies where withdrawn funds can be sent; can point to BLS keys or an execution address after updates.
Lightning channel with capacity above earlier default limits, used by routing nodes and services to improve throughput for large payments.
Signature and script data moved outside the legacy transaction structure by SegWit, enabling malleability fixes and block weight accounting improvements.