GRU (Gated Recurrent Unit)
Recurrent neural network cell with update and reset gates; simpler than LSTM.
Recurrent neural network cell with update and reset gates; simpler than LSTM.
Transformer‑based language model trained to predict next tokens; adaptable via prompting or fine‑tuning.
Architecture with a generator and discriminator trained adversarially to synthesize realistic data.
Efficient zkSNARK proving system requiring a trusted setup; used in privacy and rollup proofs.
Decentralized indexing protocol for blockchain data, using subgraphs and query marketplaces.
Adversary incurs small cost to impose larger costs on others, degrading liveness or UX.