< First Neuron | Home | Random | Final Neuron >

Model Index Page: GELU Model: 4 Layers, 2048 Neurons per Layer

Dataset: 80% C4 (Web Text) and 20% Python Code

Hooked Transformer Loading: gelu-4l

Layers:

Layer #0 First Random Last
Layer #1 First Random Last
Layer #2 First Random Last
Layer #3 First Random Last