OpenAI has launched 'Parameter Golf,' a public challenge where anyone trains the best language model that fits in a 16MB artifact and trains in under 10 minutes on 8xH100s. Scoring is based on compression of the FineWeb validation set, measured in bits per byte — tokenizer-agnostic.
Like golf, lowest wins. The smaller and more capable your model, the better your score.
The challenge runs from March 18 to April 30, and OpenAI is sponsoring $1,000,000 in compute credits to help participants get started training their models. Anyone 18 or older in supported countries can participate.
Standout participants may be invited to interview for job opportunities at OpenAI, where AI researcher roles pay up to $500,000 per year. No resume required. No recruiter screening. Just demonstrate you can build something impressive under extreme constraints.
In June, OpenAI plans to hire a small cohort of early-career researchers, targeting current undergraduate students, recent graduates, Olympiad medalists, and elite competitors. Winning approaches may be featured publicly.
The challenge tests exactly what OpenAI values: deep understanding of model architecture, training efficiency, data curation, and extracting maximum capability from minimal resources. The 16MB constraint is brutally tight — forcing participants to make every parameter count.
The approach reflects a growing trend in AI hiring — companies increasingly value demonstrated ability over credentials. Parameter Golf is one of the most accessible paths into a top AI lab for talented researchers who might not have traditional pedigrees.