Do AI models in biology have to get bigger to get better? And what can you do with more training, more computing power, and more outputs?
Joining me this week to talk about MaxToki, a new AI-powered model that can predict how cells age, is Christina Theodoris, a physician scientist at the Gladstone Institutes who is using her models to study cardiovascular disease. She’s a pioneer in this field, having developed GeneFormer: a program akin to ChatGPT or Claude, but which spits out lists of interesting genes or maybe predictions of expression patterns rather than custom workout plans or summaries of your meeting notes.
But size isn’t everything. “The biggest impact we see though is with diversity of the data,” she said. “As we increase the diversity, that actually has even more impact than just the pure numbers, let’s say if you were to use cells that are similar to the ones seen before.” This has implications for training new, even more powerful models.
Theodoris is also an accomplished visual artist whose paintings draw on surrealism, cultural heritage, and memory. With that in mind, we talked about the trend of AI slop, how she would paint MaxToki, and whether or not she likes the Ion Genomics logos I came up with.
To learn more about her art, please check out Christinatheodoris.com
You can also listen on Apple Podcasts and Spotify
Links to Theodoris’ work discussed in this episode:
MaxToki Preprint: https://www.biorxiv.org/content/10.64898/2026.03.30.715396v1
Scaling GeneFormer: “Scaling and quantization of large-scale foundation model enables resource-efficient predictions in network biology,” Nature Computational Science
https://www.nature.com/articles/s43588-026-00972-4
“Discovery of candidate therapeutic targets with Geneformer,” Nature Protocols



