I've really been enjoying Sara Constantin's “Rough Diamonds”, a sub stack devoted to high-impact opportunities in science and engineering. The most recent essay is on industrializing and scaling mammalian cell cultures.
Cell cultures are big business. Devising a more efficient manufacturing process could be worth trillions, potentially revolutionizing gene therapy, cell therapies, the production of antibiotics, and even kickstarting the lab-grown meat industry.
What’s the problem? It turns out scaling the growth of mammalian cell cultures is hard. Why? They’re really finicky.
"Mammalian cells, by contrast, are meant to grow in, y’know, mammals. They’re expecting to find themselves integrated into complex tissues, full of regulatory chemicals that tell them how to differentiate, when to grow, and when to die. A bioreactor is not their natural home.
Moreover, mammalian cells are not meant to proliferate forever. They get a certain number of cell divisions and then they die."
And what’s more, at every change in scale you have to basically start your experiments over. A technique which produces usable cells on a work bench may not be remotely effective when you scale up even a little bit.
"How bad is it? Well, for instance, once culture growing conditions have been optimized at one scale (say, a benchtop reactor), they have to start almost from scratch with figuring out how to scale up to a larger size (say, a pilot plant). The effect of something as simple as reactor volume on mammalian cell culture growth is unknown and unpredictable.Mammalian cell culture is labor-intensive, expensive, and scales very poorly."
A promising advance would be building reactors (machines which maintain the growth environment and produce healthy cell lines) which dynamically adjust the parameters of the medium rather than targeting a static value (ie it could vary pH instead of just setting it to 7.0, or whatever)
"The target levels for things like temperature or pH or glucose concentration are fixed, hard-coded. Usually the target values programmed into the bioreactor are determined experimentally, through laborious trial and error. (And remember, the settings that worked in the lab won’t necessarily work in the factory — all the experimentation has to be redone for each successive scale-up of production!)Research has shown you get much higher yields under adaptive control — if you allow the target levels for bioreactor settings to adjust as the culture grows or in response to measured conditions in the cell culture. The ideal growing conditions for a low-density culture are different from the ideal conditions for a high-density culture after cells have been growing for a while."
This could be done with machine learning, and probably wouldn’t even be that hard compared to some of the language model stuff I’ve done.
"This isn’t actually that big a model to train. There are probably well under 50 types of sensor measurement and reactor-controlled parameters worth including, all of which are producing sensor data or logging actuator behavior at perhaps a few hertz. Language or video-based machine learning models are much higher-dimensional and need correspondingly vast quantities of training data."
Are there startups working on this? Of course! One of the funnier ones is Trisk Bio, which is solving the scaling problem by building a billion really small growth environments. "Neuroscientist-turned-entrepreneur
Gaurav Venkataraman has a very early-stage startup, Trisk Bio, whose initial angle of attack is
brilliantly simple. Scaling up cell culture from benchtop to factory is unreliable? So just make
tons of benchtop-sized bioreactors for your factory."
(Apparently, sometimes the solution is “just do what you were doing before, but more”)
Where’s all this lead? It’s still an open problem, but one that might be amenable to known techniques (like ML), and with potentially enormous payoffs.
I, for one, am excited to see what comes out of this.
Share this episode.