| Episode | Status |
|---|---|
| Episode | Status |
|---|---|
In this episode, Patrick McKenzie (patio11) is joined by Ben Reinhardt, founder of Speculative Technologies, to examine how science gets funded in the United States and why the current system leaves m...
Welcome to complex systems where we discuss the technical, organizational, and human factors underpinning why the world works the way it does. Hi there, everybody. My name is Patrick McKenzie, better known as Patio eleven on the Internet. And I'm here with my buddy Ben Reinhardt, who's the founder of Speculift Technologies, a focused research organization. FROs, if you're not familiar, are a sort of innovation that is returning to tradition in terms of how we do science funding.
But we've had some episodes recently about charitable giving and some episodes recently about sort of for profit investing. And science funding sits uncomfortably in, like, the spiritual intersection of those two, where the gains from core science funding are often not directly captured in the way that the gains from funding a company would be captured. But they're not exactly charitable either. For one thing, those of us who have worked in tech have had long and prosperous careers due to someone in the past making decision to fund some amount of research that we eventually built on top of. And with that super broad prompt, I just want to talk about that, what we currently do for the funding ecosystem in The United States, and how we could improve it.
I'm excited to be here. One quick correction is speculative technologies is not itself a focused research organization. We help people start them and, like, aspire to run them internally. But and I can get into the the nitty gritties of, like, what a sort of the the classic definition of a focused research organization as as classic as a a five year old definition can be. But that's that is
the one little little asterisk on that. Yep. My bad. So from a super high level, I think the common narrative in tech spaces of the funding for basic research in The United States and elsewhere was that it was, at one point in the past, largely an industrial game with places like Bell Labs that had large semi monopoly powers investing a large amount of economic rent into their internal laboratories, which deployed huge amounts of capital by the standards of the time against basic research. And then that fell out of favor for a variety of reasons and in favor of the federal government funding almost all basic research through a couple of different funding authorities and organizations.
And there has been something of a shift of that in the course of the last few years with increasingly a lot of research in particular areas of interest funded through industry again, but the federal government remains the largest funder by quite a bit. Does that broadly capture the shape of the curve?
That broadly captures the shape of the curve. I think it obviously, it depends on when you start when you start history when you're you're what is your baseline. So if we we look at sort of the the post World War two system, which is kind of what we live in today, the the government o has has sort of always been funding a lot of basic research, and the industrial funding of basic research I will I'll put a star on this because I I think the the categorization of research into basic and advanced and development is a a bad bucketing system. But companies do not do much less basic research than than they used to. And now if you look at the the breakdown, the the vast majority of it is funded by the federal government.
And I would love your take on why these three buckets are not a great taxonomy for organizing the world of research. But just for the benefit of people who have never considered that question before, what is, like, the, quote, unquote, typical taxonomy? And then we can go into reasons why that might not cleave reality at the joints.
So the the typical taxonomy of research, and this is, like, encoded into law, is that there is basic research, which sort of in in the the sort of classic sense of your scientist who's who's like, oh, I wonder, like, snails meet. Right? Like, I I think it's it's it's sort of like a a prototypical it it is is research done into the nature of the universe with no mind towards how this might be useful for anybody. It's just like pure curiosity. And then there is applied research, which you are then still very much doing trying to trying they're they're still very open ended questions, but you are now trying to do something useful.
So now it is like, how can we make the structure of this material have this specific set of properties? Right? So in a way that, like, we don't know how to yet. And then there's development, which is, like, for for lack of a better term, like, make like, the the last piece before making a product. So the the to to a large extent, like, the work to make Starship that SpaceX is doing is development in the sense that they are trying to do something that nobody has done before, but and and it is very hard, but there is a clear sort of product at the end of it.
Mhmm. And this is the quote unquote, just an engineering problem. No new science required. Yeah. I hate it.
Something that engineers, yeah, never love hearing, particularly from from the CS field. Like, given the existence of Turing complete languages, everything is an engineering problem. Yes. I think it is sometimes useful in terms of my mental taxonomy to think, okay. There is foundational research done into electromagnetics.
And then in the middle, we discovered that LEDs are a concept, and then actually getting, like, a blue LED to exist in the physical universe at a price people can afford is more applied research. But The United States, through the genius of its elected representatives, has encoded this tripartite definition into law. Why might that not have been a great idea?
Well, I I think it's tough. Like, semiconductors are are a great example of of why. So you you think about point at the history of the transistor, and, you know, it starts off with sort of this realization that there are these things called semiconductors where they're not quite conductors and they're not quite insulators, and they have, like, weird properties about how electrons move through them. And, like, that that is certainly basic research. And then people are like, oh, we might be able to use this to make, you know, better amplifiers.
We might be able to to replace these, like, vacuum tubes. And so the the way that you actually do the work there is some like, a mixture of applied research where you're, like, trying to get these semiconductors to be useful at the same time as, like, trying to figure out what the actual shape of the product is that you're going to make out of them. But then you run into this situation where you realize that, actually, like, our our understanding of the laws of physics is insufficient to explain how how electrons move through these semiconductors. You basically need to, like, do some updates on our understandings of of quantum mechanics in order to actually, like, properly model how electrons move through the semiconductor, which and and you're doing this sort of at the same time as thinking about the product. And so it becomes this wild and tangled mess with a whole bunch of loops.
And so if you have a team and one guy is trying to figure out new laws of quantum mechanics while talking to a guy who's trying to bond hunks of of semiconductor together to try to make this amplifier that is actually useful, like, what kind of research are you doing? Right? Like, it goes in no bucket. And and the reality is that if you zoom out, like, the vast majority of, like, useful research projects look like this.
Mhmm. Yeah. I think it often gets a short shrift in discussions even though it's been highlighted for decades in a number of places. In in Japan, the magic word is Kaizen, which got appropriated slash homage by a US based consulting community. And often with regards to semiconductors, the the jargon thrown around is process knowledge there where there is something that hasn't been reduced to a paper yet, but you can't build a chip fab without having it, where, like, just the designs and just the description of, you know, all the specs for all the machines, you wire them all all up together, you will not get, like, a usable wafer at the end of it because you lack the process knowledge.
And and the dirty secret of science is that even when there is a paper, it process there's still process knowledge. Like, there there are many, many situations where the only lab that can actually do the thing that is reported in the paper is this this one lab that has the, like, the process knowledge that even if they are, like, trying to communicate it, it's very hard because you're like, oh, you have to, like, do like, fiddle with this thing just this just the right way. And so so I think process knowledge is shot through everything. It's not just, like, even even when there are papers. This this sidebar is why the the idea that we are going to teach AI to do all of the science just by feeding it a bunch of training it on a bunch of papers is one that I personally think is not going to happen.
I think there's a number of reasons that one could doubt doubt that broad hypothesis even though, goodness, we we hope that AI results in an acceleration of science or else what the heck are we doing with it?
Yeah. No. It it certainly could.
We will lack digital twins for a while, to use another magic jargon word that people really like. And so to some amount of the science is going to be rate limited by people, you know, reading the AI from a screen and then using that to do something in the physical world and then iterate that loop a lot, at least until the point where we have a high fidelity simulation of the world that can be run-in a computer. Currently, the high fidelity simulation of the world uses human language as a substrate or some derivative of human language in the model weights, which is a wild world to have ended up in. I don't think we appreciate how science fiction reality is at the moment. Anyhow, a worthwhile tangent to explore some other time, but we were talking about scientific funding.
We can talk just talk about how science funding works in The United States right now. Let's let's just do, like, a broad overview of of the system. And I will flag that how science is funded is deeply coupled to where the science is done. So so there's there's sort of these these two parallel tracks that are are deeply interrelated. Right now, using the the the buckets, the the basic applied and development buckets as as they exist.
Right now, I I think it is roughly roughly order $900,000,000,000 of it goes into science research funding in The United States. And and sort of, like, of that, we can we can slice it up in a couple of different ways. The vast majority of it actually goes towards development. So I I think that it is don't quote me on these numbers, but it's, like, order 700 of that 900 goes towards development. And then there's roughly two thirds of the remainder goes towards applied research, and then of the and then the rest goes towards basic.
So so basic research is the least expensive. And this makes sense. Right? Like, you don't need to be building, like, huge machines. You don't need to be blowing up spaceships when you're doing basic research.
Another way of chunking up that money is sort of, like, money like, you could sort of largely put into to three buckets. Money that is from the government, money that is from private corporations, and money that is from private sort of foundations and other nonprofits. And you you'll see stats that, again, like, a lot of the money actually does come from businesses. So I think it it's it's roughly 600 of that 900 is coming from businesses. And then another 200 is from the government and, like, less than less than 100 is from private organizations.
The thing to keep in mind so looking at this, many people are like, oh, business funding for r and d is very high. Like, what's the problem? And the the thing to note is that the vast majority of the the r and d money that businesses put in is is d, is the development. And I'm gonna go slightly deep right here, which is a thing to keep in mind whenever you hear this number is what things can get coded as development spending by these businesses. Some of your audience may be aware of this, but, like, for example, like, building a new feature in a piece of software can get coded as a development expense.
And so when you hear this, like, that there's all this money going into to r and d and all this money going into to, like, r and d from businesses, Some of it is is, like, very legitimate research. Right? Like, Microsoft and Google are building quantum computers and discovering LLMs and and all this stuff. Right? So businesses are certainly doing good research, but the the the numbers are are a little bit skewed.
So I felt something like an impedance mismatch of how tax policy around software specific has huge distortionary effects around, one, our understanding of just, like, the fundamental domain of science and around allocation of resources in the economy. So you were mentioning that businesses can code this, which basically means, like, the if the business identifies a particular line item, whether that's an engineer's time or similar as laddering up to research, then they get the r and d tax credit. And the companies are incentivized in many cases to, like, try to maximally claim for all the R and D, which pushes up the claimed amount of research and development work done every year. However, as everyone who has ever worked in a software company before knows, like, a certain amount of you know, one has a software business, the project exists, and a certain amount of the intellectual effort by engineers and product managers and similar on that software every year is effectively OPEX, but doesn't look like OPEX in a certain view of your balance sheet slash profit and loss statement and isn't coded as a 100% OPEX when the accountants or consultants file the tax return because if you you know, like, if it's cherry flavored you get a substantial credit from The United States from it.
Where if it's, like, simple normal garden variety optics, you don't. And as a result, we have incentivized some of the largest version of capitalism to say, well, we we have an awful lot of cherry flavored topics every year where that doesn't change the fact of the physical universe. It doesn't change the the rate we are, you know, learning about reality all that much. As you mentioned, like, large firms in capitalism also do a whole lot of legitimately cutting edge research on everything from quantum computing systems to, you know, the attention is all you need paper. It's classic basic, quote unquote, basic research where there was no application that could actually be made for it at the point it was written.
And then a number of firms, interestingly, not the ones that wrote the original paper, chased after it with Vim and Vigor, and now we have, you know, magic answer boxes on our phones.
Yes. A couple of other things to to flag about sort of I I I think it's it's in addition to, like think I we've we've sort of talked about, like, where the money is coming from. And then there's the the question of, like, where is the money going to? And the the vast vast majority of money that goes towards what I call like, this is my own term, but I I call it precommercial research. And so this is sort of work that sort of to to your point, maybe targeted at an application, but is not yet a to to some extent, not yet a thing that makes sense as an investment because of the the level of uncertainty, time scales, and public goodsiness of the the research makes it so that a rational investor will look at it and say that I do not want to put my money into that if I want my money to make more money.
The vast majority of that right now happens in universities. And so when when you hear people complaining about the academic system, that is that is sort of these universities. And then there there is a good chunk of work that happens in in national labs and and a number of sort of a long tail of of other organizations. And so that the the the sort of my hobby horse one of my hobby horses, I have I have several. You could say I have, like, a hobby chariot that that is pulled by my hobby horses.
One of them is that to to a large extent, we we have developed a system in The United States where we we sort of when you ask someone how does technology happen, the the response will be, well, someone does pre commercial work in a university until it makes sense as a company, and they will spin it off into a company, and that company will raise a bunch of money products and low, there will be technology. It's similar to how I the the basic applied development model of of research, think, is is a bit flawed. So too is this, like, how does how does technology magically happen is is also a bit flawed.
So I think there are all sorts of institutional incentives that threw a wrenches in the works here. But one of them in particular is at the point where you are spinning something out from a university into its own private company or selling the technology to someone or similar, our friends at the technology transfer office at the university get involved. And you've had some choice words for them in the past. Do you want to spin out that thought on air for people?
Sure. So the explicit choice words are that I'm generally a big believer in Chesterton's fence and that most institutions that exist exist for a reason and need to be reformed, but are, by and large, should exist. That is not the case for tech transfer offices. And and I think that the world would just be better if we burned them to the ground. And this is not against anybody any individual in a tech transfer office.
There are many, many wonderful, lovely people who work there. But that to a large extent, the tech transfer offices are serving none of the the their purported purposes. Right? So if you you think about what ideally would a tech trans like, you think about it, it's like, oh, the purpose of a tech transfer office should be twofold. One is to make sure that technology that is invented in the university gets out into the world and so the world can benefit from it.
And two, make it so that the university can capture some of that value that they've created, which will then hopefully be plowed back into more research to create more technology and more science.
I think the acknowledgment of an ad read sounds cooler in Japanese. Cool.
Right?
You might have heard on this podcast that cuts to PEPFAR and USAID were extremely disruptive to health care in some of the world's worst off communities. Private funders ended up picking up part of the slack. But if you're not a billionaire, you probably can't commission quality research on whether your charitable giving is effective. GiveWell can. GiveWell is a nonprofit which has a team of researchers working in real time to track the impact of foreign aid cuts, and they contribute their research to the commons for free.
For example, they found one of the most effective interventions is paying caregivers and poor nations directly in cash to take their children in for routine childhood vaccinations. This decreases the disease burden on the kids and their families and reduces childhood mortality. GiveWell has spent eighteen years researching global health and poverty alleviation. This work is funded by donors who think it's useful for directing their charitable giving. GiveWell also lets you donate to the causes they think are most effective and will pass 100% of your donation along to the recommended funds.
In addition to helping donors stretch their charitable dollar, GiveWell has a matching program where they'll amplify your donation and route it to where they think it would be best employed. If this is your first gift through GiveWell, you can have your donation matched up to $100 before the end of the year or as long as matching funds last. To claim your match, go to givewell.org and pick podcasts. Enter Complex Systems, all capital letters, all one word, at checkout. Make sure they know you heard about GiveWell from the Complex Systems podcast to get your donation matched.
Again, that's givewell.org, Code, all one word, all capital letters, complex systems to donate or find out more. I have an engineering degree and code my own websites. It's probably the most irrational choice I make in business. Low leverage, a spiky maintenance burden that always comes at the worst times, and and they don't even look good. Don't get advice about design from me.
Instead, take it from Framer, a sponsor of today's episode. Framer already built the fastest way to publish beautiful production ready websites, and it's now redefining how we design for the web. With the recent launch of DesignPages, a free Canvas based design tool, Framer is more than a site builder. It's a true all in one design platform. From social assets to campaign visuals, to vectors and icons, all the way to a live site, Framer is where ideas go live, start to finish.
Ready to design, iterate, and publish all in one tool? Start creating for free at framer.com/design, and use code complex systems, all one word, all capital letters, for a free month of Framer Pro. Framer.com/design. Promo code complex systems. Rules and restrictions may apply.
Both of these things seem like they're good things, but the the reality on the ground is that tech transfer offices serve neither of these purposes. I looked looked it up beforehand, and the fraction of tech transfer offices that are profitable of if you look at across all tech transfer offices is 16%. So the the vast majority of tech transfer offices actively lose money. Of the ones that even do make money, the the total amount of money that they make sort of across The US is, like, single digit billions of dollars a year. And that's, like, every single every single spin out from every single university.
That's that is the RNA vaccine. That is Google. That is Gatorade, which makes shockingly large amount of money. And so so yes. So the amount of money that tech transfer offices are generating is tiny compared to the amount of money that research actually needs.
Most of them are not profitable. And I think the biggest thing is that, like, the amount of pain that people have to go through in order to try to get technology out of the university is is kind of mind boggling. So for your listeners, when someone signs an employment contract with a university, part of that employment contract is that they do not own anything that they invent sort of using university resources or on the clock for the university. The university owns it. So if if you are a scientist and you invent something at the university and you're like, this would make a great product, you, like, must go to the tech transfer office and say, hey, may I license my own invention from the university in order to to spin out a a startup.
And then the the tech transfer office will, like, hem and haw. And first, they they will feel no urgency on this. And as your listeners know, time is of the essence when you're starting a startup. And then they very often will want, like, fairly onerous deals where they they demand monthly payments starting at at day zero or a very large like, unreasonably large amount of equity. There are questions about, like, counterfactually, how much technology has not spun out of universities because it's such a pain to go through the tech transfer office.
Like, how many companies have have died because people have tried to, and it is either have been discouraged or the the deals were such that it was then impossible to to raise more money. They function as just one
more imposition on the PI's time doing paperwork instead of doing research. A long time ago, not going to place far, far away, I was an undergraduate research assistant. And my understanding as someone who had been a student until, like, a hot minute ago was that, you know, I am working for the princely sum of $12 an hour doing this undergrad research. Obviously, I'm not going to get any sort of upside in this research. That's not the point of the exercise.
I'm learning things, etcetera, etcetera. This will be a good way to use the summer. I spent more than ninety minutes that summer myself doing paperwork from the university tech transfer office, and the PI spent tens of hours over the course of his year. Wild. I I don't think there's any PI in a university who only actually works two thousand hours.
But if it were, you know, a 1% tax on all research produced by the university, you you'd want to know, like, what are we getting for paying 1% of all life's potential tax? And if that answer is like, it's literally negative, then just cancel it without replacement. And maybe, you know, if the university wants to do something instead, they'll just say, yeah. We're gonna put a $100,100 k check into anyone who spins out something. That seems like, you know, we'll we'll be in the the things that are on the far right tail of outcomes.
And for the other ones, it's, like, cheaper than having an office staff full of full time employees.
Yes. That would be very that that would be very wise. And we've we've run this experiment so that there there are sort of natural experiments in this. If you look at the the University of Waterloo, the the university lets inventors own their own inventions. And to a large like, base basically, they say, hey, if you go off and you make a ton of money, you know, like, give some of it back.
And I I think people I I may I'm I'm an optimist about human nature, but I I think people do. Like, if if the it really is a thing where they're like, oh, I was unable to do this without the university. Like, the University of Waterloo is doing quite well off of some of the things that are invented. And I think the the the riff on this is like, your listeners know the dynamics of startups very well, where it is, like, an extreme tail dominated thing. So the vast majority the vast majority of things that spin out of university are not going to be incredibly lucrative.
And so trying to, like, squeeze everyone the right amount does nothing besides sort of give you fewer shots on goal.
Yeah. And I think we saw over and over again at Stripe Atlas is that the and I'll say Stripe's not necessarily endorsed representations that I make, but start ups in their early days are just so fragile, particularly in, like, the proto startup form where I'm not exactly sure what I'm going to do with my life next year, the PI says. Am I going to, you know, continue climbing the ladder? Am I going to try to go after that ambitious research project that I just got a grant for? Grant may gain some ball of wax.
Or should I try commercializing the last research project? And in those early days, just the notion of like, well, it's 600 pages of forms to do, like, option b versus option a is like, it's still work. I mean, you know, I work for a living, but it is work that feels less draining, then I won't do the one that has 600 pages of forms in front of it. And barely an exaggeration, by the way. And it's not just like 600 is the magic number where people have an internal capacity of 450 forms, and then the startup dies.
We were tracking it down to, like, literally form field level. And eventually, he played this out a few years of probably, like, his visible and macroeconomic indicators. But we have intentionally, and for often good reasons, strangled innovation at the earliest stages due to the bureaucratic and administrative overhead on it. Absolutely. There's a wag quote that the bureaucracy is expanding to fill the needs of the expanding bureaucracy, but that haven't interfaced with some of these processes directly.
You definitely feel the, you know, indicative truth of that.
Oh, yes. I I both have gone through them myself and very close to many professors and and sort of hear on a daily basis that looks like and then the the data also bears this out that they they did some surveys, and and I think that bureaucracy occupies now roughly 40% of sort of, like, logged hours by by professors. So it it is a a very real thing. If I if I can sort of double click quickly on on a term that you've been using that I'm I just wanna make sure that the listeners all know is is you've been using the term PI a lot. And I don't know that we've defined it.
And so just that is is principal investigator, which is sort of a term of art of that like, our our sort of modern system of research funding, which is, like, the person who applies for and gets the money. And I I'm I'm noting that noting this because PIs like, in a university, usually, the PI is a professor. Sometimes it is a a postdoc or or sort of a research assistant, but the vast majority of the time it's a professor. But our our sort of like, the entire grant making system is organized around this assumption that there there will be API, and this person will both, like, write write the proposals for research, be responsible for executing on the research, and be be leading the research, which at sort of like a very granular level and, like, that they are applying for research to do a a very specific project that has very specific scopes and aims. And I and I flagged this only because I think that it it is certainly a fine system.
Right? Like, it has it has gotten us here like, it has produced many amazing things, but as as a way of doing everything, it's it's kind of shocking because if you you think about it, that would be, like, sort of in in startup terms, that would be like, you're not allowed to have a CTO. The the person like, you are only allowed to have one executive who must do all the fundraising, run the team, and the the scale on which they can raise money is, like, four very specific projects. Like, we need to develop this new feature or or, like, roll out this specific app. And, know, heaven forbid, you find a different opportunity and want to use that money for something else because you certainly aren't.
So that is a just a thing to flag about how how this all works.
And this goes back decades, but I sometimes wonder if we're not reacting to the fact of science as conducted centuries ago, where when, you know, looks in the history books, a number of the, like, foundational mathematics and foundational science were done by essentially bored noblemen or, you know, patent clerks infamously. Anyone that that just had, like, a relatively tiny amount of money, a single practitioner that was advancing the state of the art, and then just time to think about the problem and draw things on the blackboard, have correspondence with peers. Increasingly, that's not how science is conducted. These days, you know, we have plucked much of our low hanging fruit, and now you sort of need a large team in a lab kind of environment to do much of the research that we're discussing. But the funding mechanism assumes, yes, we understand you will need a large lab.
However, for whatever reason, we can't really interface with large labs. We have to just pretend that it is one scientist who has all the ideas in their own head, and we will, like, essentially ban via statute any notion of, like, you specializing in, you know, in private industry, granted, CEO almost always, like, owns the fundraising ball as one of their things, but you're not also forced to be the person that is doing all the hiring and filing all the time cards and then doing server administration. Whereas under the grant contracts, no, it is literally on you. Like, you have to be the chief hirer, the chief correspondent with your your funding agency, and also, like, most of the brain trust, which doesn't do wonderful things for those scientists who are good at science, but not great at, you know, admin or managing people. It they get essentially frozen out of the the funding pipeline unless they have one of our, quote unquote, supremely talented polymath PIs that can sort of, like, step in for them and write the proposal for them while, like, humming, humming, humming over, you know, a required line in the proposal as it's your work that you're applying for and not somebody else's.
Yeah. Yeah. That is that is spot on. Another one of my hobby horses is is trying to to create more systems that are sort of outside of this, where you can sort of have, you know, funding at some some level above the level of, like, a lab and a a PI, like, in the same way that you have a company. And within that company, there are, you know, several departments, and they're doing, you know, all all these different things and not needing to, you know, go to someone outside for for and and justify sort of, like, on a project by project basis why they need why why they need that funding?
For the benefit of people who have only worked private industry so, obviously, we spend a lot of intellectual calories on fundraising in private industry, whether it's through investment or through sales. But the calories largely get spent at the company level. And then while budgeting is processed, teams have some flex in, like, how they spend their department's budget, and often it only takes an email to get more allocated to you. If a PI at a university lab has their ambitious research project and needs an extra $200,000, how much additional difficulty does that add to their life?
Like, basic like, I I I pause only because it's, like, almost unquantifiable. Right? Like, it in in the sense that there there's one thing if that money is for, like, a entirely new project. Right? Like, there there and there there there are processes for that where it's like, oh, I have this idea, this project, it needs $200,000.
Professors do when they when they start their jobs, they get what is called the startup budget. So they have this, like, slush fund that the university this sort of seeds the lab with. And so if they have that, so some of that, they could deploy it. But if they don't want to, wanna save it for a rainy day, which is very reasonable, and they want extra money for an existing project, it is sort of there like, there are no standard channels for that. Right?
So it's like the way that you would get that money is like if you happen to know an incredibly wealthy person or or like you would secretly, like, basically apply for money for a different project and then, like, do some very sketchy, like, book cooking to be like, oh, well, like, this grad student who's on this other project is just gonna be spending some of their time on on this other project. And, like, you know, banish the thought if you want to spend that money on equipment because something that we didn't mention is that, like, most of the sort of precommercial research funding that's the Soup grants specifies how that money must be spent, like, what what you are allowed to spend that money on. And you need to say, like, ah, yes. This amount of money will be spent on, like, consumables, like, you know, chemicals and reactants. This amount of money will be spent on on personnel.
And sidebar, that personnel money is usually earmarked specifically for for grad students or postdocs. So you're not really allowed to hire professionals to to come in, like specialized professionals to come in. And a very small amount of that will be earmarked for for equipment, which just in terms of of of going deep on things creates these very interesting incentives not to automate anything in science. Right? So everybody's pointing at like, you know, there's there's the like, everybody's wringing their hands about how science productivity has not increased as much as as many areas of the economy.
And I I, to some extent, buy the low hanging fruit argument, but I do not think that it is a complete explanation. And that when it is relatively easy to get money for grad students and relatively hard to get money for equipment, you are very disincentivized for for sub you know, like, substituting for for getting more CapEx to substitute for OpEx even if it would make you much more productive. You know, I've I've heard of very explicit conversations with with PIs being like, so why don't you install some robots to do this incredibly repetitive task that right now you have five different grad students doing? Like, you know, like, basic like, pipetting, like, you know, dropping one one drop of of of chemical into another. And they're like, well, it's easy for me to get money for grad students, so why would I buy robots?
This also has impacts on the the quality of the science done because we we reach for the grad student because it is the one option in the toolbox. You mentioned it is difficult to hire external professionals both because the social system in the history the university makes that more difficult than just promoting a new person, a grad student. Also, the numbers thrown around might not sway industry wages for very long. But I remember there was a funded project conducted at my alma mater many years after I left that was doing urgent research into the effectiveness of a particular drug for treating COVID. And they had a 200 question intake questionnaire for patients that they were attempting to get into this drug trial and were realizing, go figure, that not many patients were getting through an online 200 question questionnaire to get approved into the trial.
And through a random pathway, I ended up discussing with the PI. And I said, so help me understand as the person who has not done medical research for his entire career. Why is the 200 questions? He said that my hypothesis was it must be like institutional review board or something has asked these questions, and they aren't, you know, accountable for the actual success of the trial. And it's like, oh, that's how many of the grad student coded.
Like, oh, so how many are required? Four. Like, oh, well, how about we delete the 196? And he said, no one on the team knows how to do it because, you know, this is written in PHP. Only the grad student does PHP.
Like, it's a big Internet out there. You know? You can find a lot of people who can press the delete key on a PHP app. And it's like, no. Like, we have to do the grad student.
Like, and why isn't she in this meeting already? And, you know, for, like, usual organization, etcetera, reasons, she wasn't in the meeting and wouldn't be available for days in this, you know, period of, like, intense criticality for the project and the broader society that research is supposed to support. And so their, quote unquote, funding mechanism was having me ask the best man at my wedding who does code PHP, hey. Can you look at this thing and press the delete key for us? It'll be very effective.
And indeed, it was. Oh, that's that's wild. And, you know, grad students do a lot of very excellent work. I've known some extremely talented people who were taking an incredible pay cut relative to working in the industry because they love the science and wanted to spend their life doing it. But I think we are making poor use of their time and brain sweat as a society if we're having them pipetting chemicals into containers just because that is the easy thing to put on
a grant application. Yes. And and, like, that two two two quick riffs on that. One is that that requirement, to some extent, like, comes down all the way, like, from legislation because the United States government sees grad students doing research as fulfilling two two different things. One is they're like, we must train the work the scientific workforce, and two, we must create research.
And you know, this this sounds great at the level of, like, the senate, but on the ground, it increasingly has has led to to tensions where, one, we we are kind of, like, overproducing grad students if the vast majority of scientific research needs to be done with grad students. If you're like, okay. Every time we want more science, we need more grad students. So we get this this overproduction of of grad students. And then two, I would argue that we get lower quality research because, like, to your point, many grad students are really amazing.
But at the end of the day, like, they are trainees. And I sort of liken this to if you were a software company and every time you wanted a new feature, you're like, okay. Cool. Let's have the interns do it. And then you not only have the issues with people sort of like learning and doing at the same time, which again can work, but usually is helped by having many more senior people around.
But then also, like, continuity because the interns leave after a while, And then people don't know how to like, you know, it's one thing if the the grad student isn't in the room, but often the grad student has who who coded the button has graduated, and they're off doing their thing. And, you know, it's, like, very hard. You have to, like, do, like, code a lot of, like, code archaeology to do that. And so that is that is sort of another way in which the system is a little bit rickety.
Well, at least you can look in the source control for what they did, which is a dark joke. How many grad programs that I've interacted with actually teach people to use source control?
I tried to introduce GitHub to my lab during grad school. Like like, this was literally everything was just files on computers. And I was like, hey. There's this thing called Git. We could use it as source control.
And everybody was like, that sounds like work.
Do you mind if I ask what year that conversation happened in?
That I believe that happened in 2013.
Okay. So the Joel test, which came out in about, like, 2005, 2006 or so, it was a 10 question test by Joel Spilsky, which was essentially helping people who had not yet joined a company evaluate where the vet company had taste. And so already by February in the private tech industry, like, do you source control was a pass fail question. Yeah. And that's roughly contemporaneous with my own research experience, but even ten years later, it was still taking its time to percolate into the research community.
You know, we have a abbreviated time together today, and I would hate to just dwell on on the problems. And, obviously, one can't thank the research system in The United States enough for all the wonderful things in the built environment around us. But what are the, reasons for optimism about this? What has been working well recently?
Yeah. There's kind of like this emerging new world of people who have seen these problems, realized that the system can be changed, and we need to build new institutions and are are sort of working on that. And there there's sort of like this I'd call it a small group of of of misfits who are trying all sorts of institutional experiments. And I think that sort of the the the realization that we we need to change this has has started to percolate through more and more of the culture. That's that's one reason for optimism.
And then the other is, like, like, as as much as I think that it could be better, like, the reality is that The US research system is still sort of, like, the thing that everybody looks up to and emulates. So just, like, that is another thing to keep in mind is, like, despite all of this despite all of this, there are still amazing people doing amazing work, and and we should be very supportive of them. So just going a a little deeper on the things that are going on, we mentioned first, right at the beginning, focused research organizations. So so that is sort of this this idea of like, oh, what if we actually had organized research more like a startup, where you have a core group of people working on a a very specific problem. So that is that is one thing that's going on.
There are a number of sort of attempts in in various governments to to reorganize research funding. You have ARIA in Britain. You have this sort of, like, ARPA model model after DARPA in The United States. You have organizations both on the the for profit and nonprofit side, like speculative technologies that are are trying to to sort of say, like, well, what if we took all these these, like, things that I was I've been sort of critiquing and invert them and say, like, okay. What if we do have, like, teams of professionals who are not burdened by bureaucracy working on problems that are not bucketed by whether they're sort of basic or applied, trying to do very useful things.
And there there are sort of many ways of implementing that, and I'm I'm very excited because I think that there's many people trying many different experiments with that. And so I think that that is a a good reason to be optimistic.
Mhmm. Well, more experiments on how to do experiments better. Never heard anybody. I regret that that's all the time we have to chat today. But, Ben, where can people follow your work on the Internet?
You can find me on x at ben rinehart, ben underscore rinehart. I write both blog.spec.tech and my website, benjamin reinhardt dot com.
Thanks very much, Ben, for your time today. And for the rest of you, thanks very much, and we'll see you next week on Complex Systems. Thank you, Patrick. Thanks for tuning in to this week's episode of Complex Systems. If you have comments, drop me an email or hit me up at Patty eleven on Twitter.
Ratings and reviews are the lifeblood of new podcasts for SEO reasons, and also because they let me know what you like.
The economics of discovery, with Ben Reinhardt
Ask me anything about this podcast episode...
Try asking: