Podchaser Logo
Home
Gaps & Opportunities in the Current Research System

Gaps & Opportunities in the Current Research System

Released Tuesday, 27th February 2024
Good episode? Give it some love!
Gaps & Opportunities in the Current Research System

Gaps & Opportunities in the Current Research System

Gaps & Opportunities in the Current Research System

Gaps & Opportunities in the Current Research System

Tuesday, 27th February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Stephanie Reid 0:06 Welcome to the Beyond Research Podcast. This episode is a conversation between Dr. Melissa Flagg, and Stefan Leslie that explores the challenges of tackling wicked and super wicked problems at both global and local scales, and the importance of understanding problems and the environments in which they exist. They talk about gaps in the current research system, and how to experiment with different ways to support research. Melissa will talk about how important it is to understand the constraints that conditions what research should be done, and how that makes the research better. Stefan will explore the value of supporting research that solves problems, but creates knowledge that is valuable in other circumstances or situations. I'll introduce Melissa and Stefan and then we will get right into the conversation. Melisaa Flagg 0:56 I am actually a pharmacist by my undergrad degree and a PhD Pharmaceutical Chemist. I thought I was going to bounce around the jungles, collecting plants and cure tuberculosis. That clearly is not what happened. Stephanie Reid 1:13 Dr. Melissa Flagg is the Founder and President of Flagg Consulting, a Fellow at the Acquisition Innovation Research Center, a Visiting Fellow at Perry World House, and a Senior Adviser of the Center for Security and Emerging Technology at Georgetown University. Prior to this, she served as the US Deputy Assistant Secretary of Defense for Research, responsible for policy and oversight of Defense Department's science and technology platforms. Dr. Flagg has served on numerous boards, including the National Academy of Sciences Air Force Studies board, and the Department of Commerce Emerging Technology Research Advisory Committee. She is currently on the advisory board for the Andrew W. Marshall Foundation. She holds a PhD in pharmaceutical chemistry. Melissa Flagg 2:03 I moved to DC to do a fellowship in September of 2001 at the State Department thinking I would be focused on sustainable development and 9-11 happened and the world in Washington DC sort of imploded and my career path changed dramatically. And fast forward, I wind up getting offered a job with the US Navy, doing international research collaboration overseas in London. So I wind up for most of my career working in research and development with the US military, which was certainly not something that I had, it wasn't on my bingo card for 2001. That was for sure. I did at one point do a little diversion into philanthropy, I worked for the MacArthur Foundation on the colloquially known as the Genius Grants Program from 2013 to 2015, thinking I was going to be in philanthropy for the rest of my life. I was very interested and focused on really like thinking how we bring science to domestic communities. I was offered an opportunity to go back to the Department of Defense as a political appointee under the Obama administration. And I just couldn't say no to that. I will say, sitting at that place where I was doing oversight of all the basic science for the US military, I spent a tremendous amount of time at universities, talking to some of the best researchers in the United States. And I was offered the opportunity to find alternative employment when the election happened and wound up taking a year off and driving the country on a road trip rethinking like what had I learned from sitting in a position of real leadership for maybe the first time in my life. What I realized was, I wasn't super proud of all the things I'd been doing, which was I felt like I had been prioritizing this false narrative of quality, which is so hyper focused on novelty, and on elite institutions really. And I really cared about solving problems. And the military gives you a lot of opportunity that, but really for an internal customer. So I started thinking a lot more about how do we actually prioritize and make solutions valuable again? If you go back to Vannevar Bush and his science in the Endless Frontier in 1945, he was never suggesting that we divorce basic research from the end user. In fact, he was very clear that it needed to have a focus on the public good or curing disease or the National Defense, National Security. So I really went back and I reread that and I've kind of reshaped my career from that point forward. I've spent some time in a think tank, I spent a little time opening a lab for the Army Research Lab up in Boston trying a new model, and now I work for myself. And I do a lot of horizon scanning and work on predictions of like forecasting emerging technology. But I also spend a lot of time thinking and writing on these types of topics. And really considering what the rest of my career looks like, if this is more the lens that I choose to focus it through. Stefan Leslie 5:26 Really what I do is I read interesting things that other people write and then I get in touch with them and say, hey, that's interesting., do you want to talk some more about that? And Melissa had the very good kindness to respond. Stephanie Reid 5:38 Stefan Leslie is the CEO of Research Nova Scotia. Prior to this, he served as Executive Director of the Marine Environmental Observation, Prediction and Response Network or MEOPAR, a national oceans research network. He has also worked for Fisheries and Oceans Canada, and the New Zealand Ministry of Fisheries. Stefan Leslie 6:00 So, I'm the CEO of Research Nova Scotia, that's what I'm currently doing. I've been here for the past four or five years since we started. But most of my career has actually been in fisheries, fisheries management. So, I started in the private sector. And I was there for a couple years before I moved to the Canadian Federal Government. And I worked there in both Ottawa and here in Halifax, in the regional office. So, both in the headquarters and the regional office. And I worked on the management side. And what I found fascinating about that was there you have a science-based department, which absolutely depends on top quality research work and the application of that research to immediate management problems. Problems that are defined by the regulator, the government, but also given the nature of how the fishery operates, those articulated sometimes quite loudly by the users and other interest organizations and groups. And so, I spent a lot of time really thinking a lot about how do you mobilize research in service of those needs? And not just in the immediate sense, because the act of management when it comes to fisheries is you have to say, we're gonna fish this much down to the kilogram in this area and this season. But that is based on a system which is largely unknowable. And so, you have to construct a really interesting risk framework to be able to understand that uncertainty and be able to bring meaning to it that will actually allow you to make decisions. And so that initiates a whole series of investigations that are required that have broad application, maybe not immediate use, but you have to kind of tailor it to what you see coming down down the pipe. So, as I said, I spent quite a few years working for the Canadian fishery system and then took a four year leave to go work in New Zealand, which well, first of all, it's a beautiful country, and you should go visit because it's just an astonishing place to be. But they had the great kindness of allowing people with no other association with the country to go work for their government. And so I worked for them and what was interesting there was there, and I was managing fisheries there, they have the same challenges we do. They have communities, they have uncertainty, they've got a certain set of species about which they know some things and not others. But they go about things in a slightly different way. So sure, it's a Westminster style, liberal democracy and all the rest of it. But nevertheless, the relationship between society and industry, and government is a little different. And so that's a great, you know, you spent a couple of months driving around, I spent a few years working in another system, but nevertheless, it has that same effect of opening your eyes to the possibility of tackling these sorts of issues in a different way. So anyway, long, long introduction short, when this opportunity came to get Research Nova Scotia off the ground, with an open invitation at that time to say, let's try and do something here. Let's try and find a way to directly attach the enormous research capacity we have here in service of what we need in this province. Melissa Flagg 9:13 Yeah, I think one of the things I was so excited about when Stefan reached out to me was seeing that Canada had committed real centralized top down resources to empower a bottom up approach. This, to me is something that is missing by and large in the United States. We have this tremendous decentralized ecosystem and we have a lot of things going on at different levels. But the federal government has a real challenge sometimes, I think, with the trust that it takes to really invest in bottom up approaches, where you're allowing different voices to help you actually ask the question to prioritize the question. I'm so excited by this, I feel like it historically, at least in the US, if you go back the agricultural extension service is one of the models that we have that I feel like I'm so proud of that we, but somewhere along the way, in the last 80 years, we got more excited by models like DARPA, that were highly focused on revolutionary ideas that were big and flashy and novel, and very top down, rather than these ideas that were really focused on empowering and enabling individual farmers across the country to leverage breakthroughs in chemistry or to leverage new knowledge that was coming out on drought sustainable crops as climate shifts or other things like this. And I feel like there's a happy medium there. And I was so intrigued by, I feel like what you're doing at Research Nova Scotia, that's almost marrying these two models, and that's really, I think, an opportunity space that's being leveraged by very few people that I'm really excited about. Stefan Leslie 11:15 I just ask the question. So, you wrote in, I think it was in the article that I responded to that a lot of science that we support, has not really been tuned to the needs of American communities, because that was the landscape you're dealing with, and the kind of the immediate issues that they identified. So why is that? So, you mentioned over the 80 years, we've deviated away into the big and the flashy, as you mentioned, what caused that? And why did that happen? Melissa Flagg 11:45 When you look at what happened after World War II, there was this huge shift in the United States where we were focused on this big global strategic threat of it was like all consuming nuclear, global nuclear war, was this driver, that almost was the lens that all of our research went through. If you go back to like the 60s, the US Department of Defense alone as an organization was 30% of global R&D funding. Like that's why it's just hard to wrap your head around, right. And the United States in total was 69 to 70% of global R&D funding. And the kind of culture that derived from that was one of we have strategic problems, and everybody is going to focus on that. Over time, I think the second thing that's driven us in that direction is this obsession with efficiency that you see starting in the 80s, and it's like a fetish. It's like a cult that is very off putting, because it isn't real efficiency of the mission. It's like efficiency of the accounting. Right? So it isn't, did we get to like solutions that allow our communities to thrive, and therefore, we spent the money in a good way, even if it was a lot of money. It was, did each program manager that we gave money to try to save as much as they could? So, we started to cut down I think, on a lot of ways that were considered wasteful of travel, conferences, one-on-one engagements. We also have doubled in population since this time. The United States is what? Like top five populations in the world. I mean, we're a big country. And so, trying to figure out how a couple of program managers at the National Science Foundation, or the Office of Naval Research, are going to meet with all of the 1000s and 1000s and 1000s of people that might want to talk to them, without showing favoritism became very complicated, I think. And so, what you started to see was a narrower way of developing the questions that we ask. In the US, we're sort of like teenagers. We like things to be a little dramatic, right? I fit well, here. I'm very American. I have very big teeth. I see the dentist every six months, right? But we do tend, I think, to dramatize these choices a bit. Polarize them. And I think it's driven us to a place where we have incentives for profit; industry can make lots of money. And we have incentives for prestige, which are basically novelty driven in academia. And we have no incentives for helping Americans thrive. Stefan Leslie 14:46 Because I read that piece you wrote, where you talk about profit and prestige. So let's do a little on the fly kind of framework conversation here because I think part of what drew me to looking through that is I thought, well, that's really interesting, but I think you got it about half right. And the reason why I think it's half right, is that I think that the concept of profit is great when you're talking about things that are being driven by industry who, quite appropriately are needing to look after the bottom line. But the concept of what happens at that far end, I get prestige, let's just leave that one aside. It's prestigious is, in some ways, very useful, has its limitations certainly, when you're consumed by an interest of what society needs. But nevertheless, prestige has a role. But on the profit side, if just limited to profit, it does exclude other work that's being done, not for profit, but maybe for a benefit, but it's highly identifiable. So you're fixing an immediate need, you're applying the tools that science can bring to address a very specific problem. And so that's more a production model. So you can define those outcomes. But you don't necessarily incentivize a broader structure to engage in the research questions that will produce a set of outcomes that may apply in a range of other circumstances. And so it's almost to fit for purpose for broader application. Now, again, just like the prestige side, that's a really important component of society, you know, if we want, if someone's designing a better search and rescue system or better care for the elderly population, you want them to apply those tools. But I think that that middle space or the other, the other element between prestige and profit, there is time, and there is a need to inspire people to work on those broader societal outcomes. That still is the act of discovery in research. Melissa Flagg 16:49 I 100% agree with you, I do want to just step back and say when I talk about profit and prestige, because I am a little dramatic, it sounds like I'm super upset about it. I am upset about it only in that I believe there are some missing elements that specifically in my country, are not well nurtured in the federal government. And I do think production and public good, these spaces are, should just be part of this Venn diagram, right? We have these two circles, and we nurture them really well with incentives. And then we have this third circle of sort of the public good, where philanthropy I think, in the US tries to step in, and they do some great work producing specific outcomes for specific populations. I think one of the concerns that I have is that first, there aren't a lot of incentives. So I spoke to a guy, for instance, from up in Washington state who had done some excellent research, and it had been published in a very high impact journal. But it was almost immediately applicable to a local problem. And the state wanted to pay him to try to apply it to their problem. The university literally didn't know how to take money from the state, because they were so highly tuned to the federal process, that they basically were like, it's not enough money to go to the effort to try to figure it out. Now, he eventually figured, they eventually worked it out. But it took over a year. And it wasn't for any reason other than just our systems are so highly attuned and efficient for the primary goals and incentives that we've put in place, that it's very hard for us to nurture these other spaces. I think the second aspect, and I don't believe that this is necessarily exactly the same in every country, and I try to be really clear that I speak about this through a lens of the United States, because I think we have our own unique cultural opportunities and bounds and barriers, right? Everyone has their pros and cons. I think another, a second aspect of this where I think the federal government really could have a role is that when you do have great organizations actually producing solutions in these spaces, right? That knowledge is often, the nuance around that knowledge is often lost. So, what were the characteristics of the problem that allowed the solution you applied to be effective? What did you try first that didn't work? Was the solution sustainable over time? It seems like it fixed it. Five years later, is it still fixed? Right there are, there's this nuance around solutions that is often not captured and I feel like we don't value kept the capturing of lessons learned and because we value novelty so heavily in the scientific system, even where lessons learned are captured, there's often not an interest in really diving into that and learning from it and applying it because it automatically frames your work as incremental, as opposed to novel. And so we often, in fact, try to distance ourselves from those lessons learned, right, rather than build on them. And I guess what I would love to see is more prestige and more excitement attached to actually learning how to help communities thrive, rather than being attached so specifically to novelty. I just think it's a bad definition of prestige. I don't have a problem with prestige. I mean, certainly, I was a political appointee. It may be the biggest vanity title of all time, right? I mean, we all have an ego. But I do think it's a very narrow definition. Stefan Leslie 21:01 So, this is an interesting element, because prestige or maybe excellence or quality, research quality, that could be another, maybe a synonym for what you're talking about. We maybe have underestimated the role of certain components of what should go into that. So I had been thinking about this as the role of research is, first and foremost, when we're talking about the space that we're in here, the role of research is to solve problems, pursue opportunities, manage risk, fix things for people, like, if it's not doing that, if we're in this territory, then it's really not that cheap. So that is our, that's our objective, the condition has to be that the work is of a certain quality. The other way, which you opened my eyes to as you can actually think of, well, maybe we need to actually redefine what quality research work actually means. So it is not just novelty, which is sort of got the dimension of well, it's going to be published, or that's what is, that's the coin of the realm in the research world. But we are certainly heavily weighted towards the kind of the rigor and discipline in research methods. And probably not so bad, or maybe getting a little better on ensuring that there's legitimacy in the process, that we're talking to the right people that we're doing things in a respectful way, anyone who's gone through the research ethics board will see that the immediate visitation of the concept of legitimacy into the research process, but we're probably underweighting the concept of salience in that we're asking the right questions in the right way at the right time. And that's partly, I think, actually doing a better job at the outset of pursuing the outcome we actually need. And so, your comment about have we look at something five years later, is it's what were the ingredients that led it to be successful? Is it still the case? Maybe we didn't actually define the outcomes we're looking for, with enough precision, to be able to manage towards it, monitor our progress towards it, and ensure that over time, that we will retain that kind of advantage that we're hoping to get out of the research endeavor to begin with. Melissa Flagg 23:21 I couldn't agree more. To me, this is such an opportunity, right? I'll give an example, again, from the US because it's what I know, not because I believe it's more important, right? I can only speak from my own experience, is that many scientific funders in the United States federal government, they'll follow up, like if they give you a grant, they want you at the end of the year to send in all the papers you produce from it. But they don't say, hey, when we asked you why this was important when you applied for the grant, and you said you were going to use it in your curriculum and teach students and you were going to talk to this community about it or whatever. They don't actually follow up on those things. They don't track those online for the community to find or engage or comment on. Right? And so, I would argue, we don't value salience enough to force it to be defined effectively, which means we also have to define it at the funder level, right, which is very uncomfortable, because it's not what we're trained to do. We have to define what we expect from people so that they, when they submit a proposal, they know how to do it, right? They know what we're asking of them, and we have to track it. We have to ask them, hey, you're expected at the end of this grant to report out on did this actually do what you said it was going to do? Right? Did you actually go out in your, while you were doing diabetes research and speak to the community, who has diabetes to find out if these paths are even relevant, right? Or if they're interested, or what is the real problem that needs to be solved? Are these even aligned in any way with the fundamental biggest problems that keep people from getting something treatable treated? I feel like the first part of this is getting our own house in order and knowing what it is that we want to achieve, even if I mean not specifically, but like the characteristics of a value in this domain and quality in this domain. And then the second part of that is to track it so that people know we're serious, and that we care. And that, to me, is the ultimate accountability to the people who pay for this. We are taking money from people who are struggling to eat, we are taxing people who are working as servers in restaurants and as long schwarmann, and, you know, emptying lobster traps in Maine or whatever. And we're taking this from them, and we're putting it into these organizations and we're promising them that this will give their kids a better life. We owe some accountability to that promise. And I feel like that should be more exciting to people in science than it is. Stefan Leslie 26:27 So those, there's two really important issues, which I think we need to spend some time on. The first is how you define those outcomes in a way that's meaningful. And what I mean by meaningful is not just that they are important, but I mean, that are sufficiently precise, that you can then mobilize, or in this particular case, a research system around their resolution. So, you know, we have the Sustainable Development Goals, which is a set of idealized statements about what the world would look like if it were better. Those are in some respects outcome statements. But they don't give enough precision, nearly enough precision for us to know how do we actually create a series of interlocking projects that are coordinated, that are working on the right things, that are absorbing the lessons from other projects that are nearby, that are learning from their colleagues and the communities and the industry around them? And so on. So I think there's a significant challenge around how do you take what amounts to a wicked or a super wicked problem at a global or local scale and actually fractured into a set of research problems? And then the second thing which you identified is, how do you know you're actually making progress towards it? So how do you actually hold yourself to account to realizing that outcome at the end, when what you control as a researcher is the progress of your research activities. You're bringing in people, you're doing projects, you're acquiring equipment, you're running experiments, whatever that may be. How do you then ensure that what you're doing is directly attached to that outcome in the end? Melissa Flagg 28:13 We need to ask people like, okay, there's what are you specifically going to produce from your research? Like, are you going to go give a talk? Are you going to incorporate it into your curriculum? Are you going to invite the community into sort of like educational conversations, so at least they know what's happening? You know, I don't know. You, we could talk all day about what those very discrete kind of activities are. But the bigger question is, over five or 10 years, what would some, what would an organization like Research Nova Scotia want to do? If you went back and did an analysis of a single program, and you looked at all of the individual things that they did, and then tried to see did it get you closer to a solution, right? I do feel like also, it's very painful. And one of the things that I love about what I see in your work is that I think this work is high touch, I do not believe that you can fully automate and make efficient, from an accounting perspective, right, trust building, and actually solving people's problems because doing that means you have to listen to them, not talk at them and explain what you're doing for them. But like close your mouth, attach your ear to your brain. Hear what people who use different words than you and maybe aren't grammatically correct, didn't go to fancy schools or whatever. Hear what they're saying and like, respect it and value it and then integrate that back into, okay, we had 10 stakeholder meetings, we heard these things, they sort of bundle into these types of problems. Now, if we decompose those through, just like good, rigorous critical thinking and the scientific method, what parts of this are technical? Are like natural sciences approaches? Right? What parts of this really requires some qualitative social science, engagement, education, research, etc.? That way we may need to partner with other funders on. And what aspects of this are literally just they're not science problems? But we have to keep them in mind, because in fact, they may be the reason the scientific approach, outcomes are never adopted. And so if we don't work through community based engagement, to solve those problems over time, right, then it's not gonna matter how good the research is, and we might as well not do it. I will just pause also, I want to tell a little, give you a little anecdote, if you don't mind, of the thing that I think we need to avoid. I was working for an organization and a researcher came to me and said, hey, you've helped some researchers link up with some companies that want to use their research and commercialize. I said, yeah. They said, well, can you do it for me? I was like, well, send me some information and I'll chat with some folks. So it was chemistry, which is my, it's my jam, I understand it. I had some friends in some companies that we were working with, I went I chatted with them. And they said very clearly, they're using a compound, a catalyst that is illegal in Europe. God bless my country. And they were like, at some point, this will eventually not be illegal to use in the US, but you certainly can't export anything that you develop from this to Europe, so you're shutting down markets. So, a company like ours, which is global, we're never going to leverage this research. They need to go back and find a different catalyst. If they do that, we're very interested, great. I give this feedback and the man looks at me dead in the eye and says, okay, well, we'll just keep doing the research. Like never, he’s absolutely disinterested in changing the catalyst and he will do this until he dies, and we will somehow fund him until he dies. Because he's still publishing, it's still novel, it's still technically quote, unquote, good science. But it will literally never be useful. Stefan Leslie 32:51 Is that because it is assuming that the challenge and the most important part is actually technical in nature, rather than reflecting what, in this particular case, the market needs, or will soon need, because that's what is actually going to put it in place is if you have a catalyst that is currently banned in Europe and eventually will be banned in North America, then that is a, that I assume is, by and large, a social structure, right? Because there is a there's a decision making body, it is not the outcome of a normative process that you can just determine that something is going to happen, someone is making an assessment that it is banned for a particular reason. And so what's actually needed there is answering a different set of questions then this particular person was prepared to ask, let alone answer. Melissa Flagg 33:41 If you can bring in the end point constraints early, and actually give those to the scientists as part of the hard problem. Like, we need a solution to this, but I think it's, it's like it has to sit, you give them those constraints up front, it makes the science harder. And so help them see that it's actually a more exciting scientific endeavor, that the novelty is actually higher, the prestige should be higher, if you're able to solve this problem within the constraints that you've been given rather than just blue sky, but never helped the end customer, never actually make a difference. Stefan Leslie 34:27 So to me that says that part of the definition around the outcomes is not just what you want, but the space in which it has to exist. Some of which, you can't violate the laws of physics. So there are certain things that science is already going to get right? You know, chemical reactions follow a certain set of rules, which don't tend to change that much. But then, inevitably, if it's going to intersect with people, or maybe even the environment, then you have to make choices and that space is defined both by what people value, but also what people anticipate to value in the future. And so I know you've written about this and something I've thought a lot about as well is that the challenge was, in the immediate sense developing a vaccine for SARS-CoV-2, but the actual outcome, we were looking for ways to keep the population safe, right, to reduce infectious people, reduce infection, keep people from dying. So, developing a vaccine, what a biochemist might do and an immunologist might do and that set of disciplines is an essential component, we were never going to get there, we're never going to get as far as we could have got without that. But in and of itself, alone, it's insufficient, because at its core that problem involves more than just a technical challenge around how a particular pathogen interacts with the human body and how it transmits from mammal to mammal or mammal to human, or whatever it might be. And so, I think that when we're talking about community, that community, among the research itself, researchers themselves is actually important, because you don't necessarily want to have or could have the chemist who is also equally capable of engaging on these sorts of social matters. But you do need to pair them with people who are and you need to build a system that actually respects the interplay between those different disciplines. And I think you probably get that not just by saying, well, we need a social scientists on this because it involves people, and so once we're done, we'll hand it over to them and tell them here make this work for people. I think what it does is it requires you to define the set of problems according to the full set of areas that you need to pursue, in order to have a realistic chance of an enduring solution at the end. Melissa Flagg 36:52 I couldn't agree more. There's this hierarchy of like, intellectual elitism or something where we place a certain group of people above everyone else and then we believe that the narrative and the information flow should come from the quote, unquote, smartest people, to everybody else. And everyone else should just innately trust that this is true. The problem is that these people live in bubbles. And they're very, very narrow in their specialties, right? And we live in a very complex world. And so, you're almost certainly going to be wrong at some point. And once that happens, and if you don't have relationship with the rest of these people, they will very quickly start to mistrust you as a source, and now, instead of actually having a way and a relationship, that you sit down and have a conversation around what happened, everything just sort of breaks down into people fighting and pointing fingers, right, I think it's super problematic. So I do feel like this idea that you actually need these relationships and this trust billing and these equal conversations from day one, also just it makes the science better. It makes the science better. It doesn't dilute it, it doesn't just slow you down, it isn't just an inefficiency, it isn't a burden on the scientists to have to listen. It is bringing the constraints that will ultimately allow progression of the science to the application, to the solution of the problem much more rapidly and efficiently downstream and will bring on board the kinds of champions that you simply cannot buy with money. It will bring the kind of trust to that definition period, to the science itself, that you cannot buy with money. It only comes with time and respect, and the nurturing of that relationship and changing what you do sometimes because they need something else. And that kind of respect is so high touch. And governments don't like it. They want everything to be fast and automated and streamlined, and they want it to be the same for everybody so it's super fair, and all of these things and I get it. I was that person. I oversaw that system. But I walked away from having overseen a system like that, feeling very strongly that it is not going to help people solve their problems. Stefan Leslie 39:41 So, we'll actually do want to talk about what possible system we could envision that would address some of these issues because you have on your kind of as a description of your purpose or perhaps your interest. Maybe this is the company you're saying, maybe that you were discussing, but you talked about creating new institutional and organizational experiments to modernize how we fund and execute basic and applied science. That mean this is this is great, because you're interested in experiments to modernize. So, you know, this idea that science and research is at its core, an experimental activity. Well, we ought to equally experiment with how we're selecting, choosing, monitoring, valuing publishing. And so I do, you know, I think this is the great space to be in that we can, we can talk about where these gaps are and where the opportunities lie, because you and I have both been in the kind of science world in some way, shape or form for quite a long time and I hear the passion, I hear the love of the way it works, and the potential that it's got. And so I think maybe we could talk a little bit about what kind of adjustments or what kind of new pursuits, could we envision, that could leave aside some of these chronic limitations that we've identified and pursue some of those upsides or those benefits that we could see, by maybe rethinking the problem, or just rethinking the structure in a different way that values different things. Melissa 41:23 I do think it's crazy, that we love to run experiments on other things, but we don't like to run experiments on ourselves as scientists, which I will never understand, like, let's experiment with the system itself, how exciting. That's great science, right? Um, I think the first thing that maybe is to kind of bucket or that I think about when I think about these experiments, is this sort of bucket the types of outcomes that we want, right? So some are discrete solutions in local light places, right, where 1000, people are never going to have lead in their water again, or whatever, I don't know, you know, very discrete solutions. And then I think there's a second level to this in my mind, which is beginning to understand the characteristics of those various problems and the communities within which they sit, right? Or they exist, which means, like, the problem itself has a set of characteristics. But the sort of environment within which it has to be solved, also has a set of characteristics, right? If you're in a rural environment, and you're dealing with water contamination, it's going to be very different than if you're in an urban environment, that the pathogens will be different, the infrastructure will be different. The money that people have to tackle the problem will be different. The types of regulations that are in existence will be different, et cetera, completely different environment, right? So, then you have like, the ability to actually understand that, and then reach out and network and begin to populate, what are those problems? What are the buckets of characteristics of environments and problems themselves? And then how do you start to engage philanthropy industry, states, provinces, you know, federal government, to see how they are interacting in these various sort of types of environments? How do you bring people together to start to not only solve things, but to also develop lessons learned over things they've done historically? Right? So some of the experimentation is almost the humanities. It’s historical, right? It's figuring out what we've already accomplished, and learning from that, and beginning to apply that into where does that make sense in these different characteristic buckets? And then when we're actually funding people, or when organizations are trying to solve a problem, or we're bringing groups together to solve a problem, how do we make sure there's some entity or some way that they're first given access to this context, right? And everyone that wants to apply for the grant or everyone that wants to participate is first given access to this full context, and they're expected to articulate their solutions and their proposals within that context. It could be that you believe this is the moment you can break the laws of physics. As a person who worked in DOD, I'll tell you, I'd rather not see a proposal that says they're breaking the laws of physics. I'm not going to shut it down without reading it, but the bar is pretty high. The bar is pretty high. You got to convince me that there's a reason for me to think this is the day. Today's the day we're going to revisit this, right? I'm open to it, but the bar is high. And that means you're gonna have to show that you know, all of the stuff that came before you, and why you believe you're proposing something new and different in this moment. And so for me, this is simply giving communities this structure and access to this historical context, so that they can place themselves in this context, and that they're not trying to reinvent the wheel every single time. And that we're not actually pitting organizations against each other based on how much time they had to get somebody to do the research to make the case. We're giving that to them. We're saying we want you to start on the shoulders of giants. And we want you to go further. So we're giving you a step stool up to the shoulders, right? We're giving you a path. We’re not trying to get everyone to reinvent that path because that always prioritizes large institutions, wealthy institutions, and the rich get richer. And there's like, I got no problem with wealth. But I have a problem with assuming that wealth and intelligence are somehow overlapping in the Venn diagram. That is just not always the case. And so what I want is to see that everyone has access to this and I think that is a fundamental experiment that we've just never run is that the role of the government isn't to just pit everyone against each other. It is actually to give them the resources, they need to start from a level playing field. And then I want to see the novelty, then I want to see the quality. So I feel like there are these like, maybe it's kind of three different buckets and roles of experimentation. And I'm super excited about all three of them. But I do think they're kind of different and should be separated and experimented with the understanding that they sit in a broader context. Stephanie Reid 47:14 Our discussion today looked at the importance of redefining what research means, emphasizing salience and the needs of the consumer or user. If you enjoyed this episode, we invite you to listen to part two of the discussion coming soon. In the next episode, we will explore a new proactive and accountable investment in research funding created to better serve our communities and address the challenges faced by society. Thank you for listening, and we will see you next time.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features