High-Risk/High-Payoff Part 2
IARPA: Disbelief To Doubt Podcast
Episode 3: High-Risk/High-Payoff Part 2
Guest: Rob Rahmer
Dimitrios Donavos: Welcome back to IARPA: Disbelieve to Doubt. I’m your host, Dimitrios Donavos. In Part 2 of this 2-part episode we continue our conversation with former Office of Analysis Director, Mr. Rob Rahmer and discuss how Rob defines success in the Office of Analysis, what makes a problem "IARPA Hard" and much more. Take a listen.
Dimitrios Donavos: What really separates IARPA from nearly all other funding agencies is this permission to fail. As you know, a lot of times in academia when you're applying for grants, you effectively have to show that you've already able to do the work. You have to send data in advance showing that the work that you're proposing is actually even feasible. So I think that when we're looking at the scale of an IARPA program and the goals, which oftentimes are to push the boundaries of science, we have to be willing to accept some failure. Do you feel that failure, that you need to have some degree of failure in order to balance the portfolio?
Rob Rahmer: For meeting the entirety of the program's end goals, I would say yes, you do. If everything is successful, maybe we're not trying hard enough. Maybe we're looking too incrementally. Again, that's why we go through the Hall -Meier process. Again, failure in the eye of the beholder. I see it as not meeting the final phase three metrics, but many, many technologies get transitioned even if it doesn't meet those technologies because we get feedback from the IC that state, but this may be even better than, this is better than what we're, maybe technology we're using today. And it's still, you know, maybe at least one order of magnitude improved. So yeah, we're going to take it. But programmatically, we don't meet those metrics and we move on.
Dimitrios Donavos: At IARPA, we reference program managers or PMs as being force multipliers. For any future program manager who might be listening, can you unpack what that phrase means to you, especially considering you've been in that role?
Rob Rahmer: So there are several ways I view that term. One is either expanding a current community or building one. It may be so niche that you need to build it yourself and build champions around the intelligence community. There is also all of the work that program managers do to spread the word of this new research that...or this new community or expanding a current community. And our program managers engage academics. They're at conferences speaking, providing keynotes. And so they are not just speaking about their research programs, they are ambassadors of IARPA, the Office of the Director of National Intelligence and the intelligence community at large. So they are...take on many different roles. And we're one of the unique places in the intelligence community that actually has public and open engagements because of the majority of unclassified work that we pursue and with engaging researchers from around the world. So that's how I view force multipliers because we are...
In addition to external, but those communities are inside the intelligence community, and they are sort of building the community around a common problem. It may not apply to every agency, but we, and one of the philosophies of several directors was you can't just have one agency come to the table. Agencies have their own research organizations, so we need to have something that's common across many agencies of the intelligence community for IRPA to invest. And...
Getting those champions to support research that may potentially solve that problem is another way that they become force multipliers.
Dimitrios Donavos: You talked earlier in our discussion about the office and program development transition partners. Our listeners may not know what that term means. Can you just unpack that for us and talk about the critical role in involving our transition partners and being able to actually transition elements, if not the entire outcome of a program, is for pushing not only the boundaries of science, but ensuring the intelligence community stays ahead of the curve when it comes to avoiding technological surprise.
Rob Rahmer: Great question. So transition partners are members of the intelligence community, often from the, that are researchers, but they also have operational backgrounds. So it may be a, they may have both. They may be unique to either side. And so when we, let's start with the program development phase, we need to understand the perspectives of both. What research is, is a particular agency currently conducting that is in line with the goals of the program as a program manager is developing, and also what operationally are the needs. And so why that's important is because we don't want to develop something that no one is going to use. We don't want to have it shelved after investing millions of dollars in a given program. And so the intelligence community partners, these transition partners,
They understand their operational mission and they understand the threat to those operational missions. So they are providing that input to the program manager during the development phase so that we can have something that could counter and the challenges, many of them, at least on the operational side, many of them are looking as they should, they're looking sort of down at the problems of today and what the program managers have to do is work with them to find those that are also looking strategically at what are threats over the horizon.
And so we need their help because we're not going to develop a program in a silo that gets shelved. And then we also need to understand what the state of the art is or what they're currently looking at, looking at an agency in particular is looking at and investing in and maybe build upon that to help us. And maybe an agency could provide an operational baseline, whatever the case, so that we can measure to understand, is this really going well beyond what the IC currently may have in use.
Dimitrios Donavos: Since you've occupied such a number of varied roles within the organization, I'm curious, you talked earlier about lessons learned, and can you share with us what a lesson learned is for you over your tenure at IARPA, especially from your perspective in the leadership role there's so many to choose from. I'm always going back and looking at what I could have done differently. So from a program manager perspective, data, data, data, data, data is a killer. And I had, I think plan A, plan B and plan C. And luckily I had a plan C because plan B went away quickly in my program. And then finally plan C went away, given a champion at an organization that was going to share data for us effectively killed the program. But having enough data to support the Office of Analysis and having backup plans is critical. And I think having commitments to the data, whether we have to collect it ourselves or use licenses to leverage it or have someone, I say collect ourselves using our, excuse me, test and evaluation partners or even the performers.
Building much of that into the program, I think, especially for cause, would have been beneficial. So for example, if the performers, one thing I didn't leverage and should have done is have the performers come to the table with not only their solution, but the data that they could share with all the other teams. And that wasn't baked in initially to the program.
And I think we've instilled that in some of our other programs. We've actually had that as part of the proposal process that you need to bring data and also have it shareable to other teams. That way the risk isn't entirely on the government at that point in time for providing the data across teams. And so that's something that was huge. So during my time as office director, we've sort of instilled that in many programs, not all of them, but where it makes sense.
Dimitrios Donavos: So you've had a prolific run of starting programs in the Office of Analysis. We've talked about the importance of the Heilmeier Cataclysm, but clearly you have an eye for what it takes to start a successful program. Can you tell our listeners from your perspective? According to Rob Rahmer, what is a successful program in the Office of Analysis look like?
Rob Rahmer: Well, from its inception, if you already know how to build something and have a plan, not to be blunt, but that's just who I am. Don't waste our time. If you already have a patent and you know how to build it, get some investment money and go ahead and build it. It doesn't mean the problem isn't there. It doesn't mean your idea is bad. That's not our mission. So for me, a successful program is we start with data. One, we've developed data that can last many years for future programs. Even if a program fails, we've developed robust data sets, annotated data sets that can be used. If you look at computer vision, cybersecurity, so forth, those will last many times longer. And even human language technology, those will last much longer than the programs because we've ground truth much of the data, at least in the view of the program. And it will be beneficial to not just the intelligence community, but...much of the data is released publicly through various sources and other government agencies. So I'd say that's one. Two, a successful program is well documented and it is well managed from a sense that you're not married to the performer teams. And what I mean is you can be objective and make those tough decisions. So if you have to make cuts or reductions, you really need to look at them and make those reductions. And we wanna be good stewards of the taxpayers' money.
and give it back to maybe another IROPA program or another part of your program. Maybe it's more enhanced test and evaluation. So there's that. And then the end result is that we have some component or entire system that is transitionable, that is going to be utilized by the intelligence community. And ultimately, that's why we're here. We look three to five years once a program is approved to move forward. And we have something we can transition, whether it's the entirety of the system the technology or some component and if it can be leveraged by multiple agencies or even commercialized, that's a win.
Dimitrios Donavos: So what is interesting is that we talk about problems being IARPA hard when program managers are thinking of pitching a program. And you've touched a little bit on this, but can you describe in your own words what it means for a problem to be IARPA hard?
Rob Rahmer: So I always go back to the Hallmark catechism when I think through that. Are we doing something incremental, or are we really changing the game, moving the needle? Are we doing something revolutionary? And I think going through the Hallmark catechism and answering those questions sort of proves it right.
I mentioned those naysayers, as Dr. Heinem put it to me. You have skepticism, you have skeptics, and then you have those that think you can't do it, then we already do it. And proving that through the New Start pitch is critical to proving that it's IARPA hard. And maybe we have to develop new metrics. Maybe there, maybe, maybe it's, so far over the horizon that a particular agency really isn't looking at it. And that's what we're here for. We are here to help them look well beyond the horizon at threats of tomorrow, or even threats of today that may evolve into different threats, but threats of tomorrow to build programs and technologies that can be there when ready. And so when we look at IARPA hard, we build those with these milestones and measure progress throughout the program to make sure we're on track.
And when we reach the end of a phase, we have these measures to compare to. And when those milestones are built in with those metrics, with input from those transition partners, those government transition partners, we know we're there. Because oftentimes we get metrics that are sort of looking at the day, maybe one year down the road. And then we put these stretch goals into the program that are almost, I want to say they're unbelievable, but they're very, very difficult. And so that's where you get the naysayers. That's where you get the skeptics that you're not gonna be there, but maybe we will. So I think that's where, like the catechism is, the Halmar catechism is definitely, flushes that out for us.
Dimitrios Donavos: I don't want you to pick favorites, but I want you to think about the portfolio of programs that have come from the Office of Analysis. You've touched on a few of them, our HLT programs, for example, but we have programs across the spectrum, including in HLT, in cyber, in biometrics, in video processing. Talk to our listeners about programs that they may not...be familiar with and how those programs have made an impact not just in the intelligence community but in the open scientific community.
Rob Rahmer (56:14.236)
Well, at least you didn't. Did you really not ask me to pick my favorite, your child, I guess, for those that are going to be listening? Because no one's going to get, I'm not going to give an answer to that one. But I mean, we can look across the board. There's many things we can't talk about in this forum for those transitions to partners and how they're being used. But there are many. But human language technology programs, data sets extremely important for, I think, the Defense Language Institute and other organizations. You're looking at machine translation, annotated data sets for ground truth for future research. Hiatus technology, looking at one of Tim McKinnon's programs, looking at how they're detecting machine -generated text, things like chat GPT early wins for the program beyond what the current state of the art is. So I think those are early programs, early in their program life cycle. But that's, those are, the wins are significant early and those are ones that can be easily measured. We have related to that, the material program that was before the better program, lots of data transitions and transitions that I really can't speak about here. But I think going back to data, that's always significant and an easy impact outside of the technologies that are being used. Because oftentimes we see technologies early on in programs that may be performing better than ones commonly used, I'll just say, not to compare it to commercially available technologies.
And then as the program evolves, those technologies get better. So it's always sort of a race to get there, but many times that we outpace those technologies and then things catch up. And then at some point we're at the end of the program where we have to say, yeah, the industry is advanced past, beyond at least some component of the program and move on. But human language technology programs have had a lot of transitions, at least one of the few types of programs I can talk about here.
Dimitrios Donavos: So specifically referencing human language technology programs, our listeners might be surprised to know that a lot of the data sets that we've collected have actually been made available to the open public and can be found in locations like the Language Data Consortium. And on our podcast website, you will see a tab for resources for available things that are available to the public. And I encourage our listeners to explore that. Rob, you talked about thinking about over the horizon and your term at IARPA is going to be ending in the coming months. Given the speed of innovation within the intelligence community, we are likely going to face a landscape of technological challenges that are going to look very different from the challenges from just a few years ago when you took the position as the office director. Looking over the horizon, what advice would you offer your successor for leading the Office of Analysis through its next chapter?
Rob Rahmer: My first advice would be don't listen to me because I'm moving on. And I have my own ideas for where new program, where we need to be. But I think collaboration with partners in the intelligence community to understand their problems and needs and then finding those looking well ahead and leveraging their knowledge of their missions to help determine where IARPA should be investing in is going to be critical. Additionally, they need to work within the program manager experts that we have here and understand what trends they're seeing in their specific domains. That's one thing I did when I came on board at IARPA. I had program managers, because again biology, for example, by far, nowhere near an expert there, but I had to leverage the experts in that field and other domains to say, okay, what are the trends now? You have your programs now, here's how they're progressing. What's happening in the next three to five years that we need to get ahead of and where should we develop programs? And so I would offer that advice to my successor, work down, work across and work above with the new director. Dr. Rick Moeller.
Dimitrios Donavos: And I would also like to just ask where you see some of the biggest challenges moving forward. We've seen rapid and sometimes startling development in the AI domain. And I think that that is going to be something our listeners are very interested in exploring. Can you talk to us about where you see some of those challenges in the next five years?
Rob Rahmer: One of the questions I have is what does next generation biometrics look like? So I'll just leave that out there. We have facial recognition. You have programs like Briar that leverage face gate and other modalities. What is next generation biometrics? Both sensor and analysis. So that's something to think about. And that also deals with identity intelligence as well.
AI security, trust and security, depending on what your background is, looking at ways to build models, build security into models, especially if there are multiple organizations using their own training data, using their own developing portions of models. Maybe it's an ensemble. How does that look? How do you secure these models? How do you make them robust? So those are two areas that I can talk about here that I think should be considered.
Dimitrios Donavos: One of the questions that I'm really curious to ask is, can you describe what has been one of your most challenging moments at IARPA and also the most rewarding and what you learned about yourself experiencing the highs and the lows?
Rob Rahmer: I would say one of the most challenging things is working within various budget profiles, looking at, you face cuts, you have a surplus, deficits. I mean, there's various things that are thrown at you and trying to adjust. So that's something that's always evolving, looking at the type of funding we get. It's not understood looking at sort of two -year funds versus typical one -year funds. So we always have to..help people understand that, but we have to navigate within the structure that we have. So I would say that's one of the areas that's always been a challenge for us and me. We're navigating through that. We also have resource constraints at various times. We started a lot of programs and we need support from acquisitions, contracts, and we need to...you know, make sure that they have enough resources to support the scale of what we've done here. So that's another area that we need to have help with to support these programs as we move forward.
The other question was related to the most satisfying aspect of my role as office director. I think there's several aspects. When we develop a new program with a program manager, they come to us with this idea and maybe they don't have a clearance, maybe they do, maybe they don't understand the full scope of various IC missions and they may be only looking at one part of the problem, maybe looking at one aspect of research. And I think that's something that's working with them to see the light to understand, yeah, maybe there's a bigger problem or maybe there's another part of this research that has to be done because we're getting the data anyway. And one of my philosophies has been looking at sort of the red and blue side of it, take the cyber perspective. There's a...
We can look at, so take hiatus for example. We're looking at privacy, but we're also looking at attribution of text, authorship of text. And so the original idea that came through was we want to, Tim wanted to just focus on the privacy aspect. I said, but yes, we need to also understand attribution, especially with these large language models and machine generated text. How can we distinguish from human versus machine and then various...versions of machines, the different machines being used to generate text and then also the different humans that are generating text. And so developing a program with that mindset and then seeing come to light and have early successes is very rewarding. I think seeing program managers build their communities and ultimately start new programs and then, and always the transition piece is always gratifying, but, but.
One of my things that I enjoy is engaging the community, but not just the community, but also engaging organizations, such different academics that really don't understand that they can work with us. They see us as part of the intelligence community and what they believe is, we don't have facility clearances, we don't have clearances, we didn't know we could work with you. And it's just going out and talking to them. We were North Carolina A &T last fall and they're doing that great, great research programs there and in areas that we have programs in or future programs we're developing. And it was great to make those connections. So I think engaging with those organizations, speaking at conferences, giving keynotes, or just sitting panels to spread the word or talk about lessons learned. But there we get to meet researchers. We get to meet students. We get to talk to other government organizations, people from other government organizations, and make those connections. That's probably very gratifying because it's data points that I can bring back to build relationships for maybe new program managers. We open up the pool for new research performers to get a broader, more diverse set of proposals to help solve our problems. And I think that's probably one of the most rewarding aspects of my role here outside of hiring some of the best program managers and brightest program managers in the world.
Dimitrios Donavos: Rob can you highlight for our listeners where they would go and what are the mechanisms for them to engage with IARPA?
Rob Rahmer: Well, the first stop is IARPA.gov and there you will find a list of open requirements, whether it's requests for information or broad agency announcements, BAAs. You can also see some of the more recently closed ones. So looking at the program lifecycle, we oftentimes, as we're developing programs and the program managers trying to answer the Halmeyer catechism, is they may not have all of the knowledge. And they'd be reaching out to folks that don't have answers for them as well. So we oftentimes put out, and this is a cue for everyone, a clue, sorry, this is a clue for everyone. Oftentimes, when we put out a request for information, that means we're interested in a particular area. It may not be what you think it is, but we are trying to develop a program. And there probably will be a program in that space sometime in the next six months or a year.
So that's the first way to engage with IR. But specifically in requests for information, RFIs, we ask questions. Oftentimes, we see just proposals submitted. And that's great. And that's part of the process. But we really want specific answers. And so speaking about the Office of Analysis, a lot of times they're around data or metrics. We want to understand. And so at this point, the program manager has done some diligence to look through literature look at what data sets are available, but it comes to me and I'm like, this really isn't sufficient right now. Let's put out the RFI to really get more answers, see what academics and industry are doing. And that is, we want really, what we really want are those answers to those questions in the RFI. And if you have some ideas or you've done some work in the space, you can add to it, but page numbers are limited. So I think that's the first place to, way to engage us. The next step is we will announce,
And we post RFIs not just on our website, but also SAM .gov. You should register and register IARPA to get notifications from IARPA releasing such documents and requests. And then the next step is we have proposers days. Once a new start pitch is approved, the program manager will announce a proposers day and hold a proposers day. And this is the first engagement that academics and folks from industry who, and researchers in general, will have with the program manager in IARPA about a specific program. Here the program manager will release as much information as possible about the program, what the program's trying to accomplish, the structure of the program, metrics, timelines. The only thing we don't release is the value of the funding profile for the overall funding profile for the program. But this also is a forum for researchers to team up and we hold hybrid events. And what I mean team up is to see how they want to structure their teams for writing proposals. And so we provide this forum. The government usually leaves. We leave typically in the afternoon after the Q &A session. But we also, again, speaking of Q &As, we allow researchers to ask questions. And so we have this hybrid model now. We used to do all of these in person. Now we have it in person and through, I think, WebEx or I think it's WebEx.
And so now we've gone from maybe having 75 potential researchers attend to, I think, the max, the biggest one was almost 500. So we've scaled quite a bit. But for in -person folks, you get an opportunity to meet, shake hands, exchange business cards. We also give an opportunity in the form of lightning talks at the end of the session that you can just get five minutes to speak about your capabilities and how it relates to the program. So that's the second opportunity. And then finally, based on the questions and feedback that the program manager receives, we'll take that back.
I'll work with the program manager and see do we need to restructure the program based on some of these questions. Because as we develop these programs, we're not just looking at it from an ARPIC perspective. We're looking at it. We're putting feedback across the IC on what to incorporate in the final broad agency announcement.
And then we may change course a little bit. We may update metrics. And we look at our data profile for starting the program. And then the next step is to release the broad agency announcement, the BAA, which also will go to SAM .gov if you're registered for notifications. And this is the official release of the solicitation for you to write proposals. And oftentimes, it's any time between 45 to 60 days to write your proposals and submit to A <|ml|> Arapa.
So those are the main ways to engage with IARPA. We do not have open BAAs for seedling ideas. But you can always contact a program manager to talk about what capabilities you may have. Or even if you're a new program manager, or sorry, if you're a program manager candidate, you could talk about maybe new ideas with myself or the program manager.
in that specific domain to see if it's something that would be of interest. So there's various forms. And I think most of our program managers' email addresses and phone numbers are posted on our public website at hiereport .gov.
Dimitrios Donavos: Can you share with our audience one thing they might be surprised to learn about you?
Rob Rahmer: Sure, I can give a couple of things. For those that know me, especially with the most recent holiday party, and if you look at my door, I often ask senseless movie trivia questions, oftentimes from comedies. I post those and then email those to my program managers. And they're quotes from various movies that many of us have seen. But just as a way to joke around and a fellow comradery in the office. Sometimes there's sort of a science twist to it. I've had some from Back to the Future looking at quotes from that movie or maybe what things have changed. I even post images of things from movies. Like for example, many people didn't know that the Jason mask used from, you know, in the movie, the Halloween series is actually a cast of William Shatner from Star Trek. Not many people know that.
So I just post senseless things like that. And then that kind of trickles into holiday parties with some fun for competition and prizes. So that's one thing that watching, playing video games and doing math and watching movies in the eighties and seventies and eighties, it was a fun ride for those types of movies. I also, back, many of us looking at math background and statistics, fantasy sports was, always, I've been playing fantasy sports since early high school. And that's evolved in the last decade, even beyond that with what they call daily fantasy sports. So back in the old days, we would just try, you know, before the season starts, we'd have a fun draft, have a draft party and you pick your players and you just look at the statistics of the players. You try to look at the old statistics and try not to...judge based on what they did in the past, but oftentimes that was done. And as, as the world has evolved and the data that becomes available to you, makes things more competitive. And so the daily fantasy sports came around, there's many different, platforms to, to choose from, but I was very competitive in, in, in specifically, college fantasy football. And I had a great approach used to rank those players and it had proved very fruitful from a prize perspective. But as I became a program manager at AERPA and then office director, that has pretty much ended because I have little time to have fun. But I used to, that was my hobby every week, especially during the college football season. And what was unique about it is we have the national football league and there's so much, so many resources around, there's 32 teams still. And same thing for baseball. You have a smaller pool of teams, players. And so you need to figure out the variance of where these sort of high -risk players are. So you look at high -risk, high -reward. So for college, there's a lot more college teams out there than NFL teams. And so there wasn't a lot of information that was being aggregated. So I had a way to do that. And I spent a lot of evenings just because it was fun. I love to watch college football. That was a hobby that maybe I'll go back to eventually, but the world's changed a little bit. So it's something that I still look at closely. And I think it's... looking at it from my math background and statistics background and taking risks. It's always a fun venture to think about what has happened in the past and then trying to predict the future. So my system actually worked for a long time, at least until I stopped.
Something else I look at as a hobby is you look at sort of math in different areas of life. So you look at songs, for example, and you look at why people write songs and how they build their, they compose them, lyrics and music. And so one of the things that I think was a Joe Rogan podcast, experience podcast, and...
They had Maynard James Keenan from Tool. I don't know if you're familiar with that band from, they're still going on. And it was an interesting interview because they talked about, you know, phi. They talked about the golden spiral and looking at Fibonacci sequences. And so looking at the song that was developed, and many of them occur naturally in plants and other areas of life, and even I think phases now, there's some sort of phi sort of ratio with phases. And just things like that, that just, draw my attention and how they created that song, Lateralist, with Tool is just interesting. So if somebody wants to look that up, it's kind of neat. And so those are the kind of things that I get drawn into and applying math to just maybe art is something that is of interest to me.
Dimitrios Donavos: Rob this has been a pleasure. I really enjoyed it. I hope you did as well. And thank you again for your time
Dimitrios Donavos: And thank you again for your time
Rob Rahmer: Dimitrios, Thank you.
[Outro Music]
Dimitrios Donavos: Thank you for joining us. For more information about IARPA and this podcast series visit us at I-A-R-P-A.gov. You can also join the conversation by following us at LinkedIn and at X, formerly Twitter @ IARPAnews.
