Dear reader,
Welcome to episode one of my new podcast: discourse brunch. You can watch it by clicking above, the Substack app; or add it to Spotify, Apple Podcasts, or Pocket Casts.1
I spent a long time debating whether to do a podcast. Editing seemed like a lot of work, and posting video of myself makes me feel intense shame.2 But I wanted an excuse to talk with friends who know more than me and embody alternate perspectives on the topics I write about.
Once I finished recording today’s episode with my friend Arjun Ramani, I was certain this was the right choice. He’s a fellow niche-tech-ideology-watcher, so we had a bunch of fun—I always learn a ton from him. Consider this episode an accompaniment to my “tech right (disambiguation)” post from last week. Among other things, we discuss:
Mark Zuckerberg’s new look
PG’s new essay on The Origins of Wokeness
Why tech leaders admire Malcolm X and Robert Moses
Arjun’s conversations with Altman, Collison, and Lonsdale
How 21st century billionaires differ from 20th century ones
Non-Silicon-Valley reasons for R&D mania
Growth rates in the US, India, and UK
Buying the dip on effective altruism
Cars as an analogy for AI impacts
Gains from trade (with AGI)
A full edited transcript, and list of links/books, is below. And please send me feedback on the podcast if you listen—structure, audio quality, whatever!
Episode transcript
This transcript has been lightly edited for length & clarity.
Jasmine (00:00)
Hello! For the inaugural episode of the podcast, I'm inviting my good friend Arjun Ramani. We've known each other since freshman year of Stanford when we got into a late night dorm argument about critical theory in high school debate (which says a lot about our friendship).
Arjun is currently a PhD candidate in economics at MIT. And for the last few years, he's been a correspondent for The Economist covering topics like finance, AI and India. Welcome!
Arjun (00:25)
Thanks, Jasmine. I remember that conversation because everyone else in that room was talking about like CS 106 or something, and we sort of just checked out. That was great. And you've changed a lot since then. You're still very academic, intellectual, but also much more pragmatic.
Jasmine Sun (00:49)
I mean, I literally just quit my product job to be a poster, so I'm not sure if that's true.
Zuck’s new style
Jasmine Sun (00:49)
But speaking of people who have changed a lot, what do you think of Mark Zuckerberg's new look?
Arjun (01:04)
I'm trying to figure out my own style, so I respect him for the mid-career shift.
It fits into the broader question with Zuck of whether it's performative for recruiting. He had a bad image problem after all the 2016 stuff. And is he currying favor with Trump or is it his true self is coming out? Like was he always like this and just felt stifled by being a super young CEO and the woke mob coming after him? I don't know. What do you think?
Jasmine Sun (01:37)
As much as I don't want to say so, I think it does look good. It is a response to like shifts in wokeness, but it also fits him better than the gray t-shirt robot thing that was going on before. So I'm happy for him.
PG on wokeness
Jasmine Sun (01:37)
Anyway, let's get into it. A couple days ago, you texted me this new essay that PG just dropped: The Origins of Wokeness. He starts by defining this old word, prig: “a self-righteously moralistic person who behaves as if superior to others.” And he's basically trying to say that wokeness is not a new phenomenon. There have always been prigs in society who want to impose rules upon others. I'm curious what you thought of the essay.
Arjun (02:32)
So I really like Paul. Big fan of his writing in general. I think this essay was more mixed. I agree with the historical claims: wokeness is not new. The idea that young people, when they develop their ideologies, are impressionable; and when they eventually come into positions of power, that is what drives shifts in elite cultural opinion. But it's possible to overstate the wokelash. There are some shifts that probably will persist, like society's tolerance for sexist behavior in firms. Whereas in other ways, things are shifting backwards, like on police or immigration.
He makes this point that we can treat wokeness like a religion. And that didn't land with me because it's just a debate over what is within the morally acceptable kind of boundaries of discourse and that's different than religion.
Jasmine Sun (03:22)
Could you say more about that distinction? Is it more that religions have specific texts that they hew to? Whereas wokeness simply is definitionally constantly shifting what the bounds are?
Arjun (03:55)
Yeah, you could argue that there are ideas from religions that constantly become mainstream, like lot of modern American cultures are downstream of Christianity and Indian culture is super influenced by Hinduism.
But wokeness is about the boundaries of discourse, what's considered correct. There's always going to be this negotiation. With religion we agreed in liberal society that you're not supposed to try to convince other people of your religion. Whereas it is fine to try to make progress philosophically or morally through discourse, through movements, through political arguments.
Jasmine Sun (04:55)
The other thing that was interesting to me about the essay is the meta: Why did PG choose to write and publish this in 2025?
Peak woke is clearly over. Trump is in office, you guys won. We've been having these conversations for many years now. And I've been constantly surprised by how many like tech luminaries are really, really, really focused on wokeness. And it's not just like the “tech right.” PG is not a right-winger. He was like very loudly, a Kamala voter, and broke with a lot of peers over Israel-Palestine. So like why do you think that he and others are still so concerned with this?
Arjun (05:43)
Funny story: In July 2022, The Economist ran a cover story titled “Peak progressive,” trying to make this argument that we'd passed peak woke. And I interviewed PG for a story, and we talked about this for a really long time. He was super fixated on it. So this is not a new idea for him.
But I think you can view the recent discourse as him and other tech elites wanting to process and rationalize the new political order publicly, so they can defend their own place in it. So someone like him who's sort of threading the needle—I'm antiwoke, but I still voted for Kamala—why that's justified given the shift in norms. What do you think?
Jasmine Sun (06:37)
There is potentially a thing going on where after being flamed for being a Kamala supporter, he wants to show peers that he's still on the right side, i.e. the anti-woke side, and it's a signaling thing of being an independent. But he's also an essayist, so sometimes you just need a good topic.
Causes of the “tech right”
Jasmine Sun (06:37)
I've been thinking a lot about: What is the tech right? Where did it come from? Why is it happening? And I feel like there's two big buckets of theories for the rise of the tech right.
One is the material bucket: The end of zero interest meant that VCs got interested in like defense and government contracts. Biden and Lina Khan were like really hostile to Big Tech. Or one that's popular among the left is like: billionaires want lower taxes, of course they are Republicans.
But there's also like this cultural explanation. Like there's something about wokeness that was like uniquely radicalizing for tech folks. Like workers at companies, because of wokeness, were asking for things that impeded productivity and were annoying to the CEOs. Or journalists were really mean to tech people and constantly told negative stories about tech billionaires even when they were doing reasonable things like Zuckerberg constructing a hospital. Or content moderation went way overboard. And this is more than a purely economic rational self-interest thing.
Do you think that material or cultural concerns were a bigger factor in the rise of the tech right?
Arjun (08:33)
I'm to give the classic economist non-answer. I think they're both true and it's hard to separate because they're perfectly correlated.
The Lina Khan kind of neo-Brandeisian argument is that big market concentration is bad partly because of its social and cultural effects. And on the tech side, being pro-progress is about letting companies grow, which makes money, but it's also about this aesthetic of pursuing growth for its own sake, which is a cultural and values argument. That's Marc Andreessen's recent super long essay on why AI safety is bad.
That said, I do think a big chunk of the recent post-election stuff is very materialistic. Consider the counterfactual if Kamala had won: Would Zuck be doing what he's doing right now? Obviously not. It really reminds me of when I was in India and looking at how CEOs basically treated the prime minister Narendra Modi. When you have strongman rulers, supposedly independent CEOs line up like courtiers to suck up so they can curry their favor. And now we're going to see a bunch of protectionism with tariffs, and you're going to see everyone asking for exemptions, arguing why their industry is strategic, why they are a good guy.
I'm very curious to see what happens with this whole TikTok thing. I heard part of the reason Zuck is doing so much right now is because he wants to buy TikTok. And he's also looking for air cover. Anything that Elon does, He's the first buddy. It's all good to go. So you can't tell me what I'm doing is wrong if I'm just doing the same thing Elon is. Put in Community Notes and go to the inauguration and all that stuff.
The last thing that comes to mind on this is a point you've made to me many times, which is you have to distinguish the capitalist and labor class in tech. Donations from executives massively shifted this last political cycle from vast majority being blue to the majority being red. If you look at a sample of workers in the tech industry, I’d guess this shift would be much less. SF County wasn't that much more in favor of Trump than before. That would point to it being more of a material thing. That said, if intensity of preference among the kind of working class went down, where they're less likely to boycott or change decisions in which firms they go to, which I think is true, the power in this market shifts more towards the investors and founders.
Jasmine Sun (11:03)
That's a good point. I am curious to dig more into the worker-level shifts.
Obviously this is a biased and imperfect microcosm, but we were both at Stanford during peak techlash. There's a lot of like No Tech For ICE stuff going on—which disclosure, I was somewhat involved with. And these days you have a lot of energy around Stanford defense clubs. Working for Anduril and Palantir is awesome again, and we should all be hacking for defense. I'm thinking of doing a reporting project where I go down to campus and talk to a bunch of like younger folks about their own political shifts, and what kinds of companies they are interested in.
Maybe this is more unique to Stanford than tech workers writ large, but because tech leaders are quite public about their beliefs, it does have influence on like what younger and aspiring tech workers also believe and choose to do. And you don't want to come off as contrarian to the tech leaders, because are you going to be able to raise capital if you do that?
Arjun (12:44)
Actually the usage of blog posts and Twitter to create a worldview and convince people you're cool is maybe one of the biggest ways in which modern tech is different than past industries. Before you also wanted to recruit and convince regulators of your worldview, but you didn't have direct access to people like you do now.
Jasmine Sun (13:14)
Like Lulu Meservey, who currently runs her own comms firm—her whole thing is “going direct.” Founders should tweet and do Substack and not try to mediate their visions through the mainstream media, which is often quite negative on tech, probably to an extreme extent. So I get that.
Arjun (13:33)
Do you know what happened to the American Dynamism Club? Wasn't there some club that was trying to have some party with A16Z and then they didn't get approved, so they did it off campus?
Jasmine Sun (13:48)
I have no idea, but that is such a good lead. I'll go find out.
Arjun (13:59)
It's funny because Steve Blank, he's one of the startup guys at Stanford who teaches the Lean Launchpad class. He's super in that world.
Jasmine Sun (14:08)
We've had “Hacking for Defense” with Condi Rice like forever. It was just kind of uncool when we were in school.
Arjun (14:16)
Yeah, that was a very unusual period. It was all tech ethics. I wonder how much of that sticks.
Jasmine Sun (14:21)
I don't know, it's interesting.
Builders vs. prigs
Jasmine Sun (14:21)
My overall thought on the sort of cultural versus material debate is like, the left liberal camp still underrates the cultural stuff. I think you're right that considering the counterfactual, would Zuckerberg be being like, “Meta is too feminine now” if Kamala won—obviously no. At the same time, when I read PG's essay, I was like, oh, this is a complaint about bureaucracy as much as it's a complaint about politics.
It's this cultural thing of builders in, prigs out. There are all these memes like “high agency” or “founder mode” or “live players” or “you can just do things.” My friend Clara, who edits Asterisk Magazine, was telling me about some conference where someone called it “Robert Moses libertarianism.” What does that even mean? And I think this is incoherent, except for that Moses was really going founder mode. He was a high agency guy. He's very effective, right? And there is some disposition where tech, because it is the industry of innovators and disruptors, is just like: Anybody who's super effective and hacks the system, whether it's from the inside or the outside, whether it's in tech or in politics, they get that respect.
That relates to why I see progress as like the primary coalition rather than like left or right. Or as Lonsdale calls it, the “builder class.” There is economic self-interest, obviously, but it's also an aesthetic preference.
Arjun (15:51)
You could reductively frame it as good is good, right? But, you know, I do think what you said about bureaucracy and procedure is the difference, right? Should you just, follow what you think is good and ignore sort of the rules of the game? Ignore the prigs or not?
Jasmine Sun (16:02)
One funny thing here that I was thinking about is how during the pandemic, I was doing all of these virtual book clubs with my friends. The first couple of books I chose were racial justice books because it was post-2020 protests. So we did Just Mercy by Bryan Stevenson. And then the second one was The Autobiography of Malcolm X.
The majority of my friends who wanted to read it were interested in the racial justice stuff. And then our friend Felix also joined the book club. We were going around and all saying like, I'm interested because I want to learn about black power or movement building and whatever. And Felix was like, I wanted to read it because Malcolm X is a great man.
I thought it was so funny, because it is true that there are so many tech folks who really like The Autobiography of Malcolm X. I don't think it's about his politics. It's not like they're black-power-heads. They're just like, yeah, that guy, great man!
Arjun (17:12)
That's the same thing with Robert Caro and Robert Moses! You got to read him because he's the great man of government. The book is not super flattering about him, but I think you can read it and still get a flattering impression of him because he got so much stuff done.
Jasmine Sun (17:33)
That was how I felt about the Malcolm X autobiography too. Those of us who came to it because we were interested in racial justice came away with a worse impression of Malcolm X because he was, for example, a real womanizer. My impression is still positive, but it made him look like more complex character.
Whereas if you come into that book with a great man reading, then he's a badass. Hugh was talking about how it's one of his favorite books because there's this bit where he loads up a gun with blanks, doesn't tell anybody, goes around to his friends and allies to play Russian roulette, then points it at his own head and fires it too. And it's just like, is that not the most badass shit in the world? I'm like, yeah, I guess it is.
Arjun (18:30)
I think you made a point in one of your recent Substack posts, but that kind of shift to focus on progress is not really split along traditional party lines.
You could even say that the Biden administration agrees with builders in, prigs out, right? Like Biden didn't want advice from academics compared to Obama; the people at the top of administration were operators and lawyers as opposed to academics. And the whole reindustrialization of America, passing these massive spending packages, the whole moniker of supply-side progressivism, NEPA. Lots of people at the top of the administration would agree that regulation is preventing them from achieving their goals.
So yeah, I guess how much of it is the tech ideology driving things versus sort of a natural reaction to, you know, the fact that we are facing inflation and we were in this new cold war with China. It's difficult to disentangle.
Jasmine Sun (19:43)
You mentioned that there are like historical analogues and recent US history that you feel is relevant, right?
Arjun (19:49)
Yeah, I just think the 70s and 80s is the classic analogue for the great inflation, where you had big supply shock to energy markets, OPEC 1973, and then inflation. In response to that, you got a bunch of deregulation, both from Carter and Reagan and tax cuts. And because the body politic was annoyed that their living standards fell. And so you get the kind of natural reaction to that.
So if we didn't have a kind of tech-driven progress studies coalition today, would we be still having the same response? I think probably yes. It would probably be flavored differently. But there would probably be something roughly in that direction.
« ADDENDUM - not recorded, but something Arjun wanted to add »
It’s easy to overemphasize the association between Silicon Valley and movements like EA and progress studies (I’m often guilty of this). EA sort of has three semi-independent origin stories: (i) Oxford philosophy e.g. Will MacAskill, organisations like 80,000 hours and the Centre for Effective Altruism (ii) GiveWell, started by Ellie Hassenfeld and Holden Karnofsky, then at Bridgewater, and originally a spreadsheet that ranked charities, and maybe captures a strand of the American finance optimization mindset (iii) an outgrowth of Bay Area rationalists e.g. communities around people like Scott Alexander and Eliezer Yudkowsky who have been thinking about AI risk and related concepts for a while. A key early event was when Dustin Moskovitz (Facebook co-founder, Asana CEO), came across GiveWell, was enamored, and invited Holden to run OpenPhil, which became the most important EA philanthropy.
Progress studies is associated with tech because of Patrick Collison and the YIMBY movement in SF. But London (e.g. Works in Progress) and DC (Institute for Progress) are just as important. Economists have been talking about YIMBY-esque ideas for decades (most notably Ed Glaeser) and there was a growing community of academics working on open science and the reproducibility crisis well before the term metascience was in vogue online (the main new thing is building new scientific institutions).
Tech has certainly been important, primarily by providing capital and conferring status to these ideas. This has attracted more talented young people to the cause. But I can see why people get annoyed when people associate these ideas primarily with tech.
Jasmine Sun (20:35)
You're probably right that like there are these macro factors that would have shifted policy and discourse in this direction either way.
Captains of industry and politics
Jasmine Sun (20:35)
One of the reasons that I wanted to bring you to talk about this stuff is that you did get to spend a bunch of time with Paul Graham, Sam Altman, Patrick Collison and Joe Lonsdale for an Economist piece that you did about a year ago on “The new tech worldview.” It was a fun piece. We learned, for example, that Sam is a washed jeans guy and Joe is a linen guy. Good intel.
I wanted to hear more about that piece. Why did you pick those four individuals to talk to? Do you think that they are meant to represent four distinct worldviews or one unified tech worldview, as the headline implies?
Arjun (21:26)
It was actually two years ago. I started reporting in the summer of 2022. I thought these people and these different intellectual strands in Silicon Valley were super interesting and poorly understood. My editors didn't really know what effective altruism was, and no one knew what progress studies was. It wasn't even clear there was a broader “tech right” beyond Peter Thiel and the PayPal mafia. I wanted to explain to the world what these movements were.
Those specific people were not the original plan. Originally, we were like, We have three intellectual strands: effective altruism, progress studies, the tech right. Let's find one person for each.
SBF was super big at the time, but we had profiled him pretty recently. That's off the table. I was thinking about Dustin Moskovitz because of Open Philanthropy. But he actually declined to be interviewed. And he's not really that public. He still doesn't do any interviews. So that didn't work. Vitalik, we had also profiled recently, and he's also not quite Silicon Valley.
So it ended up just being a hodgepodge of three of the more intellectually interesting people under the age of 40 in Silicon Valley: Sam, Patrick, and Joe. Paul was not really one of the three. He was supposed to sort of be the elder statesman, the intellectual forefather. And we also kind of cast Peter Thiel in that role, though I didn't speak with him.
I think I slept on Elon in that piece. I knew he was super important, of course, but he didn't write much about his worldview. And he hadn't bought Twitter yet, so he was just less present in the in the discourse. His net worth before COVID was also way smaller than it was after Tesla ripped, which gave him way more leeway to do things. He always had, you know, his hands in a lot of buckets, but this force-multiplied that. And I think we only really saw the effects of that in the past year and a half.
I will say I didn't see Elon coming, and in general underestimated the shift to the tech right. I do think there were people in the capitalist class of Silicon Valley who were keeping quiet. Like I think Marc Andreessen probably was already quite red-pilled, maybe even before 2020, he was just very quiet. And then there was this sort of preference cascade, Cass-Sunstein-style, that happened over the last year and a half.
Jasmine Sun (24:20)
One interesting choice there is you've told me that Joe Lonsdale's influence is quite underrated. For our listenership, he might be the name that not everyone is super familiar with. I was curious if you could say more.
Arjun (24:43)
I think he puts his money where his mouth is in terms of institution-building, which a lot of people don't. The University of Austin is the best example. It seems to be off to a decent start. They have a really strong set of people. It's a cool thing that they're doing this, trying to create a new university. He's also very connected to young up and coming people in Silicon Valley, because he maintains his connections with the Stanford Review, and 8VC recruits a lot of top people out of Stanford. He has the Cicero Institute, which does some interesting work on housing and homelessness and so forth.
I think it would be wrong to cast him as like the protege of Peter, which is a thing I thought at one point. Now I think he's less a philosopher and more of a pragmatist, but still an important person.
Jasmine Sun (25:33)
What else did you change your mind on while you were reporting this piece?
Arjun (25:38)
One of the key tensions is: Is this something new? Or is it just a rendition of old thing of rich people having political and social influence? People lobby, grand statements—not new.
In the early 20th century, Andrew Carnegie wrote this essay, The Gospel of Wealth, promoting his worldview that rich people have a moral obligation to society. Rockefeller has this great quote that “competition is a sin”—sounds a lot like Peter Thiel's “competition is for losers.” George Soros had a PhD in philosophy. You could argue he and Bill Gates are the leading people behind the neoliberal philanthropy that was dominant in the 90s and 2000s. At one point I thought what distinguishes the current tech leaders is they try to build new institutions rather than working within existing ones. But the robber barons of the early 20th century were doing that too. They were building new universities, libraries, etc.
So what is new? You could make the case for a few things.
One is just technology. The Internet means that all these people’s reach is far greater than anyone before. You can't do what Elon's doing today without the internet or software. You just can't manage that many separate things at once. Rockefeller, who was the richest person before Elon in terms of wealth, was mostly just Standard Oil.
The second is public writing. We talked about this a bit before: using social media, blog posts, and podcasts to sort of influence the direction of technology. Sam is the best example of this. He tried to reorient YC in a more hard tech direction. Questionable success, but certainly tried. You could write op-eds and papers before, but the internet lets you publish much more frequently and directly.
The third thing is VC fundraising dynamics and startup risk culture, where you're using finance to pull the future forward. We never had as much risk capital as we do today. That incentivizes worldview-building, having a lofty vision, and building hype; because you need to do that to recruit and to get investors to invest in you. It’s a rational delusion thing, right?
Progress and growth
Jasmine Sun (28:20)
I wanted to talk a little bit more about progress studies because this is the idea machine that I am currently most interested in and intrigued by. I'm going to paraphrase the loose definition that I wrote in my tech right disambiguation post:
The progress coalition is this rising alliance of thinkers, policymakers, and industrialists who are united by their focus on accelerating US innovation, through science and technology, to drive US economic growth and global primacy. It's interesting because it feels both nonpartisan and bipartisan, like individual members have a range of political beliefs. They might call themselves supply-side progressives, or e/accs, or American dynamists, or abundance liberals. Some folks just seem like neolibs and neocons to me. But they’re rallying around the same set of issues, like R &D funding, China competition, and deregulating supply in domains like housing, energy, pharmaceuticals. And these folks seem to believe both in markets and market failures, both like the motivating magic of capitalism and the need for competent governance to speed up growth when it slows.
I also suspect that a lot of the folks in this coalition, especially the newer people coming from tech, would prefer not to think about politics. It was more that they ran up against the constraints that politics imposed and were like, well, guess this is the bottleneck that we got to fix.
The question I have for you as a fellow Progress Studies watcher is, I'm curious where these folks came from. For example, like in 2019, there was this like big Atlantic op-ed co-authored by Patrick Collison and Tyler Cowen called We Need a New Science of Progress. That was obviously very influential, but like you sent me this like New Yorker profile on Sam Altman from 2016 when he was still president of YC. At that point, Altman is already thinking very similarly about the role of science, hard tech, and defense in jumpstarting growth. And the fact that growth is so important the society and political institutions that we hold dear.
I'm curious where you think Silicon Valley's interest in this science-driven growth came from.
Arjun (30:28)
The broader progress coalition is kind of predictable given the macroeconomic context of inflation and China. Mark Zachary Taylor, a political scientist at Georgia Tech, has a good book called The Politics of Innovation, which shows that when countries face security threats, they invest a lot more in R&D. And that involves deregulating to make sure that R&D goes a long way. There's the 70s example. There's the World War II example, investing in creating the entire national science funding apparatus and so forth. So what I'm very curious about, and don't have an answer to is: What ways do the particular ideologies of today's tech elite change how progress instantiates compared to what would have happened if we had a different set of characters in charge?
On science-driven growth in particular, the other backdrop to this is the recognition within the tech industry that the internet and software—while they made people a lot of money—its broader impact on living standards and growth wasn't as high. Peter Thiel's, we've seen change in bits, but not atoms. Tyler Cowen’s “great stagnation.” And it's also a realization that there is an end to every tech cycle. So what's the next thing? You make a lot of money from identifying the next thing.
And the third thing is tech really is ideological. People want to do the coolest thing that really does change how the world works. Sam is certainly oriented in that direction. So there has been a cultural shift where deeper tech has become cooler. If you want to be a great man, working on science is cooler, right? You have deep intelligence as opposed to just business optimization.
Jasmine Sun (32:34)
There's this Sam quote in that New Yorker profile where he says: “Democracy only works in a growing economy. Without a return to economic growth, the democratic experiment will fail.” Do you agree with this?
Arjun (32:53)
In a way, yes, growth masks all problems.
But the economist Dietrich Vollrath has a book called Fully Grown, which says, as a country gets richer, it starts to prioritize other things other than consumption of tangible goods. Look at Western Europe. I know it's fashionable right now to shit on European growth, but if you do the proper PPP adjustments and look at overall welfare, including things like life expectancy and public amenities, I think the quality of life is higher there than in the US. I preferred London over any American city. Western Europe is less polarized than America, arguably, and at less risk of democratic backsliding. So lower growth, more politically stable. That's like a straight test of that argument.
That said, it does help to grow. It does reduce tension. Maybe it's also a cultural thing where in America, we come to expect rising living standards in terms of consumption, but in Western Europe, people know the trade-off they're making.
Jasmine Sun (34:11)
I was curious if you could elaborate the steel man for the argument that democracy only does work when you have high growth. Why is that true?
Arjun (34:21)
It's the debate that's happening in India right now. People are like, Hey, we should be more like China and grow faster.
Jasmine Sun (34:22)
Notably China is not growing very fast, and democratic backsliding.
Arjun (34:36)
No longer, no longer.
But if there is this trade-off, then when you're poor, you might deprioritize democracy and procedure and fairness because people, when they're poor, want to get rich.
I don't think we have enough of a sense of the preferences of already developed countries and political stability within them to know. Is there a cross-country relationship where you have a slowdown in growth and you become less democratic? The recent experience suggests that's the case across the world, but there are lots of other changes too, like media environment. And the vibe has just changed, as Tyler Cowen would say. There's not necessarily a causal explanation for it.
There is literature on deindustrialization. One of my professors at MIT, David Autor, has studied areas of America affected by Chinese import competition after they joined the WTO in 2001. He finds that manufacturing job loss has big local multiplier effects, and predicts rightward shifts and other negative impacts in communities. So that's certainly true.
Jasmine Sun (35:49)
There's like part of me that wants to say the assumption that growth will gloss over political questions is naive. Because let's say you want to deregulate, where will you deregulate first? You want to build more housing, where will housing get built and not get built? Sometimes there are only so many resources, so those prioritization questions reflect political trade-offs.
At the same time, for me, the best case for abundance to me theoretically is that growth unlocks choice. A diverse society has lots of inherently irreconcilable and irresolvable values. some people want to live here and some people want to live there. You can get in like endless debates and community input meetings about where to build the buildings. But like in the end, the only way to resolve these differences in a diverse society is to have enough total choices that everybody can have what they want. And all we can do is create a more positive sum environment.
I'm thinking about San Francisco again, because I'm trying to learn more about politics here. It does feel like so much of the antagonism against tech is because there isn't enough housing, so when tech comes in there is this brutal trade-off where creatives and service workers and everybody else has to move out. Whereas in a place like New York, which is denser and there's like more space, there's a complementary and synergistic relationship between being a financial capital, being a cultural capital, and being a place where working class folks can make a living. SF, by its constraint of space, it's so zero-sum. If these people move in, we gotta move out. That escalates the tension so much.
Arjun (37:49)
I totally agree with that. But I would just say that feels a little bit less like growth and more like how much scarce resources you have relative to how unequal purchasing power is. “Growth caused this problem” is the bad way to frame this. You had massive growth of tech industry in this region clustered without corresponding growth in the housing supply. You had a huge increase in inequality without an expansion of the scarce resource. So that created the problem. Now obviously we should build housing. But if you were to look at other cities that are at a lower level of per capita GDP, they might not have the same problem if you have less inequality relative to scarce resource.
So growth alone doesn't solve the problem. You need the scarce resource to expand enough to cover up how much inequality you have in consumption.
In the 2010s, it was fashionable to talk about inequality. Now it's fashionable to talk about growth. We're going to probably be talking about inequality again in a decade or two.
Jasmine Sun (38:54)
That's a really useful clarification. I realize I had conflated GDP growth with simultaneous increase in the amount of resources and infrastructure.
Tech diffusion vs. adoption
Jasmine Sun (38:54)
I do associate you as broadly within the “progress coalition,” which I also am sympathetic to, but I don't know if I'd consider myself a part of. I'm curious, as a both an observer and a member, is there anything that you think the progress folks miss?
Arjun (39:29)
Generally, I’m a big fan. I also think there is quite a big diversity of viewpoints, as you mentioned earlier.
Intellectually, the main thing I would say is there's a big emphasis on innovation and less of an emphasis on diffusion of technology. So Jeffrey Ding, who's at George Washington, has some cool work comparing America and China in terms of technological capabilities. His basic point is that on measures of innovation and the frontier, the difference between the US and China is actually not that big. In sectors like 5G and electric vehicles and batteries, there are ways in which China is actually ahead of America now, manufacturing being the number one thing. But on diffusion of technology, that's why the US's national power is much greater and why living standards are higher. If you were to go to a second or third tier city, America does much better in terms of adoption of computers and allocation of resources. The economic way of putting it is that there's just much less misallocation. The intensity of use of your technology is much greater. The UK is suffering from this right now, where they're at the frontier of biotech and AI, but besides London, they have pretty poor tech diffusion.
If you want to achieve diffusion, you need standard things that are less sexy. You need good human capital for people to adopt the technology. You need competitive markets so you churn out the bad firms and the good ones scale. And this means you should care about education and antitrust, which are not the top issues in the progress studies world.
What I would say if someone told me this is like, other people are focusing on this. We're focusing on the things that are less emphasized. But I've not done the analysis in terms of where the marginal impact is greatest.
Jasmine Sun (41:22)
When you talk about diffusion, are you saying how well the median city is doing, or how well the 25th or 75th percentile is doing? Because I do think that tech, for example, likes to focus on the 99th percentile.
Arjun (41:31)
Yeah, that's exactly it. This is one of the failure modes of India. Actually America was great in expanding public education early in its history, which Europe did less of. That matters a lot because if you're second and third tier cities, your 25th percentile, 50th percentile person can adopt technologies. That's what is going to lift your overall living standard and growth rate.
Jasmine Sun (42:01)
How do we develop ways to talk about that? Like all higher education discussions are about Harvard and Stanford. All technology discussions are about what happens in San Francisco. Like there is an ease and a bias to only focus on the leading edge of things and it can mask a lot of problems. Like the US can talk about how the Fortune 500 is doing great and then everyone is like, the economy feels like shit to me personally, because the stats that you're looking at are too aggregate or only reflect the leading edge.
Arjun (42:38)
Yeah, but the mainstream does think a lot about the left-behind regions and geographic inequality. So maybe there's like a good division of labor here, right? Silicon Valley is the innovation engine.
And then it depends on how powerful the progress coalition gets. If they become the government, they have to care about the other stuff too. But if they're still, thinking on the margin, then it's fine.
Jasmine Sun (42:58)
On the antiwoke cultural side of things, it does seem like a lot of folks in tech believe we have become too focused on bringing up the tail and not sufficiently focused on ensuring that we're continuing to make progress at the leading edge. And how can we prevent, tall poppy syndrome, where we no longer want people to make lots of money and make giant companies and experiment with things.
The state of EA
Jasmine Sun (42:58)
I did want to briefly touch upon our good friends, Effective Altruism. From when we were undergrads until probably the fall of SBF, I feel like this was clearly the biggest and most influential idea machine on the scene. They obviously have had a lot of impact. They were very, very early to shaping concerns around AI and AI safety. Even policy impact-wise, the Biden executive order on AI was extremely shaped by folks from EA.
But because of SBF, it feels like EA’s lost a lot of credibility and influence in the public imagination. With AI in particular, there’s a more competitive environment where I'm not really sure personally what the future of safety is. It feels like everyone right now, whether because of China or because of intra-firm competition, is only interested in accelerating. And the appetite for a pause or a slow down seems less likely now.
I actually feel quite bad for EA because I've been a critic in the past, but I am like, wow, a lot of these people are actually doing really rigorous research and making sacrifices to do good things for the world. And now I feel like selfishness is in vogue, and they keep getting clowned on for like trying to do good things.
But I'm curious, what do you think happens to EA from here? Do you think they're up, down? Do you think that they're actually they are still as influential as they were?
Arjun (44:47)
You're absolutely right that in terms of status, they are down. I mean, SBF, a lot of the safety stuff getting memed on now. But in terms of influence, you could argue that they're more powerful than ever. They've been right about a lot of things that other people were sleeping on. When you become mainstream and other people start thinking through the frameworks that you offered, your individual people might be less influential, but your ideas have a big impact. That's sort of where where EA is at.
I think some things have been institutionalized like the UK AI Safety Institute. The way they're doing things actually makes a lot of sense, right? Constant vigilance on the frontier, so they monitor frontier models for dangerous capabilities. They have a lot of really talented people, who 5 or 10 years ago would not have otherwise worked in government, particularly from the tech industry. Like you got Paul Christiano, who could be like founding an AI company and raising a bunch of money right now.
Yes, we are in a huge acceleration mode, but everything is at the margin, right? That's because we've learned that the fast takeoff scenarios are not likely to happen, whatever people say about drop in remote workers. And there's huge uncertainty around what the first 18 months of Trump will look like and there's some uncertainty over where AI progress will go. Maybe if there's some warning shot with AI progress and some bad stuff happens, then you can see a return.
Sam Altman talks about this co-evolution idea, right? You want iterated deployment—and in a way that's justification for them to do whatever they want in terms of rolling out the product—but I think it's relatively correct. That's what's happening. People are adjusting and adapting, and that's good.
Jasmine Sun (46:30)
I feel like I would probably buy the status dip on EA right now.
Arjun (46:35)
Oh yeah? What time will they come back?
Jasmine Sun (46:41)
In a few years, maybe three years out. People gotta get over the SBF thing and another big prediction has to come true. Something they've been on that the rest of the world hasn't.
Arjun (46:42)
Maybe agents doing something bad.
AI impacts
Jasmine Sun (46:58)
I was doing this project on Public AI with my friend Nik Marda. This was with Mozilla, and we were interested in how could we get more public federal investment in shared AI resources, compute, data sets, things like that. And we were thinking about how to message the importance of AI to a more policy and less tuned-in technical audience.
We're looking for historical analogies, and one that we turn to is transportation infrastructure. When cars and trains and buses were invented, there were all these safety debates that were happening. Tons of people die and crash and there's tons of pollution. Eventually you get seat belts, you get speed limits, you get clean air stuff to help mitigate the side effects. And it is this decades-long process to figure out how society can adapt to the influx of cars on the road.
Arjun (47:55)
Didn't OpenAI have a paper recently? They refer to the history of cars. I don't think we get into the social response, but they use it as an example of, hey, the horses got out of the way, the people got out of the way, and we rebuilt our infrastructure to make way for cars.
There's an assumption that was a good thing, but there's also this whole strain of thinking that we've overbuilt American cities for cars. That's one of the reasons I love London. So it's like a really bad example for going ahead strong and re-engineer society for a new technology, even though I generally think we should go full steam ahead with AI.
Jasmine Sun (48:42)
That's funny. It was one of those things where Nik and I put one paragraph about AI and cars and transportation, but the number of comments that we got from reviewers relative to how much space it took up with people arguing... Like, man, people get into that stuff.
Our argument was actually like, just as we've perhaps overcorrected on private cars, it's really important that we have public alternatives in AI. You have these private consumer apps, maybe those are the cars. But what is the public bus or high speed rail of AI?
The thing I was sort of getting at there was when you look at safety questions around the adoption of cars, a lot of people basically had to die first. Like there was like so many crashes and pollution got so bad that big advocacy organizations of concerned parents and environmentalists were like, our kids are dying on the road so let's go to Congress and lobby for safety laws.
And part of me basically thinks that AI safety will make a comeback when there are sufficiently high-profile, extremely obvious harms. Not to say that folks haven't died as a result of AI systems before.
Arjun (49:55)
Really?
Jasmine Sun
There are autonomous weapons that sometimes kill civilians, right? There are AI systems that will make mistakes that will lead to people dying.
Arjun
Are they autonomous though? Like the drones in Ukraine?
Jasmine Sun (50:13)
A lot of Israel's targeting is AI. There's like a 972 article that I can link. But these are automated decision making systems, they get integrated into lots of things. Like if that system has the power to let someone live or die, there are going to be mistakes made. You can argue about whether it's more or less mistakes than a human would have made, right? That's a very valid argument to have, but like certainly, depending on your definition, people have died because of AI.
There's like an AI incidents database with a list of examples that some EA folks made. But clearly none of these have like risen to the level of really getting like the AI safety movement back in. I'm a little bit nervous about like, what is the incident that will get people to go, shit, regulations, slow down.
Arjun (51:08)
Then there's a whole entire other strand of EA that's not AI that's sort of humming along. There's a lot of development people who are very EA influenced, and there's a whole debate around whether we went too far with randomized controlled trials. Now Open Philanthropy plans to be starting this team to invest in growth in developing countries, which I think is really good. They've made a massive mark on that world, funding huge amounts of research and philanthropy. It's just less in the news.
Jasmine Sun (51:27)
I know less about the world, but, yeah, it seems less controversial overall. I'm probably a fan. Like you should probably test the efficacy of things you do. I like, donate to the charities once a year when everyone does their big GiveDirectly promotions.
Gains from trade (with AGI)
Jasmine Sun (51:27)
My final question for you is: What is a research question that you are trying to untangle right now?
Arjun (52:14)
One that's sort of on theme for this podcast is thinking about AI capabilities.
The term “artificial intelligence” is sort of a strange term, because are we really building “intelligence” like we understand it for humans? I don't think so. We're building something that's much more uneven. It's already superhuman at many things that humans are bad at, and it's much worse than us at other things. Some people use the term “jagged frontier.” A friend of mine, Basil, likes the term “multi-dimensional intelligence.” And so, you know, we're trying to think about how to model this explicitly and create a framework for thinking about the different dimensions: Is progress going faster in some than others? What are the implications for the economy in terms of impacts on labor markets on employment growth?
And I think that's better if it's multi-dimensional, if the frontier is jagged, versus if it perfectly matches the human frontier. Because then it preserves human advantage in some domains and you get nice gains from trade. If you had something that was like strictly the same as a human, but zero cost, you'd get more negative labor market impacts. And you can still get a lot of growth—in fact, you get the most growth from AIs that are good at things that humans are not. Creating new tasks, new industries, and so forth.
Jasmine Sun (53:39)
Whoa. I had never thought about it as gains from trade, but that totally does make sense.
Arjun (53:59)
Yeah, there are a lot of econ people who think the pursuit of humanlike AI is the wrong goal. And there's something valid to that point. The biggest gains come from when we don't just automate existing tasks, but we do entirely new things. That might happen with this kind of jagged intelligence concept.
Jasmine Sun (54:26)
Okay, we should talk more about what AGI means and the jagged frontier thing at some point, which will not be the next hour.
Thanks so much for doing this with me. This was fun!
Arjun (54:35)
Sounds good. It was a lot of fun! I really enjoyed it.
Jasmine Sun (54:40)
This is Jasmine again, several hours later after I'm done editing the whole thing. Thank you so much for listening to the first episode of my podcast! You can find the essays and the links for this at jasmine.substack.com.
And because this is the first episode, I am very keen to hear your feedback: Was the format good? Was the audio quality all right? Do you look at the video?
I'm hoping to bring in a new smart friend to talk about whatever it is I've been writing and thinking about that week, to provide extra context and knowledge that I don't have. So if you like it, please let me know. Please send it to people. I really appreciate you listening.
Links and books
Here was rest of our reading list for today:
The Origins of Wokeness (Paul Graham, Jan. 2025)
The new tech worldview (Arjun Ramani for The Economist, Dec. 2022)
Idea machines (Nadia Asparouhova, May 2022)
Sam Altman’s Manifest Destiny (Tad Friend for The New Yorker, Oct. 2016)
A time for truth and reconciliation (Peter Thiel, Jan. 2025)
The Rise of the Right-Wing Progressives (N.S. Lyons, Jan. 2024)
The changes in vibes - why did they happen? (Tyler Cowen, Jul. 2024)
And the recommended / referenced books:
The Power Broker, Robert Caro
Technology and the Rise of Great Powers, Jeffrey Ding
The Politics of Innovation, Mark Zachary Taylor
Fully Grown, Dietrich Vollrath
Thanks for listening!
Jasmine Sun
The origin of the name is an article club that my friends and I sometimes host, where we pick a longform essay to all read and discuss together over brunch. I wanted my podcast to feel like a virtual version of the same—fun and informal, but genuinely substantive.
The rough plan is that my cohost and I will send each other a set of articles/essays to read and discuss; often related to whatever I’ve most recently written.
Also PTSD from this:
Share this post