menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

How the Attention Economy Is Devouring Gen Z — and the Rest of Us

12 9
previous day

Advertisement

Supported by

The Ezra Klein Show

By

transcript

The last few episodes of the show have been about attention, but about attention in terms of something else. Attention in terms of Zohran Mamdani and Donald Trump. Attention in terms of the “big beautiful bill.” But I wanted to do an episode that was about attention in terms of itself. If we’re going to say attention is currency, if we’re going to say it’s power. Well, what kind of currency is it? What kind of power is it? I’m a pretty loyal reader of “Kyla’s Newsletter” by Kyla Scanlon, which often feels to me like it’s being sent back in time from some future economy. And in a way, it is. Kyla is very much a member of Gen Z. [CLIP] Fartcoin has been one of the most stable assets of this entire time, which I think says a lot about this entire time. And the economy. She is reporting on the economy. She is theorizing an economy where attention drives capital, as opposed to capital driving attention. Kyla is also the author of the book “In This Economy?” and you might have heard some of her different coinages — like “vibecession.” She’s a fascinating person to talk about this with. As always, my email: ezrakleinsho@nytimes.com. Kyla Scanlon, welcome to the show. Thanks for having me. What is different about the economy that you live in, that you see, that you feel Gen Z is experiencing from the way US 40 and 50-year-olds described or understood the economy. So one of my pieces that went far was a Gen Z in the end of predictable progress, and it was based on my research. Over the past year, I’ve been traveling on a quasi book tour going to a lot of college campuses and going to a lot of conferences. And I’ve been talking to young people about how they experience the economy and how they think about their future And so for them, it doesn’t feel like there’s that quote unquote path of predictable progress that maybe their parents had or their grandparents had. And of course, every generation has had its own challenges. But for Gen Z, you don’t have that predictable return on a college education anymore because of things like AI, because of how expensive college has gotten. You don’t necessarily have a path toward buying a house that feels even remotely approachable. And if you think about retirement or just moving through a career path that also feels really far away. And so that whole piece was like, well, what does it look like if this path that everybody has followed has kind of disappeared more or less. Let me stay on the topic of the feel of it. So then what does it feel like if you can’t see your way to a house if you’re not sure what the career paths are if you’re not sure what I will do. If you had to describe the emotional structure of the conversations you have with people on the tour, what are the dominant emotions. I think there’s a lot of worry. I mean, I think everybody picks up on this there are so many think pieces about the kids are not. All right. There’s a lot of nihilism. There’s a lot of concern. There’s a lot of fear. There’s a lot of anxiety. And you sense that when you talk to people because they don’t really know what to do, because this path that has been instilled in you from the time that you were little, go to college, graduate, buy a house. It’s just it’s out of reach. And so when you can’t get that, it feels far away. And so David Brooks was writing about the rejection generation that I think encapsulated it really well because he was like, the Gen z-ers are facing like rejection after rejection, it’s hard to get into college. And then when you graduate college, it’s hard to get a job. And so I think that element of always being rejected from everything, or at least feeling like you’re being rejected from everything, creates those elements of nihilism that show up in how Gen Zers might spend or save or invest. Tell me about your barbell theory of Gen Z. Essentially, there’s two ways that people seem to be responding to the uncertainty in the economy and the lack of a path of predictable progress. One path is tool belt pragmatism. So people going back to the trees like becoming a plumber electrician. Great taking a path that isn’t as speculative and uncertain as getting a bunch of debt and going to college. Other people are going a meme, coin gambling, sports betting type of route. And so you have these two ends of the extreme of risk. But both of those are responses to that path get a college education, get a white collar job and go off into the sunset. Not really working anymore. How much is this a narrative and how much is it a reality. So if you look at unemployment for Gen Z or new college graduates recently, the unemployment rate for new college graduates is up a bit. But it’s not 30 percent or 40 percent I’ve seen people debating the housing question, were millennials or now Gen Zers, are they really so far behind other generations on housing. And won’t all those houses get passed on anyway. So is this really a problem. So when you. Is there a divergence, in your view, between the data on the economy that people in Gen Z are in and how they feel about it, or does it match. I still think there are some elements of a disconnect. Like people might be feeling a certain way because of social media. That doesn’t always match the data. But I think when we look at some data points the college wage premium. So basically how much more you make with a college degree than not. And that’s eroded over the past couple of years. And so you’re graduating with a college degree very expensive. You’re not making as much as you might expect. Housing is still quite expensive. Like eventually those homes will pass on. But I think the median age to buy a house now is like 54 years old. And in the 1980s, I believe it was 34. And so there are some data points that support the fact that there are these elements of nihilism, and perhaps reasons for feeling nihilistic that weave into how Gen Z experiences the economy. But I think it’s all exacerbated by things like social media, for sure. Like, there is elements of narrative that might sweep beyond the data, but the data tells the story, too. You also talk about how there’s not just one Gen Z. There are multiple. What are they. Yes so I’m an ancient Gen Z. I’m an elder millennial. So I graduated right into the pandemic. And so I’m part of the Gen Z ones. So those that were remember a time in college where you weren’t in Zoom like you were pre-pandemic. You were in the workforce during the pandemic. And then Gen Z, 1 and 1/2, it would be where my little brother is. And so he was in college during the pandemic. And so that shaped both his relationship with institutions and his relationship with digital. And so he got a lot of education via Zoom. And he relied on digital tools, as we all did during that time, to forge friendships and to forge connections. And then Gen Z 2.0 is the people that are in college now and high school now, and they’re the first part of that generation that is entirely digital. So for them that digital seems to be an extension of reality. So Rachel Jones vasa was the first person that came up with bucketing the Gen Z’s. She had Gen Z 1.0 and Gen Z 2.0, and I stuck a Gen Z 1 and 1/2 in the middle because I think it deserves a bit more splicing, but that’s how you can think of it as the relationship with technology and the relationships to institutions. And the pandemic is such a big player in that right to be to graduate before the pandemic, to be in school during the pandemic is the implicit argument here that Gen Z is just going to be a very different generation, because it had this shock that other generations didn’t that the pandemic hit at a much more formative time for essentially all Gen x’ers and what came before them was going to come after them is going to be less volatile than what they went through. Yeah, I think so. I think Gen Z is the beta generation, the beta tester generation rather. So they’re the ones who are like, is TikTok good for the brain. We’ll see. And can students do Zoom classes and still get an education. We’ll see. Is unfettered access to the internet O.K. We’ll see. Like we’ve tested a lot of things on smartphones in schools. And we’re learning. We’re learning like a lot of schools are starting to ban smartphones. But Yes, I think Gen Z is a generation that hopefully will take a lot of lessons for Gen Alpha and perhaps not put them through what the Gen z-ers have gone through from a digital mental perspective. How does AI fit into the job market that young people are both seeing and sensing. So AI is interesting because there’s all these stories about how AI is going to replace entry level jobs, and these companies are going to automate all this work with AI like Salesforce, I think has automated, so they say like 30 percent to percent of their works or something like that. And so if you’re a young person and you’re looking at that, you’re like oh no these entry level jobs that I rely on to get on the bottom rung of my career ladder are going to be taken away from me. And so I think that also creates an element of fear. It’s like, well, where do you even enter the workforce if the entry level jobs are supposedly all going to be done by AI. This is where I think the feeling of it is important when I both feel this a bit myself. But when I talk to people who are starting out more in their careers, my sense of the moment is that it’s a little bit for a lot of people looking at a wave in the distance and you turn on the news and they keep telling you have a tsunami Warning. And you’re like, well, is that wave going to turn into a tsunami. And if so, how quickly and do I have time to get out. Or am I in a high enough, area and you don’t really but that it has become this fog between you and a stable vision of your own future, right. When I started out in journalism, it wasn’t clear that I could succeed in journalism, but it was never because there might be a massive technological shock to journalism in which a computer would do my job instead of me. And obviously, things like that have happened in manufacturing with both globalization and automation. But I think it’s some of that there is just this ambient uncertainty. It feels like that moment of liquid or I think fog is probably the better metaphor is very present for people. Yeah and if you look at the money that the companies are spending like that can that’s concerning to when you see Mark Zuckerberg spending billions of dollars to poach people from OpenAI. It’s like, well, clearly something is happening here. Clearly he has some plan. Like, will that plan work. Who knows. Because the metaverse didn’t work out. And so I think the overwhelming narrative is that Oh gosh. Well, if you majored in computer science you’re out of luck, man. Good luck out there. Or if you majored in arts, or if you’re trying to be an artist too bad. AI is going to do your job. And so I think right now we’re at this phase of technology where we’re trying to establish the human part of it, and we’re not doing a good job at figuring out where humans need to be in the technology equation. And instead, we’re like, no, we’re just going to automate everything. And sorry, we don’t have a plan for people. This really frightens me, actually, is that if I was going to hit like COVID hit and just put percent of the population out of work. We would do something about it, but if it moves slowly and just eats a category and a tranche of jobs at a time, we’re going to blame the people who don’t have jobs. We’re going to say, well, most people who graduate with a marketing degree got a job, so you’re just not working hard enough. You’re not smart enough, you’re not one of the good ones. We’re so used to doing that in this economy, blaming every individual for what happens often when it’s very sectoral, very technological. And I talked to a lot of politicians about this, and they’re kind of abstractly worried about it. And none of them have an inkling of a policy answer for it. Like, what do we do if I which seems totally plausible, just doubles like 18 to 24-year-old unemployment in the next six years or triples it. Like, what policy would we deploy for that. And nothing. Yeah, I’m not surprised to hear that. I don’t know if anybody has a good answer. Like, the common answer you’ll hear is UBI universal basic income. Like, we’ll just set aside $1,000 a month for everybody, and that’ll be that. And I don’t think that’s the right answer either, because, to be human is to work and to have meaning and to have purpose. And so if we all of a sudden just say like, no, you don’t have meaning and you don’t have purpose that feels really bad. And so I worry even if we do have a policy response like the upheaval, the societal upheaval that could happen if all of a sudden we’re like, no jobs for everybody could be really bad. And then there’s other side of the coin where there’s a good paper that was published in the National Bureau of Economic Research that talks about nails like hammer and nails, where once I think zero. So 0.5 percent of GDP in the 1800s. Like the nail part. And so it’s a lot of GDP, actually. Yeah, right. But like that just shows technology is always shifting and changing and no longer nails like that big of our economy. Like we develop new parts of our economy just like we did with the internet. And so that’s kind of what we have to hope for, is that new jobs will be established with AI rather than no jobs at all, which maybe they will. But it’s that disruption period that is very scary, I think. You mentioned UBI. I’ve been around a lot of UBI discourse. My wife wrote a book on giving people money a couple years back. My old colleague Dylan Matthews, I thought, had a great line on this, which he said that UBI is simultaneously too much and too little of a solution for the AI problem. Because if you imagine you’re a unionized truck driver, right, an interstate truck driver, you’re making $88,000 a year and your job is automated by a driverless truck, which we are currently pretty close to being able to do. In fact, maybe already can do it. And then what an A UBI, in a far fetched scenario where we actually pass one is going to give you $22,000 a year and also going to give me $22,000 a year, so it’s not enough of a solution for you. It didn’t replace your income or give you your dignity back. And it’s too much solution for me. Like I didn’t lose my job. Yeah and the UBI thing always just struck me as a very strange like and fanciful response to AI. It might be good for other reasons, but AI is not going to put everybody out of work all at once. AI is going to make workers more productive, which will slow down the hiring of New workers. And that’s just a very that’s actually just a much harder problem to solve or to address. How do you think we should address it. I have no idea. So even like in talking to the policy makers, there hasn’t been anything floated that is this just something that we’re all just going to stare at each other until it’s here. I think that going to some of your work on attention, you’re going to need a focusing event, and we don’t know what it will be yet. Where something happens that clicks into place that this is a problem we need to deal with. Now that people are really losing their jobs. Now, and it’s going to depend on how big that event and that trend are. So a lot of people lost their jobs to the movement of factories, to China. We didn’t really do that much for them. No, we just failed to respond to that. And it was very destabilizing over time in our politics. I was talking to somebody who’s in this world, and we were and he was a big skeptic, that I was going to take away jobs, even though he was a big believer in AI. And his argument to me was look, you’re going to have so much capital investment in data centers and all this. We’re going to need so many electricians we’re going to meet. And it struck me as very fanciful in this idea that we’re going to turn all of these comms majors into electricians really quickly. So to me, I just haven’t heard a solution that really makes sense. And my worry. It’s weird to be hoping for much more disruption rather than less, but I think we would have a much better chance of responding to it. Well, if the disruption is significant enough to be undeniable. Then if it is slow and a lot of little pieces happening kind of all at once, but nobody can quite prove what it’s coming from. My confidence is higher in an almost emergency scenario of this than it is in the accretion that I think we’re likely to get. So you’d rather have a flood than a slow drip. I think so Yeah. What do you think would work I don’t know either. That’s like the hard part about it. It’s so easy to diagnose problems and it’s much harder to come up with solutions, it turns out. But I don’t know. I mean, I think the physical world element of AI is also interesting too. Like, it does require a significant amount of resources the data centers are massive and they’re quite loud if you’re near them. I think we are speedrunning it. Like, I wouldn’t be surprised if we see a flood rather than a slow drip, just based on the amount of money that’s going into it. My version of what I think is going to happen, or one, let’s call it plausible scenario for what will happen, is that it’s going to be during the next economic downturn, when companies need to squeeze their labor forces that they’re going to make a big transition to AI. And so if you imagine the mixture intentionally of a recession, which focuses a lot of coverage and interest on the economy and watching companies do what they’ve done in past recessions, which is using the recession as a moment to make a technological jump into some higher productivity technology like AI, that I think is going to be the kind of scenario where this becomes a really dominant conversation. And by the way, you could imagine another category of answer to it, right, which is regulations about how you can and cannot use AI to replace people and various kinds of protectionism. There was a somewhat famous interview from a couple of years back between Tucker Carlson and Ben Shapiro, where Carlson basically says trucking is one of the most common jobs for men in most states. Oh, I would absolutely outlaw driverless trucks. And she was like would. And Carlson is like Yeah what’s wrong with you that you wouldn’t. I think debates like that. Should we actually welcome this productivity increase or should we stop. It will become much more salient and in a way that people are not yet ready for it because they’re so used to technology just being adopted as opposed to debated. Yeah, not all progress is progress, right. Yeah some. Yeah I mean, I think that’s the rub of it. It’s like because what’s also interesting about AI is like, Yes, it’s being used to replace some jobs, but it’s also being used in ways that are spreading misinformation and kind of capturing people. Like people have sent 3 million messages to the AI chicken on Instagram. And so you also have to question like, not just a hell of a sentence, but I don’t really understand, to be honest. Really but I mean, I can intuit I can Yeah, I guess there’s a chicken. That is AI generated on Instagram. Message it and it says like bok bok. Like what. I don’t actually know quite how a message is back, but there have been messages exchanged with the chicken. And there’s all sorts of I slot videos on TikTok and that’s a big part of it too. Being a person, you find meaning within work, but you also find meaning within how you spend your leisure time. And increasingly, people spend it scrolling, understandably. I was on a plane ride yesterday for 5.5 hours, and the woman next to me was very nice, but she was scrolling on TikTok for the entire 5.5 hours, and you can just only imagine how I will accelerate the addictiveness of stuff like that as well as the impact it’ll have to the labor force. You talk about how AI is going to create this abundance of intelligence that will create a scarcity of truth. Tell me about that. So yeah, I think truth is really valuable. It’s the most important commodity of the present moment, and it’s something that is increasingly scarce. And once you lose it, it’s very difficult to regain it. And so I think AI is going to create a lot of information and a lot of noise. And it will be increasingly important for people to be able to the truth out from that because the AI does hallucinate quite a bit. If you’ve ever talked to ChatGPT, it does make stuff up and you can be like, hey, you made that up and it’ll correct itself, but you still have to be able to source like what the truth is and what that means. And I think that’s also the problem with social media too, is like those algorithms are designed and the incentives are perhaps not aligned to the user. They’re aligned toward the corporation. And so anything that people can get addicted to and there’s a monetary incentive for them to get addicted to it, it’s going to happen. And so I think there is a world where the I can be a source of truth. But right now, I don’t think it is. But people take everything it says at face value the number of at grok. Is this true. That happens on Twitter where people are asking the AI to validate a tweet rather than go and do the research themselves and work that muscle in their brain. It’s concerning because you do have to have a radar for truth, because it’s so easy to get taken advantage of right now. There’s just so much information, there’s so much noise, and it’s just nonstop. And it’s very easy to make mistakes. And a lot of people do, and you have to be able to know what’s true and what isn’t and have your own, moral and value compass. So I guess maybe here would be an optimistic version of this, which is, I think, a very standard story about social media. Is it shattered the thing we now call consensus reality and how much we have our consensus reality. I think you can debate, but probably more and at some other points in history than at this point in history, and that what is I but an articulator of consensus reality. What is I. But when you say, hey, is it true it gives you this middle common denominator vision of truth, which is going to miss perspectives that are maybe valuable at the margins. I think there’s a worry that I’ll make us all more like media AI is a technology of intellectual mediocrity. It coheres everybody on the same set of consensus ideas. But we’ve been sitting here for so long lamenting the destruction of that consensus reality. Maybe this world where everybody’s asking ChatGPT or Grok, is this true is exactly the thing we’ve been yearning for. Maybe, I think it has to get there. Can’t blame me for trying. Yeah I mean, I think there’s a lot of value in the optimism, but I think what you were maybe talking about is, of dead internet theory, almost like where there is this intellectual flattening of the public. And then maybe people don’t come up with new ideas and they don’t challenge themselves and they don’t there’s a bunch of articles talking about how the college students are having a tough time because AI is making things a bit too easy, it’s a little too frictionless, and I think I worry about that, too the cognitive effect it could have. Like I notice it in myself. Like if I overuse AI in terms of research, which I do use it, I do use it. I notice the lack of sharpening of my own toolkit. You’ve written a lot about this world in which it seems to you the. The social incentive for thinking is being diminished. Well, I mean, I have to be cautious of broad generalizations. So like most of my I make videos about the economy. So every newsletters and newsletters and wrote a book and but you’re multi-platform. But like I spend a lot of time on social media and the reason I make videos on social media is so I can understand the mechanisms. So every day I post a video on Instagram or TikTok. And that way, I can see how people respond and I can get a radar for how social media is interpreting one side of a conversation. And so then in terms of the social incentive for thinking, I mean, I do worry that social media has created elements of polarization. That’s been very widely discussed by a lot of people, and I think I could exacerbate that and make it more challenging for people to go out and seek information. Like, there’s a lot of value in memorizing things. So that way you can pull forward that fact rather than going and googling. But these arguments are also old, right. People did say the same stuff about googling. It’s like making people dumb. And so perhaps it’s just that same argument rehashed. So I think AI is also a bridge to this other argument you’re making, which is that attention is infrastructure. Yeah and one of the things you write in that is that traditional economic substrates are land, labor, capital, bedrock inputs to make stuff. But now the foundational input is attention. Walk me through that. Yes so that’s an argument that you used to need things to of like raise money or move through the world. And now increasingly you can just have attention. And the idea is that attention is increasingly becoming an infrastructure of sorts that people have to build upon. So no longer is the economic foundation, something like land, which is very physical labor or person capital, actual money, it’s attention and then narrative. So the story that you tell to gain that attention is the capital that inflates the attention itself. And then increasingly now we have speculation on top of both of those. So speculation is kind of like the operating layer. So it operationalizes the intention and makes it move throughout the world, because now you can attach actual dollar signs to how much attention you’re gathering. So things like prediction markets would be the best example. So Polymarket has a Substack, and they wrote about a man that bet on Mamdani in the mayoral race in New York, and he made like $300,000 off of it. And what he was doing was essentially seeing where attention was going and trying to make inverse bets off that and seeing where the story, the narrative was wrong, and he was able to operationalize that attention going in the wrong direction. The stories being incorrect through speculation via predictive markets. Does that make sense. So this is a weird way, it sounds to me, in which I think we think of the attentional economics and revolution as being primarily about digital media. But it sounds to me like the way you’re describing it, I think this is true, is that it’s like the weird bastard child of digital media and financialization of everything, that it’s the ability to endlessly bet. The venture capitalists are betting. The day traders are betting. The crypto people are betting in a way, when we just give things our time, we are of making a bet about what’s important. Because that’s where so influencers, right. You get money through views like that’s the monetization of the attention economy, more or less. But now we’re able to bet on where attention goes. So like influencer, it’s a very boxed situation. Like you have a video that does a million views, you make your money, blah, blah, blah. But people can bet on how much views your video might get. And so that creates this multi-dimensional aspect to the attention economy. Well, and creates a feedback loop too. I mean, polling has always had a bit of this quality, but I really watched with Mamdani and Trump the way the betting markets drive now, feedback loops where people see something happening. And then they begin posting more on it, and then the thing begins happening more and you can see it in the betting market. And so they begin posting more on it or giving more money to it or whatever it might be. And one, it turns attention into capital. Because money follows it. And two, it just makes it realer. It is a way of something that’s starting small can become exponential just through these feedback loops. Again, if it’s sufficiently viral and people can keep picking it up. Yeah and they do because you get when money goes like attention follows. When attention goes money follows. And so it does create this really quite nice like in terms of structure. Not nice in terms of effects but nice feedback loop. That’s really interesting to watch. And it was very interesting to watch, what happened with Mamdani and the prediction markets and how that moved him. And increasingly prediction markets, I think it’s always kind of been this way. But it’s not what people think, right. Like they’re not betting on what they think. They’re betting on what they think other people think. And so they’re essentially betting on the attention economy itself and using the stories that people are telling about attention economy to determine where their money should go, and then more money follows. It’s crypto in a sense. So you said attention is a foundational input. That’s clearly true for a certain set of products. Does your theory of attention as a major infrastructural input, does it say anything about the big parts of the economy that we just don’t pay that much attention to that don’t have big narratives around them. Or does that that’s just not part of this theory. So there’s the physical world and there’s the digital world. The attention economy really applies to the digital world. There’s elements of attention that can serve the physical world, if care workers need a new policy passed around them, the more attention that they can have on things like that, the better because that’s how you get things done, is a bunch of people talking about it. But Yes, the attention economy primarily sits within the digital world, and the physical world for now is free of it. There’s a real thing happening here, but there’s also a very speculative attentional thing happening here. One thing about AI as a technology is that the leading figures of it are very big influencers on social media. Sam Altman is probably the most masterful of the CEOs, alongside Elon Musk at driving attention to whatever he wants it to be on. But the world of AI. People are just extremely dominant on social media, on YouTube, right. There’s just like AI is a technology, but it’s also a very compelling storyline in a way that very few other technologies have been. How do you think that plays into this. Yeah, I mean, the entire S&P 500 is a bet on if the AI will make it right. And so there is a lot of money making really big bets. And there is a lot of incentive for them to keep attention on it and to hype it up as much as possible. Like, I remember this one interview that Sam Altman did where he talks about how AI is going to require a reordering of the social contract. And so to go back to what we were talking about at the beginning of the conversation, when you hear something like that, you’re like, oh, what does that mean for me. And the 40 years, I have left on Earth or 60 years or however long. And so I think for the AI universe, they do a great job at keeping attention on them. And, that’s actually one very useful part about the image generation. And the video generation is that you’re able to direct a lot of eyeballs towards your AI model, and then also use that AI model in a workplace. Hype is very valuable. So there’s a company called Cluley that raised. Have you heard. Yeah, they raised $15 million from a16z. They’re interesting. They have a lot of........

© The New York Times