EP. 149: HUMAN EXPERIENCE IN A DIGITAL WORLD
WITH CHRISTINE ROSEN
A cultural critic explores how the rise of digital life is eroding our capacity for real connection, embodied experience, and authentic healing—and why preserving our humanity in an age of simulation may be the most urgent challenge of all.
Listen Now
Episode Summary
If you could be plugged into a machine that simulated the perfect experience — limitless joy, deep connection, a sense of purpose — yet you knew it wasn't real, would you choose to stay plugged in?
This isn't just a philosophical exercise. As our lives become increasingly digitized, our relationships filtered through screens, our emotions managed by algorithms, our attention parceled out to feeds and notifications, we are confronted with a deeper question: what does it mean to have an authentic experience anymore?
Our guest on this episode is Christine Rosen, a writer and cultural critic whose book The Extinction of Experience (2024) explores how the virtualization of our world is transforming not just our habits, but our inner lives. Drawing from philosophy, neuroscience, and her own reflections, Rosen examines what we lose when direct embodied experience gives way to digital mediation, whether that's our connection to the natural world, our relationships, or even our own sense of self.
The repercussions for medicine are profound. In an era where care is often delivered through screens, where patients track their bodies through apps and data, and where wellness is increasingly conflated with optimization, how do we preserve what is human in the doctor-patient relationship, and how do patients navigate their own sense of health and wholeness in a world that so often substitutes simulation for substance?
This is a conversation that cuts deep into one of the most pressing cultural currents of our time and its implications for how we connect, how we heal, and how we find meaning in being alive.
-
Christine Rosen is a senior fellow at the American Enterprise Institute, where she focuses on American history, society and culture, technology and culture, and feminism. Concurrently she is a columnist for Commentary magazine and one of the cohosts of The Commentary Magazine Podcast. She is also a fellow at the University of Virginia’s Institute for Advanced Studies in Culture and a senior editor in an advisory position at the New Atlantis.
Dr. Rosen is the author or coauthor of many books and book chapters. Her books include The Extinction of Experience; Acculturated: 23 Savvy Writers Find Hidden Virtue in Reality TV, Chick Lit, Video Games, and Other Pillars of Pop Culture with Naomi Schaefer Riley; My Fundamentalist Education: A Memoir of a Divine Girlhood, which was named one of the best nonfiction books of the year by the Washington Post; Preaching Eugenics: Religious Leaders and the American Eugenics Movement; The Feminist Dilemma: When Success Is Not Enough; and Women’s Figures: An Illustrated Guide to the Economic Progress of Women in America.
Dr. Rosen’s broadcast appearances include ABC News, BBC News, CBS News, CNN, C-SPAN, Fox News Channel, NBC News, MSNBC, PBS News, and National Public Radio. She has testified before Congress and the US Secretary of Education’s Commission on Opportunity in Athletics.
Dr. Rosen has a PhD in history, with a major in American intellectual history, from Emory University, and a BA in history from the University of South Florida.
-
In this episode, you will hear about:
• 3:00 - How Rosen came to focus her career on the history of technology
• 5:51 - Why we should think proactively about the effects of technological advances on our behavior and society
• 11:40 - How modern technology has encouraged impatience and disconnect with other humans
• 27:06 - Why we should stop seeing technology as a means to “solve” or “overcome” human behavior
• 37:23 - The epidemic of loneliness that exists despite unprecedented levels of technological interconnectivity
• 45:37 - The moral challenges in our society’s attempt to end boredom, discomfort, and suffering
• 54:28 - How to think and act critically about the relentless march of technology
• 57:17 - What we can do to make our lives flourish
-
Henry Bair: [00:00:01] Hi, I'm Henry Bair.
Tyler Johnson: [00:00:03] And I'm Tyler Johnson.
Henry Bair: [00:00:04] And you're listening to The Doctor's Art, a podcast that explores meaning in medicine. Throughout our medical training and career, we have pondered what makes medicine meaningful. Can a stronger understanding of this meaning create better doctors? How can we build healthcare institutions that nurture the doctor patient connection? What can we learn about the human condition from accompanying our patients in times of suffering?
Tyler Johnson: [00:00:27] In seeking answers to these questions, we meet with deep thinkers working across healthcare, from doctors and nurses to patients and healthcare executives. Those who have collected a career's worth of hard earned wisdom, probing the moral heart that beats at the core of medicine. We will hear stories that are by turns heartbreaking, amusing, inspiring, challenging, and enlightening. We welcome anyone curious about why doctors do what they do. Join us as we think out loud about what illness and healing can teach us about some of life's biggest questions.
Henry Bair: [00:01:02] If you could be plugged into a machine that simulated the perfect experience, limitless joy, deep connection, a sense of purpose, yet you knew it wasn't real. Would you choose to stay plugged in? This isn't just a philosophical exercise or a plot of a science fiction movie. As our lives become increasingly digitized, our relationships filtered through screens, our emotions managed by algorithms, our attention parceled out to feeds and notifications, we are confronted with a deeper question what does it mean to have an authentic experience anymore? Our guest on this episode is Christine Rosen, a writer and cultural critic whose 2024 book The Extinction of Experience explores how the virtualization of our world is transforming not just our habits but our inner lives. Drawing from philosophy, neuroscience, and her own rich reflections, Rosen examines what we lose when direct embodied experience gives way to digital mediation, whether that's our connection to the natural world, our relationships, or even our own sense of self. As physicians, we approached this conversation not only with interest in the cultural moment we find ourselves in, but with an eye towards what it means for the practice of medicine. In an era where care is often delivered through screens, where patients track their bodies through apps and data, and where wellness is increasingly conflated with optimization. How do we preserve what is deeply human in the doctor patient relationship, and how do patients navigate their own sense of health and wholeness in a world that so often substitutes simulation for substance? This is a conversation that cuts deep into one of the most pressing cultural currents of our time and its implications for how we connect, how we heal, and how we find meaning in being alive.
Tyler Johnson: [00:02:52] We are always excited for every episode, but I have to say, I'm especially excited today to welcome Christine Rosen. We're so glad to have you here and welcome to the show.
Christine Rosen: [00:03:02] Thanks so much for having me.
Tyler Johnson: [00:03:03] I would love for you to just tell us, you know, we present ourselves as a medical podcast, although we really talk about issues and themes that are much broader, I think, than that. Nonetheless, most of our guests are affiliated with the medical enterprise in some way, shape or form. But you are the relatively rare guest who, although you have some things to say about it, is not directly affiliated with it. So first, can you just tell us what do you do in your professional life?
Christine Rosen: [00:03:28] I come to you from the humanities, where where I have a PhD in history, actually, and I studied history of science and technology, and I currently am a senior fellow at a research institute here in Washington, DC called the American Enterprise Institute. It's a think tank, so I do some policy, but mainly I still just do research and writing a little bit of teaching. I'm also a fellow at the University of Virginia's Institute for Advanced Studies in Culture, where I get to continue to dabble in academic work in history. Yes, I'm from the humanities. I'm from the squishy science, non-science world. And, uh, but I've long obviously, because I did study the history of the eugenics movement and early genetic science. I've had a long time interest in the sciences and in medicine. So yes, my perspective is a little different, probably from from your own and from many of your listeners. But I hope, uh, hope we can find a lot of common ground as well.
Tyler Johnson: [00:04:24] Yes. You'll be glad to know that both Henry and I are proud card carrying fans of the humanities. Both of us have our bachelor's degrees in the humanities, and I think are as immersed in those worlds in a lot of ways as we are in medicine. So just to kind of set the stage for a minute, we have talked a lot on the program about the ways in which technology has transformed the landscape of medical practice over the last, especially over the last 10 to 15 years. And I think for the most part, when we think about quote unquote technology and medicine, it's easy to think about, you know, 20 or 30 years ago, CT scans and then MRIs, and then more recently to think about the electronic medical record and to think about the sort of dawn of the age of AI in medicine. But I think it's important to remember, too, that technology does not have to be super fancy, ultra modern, whiz bang, right? Even the stethoscope is a form of technology, or a reflex hammer is a is a tool a technological tool, if you will, in a sense. And so that's just to say that in some fashion, technology has been part of medicine since there was medicine. Right. That's just a it's a question of how humans interact with each other and with the world around them. And so I think that having the opportunity to talk to a person who's lifelong professional expertise is in the history of technology, I think is a is a really fascinating and important opportunity for us.
Tyler Johnson: [00:05:51] So let me start before we get to your book, let me ask you this question to allow you to kind of prove to our listeners why this conversation might be important for them. As a doctor who maybe has never thought about this, you know, in, in these kinds of terms, why do you think we should care enough to think proactively about technology, like, isn't it just a thing that sort of happens and it's kind of inescapable and unturned backable like it's just going to, you know, technology is just going to happen and we might as well just live in the world as it's presented to us. Like, why? Why is this a thing worth thinking and talking about?
Christine Rosen: [00:06:28] It's a great question. And I think I think I would begin by answering it and saying, we all like to assume that technology is neutral. These are just tools we devise. They're extensions of what we're already doing that help us, you know, eyeglasses help us see better, stethoscopes help us listen, scans are just a more sophisticated way of doing that. What that misses. And I think our general techno optimism in the United States, in particular, historically and and up to the present, has allowed us to elide some distinctions. One is that tools only go one way. Our use of them extends our powers, rather than realizing that it also shapes what we perceive, what we see, how we do things, it changes us. And so this is particularly concerning in professions where you start to put more tools between yourself and the people you're tending to. If you're in healthcare or in mental health care, or if you're a teacher with students. So I think for me, my sort of lifelong effort is to remind people that technology is never neutral. It is designed with certain ethical and moral sensibilities embedded in the tool itself. Some of those are quite routine and acceptable to us, but others, I think, are left unquestioned at a time when, because our technology use is so ubiquitous, because most of us spend most of our time with a small computer attached to our physical bodies throughout the day. It seems normal. So my I'm constantly prodding people to think, okay, well, what's not on the menu of options here if you use this tool versus that one, can you do this without a mediating technology between you and another human being? Might that be a better way to do it? The answer isn't always yes or no. It really does depend on one's context and situation. But asking those questions is one thing we tend not to do. And so I like to urge people to start with that question when they're particularly when they're about to bring a new technology into an environment where humans used to do things with each other without mediation.
Henry Bair: [00:08:27] So I think I may have been the last generation to have grown up without the ubiquity of social media, digital technologies. I didn't have an iPhone until sophomore year of college, but I think I was pretty late on that. I think most of my a lot of many of my friends had iPhones. End of high school or early college. I got mine fairly late. Um, and that was because I had moved abroad. I'd moved from Taiwan to America for college, and my whole family was in Taiwan, and it just made things a lot easier to have that. So when you talk about the ways that these new technologies have inserted themselves between us and other human beings, I intuitively sense that there is something worth exploring there, that there is something being lost when you have that interface sort of interrupting you and and other people. But again, I think you can find lots of people who might question, like, so what? Why does that matter? And especially since obviously there there has been and there will continue to be people who have only known this all their lives, people who grew up being steeped in this world. So what is your response to that? Like why? Why does it matter at all?
Christine Rosen: [00:09:38] So when you have no other alternative to speak or in. Now, thanks to FaceTime and smartphone use to see the people you're speaking to through the phone. If that's your only alternative, absolutely, you should do that. Grandparents get to see their grandchildren who live on the other side of the world. You can talk to family members who live elsewhere. This is all a good thing. I have no problem with that. I am not a Luddite, but what I know about my study of human nature teaches me that if you make something easier and more convenient, we're going to do more of that. If you make it efficient, we're going to do more of that. So one of the ways in which we've allowed mediated experiences to creep up on our everyday interactions is that maybe my neighbor down the street has a question for me, and they might just text me instead of walking over and knocking on my door. Maybe your colleague, instead of going down to their office and having a face to face, far more complicated and nuanced conversation about an issue, you send them an email, or you put it on the slack channel, or you just, you know, have a quick FaceTime. It's not the same.
Christine Rosen: [00:10:40] The quality of that engagement is going to be different. And so my concern isn't that we have the option. I think the option is a good thing. I think it's that we we are more and more and we see this with the data. We are more and more choosing the mediated option when the face to face is right there nearby, just perhaps slightly more inconvenient. Like I have to put on pants and walk outside the house to talk to the neighbor versus, you know, texting them or posting something on Nextdoor. So when the choice is made, that convenient and easy for us will tend to take it. And that has a cumulative effect in terms of how we interact with each other, and also for our expectations of how we should be treated and how we treat others. I can always mute someone on my phone or just close the app. I can't close the app when my neighbors screaming at me about something they're upset about. So again, we have less control, but also more nuance. So there are tradeoffs there, and I worry that we are more and more choosing the mediated experience because it's easier, more convenient, and gives us this illusion of control.
Tyler Johnson: [00:11:40] I think I have probably mentioned this on the podcast before in other context, but in my estimation, one of the most important articles about medicine, and as it happens about medical technology that I've ever read, was one that was published in the New England Journal of Medicine by Abraham Verghese in 2009. That article is entitled Culture Shock Icon as Patient and patient as icon. Right? And so he never quite says this explicitly, but the context that I think is sort of that you intuit in the article is that he had relatively recently come from a hospital in Texas that was presumably this is the part that is sort of implied, but that was, I think, less quote unquote connected than Stanford Hospital was. He was brought here and given an endowed chair and a whole thing. And so then he goes to attend on the wards. And, you know, for especially if you speak to older doctors, that phrase attending on the wards has this romantic? Almost kind of a mythic quality, right? Like, that is the thing that attending physicians in. I mean, you can tell by the name in an academic hospital are there to do right, whatever they may do in terms of research or whatever else. Like really their raison d'etre is to attend on the wards. And what that classically meant was a person with another group of people going to visit patients in person. Right. And I can say, having worked very closely with Doctor Verghese for a number of years, when I was resident and chief resident here, that for him, that's what it means is going to the bedside. And as he likes to say, the body of the patient becomes your textbook and he is a master teacher of this.
Tyler Johnson: [00:13:20] But when he gets here in 2009 and goes to attend on the wards, he finds that. And this is, you know, let's bear in mind 16 years ago, right. So before 16 further years of technological change, He finds that the residents, for the most part, do not want to be in the room with the patients. They want to be in a separate room, which is a work room in front of a bank of computers. And that in large part the person in the bed has been, if not replaced, very nearly so by a series of graphs that you can see on the computer of their vital signs, charts of their laboratory values, renderings of their CT scan, which again, is a pixilated sort of representation of what is happening inside of the body. Et cetera, et cetera, et cetera. To the point that many teams now they never see the patients together. They do what are called card flipping rounds, which is where you sit in the team room and you talk about the patients without the patients present looking at representation, digital representations of them in a bunch of different forms make decisions about their care. And then each member of the team may spend a few minutes in front of the actual patient later. But largely what you're doing is actually being in a room. And they have actually done studies now where they outfit residents with tracking devices and see that they spend significantly more time in the computer room in front of the computer than they do in the patient's room in front of the patient. Right.
Henry Bair: [00:14:53] They need they need trackers to prove that.
Tyler Johnson: [00:14:56] Yeah. I mean, this is not a not not a thing that anybody questions who's actually been there. But this is the but it has been proven to I mean, yeah, I'm sure Doctor Rosen is laughing because now we're, you know, using more mediated technology to prove what we already knew anyway in our bones. But the point is just to say that it is profoundly different, right? Like to your point, the technology, it is almost true that it's not so much mediating as it is not entirely, but in large part replacing. Right? I was just on the wards last week and attending on the wards, and there is still. I can just tell when I say to the team, let's go and see the patients at the bedside. There's this almost magnetic force that has to be overcome to sort of like, you know, pry them away from the team room a little bit because. Because you can just do everything there, right? You can put in orders there. You can talk to people on the phone. You can like, you can just do all like, I mean. And again, even the fact that I can say, you can do all of the things a doctor does, that's what I am tempted to say you can do in the team room. Except, of course, that the one thing that's not in the team room is the actual people that you're taking care of, right? Like, it just becomes such a it's like through the looking glass or something.
Christine Rosen: [00:16:14] Well, it's a very good example of something we're doing throughout our lives, which is turning human experience into data and then trusting that the information we glean from the data is going to be more insightful and valuable to us than the experience of being at the bedside and looking at the patient and touching the patient and making sure and talking to the patient and asking them questions and looking them in the eye. And when you say, what is your pain level? And they maybe they cringe, or maybe they over there confused or they don't want to be honest. And you can see that if you're another human being looking into the eyes of your patient because you have the experience and wisdom, ideally. But I think the more we tend to outsource that kind of instinctual reaction, which we mistrust because, you know, my gut feeling could be wrong. And the data tells me this they're supposed to work in concert, right? We're supposed to we're supposed to get the information that a blood panel would give you and combine it with what the patient says he or she feels like at the moment. And from those two interactions, you can come up with with better treatment. But it does concern me the hesitation to be in the presence of another human being, especially one who's suffering and in pain and needs help. We're shrinking from those experiences all the time, and not just in medicine. We shrink from them in daily life.
Christine Rosen: [00:17:38] If you look at, you know, these terrible videos, when fights break out, now everybody picks up their phone and films it. They don't intervene. They don't even call 911 with the phone. They just watch. It's like this thing is a is a is a performance, even though it's right in front of them. And that sort of dissociative way of interacting with people, whether it's at the bedside or in our own family lives, I think unfortunately, this happens. You see everybody sitting around doing their own thing on a screen, even though they're all in the same room together or even in public life. So that way of outsourcing experience to data, I think the promise is that we'll be more insightful, will be more honest. We'll get these honest signals from the badges that track us and all these promises. Some of those were true, but they cannot replace the qualitative experience, particularly, I think, when it comes to any caregiving profession, when you look at someone, you look at their full body, their full reaction, you have a conversation with them. What you glean from that experience isn't made obsolete by all the cool new tools and computer things we can do. On the contrary, that is still the bedrock. I think of any caregiving profession, whether it's medicine, teaching, therapists, all all of these require us to try to understand other human beings, meet them where they are, and then help them.
Henry Bair: [00:18:52] And to your point about just you mentioned therapists there. You know, there are so many AI chatbots now that serve as pseudo therapists. I mean, there are products being billed as actual therapists, replacements.
Christine Rosen: [00:19:04] Like woebot. W o e b o t. That's one of them.
Henry Bair: [00:19:08] Right. And I think against what many of us would have expected, I think there have been studies and surveys showing that in many cases, patients actually prefer, um, these kinds of chatbots. And then there are also head to head trials comparing rating, having patients rate empathetic responses from humans and and robots and almost invariably robots. How would you respond to that?
Christine Rosen: [00:19:32] So those studies are fascinating to me because I think they show us two things. One, we've already become quite habituated to the idea that anthropomorphizing a bot is totally fine. And that's I don't have as much concern. And I know as particularly with people who have phobias or people who are reluctant to share personal information with another human being. It's a false idea that it's safer to do it with a bot. It's not actually safer in terms of your medical information, your data, the things you're confessing being safer. I mean, they're supposed to apply the same HIPAA protections and whatnot. We don't have studies to show whether it is actually safer for one's information confidentiality. So I think it shows this weird habituation, this anthropomorphization that we do where we're we want to think of a bot as a, as having human reactions. On the other hand, we're also very impatient of our fellow human beings. Now, in part because we've become habituated to having things on demand quickly, efficiently, conveniently. And if your therapist is ten minutes late, well, that's really annoying because my bot therapist will just respond immediately to my text message because it's always there. It's always on. And yeah, humans are annoying. I mean, part of what makes makes a lot of these kinds of products appealing to people is the promise that they don't have any of the deficiencies that we do as human beings. However, if you look at how they're often deployed, if you have really good health insurance and you can afford it, most people will still want a human therapist, someone they can see on a regular basis, who sees them as a whole person, and they can have these conversations.
Christine Rosen: [00:21:02] In other countries, the UK in particular, they're farming people out to bots when you don't have that premium insurance. And this is another example where I fear in a lot of our caregiving industries, human contact, human interaction, face to face responsive care is going to become a luxury good. And that really concerns me, because if you look at the history of technology, who did we tend to experiment on with new great nifty ideas the elderly, the poor and the young. These are the people who tend to get tested, you know, because they have less autonomy often in making decisions about their care. So I worry about our the ease with which we assume human characteristics of what is in fact not human. And our willingness to trust non-human actors before we understand how they were designed and architected, and what the purpose is beyond making money for a company that creates them. And I worry about our collective impatience with each other, because that's actually something that, again, if you're in caregiving professions, you you know, this first hand people are difficult. Part of our job, if we if we offer any sort of care to others, is to practice patience and to and to understand that it's not seamless, efficient and on demand is not something that applies when you're in a caregiving profession, even if it's the way I want to order a pizza from DoorDash.
Tyler Johnson: [00:22:21] Yeah. You know, it's very interesting to me because so this is going to sound a little bit like follow the bouncing ball. But I think there's an important point to be made here, which is that we have seen over the past, oh, 15 to 20 years, a very significant, unprecedented in the United States, decline in participation rates in organized religion. And in many cases, if you ask people why they have removed themselves from organized religion, the answers are complicated. And, you know, a lot of very morally weighty, devastating, deeply problematic scandals had to do with that, right? And I want to recognize that also, though, I think there is a cultural zeitgeist misfit in the sense that we are the era of you, do you? Right. And there's a great cultural currency to ideas like being unique or being authentic or finding yourself or those kinds of things. Right? And so the idea of going to a place where someone talks to you about anything normative, right. Anytime anybody says, well, you should do this or you shouldn't do that, or this principle is better than that principle. That just feels kind of out of joint with the idea of you, do you? Right? Because in the in the world of you, do you the only thing that matters is what you want. But what is so deeply ironic to me is that at the same time, we have seen that we have also seen and Henry mentioned this earlier.
Tyler Johnson: [00:23:45] I would make the argument that we have, in effect, on a mass scale, willingly, effectively volunteered to become cyborgs because of our smartphones. And now, in the age of ubiquitous social media and a world where most people get their information from social media, and in a world where social media is increasingly dominated by algorithms and algorithms are dominated by a very small number of people who have an enormous amount of money and who want to accumulate more power and influence and money. There is just something deeply forgive me, but I feel like both myopic and ironic about saying, well, I don't want to go sit in an uncomfortable pew for an hour on Sunday and have someone tell me what to do. But oh, by the way, I will effectively give myself an informational IV that I never unplug from my the veins to my brain that will drip in whatever stuff the algorithms on TikTok and Instagram and Facebook and whatever, which are virtually entirely monetarily driven, want to feed me forever for the rest of my life 24 over seven. It's sort of like I'm going to stand on principle and you know, and not do this thing for an hour a week, but at the same time, I'm just going to open myself up to this flood of all of this other stuff.
Christine Rosen: [00:25:01] Well, it is a fascinating thing and a very strange irony of social media platforms in particular, that they were sold to the public, which enthusiastically embraced them as being the place where you do you right. You can perform your authentic self and you'll get all this praise and maybe a little hate, but mainly praise for it. And you can curate your entire life and put it out there for people to see. But because of the way these platforms are structured and the incentives in particular that are embedded in them, it had a strangely homogenizing effect and people became quite conformist in their behavior. They conformed to the demands of the machine. And again, that's one of these things where in the history of technology we create things, we create these amazing tools thinking they will extend our powers, and they often do. But forgetting that then we conform our behavior to the demands that the machine makes. So this is this is one of those examples where it's like the distinction between information and knowledge. All of your doctors sitting in the room getting all that computer information, that's valuable information. But the hands on knowledge is what will often be the life or death decision making moment.
Christine Rosen: [00:26:06] Right? It's that and that's wisdom. That's experience. Those are things for which there aren't easily verifiable algorithms to do this right. So we need both. And as to people not not wanting to conform to the demands of institutions, particularly religious ones, ironically, the young people in this country, for example, who are suffering the least from some of the more terrible mental health crises of anxiety and depression and whatnot, tend to be kids who who are raised in faith communities of all sorts, because they do actually grow up in an environment that puts some boundaries on things like social media use and on technology use. So it's fascinating to see that research and to realize that although we do want to have this radically individualistic way of living our lives, we also seek purpose and meaning and boundaries. And it's very difficult to find technologies that encourage the purpose seeking, the meaning seeking and the boundaries, because everything about technology encourages us to be radical, very deracinated and very isolated individuals.
Henry Bair: [00:27:06] How much of how we interact with technology is up to us, right? What I mean by that is that I think one of the scariest things about the way technology works with us is that it seems to many times it knows us better than we know ourselves. With all these algorithms, and in fact, you have you have the smartest people, some of the smartest people in the world working in these large media companies and what they're dedicated their whole careers to, figuring out how to hack the human brain, to get you to put the products that you didn't know you wanted, but you actually wanted in front of you, or to, you know, maximize the amount of time you spend on some content. That does make me wonder. It's like, well, how much of my brain is really under my own control. And I don't know if you've thought about that. And what are the implications of human freedom?
Christine Rosen: [00:27:50] I've thought a lot about this, actually. There's a whole bunch of technologists. One in particular wrote a book called Honest Signals, and his idea was, you know, the thing about being human is that we're constantly fooling ourselves. We're like, I look great in this outfit. You know, I would never act this way. We tell ourselves these stories all the time just to get through the day. And I think that's healthy. And actually, I think that the way in which we choose to do that and when we're called out for it, or we realize the error of our ways is all part of learning how to be a human being, how to be a person. Our technologists think that we're just being we're lying to ourselves. Lies are bad. We need transparency. We should understand. We should have information about what we really want. What our secret urges are will be revealed when you can track someone's eyes, when they're reading something on a screen versus when, you know, I read a lot of books that friends of mine write, and I sometimes have to smile and go, oh, it's amazing. Best thing you've ever written. It's like it's the same book. This person has written, like four versions of it, but still, I'm supportive because I care about my friend and I'm not. I'm not going to be honest, because that would be hurtful.
Christine Rosen: [00:28:50] So this idea of utter transparency and honest signals and monitoring human behavior leads to a place where imagine sitting across the table with your with your significant other at the end of a long day and he or she is telling you, as you know, complaining about their day. You're tired. You just want to relax. And yet you know what you do if you love that person. You listen. You force yourself to overcome your instinct to be selfish or to tune the person out, because that is an expression of love and caring and thoughtfulness. And I think the technologists who would like us each to be wearing badges that monitor our heart rate and then send you a text going, you're really anxious right now, or she's boring you. That's not good for humanity. Now, it's very good for people who want to design products, to sell us, to make us less anxious or, you know, less angry or less annoyed. But it is not good for human relationships. And at root, we're still these evolutionarily developed creatures who have not yet caught up to some of our own tools in terms of creating new norms and healthy ways of of negotiating each other's behavior. Technology is not going to solve that problem for us, however much it promises us it will.
Tyler Johnson: [00:29:59] Yeah, I have to say that one of the parts of your book that sort of terrified me was there was this whole section about like the technologists that you were just talking about who somehow thought it was a great idea to have like some sort of body signal reading bot that would be attached to me, and then when I, like, went to work or to some, you know, to a social scene or a party or whatever would like track my whatever heart rate and, and like pupil dilation and whatever other body signals whenever I like engaged with people and then would in effect tell me, yeah, maybe not so much that person or yeah, that's a really great or like whatever. So that like the bot would be determining who I should become friends with because in theory, it's like reading my signals before I can read them. Right? And is supposed to, like, steer me away from, you know, whatever, some future problematic relationship. But those kinds of things in your book reminded me so much. I don't know if you've read the book. Um, It was published probably 15 years ago, but my Michael Pollan called In Defense of Food. That book is sort of a critique of the nutrition industrial complex, right? So the nutrition industrial complex is this idea that you can, like, break all things that we eat down into their constituent parts, and then you can say, well, you need more of this vitamin and less of this trans fat and whatever, whatever. And there's this idea that if you just sort of optimize all of the dials, then you'll somehow have a perfect diet, right? And then you get into these, you know, unbelievably heated battles about who's right about this kind of fat or that kind of, you know, whatever.
Christine Rosen: [00:31:34] Good fat and bad fat. Right.
Tyler Johnson: [00:31:36] And his point is not that any of those people necessarily is right or wrong on their specific micro points, but his point is that at some level, if you do that, the thing that you have lost in all of that sort of discussion and all of that breakdown is food, right? Because you're just talking about the constituent parts, but the whole point of food is to take really fantastic fresh French bread and dip it in olive oil and love how it tastes right, or ice cream on a hot day, or fresh baked cookie, whatever. But if the only thing you know how to talk about anymore is the constituent parts, then you don't even know. You don't even experience food, right? And that was when I was reading those parts of your book. A lot of what I was getting is that there is this what feels like and forgive me to the technologists who do this, but like what feels like this almost sort of mindless, monotonous quest to break what it is to be human down into these impoverished constituent parts. Right? As if friendship could like if you could somehow add up heart rate and pupillary dilation and how much you're sweating, and somehow by adding those things together, you get what it means to be someone's friend. You can sort of get there by degrees, sort of like, you know, turning up the water so that the pot eventually boils. But if you actually look at the whole thing, it just is nonsense.
Christine Rosen: [00:33:00] Well, this is where I think the engineer's mindset clashes and suffers from assuming that it can fully understand human nature and human experience, because a lot of what I think even our early digital technologies tried to do was open pathways to largely to information. Right. Information seeking people. I was one of them. As a grad student, I had to gin up grants so that I could afford to travel to different archives all over the country when I was doing research. Now, half of that stuff is online. I would never have to leave now. Me having the sensibility that I have, I wouldn't have made the friendships with these amazing archivists in all these places, and I wouldn't have had this opportunity to explore these cities. And even as a broke grad student, I those experiences were really wonderful, however. So that's a that's a general good. But if you look at what our technologists want to do now, they do see human nature as a problem to be overcome. And it's not in a kind of sense of let's each be our quirky, radically individualistic selves and we want to help you on that mission. It's people tend to lie. Let's figure out how to track liars better. It's very awkward to try to find someone you like dating. Let's let's set up these dating sites that will actually force you before you've even met a person in person, to have all these details about yourself and themselves, and present yourself in a kind of performative way.
Christine Rosen: [00:34:15] I cannot tell you how many people I've interviewed over the years, because I started studying dating sites from the very early days pre-social media dating sites, and boy was that. That was quite a scene. But I would interview these people and they would say, you know, they would say, well, we had this wonderful, you know, we'd have these email exchanges and we were really connecting over email. They'd meet each other in person, sit down at a table to have coffee together. And they were both like, ick. You know why? Because there was something physical that just didn't click. And that's difficult. You can't explain that to a dating site. Now, they shared a lot of interests. They liked the same things. And look, lots of people fall in love. Having met on dating sites, I'm not completely trashing them, but it can't capture some of these ineffable things about what it means to be human, about the weird people we find ourselves attracted to, who on paper don't look good, but you meet them in person. You're like, I like you. For some reason, I don't understand that. That's the stuff that I think we lose when we all embrace the engineer's mindset in all of these aspects of mainly in our private lives, too. I don't want to know what my significant other really thinks of me when I'm complaining about someone at work. I just want him to smile and nod and go, it's going to be okay. And again, that's that's important.
Henry Bair: [00:35:24] Yeah. The way you can hear like, I again, like I had lots of friends who were who are engineers at Meta and Google and hearing them talk about trying to social engineer their way into people's brains is fascinating, but also a little bit scary. And to your point about that, we shouldn't fall into that. You know, I, I don't want to romanticize this, but I do think that if we if most of us reflect on what makes us who we are, it has to come down to our unique vulnerabilities and imperfections and our struggles and how we grow through that. And part of how we interact with our relationships is the richness of our relationships with one another are because we open up to each other about that. So it just makes me think about in the future world in which machines just smooth everything out and sort of cover all the imperfections and make all the efficiencies, inefficiencies disappear. What am I going to be like? What are my interactions going to be like?
Christine Rosen: [00:36:16] It's just there's another part of that, too, that I think you're absolutely right to point out. Our relationships might be quicker, seamless, more efficient, flattened out. But deep relationships take time, and we're all becoming very impatient with those slow revelations over time. And whether you're a caregiver who needs to spend time with someone to really understand where they're coming from, what, or even if they present with a mysterious ailment, and you've got to figure out all the different things. It might not be before you can pinpoint what it is. If you're just meeting someone for the first time, and you want them to open up a little bit about themselves. They're not going to tell you everything all at once. And if they do, that's usually a red flag that they might not have good boundaries, as we say these days. Time is important. Patience is important. We do not live in a world that values or cultivates or understands patience any longer. And I think I have a whole chapter in the book called How We Wait, because the way we now understand waiting and how we wait for anything, has really been transformed by the power of these tools. And technologists want to shrink time and flatten time because that suits their ends. It doesn't necessarily suit human ends.
Tyler Johnson: [00:37:23] Yeah. You know, every parent has their things. They say that their kids just absolutely hate. Right. And one of the ones that my kids hate the most is that I tell them anytime they complain about being bored, I say, oh, boredom is the sound of your brain waking up, right? And that drives them bananas. But it is so deeply countercultural, right? Because for people who are born in whatever the 20 teens or even in the aughts, it is as if boredom categorically is not supposed to exist. Right. Like it is a failure of technology to entertain you, right? And that, I think, is actually deeply worrisome. And I sort of along those same lines, I want to ask this question. You know, it is sort of weird to think now, but believe it or not, for people who were sort of watching when social media was starting to become a thing, the initial pronouncements about social media, especially Facebook from Mark Zuckerberg, were almost kind of bordering on utopian. And he still talks this way about the metaverse, which I just don't get me started. But there was this idea that Facebook was going, and I think this is close to what, the sort of early catchphrase and maybe they still use, I don't know. But anyway. But it was going to connect the world, right? And I mean, he never said this outright, but there was this sort of idea that it was going to, you know, I don't know, it was sort of like the 21st century version of bringing democracy to the world. Like there was this very kind of utopian, almost triumphalistic notion that, like, social media would weave together what was broken and would fix the, you know, fraying social fabric or something. So now we are far enough into the era of social media that it is especially Facebook is virtually ubiquitous. I know it's not cool among the younger set, but even those who don't think it's cool mostly are subscribed to it. And you know, like.
Christine Rosen: [00:39:16] They own Instagram, they're all on Instagram which is owned by Meta, yes.
Tyler Johnson: [00:39:19] Is, yeah, which is just right. So anyway. And then TikTok. But the point is everybody, almost everybody, you know, you have to be like Henry David Thoreau or something to not be on social media. And the thing that is interesting to me about it is that we're so far into this, it is ubiquitous. It is everywhere. And arguably the public health scourge of the 21st century, the 21st century equivalent of tobacco. So much so that the Surgeon General feels necessary to issue a public health warning about it is loneliness that just feels like the mic dropping, right? Like how? How do you read that?
Christine Rosen: [00:40:01] So the epidemic of loneliness is a fascinating thing. I would say, though, that we've got it slightly wrong, because now I think, I think loneliness was starting to become a problem in the early first, let's say, the first ten years of social media use. You were very generous in calling it utopian. I found a lot of early Facebook and other social media advertising and marketing, very cult like in its sensibility. However, now I fear an epidemic of self-isolation, which is a little different than loneliness because we can be alone, feel connected, because we can text our friend or see FaceTime our friends and be on social media with our friends and constantly entertain ourselves when we have to sit alone by ourselves in a room and just be ourselves and sit with our feelings and understand and identify them and process them. You don't have to do that anymore. You just don't have to do that anymore. And so I think it's why it's easy to and I my sons are Gen Z. So they're they're 18. It's very easy to sort of understand their distress at learning how to process regular human emotions and kind of roll our eyes and go, oh, he should know how to do this. But in fact, they don't get to practice that enough. And, you know, kids who were raised constantly being able to avoid emotion will choose self-isolation because it makes sense to them.
Christine Rosen: [00:41:19] It's in a weird way. They've been habituated to that. And that's really bad, not only for their own sense of a development, of a healthy sense of self, but to civic life and to the social fabric, because it becomes difficult when everyone feels kind of fearful of dealing with emotion or dealing with other people, or dealing with the difficulties of another human being. It's much easier to just retreat into that screen or to self-isolate. So that worries me. I think it's why we do see a lot of new forms of mental health crises, not just among the young, but just in general anxiety, general impatience, a more a quickness to anger. I one of the things I was worried I was doing was just only looking at the negative stuff. I'm like, well, are people really more impatient? But I started looking at things like road rage rates, which have skyrocketed. Other measures of civic calm and how they have disappeared over time. And it is concerning because we're not giving each other the benefit of the doubt. We're not figuring out how to deal with our problems in a healthy way, and we're not figuring out what it means to be human, which is sometimes being uncomfortable, sometimes having nothing to do and being bored and figuring out how to cope with that is part of the process of being a fully formed human being.
Tyler Johnson: [00:42:30] There's this famous passage at the beginning of Neil Postman's book, Amusing Ourselves to death.
Christine Rosen: [00:42:36] Fantastic book. Yes.
Tyler Johnson: [00:42:37] Where he talks about the difference between 1984 and Brave New World. And, you know, for those who have read both books, 1984 is this idea that some totalitarian state is going to come in with a cudgel and a telescreen and is going to beat us all into submission, right? Like the the sort of immortal scene is where the protagonist in the story, who is afraid of rats, has this cage attached to the front of his face with these rats in the cage. Right. And in effect, is like the rats are going to eat his face if he doesn't succumb and start worshiping Big Brother or whatever. And of course, that is terrifying. But Neil Postman's point is that the terrifying nature of that spectacle, and the sort of in-your-face, confrontational nature of Big Brother is precisely the reason it will never happen. Because as soon as Is your sort of antenna. Go up as soon as your attention gets called to the fact that a totalitarian is trying to dominate you personally, or society or whatever. Then you have geared up for the fight, and then the battle is basically over before it's begun. But then the contrasting case is Brave New World, and in Brave New World, there is no cudgel. There is no in-your-face totalitarian state. There is instead just honey. Soma.
Christine Rosen: [00:44:03] Soma
Tyler Johnson: [00:44:04] It's like opiates for the soul, right? And in effect, you can go to the feelies, which are these, you know, I mean, that's a sort of a sort of sounds like an anachronistic term, but but this but the idea.
Christine Rosen: [00:44:17] Terrifying children show or something. Right.
Tyler Johnson: [00:44:19] But yeah, but but but the idea is that you just go to this thing and it just kind of lulls you. Right. And it and it's very sensory and enveloping and it and it just kind of, you know, whatever. It's comforting. And then if you're ever having a problem, as you mentioned, there are literally soma dispensers, right? You just push a button on a wall and here comes your fix of comfort and pleasure. Right? It really does feel, I don't know if you're familiar with the work of Anna Lemke, but she's an addiction researcher who's here on campus, and we had her on the show, you know, two and a half years ago or something. But one of the things that she talks about in her, in her book, Dopamine Nation, is the idea that we are all addicted to dopamine, like, whatever. We are addicted because society, in effect has made pleasure too easily accessible, and so all of us have become sensitized to it and require so much dopamine. We've come. Our physiology has come to expect dopamine, so much of it so easily, so often that we we don't know how to do without it anymore. And anytime we're not feeling dopamine high, we feel as if we have been sort of abused or neglected or left behind, or are missing a FOMO right? We feel FOMO that everybody else is getting dopamine all the time, and so if we're ever not getting it, there's a problem.
Tyler Johnson: [00:45:37] How do you and this is something you addressed frequently in the book, but how do you read the sort of technological quest to eradicate discomfort and difficulty and suffering?
Christine Rosen: [00:45:50] I see it as a huge moral choice that we should resist. And I do use the word moral. I sound like a scolding Victorian, but I these are moral choices to live this way. And if character formation is what we do every day, that becomes habit, that becomes states, you know, ways of viewing the world states of mind. That's who we are. We are what we repeatedly do. Right. So the concern I have is that we are changing in significant ways. The philosophical experiment that Robert Nozick did years ago called the experience machine where he said, you could have this amazing experience, but you know, you're you're having it because you're plugged into a machine. It's not real. You know it's not real. Would you choose to have that, or would you choose to actually have the experience in real life, rather than have the machine give you the perfect version of it? Most people said if I knew I was plugged into a machine, it wouldn't feel real. It wouldn't be a real experience. They could make that distinction. They've tried various updated versions of the experience machine experiment with people now giving them a pill, for example, very soma like and saying, well, if you could just take a pill and you knew that it was all generated by the pill, but you didn't have to be hooked up to a machine, and the experience would seem real, and you'd be very happy as a result.
Christine Rosen: [00:47:06] More people said yes, yeah, pill, that's not that bad. They didn't, you know, it didn't have to disrupt their lives, like being hooked up to a machine and they still would choose it, knowing it was a simulation, knowing it wasn't real because it could feel real. And I think that's that seems like it's not a big shift, but it is because humans are wired to learn from our experiences, and we have real emotional reactions when we have online digital experiences. So you could I could make you really angry right now if I tried and that you would feel real anger. It's not like it's not real because we're having this conversation over a digital medium, but it would be very different if we were face to face, where perhaps you wouldn't be as quick to anger because you would see you might be mistaken that I was trying to get you angry or whatnot. I mean, and that experience would actually leave different kinds of memory than the ones that we had online. So there are all these ways in which we're wired as human beings to experience the world, and we are changing both in quality and quantity, the kinds of experience we're having people spend, on average, seven hours a day staring at a screen.
Christine Rosen: [00:48:11] How could that not transform the way we understand the world? It does. It has. We're seeing some negative and some positive. But I think what I'm asking people to think about is it doesn't always feel like a choice. Right? I tried to park the other day downtown, and I couldn't do it without an app like that. I didn't have a choice. I either was going to get a ticket, or I was going to download the app. And I have an ancient iPhone that's like, please, no more apps, I can't do it. So again, we're building our world in a way that assumes the technological and more and more we have to actively choose the human, the analog, the thing that we used to do. And it's becoming harder in many of these settings to do that in a way that's valued by the rest of society. And that worries me because we are losing deeply important human values when we cast things aside that quickly and we're not questioning what we're losing.
Tyler Johnson: [00:49:00] So, you know, I noticed a long time ago. So I will admit that when I was in medical school and had negative money and, you know, very little disposable income, I would see the really nice versions with the really nice screens of the Mac laptops. And man, I really wanted, I wished that I had enough money to buy the nice Mac laptop with the nice screen. And for a while I did this total Mac fanboy thing where like when they would come out with a new one, I would like read about the new thing, and it would have all of these impressive sounding numbers about pixels and and and I don't even know. Right. Whatever. All of the things. And I would be like, wow, that's really cool. But the thing that I noticed after a time is that almost every time you see a picture of it, like an advertising copy picture of a mac laptop, it shows the laptop open, and sometimes it will show if there's a creative person at the laptop, it will show them doing some creative thing. But if it's just a laptop that is sitting there with no person at it, almost always it has a picture, usually of the outdoors and often specifically of Yosemite National Park, often of Half Dome or or, and the one that is now the default like screensaver thing on my laptop is looks like it comes from Sequoia National Park.
Tyler Johnson: [00:50:19] Yes. And what I realized after a while was that no matter how many megapixels, no matter how many nits of brightness and and whatever it is, what they are trying without saying that they're trying to do it. What they are trying to do is they are trying to sell you on the idea that if you buy this laptop that is somehow roughly equivalent to going to Yosemite National Park, or it brings with it sort of a suggestion of having been at National Yosemite National Park. And after a while I was like, wait a minute, shouldn't I just go to Yosemite National Park? Now, in fairness, I live in California, so I can do that, right? But like, but whatever. If you don't live near Yosemite, you live near something, right? You live near trees, you live near whatever. And so how do you think we can both recognize and push back against the often subtle, but I would argue, ubiquitous idea that technology will give us a thing that is pretty much as good as the real thing, even though it's not the real thing.
Christine Rosen: [00:51:20] Well, so it is notable that these companies, Apple being, I think, the savviest at doing this from a marketing standpoint, co-opt the natural environment to market. You think so? Both. Yosemite was the previous operating system that the Mac used, and now it is Sequoia. And that's why they link it to these ideas and images of what we know to be great examples of natural beauty, in the same way that the reason they're icon is an apple with a bite out of it. And their their early marketing strategy was think different. You know, the apple with the bite out of it is a reference to the Garden of Eden. And, you know, Adam and Eve. I mean, you don't get any more basic than that, but it's also, you know, a reference about good and evil in a weird way. And Google was don't be evil. That was their early slogan. So in a lot of ways, these companies have have gone back in time to resurrect ideas that are deeply human and then sort of say, and now let's play with it. And the way they play with it is to say it always begins with, we will, we will give you. I see this with the Google Art project, for example. Look, you can you don't have to go to the museums anymore.
Christine Rosen: [00:52:21] You can see all the world's art on your screen, and you can zoom in and see it at a level that even the human eye can't see if you were standing in front of it. So it begins as access to things that you might not be able to afford to see, or might not have access to. It starts with access, but then it very quickly becomes, this is even better. This is an even better experience than the old way of doing things, which was schlepping to a museum, paying maybe for an expensive ticket, and standing with a crowd staring at a work of art. I would argue that that experience, though, and the memory you will have of standing there and staring and really submitting yourself with humility to the vision of an artist and what they were trying to communicate to you. That is what art is about. That is what experiencing art is about, whether it's music or painting or sculpture or a comedian. In live performance, when you sit at your computer in the comfort of your home and you scroll around and look at only what interests you, that's a different thing. It is a qualitatively different experience. Now, you might still learn a lot about art, but you're in control.
Christine Rosen: [00:53:20] You're not really submitting yourself to the artist's vision in quite the same way. And it is designed to give you a feeling of yourself at the center of this universe, rather than humility. If I could say there was a through line with a lot of our amazing technological developments, is that too many of them lack humility, and they encourage in us a sense of control that often proves to be illusory. And we only realize it's illusory when we confront as as you know, because you're in this field, the frailties and limitations of our physical bodies, all of us come up against that as we age, as we get sick, and that can't be controlled. And so I worry about a society where our sense and understanding of what it's possible to control has become outsize of actual human needs and experiences. And that leads to impatience and to denial about one's own limitations. And all kinds of stuff flows downstream of that. So I do think that it's deliberate on the part of technology companies because they understand that humans are still human and want those human things. They also understand that if we if they'll give us, give us a cheap imitation or even a beautifully sophisticated imitation of it, we'll probably choose that.
Tyler Johnson: [00:54:28] So then to wrap up, that leads perfectly to the last question, which is that you make this very provocative argument at the end of the book. So, you know, all of us intuitively want, even if we don't know that we want it, but we tend towards easing friction, right? We want to make our lives less friction filled. But you make the argument at the end of the book that we should actually resist the urge to always seek the less friction filled solution to a problem. So explain to us from a like a moral or an ethical or a human perspective. Why?
Christine Rosen: [00:55:07] So I say that not not to just be a scold or to say that we should all be uncomfortable on purpose for no good reason, because I think actually there is good reason to introduce some friction in our lives. First and most important, it requires us to slow down, and the speed and pace of life right now for most people, seems overwhelming because it is. I mean, the amount of information that we take in on a daily basis, our minds really weren't designed to filter through all of it. And I think it's why people tend towards more tribal allegiances, because it's kind of a shortcut to understanding. They tend towards conspiracy theory, because really, who has the time to actually figure out what's true and what's false? So the overwhelm, reintroducing some friction in your life makes you stop and think whether that's something as simple as reading a book in its actual analog form versus scrolling through summaries of a book online. So it'll force us to slow down the other thing it asks us to do, and this is important again for the caregiving professions, is this really the best way to do this thing? So I have a whole I have a part of a the chapter I have in the book about handwriting, because all of us got handwriting was obsolete because now we do everything on the keyboard or voice memo or who needs handwriting? Turns out handwriting is implicated in a lot more than whether you can write a grocery list or sign your name, or read our founding documents, which children these days can't because those were written in cursive.
Christine Rosen: [00:56:27] It also implicates literacy and the formation of memory. So it has all these all these implications for other aspects of what it means to be human, that we shouldn't just toss that aside without thinking through or understanding what we're giving up. So introducing friction means asking questions like, okay, if we're going to make this human skill or this thing obsolete, what are the trade offs? Because there's always a trade off. It's not always going to be a benefit. There will be some drawbacks. And we should think about what those are and then choose with with the full range of options. Finally I mean it makes us it's just going to hopefully cultivate virtues like patience, humility, responsibility, things that technology doesn't valorize but that human beings do. And the values of a technology that are embedded into its architecture are often quite at odds with what we as human beings say we want for ourselves, our families, and our communities.
Tyler Johnson: [00:57:17] Much of your book is, in effect, a very detailed discussion of tradeoffs, right? So mostly until the very end, you sort of hold off from making value judgments about which thing is better to be traded off for which other thing. But you just say, we need to recognize that by getting this right. Wendell Berry likes to say we have forgotten how to subtract. So a lot of your book is trying to help us recognize the subtraction that's going on. But then at the end of the day, you tip your hand a little bit and at least suggest that the trade offs we're making are at least not all good. And there's probably a lot more subtraction going on than we often recognize. But to be able to say that right, at some point you have to make a normative judgment about like, what would be good or wouldn't. And so this is the ultimate question that I think really lies at the heart of your book, and I really am curious as to your answer. What are people for.
Christine Rosen: [00:58:13] Whom people are for other people for developing ourselves and our skills? Certainly. But we can't exist in total isolation from each other. We are here in order to do lots of different things, but together, not alone. And I think one of the conceits of our technology is it turns each of us into an absolute, you know, super emperor of our own lives. There are so many things we can do with the power of our devices that we forget we actually need others, and others need us, and our responsibilities and obligations to each other are a huge part of why we're on this planet. And then there are a lot of things that come after that caring for our environment, our natural environment, building communities where more people have more opportunities to flourish and have freedom and autonomy in their choices and access to education. So those all come from understanding that we are kind of all here in, in, in it together. And a lot of the autonomy promised to us by our technologies takes us in another direction. So people are not for any particular thing because we're not instruments or should not be instruments of anyone else's purpose. We are free creatures. But choosing to be human means choosing others with whom we want to live, communities in which who share our values. And that doesn't mean that we need to be performing ourselves and in touch with everything that's going on around the globe all the time. And there are fascinating studies as to why that's actually we're just not wired that way. We're wired to be in communities and to thrive in that way. So I think actually that's an engineer's question. What are people for? I would ask it differently. I would say, what can we do as people to make life a flourishing human life for as many of us as possible.
Tyler Johnson: [01:00:05] Yeah. And I think that that idea that we are meant maybe not what are we for? But that we exist to share vulnerability in a way that cultivates genuine community. I think there is very much at the heart, the ineffable heart at the center of the practice of medicine, I think, speaks very much to that same impulse and is the reason why we can never fully replace the act of one human touching another human, to try to be present with them and help them heal by any number of streams of data, or bits of physiological reading or bots or AI or anything else. That connection will always be at the center. Well, Christine Rosen, we really can't thank you enough. This has been a fantastic conversation. It's been such a pleasure to have you on the program. We really thank you for your life of scholarship and for this book, and we thank you so much for joining us and for your time.
Christine Rosen: [01:01:06] Thank you. It was my pleasure. Really enjoyed it.
Henry Bair: [01:01:11] Thank you for joining our conversation on this week's episode of The Doctor's Art. You can find program notes and transcripts of all episodes at the Doctor's Art.com. If you enjoyed the episode, please subscribe, rate, and review our show available for free on Spotify, Apple Podcasts, or wherever you get your podcasts.
Tyler Johnson: [01:01:30] We also encourage you to share the podcast with any friends or colleagues who you think might enjoy the program. And if you know of a doctor, patient, or anyone working in healthcare who would love to explore meaning in medicine with us on the show, feel free to leave a suggestion in the comments.
Henry Bair: [01:01:44] I'm Henry Bair.
Tyler Johnson: [01:01:45] And I'm Tyler Johnson. We hope you can join us next time. Until then, be well.