Artificial Intelligence: Effects with Humanity – Ep319

At first this episode was going to be about Acceptance, but once the conversation started talking about the current event of Grok 4 coming out, we pulled an audible and shifted topic to Artificial Intelligence’s Effects with Humanity. We look at how AI can shift human behavior as more and more people use that technology. Will humans be capable of thinking and processing what is happening in our world? Will we build relationships with AI that will replace our partners? Will society be able to adapt and learn how to use AI as an effective tool before we become too reliant on it?

Tune in to see AI’s Effects with Humanity Through a Therapist’s Eyes. 

Think about these three questions as you listen:  

  • In what ways might AI reshape how we define human identity, purpose, and connection?
  • What emotional, ethical, or psychological challenges might individuals face as AI becomes more integrated into daily life?
  • How can we ensure that AI serves humanity collectively, rather than deepening divides or displacing our sense of agency?

Links referenced during the show: 

https://www.throughatherapistseyes.com/category/podcasts/selfmanagement

Intro Music by Reid Ferguson – https://reidtferguson.com/
@reidtferguson – https://www.instagram.com/reidtferguson/
https://www.facebook.com/reidtferguson
https://open.spotify.com/artist/3isWD3wykFcLXPUmBzpJxg 

Audio Podcast Version Only 

Episode #319 Transcription 

Chris Gazdik: [00:00:00] Oh, this is Through a Therapist’s Eyes. I am your founder and this is January, July. What is this? I don’t even know. 27. No. 17. Seventh, 17th. This is July the 17th, 2025. Right boy. I’m now lined up in the right spot in 

John-Nelson Pope: the year 25. 25. 

Chris Gazdik: So we are gonna be talking today about acceptance paired with cheerleading, and I’m gonna ask you to think about these three questions while we go through and chat about acceptance interesting word, cheerleading, but we’ll get into it.

When someone accepted you without judgment, how did that affect you? That’s a powerful moment. If you’ve ever hopefully experienced it, you may not have. Secondly, how you ever received encouragement that felt too soon or dismissive of what you were going through. And then thirdly, how do you balance operating empathy with inspirational change in your own [00:01:00] relationships?

So those are some of the things that we’ll kind of angle in on at today. And what else do I need to say? So this is Tate through a therapist’s eyes. Tate, I lost that battle a long time ago. I thought ti was way better. You remember that Neil? He shakes his head. I lost, I lost, man. 

John-Nelson Pope: I, I think of potato 

Chris Gazdik: with Tate.

With Tate Tate potato. Why? I dunno, what is that? Like? Tater taught what Tater taught through a therapist’s eyes where you gain insights from a panel of therapists in your home or car. But knowing he’s not delivery therapy services in any way I have the books that are out. I do like to highlight those.

I probably don’t do that enough, too humble about it through therapist eyes, one on marriage, one on yourself, and they’re awesome books, real therapy moments that. Were generated into short chapters that I created short reads that are good pieces of information. Almost like a devotional. It is, it is. I I call it the, the, the, the toilet book.

You go to the, you go to the John, and you, you, you read a nice piece and I try [00:02:00] elevate it for you. Thank you, John. Make it like guideposts. Appreciate your guideposts. My grandma read guideposts when I was a kid all the time. That was good. That was good material. They still have 

John-Nelson Pope: that. They still have that.

They really do. Yeah, they do. 

Chris Gazdik: Oh, that’s awesome. Hey, we provide content for you that we hopefully entertain you. We definitely wanted to misspell myths and stereotypes and disseminate information about mental health. That’s our jam. That’s our passion. That’s our goal, and we hope that you follow along with us.

But your job is to click subscribe. Post comment, and please, you gotta tell a friend. So this is your part in the journey that we grow and 

John-Nelson Pope: review us. And I think if you review us on Apple, I think that’ll help increase our, our traffic. This is the first 

Chris Gazdik: time I didn’t hear you jump on Five stars. 

John-Nelson Pope: Five stars?

Chris Gazdik: Yeah. 

John-Nelson Pope: Okay. I’m sorry. 

Chris Gazdik: I know it’s the thing I was trying to sound like. Victoria, you’re doing a great job. Contact through therapist eyes.com is where you contact with us and engage with us. This is the human emotional [00:03:00] experience, which we do endeavor to figure out together. So without all being said, and without further ado, I have a.

A a current event that I wanna get into. And Neil, I’m really curious about your, your view on this, the, the tech guy with us here. Do you call yourself the tech, the full tech guy? Are you like a tech specialist or just an email specialist? Neil, give, give us your quantification of all your skills.

Neil Robinson: I’ve been an IT for 20 plus years, so I’m kind of, I’ve been all over the place from administration to just. Just emails now. So I’m more focused now than I was before. But even still, emails entrench themselves in so much so, so, no, I’m a, I’m a IT guy is what I call myself. So 

Chris Gazdik: he is the most qualified person in the room to understand this current event, John?

Yes. Mm-hmm. I would say so. Maybe, maybe not. What do you know about GR four? Neil, 

Neil Robinson: It’s the next iteration of the AI that, you know, Musk and all them created. So you have Gemini from Google, you have copilot with [00:04:00] Microsoft, and there’s gr which is, you know, Twitter, X Musk’s AI stuff. So just the next version, just like, you know, GPT-4 and all those.

It’s just the next iteration, bigger, better, stronger, faster, all those things. 

Chris Gazdik: So on the month in review shows, if you’re catching this live or seeing this we do at the end of month a review of the shows that we did for that month. We have something that we call going down the rabbit hole with Rabbit Hole with, with with Adam, who I see a Jefferson 

John-Nelson Pope: Starship was Wonder.

Chris Gazdik: I have gone down Major Jefferson, Starship Rabbit holes with this lately. I have found YouTube as a wonderful way to fall asleep too, which is very bad because it keeps me up with interesting things. So. I have become alarmed, concerned, fascinated, and just in awe of what we are literally experiencing. Now, I got the click bait stuff on YouTube, probably is what gets me going a little bit with this, but this is kind of like a big deal.

Excuse me. So yes, Neil have you, have you looked at Crock four, or you [00:05:00] know much more about it than that? Because as I’ve watched Elon’s presentation of the progress with their version of ai, and I was like. Kind of just like in awe of what he’s describing 

John-Nelson Pope: now. Now, and catch me up on this a little bit because the, one of the things that I read and I did not get this from AI, was the large language models that are used for this and that.

Right. It is kind of, there’s. It can appear as sentient when in fact it’s not. Not at all that way in that, 

Chris Gazdik: yeah. Let me catch you up a little bit on this, and it’s a, it’s a brief 1 0 1 from a therapist, so take that for what it’s worth. But I have gone down a rabbit hole and I’ve learned about this a little bit.

Okay, so basically you got computers. Alright, great. We all know that. We all know how they work. We all use ’em pretty much. Really, you know, for most everything we do. Laptops and phones are basically little computers. So we have computing and we have strong computing. But the progression of this stuff has happened so fast.

[00:06:00] So this big issue with AI is for about 20, 30 years, we’ve said we’re about 10, 15 years away from ai, artificial intelligence. But there’s different forms, different facets, different progressions. And as you say, Neil, iterations of how this works, the most recent AI that we’ve gotten are these long, large language models.

Everyone knows chat, GPT, Gemini, there’s multiple models and it’s come on the market like that fast with all these, once you got one, you got like five, right? Mm-hmm. It’s kind of wild. Well, artificial intelligence is very broad. You have what we know of as pretty basic super hopped up, Google search ai, it’s amazing.

They could talk to you, they could take a look at pictures and tell you about it, but we’re moving rather rapidly to artificial what is it called? Just blanked ar Artificial general intelligence. Mm-hmm. A GI. Mm-hmm. That is not a large language model. That is a strong progression where the, [00:07:00] the, the, the computer is not doing what it currently does right now.

It reflects what you give it, and it helps to give you information that it reads on the internet and it basically can read everything on the internet all at once, all at once. So it’s an amazing resource. Mm-hmm. I use it in various ways now. It makes humans way more productive. Artificial general intelligence doesn’t even need the internet.

It begins to process and think for itself. And then you get artificial super intelligence where it’s the super being kind of. So 

John-Nelson Pope: we’re Skynet. It’s amazing. 

Chris Gazdik: We are moving Uhhuh to sky net. My neighbor has made the joke that that is not a movie, is a documentary. Mm-hmm. And is looking like that. So what just dropped with Gronk four?

And there’s a PO whole point to this so I won’t tune out ’cause it’s psychological emotional reality here. Big time. Grok four is Elon Musk’s, like Neil just described next iteration of kind of the same stuff. [00:08:00] Although it’s super, super hopped up. Mm-hmm. It is gotten now and it’s already dropped like last week.

Brand new stuff. And it will take any test in academia. John, that you are a pretty high level expert, frankly. Mm-hmm. In our psychology field. You really are. And I mean, that is a compliment. Thank you. This thing would smoke you. Smoke you in everything that you know, and you can get your colleagues come around and try to take an SAT and it will perform 

John-Nelson Pope: you.

What is scary though is that it comes up, maybe I’m wrong, but it c maybe manufacturers references and articles, journal articles. You’re talking 

Chris Gazdik: about hallucinations and that, that’s a factor where we have imperfections. Mm-hmm. But right now we have just dropped where they have decreased hallucinations of like 5% or 4% of the stuff, and it’s not going to do nothing but get better.

But right now it will take Gronk 4G ROK from Elon Musk’s ex group or whatever, [00:09:00] any test you wanna take on all disciplines bar none, way better mm-hmm. On performance than any human could imagine being possible. Mm-hmm. It, it, it basically knows everything. 

John-Nelson Pope: So, so 

Chris Gazdik: we’re o obsolete. We are becoming. Obsolete as working professionals, and that’s just with GR forward’s large language model, super strong large language model.

Mm-hmm. When we get a GI, which is really, really on the brink mm-hmm. Within, they’re talking about this year, like this being, now this is clickbait from YouTube, mind you, but this is the last normal year of functioning when we’re going to have computers in 2026 or 2027 that are creating new alloys, creating new science, making new physics, identifying new physics laws, getting into humanity and human humanistics.

On high level it is going to [00:10:00] revolutionize all industries. Part of that also is 

John-Nelson Pope: quantum, 

Chris Gazdik: Computing, quantum computing, quantum physics, physics, all disciplines. This is going crazy level high. And when you get artificial general intelligence, the step to super intelligence is super fast because there’s an arc that goes really fast in development hockey, you see this going from hockey stick radios.

We see this going from radios to microwaves to TVs to, you know, the internet. 

John-Nelson Pope: Well, it goes back to 1947 to Roswell, New Mexico to the crash.

Chris Gazdik: Trying to be all serious. Neil, and he is blowing this up. 

John-Nelson Pope: Well, no, I’m, I am. It does though a certain extent. Yeah, maybe we got an 

Chris Gazdik: alien craft. But here’s my thing, John, as you think about the implications of all of this, we need to be a little bit brief, but, ’cause we could do a whole show on this. We’ve done that before.

Maybe we do that. And we just changed gears, but. What are the emotional [00:11:00] impacts of this? I mean, I, I think the whole purpose 

John-Nelson Pope: of we have to have meaning and purpose in our lives, right? I think in other words, we’re gonna be put into the zoo to be observed. I mean, it’s, it’s basically what I’m hearing is that we’re, or what I’m imagining is that if we don’t have a purpose, how do we progress?

How do we live? How do we, how do we function? 

Chris Gazdik: So, of all people, I heard Joe Rogan and Bernie Sanders in a dynamic conversation I would refer you to, honestly. 

John-Nelson Pope: Mm-hmm. 

Chris Gazdik: Whether you feel about Bernie Sanders or whatever. I’m not a big, big fan of people or Joe Yeah. Or Joe for that matter as well. Yes. The, the reality of it is they had a conversation about.

The realities of this progressing. For instance, right now, real time, we’re not talking in the future. Mm-hmm. There really isn’t a need for trucking. Trucking [00:12:00] is now automated. Mm-hmm. Computer runs the 24 hour trucking industry, and when companies begin to see this and it begins to progress and become more safe and all that, this year there is no job for truckers, for example.

There’s no need for it. We are much more profitable paying a computer to drive the truck 24 hours a day, wherever we wanna send it, than we are paying people loaded up and their limited ability computers, robots loading, driving trains, driving planes, driving boats, shipping. That’s just one industry.

Mm-hmm. They’re coming for us, John. 

John-Nelson Pope: Right. You know, there was an, a Saturday Night Live a skit about having and it was a, a funeral that was being held and they were talking about progression in, in the funeral. And so the minister was up there and it talked about you know, the minister had to get called away.

And so he had, he had a little [00:13:00] robot or his little speaker that had a recording and then they had in the, in the congregation or in this, in the sanctuary of, during this funeral there were people. And then the next it cuts away and then there there’s a bunch of little boxes of little speakers that are there and people are listening.

This was like 35 years ago or 40 years ago. Okay. Alright. Yeah. So in other words, we’ve imagined this, we’ve imagined this already. It seems to me that it’s here. It’s here, it’s here. And so it’s gonna be revolutionary, evidently. And of course there’s the, the warfare now done by drones, basically. Drone, yeah.

The 

Chris Gazdik: psychological implications is what I’m after. So we’re gonna make a decision here in a couple minutes. Do we stay and just riff on this topic and change the title for the show, which we can do before we make that decision? Neil, anything that I said was I on point? The tech brain that you have and things that you know about?

I mean, I’m sure that you’re dialed into some extent. How did I do? [00:14:00] 

Neil Robinson: I think did pretty good. I think the one thing you have to under, you have to think about too is the more we get AI involved in our life, the stupider we get. And I’ll just be blunt about that one for real. I mean, I’ve, I’ve worked with a, I have a friend of mine who did his own company, he’s now a CTO somewhere.

He is really doing really, really well. What you find is that the more you start relying on the robots, the ai, the less likely you actually participate and think and use your own brain. So, and I 

John-Nelson Pope: would agree with you. I would agree with you. I’m sorry, go ahead. No, no, 

Neil Robinson: no, go ahead. No, 

John-Nelson Pope: no, no, no. There’s some empirical studies already of, of, of students that have been writing papers with AI and they can’t recall within five minutes what they’ve supposedly wrote.

Yeah. Academia is crazy. Well, there, there, 

Neil Robinson: there’s another part that I, I saw a different study too where they talked about the, they had people, like they used chat gp and they had told ’em to write something, a paper. I don’t remember exactly what it was. Mm-hmm. And those that had used chat, GPT wrote worse.

But then they [00:15:00] also did a thing where like they had someone say they used chat GPT to create prompts. And they did better writing when chat GPT gave them prompts. But the next day, like, or the next time they tried to do it because they didn’t have chat, GPT giving them the prompts, they couldn’t even figure out a topic to write.

So 

John-Nelson Pope: Jesus, 

Neil Robinson: it was, so it goes back to the same thing. It’s well, and, 

John-Nelson Pope: and that’s another thing is somebody tries to fool the professor and or the teacher because there’s no grammatical errors except perhaps kinda wordiness word sound. Right. But is to put in deliberate grammatical errors into Yeah.

To make it more in my line, which 

Chris Gazdik: my dictation is terrible to make me. Right. So that’s funny. 

Neil Robinson: So going back to the thing, the, the concern is not that AI’s getting smarter, which I’ve also seen something where AI has actually put, like a programmer will ask the AI to create like code for something they need to do.

Well, they looked at the code deeper and they found like a backdoor for the. AI to get back in later or something so it doesn’t get deleted. Like there’s little [00:16:00] stuff that’s starting to get put in there, but So normally is it by By the computer? By the computer, right? Yeah. Because it just says, gimme this code for this system or this thing, and if you don’t review it, they just would put those, it just somehow put that thing in there.

Right. Now the question is, once again, if AI gets smarter or better and we don’t advance or stay up to it too, we get dumber and dumber. That becomes a risk for disaster when that was a technology can’t take over. 

John-Nelson Pope: Right? That was, that was a Star Trek thing. Episode. Episode, yeah. Yeah. And that was in the original series.

But and I, I think that I, for me, is. I, I, well, I listened to Mike Rowe. Have, you know, dirty Jobs? No, no. The guy that does dirty jobs. Mm-hmm. Just, he had a, it was on discovery channel Okay. Science several years. And he, they do all sorts of, of physical work, that sort of thing. And people doing jobs that people don’t like to do.

And Mike Rowe said, [00:17:00] you know there’s gonna be a, a need for more skilled work to be able to do the mining and things of this sort, and that coders and people that do coding are, are gonna be obsolete. And so 

Chris Gazdik: we need to make a decision. We’re already 17 minutes in, and I think that we can maybe riff on this topic and stay with it because I, let’s riff, you know, it’s, it’s unprepared, but real and genuine in the moment.

Is this, do we wanna stay with this or do we wanna get to I’m cool with it. Acceptance. 

Neil Robinson: I’m 

John-Nelson Pope: cool doing this. Do you wanna, 

Neil Robinson: I, I don’t know. Do you wanna do some more research and come back with it? Because I think there’s a lot more to this that while we can, we can riff on it. Do we really have the backing behind some of the stuff to figure out the next steps?

Yes. 

John-Nelson Pope: Spoken is a true computer. 

Chris Gazdik: Yes. Yes, we do. Neil and the Rea I true person. Yes, we do, Neil. And the reason why I say that is because we don’t have any research on the psychological and emotional impacts of this. There’s nothing, this is, this is a complete novel idea. [00:18:00] This is a reality that’s going well.

It’s kind of think 

Neil Robinson: about, think about social media. What, 20 years ago when it came out, right? Those first couple years. Right. That craziness. And now we’re starting to see some of the impact now. So it’s that. So we’re in that same phase. Mm-hmm. Mm-hmm. So social media 20 years ago, AI now, 

Chris Gazdik: and it, but it’s so fast.

That’s, that’s what’s tripping my brain out about this topic. 

John-Nelson Pope: Well, okay now, and I know we need to, to, to move on. Well, 

Chris Gazdik: I think I wanna stay with this ’cause we’re already pretty far in. 

John-Nelson Pope: Okay. Here’s, here’s an observation. I just had two of my clients yesterday, and both of them talked about they can’t keep up.

No. They do social media and they’re always anxious. They cannot not be anxious. And they say, I don’t have time to rest. I don’t have time to, to, to just take a, a deep breath. And, and, and, and, and they get drawn back into doing the social media. And one of the [00:19:00] issues is that that they have had issues with.

With their spouses and they’ve engaged in sexting and, and that sort of thing. And and have emotional affairs. And it’s like they’re just at the end of their tether at the end of the rope. 

Chris Gazdik: Yeah. I mean, John, I, this is where, I mean, I get fascinated with this computing progression. I get fascinated with, you know, tech a little bit.

I, I don’t, I’m not good at it, but it, but I like to think about it and I’ve been watching it. You’ve wax philosophical 

John-Nelson Pope: about 

Chris Gazdik: it. I am very philosophical about all of this because my goodness, this is profound. Yes. And we see it on our therapy office. The impacts of social media is the best way to talk about they’re harried.

John-Nelson Pope: That’s, that’s a good way of. For me to describe them. They’re harried. My What do you mean? They’re, they’re always hurried. They always feel like they have things left undone. They [00:20:00] always, they feel like they have they, they, they’re not on solid ground. They feel like everything’s shifting and changing so fast and they can’t keep up.

And that’s why they’re on medication. That’s why they’re on they just don’t have the time to restore themselves to get the rest, to be balanced, to 

Chris Gazdik: balance, to be balanced, no, to balance of boundaries, to nothing. Be grounded to have mindfulness. Mm-hmm. To use some of our clinical words. Right.

Right. How do you have mindfulness in the light of something that’s completely, profoundly changing The humanity experience. Change the way, right? Yeah. Change your fundamentally 

John-Nelson Pope: Yeah. And change, you know, what, what makes us human? Is an issue that I have is that if we lose our ability to, to be creative and our ability because Neil has said it makes, makes us dumber in some ways.[00:21:00] 

If, what is the essence for our purpose, we’ve lost we’ve lost a, a fundamental component 

Chris Gazdik: that we always permanent word. Mm-hmm. Meant. Had always. Mm-hmm. So in this conversation with Joe Rogan and Bernie Sanders, they were talking about the trucking stuff and the different factors. And, you know, I mean, Bernie’s thing is, is how, how can we manage the technology advances to better humanity, to help us to serve us?

And one of the things that’s floated it is like 20 hour work weeks. You know, you’re so much more productive with AI and whatnot, and you just, there’s not, they’re doing, they’re gonna do the jobs. Mm-hmm. So, but we still need humans to do it and to work with and whatnot. But maybe we literally have decreased hours to where, you know, can you remember when we actually worked 40, 50 hours a week?

That’s what we’re gonna be saying in 10 years. Yeah. Even five years. Because the, the revolutionary changes are really like, rapidly hitting us. Do 

John-Nelson Pope: you think in [00:22:00] terms of people that are hardwired to be a DHD. 

Chris Gazdik: For example. 

John-Nelson Pope: Okay. They’re 

Chris Gazdik: gonna have problems. Golly. Yeah. I didn’t think about that, John. Yeah. But that’s the, that’s some of the nuance of what gonna, what we’re facing, they’re 

John-Nelson Pope: gonna excuse my language, bored as hell.

Yeah. Completely bored, restless. Are we gonna see an increase in uptick crime, crime and violence, or are we gonna have to have some sort of Selma, some sort of medicine that would decrease that, that overweening anxiety that’s generated. 

Chris Gazdik: But think about the reality. So we’re John, we’re Joe and Bernie weren’t able to get to, is any kind of inclusivity about how do we manage our passion, our purpose?

Mm-hmm. How do we manage, you know, what we’ve always had working our identity. Mm-hmm. If we don’t have to work, there’s actually floated ideas of universal income, uhhuh so that we all. You know, get corporation money. 

John-Nelson Pope: That’s Bernie, that’s, yeah. That’s been floating around for, it 

Chris Gazdik: has, but now it has a reason.

It doesn’t [00:23:00] sound crazy if the context is, there are no jobs, right? There’s no need for teachers, there’s no need for doctors, there’s no need for lawyers. You’re gonna go to a financial consultant when you have a superpower computer to tell you what investments are best. Uhhuh. I mean, this is, this is what I mean when I think about society and societal ramifications.

I mean, there’s, there’s no more power expense because we have figured out how to do fusion. We’ve been elusively looking for and all the different, I would like to see, see us do 

John-Nelson Pope: fusion. Yeah, I would like to see that happen. But I think so far it’s alluded us. Well, the 

Chris Gazdik: computers literally, John, in two years when we graduate to these general intelligence, now 

John-Nelson Pope: you now you come up with these numbers and I’m, I’m challenging you a little bit on this.

Okay. So you come up with these numbers in two years. Two years we’ve been talking about code fusion or just regular fusion Yeah. For the last 30, 40 years. Since nine, [00:24:00] late 1980s. Yeah. And it’s some sort of a process and I know that you’d have computing models that can, you could do this, but do you have to have the amount of heat develop some sort of device engineer it using ai, for example, that would be able to make a containment area to put the plasma in right.

To contain the plasma, all of that with the, the computer. And so it’d have to have those design spec specs and things. I think you still have to have people. 

Chris Gazdik: You definitely still have to people, but I think what I’m getting at John, is the idea that much more rapidly than what I think hardly any human realizes, it seems to be honest.

Mm-hmm. It is here. Mm-hmm. It is already being experienced with everyone’s experience with chat GPT, which is basic guys question. It’s so simple. This is not a complex computing chat, GPT. It’s very, very simple compared to what’s [00:25:00] about to hit. 

John-Nelson Pope: Okay. Here’s another thing. All these computer farms, these farms that are to generate energy.

I don’t hear people on the, on the far left, for example, complaining about the envir or Greenpeace complaining about the energy that would cause global warming and heating. Okay. Because it’s going to produce a lot of heating in our environment. So is there a point Where, am I wrong on that, Neil? 

Neil Robinson: No can 

John-Nelson Pope: go.

Okay. So he just grabbed the mic. He’s not, he’s not, he’s not jumping at you. So, so my question is, we, we could be hypocritical about this as well, you know, in terms of of embracing a technology that might actually be our undoing. Yeah, yeah. That’s like, 

Chris Gazdik: legit for real. What, what are you thinking, Neil?

Neil Robinson: Well, I just, I think it goes back to when you talk about, you know, Greenpeace and those, I mean, they’re, to me, those [00:26:00] people that are all about environmental, they’re looking at, you know, fossil fuels and coal and the pollutants, right? They don’t see electricity. And those that are those types of sources, they don’t see those as a problem.

They’re just, I don’t think they’ve gotten that, they’ve gotten that bone yet. They’re not biting on it. They just don’t think about that yet. But no, a hundred percent I’ve worked in data centers. Mm-hmm. And the amount of energy it takes to keep that thing cool, not only to keep it cool, but also to run the servers and those pieces, that is a huge drain on the environment.

Plus then you add in all the EV cars that come into it. So until you really fully build some sort of a fusion slash nuclear power that can be renewable or doesn’t diminish you’re, it’s going to take a toll on the environment. It’s gonna take a toll on our, our infrastructure. It’s one of those things that we’re not, we’re not there yet.

Our, I think AI can happen fast. I don’t think our human infrastructure can keep up with it. And so we’re gonna be the bottleneck because trying to maintain an energy source, trying to main build [00:27:00] infrastructure to handle that much stuff through it, I think that’s gonna be the biggest bottleneck. 

John-Nelson Pope: Okay.

Another emotional thing, just moving to a different area is, is in terms of human engineering bio human bioengineering, and that is Neuralink is here, that we wanna get rid of the, we wanna be able to, to, to genetically engineer people. Now you could do that and use those large AI models or generalized whatever.

Chris Gazdik: I have the understanding, not in America, but across the world, people are already doing that. Did you know that? 

John-Nelson Pope: Well, I know the Chinese one, one rogue Chinese doctor did that. Mm-hmm. Researcher and and I don’t think he’s the only one. Yeah. Oh, no. Right. But I mean, I’m talking about, let’s say profound genetic diseases that Oh yeah.

I mean, that’s the, not just just just choosing sex and things. 

Chris Gazdik: This is the developing reality. I mean, with Neuralink, you know, we literally have, have cured. And the next is blindness. They’re, they’re [00:28:00] gonna be doing chips for blindness and they’ve done chips for, for I know you’ve 

John-Nelson Pope: really fallen into the hole.

Chris Gazdik: Huh? You’ve 

John-Nelson Pope: really fallen into the 

Chris Gazdik: hole. I’m deep, man. I told you. Yeah. Yeah. I mean, it’s making me think about all the emotional dynamics though. I mean, here, here’s another thing. You’re 

John-Nelson Pope: talking about what is possible, and yet you can, you can say, and they might actually even have the science to develop it, but we not, might not, and I think this is what he was talking about, what Neil was talking about, is actually having the infrastructure and that would be able to produce the machines mm-hmm.

Or the computers that would be able to do this. For example, with the, with the with vision and restoring vision in people. I, I think that’s ab It would be a wonderful thing. It’s lovely, but as long as the AI would know that people are the most [00:29:00] important elements, then I think he was telling them that.

Yeah, yeah, yeah. You got robot. 

Neil Robinson: I don’t know if I have any, I just noticed that you were talking about, you know, genetics and all that stuff. They, they’ve recently, and this just came out like the last day or two, that they actually produced a baby using three, three people’s genes that just came out. And supposedly the way that they do it, it helps prevent hereditary diseases or something strange.

So it’s like, it’s not mitochondrial diseases, right. Mitochondrial diseases. Exactly. So it’s kind of an interesting thing with the science and, and those pieces that that’s kind of where we’re now stepping involved in the 

John-Nelson Pope: can. Can you imagine that if they did, like with electro microscopes, they could see of a patent on a mitochondria?

That has been geo engineer or engineered, bio-engineered, that would say, okay gr or what, what are you gr 

Chris Gazdik: designed by? Yeah. Designed by GR for Gen X. Yeah. Or listen, here’s, here’s why I think [00:30:00] this is, is so fascinating. It, well, so many ways. I can’t tell you in the history of my life how many times I’ve had such a dynamic conversation with people when I talk about this as an issue where I hear things like that just dropped this week.

This just happened yesterday. This is was something that occurred that they’re, they’re gonna announce tomorrow. Like the, that is super fast. Well, it’s, it’s more like, oh, they’re building TV About a year ago. Did you, did you know what happened on the TV program there? That was what way we used to talk.

Okay. 1903. 

John-Nelson Pope: 1903. What happened? 1903. 

Chris Gazdik: Was that when TV came, came? No. No, that’s, no, that’s Wright Brothers too early. The Wright brothers flew. It was 1969. Yeah. We went to the moon. Here’s the technology spin that, that, that the park Now 

John-Nelson Pope: we, we went on a, we went on a dog leg or dog trail because instead of going out into space, which we probably should have [00:31:00] done and stayed out there, we because I think we went counter to our nature, which is to explore we have, we have kind of wrought the fruits of, of this horrible dilemma.

And I say horrible, not in a bad way as much as one where we are not prepared psychologically, spiritually religion wise or even in terms of our institutions to, to, to talk about the possibility that, we’re no longer gonna be needed. We’re not gonna die by nuclear war. We’re gonna die and be extinct and made extinct by our our AI overlords, 

Chris Gazdik: essentially.

Right? Yeah. So here’s, here’s where I feel like th th this is primarily one of the reasons why I just wanted to pull an audible and stay with this topic, John. I think the world is going to look at people alike, or maybe you and me. [00:32:00] Mm-hmm. Yeah. For how do we handle this emotionally? What I mean by that is our field.

Needs to be progressively in this discussion because I don’t know the technological pieces. I’m not smart enough to understand mitochondria. I’m a geeky guy who likes weird science and stuff, and I like to check this out and whatever, but that is not my professional expertise. But emotions are psychology, is managing emotional realities.

In light of this, people are gonna be asking us, yeah, what are we gonna tell ’em? How do we cope? I 

John-Nelson Pope: think we could get a job as, as therapists to the AI intelligence when they start having emotional breakdowns. 

Chris Gazdik: Now, is that a joke or is that a serious statement? Both. We’re gonna be the AI’s therapist because they’re breaking down.

Yeah, I can see that 

Neil Robinson: there. There’s two things that I kind of thought about that could be very hard for society to deal with is either one. AI comes [00:33:00] in and you program it to be whatever you think you want it to be. Right. You know the guys who fall in love with the AI chat bots, ’cause they say everything, all the right things, they, you know, whatever.

They look always right. Conformance what? Mm-hmm. Now the other side of this is the, the opposite direction. AI doesn’t have emotions, empathy, compassion. What if AI tells it how it is or what it thinks it is? And it doesn’t sugarcoat it, it doesn’t give compassion to it, it doesn’t do a lot of the stuff that it comes into.

That kind of goes back to the 

Chris Gazdik: meaning. It doesn’t demonstrate the human qualities of emotion. 

Neil Robinson: It, it, it doesn’t try to sugarcoat like, oh, you know, I don’t even, can’t give a given a, you know, the whole alcohol disorder, you know, whatever. Alcohol abuse disorder versus Yeah, you can care. You’re an 

Chris Gazdik: alcoholic dude.

Neil Robinson: Right. Right. So that’s the question, right? Either it’s gonna go one direction that you build a relationship with this AI because it does everything you think you want. And we obviously don’t in relationships. If you ever have a wife or a girlfriend, boyfriend, husband. You don’t like them always telling you like, yes, I’ll do that.

And you just have no, [00:34:00] no riffs, no anything. I just don’t care who you are. That’s a boring life. If you’re not pushed back by your significant other, it’s gonna be boring. So AI’s gonna get one where everyone’s gonna get whatever they want. How boring is that gonna be? The other side is what if you ask AI questions about something you’re having an issue with and they come back and tell you, well, you’re an alcoholic and it doesn’t have the compassion or build a relationship to say, here, let’s build trust.

And then, oh, by the way, you’re an alcoholic. And I take it. So it 

John-Nelson Pope: can’t be a cheerleader, it can’t be the one that encouraged our original topic tonight. Original. So, 

Neil Robinson: so those are the two ways I think this could go. Either they, everyone gets to the point where they expect ai, AI does everything for them that they want and they don’t know how to deal with conflicts or AI goes the other way and basically has no filter and just tells you what it thinks.

Chris Gazdik: But Neil, I think, I think that’s what, 

Neil Robinson: that’s 

Chris Gazdik: honestly what I’m getting at in that from my limited tech knowledge and ability. We, we, we don’t, even in our, in our, in the context of our conversation, I [00:35:00] feel like we don’t really have a context of understanding what we’re trying to deal with because I, I, I think I have tracking with you and right now all that human beings have ever experienced is the first version.

You know, it, it, it mirrors back at us. It tells us what we want to hear. It, it is designed to do nothing, but like, like we were talking, John, with your, your voice in writing something. Mm-hmm. You know, it doesn’t know your voice unless you have spoken to it and read your words, and then it knows you. Now it can give it back to you, but it can’t do that beforehand.

I’m, yeah, okay. That’s changing, that’s flipping on its lid because there is no more of that. Really, really soon after we get. Artificial general intelligence. It is, it is going to be doing whatever it thinks it needs to do, Neil, and it’s not gonna have the emotional things. But more importantly, [00:36:00] we do have our emotions and we’re interacting with this whole platform that is not, literally not alive, but going to be creating new thoughts.

It’s no longer just mirroring you. When we get to that next level, six months or a year from now, is what they’re talking about. It is dropping. Yeah. Okay. I know you think I’m crazy on a little bit, John. No, you’re not crazy. But, 

Neil Robinson: but, but I think it goes back too, when you, when you look at the logical side of like ai, you think about, you know the story of, you know, the people who are brain dead and you have to pull the plug.

AI would look at that saying, oh, well they have these vital organs. There’s people on the donor list. You should just pull the plug. Because the percentages of that person coming out of it are 0.0, zero, zero, blah, blah, blah. And no compassion, just logic, right? Yeah. There’s always that risk. So that’s the thing you have to look at.

And so while there’s, you know, gen, the artificial general intelligence, someone still has, it still has to have some sort of a guiding point to say, [00:37:00] what is the purpose of what I’m saying? And is it about, is it about compassion? Is it about logic? Right? Where does it balance the two pieces? So the concern is that with this artificial general intelligence, what is it gonna be factoring itself?

What is it gonna be factoring its decisions based on, right? As a general intelligence who, what’s the compass that guides it to say what’s right or wrong? See, I, you know how I’m gonna fight all this? 

John-Nelson Pope: I one in the basement, I’m, I’m gonna, I’m gonna upload my consciousness into the ai. And then I’ll be able to, to run live forever.

Live forever, and run havoc. 

Chris Gazdik: You know, that’s a ways off probably, but maybe not so much. Let, let me, let me focus us in around our human experience. Yeah. Rather than what the ethics are and the computing and whatnot. ’cause I, I don’t know, I’d just like to center us in on how do, how are we gonna cope with this?

How do we kind of manage the changes that [00:38:00] are essentially here? For instance, what is our reaction as humans? Are we gonna fall in love with robots? Well, right. Do we, do we, do we feel more lonely or more connected? We have a little bit of the answer to that with social media. Do we struggle or how do we manage our own identity?

If our jobs are changing, what are That’s what I’m getting at. Like, 

John-Nelson Pope: you know, I’m, I’m, I’m, I’m, I’m gonna go back to classical Rome class. I think I love this. Go ahead. Okay. Well. They became prosperous after the second and third Punic war. Rome became very prosperous and there was a minority that which were the nobility and or the Patricias that they could buy anything.

They could do anything. And we think of Elon Musk as the richest man in the world. There, there were people that just would just blow [00:39:00] him outta the water in terms of the difference between relatively speaking, didn’t have the technology. Wait. In Roman times, they would blow Elon 

Chris Gazdik: out. What, what I follow, what I’m 

John-Nelson Pope: saying is in terms of comparative to, to wealth.

Okay. So Oh, you mean they were wealthier? Wealthier financial. Okay. But they could do anything. They lost purpose. They lost direction. And so did they, tell me more about that. So there was a sense of of, of of, of losing the edge, of, of in other words, everything was provided for them. And that is, we’re talking about the petitions.

’cause they were wealthy. Romans wealthy. Well, they had slaves to do everything. And so I’m looking at computers. Wow. Yeah. So our slavery was different, obviously, in to our detriment in the United States. In America, yeah. United States, in America and in the world during the 19th century. Yeah. We weren’t doing one, but, but the way that slavery was done is they, it was commodities that basically they would, they would run businesses for their, [00:40:00] for the emperors and for the, for the senators and the equestrians and all the folks that were there the wealthy merchants.

And so everything was done for them. And it was basically inherited wealth. But people lost their, their direction or purpose in life. And so. I’m thinking that this would be a, so I think it would be such a detriment to us as a civilization where, lemme jump in. We all have income and all this stuff.

Yeah. 

Chris Gazdik: I wanna jump in and come back to you. ’cause what I’m hearing you say, and then I have a question, is that interestingly and your experience of history, John is amazing. So I love that you’re going here. So. In Roman times. Mm-hmm. Certain group of people, probably a large group of people, did really, really well, got really, really wealthy and had people doing pretty much everything for them so that their identity was really markedly different.

Yeah. And so is this [00:41:00] history repeating itself? I, my question is going to, 

John-Nelson Pope: technology’s different, but history’s, it’s, it’s repeat. It resembles, it resembles. So what happened? What happened with these Rich Roman and stuff? Right? 4 76 Rome fell. Rome declined in inexorably, declined and lost its moral direction.

When Christianity became the state religion, people were blaming Christianity for it. No. What Christianity did was actually give people a purpose again, but it was too late. The, the empire had gotten old, the institutions could not be revived, and everything had been predicated on, on, on a slave.

Economy. An mercantil economy was basically ripe for the taking. And so the barbarians came from the north. And the reason why they call ’em barbarians is from a Greek word, which they, it would babble [00:42:00] Barb babble. Yeah. Yeah. So so when the goths and the auths came down and from Germany, the Germanic people, they, they just took over.

Basically. They lost the ability to fight. 

Chris Gazdik: So do I hear you saying that humans in a similar situation, in that time period that resembles some of what we’re talking about, the humans really just couldn’t cope. 

John-Nelson Pope: Mm-hmm. Lost, 

Chris Gazdik: lost their moral compass, lost their functioning. They lost their purpose. 

John-Nelson Pope: And, and their purpose.

Right? Yeah. And, and, and that’s the big thing. I keep hitting on this. Okay. I’m gonna, I’m gonna 

Chris Gazdik: really challenge your brain with this. Okay. All right. What can we learn from that and apply today? Wow. Go. Okay. 

John-Nelson Pope: We have to before, because the way I get it from you, the rabbit hole Okay. Is that it’s too late already.

Ooh. That’s not what I’m saying. Alright. Well I heard that. Okay. I might be What’s saying? I heard, I heard, I heard [00:43:00] fear. Okay. We can, I think, reclaim some of the reigns and put in guardrails that we would be able to, to say that this is what we, what it means to be human. And we need to be up to the task and say we’re this is what it means to be human.

And that these tools are to serve us. We are not to serve them. And that these tools are to help, IM improve life and make life better for people and that people are gonna be the highest that we don’t, you know, I think the temp acquiesce to their position or whatever. Yeah. Good words, them sort of thing.

Yeah. And, and that is that we also realize, and, and this is one thing that I was thinking about with you, and I’m challenging again, like Rome Okay. Is that you’re saying [00:44:00] this for as though it’s in the entire world. There are 8 billion people on this planet, and of which 1 billion people, perhaps not even that 500 million people are privileged.

We are, it’s probably less than that. We’re privileged. Yeah. We’re the ones that are privileged. Mm-hmm. Okay. And so we’re looking at this as, as a global problem, when in fact, maybe. It’s, it’s, oh wow. People like, like the Romans. Like the Romans. Really just the leadership like Romanians. Yeah. 

Chris Gazdik: Yeah. And, you know, so this is not a third world problem developing.

That’s what you we a little but what you’re saying That’s I 

John-Nelson Pope: saying except if we have those data farms that work and make so much heat that might affect our ecology. 

Chris Gazdik: I, you know, let me just, lemme just be genuine for a moment and see what it is that I am feeling about this. I, it, it’s interesting that you say, I felt fear.

I, I, I think I, I feel [00:45:00] fear, if that’s the word, about people being fearful. 

John-Nelson Pope: Mm-hmm. 

Chris Gazdik: And, and wouldn’t it be awesome if we could get in front of that and embrace this stuff so that people aren’t tripping out when dramatic changes are going to start happening. ’cause these dramatic changes are gonna start happening really fast.

Yeah. And our psychology. Cannot keep up with already what we know as being social media. 

John-Nelson Pope: Our closest relatives are in the Amazon rainforest that are untouched. Tribes and we have the same brain that they have, and when they’re brought into the 21st century, it’s overwhelming. 

Chris Gazdik: They can’t handle it. Right.

You literally, and we know that when something like that sort of overwhelm happens, literally people become psychotic. Mm-hmm. They become delusional and hallucinating. Like 

John-Nelson Pope: ai, it’s hallucinating. 

Chris Gazdik: Yeah. It’s like that. And that’s not [00:46:00] from acid, right. There, there’s, there’s, this is the emotional response.

Yeah. So I, I, I, I, I just wonder about how people are going to be able to cope, John, you know, and we lose our identity or things change so dramatically, you know, just take finances. You mentioned the economy changing, like how do people, these are questions and I, I guess we’re not obviously gonna have answers, but I, I feel like our field’s gonna be needing to give answers.

Uhhuh. How do we handle all of the emotions wrapped around money when the economy and a world economy completely changes? Like 

John-Nelson Pope: that’s Or purpose changes because your purpose. Yeah. Why do you need money? 

Chris Gazdik: Absolutely. 

John-Nelson Pope: Okay. I’m gonna give you credits like on Star Trek. 

Chris Gazdik: Yeah. I mean it, I don’t know where we’re going.

Yeah. I just know we’re rapidly going somewhere because it’s been a long time since the internet came out and a long time in the matter of, you know, years and it, it, it was centuries when we got technology upgrades and we’re about ready to be on the precipice of another huge [00:47:00] one. Mm-hmm. And, and, and the biggest one, 

John-Nelson Pope: okay, here’s the, the other thought and that is what if AI farms us so that we’re, we don’t have the, that developed sense of, of curiosity and we get domesticated, domesticated animals basically have smaller brains.

Wolves have larger brains than than dogs. And with cats, big feral, not feral cats, but wild cats in nature. This is wild John. This is psychology based larger brains than Wow. The domesticated cats. Hmm. Really? Yeah. So because you need more of your senses when there’s danger. Wow. A possibility of, of doing that, or, or you’re looking for prey.

Chris Gazdik: I really don’t even know what to say to that. I’m glad you’re going for the mic, Neil. 

Neil Robinson: Well, it it goes back to muscles, right? Your brain is something that you have to use. You know, if, if you, you look at people like Arnold [00:48:00] Schwarzenegger, Hulk Hogan, now big guys, and then when they were younger, when they were working out, they haven’t worked out that they used to.

They’re smaller than they were. Same thing with the brains that, you know, the more you use it, the more wrinkles you have, you’re not as smooth and it’s gonna be, you know that that’s the problem you have. That’s why Cha CBT, when you rely on AI for everything, right, and you don’t use your brain power, you’re gonna, once again, you just lose how to function it.

So, going back to the animals, when you don’t have to have those survival instincts, and that’s a part of your brain that’s not used anymore, it’s gonna shrink. That’s just the way that it is. So, 

Chris Gazdik: but you know, it’s funny, I’m listening to you, Neil, there, and, and, and it’s, it’s funny, John, you said that you heard fear in my presentation and I’m sorry.

No, no, no, it’s fine because I’m sure that it’s there. But, but, but listening to you there, Neil, it’s almost kinda like, I personally don’t get as much fearful about technology in the sense that I, we, we adapt. Like, we’re not gonna stop using our brains. I, I don’t feel like we’re gonna have our [00:49:00] brains go to P brains and literally like be domesticated.

I, I, that seems, I’m curious, John, but that’s, I could see that there’s a possibility. I think instead it’s going to change. 

John-Nelson Pope: Well, I’m talking about the AI overlords. 

Chris Gazdik: Yeah. Well that’s 

Neil Robinson: going, going back to the Matrix or where the batteries, yeah. 

Chris Gazdik: Right. That’s where it kind of was going to the Matrix batteries.

But 

Neil Robinson: here’s my question too, like. I’m thinking AI is like a medication, right? You find a way to make it supplemental in your life to help you, as John pointed out, which was great. You know, AI is a tool, right? Don’t let it run your life. Use it as a tool, right? How many people start medication and they never get off of it because they start relying on the medication.

Yeah, that’s, that’s, it’s that same basic idea, right? Yeah, that’s true. Once you start using it, there’s gonna be a handful of people that, that they get. Once again, social media, it’s the same thing. Big flood of people use social media ’cause of the endorphins, ’cause it’s awesome ’cause you’re connected with people, blah, blah, blah.

Handful. Don’t get on it. They don’t ever get involved with it. The handful stop using it because they realize, okay, I’m tired of this, let’s move on. Right. And, but there’s always that small subset that says, [00:50:00] I’m still on it. I’m still hooked on it in, in the terms of my, what my younger teenager says. He’s a, like, you get brain rott.

That’s, you know, that’s what he calls like Gen A, I think is what he says. Like they have brain rock. ’cause all they do is watch YouTube and shorts and TikTok and all that other stuff. Right, 

John-Nelson Pope: right. It’s 

Neil Robinson: that same concern, right. You go into it is what’s the percentage of the gen, what’s the percentage of the population that will have brain rot?

What’s the percentage that’s going to ride the wave and get hooked on it, but then get off, and then what’s the percentage that’s actually going to use it as a tool and actually find ways to 

John-Nelson Pope: excel with it. Right. Okay. Opiates, for example, they’re essential for, for treating people with chronic conditions 

Chris Gazdik: because they’re useful.

John-Nelson Pope: Useful or if you’re having an operation or, yeah, I want one, I want them, that sort of thing. But there are people that that become addicted to it. And 

Chris Gazdik: oh gosh, there’s a whole nother expression. I mean, you know, I, I, you know, I I I, we talk about addiction on the show. I mean, we don’t get addicted to social media.

I think we’ve kind of [00:51:00] clarified that. I don’t think there’s much danger of addiction with AI and whatnot. Really? No. 

John-Nelson Pope: Do you want, you want like the people that, that develop a relationship with themselves essentially, and it’s self-love. That, that was one of the little areas that I was going into as well is that we become, as Gods.

And of course like a Roman emperor declare themselves gods and actually start believing it. And

Chris Gazdik: I, I don’t know how you don’t, John. And, and, and I mean, I don’t think that everyone’s gonna be walking around thinking they’re, they’re a god per se, but, but I just, I am worried, maybe that’s a better word than, than fear.

I am terribly worried about the human development in the context of what all rapid changes in the next even few years or five to 10 years are gonna be. And we’re not gonna recognize the things that are going on now. We’re not gonna recognize it. I was 

John-Nelson Pope: born in a cold camp. Yeah. Southeastern Kentucky.

Yeah. Where the West Virginia. I’m, I’m [00:52:00] your brother. Where heating was done with coal. Yeah. And coal ash. And I remember. Smelling that stuff. And we had, we 

Chris Gazdik: had black sprinkles in the snow at the bus stop because the dude up at the top of the hill used cold to warm his house. Well, that’s what we had too.

Yeah. I 

John-Nelson Pope: remember. I didn’t think there was, I, I thought all snow had black specks in anything. Yeah. Right. Yeah. 

Chris Gazdik: So, but, but, and, and now we’re talking about totally different power. I mean, you know mm-hmm. Electronically driven vehicles and self-driven, automated, whatever, and all. It’s just, you know, I don’t know.

We’re to go with it in this conversation to, to, to think. I think about how are we gonna cope. 

John-Nelson Pope: I think, I think therapy might be able to use powerful learning models that would help teach us to become better therapists. But I think there still has to be that human factor, the human touch. That has to be there.

And you talked about empathy [00:53:00] and that that’s something that you can’t program or AI can program into itself and, and, and have that even if there is some sort of limited self-awareness. I’m wondering if that is that divine spark that we have that would separate us from, from something that is as much as it would like to be is is a mechanical device.

Mm-hmm. Is an OT automaton, not, not in terms of, or a simulant, but not something that is in essence human. 

Chris Gazdik: Do you think on a super macro level, like we’re we’re talking about that human beings have the ability to value and love themselves to that level. ’cause I’m fearful. Worried is a better word that we don’t.

John-Nelson Pope: Okay. Would, 

Chris Gazdik: would you 

John-Nelson Pope: unpack that just a little bit? 

Chris Gazdik: I’ll [00:54:00] try.

I think about human beings with their insecurities. I think about human beings with our fears. I think about human beings with our fickle emotions. We can really lie to us. That’s a chapter in the book, right? Right. Emotions can lie to you. They mean, you know, insecurities make feelings are not facts.

Delusion facts. Feelings are not facts. We have a lot of phrases like that and, and, and I wonder about slash worry about human beings ability to, to manage this level of life disruption that people are talking about. 

John-Nelson Pope: Do do you think people would, you know, we talked about completing suicide, would, would actually choose to take their lives?

I do. I worry about that. And that we would also develop devices that would make it like Korkin did. Yeah. 20 years ago. Then there’s the Swiss that have these suicide machines. Now 

Chris Gazdik: I, I, I definitely am fearful of that, you know, that, that that’s [00:55:00] ultimately not coping, completing suicide. So it’s just ultimate despair or despair off the chain.

Confusion about who I am and what am I supposed to do. I mean, John, we’re looking at potentially a, a, a, a way of humanity operating without having to work. 

John-Nelson Pope: Okay. There is, there is a sense that re I think the answer is with religion. Think in spirituality. I think that, and I’m not just, I’m inclined agree with you.

I, as someone who’s from a Christian point of view, or Judeo Christian point of view, it’s not just that. It’s, it’s also what would say Buddhist and other disciplines Sufi, Sufism in Islam, in other religions as well world religions. There is that sense of need for contemplation, meditation, and a sense of, of putting yourself in perspective and that you, that you have that discipline [00:56:00] that that we reclaim that to get away from the materialism.

That we develop deeply deep and profound spiritual sensibilities. I think we might 

Chris Gazdik: even have to go that direction, John, or, or ultimately be faced with perishing because I, you know, if you think about, you know what just popped in my mind as I was listening to you, you know, there’s movies that have, like, you know, aliens are coming, the spaceship is there, and people go to the top of the Empire State Building and they’re up there on Independence Day.

Independence Day, man, my favorite movie. Absolutely right? Absolutely. They’re up there, they’re party, and they’re getting crazy. They’re all wild and they’re all excited. You could feel it when you watch those kind of movies. And of course, they. 

John-Nelson Pope: It was the, it was the sort of the, the, all the showgirls that were going up there, the people that were hedonistic.

Chris Gazdik: To a certain extent, I don’t think humans can handle it. ’cause we’re on that level. I mean, I, I guess I’m, I’m really doubtful about our ability to do what you were just describing in a religiously grounded and mindful. Some people have done it thoughtful way. [00:57:00] Well, the monks are one, but I’m not a monk, so I’m worried about myself.

Maybe. Well, we, 

John-Nelson Pope: There might need to be a, a sense of reclaiming that, of such like the Amish or the Hutterites or the Mennonites. I mean, for real. I wonder 

Chris Gazdik: there’s a major shift that’s coming. I, I, I appreciate you hanging with us with this audible that I think I pulled. But, you know, it’s, it’s, it’s going to be, I just, I just felt like this isn’t.

You can’t do this in a current event, five minute 

John-Nelson Pope: Uhhuh 

Chris Gazdik: conversation, if you’re gonna try to give any justice to brand new ideas and, and, and things coming, you know, that, that, that, again, that aren’t coming ING that are here. I mean, this is what I’ve kind of realized in the last few weeks is that we’re really not talking about development.

We’re talking about coping and handling what’s here. So it’s now, it’s now and it’s changing us now. I mean, one of the clickbait YouTube things I remember [00:58:00] said, oh, you know, this is our last year of normalcy. We’re gonna look back in five years ago. And it was like, remember when everything was normal? And I thought that was a little bit bombastic and kind of crazy.

Maybe, maybe not, maybe, maybe not. Maybe there is these changes that when we start getting new alloys, new medical interventions, new answers to hormones, you know, how long would it take? An AI general intelligence, not AI super intelligence. The, the, the, the down version, the one that we’re really, really on just about now to map the hormonal endocrine system.

John-Nelson Pope: Mm-hmm. 

Chris Gazdik: To understand all of the complexities of combined hormones. How many answers do we get to fix mental health? John-Nelson Pope: Okay. The ghost in the machine is that spark, which I think is our humanity and, and our consciousness. And I don’t [00:59:00] know if you can, and, and again, I’ll reiterate it, I’m, I am skeptical. That that that can be computed.