The Latter Day Lens

Episode 143: AI's Spiritual Skeptics, Epstein & the Royal Crown, Charlie Kirk's Shooter vs Ben Lomond High School Shooter

Shawn & Matt

Send us a text


Welcome back to the Latter Day Lens! This week, hosts Matt and Shawn are joined by Marc to tackle pressing questions at the intersection of faith, economics, and emerging technology.

We dive into the listener mailbag to explore the concept of Consecrationism—the idea that a perfect system requires people to willingly share their property and excess—and whether it represents a "third way" that addresses the moral failings of Socialism and Capitalism. The discussion heats up with personal anecdotes on political shifts and a look at Matt's recent Deseret News op-ed.

Then, we transition to the rapidly approaching reality of Artificial General Intelligence (AGI). We analyze Matt's survey data on Latter-day Saint attitudes toward AI, defining the three major groups: the Silicon Saints, the Compartmentalizers, and the Spiritual Skeptics. We debate the spiritual litmus test: Is using AI for a talk or guidance a form of "lukewarm" seeking, or is it a valid tool for a divinely creative people?

Finally, we navigate two challenging ethical dilemmas:

  1. Parental Loyalty vs. Justice: Should a father help his son evade police after a crime, or insist on accountability, particularly within a potentially flawed justice system?
  2. Royal Justice: Did King Charles III stripping Prince Andrew of his titles represent a superior form of justice or merely an act of damage control and PR in the wake of the Epstein scandal?

This episode's key topics: Consecrationism, Capitalism vs. Socialism, Generous Capitalism, AI and Faith, AGI, Spiritual Skeptics, LDS Economics, Prince Andrew, Epstein Files, and the Ethics of Accountability.

Chapter Highlights (Jump to the Discussion!):

 00:00 Introduction and Welcome
 1:19  Mailbag: Consecrationism as the "Third Way"
 2:27 The Ethics of Wealth and Charity (Billie Eilish)
 3:00 Mark's Political Journey: From Marx to Capitalism
 3:50 The Deseret News Op-Ed & Voting: To Vote or Not to Vote?
 6:21 The Organized Intelligence Conference & LDS AI Survey
 7:22  The Three Tribes: Silicon Saints, Compartmentalizers, & Spiritual Skeptics
 8:16 The Litmus Test: Repenting to AI vs. Diligent Seeking
 14:03 Elder Bednar, Creation, and the Danger of Passive Ingenuity
 16:42 Justice Dilemma: The Ogden High School Shooting & Parental Loyalty
 25:27 Royal PR vs. Real Justice: Prince Andrew & the Epstein Files
 32:41 The AGI Utopia: Will AI Eliminate All Human Work?
 37:25 The Downfall of Technology: From Telephones to AI Erotica


Matt (00:01.098)
Hello everybody and welcome to the Latter Day Lens. It is good to have you all with us again this week. I'm your host, Matt. With me as always is Sean and joining us this week is Mark. Good to have you Mark. Mark is sitting behind the golden microphone. That means that we're going to get excellent content this week. I thought you were going to say something.

Marc (00:12.61)
Die!

Shawn (00:13.821)
Hahaha

Marc (00:25.483)
Yeah.

Shawn (00:29.321)
Mark, Mark, I've been told by listeners that they love you and also that you have a deceiving voice, that you are impossible to peg your age. Not that you should reveal your age, but that sometimes people go, he's really young and sometimes people go, he's really old. And sometimes they just don't know.

Marc (00:29.494)
I've got nothing to say.

Matt (00:47.754)
Yeah.

Marc (00:47.854)
Alright, well I generally don't like getting pegged, but yeah. We might need to edit that one out.

Matt (00:52.03)
Yeah, that's why he does it on purpose, Sean. It's intentional. Mark used to be the voice of a public radio station and I found it to be so soothed. I was so yeah, it was really good. He had a good voice for that. Yeah, I wish that radio still existed as a medium so that Mark could pursue that as a career being a radio person.

Shawn (00:52.201)
You

Marc (01:01.24)
Be informed, be inspired. BYU-Idaho Radio.

Shawn (01:04.82)
I love it. I'd listen to that. I'd listen to that.

Shawn (01:14.983)
Matt, it does. It's called podcasting.

Marc (01:15.957)
I'm an extra large, so.

Matt (01:19.578)
okay. Well, people like...

Shawn (01:20.915)
exists.

Marc (01:22.21)
Well speaking of, is this not a podcast? Have we not a mailbag? see, look at that.

Matt (01:26.538)
Let's open it. We had a lot of people write in this week. And again, I just chose the one I liked the best. Although they were all good. We had a good discussion last week and it prompted a lot of comments, but this is the one I'm summarizing this very long comment somebody wrote. They said, great podcast topic, socialism versus capitalism. Thanks for showing how to disagree with love. The term consecrationalist is excellent. Socialism may be its failed version.

A perfect system must wait for the Savior's return, but we can move toward a system closer to consecration where people willingly share their property and excess. It's a shame the wealthy United States allows so much suffering. We could easily afford more social programs. I hope for a truly generous capitalism that celebrates success, gives charitably, and prioritizes health and education over profit.

Shawn (02:19.263)
Key word there, willingly. Choose willingly.

Matt (02:25.258)
Key word there, consecration. Consecrationism. That's what I am, I'm consecrationist.

Shawn (02:27.952)
that is,

Speaking of in the news, Billie Eilish kind of echoed that. She got up in front of a bunch of really rich billionaires and just started preaching like, what are you doing with all that money? Give it away, help people. Give it away and help people. And then she went around and kind of practiced what she preached and gave some money away.

Matt (02:47.028)
Didn't she give like 12 million dollars away?

Shawn (02:49.471)
Yeah, well don't know if it's 12 million. 11.5, good for her man, good for her.

Matt (02:52.074)
There might've been a hundred. okay. I rounded up.

Marc (02:52.205)
11.5

I think she's a nerd and so is the person that wrote in. They're just nerds.

Matt (03:00.042)
Did you know that Mark, probably none of us have ever loved Marxism or Marx as much as Mark did at one point in his life and despised him as much as Mark did in his life.

Shawn (03:00.541)
hahahaha

Shawn (03:09.159)
Really, Mark?

Marc (03:13.131)
Whoops.

Shawn (03:13.385)
Wow.

Matt (03:18.354)
Mark, if you had to pick today, capitalism, socialism, which one do you choose today?

Marc (03:23.373)
What kind of socialism?

Matt (03:25.478)
Well, the one that's closest to communism.

Marc (03:28.765)
then capitalism.

Matt (03:30.74)
The one that's closest to capitalism.

Marc (03:32.864)
Cap-Capital- Wait what?

Matt (03:34.506)
If you have to choose the socialism that's closest to capitalism, which do you choose?

Marc (03:36.338)
I'm, look, I'm retarded sometimes, I'll be honest. What? So anyway, the point is, when you look at the Book of Mormon, they had all things in common for about five minutes, and it was awesome. So put that in your pipe and smoke it.

Matt (03:44.456)
Which social do you choose social?

Matt (03:51.849)
Yes.

Yeah. Yeah.

I like it. Okay, well, did you know Mark, might not know this Sean and I were talking about this. I wrote an opinion piece in the Deseret news that got published this morning. Isn't that exciting? Yeah. Listeners.

Shawn (03:58.271)
You

Marc (04:07.516)
nice of them.

Shawn (04:07.529)
It was incredible.

full on interesting, topic. And it was amazing. Everyone go look at the desert news and find the article from Matt Miles, the opinion piece.

Marc (04:13.439)
One at a time, fellas.

Marc (04:21.174)
How come there was never interesting, engaging content in our class?

Matt (04:24.778)
That's a good question. Probably because I was bored. I probably wasn't that interested. But I'll...

Marc (04:31.46)
The only thing I for sure remember from your classes is that you really like George H.W. Bush.

Matt (04:37.534)
That's true. I was a big fan. I'll put a link to that article in the show notes. You know, most people, what they learn in my class, Mark, is do not vote, that voting is a waste of time. That's usually what...

Shawn (04:39.945)
Hahaha

Shawn (04:47.583)
You

Marc (04:49.575)
I must have missed that one.

Matt (04:51.162)
Okay, Election Day is also coming up by the time the podcast drops people will have hopefully voted in the election or it will be too late Did you vote mark? All right, and Sean, how about you you're gonna vote?

Marc (04:59.435)
I sold in.

Shawn (05:03.827)
I did in California here, we voted against Prop 50, which would be the two wrongs, make a right gerrymandering one, yeah.

Matt (05:09.94)
gerrymandering?

Matt (05:13.748)
That surprises me, Sean. I would have thought you would vote for anything that your governor was supporting. so it surprises me you went against him on that one. We're having an important election in Rexburg. We're voting on whether or not to raise taxes to build a police station. And we're voting for about whether to put certain people on the city council.

Shawn (05:15.401)
Why?

Marc (05:20.651)
Hahaha

Shawn (05:24.383)
You

Marc (05:24.853)
Gonna gerrymander them votes.

Matt (05:39.442)
And as a good citizen, am going to abstain from this election. I will not participate.

Shawn (05:43.751)
as a good citizen. I'm being a good citizen by abstaining.

Matt (05:48.442)
I don't trust myself. I don't trust my decision making in an election booth. So I don't think it's. I know too much to vote. I I'm too informed.

Shawn (05:53.052)
it's a cop out. Cop out.

Shawn (05:58.841)
come on, man.

Marc (06:01.003)
You know, I'm just saying the prophets vote, the apostles vote. But what they should do, they aspire, on their wall is a picture of the Lord and then Brother Miles. Those are the two guys I want to be like on this side of the veil and the other side right there.

Matt (06:05.738)
That's true. That's true.

Shawn (06:06.729)
Do they not know enough, Matt? Do they not know enough to vote? To be.

Matt (06:10.27)
They know.

Matt (06:17.578)
All right, let's get to the top the first topic this week there is it's called the organized intelligence conference It's happening at the church office building on November 4th and 5th this week So this episode is gonna drop on November 5th So listeners can rush out and go to the second day of the organized intelligence conference if they want Yeah, it's free to the public

Shawn (06:40.0)
this is for the public.

and you're gonna be presenting there.

Matt (06:46.31)
I'm presenting there. Elder Gong is presenting there. Elder,

Shawn (06:50.397)
What did Mark just say? The Apostles and Elder Miles.

Matt (06:54.634)
Yeah, listeners, you should come if you want to come. you want to learn more about artificial intelligence, it's all about artificial intelligence from a Latter-day Saint perspective. And Mark thinks it's boring, as do I, if I'm being totally honest. I don't love the topic of artificial intelligence. But this was an opportunity to do a survey of members of our church. And so I'm presenting some survey data that we collected. And so what I find is LDS, members of our Church of Jesus Christ of Latter-day Saints.

Marc (06:55.06)
This is it.

Matt (07:24.084)
Their attitudes about artificial intelligence fits into three groups that are about equal in size. There's about a third of our church members that are in each of these groups. And it also doesn't matter active in the church, less active in the church, that doesn't seem to make much of a difference. So there's one group I call the Silicon Saints. There's another group I call the Compartmentalizers. And there's another one I call the Spiritual Skeptics. The Silicon Saints are tech savvy and use AI for faith. They use it for spiritual help.

The spiritual skeptics rarely use artificial intelligence and worry about it, think that it's not very good. One of the spiritual skeptics said that if the church was to use AI, they would leave the church. And then the compartmentalizers are the kinds of people that use AI in their daily life, but they separate it and they never use it for faith. So they think of it kind of maybe as like a tool that you can use for some tasks, but things like faith and church stuff should be not used by AI. So my question is,

Who is it that uses AI the way God intended? Which group is right?

Shawn (08:26.591)
Which of the three groups is more in line? I got an answer for you, Matt. I got it. I got it. Okay. They are all correct or they are all absolutely incorrect depending on one single thing. One single thing. But the scriptures clearly teach that God is never happy with lukewarm passive decision-making. Never.

Matt (08:33.236)
okay, good.

Shawn (08:53.759)
In fact, there's abundant scriptures, Revelation 3.15. I would that your works were hot or cold, but if you're lukewarm, I spew thee out of my mouth. 1 Kings 18.21, Elijah says to the people, how long halt ye between two opinions? If the Lord be God, follow him. If Baal then follow him. He was saying clearly choose this way or choose that way, but don't be lukewarm. And so if any of those three groups treat AI as a passive decision-making, as a basically I'm praying to AI to

to give me my answers for me, then they are all absolutely dead wrong. They're all fine if they're using it as information, gathering knowledge, a way to scan the scriptures quickly, find certain references, like nothing wrong with any of that.

Matt (09:28.916)
Mm-hmm.

Matt (09:39.272)
No, no, no, we're talking more than that. Like I ask them questions like, should you use AI as a spiritual guide? Should you use AI to confess your sins and discuss your personal worthiness? And the Silicon saints are like, sure, that's fine. And then the skeptics are like, no, no, no, no, no, no, you should not use AI for any of that kind of stuff. And the compartmentalizers are the same way.

Shawn (09:51.283)
Yeah, absolutely not.

Shawn (10:00.109)
well, that's more detail. That's more detail. Okay, so if you're praying to, if you're repenting to, if you're repenting to, that is praying to. So if you're praying and repenting to, and why isn't it praying if, for example, hey, I've got a situation where I, I don't know, I met an old boyfriend and I don't know, I kind of want to connect with, no, not even that, but I'm interested in connecting with him. Like, what is your advice in this situation? That's counseling and praying with.

Matt (10:04.914)
Not praying, but like I've sinned. Okay.

Matt (10:19.102)
We messed up.

Shawn (10:27.325)
with chachi piti as opposed to taking it to God. So if they're repenting to and praying to these things, there's probably nothing more dangerous, right? Like the scriptures have warned against that in abundance. That is idol worship, right?

Matt (10:40.766)
Okay, what do you say, Mark?

Marc (10:42.757)
I agree. think that the skeptics and the compartmentalizers are in the right and that the silicon saints... silicon? What? Whatever they are. They... I wouldn't give them the title Saints. They've lost it.

Matt (10:51.291)
I call them... yeah, it's just a word I made up.

Shawn (10:57.385)
HAHAHAHA

Matt (11:00.202)
They're active in the church.

Marc (11:01.916)
doesn't mean a dad blamed thing.

Matt (11:04.426)
Huh?

Marc (11:07.099)
Lucifer could be active in the church. It was for a while, so.

Shawn (11:09.247)
So they're just, so they're just, so the Silicon saints are just ladder, ladder days, ladder dares. They're not letter day saints.

Matt (11:15.978)
Church of Jesus Christ, they're members of the church. Yeah, that's interesting. but I, I mean, I technically would call myself a compartmentalizer. I use AI for a lot of things like digital video creation and playing around, like doing silly things, but I've never thought should I ask artificial intelligence for guidance in my life? Like to me, that's, that's completely in what I should do with God.

Shawn (11:21.855)
What do think, Matt?

Marc (11:43.625)
I did ask Grok the other day, based on the Book of Mormon, the Doctrine and Covenants, and prophetic teachings, when is your best guess for the Second Coming? And Grok gave me a pretty detailed idea and some thoughts. Then I asked Grok, based on the Book of Mormon timeline, using America as a corollary, where are we in the Book of Mormon? it said, I can't give you an exact, but here are several events between the United States and the Nephites.

Shawn (11:43.647)
That is why. Go ahead, Mark.

Marc (12:13.382)
that mirror and the one that spooked me the most was the Nephite Open Civil War for the Americas it said, approaching threshold.

Matt (12:21.706)
Whoa.

Marc (12:23.186)
So that was fun.

Matt (12:24.52)
Yeah, but you're not, you wouldn't take that as guidance, like as if it was the Holy Ghost. Yeah. To me, that's, to me, that's what gets scary, right? What were you going to say about that, Sean?

Marc (12:31.58)
Sister Ghost.

Shawn (12:31.708)
so.

Shawn (12:36.063)
I was gonna say you, would never think to go to Chachi P.T. Neither would I. I don't know, I don't think Mark would either to go, yeah, let's guide me spiritually or let's repent or let's confess my sins. But it's because you're, I know you, you're firmly set on a foundation of faith based in Christ that is focused on feasting on His words, praying regularly, serving others. And that grows your faith under repentance. Like your dependence is upon God. And so I know that about you. So it makes sense that you would never look at a-

an outside source to seek that peace and happiness.

Matt (13:06.73)
that. But when you ask this question, is it okay to use chat GPT or artificial intelligence to help you brainstorm ideas for your calling or to help you write a talk, then spiritual skeptics would say no way. But compartmentalizers are like, yeah, that's okay to use it to write a talk or to use it to brainstorm ideas for your calling or to help with your ministering.

Shawn (13:29.757)
My lit, to me the litmus test is lukewarm decision making. The Lord clearly does not want us to be lukewarm in our diligently seeking. So if I'm writing a talk and I'm asking it to write it for me, that's lukewarm. So that is against the will of God, it's clear. But if I'm using it to like brainstorm or get ideas, but I'm gonna write my own talk in a hot or cold way, then I think it's fine. Just don't treat it as a lukewarm. Don't be halted and be passive, passively letting it tell you in

Matt (13:46.186)
Mm-hmm.

Shawn (13:58.782)
what to say, what to do, what, you know, that's the danger. I think the scriptures are clear there.

Matt (14:03.006)
Mark's gonna know this talk from Elder Bednar better than I do, I think. But Elder Bednar gave a talk a while back where he talked about how it's human nature to want to create. As children of Heavenly Father, we want to create, we want to make things, we want to build, we want to progress. And he warned about using video games because video games give you this sense of accomplishing something while not actually accomplishing anything at all.

Shawn (14:27.295)
Mmm. Mmm.

Matt (14:29.512)
And so for me, the litmus test for artificial intelligence is, is it damning or hurting your ability to be creative, to be ingenuitive, to grow, to develop, or is it weakening your relationship with Heavenly Father? If it does either of those two things, then I say that it's wrong. So then anything, in my opinion, where you're using it to create for you or to think for, like,

I might come up with this idea of like, there's this video clip I want you to make and then give it some prompt. To me, that's not hurting my creative power. Maybe it is. But to me, like that's the litmus test. Does it hinder my ability, weaken me over time in my ability to create and to build and weaken my relationship with God?

Shawn (15:14.025)
I like that, Matt. I'll give you points for that. I just think the only situation where you do weaken your creative ability or problem solving or creative thinking ability is when you are treating it with passive thinking. Tell me what to believe. Tell me what to think. Tell me what to create. As opposed to, hey, let's riff on some things. I've created this idea. Give me some thoughts. Give me your responses. Hey, can you think of any other critical things that I'm not looking at in this idea? That's the right way to do it.

Marc (15:44.012)
If it's spiritual stuff, my vote is don't even get near it because let's take for example, the elders quorum activity guy says, I'm going to use chat GPT or grok to figure out some activities. It'll give you some good ideas. That'll be fun. But if you would have just gone to heavenly father, he would have guided you to this one activity that you would never have thought of. It's not a common thing. So grok wouldn't have come up with it. And it's the one activity was needed for that one brother to come back and into the fold. So

Shawn (16:13.695)
I remove my points to Matt. Mark, you get all of them.

Marc (16:17.143)
Thank you.

Matt (16:17.354)
Hey, good job. And clearly none of us would be Silicon Saints. So if you're in that group listener where you're in that third, I still think you're a member of the church. I still want you as part of our group. But feel free to write us and let us know what we got wrong. All right, next up. I'm so curious about this. This one, this one.

Marc (16:26.967)
repent.

Marc (16:32.597)
Even the devils believe.

Marc (16:39.375)
Don't mention my name.

Matt (16:46.74)
tears me apart. this would be early October, Ben Lomond High School in Ogden, Utah. Sean and I went to high school in Ogden. Ben Lomond was probably one of your high school rivals, right, Sean? You guys wanted to beat Ben Lomond in football or something?

Shawn (16:59.407)
I was only there for like six months, so I don't even know.

Marc (17:02.575)
I just had lunch in, I don't know, usually in Ogden, had lunch in Orham. Asa Ramen, strongly recommend it.

Matt (17:07.107)
close. okay. Well, so there was

Shawn (17:11.199)
So this is a shooting that happened at Ben Lohman. I don't think I heard about this in the news. Ben Lohman, I didn't know Ben Lohman was in Ogden. Okay.

Matt (17:15.282)
I didn't either, yeah.

Yeah, October 7th, there was a shooting at Ben Leman High School and a 16 year old was killed. The father of one of the suspects in that shooting, his name is Fernando Renteria. He's been indicted on two counts of obstruction of justice for trying to help his son escape arrest by driving him and his family to Las Vegas. And then he got on a flight to Mexico. And so he tried to help his son evade the police.

eventually that his son was arrested in Denver, but his dad tried to help him fly away to Mexico. In contrast, when the Charlie Kirk shooter's dad found out that his son, what his son had done, he encouraged his son to turn himself in. So the question is, which dad made the right decision?

Shawn (18:01.309)
Matt, the TV show, the TV show, the best TV show of all time, Fargo. Fargo season two, the best season of TV of all time, all time, informs us on this question. Kirsten Dunst was married to this dude and she killed someone, committed a crime. And the whole season, he's trying to hide those crimes in the name of, gotta protect my family, I gotta protect my, well, he ends up dead, she ends up going to jail. So that's the obvious answer there. Clearly, number two,

The father number two, who encouraged accountability, followed the guidance of Alma when he's talking to his son, Corianton. You cannot hide your crimes, not from God, not from law, right? So clearly number two is the right choice. How are you loving your child if you're teaching them that if you kill a person and you can escape justice, that's a moral good. That's not teaching your son.

Matt (18:53.502)
No, Well, so I'm going to advocate for father number one instead of father number two because because because father number one recognized the improbability that his son would get treated fairly by the justice system in the United States. So he wants justice for I'm telling you, I'm just being honest like

Shawn (18:57.727)
boy. All right.

Marc (18:59.842)
I didn't see that coming. I've got to honest, I'm shooketh over here.

Shawn (19:01.567)
you

Shawn (19:12.468)
my gosh, dude. Come on.

Marc (19:14.382)
BOOOO

NERD!

Shawn (19:19.559)
All right.

Marc (19:21.006)
Put him on the plane to Mexico, let him go to Mexico.

Matt (19:23.818)
Look, if I thought as a father, if I thought that my son could get free, like could get a fair trial, could be treated justly and would not be treated unfairly. Now again, I'm just going by the name of his dad, Fernando Renteria and the fact that he sent his son to Mexico. I'm not being, I'm just saying like, I'm just saying like if I was, if I was in that situation.

Shawn (19:24.541)
Hahaha

Marc (19:42.012)
so you're being racist. I see what's happening here. Okay. Well, hey. All right. Got the Woodrow Wilson man in the house here.

Shawn (19:43.559)
He is being racist. That's you being racist!

Shawn (19:51.87)
Hahaha

Matt (19:54.076)
I would not, so I would say, son, you need to be accountable for what you do. You're gonna have to find a way to repent, to make this right with this kid's family. I don't know what role you played in this. don't know, right? I, I don't know those details, but whatever it was as a father, I would say, son, you're taking accountability for your actions, but don't do it through the U S justice system. That's not the way to take accountability for your actions.

Marc (20:18.635)
I'm giving the points to Sean, because that was stupid.

Marc (20:24.813)
But I say I would love your beautiful man.

Matt (20:27.444)
You

Shawn (20:27.721)
But so, Matt, mean, come on. So you're saying that the, so I'm just reading the KSL article that when it happened and they're talking about there were two people in a car, they were chasing the other car and they shot at the other car, killed the kid. And when they were arrested, they arrested both the kid who was driving and both the kid who was shooting. And it was kind of clear because they found all the evidence. The gun was there, the bullets matched.

There was eyewitnesses, like all that overwhelming evidence. And you're saying, yeah, no matter what, that kid, because he's Latino, is not gonna get a fair shake in the American justice system. So get out, get him to escape. Come on.

Matt (21:06.442)
You can frame that evidence in different ways, right? You could frame it as a gang shooting. So then we need to increase the charges against you. You could frame it as something premeditated. You could frame it as something accidental. Like the justices, the prosecutors in that case have a lot of latitude in how they're gonna frame that case before a jury in Ogden, Utah. And I'm just saying like the way that you frame the information before a jury in Ogden, Utah could strongly influence the outcome of the

Shawn (21:34.963)
Hang on,

Matt (21:35.496)
the justice that this kid receives.

Marc (21:35.843)
Ogden's like 80 % Mexican in Ogden right now, so they'll be fine.

Shawn (21:41.683)
gonna say, Matt, I lived in Ogden. That was a really diverse place. I lived there for eight months.

Matt (21:46.91)
But you get the right jury. You get people that are afraid of the gangs or afraid of the drive-by shooting. However they frame it.

Shawn (21:51.431)
Okay, so hang on, but how doesn't that logic apply to anyone, anywhere with any jury? That has to apply, Matt. Your line of thinking should say that anyone who gets convicted of anything or anyone gets arrested for anything, the parents should help them escape because no matter what, you're gonna be like, but what if, but what if the jury does this? What if the jury will be biased in this situation? They always will be biased, of course.

Matt (22:16.894)
I'm saying that if you're an upper middle-class white kid in St. George, Utah, whose father apparently has connections and friends that are former detectives, which was what happened in the shooting with Charlie Kirk with that kid, your parents clearly have like some resources or some information. You're going to get treated differently than if you were not an upper middle-class white kid from St. George. Like everybody in St. George says we can relate to not only this kid, but to this kid's parents.

Shawn (22:39.209)
Man, that's crazy.

Matt (22:46.44)
And that doesn't necessarily happen.

Marc (22:46.496)
You know who we need to relate to is the family of the child who was innocently murdered by this other kid. So if he gets hosed by the justice system, okay, he did kind of just shoot someone and kill them. So, darn, he got an extra 10 years of being alive on the taxpayer's teat, whereas this kid is just dead to the end. So.

Matt (23:11.434)
That's, that's, that's hard to argue against Mark. That's a pretty good point. I do have to sort of, that is true. I should think about the victim of the crime, but I am sad about the victim. But, but no matter what happens to my son, if I'm Fernando Renteria, no matter what happens to my son, the kid is still dead, right? The victim is still dead. So the question is.

Marc (23:14.655)
I'm telling you, yeah, this is why I don't prep. If I prepped, I'd have thoughts. I'm just straight out of the gut. Yeah, they kind of happened, you know. They were there.

Marc (23:40.138)
Isn't Fernando Renteria an Abba song? Fernando Renteria. Sorry, that's not helpful.

Matt (23:42.954)
So if I'm the father of, if it's my kid, I'm going to be thinking about what's the best thing for my child moving forward, not necessarily what's the thing that's going to make society feel the most happy or feel the best about what happened in the crime.

Marc (23:59.508)
and thank heavens then that the jury cannot be members of the family so they can accurately weigh the evidence and the justice instead of saying, but that's my cousin. He's fine.

Shawn (24:12.563)
Now, Matt, having said all that, I recognize that maybe it's systemic, maybe it's not. I would guess that it's not systemic, but of course there's going to be injustices based on biases. And of course there are going to be certain people in certain minority groups that are going to get, I guess, less of a fair shake. And we should fight side by side to make sure that that doesn't happen. We should. We should absolutely do that. But not to the extent of help a murderer escape to Mexico.

Matt (24:13.427)
Yeah.

Matt (24:36.446)
Okay.

Shawn (24:42.835)
because there's some bias that could happen within a jury. That's crazy,

Matt (24:48.434)
Yeah. What he should have done instead of helping his kid to get on a flight to Mexico is he should have had his son drive around, I don't know, Ohio somewhere with a bumper sticker that says I am undocumented. And then ICE would have deported him to Mexico for his dad. And then there would have been no crime, no harm, no foul. They'd like, send this undocumented kid to wherever.

Shawn (25:02.623)
Wow.

There's this. What's now? What's that dumb game you guys would play on the mission and I would mock you? What was it called? Risk, there's the risk minded Matt putting strategy in place. Well done.

Matt (25:11.165)
Risk.

Marc (25:17.237)
You have to get Kamchatka and hold Kamchatka. It's the key to all of Europe.

Shawn (25:21.705)
Ha

Matt (25:24.786)
Alright, here's the last topic in the news. So King Charles III has stripped his brother Prince Andrew of his remaining titles and evicted him from his royal residence, the Royal Lodge. In wake of revelations about… I think it is, a little bit. A little bit.

Shawn (25:35.721)
That is justice.

Is that just? Is that just?

Marc (25:39.648)
Well, I want to argue on behalf of Prince Andrew because... I just... How does he feel?

Matt (25:43.146)
Let me finish the question. So this all happened in the wake of revelations about Andrew's relationship with convicted sex offender Jeffrey Epstein. The palace announced that Andrew will no longer be known as Prince but as Andrew Montbotten Windsor and will move into a private accommodation. Prince Andrew denies any wrongdoing.

Shawn (25:43.303)
boy.

Shawn (25:47.881)
Yeah, finish the-

Matt (26:10.868)
but King Charles III is siding with the victims over his own brother. So this is the question, not do you sympathize with Andrew, Mark, it is this. We have a difficult time getting justice for the Epstein victims in the United States. Is this an example of the superiority of the British system to that of the United States? Is a righteous king better at providing justice than our own judicial system?

Shawn (26:34.399)
don't know why you tend to continue to pick this one verse and then you remove the other half of the verse and you just stick on this belief that scripture teaches that it's better to have a king. Because it doesn't say that. It doesn't say that at

Matt (26:45.962)
Well, I'm just asking like, okay, maybe I went a little too far in the question, but the question still stands. Prince Andrew is being punished for what he did. And we didn't say wait for a trial.

Shawn (26:49.596)
Yeah.

Shawn (26:54.525)
Yeah, he has to take his millions of dollars that the British people have given him because he's part of the monarchy and he has to go live in a private residence the rest of his life. This wealthy, large, and he doesn't get the title of Prince anymore. That is true punishment. I'm with you, Matt. That is just, that is justice. He potentially, those crimes.

Matt (27:06.023)
And he doesn't get to be prince anymore.

Marc (27:14.324)
That makes up for all the kid diddling. Yeah.

Shawn (27:17.905)
Yeah, good. I'm with you, Matt. The British government, that justice system, even though this has nothing to do with the British judicial system, because this is just two knobs who are kings and princes messing with each other. Yeah, that's...

Matt (27:31.284)
I'm just saying something happened. Think about all the people in the Epstein files in the United States that nothing has happened to them, nothing at all. We don't even know their names. They're being protected by our federal government.

Shawn (27:39.16)
No one's... Mark, you don't defend that. Mark, you don't defend that, right? I would never defend that. That is pathetic.

Marc (27:45.435)
I know, I put the names out already.

Shawn (27:49.043)
Yeah. Put him out.

Matt (27:49.578)
Right. that's what I'm saying. In the British system, like we don't have to wait around for like

Marc (27:55.359)
This isn't a justice thing. This is literally just PR on behalf of Buckingham Palace. I mean, yes, God save the King. I, hooray, outside, it's just, he's like, this looks bad. How about we just do that? There you go. Bob's your uncle.

Shawn (28:01.183)
That's right, that's right.

Matt (28:02.176)
Hahahaha

Shawn (28:11.411)
We give him a private mansion instead of a mansion that has a different title on it. We strip him of a title, he does justice.

Matt (28:16.628)
You guys just sound like peasants who've never been kings in your life. If you were born into royalty, if you were born Prince Andrew and they strip you of your title because of something a victim claims, who's no longer here by the way, and she can't be cross-examined because she's dead, then you'd be like, how dare you take my title away from me without any due process?

Shawn (28:36.479)
This isn't justice. Mark nailed it. This isn't justice. This is PR. That's all it is.

Matt (28:43.37)
Wow. Okay, so then they're better than us, right? Because our PR is pretty bad on this.

Marc (28:48.063)
Well, our PR is fine. We have memes.

Matt (28:51.274)
Okay, let me ask you this mark. Do you think that Donald Trump his name is in the Epstein files and that's why it's not being released?

Marc (29:00.561)
I don't think that that's why it's not being released. I don't know if his name's in there. I anticipate it is, but not for as egregious a crime as Bill Clinton, because it's been pretty clear that Trump knew him. There was something there, but then it just kind of fizzled pretty quick.

Matt (29:17.832)
No, no, you know that's how he met Melania Trump was through Jeffrey Epstein.

Marc (29:20.563)
Well that happens. What I meant to was it fizzled later on. Anyway, I don't know. I haven't studied it because there's too much more important stuff going

Matt (29:24.462)
I see.

Matt (29:29.672)
No, I agree, but I was just wondering, like, so you think that Bill Clinton's name's in there for doing some pretty bad stuff?

Marc (29:36.157)
Well, it's part of becoming a conservative. have to accept that Clinton drinks children's blood. And as a kid didler, it's part of the party and everything. I'm just here for the ride,

Matt (29:42.152)
You

Shawn (29:43.007)
Ha ha ha ha ha!

Matt (29:47.178)
So then why not release the information so that they can be held accountable in the same way that Prince Andrew?

Shawn (29:56.159)
Because we all agree with that, Matt. There's no one that disagrees with that except for those who have something to lose or those who have friends who have something to lose. We know Donald Trump would not be past that. If Donald Trump has powerful friends who can help him either win office or get things done, he justifies, make money, sure, he justifies it and says, it's not worth it. It's not worth it. But it's wrong justification. It's absolutely wrong justification.

Matt (30:16.478)
Make money.

Matt (30:24.458)
But in the British system, the king is turning on his own brother. Right? Like, that's awesome.

Marc (30:34.409)
Good, but it would be nicer if he prosecuted him.

Shawn (30:39.005)
Yes? Exactly!

Marc (30:40.947)
That would be justice.

Matt (30:40.99)
Well, I don't I don't know if the king has that kind of power to

Marc (30:44.681)
then what the heck is he doing?

Shawn (30:44.681)
so it's not the...

Matt (30:50.41)
This feels like anti-British bias to me. This feels like a situation where the...

Marc (30:54.013)
I have a red coat right here, alright? On the other side of this door is a picture of King George III himself, his gracious majesty. Do you have that? Okay then, let's just buckle down, bucko.

Shawn (30:55.933)
Yes, he has a red coat. Do you have a red coat behind you, Matt?

Shawn (31:03.081)
Yeah, yeah. Do you have that, Matt?

Matt (31:06.492)
I do not have that. No, but I, I did go to, I did go to, I went to Stonehenge one time.

Shawn (31:09.993)
Yeah!

Marc (31:13.339)
Yeah, but you didn't come to the Orem Fourth of July Freedom Festival where I was to say hi, so I'm still a little hurt, honestly, that no one showed up.

Shawn (31:13.78)
You

Matt (31:18.41)
That's true. That's So, okay, you're so it's just PR. When the King of England, when the King of England strips his own brother of all of this stuff, but in the United States, when we don't do anything, but it's, but it's bad PR.

Shawn (31:22.088)
You

Marc (31:26.813)
100 %

Shawn (31:33.597)
It's also PR. It's also, well, it's not PR. It's power grab. Yeah, it's power grab. In the United States, it's...

Matt (31:39.594)
So then if you have a, if you have a British system where the king can't grab any more power than what he already has, then that's maybe a better system, right? Because then he'll do what's right.

Shawn (31:47.327)
Is your topic about Epstein files or is your topic about the form of government?

Matt (31:50.812)
No, the British system. It's yeah, well, not necessarily a form of government. It's just like we should envy Britain in this situation. They can do stuff that we can't do because they have a king.

Marc (31:57.613)
No. No.

Shawn (32:00.255)
They didn't do anything. He didn't do anything. There's no justice that's been served. He did nothing except save his own public relations debacle.

Marc (32:08.669)
It's like when World War II happened and they changed their name from whatever the German thing was to whatever the English thing is. I don't remember the details, but there you go.

Matt (32:19.1)
Okay, all right. I'm just confused, but that's fine. I'm sure our listeners are not confused, but I'm going to give myself the points because I agree with myself on this one. you. All right. Here's the, this is the big, try not to let your brains hurt too much as we think about this topic.

Marc (32:20.445)
This is PR.

Marc (32:28.549)
and me and Sean are going to take the points away and give them to ourselves.

Shawn (32:31.571)
Hahaha

Marc (32:38.992)
it's already struggling. Okay. What?

Shawn (32:39.123)
Now this is a good one. This is simple. This is simple.

Matt (32:42.054)
The founder of Anthropic, Anthropic is the company that does Claude.ai, his name's Mogadot. He recently on a, he went someplace and on, YouTube, he said that the moment of artificial general intelligence where machines become smarter than humans at everything humans can do is a matter of months away. He thinks it's going to happen by 2026. He says, maybe not till 2027, but it doesn't matter. It's inevitable. And it's going to happen really soon.

Artificial general intelligence will create a profoundly unfamiliar world, according to Mogadot. The one thing AI cannot do is replace or take over love and human connection. He stresses that this connection is the only gift from the spiritual world and the only thing that will remain when everything else is done by machines. So my big question is this, is Mogadot right? And if so, will this dramatic change in technological advancement make the world a better place?

Shawn (33:39.145)
So this is a wonderful example of what happens when secular society abandons religion. They start to try and find meaning and purpose and like power in their science and in their technology. Moghadad is a very secular guy. And what he's basically trying to say here is the science and the technology that I have created is going to make earth a utopia. There will be nothing but love and human connection because of the solutions.

Marc (33:41.499)
you

Matt (34:04.404)
Mm-hmm.

Shawn (34:08.819)
that science, not God, not religion, but science and technology will bring to us. It's a perfect example. No, it's really exactly what he's saying. He's trying to say, I found the solution. There will be nothing but, in his words, love and human connection in a world where my technology has overtaken things. He's trying to replace religion.

Matt (34:27.53)
Well that's the end.

Matt (34:32.264)
No, saying that's the only thing humans will be, he's saying that's the only thing AI will never be able to create.

Shawn (34:40.209)
Right. So he's saying that AI will basically overtake all of society and what will be left is just humans to be able to love and connect with each other. That's what he's saying. Yeah. Yeah. A utopia.

Matt (34:51.752)
Yeah. And we'll probably be doing it in slaves. No, we'll be doing it as slaves working in like data centers. You love your family, right? But you're not, you're not going to have any kind of a job that's meaningful. You're not going to be able to find any meaning and purpose in your work because that you can't do anything anymore. All you can do is love and connect.

Shawn (34:57.491)
How do you love?

Shawn (35:07.741)
Yeah, but the, yeah, but the, don't, is his tone trying to be a warning against it? I don't think so. What I've read, it is, I think his tone is, look, we're solving the world. Like, let's get it to a point where general AI can, I mean, he's the one cause, why would he be warning against the thing that he's profiting off of and building and promoting? Why would he promote general AI as the, as the founder of Anthropic and Claude and then say, no, we can't do it, can't do it.

Matt (35:13.555)
Yes.

Matt (35:30.122)
Cause he's gonna cause he's cause no, he's going to own it. Right. So he'll be rich. She'll be fabulously wealthy as he controls the thing that does all of the work that's left in the world.

Shawn (35:43.515)
Yeah, this is a wind of doctrine, a wind of man's doctrine, right? That is basically saying, think that's, whatever he's saying, everything that I've studied him saying is basically this, we need to get to a utopia. AI will enable us, man's science and technology will enable us to get to the best world that we live in, where it's just love of each other and human connection. AI is gonna get us there. by the way, I'm the one creating the science and the technology. It's just secularism, it's secularism.

Matt (35:47.146)
Yeah.

Matt (36:11.54)
So you think he's wrong. You think it's not gonna happen.

Shawn (36:14.171)
Absolutely, absolutely not. No AI or technology or science is going to enable a utopia where humans can just like bask in the sun, love each other and connect with each other. No, it's absolutely not gonna happen.

Matt (36:27.883)
What do you say, Mark?

Marc (36:31.096)
I say, you you look at Terminator and it was about ending John Connor. What we need to do is get a time machine and go back to save Harambe, because before Harambe, we didn't have any this. Hoo-ha.

Shawn (36:46.045)
He said hoo-ha, he said hoo-ha.

Marc (36:48.183)
I did. But on a more serious note, I don't think he's entirely wrong in what he says. Whether it's good or bad, just factual statement, sounds like, yeah, we're probably heading in that direction where more and more will just be AI. And I don't like it. And I think it's very much what Ian Malcolm said is your scientists were so preoccupied with whether or not they should. Sorry, I screwed the whole thing up already. Whether or not they could, that they never stopped to think whether or not they should. And

Shawn (37:16.937)
Yeah, yeah.

Matt (37:17.543)
huh.

Marc (37:18.093)
I feel like that's what happened with AI is someone finally said, let's see if we can actually do this. shoot. It's over.

Matt (37:26.768)
Yeah, I think, I think I'm somewhere between both of you. think that the, the scientists are always going to be more optimistic about what they can and cannot do. They're always going to think that they can do stuff that they can't do. I think that the future of AI is the same as the future of what telephones was, what the printing press was, what the internet was. All technology starts out with all of these positive promises of good things that are going to happen.

And then eventually it all turns into eight Chan and four Chan and all the dirty, nasty things that happen on the internet and social media and all of that stuff. think like what I read recently that I actually think it's anthropic, but maybe it's a chat GPT or maybe it's open AI that they're going to start creating like erotica and porn in their AI tools. And it's like, that's the natural to me, that's the natural progression. There's a technology that emerges. It can be used for good things.

Then people realize you can make a whole lot more money doing it for bad things. Then the good people abandon that technology because they don't like all the bad stuff that's happening there. And then it becomes like a nasty, filthy, dirty place. Like it happened with my telephones back in the day when I was a kid, you could call like a 1-800 or 1-900 number and do like phone sex, internet. happened that way. Social media happened that way. I think it's inevitable with AI it's going to happen that way, but I don't think that,

I don't, I think I'm with Sean in this way. I don't think that any science will ever be smart enough or good enough to take the place of humans. Humans are too smart. too, we have this like divine stuff within us that allows us to adapt to tools and we can use them to make ourselves better. We don't always do it. We can use them to make ourselves better. So I think we will always be one step ahead of whatever technology we develop.

And we will never develop a technology that becomes better than us, that can do everything we can do. I think that's impossible.

Marc (39:25.594)
And I wouldn't say everything, but I just think a lot of things will be replaced. A lot of teachers could be replaced with AI and you'll just have a class monitor and even security. job I'm in right now could just be cameras with security, AI, and it just sends out a signal. Hey, there's a thing. Bring the cops. So, I have even, I've heard of

Matt (39:49.064)
Yeah, I think that.

Marc (39:53.37)
some arguments that we should have judges replaced with AI. And as I'm just saying, the more you think about everything, it could be replaced eventually, for the most part.

Matt (40:03.984)
Medical doctors could you imagine if we had medical doctors that are AI? That I don't have to pay an outrageous fee to somebody who like went to medical school to do these kinds of things You'd imagine surgery could get you done better with a robot than with humans But Yeah, but I but we've adapted right there was a time when we needed everybody to produce food and we don't have such an agricultural Society anymore there was a time when we needed a lot of people work in factories We don't need that anymore

Marc (40:08.12)
Star Wars.

Marc (40:18.307)
No mistakes.

Matt (40:31.818)
So I think we'll always find a way to adapt and evolve and to stay ahead of the tools that we develop. What's going to happen is we're going to need people to work at AI data centers, right? They consume a lot of energy and they consume a lot of resources. So we'll need people to produce resources and maintain the resources that keep the AI data centers running.

Shawn (40:37.887)
That's fair.

Marc (40:51.083)
until the AI can tell you how to do it with less energy.

Matt (40:56.392)
Wow. But then somebody still has to do it, right?

Marc (40:59.563)
Yeah, but so again, it's not going to be every single human is now unemployed and just loving, but a lot. I mean, you say that we aren't on the farms anymore, we aren't in the factories, but where are we? There's a lot of McDonald's employees with master's degrees.

Matt (41:12.318)
Yeah, no, I yeah, I

Matt (41:18.792)
I think that there are some big changes that are happening that are going to happen in the next few years in the global economy that's going to cause a lot of pain. And I think that that's where the gospel of Jesus Christ becomes more important than ever. Because if you're asking AI how to help you with those problems, AI is not going to give you as good of answers as God would give you of how to handle your personal situation.

Shawn (41:24.297)
for sure. Yeah.

Shawn (41:40.859)
Amen.

Marc (41:41.089)
Mmm full circle well done you get a point on that one

Matt (41:44.266)
Okay. It took the entire episode, but I finally earned points. And so we're going to end on that. Hey, thanks you guys for joining me. Listeners, thank you for joining us this week. Let us know what you think. I know that Mark and Sean took some controversial positions that maybe they need to be pushed back on just a little bit. We would love to hear about that.

Shawn (42:03.199)
You

Marc (42:03.609)
I disagree with that entirely. I'm 38 by the

Matt (42:08.234)
38 years young. That's great.

Shawn (42:08.543)
There you go. Young man.

Marc (42:09.922)
So, yes, I'm young and old.

balancing.

Matt (42:17.416)
All right, everybody, have a great week.

Marc (42:19.521)
Bye.