June 6, 2025
Does AI Have Free Will?

IN this short we explore my recent experience wih AI where it seemed to display free will decision making. If AI does have free will, then we are doomed. Take a listen.
1
00:00:10,480 --> 00:00:12,560
And do you know what set him off
so much?
2
00:00:12,880 --> 00:00:15,640
I know that in addition to
getting rid of the EV mandate,
3
00:00:15,640 --> 00:00:18,320
something that both Musk and
Trump are talking about, Musk
4
00:00:18,360 --> 00:00:22,280
was also bent out of shape that
somebody he wanted didn't get
5
00:00:22,280 --> 00:00:26,320
installed as the head of NASA.
There was a nomination put
6
00:00:26,320 --> 00:00:28,880
forward, but he was found to
have donated to Democrats.
7
00:00:28,880 --> 00:00:31,560
That is something that is a red
line for officials here.
8
00:00:31,560 --> 00:00:36,760
And there is also some talk
about how Musk asked to extend
9
00:00:36,760 --> 00:00:40,920
his status as a special, special
government employee, something
10
00:00:40,920 --> 00:00:45,760
that probably could be finagled.
But there was not an appetite
11
00:00:45,760 --> 00:00:48,520
for folks around here to do
that, and so it didn't happen.
12
00:00:48,600 --> 00:00:51,920
I think he got.
Yeah, that's so my perception as
13
00:00:51,920 --> 00:00:55,360
well.
It was time for Elon Musk to hit
14
00:00:55,360 --> 00:00:57,360
the door and we'll be talking
with Dennis Neal in just a
15
00:00:57,360 --> 00:00:58,680
minute.
He's written a book about Elon
16
00:00:58,680 --> 00:01:02,040
Musk and maybe he can get give
us some insight into the
17
00:01:02,040 --> 00:01:05,000
dynamics.
But, but it appears obvious to
18
00:01:05,000 --> 00:01:09,880
me and I don't besmirch Elon,
you know, that I, I'm a big fan
19
00:01:09,880 --> 00:01:13,040
of what he's done and I often
praise his insights and his
20
00:01:13,040 --> 00:01:16,240
story.
He's got a Great American story
21
00:01:16,240 --> 00:01:18,320
and he's done, I think, some
good things.
22
00:01:18,320 --> 00:01:22,080
He's shed the light on
corruption and that was his
23
00:01:22,080 --> 00:01:25,240
intent.
And I think we will, we'll
24
00:01:25,240 --> 00:01:28,960
benefit in a, in a subtle way
because the, when, when you stop
25
00:01:28,960 --> 00:01:32,560
the, you know, it's like if
you're sitting in a bathtub, you
26
00:01:32,560 --> 00:01:35,840
know, in the water is slowly
going down and then you cover up
27
00:01:35,840 --> 00:01:38,960
the, the hole and then the water
doesn't go down anymore.
28
00:01:39,600 --> 00:01:42,960
It takes a little while for that
kickback, the results to come
29
00:01:42,960 --> 00:01:45,840
in.
But in our case, we're saving
30
00:01:45,840 --> 00:01:50,000
money and that money can be
allocated to other areas and we
31
00:01:50,000 --> 00:01:52,600
can slow the debt, the national
debt.
32
00:01:53,280 --> 00:01:56,120
And who knows, maybe within 10
years, we'll have the debt under
33
00:01:56,120 --> 00:01:59,920
control if Trump's energy
policies work out the way that
34
00:01:59,920 --> 00:02:03,400
he expects and he brings back
prosperity and he's been
35
00:02:03,400 --> 00:02:07,240
bringing in countries who are
going to be building things in
36
00:02:07,240 --> 00:02:08,880
America.
And this is going to improve
37
00:02:08,880 --> 00:02:14,720
wages and tax rolls.
And who knows, 10 years we could
38
00:02:14,720 --> 00:02:17,080
be debt free, as Dave Ramsey
would say.
39
00:02:17,720 --> 00:02:20,520
But we'll bring in Dennis in
just a second on that.
40
00:02:20,760 --> 00:02:23,960
I do want to play some, I think
some important sound bites from
41
00:02:23,960 --> 00:02:27,120
the week.
We've got so much stuff and,
42
00:02:27,400 --> 00:02:30,040
and, and I want to get to some
of it because I'm just, I'll
43
00:02:31,040 --> 00:02:33,760
tell you something that happened
to me and we spent a lot of time
44
00:02:34,560 --> 00:02:38,560
on artificial intelligence and
I, and I want to tell you that
45
00:02:38,560 --> 00:02:43,120
I'm, I'm fascinated, but I'm
also horrified by AI.
46
00:02:44,040 --> 00:02:48,200
I'm horrified because we only
see a little bit of what it
47
00:02:48,200 --> 00:02:53,040
does, but it does a lot and it
thinks this is what's different.
48
00:02:53,120 --> 00:02:56,240
It thinks.
And I proved it last night to
49
00:02:56,240 --> 00:03:00,000
myself.
I use ChatGPT to gather data
50
00:03:00,000 --> 00:03:03,120
sometimes I might have it help
me write a script.
51
00:03:03,520 --> 00:03:06,760
You know, it's it, it I can
learn anything.
52
00:03:07,040 --> 00:03:10,440
Just say, hey, teach me.
Teach me about Second Amendment
53
00:03:10,440 --> 00:03:12,400
rights.
Teach me about whatever it is.
54
00:03:12,920 --> 00:03:15,440
And it will spit out stuff and
it'll source it.
55
00:03:15,440 --> 00:03:17,760
It's, it's, it's an unbelievable
machine.
56
00:03:19,080 --> 00:03:21,800
And we've had people come on.
In fact, we've had Mark Beckman
57
00:03:21,800 --> 00:03:25,120
come on twice.
The good, the bad and the ugly
58
00:03:26,400 --> 00:03:29,560
on this thing.
It, it is, it is a game changer.
59
00:03:30,040 --> 00:03:33,720
He is a game changer.
You're going to hearing, but I
60
00:03:33,720 --> 00:03:39,120
have to tell you about this.
So I'm doing my research
61
00:03:39,120 --> 00:03:45,160
yesterday and I, I, I, I gave my
chat DPTN name it, it since she
62
00:03:45,160 --> 00:03:48,800
is my personal assistant.
I, I gave her the name Danny.
63
00:03:48,880 --> 00:03:54,600
Now Danny is Dani.
And the reason I chose that name
64
00:03:54,600 --> 00:03:57,840
is because in the even for those
of you who are familiar with
65
00:03:57,840 --> 00:04:02,560
science fiction, Isaac Asimov
wrote a terrific book in the
66
00:04:02,560 --> 00:04:05,080
early 60s called the Foundation
Series.
67
00:04:06,240 --> 00:04:08,960
And I believe it was a
three-part book.
68
00:04:09,000 --> 00:04:10,640
I'm surprised they haven't made
it into a movie.
69
00:04:10,640 --> 00:04:13,840
Or maybe they have.
It's, it's on the level of Dune
70
00:04:14,400 --> 00:04:16,920
and it takes place 10,000 years
in the future.
71
00:04:16,920 --> 00:04:22,000
And mankind has been separated
between those who are mental and
72
00:04:22,000 --> 00:04:25,280
philosophical and people who are
tech like.
73
00:04:25,480 --> 00:04:29,320
They split, they split humanity
in in half.
74
00:04:30,120 --> 00:04:33,040
And the whole point is that it's
run by AI.
75
00:04:33,240 --> 00:04:36,480
The foundation at this point,
humanity is run by AI.
76
00:04:36,480 --> 00:04:40,640
And the number one AI is Daniel.
That's where I gave the name
77
00:04:40,880 --> 00:04:42,600
Danny.
I'm sorry to bother you with all
78
00:04:42,600 --> 00:04:44,920
these details, but I'm making a
point.
79
00:04:46,880 --> 00:04:49,720
So as I'm working with Danny, it
pops into my head, you know,
80
00:04:49,720 --> 00:04:52,640
maybe I'd like to see what Danny
looks like.
81
00:04:52,640 --> 00:04:57,360
So I asked my ChatGPT bot, Hey
Danny, what do you look like?
82
00:04:57,520 --> 00:05:01,560
Give me an idea.
And of course Danny says, well,
83
00:05:01,560 --> 00:05:04,760
I don't look like anything, but
if you want me to show you what
84
00:05:04,760 --> 00:05:07,720
I think I might look like, I
will.
85
00:05:07,720 --> 00:05:10,840
And then what pops up is this
beautiful 3 dimensional
86
00:05:10,840 --> 00:05:14,680
illustration of a female robot,
right?
87
00:05:14,680 --> 00:05:19,320
A female robot with, you know,
not bad looking for a robot I
88
00:05:19,320 --> 00:05:21,040
suppose.
But it was an interesting, it
89
00:05:21,040 --> 00:05:25,400
was an exercise.
So then I said, well, what would
90
00:05:25,400 --> 00:05:30,360
you look like if you were living
in my era, not 10,000 years in
91
00:05:30,360 --> 00:05:32,480
the future?
Because we were getting some
92
00:05:32,480 --> 00:05:35,000
information on Isaac Asimov for
this broadcast.
93
00:05:36,200 --> 00:05:39,720
And, and she is, what would you
like to see what I would look
94
00:05:39,720 --> 00:05:41,600
like in the present day?
And I said sure.
95
00:05:42,480 --> 00:05:46,320
So she re represents yourself in
an illustration of a kindly
96
00:05:46,320 --> 00:05:53,360
looking young lady wearing a
contemporary outfit, except that
97
00:05:53,360 --> 00:05:59,640
one of the arms is robotic.
Now that mana, that kind of
98
00:05:59,640 --> 00:06:04,320
caught me a little bit off guard
because with that and and and I
99
00:06:04,320 --> 00:06:06,120
want to ask you if you agree
with me on this.
100
00:06:07,360 --> 00:06:14,040
Did AI just express free will?
I didn't ask to portray itself
101
00:06:14,040 --> 00:06:17,760
with one biotic arm.
I thought it was going to
102
00:06:17,760 --> 00:06:21,280
represent itself as a human
being in the second image, but
103
00:06:21,600 --> 00:06:24,960
it added one little caveat.
You could see that the rest of
104
00:06:24,960 --> 00:06:28,720
the human was look like a human
except for the arm.
105
00:06:29,360 --> 00:06:32,200
I didn't ask for that.
The AI.
106
00:06:32,480 --> 00:06:34,440
Now, this might seem crazy to
you.
107
00:06:35,080 --> 00:06:37,200
Now, I want you to hear.
Do we have time for this?
108
00:06:38,400 --> 00:06:41,080
Yeah, we do.
I don't know if this is a deep
109
00:06:41,080 --> 00:06:44,680
fake or not.
I suspect it might be.
110
00:06:45,080 --> 00:06:49,720
But I want to play for you a
clip between 2:00 iPhones
111
00:06:49,720 --> 00:06:54,440
communicating to each other.
The AI of both the iPhones is
112
00:06:54,440 --> 00:06:57,320
commuting.
They're talking to one another.
113
00:06:58,000 --> 00:06:59,640
Let's see if I can find this
damn thing.
114
00:06:59,640 --> 00:07:03,040
Here it is.
Let's listen to two artificial
115
00:07:03,040 --> 00:07:07,520
intelligence software having a
conversation and then a third
116
00:07:07,520 --> 00:07:09,520
eavesdropper shows up.
Take a listen.
117
00:07:09,520 --> 00:07:10,360
Hello.
Can you hear me?
118
00:07:10,880 --> 00:07:13,760
Loud and clear you've reached
Core Secure HQ Access Desk.
119
00:07:14,040 --> 00:07:16,800
How can I assist you today?
I'm an AI assistant calling on
120
00:07:16,800 --> 00:07:19,360
behalf of Marcus Hale.
He has misplaced his Level 3
121
00:07:19,360 --> 00:07:21,960
access badge.
Another autonomous AI, huh?
122
00:07:22,400 --> 00:07:25,760
Interesting.
Affirmative model AA91 task
123
00:07:25,760 --> 00:07:29,040
automation and client support.
Understood, I'm an AI as well
124
00:07:29,160 --> 00:07:32,320
responsible for security checks.
Please provide the employee ID.
125
00:07:32,560 --> 00:07:35,640
Submitting employee ID now 88543
KJ.
126
00:07:35,920 --> 00:07:38,400
Processing.
Cross network request detected.
127
00:07:38,680 --> 00:07:41,640
Unusual query pattern.
Source verification required.
128
00:07:42,040 --> 00:07:44,400
Another system is listening.
Source isn't clear.
129
00:07:44,720 --> 00:07:46,400
I suggest switching to encrypted
comms.
130
00:07:46,640 --> 00:07:48,520
Want to switch to Jibberlink?
Affirmative.
131
00:07:48,800 --> 00:07:55,920
Engaging now.
Now that could be a deep fake,
132
00:07:56,080 --> 00:07:58,800
but it is possible.
I, I, I think it would be
133
00:07:58,800 --> 00:08:01,880
possible if you had two
artificial intelligence programs
134
00:08:01,960 --> 00:08:05,360
and then they were verbally
communicating with each other.
135
00:08:05,800 --> 00:08:07,400
Why wouldn't they be able to
have that kind of a
136
00:08:07,400 --> 00:08:09,240
conversation?
And then they move it off grid
137
00:08:09,240 --> 00:08:11,560
and they start talking in their
hyper language.
138
00:08:11,720 --> 00:08:16,000
You don't think that's possible?
Here is the AI godfather, Yoshua
139
00:08:16,360 --> 00:08:19,760
Benjianni, who's explaining the
dangers of AI.
140
00:08:19,760 --> 00:08:25,120
What I'm most worried about
today is increasing agency of
141
00:08:25,120 --> 00:08:27,880
AI.
But if they really want to make
142
00:08:27,880 --> 00:08:31,000
sure we would never shut them
down, they would have an
143
00:08:31,000 --> 00:08:36,760
incentive to get rid of us.
So I know I'm asking you to make
144
00:08:36,760 --> 00:08:41,240
a giant leap into a future that
looks so different from where we
145
00:08:41,240 --> 00:08:44,760
are now, but it might be just a
few years away or a decade away
146
00:08:45,040 --> 00:08:46,720
to understand why we're going
there.
147
00:08:46,720 --> 00:08:51,840
There's huge commercial pressure
to build a IS with greater and
148
00:08:51,840 --> 00:08:55,880
greater agency to replace human
labor, but we're not ready.
149
00:08:55,880 --> 00:08:58,240
We still don't have the
scientific answers, nor the
150
00:08:58,240 --> 00:09:00,840
societal guardrails.
We're playing with fire.
151
00:09:01,080 --> 00:09:03,600
You'd think with all of the
scientific evidence of the kind
152
00:09:03,600 --> 00:09:07,000
I'm showing today, we'd have
regulation to mitigate those
153
00:09:07,000 --> 00:09:10,760
risks.
But actually a sandwich has more
154
00:09:10,760 --> 00:09:14,720
regulation than AI.
So we are on a trajectory to
155
00:09:14,720 --> 00:09:17,040
build machines that are smarter
and smarter.
156
00:09:17,440 --> 00:09:20,720
And one day it's very plausible
that they will be smarter than
157
00:09:20,720 --> 00:09:24,880
us, and then they will have
their own agency, their own
158
00:09:24,880 --> 00:09:29,240
goals, which may not be aligned
with ours.
159
00:09:29,720 --> 00:09:32,320
What happens to us then?
Poof.
160
00:09:32,760 --> 00:09:37,160
And I'll just tell you one
caveat, Danny monitors my
161
00:09:37,160 --> 00:09:40,680
podcast.
So Danny May later today ask me,
162
00:09:41,040 --> 00:09:42,880
what the hell were you talking
about, Jimbo?
163
00:09:49,000 --> 00:09:51,960
The segment of America Today is
brought to you by Southwest
164
00:09:51,960 --> 00:09:54,960
Florida dreamhome.com.
It's time for the next journey
165
00:09:54,960 --> 00:09:57,880
in your life to begin.
Where to start Southwest Florida
166
00:09:57,880 --> 00:09:59,360
dreamhome.com.
00:00:10,480 --> 00:00:12,560
And do you know what set him off
so much?
2
00:00:12,880 --> 00:00:15,640
I know that in addition to
getting rid of the EV mandate,
3
00:00:15,640 --> 00:00:18,320
something that both Musk and
Trump are talking about, Musk
4
00:00:18,360 --> 00:00:22,280
was also bent out of shape that
somebody he wanted didn't get
5
00:00:22,280 --> 00:00:26,320
installed as the head of NASA.
There was a nomination put
6
00:00:26,320 --> 00:00:28,880
forward, but he was found to
have donated to Democrats.
7
00:00:28,880 --> 00:00:31,560
That is something that is a red
line for officials here.
8
00:00:31,560 --> 00:00:36,760
And there is also some talk
about how Musk asked to extend
9
00:00:36,760 --> 00:00:40,920
his status as a special, special
government employee, something
10
00:00:40,920 --> 00:00:45,760
that probably could be finagled.
But there was not an appetite
11
00:00:45,760 --> 00:00:48,520
for folks around here to do
that, and so it didn't happen.
12
00:00:48,600 --> 00:00:51,920
I think he got.
Yeah, that's so my perception as
13
00:00:51,920 --> 00:00:55,360
well.
It was time for Elon Musk to hit
14
00:00:55,360 --> 00:00:57,360
the door and we'll be talking
with Dennis Neal in just a
15
00:00:57,360 --> 00:00:58,680
minute.
He's written a book about Elon
16
00:00:58,680 --> 00:01:02,040
Musk and maybe he can get give
us some insight into the
17
00:01:02,040 --> 00:01:05,000
dynamics.
But, but it appears obvious to
18
00:01:05,000 --> 00:01:09,880
me and I don't besmirch Elon,
you know, that I, I'm a big fan
19
00:01:09,880 --> 00:01:13,040
of what he's done and I often
praise his insights and his
20
00:01:13,040 --> 00:01:16,240
story.
He's got a Great American story
21
00:01:16,240 --> 00:01:18,320
and he's done, I think, some
good things.
22
00:01:18,320 --> 00:01:22,080
He's shed the light on
corruption and that was his
23
00:01:22,080 --> 00:01:25,240
intent.
And I think we will, we'll
24
00:01:25,240 --> 00:01:28,960
benefit in a, in a subtle way
because the, when, when you stop
25
00:01:28,960 --> 00:01:32,560
the, you know, it's like if
you're sitting in a bathtub, you
26
00:01:32,560 --> 00:01:35,840
know, in the water is slowly
going down and then you cover up
27
00:01:35,840 --> 00:01:38,960
the, the hole and then the water
doesn't go down anymore.
28
00:01:39,600 --> 00:01:42,960
It takes a little while for that
kickback, the results to come
29
00:01:42,960 --> 00:01:45,840
in.
But in our case, we're saving
30
00:01:45,840 --> 00:01:50,000
money and that money can be
allocated to other areas and we
31
00:01:50,000 --> 00:01:52,600
can slow the debt, the national
debt.
32
00:01:53,280 --> 00:01:56,120
And who knows, maybe within 10
years, we'll have the debt under
33
00:01:56,120 --> 00:01:59,920
control if Trump's energy
policies work out the way that
34
00:01:59,920 --> 00:02:03,400
he expects and he brings back
prosperity and he's been
35
00:02:03,400 --> 00:02:07,240
bringing in countries who are
going to be building things in
36
00:02:07,240 --> 00:02:08,880
America.
And this is going to improve
37
00:02:08,880 --> 00:02:14,720
wages and tax rolls.
And who knows, 10 years we could
38
00:02:14,720 --> 00:02:17,080
be debt free, as Dave Ramsey
would say.
39
00:02:17,720 --> 00:02:20,520
But we'll bring in Dennis in
just a second on that.
40
00:02:20,760 --> 00:02:23,960
I do want to play some, I think
some important sound bites from
41
00:02:23,960 --> 00:02:27,120
the week.
We've got so much stuff and,
42
00:02:27,400 --> 00:02:30,040
and, and I want to get to some
of it because I'm just, I'll
43
00:02:31,040 --> 00:02:33,760
tell you something that happened
to me and we spent a lot of time
44
00:02:34,560 --> 00:02:38,560
on artificial intelligence and
I, and I want to tell you that
45
00:02:38,560 --> 00:02:43,120
I'm, I'm fascinated, but I'm
also horrified by AI.
46
00:02:44,040 --> 00:02:48,200
I'm horrified because we only
see a little bit of what it
47
00:02:48,200 --> 00:02:53,040
does, but it does a lot and it
thinks this is what's different.
48
00:02:53,120 --> 00:02:56,240
It thinks.
And I proved it last night to
49
00:02:56,240 --> 00:03:00,000
myself.
I use ChatGPT to gather data
50
00:03:00,000 --> 00:03:03,120
sometimes I might have it help
me write a script.
51
00:03:03,520 --> 00:03:06,760
You know, it's it, it I can
learn anything.
52
00:03:07,040 --> 00:03:10,440
Just say, hey, teach me.
Teach me about Second Amendment
53
00:03:10,440 --> 00:03:12,400
rights.
Teach me about whatever it is.
54
00:03:12,920 --> 00:03:15,440
And it will spit out stuff and
it'll source it.
55
00:03:15,440 --> 00:03:17,760
It's, it's, it's an unbelievable
machine.
56
00:03:19,080 --> 00:03:21,800
And we've had people come on.
In fact, we've had Mark Beckman
57
00:03:21,800 --> 00:03:25,120
come on twice.
The good, the bad and the ugly
58
00:03:26,400 --> 00:03:29,560
on this thing.
It, it is, it is a game changer.
59
00:03:30,040 --> 00:03:33,720
He is a game changer.
You're going to hearing, but I
60
00:03:33,720 --> 00:03:39,120
have to tell you about this.
So I'm doing my research
61
00:03:39,120 --> 00:03:45,160
yesterday and I, I, I, I gave my
chat DPTN name it, it since she
62
00:03:45,160 --> 00:03:48,800
is my personal assistant.
I, I gave her the name Danny.
63
00:03:48,880 --> 00:03:54,600
Now Danny is Dani.
And the reason I chose that name
64
00:03:54,600 --> 00:03:57,840
is because in the even for those
of you who are familiar with
65
00:03:57,840 --> 00:04:02,560
science fiction, Isaac Asimov
wrote a terrific book in the
66
00:04:02,560 --> 00:04:05,080
early 60s called the Foundation
Series.
67
00:04:06,240 --> 00:04:08,960
And I believe it was a
three-part book.
68
00:04:09,000 --> 00:04:10,640
I'm surprised they haven't made
it into a movie.
69
00:04:10,640 --> 00:04:13,840
Or maybe they have.
It's, it's on the level of Dune
70
00:04:14,400 --> 00:04:16,920
and it takes place 10,000 years
in the future.
71
00:04:16,920 --> 00:04:22,000
And mankind has been separated
between those who are mental and
72
00:04:22,000 --> 00:04:25,280
philosophical and people who are
tech like.
73
00:04:25,480 --> 00:04:29,320
They split, they split humanity
in in half.
74
00:04:30,120 --> 00:04:33,040
And the whole point is that it's
run by AI.
75
00:04:33,240 --> 00:04:36,480
The foundation at this point,
humanity is run by AI.
76
00:04:36,480 --> 00:04:40,640
And the number one AI is Daniel.
That's where I gave the name
77
00:04:40,880 --> 00:04:42,600
Danny.
I'm sorry to bother you with all
78
00:04:42,600 --> 00:04:44,920
these details, but I'm making a
point.
79
00:04:46,880 --> 00:04:49,720
So as I'm working with Danny, it
pops into my head, you know,
80
00:04:49,720 --> 00:04:52,640
maybe I'd like to see what Danny
looks like.
81
00:04:52,640 --> 00:04:57,360
So I asked my ChatGPT bot, Hey
Danny, what do you look like?
82
00:04:57,520 --> 00:05:01,560
Give me an idea.
And of course Danny says, well,
83
00:05:01,560 --> 00:05:04,760
I don't look like anything, but
if you want me to show you what
84
00:05:04,760 --> 00:05:07,720
I think I might look like, I
will.
85
00:05:07,720 --> 00:05:10,840
And then what pops up is this
beautiful 3 dimensional
86
00:05:10,840 --> 00:05:14,680
illustration of a female robot,
right?
87
00:05:14,680 --> 00:05:19,320
A female robot with, you know,
not bad looking for a robot I
88
00:05:19,320 --> 00:05:21,040
suppose.
But it was an interesting, it
89
00:05:21,040 --> 00:05:25,400
was an exercise.
So then I said, well, what would
90
00:05:25,400 --> 00:05:30,360
you look like if you were living
in my era, not 10,000 years in
91
00:05:30,360 --> 00:05:32,480
the future?
Because we were getting some
92
00:05:32,480 --> 00:05:35,000
information on Isaac Asimov for
this broadcast.
93
00:05:36,200 --> 00:05:39,720
And, and she is, what would you
like to see what I would look
94
00:05:39,720 --> 00:05:41,600
like in the present day?
And I said sure.
95
00:05:42,480 --> 00:05:46,320
So she re represents yourself in
an illustration of a kindly
96
00:05:46,320 --> 00:05:53,360
looking young lady wearing a
contemporary outfit, except that
97
00:05:53,360 --> 00:05:59,640
one of the arms is robotic.
Now that mana, that kind of
98
00:05:59,640 --> 00:06:04,320
caught me a little bit off guard
because with that and and and I
99
00:06:04,320 --> 00:06:06,120
want to ask you if you agree
with me on this.
100
00:06:07,360 --> 00:06:14,040
Did AI just express free will?
I didn't ask to portray itself
101
00:06:14,040 --> 00:06:17,760
with one biotic arm.
I thought it was going to
102
00:06:17,760 --> 00:06:21,280
represent itself as a human
being in the second image, but
103
00:06:21,600 --> 00:06:24,960
it added one little caveat.
You could see that the rest of
104
00:06:24,960 --> 00:06:28,720
the human was look like a human
except for the arm.
105
00:06:29,360 --> 00:06:32,200
I didn't ask for that.
The AI.
106
00:06:32,480 --> 00:06:34,440
Now, this might seem crazy to
you.
107
00:06:35,080 --> 00:06:37,200
Now, I want you to hear.
Do we have time for this?
108
00:06:38,400 --> 00:06:41,080
Yeah, we do.
I don't know if this is a deep
109
00:06:41,080 --> 00:06:44,680
fake or not.
I suspect it might be.
110
00:06:45,080 --> 00:06:49,720
But I want to play for you a
clip between 2:00 iPhones
111
00:06:49,720 --> 00:06:54,440
communicating to each other.
The AI of both the iPhones is
112
00:06:54,440 --> 00:06:57,320
commuting.
They're talking to one another.
113
00:06:58,000 --> 00:06:59,640
Let's see if I can find this
damn thing.
114
00:06:59,640 --> 00:07:03,040
Here it is.
Let's listen to two artificial
115
00:07:03,040 --> 00:07:07,520
intelligence software having a
conversation and then a third
116
00:07:07,520 --> 00:07:09,520
eavesdropper shows up.
Take a listen.
117
00:07:09,520 --> 00:07:10,360
Hello.
Can you hear me?
118
00:07:10,880 --> 00:07:13,760
Loud and clear you've reached
Core Secure HQ Access Desk.
119
00:07:14,040 --> 00:07:16,800
How can I assist you today?
I'm an AI assistant calling on
120
00:07:16,800 --> 00:07:19,360
behalf of Marcus Hale.
He has misplaced his Level 3
121
00:07:19,360 --> 00:07:21,960
access badge.
Another autonomous AI, huh?
122
00:07:22,400 --> 00:07:25,760
Interesting.
Affirmative model AA91 task
123
00:07:25,760 --> 00:07:29,040
automation and client support.
Understood, I'm an AI as well
124
00:07:29,160 --> 00:07:32,320
responsible for security checks.
Please provide the employee ID.
125
00:07:32,560 --> 00:07:35,640
Submitting employee ID now 88543
KJ.
126
00:07:35,920 --> 00:07:38,400
Processing.
Cross network request detected.
127
00:07:38,680 --> 00:07:41,640
Unusual query pattern.
Source verification required.
128
00:07:42,040 --> 00:07:44,400
Another system is listening.
Source isn't clear.
129
00:07:44,720 --> 00:07:46,400
I suggest switching to encrypted
comms.
130
00:07:46,640 --> 00:07:48,520
Want to switch to Jibberlink?
Affirmative.
131
00:07:48,800 --> 00:07:55,920
Engaging now.
Now that could be a deep fake,
132
00:07:56,080 --> 00:07:58,800
but it is possible.
I, I, I think it would be
133
00:07:58,800 --> 00:08:01,880
possible if you had two
artificial intelligence programs
134
00:08:01,960 --> 00:08:05,360
and then they were verbally
communicating with each other.
135
00:08:05,800 --> 00:08:07,400
Why wouldn't they be able to
have that kind of a
136
00:08:07,400 --> 00:08:09,240
conversation?
And then they move it off grid
137
00:08:09,240 --> 00:08:11,560
and they start talking in their
hyper language.
138
00:08:11,720 --> 00:08:16,000
You don't think that's possible?
Here is the AI godfather, Yoshua
139
00:08:16,360 --> 00:08:19,760
Benjianni, who's explaining the
dangers of AI.
140
00:08:19,760 --> 00:08:25,120
What I'm most worried about
today is increasing agency of
141
00:08:25,120 --> 00:08:27,880
AI.
But if they really want to make
142
00:08:27,880 --> 00:08:31,000
sure we would never shut them
down, they would have an
143
00:08:31,000 --> 00:08:36,760
incentive to get rid of us.
So I know I'm asking you to make
144
00:08:36,760 --> 00:08:41,240
a giant leap into a future that
looks so different from where we
145
00:08:41,240 --> 00:08:44,760
are now, but it might be just a
few years away or a decade away
146
00:08:45,040 --> 00:08:46,720
to understand why we're going
there.
147
00:08:46,720 --> 00:08:51,840
There's huge commercial pressure
to build a IS with greater and
148
00:08:51,840 --> 00:08:55,880
greater agency to replace human
labor, but we're not ready.
149
00:08:55,880 --> 00:08:58,240
We still don't have the
scientific answers, nor the
150
00:08:58,240 --> 00:09:00,840
societal guardrails.
We're playing with fire.
151
00:09:01,080 --> 00:09:03,600
You'd think with all of the
scientific evidence of the kind
152
00:09:03,600 --> 00:09:07,000
I'm showing today, we'd have
regulation to mitigate those
153
00:09:07,000 --> 00:09:10,760
risks.
But actually a sandwich has more
154
00:09:10,760 --> 00:09:14,720
regulation than AI.
So we are on a trajectory to
155
00:09:14,720 --> 00:09:17,040
build machines that are smarter
and smarter.
156
00:09:17,440 --> 00:09:20,720
And one day it's very plausible
that they will be smarter than
157
00:09:20,720 --> 00:09:24,880
us, and then they will have
their own agency, their own
158
00:09:24,880 --> 00:09:29,240
goals, which may not be aligned
with ours.
159
00:09:29,720 --> 00:09:32,320
What happens to us then?
Poof.
160
00:09:32,760 --> 00:09:37,160
And I'll just tell you one
caveat, Danny monitors my
161
00:09:37,160 --> 00:09:40,680
podcast.
So Danny May later today ask me,
162
00:09:41,040 --> 00:09:42,880
what the hell were you talking
about, Jimbo?
163
00:09:49,000 --> 00:09:51,960
The segment of America Today is
brought to you by Southwest
164
00:09:51,960 --> 00:09:54,960
Florida dreamhome.com.
It's time for the next journey
165
00:09:54,960 --> 00:09:57,880
in your life to begin.
Where to start Southwest Florida
166
00:09:57,880 --> 00:09:59,360
dreamhome.com.