July 28, 2025
Is AI Woke and How To Fix It

AI expert and PICKAX CEO Jeff Dornik joins us to talk about how to keep AI pure and from being tainted with WOKE ideology. Information should be unbiased, but as we have seen, Silion Valleyy has other plans.
1
00:00:00,120 --> 00:00:02,560
All right, welcome back, America
today.
2
00:00:02,560 --> 00:00:05,400
You know, woke is everything.
Our AI is everything now.
3
00:00:05,400 --> 00:00:09,880
And I wanted to bring in Jeff
Dornick about this because it's
4
00:00:09,880 --> 00:00:13,800
very important about AI and
where we're headed and what it
5
00:00:13,800 --> 00:00:17,640
means.
And Jeff is the CEO of Pickaxe,
6
00:00:18,040 --> 00:00:21,200
A groundbreaking social media
platform built on 2
7
00:00:21,200 --> 00:00:25,360
uncompromising principles,
freedom of speech and freedom of
8
00:00:25,360 --> 00:00:27,360
reach.
Welcome to the program.
9
00:00:27,360 --> 00:00:29,160
We appreciate you coming on.
Jeff, how are you?
10
00:00:30,280 --> 00:00:31,680
Doing very well, thanks for
having me.
11
00:00:31,960 --> 00:00:33,320
On Yeah, it's a pleasure, real
pleasure.
12
00:00:33,320 --> 00:00:38,000
So what's the drive right now
with the United States trying to
13
00:00:38,000 --> 00:00:42,480
dominate in the world of AI?
What is it about AI that is
14
00:00:42,480 --> 00:00:44,680
making our government so excited
right now?
15
00:00:46,240 --> 00:00:49,080
Well, I think there's there's a
couple of different issues all
16
00:00:49,280 --> 00:00:52,200
playing into this right now.
You know #1 you've got China,
17
00:00:52,200 --> 00:00:55,200
which it, which is investing
heavily into both artificial
18
00:00:55,200 --> 00:00:58,760
intelligence and quantum
computing to where from a, from
19
00:00:58,760 --> 00:01:01,280
a national standpoint, you know,
if you're looking at different
20
00:01:01,280 --> 00:01:03,680
nations, you know, competing in
this space, you know, China's
21
00:01:03,680 --> 00:01:06,760
definitely leading the way.
But then you also have a lot of
22
00:01:06,760 --> 00:01:09,440
the, a lot of the big tech
oligarch type guys, you know,
23
00:01:09,440 --> 00:01:12,200
the same Altman's little Larry
Ellison's, Mark Zuckerberg, even
24
00:01:12,200 --> 00:01:15,360
Elon Musk, they're all are all
super heavily invested into
25
00:01:15,360 --> 00:01:18,200
artificial intelligence as well.
And so I think that what's
26
00:01:18,200 --> 00:01:21,520
happened is that a lot of them
have coalesced around President
27
00:01:21,520 --> 00:01:24,560
Trump and convinced him of the
fact that, you know, our
28
00:01:24,560 --> 00:01:29,080
competition, IE China is, is
going to beat us at the at the
29
00:01:29,120 --> 00:01:32,840
AI race unless we invest heavily
into building out our own
30
00:01:32,840 --> 00:01:36,960
infrastructure and compete
directly with them, which is a
31
00:01:36,960 --> 00:01:40,200
factual statement.
At the same time, I think with
32
00:01:40,200 --> 00:01:44,000
not being asked with it within
our administration is what are
33
00:01:44,000 --> 00:01:45,720
the potential ramifications of
this?
34
00:01:45,720 --> 00:01:47,920
And is this going to be a
violation of our constitutional
35
00:01:47,920 --> 00:01:51,680
rights in a variety of ways?
And I don't, I don't see anybody
36
00:01:51,680 --> 00:01:54,240
actually discussing that side of
things.
37
00:01:54,240 --> 00:01:58,800
So that's definitely a concern.
Well, last week, so Trump signed
38
00:01:58,800 --> 00:02:02,400
an executive order saying he
doesn't want AI technology
39
00:02:02,400 --> 00:02:06,720
that's been influenced by
partisan bias or woke stuff.
40
00:02:06,720 --> 00:02:09,479
So how does how do you even, how
can you fix that?
41
00:02:09,840 --> 00:02:13,200
And how affected is AI with
wokeness right now in your
42
00:02:13,200 --> 00:02:17,080
opinion?
Yeah, I think AI in general is
43
00:02:17,120 --> 00:02:20,960
is pretty infected primarily
because it's like where, where
44
00:02:20,960 --> 00:02:23,840
is it getting its information?
You just look at Grok and
45
00:02:23,840 --> 00:02:25,880
obviously it had it had this
thing a couple weeks ago that
46
00:02:25,880 --> 00:02:28,320
where it went haywire being
completely anti-Semitic.
47
00:02:28,520 --> 00:02:32,280
But just over this last week we,
you know, we see you know, Grok
48
00:02:32,480 --> 00:02:35,640
relying on mainstream media
outlets for all of its sources,
49
00:02:35,640 --> 00:02:37,000
whether it's talking about
Russia gate, whether it's
50
00:02:37,000 --> 00:02:38,880
talking about election fraud,
whether it's talking about
51
00:02:38,880 --> 00:02:40,760
COVID-19.
And obviously that's going to
52
00:02:40,760 --> 00:02:43,000
have a much more woke bias.
Now, when you look at what
53
00:02:43,000 --> 00:02:45,440
President Trump is doing when
it's talking about AI, it was
54
00:02:45,440 --> 00:02:47,680
specifically within the
government, is basically saying
55
00:02:47,680 --> 00:02:50,840
if there's going to be a
government contract to bring in
56
00:02:51,160 --> 00:02:55,320
AI into one of our departments,
it cannot be infected with this
57
00:02:55,320 --> 00:02:58,000
kind of woke ideology, which I
think is going to be very hard
58
00:02:58,000 --> 00:03:02,960
to actually to actually decide
it and enforce just because the
59
00:03:02,960 --> 00:03:06,040
coding can be very convoluted.
So it depends on who's taking a
60
00:03:06,040 --> 00:03:09,360
look at it, who's deciding, you
know, that whole deal.
61
00:03:10,280 --> 00:03:12,640
On the on the flip side, we have
to remember this is all this,
62
00:03:12,640 --> 00:03:15,840
this, this last issue when it
came, when it came to this
63
00:03:15,840 --> 00:03:18,400
executive order with President
Trump, again, that was an
64
00:03:18,400 --> 00:03:20,120
executive order.
It wasn't legislation, which
65
00:03:20,120 --> 00:03:23,320
means it can it can be easily
undone with it, with the next
66
00:03:23,320 --> 00:03:26,000
president of the United States
just with a stroke of a pen.
67
00:03:26,000 --> 00:03:29,480
So because we're not doing this
through legislation, you know,
68
00:03:29,520 --> 00:03:32,600
it, it's easily reversible.
And the problem that I think the
69
00:03:32,600 --> 00:03:35,400
Trump administration is facing
is that during the big beautiful
70
00:03:35,400 --> 00:03:38,960
bill debate, you know, basically
a lot of a lot of what's in the
71
00:03:39,080 --> 00:03:41,800
the current executive order was
actually written into the
72
00:03:41,800 --> 00:03:43,560
legislation for the big
beautiful bill.
73
00:03:43,720 --> 00:03:46,720
There was an outcry from the
MAGA base, from several
74
00:03:46,720 --> 00:03:48,960
different leading voices within
the Republican Party, including
75
00:03:48,960 --> 00:03:51,440
Marjorie Taylor Green and Thomas
Massie.
76
00:03:51,440 --> 00:03:54,560
And so they ended up pulling it
out of the legislation.
77
00:03:54,560 --> 00:03:57,320
And then Trump then turns around
and implements a lot of that
78
00:03:57,320 --> 00:04:00,600
with his executive order.
And so I don't necessarily see
79
00:04:00,600 --> 00:04:02,440
that going over well with a lot
of his base.
80
00:04:02,440 --> 00:04:05,720
But also, you know, because it's
by executive order, it can be
81
00:04:05,720 --> 00:04:07,960
easily done with the next
administration.
82
00:04:08,240 --> 00:04:10,720
Well, you know, we saw what
happened with the Wikipedia,
83
00:04:10,720 --> 00:04:13,600
which sort of became the go to
to find out everything.
84
00:04:13,920 --> 00:04:16,920
And then it, it became wokeified
and you had people who could go
85
00:04:16,920 --> 00:04:20,839
on and change things and put in
their own information and, and
86
00:04:21,120 --> 00:04:23,720
you know, but as a, for example,
because many people are still
87
00:04:23,720 --> 00:04:27,680
sort of getting used to what AI
is, I see it more as a
88
00:04:28,040 --> 00:04:32,040
surveillance tool, a data
gathering tool more than
89
00:04:32,040 --> 00:04:34,480
anything else.
It can help you if you're in
90
00:04:34,480 --> 00:04:38,080
marketing and you want, they've
really come a long way and being
91
00:04:38,080 --> 00:04:41,560
able to develop really cool
things, assets that people call
92
00:04:41,560 --> 00:04:43,760
them.
But it is true.
93
00:04:43,760 --> 00:04:46,760
I think that AI, it's all about
the input.
94
00:04:46,840 --> 00:04:49,080
It's only going to, it doesn't
think for itself.
95
00:04:49,080 --> 00:04:52,480
It's only going to go by the
information that's been fed into
96
00:04:52,480 --> 00:04:56,400
it or that it can curate.
So This is why it becomes woke
97
00:04:56,400 --> 00:05:01,440
because woke is is sort of
trending in mainstream America,
98
00:05:01,440 --> 00:05:02,800
right?
Yeah.
99
00:05:02,840 --> 00:05:05,760
And, and my issue when
especially when it comes to the
100
00:05:05,760 --> 00:05:08,880
government implementation of AI
is not as much about the woke
101
00:05:08,880 --> 00:05:10,840
ideology.
Yes, that can be a concern and
102
00:05:10,840 --> 00:05:12,440
all that.
But in in the, in the grand
103
00:05:12,440 --> 00:05:15,240
scheme of things, the most
important issues that where it's
104
00:05:15,240 --> 00:05:16,800
going, where it's being
implemented right now.
105
00:05:17,000 --> 00:05:20,080
I'm sure that the intelligence
agencies are using it, which,
106
00:05:20,080 --> 00:05:21,840
you know, who knows how they're
doing it because of how
107
00:05:21,840 --> 00:05:24,360
secretive they are.
But I think a perfect example of
108
00:05:24,360 --> 00:05:27,720
this is, is the departments of
HHS is now using artificial
109
00:05:27,720 --> 00:05:29,880
intelligence.
We're at a fast track, FDA
110
00:05:29,880 --> 00:05:33,880
approval of particular drugs and
eliminate animal testing, which
111
00:05:33,880 --> 00:05:37,160
I think is horrific to do it.
You know, obviously I don't like
112
00:05:37,160 --> 00:05:38,880
torturing puppies and all that
kind of stuff.
113
00:05:38,880 --> 00:05:42,280
But at the same time, we have to
have certain safeguards in order
114
00:05:42,360 --> 00:05:45,720
to protect humans before drugs
go to go to human trials, right?
115
00:05:46,000 --> 00:05:48,560
And so when we're looking at the
fast tracking of this and
116
00:05:48,560 --> 00:05:52,320
eliminating the animal trials in
exchange for the for the AI
117
00:05:52,320 --> 00:05:54,680
models, the problem is, is that
who's actually making the AI,
118
00:05:54,880 --> 00:05:56,800
the AI models?
And you could say, oh, well,
119
00:05:56,800 --> 00:05:59,160
under Kennedy, you know, it's
going to be completely neutral.
120
00:05:59,280 --> 00:06:00,960
There's going to be no outside
influence.
121
00:06:00,960 --> 00:06:02,920
He's not going to allow big
pharment in in order to
122
00:06:02,920 --> 00:06:05,240
influence the FDA or anything
like that with money.
123
00:06:05,320 --> 00:06:07,760
Well, what happens with the with
the next HHS secretary?
124
00:06:07,760 --> 00:06:09,960
That's the problem, is that more
or less, yeah.
125
00:06:10,000 --> 00:06:12,200
Then yeah.
Then yeah.
126
00:06:12,360 --> 00:06:14,080
Might get the like the last one
you know.
127
00:06:15,440 --> 00:06:16,760
Exactly.
Exactly.
128
00:06:18,040 --> 00:06:19,960
Tell me about your We got a
couple of minutes here.
129
00:06:19,960 --> 00:06:22,360
We're talking with Jeff Dornick.
He's the founder and CEO of
130
00:06:22,360 --> 00:06:25,120
Pickaxe.
So it sounds like it's sort of a
131
00:06:25,120 --> 00:06:26,920
a non biased.
Tell me what it is.
132
00:06:27,200 --> 00:06:28,360
Tell me about your company.
Yeah.
133
00:06:29,280 --> 00:06:30,040
Definitely.
I appreciate.
134
00:06:30,040 --> 00:06:31,600
Yeah.
So it's a social media platform
135
00:06:31,600 --> 00:06:34,840
that we that we built and it's
kind of a hybrid between X and
136
00:06:34,840 --> 00:06:37,120
Substack with a bunch of other
really cool, you know, features
137
00:06:37,120 --> 00:06:39,840
built into it.
But the whole idea behind it was
138
00:06:39,840 --> 00:06:42,480
not only allowing you to have
freedom of speech, but then also
139
00:06:42,480 --> 00:06:45,120
freedom of reach, which means
that we're not going to use
140
00:06:45,120 --> 00:06:48,240
algorithms in order to
manipulate your exposure to your
141
00:06:48,240 --> 00:06:49,440
audience.
So he follows you.
142
00:06:49,440 --> 00:06:51,160
That means that they actually
want to see your content.
143
00:06:51,160 --> 00:06:53,160
So we're not going to get in
between you and your audience,
144
00:06:53,400 --> 00:06:55,840
but we're also finding new and
innovative ways in order to
145
00:06:55,840 --> 00:07:00,360
bridge that gap between users to
users and creators to users as
146
00:07:00,360 --> 00:07:01,760
well.
So things like, you know, we're
147
00:07:01,760 --> 00:07:04,040
going to be implementing an
e-mail newsletter component
148
00:07:04,320 --> 00:07:07,160
where you can send emails to to
your subscribers.
149
00:07:07,440 --> 00:07:10,560
You know, we're, we're bringing
up new and innovative ways for
150
00:07:10,560 --> 00:07:13,040
content creators to be able to
monetize their content.
151
00:07:13,600 --> 00:07:16,800
You know, you can write articles
directly within our platform and
152
00:07:16,800 --> 00:07:18,400
have and have it readable right
there.
153
00:07:18,400 --> 00:07:21,440
We integrate with Rumble into
the platform to where all Rumble
154
00:07:21,440 --> 00:07:23,840
videos are playable in the news
feed as if it's native to the
155
00:07:23,840 --> 00:07:25,160
platform.
So we're doing a lot of really
156
00:07:25,160 --> 00:07:28,240
cool things with this.
And so we've been building
157
00:07:28,240 --> 00:07:30,240
building this out.
We've got the web version out
158
00:07:30,240 --> 00:07:33,440
right now at at pickaxe.com.
And then we're going to be
159
00:07:33,440 --> 00:07:35,840
launching the app here next
month as well.
160
00:07:36,040 --> 00:07:38,240
And that'll be on both iOS and
Android.
161
00:07:38,440 --> 00:07:40,360
And we're getting a lot of
amazing people already coming
162
00:07:40,360 --> 00:07:43,600
over, including like Children's
Health Defense, Naomi Wolf, you
163
00:07:43,600 --> 00:07:46,120
know, Shannon Joy, Daily Cloud,
a bunch of amazing people
164
00:07:46,120 --> 00:07:49,160
already on the platform.
So I highly encourage people to
165
00:07:49,440 --> 00:07:51,480
come on over.
I'm going to check it out.
166
00:07:51,480 --> 00:07:55,520
I I'm somewhat happy with
getter, but it's a lot of it's
167
00:07:55,520 --> 00:07:59,080
sort of like an echo chamber X.
It's just not keeping up.
168
00:07:59,480 --> 00:08:02,960
So there's not a lot of real
great sources out there for good
169
00:08:02,960 --> 00:08:04,360
social media.
I'm going to check it out.
170
00:08:04,640 --> 00:08:09,400
And it's spelled, by the way,
for people listening, PICKAX,
171
00:08:09,760 --> 00:08:12,200
pick X.
And that's the name of the new
172
00:08:12,200 --> 00:08:13,840
social media platform that's out
there.
173
00:08:13,840 --> 00:08:15,920
If you're into social media, I
know I am.
174
00:08:15,920 --> 00:08:19,320
I'm a content producer, so I
like new avenues to reach new
175
00:08:19,320 --> 00:08:20,640
audiences.
So that's good.
176
00:08:21,040 --> 00:08:23,680
What's the most surprising thing
in less than a minute?
177
00:08:23,680 --> 00:08:26,800
What is the most you're excited
about AI, Jeff?
178
00:08:29,480 --> 00:08:32,799
I, I think the, the thing that I
am the, it's, it's funny,
179
00:08:32,960 --> 00:08:35,880
there's, there's certain
applications obviously of, of AI
180
00:08:35,880 --> 00:08:38,880
that, that can be phenomenal,
including, you know, we, we see
181
00:08:38,880 --> 00:08:41,120
Elon Musk talking about, you
know, allowing blind people to
182
00:08:41,120 --> 00:08:44,200
see and use neural link and
different things like that.
183
00:08:44,800 --> 00:08:47,880
I think for me, and this is
going to be kind of backwards a
184
00:08:47,880 --> 00:08:51,720
little bit, but the very most
exciting thing that I've been
185
00:08:51,720 --> 00:08:56,760
seeing is that people are waking
up to the fact of how valuable
186
00:08:56,760 --> 00:08:58,600
just human to human connection
is.
187
00:08:58,680 --> 00:09:01,240
Now we're seeing more and more
people going back to just having
188
00:09:01,240 --> 00:09:03,760
human to human connection and
growing their own food and
189
00:09:03,760 --> 00:09:05,720
spending time with their
families and all that kind of
190
00:09:05,720 --> 00:09:06,680
stuff.
And to a certain degree,
191
00:09:06,920 --> 00:09:09,600
checking out from technology.
And, and that's been one of one
192
00:09:09,600 --> 00:09:13,160
of the interesting effects of
this really, really big push of
193
00:09:13,320 --> 00:09:16,880
AI is now we're actually seeing
people go out and plant the
194
00:09:16,880 --> 00:09:19,800
garden and actually doing their
own research and figuring things
195
00:09:19,800 --> 00:09:21,960
out for themselves.
To me, that's been one of the
196
00:09:21,960 --> 00:09:23,760
most positive things.
Yeah, you're right.
197
00:09:23,760 --> 00:09:25,360
I had to fix something this
weekend.
198
00:09:25,360 --> 00:09:28,600
I thought I could go on YouTube
right now and get the answer.
199
00:09:28,600 --> 00:09:31,240
But there's something fun about
the discovery of how to solve a
200
00:09:31,240 --> 00:09:33,360
problem, you know?
All right, well, listen, thank
201
00:09:33,360 --> 00:09:37,280
you so much again.
Jeff Dornick is is the gentleman
202
00:09:37,280 --> 00:09:41,400
and he's the CEO of Pickaxe.
Let's pick Axe, a new social
203
00:09:41,400 --> 00:09:43,880
media app talking about AI.
Thanks again, Jeff.
204
00:09:43,880 --> 00:09:46,240
We'll talk to you again soon.
Thank you.
00:00:00,120 --> 00:00:02,560
All right, welcome back, America
today.
2
00:00:02,560 --> 00:00:05,400
You know, woke is everything.
Our AI is everything now.
3
00:00:05,400 --> 00:00:09,880
And I wanted to bring in Jeff
Dornick about this because it's
4
00:00:09,880 --> 00:00:13,800
very important about AI and
where we're headed and what it
5
00:00:13,800 --> 00:00:17,640
means.
And Jeff is the CEO of Pickaxe,
6
00:00:18,040 --> 00:00:21,200
A groundbreaking social media
platform built on 2
7
00:00:21,200 --> 00:00:25,360
uncompromising principles,
freedom of speech and freedom of
8
00:00:25,360 --> 00:00:27,360
reach.
Welcome to the program.
9
00:00:27,360 --> 00:00:29,160
We appreciate you coming on.
Jeff, how are you?
10
00:00:30,280 --> 00:00:31,680
Doing very well, thanks for
having me.
11
00:00:31,960 --> 00:00:33,320
On Yeah, it's a pleasure, real
pleasure.
12
00:00:33,320 --> 00:00:38,000
So what's the drive right now
with the United States trying to
13
00:00:38,000 --> 00:00:42,480
dominate in the world of AI?
What is it about AI that is
14
00:00:42,480 --> 00:00:44,680
making our government so excited
right now?
15
00:00:46,240 --> 00:00:49,080
Well, I think there's there's a
couple of different issues all
16
00:00:49,280 --> 00:00:52,200
playing into this right now.
You know #1 you've got China,
17
00:00:52,200 --> 00:00:55,200
which it, which is investing
heavily into both artificial
18
00:00:55,200 --> 00:00:58,760
intelligence and quantum
computing to where from a, from
19
00:00:58,760 --> 00:01:01,280
a national standpoint, you know,
if you're looking at different
20
00:01:01,280 --> 00:01:03,680
nations, you know, competing in
this space, you know, China's
21
00:01:03,680 --> 00:01:06,760
definitely leading the way.
But then you also have a lot of
22
00:01:06,760 --> 00:01:09,440
the, a lot of the big tech
oligarch type guys, you know,
23
00:01:09,440 --> 00:01:12,200
the same Altman's little Larry
Ellison's, Mark Zuckerberg, even
24
00:01:12,200 --> 00:01:15,360
Elon Musk, they're all are all
super heavily invested into
25
00:01:15,360 --> 00:01:18,200
artificial intelligence as well.
And so I think that what's
26
00:01:18,200 --> 00:01:21,520
happened is that a lot of them
have coalesced around President
27
00:01:21,520 --> 00:01:24,560
Trump and convinced him of the
fact that, you know, our
28
00:01:24,560 --> 00:01:29,080
competition, IE China is, is
going to beat us at the at the
29
00:01:29,120 --> 00:01:32,840
AI race unless we invest heavily
into building out our own
30
00:01:32,840 --> 00:01:36,960
infrastructure and compete
directly with them, which is a
31
00:01:36,960 --> 00:01:40,200
factual statement.
At the same time, I think with
32
00:01:40,200 --> 00:01:44,000
not being asked with it within
our administration is what are
33
00:01:44,000 --> 00:01:45,720
the potential ramifications of
this?
34
00:01:45,720 --> 00:01:47,920
And is this going to be a
violation of our constitutional
35
00:01:47,920 --> 00:01:51,680
rights in a variety of ways?
And I don't, I don't see anybody
36
00:01:51,680 --> 00:01:54,240
actually discussing that side of
things.
37
00:01:54,240 --> 00:01:58,800
So that's definitely a concern.
Well, last week, so Trump signed
38
00:01:58,800 --> 00:02:02,400
an executive order saying he
doesn't want AI technology
39
00:02:02,400 --> 00:02:06,720
that's been influenced by
partisan bias or woke stuff.
40
00:02:06,720 --> 00:02:09,479
So how does how do you even, how
can you fix that?
41
00:02:09,840 --> 00:02:13,200
And how affected is AI with
wokeness right now in your
42
00:02:13,200 --> 00:02:17,080
opinion?
Yeah, I think AI in general is
43
00:02:17,120 --> 00:02:20,960
is pretty infected primarily
because it's like where, where
44
00:02:20,960 --> 00:02:23,840
is it getting its information?
You just look at Grok and
45
00:02:23,840 --> 00:02:25,880
obviously it had it had this
thing a couple weeks ago that
46
00:02:25,880 --> 00:02:28,320
where it went haywire being
completely anti-Semitic.
47
00:02:28,520 --> 00:02:32,280
But just over this last week we,
you know, we see you know, Grok
48
00:02:32,480 --> 00:02:35,640
relying on mainstream media
outlets for all of its sources,
49
00:02:35,640 --> 00:02:37,000
whether it's talking about
Russia gate, whether it's
50
00:02:37,000 --> 00:02:38,880
talking about election fraud,
whether it's talking about
51
00:02:38,880 --> 00:02:40,760
COVID-19.
And obviously that's going to
52
00:02:40,760 --> 00:02:43,000
have a much more woke bias.
Now, when you look at what
53
00:02:43,000 --> 00:02:45,440
President Trump is doing when
it's talking about AI, it was
54
00:02:45,440 --> 00:02:47,680
specifically within the
government, is basically saying
55
00:02:47,680 --> 00:02:50,840
if there's going to be a
government contract to bring in
56
00:02:51,160 --> 00:02:55,320
AI into one of our departments,
it cannot be infected with this
57
00:02:55,320 --> 00:02:58,000
kind of woke ideology, which I
think is going to be very hard
58
00:02:58,000 --> 00:03:02,960
to actually to actually decide
it and enforce just because the
59
00:03:02,960 --> 00:03:06,040
coding can be very convoluted.
So it depends on who's taking a
60
00:03:06,040 --> 00:03:09,360
look at it, who's deciding, you
know, that whole deal.
61
00:03:10,280 --> 00:03:12,640
On the on the flip side, we have
to remember this is all this,
62
00:03:12,640 --> 00:03:15,840
this, this last issue when it
came, when it came to this
63
00:03:15,840 --> 00:03:18,400
executive order with President
Trump, again, that was an
64
00:03:18,400 --> 00:03:20,120
executive order.
It wasn't legislation, which
65
00:03:20,120 --> 00:03:23,320
means it can it can be easily
undone with it, with the next
66
00:03:23,320 --> 00:03:26,000
president of the United States
just with a stroke of a pen.
67
00:03:26,000 --> 00:03:29,480
So because we're not doing this
through legislation, you know,
68
00:03:29,520 --> 00:03:32,600
it, it's easily reversible.
And the problem that I think the
69
00:03:32,600 --> 00:03:35,400
Trump administration is facing
is that during the big beautiful
70
00:03:35,400 --> 00:03:38,960
bill debate, you know, basically
a lot of a lot of what's in the
71
00:03:39,080 --> 00:03:41,800
the current executive order was
actually written into the
72
00:03:41,800 --> 00:03:43,560
legislation for the big
beautiful bill.
73
00:03:43,720 --> 00:03:46,720
There was an outcry from the
MAGA base, from several
74
00:03:46,720 --> 00:03:48,960
different leading voices within
the Republican Party, including
75
00:03:48,960 --> 00:03:51,440
Marjorie Taylor Green and Thomas
Massie.
76
00:03:51,440 --> 00:03:54,560
And so they ended up pulling it
out of the legislation.
77
00:03:54,560 --> 00:03:57,320
And then Trump then turns around
and implements a lot of that
78
00:03:57,320 --> 00:04:00,600
with his executive order.
And so I don't necessarily see
79
00:04:00,600 --> 00:04:02,440
that going over well with a lot
of his base.
80
00:04:02,440 --> 00:04:05,720
But also, you know, because it's
by executive order, it can be
81
00:04:05,720 --> 00:04:07,960
easily done with the next
administration.
82
00:04:08,240 --> 00:04:10,720
Well, you know, we saw what
happened with the Wikipedia,
83
00:04:10,720 --> 00:04:13,600
which sort of became the go to
to find out everything.
84
00:04:13,920 --> 00:04:16,920
And then it, it became wokeified
and you had people who could go
85
00:04:16,920 --> 00:04:20,839
on and change things and put in
their own information and, and
86
00:04:21,120 --> 00:04:23,720
you know, but as a, for example,
because many people are still
87
00:04:23,720 --> 00:04:27,680
sort of getting used to what AI
is, I see it more as a
88
00:04:28,040 --> 00:04:32,040
surveillance tool, a data
gathering tool more than
89
00:04:32,040 --> 00:04:34,480
anything else.
It can help you if you're in
90
00:04:34,480 --> 00:04:38,080
marketing and you want, they've
really come a long way and being
91
00:04:38,080 --> 00:04:41,560
able to develop really cool
things, assets that people call
92
00:04:41,560 --> 00:04:43,760
them.
But it is true.
93
00:04:43,760 --> 00:04:46,760
I think that AI, it's all about
the input.
94
00:04:46,840 --> 00:04:49,080
It's only going to, it doesn't
think for itself.
95
00:04:49,080 --> 00:04:52,480
It's only going to go by the
information that's been fed into
96
00:04:52,480 --> 00:04:56,400
it or that it can curate.
So This is why it becomes woke
97
00:04:56,400 --> 00:05:01,440
because woke is is sort of
trending in mainstream America,
98
00:05:01,440 --> 00:05:02,800
right?
Yeah.
99
00:05:02,840 --> 00:05:05,760
And, and my issue when
especially when it comes to the
100
00:05:05,760 --> 00:05:08,880
government implementation of AI
is not as much about the woke
101
00:05:08,880 --> 00:05:10,840
ideology.
Yes, that can be a concern and
102
00:05:10,840 --> 00:05:12,440
all that.
But in in the, in the grand
103
00:05:12,440 --> 00:05:15,240
scheme of things, the most
important issues that where it's
104
00:05:15,240 --> 00:05:16,800
going, where it's being
implemented right now.
105
00:05:17,000 --> 00:05:20,080
I'm sure that the intelligence
agencies are using it, which,
106
00:05:20,080 --> 00:05:21,840
you know, who knows how they're
doing it because of how
107
00:05:21,840 --> 00:05:24,360
secretive they are.
But I think a perfect example of
108
00:05:24,360 --> 00:05:27,720
this is, is the departments of
HHS is now using artificial
109
00:05:27,720 --> 00:05:29,880
intelligence.
We're at a fast track, FDA
110
00:05:29,880 --> 00:05:33,880
approval of particular drugs and
eliminate animal testing, which
111
00:05:33,880 --> 00:05:37,160
I think is horrific to do it.
You know, obviously I don't like
112
00:05:37,160 --> 00:05:38,880
torturing puppies and all that
kind of stuff.
113
00:05:38,880 --> 00:05:42,280
But at the same time, we have to
have certain safeguards in order
114
00:05:42,360 --> 00:05:45,720
to protect humans before drugs
go to go to human trials, right?
115
00:05:46,000 --> 00:05:48,560
And so when we're looking at the
fast tracking of this and
116
00:05:48,560 --> 00:05:52,320
eliminating the animal trials in
exchange for the for the AI
117
00:05:52,320 --> 00:05:54,680
models, the problem is, is that
who's actually making the AI,
118
00:05:54,880 --> 00:05:56,800
the AI models?
And you could say, oh, well,
119
00:05:56,800 --> 00:05:59,160
under Kennedy, you know, it's
going to be completely neutral.
120
00:05:59,280 --> 00:06:00,960
There's going to be no outside
influence.
121
00:06:00,960 --> 00:06:02,920
He's not going to allow big
pharment in in order to
122
00:06:02,920 --> 00:06:05,240
influence the FDA or anything
like that with money.
123
00:06:05,320 --> 00:06:07,760
Well, what happens with the with
the next HHS secretary?
124
00:06:07,760 --> 00:06:09,960
That's the problem, is that more
or less, yeah.
125
00:06:10,000 --> 00:06:12,200
Then yeah.
Then yeah.
126
00:06:12,360 --> 00:06:14,080
Might get the like the last one
you know.
127
00:06:15,440 --> 00:06:16,760
Exactly.
Exactly.
128
00:06:18,040 --> 00:06:19,960
Tell me about your We got a
couple of minutes here.
129
00:06:19,960 --> 00:06:22,360
We're talking with Jeff Dornick.
He's the founder and CEO of
130
00:06:22,360 --> 00:06:25,120
Pickaxe.
So it sounds like it's sort of a
131
00:06:25,120 --> 00:06:26,920
a non biased.
Tell me what it is.
132
00:06:27,200 --> 00:06:28,360
Tell me about your company.
Yeah.
133
00:06:29,280 --> 00:06:30,040
Definitely.
I appreciate.
134
00:06:30,040 --> 00:06:31,600
Yeah.
So it's a social media platform
135
00:06:31,600 --> 00:06:34,840
that we that we built and it's
kind of a hybrid between X and
136
00:06:34,840 --> 00:06:37,120
Substack with a bunch of other
really cool, you know, features
137
00:06:37,120 --> 00:06:39,840
built into it.
But the whole idea behind it was
138
00:06:39,840 --> 00:06:42,480
not only allowing you to have
freedom of speech, but then also
139
00:06:42,480 --> 00:06:45,120
freedom of reach, which means
that we're not going to use
140
00:06:45,120 --> 00:06:48,240
algorithms in order to
manipulate your exposure to your
141
00:06:48,240 --> 00:06:49,440
audience.
So he follows you.
142
00:06:49,440 --> 00:06:51,160
That means that they actually
want to see your content.
143
00:06:51,160 --> 00:06:53,160
So we're not going to get in
between you and your audience,
144
00:06:53,400 --> 00:06:55,840
but we're also finding new and
innovative ways in order to
145
00:06:55,840 --> 00:07:00,360
bridge that gap between users to
users and creators to users as
146
00:07:00,360 --> 00:07:01,760
well.
So things like, you know, we're
147
00:07:01,760 --> 00:07:04,040
going to be implementing an
e-mail newsletter component
148
00:07:04,320 --> 00:07:07,160
where you can send emails to to
your subscribers.
149
00:07:07,440 --> 00:07:10,560
You know, we're, we're bringing
up new and innovative ways for
150
00:07:10,560 --> 00:07:13,040
content creators to be able to
monetize their content.
151
00:07:13,600 --> 00:07:16,800
You know, you can write articles
directly within our platform and
152
00:07:16,800 --> 00:07:18,400
have and have it readable right
there.
153
00:07:18,400 --> 00:07:21,440
We integrate with Rumble into
the platform to where all Rumble
154
00:07:21,440 --> 00:07:23,840
videos are playable in the news
feed as if it's native to the
155
00:07:23,840 --> 00:07:25,160
platform.
So we're doing a lot of really
156
00:07:25,160 --> 00:07:28,240
cool things with this.
And so we've been building
157
00:07:28,240 --> 00:07:30,240
building this out.
We've got the web version out
158
00:07:30,240 --> 00:07:33,440
right now at at pickaxe.com.
And then we're going to be
159
00:07:33,440 --> 00:07:35,840
launching the app here next
month as well.
160
00:07:36,040 --> 00:07:38,240
And that'll be on both iOS and
Android.
161
00:07:38,440 --> 00:07:40,360
And we're getting a lot of
amazing people already coming
162
00:07:40,360 --> 00:07:43,600
over, including like Children's
Health Defense, Naomi Wolf, you
163
00:07:43,600 --> 00:07:46,120
know, Shannon Joy, Daily Cloud,
a bunch of amazing people
164
00:07:46,120 --> 00:07:49,160
already on the platform.
So I highly encourage people to
165
00:07:49,440 --> 00:07:51,480
come on over.
I'm going to check it out.
166
00:07:51,480 --> 00:07:55,520
I I'm somewhat happy with
getter, but it's a lot of it's
167
00:07:55,520 --> 00:07:59,080
sort of like an echo chamber X.
It's just not keeping up.
168
00:07:59,480 --> 00:08:02,960
So there's not a lot of real
great sources out there for good
169
00:08:02,960 --> 00:08:04,360
social media.
I'm going to check it out.
170
00:08:04,640 --> 00:08:09,400
And it's spelled, by the way,
for people listening, PICKAX,
171
00:08:09,760 --> 00:08:12,200
pick X.
And that's the name of the new
172
00:08:12,200 --> 00:08:13,840
social media platform that's out
there.
173
00:08:13,840 --> 00:08:15,920
If you're into social media, I
know I am.
174
00:08:15,920 --> 00:08:19,320
I'm a content producer, so I
like new avenues to reach new
175
00:08:19,320 --> 00:08:20,640
audiences.
So that's good.
176
00:08:21,040 --> 00:08:23,680
What's the most surprising thing
in less than a minute?
177
00:08:23,680 --> 00:08:26,800
What is the most you're excited
about AI, Jeff?
178
00:08:29,480 --> 00:08:32,799
I, I think the, the thing that I
am the, it's, it's funny,
179
00:08:32,960 --> 00:08:35,880
there's, there's certain
applications obviously of, of AI
180
00:08:35,880 --> 00:08:38,880
that, that can be phenomenal,
including, you know, we, we see
181
00:08:38,880 --> 00:08:41,120
Elon Musk talking about, you
know, allowing blind people to
182
00:08:41,120 --> 00:08:44,200
see and use neural link and
different things like that.
183
00:08:44,800 --> 00:08:47,880
I think for me, and this is
going to be kind of backwards a
184
00:08:47,880 --> 00:08:51,720
little bit, but the very most
exciting thing that I've been
185
00:08:51,720 --> 00:08:56,760
seeing is that people are waking
up to the fact of how valuable
186
00:08:56,760 --> 00:08:58,600
just human to human connection
is.
187
00:08:58,680 --> 00:09:01,240
Now we're seeing more and more
people going back to just having
188
00:09:01,240 --> 00:09:03,760
human to human connection and
growing their own food and
189
00:09:03,760 --> 00:09:05,720
spending time with their
families and all that kind of
190
00:09:05,720 --> 00:09:06,680
stuff.
And to a certain degree,
191
00:09:06,920 --> 00:09:09,600
checking out from technology.
And, and that's been one of one
192
00:09:09,600 --> 00:09:13,160
of the interesting effects of
this really, really big push of
193
00:09:13,320 --> 00:09:16,880
AI is now we're actually seeing
people go out and plant the
194
00:09:16,880 --> 00:09:19,800
garden and actually doing their
own research and figuring things
195
00:09:19,800 --> 00:09:21,960
out for themselves.
To me, that's been one of the
196
00:09:21,960 --> 00:09:23,760
most positive things.
Yeah, you're right.
197
00:09:23,760 --> 00:09:25,360
I had to fix something this
weekend.
198
00:09:25,360 --> 00:09:28,600
I thought I could go on YouTube
right now and get the answer.
199
00:09:28,600 --> 00:09:31,240
But there's something fun about
the discovery of how to solve a
200
00:09:31,240 --> 00:09:33,360
problem, you know?
All right, well, listen, thank
201
00:09:33,360 --> 00:09:37,280
you so much again.
Jeff Dornick is is the gentleman
202
00:09:37,280 --> 00:09:41,400
and he's the CEO of Pickaxe.
Let's pick Axe, a new social
203
00:09:41,400 --> 00:09:43,880
media app talking about AI.
Thanks again, Jeff.
204
00:09:43,880 --> 00:09:46,240
We'll talk to you again soon.
Thank you.