Oct. 6, 2025

How GenAI Can Fast-Track Your Career

The player is loading ...
How GenAI Can Fast-Track Your Career

In this episode of Leadership Sovereignty, Ralph Owens and Terry discuss the transformative impact of AI on careers and society. They share a real-life success story of an individual who embraced AI opportunities, highlighting the importance of early adoption and risk-taking. The conversation also delves into navigating information bias with AI tools and the evolution of technology, emphasizing the need for continuous learning and adaptation in a rapidly changing world.


Key Takeaways

  • AI is changing lives and impacting careers.
  • Embracing AI can significantly alter one's career trajectory.
  • Early adoption of technology is crucial for success.
  • Risk-taking is essential in leveraging new opportunities.
  • AI's responses require critical evaluation and feedback.
  • Information bias can be navigated using AI tools.
  • The speed of information dissemination affects societal evolution.
  • AI can supercharge ideas and innovations.
  • Access to information is foundational for growth.
  • Continuous learning is necessary in the age of AI.


Chapters

00:00 Seizing Current Opportunities in AI

00:58 Real-Life Success Stories in AI

04:21 The Importance of Passion and Initiative

05:20 Navigating the Future of AI

07:00 Leveraging Feedback Loops in AI

09:22 Understanding Information Bias in AI

14:30 Accessing AI Tools for Innovation

20:33 The Evolution of Knowledge Sharing







Transcript
1
00:00:00,040 --> 00:00:03,360
Less than a year and a half
later, this dude is the head of

2
00:00:03,440 --> 00:00:06,160
AI.
He he he will, he will.

3
00:00:07,200 --> 00:00:11,320
I I I would estimate in the next
three years he will 10X his

4
00:00:11,320 --> 00:00:31,440
salary.
I got a real life story, Ralph.

5
00:00:31,440 --> 00:00:33,440
So that was a gentleman.
I'm going to, I'm not going to

6
00:00:33,440 --> 00:00:37,000
share his name.
There's a gentleman and I saw he

7
00:00:37,000 --> 00:00:43,200
had something and he was fresh,
fresh out of school and he was

8
00:00:43,200 --> 00:00:47,040
put in an opportunity and, you
know, kind of a, you know, he

9
00:00:47,040 --> 00:00:51,200
was a, you know, service desk
help help desk guy and he was

10
00:00:51,200 --> 00:00:58,560
put in a situation to to work
with the leaders and, and I'm

11
00:00:58,560 --> 00:01:00,640
like, OK, well, you know, when
leaders, you know, you could

12
00:01:00,640 --> 00:01:02,160
deal with some of everything,
right?

13
00:01:02,160 --> 00:01:04,160
Wireless network, blah, blah,
blah.

14
00:01:04,360 --> 00:01:07,280
So I invested, I'm like, hey
man, I got some learning credits

15
00:01:07,280 --> 00:01:10,120
with Cisco.
I'm going to send you to get

16
00:01:10,120 --> 00:01:14,120
this basic networking and, you
know, training.

17
00:01:14,120 --> 00:01:15,360
He just, he had something,
right.

18
00:01:15,520 --> 00:01:20,640
Well, the organization start,
you know, a few years later

19
00:01:20,640 --> 00:01:28,280
start moving into this, you
know, kind of AI space and start

20
00:01:28,280 --> 00:01:32,960
doing these campaigns to can you
find something to make more

21
00:01:32,960 --> 00:01:35,120
efficient?
Well, this guy jumped all in.

22
00:01:35,120 --> 00:01:40,600
Now he's again, in terms of
technical rank, he was

23
00:01:40,600 --> 00:01:45,840
considered, you know, service
desk, you know, that kind of job

24
00:01:45,840 --> 00:01:50,080
in terms of the technical stack,
right, hierarchy, you know, you

25
00:01:50,080 --> 00:01:53,360
know, you're starting that
you're starting, you know, so

26
00:01:53,360 --> 00:01:55,400
this guy goes all in on that,
right?

27
00:01:55,400 --> 00:01:57,200
Opportunities.
He's like, I'm gonna try this,

28
00:01:57,200 --> 00:02:00,000
try that.
And next thing you know, I get

29
00:02:00,000 --> 00:02:02,680
an e-mail.
Hey, you know, we have this

30
00:02:02,680 --> 00:02:09,320
internal AI project and he's
kind of the face of he's

31
00:02:09,320 --> 00:02:10,720
leading.
And I'm like, OK, that's neat,

32
00:02:10,720 --> 00:02:13,080
you know?
And so he included me because I

33
00:02:13,080 --> 00:02:16,120
was a early sponsor for him.
So I'm giving him feedback.

34
00:02:16,120 --> 00:02:19,040
Yeah, I think it's great.
It was a little slow here or I'm

35
00:02:19,240 --> 00:02:20,920
searching here.
I didn't get this blah blah

36
00:02:20,920 --> 00:02:26,760
blah.
You know bro, less than a year

37
00:02:26,760 --> 00:02:30,080
and a half later, this dude is
the head of AII.

38
00:02:31,880 --> 00:02:33,520
Know who?
I know you know who I'm talking

39
00:02:33,520 --> 00:02:35,880
about.
And all I could say was, first

40
00:02:35,880 --> 00:02:39,480
of all, it was his hunger,
right?

41
00:02:39,480 --> 00:02:44,560
I saw he was hungry from the
beginning and and, and so, you

42
00:02:44,560 --> 00:02:48,800
know, I saw that and I'm like, I
want to invest in it, but also

43
00:02:48,800 --> 00:02:52,120
too, there was some benefit to
me because I know he was

44
00:02:52,120 --> 00:02:56,360
addressing the higher ups.
The more information I can get

45
00:02:56,360 --> 00:02:58,560
to him, the less that's going to
come to me.

46
00:02:59,000 --> 00:03:01,400
So it was a it was, it was I
talk about this all the time.

47
00:03:01,400 --> 00:03:04,480
It's a strange value.
So I'm going to help you, help

48
00:03:04,480 --> 00:03:05,920
me help you.
That's right.

49
00:03:07,480 --> 00:03:10,680
So as a result of that, that
kind of start of his journey,

50
00:03:11,000 --> 00:03:14,480
right?
And again this this.

51
00:03:14,640 --> 00:03:18,120
Terry, do you realize, do you
realize you changed the

52
00:03:18,120 --> 00:03:22,800
trajectory of his life?
He is never going to leave AI.

53
00:03:22,800 --> 00:03:24,280
Now.
I know who you're talking about

54
00:03:24,400 --> 00:03:26,000
because he's a good friend of
mine too.

55
00:03:26,480 --> 00:03:28,360
He's never going to leave the AI
industry.

56
00:03:28,640 --> 00:03:32,480
And he is so far ahead of his
peers like they don't even know

57
00:03:32,480 --> 00:03:35,200
what's happening right now.
No, they don't.

58
00:03:35,280 --> 00:03:37,040
Right when you could take on a
title.

59
00:03:37,040 --> 00:03:39,080
Yes, he got a title.
It's the head.

60
00:03:39,600 --> 00:03:43,160
That says ahead of AI, I'm
trying to tell you he he, he

61
00:03:43,160 --> 00:03:47,120
will.
He will, I, I, I would estimate

62
00:03:47,560 --> 00:03:52,600
in the next three years he will
10X his salary, right?

63
00:03:52,600 --> 00:03:55,080
Because, because he saw the
opportunity and, and you, you

64
00:03:55,080 --> 00:03:57,080
like, you push them in that
direction.

65
00:03:57,080 --> 00:03:59,920
And you know, now he sees like
this is, this is the future.

66
00:04:01,000 --> 00:04:02,880
Yeah, yeah.
And and this is this is

67
00:04:02,880 --> 00:04:07,280
available to anybody right now.
Yep, and and he took.

68
00:04:07,280 --> 00:04:09,240
A risk?
He took a risk, but he was like,

69
00:04:09,280 --> 00:04:12,040
this is important to me because
here's the I'm going to be

70
00:04:12,040 --> 00:04:13,640
honest with you.
There are folks were like, I

71
00:04:13,640 --> 00:04:16,160
don't know why he's doing this.
I don't know why they why, why

72
00:04:16,160 --> 00:04:18,399
he's getting the opportunity.
Well, he's getting the

73
00:04:18,399 --> 00:04:20,120
opportunity because he was
hungry.

74
00:04:21,440 --> 00:04:23,320
Yeah.
But he showed passion.

75
00:04:23,320 --> 00:04:24,720
That's right.
Let me use a better word.

76
00:04:25,080 --> 00:04:26,080
Right.
I'm I'm sorry.

77
00:04:26,120 --> 00:04:28,240
I'm sorry.
I'm you know, we had basketball

78
00:04:28,240 --> 00:04:29,720
this morning, so I'm using
those.

79
00:04:29,720 --> 00:04:32,400
I'm using those poster.
How hungry are you?

80
00:04:32,400 --> 00:04:33,880
Go get it.
All right.

81
00:04:34,320 --> 00:04:35,920
He had passion.
I'm sorry.

82
00:04:36,680 --> 00:04:40,640
Yeah, yeah, yeah, yeah, yeah.
No, that's, that's good, man,

83
00:04:40,640 --> 00:04:41,280
That's good.
Yeah.

84
00:04:41,720 --> 00:04:45,960
If you, yeah, yeah, No, I, I, I
think I have to sit back and

85
00:04:45,960 --> 00:04:47,600
smile when I think about,
because you're right.

86
00:04:47,960 --> 00:04:51,440
He was very passionate and he,
he definitely was never lazy,

87
00:04:51,440 --> 00:04:54,000
always wanted to go get it.
And now it's paying off for him.

88
00:04:54,400 --> 00:04:57,680
And yeah, I, I know that he's
going to do well.

89
00:04:57,680 --> 00:04:59,840
So that's.
And it's just a story, right,

90
00:04:59,840 --> 00:05:05,000
that shows that if you embrace
it, it can change your

91
00:05:05,000 --> 00:05:07,320
trajectory, right?
Everyone's going to be a little

92
00:05:07,320 --> 00:05:10,160
different, right?
But again, if you embrace it

93
00:05:10,200 --> 00:05:13,840
early, right?
He was an early embracer of it.

94
00:05:14,320 --> 00:05:17,040
Yes, yeah.
So yeah, that's also the story.

95
00:05:17,080 --> 00:05:18,560
Just get, you know, just get it
early.

96
00:05:18,560 --> 00:05:19,280
I'm sorry.
Go ahead bro.

97
00:05:20,360 --> 00:05:22,920
No, no.
But to your point, right, we are

98
00:05:22,920 --> 00:05:26,560
in this moment in history right
now where it's still optional,

99
00:05:26,560 --> 00:05:29,560
yes, right.
But if you go get it, you're

100
00:05:29,560 --> 00:05:33,400
going to stand out.
Yes, it's.

101
00:05:33,440 --> 00:05:35,640
Going to be VM Ware.
It's going to be VM Ware.

102
00:05:36,200 --> 00:05:39,200
Yeah, all over again, right.
And it's going to be a part of

103
00:05:39,200 --> 00:05:41,200
everybody's job.
And if you want to have your

104
00:05:41,200 --> 00:05:42,880
job, you're going to have to
know how to do this.

105
00:05:43,160 --> 00:05:46,080
Well, The opportunities to take
those advancements, those leaps

106
00:05:46,520 --> 00:05:49,200
forward really quickly, those
are going to be gone.

107
00:05:49,520 --> 00:05:52,680
Right now is the time to to, to
leverage that, you know, that

108
00:05:52,680 --> 00:05:56,480
technology to make those big
changes in your career, right?

109
00:05:56,480 --> 00:05:58,000
For sure.
So yeah, yeah.

110
00:05:58,880 --> 00:06:01,800
And another pro tip shout out to
Gary because we were talking

111
00:06:01,800 --> 00:06:05,520
this week and he he said, hey,
you know that pro tip you gave

112
00:06:05,520 --> 00:06:10,400
me about using a feedback loop
in, in, in AI that really helped

113
00:06:10,400 --> 00:06:12,640
me out.
I just want to reiterate again,

114
00:06:12,640 --> 00:06:16,640
right?
So a is first answer is probably

115
00:06:16,640 --> 00:06:21,320
not going to be the 100% correct
output that you want.

116
00:06:22,360 --> 00:06:26,600
You have to learn how to make it
give you feedback on what you

117
00:06:26,600 --> 00:06:30,800
told it to produce, right?
You know, so if you said, hey, I

118
00:06:30,800 --> 00:06:33,120
want you to do this and I want
it to look like that and it

119
00:06:33,120 --> 00:06:36,040
gives you an output, you don't
just take that as face value.

120
00:06:36,680 --> 00:06:40,480
You know, a pro tip is the next
thing you ask, it is OK, I want

121
00:06:40,480 --> 00:06:43,640
you to give yourself feedback
based on what I asked you to do.

122
00:06:44,600 --> 00:06:46,920
9 times out of 10, it'll say,
oh, you know what?

123
00:06:46,920 --> 00:06:48,600
I forgot that.
I forgot that because AI is not

124
00:06:48,600 --> 00:06:51,560
perfect, right?
But by incorporating those

125
00:06:51,560 --> 00:06:54,840
feedback loops into your
prompts, into your questions,

126
00:06:55,120 --> 00:06:58,360
into your, into your
interactions with AI, you will

127
00:06:58,360 --> 00:07:02,320
drive to a solution that is.
Yeah, so, so, so yeah, I got I

128
00:07:02,320 --> 00:07:05,000
got a real life example, right.
So I'm watching the football

129
00:07:05,000 --> 00:07:09,160
game last week.
I think it was what was the big

130
00:07:09,160 --> 00:07:10,600
game last week?
It was last.

131
00:07:10,600 --> 00:07:13,560
Was it Georgia and somebody?
What was the big game last night

132
00:07:13,680 --> 00:07:16,200
last week?
Saturday night game.

133
00:07:16,600 --> 00:07:19,640
Yeah, Saturday night game.
I think it was yeah, I think it

134
00:07:19,640 --> 00:07:23,560
was gosh, I forget who it was,
but anyway, so I'm watching it

135
00:07:23,560 --> 00:07:26,360
right?
And so this commercial comes up

136
00:07:26,360 --> 00:07:31,960
about the N out and about some
bill or at that.

137
00:07:31,960 --> 00:07:35,920
They want us to sign right?
It's about let's make it fair.

138
00:07:36,480 --> 00:07:38,800
And I'm like, I think I know
what that's about.

139
00:07:38,800 --> 00:07:41,000
So of course I go to the
assistant.

140
00:07:41,000 --> 00:07:44,800
Hey, can you give me a breakdown
of what the blah blah blah bill

141
00:07:44,800 --> 00:07:47,280
is that they're talking about
that's in the Senate or they

142
00:07:47,280 --> 00:07:50,600
want us to vote for Yeah, so I
get all this feedback, right?

143
00:07:50,640 --> 00:07:54,000
And they're giving me the way
they're giving me the

144
00:07:54,000 --> 00:08:01,960
information is in a manner that
that's a it's not giving me it

145
00:08:01,960 --> 00:08:07,400
in a non biased fashion.
So I I, but I, you know, I keep

146
00:08:07,400 --> 00:08:11,320
talking to it and I'm like, hey,
why are you giving me the

147
00:08:11,320 --> 00:08:15,320
information with this tone?
Oh, I'm sorry, you're right.

148
00:08:15,960 --> 00:08:22,520
I need to give this in a non,
you know, pro or against manner.

149
00:08:23,160 --> 00:08:25,000
I'm, you know what I told him.
I said it's too late.

150
00:08:25,040 --> 00:08:30,080
You've done it already.
Because I'm look, look, I may

151
00:08:30,080 --> 00:08:32,720
lose some friends on this.
I'm pro athlete when it comes to

152
00:08:32,720 --> 00:08:38,440
NIL, right, Because, you know, I
know the first hand, you know,

153
00:08:38,480 --> 00:08:40,559
you may say they getting I don't
want to.

154
00:08:40,640 --> 00:08:43,440
I don't know if anyway, let me
just get off myself.

155
00:08:43,720 --> 00:08:46,040
Anyway, I'm like, Hey, can I
just get the information?

156
00:08:46,280 --> 00:08:50,040
It was like, I'm sorry, let me
give it to you in a in a more

157
00:08:50,680 --> 00:08:52,800
neutral position.
I'm like, thank you.

158
00:08:53,680 --> 00:08:58,160
I just want the information.
That's a fantastic example,

159
00:08:58,760 --> 00:09:02,480
especially now with, you know,
our country being so polarized,

160
00:09:02,880 --> 00:09:05,760
you know, the left or the right,
you know, for this and against

161
00:09:05,760 --> 00:09:09,080
that and you don't know exactly
what to believe and what not to

162
00:09:09,080 --> 00:09:12,440
believe.
You can utilize this tool to get

163
00:09:12,440 --> 00:09:16,000
yourself some facts right.
To Terry's point, you got to

164
00:09:16,000 --> 00:09:17,920
tell it.
Hey, I don't want your, your

165
00:09:17,920 --> 00:09:22,680
response to be for or against.
I just want it to be purely the

166
00:09:22,680 --> 00:09:26,040
facts and help me understand
this so you can get some clarity

167
00:09:26,040 --> 00:09:27,680
amongst all the noise that's
actually.

168
00:09:29,320 --> 00:09:32,840
And and it's his right, because
the one thing we want to have is

169
00:09:32,840 --> 00:09:36,040
get the data because you, you
also too, you got to understand

170
00:09:36,040 --> 00:09:39,120
this right?
It's going out and pulling data

171
00:09:39,120 --> 00:09:43,640
from sources that may have an
opinion.

172
00:09:43,760 --> 00:09:46,520
Either way, you don't want the
opinions.

173
00:09:46,520 --> 00:09:52,360
You just give me the give me the
facts as as they've been stated.

174
00:09:53,520 --> 00:09:58,440
And then what I will do is draw
a conclusion from what I'm

175
00:09:58,440 --> 00:10:00,760
receiving.
Well, in the same way, hey, give

176
00:10:00,760 --> 00:10:04,760
me information about what Cisco
is doing in terms of these data

177
00:10:04,760 --> 00:10:07,600
centers.
Now what I do know in terms of

178
00:10:07,600 --> 00:10:09,440
that.
And so Ralph, to your point on

179
00:10:09,440 --> 00:10:14,200
that is I, I had the pleasure
and thanks to Cisco and some of

180
00:10:14,200 --> 00:10:18,800
our other partners to attend
Cisco Live this past year.

181
00:10:19,760 --> 00:10:24,680
The data centers that you're
talking about these are highways

182
00:10:24,680 --> 00:10:31,640
specifically for nothing else
but AII mean rolls and rolls and

183
00:10:31,640 --> 00:10:36,880
rolls of GPU's right to process
super highway.

184
00:10:36,880 --> 00:10:39,520
They're talking about I'm
telling let me remember I have

185
00:10:39,520 --> 00:10:41,360
to remember my number my my date
on this.

186
00:10:41,360 --> 00:10:45,680
But you're talking about these
are not like I remember 100 gig.

187
00:10:45,680 --> 00:10:49,360
These chassis are 100 gig ports
yeah, throughput.

188
00:10:49,480 --> 00:10:52,120
Yeah, this is not a gig.
You know, we made we were

189
00:10:52,120 --> 00:10:59,480
excited about 25 gig 10 gig.
No, this is 100 gig, right, Not

190
00:10:59,480 --> 00:11:05,320
single, but they're do bind it
together.

191
00:11:06,600 --> 00:11:11,720
Yeah, you know, and so the
amount of traffic, the amount of

192
00:11:11,720 --> 00:11:15,000
data that this that's going to
be piped through these data

193
00:11:15,000 --> 00:11:19,040
centers, man, it's it's it's
going to pull it's going to pull

194
00:11:19,040 --> 00:11:23,640
as much power as a city.
Yes, yeah.

195
00:11:23,800 --> 00:11:27,640
And and just just a quick,
quick, quick shout out selfless

196
00:11:27,640 --> 00:11:32,000
plug.
Terry, you remember back in man,

197
00:11:32,000 --> 00:11:35,560
I don't know, this had to be
like 2006.

198
00:11:36,400 --> 00:11:39,240
I was like TI got this idea.
I remember that, I think about

199
00:11:39,240 --> 00:11:42,120
it all the time.
You were on it, bro.

200
00:11:42,720 --> 00:11:46,480
Yes, I.
Had this idea of distributing

201
00:11:46,480 --> 00:11:50,640
computing back in 2005, 2006.
I just didn't know what to do

202
00:11:50,640 --> 00:11:53,800
with it and here we are today.
Yes, and that's exactly what

203
00:11:53,800 --> 00:11:55,440
they did.
When you said you said TI,

204
00:11:55,560 --> 00:11:58,600
remember, I remember, like you
said, T we got to build

205
00:11:58,600 --> 00:12:02,080
something.
We're compute, storage,

206
00:12:02,600 --> 00:12:10,400
networking.
It's all in 111 chassis.

207
00:12:10,640 --> 00:12:12,400
Of course, there's been
iterations of it, right.

208
00:12:12,400 --> 00:12:15,880
I think the best at the consumer
or the OR the enterprise level

209
00:12:15,880 --> 00:12:18,360
at this point, I think is new
Nutanix.

210
00:12:18,600 --> 00:12:21,120
I think they were doing it the
best, I think here recently.

211
00:12:21,520 --> 00:12:26,120
But yes, but you had that into
because remember, you took it to

212
00:12:26,600 --> 00:12:27,840
what was the guy at?
Oh, gosh.

213
00:12:32,080 --> 00:12:34,760
Who we had gotten our one of our
chassis from who Chris used to

214
00:12:34,760 --> 00:12:35,880
work for.
What was the name of that

215
00:12:35,880 --> 00:12:38,920
company?
And you, you took it to the CTO,

216
00:12:38,920 --> 00:12:40,960
remember, because he was a
musician, too.

217
00:12:41,480 --> 00:12:43,280
You took that idea to him.
Yeah.

218
00:12:43,520 --> 00:12:44,720
Yeah.
You were there, bro.

219
00:12:44,760 --> 00:12:46,440
You were there years.
Yeah, I was you.

220
00:12:46,520 --> 00:12:51,040
You were 20 years ahead.
I I distinctly remember saying,

221
00:12:51,040 --> 00:12:54,080
you know, what if we just had a
whole rack of nothing but CPU I.

222
00:12:54,080 --> 00:12:55,480
Remember that.
Right.

223
00:12:55,480 --> 00:12:58,080
You know, and a whole rack of
nothing but memory by itself was

224
00:12:58,080 --> 00:13:00,440
distributed and then something
controlling it all.

225
00:13:00,880 --> 00:13:03,200
So just want to make, you know,
I didn't invent the Internet,

226
00:13:03,600 --> 00:13:06,560
but I didn't have that idea.
Ralph, Hey.

227
00:13:06,720 --> 00:13:08,200
Hey again.
I hey.

228
00:13:08,800 --> 00:13:10,400
I'm a witness.
I remember.

229
00:13:10,480 --> 00:13:12,440
You don't only witness yet
because you know the person I

230
00:13:12,440 --> 00:13:14,280
told.
We were I, I see it.

231
00:13:14,360 --> 00:13:17,040
I see that, man.
That conversation comes to me a

232
00:13:17,040 --> 00:13:20,640
lot like Ralph was.
It's crazy awesome.

233
00:13:20,640 --> 00:13:24,400
It's crazy.
Guys, again, right, this is all

234
00:13:24,400 --> 00:13:29,120
about jumping in when it's in
it's we are in the infancy right

235
00:13:29,120 --> 00:13:32,400
now of Gen.
AI and moving on to a gentic.

236
00:13:32,400 --> 00:13:35,520
We'll get to that because I got
a guy I want to bring on here

237
00:13:35,680 --> 00:13:38,360
when we start talking about a
gentic and he's doing some

238
00:13:38,360 --> 00:13:42,040
things with a platform out
there, man, that is, I mean,

239
00:13:42,040 --> 00:13:44,360
just revolutionary.
And when, when they talk about

240
00:13:44,360 --> 00:13:49,960
people are building apps with
AI, they are and they're not.

241
00:13:50,360 --> 00:13:53,960
These are not technical people.
They just have an idea.

242
00:13:54,280 --> 00:13:58,520
So here's here's the beauty of
what, and I think we use this on

243
00:13:58,520 --> 00:14:03,360
the first episode, we said AI
can supercharge your idea.

244
00:14:04,760 --> 00:14:07,040
Yes, that's it will supercharge
it.

245
00:14:07,040 --> 00:14:12,080
And if you can think of it,
yeah, there is a platform out

246
00:14:12,080 --> 00:14:15,120
there that will help you realize
it sooner.

247
00:14:16,920 --> 00:14:20,320
Hey, hey, if we had AI back when
I had that idea, you and I'd be

248
00:14:20,320 --> 00:14:22,680
millionaires right now.
Yes.

249
00:14:23,440 --> 00:14:26,120
Yeah, Yeah, yeah, yeah.
Because I mean, it was, there

250
00:14:26,120 --> 00:14:30,720
was just no way to, to, to, to
get that to, you know, fruition,

251
00:14:30,720 --> 00:14:32,400
right?
It was hard to conceptualize.

252
00:14:32,400 --> 00:14:36,200
Yeah, it was conceptual, but to
move it from a conceptual to a A

253
00:14:36,440 --> 00:14:40,840
and, and be connected into
those, into those, because

254
00:14:40,840 --> 00:14:43,040
here's the deal, right?
And here the Ralph, I think this

255
00:14:43,040 --> 00:14:46,480
really kind of brings us to why
we're doing what we're doing is

256
00:14:48,080 --> 00:14:53,320
it's about access.
But access starts first with

257
00:14:53,840 --> 00:14:57,360
information and understanding
where to go, right?

258
00:14:57,520 --> 00:15:03,080
And so this platform that we've
created is about creating a

259
00:15:03,320 --> 00:15:10,800
access to these platforms and
areas that we traditionally

260
00:15:10,800 --> 00:15:14,200
didn't have understanding or
knowledge of, right?

261
00:15:14,200 --> 00:15:16,600
Just because we weren't in those
circles.

262
00:15:16,600 --> 00:15:20,440
Well, God has blessed us to be
at the table.

263
00:15:20,440 --> 00:15:25,240
God has blessed us to get favor
with different persons, right?

264
00:15:25,240 --> 00:15:27,240
I'm able to do what I'm doing
today with lift.

265
00:15:27,480 --> 00:15:30,440
Like man, I remember when I
launched Lift Life Technology

266
00:15:30,720 --> 00:15:34,200
and all I could think is how am
I going to get, I know the

267
00:15:34,200 --> 00:15:37,160
technology, but how do I get in
front of people?

268
00:15:38,840 --> 00:15:41,680
God allowed me to connect with
the right people.

269
00:15:42,000 --> 00:15:45,720
So now that I'm connected with
those people, I can connect you

270
00:15:45,720 --> 00:15:48,080
with those people through
through the information.

271
00:15:49,320 --> 00:15:52,120
Yes, yes.
Right, Ralph?

272
00:15:52,120 --> 00:15:56,000
Has connected with people so now
we're connecting you to those

273
00:15:56,000 --> 00:15:58,760
people through the information
we're not giving.

274
00:15:58,760 --> 00:16:01,840
Hey, hey, hey, John, this is
this is Mark.

275
00:16:02,120 --> 00:16:04,920
We're not necessarily giving you
that face not say that we can't.

276
00:16:04,920 --> 00:16:07,960
But what I'm saying is what
we're doing is giving you

277
00:16:07,960 --> 00:16:10,600
information to get to the
places, right?

278
00:16:10,600 --> 00:16:14,440
Whether it's a methodology,
whether it's a way of thinking,

279
00:16:14,600 --> 00:16:20,040
whether it's a, a concept on
networking, right?

280
00:16:20,040 --> 00:16:24,720
And we point you in those again,
you know, we were trying to use

281
00:16:24,720 --> 00:16:27,640
the little resources we had, the
little the people we knew.

282
00:16:27,920 --> 00:16:30,920
And it was just, it was hard to
get it off the off the mat, so

283
00:16:30,920 --> 00:16:34,240
to speak, so.
No, that's good, that's good,

284
00:16:34,240 --> 00:16:35,200
that's good, that's good, that's
good.

285
00:16:35,640 --> 00:16:39,360
So you know, just, you know, in
closing, you know, we just, we

286
00:16:39,360 --> 00:16:44,000
do want to encourage our mindset
of experimentation and iteration

287
00:16:44,000 --> 00:16:47,640
when working with AI, right?
Hit it with any, any, any

288
00:16:47,640 --> 00:16:50,280
question you have, nothing is
off the table, right?

289
00:16:50,280 --> 00:16:52,760
You know, put it out there, see
what it comes back with.

290
00:16:53,160 --> 00:16:57,600
Remember that your interaction
with AI is a continuous

291
00:16:57,680 --> 00:16:59,640
iteration, right?
You throw.

292
00:16:59,640 --> 00:17:03,160
So I, I've always said that, you
know, the, the, the speed in

293
00:17:03,160 --> 00:17:06,520
which the, the, the society
evolves has been based on how

294
00:17:06,520 --> 00:17:09,800
fast information has been able
to spread across the world,

295
00:17:09,800 --> 00:17:12,160
right?
Let's just go back to, you know,

296
00:17:12,560 --> 00:17:14,960
biblical days.
People had the right stuff on

297
00:17:14,960 --> 00:17:18,000
scrolls, but how did other
people, you know, on the other

298
00:17:18,000 --> 00:17:19,760
side of the country get that
information?

299
00:17:20,079 --> 00:17:22,000
Those scrolls had to be carried,
right?

300
00:17:22,040 --> 00:17:26,200
Copied, hand copied, right?
You know, so society evolved as

301
00:17:26,200 --> 00:17:29,520
fast as the information couldn't
move when the Internet came.

302
00:17:29,960 --> 00:17:33,240
Now you had information moving
from one side of the world to

303
00:17:33,240 --> 00:17:36,240
the other in seconds.
Whereas it may have taken months

304
00:17:36,240 --> 00:17:39,360
or maybe even years for books
and stuff like that to cross the

305
00:17:39,440 --> 00:17:42,800
ocean, right now information is
moving at the speed of light.

306
00:17:43,160 --> 00:17:46,400
Well, now it's moving even
faster with AI because you're

307
00:17:46,400 --> 00:17:48,320
asking questions and you're
getting exactly what you want in

308
00:17:48,320 --> 00:17:51,280
seconds, right?
You know, so our ability to

309
00:17:51,280 --> 00:17:55,520
evolve is, is, is, is speeding
up and you want to be able to

310
00:17:55,520 --> 00:17:57,960
stay on the, on the leading
curve of that, right?

311
00:17:58,280 --> 00:18:00,960
So, you know, again, just call
to action, go to

312
00:18:00,960 --> 00:18:06,080
leadershipsovereignty.com/AI to
get your free AI tool kit so

313
00:18:06,080 --> 00:18:09,000
that you can start using this
tool, you know, in your

314
00:18:09,000 --> 00:18:12,000
day-to-day personal and
professional life, you know, to

315
00:18:12,000 --> 00:18:14,640
take advantage of the
opportunity that it brings for

316
00:18:14,640 --> 00:18:16,640
you.
Any any any final thoughts too?

317
00:18:16,680 --> 00:18:20,600
Look man, you got the assistant
in your pocket.

318
00:18:21,840 --> 00:18:24,600
In your pocket.
It's in your pocket, right?

319
00:18:24,600 --> 00:18:25,800
Ralph was talking about the
Internet.

320
00:18:25,800 --> 00:18:28,200
We had to wait.
We had to wait to get down at

321
00:18:28,200 --> 00:18:32,000
the computer, Bailey.
With the dial.

322
00:18:32,000 --> 00:18:36,680
Dial up, right, Right.
I mean, just think about our

323
00:18:36,680 --> 00:18:38,680
parents.
They had to wait for the

324
00:18:38,680 --> 00:18:43,400
newspaper, Yes.
They had to wait like 24 hours

325
00:18:43,400 --> 00:18:46,520
for the next print.
That's right, right.

326
00:18:46,520 --> 00:18:49,360
I mean, right.
I think what you shared that

327
00:18:49,360 --> 00:18:53,200
information aspect is, is
critical, right?

328
00:18:53,200 --> 00:18:55,320
Because think about it, right?
I mean, let's, let's take a

329
00:18:55,360 --> 00:18:56,440
step.
Let me walk through just a

330
00:18:56,440 --> 00:18:58,040
little slower.
Sure.

331
00:18:58,160 --> 00:19:02,440
It was on rocks first.
No, it was word of mouth first.

332
00:19:03,640 --> 00:19:04,960
Word of mouth.
Right, you have you know the

333
00:19:05,280 --> 00:19:09,080
story changed the story, right
Then they started.

334
00:19:09,080 --> 00:19:10,840
Man, this dude didn't messed up
my story.

335
00:19:10,840 --> 00:19:16,160
Let me put this on this rock.
But the problem, the problem is

336
00:19:16,160 --> 00:19:17,880
the person has to come.
Hey man, you got to come over

337
00:19:17,880 --> 00:19:19,200
here to The Cave.
You got to read that's.

338
00:19:19,760 --> 00:19:21,800
That's right.
That's how you think about it.

339
00:19:21,800 --> 00:19:24,040
The Egyptians, they used to put
this stuff all on the walls,

340
00:19:24,040 --> 00:19:26,280
right?
But the evolution part came

341
00:19:26,480 --> 00:19:31,120
when, if Terry introduces a
piece of information to me, I

342
00:19:31,120 --> 00:19:34,880
now absorb that information.
I now take my own personal

343
00:19:34,880 --> 00:19:37,600
experiences and I build a block
on top of that.

344
00:19:37,640 --> 00:19:40,480
That block didn't come until I
was able to absorb the

345
00:19:40,480 --> 00:19:44,360
information that Terry gave me.
So evolution of society when as

346
00:19:44,360 --> 00:19:45,800
fast as information could
travel.

347
00:19:47,280 --> 00:19:50,200
Now I have the information in my
glasses.

348
00:19:51,600 --> 00:19:54,840
I'm talking to my glasses.
And it's telling me, yeah, you

349
00:19:54,840 --> 00:19:56,160
can do this, you can do that,
you can do that.

350
00:19:56,240 --> 00:19:58,960
So every time he gives me a new
piece of information, I build on

351
00:19:58,960 --> 00:20:00,840
top of that and I create
something new and I ask it

352
00:20:00,840 --> 00:20:03,760
another question he gives me.
So that reiteration process is

353
00:20:03,760 --> 00:20:07,680
happening constantly, right?
And things are evolving fast.

354
00:20:08,920 --> 00:20:12,120
Yeah, guys.
So look, get in getting rough.

355
00:20:12,120 --> 00:20:16,880
I think that is a brilliant,
brilliant, brilliant story about

356
00:20:17,400 --> 00:20:21,080
we develop as fast as the
information is able to be

357
00:20:21,080 --> 00:20:26,720
disseminated, received,
represented, and that that's

358
00:20:26,720 --> 00:20:29,880
that iterative process over and
over because and that's

359
00:20:29,880 --> 00:20:32,840
essentially what learning is,
right?

360
00:20:33,000 --> 00:20:38,440
It's the exchange of ideas.
Yes, yeah.

361
00:20:38,440 --> 00:20:41,280
Think of the medical community,
right?

362
00:20:41,320 --> 00:20:45,160
They have only been able to
evolve medicine as fast as

363
00:20:45,160 --> 00:20:47,520
information has been able to
travel because when one

364
00:20:47,520 --> 00:20:50,240
scientist figures something else
out, they have to spread it out

365
00:20:50,240 --> 00:20:52,560
to the rest of the world.
When somebody else figures out

366
00:20:52,560 --> 00:20:53,960
what that is, they build on top
of it.

367
00:20:53,960 --> 00:20:55,360
Correct.
But yeah.

368
00:20:55,800 --> 00:21:00,880
Great show, man, today I am I'm
glad that we, you know, added

369
00:21:00,880 --> 00:21:04,520
this bonus and and and this is
not the end right for for AI.

370
00:21:04,560 --> 00:21:06,040
This is really just the
beginning.

371
00:21:06,400 --> 00:21:10,480
Again, as I alluded to, there's
a, a friend that I've met over

372
00:21:10,480 --> 00:21:15,960
the last month who's in into he,
he's got AAI company and they're

373
00:21:15,960 --> 00:21:18,800
doing some fabulous things.
And, and we'll start, you know,

374
00:21:18,840 --> 00:21:22,960
in future shows, we'll talk
about agentic AI and what that

375
00:21:22,960 --> 00:21:26,160
means.
And again, that's another

376
00:21:26,160 --> 00:21:30,360
evolution of what this
technology is bringing to

377
00:21:30,400 --> 00:21:34,480
enhance, you know, our
experiences and enhance our

378
00:21:34,480 --> 00:21:36,640
exchange.
So thank you, Ralph, great show

379
00:21:36,640 --> 00:21:37,600
today.
Thank you, Sir.

380
00:21:38,280 --> 00:21:40,240
Awesome, thank you too.
All right, take care of y'all.

381
00:21:40,880 --> 00:21:43,680
Thank you for listening to the
Leadership Sovereignty Podcast.

382
00:21:44,000 --> 00:21:47,040
If this content blessed or
helped you in any kind of way,

383
00:21:47,440 --> 00:21:51,520
support us today by subscribing
to our YouTube channel, clicking

384
00:21:51,520 --> 00:21:54,080
the like button for this
episode, and sharing this

385
00:21:54,080 --> 00:21:56,080
content with others that you
think it will help.

386
00:21:56,800 --> 00:22:00,480
Until next time, stay safe,
peace and blessings.

Ralph Owens Profile Photo

Ralph Owens

CIO | CTO | Podcaster | 2x Houston CIO Of The Year Finalist | US Navy Veteran

Ralph Owens is an accomplished technology executive with a proven record of driving digital transformation and business growth in high-stakes environments. Fueled by a deep passion for technology and innovation, Ralph excels at developing and executing IT strategies that deliver measurable results and lasting competitive advantage. As a strategic leader, Ralph brings a sharp focus on cybersecurity, operational excellence, and building strong partnerships across the business. His experience spans diverse industries, including financial services and energy generation, where he has successfully secured critical infrastructure and navigated complex regulatory landscapes. Recognized for his ability to build high-performing teams and lead complex IT initiatives, Ralph consistently aligns technology with business goals to create innovative solutions that accelerate growth, enhance customer experience, and achieve revenue targets. Driven to empower organizations to harness technology for sustainable value, Ralph is passionate about collaborating with forward-thinking leaders to shape the future of digital transformation.

Terry Baylor Profile Photo

Terry Baylor

Strategic IT & Digital Transformation Leader, Entrepreneur, Mentor, Public Speaker, and Podcaster

Terry Baylor is a transformative Strategic IT & Digital Transformation Leader, entrepreneur, mentor, public speaker, and podcaster based in Houston. He’s on a mission to humanize technology, believing that the most powerful connections still happen in person. Terry leads with empathy and action, guiding teams and organizations to harness Agile practices, embrace innovation, and thrive in complex digital landscapes. At Lift Life Technology, Terry champions the mantra "Old‑fashioned isn’t outdated," emphasizing face‑to‑face interactions in a virtual world. His recent LinkedIn reflections underscore his passion:

“When you show up, listen, and connect from the heart, you’re not just selling a service — you’re building trust, community, and lasting relationships.”

Whether he’s delivering key insights at events like Cisco Live or coaching high-performing teams, Terry empowers others to lead with authenticity, agility, and impact.