Sept. 8, 2025

AI at Work: The Risk No One Warned You About. Part 4

The player is loading ...
AI at Work: The Risk No One Warned You About. Part 4

Welcome back to The Leadership Sovereignty Podcast. In today’s episode, Ralph and Terry break down the biggest hidden risk of using generative AI at work — data leakage. If you’ve ever wondered whether dropping company files into ChatGPT or Gemini is safe, this conversation is for you. You’ll learn how to protect your career, the right way to use AI tools without risking sensitive information, and why developing prompt engineering skills could set you apart in the workplace. Stay tuned — this episode could save your job.

 

Takeaways

    • The conversation highlights the importance of effective communication in leadership.
    • AI hallucinations can lead to misunderstandings; always verify information.
    • Data leakage is a significant risk when using generative AI tools.
    • Sensitive information should never be input into free AI tools.
    • Prompt engineering is crucial for maximizing the value of AI outputs.
    • Understanding company policies on AI usage is essential for data protection.
    • AI tools can enhance productivity and creativity in various fields.
    • The future job market will increasingly require AI proficiency.
    • Investing in AI skills can lead to significant career advancements.
    • Embracing AI technology is vital for staying relevant in the workforce.

 

Chapters:

00:00 – Welcome & AI video creation demo

01:09 – Trust but verify: AI hallucinations explained

02:27 – Pro tip: Feedback loops with ChatGPT

03:29 – The #1 risk: Data leakage in AI tools

06:22 – Protecting personal and company data

07:38 – Free vs. paid AI tools and policies

10:46 – Normalizing AI safety, like learning to drive

14:16 – Story: A student turns AI into career advantage

18:22 – Why AI creates opportunity, not just risk

21:26 – Key takeaways: prompt engineering, data safety, and career growth

23:15 – Closing and listener survey

 

Transcript

1
00:00:00,120 --> 00:00:02,360
And I think Terry, you, you were
just mentioning right before the

2
00:00:02,360 --> 00:00:06,960
show that Jim and I now has an
ability to take text and create

3
00:00:06,960 --> 00:00:07,920
a video.
Is that?

4
00:00:07,920 --> 00:00:11,000
Right, yes, yes, yes.
So I was planning around I'm

5
00:00:11,000 --> 00:00:13,680
like, hey, what are all the
tools I get with my, you know,

6
00:00:13,680 --> 00:00:17,080
Google Workspace subscription?
And I'm looking through I'm like

7
00:00:17,080 --> 00:00:19,360
most of these I use I'm like
vid, what is that?

8
00:00:19,960 --> 00:00:23,360
And so I start playing around
with it last week and creating

9
00:00:23,360 --> 00:00:27,720
some, you know, content and
yeah, I said, hey, create me a

10
00:00:28,240 --> 00:00:32,360
one minute video on HIPAA and
and and privacy.

11
00:00:32,920 --> 00:00:36,480
And I was astonished with what
it delivered.

12
00:00:36,920 --> 00:00:40,320
Like wow, that this.
That this is the future.

13
00:00:40,320 --> 00:00:42,720
Y'all, y'all have to get on
board with this.

14
00:00:43,000 --> 00:00:45,720
You can't just say, oh, you
know, that's for the kids or

15
00:00:45,720 --> 00:00:48,920
whatnot because you look up at 5
years and this is part of your

16
00:00:48,920 --> 00:00:51,840
job and you have no idea how to
use it right.

17
00:01:09,920 --> 00:01:13,440
So that's what he meant by I
don't believe what anyone says

18
00:01:13,440 --> 00:01:16,600
because I may ask you a question
and you're going to give me a

19
00:01:16,600 --> 00:01:20,960
very confident answer based on
what you thought I said.

20
00:01:21,360 --> 00:01:25,440
But let me probe that a little
further to say, OK, so this is

21
00:01:25,440 --> 00:01:28,560
what I heard you say.
Is this what you meant?

22
00:01:29,400 --> 00:01:31,320
No, that's not what I meant at
all.

23
00:01:31,520 --> 00:01:33,000
Exactly.
Exactly.

24
00:01:33,040 --> 00:01:36,680
Exactly.
In in human terms that in in in

25
00:01:36,680 --> 00:01:40,840
ChatGPT terms, that is an
hallucination.

26
00:01:41,280 --> 00:01:44,600
That person gave you a confident
answer back.

27
00:01:45,000 --> 00:01:46,840
Good.
That's a great example, yes.

28
00:01:47,400 --> 00:01:51,680
Right.
So, but we have to drive in why,

29
00:01:51,760 --> 00:01:54,720
why did we most likely ask that
follow up question?

30
00:01:55,000 --> 00:01:58,160
Because of our experience and
our knowledge and our

31
00:01:58,160 --> 00:02:01,080
understanding on what it is
we're talking about.

32
00:02:01,280 --> 00:02:04,800
And we like, oh, I heard that,
but I don't know if that was

33
00:02:04,800 --> 00:02:10,759
quite what I was looking for or
they didn't seem as confident in

34
00:02:10,759 --> 00:02:12,920
that answer.
And they were kind of all

35
00:02:12,920 --> 00:02:15,520
around, you know, kind of all
around the question without

36
00:02:15,520 --> 00:02:20,000
really answering it directly.
So it's no different right than

37
00:02:20,000 --> 00:02:22,880
when you talk with a human being
the first time someone gives you

38
00:02:22,880 --> 00:02:25,680
a quick answer back, you're
like, OK, you know, let me dig

39
00:02:25,680 --> 00:02:28,160
into that a little more.
It's the same thing guys.

40
00:02:28,160 --> 00:02:33,440
So that that principle of trust
for verify it it still exists.

41
00:02:33,800 --> 00:02:34,960
That's perfect.
That's perfect.

42
00:02:34,960 --> 00:02:35,760
It's perfect.
It's perfect.

43
00:02:35,960 --> 00:02:40,200
And here's a here's a pro tip,
pro tip for using these these

44
00:02:40,200 --> 00:02:41,360
Gen.
AI tools.

45
00:02:42,040 --> 00:02:44,680
And I don't know how many people
really have caught on to this.

46
00:02:44,880 --> 00:02:47,960
I've caught on to it because of
the workflow that I do and it's

47
00:02:47,960 --> 00:02:52,640
been extremely helpful feedback.
OK, what, what, what do I mean

48
00:02:52,640 --> 00:02:55,040
by that?
When you ask it to give you

49
00:02:55,040 --> 00:02:57,960
something specific, right?
You take the time to create a, a

50
00:02:57,960 --> 00:03:01,560
really well thought out prompt
and you ask it to give you

51
00:03:01,560 --> 00:03:05,520
something very, very specific.
The next question I ask it is

52
00:03:05,520 --> 00:03:08,760
give me feedback on what you
just gave me based on what I

53
00:03:08,760 --> 00:03:11,600
asked you.
It will review it and say, you

54
00:03:11,600 --> 00:03:13,360
know what, you're right, I
missed this.

55
00:03:13,360 --> 00:03:15,600
Let me correct that for you,
right?

56
00:03:15,760 --> 00:03:17,720
And I'll go through that
feedback loop until it's

57
00:03:17,720 --> 00:03:20,840
perfect, right?
So, but most people don't really

58
00:03:20,840 --> 00:03:25,080
realize that Chachi BT can do
feedback on itself, right?

59
00:03:25,080 --> 00:03:29,440
You know so pro tip, put that
into your prompts, right?

60
00:03:29,440 --> 00:03:31,880
So like the prompts that I have,
well, I may have like 4

61
00:03:31,880 --> 00:03:35,240
different sections and I'll say
pause when you finish Section 1

62
00:03:35,400 --> 00:03:39,360
and don't proceed until I
acknowledge, right?

63
00:03:39,360 --> 00:03:42,120
And it'll give me what I know or
want and it'll say, OK, would

64
00:03:42,120 --> 00:03:43,840
you like to move forward?
I'll say no, I want you to give

65
00:03:43,840 --> 00:03:45,760
me feedback on what you just
what you just produced.

66
00:03:46,000 --> 00:03:47,120
And it'll say, oh, you know
what?

67
00:03:47,120 --> 00:03:48,760
I missed this and let me correct
that, right?

68
00:03:49,120 --> 00:03:52,200
Don't take it at face value the
first time, right?

69
00:03:52,200 --> 00:03:54,520
Take the time to review it
yourself, right?

70
00:03:54,520 --> 00:03:57,080
And you can even use the tool
itself to give its own feedback

71
00:03:57,080 --> 00:04:00,400
to improve the output, right?
So feedback loops are really

72
00:04:00,400 --> 00:04:04,080
important.
But yeah, no great, great trust

73
00:04:04,080 --> 00:04:06,560
for verify that that that really
sums it up, Terry.

74
00:04:06,720 --> 00:04:08,800
I think that's, that's, that's,
that's that's killer.

75
00:04:09,720 --> 00:04:13,280
And let's let's talk about the
second biggest, probably bigger

76
00:04:13,280 --> 00:04:16,320
than hallucination risk of using
Gen.

77
00:04:16,360 --> 00:04:19,680
AI, which is data leak, data
leakage.

78
00:04:20,519 --> 00:04:24,000
So data leakage is the number
one risk in using GI in using

79
00:04:24,000 --> 00:04:26,400
Gen.
AI because if you input

80
00:04:26,400 --> 00:04:30,640
sensitive information, sensitive
data into Gen. and shift AI

81
00:04:31,120 --> 00:04:34,880
client names, financial
strategy, that information could

82
00:04:34,880 --> 00:04:38,960
be stored, logged or used in
ways that compromise privacy,

83
00:04:39,320 --> 00:04:43,280
especially on free and unsecure
platform.

84
00:04:43,280 --> 00:04:46,200
So what?
So what does that mean?

85
00:04:47,120 --> 00:04:51,320
If you go and sign up for free
version of ChatGPT or free

86
00:04:51,320 --> 00:04:54,720
version of Gemini, it can do all
these wonderful things that we

87
00:04:54,720 --> 00:04:58,720
just talked about, but that the
Gen.

88
00:04:58,720 --> 00:05:02,480
AI model itself is what's called
a large language model.

89
00:05:02,920 --> 00:05:07,000
It basically think of it as a
big database of information.

90
00:05:08,240 --> 00:05:11,560
It's constantly training itself
to get better based on the

91
00:05:11,560 --> 00:05:13,360
information that's in that
database.

92
00:05:14,120 --> 00:05:18,680
When you use free tools,
anything you put into that tool

93
00:05:18,680 --> 00:05:21,480
becomes part of that database
that is now retraining and

94
00:05:21,480 --> 00:05:23,800
relearning on.
So what does that mean?

95
00:05:23,800 --> 00:05:26,200
That means that if you put
sensitive information in there,

96
00:05:26,200 --> 00:05:27,520
right?
Because a lot of people do this,

97
00:05:27,520 --> 00:05:29,960
they're careless.
They'll just put in, you know,

98
00:05:29,960 --> 00:05:32,840
documents for their company that
have sensitive information in it

99
00:05:32,840 --> 00:05:37,600
to get a response.
And they don't understand that.

100
00:05:37,600 --> 00:05:41,240
Now that data is part of the
large language model.

101
00:05:41,600 --> 00:05:44,720
And if other people who are
using the tool ask the same

102
00:05:44,720 --> 00:05:46,840
similar questions, that your
data will now come up.

103
00:05:47,040 --> 00:05:51,040
So now you have a data brief.
Please hear us when we say this.

104
00:05:51,040 --> 00:05:55,280
Do not do this, OK?
And it's very clear to

105
00:05:55,280 --> 00:05:59,600
understand when you use a free
tool that data is not private.

106
00:06:00,280 --> 00:06:03,120
So what you have to do is you
have to use it with sanitized

107
00:06:03,120 --> 00:06:05,360
data, right?
You have to use it with data

108
00:06:05,360 --> 00:06:07,560
that doesn't have any
particulars or specifics in it,

109
00:06:08,040 --> 00:06:09,520
right?
Once it gives you the framework

110
00:06:09,520 --> 00:06:12,080
of the output that you want, you
can copy that in and change it

111
00:06:12,080 --> 00:06:15,080
when you put it into, you know,
your, your company systems,

112
00:06:15,800 --> 00:06:17,720
right?
And I think there Terry, you

113
00:06:17,920 --> 00:06:21,280
well before we move forward, any
thoughts on that point there?

114
00:06:22,200 --> 00:06:25,520
Yeah, basically, man, it's
personal identifiable

115
00:06:25,520 --> 00:06:29,440
information, right?
Really, we just, we just have to

116
00:06:29,440 --> 00:06:33,840
protect it.
And you know, it's the, it's

117
00:06:33,840 --> 00:06:36,560
it's that sensitive information,
right?

118
00:06:36,640 --> 00:06:40,280
And yeah, just just be diligent,
right?

119
00:06:40,280 --> 00:06:43,560
Due diligence, Due diligence.
Yeah, yeah.

120
00:06:43,560 --> 00:06:48,320
So, so let's talk a little bit
more about how you can use Gen.

121
00:06:48,360 --> 00:06:51,240
AI tools with, you know, that
sensitive information.

122
00:06:51,240 --> 00:06:57,800
So each tool has its own level
or each platform has its own

123
00:06:57,800 --> 00:06:59,680
level.
You have the free version where

124
00:06:59,680 --> 00:07:01,280
everybody can get in and use it,
right?

125
00:07:01,400 --> 00:07:04,080
Again, none of that data is
protected, right?

126
00:07:04,080 --> 00:07:05,960
So never put sensitive
information into that.

127
00:07:06,400 --> 00:07:09,440
But then you'll have the paid
version, right, which they'll

128
00:07:09,440 --> 00:07:12,480
give you tools to protect your
data, right?

129
00:07:12,480 --> 00:07:15,080
You know, so Copilot, which is
one of the leaders in the

130
00:07:15,080 --> 00:07:18,240
industry, especially in
business, if you have a

131
00:07:18,240 --> 00:07:22,400
Microsoft subscription with your
company, that Copilot data may

132
00:07:22,400 --> 00:07:25,680
be secured, right?
You know, always check with your

133
00:07:25,680 --> 00:07:27,520
company to find out what it's
Gen.

134
00:07:27,600 --> 00:07:32,400
AI policy is that is critical.
You need to understand, OK, what

135
00:07:32,400 --> 00:07:34,880
is my company saying is the
approved Gen.

136
00:07:34,960 --> 00:07:37,760
AI policy for me to use with my
company's data?

137
00:07:38,320 --> 00:07:40,680
That could be the difference
between a data leak and

138
00:07:40,680 --> 00:07:43,120
termination, right?
This is very, very serious.

139
00:07:43,840 --> 00:07:47,040
You do not want to put your
company's data in a free tool

140
00:07:47,600 --> 00:07:50,600
because now you're accountable
for a possible data leakage if

141
00:07:50,600 --> 00:07:54,000
you did put like Terry said, PII
in there, right?

142
00:07:54,720 --> 00:07:59,160
But check with your company
again on their G and AI policy.

143
00:07:59,480 --> 00:08:01,560
Hey, do you, can I use G and AI
here?

144
00:08:01,560 --> 00:08:05,160
Can I not use it AI here again,
What you do at home, obviously

145
00:08:05,160 --> 00:08:08,760
is, is is your own personal
space, but you just want to be

146
00:08:08,760 --> 00:08:13,440
very mindful that you do not
save sensitive information in a

147
00:08:13,600 --> 00:08:14,960
a public Gen.
AI tool.

148
00:08:16,400 --> 00:08:21,400
Now, here's the thing.
There are some tools again, like

149
00:08:21,560 --> 00:08:25,000
the paid version of Jack GBT is
something that I that I pay for.

150
00:08:25,440 --> 00:08:29,200
It gives me the ability to tell
it do not use any of my data to

151
00:08:29,200 --> 00:08:32,919
retrain your model.
OK, so I pay to make sure that

152
00:08:32,919 --> 00:08:35,679
that doesn't happen, but the
free version doesn't have that

153
00:08:35,679 --> 00:08:39,520
capability, right.
You know, so definitely use at

154
00:08:39,520 --> 00:08:42,600
your own risk, but be smart
about it right?

155
00:08:42,600 --> 00:08:45,600
Do not put sensitive information
into to to Gen.

156
00:08:45,640 --> 00:08:49,000
AI tools unless your job is
giving you a sanctioned tool to

157
00:08:49,000 --> 00:08:52,400
be able to do that with SO.
Yeah.

158
00:08:52,400 --> 00:08:55,000
And I think ref only thing I'd
add to that too, right.

159
00:08:55,000 --> 00:09:02,560
And so some, some companies have
the ability to have data loss

160
00:09:02,560 --> 00:09:06,120
prevention policies that won't
allow you to do that.

161
00:09:06,120 --> 00:09:08,280
But here's what will happen,
right?

162
00:09:08,360 --> 00:09:13,040
If you hit that particular
control, then yeah, you may get

163
00:09:13,040 --> 00:09:16,880
an e-mail.
Hey, we recognize you, you know,

164
00:09:16,880 --> 00:09:24,480
were trying to place, you know,
some IP or PII into a, you know,

165
00:09:24,680 --> 00:09:28,200
some kind of AI tool that is non
sanctioned.

166
00:09:28,200 --> 00:09:32,280
Now, you know, more than likely
in that scenario, you you know,

167
00:09:32,280 --> 00:09:33,600
you're probably going to get a
warning.

168
00:09:33,840 --> 00:09:37,160
But what we want to do is
equipped you on the front end to

169
00:09:37,200 --> 00:09:41,040
understand how to use the tool.
There's no different than when

170
00:09:41,040 --> 00:09:44,440
you get behind the wheel of a
car, right when you first

171
00:09:44,440 --> 00:09:47,560
started, you know, being trained
to drive, you know, you stop at

172
00:09:47,560 --> 00:09:50,880
a four way intersection, you let
the person to the right go,

173
00:09:51,040 --> 00:09:55,400
right when you come to a yield
sign, that's not slow, that's

174
00:09:55,400 --> 00:09:59,560
not a that's not a rolling stop.
It's really like, you know, stop

175
00:09:59,560 --> 00:10:02,280
and look, then go.
So that's that's how what we're

176
00:10:02,280 --> 00:10:06,240
doing here, right?
These are just normal basic

177
00:10:07,520 --> 00:10:12,680
exercises that you do when you
get on the road of AI, right?

178
00:10:12,680 --> 00:10:17,040
So what what I want to do is
kind of normalize it in terms of

179
00:10:17,240 --> 00:10:21,160
this is what you do when using
these tools, right?

180
00:10:21,160 --> 00:10:25,080
Now that we're introducing you,
we're also introducing you into

181
00:10:25,280 --> 00:10:30,440
the framework of how to be safe.
No different than when you

182
00:10:30,640 --> 00:10:34,720
started getting online, right?
Don't click on, you know, don't

183
00:10:34,720 --> 00:10:38,360
go to websites that you don't
know right there, there.

184
00:10:38,360 --> 00:10:41,000
So this is no different than
that, right?

185
00:10:41,040 --> 00:10:46,200
So there, there, there are modes
and operations methods on how

186
00:10:46,200 --> 00:10:49,840
you carry out even on being on
Facebook, right?

187
00:10:49,840 --> 00:10:53,720
There are things that we taught
you or folks taught you how to

188
00:10:54,120 --> 00:10:57,440
effectively be on Facebook
without being scammed, right?

189
00:10:57,440 --> 00:11:01,200
All these kind of things.
So this is this is not for you

190
00:11:01,200 --> 00:11:06,680
to be fearful or afraid.
Feel like, Oh no, I'm being

191
00:11:06,680 --> 00:11:09,320
equipped.
So that's that's that's the main

192
00:11:09,320 --> 00:11:11,640
point I want to bring out here.
That's great that's great that's

193
00:11:11,640 --> 00:11:13,800
great so think about it like
this data leaks don't happen

194
00:11:13,800 --> 00:11:17,480
because AI is evil they happen
because people forget that AI is

195
00:11:17,480 --> 00:11:21,760
not private by default right so
you know think before you type

196
00:11:21,800 --> 00:11:25,600
you know protect your data you
secure settings right because

197
00:11:25,600 --> 00:11:29,680
gene AI is powerful, but you are
responsible for what you share

198
00:11:30,400 --> 00:11:31,760
right So don't don't forget that
that's.

199
00:11:31,760 --> 00:11:36,440
A very important point and then
so, you know, lastly, what what

200
00:11:36,440 --> 00:11:40,480
we wanted to cover was again,
tool that we use, you know,

201
00:11:40,480 --> 00:11:44,280
almost daily, which is ChatGPT.
You know, again, to Terry's

202
00:11:44,280 --> 00:11:47,800
point, you can there's ChatGPT,
there's copilot, there's Google

203
00:11:47,800 --> 00:11:50,920
Gemini, there's Claude, there's
a few others out there, right?

204
00:11:50,920 --> 00:11:52,560
You have to find the tool that's
best for you.

205
00:11:53,280 --> 00:11:55,960
ChatGPT was the first to the
game, right?

206
00:11:55,960 --> 00:11:57,600
They were the ones who created
Gen.

207
00:11:57,600 --> 00:12:01,040
AI 1st and and and beat everyone
else to the market.

208
00:12:01,200 --> 00:12:03,240
So they have a more mature
product.

209
00:12:04,680 --> 00:12:08,000
You know they have three
different versions, a free

210
00:12:08,000 --> 00:12:11,440
version, right, which you can
just go to chatgbt.com and sign

211
00:12:11,440 --> 00:12:13,120
up.
It gives you basic features,

212
00:12:13,720 --> 00:12:17,360
text generation, no web, no file
tools, no memory.

213
00:12:17,760 --> 00:12:21,000
If you sign up for the plus
version at the at the time of

214
00:12:21,000 --> 00:12:24,960
this recording, which was about
$20.00 a month, you get web

215
00:12:24,960 --> 00:12:27,880
browsing, file uploads, image
analysis, memory,

216
00:12:27,880 --> 00:12:32,440
personalization, access to Jet
GBT plug insurance, best use for

217
00:12:32,440 --> 00:12:34,440
daily pro use, content creation
and learning.

218
00:12:35,240 --> 00:12:39,040
That's the one I do personally.
I, I actually create images with

219
00:12:39,040 --> 00:12:42,520
it, right?
And it's pretty good when you

220
00:12:42,520 --> 00:12:45,000
get good with your prompt
engineering and you can tell it,

221
00:12:45,000 --> 00:12:47,120
Hey, I want this person to be on
the left side.

222
00:12:47,320 --> 00:12:49,560
I want the words on the right
side to be in bold.

223
00:12:49,560 --> 00:12:51,960
I want them to be centered.
I want them to be around this

224
00:12:51,960 --> 00:12:53,960
size font.
I want them to be this color,

225
00:12:54,160 --> 00:12:57,640
right, you know, or a gradient
or in and I want this person's

226
00:12:57,640 --> 00:13:00,200
image to to have this type of
expression.

227
00:13:00,560 --> 00:13:03,200
You know, when you can get you
very granular down into your

228
00:13:03,200 --> 00:13:07,480
prompts, you create images like
that, that are they are so good

229
00:13:07,480 --> 00:13:11,000
that it would take me 10 times
the time to try to create that

230
00:13:11,000 --> 00:13:13,040
on my own.
And when I say that I'm talking

231
00:13:13,040 --> 00:13:16,000
about the nuance, I'm talking
about the shadow on the on

232
00:13:16,000 --> 00:13:19,360
certain side of the face, right?
That I wouldn't be able to just

233
00:13:19,360 --> 00:13:23,080
naturally generate myself right
so it it can definitely, you

234
00:13:23,080 --> 00:13:25,600
know, supercharge some things
there and I think Terry, you you

235
00:13:25,600 --> 00:13:28,240
were just mentioning right
before the show that Jim and I

236
00:13:28,240 --> 00:13:32,600
now has an ability to take text
and create a video is that.

237
00:13:32,600 --> 00:13:35,600
Right, yes, yes, yes.
So I was planning around.

238
00:13:35,600 --> 00:13:38,360
I'm like, hey, what are all the
tools I get with my, you know,

239
00:13:38,360 --> 00:13:41,640
Google Workspace subscription
and I'm looking through, I'm

240
00:13:41,640 --> 00:13:44,040
like most of these I use I'm
like vid, what is that?

241
00:13:44,680 --> 00:13:48,040
And so I start playing around
with it last week and creating

242
00:13:48,040 --> 00:13:52,320
some, you know, content And
yeah, I said, hey, create me a

243
00:13:52,880 --> 00:13:57,080
one minute video on HIPAA and
and and privacy.

244
00:13:57,600 --> 00:14:01,200
And I was astonished with what
it delivered.

245
00:14:01,600 --> 00:14:05,040
Like, wow, that this, that this
is the future.

246
00:14:05,040 --> 00:14:07,400
Y'all, y'all have to get on
board with this.

247
00:14:07,680 --> 00:14:10,560
You can't just say, oh, you
know, that's for the kids or

248
00:14:10,560 --> 00:14:13,600
whatnot because you look up at 5
years and this is part of your

249
00:14:13,600 --> 00:14:16,480
job and you have no idea how to
use it right.

250
00:14:16,560 --> 00:14:18,120
You got it.
You got to embrace this stuff.

251
00:14:18,120 --> 00:14:20,480
You got to embrace this stuff.
So, so great, great point there

252
00:14:20,480 --> 00:14:21,800
to ref.
Let me share a quick story.

253
00:14:21,800 --> 00:14:25,440
So Noah and I as a, as a part of
my, you know, I want to be a

254
00:14:25,440 --> 00:14:27,200
great dad.
This one, you know, I said, man,

255
00:14:27,200 --> 00:14:31,120
hey, Monday mornings are yours.
And so, you know, I, you know,

256
00:14:31,120 --> 00:14:33,920
we go out and we play 9 or he
plays 9.

257
00:14:33,920 --> 00:14:38,440
I just kind of walk and and
watch anyway, so we meet some

258
00:14:38,680 --> 00:14:41,040
just fantastic people every time
we go out and play.

259
00:14:41,040 --> 00:14:48,440
So I met a couple new grads this
past week and gentleman who

260
00:14:48,920 --> 00:14:52,600
graduated with a degree in
accounting and he was like, you

261
00:14:52,600 --> 00:14:54,760
know, man, I was kind of
concerned because I just

262
00:14:54,760 --> 00:14:59,040
graduated with accounting degree
and, and AI is out there and am

263
00:14:59,040 --> 00:15:02,160
I going to be obsolete?
And I'm like, again, same thing.

264
00:15:02,160 --> 00:15:05,840
I share the same thing.
Your expertise will be needed.

265
00:15:07,080 --> 00:15:09,720
You know, it'll be a while
anyway because of a lot of these

266
00:15:09,720 --> 00:15:11,680
companies, the processes are
manual.

267
00:15:12,760 --> 00:15:17,400
And so your ability to beef up
and, and, and, and be prepared.

268
00:15:17,600 --> 00:15:20,440
And he was like, I'm glad you
said that because he's got, you

269
00:15:20,440 --> 00:15:23,160
know, he's got some friends who,
who, who know who owns companies

270
00:15:23,480 --> 00:15:25,520
and here's and I, I love what he
shared.

271
00:15:25,520 --> 00:15:30,440
He says, yes, I've been going in
and volunteering are, you know,

272
00:15:30,440 --> 00:15:34,920
kind of interning with them and
looking at their processes and

273
00:15:34,920 --> 00:15:38,480
automating workflows.
He's like, I'm teaching myself

274
00:15:38,480 --> 00:15:42,200
as I go.
I'm like I said, son, you, you,

275
00:15:42,360 --> 00:15:46,880
you're doing it the right way.
You I could not, I could have

276
00:15:46,880 --> 00:15:48,640
gave the guy hug, to be honest
with you.

277
00:15:48,760 --> 00:15:51,960
I said, because that is that is
it right?

278
00:15:53,160 --> 00:15:59,080
Your your, your inquisitiveness
about it didn't stop with just

279
00:15:59,280 --> 00:16:03,840
let me, you know, see what I can
do or let me see you know, if I

280
00:16:03,880 --> 00:16:08,880
no, he put himself in a position
in a real, real world scenario

281
00:16:09,120 --> 00:16:15,120
where he could actually take his
skill set went to a place.

282
00:16:15,320 --> 00:16:17,400
Again, this was a, you know, not
everyone might have this

283
00:16:17,400 --> 00:16:20,520
opportunity, but he took
advantage of that opportunity to

284
00:16:20,520 --> 00:16:24,200
actually go in, look at
workflows, leverage his skill

285
00:16:24,200 --> 00:16:26,600
set and what he knows about
numbers and accounting and

286
00:16:26,600 --> 00:16:30,520
finance and start to automate.
I said, here's the benefit for

287
00:16:30,520 --> 00:16:32,240
you.
It's because he doesn't start

288
00:16:32,240 --> 00:16:36,720
his job until until October.
You get to go in there with a

289
00:16:36,720 --> 00:16:44,120
portfolio of AI prompts that you
can say, here's my AI portfolio.

290
00:16:44,120 --> 00:16:51,520
And it's a matter of just taking
the step, you know, getting in

291
00:16:51,520 --> 00:16:56,960
there, jumping in because again,
those who take the initiative,

292
00:16:57,240 --> 00:17:01,440
right, those who have the I said
for nothing else, when the

293
00:17:01,440 --> 00:17:07,240
conversation comes up, you will
be able to articulate in an

294
00:17:07,240 --> 00:17:11,040
educated manner and have
conversation about the

295
00:17:11,040 --> 00:17:15,200
creativity and how you've
leveraged it.

296
00:17:15,200 --> 00:17:18,319
Have you seen other business
leverage it, right?

297
00:17:18,400 --> 00:17:20,400
And that's going to open up
opportunity for you.

298
00:17:20,760 --> 00:17:24,240
So I mean, that's right.
I mean, it was a great example

299
00:17:24,240 --> 00:17:26,560
of it, right?
Firstly out of firstly out of

300
00:17:26,560 --> 00:17:29,520
college.
You know, I'm glad you brought

301
00:17:29,520 --> 00:17:32,200
that up, Terry.
People don't understand that

302
00:17:32,200 --> 00:17:34,560
those who want to go in the
track, technical track, right?

303
00:17:34,560 --> 00:17:39,400
This is for you. 2 things.
Your ability to upskill yourself

304
00:17:39,400 --> 00:17:44,000
in AI is going to set you apart.
So Meta, formerly known as

305
00:17:44,000 --> 00:17:49,800
Facebook, has been hiring a
super AI intelligence team,

306
00:17:50,560 --> 00:17:52,240
right?
So yeah, it may not have come

307
00:17:52,240 --> 00:17:55,480
across most of your radar, but
they have Mark Zuckerberg, the

308
00:17:55,480 --> 00:18:00,120
owner.
He wants to create the A-Team of

309
00:18:00,120 --> 00:18:03,240
the most elite AI producers in
the world.

310
00:18:04,600 --> 00:18:07,880
He contracted to pay somebody.
I believe it was 100.

311
00:18:07,880 --> 00:18:11,680
It was either 100 or $300
million / 10 years.

312
00:18:12,000 --> 00:18:13,080
This.
What about that?

313
00:18:13,840 --> 00:18:16,400
Right.
The the, the, the industries

314
00:18:16,400 --> 00:18:18,160
already understand that this is
where it's going.

315
00:18:18,360 --> 00:18:21,880
OK, to Terry's point in his
example with that young man,

316
00:18:22,400 --> 00:18:27,520
every company in the world has a
ton of inefficient processes.

317
00:18:29,120 --> 00:18:32,920
There's a, there's a sea of them
because we've all been doing

318
00:18:32,920 --> 00:18:37,440
things manually for so long.
If you could tap into AI and

319
00:18:37,440 --> 00:18:43,200
automate and, and, and be able
to produce efficiency, OK,

320
00:18:43,480 --> 00:18:46,440
efficiency, right?
Efficiency is a metric that's

321
00:18:46,440 --> 00:18:50,880
used to, to, to determine how
profitable a company is.

322
00:18:51,040 --> 00:18:56,160
If you can automate, you know,
processes that are inefficient

323
00:18:56,160 --> 00:18:58,920
and make them efficient, save
time, right?

324
00:18:58,920 --> 00:19:00,160
You'd be able to do these types
of things.

325
00:19:00,160 --> 00:19:02,760
That is value.
That is value that people are

326
00:19:02,760 --> 00:19:05,280
willing to pay you for today,
right now.

327
00:19:05,480 --> 00:19:09,560
And AI is so new that I mean,
you can go get AAI certificate

328
00:19:09,560 --> 00:19:12,880
right now and it'll put you
light years ahead of anybody

329
00:19:12,880 --> 00:19:15,160
else that's trying to do this on
their own, right?

330
00:19:15,200 --> 00:19:19,120
The opportunity is there, right?
That when I came into iti came

331
00:19:19,120 --> 00:19:23,880
into IT in the late 90s when
Windows NT was just going into

332
00:19:24,160 --> 00:19:27,120
businesses, right?
They, they were still using

333
00:19:27,120 --> 00:19:29,120
green screens, you know, at that
time.

334
00:19:29,120 --> 00:19:30,960
And so when they were switching
over from green screens to

335
00:19:30,960 --> 00:19:33,920
Windows, that's when I got into
technology.

336
00:19:34,080 --> 00:19:37,000
And I was able to ride that wave
in my career, right, because I

337
00:19:37,000 --> 00:19:40,240
stayed on top of the technology.
The businesses were moving that

338
00:19:40,240 --> 00:19:42,200
fast.
The education system couldn't

339
00:19:42,200 --> 00:19:45,280
catch up because they hadn't
learned how to educate people on

340
00:19:45,280 --> 00:19:46,800
this stuff.
I was getting technical

341
00:19:46,800 --> 00:19:49,280
certifications in this stuff.
The business is like, hey, I

342
00:19:49,280 --> 00:19:51,840
need that right now.
We'll pay you this, we'll pay

343
00:19:51,840 --> 00:19:53,040
you that.
There.

344
00:19:53,080 --> 00:19:55,560
There is another wave that's
happening right now and it's

345
00:19:55,560 --> 00:19:58,440
with AI, right?
You know, so I, I love to hear

346
00:19:58,440 --> 00:19:59,600
that.
I love to hear that example.

347
00:20:00,160 --> 00:20:04,440
Yeah, and and and wrap to your
point, right, that Windows, the

348
00:20:04,440 --> 00:20:08,200
beauty of what you did is as
Windows transitioned over the

349
00:20:08,200 --> 00:20:12,600
years from PCs, then it went to
virtualization.

350
00:20:12,840 --> 00:20:16,920
The foundation of it was still a
Windows OS, right.

351
00:20:17,040 --> 00:20:21,800
Then it went into Citrix, right.
Well, with Citrix, then

352
00:20:21,800 --> 00:20:27,160
virtualization, that foundation
that you learned in the early

353
00:20:27,160 --> 00:20:30,840
90s that carried you 20 years
bro.

354
00:20:31,080 --> 00:20:34,240
It did, it did and I, and I
recognize that it at the time

355
00:20:34,520 --> 00:20:37,000
and I like to say I was smart
enough to see that, oh man,

356
00:20:37,000 --> 00:20:39,120
there's this certain gap in time
and I'm going to jump into it.

357
00:20:39,280 --> 00:20:41,920
It's not how it happened for me.
I was just chasing my passion.

358
00:20:42,520 --> 00:20:43,920
It just happened to be at that
time.

359
00:20:43,920 --> 00:20:48,000
Now, now 20, you know, five
years later, I look back, I see,

360
00:20:48,000 --> 00:20:51,520
man, I got in, in a wave.
I got in a wave that actually

361
00:20:51,520 --> 00:20:53,000
opened up door after door after
door.

362
00:20:53,000 --> 00:20:57,320
So I could see that pattern
repeating again with AI, right?

363
00:20:57,320 --> 00:20:59,840
So if you if, if for those young
folks out there who are

364
00:20:59,840 --> 00:21:02,560
interested in getting into
something that's going to be

365
00:21:02,560 --> 00:21:06,120
cutting edge, that it can make
you a lot of money.

366
00:21:06,120 --> 00:21:07,520
And I'm talking about a lot of
money.

367
00:21:07,520 --> 00:21:10,960
I just told you the top people
are getting paid up to like $300

368
00:21:10,960 --> 00:21:14,200
million / 10 years.
No, but no, I've never heard

369
00:21:14,200 --> 00:21:16,000
anybody making that kind of
money in a job.

370
00:21:16,280 --> 00:21:18,600
Maybe a business owner, but not
a job, right?

371
00:21:19,000 --> 00:21:23,000
If you want to get into some
significant upside on

372
00:21:23,000 --> 00:21:26,280
compensation, AI is your way to
go.

373
00:21:26,280 --> 00:21:28,440
You just got to find your entry
point, right?

374
00:21:28,920 --> 00:21:32,440
But so so let's just talk about
the takeaways for today.

375
00:21:32,880 --> 00:21:33,840
Gen.
I Gen.

376
00:21:33,880 --> 00:21:37,240
AI does not think you do.
So you got to use it wisely,

377
00:21:37,720 --> 00:21:40,720
right?
Learn prompt engineering.

378
00:21:41,000 --> 00:21:46,080
Prompt engineering is needed to
unlock the real value in Gen.

379
00:21:46,160 --> 00:21:49,080
AI.
You really need that protect

380
00:21:49,080 --> 00:21:52,040
your company and your career
from data leaks, right?

381
00:21:52,040 --> 00:21:54,560
You got to be smart with your
settings and your tools and what

382
00:21:54,560 --> 00:21:59,000
you input into them, right?
And then choose a tool that

383
00:21:59,000 --> 00:22:01,720
meets your requirements, right?
Take a look at all of them.

384
00:22:02,000 --> 00:22:04,320
Try them out again, don't put
any personal sensitive

385
00:22:04,320 --> 00:22:06,760
information in them, but try
them out, you know, Start

386
00:22:06,760 --> 00:22:09,720
learning the prompt engineering.
You can do that, you know, for

387
00:22:09,720 --> 00:22:10,840
free.
The courses out there.

388
00:22:11,080 --> 00:22:13,560
We'll, we'll, we'll sit, we'll
put some links in the show notes

389
00:22:13,560 --> 00:22:15,880
for you to check out.
And there's just so, I mean, you

390
00:22:15,880 --> 00:22:18,280
can go to YouTube and look up
prompt engineering and Start

391
00:22:18,280 --> 00:22:20,800
learning right now, right?
And there's nothing standing

392
00:22:20,800 --> 00:22:22,560
between you.
There's no barrier to entry, I

393
00:22:22,560 --> 00:22:25,640
guess is what I'm trying to say.
But find the tool that works

394
00:22:26,160 --> 00:22:28,760
with and matches your workflow
and your privacy needs.

395
00:22:28,760 --> 00:22:30,960
But any any closing thoughts on
that too?

396
00:22:31,960 --> 00:22:35,280
Yeah, no Gen.
AI, man, that is a great place

397
00:22:35,280 --> 00:22:39,360
to start.
There are some other models

398
00:22:39,360 --> 00:22:43,480
coming out, but what we wanted
to focus on today was Gen.

399
00:22:43,560 --> 00:22:47,080
AI.
And you know, as time moves

400
00:22:47,080 --> 00:22:52,000
forward, we'll introduce the
other generative, I guess

401
00:22:52,000 --> 00:22:56,720
there's a gentic AI and that
does have an ability to be

402
00:22:56,720 --> 00:22:59,800
creative.
But we just wanted to give you a

403
00:22:59,800 --> 00:23:05,520
intro and a great place to start
because the majority of the

404
00:23:05,560 --> 00:23:10,480
opportunity right now for what I
would say just the entry level

405
00:23:10,480 --> 00:23:14,680
point is this generative AI.
So it's a great place to start.

406
00:23:15,200 --> 00:23:17,040
Absolutely.
Well, thank you as always.

407
00:23:17,040 --> 00:23:19,800
We appreciate you supporting the
show and listening.

408
00:23:20,080 --> 00:23:23,200
I always want to ask, you know
we want to hear from you, right?

409
00:23:23,440 --> 00:23:27,720
Go to our website,
leadershipsovereignty.com/survey

410
00:23:27,800 --> 00:23:31,240
and take our listener survey and
help us to improve the show for

411
00:23:31,240 --> 00:23:34,280
you so we can make it better to
support the show.

412
00:23:34,280 --> 00:23:36,080
You can always give a donation,
right?

413
00:23:36,080 --> 00:23:39,040
Leadership
sovereignty.com/donate right?

414
00:23:39,040 --> 00:23:41,320
And until the next time, stay
safe.

415
00:23:41,360 --> 00:23:43,320
We'll talk to you.
We'll see you on the next show.

416
00:23:44,200 --> 00:23:47,000
Thank you for listening to the
Leadership Sovereignty Podcast.

417
00:23:47,280 --> 00:23:50,400
If this content blessed or
helped you in any kind of way,

418
00:23:50,760 --> 00:23:54,840
support us today by subscribing
to our YouTube channel, clicking

419
00:23:54,840 --> 00:23:57,400
the like button for this
episode, and sharing this

420
00:23:57,400 --> 00:23:59,400
content with others that you
think it will help.

421
00:24:00,120 --> 00:24:03,920
Until next time, stay safe,
peace and blessings.

Ralph Owens Profile Photo

Ralph Owens

CIO | CTO | Podcaster | 2x Houston CIO Of The Year Finalist | US Navy Veteran

Ralph Owens is an accomplished technology executive with a proven record of driving digital transformation and business growth in high-stakes environments. Fueled by a deep passion for technology and innovation, Ralph excels at developing and executing IT strategies that deliver measurable results and lasting competitive advantage. As a strategic leader, Ralph brings a sharp focus on cybersecurity, operational excellence, and building strong partnerships across the business. His experience spans diverse industries, including financial services and energy generation, where he has successfully secured critical infrastructure and navigated complex regulatory landscapes. Recognized for his ability to build high-performing teams and lead complex IT initiatives, Ralph consistently aligns technology with business goals to create innovative solutions that accelerate growth, enhance customer experience, and achieve revenue targets. Driven to empower organizations to harness technology for sustainable value, Ralph is passionate about collaborating with forward-thinking leaders to shape the future of digital transformation.

Terry Baylor Profile Photo

Terry Baylor

Strategic IT & Digital Transformation Leader, Entrepreneur, Mentor, Public Speaker, and Podcaster

Terry Baylor is a transformative Strategic IT & Digital Transformation Leader, entrepreneur, mentor, public speaker, and podcaster based in Houston. He’s on a mission to humanize technology, believing that the most powerful connections still happen in person. Terry leads with empathy and action, guiding teams and organizations to harness Agile practices, embrace innovation, and thrive in complex digital landscapes. At Lift Life Technology, Terry champions the mantra "Old‑fashioned isn’t outdated," emphasizing face‑to‑face interactions in a virtual world. His recent LinkedIn reflections underscore his passion:

“When you show up, listen, and connect from the heart, you’re not just selling a service — you’re building trust, community, and lasting relationships.”

Whether he’s delivering key insights at events like Cisco Live or coaching high-performing teams, Terry empowers others to lead with authenticity, agility, and impact.