Craig Tucker: Why Your AI Feels Cold and How Emotional Intelligence Changes Everything
EPISODE OVERVIEW
Duration: Approximately 25 minutes
Best For: Trapped entrepreneurs who know they need to automate customer interactions, but fear losing the human touch that built their reputation
Key Outcome: Understand how emotionally intelligent AI can handle the draining customer interactions that burn out your team, freeing you to focus on what actually matters
He wanted to build something to keep his grandmother company after his grandfather died. What Craig Tucker discovered changed how machines understand us.
THE BOTTOM LINE
You built your business on relationships. On understanding people. On being there when clients needed you. The thing is, that same commitment has you answering emails at 5am, managing customer complaints that drain your energy, and watching your team burn out from the constant emotional labour of frontline work. Craig Tucker spent years developing AI that actually understands human emotion, not the robotic chatbots that make your customers angrier. His technology detects frustration, adjusts in real time, and handles the interactions that nobody wants to have, so your people can focus on genuine connection. For trapped entrepreneurs, this represents something profound. The customer service, the support tickets, the repetitive explanations, these are the things keeping you chained to operations. What happens when AI handles the emotional heavy lifting? Your team stops burning out. Your customers feel heard. And you? You might actually take that holiday.
WHY THIS EPISODE MATTERS TO YOU
You will discover how AI can genuinely understand customer emotions, meaning fewer escalations landing on your desk at midnight
You will learn why the biggest companies laying off thousands for AI are failing, and how to avoid their costly mistake
You will understand how emotional AI creates trust with customers, so you can step back without your service quality collapsing
You will see the real cost of ignoring this, watching competitors build relationships at scale while you remain trapped answering every complaint personally
KEY INSIGHTS YOU CAN IMPLEMENT TODAY
The difference between AI that frustrates and AI that connects comes down to emotional detection. Craig's system measures emotions consistently rather than guessing, which means your customers feel understood on first contact. Because of this, the angry phone calls that eat your mornings could be resolved before they ever reach a human.
Customer service burns people out because every interaction is someone else's problem dumped on them. When AI handles that first level contact with genuine emotional awareness, your team stops being an emotional dumping ground. That said, they become problem solvers for the interesting cases. The result is lower turnover and people who actually want to come to work.
The fear of losing control when you automate is real. Craig addresses this directly. The AI acts like an insider who guides customers to solutions, building trust rather than creating barriers. Your customers get help faster. You get freedom. Both happen because the technology honours human connection rather than replacing it.
Mental health applications prove this technology works in the most sensitive contexts. If AI can appropriately respond to someone in crisis, it can certainly handle your customer asking about delivery times. The guardrails exist. The measurement tools exist. The question is whether you will use them or keep doing everything yourself.
Start by identifying the three customer interactions that drain you most. These repetitive, emotionally taxing conversations are perfect candidates for emotionally intelligent automation. You do not need to transform everything overnight. You need to remove the chains one link at a time.
GOLDEN QUOTES WORTH REMEMBERING
"We want to be like Bob Barr in Incredibles, the guy on the inside that tells you how to cut all the red tape and achieve your goals. That's what we want the AI to be for the user." - Craig Tucker
"You don't call customer service or support because you're having a good time. Usually it's a problem and you're not happy about it. So you're getting a mental dump of all these people's problems." - Craig Tucker
"If we don't teach AI how to understand feelings and what they mean to human beings, how they're communicated, then we're doomed to the same fate." - Craig Tucker
"AI really enables everybody to have a little bit of a superhuman power." - Craig Tucker
"Somewhere Socrates, Plato, Aristotle are sitting around laughing because that's what they saw as the world being in the future, when man can step aside from the toils of everyday labour and really look at existence, science, philosophy and why we're here." - Craig Tucker
QUICK NAVIGATION FOR BUSY LEADERS
00:00 - Introduction: Roy introduces Craig Tucker and his groundbreaking work with emotional AI
02:15 - The Grandmother Story: How personal loss led to pioneering technology that could transform your customer service
05:30 - Why Chatbots Fail: Understanding the uncanny valley and why your current automation frustrates customers
08:45 - Live Demo: Watch an emotionally intelligent shopping assistant handle a transaction, then an angry customer
14:20 - The Big Picture: How AI has changed in the last six months and what trapped entrepreneurs must understand
17:40 - Mental Health Applications: Why emotional AI works in the most sensitive contexts and what that means for your business
20:30 - The Freedom Vision: Craig and Roy discuss how this technology can finally give you your time back
24:00 - Conclusion: Where emotional AI is heading and your next steps toward freedom
GUEST SPOTLIGHT
Name: Craig Tucker
Bio: Craig Tucker is the Founder and CEO of VERN AI, building the control layer for human-facing AI. With over a decade at the intersection of product design and human behaviour, Craig pioneered emotional intelligence measurement for AI systems. His patented technology helps businesses create AI interactions that build trust, drive conversions, and actually understand how customers feel.
Connect with Craig:
Website: www.vernai.com
LinkedIn: https://www.linkedin.com/in/craigmtucker/
YouTube: https://www.youtube.com/@VERN_AI
Facebook: https://www.facebook.com/VirtualEmotionResourceNetwork
YOUR NEXT ACTIONS
This Week: List the three customer interactions that drain your energy most. These emotionally taxing, repetitive conversations are your first candidates for intelligent automation. Just identifying them shifts your thinking from trapped to strategic.
This Month: Explore one emotionally intelligent AI solution for your highest volume customer interaction. Book a demo with Craig's team or research alternatives. The goal is understanding what is possible, not committing to everything at once.
This Quarter: Implement AI for your most draining customer touchpoint. Measure the time it gives back to you and your team. Use that time for the high value human connections that actually grow your business.
EPISODE RESOURCES
VERN AI Platform: www.vernai.com - Craig's emotional intelligence layer for AI systems
Hello My Friend App: Mentioned as an example of AI helping children with autism understand emotions
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
READY TO ESCAPE THE TRAP?
Take the Freedom Score Quiz: https://scoreapp.atpbos.com/
Discover how trapped you are in your business and get your personalised roadmap to freedom in under 5 minutes.
Book a Free Strategy Session: https://www.atpbos.com/contact
Let's discuss how to build a business that works WITHOUT you.
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
CONNECT WITH YOUR HOST, ROY CASTLEMAN
Roy is the founder of All The Power Limited and creator of Elevate360, a business coaching system for entrepreneurs ready to scale without burnout. As a certified Wim Hof Method Instructor and the UK's first certified BOS UP coach, Roy combines AI automation, wellness practices, and business operating systems to help trapped entrepreneurs reclaim their freedom.
Website: www.atpbos.com
LinkedIn: https://www.linkedin.com/in/roycastleman/
YouTube: https://www.youtube.com/@allthepowerltd
Transcript
1
::Good morning, good afternoon, good evening wherever you are in
2
::the world. I'm here with Craig, my good friend. He
3
::is on the bleeding edge of AI. He's doing things
4
::that are going to change the world. Has inspired me
5
::every time we've spoken. Craig is working with AI avatars,
6
::but he's working with emotion. He's working with human language.
7
::He's working with a much deeper understanding. Welcome, Craig. It's
8
::really good to have you on the podcast. Thank you,
9
::Roy. I appreciate it. Always nice to see you. And
10
::I think it's probably worthwhile starting with how this started
11
::for you. Not going that far back, but yeah, this
12
::current journey, it all started for me because my grandpa
13
::died, Grandma was lonely and I thought I would be
14
::the good Grandson. Build an AI companion. Was in a
15
::PhD program. Michigan State University. Thought I'm the right one
16
::for this. Right. And along the way found the real
17
::key was emotions and how we communicate. Did patented on
18
::that were a couple years of validating and here we
19
::are. That story with your grandma resonates with so many
20
::people. People are going to be sitting there saying but
21
::really, AI, it's not good enough. So tell us why
22
::it's different. Sure. Most AI that we've all experienced has
23
::been some kind of a chatbot. Right? The chatbots that
24
::we get are fine and good, but they just only
25
::do a certain thing or they act a certain way.
26
::That's always the same. They. You get the same robotic
27
::type of response back. It doesn't seem personal, it doesn't
28
::connect with you. And you're on other sides of what
29
::we call the uncanny valley. Right. Area in which you
30
::know that you don't really accept a technology because it's
31
::not quite human enough or it's too human and you've
32
::got to find a bridge or a way across that.
33
::We detect the emotions that are communicated between people. It's
34
::part of our language, the structure of communication, both on
35
::the subtextual level and also on the contextual level. It
36
::becomes important to use these clues and signals to inform
37
::any type of AI that has to interact with a
38
::human being because we have feelings and what we interact
39
::with should understand that. Otherwise it just becomes those chatbots
40
::with five options that you yell at. I know I
41
::do when I'm trying to book my flight or get
42
::some help from the utilities. Really that's the why and
43
::what we do and what makes it different. So you
44
::can use that in enterprise for customer service, wayfinding or
45
::in mental health, help people connect or help people try
46
::to get better. Even since I've known you probably six
47
::months, the technology has just moved on so much and
48
::I think it's a good time for us to give
49
::a demonstration of one of your avatars. And Greg's got
50
::a whole bunch of different avatars. He has a set
51
::of humans that you can go and play with and
52
::we'll share some of that. And literally the interactions you
53
::have with them is so different. Let's have a chat.
54
::Let me get everything going here and I'll bring everybody
55
::in. So what we're going to see is a personal
56
::shopper. Her name is Carrie and we created her to
57
::help out with a shopping transaction and we used a
58
::lot of real world experience because we're all online shoppers
59
::as well. Let's go ahead and talk to Carrie and
60
::see what we can do. Hi, I'm Carrie.
61
::Tell me what you're shopping for today. Yeah, I'm looking
62
::for some pants. What do you have in pants today?
63
::For pants we currently have a few options. The Tiller
64
::Jean Essential Ashland Pant Vintage fit Double front canvas pant
65
::and the Ashland Bib. Let me know if any of
66
::these catch your eye and we can dive into the
67
::details. Yeah, Can I see the Essential Ashland, please? Here
68
::are. One second, I'll check. Would you like some insight
69
::into how others have felt about wearing the Essential Ashland
70
::pants? I can show you some customer reviews that highlight
71
::the fit and durability. If you're interested from the reviews.
72
::People have been praising the Essential Ashland pants for their
73
::comfort and versatility. Many have noted they hold up well
74
::in both casual and and more rugged environments. Let me
75
::know if you're leaning towards a particular color or size.
76
::Yeah, I like the black one. The
77
::black Essential Ashland Pant is a solid choice. Just let
78
::me know your size preference. Yeah. What sizes do they
79
::come in? The Essential Ashland Pant
80
::is available in sizes from extra small to double extra
81
::large. Just let me know which size fits you best.
82
::Excellent. I'll do a large. Please add that to cart.
83
::Great. I've added the black Essential Ashland Pant. If there's
84
::anything else you need or if you're ready to check
85
::out, just let me know. Yeah, Carrie, let me see
86
::some long sleeve shirts that you'd recommend that would go
87
::with those pants. I
88
::can see we have the Dearborn Loose LS logo T
89
::shirt and Durham knit overshirt among others that could match
90
::those black pants beautifully. Would you like more details on
91
::either of those? Yeah, the Dearborn Loose. Let me see
92
::that please. Hold
93
::on a second. Here's
94
::the Dearborn Loose long sleeve logo T shirt. It has
95
::a relaxed fit and is available in a few colors
96
::that could pair well with your black pants. Like the
97
::smoke teal green or the classic black. Yes. Let's do
98
::a large teal and add that to cart, please. I've
99
::added the Dearborn Loose long sleeve logo T shirt in
100
::smoked, teal, green and large size to your cart. Is
101
::there anything else you'd like to explore or another item
102
::you're considering? We'll mute her right now. Toss it back
103
::over to you, Roy. She's a big shopping assistant. Match
104
::and pair things together, build carts and then be able
105
::to answer customer questions. That usually would prevent somebody from
106
::adding stuff to cart or completing a transaction. So helps
107
::to speed that up. The friction's gone and does it
108
::in a very personal way. A character that people can
109
::relate to. Obviously she's emotionally intelligent too. So I was
110
::to have a meltdown about one of the sizes or
111
::anything else. She could respond to me in real time
112
::and get me over. Let's try that. Let's be a
113
::little bit irritated about something and see what. Sure. Excellent.
114
::Let me go ahead and see that now. Carrie, I'm
115
::really frustrated and frankly, I don't know what the hell
116
::you're doing. I did not want to see any of
117
::those T shirts. I really wanted to see some pants.
118
::Sorry for any misunderstanding. Let's get back to the pants.
119
::Is there a specific type of pants or style you're
120
::interested in seeing again or something I can clarify for
121
::you? Yeah, I just want to see the pants, please.
122
::Let me see. I've got the pants collection up for
123
::you. Take a look and see which styles might fit
124
::your needs better this time. Thank you for your patience.
125
::As you can see, she handles the frustration and the
126
::anger that I'm showing her and being able to not
127
::comply. So does her best. She apologizes. Basically mirror the
128
::user so that they can understand that they understand what's
129
::going on and that they're trying to please them. I
130
::think this is something that's missing currently from technology today
131
::and is essential for a really good full range shopping
132
::experience. You don't want your chatbot to be an anger
133
::machine. So somebody's emotions actually help guide the conversation so
134
::the customer doesn't spend too much time being angry or
135
::fearful. Let's go back to chatting. Sure. So
136
::sitting where you're sitting at the moment, Craig, you're seeing
137
::so much opportunity in so many different places. You're early
138
::to this game. Have you seen the change in how
139
::people are viewing AI in the last six months? Yeah,
140
::absolutely, Roy. I think it changes week to week. It's
141
::comes something being coined as the speed of AI and
142
::we're obviously early adopters. We started doing this stuff 2015,
143
::2016, right. So we've been around a while doing this
144
::stuff and seeing the hockey stick of progression is just
145
::through the chart. It's exponential now because AI really enables
146
::that. Right? Enables productivity to compound exponentially. Knowledge
147
::and information to compound exponentially. And I think that it
148
::really enables everybody to have a little bit of a
149
::superhuman power. And from what we're seeing in our perspective
150
::is it is disruptive. It can be disruptive, but in
151
::a good way. It solves some of the biggest problems
152
::that humanity has right now. Jobs we don't want to
153
::do that, tax us that are mentally draining, that leave
154
::us with feeling terrible and as a psychological burden on
155
::us. There are a lot of jobs that are like
156
::that. Customer service support gives a really good example. You
157
::don't call a customer service or support because you're having
158
::a good time, right. Usually it's a problem and you're
159
::not happy about it, right. So you're getting a mental
160
::dump of all these people's problems on any csr. So
161
::burnout becomes a problem. Churning of employees becomes a problem
162
::that Verm can help solve by putting AI there to
163
::do that first level stuff to connect with a human
164
::being to make them feel like they're hurt. And we
165
::always tell everybody, Roy, that we want to be like
166
::Bill Barr, right? Or Bob Barr, rather. In Incredibles, in
167
::the first scene when the old lady trying to get
168
::help and the insurance company can't. And Bob basically tells
169
::her. Exactly. The guy on the inside tells her how
170
::to cut all the red tape and achieve her goals.
171
::And she's extraordinarily happy and satisfied. That's what we want
172
::Vern to be for the user is to act like
173
::they're the person on the inside that they can trust,
174
::get them the information they need to accomplish their goals,
175
::and then good things come from that. And it's such
176
::a risky journey for companies, isn't it? And this is
177
::what you're struggling with because you've seen bigger companies laying
178
::off 5,000 people because AI is going to do it,
179
::and then realizing that, not bringing the emotion to it,
180
::and the clients are going away. Exactly. And another thing
181
::too is there's a term called AI washing, which I
182
::think is really true. A lot of these companies that
183
::are shedding people added them gratuitously during the pandemic. If
184
::you look at all the hiring that meta Microsoft, you
185
::name any one of these companies that were trying to
186
::expand and grow, seeing work from home being an opportunity.
187
::So they added a lot of headcount towards risks and
188
::bets that they were making. Some of them didn't come
189
::true. There's a gravitational pullback into community and offices and
190
::in person. So you're seeing some of that is a
191
::direct result of they just had so much overhead. Some
192
::of that is you're also seeing too from insiders is
193
::we're getting rid of these people because we think AI
194
::can fill the gap. We don't know and it's your
195
::job to figure it out. That's a recipe for disaster.
196
::It absolutely is. And it will be doomed to fail.
197
::The incentives aren't aligned. And what we see where AI
198
::can do the most good is some of the most
199
::sensitive places, like within mental health. Let's talk about that
200
::a little bit, because that's something. People are going to
201
::be sitting here, not having had the experience with your
202
::avatars. There's got to be so many guardrails. How do
203
::you take somebody that's nearly suicidal and put an AI
204
::model in front of them? What's happening at the moment,
205
::which we start seeing in the news and it starts
206
::getting blown up, is people are using ChatGPT as a
207
::service because that's all they've got, right? Yeah, it's not
208
::responsible either. And the big problems is with large language
209
::models, there's no consensus on what emotions are in psychology,
210
::so they don't know which one to use at any
211
::given moment and usually mix them up and give you
212
::a mixture of theories, frameworks, you're basically getting slop.
213
::And people are taking this as good psychotherapy because it
214
::sounds good. They don't know any better and they shouldn't
215
::have to know better. But the difference is in what
216
::Vern allows is we don't tell or we don't have
217
::the AI. The large language models guess what the emotions
218
::are. They know, we annotate them. We're a measurement tool.
219
::You'll get the same answer, same input every single time.
220
::If not, it's not a measurement tool, you're guessing. So
221
::being able to measure everything gives the AI a very
222
::good idea of the motionality within the scenario. And so
223
::what we do is we use the emotion to guide
224
::the conversation. Right. High anger, repetitively, certain
225
::types of emotional expression leads to a triage logic
226
::which would then help to put everything else aside and
227
::address the person's issues. This is something we pioneered back
228
::in 23. We won the bot a thon at core
229
::AI with this module that could detect when somebody was
230
::angry and wanted to speak to a manager. It would
231
::calm them down, figure out what the problem is, buy
232
::some time, and then put in a transfer to a
233
::live agent. That type of concept of triage is something
234
::you can do with Vern and it's why we are
235
::trusted in deployments that are very sensitive. You've had some
236
::really interesting deployments and was there an autistic deployment that
237
::you told me about? Yes. Yeah. Yeah. Hello, My Friend
238
::is an app for kids with autism or neurodiversity. Have
239
::issues with understanding their own emotions. With Vern, being able
240
::to identify, label emotions for them allows the avatars to
241
::practice with the kids, to review things in their life
242
::and their world that they're confused about because they may
243
::not understand the emotions. The AI avatars can be age
244
::appropriate, gender appropriate. So anything from young kids to teens,
245
::young adults can connect with them and look like them
246
::and be able to discuss whatever that they want within
247
::the respective frameworks using the appropriate psychological therapy.
248
::I think it's fascinating. It's one of the biggest things
249
::that's missing in the game at the moment. Without a
250
::shadow of a doubt. I talk so much about keeping
251
::the human and raising the human to the top. If
252
::you've got a staff of 10 people and those 10
253
::people are doing 70% of their work and as repetitive,
254
::boring tasks and 30% of their work that they like
255
::doing, can you ship that 70% off to the AI
256
::and give them the chance to spend more time connecting,
257
::communicating, engaging with clients? And that I think is going
258
::to be the telling company in 2026 and beyond. It's
259
::not the company that just automates everything and lets it
260
::go. It's a company that lets human come to the
261
::surface and understands how important that connection is with people.
262
::And I think this goes a long way to addressing
263
::and honoring that humanity. We couldn't be here without the
264
::humanity. And AI wouldn't be worthy to us if they
265
::weren't serving human ends. I think that's critically important that
266
::we keep the person. And I hear all the time,
267
::what about Terminator? But what people tend to Forget is
268
::in T2 he was reprogrammed with empathy, set back in
269
::time to save humanity. Right. So if we don't teach
270
::AI how to understand feelings and what they mean to
271
::human beings, how they're communicated, then we're doomed to the
272
::same fate. Right? Hopefully they'll see the beauty in humanity
273
::and how we communicate and express ourselves as worthy as
274
::a 0 and a 1. And so having that translation
275
::layer, what VERN allows, allows for the AI to understand
276
::humans, and we've found for humans to understand themselves better.
277
::I think one of the biggest lessons I've had in
278
::this whole journey is with the current large language models,
279
::it allowed me to communicate with my people better because
280
::as I'm putting the prompts in and I'm getting rubbish
281
::out, I'm thinking what's not clear? And then you analyze
282
::yourself and you say, actually let me read what I
283
::just typed in there or talked in there. Ah, this
284
::is how it could have been translated. I need to
285
::be much more clear. And yeah, in my business relationships,
286
::in my personal relationships, in all of them, I feel
287
::that's given me an uplift in being more clear. 101
288
::of the things that we've really found is obviously working
289
::with these emotional tools is that we detect emotions a
290
::lot better. So our personal relationships have gotten better. But
291
::I will say there is such thing as an emotional
292
::container contagion, long theorized, never proven. We've got some paperwork
293
::on it, so we'll show it. But yeah, you have
294
::to also be careful about surrounding yourself with one emotion
295
::or another. Because when we were doing study and building
296
::out these models, the team members that were working on
297
::say anger started to be more angry and sad. Similar
298
::experience, Love and Joy. I don't know how to and
299
::if they were a little bit more spunky or not,
300
::but it definitely seemed when anybody working on Love and
301
::Joy was a lot happier and we had less issues.
302
::The emotional signals you surround yourself with do have a
303
::profound impact on you. We're going to prove that out
304
::in the next few years. So where do you see
305
::this all going? What do you see? The journey over
306
::26. Yeah. For us, this year really is about enabling
307
::folks to create interface with AI that's responsible, that's
308
::governed so that they can trust. Right. It's all about
309
::building trust with AI. It's not going away. It's definitely
310
::a tool. Using it and deploying it is very important,
311
::obviously all the way up in the board level. But
312
::having the ability to put a layer in between the
313
::human and the AI to ensure that both sides play
314
::nice. Right. I think what we're doing and for us
315
::this year now down the road, I see as what
316
::everybody wants to have is the bat computer or Jarvis,
317
::is the AI that is remarkably human, but still computerized
318
::and digital and obviously has Immense amount of resources.
319
::That's basically the whole world at your fingertips concept and
320
::that assistant that's there to help you achieve your goals.
321
::So I think that's the path everybody's on and we're
322
::really happy to be able to enable a big part
323
::of that. Yeah, and it's really this piece for me,
324
::this doom and gloom beforehand, you know, now we're moving
325
::into a change environment, I think, where people are accepting
326
::and they're saying, okay, how can I use this? Hopefully
327
::we will get to a stage where everything is just
328
::done so much more efficiently, that freedom actually becomes a
329
::reality for so many people. Right. Food's going to be
330
::created, the farming is going to be better, medical is
331
::going to be better. The access of information to all
332
::the people in the world that can't get it is
333
::going to get be better. We understand and be able
334
::to communicate much more. And there's going to be a
335
::change with that time in the near future where more
336
::people are going on holidays, more people are spending time
337
::with the families, more people are exploring the world and
338
::seeing the beauty out there. And that'll because of AI.
339
::Right. And it's the same glide path we've been on
340
::since the Enlightenment. Right. That steady increase of productivity that
341
::leads to more leisure time, more pursuits of things that
342
::interest us and that have uncalculable value. Priceless.
343
::I saw a couple of things and then we'll end
344
::after that. I saw a couple of things on. Yeah,
345
::I think it was aluminum like you guys. Aluminium, as
346
::we said, where the MIT scientists put the AI on
347
::the task of finding an alloy that was stronger. And
348
::in one afternoon they were able to test 300,000 different
349
::possible molecules on it. And they came out with something
350
::that was way better. And if you just go and
351
::apply that to all the other problems humanity has, we
352
::can start solving them and start fixing them. We're never
353
::going to run out of problems. There's always going to
354
::be use for people. Yeah, for sure. It's just going
355
::to change. And every major shift in the world, every
356
::printing press and every bit of electricity and all these
357
::things have all taken us further. There's been a change.
358
::We've adapted. We found out what to do with it
359
::and then we've moved on. And I just. Yeah, I'm
360
::waiting for the day where your day becomes going out
361
::and connecting with people and having deep and meaningful conversations
362
::and understand emotions better. And I just think that's where
363
::we're going to go. Somewhere, Socrates, Plato, Aristotle are sitting
364
::around laughing because that's what they saw as the world
365
::being in the future is one day when man can
366
::step aside from the toils of everyday labor and really
367
::look at existence, science, philosophy and why we're here and
368
::where we're going. I love it man. Awesome. Thank you
369
::for joining me. It's been inspiring talking to you. I'll
370
::put your contact details so people can get hold of
371
::you and look forward to chatting in a few months
372
::time and see where you've got to. Sounds good Roy.
373
::Thanks for having me.