In this episode, Aaron Francis (founder of Try Hard Studios, devoted husband and father, Laracon speaker, coolest person in Dallas) walks me through how he built four giant handwriting robots. It involved almost burning down an office building.
This episode was almost called "I Don't Know How Men are Still Alive."
(Just as a disclaimer, intellectual property from Caedmon Records has been used in this episode under the Fair Use doctrine. Fair use is a doctrine in the United States copyright law that allows limited use of copyrighted material without requiring permission from the rights holders, such as for commentary, criticism, news reporting, research, teaching or scholarship.)
Aaron
00:00:00 – 00:00:09
They are doing great.
3 years old and maybe 8 or 9 months old, and the low low guy's starting to crawl a little bit.
Paige
00:00:09 – 00:00:14
Oh my god.
It's like, is he, like, mobile, like getting in closet sand in cabinets
Paige
00:00:14 – 00:00:15
and shit?
Aaron
00:00:15 – 00:00:30
He's just like somehow appearing, you know, 3 feet away from where you put him down.
That's about as far it's about as far as we're getting at this point.
But it's just, you know, it's right around the corner before he's just gone.
So gotta keep an eye on him
Paige
00:00:30 – 00:00:41
now.
That's so cute.
I feel like, so I've heard kids tend to adopt the personalities of their parents, but it's not like a mix.
It's either one or the other.
Are you in gen Z and that with your babies?
Paige
00:00:41 – 00:00:43
Like, is one of them you're like, oh, that's me.
Aaron
00:00:43 – 00:00:58
One of them is very headstrong and resolute, and one is kind and sweet and shy.
And I feel like both of those things are a mix of each of us.
Does seem like there's a pretty good mix there.
Paige
00:00:58 – 00:01:11
Oh my God.
Your kids are the cutest.
Yeah.
Well, okay.
So kids, we can't always predict how they're going to turn out, but bringing Aaron on my show today to talk about something you can predict very well, robots.
Paige
00:01:12 – 00:01:49
So Welcome to my show, Aaron.
For my listeners, Aaron Francis probably doesn't need an introduction, but just in case you're not familiar, he's the founder of TryHard Studios.
I actually copied your Wikipedia bio.
Aaron
00:01:49 – 00:01:50
Wikipedia?
Paige
00:01:50 – 00:02:01
Yeah.
Or like Google, you Google and there's all this stuff about Aaron.
Aaron is a software developer, content creator, and co founder of TryHard Studios.
He used to be a tax accountant in the big four.
Oh my God.
Paige
00:02:01 – 00:02:09
Okay.
Aaron, you graduated from A&M.
You used to be a tax accountant and you live in Dallas, and yet you are still cool.
How?
Aaron
00:02:09 – 00:02:18
Yeah.
By God's grace alone, I guess.
Yeah.
I mean, I I left accounting a long time ago, so hopefully that just worn off.
But I I do live in Dallas, and I did go to Texas a and m.
Paige
00:02:18 – 00:02:30
Yeah.
Okay.
So I'm bringing Aaron on my show to talk about his giant handwriting robots.
Okay.
Before we begin your story, Aaron, would you consider yourself a mad scientist?
Aaron
00:02:30 – 00:02:51
That is a good question.
I would consider myself a tinkerer for sure.
And I think tinkerer brushes up on the edges of mad scientism.
So, yeah, maybe.
I really, really, really like building things, trying to figure things out, and that has led me to where I am in my career right now.
Aaron
00:02:51 – 00:03:12
That's the only thing that has gotten me this far is a willingness to try to do things.
And so, yeah, maybe in some regards, I'm a mad scientist and others, I could be called, like, a builder, maker, hacker in, like, the traditional sense, not in, like, the breaking into people's computers.
So, yeah, I I'll take that moniker.
I will allow that.
Sure.
Paige
00:03:12 – 00:03:15
Did you watch the Animaniacs growing up?
Do you remember that?
Aaron
00:03:15 – 00:03:29
Oh, gosh.
I, for some reason, feel like I could sing the Animaniacs song, but don't remember a single frame of the cartoon.
So I must have watched it, but I don't remember it at all, but the song is embedded deep in my psyche somewhere.
Paige
00:03:30 – 00:03:35
There's that little mouse character, and he's like, time to take over the world, Pinky.
I don't know.
I was
Paige
00:03:35 – 00:03:35
just thinking
Aaron
00:03:35 – 00:03:36
Pinky and the Brain.
Paige
00:03:36 – 00:03:36
Yeah.
Yeah.
Aaron
00:03:36 – 00:03:54
We watched a lot of Pinky and the Brain.
The other one that was adjacent would have been Dexter's Lab.
That was a similar time and era of Animaniacs, and it was about a little kid that had, like, a secret, you know, science lab under his house and would do all kinds of fun stuff.
And so I feel like that was another one that was kind of the same vibe.
Paige
00:03:54 – 00:03:59
It's kind of like you now, only you're much cooler and, you know, you've got a little lap downstairs.
Aaron
00:03:59 – 00:04:05
I do.
Yeah.
I do love a little I do love a little workshop or workspace.
I do find happiness there.
Paige
00:04:05 – 00:04:13
It's really cute.
Okay.
So just tell me your story.
How did you build handwriting robots?
How did that come about?
Paige
00:04:13 – 00:04:14
Just walk me through it.
Aaron
00:04:14 – 00:04:35
Yeah.
I mean, like everything else that I have basically ever done, it was just kind of following curiosity and, you know, taking a step at a time.
And, man, I built these things in 2019 probably.
And I was working at a a local, just small company.
I was working at, like, a services firm here in Dallas.
Aaron
00:04:35 – 00:05:02
Not a tech company, not anything interesting whatsoever, frankly.
A big part of our business is sending out physical mail.
And so we were a property tax company that worked on behalf of homeowners.
So as you know, living in Texas, we don't have state income tax, but you get totally hosed on property taxes.
So your property taxes can be like 2.5% or 3% of your value, which, you know, when you have a house, that's a lot of money that you have to pay every year.
Aaron
00:05:02 – 00:05:21
But in the law, you are allowed to go to the county and say, listen, you said that my house is worth $500,000.
My house is not worth $500,000.
Look, here's all the evidence, and you argue about it.
You go back and forth.
And then if they agree, you get a big tax break because it's based on whatever they say the value of your house is.
Aaron
00:05:21 – 00:05:42
So that's kind of the setup.
The firm that I worked at would sign up individuals like me, like you, individuals that owned a home and say, listen.
We are experts of this.
We will go into the soulless central appraisal district, and we will argue on your behalf because we're very, very good at it, and you're not very good at it.
And so that's what we did.
Aaron
00:05:42 – 00:05:52
And if we were successful, if we did a bunch of work and we saved you a bunch of money, we get a little bit.
Right?
So I felt like, hey.
That's a good place to be, you know, man of the people.
Let's fight the government.
Aaron
00:05:52 – 00:06:09
Right?
So a big part of our marketing efforts were, well, we gotta send people letters and say, like, hey.
You own this house, and the central appraisal district has set your value at x y z.
We think that's too high.
Why don't you sign up with us?
Aaron
00:06:10 – 00:06:30
Very effective marketing.
I mean, it's really nice that your entire client base is available as public data.
Right?
So you can go to every county in the state of Texas and say, I wanna see your public data, and they have to, by law, give it to you.
A little while back, I published my office address, and everybody was like, you included, by the way.
Aaron
00:06:30 – 00:06:36
You were like, no.
Don't share your address with weirdos.
And I Wow.
You know, deep in my heart, I'm like, yeah.
That's right.
Aaron
00:06:36 – 00:06:36
But
Paige
00:06:36 – 00:06:39
Weird.
I see the weirdos on your page.
I wouldn't share
Aaron
00:06:39 – 00:06:39
my app.
Paige
00:06:39 – 00:06:40
Yes or no.
Aaron
00:06:40 – 00:06:51
There are weirdos everywhere.
But here's the secret.
You could just go to the central appraisal district and type in my name, and my physical address comes up.
And it's like, that's not great.
But that
Paige
00:06:51 – 00:06:53
is an LLC.
You can pre check
Aaron
00:06:53 – 00:07:08
your an LLC.
Not if you're a judge, not if you're a police officer.
There are certain ways to hide that.
All of this data is publicly available.
So that's really helpful when you're a company that operates based on physical properties.
Aaron
00:07:08 – 00:07:33
And so we would just go, and this was my job.
I was the only developer in this company, so my job was everything.
But part of my job was to go get the public data, clean it up, find people with really expensive houses because, you know, the more expensive the house, the better the odds that we can get a big cut.
Find those people and then send them a letter in the mail.
That was the marketing that's been done for a 1000 years or, you know, maybe a little bit shorter, but that's how it worked.
Aaron
00:07:33 – 00:07:47
And so I came along and I was like, hey, what can we do that's one way more awesome than this, because this sucks.
This is very lame.
And what could we do that would be way more successful?
So there were 2 things that I came up with.
1 is interesting and one is not.
Aaron
00:07:47 – 00:08:07
The not interesting one is find in this public data, find people who own multiple houses.
That's not that interesting.
Right?
So you just look it up and you're like, alright.
This person, Climate Page, owns 4 houses, and we can link them up because Climate Page's mailing address is on these 4 properties.
Aaron
00:08:07 – 00:08:18
And so she must own those 4 properties.
Bravo, by the way.
And so then I would send you a letter that's like, hey, Climate Page.
You own these 4 properties.
Why don't you sign all 4 of them up with us?
Aaron
00:08:18 – 00:08:31
You're an investor.
We get it.
Sign up with us.
That works super well.
What was more interesting was finding those people and frankly, finding people that owned 3 to 500 properties across all of Texas.
Aaron
00:08:31 – 00:08:52
After we found these people or entities that owned 50 to a 1000 properties across all counties in Texas, then it was like, alright, these are the whales.
These are, like, these are the big dogs.
How do we get their attention?
And my slightly selfish, slightly genius idea was, what if I built handwriting robots?
That's a great idea.
Aaron
00:08:52 – 00:09:24
That's something that a property tax company should do is build robots.
Why not?
Because my thought was, if we send a piece of mail that is physically bigger than a regular piece of mail, and on that envelope, we wrote with a pen their name and address and everything, and that would probably convince people to open it.
And then when they opened it, what if on the first sheet of paper that they pulled out, it was like this printed beautiful thing, but scrawled messily across it was like, hey, Climate page.
We would love to save you some money.
Aaron
00:09:24 – 00:09:41
Why don't you give me a call, Blake?
And it was like handwritten on the piece of paper.
My hypothesis was that's gonna convert a lot better.
And frankly, if you go from a 1% conversion to a 2% conversion, that's a 100% improvement.
And so that's what I did.
Aaron
00:09:41 – 00:10:07
So, I'll stop there, but that's the I I ended up building through a series of events that lasted, you know, maybe a year and a half or 2.
I ended up building 4 handwriting robots that were the size of a sheet of plywood, which as I'm sure you know, Paige, is 4 American feet by 8 American feet.
So that would be 32 square feet of robot times 4, which is a lot.
What is that?
A 128 square feet of robot?
Aaron
00:10:07 – 00:10:17
It was just an absolute freaking blast.
I had the most fun ever doing that.
So I'll stop there and then we can get into the step by step of how the robots actually works.
Paige
00:10:17 – 00:10:23
Yeah.
I have a few just quick little questions.
When did you do this on company time or in your free time?
Aaron
00:10:23 – 00:10:41
Definitely on company time.
It was pre kids, first of all, and I was incentivized to make the business more profitable.
We'll say that much.
And so like I did a lot of work during the day.
I did stay pretty late and work on some of this stuff, but it was partially a passion project and partially pure greedy capitalism.
Paige
00:10:42 – 00:10:49
No, I love it.
I love money.
I feel like that's a really cool project to get to work on just for a company.
Okay.
Other question.
Paige
00:10:49 – 00:10:51
Did you name the robots?
Like, when you were
Aaron
00:10:52 – 00:10:52
working on that
Paige
00:10:52 – 00:10:53
the robots.
Yeah.
Aaron
00:10:53 – 00:10:53
Did you give
Paige
00:10:53 – 00:10:54
them names?
Aaron
00:10:54 – 00:11:09
Yeah.
I did.
And they were I don't remember.
Each one had an individually addressable name because I had in the code, I had to, like, target, you know, this robot's doing that.
It was like Renaissance era painters is basically what their names were.
Aaron
00:11:09 – 00:11:15
I don't remember exactly, but it was like famous painters from whenever the Renaissance was.
Who can say for sure?
Not me.
Paige
00:11:15 – 00:11:15
Are you
Paige
00:11:15 – 00:11:18
sure you didn't name them after Ninja Turtles, the Mutant Ninja
Aaron
00:11:18 – 00:11:32
Turtles?
No.
I think I think some people, were vying for that, but I'm a very particular or rather a very specific type of nerd, and that is not the type of nerd that I am.
No Star Wars, no Ninja Turtles, no Star Trek, none of that.
So that was all lost
Paige
00:11:34 – 00:11:35
on me.
I didn't watch that growing up.
You're a learning nerd.
Aaron
00:11:35 – 00:11:37
Yes.
I think that is correct.
Paige
00:11:37 – 00:11:43
Yeah.
That makes sense.
Because you, you really this is just a side note.
You really do a lot of things.
You have 4 kids.
Paige
00:11:43 – 00:11:49
You have, like, 8 different projects.
You refused coffee at 4 PM.
So you're just going on
Paige
00:11:49 – 00:11:50
pure energy.
Aaron
00:11:51 – 00:11:51
Yeah.
Paige
00:11:52 – 00:11:52
Aaron.
Aaron
00:11:52 – 00:11:55
It's just the spice of life over here.
Yeah.
Paige
00:11:55 – 00:12:13
That's wild.
I have some questions about a potential robot Aaron leader, but okay.
So just just to, like, kinda take a backtrack, what kind of automation was involved with these robots?
Like, what what would you call that automation?
Also, how would you define automation for my listeners?
Aaron
00:12:13 – 00:12:47
I call it process automation.
And by the time I left that company, I was the COO of the whole company.
And so at that point, I'm managing, oh, man, I'm managing 15 to 20 people stateside and maybe 10 to 15 more abroad who are, like, helping with customer support and that sort of thing.
And the thing that I cared about the most in terms of operating the business while I was there, because at at the end, I was in charge of everything.
And then there were 3 board members above me, and that was kinda it.
Aaron
00:12:47 – 00:13:09
And so it was like, this is my this is my fiefdom.
I can make it into whatever I want.
And the thing that I cared most about was and I continued to tell all the employees this.
We need to let the computers do what computers are good at and the humans do what humans are good at.
And I think that often gets lost in the discussion of automation or AI, whatever that may mean.
Aaron
00:13:09 – 00:14:07
I think that gets lost.
And I think what my stance, my belief is there are lots of things that computers are really, really good at, and we should not hold on to doing those things.
In this property tax company, an example of that is when a citizen calls in and says, hey, My property is at this address.
A computer is really good at finding that address, taking all of the data from the public records, submitting it to a third party address verification to make sure everything is spick and span, all cleaned up, you know, mail is going to arrive, everything like that, taking all that data, putting it on a contract, making a PDF out of that contract, emailing the contract to the citizen, and then waiting for them to sign it, and then emailing the signed contract to the central appraisal district and telling them we are now representing Climate Page.
And so all of that, a computer is great at doing.
Aaron
00:14:07 – 00:14:22
There is no reason that a human should be staying late at work to take items from one screen and put it onto another screen.
That makes no sense.
That is not fulfilling.
That is not good for humans.
That is not good for the business.
Aaron
00:14:23 – 00:14:36
Makes no sense whatsoever.
On the other hand, I think humans should be talking to humans.
And so like when a citizen calls in, I feel like a human should answer the phone.
If they wanna go online and sign up, that's great.
I love that for them.
Aaron
00:14:36 – 00:14:52
That is available to them.
If they call in, you know who I don't really want them talking to?
An AI.
I kinda want them talking to a human because at that point, they probably have some questions.
And, you know, maybe one day we'll be at the point where an AI can answer the phone and talk like that.
Aaron
00:14:53 – 00:15:17
I am I'm skeptical.
And so my big push at the firm was let us leverage computers for literally everything that they can do and free up the humans to do the better and higher work that we need humans to do.
And, like, that includes making a judgment call.
A computer can't make a judgment call.
It can follow a set of rules, but it can't make a judgment.
Aaron
00:15:17 – 00:16:11
And so that's kinda like when it comes to automation, that's really the only thing I care about is let's look at the tasks that are available to be done and partition them into best for a computer, best for a human.
And then the ones that are best for a human, well, let's leverage a computer such that it presents the information to the human in a helpful, digestible, and actionable way, and then you can provide it input, and the computer will go back to work.
And so the platonic ideal of automation is the human gets to sit at their desk and be presented with things that are beyond the capacity of a computer to figure out, and the human gets to use their powerful god given brain to make a great judgment, and then they don't have to do any business or busy work, and everything goes back to the computer.
So that's kind of my my take on, like, are the robots gonna take our jobs?
I have no idea.
Aaron
00:16:11 – 00:16:14
Hopefully, they just help us do things that only humans can do.
Paige
00:16:14 – 00:16:28
Very interesting.
So I do have to ask when you, you know, implemented these robots and automation at your company, did they lay anybody off?
Were there any jobs that got caught because of their robots or did it just kind of speed up processes?
Aaron
00:16:29 – 00:16:58
Yeah.
I will answer what you actually asked, and then I'll answer what I think is the truth.
Nobody got laid off.
So when I came, people were literally taking information from one screen and then merging it to PDF on another screen and then literally printing it out and stuffing it in an envelope and literally mailing it by hand.
And so they would have these they called them before I got their contract parties where they would answer the phones all day, type down information, and then at 5, they would turn off the phones and spend from 5 till 11 PM.
Aaron
00:16:58 – 00:17:14
They would order dinner and drink a beer and print contracts and stuff them.
And it's like, boy, that sucks.
So I came along and just forget robots.
I just did programming.
I'm a developer, and so I just made that process automated.
Aaron
00:17:14 – 00:17:34
And nobody lost their jobs.
Everybody's life just got better.
Right?
So when I started, we'll say there were x properties under representation, and when I left, there were 6a half x properties under representation.
So we grew the company by 6 and a half times, but here's the real answer to your question.
Aaron
00:17:34 – 00:17:50
We did not have to scale hiring at the same rate that we scaled the company, and that is because of the automation and the software that I built.
And so the real answer is, no.
Nobody got laid off.
Oh, robots are great.
Computers are great.
Aaron
00:17:50 – 00:18:13
There's no problem ever.
The underlying answer is, had we not had those processes, we probably would have hired a lot more people.
We were running an incredibly lean operation as compared to our competitors because everything was automated.
We could literally have somebody sign up a 100 properties and never speak to a human, and that would like, they would get their contracts.
The county would get their contracts.
Aaron
00:18:13 – 00:18:23
Everything would happen, and we would never talk to a human.
And so, yeah, there were definitely future potential jobs that were lost, but nobody actually got laid off or anything.
Paige
00:18:23 – 00:18:33
Very interesting.
Can you tell me about the actual process of building the robots, like from start to finish?
Yeah.
Yeah.
It took you over a year.
Paige
00:18:33 – 00:18:33
I feel
Paige
00:18:33 – 00:18:41
like it was really fun.
You probably like out, like, wearing maybe the lab coat, like, with black wish.
Yeah.
Lab.
She get a lab coat.
Aaron
00:18:41 – 00:19:02
I know.
If it were still 2012, I'd get, like, a hipster leather no.
Jean, like, frock with a leather, you know, strap around it, and I could, like, hang a a hatchet in it.
But 2012 is over, and, like, the whole Mumford and Sons thing is kinda out, but those were cool.
Like, the jean aprons, I've really digged those for a while.
Aaron
00:19:02 – 00:19:11
No.
I didn't have any costumes like an idiot.
I was just there in my t shirt.
So how did I build the robots?
This is just a just just a delight.
Aaron
00:19:12 – 00:19:36
It's a process of fumbling around in the dark.
So the way that it started is not the way that it ended, but the way that it started is I found some off the shelf x y plotters.
And so an x y plotter is basically a robot that has 2 arms.
One is stable on the desktop, and the other one just kinda moves around, and it has a pan at the end.
And so very, very simple machines.
Aaron
00:19:36 – 00:19:50
So I found that those existed, and I was like, hey.
This is a great start.
And I bought one of them, or the company bought one of them.
We figured out in the beginning how to write a single message thousands of times.
So that was the beginning.
Aaron
00:19:51 – 00:20:18
We take off the shelf robots.
We ended up buying 10 of these off the shelf robots and set up a little, you know, in the back of the office, set up a little robot room.
And we had an individual write the the message one time onto, like, a tablet.
And I captured that writing, and then I fed that writing to the 10 different robots.
And so the 10 robots would all be writing at once, and they, you know, obviously can't be addresses because that's variable data.
Aaron
00:20:18 – 00:20:29
So we're writing this message on the front of this insert that's like, hey, no name, because we couldn't write a name.
Hey.
We know we've got a lot of properties.
Sign up with us, and we're awesome.
That kind of thing.
Aaron
00:20:29 – 00:20:52
So that's how it started.
And then the question turned to, this is great, but if nobody ever opens the thing, they're not gonna see the handwritten message.
And so, you know, we're putting all this effort into writing this handwritten message and then shoving it in an envelope and slapping a sticker on it.
It's like, I don't know if this gets us anywhere.
And so the logical next step was how do we get variable data because we need to write people's addresses.
Aaron
00:20:53 – 00:21:20
And so the answer there after a whole lot of effort, I tried a bunch of different ways, and I think the tools have come a long way since I tried, but I've tried a bunch of different ways to generate handwriting.
Turns out pretty hard problem solved.
And so I pivoted, and what I did instead was I captured one of our employees handwriting.
So I sat her down and was like, alright.
Good news, bad news.
Aaron
00:21:20 – 00:21:34
Good news is you don't have to answer the phones today.
Bad news is I need you to write each letter of the alphabet, upper case and lower case, 25 times, and I need you to make it as neat as possible.
So good luck.
Have fun.
And so she she did that.
Aaron
00:21:34 – 00:22:10
I mean, she wrote each letter 25 times on this little tablet thing, and then I captured it and digitized it and turned it into a library of characters.
And then I wrote a program that would, like, pick a random assortment of characters to spell the correct word and then apply certain variations to those characters such that they looked human, including, like, when you're writing, your baseline naturally wanders.
Right?
Because you don't normally draw a line with a ruler and then make sure that you're writing on that line.
So if you're just writing out on an open piece of paper, your baseline wanders, your left margin wanders a little bit.
Aaron
00:22:10 – 00:22:44
You know, some things are too close together.
You know, when you're writing an address and be like, I'm gonna write this in the middle and you start and you're like, oh, I ran out of room.
So I modeled those things such that the computer would make mistakes where it started off too far to the left and then it would work its way back.
So I turned it into a program that could generate handwriting.
And so step 2 was get a list of these high value prospects and then programmatically generate a 1000, 5000, whatever it was, sets of basically, it was a file, generate 5,000 files with their addresses as if our employee had handwritten them.
Aaron
00:22:44 – 00:23:04
And then I fed those into the 8 or 10 separate machines, and we wrote them on label sheets.
And so one label sheet had 10 labels on it.
So a robot could write 10 labels before we had to have a human being change out the sheet.
Right?
So we had one human that was back there monitoring these 10 machines.
Aaron
00:23:04 – 00:23:23
And every time a machine finished its 10th label, it would call out to the human and say, I'm done.
I need you to change the label sheet and hit this button so that I know that the label sheet has been changed, and I can start again on my next 10.
That was phase 1.
There are more phases.
Questions about phase 1.
Paige
00:23:23 – 00:23:28
Yes.
A few.
Oh, many questions.
So do you wanna be mindful of our stop time?
Oh my god.
Paige
00:23:28 – 00:23:30
Okay.
Just a little quick little note.
Paige
00:23:30 – 00:23:30
We can
Aaron
00:23:30 – 00:23:32
just keep going.
We can go as long as you want.
Paige
00:23:32 – 00:23:33
Be careful about what
Aaron
00:23:33 – 00:23:34
you're doing.
Paige
00:23:34 – 00:23:48
Thomas is a very curious person, Aaron.
I know you have children and things to do that are not talk to me.
Okay.
So just quick little note.
Do you know there when intelligence agencies are trying to determine where someone's handwriting is from?
Paige
00:23:48 – 00:23:57
The more mistakes like that, the more American you are.
If there's like different countries, like Germany, very straight, everything is perfect.
Yeah.
Aaron
00:23:57 – 00:24:01
I'm shocked to hear that the Germans have very straight handwriting.
What a surprise.
Paige
00:24:01 – 00:24:06
Earlier when you were like, oh, we we wanted to make it, like, super straight.
You're just training your robots to be German.
Aaron
00:24:06 – 00:24:07
Mhmm.
Paige
00:24:07 – 00:24:20
It sounds sort of like an assembly line of robots.
Right?
They were producing and humans were getting involved just to kind of switch it over.
What was, like, the overall sentiment or reaction from the people you worked with?
Were they like, oh, this is so cool.
Paige
00:24:20 – 00:24:26
No more 5 PM to 11 PM work shifts, or were some people a little concerned?
What's the just the feeling?
Aaron
00:24:26 – 00:24:38
A 100% awesomeness.
Late night shifts were eliminated by pure software, and they were just thrilled to death about that.
They loved that to no end.
And then the robots came along, and robots are awesome.
Everybody loves robots.
Aaron
00:24:38 – 00:24:59
I mean, this is an interesting case because this was a new function that had not been executed by anyone before, and so it was purely additive.
And so in this case, I wasn't freeing anyone up from, you know, handwriting all these letters.
It opened a new avenue for us of marketing.
And so it wasn't like, hey.
I used to be the handwriting guy.
Aaron
00:24:59 – 00:25:08
Now I'm the robot, you know, maintainer.
It was like, oh, we never did this before, but now I get to watch all these robots.
Right?
And isn't that super cool?
So everyone was thrilled about it.
Aaron
00:25:08 – 00:25:16
Everyone was super distracted by wanting to give me ideas.
I'm like, hey.
What if the robot did this?
And I'm like, y'all, the phones are ringing.
Go answer the phones.
Aaron
00:25:16 – 00:25:19
I'll handle the robots.
So, no, they all loved it.
Paige
00:25:20 – 00:25:30
I had a robot.
I programmed in grad school called Misty, Misty AI.
Very cool.
We renamed it the Daisy bot because we thought Misty was just not a very cute name for kids.
Paige
00:25:30 – 00:25:30
Mhmm.
Paige
00:25:30 – 00:25:39
And you could program it to bring you coffee and say hello.
Was that what people are wanting or were they wanting things that would make their jobs easier?
Like what were some of the suggestions you got?
Aaron
00:25:39 – 00:25:46
The tips they were giving me were more like, hey.
Why don't you try this wacky thing?
And I'm like, I don't even know what that means.
Paige
00:25:47 – 00:25:48
Like, what?
What do you mean?
Aaron
00:25:48 – 00:25:56
Well, let's go to phase 2 then.
Phase 1 is we had these 10 robots and a human was watching it.
Phase 2 was, like, alright.
This kinda worked.
We gotta scale up.
Aaron
00:25:56 – 00:26:13
And if we're gonna scale up, we need to decrease the amount of time that a human has to be in the loop.
Because at that point, a human had to be on 247.
That's not true.
Had to be on as long as business hours.
As long as we were running the robots, a human had to be there.
Aaron
00:26:13 – 00:26:31
And so phase 2 was like, alright.
Well, what is the human actually doing?
The human is moving paper.
Surely, a robot can move paper.
Like, surely, we can say, grab this piece of paper, move it out of the way, and put a new piece of paper in such that the robot can start on a new sheet of labels.
Aaron
00:26:31 – 00:27:00
So phase 2 was how do we move paper?
You know, turns out pretty freaking hard to move a single sheet of paper without moving other sheets of paper.
We tried just using a printer because printers have it figured out.
Yes.
Printers suck generally, but one thing they're good at is pretty much just grabbing a single sheet of paper, connecting or, like, running out of ink or other stuff they're bad at, but they don't normally get paper jams anymore.
Aaron
00:27:00 – 00:27:21
Right?
And so we're at this spot where I'm trying to figure out how do I set up a robot and a second robot such that the paper moving robot can put it in place of the handwriting robot.
I tried all kinds of stuff.
The main route that I went down was suction.
And so for a period of time, I was pursuing a path, and this is where all the suggestions came in.
Aaron
00:27:21 – 00:27:38
That was like, guys, I can't I can't, like, bite any more suggestions.
I just have to try it and see if it works, and then we'll move on to see if something else works.
But it was the same kind of thing, like, you know, when a dad is standing around looking at the engine, like, 15 other dads appear out of thin air, and they're like, oh, what do you got there?
An engine?
Yeah.
Aaron
00:27:38 – 00:27:49
I've got an engine back home.
And so it was that kind same kind of thing.
Like, I would be doing work, and then suddenly, employees are everywhere.
And I'm like, y'all have stuff you gotta do.
The path that I went down for a while
Paige
00:27:49 – 00:27:49
was, alright.
Let's see if
Aaron
00:27:49 – 00:28:04
I can create a vacuum that can lower down and pick up a single piece of paper off the that can lower down and pick up a single piece of paper off the stack and move it over.
So kinda like a crane arm.
It came down and it turned on the vacuum and it sucked up the piece of paper and it moved it over and it put it back down.
Theoretically, it worked great.
In practice, it was a disaster.
Aaron
00:28:07 – 00:28:31
Paper is somewhat porous, and so we would constantly be picking up 3, 4, 5 pieces of paper, and then you raise it up into the air to move it, and the weight of 5 pieces of paper causes the suction to break.
Should you get a single piece of paper, then putting it down, does it put it down with the corners all squared up?
And then so you start handwriting on it, and you're in the wrong spot.
And so phase 2 was just a disaster.
It was just a nightmare.
Aaron
00:28:31 – 00:28:53
It was like, I'm running headlong into a brick wall of things that are beyond my skill level, and we couldn't crack it.
And so I changed tack a little bit for phase 3, which was successful.
And phase 3 was, alright.
I am going to optimize for because the question you always have to ask is, like, what are you optimizing for?
Not just in robots, not just in programming.
Aaron
00:28:53 – 00:29:21
In life, what are you optimizing for, and are you working towards that end?
And so I decided, here's what we're gonna optimize for in phase 3.
We're gonna optimize for unattended run time.
So unattended run time is how long can the robots run while a human is not there.
Because if the robots can run 12 hours unattended, a human can start a process at 5 PM and walk out the door.
Aaron
00:29:21 – 00:29:45
And then when we get to work at 8 AM the next day, the robot did 12 hours of work without us having to futz around with it.
Right?
We never got that far, but unattended runtime became my new North Star.
So how can I decrease the amount of human interaction while still increasing or maintaining the amount of robot output?
And one way to solve that constraint is paper moving.
Aaron
00:29:45 – 00:29:53
You know, have a robot pick up the paper and put it in place over and over and over.
You could run for infinity time until, you know, something breaks.
Right?
Couldn't crack it.
Couldn't get there.
Aaron
00:29:53 – 00:30:10
So what I did instead was I made the robot so much bigger.
So what that allowed me to do was the robots off the shelf for maybe, you know, 16 inches long.
Right?
I looked at that and I thought, okay.
A 16 inch long robot can handle 1 sheet of labels.
Aaron
00:30:10 – 00:30:28
One sheet of labels has 10 labels on it.
Each label takes 45 seconds to 60 seconds to write, so we get 6 minutes of unattended run time.
That's not great.
That's not very long, you know, and then you have 10 robots running and you're just constantly attending robot.
So what I did was I said that 18 inches is stupid.
Aaron
00:30:28 – 00:30:31
What if we made it 8 feet instead?
Paige
00:30:31 – 00:30:33
Made giant robots?
Aaron
00:30:33 – 00:31:01
What if what if instead of doing something halfway, what if we just did it too much, which is kinda like the theme of my life?
Like, what's a reasonable thing to do?
Let's go beyond that by 6 or 7 steps.
And so that's what we did was I disassembled these off the shelf robots that we had, and I replaced the 18 inch bar with I think it was, like, 2,000 millimeters, which is impossible to know what it is in human units because I don't know what a millimeter is, but it was almost 8 feet long.
Right?
Aaron
00:31:01 – 00:31:18
So then I have this bar that's 8 feet long, and that allows me to get, you know, however many, let's say, 6 to 8 sheets of labels in.
So suddenly, we've gone from 1 sheet of labels to 8 sheet of labels.
We've gone from 10 labels to 80.
Unattended run time, way, way up.
That's awesome.
Paige
00:31:18 – 00:31:19
And then I looked at it
Aaron
00:31:19 – 00:31:38
and I thought, hang on.
There are only 2 axes on this thing.
There's an x and there's a y.
We're only riding on one side of this 2,000 millimeter bar.
So, like, if you looked at this robot, you would see 8 sheets of paper all lined up on one side, and then on the other side, you would see nothing.
Aaron
00:31:38 – 00:31:40
There's nothing on the backside of the robot.
Paige
00:31:40 – 00:31:41
And I looked at it
Aaron
00:31:41 – 00:32:28
and I thought, this seems very stupid.
What if I were able to attach a pen to both sides of this cross member such that once it's finished writing these 80 labels, it could just turn around and write another 80 on the backside of the robot.
So now instead of being a one-sided handwriting robot that is 18 inches long, it is a 2 sided handwriting robot that is almost 8 feet long.
And so then I've taken one sheet and turned it into, you know, 16 sheets, and the unattended run time goes up to a few hours.
And so that means when somebody leaves at night, they can start a new run, and it'll go until 8 or 9 o'clock at night, and we're getting all of that time back without having to have a human babysit it.
Aaron
00:32:28 – 00:32:54
And so the process to get there was quite difficult because these robots were not intended to be 8 feet long.
The software did not expect them to be 8 feet long.
The software did not expect them to write on the opposite side from which the pen was originally mounted.
The software did not expect there to be another pen that had to be controlled by a servo.
And so I had to write all this custom code, which was a blast, to figure out, like, where am I in 2 dimensional space?
Aaron
00:32:55 – 00:33:03
Where's the paper?
Where's the pen?
What am I supposed to be writing?
Has the pen run out of ink?
Has the servo died?
Aaron
00:33:03 – 00:33:28
All kinds of these questions of how do you know where you are when you don't know where you are?
That's like the ultimate question is, okay.
I'm gonna tell this pen to go to the second sheet of labels and start writing on the 3rd label.
That has no meaning in the world whatsoever, and you have to define where does the second sheet of labels begin?
Where does the third label begin?
Aaron
00:33:29 – 00:33:58
What are these measurements?
And everything is measured in steps of the motor, and so you're doing all this math.
And, like, I had to look up, like, high school precal, like, trigonometry to be like, how do I rotate this so that I can trick the robot into thinking it's riding on the front side, but it's actually riding on the backside, but the motions are still all correct.
And so that was phase 3, and that's where we finished was we had 4 of these 8 foot robots stacked on top of each other.
And so they're all in one room, 4 sheets of plywood stacked on top of each other.
Aaron
00:33:58 – 00:34:03
So I got to build this massive structure to, like, hold these 100 pound robots.
Paige
00:34:03 – 00:34:04
In the house?
Aaron
00:34:04 – 00:34:21
It was an office.
So we moved offices, and the robots got their own office, which was awesome.
It had this gentle of robots all the time that was super fun.
And so I built this giant structure.
I wired all 4 robots to a single, like, Raspberry Pi computer, and and had a little monitor in there.
Aaron
00:34:21 – 00:34:41
And everything was run off of just a keypad, and so you would come in and be like, alright.
Robot 1, I need to replace the pen, and it would, like, go to the replace pen position.
You'd replace the pen, and then you'd do a self check.
I could control them either from the robot room or just from my computer in the other room because I could just, like, remote into the robot controller and be like, alright.
Everybody stop.
Aaron
00:34:41 – 00:34:55
Stop working.
Something went wrong.
I am going to go in and fix the robots.
And so that is where we ended.
We ended up writing many, many, many thousands and thousands of labels, including the custom signatures on top of the inserts.
Aaron
00:34:55 – 00:35:13
And it was just we we don't have, like we couldn't AB test anything because it's it's physical mail.
It's kinda hard to get statistically significant stuff, but anecdotally, massive success.
Just huge fantastic win.
Tons of clients.
We would get people that would call in and be like, man, I can't believe you took time to hand write that to me.
Aaron
00:35:13 – 00:35:24
That really meant a lot.
And we were like, I mean, frankly, it took more time than you could ever imagine.
It maybe would have been faster just to hand write it.
So still feel good about that.
But, yeah, that was that was where we ended.
Aaron
00:35:24 – 00:35:29
That was phase 3, and that was just an absolute delight for me to be able to pull that off.
Paige
00:35:29 – 00:35:36
Did you ever tell the people who called in that it was a robot?
Is it, like, secret?
Was it a secret at the time?
You're like, we're not gonna tell you it's a robot.
Yes.
Paige
00:35:36 – 00:35:37
We did handwrite it.
Aaron
00:35:37 – 00:36:02
I would not allow anyone to say that we handwrote it because that was not truthful.
We did not offer that a robot wrote it.
And when I say, like, it probably would have been faster to handwrite it, that is true.
And so when somebody would call in and say, like, ah, I'm so impressed by the amount of effort you put into handwriting this, The real answer is thank you.
It was a lot of work because it was a stupid amount of work.
Aaron
00:36:02 – 00:36:08
That's the real answer.
And so we wouldn't say like, oh, thanks.
I did it myself.
We would say, thank you.
That was a lot of work.
Aaron
00:36:08 – 00:36:12
I'm glad that you got it.
Can we sign up your properties?
I mean, that's kinda how it went.
Paige
00:36:12 – 00:36:25
Well, yeah, honest.
I'm very honest.
I have to ask, were there any safety issues when you were building these robots?
I mean, to build a tiny robot, that feels pretty it's not gonna fall on you.
If it catches on fire, it's whatever.
Paige
00:36:25 – 00:36:34
You can put it out.
What were some safety issues with these big robots?
How did you handle them?
I know guys are a lot less cautious than women.
I know this for a fact.
Paige
00:36:34 – 00:36:35
I see this in my life
Paige
00:36:35 – 00:36:36
Yes.
Everywhere.
Aaron
00:36:36 – 00:36:54
Yeah.
Never heard of OSHA in my life is the real answer there.
So the robots themselves are all low low voltage.
So, like, it weren't really working with dangerous robotics stuff.
I think the stupidest thing that we did was we ran a full on laser cutter in an office park.
Aaron
00:36:54 – 00:37:22
So like, where yes.
Where you go in to like, where you go into some nondescript depressing office park, and there's, like, old sandwiches for sale.
We were running a laser printer in one of those, and turns out we were on the 7th floor, so the windows didn't open.
So we jerryrigged we jerryrigged, like, the most janky filter system you've ever seen in your life, and then we vented it into the hallway.
So we, like, ran it through all our filters and then opened the door and blew a fan into the hallway.
Aaron
00:37:23 – 00:37:34
And there were times where security would come up and be like, hey.
Is something burning up here?
And we'd be like, woah.
I don't know, man.
Guess you should go check down the hall.
Aaron
00:37:34 – 00:37:47
That's crazy.
That was the stupidest thing we did.
Nothing ever happened from it.
And eventually, I had a huge set of things I needed to cut, like, many hundreds of things I needed to cut.
And I was like, guys, we gotta get this thing out of here.
Aaron
00:37:47 – 00:38:00
And so we took it to somebody's house and put it in their garage and vented it to the outdoors.
But that was maybe the dumbest thing we ever did.
The most unsafe yeah.
It's probably that also, the laser cutter.
That's probably the most dangerous thing we did, but it was awesome.
Paige
00:38:00 – 00:38:06
Enough.
You got it from my face, but I just took a mini heart attack.
You were on very high office floor.
Yeah.
No ventilation.
Paige
00:38:07 – 00:38:13
No.
I've had these all night.
Boba were like fires burning down the box.
Aaron
00:38:13 – 00:38:31
We never ran the laser unattended, because you can't.
So that was eliminated as a concern.
Running the robots unattended, I don't know.
It never bothered me even a little bit.
Because, again, like, the power to the robots was, first of all, manufacturer supplied.
Aaron
00:38:31 – 00:38:54
So I'm not, like, cutting and splicing power to the wall or anything.
I'm taking what the manufacturer supplies and plugging it into the wall.
I don't mess with a 120 volts because I'm not skilled with electricity.
So after it's stepped down into, you know, whatever, 10 volts, 12 volts, whatever, I'm just not really concerned about it.
And then the laser was always risky and never unattended.
Aaron
00:38:54 – 00:38:58
But, yeah, it was definitely stupid, but it was, oh, it was very awesome.
Paige
00:38:58 – 00:39:01
It's not stupid.
It was just high risk.
Aaron
00:39:01 – 00:39:02
I'll take that.
That's great.
Paige
00:39:02 – 00:39:06
Very high risk.
Doesn't surprise me.
Never heard of OSHA.
Oh my gosh.
Aaron
00:39:06 – 00:39:08
What even is that?
Yeah.
Who knows?
Paige
00:39:08 – 00:39:16
Yeah.
Yeah.
I mean, like I have 2 little brothers, so what you're describing it pretty masculine coded behavior.
Yeah.
Aaron
00:39:16 – 00:39:17
It sure is.
Paige
00:39:17 – 00:39:24
Yeah.
You know, they say when, when boys grow up, they don't stop playing with toys.
The toys just get more, more expensive.
Aaron
00:39:24 – 00:39:43
Not at all.
This is very much in dude's rock territory, which is like on the bubble between dudes are doing something awesome and dudes are doing something stupid.
That's when you say like dude's rock.
And this is very dude's rock.
Like, oh, what if we, what if we got a laser cutter and piped it through this carbon charcoal filter and then vented it into the hallway?
Aaron
00:39:43 – 00:39:49
Like, yeah, hell yeah.
Let's do it, man.
Yeah.
It's kinda, kinda done, but kinda awesome.
Dude's rock.
Aaron
00:39:49 – 00:39:49
Did you
Paige
00:39:49 – 00:39:53
play a lot of eighties music when you were building this?
Was it like you and Dutch,
Aaron
00:39:53 – 00:40:30
what did I play?
As with every project I do, it was either Sufjan Stevens or Blink 182, which is like it makes up 90% of my music listening.
So I'm pretty sure I've got this weird tick where when I'm in the zone or need to be, I'll listen to the same song on repeat for many days, if not weeks.
And I think that was a period where I was listening to Sufjan Stevens, a song called Saturn, which I think is, like, 3 minutes and 23 seconds long, And I listened to it for like, well, maybe 2 and a half weeks straight.
Maybe eighties rock would have been better, but it felt like I was I was in the zone with, you know, Saturn there.
Aaron
00:40:30 – 00:40:31
So that was kinda cool.
Paige
00:40:31 – 00:40:42
Okay.
So I don't know if you know this or high performers, people who do a lot of things really well, that's actually a really common trait.
They will pick one song and they'll just listen to it on repeat.
Like for, like, I think
Aaron
00:40:42 – 00:40:49
Yeah.
I learned so much from you on Twitter because I think I tweeted recently that I did that and you were like, ah, this is what smart people do.
Paige
00:40:49 – 00:40:50
And I was like, yes,
Aaron
00:40:50 – 00:40:51
tell me more.
That's awesome.
Paige
00:40:52 – 00:40:56
Yeah.
No.
It's true.
It's, oh my god.
I hope I hope I'm not obnoxious on Twitter.
Paige
00:40:56 – 00:40:57
I know I am.
Paige
00:40:57 – 00:40:58
Oh, no.
No.
No.
Not at all.
Paige
00:40:58 – 00:41:07
Oh my god.
Alright.
So you were telling me about laser cutters and that Yeah.
Before were some of those other, like, dudes rock?
We're gonna do this awesome build moments.
Aaron
00:41:08 – 00:41:10
As it relates to the robots?
Paige
00:41:10 – 00:41:11
Yeah.
Like, the laser cutters.
Aaron
00:41:11 – 00:41:31
When it came time to move the robots, it was like because we moved out of the 7th floor office and into a second floor office in another building because the company was growing.
Right?
And it was like, well, we really painted ourselves into a corner, haven't we?
We've got 4 robots in this little room, and they all weigh, you know, a £100 each.
Like, how the hell are we gonna get these things out of there?
Aaron
00:41:32 – 00:41:53
And that was kinda like a dude's rock moment.
We built a little frame out of 2 by fours such that we could hold the robots the sheet sheet supply wood vertical, and we, like, shoved this frame into the back of a truck and cinched it down.
And then we had these 4 robots vertical in the back of the truck, and I'm like, he's driving the truck.
I'm behind with my flashers on.
Stuff's falling out.
Aaron
00:41:53 – 00:42:05
You know, pieces servos are falling out.
I'm jumping out in the road and grabbing.
I got the servo, and illegal, potentially unsecured cargo.
Again, not a police officer, not a member of OSHA.
Don't super care.
Aaron
00:42:05 – 00:42:09
But it was awesome, and we made it.
And it was like, hey.
That was cool.
Wasn't it?
Paige
00:42:09 – 00:42:16
You're doing this in Dallas.
Dallas is a city where people do not stop for pedestrians.
No.
No.
No.
Paige
00:42:16 – 00:42:21
No.
No.
It's speed city speed demon city.
Okay.
I don't actually wanna know any close calls.
Paige
00:42:21 – 00:42:23
That was stage 3.
Aaron
00:42:23 – 00:42:35
Yeah.
We reached the finality of the robots.
In stage 3, they were just totally rocking.
It was producing great output.
And then COVID happened and we all went home, and I haven't seen them since.
Paige
00:42:35 – 00:42:36
What is COVID?
Aaron
00:42:37 – 00:42:38
COVID, the the thing I'm
Paige
00:42:38 – 00:42:39
sorry.
Paige
00:42:39 – 00:42:40
I'm just teasing you.
I know it's
Aaron
00:42:40 – 00:42:54
a glitch.
Yeah.
The the thing that that definitely happened.
So, you know, when that happened, we all went home and it was like, alright, Go home.
And then after that, we were a remote company, like, a 100%, and we never went back.
Aaron
00:42:54 – 00:43:08
And so the robots rested there for a while, and then I think, you know, I have since last left the company.
I think another one of the employees took them home to their house, and I think we're gonna try to get them up and running for the next marketing season.
Paige
00:43:08 – 00:43:26
Yeah.
That happened to me with my Misty Bot.
When COVID hit, they shut down the learning lab because apparently schools in COVID had a big shutdown.
So Misty Bar, last day I saw her locked in a cabinet with my cherry mash bars.
And I'm no longer in North Carolina and I'm never going to go back to North Carolina.
Paige
00:43:26 – 00:43:32
So Yep.
Goodbye miss for Daisy.
Know.
Yeah.
Sarah, have you been trying to get you back to their company?
Aaron
00:43:32 – 00:43:50
Do a little bit of work with them and advise them on architecture because I built, you know, the whole software system, not the robot.
So I did build the robots, but that's, you know, not what they're asking about.
So I still I still advise them a little bit.
But, yeah, there's some desire to fire up the robots again, because one, they're awesome, and 2, like, they work.
They they work really well.
Aaron
00:43:50 – 00:43:55
And so, yeah, there's potentially a future there where I do some more robot work.
Paige
00:43:55 – 00:43:59
Very interesting.
I have some questions about the robot and then just general questions about automation.
Aaron
00:44:00 – 00:44:00
Please.
Paige
00:44:00 – 00:44:07
Before our interview, I read this article in Fast Company about the competitive industry of robots, writing letters and human handwriting
Paige
00:44:08 – 00:44:08
Mhmm.
Paige
00:44:08 – 00:44:09
For quite a lot of them.
Aaron
00:44:09 – 00:44:09
There are.
Paige
00:44:09 – 00:44:14
Why didn't you just use one of them?
And how do you feel like your robots competed with those robots?
Aaron
00:44:15 – 00:44:56
The reason we didn't use one of the services of which there are many, some have since gone bankrupt since we were looking because there was a period of time where the owners of the company that I was working at wanted to turn this discreet thing into a business.
They wanted to be the handwriting company.
The economics are just super tough for a handwriting robotics company because let's look at the use cases.
If you wanna do wedding invitations, the handwriting has to be absolutely perfect because you don't wanna pay $5 for a wedding invite and have the handwriting be all wonky.
And so that already raises the production bar to a level beyond which I was comfortable because our handwriting was not perfect.
Aaron
00:44:57 – 00:45:28
And then you have to go out and find these people that are willing to pay $5 a card, which is a lot of money just for the handwriting part, to say nothing of, like, the paper and the postage or anything.
And so the economics of running it as a business never made sense to me.
And then from the consumer side, the economics of paying someone else to do it didn't make sense to me because it was $5, and what you would get out of it was one single piece of stationary limited to, like, 80 or a 100 words.
And in my opinion, it looked fake.
And so I would study.
Aaron
00:45:28 – 00:45:39
I requested samples from all of these providers, and I would get them.
And then I would look at them, and I'd pass them around the office, and I'd have everybody look at them and be like, what do you think of this?
And they're like, it looks it looks fake.
And I'm like, yes.
I know why.
Aaron
00:45:39 – 00:45:57
And so everything that looked fake from all the other providers, I rolled back into our software to make it look more and more real.
And so I didn't wanna use somebody else's crappy expensive product.
Plus, I wanted to do it myself because that's so much more fun.
So, is that, you know, post hoc justification?
Maybe.
Aaron
00:45:57 – 00:46:00
But I like to think it was a good economic decision as well.
Paige
00:46:00 – 00:46:06
Very interesting.
So uncanny valley, not a place Aaron Francis wants to be in.
Aaron
00:46:06 – 00:46:08
Nope.
Not a good place to hang out in.
Paige
00:46:08 – 00:46:24
What is the value of authenticity to you specifically when working with robots, like with automation, do you feel like everyone cares about it as much as you?
I mean, at the end of the day, these people are getting letters about saving money.
Like that's, that's the goal for them.
I feel like
Paige
00:46:24 – 00:46:24
I've
Paige
00:46:24 – 00:46:40
gotten a few letters from property tax companies and I was more concerned with who is gonna save me the most money.
Some of the handwritten letters were clearly done by a robot.
I was like, this is not a person's handwriting.
Why was authenticity so important to you here?
Aaron
00:46:41 – 00:47:16
So constrained specifically to the robotics discussion because it makes more money that way.
That is not my answer in the rest of life, but constrained specifically to this question, the better, and by better, I mean, more authentic it looks, the more people are gonna open the freaking letter.
And honestly, I just gotta get them to open it because once they open it, our odds go way, way, way, way, way up.
And so I just needed people to open the freaking thing.
So in terms of, like, authenticity, I guess I thought of that as a second order effect.
Aaron
00:47:16 – 00:47:33
The first thing I thought about is what is most effective.
What is the most effective thing that I can do to get people to open this and pay attention to us in their busy, busy life.
And so maybe 0% is how much I thought about authenticity for the robot.
I thought mostly about effectiveness.
Paige
00:47:34 – 00:47:45
Very interesting.
How long did it take for you to get the handwriting to a point where you were like, this looks real, this is gonna be really successful.
Like how tough is it to actually generate handwriting?
Good handwriting.
Aaron
00:47:45 – 00:48:11
I did it in the most simpleton way possible, and it took me a long time.
It was very difficult and I did it in the way that a dummy would do it, which is not generate handwriting, but rather choose characters from a library and try to put them together such that they look like they go together.
So I did a bunch of, you know, futzing around the edges to make one of those called ligatures where, like, 2 characters get really close together and then they connect.
Right?
So I did a bunch of that and, you know, a bunch of the other wonkiness.
Aaron
00:48:12 – 00:48:35
But in terms of, like, generating handwriting from whole cloth, it's just really hard.
And I think with some of these AI models or whatever, people have gotten a lot further.
I didn't get very far, and it was really hard.
It took me a long time, like many, many weeks, sometimes split across many months to be like, well, I have realized now that this sucks, and I have to completely reinvent it from scratch.
So it's hard.
Aaron
00:48:35 – 00:48:45
It's very hard.
And I didn't even I didn't even get very far.
I got to a very good passable state, but not something that a computer scientist would look at and say, what a novel implementation.
I I did I never got
Paige
00:48:45 – 00:48:45
there.
Paige
00:48:45 – 00:49:05
So you say that, but I feel like most of the use cases I see for AI and for automated and robot generated things, it's kind of like that.
It's like, if I were a teacher, I would give it 70%.
Like it's good enough.
If I'm not looking closely, it looks real.
But once you kinda look deeper, you're like, oh, this is fake or, oh, this is generated by a robot.
Paige
00:49:05 – 00:49:20
Maybe sometimes it really does look genuine.
For Climate Detectives, we have about 13 cards that we made using AI.
Mhmm.
I think there maybe a little bit more.
And there are some students genuinely cannot tell apart, but those are few and far in between.
Paige
00:49:20 – 00:49:23
Most of them, you can look and go, oh, this looks AI generated.
Paige
00:49:23 – 00:49:23
Mhmm.
Paige
00:49:23 – 00:49:26
How long do you think we have until that's no longer the case?
Aaron
00:49:26 – 00:49:40
Gosh.
I have no clue, but it can't be that long.
I feel like this generative AI stuff is I don't know if it's a bubble.
I think it's over invested, and I think it's overhyped, which maybe is the definition of a bubble.
What do I know?
Aaron
00:49:40 – 00:50:01
It is continuing to get better, I will say, but I am slightly bearish, so slightly negative on the fact that the last 5% is linear.
And so what I mean by that is, like, any project you have ever done so you made a game called Climate Detectives.
Climate I
Paige
00:50:01 – 00:50:03
just opened for business, www.medicariousgames.com.
Aaron
00:50:04 – 00:50:15
Yes.
Exactly.
That's what I meant to say.
So the first, like, when you're building when you're creating that game, the first 80% of that game took you a year to do.
The last 20% of that game took you a year to do.
Aaron
00:50:15 – 00:50:31
That's how every project goes.
Right?
The best part about being 90% done with the project is you're halfway finished.
And so my postulation is that we're 90% of the way there on generative AI.
And I think the last 10% is gonna be really, really, really hard.
Aaron
00:50:31 – 00:50:48
And I think we're gonna see diminishing returns as we approach a 100% fidelity.
So you look at these video generation models, and it's like, well, that woman just turned into a dog.
I don't think the AI really understands what's happening here.
It's like, it knows what a woman is.
It knows what a dog is.
Aaron
00:50:48 – 00:51:03
It knows what a lamppost is, but somehow it doesn't know that those are 3 discrete things.
And each frame of the video is like, and now that person has 3 heads.
And you're like, there's something fundamentally wrong here.
So I don't know if we ever, I think we'll get there, but I don't know how long it'll be.
Paige
00:51:03 – 00:51:06
How do you feel about that, that we might eventually get there?
Aaron
00:51:06 – 00:51:26
Oh, outside of my circle of control.
I I feel null about it.
Neither bad nor good.
The thing that I care the most about as it relates to the changing industry is how do I make enough money to provide for my family?
So can I change the world?
Aaron
00:51:26 – 00:51:41
Maybe, but that's not my goal.
My goal is not to change the world.
My goal is to make sure that I am fulfilling my duty as a husband and father and make enough money such that my kids can go to school and eat.
So I don't know.
I'm just trying to stay ahead of the wave.
Paige
00:51:41 – 00:52:00
Well, if I can say, aside from the conversation we're having about AI, I feel like by being a good husband and father of 4 kids and raising them all, that's doing more to change the world than a lot of these people running AI startups on, oh my God, I don't even know what people are doing now, but just for what it's worth, I feel like that is changing the world more positively, creating more people.
Aaron
00:52:00 – 00:52:01
Well, thank you.
Paige
00:52:01 – 00:52:13
Kind of a lot.
Better people.
All right.
So this is gonna come on later, but I might as well ask it now.
How do you feel about the idea that your 4 adorable children could grow up to have a robot errand to talk to when you're busy?
Paige
00:52:13 – 00:52:17
Would you even let them?
And if you did, what are the implications of that?
Aaron
00:52:17 – 00:52:30
Absolutely not.
Never ever, ever in my life.
Not even a single chance.
And to the extent that that becomes common, I will encourage them not to do that after I'm gone, which will happen someday.
I mean, how deep do you wanna go?
Paige
00:52:30 – 00:52:47
I felt like, okay.
So you say it's in the future.
When I interviewed, I don't remember if I included this in or if I caught it, or Jason Krom was telling me people have made robots, smart bear versions of him.
I interviewed this woman that makes AI clones with people.
I did not pull the trigger on making mine.
Paige
00:52:47 – 00:52:57
I'm the same as you.
It kind of breaks me out.
I'm like, I don't wanna put my whole self into this machine, but I could have, and it would have been really accurate.
It's here.
It's not a 100% here.
Paige
00:52:57 – 00:53:01
It's not 95% here, but it's, like, 75% here.
Paige
00:53:01 – 00:53:01
Mhmm.
Paige
00:53:01 – 00:53:06
You can make a version of yourself that people can talk to for 2 hours before they realize it's fake.
Aaron
00:53:07 – 00:53:26
So I understand that people are doing that.
I think that that is an abomination, frankly.
I don't think there I don't think there are words strong enough to describe how bad I think that is.
So let's take the example of my kids talking to an AI, Aaron, because I'm either busy or dead.
Right?
Aaron
00:53:26 – 00:54:12
And both of those things happen or will happen.
If my kids need to resort to talking to an AI because I'm too busy, I am failing, fundamentally failing as a father, and I'm using some AI to assuage my guilt, and that that's the disaster.
That is a terrible, terrible idea because a shadow of the thing is not the thing.
No matter how much you train an AI on my output, which is the only thing you can train it on, you can't train it on my inner thoughts because my inner thoughts, by definition, are inner.
There's enough content of me publicly that somebody could just go and train an AI on everything that I have ever said or ever written through all of time, and it still wouldn't be me.
Aaron
00:54:12 – 00:54:34
It would be a hollow impression.
It would be a shadow.
It would be a mere imitation.
The thing that my kids cannot get from an AI is the love of a father or, like, adapting to changing times and changing personalities of my children.
Like, what is an AI gonna know about the gentleness of my son or the robustness of my daughter?
Aaron
00:54:34 – 00:55:04
Like, they're there's not they're not gonna know anything about any of that.
And 1, I think there are 2 sides of it.
1, for me as a dad to say, there is a stop gap, which is the AI, so I can just keep working in just utter complete abdication of responsibility, which I think is a moral failure.
Then for the poor children to talk to this AI as if it's me, but it's not, and that is heartbreaking to me.
And to take it further, I'm going to die.
Aaron
00:55:04 – 00:55:44
And if they let's not even say that they're doing it.
If someone decides to keep me around in an AI form, that prevents them from grieving properly, which is hard but correct.
Like, it is a good thing to grieve the things that we have lost, And if you keep an AI of your dad around because you miss him, there's nothing more human than missing someone that you love.
But to keep them around after they're dead in this shadow form is just, it's an abomination, and that it prevents the actual process of being a human, which includes grief.
And you have to get through that, and that changes you fundamentally as a person.
Aaron
00:55:45 – 00:56:03
Grief does.
And I don't think it's healthy to face something that's very difficult, the loss of a parent, and say, I'm not in fact going to face it.
I am going to construct an artifice to which I can pray.
It's like, no.
That is a bad thing for your soul and you have to approach it head on.
Aaron
00:56:03 – 00:56:12
And so this whole like, oh, I really I've seen adults say this.
I really miss my dad.
And so I trained this AI.
I'm like, man, you've got work.
Like you've got work to do that you're avoiding.
Aaron
00:56:12 – 00:56:14
And I think it's better that you do the work.
Paige
00:56:14 – 00:56:18
Very interesting.
There won't ever be a Stepford Wife, Erin.
No, go ahead.
Aaron
00:56:18 – 00:56:22
Not if I have control over it.
No.
No.
Unfortunately not.
Paige
00:56:23 – 00:56:36
Yeah.
I would say I am in agreement with you on this.
I have the same gut reaction to the idea of creating a fake version of myself.
The AI woman I was talking about, I'm allowed to delete the content anytime.
I fully trust her with her.
Paige
00:56:36 – 00:56:50
I think uploading, I'm sure I could get rid of it, but there is just something that stops me from doing it.
I have the same hesitations.
The more I talk to people in the tech and in the world, I feel like people are in one of 2 categories.
Either they are so gung ho.
They're like, yeah.
Paige
00:56:50 – 00:57:06
I wouldn't chat GPT to do my homework.
I would love to have a scheduling assistant, you know, outsourcing and automating as much as they can.
And then there are other people who are like, I'm happy to automate assembly line type work.
Right?
Like if I could get something to do the dishes for me, that's fine.
Paige
00:57:06 – 00:57:19
I'm not spiritually connecting to the dishes.
Mhmm.
The idea of taking any part of my creativity of my humanity and putting it to a machine, I don't wanna do that.
Why do you think that is?
What do you think is separating these two groups of people?
Aaron
00:57:19 – 00:57:27
Oh, goodness.
It is hard for me to say without sounding like I think I'm better than the other group.
Paige
00:57:27 – 00:57:35
Say it.
I can always censor it later.
The rule of them to show it, the only rule I really have is just say everything you think.
You get full approval and everything released.
Aaron
00:57:35 – 00:57:53
No.
I wouldn't want you to hold anything back.
I'm just trying to figure out how do I, like so the question is why do why do these people fall into these different camps?
I think it's probably more on a gradient than, like, a strict dichotomy.
Certain people are okay with certain shades of automation, and certain people are not okay with others.
Aaron
00:57:54 – 00:58:36
I think there's a pretty common thing amongst tech people that technology is the solution to all of our problems.
And to a degree, I believe that that can be partially true.
I think technology can solve all kinds of problem.
I think if we, as humanity, had energy that was free, like not not like too cheap to meter, just like just free, like somehow we got free energy forever, I think that would solve a lot of the world's problems.
I think, however, once we have free energy and we can desalinate water and we can hydrate deserts and we can change the face of the planet, at the end of the day, you're left with humans.
Aaron
00:58:36 – 00:59:09
The problems are not purely technological.
The problems are, how do I, as a human, relate to other humans?
And there's always gonna be strife and conflict and things to work through and things to work around and things to grieve and things to rejoice.
And I think it's myopic to suggest that the only problems we have are problems of technological nature.
I think the real problems that we have are problems of human interaction and internal emotional problems.
Aaron
00:59:09 – 00:59:32
Like, how do I, as a dad, relate to my 4 little kids?
There's no technology that can help me or tell me what to do.
I have to put in hard labor to figure out what does my 3 year old son need versus what does my 3 year old daughter need?
What does my 8 month old son need versus what does my 8 month old daughter need?
Those are actual genders and ages.
Aaron
00:59:32 – 00:59:41
I do have 2 sets of twins.
And so there's, like, a lot of work that needs to be done on a human level.
And so am I a techno optimist?
A 100%.
Yes.
Aaron
00:59:41 – 00:59:55
Do I think technology is gonna, like, rid us of problems and allow us to live in a state of peace and tranquility on the earth?
No, I just don't think, I don't think so.
It can make our lives better, but it's not gonna solve our problems.
Paige
00:59:56 – 01:00:03
Very interesting.
Oh my God.
There's, there's so much I wanna know now about what you think about everything.
Okay.
Right.
Paige
01:00:04 – 01:00:16
So, you know, we're talking about abominations, and I agree with you.
There's just like this, like, level.
It's weird.
I don't even know if I can put it into words.
There's like a, I'm okay with this, and then, I'm not okay with this.
Paige
01:00:16 – 01:00:30
And it's just a feeling.
So there was a restaurant in Austin called Lucky Robot that when it first opened up, everything was automated.
So the idea was that there would be no waiters.
There would be no waiters at all and you could just order.
I did not like that.
Paige
01:00:30 – 01:00:40
I went one time and I was like, I'm disgusted by this.
I don't know why.
And it was so poorly received that they had to add wait staff.
That's an example of a gut reaction.
I don't like this.
Paige
01:00:40 – 01:00:44
My plants outside, I have automatic watering system.
Paige
01:00:44 – 01:00:44
Cool.
Paige
01:00:44 – 01:00:44
That does
Paige
01:00:44 – 01:00:52
not fill me with this crust.
In fact, I would like all my plants to be on an automatic watering system.
It is Texas.
It's very hot.
Some of those babies are gonna die.
Paige
01:00:52 – 01:00:59
Sorry.
Do you have a like, why why is that?
Why do some things feel okay and some things don't?
And it's not always human related.
Okay.
Paige
01:00:59 – 01:01:08
Like chat GPT writing.
I Mhmm.
Learning something new for work.
And one of my friends was like, oh, you could just use chat GPT to check out.
I was like, I don't wanna do that.
Paige
01:01:08 – 01:01:21
There's no consequence that will come from that.
I'm not technically cheating.
But if I were in school now, I would not wanna use chat GPT for anything like SparkNotes.
Never used SparkNotes in high school or college.
There was just something about it that was so gross to me.
Paige
01:01:21 – 01:01:31
And I were like, I can't, I cannot even put my finger on it.
Please describe this feeling to my listeners better than I just have because you probably have a better handle on it than me.
Aaron
01:01:32 – 01:01:41
I don't know if I have a better handle.
I think I maybe have, a different perspective.
I agree.
I agree with you.
Just quickly.
Aaron
01:01:41 – 01:01:52
One interesting thing about lucky robot is the market also agreed with you.
Everyone agreed with you.
And they were like, this is weird.
Give us some waitstaff.
And so the broad market does have power.
Aaron
01:01:52 – 01:01:59
And that's that's like the collective voice of the people saying, hey.
We're not gonna come to your restaurant because it's freaking weird.
And then everybody's like,
Paige
01:01:59 – 01:02:01
okay.
We gotta fix the restaurant.
Aaron
01:02:01 – 01:02:15
So I love that.
The question about I don't wanna use chat g p t for that.
I think, personally, that is a question of focal point.
Like, what are you focusing on?
So in my line of work, I do a lot of reading.
Aaron
01:02:15 – 01:02:26
I guess you could say I'm a developer educator.
I spend a lot of my time teaching other people how to be software developers.
Love it.
Find it so fun.
Absolutely enjoy it.
Aaron
01:02:26 – 01:02:47
As a part of my work, I am focused on being able to teach people how to do the thing.
Right?
So that is the thing on the horizon upon which I am focused.
How do I teach someone the best way to do this?
Along the line for me to that focal point on the horizon is a set of things that I have to do to get there.
Aaron
01:02:47 – 01:03:21
And in my case, it is most effective and best, I think, in my opinion, for me to read books.
It is the best thing for me to do to get to where I want to go is intensely read books, intensely scour the Internet, looking for what smart people have written on their little blogs that are, like, lost in the corner.
Go find all of that.
Go to forums and read the freaking comments from 12 years ago to see what these people who are way smarter than me know.
And as in the course of that, you know, everything I do is marketing.
Aaron
01:03:21 – 01:03:36
And so I like, well, you know, I've got my stack of books, and I tweet out a picture of the stack of books.
Right?
Every time I tweet out, hey.
I'm reading the books, or I printed out the documentation for this particular tool, and I'm reading it.
Look how look how interesting this is.
Aaron
01:03:36 – 01:03:47
Without fail, someone comes back and says, what an idiot.
Look at this boomer.
We have chat gpt now.
And I am sensitive to that as a rebuttal.
I think it's silly.
Aaron
01:03:47 – 01:04:08
I think it's a little bit foolish.
I've written on my personal site about why I think reading the docs is better than talking to chat gpt.
But, again, it goes back to, like, what am I focused on?
I need to be able to teach this to another human, and so I need to understand it extremely deeply.
I take I think this is the key.
Aaron
01:04:08 – 01:04:30
I take a lot of pride in the process of learning and teaching.
That is my craft.
Now there are other cases where the thing that I am focused on is not really teaching people.
It's getting a certain thing done.
So there have been times where I'm, like, trying to write scripts to convert video from one format to another.
Aaron
01:04:30 – 01:05:02
And in that process, the only thing I care about is getting the job done.
On that vector that I'm headed towards getting the job done, chat gbt is awesome Because it's like, hey, I have a discrete question.
You can give me a discrete answer, and I can move on with the thing that I am trying to do.
And so I think for me, it's a question of what part of the process do you take pride in?
And in the case of, like, learning and teaching, I take pride in understanding deeply such that I can offer nuance to the people that I'm educating.
Aaron
01:05:02 – 01:05:16
Because if I don't offer nuance, then they just get a set of these hard line rules that I've made up in my head.
Right?
In the process of trying to get something freaking shipped out the door of this converted video, I don't really care.
I just need the answer.
And so I go ask Chet GPT.
Aaron
01:05:16 – 01:05:53
So I think it depends on what you're focused on, like, what outcome you want and where you define your craft.
And so, like, if I was in college and learning accounting, which is what I studied in college, I wouldn't ask chat g p t for fundamentals.
I might ask chat g p t for clarifications for fundamentals.
I have to have my own web of knowledge inside my head such that when future things fly past my brain, there's a place for that to get caught and connect and solidify in my brain.
If I'm constantly just getting answers to discrete questions, I'm not forming a lattice of information in my head.
Aaron
01:05:53 – 01:06:07
I'm getting the answer.
I'm writing it down.
I'm moving on.
And the next time something related to that comes into my brain, there is no framework that has been constructed for that information to be caught, and it just flies by me.
And I don't get smarter, and I don't like that.
Paige
01:06:07 – 01:06:22
Yeah.
Very, very interesting.
So I do have a question for you about that, that you are a very caring father.
It is a huge part of your life to make sure that your kids grow up happy, healthy, and successful.
I remember when I was growing up, my mom had this huge emphasis on not cheating.
Paige
01:06:22 – 01:06:33
Right?
Like you go to school to learn, you're not supposed to cheat.
So I didn't cheat.
And I remember being in high school, basically a college prep school, a lot of people around me were cheating.
I felt like I was at a disadvantage.
Paige
01:06:33 – 01:07:01
And honestly, I was, I was making lower test scores because I was making them honestly.
This problem is already majorly exacerbated by ChattyPT and AI.
There are tons of students who are cheating very easily at home and, you know, teachers, I feel like were initially able to catch onto it and now it's getting harder and harder.
What will you teach your kids about education if they're seeing all of their friends relying on AI, right?
Like, what are you gonna do?
Paige
01:07:01 – 01:07:05
Has that entered your mind?
How are you gonna raise them to use AI?
Aaron
01:07:05 – 01:07:25
The argument that all of their friends are doing it is lost on me because, frankly, the average person we you don't wanna be like the average person.
Right?
Especially with what our family holds as religious beliefs, which is, you know, Judeo Christianity, you're gonna be weird.
That's promised in the Bible.
You're gonna be an outcast.
Aaron
01:07:26 – 01:07:48
You're not going to fit in.
And so if an argument is ever brought to me that I need to do this because everyone else is doing it, just like every other parent in the world, I am unmoved by that argument.
And I think teaching our kids that one, you don't actually let's let's hold religion aside.
You don't actually wanna be like everybody else.
That's not how you get ahead.
Aaron
01:07:48 – 01:08:15
If you're just purely self interested in getting ahead, doing what everyone else is doing, it's not really the best way to get there.
So that's one thing that I believe.
The second thing that I believe about using AI to cheat is cheating is always wrong.
And I think regardless and I I often have a hard time with this.
Regardless of if the rules are smart or stupid, the rules are supposed to be followed.
Aaron
01:08:15 – 01:08:42
And I would say that that is for, like, rules from authority, not like rules from society.
Right?
So rules from society are you should get a job that is safe and follow that path until you retire.
That's a rule from society, which I think is stupid, and you should break it.
A rule from the authority is the teacher has instructed you not to use chat gpt on this, and therefore, you will not use chat gpt on this.
Aaron
01:08:42 – 01:08:58
Whether you think that's stupid or not, that is the rule.
And as a student, you are under authority.
As a citizen, I am under authority.
As believers, we are under authority, and sucks.
Like, I would love to be in charge of everything all the time.
Aaron
01:08:58 – 01:09:20
I would love to be God.
I'm not.
So you have to follow the authority.
I think the third thing is frankly, like, back to just cheating, there's a certain part of cheating that betrays yourself more than it's like you get in trouble.
It's like you could have done the hard and noble thing, and you took the easy way out.
Aaron
01:09:20 – 01:09:50
And if you do that enough, you're never gonna do the hard and noble thing, and it is better forget all sorts of religions, forget anything.
It is better to be noble.
It is better to bear up under your duty than it is to shirk your responsibility.
In any movie, in every piece of literature, everything that has ever been written from the beginning of time, the hero is the one that faces the difficult odds and bears up under it and tries.
They don't always succeed, sometimes they fail.
Aaron
01:09:50 – 01:10:13
That's fine.
I don't really care about success or failure.
What I care about is did you have the strength of character to say this is difficult, and yet I persist.
That's all I care about.
If you don't get into any college because you refuse to cheat and no one ever listens to that on your admin applications and you don't get into college, I am proud of you.
Aaron
01:10:13 – 01:10:33
I am so proud of you.
And I think you, as the child, are going to be better off if you have learned there is a responsibility to be noble.
There's a responsibility to be pure and to bear up under things that are hard.
That's a good lesson.
And then I think number 3 is, alright.
Aaron
01:10:33 – 01:10:38
What are the rules, and then what is reality?
So the teacher says you can't use chat gbt.
Paige
01:10:38 – 01:10:38
Oh,
Aaron
01:10:39 – 01:10:53
dang.
That sucks.
The reality is the entire world is learning how to use AI, and you are following the rules and not using AI.
Okay.
How do I, as a parent, ensure that you are prepared for the real world?
Aaron
01:10:53 – 01:11:03
Because the real world, when you get to a job, what does the job care about?
Outcomes.
And so they don't care.
They don't care if you used AI, I'm in most cases.
And so if you get to a job and you're like, hi.
Aaron
01:11:03 – 01:11:10
I've been pure and noble, and I've never used AI, and they're gonna be like, okay.
Well, we use AI all the time.
You can't work here.
Oh, shoot.
I failed.
Aaron
01:11:10 – 01:11:29
So then I think the final thing is how do I, as a parent, ensure that I'm raising my kids to be prepared for the real world while also making sure that they are taking the noble and sometimes difficult path?
But it's still incumbent upon me to make sure that when they get out into the real world, they're not incumbent upon me to make sure that when they
Paige
01:11:29 – 01:11:29
get out into the real world, they're not Luddites
Aaron
01:11:29 – 01:11:46
or rubes.
They're ready.
Like, they're ready to beat everyone and have success for themselves.
So I don't know how we get there, but I think it is important for me slash parents to think about how do you prepare your children for a world in which we did not grow up.
That is gonna be the hardest thing.
Aaron
01:11:46 – 01:12:05
I did not grow up in a world that had AI readily available.
I grew up in a world that had the Internet readily available, and my parents had to adapt.
And I did a great job, and I don't know how I'm gonna adapt to a changing world.
What advice is an old man gonna have for a young kid in a world that has left him behind?
I don't really know, but hopefully, I'll figure it out.
Paige
01:12:05 – 01:12:07
Being a parent sounds like a ton of pressure.
Aaron
01:12:07 – 01:12:18
It is a massive amount of pressure, and the responsibility of a parent is to bear it for the sake of their children.
And so that is what I'm trying to do.
Paige
01:12:18 – 01:12:44
Yeah.
Few are many questions and and things I wanna say, but just first, I, actually, I have a I have a cheating story that might might make you feel better.
It kind of relevant.
When I was in 6th grade, I let this girl copy off my science homework and I got caught by the teacher.
The girl who copied off my homework received no punishment and I was given detention for a month and I had to take a pink slip home for my mom to sign, who asked me if I was doing drugs.
Paige
01:12:44 – 01:13:07
That was the only time I cheated because I got such a, it was just like such a heavy, embarrassing punishment from an early age.
Like, honestly, a lot of what stopped me from like academic dishonesty when it was really, really, really, really tempting was just that feeling of, I never wanna be embarrassed again in that way.
I never wanna have to have my mom sign something and have her ask me if I'm doing drugs.
Like just not
Aaron
01:13:07 – 01:13:17
because you know, in your core, so that that was wrong and you should be embarrassed about it.
That's what I'm saying.
Like, you knew that everyone does.
Paige
01:13:17 – 01:13:27
Yeah.
What was wild?
It was not even it was like a like a completion.
It was like, look at the textbook, write answers down.
It was like a homework assignment that took me, like, 2 minutes.
Paige
01:13:27 – 01:13:34
And I let this girl copy off of it just because she was like, hey, Paige.
I didn't do my homework.
Can I look at yours?
I didn't even think anything about it.
I was like, sure.
Paige
01:13:34 – 01:13:39
I'm 11.
I wanna have friends.
Okay.
Where am I even going with that?
Okay.
Paige
01:13:39 – 01:14:01
So your kids, you're trying to prepare them for a world you did not grow up in with AI.
You have more knowledge about artificial intelligence and how to use it properly and honestly than a lot of people.
What would you suggest parents teach their kids about how to use AI in ways that are healthy and sustainable and good, and that will increase learning?
Aaron
01:14:01 – 01:14:05
I am wholly unqualified to give other parents advice for small.
Paige
01:14:05 – 01:14:07
That's what all good parents say.
Aaron
01:14:07 – 01:14:27
I think parenting is theoretical until you have parents, and then it becomes relational.
Like, I have 2 sets of twins.
My parents know twin the same way.
Certain people, like, certain kids need different things.
Something that we have done is our 3 year olds have never had a screen, have never watched television, have never looked at an iPad.
Aaron
01:14:27 – 01:14:52
They look at pictures on mom and dad's phones of, you know, cousins and aunts and uncles and stuff, but only when we decide, and that's, like, you know, once or twice a week or something.
It's just not very often.
So at this point, our kids are just total Luddites, and I don't begrudge someone for making a different decision.
Like, we have decided a lot of things.
I think most of the things that we have decided are within the bounds of reasonable.
Aaron
01:14:52 – 01:15:05
I think it's one standard deviation at least away from reasonable.
I don't think we're, you know, 4 standard deviations out into crazy territory.
But, you know, other people might disagree and say, oh, yeah.
I cannot believe you're not letting your kids learn how to use an iPad.
It's like,
Paige
01:15:05 – 01:15:06
I know.
Aaron
01:15:06 – 01:15:29
Isn't that crazy?
We're trying to do the best we can.
So I don't know because between now and when that becomes a reality in our kids' lives, the world will have fundamentally turned over once again.
I mean, in the next 2 or 3 years, we could literally have humanoid robots in our house doing laundry.
I actually believe that, and I don't know what to do about that.
Aaron
01:15:29 – 01:15:41
I really don't know what to do about that.
And so what should other parents do?
Love your children dearly and try as hard as you possibly can to raise them well.
And beyond that, go ye with God.
I have no answers.
Paige
01:15:42 – 01:16:06
Very interesting.
Well, if it makes you feel better about the whole iPad thing, so the way brains work up until about age 5, your dopamine receptors are still developing.
So the fact that your kids aren't going to have screen addiction problems, like you could start around 6 or 7 and they would not be addicted to it.
They would not be addicted to notifications.
Your kids are gonna get older and they're gonna have attention spans.
Paige
01:16:06 – 01:16:08
Their peers will not have.
Aaron
01:16:08 – 01:16:08
It's my hope.
Paige
01:16:08 – 01:16:09
The way
Paige
01:16:09 – 01:16:22
I would describe it is like, do you, okay.
When iPods first came out, when you put like the headphones in your ears, it really messed up hearing.
There are a lot of people my age now, we watch movies together.
They need to turn the volume up.
Paige
01:16:22 – 01:16:22
Mhmm.
Paige
01:16:22 – 01:16:35
I think that's how attention spans are gonna be in the future.
Maybe like 20% of the population are gonna have really good attention spans.
And it, I don't know.
It's kind of sad because you can't really control it.
And I understand why screens, like it is hard to be a parent.
Paige
01:16:35 – 01:16:37
I don't know what I would do if I.
Aaron
01:16:37 – 01:16:38
Yeah, it is.
Paige
01:16:38 – 01:16:50
What do you think about the whole having human robots in your house doing chores for you?
Are you ever gonna let that happen?
Like, are you ever gonna welcome robots into your house?
And also things with, like, self driving cars, Teslas.
Mhmm.
Paige
01:16:50 – 01:16:52
Are you gonna let those be a part of your life?
Aaron
01:16:52 – 01:17:07
A 100% yes, without a doubt.
Provided some, you know, assurance of of safety.
You know, they're not gonna, like, either blow up or, like, decide to, you know, break one of my bones in half or something.
Provided that they're safe.
Yeah.
Aaron
01:17:07 – 01:17:39
A 100%.
I will have robots in the house doing stuff I don't wanna do.
I don't think that there is much nobility in doing menial chores.
I think to the extent that you have duties, you should fulfill them.
So to the extent that, like, I can serve my wife by cleaning the house because she's been at home with the kids all day, and I've been at work, and I can come home and help her, it is incumbent upon me to fulfill my role as a husband, which is to serve my wife.
Aaron
01:17:39 – 01:17:56
That is my role as a husband.
If a robot came in and cleaned the house, that's great.
I would 100% welcome that.
There are other ways that I can serve my wife, and there are other things that need to be done that a robot can't do, which would be, like, parent my children.
You know?
Aaron
01:17:56 – 01:18:14
So what I do now is I go home.
I help with dinner and bedtime.
My day is gone.
I get home at 6 o'clock and by the time everything is done, it's 8:30 and embarrassingly, I'm exhausted.
I'm like, this has been a very long day.
Paige
01:18:14 – 01:18:16
And I'm talking.
Aaron
01:18:16 – 01:18:19
So early.
I'm a 1000 years old, and it's so early.
Paige
01:18:19 – 01:18:20
No.
You're
Aaron
01:18:20 – 01:18:31
fortunate.
Imagine I can come home, and there are no chores, and the only tasks before me are relational.
That's great.
I love that.
I would 100% love to do that.
Aaron
01:18:31 – 01:18:45
Do I take pride in driving my 4 runner around the streets of Dallas?
No.
I would love to get in my car and have it drop me off wherever I want to be.
That's I mean, driving as entertainment Sure.
Driving as leisure, whatever.
Aaron
01:18:45 – 01:18:52
But, like, that doesn't move me.
I want I don't wanna drive.
I don't wanna drive.
I don't wanna mow the lawn.
I don't wanna do dishes.
Aaron
01:18:52 – 01:19:01
I don't wanna do laundry.
Let a robot do that, and let me do the things that a robot cannot do, which is love and care for my family.
Paige
01:19:01 – 01:19:17
I really like that answer.
I think it is really beautiful.
And I mean, I feel like, I feel like that is how you will use AI.
I could see a future in which kids maybe get used to relying on robots or talking to them like personal servants.
You're gonna raise your kids to be nice to the robots.
Paige
01:19:17 – 01:19:24
Are they gonna have to say please and thank you?
Because like you don't say please and thank you to your dishwasher, but if you've got something that's Mhmm.
To a lane.
Aaron
01:19:24 – 01:19:26
That's a machine.
Yeah.
There's a difference.
Yeah.
Paige
01:19:26 – 01:19:30
But a humanoid robot is a machine.
It doesn't have a soul.
Robot.
Aaron
01:19:30 – 01:19:47
Yeah.
It doesn't have it doesn't have a soul.
I agree.
But there is something fundamental when it's, like, anthropomorphized into the shape of a human.
Even chat gpt has the tone and tenor of a human, and so it's a little bit anthropomorphic where it's like, that is a person.
Aaron
01:19:47 – 01:20:24
I was raised yes, ma'am, no, ma'am, yes, sir, no, sir, everywhere all the time no matter who it was.
The question to will they treat the robots well?
I hope so.
And I think I would raise them to treat the robots well, because I think it is hard in the heart and soul of a little kid to understand that I can be a horrible, terrible person to this robot, but not to other, like, actual humans.
And so to have that darkness stoked into you could kick and hit and scream and, like, this robot, because it's not a human.
Aaron
01:20:24 – 01:20:41
Allow that to flourish just because it's not a human feels like the I don't care about the robot because it doesn't have feelings.
It doesn't have a soul.
It doesn't have a heart.
But I care about my kid's heart being like, I can be dark towards this thing.
It's like, I don't know if I love that idea very much.
Paige
01:20:41 – 01:20:46
The only bad part about cell phones is when you have the big handheld phones, you can like slam them down.
Paige
01:20:46 – 01:20:47
You really knew
Paige
01:20:47 – 01:21:00
that with cell phones.
If you throw a cell phone that's like $800 to $8,000 down the drain.
That's really dumb.
So your kid kicks a robot.
It would be the same as them maybe kicking an animal in terms of how you would respond and then maybe just listen.
Aaron
01:21:01 – 01:21:02
Yeah.
Yeah, exactly.
Paige
01:21:03 – 01:21:09
Very interesting.
So when we talk about souls, have you read the book, Do Androids Dream of Electric Sheep?
Paige
01:21:09 – 01:21:10
Are you familiar with that book?
Paige
01:21:10 – 01:21:18
No.
It's you should read it.
You should, I feel like you would really enjoy it.
Everything is automated.
There are so few real things left.
Paige
01:21:18 – 01:21:28
People start to value real animals.
It's a whole status symbol.
They're functionally the exact same.
Spoiler alert.
If you haven't read this book, I don't care.
Paige
01:21:28 – 01:21:38
I spoiled it.
Whatever.
Spoiling Soylent Green is people.
So end of the book, frog, this woman has like prized as a real animal.
I think it's a frog.
Paige
01:21:38 – 01:21:45
It is revealed to be a robot.
Totally.
Yeah.
It's how they getcha.
Functional differences between robots and humans.
Paige
01:21:45 – 01:21:49
You don't think robots could ever have a soul?
They could never be real.
Aaron
01:21:49 – 01:21:53
In my opinion, no.
That is incompatible with my religious beliefs.
Paige
01:21:53 – 01:21:54
Right.
Why?
Aaron
01:21:55 – 01:22:14
Because man humans were made in God's image and breathe into their bodies with the breath of life, as the Bible might say does say, which we understand to mean having a soul.
And so in my opinion, a robot can never have a soul because that is not a human made in God's image.
Paige
01:22:14 – 01:22:38
So when I was a kid, I had this little carrot bunny, the stuffed bunny I named carrot, I would carry around and it was like my friend.
Like I would talk to it.
I'd have little tea parties with her.
It's very common for kids to connect with an object in that way up until about 8 or 9, even 11, sometimes I stepped when I was 8.
What would you do if your child connected with a robot in that way?
Paige
01:22:38 – 01:22:47
Like if it had a little transitional object, would you get rid of it and replace it with a stuffed animal?
Does that bother you?
Because kids will catch the weirdest things.
Aaron
01:22:48 – 01:23:16
Yeah.
The I mean, I had a Furby when I was a kid, and I was like, this thing's alive.
The only thing that would cause me concern with a kid getting attached to a robot is who or what is behind the robot.
So, like, if the robot is a is a sneaky ploy to sell my kids more advertising, I wouldn't love that they got attached to that.
If it's online and constantly sending home conversations it's having with my children to some Facebook server.
Aaron
01:23:16 – 01:23:37
I don't super love that either.
And then to the extent that they are becoming emotionally invested in it would be a cause for a case by case concern.
I think there's some amount of, like, imaginary friend that is fun and some amount of imaginary friend that is concerning, and that would be a a case by case basis.
Paige
01:23:37 – 01:23:45
Yeah.
I mean, I had Neopets growing up, and I think it did give me a little training in how to take care of my dog.
Right?
Because it's like you forget to feed your Neopets.
You come back.
Paige
01:23:45 – 01:24:03
It's like starving.
It gets mad at you.
Luckily, my dog will let me know if he's hungry or not.
What do you think about robots teaching kids responsibility for real world things?
I know when you talk about like servers and sending data, I feel like that's more the case now than it used to be.
Paige
01:24:03 – 01:24:08
Personally, I would not really feel comfortable with my kid using social media or TikTok or anything like that.
Aaron
01:24:08 – 01:24:09
No freaking way.
Yeah.
Paige
01:24:10 – 01:24:13
Yeah.
It's spyware.
It's like weird.
People have it on their devices.
Aaron
01:24:13 – 01:24:23
Yeah.
Robots teaching kids responsibility, neutral.
I don't know enough to say.
If it's viable, great.
Anything that can teach more responsibility, great.
Aaron
01:24:23 – 01:24:39
To the extent that it stands in for a parent's duty bad.
Like, if it's doing stuff that I should be doing in terms of, like, teaching my children how to live in this world, I think that's bad.
If it's reinforcing stuff that we're doing, that's great.
But, yeah, basically neutral, I think.
Paige
01:24:39 – 01:24:50
So you've mentioned your duties as a as a husband and father a few times now.
What is your duty as a father and what lines do you draw?
So your kid is struggling with math homework.
Mhmm.
Maybe math is like, okay.
Paige
01:24:51 – 01:25:05
Kid is taking Chinese.
You don't have any background in Chinese.
Would that be something you'd be comfortable with a robot teaching?
Like, is it stuff you don't know how to do you're comfortable outsourcing?
Or are there things where you're like, I just don't ever want a robot to help my kid with their homework?
Paige
01:25:05 – 01:25:06
What's the line you would draw?
Aaron
01:25:06 – 01:25:42
I think a line I could potentially draw is it's my responsibility as a parent to teach my children how to emotionally handle difficult things, and it's my responsibility to develop character in them.
It is not necessarily my responsibility to help them learn the correct Chinese character stroke.
I don't know that.
Like, I I I can't help them with that.
I could hire a human tutor and feel 0% bad about it, so I can't imagine having our family humanoid robot help them with the Chinese characters.
Aaron
01:25:42 – 01:26:14
However, I will say that when they sit down and they're crying because the math is too hard, which is a place that I have been as as as a 10th grader or maybe 11th grader.
Being, like, in pre cal and not knowing what any of this stuff meant was just so frustrated I started crying.
Like, when my children get there, my responsibility as a parent is to take that and turn that into character and teach them, yes, things are very hard.
You are very capable.
All I'm asking is that you try hard.
Aaron
01:26:14 – 01:26:34
I do not care what the outcome is.
My goal for you is to learn how to try your hardest and do your best regardless of the circumstances.
I am proud of you no matter what.
I am proud of you if you try really hard.
If you try really hard and you bomb and you embarrass our family, I am so proud of you.
Aaron
01:26:34 – 01:26:48
I am so proud of you because you faced a hard thing and you tried, and that is wonderful.
That is my responsibility.
A robot can show them how to make the correct Chinese character.
That doesn't I'm unmoved by that.
Paige
01:26:48 – 01:26:53
Yeah.
Father of the year award to Aaron.
Right.
We'll see.
Okay.
Paige
01:26:53 – 01:26:57
So I know we're coming up on time.
I just have 2 final questions.
Aaron
01:26:57 – 01:26:57
Sure.
Paige
01:26:57 – 01:27:07
So we've touched on your face during this episode and you know, it is something that's so important and integral in your life.
When you think about your creator, your God,
Paige
01:27:07 – 01:27:07
what
Paige
01:27:07 – 01:27:09
do you think he thinks about you becoming a creator?
Aaron
01:27:10 – 01:27:26
I think that there are certain attributes of God that are communicable.
There are certain attributes of God that we as humans can be endowed with.
Creativity is one of them.
Omnipresence being everywhere all the time all at once is not one of them.
That's incommunicable.
Aaron
01:27:27 – 01:27:47
And so I think that creativity is an attribute of God and is communicable to the humans.
I think work is pure and noble and ordained by God.
So if you read Genesis, which is the first book of the Bible, if you read that, you get the creation story, some will call it the creation myth.
I'm unmoved by either term.
I don't care.
Aaron
01:27:47 – 01:28:05
But what you see is you see Adam and Eve in the garden.
And before before the fall, which is when sin was originally introduced before the fall, there was work.
Adam was working the garden.
It is good.
It is pure.
Aaron
01:28:05 – 01:28:24
It is right.
It is noble to work.
After the fall, the work was accompanied with toil and it was hard and the ground did not yield fruit and there were thorns.
And so even in a pure paradise, work is good.
Like work is a good thing.
Aaron
01:28:24 – 01:28:50
It is good to cultivate.
It is good to create.
It is good to subdue and to bring order to chaos.
Those are good things.
And so my sincere belief and my deep hope is that by exercising my creativity and my ability to create and to make, I am mirroring God in his creation of the universe.
Aaron
01:28:51 – 01:29:17
As an overflow of his own creativity, he created the universe from nothing.
And in my own way, I feel endowed with that creativity to make things out of nothing and look at it and say, that is the fruit of my labor, That is a gift to God.
That is a representation of the goodness that God has bestowed on me.
I have turned around, and I have made a thing.
And in that way, I honor him.
Aaron
01:29:17 – 01:29:34
And so I hope and I believe I believe that the scripture would back this up, that creativity and the creation of things is in line with God's desire for a perfect earth and in the future, a perfect heaven.
There will be work to do.
It is good to do work.
Paige
01:29:34 – 01:29:50
Totally off the wall question just popped up when you were talking about that.
Didn't send it to you.
So no problem if you don't have any answer, but it is my last question.
So, you know, I'm Jewish.
I would say that I've over the past few years gotten a lot stronger in my Jewish faith.
Paige
01:29:50 – 01:30:18
There are some things I look at now and I don't know how I feel about them.
When you look at people paying with things with their palms, when you look at what is happening in different countries and I'm gonna keep it as apolitical as possible for many reasons.
Sometimes I think about things I read in the Bible when I was growing up and I wonder, are we in end times?
Do you feel like we're living in end times, Aaron?
It's like AI is contributing to that.
Aaron
01:30:18 – 01:30:39
This is gonna be this is gonna be hot take territory.
This is not, this is not biblically sanctioned.
I think here's my hot take.
I think the, the antichrist will be an embodiment of artificial intelligence.
That is my, like, off the wall, totally bizarre, very little support for it.
Aaron
01:30:39 – 01:31:08
But I personally think, yeah, this feels like we're getting close to the end times, but you know who else has thought that literally everyone in all of history.
And so I don't hold it very tightly.
It's more of a, it's more of like an idle musing of like, man, woah, wouldn't that be crazy?
But yeah, I think we're living in an accelerating age.
And if what I believe about the bible is true, which obviously I believe what I believe is true, that's, you know, that's kinda indicative there.
Aaron
01:31:08 – 01:31:30
I think there is a ever increasing chance that Christ returns sooner than later.
That's logically obvious because the more time that passes, the closer we're getting.
So it's kinda like, duh.
But, yeah, as an idol musing something that I don't really think about or put a lot of stock in.
Yeah.
Aaron
01:31:30 – 01:31:35
I think we're probably kind of getting close to the end, but who can say for sure?
Paige
01:31:35 – 01:31:56
I mean, I know every generation has thought that, but this is, I feel like one of the first times in history, we're having a revolution that is creating potentially a new life form.
I know.
It's crazy.
It doesn't look like, I feel like the closest you can get to, or was when people were worried about how are they going to feed the exploding population?
I feel like it is at that level.
Paige
01:31:56 – 01:32:05
Right.
And there was, you know, industrial evolution making lots of food for everybody.
Oh my God.
History was so long ago for me, probably embarrassing myself.
Just one last question.
Paige
01:32:05 – 01:32:20
And then, and then we can close-up shop.
What do you think this artificial intelligence, antichrist embodiment in your gut?
What do you feel like that might look like?
This is a hot take territory short term.
I don't I don't think anyone is gonna, like, chat you and be like, Eric,
Paige
01:32:20 – 01:32:20
this is
Aaron
01:32:20 – 01:32:39
where just fascinating to think about.
So I think the Bible tells us 3 and a half years of peace followed by 3 and a half years of war.
At some point, honestly, it makes logical sense to seed a lot of power and control to a computer that can do it better than humans.
Humans are a problem, man.
We're stupid.
Aaron
01:32:39 – 01:32:49
We're selfish.
We're arrogant.
We're we make mistakes all the time.
It makes perfect sense to me that a robot should be in charge of everything.
Robots cannot make mistakes.
Aaron
01:32:49 – 01:33:09
They can be programmed poorly.
Yes.
But they cannot make mistakes.
And so it just like, I am just following what I think is the reasonable logic to the point where it's like, honestly, I would kinda love for a computer to be in charge of a global government.
That that kinda sounds great to me.
Aaron
01:33:09 – 01:33:25
And then you you you think about it, and you're like, oh, hang on.
That feels that feels like end times for sure.
Maybe we shouldn't do that, but that's that's kinda how I think that's why I think that that is an outside possibility.
And for the record, this is crazy.
Like, I don't know that anyone else crazy?
Paige
01:33:25 – 01:33:25
I don't
Aaron
01:33:25 – 01:33:37
know that anyone else thinks this.
This is idol musings of a guy that listens to too much sci fi and reads the bible and puts the 2 together.
So I don't know, but boy, wouldn't that be crazy?
Yeah.
Paige
01:33:37 – 01:33:45
And it's not crazy.
People base their entire lives around astrology charts that are made up on the internet.
Like you,
Aaron
01:33:45 – 01:33:46
which is also crazy.
Paige
01:33:47 – 01:33:50
Yeah.
But like, that's actually bananas.
Sorry.
Aaron
01:33:51 – 01:33:52
Yeah.
Don't y'all don't do that.
Paige
01:33:53 – 01:34:05
Geology.
Oh my God.
I'm, I'm pretty agnostic.
I mean, I, I mentioned gotten more deep in Jewish space, but like, I just, I don't think about it to the level that you do.
Generally speaking, I, I believe in a God.
Paige
01:34:05 – 01:34:22
Yeah.
I don't know.
I think it's gonna be really interesting when we start to rely more on AI and we automate things and we're not necessarily paying attention to what we're automating.
Just as a final note for my listeners, are there any thoughts or maybe bible verses you'd like to leave them with?
Aaron
01:34:22 – 01:34:41
Sure.
I would love that.
There are 2 that come to mind that have informed the way that I want to live and also, line up with the way that I want to live.
I don't know if it's chicken or egg, but these 2, like, I feel drawn to.
1 is I'll I'll have to look up the reference here.
Aaron
01:34:41 – 01:35:08
It's first James or I'm sorry.
James 127, and it's it talks about, like, pure and undefiled religion before God.
And I think religion has gotten perverted in America.
Religion has become conservatism, which I think is neither here nor there.
Here's what pure and undefiled religion is according to the bile according to the Bible, to visit the fatherless and the widows in their affliction.
Aaron
01:35:09 – 01:36:00
That is pure and undefiled religion before God to care for the orphans and to care for the widows.
And that is, like, our responsibility is to care for people who are weaker than us and to serve them, not to lord our power over them.
So that's one that, like, affects my outward view of the world, And one that affects my inward view of the world is, I think it's in 1st Thessalonians, and it talks about you should aspire to live quietly, to mind your own business, and to work with your hands so that you may walk properly before the outsiders.
And I just like, what a counter cultural mandate.
You should aspire to live quietly to mind your own business and to work with your hands.
Aaron
01:36:01 – 01:36:41
That is just like, you have so much at home that you need to do.
You need to focus on living a quiet life and working with your hands.
And that just I just love that so much because it aligns with how I feel, but it also, like, it is a there's a quiet nobility in I am responsible for very few things, and I'm going to be a good steward with those very few things.
I'm not, like I said, interested in changing the world.
And to borrow something you said, I'm interested in changing the world of a few people, of my family, my wife and my children and my neighborhood.
Aaron
01:36:42 – 01:36:56
Maybe my city, my state is too far.
And I'm so my aspiration is to lead a quiet life, work with my hands, to care for the widows and the orphans, and be pure and undefiled before God in that way.
Paige
01:36:57 – 01:37:07
You are such a great dad.
Your kids are gonna grow up every Father's Day.
They're just gonna make a card that says Karen Francis, best dad in the universe, and it's gonna be true.
Yeah.
I'm serious.
Paige
01:37:07 – 01:37:21
Well, I guess that's all the time we have.
Thank you so much for coming on my show.
It very interesting.
I loved hearing your thoughts on artificial intelligence and fatherhood and automation and God.
Thank you so much, Aaron.
Aaron
01:37:21 – 01:37:27
Of course.
Thanks for having me on.
I enjoy listening to the show, and it was a delight to be here.
Thank you.