Shipping SQLite to Lambda (with Laravel Vapor)

December 19, 2023

The first question you may ask is, "Why would I ever want to use SQLite on a serverless platform? That makes no sense." Good question!

If you know anything about a serverless platform, you know that the file system is ephemeral. You can mount a persistent file system with Lambda, but by default, it's ephemeral. And since SQLite is a database-in-a-file, you're shipping your database to a filesystem designed to be blown away.

Lambda can spin up as many copies of your application as it needs to satisfy requests, which leads to a bunch of different copies of your database that aren't in sync. Each time a new Lambda container boots up, there's a new copy of your database that doesn't speak to any of the other copies.

This means that if you write to the database in Lambda, the writes basically just disappear when a new container is booted. The containers can be reused, but eventually, they'll be destroyed.

There are some solutions for that. You could use something like LiteFS or Turso. If I needed something like that, I'm just gonna use a regular MySQL database (through PlanetScale, of course.)

SQLite as a read-only database

The reason that I use SQLite on Vapor is that it's an excellent read-only distribution mechanism. For this website, I write all of the content to the database either locally or in CI, and then I commit and ship the database out to production to be read from.

I don't ever write to the database in production, which is weird. I use it as a content repository, similar to a pre-warmed cache. I put all my articles, videos, and podcasts into the database. In production, I still get to use Eloquent to pull items out of the database, but my site is static content. Every time I deploy a new version of my website, I deploy a new version of my database.

SQLite on Vapor

For your database to be deployed to Laravel Vapor, you can't put your database.sqlite file in the database directory. The vapor-cli BuildProcess command ignores sqlite files in the database directory:

$files = (new Finder())
->in(Path::app().'/database')
->depth('== 0')
->name('*.sqlite');
Code highlighting powered by torchlight.dev (A service I created!)

Instead, I put mine in the root directory. You'll need to update your database.php as well:

'sqlite' => [
'driver' => 'sqlite',
'url' => env('DATABASE_URL'),
'database' => base_path('database.sqlite'),
'prefix' => '',
'foreign_key_constraints' => env('DB_FOREIGN_KEYS', true),
],

Building the database

I populate the database by running several commands to gather items from across the internet. I have a command that looks at YouTube and pulls down all of the videos. I have a command that looks at podcast feeds and pulls down all of the episodes. For my articles, it looks at the markdown on the disk and write it into the database.

I can build the database locally, or I can build it in CI. I can also delete it altogether and rebuild it from scratch!

I don't ever have to back up the database or worry about git conflicts or messing anything up because all of the data is available externally and I'm just pulling it in and putting it into a database for easier usage.

All of those commands are wrapped up into a single Artisan command called sync:all, which looks for all sync:* commands and runs them:

public function handle()
{
collect(Artisan::all())
->keys()
->filter(function ($command) {
return Str::startsWith($command, 'sync:') && $command !== 'sync:all';
})
->each(function ($command) {
Artisan::call($command, [
'--no-interaction' => $this->option('no-interaction')
]);
});
}

In GitHub, I have a workflow that runs that command and commits the content to git.

jobs:
deploy:
name: Sync content
runs-on: ubuntu-latest
steps:
- name: Sync Content
id: sync_content
env:
YOUTUBE_API_KEY: ${{ secrets.YOUTUBE_API_KEY }}
run: php artisan sync:all --no-interaction
 
- name: Commit content
uses: stefanzweifel/git-auto-commit-action@v4
with:
commit_message: "Sync content"

This ensures that I always have the most up to date content right before I deploy. I also run this workflow on a schedule so that every night at midnight the site updates itself.

Conclusion

This is a bit of a strange use case for SQLite, but I find it really easy to reason about. It makes my site fully self-contained and reduces the number of services I have to rely on.

As a YouTube video

If you want to see more about my site, I've recorded a YouTube video on it!

Me

Thanks for reading! My name is Aaron and I write, make videos , and generally try really hard .

If you ever have any questions or want to chat, I'm always on Twitter.

You can find me on YouTube on my personal channel or the Try Hard Studios channel.

If you love podcasts, I got you covered. You can listen to me on Mostly Technical .