Idle Thoughts on Old Perl Versions for New Distributions


My upgrade of my home server from Debian 11 ("bullseye") to Debian 12 ("bookworm") went almost without a hitch. Yesterday I realized that the Postgres data hadn't been migrated from the old DB to the Debian package of Postgres 15. But luckily, the good Pg people provide a Debian package of 9.6 (the version which held my data) for Debian 12. I could install that one, fire it up, dump all data into SQL, fire up Pg 15 from Debian and import it there. Now I run such an SQL dump daily, just to have the data available as SQL files.

I wonder if it would be worthwhile for Perl to provide prebuilt binaries/packages of old Perl versions for current OSes, but then, there are so many build options that it's not worth the effort in general.

The only use case I see would be to provide an emergency Perl when your dist-upgrade nuked the system Perl [^1] , but some custom XS modules or XS modules installed via cpan instead of the package manager relied on that version. This would reduce the number of build options, but I'm still not sure if that actually helps anybody.

Maybe simply taking the (Debian) build files for old packages/distributions and running them for new distributions, with a prefix of /opt/perl5-xx could already help. People would still need to edit the path of their scripts to bring things back up.

This only makes sense when also rebuilding all the old CPAN modules for the new OS version, except under /opt. That's a lot of effort for little to no gain, except when people really need it.

[^1] : well, not nuked, but replaced with a newer major version that is not binary compatible

Git-Frankenstein - Transplant Patches Between Repos


How it started

I keep all files related to my Perl distributions in a separate git repository. The files needed to build a distribution, the author tests and some other scaffolding get copied from that repository when I start a fresh repository. I then customize the copied files. But of course, my author tests, or the Makefile.PL evolve, and most changes never propagate to old distributions.

Of course, from time to time, I copy the changed files from Dist::Template into whatever distribution I'm working on.

How it's going automates this process. It transplants patches to Dist::Template onto other repos, when they fit. It basically is

for $patch in (`git log $source`) { if( `git-apply --check` ) { git-apply } }

... except that it also can list the applicable patches, before applying them:

corion#speech-recognition-vosk: list 626d986cdb25f2c71e7eef277c7c998f5cc87e28 update_file() handles (only) UTF-8 files 033e53e0cd9866ed281be0924bed2a265e3e5295 Remove -T from tests d5de9059421538580edbc604e64432b3211f16c7 Don't warn for every thing in our own distribution and test prerequisites 07ec7b5592e6cd96e8d5da9f4643022a52a940b1 Don't use ->import() for version checking

Also, to satisfy a real need, there also is the auto command, which automatically applies all patches that apply.

corion#speech-recognition-vosk: auto 626d986cdb25f2c71e7eef277c7c998f5cc87e28 update_file() handles (only) UTF-8 files 033e53e0cd9866ed281be0924bed2a265e3e5295 Remove -T from tests d5de9059421538580edbc604e64432b3211f16c7 Don't warn for every thing in our own distribution and test prerequisites 07ec7b5592e6cd96e8d5da9f4643022a52a940b1 Don't use ->import() for version checking

The code lives on Github.

Disappointed by mobile Browser APIs


Again the browser APIs disappoint me, especially their (non)implementation for mobile phones.

To display images from my mobile or to share my phone screen on a random TV (or other computer) I'd like a simple photo projection website. Very much like ShareDrop , except that the images are not saved on the other machine(s) but simply displayed there.

Making a directory full of images accessible to the browser does not work, as the File System Access API is not implemented on mobile browsers.

Screen sharing could work via WebRTC and images stay locally as far as possible. But the WebRTC screen sharing API is not implemented on mobile browsers.

This is disappointing.

Simple Deployments using Git as Transport


This is largely an elaboration on my other post on using Git for deployment

I like to write small toy programs as web apps, like the Curl-to-Perl converter or my weather forecast app. My current tool of choice for writing such web apps is Mojolicious, and the local development is quite nice as it comes with a local web server built in.

XXX screenshot of the weather app

The internet

But obviously a web application is no fun if you can't put it online and use it from wherever, or show it around. While I have a server on the internet and it has a webserver, updating my software while it is in development is inconvenient. When it is inconvenient, I don't do it often, so I want to remove that inconvenience as far as possible.


Copying files to the target machine

It's easy to copy files using scp or rsync. My uplink is pretty fast nowadays so I can conveniently copy 10MB within a second.

I program in Perl, and I would need to copy the needed Perl modules as well. This fails when I need to use Perl modules with a binary component like a C library. So copying files alone will not work.

Running programs on the target machine

If I'm only copying the data that can be easily created locally, I need a way to run programs on the remote machine. This is possible for example through ssh -c.

But as most of my webapps are writen with Perl as the backend, I need to run at least cpanm --installdeps to install the modules needed. Often I also want to regenerate other files like manifest.json and/or compress assets. Usually, these other jobs are done through a Makefile.

Using Git as transport and runner

I use Git as my version control system. In Git, I usually check in almost everything of interest to a project. Instead of writing a shell script that I run locally which uploads the files and then kicks off a remote build, I (ab)use the Git post-receive hook to work as my program runner and the Git transport mechanism for transferring the data.

Git as transport

Git can download and upload changes from other git Repositories. It can use a variety of transport mechanisms:

  • file copy from/to a local directory
  • file copy via the git protocol
  • file copy via the ssh / scp protocol

The last protocol is of convenient interest to me, as it means I can simply have a Git repository on the webserver machine and git push will upload my local changes to the remote webserver.

Anatomy of the Git post-receive hook

Whenever Git receives a complete set of changes in a repository, it will then kick off the post-receive hook. The post-receive hook is a program intended to be customized by the user to perform tasks whenever the event arrives.

In my case, I use that post-receive hook to perform all tasks that I want to be done on the webserver:

  • check out the latest state of my webapp into a directory
  • install modules needed by my webapp
  • perform other tasks as specified by a Makefile


Setting up the post-receive hook is fairly simple:

  1. Create a remote directory for the repository

mkdir my-webapp.git

  1. Initialize the directory as Git repository

cd my-webapp.git && git init --bare

  1. Add the post-receive hook

Don't forget to make the file executable.

  1. Add the machine as a remote on your local repository

git remote add demo corion@that.machine.example:my-webapp.git


Deployment now looks like

git push demo

The steps of the post-receive hook

The steps performed by the hook in detail are:

Check out the latest state of my webapp into a directory

git "--work-tree=${CHECKOUT_DIR}" "--git-dir=${REPO}" checkout -f

From the Git repository, we checkout the current state into a target directory.

Install modules needed by my webapp

I like to install all modules needed by a webapp into a directory local to that webapp. This means more maintenance, but it also means that changes to one webapp don't break other webapps. For additional safety, I also reset the PERL5LIB environment variable so even if the hook is run manually it won't install or use modules outside of the app-specific directory.

PERL5LIB=${BASE}/${DIST}/lib /home/corion/perl/bin/cpanm --installdeps "${CHECKOUT_DIR}" -l "${CHECKOUT_DIR}/extlib" --notest

Run post-install steps

Some assets of the webapp might need to be (re)compressed. make is a convenient tool to update files based on the date of other files:

cd "${CHECKOUT_DIR}/public" && make deploy

The post-receive hook in its full glory


REPO=$( cd "$GIT_DIR" || exit; pwd)
BASE=$(cd "${REPO}/.." || exit; pwd)
DIST=$(basename "${CHECKOUT_DIR}")

git "--work-tree=${CHECKOUT_DIR}" "--git-dir=${REPO}" checkout -f
PERL5LIB=${BASE}/${DIST}/lib /home/corion/perl/bin/cpanm --installdeps "${CHECKOUT_DIR}"  -l "${CHECKOUT_DIR}/extlib" --notest

cd "${CHECKOUT_DIR}/public" && make deploy

See Also

Git::Hooks - a Perl program for Git hooks

git-init - create or edit your default Git hooks

Other approaches


A bundler for Perl. This requires you to have the same Perl compiler and compiler flags locally as you have on the remote machine as it compiles all artifacts locally.