Just before six p.m. I decided to pop open my late 2014 Mac mini to
see if I could clean it inside with compressed air. Its fan had been
spinning for quite some time non-stop. And even though I had use
compressed air on the outside I had the feeling there was still a lot
of dust inside and the reason for the fan still spinning noisily.
I used a bol.com plastic gift card to pry open the
bottom. A task that was much easier than I had thought; the plastic
lid popped off very easy. And then I noticed how much dust bunnies had
gathered inside the Mac mini... I could pick up most of it with my
hand and used compressed air to blow away the rest.
When I reconnected the Mac mini and turned it on... silence. The dust
was really causing the Mac to overheat, and maybe unstable as well,
and kept the fan spinning.
Many machine learning (ML) models are Python pickle
files under the
hood, and it makes sense. The use of pickling conserves memory,
enables start-and-stop model training, and makes trained models
portable (and, thereby, shareable). Pickling is easy to implement,
is built into Python without requiring additional dependencies, and
supports serialization of custom objects. There’s little doubt about
why choosing pickling for persistence is a popular practice among
Python programmers and ML practitioners.
This started with a consulting snafu: Government organisation A got
government organisation B to develop a web application. Government
organisation B subcontracted part of the work to somebody. Hosting
and maintenance of the project was later contracted out to a
private-sector company C. Company C discovered that the
subcontracted somebody (who was long gone) had built a custom Docker
image and made it a dependency of the build system, but without
committing the original Dockerfile. That left company C with a
contractual obligation to manage a Docker image they had no source
code for. Company C calls me in once in a while to do various
things, so doing something about this mystery meat Docker image
became my job.