Plurrrr

week 46, 2020

Python Internals Serie : Subprocess.Popen

The purpose of this serie is to review some parts of the CPython’s code.

Why ?

Well there are multiple reasons :

  • Because we can. The project is opensource and the source code is freely available here : https://github.com/python/cpython
  • I strongly believe that we, as developpers, can learn a lot by studying good and clean code. And I think we can safely assume that CPython’s code, which is used practicaly everywhere, meet this crietria.
  • I also think that studying python internals can make us better python programmers, for example ever wondered why changing sys.stdout seems to have no effect on subprocesses ? Well at the end of this article You will know why.

Source: Python Internals Serie : Subprocess.Popen, an article by Yassir Karroum.

Chappie (2015)

In the near future, crime is patrolled by a mechanized police force. When one police droid, Chappie, is stolen and given new programming, he becomes the first robot with the ability to think and feel for himself.

In the evening we watched Chappie. I liked the movie and give it a 7.5 out of 10.

macOS 11.0 Big Sur: The Ars Technica review

For the first time in almost two decades, Apple has decided to bump up the version number of the Mac’s operating system. The change is meant to call attention to both the pending Apple Silicon transition—Big Sur will be the first macOS version to run on Apple’s own chips, even if it’s not the first to require those chips—and to an iPad-flavored redesign that significantly overhauls the look, feel, and sound of the operating system for the first time in a long while. Even the post-iOS-7 Yosemite update took pains to keep most things in the same place as it changed their look.

Source: macOS 11.0 Big Sur: The Ars Technica review, an article by Andrew Cunningham.

Anchor Points for Webbing

In the early evening I commented on a terrarium video posted on Facebook that if I was the owner of the terrarium I would provide more higher anchor points to the Chromatopelma cyaneopubescens shown webbing around. I keep this species in a plastic container with quite some height as I understand that this species can be found several feet above the ground:

Chromatopelma cyaneopubescens live in extreme xeric conditions in sandy thorn tree/cactus forests on the Paraguana Peninsula of Venezuela. This species is an "opportunistic burrower", whereby, they will make their silken retreats in the dried fissures of the ground, in old dried and piled up cacti, at the base of large thorny trees or up in the natural cavities of those thorny acacia tree ... basically, wherever the prey availability forces them to make their retreat. The trees are rarely higher than 12 feet and either cracks in the tree or natural tree cavities are never above 6 feet.

There are MANY theraphosid taxa that live high in trees that are not true arboreals AND there are true arboreals that have been found living in fossorial ground burrows or under fallen logs lying on the ground.

Source: Chromatopelma cyaneopubescens-- Arboreal vs Terrestrial.

Chromatopelma cyaneopubescens webbing using various anchor points
Chromatopelma cyaneopubescens webbing using various anchor points.
Chromatopelma cyaneopubescens webbing using various anchor points
Chromatopelma cyaneopubescens webbing using various anchor points.

So I keep this species without a water dish on dry coco peat with only a small part kept moist. I also place a drop of water on the web near the spider now and then. On the 8th of April, 2020 I could take photos of this specimen taking moisture from the substrate.

M1 Memory and Performance

The M1 Macs are out now, and not only does Apple claim they're absolutely smokin', early benchmarks seem to confirm those claims. I don't find this surprising, Apple has been highly focused on performance ever since Tiger, and as far as I can tell hasn't let up since.

One maybe somewhat surprising aspect of the M1s is the limitation to "only" 16 Gigabytes of memory. As someone who bought a 16 Kilobyte language card to run the Merlin 6502 assembler on his Apple ][+ and expanded his NeXT cube, which isn't that different from a modern Mac, to a whopping 16 Megabytes, this doesn't actually seem that much of a limitation, but it did cause a bit of consternation.

Source: M1 Memory and Performance, an article by Marcel Weiher.

Introduction to Go Modules

I’ve seen many people online talk about liking Go and using it, but being confused by its dependency system, called Go modules. This blog post aims to provide a simple introduction with examples. It focuses mostly on Unix-based systems like Linux and macOS over Windows.

This post does not cover all possible ways of using Go modules. It’s just a simple introduction with the most common use cases.

Source: Introduction to Go Modules.

Charles proxy for web scraping

Charles proxy is an HTTP debugging proxy that can inspect network calls and debug SSL traffic. With Charles, you are able to inspect requests/responses, headers and cookies. Today we will see how to set up Charles, and how we can use Charles proxy for web scraping. We will focus on extracting data from Javascript-heavy web pages and mobile applications.

Source: Charles proxy for web scraping, an article by Kevin Sahin.

Building a Homelab VM Server (2020 Edition)

For the past five years, I’ve done all of my software development in virtual machines (VMs). Each of my projects gets a dedicated VM, sparing me the headache of dependency conflicts and TCP port collisions.

Three years ago, I took things to the next level by building my own homelab server to host all of my VMs. It’s been a fantastic investment, as it sped up numerous dev tasks and improved reliability.

In the past few months, I began hitting the limits of my VM server. My projects have become more resource-hungry, and mistakes I’d made in my first build were coming back to bite me. I decided to build a brand new homelab VM server for 2020.

Source: Building a Homelab VM Server (2020 Edition), an article by Michael Lynch.

Alice handling an Aphonopelma seemanni

In the evening Alice held the female adult Aphonopelma seemanni I keep. Currently, this tarantula is held in a smaller enclosure so her main enclosure can dry out a bit.

Alice holding a female Aphonopelma seemanni
Alice holding a female Aphonopelma seemanni.

Systematically removing code

It's easy to miss things when removing code, leaving behind unused methods, templates, CSS classes or translation keys. (Especially in a dynamic language like Ruby, without a compiler to help you spot dead code.)

I avoid this by removing code systematically, line by line, depth-first.

This is one of those things that seems obvious when you do it, but in my experience, many people do it haphazardly.

Source: Systematically removing code, an article by Henrik Nyh.

How To Write Unit Tests For Logging

Once in a while I get asked the question whether one should write solitary tests for logging functionality. My answer to this question is the typical consultant answer: “It depends”. In essence, logging is an infrastructure concern. The end result is log data that is being written to a resource which is external to an application. Usually the generated data ends up in a file, a database or it might even end up in a cloud service.

Source: How To Write Unit Tests For Logging, an article by Jan Van Ryswyck.

Eleven Years of Go

Today we celebrate the eleventh birthday of the Go open source release. The parties we had for Go turning 10 seem like a distant memory. It’s been a tough year, but we’ve kept Go development moving forward and accumulated quite a few highlights.

Source: Eleven Years of Go, an article by Russ Cox.

This is how I git

Every now and then I get questions on how to work with git in a smooth way when developing, bug-fixing or extending curl – or how I do it. After all, I work on open source full time which means I have very frequent interactions with git (and GitHub). Simply put, I work with git all day long. Ordinary days, I issue git commands several hundred times.

I have a very simple approach and way of working with git in curl. This is how it works.

Source: This is how I git, an article by Daniel Stenberg.

Optimizing your code is not the same as parallelizing your code

You’re processing a large amount of data with Python, the processing seems easily parallelizable—and it’s sloooooooow.

The obvious next step is switch to some sort of multiprocessing, or even start processing data on a cluster so you can use multiple machines. Obvious, but often wrong: switching straight to multiprocessing, and even more so to a cluster, can be a very expensive choice in the long run.

In this article you’ll learn why, as we:

  1. Consider two different goals for performance: faster results and reduced hardware costs.
  2. See how different approaches achieve those goals.
  3. Suggest a better order for many situations: performance optimization first, only then trying parallelization.

Source: Optimizing your code is not the same as parallelizing your code, an article by Itamar Turner-Trauring.

Use Python for shell scripts

I never got used to bash scripting syntax. Whenever I have to write a more-than-trivial bash script, the strange syntax annoys me, and I have to Google every little thing I need to do, starting from how to do comparisons in if statements, how to use sed , etc.

For me, using Python as a shell scripting language seems like a better choice.

Python is a more expressive language. It is relatively concise. It has a massive built-in library that let you perform many tasks without even using shell commands, it is cross-platform and it is preinstalled or easily installed in many OS’s.

I am aware that some other dynamic languages (e.g Perl, Lua) might also be very suitable for shell programming, but I (and my team) work with Python daily and familiar with it, and it gets the jobe done.

Source: Avoiding Bash frustration — Use Python for shell scripts, an article by David Ohana.

How Python bytecode is executed

We started this series with an overview of the CPython VM. We learned that to run a Python program, CPython first compiles it to bytecode, and we studied how the compiler works in part two. Last time we stepped through the CPython source code starting with the main() function until we reached the evaluation loop, a place where Python bytecode gets executed. The main reason why we spent time studying these things was to prepare for the discussion that we start today. The goal of this discussion is to understand how CPython does what we tell it to do, that is, how it executes the bytecode to which the code we write compiles.

Source: Python behind the scenes #4: how Python bytecode is executed, an article by Victor Skvortsov.