Plurrrr

Sun 09 Aug 2020

Psalmopoeus irminia exuviae

In the afternoon I noticed that the Psalmopoeus irminia sling I keep had molted; I saw the exoskeleton on top of a piece of moss coming out of the cork tube it lives in.

Psalmopoeus irminia molt
The exoskeleton of a Psalmopoeus irminia.

I hadn't seen the spider for weeks, and at times I was worried it had passed away. However, 4 days ago I removed some moss from it's cork tube and I saw legs that moved. And today it had kicked out it's exuviae.

The previous molt was discovered the 26th of May, 2020.

A Gentle Introduction to the Rectified Linear Unit (ReLU)

In a neural network, the activation function is responsible for transforming the summed weighted input from the node into the activation of the node or output for that input.

The rectified linear activation function is a piecewise linear function that will output the input directly if it is positive, otherwise, it will output zero. It has become the default activation function for many types of neural networks because a model that uses it is easier to train and often achieves better performance.

In this tutorial, you will discover the rectified linear activation function for deep learning neural networks.

Source: A Gentle Introduction to the Rectified Linear Unit (ReLU), an article by Jason Brownlee.