a tumblelog
Sat 05 Oct 2019

Black Jade

In the darkest of ages, the Lord of the Lightstone is a lowly man, and lost…

In the afternoon my mother and I visited a local bookstore that had a cancellation sale; all books were priced at 1 euro each. I couldn't find anything, but my mother found Black Jade by David Zindell.

Cover of Black Jade by David Zindell
Cover of Black Jade by David Zindell.

I had never heard of this author, but decided to buy the book anyway.

Version 3.0.0 of tumblelog has been released

In the evening I finished what I started yesterday; version 3.0.0 of tumblelog. This version adds the ability to create non-blog pages, for example an about page or a subscribe page. It should even be possible to build a complete (micro) site without a blog this way.

Get version 3.0.0.

Principal Component Analysis

In dimensionality reduction we seek a function f:ℝn↦ℝm where n is the dimension of the original data X and m is less than or equal to n. That is, we want to map some high dimensional space into some lower dimensional space. (Contrast this with the map into a finite set sought by cluster analysis.)

We will focus on one technique in particular: Primary Component Analysis, usually abbreviated PCA. We’ll derive PCA from first principles, implement a working version (writing all the linear algebra code from scratch), show an example of how PCA helps us visualize and gain insight into a high dimensional data set, and end with a discussion a few more-or-less principled ways to choose how many dimensions to keep.

Source: Principal Component Analysis, ML From Scratch, Part 6 by Oran Looney.