Syllabification with neural networks

As part of another project, I came across the problem of correctly counting the number of syllables in English words. After searching around and seeing mostly rule- and dictionary-based methods, I ended up building such a syllable counter from scratch, which ultimately led to the construction of a neural network... [Read More]

Squared error and cross entropy

When introduced to machine learning, practically oriented textbooks and online courses focus on two major loss functions, the squared error for regression tasks and cross entropy for classification tasks, usually with no justification for why these two are important. I’ll here show that they’re both instances of the same concept:... [Read More]

NaturalSelection

A Python package to easily evolve neural networks

In a deep learning project I am currently working on, I faced the inevitable problem of having to tune my hyperparameters. After trying a few dozen combinations it felt way more like guesswork than anything and I decided to be more systematic, which eventually led to the development of my... [Read More]

Singular value decomposition

Whenever we are dealing with a complicated problem, it usually helps to break it into smaller pieces that are easier to handle. This is as true in mathematics and machine learning as it is when we’re cooking a meal or cleaning our home. This idea is what’s the guiding principle... [Read More]

Normal

Why standardise data?

The normal distribution. Gaussian distribution. Bell curve. The ideal has many names. But what is so special about this distribution? Answering this question turns out to also give justification for Scikit-Learn’s StandardScaler! Let’s get crackin’. [Read More]