Welcome!

So far, this blog is mostly dedicated to providing an intuitive introduction to machine intelligence, featuring a constantly growing series of articles about topics in machine learning, data mining, statistical learning, big data, artificial intelligence, robotics, etc. But sometimes I just write about whatever comes to my mind.

Jump directly to the intuitive introduction to machine intelligence or browse the list of recent posts below.

Recent Posts

  • Yet Another Machine Learning 101

    This post is a somewhat short recap of machine learning in general. I wrote it as lecture notes for a tutorial that I gave in one of the robotics courses at TU Berlin.

  • The Curse of Dimensionality

    In the last post we have looked at one of the big problems of machine learning: when we want to learn functions from data, we have to fight overfitting. In this post we will look at another archenemy of learning: dimensionality.

  • Learning Functions from Data: A Primer

    In the introductory articles we have learned that data is a bunch of numbers encoding some information, and that data can be multi-dimensional which makes them live in vector spaces. We have also looked at the core competence of machine intelligence: applying functions to data. In this and the following posts we will look at the most powerful tool of machine intelligence: learning functions from data.

  • Functions as Data Translators

    If you have read the previous posts carefully you should now be familiar with high-dimensional data. In this last introductory article we are now going to look at what machine intelligence people mean when they think about manipulating data: they apply functions to data.

  • Stop Teaching Math in School?

    Today, I watched the TEDx talk by John Bennett who advocates that we should stop teaching math to middle and high school students. I find it great if people make controversial claims and defend them argumentatively. And indeed, there seems to be a problem with how math is taught in school, given the infamous “math anxiety” that Bennett is alluding to. Yet, I strongly disagree with Bennett’s conclusion.

  • The Curse of Overfitting

    In the last post we obtained an understanding of how to learn functions from data, and we developed our first learning method. In this post, we will start building an understanding of why learning from data is actually pretty hard. The first problem we are facing is the curse of overfitting. Let’s see what that is.

  • The Power of Machine Intelligence

    Thinking machines have revolutionalized our everyday life, maybe even more than industrialization or the invention of the car. But although everyone of has an intuition about how a car works thinking machines remain obscure and magical to most of us.

  • High-dimensional Spaces

    In the last post we have seen examples for how numbers can encode information, becoming data. In this post we will talk about a very important way to look at data. This view will allow us to play around with data in a powerful way, and this view lies at the core of machine intelligence and the science of data.

  • Data, Numbers and Representations

    I would like to start the intuitive introduction to machine intelligence by looking at the term data. Let us try to gain an informal but sufficient understanding of how we could define data.

  • Understanding Machine Intelligence without Formal Math

    In this post I want to defend the basic idea of this series of articles, namely why I am trying to avoid mathematical formalism for explaining data science.

subscribe via RSS