A case against AI-specific regulation

Artificial intelligent systems are increasingly being deployed in all spheres of our lives. At the same time, we also keep hearing how they can discriminate, make biased decisions, drive people towards depression and anxiety, and sometimes even outright kill us. Not surprisingly, there is growing pressure to establish rules to prevent some of the damage that the widespread introduction of AI systems is bringing. However, regulation focusing only on AI systems may distract us from the foundational problems and cause us to miss opportunities to address wider societal issues.

Book notes: Gödel, Escher, Bach. Part 2

This is the second part of my notes on Gödel, Escher, Bach: An Eternal Golden Braid. This post will cover the second part of the book, namely the EGB part.

Book notes: Gödel, Escher, Bach. Part 1

Occasionally I read a book that is so dense in useful ideas I want to internalize that cannot help myself but start rereading it as soon as I finish it. To make sure that I remember the key points that made me love the book, I would also write down notes in a notebook I have for this purpose. Then, whenever I realize I can benefit from reading the book again, but don’t have the time at hand, I just quickly review these notes. Since I started this blog, I realized that probably some (though not all) of these notes...

Concentration bounds. Part 4: Hoeffding inequality

Hoeffding’s inequality is one of the most frequently used bounds. It can be applied to sums of bounded independent random variables. Here “bounded” is the keyword: it means that every variable $X_i$ can only take values in $[a_i, b_i]$. In many real-world cases we have such variables.

Concentration bounds. Part 3: Chebyshev and Chernoff inequalities

In this post, we will take a look at two special cases of Markov’s inequality. First, we will take $X=(Y-\mathbb{E}Y)^2$ which leads to the Chebyshev bound. It allows us to bound deviations from the mean by using the variance of the random variable. Then, we will take $\phi(x)=e^{sx}$ in the extended version of Markov’s inequality, resulting in the Chernoff bound. It will be the basis for Hoeffding’s inequality that we will look at in the next post.

Concentration bounds. Part 2: Markov inequality

Markov’s inequality can be used to provide an upper bound on the probability of events several times larger than the average. As the only constraints it puts on the random variable $X$ are non-negativity and integrability (i.e. $\mathbb{E}X$ exists), it is one of the most widely used concentration bounds. It is also the starting point for many other concentration bounds, including Chebyshev, Chernoff, and Hoeffding, all of which we will see later.

Concentration bounds. Part 1: Introduction

In science, engineering, and generally in life we often need to make conclusions about the world based on a few discrete experiences we have. While we are usually aware that extrapolating from a handful of samples to the whole population is must come with some uncertainty, we are often happy with just saying that “our estimate approximates the true value”. However, there are some scenarios when there is a lot at stake: the safety of medicines, the reliability of aircraft engines, climate change predictions, and many more. To convince others (and ourselves) that our empirical study of such systems is representative of what would happen if we release them to the public, we need something more than “our analysis approximates reality”. We need to say how well it approximates it. This is the realm of concentration bounds: quantitative bounds on the quality of our estimates.

Why a category is the best representation for compositional structure

Before someone answers a question formed as Why is X the best Y, they should be pretty sure that the answer to the question Is X a Y is a solid YES! It took me some time to figure out why everybody in category theory is so stoked about compositionality. Sure, problems that I’d consider to have compositional structure do indeed happen to be nicely representable as categories. But that doesn’t matter that they all can, right? Or that a category is the only way to model them? Well, it turns out that there are some good reasons why compositionality...

Heteromorphic twisted categories

My master thesis involved a fair deal of Category Theory. Something that as a student of robotics (and previously of all things flying) I had never encountered before. And even though I was dipping my toe in the very basics, I stumbled upon a simple construction that seemed to not have been studied by anyone before.

While trying to soak up terms and constructions as quickly as possible, I was amazed at how every small variation or flavor of a category-theoretical concept that I could think of had already been diligently studied and curiously named. Until I found one that...