List

What is Dimensionality Reduction?

Dimensionality reduction is often used in pattern recognition, data mining and machine learning. The goal of dimensionality reduction is to reduce the number of dimensions of feature vectors.

In this post, I will briefly try to provide an introduction with examples.

To get into details, we first need to understand the concepts of feature, feature vector, and dimensions of a feature vector.

What is a feature?

A feature is a distinguishing characteristic of an object or entity. For example:

  1. Identifying single object: A red colored car is passing on a street. When you see a red colored car you are distinguishing that car by its color. Here, red is a distinguishing characteristic of that car
  2. Comparison of two objects: A is 6 feet tall and B is 5 feet tall. From this statement, we can extract a length (in feet) as a feature. A and B can be distinguished using length (in feet) as a feature.

If you look at the above two examples, we see that the features are unique attributes to identify one object from another.

In the first case, although there is only one car (which is red), we are inherently applying our knowledge that there are cars of different also available. In other words, we know that cars come in different colors and we (human) are processing information such that we can distinguish the currently seeing car from others (based on previously seen car colors).

In the second case, it is explicitly stated that A is 6 feet tall and B is 5 feet tall. Here the statement itself is making a clear comparison of two objects — we need not use our past stored information what we did to compare red colored car. The distinguishing attribute is the length (in feet).

What is a feature vector?

A feature vector is a vector (1-dimensional array) which consists of values relating to a feature.

Let us take a simple example of cars. Say we have three cars (Car 1, Car 2 and Car 3) of different colours (red, green and blue) and of length 1.8, 1.9 and 2.0 meters, respectively.

Then, each car can be identified by their feature vectors as:

Car 1 = [red, 1.8]

Car 2 = [green, 1.9]

Car 3 = [blue, 2.0]

The fundamental idea underpinning feature vector is to uniquely identify an entity. In the above car example, Car 1, Car 2 and Car 3 are entities (or objects) that we wanted to uniquely identify or represent.

What are attributes of a feature vector?

red, 1.8 are attributes of Car 1.

green, 1.9 are attributes of Car 2

blue, 2.0 are attributes of Car 3.

Length of a feature vector: Car 1 has two attributes. Therefore, feature vector length of Car 1 is 2. In the above example, Car 2 and Car 3 also have feature vector lengths of 2.

What is dimension?

Dimension of a feature vector is the number of independent attributes. In the above car example, let us take Car 1 features. They are red and 1.8. Both attributes are not related, the dimension is 2. Because one is text (red) and the other is a number (1.8), this may not be easily understood.

To better handle this, we could convert colours to numbers. Lets say, we assign numbers 2, 3, and 5 to red, green and blue. Then our feature vectors can be written as

Car 1 = [2, 1.8]

Car 2 = [3, 1.9]

Car 3 = [5, 2.0]

Now dimensions of each of the feature vectors (Car 1, Car 2 and Car 3) are 2 (because each have two attributes).

What is dimensionality reduction?

Reducing dimensions of a vector (or a feature vector) is referred to as dimensionality reduction.

In the above cars examples, suppose we find a mapping function $ f: x \mapsto y $, where  $x$ is an input feature vector and $y$ is an output feature vector.

For example, we can find a function $ f_1 : [2, 1.8] \mapsto 0.75 $. Here, the dimensions of Car 1 feature vector, which is 2, is mapped to a single (scalar, real) value of 0.75. The output value of 0.75 was just an example. It could any real value.

Because 2 dimensions of Car 1 were converted (mapped) to a single value, there is a reduced dimension at the output. For this reason, we call this as dimensionality reduction.

 

Further reading for interested readers: