As many, I am interested in machine learning. I have taken a class on this topic, and have been reading some papers. I am interested in finding out what makes a problem difficult to solve with machine learning. Ideally, I want to learn about how the complexity of a problem regarding machine learning can be quantified or expressed.

Obviously, if a pattern is very noisy,one can look at the update techniques of different algorithms and observe that some particular machine learning algorithm incorrectly updates itself into the wrong direction due to a noisy label, but this is very qualitative arguing instead of some analytical / quantifiable reasoning.

So, how can the complexity of a problem or pattern be quantified to reflect the difficulty a machine learning algorithm faces? Maybe something from information theory or so, I really do not have an idea.

1

There are 1 answers

0
amit On BEST ANSWER

In thery of machine learning, the VC dimension of the domain is usually used to classify "How hard it is to learn it"

A domain said to have VC dimension of k if there is a set of k samples, such that regardless their label, the suggested model can "shatter them" (split them perfectly using some configuration of the model).

The wikipedia page offers the 2D example as a domain, with a linear seperator as a model:
example1example2example3

The above tries to demonstrate that there is a setup of points in 2D, such that one can fit a linear seperator to split them, whatever the labels are. However, for every 4 points in 2D, there is some assignment of labels such that a linear seperator cannot split them:
counter example

Thus, the VC Dimension of 2D space with linear seperator is 3.

Also, if VC dimension of a domain and a model is infinty, it is said that the problem is not learnable

If you have strong enough mathematical background, and interested in the theory of machine learning, you can try following the lecture of Amnon Shashua about PAC