The end of my second year as a Ph.D. student is getting close and I must start thinking about a dissertation topic. At least, I already have some ideas about the areas of my dissertation: databases, data mining, and modern statistics and learning theory. The problem of learning drew my interest since I discovered there was a problem of learning, i.e., in philosophy class in high school. Being the grandchild of a philosopher, I just could not resist and I started wondering about knowledge and how we acquire it. Still, I have to admit I did not read much of the opinions of great thinkers of the past and the present on the matter.
Anyway, the following is a passage from the Preface to the First Edition of Vladimir N. Vapnik’s The Nature of Statistical Learning Theory, 2nd Ed.
[…] during the last few years at different computer science conferences, I heard reiteration of the following claim:
Complex theories do not work; simple algorithms do.
[…] I would like to demonstrate that in this area of science a good old principle is valid:
Nothing is more practical than a good theory.
Vapnik does not say (at least in this Preface) whether a “good” theory could also be simple or actually have to be complex, nor whether the complexity of the theory has any significance on its being “good”.
Still, I like this passage. It is bold, effective, and probably true, especially in the part regarding computer scientists’ point of view.
1. ^ statistical inference.
Posted from Providence, Rhode Island, United States.