Even though supposedly the web is already at version 2.0, most sites still behave more or less the same for everyone. A famous exception is Facebook, which uses its vast amount of user data to, for instance, make suggestions about whom you would like to befriend. Wikipedia on the other hand behaves the same for everyone, independent of who’s watching. Wikipedia and Facebook are both considered prime examples of Web 2.0. So what’s the difference?
Facebook is what’s called an ‘Adaptive System’, a system that automatically changes its behavior depending on environmental factors and history. Adaptive Systems started out in the educational sector where they would adapt their content to best suit the goals of the learner. By focusing on things the learner had not mastered yet, the learning process could be more efficient. Wouldn’t it be great if Wikipedia could gather and summarize articles that would be better tailored to your question? Some websites such as WolframAlpha have tried to realize this goal, and can already answer such simple questions as, “How old is Barack Obama?”
The Adaptive Systems in today’s world use rather simple rules to create a model of the user and adapt the content presented to him. Our interaction with machines and with the web can become much more personal. One reason for why this hasn’t been realized yet is that it is hard to automatically create a good model of an individual. The second reason is that, if you get the model wrong, it quickly worsens the user’s experience. I find it more frustrating when WolframAlpha gets a simple question wrong than when I have had to search a bit longer on Wikipedia.
My expectation is that as our understanding of the human mind progresses, so will the quality of adaptive systems. It is estimated that somewhere between the year 2030 and 2040 computers will match the computing power of a human brain. At that point they could well be as good in adapting as we humans are.