The future of computers
October 1, 2010Earlier this week, Research in Motion, the maker of BlackBerry, announced that it would soon be making a tablet computer to go head-to-head with Apple's iPad. Given the recent surge in smaller devices like netbooks, tablet computers and e-book readers, Deutsche Welle spoke with Stefan Jaehnichen, the head of the Fraunhofer Institute for Computer Architecture and Software Technology (FIRST) in Berlin. Since 1998, Jaehnichen has also been president of the German Informatics Society (GI).
Deutsche Welle: Mr. Jaehnichen, according to your provocative theory, the computer as we know it today won't exist tomorrow. Why do you say that?
Stefan Jaehnichen: I think that the computer won't disappear from our everyday lives, but it will get smaller and less visible. We will no longer see the computer as a device because it's incorporated everywhere – in all of the things that we use. The iPad is a good example: You really don't see the iPad as a computer; it's more of an everyday object.
You use it like a washing machine. You get on the Internet with it and order five sets of shirts and don't think of it as a computer – also because you don't have to program it – you use it for small things. You don't care where the data comes from – whether it reaches you via the Internet or it's right it front of you. It's no longer the computer sitting in the corner that takes extra effort to use.
Does that mean the objects around us will be "intelligent" in the future?
I'd be careful with the term "intelligence." But there will certainly be more computers integrated into objects – perhaps special chips that, for instance, could tell you what the refrigerator is doing. I think we'll be linking things together more and more, and that means we'll have a lot of convenient functions that we might even enjoy.
What kind of functions do you have in mind?
For example, if I'm at home and my favorite music cues up, and I don't have to worry about picking a radio station. It's just a bit of a relief. Or I pick up a certain object and use it to turn up the heat, just pointing that way. I don't have to run over and fiddle with the valve, instead I have something in my hand that I turn virtually, and the heat responds.
So what might change in the future is the way we tell our computers what to do?
Absolutely right. Again, that relates to the interfaces I have for all these everyday items, which are actually computers. There are plenty of ideas still to come there – and there's a lot yet to happen. I can also imagine that we'll need a way for the end-user to do very simple programming – for instance, to combine individual service functions. You can do that today, of course, with things like mobile phones, but it could get a little more complex in the future. So you have to get that to a point that allows regular users to configure these service functions a bit. That's one of the things that needs to happen in the coming years.
And that programming can happen through things like spoken commands?
I still can't say how exactly that will happen. (The programming) will probably use words; if our voice processing systems improve a little, then you won't need keyboards anymore. But many people are already quick enough with the keypad.
What other possibilities are there for giving a computer commands?
That's another very visionary question. I'll give you another example of applications that no one has thought of yet: There was a pretty spectacular development at the Fraunhofer Institute many years ago. We were among the first to present a so-called "brain computer interface." We tried to measure brain signals and interpret them. It's not reading thoughts – I have to reject that notion – rather, it's simply the interpretation of certain patterns.
If you can tell the difference between these pattern movements – for instance, between the movement of the left arm and then right arm – then I have "bit" information: "there" or "not there." And as a result, if someone is thinking about his right hand, the mouse arrow moves right. And when he thinks about his left hand, it goes left. That's also the basis for an interface between user and computer that no one has thought of yet. In that respect, I don't want to reveal too much of my ideas for the future.
But if every movement elicits a response from the computer, wouldn't errors become more frequent?
That depends on the precision, of course. It also depends on how precisely I can control it as the user. I'll give another example of a vision: better support for surgeons in the medical sector by trying to understand their facial expressions, their gestures (and) their movement. As a result, very specific things would load onto a screen – things the surgeon wants to see at that moment – because he changes the angle of his eye, meaning he wants to see something in more detail. But I think that people have to be in control. If I allow the computer to make decisions on its own, there's more uncertainty and it gets very risky.
Do you feel that computers shouldn't be incorporated into certain aspects of life, even in 2020 or 2025?
Everyone wants to live his private life the way he likes. I'm still not sure whether I won't prefer to keep reading a normal book instead of a book on an iPad or something like that. I just don't know yet. Certainly people's needs are different. And we shouldn't let ourselves get too caught up in technology – we should utilize it in places where it's really useful – and maybe have some fun with it, too.
Interview: Richard A. Fuchs (arp)
Editor: Cyrus Farivar