'Visionary' is one of Linus Torvalds' - creator of Linux - least favourite words.
His license plate may read 'King of the Geeks', but ahead of finding out that he had jointly won the Millennium Technology Prize in Finland (and the accompanying €600,000 cheque), described as the Nobel prize of tech, he is careful to put his own achievements in context.
"I don't see myself as a visionary at all," he told the Huffington Post. "I see myself as a technical person who chose a great project and a great way of doing that project."
In 1991 Torvalds started work on the kernel - the technical core - of what became the open-source Linux operating system, on his own, as a sort of hobby.
Twenty years later and he's still working on it - leading the technical development and managing its upgrade path - and Linux has never been more successful.
If you've never used one of the many versions of Linux at home, you might think it's a niche concern - a sort of nerdier Mac OS X. But you could not be more wrong. Linux is used in 91% of the world's top 500 fastest supercomputers, a huge number of web servers and forms the base of the Android mobile OS, which by some counts is the most popular smartphone OS in the world.
In a rare interview with the Huffington Post UK, Torvalds told us what keeps him working, what the future holds for computing and how he sees his own role in its history.
But first we made the mistake of using the 'v' word...
Q. The award does meant that journalists - like me - will start applying the word visionary to you more and more…
Oh please don't.
Q. I was going to say, that's something you've said before doesn't apply.
Yes I don't see myself as a visionary at all, I see myself as a technical person who chose a great project and a great way of doing that project. But yes - you clearly know that I despise, or maybe not despise, but I don't think…
Vision is sometimes over-rated. And I am a strongly in the camp that it is 99% perspiration and 1% inspiration.
Q. Do you think that search for a visionary by the media and people generally is that we just prefer to think invention is the work of one person, in a lab somewhere, but that actually the history of invention is a huge collaborative process.
Right. Especially in the media you do want to make a story about where you have a single focal point, and where you had somebody who came up with a revolutionary idea, and it turns out that is very seldom the case, even for the big historical people, they did not live in a vacuum and they did not do it on their own. It's just that for the story you want to condense it into a simpler model where the lone genius came along and fixed the world, and that's not how it works in reality.
Q. Do you worry about getting to the point in your life where you start being given 'lifetime achievement awards'?
I don't think that's a big worry for me - and I actually think the Technology Prize here is supposed to be not just for a lifetime's achievement. It's supposed to be a continuing achievement award - and I feel I am still making a difference, and I also think that Linux development seems to be, if anything, picking up speed as most people and companies get involved. So I don't have that feeling at all where, 'ok, I did it', and now I'm resting on my laurels.
Q. You're still very much involved in the Linux project, but how would I explain to people what you do, day-to-day?
It really doesn't look like much to the outside world because I spend all my time in front of my computer in my office and read email. And basically I'm a technical lead, I apply patches others have written, and even more so I merge the work of other maintainers, so I'm kind of a central point for the development process, but I don't write much code myself. I'm a technical manager but I don't have to take care of people. I only have to worry about technology itself.
Q. If you suddenly quit, or weren't around, how would that impact the Linux project?
It would be a gap… But we already have a very strong development community and it's fairly robust. It's not like I'm at the top of the pyramid, it's more like there's a network and I happen to be connected more centrally than most. We have a number of fairly central developers. I can take a week off - or even two though I don't remember when that last happened - and I just tell people, 'OK, I'll be gone, I hope nothing bad happens but if it does you're in charge'. There are two or three people who could easily take over. Or maybe I shouldn't use the word easily because that could give them ideas.
Q. Do you find it easy to switch off or go on holiday?
No. I can't.
Q. When you do, can you switch of your email?
It happens but I get bored if I don't do my work. The only time it happens is if - I try to take off for at least a week a year and just do scuba diving. Then I'll do that.
Q. Where do you go?
I tend to prefer warm water. I do cold water diving in the dry suit too, but usually it ends up being Hawaii as it's easy to get to with the family… But I have to say I think some of the best scuba diving I've ever done is Okinawa, which is not famous as a scuba diving destination.
The thing i love about diving is the flowing feeling. I like a sport where the whole point is to move as little as humanly possible so your air supply will last longer. That's my kind of sport. Where the amount of effort spent is absolutely minimal. It might not be the most healthy sport, but it's what I do.
Q. Going back to Linux, you started the kernel almost as a hobby. Is it more difficult to get into computing that way - particularly as a young person - now?
I think it is. But I don't think it's because computing is more complex, it's that the bar is set so much higher. When I started computing it was like, you could write your own game. And maybe the games you wrote weren't all that great, but the commercial games weren't all that great either.
What happens now if you get into computing and you start doing your own thing, there is so much pre-existing commercial stuff which is so clearly a whole different level, that I suspect it's harder to get involved. Just because you feel that sense of 'I can't ever get there'.
When I got started you could get a book and type in a program that were in the book and make something that wasn't all that different from the commercial programs you could buy.
On the other hand there are better tools around too.
Q. But is it also easier now to code an app for iOS or Android which is simpler and not as complex?
Well you're probably right in that cell phone programming it's much closer to what I grew up with, because there it is a slightly more limited and the tools are better so it's somewhat easier to get started. Maybe you lose some you win some.
Q. Is there more that can be done to bridge this gap in education?
So, I may be biased. Because my own experience has been that it was something that I was just deeply interested in, despite my grade school not having computers. I got into it because of my own interests. Because of that I may be biased. But I think the best thing you can do from an education standpoint is not try to make everybody learn how to program but make it available as an option for people who seem to have an interest.
The Raspberry Pi and other projects are great, not because I think everybody should learn to program but because it might make it available to people who might not otherwise have had the potential to tinker with computers and get into it that way, by accident.
Q. There is an ongoing meme if you like of 'everyone should learn to code'...
I personally disagree with that. I think it's good to know how a computer works, but at the same time expecting everybody to program is like expecting everybody to sew their own clothes. Not everybody wants to do that. And I don't think it's something everybody should do.
It's not like reading or writing where you have to be able to do it because you won't be a productive member of society if you can't.
Q. Do you have any opinion on what the next major stage in computing may be - or what might create the need to tear up Linux and start from scratch?
I don't forsee the need to start from scratch. If anything the next stage will not be about the kernel it will be about building on top of it. We'll certainly make changes but I think we usually have good reason for doing things the way we are doing them. So I don't see a complete re-architecturing.
That said the thing I'm personally interested in, is how I believe that within 10 to 20 years we'll hit a completely new point in computers which is when Moore's Law really stops working. What happens when we hit the hall of physics? When suddenly we can't rely on computers two years from now being twice as powerful as computers today?
I think that will change the business of technology in a big way and I think that will indirectly or directly impact also on what we have to do. But it's still several years away, I just think people are seeing it looming.
Q. So is it possible in a couple of hundreds years your work will still be in a kernel somewhere powering something - even if it's just a fossilised layer?
I don't know. I don't even know if I want to say 'I hope so' because that sounds a bit… scary. Sad, really. But I think computers will over that timeframe, people will do things differently. And at some point we'll probably have to come up with new ways of doing things. Artificial intelligence might actually take off.
But at the same time Unix has been around for over 30 years and Linux has been around for over 20, and we do the things we do for fairly good reasons.
So it's not entirely impossible that in 200 years we'll still have operating systems, it's just by then hopefully no one will care because they'll be working on something new and exciting.