Computers: It's Time To Start Over

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

Computers: It's Time to Start Over

Computer scientist Robert Watson, putting security first, wants to design with a clean slate
Steven Cherry: Hi, this is Steven Cherry for IEEE Spectrums Techwise Conversations. If you think about it, its weird. Everything about computer security has changed in the past 20 years, but computers themselves havent. Its the world around them that has. An article to be published in the February 2013 issue of Communications of the ACM sums up the situation pretty succinctly: The role of operating system security has shifted from protecting multiple users from each other toward protecting a singleuser from untrustworthy applications.Embedded devices, mobile phones, and tablets are a point of confluence: The interests of many different partiesmust be mediated with the help of operating systems that were designed for another place and time. The author of that article is Robert Watson. He advocates taking a fresh start to computing, what he calls a clean slate. Hes a senior research associate in the Security Research Group at the University of Cambridge, and a research fellow at St John's College, also at Cambridge. Hes also a member of the board of directors of the FreeBSD Foundation, and hes my guest today by phone. Robert, welcome to the podcast. Robert Watson: Hi, Steven. Its great to be with you. Steven Cherry: Robert, computer security meant something very different before the Internet, and in your view, we arent winning the war. Whats changed? Robert Watson: Right. I think thats an excellent question. I think we have to see this in a historic context. So in the 1970s and 1980s, the Internet was this brave new world largely populated by academic researchers. It was used by the U.S. Department of Defense, it was used by U.S. corporations, but it was a very small world, and today we put everyone and their grandmother on the Internet. Certainly the systems that we designed for those research environments, to try and solve really fundamental problems in communications, werent designed to resist adversaries. And when we talk about adversaries, we have to be careful, but, you know, I think its fair to say that there were, you know, very poor incentives from the perspective of the end user. As we moved to banking and purchasing online, we produced a target, and that target didnt exist in the 1990s. It does exist today. Steven Cherry: Your research is focused on the operating system. But how much of computing security is built into the operating system currently?

Robert Watson: Weve always taken the view that operating system security was really central to how applications themselves experience security. And in historic systems, large multiuser computer systems, you know, we had these central servers or central mainframes, lots of end users on individual terminals. The role of the OS was to help separate these users from each other, to prevent accidents, perhaps to control the flow of information. You didnt want trade secrets leaking from, perhaps, one account on a system to another one. And when we had large time-sharing systems, we were forced to share computers among many different users. Operating systems have historically provided something called access control. So you allow users to say this file cant be accessed by this user. This is a very powerful primitive. It allows us to structure the work we do into groups, interact with each other. Users are at their own discretion to decide what theyre going to share and what they wont. So the observation we make on these new end-user systems like phones is that what were trying to control is very different. The phone is a place where lots of different applications meet. But Im downloading software off the Internet, and this is something weve always, you know, encouraged users to be very cautious about. We said, Dont just download random programs through the Internet. You never know where it will have come from. You know, you have no information on the provenance of the software. And on phones today, we encourage users to download things all the time. So what has changed now? Well, weve deployed something called sandboxing inside of these phones so that every application you download runs inside its own sandbox. And that is a very different use of security. And it is provided by the operating system, so its still a function of the operating system. So a phone is trying to mediate between these applications, prevent them from doing what people sort of rather vividly describe as bricking the phone. So you have integrity guarantees that you want. You dont want to damage the operation of the phone. But you also dont want information to spread between applications in ways that you dont want. Steven Cherry: Now, lets talk about Clean Slate. This is research youre conducting for the Department of Defense in the U.S., along with noted computer scientist Peter Neumann. Neumann was recently profiled in The New York Times, and he was quoted as saying that the only workable and complete solution to the computer security crisis is to study the past halfcenturys research, cherry-pick the best ideas, and then build something new from the bottom up. What does that mean? Robert Watson: Thats a great question. I mean it is an interesting problem. You know, the market is controlled by what people are willing to pay for a product. And one of the things we know about the computer industry is that its very driven by this concept of time to market. You want to get things to the consumer as soon as possible. You dont do everything 100 percent right. You do it 90 percent right or 70 percent right, because you can always issue updates later, or once youre doing a bit better in the marketplace, replace the parts, and your second-generation users will expect something a little bit better than what we call early adopters, who are willing to take risks as they adopt technology. So theres a cycle there that means that were willing to put things out that arent quite ready. So when we look at algorithms to search for desired values in some large spaceand we have this term which is called hill climbing, and the idea of hill climbing is that wherever you are, you look around your set of strategic choices. Do you adjust this parameter? Do you adjust that parameter? And you pick the one that seems to take you closest to the goal that youre getting to. And you just repeat this process over time, and eventually you get to the top of the hill. So theres a risk in this strategy. Its not a bad strategy. It does get you to the top of a hill, but it might get you to the top of the wrong hill.

So what the Clean Slate approach advocates is not throwing the whole world away, but instead taking a step back and asking, Have we been chasing, you know, the wrong goals all along? Or have we made the right choice at every given moment given where we were, but we ended up at the top of the wrong hill? And thats really what its all about. Peter talks about a crisis, and I think it is a crisis. We can see what is effectively an arms race between the people building systems and the people who are attacking systems on a daily basis. Every time you get a critical security update from your vendor or a new antivirus updatethese things happen daily or weeklythey reflect the discovery and exploitation of vulnerabilities in the software that we rely on to do our jobs. So were clearly, as the defenders, at something of a disadvantage. And theres an asymmetric relationship, as we like to say. The attacker has to find just one flaw in order to gain control of our systems. And we, as defenders, have to close all flaws. We must make no mistakes, and we cannot build systems that way; its just not a reliable way of doing it. It doesnt solve the problem. Antivirus is fundamentally responsive. Its about detecting somebodys broken into your machine and trying to clean up the mess thats been left behind by poorly crafted malware that cant defend itself against a knowledgeable adversary. It presupposes that theyve gotten in, that theyve gotten access to your data, they could have done anything they want with your computer, and its the wrong way to think about it. Its not to say that we shouldnt use antivirus in the meantime, but it cant be the long-term answer, right? It means that somebody else has already succeeded in their goal. Steven Cherry: Yeah, I guess what you want to do is compartmentalize our software, and I guess the New York Times article talked about software that shape-shifts to elude would-be attackers. How would that work? Robert Watson: You know, we could try to interfere with the mechanisms used to exploit vulnerabilities. So, you know, a common past exploit mechanism, something called a buffer overflow attack. So the vulnerability is that the bounds are calculated incorrectly on a buffer inside of the software, and you overflow the buffer by sending more data than the original software author expected. And as you overflow the buffer, you manage to inject some code, or you manage to insert a new program that will get executed when the function that youre attacking returns. So this allows the adversary to take control of your machine. So we could eliminate the bug that left a buffer overflow, but imagine for a moment that were unable to do that. Well, we could interfere with the way the buffer overflow exploit works. We could prevent it from successfully getting code into execution. So this is something we try to do: Many contemporary systems deploy mitigation techniques. Its hard to get an operating system that doesnt. If you use Windows or you use iOS, [or you] use Mac OS X, they all deploy lots of mitigation techniques that attack exploit techniques. So the one that were particularly interested in is one called compartmentalization. And the principle is fairly straightforward. We take a large piece of software, like a Web browser, and we begin to break it into pieces. And we run every one of those pieces in something called a sandbox. A sandbox is a container, if you will, and the software in the sandbox is only allowed to do certain things with respect to the system that runs outside the sandbox. So a nice example of this is actually in the Chrome Web browser. So in Chrome, every tab is rendered inside a separate sandbox. And the principle is that if a vulnerability is exploited by a particular Web page, its not able to interfere with the contents of other Web pages in the same Web browser.

So originally this functionality was about robustness. What you dont want is a bug in the rendering of any one page to make all your other tabs close, right, crash the Web browser, require you to effectively, well you almost reboot your computer in some sense as you get started up in your Web sessions again. But Google noticed that they could align these sandboxes with the robust units that they were processing each tab in, try and prevent undesired interference between them. So thats kind of a rudimentary example of compartmentalization. And it does work, but there were some problems with it. What wed really like to do, though, is align these sandboxes or compartments with every individual task that were trying to accomplish and the specific rights that are needed. And theres an interesting principle called the principle of least privilege, which was an idea first really talked about in the mid-1970s, sort of proposed at MIT. And what the principle says is every individual piece of software should run with only the rights that it requires to execute. So if we run software that way, then were actuallywe can be successful at mitigating attacks, because when you exploit a vulnerability in a piece of software, whether its a buffer overflow or maybe something more subtle or maybe something in the logic of the program itself, we just got the rules wrong, you now gain some rights. But you gain only the rights of that particular compartment. For example, wed really like not to be able to see what is going on in your online banking. It would seem natural to us as users that that should be the case. But it requires very granular sandboxing. This is part of where our Clean Slate research comes in. Current computer systems were not designed to provide that granularity of sandboxing. Steven Cherry: Youve used the word fundamental a couple of times, and I think what youre advocating is really fundamental. Its in some ways changing the entire 60-year paradigm of computing, abandoning whats sometimes called the von Neumann architecture. This is a different Neumann, John von Neumann, who coinvented game theory as well as the modern computer. According to, you know, basically we dont even put code and data in separate sandboxes. Am I right in thinking its that fundamental, and do you think the discipline of computer science is really ready for such a fundamental change? Robert Watson: Well, its an interesting question. So, you know, the von Neumann architecture, as you suggest, originally described in the paper in the mid 1940s on the heels of the success of systems like ENIAC and so on. And what John von Neumann says is if we store the programyou know, there are a number of aspects in the architectureif we store the program in the same memory that we store data in, we gain enormous flexibility. Provides access to ideas like software compilers that allow us to describe software at a high level and have the computer itself write the code that its later going to run. Its a, you know, pretty fundamental change in the nature of computing. I dont want to roll back that aspect of computing, but we have to understand that many of the vulnerabilities that we suffer today are a direct consequence of that design for computers. So I talked a moment ago about this idea of code injection attacks at the buffer overflow where I, as the attacker, can send you something that exploits a bug and injects code. This is a very powerful model for an attacker because, you know, suppose for a moment we couldnt do that. Id be looking for vulnerabilities that directly correspond to my goals as the attacker. So I have to find a logical bug that allows the leaking of information. You know, I could probably find one, perhaps. But its much more powerful for me to be able to send you new code that youre going to run on the target machine directly, giving me complete flexibility.

So, yes, we want to revisit some of these ideas. Id make the observation that the things that are really important to us, that we want to perform really well on computers, that have to scale extremely well, so there could be lots and lots of them, are the things that we put in low-level hardware. The reason we do that is that they often have aspects of their execution which perform best when theyre directly catered to by our processor design. A nice example of this is graphical processing. So, today, every computer, every mobile device, ships with something that just didnt exist in computers 10 or 15 years ago, called a graphical processing unit, a graphics processing unit, a GPU. So today you dont buy systems without them. Theyre the thing that makes it possible to blend different images, you know, render animations at high speed and so on. Have the kind of snazzy, three-dimensional graphics we see on current systems. Hard to imagine life without it. The reason that was sucked into our architecture design is that we could make it dramatically faster by supporting it directly in hardware. If we now think security is important to us, extremely important to us because of the costs and the consequences of getting it wrong, theres a strong argument for pulling that into hardware if it provides us with dramatic improvement in scalability. Steven Cherry: Well, Robert, it sounds like were still in the early days of computing. I guess in car terms were still in maybe the 1950s. I guess the MacBook Pro is maybe a Studebaker or Starliner, and the Air is a 1953 Corvette. And its up to folks like you to lay the groundwork for the safe Volvos and Subarus of tomorrow. In fact, also for making our cars safe from hackers, I guess, but thats a whole other show. Thanks, and thanks for joining us today. Robert Watson: Absolutely. No, I think your comparison is good, right. The computer world is still very much a fast-moving industry. We dont know what systems will look like when were done. I think the only mistake we could make is to think that we are done, that we have to live with the status quo that we have. There is still the opportunity to revise fundamental thinking here while maintaining some of the compatibility we want. You know, we can still drive on the same roads, but we can change the vehicles that we drive on them. Thanks very much. Steven Cherry: Very good. Thanks again. Weve been speaking with Robert Watson about finally making computers more secure, instead of less. For IEEE Spectrums Techwise Conversations, Im Steven Cherry. This interview was recorded 5 December 2012. Segment producer: Barbara Finkelstein; audio engineer: Francesco Ferorelli Read more Techwise Conversations or follow us on Twitter.

You might also like