Midterm Quiz

Originally we intended this quiz to be an hour long, but the TAs think that it may take you up to 3 hours. You have 4 hours. Good luck!

Please copy this quiz into word processor and enter your response below each question. When you are finished, please upload the entire file using the form at the bottom of this page and and click "submit".

If you lose your Internet service during the quiz, please email your file as an attachment to csci_e-170@ex.com.

This quiz has 7 questions, 1 bonus question, and a total of 100 points (plus 15 possible bonus points.)

Good luck.

What is your name:

Question #1 (15 points).

This question concerns Chapter 20. A User-Centric Privacy Space Framework Benjamin Brunk, in Cranor and Garfinkel.

1-a. What is Brunk's "Privacy Space?" Define the term. The privacy space is a list of programs that can be used to provide end-user privacy, or a list of features that are in those programs. The privacy space framework is a structure of the privacy space that is useful for understanding how these programs relate to each other.

If you said that the privacy space is a framework for developing software features, you lost 2 points. The space is not a framework; the framework is a structuring of the space.

1-b. Is a web browser such as Internet Explorer or FireFox an example of Privacy Software, as defined by Brunk? Internet Explorer and FireFox are not privacy software per se, but they have many privacy features. It was acceptable to say that these programs are privacy software because they have privacy features.

If you made a convincing argument as to why these programs are not privacy software, you were given full credit. Some students argued that these programs are not privacy software because they do not implement all of the privacy features that Brunk defined. But, in fact, many programs do not.

1-c. No matter how you answered question #2, givefive examples of privacy features in a web browser. For each feature, explain whether the feature contributes to awareness, prevent, detection, response or recovery.

First Feature:

Second Feature:

Third Feature:

Fourth Feature:

Fifth Feature:

Features that we allow included:

Features that were not allowed include:

1-d. Explain how web browser cookies violate the Code of Fair Information Practice.

Cookies do not explain to the user what information they contain and how they are being used. Cookies collect information without the consent of the user. Cookies may allow the collection of information that is not accurate. Cookies can have long expiration times, violating data retention policies. Cookies allow information to be used for purposes other than that for which they were collected.

1-e. Propose a solution for web browser cookies that would bring them into the compliance with Fair Information Practice. A law could be passed requiring websites to explain what information is in the cookies and how they are used. Alternatively, browsers could do a better job allowing users to display and visualize the information currently in cookies. Instead of controlling the receipt of cookies, browsers could be designed so that they control the release of cookies.


Question #2. (15 points)

This question concerns Kuhn, Markus G., Anderson, Ross, "Soft Tempest: Hidden Data Transmissions Using Electromagnetic Emanations", David Aucsmith (Ed.): Information Hiding 1998, LNCS 1525, pp. 124-142, 1998.

2-1. Why were the researchers at Cambridge studying Tempest Technology?(3points)
The simple answer was "curiosity". The deeper answer was that they wanted to see if Tempest be applied to the application of Software Copywrite protection, and if there was a simple software way to defeat Tempest. There was also interest due to the lack of published data on the infomation carrying emantations of modern hardware. Common mistakes: failing to mention one of the 3 main reasons.

2-2. On a typical desktop computer, what is the component that is most responsible for RF emanations(2 points)?
The Monitor. (or the cable attached to the monitor) Almost everyone got this right. One person said keyboard - this is wrong as keyboards give off very little RF unless energized by an external source.

2-3. Imagine that a piece of hostile software is able to take over a computer and control the Num Lock, Caps Lock, and Scroll Lock lights on a keyboard. Assume that the maximum rate that each of these lights can be flashed in 100 times a second. Does this represent a realistic Tempest-style threat? Why or why not?(3 points)
Depending upon how you examined this, it might have been a realistic threat. Light is a type of Radiation which could be picked up by a receiver. In the strictest sense, it was not a Tempest attack. However, the threat is very similar - information being leaked out of a computer by a covert channel. A simple attack would be to have someone watching from across the room or outside of the building. A more Tempest-style attack would be to have a transmitter monitor the keycodes going across the keyboard cable while it is energized by another transmitter. At which point it truely is a Tempest attack. Many people said that it was not a Tempest-style attack because the lights did not generate RF. This is not true, as light is in face an Electro-Magnetic wave that can be detected at a distance.

2-4. Does the ability to blink keyboard lights represent any kind of threat at all? Why or why not? If it represents a threat, explain how to exploit it and calculate the average data rate. How would you protect against such an exploit?(7 points)
Many answers were accepted here depending upon the previous question.
The ability to blink lights does represent a threat, as it means that information can be leaked out. (2 points)
Data rate (2 points): 3 lights blinked 100 times per second is 300 bits per second. Not 800 bits per second. 800 "states" per second would also have been accepted.
Exploitation just requires having some way to observe the lights. Either a reciever able to hear the keycodes or a telescope to see the lights clearly and feed them to another computer. Some examples of protection(3points)


Question #3 (15 points)

This question concerns Chapter 26. Anonymity Loves Company: Usability and the Network Effect Roger Dingledine and Nick Mathewson, in Cranor & Garfinkel and Roger Dingledine's class presentation.

3-1. Briefly explain how a TOR circuit is created.

3-2. How many servers are used to create a TOR circuit? Why?

3-3. Would the user's privacy be increased if he/she could increase the number of servers? Is this setting under the user's control? Why or why not?


Question #4 (15 points)

This question concerns Chapter 23. Privacy Analysis for the Casual User with Bugnosis David Martin, in Cranor and Garfinkel

4-1. What is Bugnosis? Bugnosis is a program that monitors downloaded web pages for web bugs. When a web bug is found, Bugnosis alerts the user.

4-2. How does Bugnosis work? Bugnosis is a browser helper object that accesses the contents of the web page using Internet Explorer's Document Object Model API. A web bug is defined by Bugnosis with a series of heuristics.

4-3. Would you run Bugnosis? Sure. It's a lot of fun. (Any reasonable answer got credit here.)

A few students thought that this question was asking whether Bugnosis had to be explicitly "run," as opposed to having the program start up automatically. The answer is that the program starts up automatically.

4-4. What is the difference between Bugnosis and Privacy Bird? Whereas Bugnosis examines a web page for the existence of a web bug, Privacy Bird checks the P3P policy of a web site to determine if it is in compliance with the user's own preferences as programmed into the computer. So Privacy Bird looks at policy, while Bugnosis looks at practice.

4-5. What is a web bug anyway? A web bug is typically a 1x1 transparent gif that is downloaded from a third party web server.

4-6. Is there a web bug on this web page? Some students thought that the intent of this question was to get you to download and install Bugnosis. It was not. If you examined the HTML of this page, you would have seen this: <img src='http://www.simson.net/blank.gif' width=1 height=1> Bugnosis didn't think that this was a web bug because www.simson.net and e170.ex.com have the same IP address. Whether having the same IP address really means that they are the same site is a question of interpretation, of course. Students who answered "yes" and showed the HTML received full credit. Students who answered "no" and explained in detail why they thought that the answer was "no" received fully credit. Students who just wrote "yes" or "no" might have lost a point, depending on the level of understanding made evident by answers to other questions.


Question #5 (10 points)

This question concerns Chapter 21. Five Pitfalls in the Design for Privacy Scott Lederer, Jason I. Hong, Anind K. Dey, and James A. Landay, in Cranor and Garfinkel, and the class experience of obtaining and using Thawte certificates.

For each of these questions, 1 point was given for explaining the pitfall, preferably in your own words. 1 point was given for giving a reasonable example of how it related to your experience with the Thawte certificates.

What are the five pitfalls? Explain each pitfall in a sentence or two, and explain how you encountered or avoided each pitfall in your attempts to acquire and use a Thawte personal certificate.

5-1. Pitfall 1: Obscuring potential information flow.
"Destroy document before reading".
Ssytems needs to make clear the scope of the privacy implications relating to the potential for disclosure. Aspects include:

In the case of Thawte, for the most part this is clearly explained as information is gathered in their information and privacy practices documentation.

5-2. Pitfall 2: Obscuring actual information flow.
"Your email address, SSN, and Credit Card number has been given to SPAMRUS.COM. Have a nice day!"
Given the previous pitfall, what information is actually being transmitted to whom?
Thawte explicitly states most of this information in their Privacy statement. Thawte does use cookies and popups, which might be construed as falling in the pitfall in other ways, but really they are trying to follow their statement.

5-3. Pitfall 3: Emphasizing configuration over action.
"OK, I need to download the certificate, then goto a hidden menu that is 6 levels deep, then give it my Grandmother's maiden name and beat the Black Dragon with my bare hands. How is this easy to install again?"
Avoiding unneccessary hassle to create and maintain privacy. This should be reasonable for an ordinary person with a small skillset to install and configure such a system. Users are lazy enough to use default settings, so it is important that the configuration be easy to use and/or have intelligent initial settings.
Many people had a great deal of trouble installing the certificate in their mail readers. This was a perfect opportunity to explain why it was either extremely easy or not for you to use it.

5-4. Pitfall 4: Lacking course-grained control.
"How do you turn the blasted thing off?"
The key is turning disclosure on and off. Even a slightly more fine-grained ordinal control is considered good practice, as it having both. Having extremely fine-grained control is beyond what the average user traditionally wants. They want simple, intuitive interfaces -- needless to say, this is extremely challenging.
In the case of the Email readers that were used, most made it very simple to sign or encrypt mail through a button or menu. This assumes that the mail reader had such capability at all.

5-5. Pitfall 5: Inhibiting established practice.
"This software is so frustrating. It always gets in my way."
Systems need to not get in the way of existing social practice. People already manage their privacy in a range of established practice, some which make sense, and some which do not. Good systems make use of the basic assumptions that an average person in a given societal context will have to ease the learning curve and minimize inconvenience.
Tools that are useful for this are plausible deniability (such that an observer cannot determine that lack of disclosure was intentional) and disclosing ambiguous information (fuzziness of identity or location). In terms of Thawte Certificates, there is no actual verification that occurrs other than checking that the email address is valid. This means that if you were to create a false persona, that could be used to enhance privacy by being unassociated with a given identity. Encrypted email also enhances this by making the information only visible to the intended recipient. Answers that were also accepted were any analysis of the Email reader's ease of integration of Certificates.


Question #6 (15 points)

This is your another question about certificates.

6-1. What does a personal certificate from Thawte certify?

6-2. When you use a Thawte personal certificate, who is the relying party, and what are they relying on?

6-3. What is a Certificate Practices Statement?

Imagine that a hacker is able to steal the private key that was used to sign the Amazon SSL certificate for the server https://www.amazon.com/.

6-4. Describe how the hacker could use this private key to compromise the credit-card of an Amazon customer.

6-5. Describe a second attack that the hacker could perform with this key.


Question #7 (15 points)

In the online discussion on LiveJournal, Professor Garfinkel posted an article about some software that Sony has put on its new computers. The software is designed to prevent users from making more than a few copies of their audio discs. The software uses root kit technology to remain hidden.

If you missed the article, you can view it here.

7-1. Write a short (3 paragraph) essay discussing whether or not you believe that Sony's use of this software violates the principles of Fair Information Practice. If you think that the program does violate FIP, were these violations addressed by Sony's publication of a program to "uncloak" its software?

Getting the length right was worth 1 point. Explain how FIP principals specifically applied to the case was worth 5 points. Detailed analysis of the Sony aspect was worth 5 points. Explaining the uncloaking factor was worth 3 points. Grammar, readibility, and organization were worth 1 point.

Sony's software violates and follows a number of FIPs principals, depending upon if you went with the FTC version, or the US Department of Health, Education and Welfare from "Security and Usability." They consisted of:

In terms of the decloaking software, this did in fact address some FIP points. The fact that it made the software visible means that they are closer to followiing the Openness principal. Unfortunately, access to the software is made difficult and it does not give the user a way to remove the rootkit, indicating that for the most part, FIP is still not being followed.


Bonus Question #8 (5 bonus points)

This question is worth 5 points of extra credit. (even though the headline originally said 15 points. That was a typo.)

It was recently reported that Windows Vista, due to be released next year, will have two significant improvements with respect to Internet Explorer's SSL implementation.

  1. SSL 2.0 will be disabled by default; SSL 3.0 and TLS 1.0 will be required.
  2. IE7 will block access to SSL-enabled websites that have certificates that are expired or that are signed by unknown CAs.
  3. The 256-bit version of AES will be supplied.
The first point is beyond the scope of what was discussed in class. However, given what you know, discuss the impact that improvements #2 and #3 are likely to have on end-user security.


Your answer:

#2 - Having IE block access to websites with expired certificates won't have a significant increase in security, but it will make IE less usable since many sites allow their certificates to expire. (Certificates should be replaced in advance of their expiration, rather than when they expire.) But an expired website is not a threat to the end user's security. Security is really only affected by a compromised private key. It may be that the expired certificate was on a hard drive that was discarded, and in this way the private key was compromised. In most cases, however, an expired certificate merely indicates that the site's management is somewhat lax in its SSL administration. (element 1)

Having the browser complain when a website is signed by an unknown CA is a different matter. The underlying principle of PKI is third-party attestation of an organization's or user's identity. If the third party is not explicitly trusted, that attestation is worthless. (element 2)

#3 - Many students didn't realize that Internet Explorer currently supports the 128-bit mode of AES. So the question was asking what security improvements users are likely to see with the additional support of the 256-bit mode of AES.

Since AES-128 offers sufficient protection against any conceivable attack for at least the next decade, increasing the strength of IE7 to AES-256 is likely to have no increase on security. However, supporting this new mode probably won't hurt.

Several students wrote that IE7 was replacing the MD5 encryption algorithm with the AES-256 cipher. This is of course wrong, since MD5 is a hash function, not an encryption algorithm.