Security/Contextual Identity Project/Related Work: Difference between revisions

no edit summary
(Created page with "= Deletion = = Psychology = = Policy and privacy = *Helen Nissenbaum (2009). [http://www.amazon.com/Privacy-Context-Technology-Integrity-Stanford/dp/0804752370 Privacy in Conte...")
 
No edit summary
Line 1: Line 1:
= Deletion =
= Deletion =


= Psychology =
= Psychology, Sociology =
*Carl Jung (1953). [http://en.wikipedia.org/wiki/Persona_%28psychology%29 Persona theory (need a real reference for this)]
 
Persona as mask: "One could say, with little exaggeration, that the persona is that which in reality one is not, but which oneself as well as others think one is.” -- Carl Jung
 
*Erving Goffman (1959). [http://www.amazon.com/The-Presentation-Self-Everyday-Life/dp/0385094027 The Presentation of Self in Everyday Life]
Everyone's got role(s) to play: "All the world is not, of course, a stage, but the crucial ways in which it isn't are not easy to specify." -- Goffman


= Policy and privacy =
= Policy and privacy =
*Helen Nissenbaum (2009). [http://www.amazon.com/Privacy-Context-Technology-Integrity-Stanford/dp/0804752370 Privacy in Context: Technology, Policy, and the Integrity of Social Life.]
*Helen Nissenbaum (2009). [http://www.amazon.com/Privacy-Context-Technology-Integrity-Stanford/dp/0804752370 Privacy in Context: Technology, Policy, and the Integrity of Social Life.]


<blockquote>This book claims that what people really care about when they complain and protest that privacy has been violated is not the act of sharing information itself—most people understand that this is crucial to social life —but the inappropriate, improper sharing of information.
Sharing information is not a privacy violation per se, and is often desireable. It is when information is shared out of context, without regard for social norms and values, that privacy is violated.
 
Arguing that privacy concerns should not be limited solely to concern about control over personal information, Helen Nissenbaum counters that information ought to be distributed and protected according to norms governing distinct social contexts—whether it be workplace, health care, schools, or among family and friends. She warns that basic distinctions between public and private, informing many current privacy policies, in fact obscure more than they clarify. In truth, contemporary information systems should alarm us only when they function without regard for social norms and values, and thereby weaken the fabric of social life.</blockquote>


= Regret =
= Regret =


= Mental models of privacy =
= Mental models of privacy =
*L. Jean Camp (2006). [http://papers.ssrn.com/sol3/papers.cfm?abstract_id=922735 Mental Models of Privacy and Security]
Roundup of different mental models (criminal, warfare, physical, medical infection, economic) that inform whether or not users think they are at risk, and what motivates their attackers.
*Cormac Herley (2009). [http://research.microsoft.com/apps/pubs/?id=80436 So Long, And No Thanks for the Externalities].
Users are not irrational for not heeding security advice, they are merely making a rational decision that the expected value of following that advice is negative (given the false positive rate, or the probability that not following the advice will lead to compromise).
*Rich Wash, Emilee Rader (2011). [http://www.rickwash.com/papers/conference/influencing-models-nspw.html Influencing Mental Models of Security: A Research Agenda].
Mental models are simple by necessity. Having to reason about all the factors that go into a decision is not worth it, so most people satisfice. However, just because a mental model is technically incomplete or incorrect does not mean that it can't lead to a good decision. -->


= Merging social graphs =
= Merging social graphs =
Arvind Narayanan
*Arvind Narayanan and Vitaly Shmatikov (2008), [http://arxiv.org/pdf/cs/0610105v2 Robust De-anonymization of Large Datasets (How to Break Anonymity of the Netflix Prize Dataset)]
We show that one can link an anonymous Netflix record to external, public data not in the dataset, such as public IMDb ratings, which are associated with a person's identity.
 
*Arvind Narayanan and Vitaly Shmatikov (2009). [http://randomwalker.info/social-networks/ De-anonymizing Social Networks]
More generalized re-identification attacks.


= Social media and privacy =
= Social media and privacy =
*Mary Madden, Aaron Smith (2010). [http://www.pewinternet.org/Reports/2010/Reputation-Management.aspx Reputation Management and Social Media.]
*Mary Madden, Aaron Smith (2010). [http://www.pewinternet.org/Reports/2010/Reputation-Management.aspx Reputation Management and Social Media.]
Young people self-report changing privacy settings and taking remediative actions (deleting posts, etc) to preserve privacy than older people. Also some interesting stats about pseudonym use and monitoring digital footprints.


= Tracking =
= Tracking =
cookies, 3rd party cookies, web bugs, flash cookies, network monitoring, fingerprinting, geolocation, history attacks, there's gotta be a roundup for this already


= Usability and privacy =
= Usability and privacy =


*Pedro G. Leon, Blase Ur, Rebecca Balebako, Lorrie Faith Cranor, Richard Shay, and Yang Wang (2012). [http://www.cylab.cmu.edu/research/techreports/2011/tr_cylab11017.html Why Johnny Can’t Opt Out: A Usability Evaluation of Tools to Limit Online Behavioral Advertising]
*Pedro G. Leon, Blase Ur, Rebecca Balebako, Lorrie Faith Cranor, Richard Shay, and Yang Wang (2012). [http://www.cylab.cmu.edu/research/techreports/2011/tr_cylab11017.html Why Johnny Can’t Opt Out: A Usability Evaluation of Tools to Limit Online Behavioral Advertising]
Confirmed users
238

edits