Is preventing browser fingerprinting a lost cause?



Similar documents
Web Tracking for You. Gregory Fleischer

Why Mobile Performance is Hard

Web Conferencing Version 8.3 Troubleshooting Guide

Web Application Hacking (Penetration Testing) 5-day Hands-On Course

A more detailed list of types of fingerprinting is included below. A similar definition is provided by [[RFC6973]].

How To Understand The History Of The Web (Web)

Device Fingerprinting and Fraud Protection Whitepaper

HTML5 the new. standard for Interactive Web

Step into the Future: HTML5 and its Impact on SSL VPNs

HTML5 & Digital Signage

PST_WEBZINE_0X04. How to solve XSS and mix user's HTML/JavaScript code with your content with just one script

A Practical Attack to De Anonymize Social Network Users

Addressing Mobile Load Testing Challenges. A Neotys White Paper

PLATO Learning Environment System and Configuration Requirements for workstations. October 27th, 2008

Weird New Tricks for Browser Fingerprinting. yan ToorCon 2015

Torbutton and Firefox. Mike Perry Mozilla Brown Bag Jun 22, 2010

WompMobile Technical FAQ

Criteria for web application security check. Version

Checking Browser Settings, and Basic System Requirements for QuestionPoint

Web Application Worms & Browser Insecurity

DNS REBINDING DENIS BARANOV, POSITIVE TECHNOLOGIES

MEGA Web Application Architecture Overview MEGA 2009 SP4

Firefox for Android. Reviewer s Guide. Contact us: press@mozilla.com

PLATO Learning Environment 2.0 System and Configuration Requirements. Dec 1, 2009

Pizza SEO: Effective Web. Effective Web Audit. Effective Web Audit. Copyright Pizza SEO Ltd.

DESTINATION MELBOURNE PRIVACY POLICY

STOPPING LAYER 7 ATTACKS with F5 ASM. Sven Müller Security Solution Architect

Acunetix Website Audit. 5 November, Developer Report. Generated by Acunetix WVS Reporter (v8.0 Build )

3. Broken Account and Session Management. 4. Cross-Site Scripting (XSS) Flaws. Web browsers execute code sent from websites. Account Management

INUVIKA OPEN VIRTUAL DESKTOP FOUNDATION SERVER

10CS73:Web Programming

Detecting and Exploiting XSS with Xenotix XSS Exploit Framework

Recent Advances in Web Application Security

KUB Website Troubleshooting

Privacy leakage vs. Protection measures: the growing disconnect

Instructions for Configuring Your Browser Settings and Online Security FAQ s. ios8 Settings for iphone and ipad app

April 11, (Revision 2)

NSFOCUS Web Vulnerability Scanning System

Web-Application Security

Web Security. Mahalingam Ramkumar

Five Tips to Reduce Risk From Modern Web Threats

Progressive Enhancement With GQuery and GWT. Ray Cromwell

Modern Web Development From Angle Brackets to Web Sockets

Pwning Intranets with HTML5

PC CHECKING MINDTAP SYSTEM REQUIREMENTS

Information Assurance Directorate

Deliuery Networks. A Practical Guide to Content. Gilbert Held. Second Edition. CRC Press. Taylor & Francis Group

Obfuscation for and against device fingerprinting Position Paper for Symposium on Obfuscation New York University, February 15, 2014

SiteCelerate white paper

Live Guide System Architecture and Security TECHNICAL ARTICLE

Computer Networks. Lecture 7: Application layer: FTP and HTTP. Marcin Bieńkowski. Institute of Computer Science University of Wrocław

Google Web Toolkit. Introduction to GWT Development. Ilkka Rinne & Sampo Savolainen / Spatineo Oy

Differences between HTML and HTML 5

Performance Testing Web 2.0. Stuart Moncrieff (Load Testing Guru) /

Repeater. BrowserStack Local. browserstack.com 1. BrowserStack Local makes a REST call using the user s access key to browserstack.

Internet Explorer 10 Internet Explorer 11. Internet Explorer 10

Combating Web Fraud with Predictive Analytics. Dave Moore Novetta Solutions

Developing ASP.NET MVC 4 Web Applications MOC 20486

Web Browser. Fetches/displays documents from web servers. Mosaic 1993

Web Conferencing: It should be easy THE REASONS WHY IT IS NOT AND THE PATHS TO OVERCOME THE CHALLENGES.

Web Development. How the Web Works 3/3/2015. Clients / Server

Abusing HTML5. DEF CON 19 Ming Chow Lecturer, Department of Computer Science TuCs University Medford, MA

DEPLOYMENT GUIDE Version 1.1. Deploying the BIG-IP LTM v10 with Citrix Presentation Server 4.5

Application Firewalls

PLATO Learning Environment System and Configuration Requirements. for workstations. April 14, 2008

Rich Internet Applications

WHAT DOES CREDIT ONE BANK, N.A. DO WITH YOUR PERSONAL INFORMATION?

Complete Patch Management

OE Cloud Standard Terms of Service

BROWSER AND SYSTEM REQUIREMENTS

Best Practice in Web Design

Rich Internet Applications

2 Downloading Access Manager 3.1 SP4 IR1

Web Designing with UI Designing

Bypassing Browser Memory Protections in Windows Vista

Chapter 12: Advanced topic Web 2.0

Cross-Site Scripting

Website Optimization Tips for Speed

This document is for informational purposes only. PowerMapper Software makes no warranties, express or implied in this document.

Host Fingerprinting and Tracking on the Web: Privacy and Security Implications

Improving Web Vulnerability Scanning. Daniel Zulla

A Tale of the Weaknesses of Current Client-Side XSS Filtering

Web Security. Crypto (SSL) Client security Server security 2 / 40. Web Security. SSL Recent Changes in TLS. Protecting the Client.

Cyber Security Workshop Ethical Web Hacking

A Server and Browser-Transparent CSRF Defense for Web 2.0 Applications. Slides by Connor Schnaith

M86 Web Filter USER GUIDE for M86 Mobile Security Client. Software Version: Document Version:

Advanced Web Development SCOPE OF WEB DEVELOPMENT INDUSTRY

Outline. CIW Web Design Specialist. Course Content

Adaptive Authentication Integration Options. John Murray Manager, RSA Systems Engineering

Privacy Policy/Your California Privacy Rights Last Updated: May 28, 2015 Introduction

Last update: February 23, 2004

Transcription:

Is preventing browser fingerprinting a lost cause? Obligations of W3C specification authors and reviewers to preserve passive privacy properties of the user agent.

What are we talking about? You call it fingerprinting I call it feature detection Passive privacy Different from active/stateful tracking Cookies, HTML5 storage, FSOs Even HSTS supercookies

Panopticlick http://panopticlick.eff.org/ User Agent, HTTP_ACCEPT headers, plugin details, time zone, screen size, color depth, system fonts, cookies / local storage ~60 bits for my Win 7 + IE 9 test Unique in 2.5 million

BrowserSpy.dk Accepted Filetypes ActiveX Adobe Reader Ajax Support Bandwidth Browser Capabilities Colors Components Connections Cookies CPU CSS CSS Exploit Cursors Date and Time DirectX Document Do No Track.NET Framework Email Verification Flash Fonts via Flash Fonts via Java Gears Gecko Geolocation Google Chrome Google Apps GZip Support HTTP Headers HTTP Images IP Address Java JavaScript Languages Mathematical MathML Support MIME Types Mobile Network Objects Object Browser Online / Offline OpenDNS OpenOffice.org Opera Browser Operating System Google PageRank HTTP Password Ping Plugins Plugs Prefetch Proxy Personal Security Manager QuickTime Player RealPlayer Resolution Screen Security Shockwave Silverlight Sound Card SVG Text Formatting File Upload UserAgent VBScript WAP Device WebKit Web Server Window Windows Media Player

Even deeper http://cseweb.ucsd.edu/~hovav/papers/mbys 11.html Fingerprinting Information in JavaScript Implementations Micro-performance benchmarks

DIFFICULT ASYMMETRIES

User Constituency vs. Spec Author Constituency Lots of them, few of us, we should take on necessary extra work in their interest.

Ease of Circumvention vs. Ease of Securing Very hard to prevent fingerprinting, easy to find holes and widely disseminate knowledge or even productize such evasions, undoing the work of conscientious authors.

Ian Goldberg s Privacy Ratchet Easy to take away privacy, hard to add it back in.

Paradox of User-Controlled Privacy Technology Fine-grained settings or incomplete tools used by a limited population can make users of these settings and tools easier to track.

Fingerprint bandwidth / Anonymity set size OS version sub-version, patch level personalization (e.g. fonts) Browser version personalization installed plugins installed plugin versions plugin personalization (e.g. NoScript whitelist)

Where to draw the line? There may be things we cannot prevent fingerprinting: browser make and version, OS make and version We advertise them deliberately! In the UA string, in DOM APIs Millions of lines of code and thousands of API points And you get to run arbitrary code inside them and communicate to the world There will ALWAYS be detectable differences Even if users change their UA string

But even these public characteristics can be differentiators that lead to unique identification when combined with other information like IP address

Where is the line today? Technology that has a privacy impact due to state maintenance typically addresses this in the spec, and UAs typically address this in implementations Though often not the case for private plugins That features can be used as part of fingerprinting a browser version is not typically addressed as a privacy issue

Threat Models Who are we trying to defend against? Casual commercial parties Individual sites writing their own code Limited incentive to invest in unique research Satisfied with an opt-out regime that lets them track many/most users Dedicated commercial parties Selling a tracking system Commercial motivation to invest in arms-race State-level actors May make much larger research investments than would be commercially justified, keep such research private What are we willing to promise? Consequences of failure are VERY different in these scenarios

Realistic Threat Profiles Needed For Browser as a Whole We saw this in privacy objections to CSP A bad faith actor could abuse CSP s reporting to phone home opt-in requested But, if acting in bad faith, could do the same with <img>, <form>, <href>, script, etc No feeling in WG that closing one of a hundred holes would actually move the privacy needle Bad guy who will only abuse one obscure feature and ignore dozens of equivalents isn t a credible threat

Norms matter If we expect people to do this work, need to convince them that everyone else is doing it too And need to take on the back catalog to make it all meaningful Can we do this? Is it a productive way to spend our efforts? To what degree? (back to Lampson )

Related work in IETF http://wiki.tools.ietf.org/html/draft-iab-privacyconsiderations-04 Good layout of model, terms, threats, etc. Fingerprinting. In many cases the specific ordering and/or occurrences of information elements in a protocol allow users, devices, or software using the protocol to be fingerprinted. Is this protocol vulnerable to fingerprinting? If so, how? Can be designed to reduce or eliminate the vulnerability? If not, why not?

Context Context Context Eliminating fingerprinting and other privacy leaks may be an achievable goal when designing a new protocol with confined boundaries. Is anything we do at W3C ever really new? We are almost always operating in the context of the browser : a high-bandwidth, high-functionality application execution environment

Browser Fingerprinting ~= Covert Channel We have known since Lampson s A Note on the Confinement Problem (1973) that such channels cannot be eliminated. Best we can do is limit their bandwidth. We know it is very difficult and expensive to do even this even when it is part of the initial design criteria.

We are talking about retroactively removing covert channels from a 20+ year old system with >10e9 nodes and LoC

Private User Agent community group http://www.w3.org/community/pua/wiki/draft Javascript has access to a wide range of information about the UA and has access to communication channels to leak this information. Limiting access to back channels and limiting access to the UA state helps limit the covert sharing of the UA state. One option is to continue to support JS but to add support for extra restricted JS contexts and two restricted contexts are considered below: 1. Shared Contexts that restrict access to defined private UA state but allow access to back channels; 2. Private Contexts that restrict access to back channels but allow access to the defined private UA state.

Alternate Approaches DNT: do not attempt to make tracking impossible, instead convey a user preference and manage/enforce at layer 9 Incognito / In Private browsing modes Adversary is other users of browser? Anonymity as a specific use case for speciallydesigned user agents: Tor Button Create a standard fingerprint. Comment on Twitter: saw a visitor from NSA, using hilariously out of date Firefox and XP Seems unlikely that s true

http://www.techpolicy.com/blog/featured-blog-post/users-cannot-winin-a-tech-race-with-advertisers-a.aspx TAP: You mentioned that some companies have adjusted their tracking mechanisms to make it more difficult for users to avoid tracking. Is there a way for consumers to block these additional mechanisms themselves? Chris Hoofnagle: Users cannot win in a technology race with advertisers. The incentives are simply too powerful for companies to track users. The market for privacy-preserving tools is still in its infancy. Still, there are things consumers can do; the problem is that every privacy intervention can interfere with how websites and services function. On the most basic level, one can block third party cookies. This typically does not affect web browsing adversely, however, some authentication mechanisms fail when third party cookies are blocked. There are a number of privacy-preserving plugins that can help. Some of the best are Abine's DoNotTrackPlus, Ghostery and NoScript.

I think that, all other things being equal, having W3C WGs consider the fingerprinting impacts of their work is worthwhile. Having W3C declare it irrelevant may impose a substantial, and hard to reverse, cost to privacy. Adam Shostack Solicited Email Correspondence I think the objective of resistance to passive fingerprinting is both important and at potentially achievable. Of the passively-measurable variables we know about, quite a few (excessive User Agent entropy, Accept headers, header ordering, header variation by request type) should be addressable through standards work. Passively measurable variables that are harder include clock skew, clock error, and TCP stack variation. But my guess is that those are probably 5-10 bits of identity, and they could be reduced with some effort. On the other hand active fingerprinting resistance (fonts, plugins, JS quirks, etc) seems much harder, but less important, because people can crawl for and audit it. -Peter Eckersley

???