Private browsing is not that private, but it can be
Private, “Incognito mode” browsing sessions are not as foolproof as most users believe them to be.
“After a private session terminates, the browser is supposed to remove client-side evidence that the session occurred. Unfortunately, implementations of private browsing mode still allow sensitive information to leak into persistent storage,” a group of MIT and Harvard University researchers pointed out.
“Browsers use the file system or an SQLite database to temporarily store information associated with private sessions; this data is often incompletely deleted and zeroed-out when a private session terminates, allowing attackers to extract images and URLs from the session. During a private session, web page state can also be reflected from RAM into swap files and hibernation files; this state is in cleartext, and therefore easily analyzed by curious individuals who control a user’s machine after her private browsing session has ended. Simple greps for keywords are often sufficient to reveal sensitive data.”
While this is not exactly news for some information security professionals, workable solutions for this problem are scarce.
But Frank Wang, an MIT graduate student in electrical engineering and computer science; Nickolai Zeldovich, an associate professor of electrical engineering and computer science at MIT; and James Mickens, an associate professor of computer science at Harvard have come up with one, and it doesn’t rely on browsers adequately scrubbing the collected information from the system.
They called their solution Veil.
About Veil
Veil is a web framework that allows web developers to implement private browsing semantics for their pages and puts the onus on them to protect client-side user privacy.
The developers must recompile their web content using the Veil compiler. The compiler transforms cleartext URLs into blinded references, as well as injects into each page a runtime library that forces dynamic content fetches to use blinded references.
The compiler then uploads the objects in a web page to Veil’s blinding servers, from where user’s browser will download the content.
“The blinding servers provide name indirection, preventing sensitive information from leaking to client-side, name-based system interfaces. The blinding servers mutate content, making object fingerprinting more difficult; rewritten pages also automatically encrypt client-side persistent storage, and actively walk the heap to reduce the likelihood that in-memory RAM artifacts will swap to disk in cleartext form. In the extreme, Veil transforms a page into a thin client which does not include any page-specific, greppable RAM artifacts,” the researchers explained.
The good news for web developers is that Veil automates much of this effort – the framework is meant to be a helpful tool for those developers who want to protect user privacy, but don’t have the necessary technical skills to do it. It’s also a great tool for developers who are actively invested in using technology to hide sensitive user data.
The blinding servers required for this to work can be run by volunteers, be hosted by companies, or by the site administrators.
“An increasing number of web services define their value in terms of privacy protections, and recent events have increased popular awareness of privacy issues. Thus, we believe that frameworks like Veil will become more prevalent as users demand more privacy, and site operators demand more tools to build privacy-respecting systems,” the researchers concluded.