Last Thursday we held the first ever workshop on Web Applications and Secure Hardware. The idea of the workshop was to bring together researchers and engineers who were interested in how to make use of hardware security devices from the web and discuss the challenges and opportunities that this presents.

The programme was packed with talks from a wide variety of sources. These included six full papers which had been submitted and reviewed over the last few months, plus two keynote speakers and two short talks from related projects. We also had one poster by Sandoche Balakrichenan on the subject of the DANE protocol.

Our first presentation was a keynote by Andrew Martin from the University of Oxford, entitled “Who is the user?” Andrew’s main argument was that our current “user account” paradigm, born from a time when computers were time-shared by many individuals, is no longer a sensible abstraction for how people interact with their devices. After describing how our relationship with user accounts on devices had changed over the last few decades, he instead suggested that we need to define more specific roles than just “user”. For example, “owner”, “viewer”, “controller” and so on. He also proposed that a “co-account” would be a useful abstraction for systems where two or more people commonly shared the same account, particularly for a parent/child relationship where the child is eventually given more and more control over how they can use the device. A comment from the audience suggested that sandboxes could achieve this to some extent. Finally, Dr Martin proposed that this problem should be treated as a systems problem as much as a HCI or security issue. As part of further discussion, concerns that the wrong user account paradigm might become too well “baked-in” through extra hardware security mechanisms were also highlighted. One member of the audience asked whether there was a critical incentive for vendors and developers to consider this issue further.

Next up was Laurent Castillo from Gemalto, who described the Secure Add-On Management (SEAM) framework for connecting web browsers with secure hardware elements. He outlined the design principles that SEAM has been based on – security, usability and extensibility – and the challenges faced by the developers. In particular, he identified that the number and diversity of browsers and operating systems made deploying a general-purpose solution extremely difficult. He finished the talk with a few lessons learned, and with pointers to other techniques such as the Gibraltar system.

Nick Hofstede from Inventive Designers then spoke about the W3C WebCrypto and Key Discovery API proposals, and how they had been motivated by trying to solve the problem of electronic signatures for web transactions. Specifically, how to enable online submission of tax returns in Belgium without using a native (Java / Flash) browser plugin. The talk generated a lot of discussion, including questions about how to avoid privacy issues when discovering and using signing keys and certificates on more than one web domain. The fear being that breaking the Same Origin Policy for these keys could allow them to act like “super cookies”, identifying users without their permission. However, a user consent step is currently being investigated as a solution. Finally, one member of the audience suggested that there could be an attack based on a malicious web application misusing a signing key to sign a document without consent (or misusing consent granted for a different document).

The next speaker was Justin King-Lacroix from The University of Oxford who gave an intentionally-controversial talk questioning whether the web could make much use of secure hardware at all. One of the problems he focussed on was that, in order to use hardware to makes any kind of useful guarantee, one of two problematic things were required. Either all of the software on the system must be made compatible with the process and be able to pass web requests directly to the hardware, or each piece of software must be trusted by the person seeking assurance. There was an interesting discussion about the feasibility of hardware-based DRM (a subject also discussed by Martin Pirker) and how this was much more achievable on mobile in comparison to PC operating systems. One member of the audience suggested that re-designing PC operating systems would be necessary to solve the problem. Overall, the talk concluded that secure hardware could have a role to play, particularly in authentication and on mobile, but significant limitations were still present.

Cornelius Namiluko, also from The University of Oxford, then spoke about combined work with Andrew Paverd and Túlio de Souza on the webinos web application system and how this could be enhanced by trusted execution technology. The webinos infrastructure allows web applications to access other devices belonging to the current user, through mutually-authenticated TLS sessions established by a local piece of software. Cornelius described how the keys used in these TLS sessions could be protected by keeping them within a secure execution environment such as Intel TXT or ARM TrustZone. They then explained how they had modified webinos and OpenSSL to allow them to do exactly that. They were only able to experiment in an emulator, but their key operations were surprisingly fast (under 10ms) compared with previous work using Intel TXT. Questions from the audience suggested that OpenSSL might be replaced with a different SSL library to ease the process. One audience member also asserted, with support from the authors, that more access to experimental hardware with open documentation would be extremely valuable.

After lunch, Krishna Ksheerabdhi from Gemalto gave a short invited talk on the history of security hardware and how it was used on mobile devices and web platforms. This touched on Gemalto’s experiences, and briefly revisited the design choices and approach described previously by Laurent Castillo.

Rolf Lindemann from Nok Nok labs, representing the FIDO Alliance, then gave a short invited talk on the FIDO alliance’s approach to improving web authentication. He asserted that passwords were currently inadequate and causing serious security incidents at alarmingly frequent intervals, and that alternatives were necessary. Many other options are already available, including biometrics, security tokens, face recognition, and more. However, because no single authentication approach would ever be appropriate in all situations, the FIDO Alliance was instead proposing an extensible local middleware solution. This would allow multiple authenticators to be used by the end user, without every web application needing additional server-side software to support them. The FIDO approach used a trustworthy client-side component (the “FIDO Authenticator”) which contained an attestation key, used to demonstrate assurance in the authentication mechanism. One member of the audience remarked that federated login could have provided some of the answers, but it was agreed that this had never lived up to its potential.

The next talk was the second keynote, this time by Patrik Ekdahl from Ericsson. Patrik spoke about mobile security technology, first giving an overview of mobile phone security from the early 80s to the present day. He then went into the software and hardware architectures required for security and the new standards being proposed by the GlobalPlatform initiative. These included Trusted Execution Environments (TEEs), Trusted User Interfaces, Secure Elements and more. He described how these features might be used from the web for secure PIN and password entry, and for transaction confirmation. The audience raised several questions about the Trusted User Interface, and the challenges in making the UI attractive enough without compromising security through additional complexity. There was also discussion on how Trusted Applications could make use of TEEs and how they were deployed.

Martin Pirker from TU Graz then gave a presentation about how a secure media path (SMP) might be created to protect valuable content on mobile devices. He and his colleagues (Ronald Toegl and Johannes Winter) described how ARM TrustZone could be used to protect media processing software, isolating it from the rest of the operating system, in order to avoid unauthorised copying. This was motivated by the idea that people will be increasingly accessing high quality multimedia from many devices, and may not have dedicated hardware to process it. One comment from the audience indicated that, for performance reasons, dedicated hardware might needed anyway. However, the approach would be useful in many other scenarios.

The final talk was by Jiun Yi Yap from the Information Security Group at Royal Holloway. The presentation described work he had carried out as part of his doctoral studies analysing the threats facing the Trusted Platform Module. He presented his analysis of two scenarios — encrypting and decrypting data — using the Microsoft STRIDE model and SDL threat modelling tool. This was well received by the audience, with one person suggesting that this work could be combined with the Common Attack Pattern Enumeration and Classification database at MITRE. There was also discussion about the value of context, as Jiun pointed out that his threat analysis was intentionally focused on a scenario, and it could not extend further to other aspects of the TPM without changing the scenario and violating earlier assumptions.

Before wrapping up the workshop, a panel discussion was then held with invited panellists Rolf Lindemann, Shamal Faily (University of Oxford), Martin Pirker and Krishna Ksheerabdhi. The panel were initially asked whether they thought that secure hardware could be the solution to user authentication on the web. Most of the panellists agreed that this could be true in part, but that hardware security devices would not be appropriate in all situations. The rest of the panel time was spent responding to questions from the audience, including discussions about how users might have a changing relationship with technology when they become older, and with some perspectives on where the area would go in the next five years.

We would like to thank Gemalto for their sponsorship of the workshop, as well as Imperial College London for hosting the event. More details can be found on the workshop website, including details on each talk and reviewed paper.