ESRC Data PSST! Seminar Series

ESRC Data PSST! Seminar Series

IMG_20160519_182523A couple of weeks ago I ventured over to North Wales to  Bangor University for the final seminar in the ESRC Data PSST! Seminar series. As a bit of background, the project has been running for the past couple of years, and has seen many different speakers and attendees meeting up for critique and discussion around the themes of surveillance, transparency, non-state actors and political communication.

Being rather late to the series, I was pleasantly surprised to be invited to come along as a speaker and participant for the final event by PI Vian Bakir. I had attended the previous session at Cardiff University after colleague and friend Gilad Rosner suggested I come along. The Cardiff event focused on the role of non-state actors in surveillance and the challenges posed for traditional notions of transparency. (I’ve put my position statements from both events at the bottom).

IMG_20160519_190813For this one, I had a few hours driving over to Bangor. It was a dreich Thursday evening from Ashbourne, but there were bonny sea views on the A55 and Gojira’s album l’Enfant Sauvage for company:) Bangor is quite picturesque as it gazes out over nearby Welsh forests, estuaries, cliffs and mountains. The pre-workshop dinner over in Menai Bridge on Anglesey was at a rather charming fish restaurant too.

 

20160520_095541

A very small group (9 or 10 of us) spent the day discussing how best to engage different stakeholders with concerns over transparency, state surveillance and data governance. I particularly enjoyed learning about the concept of translucency from Vian Bakir and Andrew McStay’s work on a typology of transparency. Interesting communication and engagement tools, from provocative short films to art projects, were discussed. An important point raised was how engaging the public and  policymakers require different approaches. The former may be more interested in educational or viral type material (like the recent Cassette Boy/Privacy International mashup on the IP Bill), whereas that won’t work for the latter, where they may be more responsive to reports, white papers and policy recommendations.

20160520_094747My presentation for this session considered practical approaches to engaging internet of things designers with privacy regulation. The privacy by design cards are a good example, but importantly  I looked at the broader shift IMG_20160519_173403towards bringing designers into regulation too. Finding the best forums to support designers in their new role is important.  Professional bodies like the ACM or IEEE clearly have strong links with their members and can guide on ethics and to an extent regulation. Equally state regulators like the Information Commissioner Office have a role in communicating and supporting designers on their compliance obligations. A particular challenge of this is the differing level of resources organisations have to deal with compliance, from startups and SMEs (with little) to multinationals (with more). The nature of support they may require will differ, and we need to better understand how compliance plays out in these different organisations.

It was an enjoyable workshop and thanks again to the organisers again for having me along:)

I’ve put my position statements from Data PSST! Cardiff (March 2016) and Bangor (May 2016) below.

Seminar 5:

Transparency of Non-State Actors? The Case of Technology Designers and Privacy by Design

Lachlan Urquhart

Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham

Cardiff (March 2016)

 

My position on transparency and non-state actors is framed in the context of European Data Protection (DP) Law. A key component of the upcoming EU DP reform package is the concept of data protection by design and default (DPbD). Designing privacy protections into a technology has long been considered best practice, and soon it will be mandated by law. It requires privacy concerns to be considered as early as possible in the design of a new technology, taking appropriate measures to address concerns. Such an approach recognises the regulatory power of technology which mediates behaviour of a user, and can instantiate regulatory norms.

Concurrently, regulation, as a concept, has been broadening and moving beyond notions of state centricity and increasingly incorporating actions of non-state actors. I’d argue privacy by design is a context where technology designers, as non-state actors, are now regulators. How they build systems needs to reflect their responsibilities of protecting their users’ rights and personal data, through technical and social safeguards.

However, the nature of their new role is not well defined, leaving open questions on their legitimacy as regulators. They are not normally subject to traditional metrics of good governance like public accountability, responsibility or transparency. Furthermore, the transnational nature of data flows, as we see with cloud computing for example, adds an extra layer of complication. The new DP law will apply to actors outside of EU, e.g. in US, if they are profiling or targeting products and services to EU citizens, meaning  there are national, regional and international dimensions to consider. Overall, the fast pace of technological change, contrasted with the slowness of the law has pushed designers to be involved in regulation, but without appropriate guidance on how to do so.

This is a practical problem that needs to be addressed. An important component is the role of nation states. State and non-state actors need to complement each other, with the state often ‘steering, not rowing’. The model of less centralised regulation cannot mean dispelling with traditional values of good governance. Instead state regulators need to support and guide non-state actors, on how to act in a regulatory capacity. How can transparency, legitimacy and accountability be reformulated for this new class of ‘regulator’: the technology designer. Much work needs to be done to understand how designers need support as regulators, and how the state can respond to this.

Seminar 6:

Making Privacy by Design a Reality?

Lachlan Urquhart

Mixed Reality Laboratory & Horizon Digital Economy CDT, University of Nottingham

Bangor (May 2016)

We have developed a tool that aims to take the principle of data protection by design from theory into practice. Article 23 of the General Data Protection (DP) Reform Package (GDPR) mandates data protection by design and default (DPbD). This requires system designers to be more involved in data protection regulation, early on in the innovation process. Whilst this idea makes sense, we need better tools to help designers actually meet their new regulatory obligations. [1]

Guidance on what DPbD actually requires in practice is sparse, although work from usable privacy and security or privacy engineering does provide some guidance [5, 6]. These may favour technical measures like anonymisation or tools to increase user control over their personal data [7]; or organisational approaches like privacy impact assessments. [2]

By calling on design to be part of regulation, it is calling upon the system design community, one that is not ordinarily trained or equipped to deal with regulatory issues. Law is not intuitive or accessible to non-lawyers, yet by calling for privacy by design, the law is mandating non-lawyers be involved in regulatory practices. We argue that there is a need to engage, sensitise and guide designers on data protection issues on their own terms.

Presenting a non-legal community with legislation, case law or principles framed in complex, inaccessible legalese is not a good starting point. Instead, a truly multidisciplinary approach is required to translate legal principles from law to design. In our case, we bring together information technology law and human computer interaction. [4]

Our data protection by design cards are an ideation technique that helps designers explore the unfamiliar or challenging issues of EU DP law. [8] Our cards focus on the newly passed GDPR, which comes into effect in 2018. They are designed to be sufficiently lightweight for deployment in a range of design contexts eg connected home ecosystems or smart cars. We have been testing them through workshops with teams of designers in industry and education contexts: we are trying to understand the utility of the cards as a privacy by design tool. [9]

A further challenge for privacy by design goes beyond how to communicate regulatory requirements to communities unfamiliar with the law and policy landscape. Whilst finding mechanisms for delivering complex content in more accessible ways is one issue, like our cards, finding the best forums for engagement with these concepts is another. Two examples could be the role of state regulators and industry/professional associations. State regulatory bodies, like the UK ICO or EU Article 29 Working Party, have a role to play in broadcasting compliance material and supporting technology designers’ understanding of law and regulation. The needs of each business will vary, and support has to adapt accordingly. One example could be the size and resources a business has at its disposal. It is highly likely these will dictate how much support they needed to understand regulatory requirements e.g. an under resourced Small or Medium-sized Enterprise vs. a multinational with in-house legal services.

Industry and professional associations, like British Computer Society, Association for Computing Machinery or the Institute of Electrical and Electronics Engineers may be suitable forums for raising awareness with members about the importance of regulation too. Sharing best practice is a key element of this, and these organisations are in a good position to feed their experience into codes of practice, like those suggested by Art 40 GDPR.

[1] – L Urquhart and E Luger “Smart Cities: Creative Compliance and the Rise of ‘Designers as Regulators’” (2015) Computers and Law 26(2)

[2] – D Wright and P De Hert Privacy Impact Assessment (2012 Springer)

[3] – A29 WP “Opinion 8/2014 on the recent Developments on the Internet of Things” WP 233

[4] –  We are conducting a project in the EU and US involving researchers from: University of Nottingham (Tom Rodden, Neha Gupta, Lachlan Urquhart),Microsoft Research Cambridge (Ewa Luger, Mike Golembewski), Intel (Jonathan Fox), Microsoft (Janice Tsai), University of California Irvine (Hadar Ziv) and New York University (Lesley Fosh and Sameer Patil).

EU project page and cards are available at designingforprivacy.co.uk

[5] – J Hong “Usable Privacy and Security: A Grand Challenge for HCI” (2009) Human Computer Interaction Institute

[6] – Danezis et al “Privacy and Data Protection by Design– from policy to engineering” (2014) ENISA; M Dennedy, J Fox and T Finneran “Privacy Engineer’s Manifesto” (2014) Apress; S Spiekermann and LF Cranor “Engineering Privacy” (2009) IEEE Transactions on Software Engineering 35 (1)

[7] – H Haddadi et al “Personal Data: Thinking Inside the Box”(2015) 5th Decennial Aarhus Conferences; R Mortier et al “Human -Data Interaction: The Human Face of the Data Driven Society”(2014) http://hdiresearch.org/

[8] IDEO https://www.ideo.com/work/method-cards; M Golembewski and M Selby “Ideation Decks: A Card Based Ideation Tool” (2010) Proceedings of ACM DIS ’10, Aarhus, Denmarkhttps://dl.acm.org/citation.cfm?id=1858189

[9] E Luger, L Urquhart, T Rodden, M Golembewski “Playing the Legal Card” (2015) Proceedings of ACM CHI ’15, Seoul, S Korea

 

Originally posted at https://lachlansresearch.wordpress.com/2016/05/31/esrc-data-psst-seminar-series/

https://lachlansresearch.wordpress.com/2016/05/31/esrc-data-psst-seminar-series/