Research Update: The Internet of Food, Defence Against Dark Artefacts & Adaptive Architecture

Research Update: The Internet of Food, Defence Against Dark Artefacts & Adaptive Architecture

After a few weeks catching up on work after being away at conferences and on holiday over Easter (S.W France and Portugal – v nice, as some of the snaps to jazz up this blog post show 🙂 ), I’ve now found a little time to write a summary of recent happenings, as there has been some exciting news over the past months.


Prize Winning Paper at BILETA, Aberdeen: The annual IT law conference, BILETA, was in the Granite City this year (in slight contrast to Braga last year) but was a great reunion to catch up with colleagues from the field. I wrote a full paper with Nils Jäger and Holger Schnädelbach on adaptive architecture and regulating human building interaction for the BILETA Taylor and Francis prize. To our delight, we found out we won this prize, and received many kind comments from the judges and other conference delegates who had read it. My personal favourite comment was “so bonkers, it falls into excellent” (or something to that effect!). The paper is now up on SSRN and the PPT is also here if anyone wants to have a read – as ever, any feedback is most welcome! It is now being considered for a journal special edition of the conference proceedings.

Keynote at Internet of Food Conference, Dublin: I was invited for my first keynote at a conference on IoT & Food at University College Dublin back in March. The conference was chaired by Prof Nick Holden bringing together food scientists and engineers, biosystems specialists, agriculture industry, and many more. It was  pioneering in it’s scope, with talks on a huge variety of topics from personalised nutrition and smart agriculture, to 3D printing of cheese and robots in food supply chains. I was invited to talk on privacy, security and ethics of the Internet of Food and was warmly received. In my presentation, I considered privacy challenges around tracking domestic food consumption and security issues of smart industrial systems in the food supply chain. The PPT  is available here for a look.

Speaker at GDPR Meet the Expert event, Loughborough: GDPR mania is now in full force, but back in Feb when I was invited down to Loughborough to speak on GDPR, things weren’t quite as intense. I was participating at a Knowledge Exchange event for SMEs and small businesses run by Loughborough Advanced Technologies Institute. My job was to provide an overview of what GDPR entails, drawing on my research experience. I was joined by Susan Hallam, a digital marketing consultant, and Stuart Weir, a data protection officer who consults on businesses to become compliant. All three talks complemented each other well, going from strategic to operational requirements businesses need to take. Despite doing my best to keep this ‘GDPR light’ (not bring Lachlan’s book of doom as one colleague calls it), there was quite a volume and diversity of audience questions. Laying the legislation out in whole really got the audience thinking and I think greater support is needed for SMEs to give a better sense of what they are expected to do in practice. In general, this was a theme that emerged in my own PhD research on privacy by design, where I found large organisations have the resources and stability to create compliance strategies, but SMEs and start-ups lack such support (I was exploring the role for tools like ideation cards). As SMEs do not have in house legal or information security teams,  senior management or directors are often going out to training and reading the legislation themselves to navigate implications for their businesses. Accordingly, any support they can get, from organisations like the Federation of Small Businesses or Chamber of Commerce, is most welcome, to supplement the Information Commissioner’s Office rather high-level guidance (which they often viewed as being of limited utility). Again, PPT is available here for those interested.

Scaring the audience (not deliberately!)

Projects: Alongside these activities, a number of projects have kicked off / been funded. This makes for a busy spring/summer of planning & running workshops to develop and test new ideas/tools, such as the new Moral-IT and Legal-IT cards.

Defence Against Dark Artefacts (DADA) – I’m very excited to be part of the team from Nottingham, Cambridge and Imperial College that won this £1m TIPS 2.0 bid from EPSRC. The catchily named project kicks off in July, exploring technical, social and legal aspects of cybersecurity in smart homes. There is a new project page at Horizon and PI Derek McAuley has already given an interview about it on the radio.

The Memory MachineThis project has now started too, and we had the first workshop a couple of weeks ago (summarised here), exploring how to co-design technologies to support preservation of memories and identity for those living with the impacts of dementia. I’m looking forward to seeing how this project develops, as it brings together a wide variety of experts from mental health, psychology, computer science, art and even some law ( 😉 ) .

Towards MoralIT & LegalIT by Design– I’m leading this one, and a recent blog post outlined what we are planning on doing in a tad more detail than I’ll provide here. The new cards we’ve developed are heading to the printers this week, and will be on this website soon too (i’ve had a lot of requests where to get hold of them, so it’ll be nice to actually have something to send people). It is also really satisfying to see this work coming together after so much background work ( incl. countless hours designing them… illustrator pro level now…). We’ll be running workshops over the next few months to test them with a variety of projects in different sectors, to understand how ethical and legal issues are negotiated in different design settings, from cultural heritage and intelligent mobility to hybrid gifting and health/wellbeing.

Internet of Things & Research Ethics CardsAnother project I’m currently doing  with Martin Flintham and Stuart Moran is looking at creating a deck of ideation and evaluation cards to support researchers in exploring the ethical implications of using Internet of Things technologies in their research work.  We will be running workshops with those more and less experienced with use of IoT in their research, plus operational services, to unpack the opportunities and challenges, both practically and ethically, that greater use of IoT in research will raise.

Trust by Design – this is a one off workshop I’m running with Ansgar Koene and Virginia Portillo (Unbias) at Data Justice 2018. Here we will be exploring the challenges in doing trust by design for the domestic internet of things, with a focus on how best to bring user concerns into design discussions, particularly around trust.

Papers: Alongside papers which have recently been finished, there are others that are now accepted, and being written for the end of May/Start of June too. So watch this space and get in touch if the topics are in your research area!

Abstract: The move towards robotics in the home can be driven by users seeking convenience, comfort, companionship, and greater security, to name a few. However, if robots are not developed in a responsible manner, then they risk causing harm to users, being rejected by society at large, or being regulated in overly prescriptive ways. There is a need to create responsible robotics and, in this paper, we explore some of the challenges and requirements along conceptual and empirical lines of inquiry. To do this, firstly, we explore the emergence of robots in the home, examining definitions of robotics and the current commercial state of the art. In particular, we consider emerging technological trends, such as smart homes, that are already embedding computational agents in the fabric of everyday life. By turning to human computer interaction, particularly notions of values in design, we unpack the importance of the home as a deployment setting for domestic robotics. Subsequently, we consider the nature of responsibility in robotics, examining what it means and look at lessons from past home technologies. However, in this paper, we look at a specific element, arguing a key responsibility of roboticists, is to ensure they engage with user concerns, needs, and respond to them appropriately in design. This often does not occur, leading to technologies that are not fit for purpose and disrupt the social order of the home.

Working from this basis, we then present findings from an exploratory, qualitative survey we conducted to highlight concerns users have about domestic robots. The survey established key themes, primarily around trust, transparency and error. To explore these in more depth, we then analyse relevant literature from across technology law, computer ethics and computer science, to reflect on how to understand and deal with these concerns. We also reflect on wider risks (and opportunities) posed by robots in the home. To conclude, we provide a set of high level design requirements for responsible robotics, drawing together both our empirical observations and conceptual analysis.

Abstract: Emotions define human experience and behaviour. Once off-limits, the boundaries of personal space and borders of bodily integrity are being tested by affective computing applications in worn, domestic and public capacities. This raises questions about contextual integrity and witnessing of psycho-physiological life by an ecosystem of actors. Relying on visual analysis of expressions, gaze and gestures, and physiological sensing of heart rate, body temperature and respiration), technologies that feel raise urgent questions about surveillance, privacy and intimacy.

In this paper we acknowledge their use for personal sousveillance, but focus on situations where consent is much less clear, such as emergent use of facial coding in retail and civic spaces. We report citizen perspectives, detail ethical harms, map legal risks and outline socio-technical safeguards necessary for the emergence of trustworthy AC applications.

Abstract: Recent high-profile news stories of the police seeking access to domestic Internet of Things (IoT) data surfaces concerns about the role of ambient interactive systems in the administration of justice. If there is a demand for access to domestic IoT data by police forces, how can this be done in an ethical manner and how might it be used in practice? Police use of traditional IT devices in criminal investigations demonstrate procedural challenges of computer forensics processes, legal admissibility of evidence and risk of self-incrimination e.g. users sharing device passwords. The growth of consumer IoT involves arrays of devices and services embedded in daily life making intimate details of everyday living visible. The relationships between users, devices, service providers and law enforcement are spatially, temporally and socially complex, shaped by ambient data collection, temporally fragmented interactions and an ecosystem of concealed actors. We will explore how IoT intersects with policing practices, particularly social, legal and ethical issues.


Originally posted at