This is the sixth theme of Network Plus. In this theme, led by Ewa Luger (U. Edinburgh) we ask – how do we support and empower all aspects of society to function in a world where almost every aspect of our lives results in data trails? This might include questions around literacy, power, access to resources, and responses to surveillance capitalism.
It has been said that privacy is dead, and that citizens should simply accept that sharing their data is the latest phase of capitalism: ‘If we have nothing to hide then we have nothing to fear’. We are tracked everyday – store-cards, travelcards, applications on our mobile phone, cookies on the internet.But shouldn’t we have a choice? What about those in society who are already marginalized – what kinds of skills might they need to make informed decisions about sharing their data?
How can the concept of Human Data Interaction help us improve or adapt the ways that people learn about data-driven systems’ functions, benefits and risks? This theme will look at the skills, education and training required to deal with a data-driven world, and especially, how they might interface with HDI’s tenets of Legibility, Agency and Negotiability?
There were five projects funded under this theme.
We aim to address a perceived deficit in regional plans (in terms of the HDI tenets of legibility, agency and negotiability) for developing digital skills to improve disadvantage in the Black Country. We will therefore develop cross-sector dialogue to illuminate issues concerning both Human Data Interaction (HDI) and attention to the personal, diverse ‘postdigital’ contexts of individual learners, not yet addressed in the Black Country Digital Skills Plan.
Regional plans for ‘digital upskilling’ are aimed at local people who are already disadvantaged. Therefore, it is a problem if the digital training and support that is intended to develop their employment does not also attend to developing their agency to make choices concerning their data. It is an issue if legibility and negotiability are not addressed, to aid these participants to recognise, and respond to, data-related inequities. Our project aims to empower participants in these ways, as part of the ‘digital upskilling’ agenda.
The project is exploratory, in setting up spaces for debating ways to address HDI concerns, linked to legibility, agency and negotiability, that are not yet included in the education and training plans for ‘digital upskilling’ in the Black Country.
As Zuboff (2015) has strongly argued, surveillance capitalism, and all the underlying processes implied within it, is an unprecedented phenomenon, thus unrecognisable for many. This in turn implies that little is known let alone designed so that educators are able to explore with students what it means and what the implications are for our social life and, more poignantly, for the future of citizen’s choice and democracy. In this context, the project aims to design, develop, pilot, and refine an open course (short module) for prospective/in-service teachers and undergraduate and postgraduate students. This course will equip learners with skills and knowledge to work critically with data, enabling them to engage with data collection and processing but also explore how to use open infrastructure and store data using those open infrastructures. At the same time, it will expose learners to case studies together with critical theory so that they can explore issues related to data and the algorithms at play to uncover hidden social structures that impact different elements of society. In short, to enable participants to ‘read and negotiate’ data in everyday life situations. The online/face-to-face course will be an open educational resource and will be in itself a user-led creative design of novel data-driven learning experiences.
Conceptual and philosophical challenges to current approaches and ideologies will be addressed through different interactive and audio-visual learning activities. The aim is to explore with learners, using critical theory as a lens, different approaches to uncover issues of surveillance capitalism (Zuboff, 2015) in which they take part without necessarily being aware. As part of the learning activities learners will use popular apps such as Fitbit, Strava or Zwiff to understand how individuals become the product of their data.
This approach is inspired by the work of critical pedagogues such as Freire, Shor and Hooks. Critical pedagogy is an approach whereby the learner analyses her/his social reality through the lens of critical theory and through dialogue and group discussion oppressive structures are uncovered, and new ways to change the social reality of which the learners are part, are explored and pursued. Critical digital pedagogy is a well-researched approach (Stommel, 2016) that follows the tenets described above and has proven to be successful and transformative in hybrid approaches to learning.
Yoshik has been undergoing development since June 2017, as part of a dissertation project su- pervised by Ailsa Henderson and James Stewart at the School of Social and Political Science. The development of the tool uses game simulation design, survey design, and interviews and focus groups in order to validate design choices.
The current version of the tool includes a pre-survey, post-survey, and gameplay in between. It is originally developed in both Mexican Spanish and Spanish Android users; however, the tool was not originally intended to be distributed in English or as a Qualtrics text-based version. After careful consideration and academic discussion, it is essential to reduce the manipulation of per- ceived risks (Rauhofer, 2008) for millennial users more widely, and therefore include an English language version in production in two forms: updating the tool for English and creating a Qual- trics text-based version for non-Android users and individuals with Specific Learning Differ- ences and or visual impairments .
In order to accomplish this, we would to update the scenarios in an English version, which will additionally allow us to integrate a legal perspective into the tool. This strengthens the work by offering a multidisciplinary element, engaging Human-Data Interaction research from multiple backgrounds.
Gaming Student Analytics: A Student-led Critical Games Design Project Focusing on Learner Surveillance
John Rooksby, Northumbria University.
In this project I will work with a team of students to produce a critical game that addresses student analytics. The game will be a playable prototype that is produced in Unity and made available as a web browser-based game using WebGL.
The game will be a creative and critical response to policy documents and privacy policies published by Northumbria University relating to student analytics. This university follows best practices and makes detailed information openly available. For example, the document “Using Student Data for Educational Analytics”  sets out in detail what data is collected and why. To quote one relevant portion of the document:
“The following data, which is currently captured by the University, is initially in scope for Educational Analytics: personal information provided by the student at registration, but excluding special Category data; student level study records held by the University including assessment marks; details of a student’s assigned Personal Tutor; system-generated data from Blackboard, such as the date and frequency of accessing pages; student attendance data; library borrowing logs; smart card activity log on Campus; Northumbria gym membership.”
Documents such as these can form the basis for critical thinking on the present and future of student analytics and how social values are embedded within. The students will be invited to read documents such as these and produce a creative, critical response. They will be guided in what information they read and the creative process of creating a response. What the final design of the game is will be the decision of the students and an outcome of the process. This proposal does not attempt to specify the game but to describe the process.
The main output of the project will be a playable prototype game (or failing that a detailed design synopsis). This will be complemented by a reflective commentary produced via a focus group discussion with the students and with reference to the relevant academic literature on critical games.
Creating Learner Awareness of the Right to Opt Out
Dr Caroline Stockman, University of Winchester.
A first stage of this project will be to capture key legal guidance and frameworks in regards to the right to opt out in this context – notably, the EU General Data Protection Regulation, the Data Protection Act 2018, and its Toolkit for Schools. In addition, this stage will review existing litera- ture exploring children’s rights in the digital age. For example, child rights academic Professor So- nia Livingstone reviewed the UNCRC’s articles and adjusted them for the digital age, published in the Children’s Commissioner’s Growing up Digital report, and public campaigns such as Your Data Matters, developed by the Information Commissioner’s Office, aim to increase the public aware- ness which is at the heart of this proposal.
This stage focuses on the question of what learners today should know about their rights in regards to opting out in the schooling context, why they might choose to exercise it, and possible conse- quences to consider.
We will then add the educational dimension to this legal discussion, considering the UK’s comput- ing curriculum, revised in 2013, and other guidance, such as Education for a Connected World (2018), and the 5rights framework. More recently, the DfE has released its new educational technology strategy for England (2019d). Learner data is an integral component. Reactions to the strategy from the educational sector have been mixed, as discussed at the Westminster Education Forum held on the 25th of April 2019. This stage focuses on what learners today are taught in regards to their right to opt out.
This will lead to an initial power critique of learner data in schooling contexts, potentially pertaining to previous observations, that
- guiding legal frameworks are not sufficiently specific to accommodate for learner data in formal schooling, and
- education often does not viably present opting out as an option at all.
We seek to highlight an important tension, that if a child does exercise their right to opt out, then they may risk the quality of education they receive or risk removing themselves from the commu- nity of knowledge. Children might be inadvertently coerced into not exercising their rights to opt out because this might negatively affect their learning experience. With an open access publication, we hope to raise longer-term awareness of these issues.
The final stage of the project will be to distill these findings into a targeted YouTube video, in- forming learners of their right to opt out without their ability to have a high quality education be- ing affected. It is remarkable that such a video does not exist yet, indicating the implicit acceptance of automatic consent. For this, we intend to work with professional expertise in the design of the video, and its subsequent social media campaign.
The workshop for this theme was very successful, and took place on Friday, 7th February. A call was launched and the successful projects were then reviewed and chosen.