Synopsis of 1st HEAS transatlantic meeting

In March 2019, we hosted a transatlantic conversation about trust and surveillance in higher education, attended virtually by 13 academics from North America and the UK. The conversation was intended to be a launching point for collaborations in research, publishing, and teaching across the participants and, we hope at some point, additional academics interested in these topics. As Jen mentioned in our first Higher Education After Surveillance post, our goal is to dig into issues of trust and surveillance in higher education help generate new insights and opportunities for action.

We kept the initial conversation small so that we could manage scheduling challenges and so that we could have a productive conversation. Though we had a small group, we tried to ensure representation across a range of professional roles, locations in North America and the UK, and teaching/research interests (though all related to surveillance). Going forward, we hope to provide opportunities for more people to be involved in this project—more on that later in the post.

The attendees of this first meeting were:

We framed the conversation in three parts:

  1. Discussing questions about surveillance in higher education
  2. Exploring speculative trusting futures
  3. Scoping a set of possible next steps

PART 1. For the first part of the meeting, we posed a handful of questions that had been contributed by participants in an introductions document. The questions posed were:

  • We’ve seen recently a backlash against technology companies like Facebook for their problematic business practices. What might an ed-techlash look like?
  • What are the core needs (not features, but underlying reasons) that have been answered by surveillance-orientated models? As part of this question, we also asked about historical (pre-digital) surveillance in education and whether or not surveillance-free education is even possible.
  • What are the options to our current paradigm of platform and surveillance capitalism

Key insights:

> The group does not expect to see a backlash against educational technology, though they did point to reactions against MOOCs and to recent student protests about Facebook in schools as examples of what a backlash could look like. In discussing those responses, the group pointed to the importance of student involvement in critical responses to educational technology as part of a multi-pronged approach to resistance. The harm to students caused by these surveillance technologies should be better understood and used to create ethical frameworks for educational technology.

> While surveillance and data collection/analysis about students have long been a part of educational systems, we are worried about increases in surveillance brought on by technology, and the use of automated systems (e.g., algorithms) without dealing with underlying issues of bias and power in those systems. We need to improve our ability to call out the non-neutrality of data and data collection/analysis processes.

PART 2. In the second part of the conversation, we responded to the question about alternatives to the current paradigms of platform and surveillance capitalism that are at work in education. We did this by dividing into breakout groups (using Zoom’s breakout feature) and exploring speculative futures for trust in higher education. Speculative futures conversations provide an opportunity to critique current contexts, suggest future possibilities, and challenge our conceptions of what’s needed, what’s important, and who benefits from current and future contexts.

Scenario 1: the post-edtechlash university
In this future, regulation around individual privacy, data, and rights is extensive and highly proactive. Legislation dictates a high level of visibility and readability of algorithms, and there are clear processes for challenging decision-making. Individual rights (such as the right to be forgotten) are well established, evolving, and informed by research coming from universities. Higher education institutions have clear policies and procedures to account for this regulation, and many top-down decisions are made with privacy and data-visibility in mind. What are some key features of higher education in this climate? What are the tensions in this post-surveillance setting? What are some challenges faced by institutions, teachers, students, and ed-tech makers?

Scenario 2: DIY privacy in higher education
In this future, regulation is not very robust, and it can’t keep up with technological change. However, the education sector as a whole has been very active in educating for data citizenship and data literacy. Higher education institutions are generally more proactive than other parts of society in taking a trust-conscious, privacy-respecting stance, though this varies between institutions. In part this is because prospective students are increasingly ‘voting with their feet’, and institutions are punished for perceived breaches of trust. Activist groups of staff and students respond quickly and effectively to changes in the data or surveillance landscape, and there have been high-profile cases of institutions backing down on proposed changes in the face of student and staff pressure. What are some key features of higher education in this climate? What are the tensions in this post-surveillance setting? What are some challenges faced by institutions, teachers, students, and ed-tech makers?

Key insights:

> Both scenarios were hard to imagine as possible futures, however we are seeing educators, students, and institutions beginning to have conversations about elements of each scenario. The question becomes, which elements should we prioritize, especially given the immediate dangers/harms to our students (v. the longer-term institutional changes needed)? The tensions of priorities and strategies are hard to reconcile.

> Scenario 1 presented a future where policy addressed privacy, data, and rights but did not account for the wider political, labour, organizational, and social sphere. Scenario 1 also highlighted how many of our university/college systems (like the student information system) are difficult to unravel from processes, and create a barrier to a policy-based future scenario as presented.

> Scenario 2 was seen as potentially furthering inequities faced by marginalized groups. For example, the well-funded institutions may provide “oases of privilege” (protections for data, privacy) while less-well-funded institutions do not. The notion of “voting with your feet” privileges those who have options and punishes those who cannot access or afford other options.

> It’s clear how deeply ingrained the technologies are in our universities, and old practices have become hard coded with layers and layers of technologies built on top of them. Universities are not historically places of justice–unjust practices are deeply ingrained in higher education. To make a difference here, we will have to deconstruct and challenge education’s relationships to governments , economic structures, political movements, etc.

PART 3. We ended the conversation by raising possible courses of action we (plus others) could take in the future. This discussion showed a wide range of possibilities for collective action and individual/institutional action.

Possible actions:

  • Put together a group/materials/tactics for how to combat/resist these initiatives at your institution. Could include practical alternatives and examples of good practice from other institutions; group discussion guides to foster conversation in different contexts
  • Offer a workshop or conference in one or multiple locations, including voices from the global south and other places far more affected by this than us (North America & UK)
  • Write a book of essays on these topics; write a book (same book?) with stories of the harms being done or that might be done (e.g., speculative stories by sava and Tim)
  • Write a research networking bid
  • Work with unions across US and UK to create guidelines and encourage a unified approach to resisting or refusing specific tech

  • Create a Data Harm Record (like Data Justice Lab) for ed tech
  • Create a money/power map associated with educational technologies; connect the map with “Bad Tech” papers (in the vein of ELI’s 7 Things You Should Know papers) that administrators can read to understand implications of ed tech purchases at their institutions
  • Connect with other groups, including outside of education, to do this work (e.g., Stop LAPD Spying; Allied Media; Tactical Technology Collective)
  • Present on these topics, perhaps offer educational resistance guide, at MozFest in September

The Higher Education After Surveillance conversations will continue in asynchronous and synchronous formats as we explore what we might do next. If you are interested in connecting with these efforts, please email acollier @ middlebury. edu and jen. Ross @ ed. ac. uk

unsplash-logoev

Leave a Reply

Your email address will not be published. Required fields are marked *