June 1, 2018

“Listen to the Hummingbird
Whose wings you cannot see
Listen to the Hummingbird
Don’t listen to me”
~ @riseupnet

In late 2016 a Canary died. The US-based Riseup collective – provider of secure email hosting, mailing lists, virtual private networks, online anonymity services and group collaboration tools – triggered a public alert by deliberately failing to update an online statement within a previously agreed timeframe. ‘The Canary’ statement was designed as a warning to flag any legal process imposed upon Riseup, such as the receipt of a ‘gag order’ preventing the disclosure of information relating to any state issuance of warrants, court orders, etc. As such a provision would legally prohibit the collective from talking about a legal order, the Canary lets its authors sidestep any violation of the order precisely by not communicating within the previously established timeframe. In so doing they sound the alarm to anyone expecting to hear an update – anyone seeking assurances as to whether the security of Riseup’s services may have been compromised.

Fast forward a few weeks past the Canary deadline into late November and the collective were able to issue some updates, reassuring their user-base that there was no need to panic and that further information would be forthcoming. In time Riseup disclosed the circumstances under which the Canary died: the receipt of sealed FBI warrants targeting two accounts hosted on their servers. The collective had been left with the choice to comply and adhere to the constraints against speaking out in gag provisions attached to the warrants, disobey and risk jail time and/or termination of the organisation and its services, or shut down and pull their services offline for their entire international community, including many in need of maintaining secure channels of communication.

Determining that the targeted accounts were involved in “non-political” acts of “selfish opportunism” that violated the social contract of the host (the accounts were serving ransomware and a DDoS extortion ring), Riseup complied with the orders for user information relating to the two accounts – a decision motivated by an interest in maintaining and protecting services for their many thousands of other users.

Sidestepping a limited moral exposition of the decisions Riseup made prior to, during and in the aftermath of the death of the Canary, we would instead like to reach further into these digital currents, to draw out a sense of the social, and of our movement, within.

 

 

We recognise a familiar jaded response towards state surveillance and digital tech that this topic can give rise to. We may think: we’re fucked anyway, that this is beyond my understanding or doesn’t affect me, or that total privacy is impossible so we’ll have to live and organise in spite of its absence. While this latter response may appear more than reasonable in a great many circumstances, it’s useful to recognise the active nature of surveillance that seeks an increasingly diminishing gap between the subject and their digital reflection in forms of governance so heavily articulated by data, constructed in digital architecture. The social relation is technical – at least insofar as relations between individuals are mediated and understood as such. As this governance and influence stretches into organising spaces, right into our lives, it is perhaps more useful to look at cryptoculture and digital security practices not as belonging to a separate realm that we enter into already defeated, but as an extension of social relations of trust and affinity. And here, as even Riseup professed: fundamentally, their users must to some extent place their trust in the collective.

 

 

In any form of organising that presents agitation to everyday state violence, we find the more-than-reasonable assumption that targeted surveillance is a possibility. In the UK, the ongoing inquiry into undercover policing by the Special Demonstration Squad (SDS) continues to reveal some extent of the targeting by ‘spycops’ of political groups, organisers and activists between the late 1960s and the present.

To date, the inquiry has revealed that throughout the past 40 years, 200 officers infiltrated more than 1,000 political groups; from anti-fascists, anarchists, environmental activists and hunt saboteurs and across a broad spectrum of campaigners, families and friends, such as individuals involved in the Stephen Lawrence justice campaign. The revelations around procedures followed by the SDS bring to light the malicious nature and tactics employed by these operations and state actors, such as the frequent use by officers of the true identities of deceased children in the creation of their own fake names and the sexual and psychological abuse of campaigners by undercover police.

In parallel with these revelations, there has also been no shortage of threats towards the limiting of what has become more widespread adoption of end-to-end encryption privacy services in recent years – supported by the likes of everyday tools such as Signal Messenger – via the rhetoric of state ‘regulation’ of comms tech (here, as with spycops, often conjuring ‘extremism’ as a catch-all justification). Against these tendencies, the state spares no expense in their attempt to preserve the anonymity of their own, even when under scrutiny in the courts.

Beyond mere passive surveillance, the history of spycops also reveals the active influence of state actors in social movements. Mark Kennedy, an undercover officer who shot to headlines when unmasked in 2010, had not only observed but influenced the movements he had been part of. Alongside a legacy of numerous intimate and inherently abusive relationships with activists in the UK and across Europe, in Copenhagen, 2009, Kennedy was influential in the formation of a network under the initiative “Never Trust a Cop” as well as facilitating a number of actions in the UK, Iceland and elsewhere. Other police officers whose identities have been revealed used divide-and-rule tactics of gossiping and shit-stirring among friends and comrades to break the relations of trust and friendship that underpin any effective prefigurative community (See for example Marco Jacobs, who while undercover in Cardiff, worked hard to sow distrust, dislike and suspicion. Connected with an action against a pipeline terminal, all criminal prosecutions ultimately collapsed, but only after police had raided houses and obtained computer equipment in what seems to have been a massive fishing expedition). Despite persistent efforts and an ongoing legal case, the British state is resisting in every way possible revelations relating to the extent of their spying.

In software and in everyday life, we find ourselves swarming in cop-infested waters that not only trace and observe dissent but actively influence, intimidate and coerce. As our movement ebbs and flows within these currents, the more fundamental questions, then, are ones of trust and composition: how can we trust each other, and how can we have each others’ backs?

 

 

The courts frequently throw up transcripts of many months of text messages sent between individuals. Sometimes, as in the Welling anti-fascist trial, the only purpose for this seems to be to turn friends against each other where the relevance of messages to the case at hand is decidedly tenuous. Messages can only serve this purpose when they are made easily available: when easy-to-use encrypted communication channels are spurned. At the time of the Welling case, options were somewhat limited, but today we are fortunate to be able to take advantage of platforms and protocols that are convenient and simple enough for everyday use.

As well as the swift adoption of the tools we have to hand, there is also a simple but effective kind of savviness around communication that we should like to encourage. Have you ever overheard someone loudly recount without restraint or considerations on how even words and posturing entangle ourselves and others? The concrete effects of this can be seen with the 2011 “riots” after which many ended up doing time, and were marked forever as troublemakers, for information themselves or their friends fed into Facebook and elsewhere online. Even worse than speaking of one’s own involvement, social practices of sharing information about others – a performance of being ‘in-the-know’, about who knows who, who is a participant in what activities, or what an old friend’s multiple pseudonyms have been – must be recognised and challenged.

There is a kind of special sense to ‘affinity’ among those organising together that we don’t often acknowledge, perhaps half the time because we’re so caught up in our own internal battles. But it’s at this point, refining this, and being prepared to hold ourselves and each other to account, that we’re more likely to refine a sense of trust-through-accountability that prefigures the social composition we desire. It’s likely that we’re targeted, that we’ve been in proximity to the extensions of state surveillance and influence – but what’s even more likely is that those we’re in affinity with are the majority; and here, learning how to be with each other, is where our endeavour resides.

 

 

Extending this sense of affinity, accountability and prefigurative relations into our digital lives, we can recognise that one of the key ways in which digital technologies may debilitate us as a community is in making us dependent on convenience at the expense of independence and autonomy. If sovereignty can be understood as a sense of supreme decision-making power, technological sovereignty relates to this flow of power among the everyday deployment and composition of software ecologies and digital tools, how they are used and what acts upon the user when they are used; what is the social contract between the user and the provider of the tool, and how is the tool developed and deployed?

At present, many of us swiftly hand over large amounts of personal and social information in exchange for convenience and efficiency of communication. We give away so much, leaving ourselves incredibly vulnerable to surveillance and repression. Few of us relate in a deliberate or conscious way to the power dynamics inherent in this when we refuse to take a little time to reflect on the technological choices we want to make (that jaded sense previously alluded to). It’s worth noting that our actions here expose not only our individual selves but also our broader networks – and this should make us think twice about the impact our activity has in this extended, social sense.

What Riseup have done in creating a platform is only one particular technical manoeuvre. But beneath this composition there’s a kind of creative thinking in regards to the technical that we should like to encourage, to undercut a reading of the digital as what’s given. Riseup – like very few similar services out there – exists to provide an alternative to state surveillance and commercially-inclined digital communications services that restrict freedom and are lacking or actively intrusive when it comes to user privacy. The ‘full take’ of the Internet and ‘association mapping’ of users’ social graphs by surveillance apparatus in the UK, the US, and elsewhere, gives states the ability to build a detailed map of organisations, social movements, activist and grassroots groups. Countering this, Riseup attempts to situate communication tools within the control of movement organisations, whilst providing technical design and trust-based assurances towards user privacy and anonymity. Speaking on the Canary incident, “Crossbill” from the Riseup collective calls for more creative diversity in this space:

“We need similar projects. We need to decentralise and spread out so we can create a more healthy eco-system instead of [Riseup] becoming a gmail monoculture.”

Running a platform for web services, email hosting or messaging are awesome examples of the counterpower of contemporary indymedia – decentralising networks and making surveillance harder, sharing the responsibility and bringing relations of trust closer to home.

Whether we as individuals have developed technical skills and interests or not, we can recognise, adopt and support initiatives towards regaining autonomy and independence in our relationship to technology through our everyday behaviours. Just as there are those who commit many hours to creating and maintaining social centres, squats, street resistance to fascism or solidarity with migrant communities, there are also those whose particular skills and interests reside more so in tools and technology. Benefiting from our ability to extend trust through affinity, we need not each of us individually burden ourselves with learning all the intricate details. We would, however, do well to take the advice shared with us by those in affinity who have, and who have invested many hours in creating platforms and protocols – digital infrastructures – that facilitate our clawing back of technological autonomy from those who benefit from us being cast adrift in these waters.

As users who benefit from such autonomous infrastructures, we can also support and encourage the proliferation and decentralisation of secure digital communication platforms by not offloading all of our decisions and responsibility onto those who collectively work across borders in supplying the means to digitally mask-up. We can instead aim to extend our own localised affinities into the digital in such ways that alter our relationship from that of consumer (or product, in the eyes of the companies who run many of the tools that we use) to one of reciprocity, or mutual aid.

Just as when we use a social centre, we clean up after ourselves, or share our learnings from facilitating meetings in order to support those who put in day-to-day effort to keep vital activities moving, if we were to decentralise, re-design and even adopt end-to-end encryption as standard, the legal and coercive pressures facing a handful of tech collectives holding our backs would be significantly diminished by their removal as gatekeepers of surveilled information.

 

 

It’s a curious point to note that the design and composition of the computer-user finds its origins in productivity, as first and foremost a method for conceiving and accounting for a person’s own time in relation to work. The historic emergence of time-sharing in computer processing bestowed names upon individual users, positioning them as individual units of productive, economic value. The digital subject – the individual/the user – is above all this construct of productivity, extended now across device and platform (the personal computer, mobile phone, email account, social network).

One thing we take from this observation is that conceiving of contemporary software ecology as the inevitable or ideal result of modelling technologies for human-computer interaction, is far from accurate. Instead these technologies can be seen to trace a historic emergence in labour that links the end-user to productive and regulatory interests rather than any alternative, utopian or emancipatory design. Responding to observations of this origin story, the poet and media scholar Tung-Hui Hu describes the Cloud as:

“… a subtle weapon that translates the body into usable information. Despite this violence, it functions primarily as a banal ideology that convinces us […] that identifying ourselves is the ‘normal way of registering into the mechanism and transmission of the state.’”

This design – this what’s given – is neither inevitable nor fixed. There are many potential configurations here. The tendency within digitality towards compositions of the ‘discrete’ compliments the clean distinctions of individuals, the boundaries of the self, and facilitates the extension of the ideologies that underpin our social configuration; productivity and separation. Beneath all of this, technical infrastructure is one with material and social structures such as spaces, social centres and relations of affinity – both prefigurative and at the same time essential. As our social composition and our ability to both trust and challenge, and to be held to account, both underpins and extends beneath the technical extensions of ourselves, so here do we seek, in design, a kind of queerness that comes from our own reclamation, attendance to, or disruption of these technologies.

by aether & aphid

Featured in: