On Why Some Measure of Privacy is Still Salvageable

I received a huge surprise some months ago when I was invited to represent the IEEE Society on the Social Implications of Technology (IEEE SSIT) in Geneva at WSIS 2016.[1] This addendum is not a review of the panel session or of my general impressions of the overall meeting both of which were excellent.[2] I only wish to elaborate on two points which I had left unfinished given the time restrictions to do with our brief individual presentations. Afterwards in a more intimate gathering it was good to tease out some of the narrower implications of my summary during the course of that brisk afternoon.

I suggested privacy is not altogether dead, and some measure of it is still salvageable.[3] That we are for the greater part already known and quantified should be taken for granted, especially as regards to informational privacy.[4] That much is absolutely true. However, to completely surrender the privacy borders which are still in place is to give in to ‘Big Brother’ unconditionally and allow for depth-charged uberveillance to be introduced into our flesh for the purposes of constant monitoring, locating, and tracking.[5] Resistance is not futile when it comes to protecting whatever little of the privacy borders remain.[6] But even in the present environment we can still limit and protect our internet data flow. We can limit our use of social media, limit our use of mobile telephony, and make concerted efforts to protect our privacy by not giving in to pressures to release sensitive data or information of ourselves for the sake of rewards or convenience. Crucially, too, software design initiatives such as Privacy by Design (PbD), building privacy into the design specifications and architecture of systems and processes, should be strongly encouraged if not altogether mandated.[7]

WikiLeaks et al. and Snowden (XKS, PRISM) notwithstanding what is still left to fight for is the sacredness and inviolability of our inner space.[8] It is to stop any outside entity from introducing surveillance laboratories on the inside of our bodies.[9] Any unnecessary or unwarranted surveillance -“above and beyond”- will quickly erode human dignity, diminish our freedom, and curtail spontaneity which is the underlying force of imagination. My greatest fear is the universal numbering of human beings via implantables from cradle-to-grave and the use of such automated identification data warehouses in company-centric deposits and more so by totalitarian- and ostensibly democratic- regimes.[10]

During question time I was asked by a remote participant whether I believed uberveillance will happen, and what could we do to stop it.[11] To begin with RFID implants are not new, they are decades old. We have been implanting cats and dogs and cattle for years. In recent years it has become commonplace to find ICT devices in people for a variety of applications.[12] The discernible trajectory being the widespread adoption of embedded surveillance for value added services and [‘perceived’] total transparency. Small doubt uberveillance in one form or another will be realized. Whether this be initially on an opt-in basis and then ultimately so enmeshed in our day-to-day lives to become compulsory by necessity or enforced by political systems. When will it happen or how? I cannot give you the answer. I am not the prophet here. Others might well want to wonder with timelines and introduce apocalyptic rhetoric into the discussion. It is not necessary for the tell-tale narrative increasingly speaks for itself. Can we stop it? I do not know.[13] But what we can and must do, is to form cross national alliances at every level of our civic lives to make it as difficult as possible for governments or corporate conglomerates to force us (or to make us feel it necessary) to go down this shadowy path. It is for example a major obstacle when the UN and the EU have different comprehensions and policies on the protection and rights of privacy. Even individual states within sovereign nations have different privacy principles. We need a universal code of adhered ICT ethics. That is, accepted standards which will help determine our judgements when it comes to implantables along the lines of the UN Declaration of Human Rights.[14] I add here, as I stressed in Geneva, I believe in people power and have only little faith in institutions. Committed individuals can make a difference. Grassroots activism and protest are proven big game changers.

Implantables, of course in themselves are not the problem here, their beneficial use in medical science has been well documented. The problem rests with their blanket and undiscerning use in surveillance. We all need to be aware of function creep and to identify the wrongful uses and abuses of the various veillances in our daily lives. For instance, few would argue that such innovations as BrainGate [15] should be halted, but for the greater part we should ponder a world where such neural interface technologies are repurposed outside the application of the disabled toward every day human augmentation. This is indeed to trespass the last bastion of privacy, our deepest of thoughts, and that which means we remain free. For now we can and must safeguard what some scholars are referring to, and quite realistically too, as “meaningful privacy”.[16] If we should ever totally lose our privacy on which our rights and identity are so vitally dependent upon from top to bottom, it would be a singular catastrophe. Given such a scenario, there would be no comeback and no hope of a re-build even as there is after war.

I also spoke of these “exciting” times in which we live. My audience would have certainly had knowledge of the nuances and synonyms.

[1] http://internetinitiative.ieee.org/newsroom/ieee-to-join-stakeholders-at-the-world-summit-on-information-society-wsis-forum-2016-in-geneva-switzerland

[2] http://www.iloveengineering.org/latest/a1cae09f-68ce-4ca3-b552-6767dd78b146

[3] https://www.itu.int/net4/wsis/forum/2016/Agenda/Session/150

[4] See Roger Clarke, 1999, Introduction to Dataveillance and Information Privacy, and Definitions of Terms, http://www.rogerclarke.com/DV/Intro.html

[5] http://www.igi-global.com/book/uberveillance-social-implications-microchip-implants/76728 See also, Christine Perakslis et al., “Evaluating border crossings in an interconnected world” IEEE Potentials, September/October, 2016, in press.

[6] http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7270446

[7] Privacy by Design: https://www.ipc.on.ca/english/privacy/introduction-to-pbd/

[8] Katina Michael and MG Michael, 2013, "No Limits to Watching?" Communications of the ACM, Vol. 56, Iss. 11, pp. 26-28. 

[9] http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1716&context=infopapers

[10] http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1594&context=infopapers

[11] https://www.amazon.com/Innovative-Automatic-Identification-Location-Based-Services/dp/1599047950

[12] Katina Michael and MG Michael, 2012, “Implementing Namebars Using Microchip Implants: The Blackbox Beneath the Skin”, Jeremy Pitt (Ed). This Pervasive Day: The Potential and Perils of Pervasive Computing, Imperial College Press, pp. 163-206: http://www.slideshare.net/focas-project/implementing-namebers-using-microchip-implants-the-black-box-beneath-the-skin

[13] See Roger Clarke’s Keynote 2nd RNSA Workshop, What 'Überveillance' Is, and What To Do About It': http://www.rogerclarke.com/DV/RNSA07.html

[14]  See Stefano Rodota and Rafael Capurro’s,  Ethical Aspects of ICT Implants in the Human Body (Opinion 20), 2005: http://bookshop.europa.eu/pl/opinion-on-the-ethical-aspects-of-ict-implants-in-the-human-body-pbKAAJ05020/downloads/KA-AJ-05-020-3A-C/KAAJ050203AC_002.pdf;pgid=y8dIS7GUWMdSR0EAlMEUUsWb0000bHgL75Og;sid=fOh6iXL9ReR6niGOclfkLhDYezt8WtA-ALg=?FileName=KAAJ050203AC_002.pdf&SKU=KAAJ050203AC_PDF&CatalogueNumber=KA-AJ-05-020-3A-C

[15] BrainGate: Wired for Thought: http://www.braingate.com/

[16] See Christine Runnegar’s presentation: https://www.itu.int/net4/wsis/forum/2016/Agenda/Session/150