Transparency, contact tracing and the language of surveillance
As Richard Pope attests in his recent blog there is some pretty boring-to-the-ordinary-person aspects to the whole transparency, tracking and contact tracing narrative. To an extent we appear to be becoming desensitised to the who and the how, the centralised/decentralised, the open/proprietary, the technology choices and how inclusive or otherwise they might be, and to the notion of privacy. On this last point language plays a key role.
You'd hope this was clear language on the back of a truck in Sri Lanka
The Coronavirus (Safeguards) Bill 2020 proposes protections for 'digital interventions'. Though the ambition is understandable and the intent and brevity admirable its reliance on the same language, in relation to privacy, GDPR, anonymisation, digital exclusion, sharing, containment, research and time merits some exploration:
Digital exclusion is highest in those communities most at risk notably the poorest and the elderly, sub-sections of whom are at the forefront of services keeping the rest of us alive.
Sharing, containment and research offer competing tensions to interested parties and effectively an open charter in the public interest.
In the absence of specific "sunset" clauses "as soon as possible" is of a piece with this same public interest argument.
And on the thorny but often glibly treated matter of anonymisation as the panacea for the further widening of our surveilled world, as per an earlier post of mine, Kieron O'Hara is one who has written on privacy (including the end of), guidance on anonymisation (for data sharers), on online obfuscation, on the semantic web and, per this post, on (de)anonymisation; among his nuggets are 'vaguing-up' and individual consent-based treatment (in the debate over geographically defining a crime event). Remembered here is the line from the 2011 Transparent government, not transparent citizens: a report on privacy and transparency for the Cabinet Office: "There are no complete legal or technical fixes to the deanonymisation problem".
Exceptional times demand exceptional responses and behaviours. Populations around the world are delivering their side and so above all are key workers. It is incumbent on our politicians and law makers to do likewise. There are too many instances of the intended and unintended consequences of mass surveillance, tracking, tracing and labelling of communities for anything other than transparent, unambiguous, consent-seeking and widely supported controls or legislation being introduced. Data Privacy Impact Assessments (DPIAs) provide one possibly slightly impenetrable basis for wider transparency and engagement but whatever route or routes are adopted it is essential that we're not surprised by the outcome or come to regret it.
Seems I am not alone - others with more insight than I have obviously been thinking and acting about this - while I have been writing Twitter alerted me!
You'd hope this was clear language on the back of a truck in Sri Lanka
The Coronavirus (Safeguards) Bill 2020 proposes protections for 'digital interventions'. Though the ambition is understandable and the intent and brevity admirable its reliance on the same language, in relation to privacy, GDPR, anonymisation, digital exclusion, sharing, containment, research and time merits some exploration:
Digital exclusion is highest in those communities most at risk notably the poorest and the elderly, sub-sections of whom are at the forefront of services keeping the rest of us alive.
Sharing, containment and research offer competing tensions to interested parties and effectively an open charter in the public interest.
In the absence of specific "sunset" clauses "as soon as possible" is of a piece with this same public interest argument.
And on the thorny but often glibly treated matter of anonymisation as the panacea for the further widening of our surveilled world, as per an earlier post of mine, Kieron O'Hara is one who has written on privacy (including the end of), guidance on anonymisation (for data sharers), on online obfuscation, on the semantic web and, per this post, on (de)anonymisation; among his nuggets are 'vaguing-up' and individual consent-based treatment (in the debate over geographically defining a crime event). Remembered here is the line from the 2011 Transparent government, not transparent citizens: a report on privacy and transparency for the Cabinet Office: "There are no complete legal or technical fixes to the deanonymisation problem".
Exceptional times demand exceptional responses and behaviours. Populations around the world are delivering their side and so above all are key workers. It is incumbent on our politicians and law makers to do likewise. There are too many instances of the intended and unintended consequences of mass surveillance, tracking, tracing and labelling of communities for anything other than transparent, unambiguous, consent-seeking and widely supported controls or legislation being introduced. Data Privacy Impact Assessments (DPIAs) provide one possibly slightly impenetrable basis for wider transparency and engagement but whatever route or routes are adopted it is essential that we're not surprised by the outcome or come to regret it.
Seems I am not alone - others with more insight than I have obviously been thinking and acting about this - while I have been writing Twitter alerted me!
Comments
Post a Comment
Thank you for taking the time to ponder my musings and for any contribution you make. Although comments appear immediately (i.e. unmoderated) I will remove (or if possible) edit offensive comments.