1 Big Data Feedback Sessions Ian Stanier & Keiichi Matsuda: Security and Citizenship Keiichi Matsuda Keiichi Matsuda is a designer and filmmaker who started working with video during his Masters of Architecture at the Bartlett school (UCL) as a critical tool to understand, construct and represent space. Matsuda uses a mixture of video, motion graphics, interaction design, and architecture in his work, examining the implications of emerging technologies for human perception and the built environment, focusing on the integration of media into everyday life. Matsuda s current project, Hyper-Reality: A New Vision of the Future, is a new film series set in Medellín, Colombia, exploring a future city saturated with technology and media. Read more about Keiichi Matsuda here - and his project Hyper-Reality: A New Vision of the Future here https://www.kickstarter.com/projects/ /hyper-reality-a-new-vision-ofthe-future Rob: Do you think big data will affect how we will define citizenship / communities in the future? If so, how? The main issues raised were surrounding classification, inevitability, virtual definitions and isolation. The overall consensus of group was negative. The creative sector representation in the group was particularly pessimistic and quite suspicious of the intelligence/police representation in the room. The academics and data technologist were more open to thinking about the opportunities that Big Data may bring about but were, in conclusion, fearful of the future implications that Big Data will have. It was suggested that citizenship and communities will begin to have virtual definitions, rather than spatial or geographical ones. Issues were discussed surrounding how we classify citizenship and communities and the potential for unfair racial or demographic profiling led by data classification. There was also a sense of distrust in the group of whether the public sector and government use open data and Big Data techniques ethically. The question of what the negative implications of Big Data might be was raised, but no specific suggestions were made. The question was also raised of whether the impact of Big Data on citizenship and communities was inevitable, and whether a cosmopolitan identity might lead to isolation?
2 Sofia: Do you think technology plays too big a part in our lives? What do you think we can do to make sure that our use of technology in everyday life won t become dangerous? The main issues raised surrounded benefits, the relationship between social and virtual connections, defining dangerous, the human element in decision making and lack of choice created by our dependence on Big Data. The overriding sentiment was no, and that it can be a benefit. The relationship between social and virtual connections was discussed, and whether one comes at the expense of the other. One risk identified was the ability for Big Data to connect us digitally, but to disconnect us socially. How do we define dangerous? Technology is crucial to infrastructure but people rely on it. Does this mean that there is a lack of political decision making? Do we have any other choice but to use technology now, as we are so dependant on it? Anno: (Catholicism) Religious belief systems ethics, religion and big data is there a place for religion in the use of big data? What role does religion have in policing the web? The main issues raised surrounded surveillance and belief systems online The group discussed belief systems online. Is religion a type of surveillance over internet content? Ian Stanier Dr Ian Stanier has extensive experience in policing with intelligence in an operational context at force, national and international levels. He is the UK National Intelligence Model working group coordinator and a member of the Intelligence Futures Group (IFG) He has a particular interest in information sharing pathologies and the origins of intelligence failures. Stanier will address questions associated with intelligence management, security, ethics and other issues linked to Big Data by using examples of action-based policing. Sofia: In your experience, has the development of Big Data had predominantly positive or negative consequences on intelligence and security?
3 The main issues raised surrounded cost efficiency, individual choice, the quality of the interpretation, context and the global standardisation of laws. Issues surrounding the cost efficiency of using Big Data were raised. It was pointed out that it is up to us as individuals and a society to make sure our experience of Big Data is positive. Whether it has positive or negative consequences depends on the quality of the interpretation of Big Data. We used to learn from mistakes, but now we have technology making decisions for us, do we still need to? It all depends on the political context. Would it be described as ex Germany or ex Denmark? Will countries swap into each other and create different thresholds? Will there eventually be globally standardised laws governing Big Data, rather than different laws for different countries? We probably won t have this for generations. Anno: Could Big Data lead to discrimination and profiling of communities or does it actually solve these biases? The main issues raised surrounded technology divides causing discrimination against classes or socio-economic groups, and whether Big Data is a-political. The group discussed whether Big Data interpretation could create a technology divide, so there may be discrimination against classes or socioeconomic groups who do not posses this technology or the skills to employ it and can therefore not access or interpret the data in the same way. Does Big Data undermine the right to be forgotten? Does it help to regulate and prevent oversight and control? Is Big Data pure and a-political? What is the role of those labeling and categorising the data? If policy is made based on Big Data, how do we ensure everyone is included in the data? Being anonymous could also have a negative effect as you are not being measured? Rob: If someone has nothing to hide, should he or she still be afraid to give up their privacy? The main issues raised surrounded public consciousness. One member of the group pointed out that you still have to be conscious of everything you do and how it might be interpreted.
4 The group discussed the fact that people give their data away freely, even when it is not in their interest to do so in the long term. One fundamental problem which was considered was getting citizens to actually care right now. There is no real public consciousness on the issue of Big Data people are happy to freely trade their data and privacy for certain incentives (financial, social media, gaming etc). Is the ability to make a decision removed? Paul J. Ennis & Conal Devitt: The Ethics of Big Data Dr Paul J. Ennis Dr Paul J. Ennis earned his PhD in Philosophy at University College Dublin and is currently researching trends in contemporary online culture including the dark web, cryptocurrency, whistleblowing and hacktivism. A core feature of his approach is the manner in which intelligence agencies, specifically the National Security Agency (NSA), are engaging these trends. Ennis position is broadly neutral and aims to both dispel certain myths about mass surveillance as they have appeared recently in the media and nonetheless stress how specific laws relating to mass surveillance require more stringent application and oversight. Ennis' presentation is supported by British Society for Aesthetics (BSA). Rob: What do you think is the biggest ethical problem caused by Big Data today? What ethical difficulties / challenges / problems do you think the collection of Big Data will give rise to in the future? Do you think we are able to prevent / tackle these difficulties? If so, how? The main issues raised surrounded differentiation between Big Data applications, power and the frameworks behind data mining. The group became more optimistic after Paul J. Ennis and Conal Devitt s presentations less fearful and suspicious. The group discussed systems of classifying Big Data and who has control over the information produced. One member of the group who used Big Data in their research in the environmental sector pointed out that access to more data is a good thing, in terms of making improvements Should we differentiate between personal data and data used for science/environmental issues, in terms of the ethical questions they raise? What are the frameworks or premises for finding data? Who has the power to mine data?
5 Do you need to have a pre-conceived notion of what you are searching for to mine data effectively? How is it managed? Sofia: Considering the huge amount of data that is generated constantly, who do you think will be able to access the data in the future? The main issues raised surrounded inevitability, encryption, power imbalance, distrust and self censorship. The group agreed that it is inevitable that data will have to be shared. Issues surrounding encryption and decryption were discussed, and how this might allow only specific people to access specific data. This could lead to a power imbalance, and as a result there will be distrust of those in power, particularly in relation to how they communicate their knowledge. There is a potential for self censorship by those who can access this data, or by those who are suspicious of their data being analysed. Issues surrounding distrust who distrusts and who is distrusted? Anno: To what degree are we responsible for what happens to the data we share? The main issues raised surrounded digital literacy, privacy and informed consent. The group agreed that responsibility depends on digital literacy. More investment should be made to educate citizens in a digital age. A radical new business model for Facebook, and other social media, was considered based on users paying 0-10 GBP a month to ensure their privacy and that their data was not sold to external parties. Are we aware of what happens to data we freely give away? Are we truly consensual if we are not informed, and therefore do we need to rethink informed consent as a legal concept? Should services built on user s data, such as Facebook, create clear opt-out/opt-in settings to ensure better privacy conditions for users? Conal Devitt Conal Devitt has worked on numerous prestigious positions and large-scale projects in the public sector. He has held the position of Head of Community Safety in St Helens where he helped to establish the Safer
6 Merseyside Partnership, and Group Manager for Community Safety for Liverpool City Council. Devitt was also Head Deputy Director of Criminal Justice in the Northern Ireland Office as well as chaired the Northern Ireland Community Safety Forum. Until recently, he was the business development manager for The Life Channel, a Health and Wellbeing TV company working closely with the Department of Health and schools. Devitt is now director at TGL (Teaching Giving Learning) - a ground breaking virtual marketplace for local people, business and communities. Read more about TGL here - Rob: Can we opt out anymore and be digital free? If so, do you think it would be possible to live as part of modern society? The main issues raised were surrounding whether we would be allowed to become digital free, loopholes, data farms and sustainability. The group believed that people will always find loopholes, therefore it is impossible. They also discussed the sustainability of data farms, which are very energy intensive and not environmentally friendly. Even if you wanted to, would the world let you become digital free? Sofia: Do individuals have the right to be forgotten? The main issues raised surrounded context and where we draw the line of who has this right. The consensus was Yes ; that individuals do have the right to be forgotten. However, it was also discussed that this right depends on the context in which something is being forgotten and who or what is being forgotten. Where do we draw the line/ how do we decide who does/doesn t have the right to be forgotten? Anno: Can Big Data be used to integrate those on the periphery of society back into society? The main issues raised surrounded digital literacy. Big Data is mainly a tool. Therefore, it can be used for good or bad depending on the intentions. More is needed to educate the periphery of society how they can use digital developments responsibly and to their advantage.
7 Richard Thieme: Science, Religion and the Future Richard Thieme (Skype) Richard Thieme is a widely published author and internationally acclaimed professional speaker focused on the deeper implications of technology, religion, and science for twenty first century life. He speaks about the challenges posed by new technologies and the future, how to redesign ourselves to meet these challenges, and creativity in response to radical change. A former Anglican clergyman, Thieme is also passionate about the integration of religion and technology. Thieme has been called a father figure for online culture and has spoken for nineteen years for the Black Hat Briefings and the annual computer hackers convention Def Con. Read more about Richard Thieme here - Anno: How would you explain the relationship between ethics and technology? The main issues raised surrounded design, development and subjectivity. The design and development stage is where ethics come in, not just in the usage of Big Data technologies. The subjectivity of ethics was also discussed Is technology neutral? Are unintended consequences of Big Data difficult to forsee? Markets largely shape technology, therefore are their ethics embodied in the technology? Sofia/Grace/Rob: Do you think the development of Big Data brings about a predominantly bright or dark future? The main issues raised surrounded storage, power, loopholes, connectivity, context and misrepresentation, loss of human element, human error, trust and accuracy. A show of hands was initially asked of the group. There were three who could conceive of a brighter future, two who saw a darker one, and one who was unsure. The language used around the future of Big Data was interesting, words such as ominous and bleak and other language of uncertainty was used frequently. There was a sense in the group that the amount of data stored was ominous for the future. There was also a bleak view of those that manage data Examples of powerful agencies buying other platforms were discussed, such as Facebook buying Whatsapp.
8 The idea that incidental connections can be created with gathered data, which can create loopholes. Positive discussions surrounding Big Data centered on the connectivity between individuals and group, and the way it can narrow degrees of separation. Big Data analysis often does not consider the context of the data, and therefore the data does not provide the full picture or consider the human perspective. For example, much of what exists on Social media, if taken out of context, can misrepresent a group or individual. Big Data also has the potential to replace human-led decision making, which may not take common sense into account. Equally, if Big Data is being mined and analysed by humans, then it is open to human error. Anxieties were voiced in the group about how much we can trust data and what the consequences of being too trusting might be. Will concerns over how much we can trust Big Data, especially in relation to the way it can be taken out of context, affect legal practice? Can more data create more accuracy? For example, data cleansing? Rob: Is there something beyond Big Data? What will the next big development in this area be? The main issues raised surrounded counter culture, opting-out and ownership The possibility of a counter culture against Big Data was discussed, which might emerge in reaction to it and represent an ideological departure from Big Data. It was pointed out that the next step beyond Big Data will simply be a more accessible and acceptable option to opt out. Will questions of ownership be the prevailing question surrounding Big Data in the future? Overall comments from Robert Barrow from the IFG: When I put it to the group that Big Data use could positively influence the ideas that bind the citizen to the state through greater efficiency and cost reduction in health services and other public sector service, especially in a democratic country such as the UK compared with a more corrupt country, the artistic representative in my group snubbed the idea and declared that our government is the most corrupt of all. The overall feeling of the group actually turned from a negative one to positive when the individual left.
9 Much of the blame toward this negative feeling was on the monopolisation of data and breaches of privacy wrought by large corporations. When these ideas were revisited later in the day, it was acknowledged that the people working for these businesses are just that people and the moral inclination of these people is probably good. Dr Paul J. Ennis s presentation which followed the first round of questions shed light on the intentions and actual practices of intelligence agencies as being more ethical/law abiding than delegates previously imagined.