What’s so important about Human-Data Interaction?

What do we mean by Human-Data Interaction anyway?

Our data is everywhere. Some of it you’ll know about – your credit history, passport details and all the things that are kept on you about about your health by the NHS. There are usually strict regulations about their use.

For example, if an insurance company wanted your medical details without you asking, you’d be upset. And there are laws in place to stop that. But there are other cases where it’s less clear what is at stake.

HDI hopes to help make things not only transparent, but legible. 

So there is other data about us out there – a lot of it. It could be generated by what you buy, by your web search activity, by cookies tracking where you go, by search engines tracking your searches, and by data that ‘leaks’ as you use the apps on your phone. Even your phone might track you as you go about your daily business. It might be very surprising to you just how much data, and what data, is being collected.

Maybe you have an idea of how much data is being collected, but you feel powerless to do anything about it. You aren’t given agency or even the ability to negotiate. 

It’s also about what data we work with. Did you know that search results can be manipulated?  That companies have been fined for pushing up the prices of some things in national disasters? Others fined for pushing their own, dearer, products to appear on search results?

There can only be agency or negotiability if there is legibility… and there can be neither unless there are meaningful choices available. 

Just now, there’s no such thing as meaningful choices. Take any piece of software. You download it, you cycle through a novel’s worth of legalese to get to the ‘accept’ button. And this is meant to be informed choice. Or take your mobile phone. A simple travel app wants to know your phone number, see the photos you have taken, access media files…

…and you feel overwhelmed…

Is it real choice when all that one can choose from is not using the software, or agreeing? Or wading through a mountain of poorly indicated possibilities, nudged constantly to make choices that suit someone else better?

We can only be literate in our relations to technology if the technologies we use are legible to us. 

Some apps are deliberately ‘leaky’. As well as your location, they’re maybe noting your identity and phone number. Why? To build up a very detailed picture of you. And all you wanted to do was find out what trains are on time. And phone apps are ‘leakier’ than browsers, where there are more strict conventions. That’s sometimes when you’re encouraged onto an app rather than stay using the information on a browser.

HDI means the ability to interact with the data you give out in a meaningful way, going beyond legibility to agency and negotiability. 

Sometimes you don’t mind sharing information about yourself – maybe you’re on Strava and quite like showing others about the hills you have climbed on your bike, or the routes you have run, and it’s nice to see where others have run or cycled. It can be a useful way to find good places to do these things. Maybe on Zwift it’s nice to cycle ‘virtually’ alongside others, and feel a sense of community even as you are at home on bicycle turbo trainer. And with things like Fitbit, monitoring yourself can motivate you to do more.

But do you know how your data is used in applications like these? What if they were used to monitor you in unexpected ways? How would you change that?

And there’s another, more worrying, side to this. There are detailed profiles on you on some social networking sites, even if you’re not on them as a member. There is data taken on you that you might not know about.

HDI means the ability to have more knowledge and control with regard to the data that you give out. 

And what about the software you use to find out things? You trust those searches to give you the most relevant information, or the right product, or the cheapest. If you’re on a dating site and the person is meant to be a ‘match’ with you, you trust that the site is not manipulating those rankings. You trust that the algorithm is not biased. But algorithms are often opaque, hidden behind commercial secrecy.

HDI means openness about how your data is processed, how the algorithms work, and who is running them. 

Those algorithms are not some small, rarified thing. Those algorithms might determine how your local town is policed, who is regarded as a ‘risk’, what behaviour is found to be ‘suspicious’. And sometimes algorithms are simply biases dressed up in code. Algorithms can be as biased as people, if people’s biases are encoded in them.

HDI means openness about how algorithms are constructed. 

There is much talk of ‘data science’. But science is conducted in the open, with arguments and assumptions laid out in a way that can be questioned.

Many of the algorithms we use today are not openly constructed. We have no idea how they work.  We have no idea if we can trust them as ‘scientific’ or simply as biases dressed up in code. To trust an algorithm, we must be able to read what it does.

With transparency, there is the possibility of legibility. If things are clear, and legible, there is the possibility that we become more literate about our data and that we can make informed choices about what to do about others’ use of our data.