Home Opinions The ethics of big data analytics and governmental responsibility

The ethics of big data analytics and governmental responsibility

Big data is quickly infiltrating every part of our lives whether we have ways to protect our information or not

1

Written by Umer Altaf

If you thought our troubles with Cambridge Analytica ended with the Facebook scandal, you are regrettably wrong. It seems that the company is impossible to keep out of the news. Not only have they recently declared bankruptcy, but the data that they have collected may potentially be sold off to the highest bidder.

In the United States, a company that has declared bankruptcy may sell off any assets it has listed to settle the debts that it has. Personally identifiable information as it is known in legal literature is an asset and the aforementioned data collected is an example of such a legal entity.

Whilst the company claims that the relevant data collected has been destroyed, we do not as of yet know this for certain. If it turns out that Cambridge Analytica does in fact have this information stored somewhere, then the game has just started.

What is perhaps most troubling about all of this is the revelation that the executives at Cambridge Analytica have set up a new company called Emerdata, to which they may be able to transfer the data. The court decision that will determine the legitimacy of such transactions will undoubtedly dominate the news in the days to come.

What might not, however, is the question of the ethicality of this technology in the first place. Forget for a moment the legal debate, and imagine a context where Cambridge Analytica’s information gathering was confirmed to be 100% legally sound. Does the ability of a private, for-profit company to potentially impact global political events not raise ethical questions on the limits of power any entity can have? I would argue that it does.

If we have learned one thing in the past few years, it is that the advancements in big data analytics are coming at us whether we are ready for them or not. Companies — technology-based or otherwise — simply have too much on the line, from a standpoint of competition and continued relevancy, for them not to want to participate. These advancements in big data have allowed us to construct our current AI renaissance, complete with self-driving cars and artificial medical diagnosticians. According to Hackernoon, one way that big data and AI can be used is to improve learning in rural and small town schools. No one wants to be left behind in this game, and so advancements in the field are all but impossible to curb.

The effectiveness of this technology will only grow with time, as more and more data becomes readily available to mine, and our analytical tools become more refined. Most crucially, many of our daily resources — such as social networks or media service providers — are things that we can only use if we surrender some degree of our personal data.

As far as I can tell, there are no perfect solutions to this ethical conundrum. Having said that, there are semi-ideal ones. I would submit that this is a potential case for government intervention in the form of support. Just as there exist social programs to allow people to afford things such as public transportation and low rent housing, perhaps we should have programs that allow people to be able to afford to keep their information private and still use the services that have become so essential to our everyday lives.

There already exist enough means for groups to manipulate people for their personal gains. Somehow, we have come to accept that being talked into things without properly consenting or being informed is a part of life. We are no longer being sold along the broad lines of demographics or tastes, but by our every defining characteristic.

Whether Cambridge Analytica is doing this legally or not is a relatively minor question that shouldn’t overshadow the truly important one; is the surrender of our private data a necessity of the modern world? No, it isn’t. Not if those with means don’t want it to be, and not if our state shields those without them.

1 COMMENT

Exit mobile version