By Angela Armstrong

Fairness isn’t easy. It posits that fairness requires, in some form, a “referee”. So, I started wondering about fairness, and referees, in the digital age.

Humans are fallible. But data (at least until it gets interpreted by someone) is arguably neutral.

You’ve probably heard of Cambridge Analytica. That’s the consulting firm held responsible for the data mining work that allegedly influenced the 2016 U.S. presidential election. Its detractors accuse it of “unfairly” mining reams of personal data in order to target customized messaging and sway voter behaviours.

But is that really “unfair”? Every voter may conduct independent research and should know that Facebook political posts, like the posts of friends who are always happy and on vacation, might be slanted toward hyperbole. Is it unfair of Cambridge Analytica to assess human nature and use it against us? Do we actually have our own free will?

Facebook, gathering all the freely given personal data, in return for free use of their service, might have unfairly peddled that data for profit. (You can always opt out of Facebook).

How independent we are as human actors is an inevitable component of the conversation around fairness.

Technology has a bright shiny side and a dark murky side. Like the kissing cousins (or Siamese twins) of the Internet and the Dark Web. And fairness seems to altruistically favour the light.

Fairness implies trust
For a few weeks after Cambridge Analytica hit the newswires and evening shows, there was a flurry of Facebook account closings. I think it would be unironically “fair” to say Facebook boycotters felt their trust had been breached. And I think there was a lukewarm attempt by the government to act as a referee.

Fair treatment about our personal data is a fine balancing act between protecting our private information and the desire for wonderful, seamless customer experiences that we all crave. Fairness in this world, then, might boil down to transparency of motives, intentions and actions.

Coming from the standpoint of a lender fairness might be that a borrower does what they say they are going to do. Fairness to that borrower might be that they are not unduly taken advantage of, as vulnerable or unsophisticated actors in the contract.

“AirBnB” the finance industry?
Shared use platforms rely on a mutual, transparent rating system. Through a public “review” system, finance could keep risks down and predictability high, thereby giving borrowers low costs and lenders high reliability. Perfect!

Sesame Credit, an Ant Financial entity collaborating with the Chinese government, is doing just this. Having the competitive advantage of a kind of data monopoly, they are using that to create a “citizen score”, thus conveying an individual’s personal “trustworthiness”, and access to products and services. (And I suspect there are no referees).

It’s the Big Brother Orwellian, albeit served up on a smartphone, a future that we used to fear. Just who will get to dictate what constitutes “good citizenship”: and access to goods, services and capital? Even with AirBnB, occasionally you need to contact customer service: the referee for disputes.

Referees are often considered intrusive by the private sector and necessary by the public: public regulations, policies and oversight are put in place to protect those who are vulnerable or lacking power. Like sports players and coaches who complain about unfair calls and slowing play, and fans who want the game played right, we love to yell at them, but we also, in our hearts, know the game could be messy without them.

In a world where data is the new currency, we aren’t even close to sorting out the intricacies and risks of trading our data for frictionless experiences. But thieves lurk in the shadows like the Dark Web where fairness is nothing more than a commodity for sale. Transparency may better enable fairness compared with opacity.

One of the biggest drivers of cost in the finance world is fraud. A transparent mutual data sharing platform might help both actors in the transaction sift out the duds from the indubitables.

Maybe to effect a real evolution of lending, opacity (aka privacy) between the lender and the borrower HAS to go away. As a borrower, we’d have to be ok with the lender in that world might see where we go on holidays and what we spend our money on, instead of making our loan payment. Transparency could protect lenders and borrowers from unethical actors, reducing the increasing costs of fraud. The referee (say a consumer privacy legislator) might be working for more opacity on the consumer side of the transaction, not less, which works ironically at cross-purposes to what consumers say they want: i.e. faster, frictionless services.

It’s all kind of messy. And even with tons of neutral data I don’t think we will get rid of the need for referees.

Without a referee, in the world of data, the new frontier of trust is in the hands of the developers, the algorithm builders and the filter creators. How will we know if they got it right, and if the result is fair? In a data driven, FinTech future, who is going to be the referee, and if they do get a pair of eyes, what should they be watching? It’s a conversation that we’d better start having more of, because data ain’t going anywhere.

Angela Armstrong is president, Prime Capital Group (www.pcclease.com).

Previous post

Keeping the goodwill going

Next post

Servitization to supplant ownership?

The Editor

The Editor