Just because we can do something is not a great argument for doing that particular thing. I’m thinking that the current monopoly of the online universe by Google and Facebook in particular is being distorted well beyond what we altruists ever imagined.

In a world where Google was guided by “do no evil” that might have been OK but those days are long gone. Both Google and Facebook are maximised and optimised to deliver a manipulated monopoly of the kind we haven’t seen before.

If you think that people are basically good then you may be shocked when you find out that bad actors will always find a way to tilt the proverbial playing field to gain an advantage.

In the advertising universe which is the metaphorical heart of this universe persuasion is never obvious, transparent or even particularly benevolent. Follow the money is always a useful way to analyse the various dimensions or not however I do recall the story of the blind men/persons describing an elephant.

In the online world the main commodity is attention. Building engagement via a mobile app is often something we tasked to do. When you have the resources that Facebook and Google have why mess around with engagement when you can have profit from addictive behaviour.

Given a choice I would rather pay an artisan for pretty much anything than just go for the lowest cost every time. However like everyone I probably have a t-shirt that was made by by extremely low pay “slaves” in Bangladesh whose only choice was to make that t-shirt or go hungry.

The ethics of paying extra $ to some Brooklyn hipster artisan designer for a similar item is a difficult choice. On the one hand if I get the lowest cost someone gets fed but I am perpetuating an onerous system in some far off country.

I like to support the creative community but we don’t all have to live in Williamsburg and while artisan goods and services is a step in the right direction that may not be the best choice either.

I have come to the view that Google and Facebook especially need to be regulated so that the advertising honed monopoly businesses that they have become do not do too much more harm to the social contract.

By social contract I mean deciding to do business with providers who are not facilitating social harm by operating a brutal unregulated system that allows the majority of its users to be exploited.

This week I have been reading about Roger McNamee’s thoughts on How to Fix Facebook—Before It Fixes Us. An early investor explains why the social media platform’s business model is such a threat—and what to do about it.

You probably won’t read piece because it is 6,000 words and who has the time? When I flagged that post to a few people they were distracted by the US debate over media manipulation by the relevant “bad guy.”

The reason why I like Roger’s oped is that he was one of the original Facebook investors who plunked down $150m at the personal invitation of Zuckerberg whom he mentored for a few years around 2007. McNamee made serious returns from that investment and so he was reluctant to “bite the hand that feeds”. You can and should read the full story.

Here is a very short snippet. Please read the other 5,500 words before rushing to comment.
 

“Smartphones changed the advertising game completely. It took only a few years for billions of people to have an all-purpose content delivery system easily accessible sixteen hours or more a day. This turned media into a battle to hold users’ attention as long as possible. And it left Facebook and Google with a prohibitive advantage over traditional media: with their vast reservoirs of real-time data on two billion individuals, they could personalize the content seen by every user. That made it much easier to monopolize user attention on smartphones and made the platforms uniquely attractive to advertisers. Why pay a newspaper in the hopes of catching the attention of a certain portion of its audience, when you can pay Facebook to reach exactly those people and no one else?
 
Whenever you log into Facebook, there are millions of posts the platform could show you. The key to its business model is the use of algorithms, driven by individual user data, to show you stuff you’re more likely to react to. Wikipedia defines an algorithm as “a set of rules that precisely defines a sequence of operations.”
 
Algorithms appear value neutral, but the platforms’ algorithms are actually designed with a specific value in mind: maximum share of attention, which optimizes profits. They do this by sucking up and analyzing your data, using it to predict what will cause you to react most strongly, and then giving you more of that.
Algorithms that maximize attention give an advantage to negative messages. People tend to react more to inputs that land low on the brainstem.

 
Fear and anger produce a lot more engagement and sharing than joy. The result is that the algorithms favor sensational content over substance. Of course, this has always been true for media; hence the old news adage “If it bleeds, it leads.” But for mass media, this was constrained by one-size-fits-all content and by the limitations of delivery platforms. Not so for internet platforms on smartphones. They have created billions of individual channels, each of which can be pushed further into negativity and extremism without the risk of alienating other audience members. To the contrary: the platforms help people self-segregate into like-minded filter bubbles, reducing the risk of exposure to challenging ideas.”
……..
“We agreed to work together to try to trigger a national conversation about the role of internet platform monopolies in our society, economy, and politics.”

 
It continues – my underlines in this image below.
 
[responsive][/responsive]
 
Roger goes on to make 10 suggestions about how some of the downsides to this can be managed including regulation. All useful suggestions to consider but I fear the sheer length of the article will mean very few readers get that far down the page.
 

“Increasing awareness of the threat posed by platform monopolies creates an opportunity to reframe the discussion about concentration of market power. Limiting the power of Facebook and Google not only won’t harm America, it will almost certainly unleash levels of creativity and innovation that have not been seen in the technology industry since the early days of, well, Facebook and Google.”

 
There are also other safeguards we can can into account such as changing user experience design.

How To Design Non-Addictive UX (It’s Really Not Hard)To wean users off their devices, UX designers can deploy the very tricks that made their products so addictive in the first place, writes Bruce Nussbaum.

Any parents who have read this far will remember how important it is to lead by example and not just setup some rules for others. If you were to ask various Facebook staff whether they let their children have unrestricted access to Facebook the answer may be surprising.

Never get high on your own supply’ – why social media bosses don’t use social mediaDevelopers of platforms such as Facebook have admitted that they were designed to be addictive. Should we be following the executives’ example and going cold turkey – and is it even possible for mere mortals?”

I wonder if ethical considerations are useful here? How do we choose the best option when Google services are so ubiquitous? Can we even opt out meaningfully?

It is a bit like playing a card game at a casino where the deck is marked and the house never loses. We can make quote all the platitudes we like about how you can just ‘step away from the table’ but addictions are difficult to manage. Moreso when those additions are undisclosed and not understood very well at all. If you don’t even know that the deck is stacked against you how do we do all of that? Stepping away from the table – digital detox or other management is sound advice but not so easy to do in real life.

Last quote from Sean Parker “It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.”

PS. By social contract I mean FB should be regulated and re-configured to promote social health rather than harm.

PSS. Facebook should be ‘regulated like cigarette industry’, says tech CEOSalesforce chief Marc Benioff is latest tech insider to raise alarm over social media’s effect on society with comments at Davos.

Update: Another day, another headline “Social scientists have warned Zuck all along that the Facebook theory of interaction would make people angry and miserable?” “The realisation that Facebook’s context collapse was intentional not only changed the whole direction of my research but provides the key to understanding why Facebook may not be so great for your mental health.”

Discover more from DialogCRM

Subscribe now to keep reading and get access to the full archive.

Continue reading