What happens when society prizes shareholder value too much

This is what happens when our technology begins to blow back on us. In a story of be careful what you / we wish for it seems clear that the largest technology companies in the world are gutting us of our essential humanity. And we don’t seem to know what to do about it.

Our corporate pursuit of shareholder value has turned the largest tech companies in the world into some kind of ridiculous monster. Shareholder wealth is being created at huge cost to communities and our personal well being.

The big question here is what we all do as individuals or governments about this and is this just not the long run consequences of commercial consumerism.

Personally I’d like to think that in New Zealand we might be able to choose a different path like a social compact of some kind combined with more transparent taxation of these robber barons.

“In a spectacular rant, Scott Galloway shares insights and eye-opening stats about their dominance and motivation — and what happens when a society prizes shareholder value over everything else.”

 

 

“Walmart, since the Great Recession, has paid 64 billion dollars in corporate income tax; Amazon has paid 1.4. How do we pay our firefighters, our soldiers and our social workers if the most successful companies in the world don’t pay their fair share? Pretty easy. That means the less successful companies have to pay more than their fair share.”

 
More from the transcript: CA is Chris Anderson.
 

“Chris Anderson: There’s another narrative that is arguably equally consistent with the facts, which is that there actually is good intent in much of the leadership — I won’t say everyone, necessarily — many of the employees. We all know people who work in those companies, and they still are pretty convincing that their mission is to — so, the alternative narrative is that there have been unintended consequences here, that the technologies that we’re unleashing, the algorithms, that we’re attempting to personalize the internet, for example, have A, resulted in weird effects like filter bubbles that we weren’t expecting; and B, made themselves vulnerable to weird things like — oh, I don’t know, Russian hackers creating accounts and doing things that we didn’t expect. Isn’t the unintended consequence a possibility here?

 

“Scott Galloway: I don’t think — I’m pretty sure, statistically, they’re no less or better people than any other organization that has 100,000 or more people. I don’t think they’re bad people. As a matter of fact, I would argue that there’s a lot of very civic-minded, decent leadership. But this is the issue: when you control 90 percent points of share in a market, search, that is now bigger than the entire advertising market of any nation, and you’re primarily compensated and trying to develop economic security for you and the families of your employees, to increase that market share, you can’t help but leverage all the power at your disposal. And that is the basis for regulation, and it’s the basis for the truism throughout history that power corrupts. They’re not bad people; we’ve just let them get out of control.

 
I’ve always wondered how a society could get to the place described in the novel 1984 where one minute “we have always been at war with Eurasia” until we are not.

As citizens we make our own reality but it is a reality constrained by our own blindspots such as unrecognised privilege or something else. We are living in the age of doublethink as foretold by George Orwell.