Brislen on Tech
The YouTube algorithm blues
Well, that went well.
First of all, a user of YouTube (owned by Google) in Brazil discovered that her video of her daughter playing in the backyard was being viewed hundreds of thousands of times. The video, of the ten-year-old child playing by the pool with a friend, was being suggested as content that viewers might like but (as you can guess) the YouTube algorithm had decided to recommend the video to those who had watched sexually explicit content.
YouTube was pushing her video to paedophiles.
It wasn't just one video either. The algorithm had curated a number of videos of prepubescent children in various states of dress and was suggesting them to anyone who had watched a more adult set of content.
YouTube immediately banned kids from live streaming content with no clear adult supervision being shown, which raises a couple of questions: firstly, would that have stopped the algorithm from sharing the content (presumably not) and secondly, if you can move this quickly when it's videos of kids how about when it's a mass killing by a terrorist?
But things went from really awful to, well, even worse if that was possible. YouTube has now begun removing videos that are about things that are demonstrably true but which are being described as fake. Videos from Holocaust deniers, those who believe a number of different US mass shootings were fake, and presumably also the moon landing hoaxers, although that's still probably being treated as entertainment (I haven't checked).
So far so… well, very late to the party. But as part of it all, YouTube began banning those extreme right wing, neo-Nazi and white supremacy videographers who were deemed to be breaching YouTubes community standards.
Not banning them from the site, however, just banning them from revenue sharing for any money that they might make.
The videos are still there, but they can't make money from the sharing arrangement YouTube has in place.
For some, making money clearly isn't the driving force. For others, they make money from other sources, such as Patreon accounts rather than specifically from YouTube. Others still sell merchandise, and it's one of these who has become a poster child for YouTube's unwillingness to do the right thing.
Canadian-based hate blogger Stephen Crowder was told he could no longer make money from his YouTube stream because of "egregious actions".
"We have suspended this channel's monetization. We came to this decision because a pattern of egregious actions has harmed the broader community and is against our YouTube Partner Program policies," says YouTube.
But, if he agrees to remove a particularly awful t-shirt that he sells (no, I'm not telling you what it says but there's a link to a BoingBoing story that details his attack content linked below ) then he's welcome to come back into the fold and carry on peddling his hate speech.
Initially the company had said it wouldn't ban him but after a backlash the decision was overturned. Now it seems it was all about the t-shirt all along.
YouTube clearly didn't read the room terribly well here. On the one hand it has allowed these extremists to build a following and to monetise it on the YouTube platform, but won't act to block content that clearly is a breach of its own terms and conditions, and then when it does it doesn't remove the content, just stops sharing the cash, and so will make even more money off it than it did before.
New York Times - On YouTube's Digital Playground, an Open Gate for Pedophiles
YouTube Blog - Our ongoing work to tackle hate
Facebook lives up to expectation
But at least YouTube is trying to do the right thing, albeit very slowly and clumsily. The company has major competition from Facebook in terms of video content and over the past four years, Facebook has made huge inroads into YouTube's dominance of the video space. The overlap continues, and YouTube really wants to make sure its algorithm serves up content that will keep users clicking instead of sailing off to Facebook's walled garden where the revenue passes on to someone else.
So what is Facebook doing since the Christchurch atrocities and what changes have the company made to its algorithms to make sure we don't see the sharing of live video footage that is obscene?
Well, that's easy. Nothing. Not a thing.
Facebook hasn't bothered doing the bare minimum and shows no signs of changing its business model, its terms and conditions for operating or its algorithm at all.
The problem lies with Mark Zuckerberg, the founder of Facebook. Not only does he hold the position of CEO but he's also chair of the board, and holds 60% of the stock, making him a very wealthy man.
The problem is, he doesn't have to listen to anyone on his board, or on his management team or indeed anyone at all. If he wants to stay up late, he can, and nobody can send him to bed.
A group of Facebook shareholders have begun something of a revolt, calling for him to step down as chair. Of course, when you own 60% of the company stock, any vote you disagree with is going to be voted down, but this group managed to get 68% of the votes of ordinary investors in favour of ousting Zuckerberg as chair and to bring in an independent to take over the board.
The shareholders are angry at the way Zuckerberg has refused to act over Facebook's role in election interference, the way it handles privacy matters and the way it has done nothing to stop a repeat of the live screening of murders and suicides on Facebook Live.
Yet they're powerless to make a change because Zuckerberg owns so much of the company. They're along purely for the ride and can have no role in the direction of the company.
For some, that will be enough - reaping the rewards of shareholder dividends and increasing stock value is all they care about. But for a growing crowd that's not good enough, and concerns over the future viability of a company that is a social media giant are growing on a daily basis.
AdAge - Facebook's strategy to take on YouTube comes into view (from 2014)
Digital Services Tax
So what's a poor government at the far end of the world to do with these social media giants and their ilk who come in here, gut the advertising market for mainstream media, take all the money out of the business and then somehow manage to claim they're making a loss in market and can't possibly pay any tax?
Well, it's baby steps for New Zealand but along with the Christchurch Call meeting of minds in France we are steaming ahead with plans to tax digital companies for earnings that are clearly made in New Zealand.
There's a discussion paper and lots to discuss ranging from should we charge such a tax at all, shouldn't we wait for the rest of the world to have a look, didn't the Australians decide not to do this, and is 3% really the right number to land on?
For local providers, it's long since time that these companies are actually held to the same consideration as local companies that presumably actually do pay tax on their earnings here in New Zealand. We don't seem to have a plethora of companies opting to base themselves in Bermuda (cough cough Southern Cross as counter-example) nor are we seeing our companies become "resident" in Ireland or Luxembourg or (heaven forbid) the UK with its lack of transparency around business ownership (I'm sure that's got nothing to do with the urge to exit the EU before EU laws are enforced, no no, not at all) but for the Apples, Googles, Facebooks and all the rest who do operate here, do make money in New Zealand and yet somehow pay less tax than is seemly, it's an issue they'll no doubt want to fight tooth and nail.
But at least we're having this discussion and at least we're starting to see online, digital companies treated as the money-making ventures they actually are, with all that entails. Tax, local legislation, requirements to comply with our laws when operating in our country.
Long may it last.
Techblog - Digital giants to face DST
The Spin Off - The Bulletin: Digital services tax takes shape
You must be logged in in order to post comments. Log In