close
close

“Feel smaller”: brutal TikTok goes viral

A viral TikTok suggesting young men find a girlfriend they can pick up and threaten violence has been the focus of a parliamentary hearing into social media and its impact on Australians.

The video, which comes from the popular podcast whensexhappens, states that men shouldn't date women who weigh more than “two-thirds” of their body weight.

“The woman should feel smaller,” says the male podcaster.

“He must feel like he could kill her, but he won't.

“If I'm holding the girl and she subconsciously knows if I wanted it, I could pick her up and throw her against the wall, but I don't, she says, 'Okay, I trust him, he can control himself.' .”

Katherine Berney, executive director of the National Women's Safety Alliance, said the government should consider censorship or penalties for the video as part of a larger reform to curb what she called a “tsunami of misogynistic content” on social media platforms.

Katherine Berney, executive director of the National Women's Safety Alliance, says there is a “tsunami of misogynistic content” on social media platforms. Image: Supplied

“The idea of ​​freedom of expression… in Australia that's obviously a concept that we have,” she told the Joint Select Committee hearing on social media and Australian society in Canberra on Monday morning.

“The idea that you can say whatever you want without consequences is ridiculous.

“What are the current consequences of this (Whensexhappens video)? This has 30 million views.

“You don’t have the right to have that opinion without consequences.”

The committee, chaired by Newcastle Labor MP Sharon Claydon, is examining the use of social media age verification on Australian children, tech giant Meta's decision to withdraw from the news media bargaining code and the role of journalism, news and media public interest in tackling misinformation and disinformation on digital platforms, how algorithms and recommendation systems influence what Australians see and other issues relating to harmful or illegal content distributed across platforms, including child abuse material.

Ms Berney said a “multifaceted and collaborative approach” was needed to address the challenges of social media.

SYDNEY, AUSTRALIA - NewsWire Photos JANUARY 20, 2023: Editorial generic stock image of an iPhone with the popular apps: Instagram, Messenger and Facebook prominently on the home screen. Image: NCA NewsWire / Nicholas Eagar

The committee is investigating the enforcement of age verification on social media platforms. Image: NewsWire / Nicholas Eagar

“We can’t shame people for their need for community and their need to feel connected,” she said.

“(But we) can provide a better framework, a safer framework.”

She said the social media giants should be required to prevent harmful content from entering their platforms and remove content when necessary.

She also recommended the government offer Australians “social media self-defense” courses to help them navigate the chaos of the online world.

The dark components of the social media age have exploded in several submissions to the committee.

David Braga, executive director of International Justice Mission Australia, said in his opening remarks that platforms such as Microsoft Teams, WhatsApp and Facebook Messenger were being used to broadcast and livestream sickening child sexual abuse material, and that Australians were “consistently considered frequent “Consumers of this abuse are to be classified”. “.

“Online child sexual exploitation often takes the form of live-streamed child sexual abuse, whereby perpetrators pay traffickers to sexually abuse victims, often young children, while perpetrators watch and direct this abuse live for a fee.” he said.

“The abuse routinely includes forcible sexual penetration. Children are forced into sexual acts with other children, sexually abused by an adult, and sometimes harmed in other degrading ways, such as bestiality.

“Simply put, it is child abuse, live, on demand.”

The connection to social media arises because the arrangements for these sessions are often made by the perpetrator and the facilitator, who communicate via everyday social media platforms.

“Livestreamed abuse sessions are then often carried out on everyday platforms such as Microsoft Skype, Facebook Messenger and WhatsApp,” Mr Braga said.

“Australia has a moral obligation to address this harm because we are consistently ranked as a country that consumes this abuse the most.”

A final report from the committee is expected in November.