The Download: policing the metaverse, and the dangers of extreme climate solutions

When Ravi Yekkanti puts on his headset to go to work, he never knows what the day spent in virtual reality will bring. Who might he meet? Will a child’s voice accost him with a racist remark? Will a cartoon try to grab his genitals? 

Yekkanti’s job, as he sees it, is to make sure everyone in the metaverse is safe and having a good time, and he takes pride in it. He’s at the forefront of a new field, VR and metaverse content moderation. 

Digital safety in the metaverse has been off to a somewhat rocky start, with reports of sexual assaults, bullying, and child grooming—an issue that’s only becoming more urgent with Meta’s recent announcement that it is lowering the age minimum for its Horizon Worlds platform from 18 to 13.

Because traditional moderation tools, such as AI-enabled filters on certain words, don’t translate well to real-time immersive environments, mods like Yekkanti are the primary way to ensure safety in the digital world. And that work is getting more important every day. Read the full story.

—Tate Ryan-Mosley

The flawed logic of rushing out extreme climate solutions

Early last year, entrepreneur Luke Iseman says, he released a pair of sulfur dioxide–filled weather balloons from Mexico’s Baja California peninsula, in the hope that they’d burst miles above Earth.

It was a trivial act in itself, effectively a tiny, DIY act of solar geoengineering, the controversial proposal that the world could counteract climate change by releasing particles that reflect more sunlight back into space.

Discover more from WHO WILL CARE eCommerce

Subscribe now to keep reading and get access to the full archive.

Continue reading