TikTok is hiring policy managers for ‘shocking and graphic content’ as it tries to combat the awful side of social media
Hi there, it’s tech reporter Alexandra Sternlicht.
TikTok is staffing up in its fight against awful content.
You could become TikTok’s “North America Product Policy Manager for Shocking and Graphic content,” which involves watching “deeply disturbing content on a daily basis,” according to a recent TikTok job posting. This includes viewing and creating policy for images and text related to “death, injury, torture, mutilation, and animal abuse.”
You’d work in New York City and earn at least $93,000 annually.
Hate NYC? TikTok is also hiring for regional roles of a similar nature in Austin and San Jose (the lucky person who gets the California job will make at least $113,777 annually). The new hires will “address some of the most objectionable and disturbing content” with the goal of promoting a “positive and safe environment for all of [TikTok’s] users.”
These job openings appear to be part of a larger effort by TikTok to police the most pernicious content on its service. Globally, TikTok has 265 open job postings that include the word “torture,” 263 mentioning “sexual abuse,” and 258 citing “bestiality” and “murder,” giving masochists plenty of opportunities to pursue their passions.
Every major social media company has teams that handle trust and safety. Generally, however, the job postings use language that somewhat masks the ugly realities of the work.
That said, it’s been well-documented that content moderators, often low-paid contractors based in developing countries, have faced long-term psychological damage from viewing the most awful content on the internet. In its job postings, TikTok acknowledges the psychological impact of the work, spelling out that one qualification for candidates is having a “resilience and commitment to self-care in order to manage the emotional demands of the role.”
These workers will be part of the company’s trust and safety organization, tasked with staying up-to-date on emerging social media trends to “predict and prevent violations” of TikTok’s community guidelines. They are also responsible for drafting, analyzing, and implementing content policies for shocking and graphic content in the U.S. and Canadian markets.
Other than the trait of resilience, and five years of relevant experience, TikTok says candidates, should have a “passion for limiting user exposure to some of the most harmful content” and be “optimistic, principled, solutions-oriented, and self-starting,” among other things.
It’s quite a moment for TikTok to have posted these job openings. During a Congressional hearing last year, a lawmaker confronted TikTok CEO Shou Zi Chew with disturbing content that a New York boy consumed on the platform before committing suicide. In response, Chew said TikTok takes mental health issues “very seriously” and provides resources “for anyone who types in” suicide-related searches. That month, the boy’s parents filed a wrongful death suit in Suffolk County Supreme Court against TikTok and its China-based owner, ByteDance. In October, the New York judge moved the case to a different court for “lack of subject matter jurisdiction.”
All this comes as TikTok fights for its future in the U.S. after President Joe Biden signed a bill into an unprecedented law that forces ByteDance to find a buyer for its U.S. TikTok business or exit from the country.
A spokesperson for TikTok did not respond to Fortune’s request for comment about the job postings.
文章来源:Aol.
TKFFF公众号
扫码关注领【TK运营地图】
TKFFF合作,请扫码联系!