More

    Facing up to duty of care for content, ST Editorial News & Top Stories

    [ad_1]

    Recent disclosures about the way Facebook goes about its content moderation duties make a strong case for better oversight of platforms that host public discourse. Broadly, the claim is that the social media giant has been placing its own growth objectives ahead of its responsibility to protect users from misinformation, hate speech and other harmful content because such posts grab attention and engage viewers, translating into higher profits. Concerns that such practices are undermining the well-being of individuals and polarising societies were revealed in tens of thousands of leaked company documents and testimonies by a whistle-blower.

    Facebook appears not to treat equally all its nearly three billion active users across the world. One leaked document showed that last year, the company allocated 87 per cent of its budget for developing misinformation detection algorithms to its home country, the United States, and just 13 per cent to the rest of the world. While multinationals routinely do different things in different markets, such decisions have had consequences. In Myanmar, where Facebook is the primary source of news for millions, the lack of active moderation compromised Facebook’s ability to take down false posts. It is feared that this may have helped fan the flames of the Feb 1 coup.

    Please or to continue reading the full article.

    Get unlimited access to all stories at $0.99/month

    • Latest headlines and exclusive stories
    • In-depth analyses and award-winning multimedia content
    • Get access to all with our no-contract promotional package at only $0.99/month for the first 3 months*

    *Terms and conditions apply.



    [ad_2]

    Source link

    Recent Articles