Bluesky saw massive growth last year, necessitating the platform ramp up its moderation efforts. In its recent release Moderation report For 2024, Bluesky said it grew by about 23 million users, jumping from 2.9 million users to nearly 26 million. Its moderators received 17 times the number of user reports they did in 2023 – 6.48 million in 2024 compared to 358,000 the previous year.
The bulk of these reports were related to “harassment, trolling, or bigotry,” spam, and misleading content (including impersonation and misinformation). Accounts pretending to be other people exist In the wake of Bluesky’s rise in popularity, the platform is taking a “more aggressive” approach in an attempt to eliminate them. In that time, she said she had quadrupled her moderation team. The new report says Bluesky’s moderation team has grown to about 100, and hiring is continuing. “Some supervisors specialize in specific policy areas, such as agents dedicated to child safety,” the report notes.
Other categories Bluesky says it has received a lot of reports about including “unlawful and urgent issues” and unwanted sexual content. There were also 726,000 reports categorized as “other.” Bluesky says it complied with 146 requests from “law enforcement, governments and legal firms” out of a total of 238 requests last year.
The platform plans to make some changes to the way it handles reports and appeals this year that it says will “simplify user communications,” such as providing users with updates on the actions it has taken on content they have reported and, in the future, allowing users to appeal takedown decisions directly in the app. Its moderators removed 66,308 accounts in 2024, while its automated systems removed 35,842 spam profiles and bots. “Looking forward to 2025, we are investing in stronger proactive detection systems to complement user reports, as the growing network needs multiple detection methods to quickly identify and remediate malicious content,” says Plosky.