WhatsApp users should be given the option to have their messages disappear after 24 hours, a change that drew immediate criticism from children’s charities.
In a blog post announcing the change, WhatsApp, which has 2 billion users, said its mission was to “connect the world privately.”
WhatsApp introduced the disappearance of messages last year, with the ability to delete chats by default after seven days, but as of Monday this is offered in two new timelines: 24 hours or 90 days. Users will also have the option to enable disappearance of default messages for all new discussions.
Mark Zuckerberg, managing director of WhatsApp’s parent company, Meta, said on his Facebook page: “Not all messages need to hang around indefinitely.”
The WhatsApp blog added, “There is a certain magic to sitting with someone in person, confidently sharing their thoughts, knowing that you are logging in both privately and right now.
“The freedom to be honest and vulnerable, knowing that the conversation is not being recorded and stored somewhere forever. Deciding how long a message should be in your hands.
Disappearing messages can be turned on by default for all new individual chats by going to settings, tapping ‘account’, then privacy and the default message timer. It will not affect existing threads.
UK children’s charity, the National Society for the Prevention of Cruelty to Children (NSPCC), said the move was “ill-thought out” and would create a “toxic cocktail of risks” when combined with Meta’s plans to encrypt messages on all of its services, including Facebook and Instagram.
“Offenders prepare children on open platforms like Instagram before moving them to WhatsApp for other abuses where there is less chance of detection,” said Andy Burrows, Online Safety Policy Officer children at the NSPCC.
“This ill-considered design move will allow offenders to quickly suppress evidence of child abuse, making it even more difficult for law enforcement to charge offenders and protect children.”
Burrows added that the combination of disappearing messages and end-to-end encryption – which prevents law enforcement and technology platforms from seeing messages by ensuring that only the sender and recipient can see their content – would not pass the UK risk assessment process online. security bill, which requires platforms to give details of risks to users to the communications regulator, Ofcom.
In November, Meta announced that end-to-end encryption would take place no earlier than 2023, a year later than expected. Announcing the move, Meta’s chief security officer Antigone Davis said the company would be able to detect abuse under its encryption plans using unencrypted data, account information and reports from the company. ‘users. A similar approach has already enabled WhatsApp to report to child safety authorities.
“Our recent review of some[al] the cases showed that we would still have been able to provide critical information to authorities, even if those services had been end-to-end encrypted, ”she said.
Home Secretary Priti Patel opposed Zuckerberg’s encryption plans, saying she “cannot allow” a situation that hampers law enforcement’s ability to tackle “criminal acts odious”.
Last month, Ofcom chief executive Melanie Dawes said social media companies should ban adults from messaging children directly or face criminal penalties.