UK regulator demands video platforms do more to protect users
- Nasdaq, S&P 500 end lower, dragged down by communications services
- Intel (INTC) Stock Plummets as Expensive Investments Expected to Pressure Margins and FCF, Prompting Three Downgrades to Neutral
- Snap (SNAP) Stock Just Crashed 25% Following Earnings, Analyst Reaction Mixed
- Beyond Meat (BYND) Stock Plunges 14% After Slashing Revenue Outlook Amid a Decrease in Retail Orders
- Dollar pares losses as Powell signals bond taper
FILE PHOTO: A man holding a phone walks past a sign of Chinese company ByteDance's app TikTok, in Hangzhou, Zhejiang province, China October 18, 2019. Picture taken October 18, 2019. REUTERS/Stringer ATTENTION EDITORS - THIS IMAGE WAS PROVIDED BY A THIR
News and research before you hear about it on CNBC and others. Claim your 1-week free trial to StreetInsider Premium here.
LONDON (Reuters) - Online video sharing platforms (VSPs), such as TikTok, Snapchat and OnlyFans, need to provide clear rules on content, allow users to flag harmful videos, and restrict access to pornographic material, Britain's media regulator Ofcom said.
Under laws that came into effect in Britain last year, VSPs must take appropriate steps to protect all of their users from illegal material, with a particular focus on under 18s.
Ofcom, which is responsible for enforcing the rules, published guidance for VSPs on Wednesday, saying it wanted to see noticeable improvements over time in safety processes and complaint procedures.
Chief Executive Melanie Dawes said online video played a huge role in people's lives, particularly for children, but many users saw hateful, violent or inappropriate material while using them.
"The platforms where these videos are shared now have a legal duty to take steps to protect their users," she said.
"So we're stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future."
Ofcom said its research showed that a third of users said they had witnessed or experienced hateful content; a quarter claimed they'd been exposed to violent or disturbing content; while one in five had seen content that encouraged racism.
The regulator's remit covers platforms established in Britain, numbering 18 initially. Platforms established in other countries, such as YouTube and Facebook, are excluded.
If it finds a VSP had failed to take appropriate measures to protect users, it will be able to investigate and take action, including fines or - in the most serious cases - suspending or restricting the service.
(Reporting by Paul Sandle, Editing by Nick Zieminski)
Serious News for Serious Traders! Try StreetInsider.com Premium Free!
You May Also Be Interested In
- Oversight Board demands more transparency from Facebook (FB)
- Biomerica (BMRA) Announces Walmart to Begin Nationwide In-Store Sales of EZ Detect Colorectal Screening Test
- Russia puts man who leaked prison torture videos on wanted list
Create E-mail Alert Related CategoriesReuters
Sign up for StreetInsider Free!
Receive full access to all new and archived articles, unlimited portfolio tracking, e-mail alerts, custom newswires and RSS feeds - and more!