NVIDIA Invests $2 Billion into AI Cloud Company Nebius
NVIDIA signed a strategic partnership with AI infrastructure provider Nebius.

UK regulator Ofcom appealed to the most popular social media platforms to enforce minimum age rules for people using their services. The regulator argues that the companies should be doing more to ensure children are kept safe online.
Ofcom stated it, and data watchdog the Information Commissioner’s Office wrote to Facebook, Instagram, Snapchat, Roblox, TikTok, and YouTube, requiring them to prove to parents they have made a genuine commitment to protecting children online. Since the UK’s online safety laws came into effect in 2025, Ofcom explained it had been investigating nearly 100 services, which resulted in progress around the sharing of child sexual abuse material, pornography site visits now requiring age checks, and restrictions on platforms such as Telegram and Reddit to prevent exposing children to harmful content.
However, it added that the industry had not done enough, and it was setting out clear demands for further action to ensure technology companies are held publicly accountable for delivering the safest possible online environment for UK children. These demands include effective minimum age policies to restrict children under-13s from accessing sites and apps, failsafe grooming protections, safer feeds for children, and an end to product testing on children.
Ofcom CEO Melanie Dawes said the online services in question are household names, but they were failing to put children’s safety at the heart of their products. “There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms.” The platforms in question have a deadline of 30 April to inform Ofcom of what actions they will take.