Meta’s Social VR Platform Horizon Hits 300,000 Users

Since being rolled out to users in the U.S. and Canada, Meta’s social VR platform for the Quest headset, Horizon Worlds, has grown its monthly user base by a factor of 10x to 300,000 people. “Meta spokesperson Joe Osborne confirmed the stat and said it included users of Horizon Worlds and Horizon Venues, a separate app for attending live events in VR that uses the same avatars and basic mechanics,” reports The Verge. “The number doesn’t include Horizon Workrooms, a VR conferencing experience that relies on an invite system.” From the report: Before its December rollout, Horizon Worlds was in a private beta for creators to test its world-building tools. Similarly to how the gaming platform Roblox or Microsoft’s Minecraft works, Horizon Worlds lets people build custom environments to hang out and play games in as legless avatars. Meta announced this week that 10,000 separate worlds have been built in Horizon Worlds to date, and its private Facebook group for creators now numbers over 20,000 members.

Meta still hasn’t disclosed how many Quest headsets it has sold to date, which makes it hard to gauge Horizon’s success relative to the underlying hardware platform it runs on. But several third-party estimates peg sales at over 10 million for the Quest. Zuckerberg recently said that Meta would release a version of Horizon for mobile phones later this year to “bring early metaverse experiences to more surfaces beyond VR.”

“So while the deepest and most immersive experiences are going to be in virtual reality, you’re also going to be able to access the worlds from your Facebook or Instagram apps as well, and probably more over time,” the CEO said on Meta’s last earnings call. Bringing Horizon to mobile would position it as even more of a competitor to Rec Room, a well-funded, social gaming app with 37 million monthly users across gaming consoles, mobile phones, and VR.

Read more of this story at Slashdot.

Meta Introduces ‘Personal Boundary’ Feature To VR Worlds

Meta has introduced a new “personal boundary” feature within its VR social spaces, starting with Horizon Worlds and Horizon Venues. Hypebeast reports: By enacting a personal boundary, a user will by default have a nearly 4-foot (1.2 m) distance between their avatar and others. Via an invisible barrier, the system will halt the forward movement of other avatars as they reach the boundary. Meta says that the feature will make it easier for users to avoid unwanted interactions such as harassment.

Users can still walk past other avatars with personal boundaries enabled and can even give them a high-five or fist bump. The feature will be rolled out as always-on, by default, which Meta says will “help to set behavioral norms” in the VR space. In the future, the company will consider adding new controls, such as allowing users to customize the size of their personal boundaries. In a statement to Ars Technica, a Meta spokesperson said: “Personal Boundary builds upon our existing harassment measures that were already in place – for example, where an avatar’s hands would disappear if they encroached upon someone’s personal space. When we launched Horizon Worlds as an invite-only beta in 2020 we knew this was just the beginning and over time we would be iterating and improving based on community feedback. We’re constantly shipping new features based on people’s feedback, including this one.”

Read more of this story at Slashdot.

Zuckerberg Tells Staff to Focus on Video Products as Meta’s Stock Plunges

Meta Chief Executive Officer Mark Zuckerberg rallied his employees to focus on video products, after they watched the stock lose a quarter of its value. Bloomberg reports: At a company-wide virtual meeting Thursday, Zuckerberg explained that the historic stock drop was a result of Meta’s weak forecast for revenue in the current quarter, according to a person who attended and was not authorized to speak about it. Zuckerberg echoed his remarks of a day earlier to investors, telling employees that the social networking giant faced an “unprecedented level of competition,” with the rise of TikTok, the short-video platform Facebook doesn’t own. Zuckerberg appeared red-eyed and wore glasses, the person said. He said he might tear up because he’d scratched his eye — not because of the topics up for discussion.

Meta is already talking about ways to retain staff amid the stock rout. The social media giant is thinking of offering long weekends, Zuckerberg said, responding to a question on burnout. He also encouraged exhausted employees to use their vacation days. He added that based on his life experience, transitioning to a four-day work week would not be productive. Employee shares vest on Feb. 15, and manager conversations about bonuses and promotions happen in March — both of which could be factors in workers’ potential decisions to leave, according to another person familiar with the company’s plans.

Read more of this story at Slashdot.

Two US Senators Urge Federal Investigations Into Facebook About Safety – and Ad Reach

Two leading U.S. Senators “are urging federal regulators to investigate Facebook over allegations the company misled advertisers, investors and the public about public safety and ad reach on its platform,” reports CNBC:

On Thursday, Senator Warren urged the heads of the Department of Justice and Securities and Exchange Commission to open criminal and civil investigations into Facebook or its executives to determine if they violated U.S. wire fraud and securities laws. A day earlier, Senator Cantwell, chair of the Senate Commerce Committee, encouraged the Federal Trade Commission to investigate whether Facebook, now called Meta, violated the agency’s law against unfair or deceptive business practices. Cantwell’s letter was made public on Thursday…

In her letter to the FTC, Cantwell focused on Facebook’s claims about the safety of its products, in addition to the allegedly inflated ad projections… She suggested the agency investigate Facebook and, depending what the evidence shows, pursue monetary relief for advertisers and disgorgement of allegedly ill-gotten gains.
Senator Warren points to a whistleblower’s recent allegations that Facebook misled both investors and advertising customers about their ad reach, according to the article. But Warren’s letter also argued the possibility Facebook violated securities law with “breathtakingly illegal conduct by one of the world’s largest social media companies,” according to the article.

And in addition, Warren “wrote that evidence increasingly suggests executives were aware the metric ‘was meaningfully and consistently inflated.'”

Bloomberg adds this quote from Senator Cantwell’s letter:
“A thorough investigation by the Commission and other enforcement agencies is paramount, not only because Facebook and its executives may have violated federal law, but because members of the public and businesses are entitled to know the facts regarding Facebook’s conduct as they make their decisions about using the platform.”

Read more of this story at Slashdot.

Meta Has a ‘Moral Obligation’ To Make Its Mental Health Research Transparent, Scientists Say

In an open letter to Mark Zuckerberg published Monday, a group of academics called for Meta to be more transparent about its research into how Facebook, Instagram, and WhatsApp affect the mental health of children and adolescents. The Verge reports: The letter calls for the company to allow independent reviews of its internal work, contribute data to external research projects, and set up an independent scientific oversight group. “You and your organizations have an ethical and moral obligation to align your internal research on children and adolescents with established standards for evidence in mental health science,” the letter, signed by researchers from universities around the world, reads.

The open letter comes after leaks from Facebook revealed some data from the company’s internal research, which found that Instagram was linked with anxiety and body image issues for some teenage girls. The research released, though, is limited and relied on subjective information collected through interviews. While this strategy can produce useful insights, it can’t prove that social media caused any of the mental health outcomes. The information available so far appears to show that the studies Facebook researchers conducted don’t meet the standards academic researchers use to conduct trials, the new open letter said. The information available also isn’t complete, the authors noted — Meta hasn’t made its research methods or data public, so it can’t be scrutinized by independent experts. The authors called for the company to allow independent review of past and future research, which would include releasing research materials and data.

The letter also asked Meta to contribute its data to ongoing independent research efforts on the mental health of adolescents. It’s a longstanding frustration that big tech companies don’t release data, which makes it challenging for external researchers to scrutinize and understand their products. “It will be impossible to identify and promote mental health in the 21st century if we cannot study how young people are interacting online,” the authors said. […] The open letter also called on Meta to establish an independent scientific trust to evaluate any risks to mental health from the use of platforms like Facebook and Instagram and to help implement “truly evidence-based solutions for online risks on a world-wide scale.” The trust could be similar to the existing Facebook Oversight Board, which helps the company with content moderation decisions.

Read more of this story at Slashdot.