IDC: ‘All Eyes Will Be On Apple’ As Meta’s VR Strategy ‘Isn’t Sustainable’

An anonymous reader quotes a report from Ars Technica: A recent media release from market research firm IDC predicts that Meta (the parent company of Facebook) may not be able to compete in the mixed-reality business in the long run if its strategy remains unchanged. The media release offers a bird’s-eye view of the virtual reality hardware marketplace. In the release, IDC research manager Jitesh Ubrani said that, while “Meta continues to pour dollars into developing the metaverse, [the company’s] strategy of promoting low-cost hardware at the expense of profitability isn’t sustainable in the long run.”

A similar concern was raised by tech industry analyst Ming-Chi Kuo late last month. Kuo predicted that Meta would make moves to scale down investment in virtual reality, creating an opening for Apple and other competitors. He also wrote that Meta’s practice of selling VR headsets at a loss is unsustainable. Currently, Meta owns 90 percent of the VR headset market, according to the IDC release. In distant second is ByteDance’s Pico, at just 4.5 percent. Overall, VR headset shipments jumped 241.6 percent year over year in the first quarter of 2022. But the industry faced significant supply issues in Q1 2021, contributing to “a favorable comparison” for this year’s Q1.

Like Kuo a couple of weeks ago, IDC research director Ramon Llamas said that “all eyes will be on Apple as it launches its first headset next year.” Apple’s headset is expected to be much more expensive than Meta’s offerings, driving up the average unit price for the product category across the board, and Llamas believes Apple’s offering “will appeal primarily to a small audience of early adopters and Apple fans.” In other words, don’t expect the first Apple headset to ship vastly more units than Meta’s Oculus Quest 2 right out of the gate. It’s just a first step in a long-term plan to own the mixed-reality market.

Read more of this story at Slashdot.

‘The Phone is Terrible For Cloud Gaming’

An anonymous reader shares a column: The promise of cloud gaming is that you can do it from anywhere using any device with internet access and a good enough browser (each cloud gaming service seems to have its own requirements on the browser front). You should be able to play super demanding games whether you’re on a work trip with nothing but a work laptop or at home and the main TV is being hogged — or even if you just don’t feel like sitting on the couch. But the biggest promise of cloud gaming is that, no matter where you are, if you’ve got a phone then you’ve got all your games.

In practice, this is a bad idea. After spending the last few weeks rapturously using my Steam Deck near daily to play games in the cloud, I am never going to willingly attempt cloud gaming on my phone again. Valve’s enormous do-anything handheld PC has made me realize that, actually, sometimes dedicated gaming hardware is good! The Swiss Army knife approach to mobile gaming promised by cloud gaming on your phone is about as useful as the saw on a real Swiss Army knife. I appreciate the effort, but I don’t actually want to use it.

I’ve tried to make cloud gaming work on my phone a lot. I’ve attempted Red Dead Redemption 2 and Star Wars Jedi: Fallen Order and Halo and Gears of War and plenty of other games. Each time, I’m hit with wonder because, holy shit, these are demanding AAA games that usually require tons of expensive (and noisy) hardware playing on my phone. That feels like the delivery on a promise tech companies made me decades ago. But the wonder wears off when you cloud game on your phone for an extended period of time. Cloud gaming drains the phone’s battery quickly, which means you can and will be feeling the battery anxiety.

Read more of this story at Slashdot.

Ubisoft To Shut Down Multiplayer For Older Games

A collection of over a dozen games from Ubisoft will see their online elements shut down on PC, PS3, and Xbox 360 in September, “which means players won’t be able to play their multiplayer components, access their online features, link Ubisoft accounts in-game, or install and access downloadable content,” reports The Verge. From the report: “Closing the online services for some older games allows us to focus our resources on delivering great experiences for players who are playing newer or more popular titles,” Ubisoft’s help page reads. With Assassin’s Creed Brotherhood having originally released in November 2010, it’s had almost 12 years of online support. But it’s always sad to see a piece of gaming history become inaccessible, especially given the game’s multiplayer element was missing from its remaster on the PS4, Xbox One, and Nintendo Switch.

Alongside Brotherhood, the online services associated with 2011’s Assassin’s Creed Revelations on PS3 and Xbox 360 are also being shut down, as well as 2012’s Assassin’s Creed 3 on PC, PS3, Xbox 360, and Wii U. […] Other games set to have their online services decommissioned across various platforms this September include Driver San Francisco, Far Cry 3’s 2012 release, Ghost Recon Future Soldier, Prince of Persia the Forgotten Sands, Rayman Legends, and Splinter Cell: Blacklist. You can view the full list of games here.

Read more of this story at Slashdot.

Vim 9.0 Released

After many years of gradual improvement Vim now takes a big step with a major release. Besides many small additions the spotlight is on a new incarnation of the Vim script language: Vim9 script. Why Vim9 script:
A new script language, what is that needed for? Vim script has been growing over time, while preserving backwards compatibility. That means bad choices from the past often can’t be changed and compatibility with Vi restricts possible solutions. Execution is quite slow, each line is parsed every time it is executed.

The main goal of Vim9 script is to drastically improve performance. This is accomplished by compiling commands into instructions that can be efficiently executed. An increase in execution speed of 10 to 100 times can be expected. A secondary goal is to avoid Vim-specific constructs and get closer to commonly used programming languages, such as JavaScript, TypeScript and Java.

The performance improvements can only be achieved by not being 100% backwards compatible. For example, making function arguments available by creating an “a:” dictionary involves quite a lot of overhead. In a Vim9 function this dictionary is not available. Other differences are more subtle, such as how errors are handled. For those with a large collection of legacy scripts: Not to worry! They will keep working as before. There are no plans to drop support for legacy script. No drama like with the deprecation of Python 2.

Read more of this story at Slashdot.

SQLite or PostgreSQL? It’s Complicated!

Miguel Grinberg, a Principal Software Engineer for Technical Content at Twilio, writes in a blog post: We take blogging very seriously at Twilio. To help us understand what content works well and what doesn’t on our blog, we have a dashboard that combines the metadata that we maintain for each article such as author, team, product, publication date, etc., with traffic information from Google Analytics. Users can interactively request charts and tables while filtering and grouping the data in many different ways. I chose SQLite for the database that supports this dashboard, which in early 2021 when I built this system, seemed like a perfect choice for what I thought would be a small, niche application that my teammates and I can use to improve our blogging. But almost a year and a half later, this application tracks daily traffic for close to 8000 articles across the Twilio and SendGrid blogs, with about 6.5 million individual daily traffic records, and with a user base that grew to over 200 employees.

At some point I realized that some queries were taking a few seconds to produce results, so I started to wonder if a more robust database such as PostgreSQL would provide better performance. Having publicly professed my dislike of performance benchmarks, I resisted the urge to look up any comparisons online, and instead embarked on a series of experiments to accurately measure the performance of these two databases for the specific use cases of this application. What follows is a detailed account of my effort, the results of my testing (including a surprising twist!), and my analysis and final decision, which ended up being more involved than I expected. […] If you are going to take one thing away from this article, I hope it is that the only benchmarks that are valuable are those that run on your own platform, with your own stack, with your own data, and with your own software. And even then, you may need to add custom optimizations to get the best performance.

Read more of this story at Slashdot.

Berlin Builds a Giant Thermos to Help Heat Homes This Winter

The Associated Press reports on a massive new 150-foot (45-meter) tower going up in Berlin — just to hold 56 million liters (14.8 million gallons) of hot water that “will help heat Berlin homes this winter even if Russian gas supplies dry up…”

“[T]he new facility unveiled Thursday at Vattenfall’s Reuter power station will hold water brought to almost boiling temperature using electricity from solar and wind power plants across Germany. During periods when renewable energy exceeds demand the facility effectively acts as a giant battery, though instead of storing electricity it stores heat…”

“It’s a huge thermos that helps us to store the heat when we don’t need it,” said Tanja Wielgoss, who heads the Sweden-based company’s heat unit in Germany. “And then we can release it when we need to use it…. Sometimes you have an abundance of electricity in the grids that you cannot use anymore, and then you need to turn off the wind turbines,” said Wielgoss. “Where we are standing we can take in this electricity.”

The 50-million-euro ($52 million) facility will have a thermal capacity of 200 Megawatts — enough to meet much of Berlin’s hot water needs during the summer and about 10% of what it requires in the winter. The vast, insulated tank can keep water hot for up to 13 hours, helping bridge short periods when there’s little wind or sun….

Berlin’s top climate official, Bettina Jarasch, said the faster such heat storage systems are built, the better. “Due to its geographic location the Berlin region is even more dependent on Russian fossil fuels than other parts of Germany,” she told The Associated Press. “That’s why we’re really in a hurry here.”
“While it will be Europe’s biggest heat storage facility when it’s completed at the end of this year, an even bigger one is already being planned in the Netherlands.”

Read more of this story at Slashdot.