New Intel Linux Graphics Driver Patches Released, Up To 10-15% Better Performance

A new set of patches have been released for the Intel Linux graphics driver that “can provide 10-15% better performance when operating in the tuned mode,” reports Phoronix. From the report: The set of Intel i915 Linux kernel graphics driver patches are about exposing the Intel RPS (Requested Power State) up/down thresholds. Right now the Intel Linux kernel driver has static values set for the up/down thresholds between power states while these patches would make them dynamically configurable by user-space. Google engineer Syed Faaiz Hussain raised the issue that they experimented with the Intel RPS tuning and were able to manage up to 15% better performance. With Counter-Strike: Global Offensive with OpenGL was a 14.5% boost, CS:GO with Vulkan was 12.9% faster, and Civilization VI with OpenGL was 11% faster while Strange Brigade was unchanged. No other game numbers were provided.

But as this is about changing the threshold for how aggressively the Intel graphics hardware switches power states, the proposed patches leave it up to user-space to adjust the thresholds as they wish. Google engineers are interested in hooking this into Feral’s GameMode so that the values could be automatically tuned when launching games and then returning to their former state when done gaming, in order to maximize battery life / power efficiency. The only downside with these current patches are that they work only for non-GuC based platforms… So the latest Alder/Raptor Lake notebooks as well as Intel DG2/Alchemist discrete graphics currently aren’t able to make use of this tuning option.

Read more of this story at Slashdot.

How ‘Homestar Runner’ Re-Emerged After the End of Flash

Wikipedia describes Homestar Runner as “a blend of surreal humour, self-parody, and references to popular culture, in particular video games, classic television, and popular music.” But after launching in 2000, the web-based cartoon became a cultural phenomenon, co-creator Mike Chapman remembered in 2017:

On the same day we received a demo of a song that John Linnell from They Might Be Giants recorded for a Strong Bad Email and a full-size working Tom Servo puppet from Jim Mallon from Mystery Science Theater 3000…. The Homestar references in the Buffy and Angel finales forever ago were huge. And there was this picture of Joss Whedon in a Strong Bad shirt from around that time that someone sent us that we couldn’t believe. Years later, a photo of Geddy Lee from Rush wearing a Strong Bad hat on stage circulated which similarly freaked us out. We have no idea if he knew what Strong Bad was, but our dumb animal character was on his head while he probably shredded ‘Working Man’ so I’ll take it!

After a mutli-year hiatus starting around 2009, the site has only been updating sporadically — and some worried that the end of Flash also meant the end of the Flash-based cartoon and its web site altogether. But on the day Flash Player was officially discontinued — December 31st, 2020 — a “post-Flash update” appeared at HomestarRunner.com:
What happened our website? Flash is finally dead-dead-dead so something drastic had to be done so people could still watch their favorite cartoons and sbemails with super-compressed mp3 audio and hidden clicky-clicky easter eggs…!

[O]nce you click “come on in,” you’ll find yourself in familiar territory thanks to the Ruffle Project. It emulates Flash in such a way that all browsers and devices can finally play our cartoons and even some games…. Your favorite easter eggs are still hidden and now you can even choose to watch a YouTube version if there is one.

Keep in mind, Ruffle is still in development so not everything works perfectly. Games made after, say 2007, will probably be pretty janky but Ruffle plans on ulitmately supporting those too one day. And any cartoons with video elements in them (Puppet Jams, death metal) will just show you an empy box where the video should be. But hang in there and one day everything will be just like it was that summer when we got free cable somehow and Grandma still lived in the spare bedroom.

And since then, new content has quietly been appearing at HomestarRunner.com. (Most recently, Thursday the site added a teaser for an upcoming Halloween video.)

The Homestar Runner wiki is tracking this year’s new content, which includes:

Strong Bad livestreaming his play of a text adventure (titled “Disk 4 of 12 — Vampire’s Castle”) on September 19thStrong Sad streaming a demo of the expando deck for Trogdor!! The Board Game (July 2)A Twitch parody in which Strong Bad livestreams a speedrun on a horrific beef-themed game (titled “Strong-Play: Marzipan Beef Reverser”) on April 25thA new Strong Bad Email on April 1st

And past videos are now also being uploaded on the site’s official YouTube channel.

Read more of this story at Slashdot.

Coding Mistake Made Intel GPUs 100X Slower in Ray Tracing

Intel Linux GPU driver developers have released an update that results in a massive 100X boost in ray tracing performance. This is something to be celebrated, of course. However, on the flip side, the driver was 100X slower than it should have been because of a memory allocation oversight. Tom’s Hardware reports: Linux-centric news site Phoronix reports that a fix merged into the open-source Intel Mesa Vulkan driver was implemented by Intel Linux graphics driver engineering stalwart Lionel Landwerlin on Thursday. The developer wryly commented that the merge request, which already landed in Mesa 22.2, would deliver “Like a 100x (not joking) improvement.” Intel has been working on Vulkan raytracing support since late 2020, but this fix is better late than never.

Usually, the Vulkan driver would ensure temporary memory used for Vulkan raytracing work would be in local memory, i.e., the very fast graphics memory onboard the discrete GPU. A line of code was missing, so this memory allocation housekeeping task wasn’t set. Thus, the Vulkan driver would shift ray tracing data to slower offboard system memory and back. Think of the continued convoluted transfers to this slower memory taking place, slowing down the raytracing performance significantly. It turns out, as per our headline, that setting a flag for “ANV_BO_ALLOC_LOCAL_MEM” ensured that the VRAM would be used instead, and a 100X performance boost was the result. “Mesa 22.2, which includes the new code, is due to be branched in the coming days and will be included in a bundle of other driver refinements, which should reach end-users by the end of August,” adds the report.

Read more of this story at Slashdot.

More Apple M1 Ultra Benchmarks Show It Doesn’t Beat the Best GPUs from Nvidia and AMD

Tom’s Guide tested a Mac Studio workstation equipped with an M1 Ultra with the Geekbench 5.4 CPU benchmarks “to get a sense of how effectively it handles single-core and multi-core workflows.”

“Since our M1 Ultra is the best you can buy (at a rough price of $6,199) it sports a 20-core CPU and a 64-core GPU, as well as 128GB of unified memory (RAM) and a 2TB SSD.”

Slashdot reader exomondo shares their results:
We ran the M1 Ultra through the Geekbench 5.4 CPU benchmarking test multiple times and after averaging the results, we found that the M1 Ultra does indeed outperform top-of-the-line Windows gaming PCs when it comes to multi-core CPU performance. Specifically, the M1 Ultra outperformed a recent Alienware Aurora R13 desktop we tested (w/ Intel Core i7-12700KF, GeForce RTX 3080, 32GB RAM), an Origin Millennium (2022) we just reviewed (Core i9-12900K CPU, RTX 3080 Ti GPU, 32GB RAM), and an even more 3090-equipped HP Omen 45L we tested recently (Core i9-12900K, GeForce RTX 3090, 64GB RAM) in the Geekbench 5.4 multi-core CPU benchmark.

However, as you can see from the chart of results below, the M1 Ultra couldn’t match its Intel-powered competition in terms of CPU single-core performance. The Ultra-powered Studio also proved slower to transcode video than the afore-mentioned gaming PCs, taking nearly 4 minutes to transcode a 4K video down to 1080p using Handbrake. All of the gaming PCs I just mentioned completed the same task faster, over 30 seconds faster in the case of the Origin Millennium. Before we even get into the GPU performance tests it’s clear that while the M1 Ultra excels at multi-core workflows, it doesn’t trounce the competition across the board. When we ran our Mac Studio review unit through the Geekbench 5.4 OpenCL test (which benchmarks GPU performance by simulating common tasks like image processing), the Ultra earned an average score of 83,868. That’s quite good, but again it fails to outperform Nvidia GPUs in similarly-priced systems.

They also share some results from the OpenCL Benchmarks browser, which publicly displays scores from different GPUs that users have uploaded:
Apple’s various M1 chips are on the list as well, and while the M1 Ultra leads that pack it’s still quite a ways down the list, with an average score of 83,940. Incidentally, that means it ranks below much older GPUs like Nvidia’s GeForce RTX 2070 (85,639) and AMD’s Radeon VII (86,509). So here again we see that while the Ultra is fast, it can’t match the graphical performance of GPUs that are 2-3 years old at this point — at least, not in these synthetic benchmarks. These tests don’t always accurately reflect real-world CPU and GPU performance, which can be dramatically influenced by what programs you’re running and how they’re optimized to make use of your PC’s components.
Their conclusion?
When it comes to tasks like photo editing or video and music production, the M1 Ultra w/ 128GB of RAM blazes through workloads, and it does so while remaining whisper-quiet. It also makes the Mac Studio a decent gaming machine, as I was able to play less demanding games like Crusader Kings III, Pathfinder: Wrath of the Righteous and Total War: Warhammer II at reasonable (30+ fps) framerates. But that’s just not on par with the performance we expect from high-end GPUs like the Nvidia GeForce RTX 3090….
Of course, if you don’t care about games and are in the market for a new Mac with more power than just about anything Apple’s ever made, you want the Studio with M1 Ultra.

Read more of this story at Slashdot.

Vice Mocks GIFs as ‘For Boomers Now, Sorry’. (And For Low-Effort Millennials)

“GIF folders were used by ancient civilisations as a way to store and catalogue animated pictures that were once employed to convey emotion,” Vice writes:

Okay, you probably know what a GIF folder is — but the concept of a special folder needed to store and save GIFs is increasingly alien in an era where every messaging app has its own in-built GIF library you can access with a single tap. And to many youngsters, GIFs themselves are increasingly alien too — or at least, okay, increasingly uncool. “Who uses gifs in 2020 grandma,” one Twitter user speedily responded to Taylor Swift in August that year when the singer-songwriter opted for an image of Dwayne “The Rock” Johnson mouthing the words “oh my god” to convey her excitement at reaching yet another career milestone.

You don’t have to look far to find other tweets or TikToks mocking GIFs as the preserve of old people — which, yes, now means millennials. How exactly did GIFs become so embarrassing? Will they soon disappear forever, like Homer Simpson backing up into a hedge…?

Gen Z might think GIFs are beloved by millennials, but at the same time, many millennials are starting to see GIFs as a boomer plaything. And this is the first and easiest explanation as to why GIFs are losing their cultural cachet. Whitney Phillips, an assistant professor of communication at Syracuse University and author of multiple books on internet culture, says that early adopters have always grumbled when new (read: old) people start to encroach on their digital space. Memes, for example, were once subcultural and niche. When Facebook came along and made them more widespread, Redditors and 4Chan users were genuinely annoyed that people capitalised on the fruits of their posting without putting in the cultural work. “That democratisation creates a sense of disgust with people who consider themselves insiders,” Phillips explains. “That’s been central to the process of cultural production online for decades at this point….”

In 2016, Twitter launched its GIF search function, as did WhatsApp and iMessage. A year later, Facebook introduced its own GIF button in the comment section on the site. GIFs became not only centralised but highly commercialised, culminating in Facebook buying GIPHY for $400 million in 2020. “The more GIFs there are, maybe the less they’re regarded as being special treasures or gifts that you’re giving people,” Phillips says. “Rather than looking far and wide to find a GIF to send you, it’s clicking the search button and typing a word. The gift economy around GIFs has shifted….”

Linda Kaye, a cyberpsychology professor at Edge Hill University, hasn’t done direct research in this area but theorises that the ever-growing popularity of video-sharing on TikTok means younger generations are more used to “personalised content creation”, and GIFs can seem comparatively lazy.

The GIF was invented in 1987 “and it’s important to note the format has already fallen out of favour and had a comeback multiple times before,” the article points out. It cites Jason Eppink, an independent artist and curator who curated an exhibition on GIFs for the Museum of the Moving Image in New York in 2014, who highlighted how GIFs were popular with GeoCities users in the 90s, “so when Facebook launched, they didn’t support GIFs…. They were like, ‘We don’t want this ugly symbol of amateur web to clutter our neat and uniform cool new website.” But then GIFs had a resurgence on Tumblr.

Vice concludes that while even Eppink no longer uses GIFs any more, “Perhaps the waxing and waning popularity of the GIF is an ironic mirror of the format itself — destined to repeat endlessly, looping over and over again.”

Read more of this story at Slashdot.