Linux 6.1 Released With Initial Support for Rust-Based Kernel Development

“Linus has released the 6.1 kernel,” reports LWN.net — and it’s the one with initial support for kernel development in Rust.
Elsewhere LWN explains the specifics of this milestone:
No system with a production 6.1 kernel will be running any Rust code, but this change does give kernel developers a chance to play with the language in the kernel context and get a sense for how Rust development feels….

There are other initiatives underway, including the writing of an Apple graphics driver in the Rust language. For the initial merge into the mainline kernel, though, Linus Torvalds made it clear that as little functionality as possible should be included. So those drivers and their support code were trimmed out and must wait for a future kernel release. What is there is the support needed to build a module that can be loaded into the kernel, along with a small sample module…. Torvalds asked for something that could do “hello world” and that is what we got. It is something that can be played with, but it cannot be used for any sort of real kernel programming at this point.
That situation will, hopefully, change in the near future.

Meanwhile, Linux 6.1 also includes “support for destructive BPF programs, some significant io_uring performance improvements, better user-space control over transparent huge-page creation, improved memory-tiering support.”

The Register adds:
Other interesting additions include more support for the made-in-China LoongArch CPU architecture, introductory work to support Wi-Fi 7 and security fixes for some flaky Wi-Fi routines in previous versions of the kernel. There’s also plenty of effort to improve the performance of Linux on laptops, and enhanced power efficiency for AMD’s PC-centric RYZEN silicon.

Read more of this story at Slashdot.

C++ Zooms Past Java in Programming Language Popularity Contest

“Java is no longer among the top three most popular programming languages in the TIOBE Index,” reports the Register, “one of several not particularly definitive yardsticks by which such things are measured.”

According to Paul Jansen, CEO of Netherlands-based TIOBE Software, the rising popularity of C++ has pushed Java down a notch. The index’s rankings are now:
– Python in first place
– C second
– C++ third, and
– Java fourth.

C++ stepped up to third, and Java fell to fourth. “C++ surpassed Java for the first time in the history of the TIOBE Index, which means that Java is at position 4 now,” said Jansen in the December update for the TIOBE Index. “This is the first time that Java is not part of the top 3 since the beginning of the TIOBE Index in 2001.”

The surge in C++, perhaps in part helped by the stable release of C++ 20 in December 2020, is particularly ironic in light of the language’s recent dismissal by Microsoft CTO Mark Russinovich, which coincides with industry evangelism for Rust and its capacity for memory safety.
The article points out that other rankings still show a slighty higher popularity for Java.

And ZDNet notes the other languages rising quickly in popularity over the last 12 months:
In a year-on-year comparison in Tiobe’s index, the languages now in the top 20 that made significant gains over the period are: Rust (up from 27 to 20), Objective-C (up from 29 to 19), science-specialized MATLAB (20 to 14), and Google’s Go language (up from 19 to 12).

Read more of this story at Slashdot.

China Bans Deepfakes Created Without Permission Or For Evil

China’s Cyberspace Administration has issued guidelines on how to do deepfakes the right way. The Register reports: [T]he Cyberspace Administration (CAC) has issued regulations that prohibit their creation without the subject’s permission, or to depict or utter anything that could be considered as counter to the national interest. Anything counter to socialist values falls under that description, as does any form of “Illegal and harmful information” or using AI-generated humans in an attempt to deceive or slander. But the rules also suggest China expects synthetic humans will be widely used. For instance, they allow use of deepfakes in applications such as chatbots. In such scenarios, deepfakes must be flagged as digital creations.

The document also envisages that deepfakes will be used by online publishers, which must take into account China’s myriad other rules about acceptable online content. Including the one that censpored images of Winnie the Pooh online, as the beloved bear – as depicted by illustrator E. H. Shepard – was felt to resemble, and mock, China’s president-for-probably-life Xi Jinping. The Register therefore suggests it will be a very, very, brave Chinese developer that creates a photorealistic ursine chatbot or avatar.

The regulations also spell out how the creators of deepfakes — who are termed “deep synthesis service providers” — must take care that their AI/ML models and algorithms are accurate and regularly revised, and ensure the security of data they collect. The rules also include a requirement for registration of users — including their real names. Because allowing an unknown person to mess with deepfakes would not do. The rules are pitched as ensuring that synthesis tech avoids the downsides and delivers benefits to China. Or, as Beijing puts it (albeit in translation), deepfakes must “Promote the healthy development of internet information services and maintain a good ecology of cyberspace.” The regulations come into force on January 10, 2023.

Read more of this story at Slashdot.

‘Diablo IV’ Developers Work Long Hours, Bracing For Impending Release

Activision Blizzard employees developing the upcoming dark fantasy action role-playing game “Diablo IV” say it will be hard to meet a June 6, 2023, release date without working significant overtime, in a process they say has been plagued by mismanagement. The release date, which has not been announced publicly, comes in the same month that Microsoft’s proposed $68.7 billion acquisition is set to close. The Washington Post reports: Fifteen current and former Blizzard employees spoke to The Washington Post on the condition of anonymity because they were not authorized to speak publicly about company operations. They described a mounting sense of dissatisfaction and malaise among employees as they endured leadership changes at Activision Blizzard and on the “Diablo IV” team. The Diablo team has been losing talent for over a year, as employees look for more competitive wages and better work conditions elsewhere, according to employees. One group of about 20 developers working on one portion of the game saw about half of its members leave within a year, according to two former employees. Blizzard did not comment on attrition on the “Diablo IV” team. Last January, Activision Blizzard CEO Bobby Kotick attributed the company’s stock price drop to the game delay of Diablo in a Venture Beat interview, rather than an ongoing sexual harassment lawsuit filed against the company in July of 2021. “I think what affected the stock price more than [the sexual harassment investigation] is pushing out [the release dates of] ‘Overwatch’ and ‘Diablo,'” he said, explaining that was one of the reasons he was selling the company to Microsoft. His comments frustrated some of the company’s developers, who felt he was blaming them unfairly.

“Crunching” in the video game industry is a common practice, but it’s become controversial in recent years, even while game developers continue working late into the evenings and weekends, sometimes secretly. Despite wishing to avoid crunch, some Blizzard employees in recent months find themselves facing down long hours again, unwilling to publish an unfinished product. They described consequences of crunch that included chronic back injuries, insomnia and anxiety, as well as less time to spend with family or to maintain romantic relationships. […] “We were never going to hit our date without crunch,” said a former Blizzard employee of a previously-intended “Diablo IV” internal release date. “And even with crunch, I don’t even know if we would have hit our date.” Activision Blizzard is offering “Diablo IV” developers a deal in which they will gain twice as many company stock shares when the game releases. Employees said they were offered more stock to stay on based on their position and seniority, from around $5,000 in value for entry-level workers to upward of $50,000 for more senior employees. […]

“Diablo IV” had multiple internal, unannounced release dates. At one point, 2021 was floated as an internal goal. A more specific date emerged — December 2022 — after the title was publicly announced in 2019 at the company’s annual gaming convention BlizzCon. Developers appealed for more time to avoid massive cuts to the game. After moving the date to April 2023, the team felt it still needed more time and was able to get the June date approved. The June date feels harder to move, several employees say. “We’re at the point where they’re not willing to delay the game anymore,” said a current Blizzard Albany employee. “So we all just have to go along and figure out how much we’re willing to hurt ourselves to make sure the game gets released in a good enough state.”

Read more of this story at Slashdot.

Why the Laws of Physics Don’t Actually Exist

Theoretical physicist Sankar Das Sarma wrote a thought-provoking essay for New Scientist magazine’s Lost in Space-Time newsletter:
I was recently reading an old article by string theorist Robbert Dijkgraaf in Quanta Magazine entitled “There are no laws of physics”. You might think it a bit odd for a physicist to argue that there are no laws of physics but I agree with him. In fact, not only do I agree with him, I think that my field is all the better for it. And I hope to convince you of this too.

First things first. What we often call laws of physics are really just consistent mathematical theories that seem to match some parts of nature. This is as true for Newton’s laws of motion as it is for Einstein’s theories of relativity, Schrödinger’s and Dirac’s equations in quantum physics or even string theory. So these aren’t really laws as such, but instead precise and consistent ways of describing the reality we see. This should be obvious from the fact that these laws are not static; they evolve as our empirical knowledge of the universe improves.

Here’s the thing. Despite many scientists viewing their role as uncovering these ultimate laws, I just don’t believe they exist…. I know from my 40 years of experience in working on real-life physical phenomena that the whole idea of an ultimate law based on an equation using just the building blocks and fundamental forces is unworkable and essentially a fantasy. We never know precisely which equation describes a particular laboratory situation. Instead, we always have to build models and approximations to describe each phenomenon even when we know that the equation controlling it is ultimately some form of the Schrödinger equation!
Even with quantum mechanics, space and time are variables that have to be “put in by hand,” the article argues, “when space and time should come out naturally from any ultimate law of physics. This has remained perhaps the greatest mystery in fundamental physics with no solution in sight….”

“It is difficult to imagine that a thousand years from now physicists will still use quantum mechanics as the fundamental description of nature…. I see no particular reason that our description of how the physical universe seems to work should reach the pinnacle suddenly in the beginning of the 21st century and become stuck forever at quantum mechanics. That would be a truly depressing thought…!”

“Our understanding of the physical world must continue indefinitely, unimpeded by the search for ultimate laws. Laws of physics continuously evolve — they will never be ultimate.”
Thanks to long-time Slashdot reader InfiniteZero for sharing the article!

Read more of this story at Slashdot.

Teenager’s Incurable Cancer Cleared With Revolutionary DNA-Editing Technique

“A teenage girl’s incurable cancer has been cleared from her body,” reports the BBC, “in the first use of a revolutionary new type of medicine….”

Doctors at Great Ormond Street Hospital used “base editing” to perform a feat of biological engineering to build her a new living drug. Six months later the cancer is undetectable, but Alyssa is still being monitored in case it comes back.

Alyssa, who is 13 and from Leicester, was diagnosed with T-cell acute lymphoblastic leukaemia in May last year…. Her cancer was aggressive. Chemotherapy, and then a bone-marrow transplant, were unable to rid it from her body…. The team at Great Ormond Street used a technology called base editing, which was invented only six years ago [which] allows scientists to zoom to a precise part of the genetic code and then alter the molecular structure of just one base, converting it into another and changing the genetic instructions. The large team of doctors and scientists used this tool to engineer a new type of T-cell that was capable of hunting down and killing Alyssa’s cancerous T-cells….

After a month, Alyssa was in remission and was given a second bone-marrow transplant to regrow her immune system…. Alyssa is just the first of 10 people to be given the drug as part of a clinical trial.
Her mother said that a year ago she’d been dreading Christmas, “thinking this is our last with her”. But it wasn’t.

And the BBC adds that applying the technology to cancer “only scratches the surface of what base editing could achieve…. There are already trials of base editing under way in sickle-cell disease, as well as high cholesterol that runs in families and the blood disorder beta-thalassemia.”

Read more of this story at Slashdot.

OSnews Decries ‘The Mass Extinction of Unix Workstations’

Anyone remember the high-end commercial UNIX workstations from a few decades ago — like from companies like IBM, DEC, SGI, and Sun Microsystems?
Today OSnews looked back — but also explored what happens when you try to buy one today> :
As x86 became ever more powerful and versatile, and with the rise of Linux as a capable UNIX replacement and the adoption of the NT-based versions of Windows, the days of the UNIX workstations were numbered. A few years into the new millennium, virtually all traditional UNIX vendors had ended production of their workstations and in some cases even their associated architectures, with a lacklustre collective effort to move over to Intel’s Itanium — which didn’t exactly go anywhere and is now nothing more than a sour footnote in computing history.

Approaching roughly 2010, all the UNIX workstations had disappeared…. and by now, they’re all pretty much dead (save for Solaris). Users and industries moved on to x86 on the hardware side, and Linux, Windows, and in some cases, Mac OS X on the software side…. Over the past few years, I have come to learn that If you want to get into buying, using, and learning from UNIX workstations today, you’ll run into various problems which can roughly be filed into three main categories: hardware availability, operating system availability, and third party software availability.

Their article details their own attempts to buy one over the years, ultimately concluding the experience “left me bitter and frustrated that so much knowledge — in the form of documentation, software, tutorials, drivers, and so on — is disappearing before our very eyes.”

Shortsightedness and disinterest in their own heritage by corporations, big and small, is destroying entire swaths of software, and as more years pass by, it will get ever harder to get any of these things back up and running…. As for all the third-party software — well, I’m afraid it’s too late for that already. Chasing down the rightsholders is already an incredibly difficult task, and even if you do find them, they are probably not interested in helping you, and even if by some miracle they are, they most likely no longer even have the ability to generate the required licenses or release versions with the licensing ripped out. Stuff like Pro/ENGINEER and SoftWindows for UNIX are most likely gone forever….

Software is dying off at an alarming rate, and I fear there’s no turning the tide of this mass extinction.
The article also wonders why companies like HPE don’t just “dump some ISO files” onto an FTP server, along with patch depots and documentation. “This stuff has no commercial value, they’re not losing any sales, and it will barely affect their bottom line.

Read more of this story at Slashdot.