The First Crew Launch of Boeing’s Starliner Capsule Is On Hold Indefinitely

Longtime Slashdot reader schwit1 shares a report from Ars Technica: The first crewed test flight of Boeing’s long-delayed Starliner spacecraft won’t take off as planned Saturday and could face a longer postponement as engineers evaluate a stubborn leak of helium from the capsule’s propulsion system. NASA announced the latest delay of the Starliner test flight late Tuesday. Officials will take more time to consider their options for how to proceed with the mission after discovering the small helium leak on the spacecraft’s service module.

The space agency did not describe what options are on the table, but sources said they range from flying the spacecraft “as is” with a thorough understanding of the leak and confidence it won’t become more significant in flight, to removing the capsule from its Atlas V rocket and taking it back to a hangar for repairs. Theoretically, the former option could permit a launch attempt as soon as next week. The latter alternative could delay the launch until at least late summer.

“The team has been in meetings for two consecutive days, assessing flight rationale, system performance, and redundancy,” NASA said in a statement Tuesday night. “There is still forward work in these areas, and the next possible launch opportunity is still being discussed. NASA will share more details once we have a clearer path forward.”

Read more of this story at Slashdot.

DOJ Makes Its First Known Arrest For AI-Generated CSAM

In what’s believed to be the first case of its kind, the U.S. Department of Justice arrested a Wisconsin man last week for generating and distributing AI-generated child sexual abuse material (CSAM). Even if no children were used to create the material, the DOJ “looks to establish a judicial precedent that exploitative materials are still illegal,” reports Engadget. From the report: The DOJ says 42-year-old software engineer Steven Anderegg of Holmen, WI, used a fork of the open-source AI image generator Stable Diffusion to make the images, which he then used to try to lure an underage boy into sexual situations. The latter will likely play a central role in the eventual trial for the four counts of “producing, distributing, and possessing obscene visual depictions of minors engaged in sexually explicit conduct and transferring obscene material to a minor under the age of 16.” The government says Anderegg’s images showed “nude or partially clothed minors lasciviously displaying or touching their genitals or engaging in sexual intercourse with men.” The DOJ claims he used specific prompts, including negative prompts (extra guidance for the AI model, telling it what not to produce) to spur the generator into making the CSAM.

Cloud-based image generators like Midjourney and DALL-E 3 have safeguards against this type of activity, but Ars Technica reports that Anderegg allegedly used Stable Diffusion 1.5, a variant with fewer boundaries. Stability AI told the publication that fork was produced by Runway ML. According to the DOJ, Anderegg communicated online with the 15-year-old boy, describing how he used the AI model to create the images. The agency says the accused sent the teen direct messages on Instagram, including several AI images of “minors lasciviously displaying their genitals.” To its credit, Instagram reported the images to the National Center for Missing and Exploited Children (NCMEC), which alerted law enforcement. Anderegg could face five to 70 years in prison if convicted on all four counts. He’s currently in federal custody before a hearing scheduled for May 22.

Read more of this story at Slashdot.

EU Sets Benchmark For Rest of the World With Landmark AI Laws

An anonymous reader quotes a report from Reuters: Europe’s landmark rules on artificial intelligence will enter into force next month after EU countries endorsed on Tuesday a political deal reached in December, setting a potential global benchmark for a technology used in business and everyday life. The European Union’s AI Act is more comprehensive than the United States’ light-touch voluntary compliance approach while China’s approach aims to maintain social stability and state control. The vote by EU countries came two months after EU lawmakers backed the AI legislation drafted by the European Commission in 2021 after making a number of key changes. […]

The AI Act imposes strict transparency obligations on high-risk AI systems while such requirements for general-purpose AI models will be lighter.
It restricts governments’ use of real-time biometric surveillance in public spaces to cases of certain crimes, prevention of terrorist attacks and searches for people suspected of the most serious crimes. The new legislation will have an impact beyond the 27-country bloc, said Patrick van Eecke at law firm Cooley. “The Act will have global reach. Companies outside the EU who use EU customer data in their AI platforms will need to comply. Other countries and regions are likely to use the AI Act as a blueprint, just as they did with the GDPR,” he said, referring to EU privacy rules.

While the new legislation will apply in 2026, bans on the use of artificial intelligence in social scoring, predictive policing and untargeted scraping of facial images from the internet or CCTV footage will kick in in six months once the new regulation enters into force. Obligations for general purpose AI models will apply after 12 months and rules for AI systems embedded into regulated products in 36 months. Fines for violations range from $8.2 million or 1.5% of turnover to 35 million euros or 7% of global turnover depending on the type of violations.

Read more of this story at Slashdot.

HP Resurrects ’90s OmniBook Branding, Kills Spectre and Dragonfly

HP announced today that it will resurrect the “Omni” branding it first coined for its business-oriented laptops introduced in 1993. The vintage branding will now be used for the company’s new consumer-facing laptops, with HP retiring the Spectre and Dragonfly brands in the process. Furthermore, computers under consumer PC series names like Pavilion will also no longer be released. “Instead, every consumer computer from HP will be called either an OmniBook for laptops, an OmniDesk for desktops, or an OmniStudio for AIOs,” reports Ars Technica. From the report: The computers will also have a modifier, ranging from 3 up to 5, 7, X, or Ultra to denote computers that are entry-level all the way up to advanced. For instance, an HP OmniBook Ultra would represent HP’s highest-grade consumer laptop. “For example, an HP OmniBook 3 will appeal to customers who prioritize entertainment and personal use, while the OmniBook X will be designed for those with higher creative and technical demands,” Stacy Wolff, SVP of design and sustainability at HP, said via a press announcement today. […] So far, HP has announced one new Omni computer, the OmniBook X. It has a 12-core Snapdragon X Elite X1E-78-100, 16GB or 32GB of MPDDR5x-8448 memory, up to 2TB of storage, and a 14-inch, 2240×1400 IPS display. HP is pointing to the Latin translation of omni, meaning “all” (or everything), as the rationale behind the naming update. The new name should give shoppers confidence that the computers will provide all the things that they need.

HP is also getting rid of some of its commercial series names, like Pro. From now on, new, lower-end commercial laptops will be ProBooks. There will also be ProDesktop desktops and ProStudio AIOs. These computers will have either a 2 modifier for entry-level designs or a 4 modifier for ones with a little more power. For example, an HP ProDesk 2 is less powerful than an HP ProDesk 4. Anything more powerful will be considered either an EliteBook (laptops), EliteDesk (desktops), or EliteStudio (AIOs). For the Elite computers, the modifiers go from 6 to 8, X, and then Ultra. A Dragonfly laptop today would fall into the Ultra category. HP did less overhauling of its commercial lineup because it “recognized a need to preserve the brand equity and familiarity with our current sub-brands,” Wolff said, adding that HP “acknowledged the creation of additional product names like Dragonfly made those products stand out, rather than be seen as part of a holistic portfolio.” […]

As you might now expect of any tech rebranding, marketing push, or product release these days, HP is also announcing a new emblem that will appear on its computers, as well as other products or services, that substantially incorporate AI. The two laptops announced today carry the logo. According to Wolff, on computers, the logo means that the systems have an integrated NPU “at 40+ trillions of operations per second.” They also come with a chatbot based on ChatGPT 4, an HP spokesperson told me.

Read more of this story at Slashdot.

Vitalik Buterin Addresses Threats To Ethereum’s Decentralization In New Blog Post

In a new blog post, Ethereum co-founder Vitalik Buterin has shared his thoughts on three issues core to Ethereum’s decentralization: MEV, liquid staking, and the hardware requirements of nodes. The Block reports: In his post, published on May 17, Buterin first addresses the issue of MEV, or the financial gain that sophisticated node operators can capture by reordering the transactions within a block. Buterin characterizes the two approaches to MEV as “minimization” (reducing MEV through smart protocol design, such as CowSwap) and “quarantining” (attempting to reduce or eliminate MEV altogether through in-protocol techniques). While MEV quarantining seems like an alluring option, Buterin notes that the prospect comes with some centralization risks. “If builders have the power to exclude transactions from a block entirely, there are attacks that can quite easily arise,” Buterin noted. However, Buterin championed the builders working on MEV quarantining through concepts like transaction inclusion lists, which “take away the builder’s ability to push transactions out of the block entirely.” “I think ideas in this direction – really pushing the quarantine box to be as small as possible – are really interesting, and I’m in favor of going in that direction,” Buterin concluded.

Buterin also addressed the relatively low number of solo Ethereum stakers, as most stakers choose to stake with a staking provider, either a centralized offering like Coinbase or a decentralized offering like Lido or RocketPool, given the complexity, hardware requirement, and 32 eth minimum needed to operate an Ethereum node solo. While Buterin acknowledges the progress being made to reduce the cost and complexity around running a solo node, he also noted “once again there is more that we could do,” perhaps through reducing the time to withdraw staked ether or reducing the 32 eth minimum requirement to become a solo staker. “Incorrect answers could lead Ethereum down a path of centralization and ‘re-creating the traditional financial system with extra steps’; correct answers could create a shining example of a successful ecosystem with a wide and diverse set of solo stakers and highly decentralized staking pools,” Buterin wrote. […]

Buterin finished his post by imploring the Ethereum ecosystem to tackle the hard questions rather than shy away from them. “…We should have deep respect for the properties that make Ethereum unique, and continue to work to maintain and improve on those properties as Ethereum scales,” Buterin wrote. Buterin added today, in a post on X, that he was pleased to see civil debate among community members. “I’m really proud that ethereum does not have any culture of trying to prevent people from speaking their minds, even when they have very negative feelings toward major things in the protocol or ecosystem. Some wave the ideal of ‘open discourse’ as a flag, some take it seriously,” Buterin wrote.

Read more of this story at Slashdot.

FORTRAN and COBOL Re-enter TIOBE’s Ranking of Programming Language Popularity

“The TIOBE Index sets out to reflect the relative popularity of computer languages,” writes i-Programmer, “so it comes as something of a surprise to see two languages dating from the 1950’s in this month’s Top 20.

Having broken into the the Top 20 in April 2021 Fortran has continued to rise and has now risen to it’s highest ever position at #10… The headline for this month’s report by Paul Jansen on the TIOBE index is:
Fortran in the top 10, what is going on?
Jansen’s explanation points to the fact that there are more than 1,000 hits on Amazon for “Fortran Programming” while languages such as Kotlin and Rust, barely hit 300 books for the same search query. He also explains that Fortran is still evolving with the new ISO Fortran 2023 definition published less than half a year ago….

The other legacy language that is on the rise in the TIOBE index is COBOL. We noticed it re-enter the Top 20 in January 2024 and, having dropped out in the interim, it is there again this month.

More details from TechRepublic:

Along with Fortran holding on to its spot in the rankings, there were a few small changes in the top 10. Go gained 0.61 percentage points year over year, rising from tenth place in May 2023 to eighth this year. C++ rose slightly in popularity year over year, from fourth place to third, while Java (-3.53%) and Visual Basic (-1.8) fell.

Here’s how TIOBE ranked the 10 most popular programming languages in May:

Python
C
C++
Java
C#
JavaScript
Visual Basic
Go
SQL
Fortran

On the rival PYPL ranking of programming language popularity, Fortran does not appear anywhere in the top 29.
A note on its page explains that “Worldwide, Python is the most popular language, Rust grew the most in the last 5 years (2.1%) and Java lost the most (-4.0%).” Here’s how it ranks the 10 most popular programming languages for May:

Python (28.98% share)
Java (15.97% share)
JavaScript (8.79%)
C# (6.78% share)
R (4.76% share)
PHP (4.55% share)
TypeScript (3.03% share)
Swift (2.76% share)
Rust (2.6% share)

Read more of this story at Slashdot.

Linux Foundation Announces Launch of ‘High Performance Software Foundation’

This week the nonprofit Linux Foundation announced the launch of the High Performance Software Foundation, which “aims to build, promote, and advance a portable core software stack for high performance computing” (or HPC) by “increasing adoption, lowering barriers to contribution, and supporting development efforts.”

It promises initiatives focused on “continuously built, turnkey software stacks,” as well as other initiatives including architecture support and performance regression testing. Its first open source technical projects are:

– Spack: the HPC package manager.
– Kokkos: a performance-portable programming model for writing modern C++ applications in a hardware-agnostic way.
– Viskores (formerly VTK-m): a toolkit of scientific visualization algorithms for accelerator architectures.
– HPCToolkit: performance measurement and analysis tools for computers ranging from desktop systems to GPU-accelerated supercomputers.
– Apptainer: Formerly known as Singularity, Apptainer is a Linux Foundation project providing a high performance, full featured HPC and computing optimized container subsystem.
– E4S: a curated, hardened distribution of scientific software packages.

As use of HPC becomes ubiquitous in scientific computing and digital engineering, and AI use cases multiply, more and more data centers deploy GPUs and other compute accelerators. The High Performance Software Foundation will provide a neutral space for pivotal projects in the high performance computing ecosystem, enabling industry, academia, and government entities to collaborate on the scientific software.

The High Performance Software Foundation benefits from strong support across the HPC landscape, including Premier Members Amazon Web Services (AWS), Hewlett Packard Enterprise, Lawrence Livermore National Laboratory, and Sandia National Laboratories; General Members AMD, Argonne National Laboratory, Intel, Kitware, Los Alamos National Laboratory, NVIDIA, and Oak Ridge National Laboratory; and Associate Members University of Maryland, University of Oregon, and Centre for Development of Advanced Computing.
In a statement, an AMD vice president said that by joining “we are using our collective hardware and software expertise to help develop a portable, open-source software stack for high-performance computing across industry, academia, and government.” And an AWS executive said the high-performance computing community “has a long history of innovation being driven by open source projects. AWS is thrilled to join the High Performance Software Foundation to build on this work. In particular, AWS has been deeply involved in contributing upstream to Spack, and we’re looking forward to working with the HPSF to sustain and accelerate the growth of key HPC projects so everyone can benefit.”

The new foundation will “set up a technical advisory committee to manage working groups tackling a variety of HPC topics,” according to the announcement, following a governance model based on the Cloud Native Computing Foundation.

Read more of this story at Slashdot.