Artists Claim ‘Big’ Win In Copyright Suit Fighting AI Image Generators

Ars Technica’s Ashley Belanger reports: Artists defending a class-action lawsuit are claiming a major win this week in their fight to stop the most sophisticated AI image generators from copying billions of artworks to train AI models and replicate their styles without compensating artists. In an order on Monday, US district judge William Orrick denied key parts of motions to dismiss from Stability AI, Midjourney, Runway AI, and DeviantArt. The court will now allow artists to proceed with discovery on claims that AI image generators relying on Stable Diffusion violate both the Copyright Act and the Lanham Act, which protects artists from commercial misuse of their names and unique styles.

“We won BIG,” an artist plaintiff, Karla Ortiz, wrote on X (formerly Twitter), celebrating the order. “Not only do we proceed on our copyright claims,” but “this order also means companies who utilize” Stable Diffusion models and LAION-like datasets that scrape artists’ works for AI training without permission “could now be liable for copyright infringement violations, amongst other violations.” Lawyers for the artists, Joseph Saveri and Matthew Butterick, told Ars that artists suing “consider the Court’s order a significant step forward for the case,” as “the Court allowed Plaintiffs’ core copyright-infringement claims against all four defendants to proceed.”

Read more of this story at Slashdot.

NIST Finalizes Trio of Post-Quantum Encryption Standards

“NIST has formally accepted three algorithms for post-quantum cryptography,” writes ancient Slashdot reader jd. “Two more backup algorithms are being worked on. The idea is to have backup algorithms using very different maths, just in case a flaw in the original approach is discovered later.” The Register reports: The National Institute of Standards and Technology (NIST) today released the long-awaited post-quantum encryption standards, designed to protect electronic information long into the future — when quantum computers are expected to break existing cryptographic algorithms. One — ML-KEM (PDF) (based on CRYSTALS-Kyber) — is intended for general encryption, which protects data as it moves across public networks. The other two — ML-DSA (PDF) (originally known as CRYSTALS-Dilithium) and SLH-DSA (PDF) (initially submitted as Sphincs+) — secure digital signatures, which are used to authenticate online identity. A fourth algorithm — FN-DSA (PDF) (originally called FALCON) — is slated for finalization later this year and is also designed for digital signatures.

NIST continued to evaluate two other sets of algorithms that could potentially serve as backup standards in the future. One of the sets includes three algorithms designed for general encryption — but the technology is based on a different type of math problem than the ML-KEM general-purpose algorithm in today’s finalized standards. NIST plans to select one or two of these algorithms by the end of 2024. Despite the new ones on the horizon, NIST mathematician Dustin Moody encouraged system administrators to start transitioning to the new standards ASAP, because full integration takes some time. “There is no need to wait for future standards,” Moody advised in a statement. “Go ahead and start using these three. We need to be prepared in case of an attack that defeats the algorithms in these three standards, and we will continue working on backup plans to keep our data safe. But for most applications, these new standards are the main event.”
From the NIST: This notice announces the Secretary of Commerce’s approval of three Federal Information Processing Standards (FIPS):

– FIPS 203, Module-Lattice-Based Key-Encapsulation Mechanism Standard
– FIPS 204, Module-Lattice-Based Digital Signature Standard
– FIPS 205, Stateless Hash-Based Digital Signature Standard

These standards specify key establishment and digital signature schemes that are designed to resist future attacks by quantum computers, which threaten the security of current standards. The three algorithms specified in these standards are each derived from different submissions in the NIST Post-Quantum Cryptography Standardization Project.

Read more of this story at Slashdot.

Texas Sues General Motors, Alleging Illegal Selling of Driver Data

In a press release today, Texas Attorney General Ken Paxton said he has filed a lawsuit against General Motors, alleging the carmaker illegally collected and sold drivers’ data to insurance companies without their consent or knowledge. CNN reports: In car models from 2015 and later, the Detroit-based car manufacturer allegedly used technology to “collect, record, analyze, and transmit highly detailed driving data about each time a driver used their vehicle,” according to the AG’s statement. General Motors sold this information to several other companies, including to at least two companies for the purpose of generating “Driving Scores” about GM’s customers, the AG alleged. The suit said those two companies then sold these scores to insurance companies.

Insurance companies can use data to see how many times people exceeded a speed limit or obeyed other traffic laws. Some insurance firms ask customers if they want to voluntarily opt-in to such programs, promising lower rates for safer drivers. But the attorney general’s office claimed GM “deceived” its Texan customers by encouraging them to enroll in programs such as OnStar Smart Driver. But by agreeing to join these programs, customers also unknowingly agreed to the collection and sale of their data, the attorney general’s office said. “Despite lengthy and convoluted disclosures, General Motors never informed its customers of its actual conduct — the systematic collection and sale of their highly detailed driving data,” the AG’s office said in a statement. The filing can be read here (PDF).

Read more of this story at Slashdot.

Study Finds 94% of Business Spreadsheets Have Critical Errors

A recent study reveals that 94% of spreadsheets used in business decision-making contain errors, highlighting significant risks of financial and operational mistakes. Phys.org reports: Errors in spreadsheets can lead to poor decisions, resulting in financial losses, pricing mistakes, and operational problems in fields like health care and nuclear operations. “These mistakes can cause major issues in various sectors,” adds Prof. Pak-Lok Poon, the lead author of the study. Spreadsheets are crucial tools in many fields, such as linear programming and neuroscience. However, with more people creating their own spreadsheets without formal training, the number of faulty spreadsheets has increased. “Many end-users lack proper software development training, leading to more errors,” explains Prof. Poon.

The research team reviewed studies from the past 35.5 years for journal articles and 10.5 years for conference papers, focusing on spreadsheet quality and related techniques across different fields. The study found that most research focuses on testing and fixing spreadsheets after they are created, rather than on early development stages like planning and design. This approach can be more costly and risky. Prof. Poon emphasizes the need for more focus on the early stages of spreadsheet development to prevent errors. The study suggests that adopting a life cycle approach to spreadsheet quality can help reduce errors. Addressing quality from the beginning can help businesses lower risks and improve the reliability of their decision-making tools. The study has been published in the journal Frontiers of Computer Science.

Read more of this story at Slashdot.