Microsoft Blamed For Million-Plus Patient Record Theft At US Hospital Giant

Brandon Vigliarolo reports via The Register: American healthcare provider Geisinger fears highly personal data on more than a million of its patients has been stolen — and claimed a former employee at a Microsoft subsidiary is the likely culprit. Geisinger on Monday announced the results of a probe into a November computer security breach, placing the blame on Microsoft-owned Nuance Communications for not cutting off one of its employees’ access to corporate files after that person was fired. The Pennsylvania-based healthcare giant uses Nuance as an IT provider. We’re told that after the Microsoft-owned entity terminated one of its workers, that staffer two days later may have accessed and taken copies of sensitive records on a huge number of Geisinger patients — for reasons as yet unknown.

Geisinger — which says it operates 13 hospitals and has more than 600,000 members — said it discovered the improper access on November 29, informed Nuance, and the IT supplier immediately cut off the former employee from the healthcare group’s data before involving police. “Because it could have impeded their investigation, law enforcement investigators asked Nuance to delay notifying patients of this incident until now,” Geisinger claimed, explaining why only now this is coming to light. “The former Nuance employee has been arrested and is facing federal charges.” It’s not immediately clear if or what charges have been laid — we’ve asked Geisinger for details.

Speech recognition firm Nuance performed its own probe, according to Geisinger, and determined that the former employee may have stolen information on a million-plus people. That info would include birth dates, addresses, hospital admission and discharge records, demographic information, and other medical data. The ex-employee didn’t swipe insurance or other financial information, the multi-billion-dollar healthcare group stated. “We continue to work closely with the authorities on this investigation, and while I am grateful that the perpetrator was caught and is now facing federal charges,” Geisinger chief privacy officer Jonathan Friesen alleged, adding: “I am sorry that this happened.”

Read more of this story at Slashdot.

Apple Pauses Work On Planned North Carolina Campus

In 2021, Apple announced plans for a new $1 billion campus in North Carolina, set to include a new engineering and research center and support up to 3,000 employees. According to Lauren Ohnesorge of Triangle Business Journal (paywalled), Apple remains committed to the project, but the timeline has been delayed by four years. MacRumors reports: A limited amount of progress on the campus has been made since the announcement, and Apple has not provided updates on construction until now. Apple told Triangle Business Journal that it has paused work on the campus, and it is working with North Carolina Governor Roy Cooper and the North Carolina Department of Commerce to extend the project’s timeline by four years.

Apple last year filed development plans for the first phase of construction, but the specific timeline for the project has never been clear. Apple’s plans for Research Triangle Park include six buildings and a parking garage totaling 700,000 square feet of office space, 190,000 square feet of accessory space, and close to 3,000 parking spaces spanning 41 acres. Apple owns 281 acres of land in the area where it plans to build its campus, so there could ultimately be several phases of construction. As it prepares to build the NC research center, Apple is leasing more than 200,000 square feet of office space in Cary, North Carolina. In a statement, Apple said it is still committed to the project: “Apple has been operating in North Carolina for over two decades. And we’re deeply committed to growing our teams here. In the last three years, we’ve added more than 600 people to our team in Raleigh, and we’re looking forward to developing our new campus in the coming years.”

Read more of this story at Slashdot.

Crypto Industry Super PAC Is 33-2 In Primaries, With $100 Million For House and Senate Races

A super PAC called Fairshake, funded primarily by top cryptocurrency companies, achieved several wins in congressional primaries and plans to spend over $100 million to support pro-crypto candidates in the general elections. CNBC reports: Fairshake and its two affiliated political action committees, one for Republicans, one for Democrats, quietly racked up half a dozen other wins Tuesday as the candidates they backed glided to victory, although none of the races were competitive. They included Rep. John Curtis, who won the Republican nomination for Utah’s open Senate seat. Created last year as part of a joint effort between more than a dozen crypto firms, Fairshake PAC has emerged as one of the top-spending PACs in the 2024 election cycle. Fairshake and its two affiliated PACs have put more than $37 million so far into advertisements in primary races, according to AdImpact. Despite a broad mission to defend the entire $2.2 trillion crypto market, Fairshake is funded by a very small set of donors.

Of the $160 million in total contributions Fairshake has raised since it was founded, around $155 million — or 94% — can be traced back to just four companies: Ripple, Andreesen Horowitz, Coinbase and Jump Crypto. But it’s not just money that the crypto industry plans to deploy this fall. The nonprofit Stand With Crypto says it has collected more than 1.1 million email addresses of crypto “advocates” it hopes to engage all the way to the ballot box. The strength of the crypto groups is getting noticed on Capitol Hill, especially among lawmakers who are facing tough elections in 2025, where a few thousand voters, or a hefty donation, could make a difference in not only a race but in which party controls each chamber. […]

In the coming months, the group doesn’t plan to spend on the presidential race, but rather the House and Senate, according to a Fairshake spokesperson. Both of those chambers are in play for 2025. Fairshake has yet to start spending in the general election cycle, but several officials in the industry said they are keeping an eye on states such as Ohio and Montana, where Democratic incumbents who are bearish on crypto face challengers who have embraced the technology. […] Ads funded by Fairshake deliver messages that are typically less about a candidates’ support for or opposition to crypto, and more about broader issues that resound with voters, such as fairness and integrity.

Read more of this story at Slashdot.

Researchers Upend AI Status Quo By Eliminating Matrix Multiplication In LLMs

Researchers from UC Santa Cruz, UC Davis, LuxiTech, and Soochow University have developed a new method to run AI language models more efficiently by eliminating matrix multiplication, potentially reducing the environmental impact and operational costs of AI systems. Ars Technica’s Benj Edwards reports: Matrix multiplication (often abbreviated to “MatMul”) is at the center of most neural network computational tasks today, and GPUs are particularly good at executing the math quickly because they can perform large numbers of multiplication operations in parallel. […] In the new paper, titled “Scalable MatMul-free Language Modeling,” the researchers describe creating a custom 2.7 billion parameter model without using MatMul that features similar performance to conventional large language models (LLMs). They also demonstrate running a 1.3 billion parameter model at 23.8 tokens per second on a GPU that was accelerated by a custom-programmed FPGA chip that uses about 13 watts of power (not counting the GPU’s power draw). The implication is that a more efficient FPGA “paves the way for the development of more efficient and hardware-friendly architectures,” they write.

The paper doesn’t provide power estimates for conventional LLMs, but this post from UC Santa Cruz estimates about 700 watts for a conventional model. However, in our experience, you can run a 2.7B parameter version of Llama 2 competently on a home PC with an RTX 3060 (that uses about 200 watts peak) powered by a 500-watt power supply. So, if you could theoretically completely run an LLM in only 13 watts on an FPGA (without a GPU), that would be a 38-fold decrease in power usage. The technique has not yet been peer-reviewed, but the researchers — Rui-Jie Zhu, Yu Zhang, Ethan Sifferman, Tyler Sheaves, Yiqiao Wang, Dustin Richmond, Peng Zhou, and Jason Eshraghian — claim that their work challenges the prevailing paradigm that matrix multiplication operations are indispensable for building high-performing language models. They argue that their approach could make large language models more accessible, efficient, and sustainable, particularly for deployment on resource-constrained hardware like smartphones. […]

The researchers say that scaling laws observed in their experiments suggest that the MatMul-free LM may also outperform traditional LLMs at very large scales. The researchers project that their approach could theoretically intersect with and surpass the performance of standard LLMs at scales around 10^23 FLOPS, which is roughly equivalent to the training compute required for models like Meta’s Llama-3 8B or Llama-2 70B. However, the authors note that their work has limitations. The MatMul-free LM has not been tested on extremely large-scale models (e.g., 100 billion-plus parameters) due to computational constraints. They call for institutions with larger resources to invest in scaling up and further developing this lightweight approach to language modeling.

Read more of this story at Slashdot.

MTV News Website Goes Dark, Archives Pulled Offline

MTVNews.com has been shut down, with more than two decades’ worth of content no longer available. “Content on its sister site, CMT.com, seems to have met a similar fate,” adds Variety. From the report: In 2023, MTV News was shuttered amid the financial woes of parent company Paramount Global. As of Monday, trying to access MTV News articles on mtvnews.com or mtv.com/news resulted in visitors being redirected to the main MTV website.

The now-unavailable content includes decades of music journalism comprising thousands of articles and interviews with countless major artists, dating back to the site’s launch in 1996. Perhaps the most significant loss is MTV News’ vast hip-hop-related archives, particularly its weekly “Mixtape Monday” column, which ran for nearly a decade in the 2000s and 2010s and featured interviews, reviews and more with many artists, producers and others early in their careers. “So, mtvnews.com no longer exists. Eight years of my life are gone without a trace,” Patrick Hosken, former music editor for MTV News, wrote on X. “All because it didn’t fit some executives’ bottom lines. Infuriating is too small a word.”

“sickening (derogatory) to see the entire @mtvnews archive wiped from the internet,” Crystal Bell, culture editor at Mashable and one-time entertainment director of MTV News, posted on X.”decades of music history gone… including some very early k-pop stories.”

“This is disgraceful. They’ve completely wiped the MTV News archive,” longtime Rolling Stone senior writer Brian Hiatt commented. “Decades of pop culture history research material gone, and why?”

The report notes that some MTV News articles may be available via internet archiving services like the Wayback Machine. However, older articles aren’t available.

Read more of this story at Slashdot.