Boston Dynamics’ Latest Atlas Video Demos a Robot That Can Run, Jump and Now Grab and Throw

Boston Dynamics released a demo of its humanoid robot Atlas, showing it pick up and deliver a bag of tools to a construction worker. While Atlas could already run and jump over complex terrain, the new hands, or rudimentary grippers, “give the robot new life,” reports TechCrunch. From the report: The claw-like gripper consists of one fixed finger and one moving finger. Boston Dynamics says the grippers were designed for heavy lifting tasks and were first demonstrated in a Super Bowl commercial where Atlas held a keg over its head. The videos released today show the grippers picking up construction lumber and a nylon tool bag. Next, the Atlas picks up a 2×8 and places it between two boxes to form a bridge. The Atlas then picks up a bag of tools and dashes over the bridge and through construction scaffolding. But the tool bag needs to go to the second level of the structure — something Atlas apparently realized and quickly throws the bag a considerable distance. Boston Dynamics describes this final maneuver: ‘Atlas’ concluding move, an inverted 540-degree, multi-axis flip, adds asymmetry to the robot’s movement, making it a much more difficult skill than previously performed parkour.” A behind the scenes video describing how Atlas is able to recognize and interact with objects is also available on YouTube.

Read more of this story at Slashdot.

Intel, AMD Just Created a Headache for Datacenters

An anonymous reader shares a report: In pursuit of ever-higher compute density, chipmakers are juicing their chips with more and more power, and according to the Uptime Institute, this could spell trouble for many legacy datacenters ill equipped to handle new, higher wattage systems. AMD’s Epyc 4 Genoa server processors announced late last year, and Intel’s long-awaited fourth-gen Xeon Scalable silicon released earlier this month, are the duo’s most powerful and power-hungry chips to date, sucking down 400W and 350W respectively, at least at the upper end of the product stack. The higher TDP arrives in lock step with higher core counts and clock speeds than previous CPU cores from either vendor.

It’s now possible to cram more than 192 x64 cores into your typical 2U dual socket system, something that just five years ago would have required at least three nodes. However, as Uptime noted, many legacy datacenters were not designed to accommodate systems this power dense. A single dual-socket system from either vendor can easily exceed a kilowatt, and depending on the kinds of accelerators being deployed in these systems, boxen can consume well in excess of that figure. The rapid trend towards hotter, more power dense systems upends decades-old assumptions about datacenter capacity planning, according to Uptime, which added: “This trend will soon reach a point when it starts to destabilize existing facility design assumptions.”

A typical rack remains under 10kW of design capacity, the analysts note. But with modern systems trending toward higher compute density and by extension power density, that’s no longer adequate. While Uptime notes that for new builds, datacenter operators can optimize for higher rack power densities, they still need to account for 10 to 15 years of headroom. As a result, datacenter operators must speculate as the long-term power and cooling demands which invites the risk of under or over building. With that said, Uptime estimates that within a few years a quarter rack will reach 10kW of consumption. That works out to approximately 1kW per rack unit for a standard 42U rack.

Read more of this story at Slashdot.

First Small Modular Nuclear Reactor Certified For Use In US

The U.S. Nuclear Regulatory Commission has certified the design for what will be the United States’ first small modular nuclear reactor. The Associated Press reports: The rule that certifies the design was published Thursday in the Federal Register. It means that companies seeking to build and operate a nuclear power plant can pick the design for a 50-megawatt, advanced light-water small modular nuclear reactor by Oregon-based NuScale Power and apply to the NRC for a license. It’s the final determination that the design is acceptable for use, so it can’t be legally challenged during the licensing process when someone applies to build and operate a nuclear power plant, NRC spokesperson Scott Burnell said Friday. The rule becomes effective in late February.

The U.S. Energy Department said the newly approved design “equips the nation with a new clean power source to help drive down” planet-warming greenhouse gas emissions. It’s the seventh nuclear reactor design cleared for use in the United States. The rest are for traditional, large, light-water reactors. Diane Hughes, NuScale’s vice president of marketing and communications, said the design certification is a historic step forward toward a clean energy future and makes the company’s VOYGR power plant a near-term deployable solution for customers. The first small modular reactor design application package included over 2 million pages of supporting materials, Hughes added. “NuScale has also applied to the NRC for approval of a larger design, at 77 megawatts per module, and the agency is checking the application for completeness before starting a full review,” adds the report.

Read more of this story at Slashdot.

IBM Top Brass Accused Again of Using Mainframes To Prop Up Watson, Cloud Sales

IBM, along with 13 of its current and former executives, has been sued by investors who claim the IT giant used mainframe sales to fraudulently prop up newer, more trendy parts of its business. The Register reports: In effect, IBM deceived the market about its progress in developing Watson, cloud technologies, and other new sources of revenue, by deliberately misclassifying the money it was making from mainframe deals, assigning that money instead to other products, it is alleged. The accusations emerged in a lawsuit [PDF] filed late last week against IBM in New York on behalf of the June E Adams Irrevocable Trust. It alleged Big Blue shifted sales by its “near-monopoly” mainframe business to its newer and less popular cloud, analytics, mobile, social, and security products (CAMSS), which bosses promoted as growth opportunities and designated “Strategic Imperatives.”

IBM is said to have created the appearance of demand for these Strategic Imperative products by bundling them into three- to five-year mainframe Enterprise License Agreements (ELA) with large banking, healthcare, and insurance company customers. In other words, it is claimed, mainframe sales agreements had Strategic Imperative products tacked on to help boost the sales performance of those newer offerings and give investors the impression customers were clamoring for those technologies from IBM. “Defendants used steep discounting on the mainframe part of the ELA in return for the customer purchasing catalog software (i.e. Strategic Imperative Revenue), unneeded and unused by the customer,” the lawsuit stated.

IBM is also alleged to have shifted revenue from its non-strategic Global Business Services (GBS) segment to Watson, a Strategic Imperative in the CAMSS product set, to convince investors that the company was successfully expanding beyond its legacy business. Last April the plaintiff Trust filed a similar case, which was joined by at least five other law firms representing other IBM shareholders. A month prior, the IBM board had been presented with a demand letter from shareholders to investigate the above allegations. Asked whether any action has been taken as a result of that letter, IBM has yet to respond.

Read more of this story at Slashdot.

CNET Pauses Publishing AI-Written Stories After Disclosure Controversy

CNET will pause publication of stories generated using artificial intelligence “for now,” the site’s leadership told employees on a staff call Friday. The Verge reports: The call, which lasted under an hour, was held a week after CNET came under fire for its use of AI tools on stories and one day after The Verge reported that AI tools had been in use for months, with little transparency to readers or staff. CNET hadn’t formally announced the use of AI until readers noticed a small disclosure. “We didn’t do it in secret,” CNET editor-in-chief Connie Guglielmo told the group. “We did it quietly.” CNET, owned by private equity firm Red Ventures, is among several websites that have been publishing articles written using AI. Other sites like Bankrate and CreditCards.com would also pause AI stories, executives on the call said.

The call was hosted by Guglielmo, Lindsey Turrentine, CNET’s EVP of content and audience, and Lance Davis, Red Ventures’ vice president of content. They answered a handful of questions submitted by staff ahead of time in the AMA-style call. Davis, who was listed as the point of contact for CNET’s AI stories until recently, also gave staff a more detailed rundown of the tool that has been utilized for the robot-written articles. Until now, most staff had very little insight into the machine that was generating dozens of stories appearing on CNET.

The AI, which is as of yet unnamed, is a proprietary tool built by Red Ventures, according to Davis. AI editors are able to choose domains and domain-level sections from which to pull data from and generate stories; editors can also use a combination of AI-generated text and their own writing or reporting. Turrentine declined to answer staff questions about the dataset used to train AI in today’s meeting as well as around plagiarism concerns but said more information would be available next week and that some staff would get a preview of the tool.

Read more of this story at Slashdot.

D&D Will Move To a Creative Commons License, Requests Feedback On a New OGL

A new draft of the Dungeons & Dragons Open Gaming License, dubbed OGL 1.2 by publisher Wizards of the Coast, is now available for download. Polygon reports: The announcement was made Thursday by Kyle Brink, executive producer of D&D, on the D&D Beyond website. According to Wizards, this draft could place the OGL outside of the publisher’s control — which should sound good to fans enraged by recent events. Time will tell, but public comment will be accepted beginning Jan. 20 and will continue through Feb. 3. […] Creative Commons is a nonprofit organization that, by its own description, “helps overcome legal obstacles to the sharing of knowledge and creativity to address the world’s most pressing challenges.” As such, a Creative Commons license once enacted could ultimately put the OGL 1.2 outside of Wizards’ control in perpetuity.

“We’re giving the core D&D mechanics to the community through a Creative Commons license, which means that they are fully in your hands,” Brink said in the blog post. “If you want to use quintessentially D&D content from the SRD such as owlbears and magic missile, OGL 1.2 will provide you a perpetual, irrevocable license to do so.” So much trust has been lost over the last several weeks that it will no doubt take a while for legal experts — armchair and otherwise — to pour over the details of the new OGL. These are the bullet points that Wizards is promoting in this official statement: – Protecting D&D’s inclusive play experience. As I said above, content more clearly associated with D&D (like the classes, spells, and monsters) is what falls under the OGL. You’ll see that OGL 1.2 lets us act when offensive or hurtful content is published using the covered D&D stuff. We want an inclusive, safe play experience for everyone. This is deeply important to us, and OGL 1.0a didn’t give us any ability to ensure it

– TTRPGs and VTTs. OGL 1.2 will only apply to TTRPG content, whether published as books, as electronic publications, or on virtual tabletops (VTTs). Nobody needs to wonder or worry if it applies to anything else. It doesn’t.

– Deauthorizing OGL 1.0a. We know this is a big concern. The Creative Commons license and the open terms of 1.2 are intended to help with that. One key reason why we have to deauthorize: We can’t use the protective options in 1.2 if someone can just choose to publish harmful, discriminatory, or illegal content under 1.0a. And again, any content you have already published under OGL 1.0a will still always be licensed under OGL 1.0a.

– Very limited license changes allowed. Only two sections can be changed once OGL 1.2 is live: how you cite Wizards in your work and how we can contact each other. We don’t know what the future holds or what technologies we will use to communicate with each other, so we thought these two sections needed to be future-proofed. A revised version of this draft will be presented to the community again “on or before February 17.”
“The process will extend as long as it needs to,” Brink said. “We’ll keep iterating and getting your feedback until we get it right.”

Read more of this story at Slashdot.

The Lights Have Been On At a Massachusetts School For Over a Year Because No One Can Turn Them Off

An anonymous reader quotes a report from NBC News: For nearly a year and a half, a Massachusetts high school has been lit up around the clock because the district can’t turn off the roughly 7,000 lights in the sprawling building. The lighting system was installed at Minnechaug Regional High School when it was built over a decade ago and was intended to save money and energy. But ever since the software that runs it failed on Aug. 24, 2021, the lights in the Springfield suburbs school have been on continuously, costing taxpayers a small fortune.

“We are very much aware this is costing taxpayers a significant amount of money,” Aaron Osborne, the assistant superintendent of finance at the Hampden-Wilbraham Regional School District, told NBC News. “And we have been doing everything we can to get this problem solved.” Osborne said it’s difficult to say how much money it’s costing because during the pandemic and in its aftermath, energy costs have fluctuated wildly. “I would say the net impact is in the thousands of dollars per month on average, but not in the tens of thousands,” Osborne said. That, in part, is because the high school uses highly efficient fluorescent and LED bulbs, he said. And, when possible, teachers have manually removed bulbs from fixtures in classrooms while staffers have shut off breakers not connected to the main system to douse some of the exterior lights.

But there’s hope on the horizon that the lights at Minnechaug will soon be dimmed. Paul Mustone, president of the Reflex Lighting Group, said the parts they need to replace the system at the school have finally arrived from the factory in China and they expect to do the installation over the February break. “And yes, there will be a remote override switch so this won’t happen again,” said Mustone, whose company has been in business for more than 40 years.

Read more of this story at Slashdot.

Android 13 Is Running On 5.2% of All Devices Five Months After Launch

According to the latest official Android distribution numbers from Google, Android 13 is running on 5.2% of all devices less than six months after launch. 9to5Google reports: According to Android Studio, devices running Android 13 now account for 5.2% of all devices. Meanwhile Android 12 and 12L now account for 18.9% of the total, a significant increase from August’s 13.5% figure. Notably, while Google’s chart does include details about Android 13, it doesn’t make a distinction between Android 12 and 12L. Looking at the older versions, we see that usage of Android Oreo has finally dropped below 10%, with similar drops in percentage down the line. Android Jelly Bean, which previously weighed in at 0.3%, is no longer listed, while KitKat has dropped from 0.9% to 0.7%. Android 13’s 5.2% distribution number “is better than it sounds,” writes Ryan Whitwam via ExtremeTech: These numbers show an accelerating pickup for Google’s new platform versions. If you look back at stats from the era of Android KitKat and Lollipop, the latest version would only have a fraction of this usage share after half a year. That’s because the only phones running the new software would be Google’s Nexus phones, plus maybe one or two new devices from OEMs that worked with Google to deploy the latest software as a marketing gimmick.

The improvements are thanks largely to structural changes in how Android is developed and deployed. For example, Project Treble was launched in 2017 to re-architect the platform, separating the OS framework from the low-level vendor code. This made it easier to update devices without waiting on vendors to provide updated drivers. We saw evidence of improvement that very year, and it’s gotten better ever since.

Read more of this story at Slashdot.

iOS 16.3 Expands Advanced Data Protection Option For iCloud Encryption Globally

Apple today announced that Advanced Data Protection is expanding beyond the United States. MacRumors reports: Starting with iOS 16.3, the security feature will be available globally, giving users to option to enable end-to-end encryption for many additional iCloud data categories, including Photos, Notes, Voice Memos, Messages backups, device backups, and more. iOS 16.3 is currently in beta and expected to be released to the public next week.

By default, Apple stores encryption keys for some iCloud data types on its servers to ensure that users can recover their data if they lose access to their Apple ID account. If a user enables Advanced Data Protection, the encryption keys are deleted from Apple’s servers and stored on a user’s devices only, preventing Apple, law enforcement, or anyone else from accessing the data, even if iCloud servers were to be breached.

iCloud already provides end-to-end encryption for 14 data categories without Advanced Data Protection turned on, including Messages (excluding backups), passwords stored in iCloud Keychain, Health data, Apple Maps search history, Apple Card transactions, and more. Advanced Data Protection expands this protection to the vast majority of iCloud categories, with major exceptions including the Mail, Contacts, and Calendar apps. For more information, you can read Apple’s Advanced Data Protection support document.

Read more of this story at Slashdot.