Hobbyist’s Experiment Creates a Self-Soldering Circuit Board

Long-time Slashdot reader wonkavader found a video on YouTube where, at the 2:50 mark, there’s time-lapse footage of soldering paste magically melting into place. The secret?
Many circuit boards include a grounded plane as a layer. This doesn’t have to be a big unbroken expanse of copper — it can be a long snake to reduce the copper used. Well, if you run 9 volts through that long snake, it acts as a resistor and heats up the board enough to melt solder paste. Electronics engineer Carl Bugeja has made a board which controls the 9 volt input to keep the temperature on the desired curve for the solder.

This is an interesting home-brew project which seems like it might someday make a pleasant, expected feature in kits.

Hackaday is impressed by the possibilities too:
Surface mount components have been a game changer for the electronics hobbyist, but doing reflow soldering right requires some way to evenly heat the board. You might need to buy a commercial reflow oven — you can cobble one together from an old toaster oven, after all — but you still need something, because it’s not like a PCB is going to solder itself. Right?

Wrong. At least if you’re Carl Bugeja, who came up with a clever way to make his PCBs self-soldering…. The quality of the soldering seems very similar to what you’d see from a reflow oven…. After soldering, the now-useless heating element is converted into a ground plane for the circuit by breaking off the terminals and soldering on a couple of zero ohm resistors to short the coil to ground.

It’s an open source project, with all files available on GitHub. “This is really clever,” tweeted Adrian Bowyer, inventor of the open source 3D printer the RepRap Project.

In the video Bugeja compares reflow soldering to pizza-making. (If the circuit board is the underlying dough, then the electronics on top are the toppings, with the solder paste representing the sauce that keeps them in place. “The oven’s heat is what bonds these individual items together.”)

But by that logic making a self-soldering circuit is “like putting the oven in the dough and making it edible.”

Read more of this story at Slashdot.

Automation Caused More than Half America’s Income Inequality Since 1980, Study Claims

A newly published study co-authored by MIT economist Daron Acemoglu “quantifies the extent to which automation has contributed to income inequality in the U.S.,” reports SciTechDaily, “simply by replacing workers with technology — whether self-checkout machines, call-center systems, assembly-line technology, or other devices.”

Over the last four decades, the income gap between more- and less-educated workers has grown significantly; the study finds that automation accounts for more than half of that increase. “This single one variable … explains 50 to 70 percent of the changes or variation between group inequality from 1980 to about 2016,” Acemoglu says….

Acemoglu and Pascual Restrepo, an assistant professor of economics at Boston University, used U.S. Bureau of Economic Analysis statistics on the extent to which human labor was used in 49 industries from 1987 to 2016, as well as data on machinery and software adopted in that time. The scholars also used data they had previously compiled about the adoption of robots in the U.S. from 1993 to 2014. In previous studies, Acemoglu and Restrepo have found that robots have by themselves replaced a substantial number of workers in the U.S., helped some firms dominate their industries, and contributed to inequality.

At the same time, the scholars used U.S. Census Bureau metrics, including its American Community Survey data, to track worker outcomes during this time for roughly 500 demographic subgroups… By examining the links between changes in business practices alongside changes in labor market outcomes, the study can estimate what impact automation has had on workers.

Ultimately, Acemoglu and Restrepo conclude that the effects have been profound. Since 1980, for instance, they estimate that automation has reduced the wages of men without a high school degree by 8.8 percent and women without a high school degree by 2.3 percent, adjusted for inflation.

Thanks to long-time Slashdot reader schwit1 for sharing the article.

Read more of this story at Slashdot.

First Small Modular Nuclear Reactor Certified For Use In US

The U.S. Nuclear Regulatory Commission has certified the design for what will be the United States’ first small modular nuclear reactor. The Associated Press reports: The rule that certifies the design was published Thursday in the Federal Register. It means that companies seeking to build and operate a nuclear power plant can pick the design for a 50-megawatt, advanced light-water small modular nuclear reactor by Oregon-based NuScale Power and apply to the NRC for a license. It’s the final determination that the design is acceptable for use, so it can’t be legally challenged during the licensing process when someone applies to build and operate a nuclear power plant, NRC spokesperson Scott Burnell said Friday. The rule becomes effective in late February.

The U.S. Energy Department said the newly approved design “equips the nation with a new clean power source to help drive down” planet-warming greenhouse gas emissions. It’s the seventh nuclear reactor design cleared for use in the United States. The rest are for traditional, large, light-water reactors. Diane Hughes, NuScale’s vice president of marketing and communications, said the design certification is a historic step forward toward a clean energy future and makes the company’s VOYGR power plant a near-term deployable solution for customers. The first small modular reactor design application package included over 2 million pages of supporting materials, Hughes added. “NuScale has also applied to the NRC for approval of a larger design, at 77 megawatts per module, and the agency is checking the application for completeness before starting a full review,” adds the report.

Read more of this story at Slashdot.

Intel, AMD Just Created a Headache for Datacenters

An anonymous reader shares a report: In pursuit of ever-higher compute density, chipmakers are juicing their chips with more and more power, and according to the Uptime Institute, this could spell trouble for many legacy datacenters ill equipped to handle new, higher wattage systems. AMD’s Epyc 4 Genoa server processors announced late last year, and Intel’s long-awaited fourth-gen Xeon Scalable silicon released earlier this month, are the duo’s most powerful and power-hungry chips to date, sucking down 400W and 350W respectively, at least at the upper end of the product stack. The higher TDP arrives in lock step with higher core counts and clock speeds than previous CPU cores from either vendor.

It’s now possible to cram more than 192 x64 cores into your typical 2U dual socket system, something that just five years ago would have required at least three nodes. However, as Uptime noted, many legacy datacenters were not designed to accommodate systems this power dense. A single dual-socket system from either vendor can easily exceed a kilowatt, and depending on the kinds of accelerators being deployed in these systems, boxen can consume well in excess of that figure. The rapid trend towards hotter, more power dense systems upends decades-old assumptions about datacenter capacity planning, according to Uptime, which added: “This trend will soon reach a point when it starts to destabilize existing facility design assumptions.”

A typical rack remains under 10kW of design capacity, the analysts note. But with modern systems trending toward higher compute density and by extension power density, that’s no longer adequate. While Uptime notes that for new builds, datacenter operators can optimize for higher rack power densities, they still need to account for 10 to 15 years of headroom. As a result, datacenter operators must speculate as the long-term power and cooling demands which invites the risk of under or over building. With that said, Uptime estimates that within a few years a quarter rack will reach 10kW of consumption. That works out to approximately 1kW per rack unit for a standard 42U rack.

Read more of this story at Slashdot.

Boston Dynamics’ Latest Atlas Video Demos a Robot That Can Run, Jump and Now Grab and Throw

Boston Dynamics released a demo of its humanoid robot Atlas, showing it pick up and deliver a bag of tools to a construction worker. While Atlas could already run and jump over complex terrain, the new hands, or rudimentary grippers, “give the robot new life,” reports TechCrunch. From the report: The claw-like gripper consists of one fixed finger and one moving finger. Boston Dynamics says the grippers were designed for heavy lifting tasks and were first demonstrated in a Super Bowl commercial where Atlas held a keg over its head. The videos released today show the grippers picking up construction lumber and a nylon tool bag. Next, the Atlas picks up a 2×8 and places it between two boxes to form a bridge. The Atlas then picks up a bag of tools and dashes over the bridge and through construction scaffolding. But the tool bag needs to go to the second level of the structure — something Atlas apparently realized and quickly throws the bag a considerable distance. Boston Dynamics describes this final maneuver: ‘Atlas’ concluding move, an inverted 540-degree, multi-axis flip, adds asymmetry to the robot’s movement, making it a much more difficult skill than previously performed parkour.” A behind the scenes video describing how Atlas is able to recognize and interact with objects is also available on YouTube.

Read more of this story at Slashdot.

CNET Pauses Publishing AI-Written Stories After Disclosure Controversy

CNET will pause publication of stories generated using artificial intelligence “for now,” the site’s leadership told employees on a staff call Friday. The Verge reports: The call, which lasted under an hour, was held a week after CNET came under fire for its use of AI tools on stories and one day after The Verge reported that AI tools had been in use for months, with little transparency to readers or staff. CNET hadn’t formally announced the use of AI until readers noticed a small disclosure. “We didn’t do it in secret,” CNET editor-in-chief Connie Guglielmo told the group. “We did it quietly.” CNET, owned by private equity firm Red Ventures, is among several websites that have been publishing articles written using AI. Other sites like Bankrate and CreditCards.com would also pause AI stories, executives on the call said.

The call was hosted by Guglielmo, Lindsey Turrentine, CNET’s EVP of content and audience, and Lance Davis, Red Ventures’ vice president of content. They answered a handful of questions submitted by staff ahead of time in the AMA-style call. Davis, who was listed as the point of contact for CNET’s AI stories until recently, also gave staff a more detailed rundown of the tool that has been utilized for the robot-written articles. Until now, most staff had very little insight into the machine that was generating dozens of stories appearing on CNET.

The AI, which is as of yet unnamed, is a proprietary tool built by Red Ventures, according to Davis. AI editors are able to choose domains and domain-level sections from which to pull data from and generate stories; editors can also use a combination of AI-generated text and their own writing or reporting. Turrentine declined to answer staff questions about the dataset used to train AI in today’s meeting as well as around plagiarism concerns but said more information would be available next week and that some staff would get a preview of the tool.

Read more of this story at Slashdot.

IBM Top Brass Accused Again of Using Mainframes To Prop Up Watson, Cloud Sales

IBM, along with 13 of its current and former executives, has been sued by investors who claim the IT giant used mainframe sales to fraudulently prop up newer, more trendy parts of its business. The Register reports: In effect, IBM deceived the market about its progress in developing Watson, cloud technologies, and other new sources of revenue, by deliberately misclassifying the money it was making from mainframe deals, assigning that money instead to other products, it is alleged. The accusations emerged in a lawsuit [PDF] filed late last week against IBM in New York on behalf of the June E Adams Irrevocable Trust. It alleged Big Blue shifted sales by its “near-monopoly” mainframe business to its newer and less popular cloud, analytics, mobile, social, and security products (CAMSS), which bosses promoted as growth opportunities and designated “Strategic Imperatives.”

IBM is said to have created the appearance of demand for these Strategic Imperative products by bundling them into three- to five-year mainframe Enterprise License Agreements (ELA) with large banking, healthcare, and insurance company customers. In other words, it is claimed, mainframe sales agreements had Strategic Imperative products tacked on to help boost the sales performance of those newer offerings and give investors the impression customers were clamoring for those technologies from IBM. “Defendants used steep discounting on the mainframe part of the ELA in return for the customer purchasing catalog software (i.e. Strategic Imperative Revenue), unneeded and unused by the customer,” the lawsuit stated.

IBM is also alleged to have shifted revenue from its non-strategic Global Business Services (GBS) segment to Watson, a Strategic Imperative in the CAMSS product set, to convince investors that the company was successfully expanding beyond its legacy business. Last April the plaintiff Trust filed a similar case, which was joined by at least five other law firms representing other IBM shareholders. A month prior, the IBM board had been presented with a demand letter from shareholders to investigate the above allegations. Asked whether any action has been taken as a result of that letter, IBM has yet to respond.

Read more of this story at Slashdot.

D&D Will Move To a Creative Commons License, Requests Feedback On a New OGL

A new draft of the Dungeons & Dragons Open Gaming License, dubbed OGL 1.2 by publisher Wizards of the Coast, is now available for download. Polygon reports: The announcement was made Thursday by Kyle Brink, executive producer of D&D, on the D&D Beyond website. According to Wizards, this draft could place the OGL outside of the publisher’s control — which should sound good to fans enraged by recent events. Time will tell, but public comment will be accepted beginning Jan. 20 and will continue through Feb. 3. […] Creative Commons is a nonprofit organization that, by its own description, “helps overcome legal obstacles to the sharing of knowledge and creativity to address the world’s most pressing challenges.” As such, a Creative Commons license once enacted could ultimately put the OGL 1.2 outside of Wizards’ control in perpetuity.

“We’re giving the core D&D mechanics to the community through a Creative Commons license, which means that they are fully in your hands,” Brink said in the blog post. “If you want to use quintessentially D&D content from the SRD such as owlbears and magic missile, OGL 1.2 will provide you a perpetual, irrevocable license to do so.” So much trust has been lost over the last several weeks that it will no doubt take a while for legal experts — armchair and otherwise — to pour over the details of the new OGL. These are the bullet points that Wizards is promoting in this official statement: – Protecting D&D’s inclusive play experience. As I said above, content more clearly associated with D&D (like the classes, spells, and monsters) is what falls under the OGL. You’ll see that OGL 1.2 lets us act when offensive or hurtful content is published using the covered D&D stuff. We want an inclusive, safe play experience for everyone. This is deeply important to us, and OGL 1.0a didn’t give us any ability to ensure it

– TTRPGs and VTTs. OGL 1.2 will only apply to TTRPG content, whether published as books, as electronic publications, or on virtual tabletops (VTTs). Nobody needs to wonder or worry if it applies to anything else. It doesn’t.

– Deauthorizing OGL 1.0a. We know this is a big concern. The Creative Commons license and the open terms of 1.2 are intended to help with that. One key reason why we have to deauthorize: We can’t use the protective options in 1.2 if someone can just choose to publish harmful, discriminatory, or illegal content under 1.0a. And again, any content you have already published under OGL 1.0a will still always be licensed under OGL 1.0a.

– Very limited license changes allowed. Only two sections can be changed once OGL 1.2 is live: how you cite Wizards in your work and how we can contact each other. We don’t know what the future holds or what technologies we will use to communicate with each other, so we thought these two sections needed to be future-proofed. A revised version of this draft will be presented to the community again “on or before February 17.”
“The process will extend as long as it needs to,” Brink said. “We’ll keep iterating and getting your feedback until we get it right.”

Read more of this story at Slashdot.

Android 13 Is Running On 5.2% of All Devices Five Months After Launch

According to the latest official Android distribution numbers from Google, Android 13 is running on 5.2% of all devices less than six months after launch. 9to5Google reports: According to Android Studio, devices running Android 13 now account for 5.2% of all devices. Meanwhile Android 12 and 12L now account for 18.9% of the total, a significant increase from August’s 13.5% figure. Notably, while Google’s chart does include details about Android 13, it doesn’t make a distinction between Android 12 and 12L. Looking at the older versions, we see that usage of Android Oreo has finally dropped below 10%, with similar drops in percentage down the line. Android Jelly Bean, which previously weighed in at 0.3%, is no longer listed, while KitKat has dropped from 0.9% to 0.7%. Android 13’s 5.2% distribution number “is better than it sounds,” writes Ryan Whitwam via ExtremeTech: These numbers show an accelerating pickup for Google’s new platform versions. If you look back at stats from the era of Android KitKat and Lollipop, the latest version would only have a fraction of this usage share after half a year. That’s because the only phones running the new software would be Google’s Nexus phones, plus maybe one or two new devices from OEMs that worked with Google to deploy the latest software as a marketing gimmick.

The improvements are thanks largely to structural changes in how Android is developed and deployed. For example, Project Treble was launched in 2017 to re-architect the platform, separating the OS framework from the low-level vendor code. This made it easier to update devices without waiting on vendors to provide updated drivers. We saw evidence of improvement that very year, and it’s gotten better ever since.

Read more of this story at Slashdot.