Intel’s Expensive New Plan to Upgrade Its Chip Technology – and US Manufacturing

America’s push to manufacturer more products domestically gets an in-depth look from CNET — including a new Intel chip factory outside of Phoenix.

CNET calls it a fork in the road “after squandering its lead because of a half decade of problems modernizing its manufacturing…”

With “a decade of bad decisions, this doesn’t get fixed overnight,” says Pat Gelsinger, Intel’s new chief executive, in an interview. “But the bottom is behind us and the slope is starting to feel increasingly strong….” More fabs are on the way, too. In an enormous empty patch of dirt at its existing Arizona site, Intel has just begun building fabs 52 and 62 at a total cost of $20 billion, set to make Intel’s most advanced chips, starting in 2024. Later this year, it hopes to announce the U.S. location for its third major manufacturing complex, a 1,000-acre site costing about $100 billion. The spending commitment makes this year’s $3.5 billion upgrade to its New Mexico fab look cheap. The goal is to restore the U.S. share of chip manufacturing, which has slid from 37% in 1990 to 12% today. “Over the decade in front of us, we should be striving to bring the U.S. to 30% of worldwide semiconductor manufacturing,” Gelsinger says…

But returning Intel to its glory days — and anchoring a resurgent U.S. electronics business in the process — is much easier said than done. Making chips profitably means running fabs at maximum capacity to pay off the gargantuan investments required to stay at the leading edge. A company that can’t keep pace gets squeezed out, like IBM in 2014 or Global Foundries in 2018. To catch up after its delays, Intel now plans to upgrade its manufacturing five times in the next four years, a breakneck pace by industry standards. “This new roadmap that they announced is really aggressive,” says Linley Group analyst Linley Gwennap. “I don’t have any idea how they are going to accomplish all of that….”

Gelsinger has a tech-first recovery plan. He’s pledged to accelerate manufacturing upgrades to match the technology of TSMC and Samsung by 2024 and surpass them in 2025. He’s opening Intel’s fabs to other companies that need chips built through its new Intel Foundry Services (IFS). And he’s relying on other foundries, including TSMC, for about a quarter of Intel’s near-term chipmaking needs to keep its chips more competitive during the upgrades. This three-pronged strategy is called IDM (integrated design and manufacturing) 2.0. That’s a new take on Intel’s philosophy of both designing and making chips. It’s more ambitious than the future some had expected, in which Intel would sell its factories and join the ranks of “fabless” chip designers like Nvidia, AMD and Qualcomm that rely on others for manufacturing…

Shareholders may not like Gelsinger’s spending-heavy strategy, but one community really does: Intel’s engineers… Gelsigner told the board that Intel is done with stock buybacks, a financial move in which a company uses its cash to buy stock and thereby increase its price. “We’re investing in factories,” he told me. “That’s going to be the use of our cash….”

“We cannot recall the last time Intel put so many stakes in the ground,” said BMO Capital Markets analyst Ambrish Srivastava in a July research report after Intel announced its schedule.

Intel will even outpace Moore’s law, Gelsinger tells CNET — more than doubling the transistor count on processors every two years. “I believe that you’re going to see from 2025 to 2035 a very healthy period for Moore’s Law-like behavior.”

Although that still brings some risk to Intel’s investments if they have to pass the costs on to customer, a Linley Group analyst points out to CNET. “Moore’s Law is not going to end when we can’t build smaller transistors. It’s going to end when somebody says I don’t want to pay for smaller transistors.”

Read more of this story at Slashdot.

Cryptographers Aren’t Happy With How You’re Using the Word ‘Crypto’

Cryptographers are upset that “crypto” sometimes now refers to cryptocurrency, reports the Guardian:

This lexical shift has weighed heavily on cryptographers, who, over the past few years, have repeated the rallying cry “Crypto means cryptography” on social media. T-shirts and hoodies trumpet the phrase and variations on it; there’s a website dedicated solely to clarifying the issue. “‘Crypto’ for decades has been used as shorthand and as a prefix for things related to cryptography,” said Amie Stepanovich, executive director of Silicon Flatirons Center at the University of Colorado Law School and creator of the pro-cryptography T-shirts, which have become a hit at conferences. “In fact, in the term cryptocurrency, the prefix crypto refers back to cryptography….”

[T]here remains an internecine feud among the tech savvy about the word. As Parker Higgins of the Freedom of the Press Foundation, who has spent years involved in cryptography activism, pointed out, the cryptography crowd is by nature deeply invested in precision — after all, designing and cracking codes is an endeavor in which, if you get things “a little wrong, it can blow the whole thing up….”

“Strong cryptography is a cornerstone of the way that people talk about privacy and security, and it has been under attack for decades” by governments, law enforcement, and “all sorts of bad actors”, Higgins said. For its defenders, confusion over terminology creates yet another challenge.

Stepanovich acknowledged the challenge of opposing the trend, but said the weight of history is on her side. “The study of crypto has been around for ever,” she said. “The most famous code is known as the Caesar cipher, referring to Julius Caesar. This is not new.” Cryptocurrency, on the other hand, is a relatively recent development, and she is not ready to concede to “a concept that may or may not survive government regulation”.

Read more of this story at Slashdot.

Researcher Argues Data Paints ‘Big Red Flashing Arrow’ Toward Wuhan Market as Covid-19 Origin

CNN reports on researcher Michael Worobey, “who specializes in tracing the genetic evolution of viruses,” who has now found “considerable evidence that the virus arose in an animal, and did not start circulating until the end of 2019.”

One case especially stood out — that of a 41-year-old accountant who allegedly got sick on December 8, 2019 and who had no connection to the market. The case has been cited as evidence the pandemic must not have started at the market.

Worobey found records that showed the man didn’t become ill with Covid-19 until later in December and that his December 8 problem was related to his teeth.

“This is corroborated by hospital records and a scientific paper that reports his COVID-19 onset date as 16 December and date of hospitalization as 22 December,” Worobey wrote in a commentary in the journal Science. That would make a seafood vendor who worked at the market and who got sick December 11, 2019, the earliest documented case, Worobey said.

Other research helped Worobey come up with a map of the earliest cases that clusters them all around the market. “That so many of the more than 100 COVID-19 cases from December with no identified epidemiologic link to Huanan Market nonetheless lived in its direct vicinity is notable and provides compelling evidence that community transmission started at the market,” he wrote. “It tells us that there’s a big red flashing arrow pointing at Huanan Market as the most likely place that the pandemic started,” Worobey told CNN. “The virus didn’t come from some other part of Wuhan and then get to Huanan market. The evidence speaks really quite strongly to the virus starting at the market and then leaking into the neighborhoods around the market….”

The journal Science subjected Worobey’s research to outside scrutiny before publishing it.

Interestingly, Science also published a letter in May in which Worobey had joined 17 other scientists to urge the investigation of both the “natural origin” and “lab leak” theories. But now while he still believes the Chinese government should’ve investigated the lab leak theory, “holy smokes — is there a lot of evidence against it, and in favor of natural origin,” Worobey tells CNN. And he’s now telling the Los Angeles Times that his new research “takes the lab-leak idea almost completely off the table…. So many of the early cases were tied to this one Home Depot-sized building in a city of 11 million people, when there are thousands of other places where it would be more likely for early cases to be linked to if the virus had not started there.”

Or, as he explained his research to the Washington Post, “It becomes almost impossible to explain that pattern if that epidemic didn’t start there.”

A virologist at Texas A&M University who was one of the coronavirus experts giving SARS-CoV-2 its name called Worobey’s research “detailed and compelling,” while a virologist at Tulane University also tells the Post the new research “shows beyond a shadow of a doubt that in fact the Huanan market was the epicenter of the outbreak.”

Read more of this story at Slashdot.

Is ‘The NFT Bay’ Just a Giant Hoax?

Recently Australian developer Geoffrey Huntley announced they’d created a 20-terabyte archive of all NFTs on the Ethereum and Solana blockchains.
But one NFT startup company now says they tried downloading the archive — and discovered most of it was zeroes.

Many of the articles are careful to point out “we have not verified the contents of the torrent,” because of course they couldn’t. A 20TB torrent would take several days to download, necessitating a pretty beefy internet connection and more disk space to store than most people have at their disposal. We at ClubNFT fired up a massive AWS instance with 40TB of EBS disk space to attempt to download this, with a cost estimate of $10k-20k over the next month, as we saw this torrent as potentially an easy way to pre-seed our NFT storage efforts — not many people have these resources to devote to a single news story.

Fortunately, we can save you the trouble of downloading the entire torrent — all you need is about 10GB. Download the first 10GB of the torrent, plus the last block, and you can fill in all the rest with zeroes. In other words, it’s empty; and no, Geoff did not actually download all the NFTs. Ironically, Geoff has archived all of the media articles about this and linked them on TheNFTBay’s site, presumably to preserve an immutable record of the spread and success of his campaign — kinda like an NFT…

We were hoping this was real… [I]t is actually rather complicated to correctly download and secure the media for even a single NFT, nevermind trying to do it for every NFT ever made. This is why we were initially skeptical of Geoff’s statements. But even if he had actually downloaded all the NFT media and made it available as a torrent, this would not have solved the problem… a torrent containing all the NFTs does nothing to actually make those NFTs available via IPFS, which is the network they must be present on in order for the NFTs to be visible on marketplaces and galleries….
[A]nd this is a bit in the weeds: in order to reupload an NFT’s media to IPFS, you need more than just the media itself. In order to restore a file to IPFS so it can continue to be located by the original link embedded in the NFT, you must know exactly the settings used when that file was originally uploaded, and potentially even the exact version of the IPFS software used for the upload.

For these reasons and more, ClubNFT is working hard on an actual solution to ensure that everybody’s NFTs can be safely secured by the collectors themselves. We look forward to providing more educational resources on these and other topics, and welcome the attention that others, like Geoff, bring to these important issues.
Their article was shared by Slashdot reader long-time Slashdot reader GradiusCVK (who is one of ClubNFT’s three founders). I’d wondered suspiciously if ClubNFT was a hoax, but if this PR Newswire press release is legit, they’ve raised $3 million in seed funding. (And that does include an investment from Drapen Dragon, co-founded by Tim Draper which shows up on CrunchBase). The International Business Times has also covered ClubNFT, identifying it as a startup whose mission statement is “to build the next generation of NFT solutions to help collectors discover, protect, and share digital assets.”

Co-founder and CEO Jason Bailey said these next-generation tools are in their “discovery” phase, and one of the first set of tools that is designed to provide a backup solution for NFTs will roll out early next year. Speaking to International Business Times, Bailey said, “We are looking at early 2022 to roll out the backup solution. But between now and then we should be feeding (1,500 beta testers) valuable information about their wallets.” Bailey says while doing the beta testing, he realized that there are loopholes in the NFT storage systems and only 40% of the NFTs were actually pointing to the IPFS, while 40% of them were at risk — pointing to private servers.

Here is the problem explained: NFTs are basically a collection of metadata, that define the underlying property that is owned. Just like in the world of internet documents, links point to the art and any details about it that are being stored. But links can break, or die. Many NFTs use a system called InterPlanetary File System, or IPFS, which let you find a piece of content as long as it is hosted somewhere on the IPFS network. Unlike in the world of internet domains, you don’t need to own the domain to really make sure the data is safe. Explaining the problem which the backup tool will address, Bailey said, “When you upload an image to IPFS, it creates a cryptographic hash. And if someone ever stops paying to store that image on IPFS, as long as you have the original image, you can always restore it. That’s why we’re giving people the right to download the image…. [W]e’re going to start with this protection tool solution that will allow people to click a button and download all the assets associated with their NFT collection and their wallet in the exact format that they would need it in to restore it back up to IPFS, should it ever disappear. And we’re not going to charge any money for that.”

The idea, he said, is that collectors should not have to trust any company; rather they can use ClubNFT’s tool, whenever it becomes available, to download the files locally… “One of the things that we’re doing early around that discovery process, we’re building out a tool that looks in your wallet and can see who you collect, and then go a level deeper and see who they collect,” Bailey said. Bailey said that the rest of the tools will process after gathering lessons based on user feedback on the first set of solutions. He, however, seemed positive that the talks of the next set of tools will begin in the Spring of next year as the company has laid a “general roadmap.”

Read more of this story at Slashdot.

‘Gas Station in Space’ – A New Proposal to Convert Space Junk Into a Rocket Fuel

“An Australian company is part of an international effort to recycle dangerous space junk into rocket fuel — in space,” reports the Guardian.

Slashdot reader votsalo shared their report (which also looks at some of the other companies working on the problem of space debris).
South Australian company Neumann Space has developed an “in-space electric propulsion system” that can be used in low Earth orbit to extend the missions of spacecraft, move satellites, or de-orbit them. Now Neumann is working on a plan with three other companies to turn space junk into fuel for that propulsion system… Another U.S. company, Cislunar, is developing a space foundry to melt debris into metal rods. And Neumann Space’s propulsion system can use those metal rods as fuel — their system ionises the metal which then creates thrust to move objects around orbit.

Chief executive officer Herve Astier said when Neumann was approached to be part of a supply chain to melt metal in space, he thought it was a futuristic plan, and would not be “as easy as it looks”.

“But they got a grant from NASA so we built a prototype and it works,” he said…

Astier says it is still futuristic, but now he can see that it’s possible. “A lot of people are putting money into debris. Often it’s to take it down into the atmosphere and burn it up. But if it’s there and you can capture it and reuse it, it makes sense from a business perspective, because you’re not shipping it up there,” he said.

“It’s like developing a gas station in space.”

Read more of this story at Slashdot.

First Electric Autonomous Cargo Ship Launched In Norway

Zero emissions and, soon, zero crew: the world’s first fully electric autonomous cargo vessel was unveiled in Norway, a small but promising step toward reducing the maritime industry’s climate footprint. TechXplore reports: By shipping up to 120 containers of fertilizer from a plant in the southeastern town of Porsgrunn to the Brevik port a dozen kilometres (about eight miles) away, the much-delayed Yara Birkeland, shown off to the media on Friday, will eliminate the need for around 40,000 truck journeys a year that are now fueled by polluting diesel. The 80-meter, 3,200-deadweight tonne ship will soon begin two years of working trials during which it will be fine-tuned to learn to maneuver on its own.

The wheelhouse could disappear altogether in “three, four or five years”, said Holsether, once the vessel makes its 7.5-nautical-mile trips on its own with the aid of sensors. “Quite a lot of the incidents happening on vessels are due to human error, because of fatigue for instance,” project manager Jostein Braaten said from the possibly doomed bridge. “Autonomous operating can enable a safe journey,” he said.

On board the Yara Birkeland, the traditional machine room has been replaced by eight battery compartments, giving the vessel a capacity of 6.8 MWh — sourced from renewable hydroelectricity. “That’s the equivalent of 100 Teslas,” says Braaten. The maritime sector, which is responsible for almost three percent of all man-made emissions, aims to reduce its emissions by 40 percent by 2030 and 50 percent by 2050. Despite that, the sector has seen a rise in recent years.

Read more of this story at Slashdot.

Alphabet Puts Prototype Robots To Work Cleaning Up Google’s Offices

The company announced today that its Everyday Robots Project — a team within its experimental X labs dedicated to creating “a general-purpose learning robot” — has moved some of its prototype machines out of the lab and into Google’s Bay Area campuses to carry out some light custodial tasks. The Verge reports: “We are now operating a fleet of more than 100 robot prototypes that are autonomously performing a range of useful tasks around our offices,” said Everyday Robot’s chief robot officer Hans Peter Brondmo in a blog post. “The same robot that sorts trash can now be equipped with a squeegee to wipe tables and use the same gripper that grasps cups can learn to open doors.”

These robots in question are essentially arms on wheels, with a multipurpose gripper on the end of a flexible arm attached to a central tower. There’s a “head” on top of the tower with cameras and sensors for machine vision and what looks like a spinning lidar unit on the side, presumably for navigation. As Brondmo indicates, these bots were first seen sorting out recycling when Alphabet debuted the Everyday Robot team in 2019. The big promise that’s being made by the company (as well as by many other startups and rivals) is that machine learning will finally enable robots to operate in “unstructured” environments like homes and offices.

Read more of this story at Slashdot.

Thousands of Firefox Users Accidentally Commit Login Cookies On GitHub

Thousands of Firefox cookie databases containing sensitive data are available on request from GitHub repositories, data potentially usable for hijacking authenticated sessions. The Register reports: These cookies.sqlite databases normally reside in the Firefox profiles folder. They’re used to store cookies between browsing sessions. And they’re findable by searching GitHub with specific query parameters, what’s known as a search “dork.” Aidan Marlin, a security engineer at London-based rail travel service Trainline, alerted The Register to the public availability of these files after reporting his findings through HackerOne and being told by a GitHub representative that “credentials exposed by our users are not in scope for our Bug Bounty program.”

Marlin then asked whether he could make his findings public and was told he’s free to do so. “I’m frustrated that GitHub isn’t taking its users’ security and privacy seriously,” Marlin told The Register in an email. “The least it could do is prevent results coming up for this GitHub dork. If the individuals who uploaded these cookie databases were made aware of what they’d done, they’d s*** their pants.”

Marlin acknowledges that affected GitHub users deserve some blame for failing to prevent their cookies.sqlite databases from being included when they committed code and pushed it to their public repositories. “But there are nearly 4.5k hits for this dork, so I think GitHub has a duty of care as well,” he said, adding that he’s alerted the UK Information Commissioner’s Office because personal information is at stake. Marlin speculates that the oversight is a consequence of committing code from one’s Linux home directory. “I imagine in most of the cases, the individuals aren’t aware that they’ve uploaded their cookie databases,” he explained. “A common reason users do this is for a common environment across multiple machines.”

Read more of this story at Slashdot.