Stephen Colbert To Produce TV Series Based On Roger Zelanzny’s Sci-Fi Novels ‘The Chronicles of Amber’

Stephen Colbert is joining the team that is adapting Roger Zelazny’s “The Chronicles of Amber” for television. Variety reports: Colbert will now executive produce the potential series under his Spartina production banner. Spartina joins Skybound Entertainment and Vincent Newman Entertainment (VNE) on the series version of the beloved fantasy novels, with Skyboudn first announcing their intention to develop the series back in 2016. The books have been cited as an influence on “Game of Thrones,” with author George R.R. Martin recently stating he wanted to see the books brought to the screen.

“The Chronicles of Amber” follows the story of Corwin, who is said to “awaken on Earth with no memory, but soon finds he is a prince of a royal family that has the ability to travel through different dimensions of reality (called ‘shadows’) and rules over the one true world, Amber.” The story is told over ten books with two story arcs: “The Corwin Cycle” and “The Merlin Cycle.” The series has sold more than fifteen million copies globally. The search is currently on for a writer to tackle the adaptation. No network or streamer is currently attached. Colbert and Spartina are currently under a first-look deal at CBS Studios, but they are not currently the studio behind the series. “George R.R. Martin and I have similar dreams,” Colbert said. “I’ve carried the story of Corwin in my head for over 40 years, and I’m thrilled to partner with Skybound and Vincent Newman to bring these worlds to life. All roads lead to Amber, and I’m happy to be walking them.”

Read more of this story at Slashdot.

Adobe Says It Isn’t Using Your Photos To Train AI Image Generators

In early January, Adobe came under fire for language used in its terms and conditions that seemed to indicate that it could use photographers’ photos to train generative artificial intelligence systems. The company has reiterated that this is not the case. PetaPixel reports: The language of its “Content analysis” section in its Privacy and Personal Data settings says that by default, users give Adobe permission to “analyze content using techniques such as machine learning (e.g., for pattern recognition) to develop and improve our products and services.” That sounded a lot like artificial intelligence-based (AI) image generators. One of the sticking points of this particular section is that Adobe makes it an opt-out, not an opt-in, so many photographers likely had no idea they were already agreeing to it. “Machine learning-enabled features can help you become more efficient and creative,” Adobe explains. “For example, we may use machine learning-enabled features to help you organize and edit your images more quickly and accurately. With object recognition in Lightroom, we can auto-tag photos of your dog or cat.”

When pressed for comment in PetaPixel’s original coverage on January 5, Adobe didn’t immediately respond leaving many to assume the worst. However, a day later, the company did provide some clarity on the issue to PetaPixel that some photographers may have missed. “We give customers full control of their privacy preferences and settings. The policy in discussion is not new and has been in place for a decade to help us enhance our products for customers. For anyone who prefers their content be excluded from the analysis, we offer that option here,” a spokesperson from Adobe’s public affairs office told PetaPixel. “When it comes to Generative AI, Adobe does not use any data stored on customers’ Creative Cloud accounts to train its experimental Generative AI features. We are currently reviewing our policy to better define Generative AI use cases.” In an interview with Bloomberg, Adobe Chief Product Officer Scott Belsky said: “We are rolling out a new evolution of this policy that is more specific. If we ever allow people to opt-in for generative AI specifically, we need to call it out and explain how we’re using it.”

“We have to be very explicit about these things.”

Read more of this story at Slashdot.

Three Arrows Capital Co-Founders Pitch To Raise $25 Million For New ‘GTX’ Exchange

Su Zhu and Kyle Davies, the founders of collapsed crypto hedge fund Three Arrows Capital (3AC), are hoping to raise $25 million to start a new crypto exchange called GTX, according to two separate pitch decks obtained by The Block. Three Arrows Capital was one of the largest hedge funds in crypto until last year’s collapse of the Terra ecosystem left it facing significant losses. The financial advisory firm Teneo has been handling the liquidation of 3AC’s assets and the hedge fund has filed for Chapter 15 bankruptcy in New York. From the report: News of the fundraise comes two months after exchange giant FTX imploded, leaving more than a million creditors out of pocket. The new exchange takes advantage of the situation offering depositors the ability to transfer their FTX claims to GTX and receive immediate credit in a token called USDG, the pitch deck said. The exchange’s name is even a spin on “FTX,” with one of the GTX pitch decks opening with the line “because G comes after F.”

The Three Arrows pair are partnering with Mark Lamb and Sudhu Arumugam, who founded CoinFlex, a crypto exchange which is in the process of restructuring. The exchange’s executive team is also made up of several CoinFlex executives including the firm’s general counsel and chief technology officer, per one of the decks. GTX will leverage Coinflex’s technology to build the exchange and a legal team will be responsible for overseeing the onboarding of claims for all the recent crypto bankruptcies such as Celsius and Voyager, according to the decks. The exchange is looking to launch as soon as possible — potentially as soon as February — and is estimating that the claims market is worth around $20 billion, according to the decks.

Read more of this story at Slashdot.

High-Powered Lasers Can Be Used To Steer Lightning Strikes

fahrbot-bot shares a report from Engadget: Lightning rods have been used to safely guide strikes into the ground since Benjamin Franklin’s day, but their short range (roughly the same radius as the height) and fixed-in-place design makes them ineffective for protecting large areas. The technology may finally be here to replace them in some situations. European researchers have successfully tested a system that uses terawatt-level laser pulses to steer lighting toward a 26-foot rod. It’s not limited by its physical height, and can cover much wider areas — in this case, 590 feet — while penetrating clouds and fog.

[“The experiment was performed on Santis Mountain, in northeast Switzerland,” adds The Washington Post. “A 407-foot (124-meter) communications tower there, equipped with a lightning rod, is struck roughly a hundred times a year.”] The design ionizes nitrogen and oxygen molecules, releasing electrons and creating a plasma that conducts electricity. As the laser fires at a very quick 1,000 pulses per second, it’s considerably more likely to intercept lightning as it forms. In the test, conducted between June and September 2021, lightning followed the beam for nearly 197 feet before hitting the rod. The findings have been published in the journal Nature Photonics. A video of the work has also been published on YouTube.

Read more of this story at Slashdot.

IBM Shifts Remaining US-Based AIX Dev Jobs To India

According to The Register, IBM has shifted the roles of US IBM Systems employees developing AIX over to the Indian office. From the report: Prior to this transition, said to taken place in the third quarter of 2022, AIX development was split more or less evenly between the US and India, an IBM source told The Register. With the arrival of 2023, the entire group had been moved to India. Roughly 80 US-based AIX developers were affected, our source estimates. We’re told they were “redeployed,” and given an indeterminate amount of time to find a new position internally, in keeping with practices we reported last week based on claims by other IBM employees.

Evidently, the majority of those redeployed found jobs elsewhere at IBM. A lesser number of staff are evidently stuck in “redeployment limbo,” with no IBM job identified and no evident prospects at the company. “It also appears that these people in ‘redeployment’ limbo within IBM are all older, retirement eligible employees,” our source said. “The general sense among my peers is that redeployment is being used to nudge older employees out of the company and to do so in a manner that avoids the type of scrutiny that comes with layoffs.”

Layoffs generally come with a severance payment and may have reporting requirements. Redeployments — directing workers to find another internal position, which may require relocating — can avoid cost and bureaucracy. They also have the potential to encourage workers to depart on their own. We’re told that IBM does not disclose redeployment numbers to its employees and does not report how internal jobs were obtained — through internal search, with the assistance of management — or were not obtained — employees left in limbo or who choose to leave rather than wait.

Read more of this story at Slashdot.

CircleCI Says Hackers Stole Encryption Keys and Customers’ Secrets

Last month, CircleCI urged users to rotate their secrets following a breach of the company’s systems. The company confirmed in a blog post on Friday that some customers’ data was stolen in the breach. While the customer data was encrypted, cybercriminals obtained the encryption keys able to decrypt the data. TechCrunch reports: The company said in a detailed blog post on Friday that it identified the intruder’s initial point of access as an employee’s laptop that was compromised with malware, allowing the theft of session tokens used to keep the employee logged in to certain applications, even though their access was protected with two-factor authentication. The company took the blame for the compromise, calling it a “systems failure,” adding that its antivirus software failed to detect the token-stealing malware on the employee’s laptop. Session tokens allow a user to stay logged in without having to keep re-entering their password or re-authorizing using two-factor authentication each time. But a stolen session token allows an intruder to gain the same access as the account holder without needing their password or two-factor code. As such, it can be difficult to differentiate between a session token of the account owner, or a hacker who stole the token.

CircleCi said the theft of the session token allowed the cybercriminals to impersonate the employee and gain access to some of the company’s production systems, which store customer data. “Because the targeted employee had privileges to generate production access tokens as part of the employee’s regular duties, the unauthorized third party was able to access and exfiltrate data from a subset of databases and stores, including customer environment variables, tokens, and keys,” said Rob Zuber, the company’s chief technology officer. Zuber said the intruders had access from December 16 through January 4.

Zuber said that while customer data was encrypted, the cybercriminals also obtained the encryption keys able to decrypt customer data. “We encourage customers who have yet to take action to do so in order to prevent unauthorized access to third-party systems and stores,” Zuber added. Several customers have already informed CircleCi of unauthorized access to their systems, Zuber said. Zuber said that CircleCi employees who retain access to production systems “have added additional step-up authentication steps and controls,” which should prevent a repeat-incident, likely by way of using hardware security keys.

Read more of this story at Slashdot.

Report: ‘Matter’ Standard Has ‘Undeniable Momentum’

The Verge reports “undeniable momentum” for Matter, the royalty-free interoperability standard that “allows smart home devices from any manufacturer to talk to other devices directly and locally with no need to use the cloud.”

“Matter was the buzzword throughout CES 2023 this year, with most companies even remotely connected to the smart home loudly discussing their Matter plans.”

The new smart home standard was featured in several keynotes and displayed prominently in smart home device makers’ booths as well as in Google, Amazon, and Samsung’s big, showy displays. More importantly, dozens of companies and manufacturers announced specific plans. Several companies said they would update entire product lines, while others announced new ones, sometimes with actual dates and prices. And Matter controllers have become a major thing, with at least four brand-new ones debuting at CES. Interestingly, nearly all of them have a dual or triple function, helping banish the specter of seemingly pointless white hubs stuck in your router closet….

Matter works over the protocols Thread, Wi-Fi, and ethernet and has been jointly developed by Apple, Google, Samsung, Amazon, and pretty much every other smart home brand you can name, big or small. If a device supports Matter, it will work locally with Amazon Alexa, Samsung SmartThings, Apple Home, Google Home, and any other smart home platform that supports Matter. It will also be controllable by any of the four voice assistants….

The big four have turned on Matter support on their platforms, but Amazon’s approach has been piecemeal, and aside from Apple, nobody supports onboarding devices to Matter on iOS yet.

However, that is shifting: at CES, Amazon announced a full rollout by spring, and Samsung’s Jaeyeon Jung told The Verge that Matter support is coming to its iOS app this month. There’s still no news on Matter support in Google Home’s iOS app. Then there’s the whole competing Thread network issue, although that sounds like it will be resolved sooner rather than later….

The Matter device drought should be over soon — although, judging by most of these ship dates, not until at least the second half of 2023.
“It’s also likely we’ll see dedicated bridges coming out that can bring Z-Wave and other products with proprietary protocols into Matter….”

Read more of this story at Slashdot.

Will This Next-Generation Display Technology Change the World?

“I saw the future at CES 2023,” writes Geoffrey Morrison, describing “a new, top-secret prototype display technology” that could one day replace LCD and OLED for phones and TVs. “It was impossibly flat, like a vibrantly glowing piece of paper.”

Meet electroluminescent quantum dots:

Until now, quantum dots were always a supporting player in another technology’s game. A futuristic booster for older tech, elevating that tech’s performance. QDs weren’t a character on their own. That is no longer the case. The prototype I saw was completely different. No traditional LEDs and no OLED. Instead of using light to excite quantum dots into emitting light, it uses electricity. Nothing but quantum dots. Electroluminescent, aka direct-view, quantum dots. This is huge.

Or at least, has the potential to be huge. Theoretically, this will mean thinner, more energy-efficient displays. It means displays that can be easier, as in cheaper, to manufacture. That could mean even less expensive, more efficient, bigger-screen TVs. The potential in picture quality is at least as good as QD-OLED, if not better. The tech is scalable from tiny, lightweight, high-brightness displays for next-generation VR headsets, to highly efficient phone screens, to high-performance flat-screen TVs.
The article predicts the simpler structure means “Essentially, you can print an entire QD display onto a surface without the heat required by other ‘printable’ tech…. Just about any flat or curved surface could be a screen.” This leads to QD screens not just on TVs and phones, but on car windshields, eyeglass lenses, and even bus or subway windows. (“These will initially be pitched by cities as a way to show people important info, but inevitably they’ll be used for advertising. That’s certainly not a knock against the tech, just how things work in the world….”)

Nanosys is calling this direct-view, electroluminescent quantum dot tech “nanoLED,” and told CNET that “their as-yet-unnamed manufacturing partner is going to be talking more about the technology in a few months…

“Even Nanosys admits direct-view quantum dot displays are still several years away from mass production…. But 5-10 years from now we’ll almost certainly have options for QD [quantum dot] displays in our phones, probably in our living rooms, and possibly on our windshields and windows.”

Read more of this story at Slashdot.