Arlo’s Security Cameras Will Keep Free Cloud Storage For Existing Customers After All

Security camera company Arlo is reversing course on its controversial decision to apply a retroactive end-of-life policy to many of its popular home security cameras. The Verge reports: On Friday, Arlo CEO Matthew McRae posted a thread on Twitter, announcing that the company will not remove free storage of videos for existing customers and that it is extending the EOL dates for older cameras a further year to 2025. He also committed to sending security updates to these cameras until 2026. The end-of-life policy was due to go into effect January 1st, 2023, and removed a big selling point — seven-day free cloud storage — for many Arlo cams. McRae now says all users with the seven-day storage service will “continue to receive that service uninterrupted.” But he did note that “any future migrations will be handled in a seamless manner,” indicating there are changes coming still.

The thread did not provide details on specific models other than using the Arlo Pro 2 as an example of a camera that will now EOL in 2025 instead of 2024, as previously announced, with security updates continuing until 2026. There was also no update on the plans to remove other features, such as email notifications and E911 emergency calling, or whether “legacy video storage” will remain. The EOL policy applied to the following devices: Arlo Gen 3, Arlo Pro, Arlo Baby, Arlo Pro 2, Arlo Q, Arlo Q Plus, Arlo Lights, and Arlo Audio Doorbell.

Read more of this story at Slashdot.

LastPass: Hackers Stole Customer Vault Data In Cloud Storage Breach

LastPass revealed today that attackers stole customer vault data after breaching its cloud storage earlier this year using information stolen during an August 2022 incident. BleepingComputer reports: This follows a previous update issued last month when the company’s CEO, Karim Toubba, only said that the threat actor gained access to “certain elements” of customer information. Today, Toubba added that the cloud storage service is used by LastPass to store archived backups of production data. The attacker gained access to Lastpass’ cloud storage using “cloud storage access key and dual storage container decryption keys” stolen from its developer environment.

“The threat actor copied information from backup that contained basic customer account information and related metadata including company names, end-user names, billing addresses, email addresses, telephone numbers, and the IP addresses from which customers were accessing the LastPass service,” Toubba said today. “The threat actor was also able to copy a backup of customer vault data from the encrypted storage container which is stored in a proprietary binary format that contains both unencrypted data, such as website URLs, as well as fully-encrypted sensitive fields such as website usernames and passwords, secure notes, and form-filled data.”

Fortunately, the encrypted data is secured with 256-bit AES encryption and can only be decrypted with a unique encryption key derived from each user’s master password. According to Toubba, the master password is never known to LastPass, it is not stored on Lastpass’ systems, and LastPass does not maintain it. Customers were also warned that the attackers might try to brute force their master passwords to gain access to the stolen encrypted vault data. However, this would be very difficult and time-consuming if you’ve been following password best practices recommended by LastPass. If you do, “it would take millions of years to guess your master password using generally-available password-cracking technology,” Toubba added. “Your sensitive vault data, such as usernames and passwords, secure notes, attachments, and form-fill fields, remain safely encrypted based on LastPass’ Zero Knowledge architecture.”

Read more of this story at Slashdot.

Is Amazon’s AWS Quietly Getting Better at Contributing to Open Source?

“If I want AWS to ignore me completely all I have to do is open a pull request against one of their repositories,” quipped cloud economist Corey Quinn in April, while also complaining that the real problem is “how they consistently and in my opinion incorrectly try to shape a narrative where they’re contributing to the open source ecosystem at a level that’s on par with their big tech company peers.”

But on Friday tech columnist Matt Asay argued that AWS is quietly getting better at open source. “Agreed,” tweeted tech journalist Steven J. Vaughan-Nichols in response, commending “Good open source people, good open-source work.” (And Vaughan-Nichols later retweeted an AWS principle software engineer’s announcement that “Over at Amazon Linux we are hiring, and also trying to lead and better serve customers by being more involved in upstream communities.”) Mark Atwood, principle engineer for open source at Amazon, also joined Asay’s thread, tweeting “I’m glad that people are noticing. Me and my team have been doing heavy work for years to get to this point. Generally we don’t want to sit at the head of the table, but we are seeing the value of sitting at the table.”

Asay himself was AWS’s head of developer marketing/Open Source strategy for two years, leaving in August of 2021. But Friday Asay’s article noted a recent tweet where AWS engineer Divij Vaidya announced he’d suddenly become one of the top 10 contributors to Apache Kafka after three months as the founding engineer for AWS’s Apache Kafka open source team. (Vaida added “We are hiring for a globally distributed fully remote team to work on open source Apache Kafka! Join us.”)
Asay writes:
Apache Kafka is just the latest example of this…. This is exactly what critics have been saying AWS doesn’t do. And, for years, they were mostly correct.

AWS was, and is, far more concerned with taking care of customers than being popular with open-source audiences. So, the company has focused on being “the best place for customers to build and run open-source software in the cloud.” Historically, that tended to not involve or require contributing to the open-source projects it kept building managed services around. Many felt that was a mistake — that a company so dependent on open source for its business was putting its supply chain at risk by not sustaining the projects upon which it depended…

PostgreSQL contributor (and sometime AWS open-source critic) Paul Ramsey has noticed. As he told me recently, it “[f]eels like a switch flipped at AWS a year or two ago. The strategic value of being a real stakeholder in the software they spin is now recognized as being worth the dollars spent to make it happen….” What seems to be happening at AWS, if quietly and usually behind the scenes, is a shift toward AWS service teams taking greater ownership in the open-source projects they operationalize for customers. This allows them to more effectively deliver results because they can help shape the roadmap for customers, and it ensures AWS customers get the full open-source experience, rather than a forked repo with patches that pile up as technical debt.

Vaidya and the Managed Service for Kafka team is an example along with Madelyn Olson, an engineer with AWS’s ElastiCache team and one of five core maintainers for Redis. And then there are the AWS employees contributing to Kubernetes, etcd and more. No, AWS is still not the primary contributor to most of these. Not yet. Google, Microsoft and Red Hat tend to top many of the charts, to Quinn’s point above. This also isn’t somehow morally wrong, as Quinn also argued: “Amazon (and any company) is there to make money, not be your friend.”
But slowly and surely, AWS product teams are discovering that a key element of obsessing over customers is taking care of the open-source projects upon which those customers depend. In other words, part of the “undifferentiated heavy lifting” that AWS takes on for customers needs to be stewardship for the open-source projects those same customers demand.

UPDATE: Reached for a comment today, Asay clarified his position on Quinn’s original complaints about AWS’s low level of open source contributions. “What I was trying to say was that while Corey’s point had been more-or-less true, it wasn’t really true anymore.”

Read more of this story at Slashdot.

‘The Phone is Terrible For Cloud Gaming’

An anonymous reader shares a column: The promise of cloud gaming is that you can do it from anywhere using any device with internet access and a good enough browser (each cloud gaming service seems to have its own requirements on the browser front). You should be able to play super demanding games whether you’re on a work trip with nothing but a work laptop or at home and the main TV is being hogged — or even if you just don’t feel like sitting on the couch. But the biggest promise of cloud gaming is that, no matter where you are, if you’ve got a phone then you’ve got all your games.

In practice, this is a bad idea. After spending the last few weeks rapturously using my Steam Deck near daily to play games in the cloud, I am never going to willingly attempt cloud gaming on my phone again. Valve’s enormous do-anything handheld PC has made me realize that, actually, sometimes dedicated gaming hardware is good! The Swiss Army knife approach to mobile gaming promised by cloud gaming on your phone is about as useful as the saw on a real Swiss Army knife. I appreciate the effort, but I don’t actually want to use it.

I’ve tried to make cloud gaming work on my phone a lot. I’ve attempted Red Dead Redemption 2 and Star Wars Jedi: Fallen Order and Halo and Gears of War and plenty of other games. Each time, I’m hit with wonder because, holy shit, these are demanding AAA games that usually require tons of expensive (and noisy) hardware playing on my phone. That feels like the delivery on a promise tech companies made me decades ago. But the wonder wears off when you cloud game on your phone for an extended period of time. Cloud gaming drains the phone’s battery quickly, which means you can and will be feeling the battery anxiety.

Read more of this story at Slashdot.

Apple Will Now Allow Developers To Transfer Ownership of Apps That Use iCloud

“The most impactful change to come out of WWDC had nothing to do with APIs, a new framework or any hardware announcement,” writes Jordan Morgan via Daring Fireball. “Instead, it was a change I’ve been clamoring for the last several years — and it’s one that’s incredibly indie friendly. As you’ve no doubt heard by now, I’m of course talking about iCloud enabled apps now allowing app transfers.” 9to5Mac explains how it works: According to Apple, you already could transfer an app when you’ve sold it to another developer or you would want to move it to another App Store Connect account or organization. You can also transfer the ownership of an app to another developer without removing it from the App Store. The company said: “The app retains its reviews and ratings during and after the transfer, and users continue to have access to future updates. Additionally, when an app is transferred, it maintains its Bundle ID — it’s not possible to update the Bundle ID after a build has been uploaded for the app.”

The news here is that it’s easier for developers to transfer the ownership of apps that use iCloud. Apple said that if your app uses any of the following, it will be transferred to the transfer recipient after they accept the app transfer: iCloud to store user data; iCloud containers; and KVS identifiers are associated with the app.

The company said: “If multiple apps on your account share a CloudKit container, the transfer of one app will disable the other apps’ ability to read or store data using the transferred CloudKit container. Additionally, the transferor will no longer have access to user data for the transferred app via the iCloud dashboard. Any app updates will disable the app’s ability to read or store data using the transferred CloudKit container. If your app uses iCloud Key-Value Storage (KVS), the full KVS value will be embedded in any new provisioning profiles you create for the transferred app. Update your entitlements plist with the full KVS value in your provisioning profile.” You can learn more about the news via this Apple Developer page.

Read more of this story at Slashdot.

Once Frenemies, Elastic and AWS Are Now Besties

Paul Sawers writes via VentureBeat: It has been a frosty few years for Elastic and Amazon’s AWS cloud computing arm, with the duo frequently locking horns over various issues relating to Elastic’s ex-open-source database search engine — Elasticsearch. To cut a War and Peace-esque story short, Amazon had introduced its own managed Elasticsearch service called Amazon Elasticsearch Service way back in 2015, and in the intervening years the “confusion” this (among other shenanigans) caused in the cloud sphere ultimately led Elastic to transition Elasticsearch from open source to “free and open” (i.e., a less permissive license), exerting more control over how the cloud giants of the world could use the product and Elasticsearch name. In response, Amazon launched an Elasticsearch “fork” called OpenSearch, and the two companies finally settled a long-standing trademark dispute, which effectively meant that Amazon would stop associating the Elasticsearch brand with Amazon’s own products. This was an important final piece of the kiss-and-make-up puzzle, as it meant that customers searching for Elastic’s fully-managed Elasticsearch service (Elastic Cloud) in the AWS Marketplace, wouldn’t also stumble upon Amazon’s incarnation and wonder which one they were actually looking for.

Fast-forward to today, and you would hardly know that the two companies were once at loggerheads. Over the past year, Elastic and Amazon have partnered to bring all manner of technologies and integrations to market, and they’ve worked to ensure that their shared customers can more easily onboard to Elastic Cloud within Amazon’s infrastructure. Building on a commitment last month to make AWS and Elastic work even better together, Elastic and AWS today announced an even deeper collaboration, to “build, market and deliver” frictionless access to Elastic Cloud on AWS. In essence, this means that the two companies will go full-throttle on their “go-to-market” sales and marketing strategies — this includes a new free 7-day trial for customers wanting to test-drive Elastic Cloud directly from the AWS Marketplace.

On top of that, AWS has committed to working with Elastic to generate new business across Amazon’s various cloud-focused sales organizations — this is a direct result of Elastic joining the AWS ISV Accelerate program. All of this has been made possible because of the clear and distinct products that now exist — Amazon has OpenSearch, and Elastic has Elasticsearch, which makes collaboration that much easier. What does Amazon get for all of this? “Put simply, companies accessing Elastic’s services on AWS infrastructure drive a lot of cloud consumption — which translates into ka-ching for Amazon,” adds Sawers.

Read more of this story at Slashdot.

Google Unveils Its B2B Cloud Gaming Platform Built With Stadia Tech

An anonymous reader quotes a report from Forbes: Google had plenty of news about Stadia, the consumer-facing aspect of its cloud gaming products, at its Google for Games Developer Summit. On the flip side of that is the white-label platform Google’s been working on: a way for other companies to license the game streaming tech that powers Stadia. Previously, that B2B offering was believed to be known as Google Stream. Google has now confirmed more details about the offering, including its name.

It’s now called Immersive Stream for Games (which doesn’t exactly roll off the tongue as smoothly as Google Stream). The Stadia team built it with the help of the folks at Google Cloud. The company says the service will allow companies to run their own game trials, let users play full games, offer subscription bundles or have full storefronts. In other words, publishers might be able to run their own versions of Stadia with their own libraries of games, branding and custom user interface.

We’ve seen a version of Immersive Stream for Games in action. Last year, Google teamed up with AT&T to offer people a way to play Batman: Arkham Knight for free via the cloud. Thousands of folks took advantage of the offer. AT&T plans to offer its customers access to another game soon with the help of Immersive Stream for Games. While that version of Batman: Arkham Knight was only available on desktop and laptop web browsers, the next game will run on mobile devices too. If all goes well, it could be a decent way for AT&T to show off what its 5G network can do. Immersive Stream for Games will include other features Google revealed for Stadia today, including a way to offer free trials of full games and a project aimed at making it easier to port games so they run on Stadia tech, as well as analytics. Developers and publishers can send Google an inquiry for more details.

Read more of this story at Slashdot.