Is Amazon’s AWS Quietly Getting Better at Contributing to Open Source?

“If I want AWS to ignore me completely all I have to do is open a pull request against one of their repositories,” quipped cloud economist Corey Quinn in April, while also complaining that the real problem is “how they consistently and in my opinion incorrectly try to shape a narrative where they’re contributing to the open source ecosystem at a level that’s on par with their big tech company peers.”

But on Friday tech columnist Matt Asay argued that AWS is quietly getting better at open source. “Agreed,” tweeted tech journalist Steven J. Vaughan-Nichols in response, commending “Good open source people, good open-source work.” (And Vaughan-Nichols later retweeted an AWS principle software engineer’s announcement that “Over at Amazon Linux we are hiring, and also trying to lead and better serve customers by being more involved in upstream communities.”) Mark Atwood, principle engineer for open source at Amazon, also joined Asay’s thread, tweeting “I’m glad that people are noticing. Me and my team have been doing heavy work for years to get to this point. Generally we don’t want to sit at the head of the table, but we are seeing the value of sitting at the table.”

Asay himself was AWS’s head of developer marketing/Open Source strategy for two years, leaving in August of 2021. But Friday Asay’s article noted a recent tweet where AWS engineer Divij Vaidya announced he’d suddenly become one of the top 10 contributors to Apache Kafka after three months as the founding engineer for AWS’s Apache Kafka open source team. (Vaida added “We are hiring for a globally distributed fully remote team to work on open source Apache Kafka! Join us.”)
Asay writes:
Apache Kafka is just the latest example of this…. This is exactly what critics have been saying AWS doesn’t do. And, for years, they were mostly correct.

AWS was, and is, far more concerned with taking care of customers than being popular with open-source audiences. So, the company has focused on being “the best place for customers to build and run open-source software in the cloud.” Historically, that tended to not involve or require contributing to the open-source projects it kept building managed services around. Many felt that was a mistake — that a company so dependent on open source for its business was putting its supply chain at risk by not sustaining the projects upon which it depended…

PostgreSQL contributor (and sometime AWS open-source critic) Paul Ramsey has noticed. As he told me recently, it “[f]eels like a switch flipped at AWS a year or two ago. The strategic value of being a real stakeholder in the software they spin is now recognized as being worth the dollars spent to make it happen….” What seems to be happening at AWS, if quietly and usually behind the scenes, is a shift toward AWS service teams taking greater ownership in the open-source projects they operationalize for customers. This allows them to more effectively deliver results because they can help shape the roadmap for customers, and it ensures AWS customers get the full open-source experience, rather than a forked repo with patches that pile up as technical debt.

Vaidya and the Managed Service for Kafka team is an example along with Madelyn Olson, an engineer with AWS’s ElastiCache team and one of five core maintainers for Redis. And then there are the AWS employees contributing to Kubernetes, etcd and more. No, AWS is still not the primary contributor to most of these. Not yet. Google, Microsoft and Red Hat tend to top many of the charts, to Quinn’s point above. This also isn’t somehow morally wrong, as Quinn also argued: “Amazon (and any company) is there to make money, not be your friend.”
But slowly and surely, AWS product teams are discovering that a key element of obsessing over customers is taking care of the open-source projects upon which those customers depend. In other words, part of the “undifferentiated heavy lifting” that AWS takes on for customers needs to be stewardship for the open-source projects those same customers demand.

UPDATE: Reached for a comment today, Asay clarified his position on Quinn’s original complaints about AWS’s low level of open source contributions. “What I was trying to say was that while Corey’s point had been more-or-less true, it wasn’t really true anymore.”

Read more of this story at Slashdot.

‘The Phone is Terrible For Cloud Gaming’

An anonymous reader shares a column: The promise of cloud gaming is that you can do it from anywhere using any device with internet access and a good enough browser (each cloud gaming service seems to have its own requirements on the browser front). You should be able to play super demanding games whether you’re on a work trip with nothing but a work laptop or at home and the main TV is being hogged — or even if you just don’t feel like sitting on the couch. But the biggest promise of cloud gaming is that, no matter where you are, if you’ve got a phone then you’ve got all your games.

In practice, this is a bad idea. After spending the last few weeks rapturously using my Steam Deck near daily to play games in the cloud, I am never going to willingly attempt cloud gaming on my phone again. Valve’s enormous do-anything handheld PC has made me realize that, actually, sometimes dedicated gaming hardware is good! The Swiss Army knife approach to mobile gaming promised by cloud gaming on your phone is about as useful as the saw on a real Swiss Army knife. I appreciate the effort, but I don’t actually want to use it.

I’ve tried to make cloud gaming work on my phone a lot. I’ve attempted Red Dead Redemption 2 and Star Wars Jedi: Fallen Order and Halo and Gears of War and plenty of other games. Each time, I’m hit with wonder because, holy shit, these are demanding AAA games that usually require tons of expensive (and noisy) hardware playing on my phone. That feels like the delivery on a promise tech companies made me decades ago. But the wonder wears off when you cloud game on your phone for an extended period of time. Cloud gaming drains the phone’s battery quickly, which means you can and will be feeling the battery anxiety.

Read more of this story at Slashdot.

Apple Will Now Allow Developers To Transfer Ownership of Apps That Use iCloud

“The most impactful change to come out of WWDC had nothing to do with APIs, a new framework or any hardware announcement,” writes Jordan Morgan via Daring Fireball. “Instead, it was a change I’ve been clamoring for the last several years — and it’s one that’s incredibly indie friendly. As you’ve no doubt heard by now, I’m of course talking about iCloud enabled apps now allowing app transfers.” 9to5Mac explains how it works: According to Apple, you already could transfer an app when you’ve sold it to another developer or you would want to move it to another App Store Connect account or organization. You can also transfer the ownership of an app to another developer without removing it from the App Store. The company said: “The app retains its reviews and ratings during and after the transfer, and users continue to have access to future updates. Additionally, when an app is transferred, it maintains its Bundle ID — it’s not possible to update the Bundle ID after a build has been uploaded for the app.”

The news here is that it’s easier for developers to transfer the ownership of apps that use iCloud. Apple said that if your app uses any of the following, it will be transferred to the transfer recipient after they accept the app transfer: iCloud to store user data; iCloud containers; and KVS identifiers are associated with the app.

The company said: “If multiple apps on your account share a CloudKit container, the transfer of one app will disable the other apps’ ability to read or store data using the transferred CloudKit container. Additionally, the transferor will no longer have access to user data for the transferred app via the iCloud dashboard. Any app updates will disable the app’s ability to read or store data using the transferred CloudKit container. If your app uses iCloud Key-Value Storage (KVS), the full KVS value will be embedded in any new provisioning profiles you create for the transferred app. Update your entitlements plist with the full KVS value in your provisioning profile.” You can learn more about the news via this Apple Developer page.

Read more of this story at Slashdot.

Once Frenemies, Elastic and AWS Are Now Besties

Paul Sawers writes via VentureBeat: It has been a frosty few years for Elastic and Amazon’s AWS cloud computing arm, with the duo frequently locking horns over various issues relating to Elastic’s ex-open-source database search engine — Elasticsearch. To cut a War and Peace-esque story short, Amazon had introduced its own managed Elasticsearch service called Amazon Elasticsearch Service way back in 2015, and in the intervening years the “confusion” this (among other shenanigans) caused in the cloud sphere ultimately led Elastic to transition Elasticsearch from open source to “free and open” (i.e., a less permissive license), exerting more control over how the cloud giants of the world could use the product and Elasticsearch name. In response, Amazon launched an Elasticsearch “fork” called OpenSearch, and the two companies finally settled a long-standing trademark dispute, which effectively meant that Amazon would stop associating the Elasticsearch brand with Amazon’s own products. This was an important final piece of the kiss-and-make-up puzzle, as it meant that customers searching for Elastic’s fully-managed Elasticsearch service (Elastic Cloud) in the AWS Marketplace, wouldn’t also stumble upon Amazon’s incarnation and wonder which one they were actually looking for.

Fast-forward to today, and you would hardly know that the two companies were once at loggerheads. Over the past year, Elastic and Amazon have partnered to bring all manner of technologies and integrations to market, and they’ve worked to ensure that their shared customers can more easily onboard to Elastic Cloud within Amazon’s infrastructure. Building on a commitment last month to make AWS and Elastic work even better together, Elastic and AWS today announced an even deeper collaboration, to “build, market and deliver” frictionless access to Elastic Cloud on AWS. In essence, this means that the two companies will go full-throttle on their “go-to-market” sales and marketing strategies — this includes a new free 7-day trial for customers wanting to test-drive Elastic Cloud directly from the AWS Marketplace.

On top of that, AWS has committed to working with Elastic to generate new business across Amazon’s various cloud-focused sales organizations — this is a direct result of Elastic joining the AWS ISV Accelerate program. All of this has been made possible because of the clear and distinct products that now exist — Amazon has OpenSearch, and Elastic has Elasticsearch, which makes collaboration that much easier. What does Amazon get for all of this? “Put simply, companies accessing Elastic’s services on AWS infrastructure drive a lot of cloud consumption — which translates into ka-ching for Amazon,” adds Sawers.

Read more of this story at Slashdot.

Google Unveils Its B2B Cloud Gaming Platform Built With Stadia Tech

An anonymous reader quotes a report from Forbes: Google had plenty of news about Stadia, the consumer-facing aspect of its cloud gaming products, at its Google for Games Developer Summit. On the flip side of that is the white-label platform Google’s been working on: a way for other companies to license the game streaming tech that powers Stadia. Previously, that B2B offering was believed to be known as Google Stream. Google has now confirmed more details about the offering, including its name.

It’s now called Immersive Stream for Games (which doesn’t exactly roll off the tongue as smoothly as Google Stream). The Stadia team built it with the help of the folks at Google Cloud. The company says the service will allow companies to run their own game trials, let users play full games, offer subscription bundles or have full storefronts. In other words, publishers might be able to run their own versions of Stadia with their own libraries of games, branding and custom user interface.

We’ve seen a version of Immersive Stream for Games in action. Last year, Google teamed up with AT&T to offer people a way to play Batman: Arkham Knight for free via the cloud. Thousands of folks took advantage of the offer. AT&T plans to offer its customers access to another game soon with the help of Immersive Stream for Games. While that version of Batman: Arkham Knight was only available on desktop and laptop web browsers, the next game will run on mobile devices too. If all goes well, it could be a decent way for AT&T to show off what its 5G network can do. Immersive Stream for Games will include other features Google revealed for Stadia today, including a way to offer free trials of full games and a project aimed at making it easier to port games so they run on Stadia tech, as well as analytics. Developers and publishers can send Google an inquiry for more details.

Read more of this story at Slashdot.

Inside Google’s Plan To Salvage Its Stadia Gaming Service

Google is trying to salvage its failing Stadia game service with a new focus on striking deals with Peloton, Bungie, and others under the brand “Google Stream.” Business Insider reports: When Google announced last year that it was shutting down its internal gaming studios, it was seen as a blow to the company’s big bet on video games. Google, whose Stadia cloud service was barely more than a year old, said it would instead focus on publishing games from existing developers on the platform and explore other ways to bring Stadia’s technology to partners. Since then, the company has shifted the focus of its Stadia division largely to securing white-label deals with partners that include Peloton, Capcom, and Bungie, according to people familiar with the plans.

Google is trying to salvage the underlying technology, which is capable of broadcasting high-definition games over the cloud with low latency, shopping the technology to partners under a new name: Google Stream. (Stadia was known in development as “Project Stream.”) The Stadia consumer platform, meanwhile, has been deprioritized within Google, insiders said, with a reduced interest in negotiating blockbuster third-party titles. The focus of leadership is now on securing business deals for Stream, people involved in those conversations said. The changes demonstrate a strategic shift in how Google, which has invested heavily in cloud services, sees its gaming ambitions.

Google has continued to prop up the Stadia consumer platform with a steady stream of titles. After Google closed Stadia’s internal game studios, known as Stadia Games & Entertainment, insiders said the directive was to build out what was internally dubbed a “content flywheel” — a steady flow of independent titles and content from existing publishing deals that would be much more affordable than securing AAA blockbusters, two former employees familiar with the conversations said. “The key thing was that they would not be spending the millions on the big titles,” one said. “And exclusives would be out of the question.” Executives and employees for the Stadia product have also shifted roles. Phil Harrison, the former PlayStation executive Google tapped to run its gaming operations, now reports to the company’s head of subscriptions.

Patrick Seybold, a Google spokesperson, told Insider in a statement: “We announced our intentions of helping publishers and partners deliver games directly to gamers last year, and have been working toward that. The first manifestation has been our partnership with AT&T who is offering Batman: Arkham Knight available to their customers for free. While we won’t be commenting on any rumors or speculation regarding other industry partners, we are still focused on bringing great games to Stadia in 2022. With 200+ titles currently available, we expect to have another 100+ games added to the platform this year, and currently have 50 games available to claim in Stadia Pro.”

Read more of this story at Slashdot.