Google Introduces Cloud-Based Blockchain Node Service For Ethereum
Read more of this story at Slashdot.
Sales And Repair
1715 S. 3rd Ave. Suite #1
Yakima, WA. 98902
Mon - Fri: 8:30-5:30
Sat - Sun: Closed
Sales And Repair
1715 S. 3rd Ave. Suite #1
Yakima, WA. 98902
Mon - Fri: 8:30-5:30
Sat - Sun: Closed
Read more of this story at Slashdot.
But on Friday tech columnist Matt Asay argued that AWS is quietly getting better at open source. “Agreed,” tweeted tech journalist Steven J. Vaughan-Nichols in response, commending “Good open source people, good open-source work.” (And Vaughan-Nichols later retweeted an AWS principle software engineer’s announcement that “Over at Amazon Linux we are hiring, and also trying to lead and better serve customers by being more involved in upstream communities.”) Mark Atwood, principle engineer for open source at Amazon, also joined Asay’s thread, tweeting “I’m glad that people are noticing. Me and my team have been doing heavy work for years to get to this point. Generally we don’t want to sit at the head of the table, but we are seeing the value of sitting at the table.”
Asay himself was AWS’s head of developer marketing/Open Source strategy for two years, leaving in August of 2021. But Friday Asay’s article noted a recent tweet where AWS engineer Divij Vaidya announced he’d suddenly become one of the top 10 contributors to Apache Kafka after three months as the founding engineer for AWS’s Apache Kafka open source team. (Vaida added “We are hiring for a globally distributed fully remote team to work on open source Apache Kafka! Join us.”)
Asay writes:
Apache Kafka is just the latest example of this…. This is exactly what critics have been saying AWS doesn’t do. And, for years, they were mostly correct.
AWS was, and is, far more concerned with taking care of customers than being popular with open-source audiences. So, the company has focused on being “the best place for customers to build and run open-source software in the cloud.” Historically, that tended to not involve or require contributing to the open-source projects it kept building managed services around. Many felt that was a mistake — that a company so dependent on open source for its business was putting its supply chain at risk by not sustaining the projects upon which it depended…
PostgreSQL contributor (and sometime AWS open-source critic) Paul Ramsey has noticed. As he told me recently, it “[f]eels like a switch flipped at AWS a year or two ago. The strategic value of being a real stakeholder in the software they spin is now recognized as being worth the dollars spent to make it happen….” What seems to be happening at AWS, if quietly and usually behind the scenes, is a shift toward AWS service teams taking greater ownership in the open-source projects they operationalize for customers. This allows them to more effectively deliver results because they can help shape the roadmap for customers, and it ensures AWS customers get the full open-source experience, rather than a forked repo with patches that pile up as technical debt.
Vaidya and the Managed Service for Kafka team is an example along with Madelyn Olson, an engineer with AWS’s ElastiCache team and one of five core maintainers for Redis. And then there are the AWS employees contributing to Kubernetes, etcd and more. No, AWS is still not the primary contributor to most of these. Not yet. Google, Microsoft and Red Hat tend to top many of the charts, to Quinn’s point above. This also isn’t somehow morally wrong, as Quinn also argued: “Amazon (and any company) is there to make money, not be your friend.”
But slowly and surely, AWS product teams are discovering that a key element of obsessing over customers is taking care of the open-source projects upon which those customers depend. In other words, part of the “undifferentiated heavy lifting” that AWS takes on for customers needs to be stewardship for the open-source projects those same customers demand.
UPDATE: Reached for a comment today, Asay clarified his position on Quinn’s original complaints about AWS’s low level of open source contributions. “What I was trying to say was that while Corey’s point had been more-or-less true, it wasn’t really true anymore.”
Read more of this story at Slashdot.
In practice, this is a bad idea. After spending the last few weeks rapturously using my Steam Deck near daily to play games in the cloud, I am never going to willingly attempt cloud gaming on my phone again. Valve’s enormous do-anything handheld PC has made me realize that, actually, sometimes dedicated gaming hardware is good! The Swiss Army knife approach to mobile gaming promised by cloud gaming on your phone is about as useful as the saw on a real Swiss Army knife. I appreciate the effort, but I don’t actually want to use it.
I’ve tried to make cloud gaming work on my phone a lot. I’ve attempted Red Dead Redemption 2 and Star Wars Jedi: Fallen Order and Halo and Gears of War and plenty of other games. Each time, I’m hit with wonder because, holy shit, these are demanding AAA games that usually require tons of expensive (and noisy) hardware playing on my phone. That feels like the delivery on a promise tech companies made me decades ago. But the wonder wears off when you cloud game on your phone for an extended period of time. Cloud gaming drains the phone’s battery quickly, which means you can and will be feeling the battery anxiety.
Read more of this story at Slashdot.
The news here is that it’s easier for developers to transfer the ownership of apps that use iCloud. Apple said that if your app uses any of the following, it will be transferred to the transfer recipient after they accept the app transfer: iCloud to store user data; iCloud containers; and KVS identifiers are associated with the app.
The company said: “If multiple apps on your account share a CloudKit container, the transfer of one app will disable the other apps’ ability to read or store data using the transferred CloudKit container. Additionally, the transferor will no longer have access to user data for the transferred app via the iCloud dashboard. Any app updates will disable the app’s ability to read or store data using the transferred CloudKit container. If your app uses iCloud Key-Value Storage (KVS), the full KVS value will be embedded in any new provisioning profiles you create for the transferred app. Update your entitlements plist with the full KVS value in your provisioning profile.” You can learn more about the news via this Apple Developer page.
Read more of this story at Slashdot.
Fast-forward to today, and you would hardly know that the two companies were once at loggerheads. Over the past year, Elastic and Amazon have partnered to bring all manner of technologies and integrations to market, and they’ve worked to ensure that their shared customers can more easily onboard to Elastic Cloud within Amazon’s infrastructure. Building on a commitment last month to make AWS and Elastic work even better together, Elastic and AWS today announced an even deeper collaboration, to “build, market and deliver” frictionless access to Elastic Cloud on AWS. In essence, this means that the two companies will go full-throttle on their “go-to-market” sales and marketing strategies — this includes a new free 7-day trial for customers wanting to test-drive Elastic Cloud directly from the AWS Marketplace.
On top of that, AWS has committed to working with Elastic to generate new business across Amazon’s various cloud-focused sales organizations — this is a direct result of Elastic joining the AWS ISV Accelerate program. All of this has been made possible because of the clear and distinct products that now exist — Amazon has OpenSearch, and Elastic has Elasticsearch, which makes collaboration that much easier. What does Amazon get for all of this? “Put simply, companies accessing Elastic’s services on AWS infrastructure drive a lot of cloud consumption — which translates into ka-ching for Amazon,” adds Sawers.
Read more of this story at Slashdot.
It’s now called Immersive Stream for Games (which doesn’t exactly roll off the tongue as smoothly as Google Stream). The Stadia team built it with the help of the folks at Google Cloud. The company says the service will allow companies to run their own game trials, let users play full games, offer subscription bundles or have full storefronts. In other words, publishers might be able to run their own versions of Stadia with their own libraries of games, branding and custom user interface.
We’ve seen a version of Immersive Stream for Games in action. Last year, Google teamed up with AT&T to offer people a way to play Batman: Arkham Knight for free via the cloud. Thousands of folks took advantage of the offer. AT&T plans to offer its customers access to another game soon with the help of Immersive Stream for Games. While that version of Batman: Arkham Knight was only available on desktop and laptop web browsers, the next game will run on mobile devices too. If all goes well, it could be a decent way for AT&T to show off what its 5G network can do. Immersive Stream for Games will include other features Google revealed for Stadia today, including a way to offer free trials of full games and a project aimed at making it easier to port games so they run on Stadia tech, as well as analytics. Developers and publishers can send Google an inquiry for more details.
Read more of this story at Slashdot.
Read more of this story at Slashdot.
Google is trying to salvage the underlying technology, which is capable of broadcasting high-definition games over the cloud with low latency, shopping the technology to partners under a new name: Google Stream. (Stadia was known in development as “Project Stream.”) The Stadia consumer platform, meanwhile, has been deprioritized within Google, insiders said, with a reduced interest in negotiating blockbuster third-party titles. The focus of leadership is now on securing business deals for Stream, people involved in those conversations said. The changes demonstrate a strategic shift in how Google, which has invested heavily in cloud services, sees its gaming ambitions.
Google has continued to prop up the Stadia consumer platform with a steady stream of titles. After Google closed Stadia’s internal game studios, known as Stadia Games & Entertainment, insiders said the directive was to build out what was internally dubbed a “content flywheel” — a steady flow of independent titles and content from existing publishing deals that would be much more affordable than securing AAA blockbusters, two former employees familiar with the conversations said. “The key thing was that they would not be spending the millions on the big titles,” one said. “And exclusives would be out of the question.” Executives and employees for the Stadia product have also shifted roles. Phil Harrison, the former PlayStation executive Google tapped to run its gaming operations, now reports to the company’s head of subscriptions.
Patrick Seybold, a Google spokesperson, told Insider in a statement: “We announced our intentions of helping publishers and partners deliver games directly to gamers last year, and have been working toward that. The first manifestation has been our partnership with AT&T who is offering Batman: Arkham Knight available to their customers for free. While we won’t be commenting on any rumors or speculation regarding other industry partners, we are still focused on bringing great games to Stadia in 2022. With 200+ titles currently available, we expect to have another 100+ games added to the platform this year, and currently have 50 games available to claim in Stadia Pro.”
Read more of this story at Slashdot.
Cryptocurrency miners are using compromised Google Cloud accounts for computationally-intensive mining purposes, Google has warned. The search giant’s cybersecurity team provided details in a report published Wednesday. The so-called “Threat Horizons” report aims to provide intelligence that allows organizations to keep their cloud environments secure. “Malicious actors were observed performing cryptocurrency mining within compromised Cloud instances,” Google wrote in an executive summary of the report…
Google said 86% of 50 recently compromised Google Cloud accounts were used to perform cryptocurrency mining. In the majority of cases, cryptocurrency mining software was downloaded within 22 seconds of the account being compromised, Google said.
Read more of this story at Slashdot.