Facing Hostile Chinese Authorities, Apple CEO Signed $275 Billion Deal With Them
Read more of this story at Slashdot.
Sales And Repair
1715 S. 3rd Ave. Suite #1
Yakima, WA. 98902
Mon - Fri: 8:30-5:30
Sat - Sun: Closed
Sales And Repair
1715 S. 3rd Ave. Suite #1
Yakima, WA. 98902
Mon - Fri: 8:30-5:30
Sat - Sun: Closed
Read more of this story at Slashdot.
Other Bluetooth-based trackers have been available for some time now, but the ubiquity of Apple devices (which communicate with AirTags via Apple’s Find My app) means it’s generally faster and more accurate to track something remotely via an AirTag than a rival device like a Tile. And while they undoubtedly make it easier for users to recover lost stuff, the tags are being exploited by criminals. Apple did build some anti-stalking functions into AirTags — if your Apple device detects that you’re being followed by an unfamiliar device, it will alert you, as long as you’re running iOS 14.5 or newer.
Read more of this story at Slashdot.
Read more of this story at Slashdot.
In understanding knots, mathematicians rely on something called invariants, which are algebraic, geometric or numerical quantities that are the same. In this case, they looked at invariants that were the same in equivalent knots; equivalence can be defined in several ways, but knots can be considered equivalent if you can distort one into another without breaking the knot. Geometric invariants are essentially measurements of a knot’s overall shape, whereas algebraic invariants describe how the knots twist in and around each other. “Up until now, there was no proven connection between those two things,” [said Alex Davies, a machine-learning specialist at DeepMind and one of the authors of the new paper], referring to geometric and algebraic invariants. But mathematicians thought there might be some kind of relationship between the two, so the researchers decided to use DeepMind to find it. With the help of the AI program, they were able to identify a new geometric measurement, which they dubbed the “natural slope” of a knot. This measurement was mathematically related to a known algebraic invariant called the signature, which describes certain surfaces on knots.
In the second case, DeepMind took a conjecture generated by mathematicians in the late 1970s and helped reveal why that conjecture works. For 40 years, mathematicians have conjectured that it’s possible to look at a specific kind of very complex, multidimensional graph and figure out a particular kind of equation to represent it. But they haven’t quite worked out how to do it. Now, DeepMind has come closer by linking specific features of the graphs to predictions about these equations, which are called Kazhdan-Lusztig (KL) polynomials, named after the mathematicians who first proposed them. “What we were able to do is train some machine-learning models that were able to predict what the polynomial was, very accurately, from the graph,” Davies said. The team also analyzed what features of the graph DeepMind was using to make those predictions, which got them closer to a general rule about how the two map to each other. This means DeepMind has made significant progress on solving this conjecture, known as the combinatorial invariance conjecture.
Read more of this story at Slashdot.
The centenary celebration faced what the paper reportedly termed unprecedented challenges, including an unexpected increase in air pollutants and an overcast sky during one of the wettest summers on record. Factories and other polluting activities had been halted in the days ahead of the event but low airflow meant the pollution hadn’t dissipated, it said. The paper, published in the peer-reviewed Environmental Science journal and led by environmental science professor, Wang Can, said a two-hour cloud-seeding operation was launched on the eve of the ceremony, and residents in nearby mountain regions reported seeing rockets shot into the sky on 30 June. The paper said the rockets were carrying silver iodine into the sky to stimulate rainfall.
The researchers said the resulting artificial rain reduced the level of PM2.5 air pollutants by more than two-thirds, and shifted the air quality index reading, based on World Health Organization standards, from “moderate” to “good.” The team said the artificial rain “was the only disruptive event in this period,” so it was unlikely the drop in pollution had a natural cause.
Read more of this story at Slashdot.
The open letter comes after leaks from Facebook revealed some data from the company’s internal research, which found that Instagram was linked with anxiety and body image issues for some teenage girls. The research released, though, is limited and relied on subjective information collected through interviews. While this strategy can produce useful insights, it can’t prove that social media caused any of the mental health outcomes. The information available so far appears to show that the studies Facebook researchers conducted don’t meet the standards academic researchers use to conduct trials, the new open letter said. The information available also isn’t complete, the authors noted — Meta hasn’t made its research methods or data public, so it can’t be scrutinized by independent experts. The authors called for the company to allow independent review of past and future research, which would include releasing research materials and data.
The letter also asked Meta to contribute its data to ongoing independent research efforts on the mental health of adolescents. It’s a longstanding frustration that big tech companies don’t release data, which makes it challenging for external researchers to scrutinize and understand their products. “It will be impossible to identify and promote mental health in the 21st century if we cannot study how young people are interacting online,” the authors said. […] The open letter also called on Meta to establish an independent scientific trust to evaluate any risks to mental health from the use of platforms like Facebook and Instagram and to help implement “truly evidence-based solutions for online risks on a world-wide scale.” The trust could be similar to the existing Facebook Oversight Board, which helps the company with content moderation decisions.
Read more of this story at Slashdot.
Read more of this story at Slashdot.
When it launches, expected in the spring, the service will merge Sony’s two existing subscription plans, PlayStation Plus and PlayStation Now. Currently, PlayStation Plus is required for most online multiplayer games and offers free monthly titles, while PlayStation Now allows users to stream or download older games. Documents reviewed by Bloomberg suggest that Sony plans to retain the PlayStation Plus branding but phase out PlayStation Now.
Read more of this story at Slashdot.
Read more of this story at Slashdot.
Comcast fully deployed bufferbloat fixes across their entire network over the past year, demonstrating 90% improvements in working latency and jitter — which is described in this article by by Comcast Vice President of Technology Policy & Standards. (The article’s Cumulative Distribution Function chart is to die for…) But: did anybody notice? Did any other ISPs adopt AQM tech? How many of y’all out there are running smart queue management (sch_cake in linux) nowadays?
But wait — it gets even more interesting…
The Comcast official anticipates even less latency with the newest Wi-Fi 6E standard. (And for home users, the article links to a page recommending “a router whose manufacturer understands the principles of bufferbloat, and has updated the firmware to use one of the Smart Queue Management algorithms such as cake, fq_codel, PIE.”)
But then the Comcast VP looks to the future, and where all of this is leading:
Currently under discussion at the IETF in the Transport Area Working Group is a proposal for Low Latency, Low Loss Scalable Throughput. This potential approach to achieve very low latency may result in working latencies of roughly one millisecond (though perhaps 1-5 milliseconds initially). As the IETF sorts out the best technical path forward through experimentation and consensus-building (including debate of alternatives), in a few years we may see the beginning of a shift to sub-5 millisecond working latency. This seems likely to not only improve the quality of experience of existing applications but also create a network foundation on which entirely new classes of applications will be built.
While we can certainly think of usable augmented and virtual reality (AR and VR), these are applications we know about today. But what happens when the time to access resources on the Internet is the same, or close to the time to access local compute or storage resources? What if the core assumption that developers make about networks — that there is an unpredictable and variable delay — goes away? This is a central assumption embedded into the design of more or less all existing applications. So, if that assumption changes, then we can potentially rethink the design of many applications and all sorts of new applications will become possible. That is a big deal and exciting to think about the possibilities!
In a few years, when most people have 1 Gbps, 10 Gbps, or eventually 100 Gbps connections in their home, it is perhaps easy to imagine that connection speed is not the only key factor in your performance. We’re perhaps entering an era where consistently low working latency will become the next big thing that differentiates various Internet access services and application services/platforms. Beyond that, factors likely exceptionally high uptime, proactive/adaptive security, dynamic privacy protection, and other new things will likely also play a role. But keep an eye on working latency — there’s a lot of exciting things happening!
Read more of this story at Slashdot.