Redis To Adopt ‘Source-Available Licensing’ Starting With Next Version

Longtime Slashdot reader jgulla shares an announcement from Redis: Beginning today, all future versions of Redis will be released with source-available licenses. Starting with Redis 7.4, Redis will be dual-licensed under the Redis Source Available License (RSALv2) and Server Side Public License (SSPLv1). Consequently, Redis will no longer be distributed under the three-clause Berkeley Software Distribution (BSD). The new source-available licenses allow us to sustainably provide permissive use of our source code.

We’re leading Redis into its next phase of development as a real-time data platform with a unified set of clients, tools, and core Redis product offerings. The Redis source code will continue to be freely available to developers, customers, and partners through Redis Community Edition. Future Redis source-available releases will unify core Redis with Redis Stack, including search, JSON, vector, probabilistic, and time-series data models in one free, easy-to-use package as downloadable software. This will allow anyone to easily use Redis across a variety of contexts, including as a high-performance key/value and document store, a powerful query engine, and a low-latency vector database powering generative AI applications. […]

Under the new license, cloud service providers hosting Redis offerings will no longer be permitted to use the source code of Redis free of charge. For example, cloud service providers will be able to deliver Redis 7.4 only after agreeing to licensing terms with Redis, the maintainers of the Redis code. These agreements will underpin support for existing integrated solutions and provide full access to forthcoming Redis innovations. In practice, nothing changes for the Redis developer community who will continue to enjoy permissive licensing under the dual license. At the same time, all the Redis client libraries under the responsibility of Redis will remain open source licensed. Redis will continue to support its vast partner ecosystem — including managed service providers and system integrators — with exclusive access to all future releases, updates, and features developed and delivered by Redis through its Partner Program. There is no change for existing Redis Enterprise customers.

Read more of this story at Slashdot.

Woman With $2.5 Billion In Bitcoin Convicted of Money Laundering

mrspoonsi shares a report from the BBC: A former takeaway worker found with Bitcoin worth more than $2.5 billion has been convicted at Southwark Crown Court of a crime linked to money laundering. Jian Wen, 42, from Hendon in north London, was involved in converting the currency into assets including multi-million-pound houses and jewelry. On Monday she was convicted of entering into or becoming concerned in a money laundering arrangement. The Met said the seizure is the largest of its kind in the UK.

Although Wen was living in a flat above a Chinese restaurant in Leeds when she became involved in the criminal activity, her new lifestyle saw her move into a six-bedroom house in north London in 2017 which was rented for more than $21,000 per month. She posed as an employee of an international jewelry business and moved her son to the UK to attend private school, the Crown Prosecution Service (CPS) said. That same year, Wen tried to buy a string of expensive houses in London, but struggled to pass money-laundering checks and her claims she had earned millions legitimately mining Bitcoin were not believed. She later travelled abroad, buying jewelry worth tens of thousands of pounds in Zurich, and purchasing properties in Dubai in 2019.

Another suspect is thought to be behind the fraud but they remain at large. The Met said it carried out a large scale investigation as part of the case – searching several addresses, reviewing 48 electronic devices, and examining thousands of digital files including many which were translated from Mandarin. The CPS has obtained a freezing order from the High Court, while it carries out a civil recovery investigation that could lead to the forfeiture of the Bitcoin. The value of the Bitcoin was worth around $2.5 billion at the time of initial estimates — but due to the fluctuation in the currency’s value, it has since increased to around $4.3 billion.

Read more of this story at Slashdot.

Nvidia’s Jensen Huang Says AGI Is 5 Years Away

Haje Jan Kamps writes via TechCrunch: Artificial General Intelligence (AGI) — often referred to as “strong AI,” “full AI,” “human-level AI” or “general intelligent action” — represents a significant future leap in the field of artificial intelligence. Unlike narrow AI, which is tailored for specific tasks (such as detecting product flaws, summarize the news, or build you a website), AGI will be able to perform a broad spectrum of cognitive tasks at or above human levels. Addressing the press this week at Nvidia’s annual GTC developer conference, CEO Jensen Huang appeared to be getting really bored of discussing the subject — not least because he finds himself misquoted a lot, he says. The frequency of the question makes sense: The concept raises existential questions about humanity’s role in and control of a future where machines can outthink, outlearn and outperform humans in virtually every domain. The core of this concern lies in the unpredictability of AGI’s decision-making processes and objectives, which might not align with human values or priorities (a concept explored in depth in science fiction since at least the 1940s). There’s concern that once AGI reaches a certain level of autonomy and capability, it might become impossible to contain or control, leading to scenarios where its actions cannot be predicted or reversed.

When sensationalist press asks for a timeframe, it is often baiting AI professionals into putting a timeline on the end of humanity — or at least the current status quo. Needless to say, AI CEOs aren’t always eager to tackle the subject. Predicting when we will see a passable AGI depends on how you define AGI, Huang argues, and draws a couple of parallels: Even with the complications of time-zones, you know when new year happens and 2025 rolls around. If you’re driving to the San Jose Convention Center (where this year’s GTC conference is being held), you generally know you’ve arrived when you can see the enormous GTC banners. The crucial point is that we can agree on how to measure that you’ve arrived, whether temporally or geospatially, where you were hoping to go. “If we specified AGI to be something very specific, a set of tests where a software program can do very well — or maybe 8% better than most people — I believe we will get there within 5 years,” Huang explains. He suggests that the tests could be a legal bar exam, logic tests, economic tests or perhaps the ability to pass a pre-med exam. Unless the questioner is able to be very specific about what AGI means in the context of the question, he’s not willing to make a prediction. Fair enough.

Read more of this story at Slashdot.

Modern Web Bloat Means Some Pages Load 21MB of Data

Christopher Harper reports via Tom’s Hardware: Earlier this month, Danluu.com released an exhaustive 23-page analysis/op-ed/manifesto on the current status of unoptimized web pages and web app performance, finding that just loading a web page can even bog down an entry-level device that can run the popular game PUBG at 40 fps. In fact, the Wix webpage requires loading 21MB of data for one page, while the more famous websites Patreon and Threads load 13MB of data for one page. This can result in slow load times that reach up to 33 seconds or, in some cases, result in the page failing to load at all.

As the testing above shows, some of the most brutally intensive websites include the likes of… Quora, and basically every major social media platform. Newer content production platforms like Squarespace and newer Forum platforms like Discourse also have significantly worse performance than their older counterparts, often to the point of unusability on some devices. The Tecno S8C, one of the prominent entry-level phones common in emerging markets, is one particularly compelling test device that stuck. The device is actually quite impressive in some ways, including its ability to run PlayerUnknown’s Battlegrounds Mobile at 40 FPS — but the same device can’t even run Quora and experiences nigh-unusable lag when scrolling on social media sites.

That example is most likely the best summation of the overall point, which is that modern web and app design is increasingly trending toward an unrealistic assumption of ever-increasing bandwidth and processing. Quora is a website where people answer questions — there is absolutely no reason any of these websites should be harder to run than a Battle Royale game.

Read more of this story at Slashdot.

Job Boards Are Rife With ‘Ghost Jobs’

“Job openings across the country are seemingly endless,” writes longtime Slashdot reader smooth wombat. “Millions of jobs are listed, but are they real? Companies may post job openings with no intent to ever fill it. These are known as ghost jobs and there are more than most people realize. The BBC reports: Clarify Capital, a New York-based business loan provider, surveyed 1,000 hiring managers, and found nearly seven in 10 jobs stay open for more than 30 days, with 10% unfilled for more than half a year. Half the respondents reported they keep job listings open indefinitely because they “always open to new people.” More than one in three respondents said they kept the listings active to build a pool of applicants in case of turnover — not because a role needs to be filled in a timely manner.

The posted roles are more than just a talent vacuum sucking up resumes from applicants. They are also a tool for shaping perception inside and outside of the company. More than 40% of hiring managers said they list jobs they aren’t actively trying to fill to give the impression that the company is growing. A similar share said the job listings are made to motivate employees, while 34% said the jobs are posted to placate overworked staff who may be hoping for additional help to be brought on.

“Ghost jobs are everywhere,” says Geoffrey Scott, senior content manager and hiring manager at Resume Genius, a US company that helps workers design their resumes. “We discovered a massive 1.7 million potential ghost job openings on LinkedIn just in the US,” says Scott. In the UK, StandOut CV, a London-based career resources company, found more than a third of job listings in 2023 were ghost jobs, defined as listings posted for more than 30 days. “Experts caution not every posting that seems like a ghost job is one,” notes the report. “Still, whether these postings are ghost jobs — or simply look and feel like them — the result is similar. Jobseekers end up discouraged and burnt out.”

Read more of this story at Slashdot.

Nvidia Reveals Blackwell B200 GPU, the ‘World’s Most Powerful Chip’ For AI

Sean Hollister reports via The Verge: Nvidia’s must-have H100 AI chip made it a multitrillion-dollar company, one that may be worth more than Alphabet and Amazon, and competitors have been fighting to catch up. But perhaps Nvidia is about to extend its lead — with the new Blackwell B200 GPU and GB200 “superchip.” Nvidia says the new B200 GPU offers up to 20 petaflops of FP4 horsepower from its 208 billion transistors and that a GB200 that combines two of those GPUs with a single Grace CPU can offer 30 times the performance for LLM inference workloads while also potentially being substantially more efficient. It “reduces cost and energy consumption by up to 25x” over an H100, says Nvidia.

Training a 1.8 trillion parameter model would have previously taken 8,000 Hopper GPUs and 15 megawatts of power, Nvidia claims. Today, Nvidia’s CEO says 2,000 Blackwell GPUs can do it while consuming just four megawatts. On a GPT-3 LLM benchmark with 175 billion parameters, Nvidia says the GB200 has a somewhat more modest seven times the performance of an H100, and Nvidia says it offers 4x the training speed. Nvidia told journalists one of the key improvements is a second-gen transformer engine that doubles the compute, bandwidth, and model size by using four bits for each neuron instead of eight (thus, the 20 petaflops of FP4 I mentioned earlier). A second key difference only comes when you link up huge numbers of these GPUs: a next-gen NVLink switch that lets 576 GPUs talk to each other, with 1.8 terabytes per second of bidirectional bandwidth. That required Nvidia to build an entire new network switch chip, one with 50 billion transistors and some of its own onboard compute: 3.6 teraflops of FP8, says Nvidia. Further reading: Nvidia in Talks To Acquire AI Infrastructure Platform Run:ai

Read more of this story at Slashdot.