New Nonprofit ‘Flickr Foundation’ Hopes to Preserve Its Billions of Photos For 100 Years

“Content of every type disappears from the internet all the time…” writes Popular Photography’s long-time “gear editor” (for photography equipment).

But someone’s doing something about it: the newly-founded Flickr Foundation, which has announced plans “to make sure Flickr will be preserved for future generations.” Or, as Popular Photography puts it, to stop photos “from suffering the same ill fate as our MySpace photos” — providing the example of important historical photos.

One particular collection their article notes is The Flickr Commons, “started back in 2008 as a collaborative effort with the Library of Congress to make publicly held photography collections readily available online for people seeking them out.”

It’s a massive, eclectic, fascinating archive that pulls images and content from around the world. This new organization hopes to integrate more partners and ensure that everything remains available and easily accessible…. If you’re not already familiar with The Commons, it’s a really fascinating online resource. It grants access to everything from historical portraits to scientific images and everything in between. It’s easy to get lost in the sheer volume of images available on the site, but Flickr relies on curators in order to bring notable images to the forefront and keep things organized and available.

With the establishment of the new foundation, Flickr hopes that it can keep this archive running to 2122 and beyond. It will doubtlessly add countless more images along the way.

Flickr is currently hiring a new archivist, according to their announcement (which also points out that the Flickr API was one of the first public APIs ever).

Among other things, it says that the foundation hopes to “investigate preservation strategies that could last for the next century,”

Read more of this story at Slashdot.

US State of Virginia Has More Datacenter Capacity Than Europe or China

The state of Virginia has over a third of America’s hyperscale datacenter capacity, and this amounts to more than the entire capacity of China or the whole of Europe, highlighting just how much infrastructure is concentrated along the so-called Datacenter Alley. The Register reports: These figures come from Synergy Research Group, which said that the US accounts for 53 percent of global hyperscale datacenter capacity, as measured by critical IT load, at the end of the second quarter of 2022. The remainder is relatively evenly split between China, Europe, and the rest of the world. While few would be surprised at the US accounting for the lion’s share of datacenter capacity, the fact that so much is concentrated in one state could raise a few eyebrows, especially when it is centered on a small number of counties in Northern Virginia — typically Loudoun, Prince William, and Fairfax — which make up Datacenter Alley.

“Hyperscale operators take a lot of factors into account when deciding where to locate their datacenter infrastructure,” said Synergy chief analyst John Dinsdale. “This includes availability of suitable real estate, cost and availability of power supply options, proximity to customers, the risk of natural disasters, local incentives and approvals processes, the ease of doing business and internal business dynamics, and this has inevitably led to some hyperscale hot spots.” Amazon in particular locates a large amount of its datacenter infrastructure in Northern Virginia, with Microsoft, Facebook, Google, ByteDance, and others also having a major presence, according to Synergy. The big three cloud providers — Amazon, Microsoft and Google — have the broadest hyperscale bit barn footprint, with each of these having over 130 datacenters of the 800 or so around the globe. When measured in datacenter capacity, the leading companies are Amazon, Google, Microsoft, Facebook, Alibaba and Tencent, according to Synergy.

Read more of this story at Slashdot.

Vietnam Demands Big Tech Localize Data Storage and Offices

Vietnam’s Ministry of Information and Communications updated cybersecurity laws this week to mandate Big Tech and telecoms companies store user data locally, and control that data with local entities. The Register reports: The data affected goes beyond the basics of name, email, credit card information, phone number and IP address, and extends into social elements — including groups of which users are members, or the friends with whom they digitally interact. “Data of all internet users ranging from financial records and biometric data to information on people’s ethnicity and political views, or any data created by users while surfing the internet must be to stored domestically,” read the decree (PDF) issued Wednesday, as translated by Reuters. The decree applies to a wide swath of businesses including those providing telecom services, storing and sharing data in cyberspace, providing national or international domain names for users in Vietnam, e-commerce, online payments, payment intermediaries, transport connection services operating in cyberspace, social media, online video games, messaging services, and voice or video calls.

According to Article 26 of the government’s Decree 53, the new rules go into effect October 1, 2022 — around seven weeks from the date of its announcement. However, foreign companies have an entire 12 months in which to comply — beginning when they receive instructions from the Minister of Public Security. The companies are then required to store the data in Vietnam for a minimum of 24 months. System logs will need to be stored for 12 months. After this grace period, authorities reserve the right to make sure affected companies are following the law through investigations and data collection requests, as well as content removal orders. Further reading: Vietnam To Make Apple Watch, MacBook For First Time Ever

Read more of this story at Slashdot.

Microsoft Trying To Kill HDD Boot Drives By 2023, Report Says

A recent executive brief from data storage industry analyst firm Trendfocus reports that OEMs have disclosed that Microsoft is pushing them to drop HDDs as the primary storage device in pre-built Windows 11 PCs and use SSDs instead, with the current deadlines for the switchover set for 2023. Tom’s Hardware reports: Interestingly, these actions from Microsoft come without any firm SSD requirement listed for Windows 11 PCs, and OEMs have pushed back on the deadlines. […] Microsoft’s most current(opens in new tab) list of hardware requirements calls for a ’64 GB or larger storage device’ for Windows 11, so an SSD isn’t a minimum requirement for a standard install. However, Microsoft stipulates that two features, DirectStorage and the Windows Subsystem for Android(opens in new tab), require an SSD, but you don’t have to use those features. It is unclear whether or not Microsoft plans to change the minimum specifications for Windows 11 PCs after the 2023 switchover to SSDs for pre-built systems.

As always, the issue with switching all systems to SSDs boils down to cost: Trendfocus Vice President John Chen tells us that replacing a 1TB HDD requires stepping down to a low-cost 256 GB SSD, which OEMs don’t consider to be enough capacity for most users. Conversely, stepping up to a 512 GB SSD would ‘break the budget’ for lower-end machines with a strict price limit. “The original cut-in date based on our discussions with OEMs was to be this year, but it has been pushed out to sometime next year (the second half, I believe, but not clear on the firm date),” Chen told Tom’s Hardware. “OEMs are trying to negotiate some level of push out (emerging market transition in 2024, or desktop transition in 2024), but things are still in flux.”

The majority of PCs in developed markets have already transitioned to SSDs for boot drives, but there are exceptions. Chen notes that it is possible that Microsoft could make some exceptions, but the firm predicts that dual-drive desktop PCs and gaming laptops with both an SSD for the boot drive and an HDD for bulk storage will be the only mass-market PCs with an HDD. […] It’s unclear what measures, if any, Microsoft would take with OEMs if they don’t comply with its wishes, and the company has decided not to comment on the matter. Trendfocus says the switchover will have implications for HDD demand next year.

Read more of this story at Slashdot.

DirectStorage Shows Just Minor Load-Speed Improvements In Real-World PC Demo

Andrew Cunningham writes via Ars Technica: Microsoft’s DirectStorage API promises to speed up game-load times, both on the Xbox Series X/S and on Windows PCs (where Microsoft recently exited its developer-preview phase). One of the first games to demonstrate the benefits of DirectStorage on the PC is Square Enix’s Forspoken, which was shown off by Luminous Productions technical director Teppei Ono at GDC this week. As reported by The Verge, Ono said that, with a fast NVMe SSD and DirectStorage support, some scenes in Forspoken could load in as little as one second. That is certainly a monstrous jump from the days of waiting for a PlayStation 2 to load giant open-world maps from a DVD.

As a demonstration of DirectStorage, though, Forspoken’s numbers are a mixed bag. On one hand, the examples Ono showcased clearly demonstrate DirectStorage loading scenes more quickly on the same hardware, compared to the legacy Win32 API — from 2.6 seconds to 2.2 seconds in one scene, and from 2.4 seconds to 1.9 seconds in another. Forspoken demonstrated performance improvements on older SATA-based SSDs as well, despite being marketed as a feature that will primarily benefit NVMe drives — dropping from 5.0 to 4.6 seconds in one scene, and from 4.1 to 3.4 seconds in another. Speed improvements for SATA SSDs have been limited for the better part of a decade now because the SATA interface itself (rather than the SSD controller or NAND flash chips) has been holding them back. So eking out any kind of measurable improvement for those drives is noteworthy.

On the other hand, Ono’s demo showed that game load time wasn’t improving as dramatically as the raw I/O speeds would suggest. On an NVMe SSD, I/O speeds increased from 2,862MB/s using Win32 to 4,829MB/s using DirectStorage — nearly a 70 percent increase. But the load time for the scene decreased from 2.1 to 1.9 seconds. That’s a decrease that wouldn’t be noticeable even if you were trying to notice it. The Forspoken demo ultimately showed that the speed of the storage you’re using still has a lot more to do with how quickly your games load than DirectStorage does. One scene that took 24.6 seconds to load using DirectStorage on an HDD took just 4.6 seconds to load on a SATA SSD and 2.2 seconds to load on an NVMe SSD. That’s a much larger gap than the one between Win32 and DirectStorage running on the same hardware.

Read more of this story at Slashdot.

Apple’s New Studio Display Has 64GB of Onboard Storage

New submitter Dru Nemeton shares a report from 9to5Mac: Apple’s new Studio Display officially hit the market on Friday, and we continue to learn new tidbits about what exactly’s inside the machine. While Apple touted that the Studio Display is powered by an A13 Bionic inside, we’ve since learned that the Studio Display also features 64GB of onboard storage, because who knows why… […] as first spotted by Khaos Tian on Twitter, the Studio Display also apparently features 64GB of onboard storage. Yes, 64GB: double the storage in the entry-level Apple TV 4K and the same amount of storage in the entry-level iPad Air 5. Also worth noting: the Apple TV 4K is powered by the A12 Bionic chip, so the Studio Display has it beat on that front as well. Apple hasnâ(TM)t offered any explanation for why the Studio Display features 64GB of onboard storage. It appears that less than 2GB of that storage is actually being used as of right now.

One unexciting possibility is that the A13 Bionic chip used inside the Studio Display is literally the exact same A13 Bionic chip that was first shipped in the iPhone 11. As you might remember, the iPhone 11 came with 64GB of storage in its entry-level configuration, meaning Apple likely produced millions of A13 Bionic chips with 64GB of onboard storage. What do you think? Will Apple ever tap into the A13 Bionic chip and 64GB storage inside the Studio Display for something more interesting?

Read more of this story at Slashdot.

Ask Slashdot: Is It Time To Replace File Systems?

DidgetMaster writes: Hard drive costs now hover around $20 per terabyte (TB). Drives bigger than 20TB are now available. Fast SSDs are more expensive, but the average user can now afford these in TB capacities as well. Yet, we are still using antiquated file systems that were designed decades ago when the biggest drives were much less than a single gigabyte (GB). Their oversized file records and slow directory traversal search algorithms make finding files on volumes that can hold more than 100 million files a nightmare. Rather than flexible tagging systems that could make searches quick and easy, they have things like “extended attributes” that are painfully slow to search on. Indexing services can be built on top of them, but these are not an integral part of the file system so they can be bypassed and become out of sync with the file system itself.

It is time to replace file systems with something better. A local object store that can effectively manage hundreds of millions of files and find things in seconds based on file type and/or tags attached is possible. File systems are usually free and come with your operating system, so there seems to be little incentive for someone to build a new system from scratch, but just like we needed the internet to come along and change everything we need a better data storage manager.
See Didgets for an example of what is possible.
In a Substack article, Didgets developer Andy Lawrence argues his system solves many of the problems associated with the antiquated file systems still in use today. “With Didgets, each record is only 64 bytes which means a table with 200 million records is less than 13GB total, which is much more manageable,” writes Lawrence. Didgets also has “a small field in its metadata record that tells whether the file is a photo or a document or a video or some other type,” helping to dramatically speed up searches.

Do you think it’s time to replace file systems with an alternative system, such as Didgets? Why or why not?

Read more of this story at Slashdot.

Ask Slashdot: How Many Files Are on Your Computer?

With some time on their hands, long-time Slashdot reader shanen began exploring the question: How many files does my Windows 10 computer have?

But then they realized “It would also be interesting to compare the weirdness on other OSes…”

Here are the two data points in front of me:

(1) Using the right click on properties for all of the top-level folders on the drive (including the so-called hidden folders), it quickly determined that there are a few hundred thousand files in those folders (and a few hundred thousand subfolders). That’s already ridiculous, but the expected par these days. The largest project I have on the machine only has about 3,000 files, and that one goes back many years… (My largest database only has about 5,000 records, but it’s just a few files.)

(2) However, I also decided to take a look with Microsoft’s malicious software removal tool and got a completely different answer. For twisted grins, I had invoked the full scan. It’s still running a day later and has already passed 10 million files. Really? The progress bar indicates about 80% finished? WTF?

Obviously there is some kind of disagreement about the nature of “file” here. I could only think of one crazy explanation, but my router answered “No, the computer is not checking all of the files on the Internet.” So I’ve already asked the specific question in three-letter form, but the broader question is about the explosive, perhaps even cancerous, “population growth” of files these days.

Maybe we can all solve this mystery together. So use the comments to share your own answers and insights.

How many files are on your computer?

Read more of this story at Slashdot.