Is GitHub Suspending the Accounts of Russian Developers at Sanctioned Companies?

“Russian software developers are reporting that their GitHub accounts are being suspended without warning if they work for or previously worked for companies under U.S. sanctions, writes Bleeping Computer:

According to Russian media outlets, the ban wave began on April 13 and didn’t discriminate between companies and individuals. For example, the GitHub accounts of Sberbank Technology, Sberbank AI Lab, and the Alfa Bank Laboratory had their code repositories initially disabled and are now removed from the platform…. Personal accounts suspended on GitHub have their content wiped while all repositories become immediately out of reach, and the same applies to issues and pull requests.

Habr.com [a Russian collaborative blog about IT] reports that some Russian developers contacted GitHub about the suspension and received an email titled ‘GitHub and Trade Controls’ that explained their account was disabled due to US sanctions. This email contains a link to a GitHub page explaining the company’s policies regarding sanctions and trade controls, which explains how a user can appeal their suspension. This appeal form requires the individual to certify that they do not use their GitHub account on behalf of a sanctioned entity. A developer posted to Twitter saying that he could remove the suspension after filling out the form and that it was due to his previous employer being sanctioned.
A GitHub blog post in March had promised to ensure the availability of open source services “to all, including developers in Russia.” So Bleeping Computer contacted a GitHub spokesperson, who explained this weekend that while GitHub may be required to restrict some users to comply with U.S. laws, “We examine government sanctions thoroughly to be certain that users and customers are not impacted beyond what is required by law.”
According to this, the suspended private accounts are either affiliated, collaborating, or working with/for sanctioned entities. However, even those who previously worked for a sanctioned company appear to be suspended by mistake.

This means that Russian users, in general, can suddenly find their projects wiped and accounts suspended, even if those projects have nothing to do with the sanctioned entities.

Read more of this story at Slashdot.

‘Biggest Change Ever’ to Go Brings Generics, Native Fuzzing, and a Performance Boost

“Supporting generics has been Go’s most often requested feature, and we’re proud to deliver the generic support that the majority of users need today,” the Go blog announced this week. *

It’s part of what Go’s development team is calling the “biggest change ever to the language”.

SiliconANGLE writes that “Right out of the gate, Go 1.18 is getting a CPU speed performance boost of up to 20% for Apple M1, ARM64 and PowerPC64 chips. This is all from an expansion of Go 1.17’s calling conventions for the application binary interface on these processor architectures.”

And Go 1.18 also introduces native support for fuzz testing — the first major programming language to do so, writes ZDNet:

As Google explains, fuzz testing or ‘fuzzing’ is a means of testing the vulnerability of a piece of software by throwing arbitrary or invalid data at it to expose bugs and unknown errors. This adds an additional layer of security to Go’s code that will keep it protected as its functionality evolves — crucial as attacks on software continue to escalate both in frequency and complexity. “At Google we are committed to securing the online infrastructure and applications the world depends upon,” said Eric Brewer, VIP infrastructure at Google….
While other languages support fuzzing, Go is the first major programming language to incorporate it into its core toolchain, meaning — unlike other languages — third-party support integrations aren’t required.

Google is emphasizing Go’s security features — and its widespread adoption. ZDNet writes:

Google created Go in 2007 and was designed specifically to help software engineers build secure, open-source enterprise applications for modern, multi-core computing systems. More than three-quarters of Cloud Native Computing Foundation projects, including Kubernetes and Istio, are written in Go, says Google. [Also Docker and Etc.] According to data from Stack Overflow, some 10% of developers are writing in Go worldwide, and there are signs that more recruiters are seeking out Go coders in their search for tech talent….. “Although we have a dedicated Go team at Google, we welcome a significant amount of contributions from our community. It’s a shared effort, and with their updates we’re helping our community achieve Go’s long-term vision.
Or, as the Go blog says:

We want to thank every Go user who filed a bug, sent in a change, wrote a tutorial, or helped in any way to make Go 1.18 a reality. We couldn’t do it without you. Thank you.

Enjoy Go 1.18!

* Supporting generics “includes major — but fully backward-compatible — changes to the language,” explains the release notes. Although it adds a few cautionary notes:

These new language changes required a large amount of new code that has not had significant testing in production settings. That will only happen as more people write and use generic code. We believe that this feature is well implemented and high quality. However, unlike most aspects of Go, we can’t back up that belief with real world experience. Therefore, while we encourage the use of generics where it makes sense, please use appropriate caution when deploying generic code in production.

While we believe that the new language features are well designed and clearly specified, it is possible that we have made mistakes…. it is possible that there will be code using generics that will work with the 1.18 release but break in later releases. We do not plan or expect to make any such change. However, breaking 1.18 programs in future releases may become necessary for reasons that we cannot today foresee. We will minimize any such breakage as much as possible, but we can’t guarantee that the breakage will be zero.

Read more of this story at Slashdot.

Researchers Release ‘PolyCoder’, the First Open-Source Code-Generating AI Model

“Code generation AI — AI systems that can write in different programming languages given a prompt — promise to cut development costs while allowing coders to focus on creative, less repetitive tasks,” writes VentureBeat.

“But while research labs like OpenAI and Alphabet-backed DeepMind have developed powerful code-generating AI, many of the most capable systems aren’t available in open source.”

For example, the training data for OpenAI’s Codex, which powers GitHub’s Copilot feature, hasn’t been made publicly available, preventing researchers from fine-tuning the AI model or studying aspects of it such as interpretability.

To remedy this, researchers at Carnegie Mellon University — Frank Xu, Uri Alon, Graham Neubig, and Vincent Hellendoorn — developed PolyCoder, a model based on OpenAI’s GPT-2 language model that was trained on a database of 249 gigabytes of code across 12 programming languages. While PolyCoder doesn’t match the performance of top code generators in every task, the researchers claim that PolyCoder is able to write in C with greater accuracy than all known models, including Codex….

“Large tech companies aren’t publicly releasing their models, which is really holding back scientific research and democratization of such large language models of code,” the researchers said. “To some extent, we hope that our open-sourcing efforts will convince others to do the same. But the bigger picture is that the community should be able to train these models themselves. Our model pushed the limit of what you can train on a single server — anything bigger requires a cluster of servers, which dramatically increases the cost.”

Read more of this story at Slashdot.

Why Swift Creator Chris Lattner Stepped Down From Its Core Team This Week

The creator of Apple’s Swift programming language stayed involved in the Swift core team and Evolution community… until this week. Though he’d left Apple more than five years ago, “Swift is important to me, so I’ve been happy to spend a significant amount of time to help improve and steer it,” Lattner wrote in an explanatory comment on the Swift community forum. “This included the ~weekly core team meetings (initially in person, then over WebEx)…”

The tech news site DevClass notes Lattner is also “the mind behind compiler infrastructure project LLVM,” but reports that “Apparently, Lattner hasn’t been part of the [Swift] core team since autumn 2021, when he tried discussing what he perceived as a toxic meeting environment with project leadership after an especially noteworthy call made him take a break in summer.”

“[…] after avoiding dealing with it, they made excuses, and made it clear they weren’t planning to do anything about it. As such, I decided not to return,” Lattner wrote in his explanation post. Back then, he planned to keep participating via the Swift Evolution community “but after several discussions generating more heat than light, when my formal proposal review comments and concerns were ignored by the unilateral accepts, and the general challenges with transparency working with core team, I decided that my effort was triggering the same friction with the same people, and thus I was just wasting my time.”

Lattner had been the steering force behind Swift since the language’s inception in 2010. However, after leaving Apple in 2017 and handing over his project lead role, design premises like “single things that compose” seem to have fallen by the wayside, making the decision to move on completely easier for language-creator Lattner.

The article points out Lattner’s latest endeavour is AI infrastructure company Modular.AI.

And Lattner wrote in his comment that Swift’s leadership “reassures me they ‘want to make sure things are better for others in the future based on what we talked about’ though….”
Swift has a ton of well meaning and super talented people involved in and driving it. They are trying to be doing the best they can with a complicated situation and many pressures (including lofty goals, fixed schedules, deep bug queues to clear, internal folks that want to review/design things before the public has access to them, and pressures outside their team) that induce odd interactions with the community. By the time things get out to us, the plans are already very far along and sometimes the individuals are attached to the designs they’ve put a lot of energy into. This leads to a challenging dynamic for everyone involved.

I think that Swift is a phenomenal language and has a long and successful future ahead, but it certainly isn’t a community designed language, and this isn’t ambiguous. The new ideas on how to improve things sounds promising — I hope they address the fundamental incentive system challenges that the engineers/leaders face that cause the symptoms we see. I think that a healthy and inclusive community will continue to benefit the design and evolution of Swift.

DevClass also reported on the aftermath:
Probably as a consequence of the move, the Swift core team is currently looking to restructure project leadership. According to Swift project lead Ted Kremenek… “The intent is to free the core team to invest more in overall project stewardship and create a larger language workgroup that can incorporate more community members in language decisions.”

Kremenek also used the announcement to thank Lattner for his leadership throughout the formative years of the project, writing “it has been one of the greatest privileges of my life to work with Chris on Swift.”

In 2017 Chris Lattner answered questions from Slashdot’s readers.

Read more of this story at Slashdot.

Programming in Rust is Fun – But Challenging, Finds Annual Community Survey

Respondents to the annual survey of the Rust community reported an uptick in weekly usage and challenges, writes InfoWorld:

Among those surveyed who are using Rust, 81% were using the language on at least a weekly basis, compared to 72% in last year’s survey. Of all Rust users, 75% said they are able to write production-ready code but 27% said it was at times a struggle to write useful, production-ready code…. While the survey pointed toward a growing, healthy community of “Rustaceans,” it also found challenges. In particular, Rust users would like to see improvements in compile times, disk usage, debugging, and GUI development…

– For those who adopted Rust at work, 83% found it “challenging.” But it was unclear how much of this was a Rust-specific issue or general challenges posed by adopting a new language. During adoption, only 13% of respondents believed the language was slowing their team down while 82% believed Rust helped their teams achieve their goals.
– Of the respondents using Rust, 59% use it at least occasionally at work and 23% use it for the majority of their coding. Last year, only 42% used Rust at work.
From the survey’s results:
After adoption, the costs seem to be justified: only 1% of respondents did not find the challenge worth it while 79% said it definitely was. When asked if their teams were likely to use Rust again in the future, 90% agreed. Finally, of respondents using Rust at work, 89% of respondents said their teams found it fun and enjoyable to program.
As for why respondents are using Rust at work, the top answer was that it allowed users “to build relatively correct and bug free software” with 96% of respondents agreeing with that statement. After correctness, performance (92%) was the next most popular choice. 89% of respondents agreed that they picked Rust at work because of Rust’s much-discussed security properties.
Overall, Rust seems to be a language ready for the challenges of production, with only 3% of respondents saying that Rust was a “risky” choice for production use.

Thanks to Slashdot reader joshuark for submitting the story…

Read more of this story at Slashdot.

Library Intentionally Corrupted by Developer Relaunches as a Community-Driven Project

Last weekend a developer intentionally corrupted two of his libraries which collectively had more than 20 million weekly downloads and thousands of dependent projects.

Eight days later, one of those libraries has become a community controlled project.

Some highlights from the announcement at fakerjs.dev:

We’re a group of engineers who were using Faker in prod when the main package was deleted. We have eight maintainers currently….

What has the team done so far?

1. Created a GitHub org [repository] for the new Faker package under @faker-js/faker.
2. Put together a team of eight maintainers.
3. Released all previous versions of Faker at @faker-js/faker on npm.
4. Released the Version 6 Alpha
5. Almost completed migrating to TypeScript so that DefinitelyTyped no longer needs to maintain its external @types/faker package.
6. Created a public Twitter account for communicating with the community.
7. Released the first official Faker documentation website….

Faker has never had an official docs website and the awesome Jeff Beltran has been maintaining a project called “Un-Official faker.js Documentation” for the last 3 years.

He gave us permission to re-use his work to create fakerjs.dev

8. Cleaned up tooling like Prettier, CI, Netlify Deploy Previews, and GitHub Actions.

9. Done a TON of issue triage and many, many PR reviews.
10. We’ve gotten in contact with the Open Collective and discussed a transition plan for the project.

We fully intend to extend Faker, continuously develop it, and make it even better.

As such, we will work on a roadmap after we release 6.x and merge all of the TypeScript Pull Requests in the next week….

We’re now turning Faker into a community-controlled project currently maintained by eight engineers from various backgrounds and companies….

We’re excited to give new life to this idea and project.

This project can have a fresh start and it will become even cooler.

We felt we needed to do a public announcement because of all of the attention the project received in the media and from the community.

We believe that we have acted in the way that is best for the community.

According to the announcement, they’ve now also forked the funding so the project’s original sponsors can continue to support the community-driven development in the future, while the original developers Marak and Brian “were able to retain the $11,652.69 USD previously donated to the project.”

Friday the official Twitter account for the new community project announced “It’s been a week. We’ve merged all of the active forks. Currently at 1532 stars. Looks like everything is settling.” [It’s now up to over 1,800 stars.]

One of the new maintainers has posted on Twitter, “I’m just grateful to the faker community that willed itself into existence and stepped up.”

Read more of this story at Slashdot.

GitHub Restores Account of Developer Who Intentionally Corrupted His Libraries

What happened after a developer intentionally corrupted two of their libraries which collectively had more than 20 million weekly downloads and thousands of dependent projects?

Mike Melanson’s “This Week in Programming” column reports:

In response to the corrupted libraries, Microsoft quickly suspended his GitHub access and reverted the projects on npm…. While this might seem like an open and shut case to some — the developer committed malicious code and GitHub and npm did what it had to do to protect its users — a debate broke out around a developer’s rights to do what they wish with their code, no matter how many projects and dependencies it may have.

“GitHub suspending someone’s account for modifying their own code in a project they own however they want spooks me a lot more than NPM reverting a package,” [tweeted one company’s Director of Engineering & Technology]. “I kind of love what Marak did to make a point and protest to be honest.”

An article on iProgrammer further outlines the dilemma present in what might otherwise seem like a clear-cut case…. “Yes, it is open source in that you can fork it and can contribute to it but does this mean that GitHub is justified in denying you the right to change or even destroy your own code?”

As of last night, however, it would appear that the entire affair is merely one for intellectual debate, as GitHub has indeed lived up to what some might view as its end of the bargain: the developer’s account is active, he has been allowed to remove his faker.js library on GitHub (depended upon as it might be), and has since offered an update that he does “not have Donkey Brains”.

Read more of this story at Slashdot.

Open Source Developer Intentionally Corrupts His Own Widely-Used Libraries

“Users of popular open-source libraries ‘colors’ and ‘faker’ were left stunned after they saw their applications, using these libraries, printing gibberish data and breaking..” reports BleepingComputer.
“The developer of these libraries intentionally introduced an infinite loop that bricked thousands of projects that depend on ‘colors and ‘faker’.”

The colors library receives over 20 million weekly downloads on npm alone, and has almost 19,000 projects depending on it. Whereas, faker receives over 2.8 million weekly downloads on npm, and has over 2,500 dependents….

Yesterday, users of popular open-source projects, such as Amazon’s Cloud Development Kit were left stunned on seeing their applications print gibberish messages on their console. These messages included the text ‘LIBERTY LIBERTY LIBERTY’ followed by a sequence of non-ASCII characters… The developer, named Marak Squires added a “new American flag module” to colors.js library yesterday in version v1.4.44-liberty-2 that he then pushed to GitHub and npm. The infinite loop introduced in the code will keep running indefinitely; printing the gibberish non-ASCII character sequence endlessly on the console for any applications that use ‘colors.’ Likewise, a sabotaged version ‘6.6.6’ of faker was published to GitHub and npm….

The reason behind this mischief on the developer’s part appears to be retaliation — against mega-corporations and commercial consumers of open-source projects who extensively rely on cost-free and community-powered software but do not, according to the developer, give back to the community. In November 2020, Marak had warned that he will no longer be supporting the big corporations with his “free work” and that commercial entities should consider either forking the projects or compensating the dev with a yearly “six figure” salary….

Some dubbed this an instance of “yet another OSS developer going rogue,” whereas InfoSec expert VessOnSecurity called the action “irresponsible,” stating: “If you have problems with business using your free code for free, don’t publish free code. By sabotaging your own widely used stuff, you hurt not only big business but anyone using it. This trains people not to update, ‘coz stuff might break.”

GitHub has reportedly suspended the developer’s account. And, that too, has caused mixed reactions… “Removing your own code from [GitHub] is a violation of their Terms of Service? WTF? This is a kidnapping. We need to start decentralizing the hosting of free software source code,” responded software engineer Sergio Gómez.

“While it looks like color.js has been updated to a working version, faker.js still appears to be affected, but the issue can be worked around by downgrading to a previous version (5.5.3),” reports the Verge:

Even more curiously, the faker.js Readme file has also been changed to “What really happened with Aaron Swartz…?”

Squires’ bold move draws attention to the moral — and financial — dilemma of open-source development, which was likely the goal of his actions.

Read more of this story at Slashdot.

‘A Quadrillion Mainframes On Your Lap’

“Your laptop is way more powerful than you might realize,” writes long-time Slashdot reader fahrbot-bot.
“People often rhapsodize about how much more computer power we have now compared with what was available in the 1960s during the Apollo era. Those comparisons usually grossly underestimate the difference.”

Rodney Brooks, emeritus professor of robotics at MIT (and former director of their AI Lab and CSAIL) explains in IEEE Spectrum:

By 1961, a few universities around the world had bought IBM 7090 mainframes. The 7090 was the first line of all-transistor computers, and it cost US $20 million in today’s money, or about 6,000 times as much as a top-of-the-line laptop today. Its early buyers typically deployed the computers as a shared resource for an entire campus. Very few users were fortunate enough to get as much as an hour of computer time per week.

The 7090 had a clock cycle of 2.18 microseconds, so the operating frequency was just under 500 kilohertz. But in those days, instructions were not pipelined, so most took more than one cycle to execute. Some integer arithmetic took up to 14 cycles, and a floating-point operation could hog up to 15. So the 7090 is generally estimated to have executed about 100,000 instructions per second. Most modern computer cores can operate at a sustained rate of 3 billion instructions per second, with much faster peak speeds. That is 30,000 times as fast, so a modern chip with four or eight cores is easily 100,000 times as fast.

Unlike the lucky person in 1961 who got an hour of computer time, you can run your laptop all the time, racking up more than 1,900 years of 7090 computer time every week….

But, really, this comparison is unfair to today’s computers. Your laptop probably has 16 gigabytes of main memory. The 7090 maxed out at 144 kilobytes. To run the same program would require an awful lot of shuffling of data into and out of the 7090 — and it would have to be done using magnetic tapes . The best tape drives in those days had maximum data-transfer rates of 60 KB per second. Although 12 tape units could be attached to a single 7090 computer, that rate needed to be shared among them. But such sharing would require that a group of human operators swap tapes on the drives; to read (or write) 16 GB of data this way would take three days. So data transfer, too, was slower by a factor of about 100,000 compared with today’s rate.

So now the 7090 looks to have run at about a quadrillionth (10 ** -15) the speed of your 2021 laptop. A week of computing time on a modern laptop would take longer than the age of the universe on the 7090.

Read more of this story at Slashdot.