

It’s a real parameter.
It’s a real parameter.
The Polynesian people had many ways of detecting land far beyond the horizon using ocean currents, temperatures, weather patterns, animal movements, and others which they used to island hop all the way through the Pacific Islands.
I have little doubt it was a well informed theory before they got into their vessels.
Modern transistors aren’t just silicon though. The silicon is doped with various materials, presumably gallium, boron, arsenic, phosphorus, and cobalt, among other elements.
Trump is notoriously a zero sum true believer, despite it being routinely mocked, disproven, and sociopathic. He fully believes that help given to people who are not him is wasteful and harmful to him, at least indirectly.
Or just use Linkwarden or Karakeep (previously Hoarder)
No, Richard, it’s ‘Linux’, not ‘GNU/Linux’. The most important contributions that the FSF made to Linux were the creation of the GPL and the GCC compiler. Those are fine and inspired products. GCC is a monumental achievement and has earned you, RMS, and the Free Software Foundation countless kudos and much appreciation.
Following are some reasons for you to mull over, including some already answered in your FAQ.
One guy, Linus Torvalds, used GCC to make his operating system (yes, Linux is an OS – more on this later). He named it ‘Linux’ with a little help from his friends. Why doesn’t he call it GNU/Linux? Because he wrote it, with more help from his friends, not you. You named your stuff, I named my stuff – including the software I wrote using GCC – and Linus named his stuff. The proper name is Linux because Linus Torvalds says so. Linus has spoken. Accept his authority. To do otherwise is to become a nag. You don’t want to be known as a nag, do you?
(An operating system) != (a distribution). Linux is an operating system. By my definition, an operating system is that software which provides and limits access to hardware resources on a computer. That definition applies whereever you see Linux in use. However, Linux is usually distributed with a collection of utilities and applications to make it easily configurable as a desktop system, a server, a development box, or a graphics workstation, or whatever the user needs. In such a configuration, we have a Linux (based) distribution. Therein lies your strongest argument for the unwieldy title ‘GNU/Linux’ (when said bundled software is largely from the FSF). Go bug the distribution makers on that one. Take your beef to Red Hat, Mandrake, and Slackware. At least there you have an argument. Linux alone is an operating system that can be used in various applications without any GNU software whatsoever. Embedded applications come to mind as an obvious example.
Next, even if we limit the GNU/Linux title to the GNU-based Linux distributions, we run into another obvious problem. XFree86 may well be more important to a particular Linux installation than the sum of all the GNU contributions. More properly, shouldn’t the distribution be called XFree86/Linux? Or, at a minimum, XFree86/GNU/Linux? Of course, it would be rather arbitrary to draw the line there when many other fine contributions go unlisted. Yes, I know you’ve heard this one before. Get used to it. You’ll keep hearing it until you can cleanly counter it.
You seem to like the lines-of-code metric. There are many lines of GNU code in a typical Linux distribution. You seem to suggest that (more LOC) == (more important). However, I submit to you that raw LOC numbers do not directly correlate with importance. I would suggest that clock cycles spent on code is a better metric. For example, if my system spends 90% of its time executing XFree86 code, XFree86 is probably the single most important collection of code on my system. Even if I loaded ten times as many lines of useless bloatware on my system and I never excuted that bloatware, it certainly isn’t more important code than XFree86. Obviously, this metric isn’t perfect either, but LOC really, really sucks. Please refrain from using it ever again in supporting any argument.
Last, I’d like to point out that we Linux and GNU users shouldn’t be fighting among ourselves over naming other people’s software. But what the heck, I’m in a bad mood now. I think I’m feeling sufficiently obnoxious to make the point that GCC is so very famous and, yes, so very useful only because Linux was developed. In a show of proper respect and gratitude, shouldn’t you and everyone refer to GCC as ‘the Linux compiler’? Or at least, ‘Linux GCC’? Seriously, where would your masterpiece be without Linux? Languishing with the HURD?
If there is a moral buried in this rant, maybe it is this:
Be grateful for your abilities and your incredible success and your considerable fame. Continue to use that success and fame for good, not evil. Also, be especially grateful for Linux’ huge contribution to that success. You, RMS, the Free Software Foundation, and GNU software have reached their current high profiles largely on the back of Linux. You have changed the world. Now, go forth and don’t be a nag.
Thanks for listening.
Bethesda was notorious back in the day for using uncompressed textures. Not lossless textures, just fully uncompressed bitmaps. One of the first mods after every game release just compressed and dynamically decompressed these to get massive improvements in load times and memory management.
2ghz does not measure it’s computing power though, only the cycle speed. Two very different things.
An objective measure is a simple benchmark:
Here’s a quad core 1.5ghz RISC-V SoC (noted as VisionFive 2) vs a quad core 1.8ghz ARM chip (noted as Raspberry Pi 400).
It’s not even remotely close to usable for all but the most basic of tasks https://www.phoronix.com/review/visionfive2-riscv-benchmarks/6
Not trying to be rude, but that’s a question of how the engine uses the CPU vs GPU implementation, not a measure of apples to apples.
Comparing modern games with CPU particle physics to the heyday of GPU Physx there is no comparison. CPU physics (and Physx) are more accurate, less buggy, and generally not impactful in performance.
“Should I, ahh, write the check, sir?”
“No.”
I mean, does it work worse? UE4/Havok and Unigine all use CPU Physx. And every other engine I know of uses a custom particle physics implementation and seem far better at it than GPU Physx ever was.
On GPU I remember physx being super buggy since the GPU calculations were very low precision, and that was if you had an Nvidia card. It made AMD cards borderline unplayable in many games that were doing extensive particle physics for no other reason than to punish AMD in benchmarks.
Meh. Physx emulation on CPU has been outstripping hardware implementations for a while as far as I know.
Nvidia dropping a portfolio item to open source appears to only happen once they’ve milked it to death first.
Donate when you can tho
Yeah but 1gb/s by me is $80/mo minimum…
1 Ethernet port does not a router make.
Wake me up when they finally do transparent aluminum.
They are. They’re shipping several RISC-V SOCs
Yeah they disappeared him and his family when he was just 6 years old, then announced a loyalist the new Panchen Llama. That was 30 years ago.
This is correct for a given transaction, but there’s no consensus needed to open a Bitcoin wallet. That is usually just a private key in an encrypted envelope.