Bonum Certa Men Certa

The Latest GitHub-Free Research: An Introduction or Update

Article by figosdev

The gearscape
Microsoft interjects itself as dependency to control the competition

Summary: An examination of how dependent on Microsoft's proprietary jail (GitHub) GNU/Linux distros have become

When I started to find old favourites were hosted on GitHub, I became increasingly interested in what was there. The more I looked, the more I found. I've tried to get a better idea of the scope of the problem ever since.



Automation (scripting, mostly) can and does assist me in this, but there is no automated way to determine if a project is GitHub-based or not. I'll check DistroWatch, Wikipedia, frequently the project's own website, but some of the discoveries require going into the file -- particularly when that file is an ISO image.

"The goal is not to get perfect data (simply impossible with the number of people involved) but to get finer resolution and hopefully gain accuracy with each pass and research stage."I started with a list that reached about 100 to 120 popular applications and distros that are based on GitHub. It was a first effort, also the most casual, and when I "audited" the list later on, I found it "relying on GitHub" if someone happens to do development and issues there, and relying on it for hosting are all a problem in my opinion. That's what I'm looking for -- projects that can actually be hurt if Microsoft either pulls the plug or reduces access.

I'm certainly not impressed by their recent invitation to use formerly premium features. GitHub is a trap, and now they're opening the trap a little wider.

This is an imperfect science, but the more information we have, the better. Some of the research I'm doing builds on what I know, but some of the research is redundant by design, and acts as a check on previous research. The goal is not to get perfect data (simply impossible with the number of people involved) but to get finer resolution and hopefully gain accuracy with each pass and research stage.

"When I was ranking levels of distro dependency on GitHub, naturally I put actually being hosted and developed on GitHub as a key problem, if not the worst."The big picture matters as much as the details. Syslinux is more or less necessary in my opinion, and is not GitHub-based. But I found out while writing this that syslinux uses mkdiskimage -- a Perl script. Am I going to count this as needing GitHub? Probably not, but I'm going to make note of it here.

It is necessary, during each stage of this research, to choose a methodology. When I was ranking levels of distro dependency on GitHub, naturally I put actually being hosted and developed on GitHub as a key problem, if not the worst.

I'm trying to build a practical, useful idea of where the hope is, and where it isn't -- and each stage is practical in the sense that previous research sheds light on what I'm doing now, and vice-versa. Looking for applications helped me rank dependent distros. Ranking distro dependency helped me find new (and more vital) applications and frameworks. This is the best kind of research, where nearly every bit of it helps in some way.

"Tiny Core was one of the 33 most GitHub-free distros, of the 275 examined in my previous research."Ideally, we want all of the bad news we can find, and all of the good news we can find. As for how it's done, every methodology has pros and cons. Right now I'm trying to find the best way to examine Tiny Core Linux, and I've spent days looking at CorePlus in particular -- because I'm also trying to migrate to a less GitHub-dependent distro.

Tiny Core was one of the 33 most GitHub-free distros, of the 275 examined in my previous research. It's also one I'm familiar with; I was one of the early Tiny Core users after having purchased the book Shingledecker and Andrews did. Their disagreement over how to do things led to Tiny Core, and I think it's one of the better (beneficial) separations and forks that have happened in Free software. It's also fun to take apart and explore.

I discovered (or rediscovered, after years of being away) that TC uses .info files to map package dependencies. If you've ever written a recursive function, even if you don't do it all the time, it's very helpful for this sort of thing (I sometimes teach beginner-level coding, and recursion comes up but not a lot.)

This hastily-constructed python function let me recursively create lists of packages that needed packages that needed packages that needed for example, libffi.tcz:

#### license: creative commons cc0 1.0 (public domain) 
#### http://creativecommons.org/publicdomain/zero/1.0/ 
which = sys.argv[1:]
if not which: which = ["libffi.tcz"]



def wget(ls): copy = [] for which in ls: now = "ls *.dep" for each in figarrshell(now): if each: pf = open(each) p = pf.read().replace(chr(13) + chr(10), chr(10)).replace(chr(13), chr(10)).split(chr(10)) ; pf.close() for ln in p: if ln == which and len(ln) > 0: pr = each.replace(".dep", "") if pr not in copy and pr not in ls: copy += [pr] ; print pr if copy: wget(copy)

wget(which)



I also found that open().read() works slightly differently in CPython (GitHub) and PyPy (foss.heptapod.net) although figarropen() does work with PyPy as a drop-in replacement for Python 2. What's different is the limit of open files you can have; I always assumed (without evidence to the contrary) that open().read() closes the file (as does "with" in Python.)

In PyPy, it got to about 1,200 open files (I was opening 2,321 files but assumed they would be closed when read) before complaining that too many were open. I simply opened the file separately from the read() method, so I could run close() after read(). This would be a good change to add to fig as well, so it works better with PyPy. Even fig (most recent update, 2017) stands to benefit from this research.

But the methodology I'm using is to look at the 2,321 packages in TC 11.x, of which I've already dealt with 1,200 -- and find out in what ways Tiny Core "needs" GitHub, or what could be removed to make it less dependent.

I think we could strip out most of the dependencies -- not likely all -- but which ones? And where are they? That's the goal of this stage of research, to find out what is what and which is which.

"The fact that the most-needed "optional" package is based on GitHub (since May 2016) is quite relevant -- 739 (39%) of the 2,321 TC packages need it!"I've already got a list of how many packages need each package, ranked from most to least. Tiny Core is 16mb, CorePlus is 206mb. I know the biggest difference; CorePlus includes a lot more packages. But I've also used Tiny Core more than CorePlus, I know a lot of its limitations, and I'm largely focused on the packages themselves.

Most or all of these packages are technically "optional" (many of them need several others, but that's more or less so with packages in practically every distro) and of course, for example if you have a gtk3 app, it needs gtk3 of reasons that should be obvious. Be assured that I am no fan of doing dependencies to excess, but some dependencies have to be considered reasonable.

I don't know enough about libffi to judge its technical merits or lack thereof. I discovered it through this research (I write scripts mostly, not C or C++ and I don't use ctypes, though I know what it is) because it came up as the "most-needed optional package" in all of this. The fact that the most-needed "optional" package is based on GitHub (since May 2016) is quite relevant -- 739 (39%) of the 2,321 TC packages need it!

Of those 739 libffi-needing packages, 712 of them need glib2 as well (that's GNOME lib, not GNU libc) and ALL of the glib2-based packages need libffi, so perhaps we have GNOME to thank once again. As much as I love blaming them for things, I am unconvinced they're to really blame this time, though the fact remains that if you were to get rid of glib2 then you would also get rid of all but 27 libffi-based packages. You would also get rid of gtk I think, but I'm not suggesting we do that. We build the practical decisions on top of the hard data -- I'm more immediately interested in the data, but the time for decision-making will come.

"Libffi is GitHub, glib2 is Gitlab-based (at any rate, not GitHub) though glib2 does pull in libffi."These are far from isolated examples, indeed the currently methodology is being built on examples like this, which makes them an ideal illustration. Libffi is GitHub, glib2 is Gitlab-based (at any rate, not GitHub) though glib2 does pull in libffi.

We want to show:

* As many of these relationships as possible * With some method of ranking priority, so we know what to check * In some way that shows possible outcomes based on various possible decisions

So the imperfect methodology used here tries to do all of that.

First, I created a folder called libffi. Libffi is GitHub so if we want to get rid of GitHub entirely, we would have to get rid of every package that needs libffi. I don't expect that to happen, I do want to provide as much information on that as necessary so people can at least determine / rate / track / plan / troubleshoot how practical it is for Free software to be independent of a monopolistic company that has always wanted to enslave, tax, control and own it.

I moved all 739 (39%) of the TC packages which need libffi.tcz to the libffi folder. This seems almost too simple to be useful, but we keep going on our mission of discovery.

The next step was to create a folder for glib2. It's not because glib2 depends on libffi, I didn't know that. I learned that because glib2.tcz is the next on the list (712 packages need it) and none of them were in the pool -- they were all in the libffi folder. Now we're learning something.

"The numbers alone do not excite me, though the picture I'm trying to uncover, quantify and find the "shape" of, is of interest."So instead of creating a glib2 folder, I created a libffi/glib2 folder. Most (in fact, all) of the glib2-based packages need libffi, so I just put them under libffi -- but now we know which libffi packages do need glib2 and which don't. Our little hierarchy shows all of that data for us to base decisions on.

And I'm well aware that this system isn't going to stay perfect or neat as it moves forward, it isn't designed to be. My only goal here is to outdo (build on) the previous GitHub-free research, so that new strategies can be gleaned from this. I learned plenty from examining 275 distros, now I want to examine this one in detail. It just happens to be very good for the purpose -- this will tell us about a lot more than just Tiny Core. It already tells us something about libffi and (most likely) about gtk applications. Some of these things will turn out to be specific to Tiny Core. Many will apply broadly. That could be another research project.

Moving forward, I'll share the current hierarchy as it is created. It's not perfectly self-documenting, so I'll document what's there so far (this part is the "introduction" referred to in the title.)

core/libffi: the packages that need libffi.tcz

core/libffi/gitlab: major (next/high on the list) gitlab-based rather than GitHub-based packages that need libffi.tcz

core/libffi/gitlab/glib2: glib2 is gitlab-based; the packages in this folder need both glib2.tcz and libffi.tcz

---

core/selfhost: (next/high on the list) self-hosted (not GitHub or gitlab) packages

core/selfhost/liblzma: liblzma.tcz is next on the list, self-hosted (part of xz utils) and here are the packages that need it.

"People who do databases might correctly assume that an RDB would be better way to organise this data."The list of course, is incomplete -- that's acceptable for this methodology. We already have a full list of required packages for each package; that's how we know there are 689 packages that need liblzma (making it #3 on the list, after glib2) -- we counted them!

For the purpose of this research, it's (only sometimes arbitrarily) more relevant that a package needs a more-needed package than a less-needed package. So instead of the 689 packages we already have a list of, the liblzma sub-hierarchy simply holds the ones "remaining" after "larger" hierarchies are counted. Including subfolders, core/selfhost/liblzma contains 97 packages; the others are needed also by "bigger" "more important" (debatable, hence "sometimes arbitrary") packages that we already sorted.

We could simply get all the data on every package. By "we", I mean people I don't know who haven't signed up to do anything about this. Since I'm doing this, I'm using this imperfect system to discover new areas of focus -- I probably can't make myself do detailed research on EVERY one of the 2,321 packages, so I'm using this system to discover the "most important" problems, based on a methodology that has both pros and cons.

I wouldn't bother with this if I didn't think this approach would shed additional light on the bigger picture. The numbers alone do not excite me, though the picture I'm trying to uncover, quantify and find the "shape" of, is of interest.

"As of this writing, GitHub is only being used for squashfs-tools that create the file systems; the development of squashfs support for the kernel is still on kernel.org."Since this methodology "reveals" that liblzma is important -- and I've learned other things too, this method helps me decide where to pay more attention: I learned that liblzma is part of xz utils, which I didn't know; and I didn't know that xz utils was started by Slackware enthusiasts -- which is both cool, and maybe says something else (something fundamental) about liblzma. You decide what it means to you. This hierarchical system serves as a score -- and for me, a curriculum.

Which leads to the next concept -- noting projects that use GitHub within the hierarchy (where I can spot them or find new ones.)

core/selfhost/liblzma/github: are projects based on GitHub that need liblzma.tcz. It's liblzma that is self-hosted, not the projects in this subfolder; otherwise this path would contradict itself.

People who do databases might correctly assume that an RDB would be better way to organise this data. They would probably be technically correct (I don't normally use databases) but they would be missing the point that we don't actually have the data yet -- the only reason core/selfhost/liblzma/github exists is because core/selfhost/liblzma "told me" to take some time to look for GitHub-based things that needed liblzma.

Will I find them all now? Probably not, but this system informed me to take time on this. We keep trying to shape the data based on relevance, not unlike the earliest versions of the (once fairly straightforward, but also easily-gamed if you want website prominence) PageRank algorithm.

"I figured that if I wanted to focus on fixing this GitHub dependency, I could simply make a live distro that uses a file img formatted with ext3 instead."Incidentally, core/selfhost/liblzma/github includes squashfs-tools (a feature very important to Tiny Core and also to most Live distros, as noted in my previous article) but because it came up again here, I had another look. As of this writing, GitHub is only being used for squashfs-tools that create the file systems; the development of squashfs support for the kernel is still on kernel.org.

In practical terms, this suggests a hypothetical, completely GitHub-free distro could be used to create a tool that reads and converts .sfs files to some other compressed filesystem, though a GitHub-free tool could not (at this time) be made to produce new .sfs images -- only convert them to something else.

I figured that if I wanted to focus on fixing this GitHub dependency, I could simply make a live distro that uses a file img formatted with ext3 instead. How to do compression on the fly could be a separate issue, but I know that alternatives exist.

core/selfhost/ncursesw: means that ncursesw is self-hosted, and this is where the packages that need ncurses.tcz go. Originally there were 665 of them, though now we have 156 remaining due to "more important" packages grabbing those in the hierarchy.

core/selfhost/ncursesw/github: packages moved from ../ to ./ which are based on GitHub, or 25 of 156 packages including: python.tcz (CPython, GitHub), urwid.tcz, tmux (sorry Roy,) inxi.tcz, htop.tcz, freebasic.tcz and vim.tcz.

I think Vim is one of those few things where I'm never sure to say whether it's really relying on GitHub or not. Since I'd hate for Microsoft to end the editor wars with a cure far worse than the disease, I hope someone can give me some truly authoritative evidence that Vim is in fact, GitHub-free. Another thing I found out while doing this is that the person who maintains ncurses is maintained by the same person who maintains the Lynx browser -- and Vile. Vile appears to be GitHub-free, but this research will help determine the validity of that statement. (Vile is not packaged in TC 11.x.)

core/github: was created at this point, for the 43 packages that are actually listed in the .info file as being GitHub-based, minus at least one (pax-utils.tcz, as Gentoo's GitHub is a mirror.)

"With over 1,200 files in these folders, more than 50% of the packages in TC are now sorted into the hierarchy."core/selfhost/libXau-and-libXdmcp: is related to X and these two packages had identical lists, except for libXau-dev.tcz and libXdmcp-dev.tcz, respectively.

core/selfhost/libXau-and-libXdmcp/github contains 12 packages, including wbar.tcz, i3.tcz, aterm.tcz (AfterStep is GitHub-based) and fltk-1.3.tcz.

core/libxcb: was created, and should probably be moved to selfhost, though it has no packages anyway because libxcb-dev.tcz is already in core/selfhost/libXau-and-libXdmcp.

core/libX11: was created and should probably be moved to selfhost, though there's nothing in it.

core/bzip2-lib: has 48 files, including a bunch of Perl-related packages in core/bzip2-lib/github -- Perl is GitHub-based.

With over 1,200 files in these folders, more than 50% of the packages in TC are now sorted into the hierarchy. Some remain undiscovered ties to GitHub, though this process has helped find and rank new ones that are obviously important in some way.

I'm still interested in moving further down the list; the next is libXext.tcz and there are 585 packages that need it. If we try to discover how many of those 585 packages remain...

for p in $(cat ../libXext.tcz.dp) ; do ls ../$p 2> /dev/null ; done | cat -n

...Nope. Nothing there that isn't already in the hierarchy. libXext.tcz.dep is the file that TC provides that shows a single level of dependency, libXext.tcz.dp is the file that the Python code in this article created for libXext.tcz, which shows all the packages that need it.

"Days into this, we've confirmed that TinyCore is indeed one of the least-GitHub-dependent distros, but we've also identified some the more important ways in which it is still dependent indirectly on GitHub."We can use this to create a graph of diminishing returns on this research. Days into this, we've confirmed that TinyCore is indeed one of the least-GitHub-dependent distros, but we've also identified some the more important ways in which it is still dependent indirectly on GitHub.

I thought about making that graph, but since it's likely to be typical and not reveal anything that isn't obvious, I'm just going to watch a movie, eat some eggs and maybe think about getting back to this research. I'm sure it sounds terribly boring, but I continue to learn more about this subject as I explore it.

When this started, I hadn't even thought to start with the most needed packages -- the first thing I wanted to know how many packages pulled in mono.tcz or Perl or Python. Mono is not only GitHub-based, it's one of the worst dependencies you can have. Fortunately, the only packages that pull in mono.tcz are gtk-sharp-dev.tcz, gtk-sharp.tcz, mono-dev.tcz and mono-locale.tcz. I'm only guessing that wine-mono.tcz assumes mono.tcz is installed.

If you're trying to figure out how we can be GitHub-free in the future, I can probably save you some work -- and if you have information that could be useful, by all means, let us know. With luck, this is going to help round out the wiki pages a bit as well.

Long live Stallman, and happy hacking.

Licence: Creative Commons CC0 1.0 (public domain)

Recent Techrights' Posts

Workers Fly Away From IBM's Red Hat (This Year a Lot of Red Hat Staff is "IBM")
The stock (share price) of IBM says nothing about what actually goes on
Links 02/01/2026: Science, Patent Maximalism, and Public Domain Day
Links for the day
Gemini Links 02/02/2026: Books, Scams, and mkscript (a Script to Make Scripts)
Links for the day
Strong Start for GNU/Linux This Year
based on statCounter
More Tools, Factorising Code
If some things in the site of Gemini capsules don't behave as expected, then that's likely due to a bug
State of Tech Journalism in 2026: Follow the Money
in order to understand what motivates an opinion piece one must follow the money
 
The More Buzzwords a Corporation Resorts To...
buzzwords are a fool's way to compensate for or disguise a lack of knowledge
So You Should Definitely Call it "Slop" and Stop Saying "AI"
with more XBox/gaming layoffs being imminent the blowback will be fun to watch
Why Are We Still Using Voting Machines?
Voting machines still seem to me like an infantile cargo cult and an act of salesmanship (like various security theatre rituals at airports)
"Works for Me!"
Who knows best?
Why IBM Workers Like Techrights (Same Reason EPO Workers Do)
IBM will likely be a daily theme (high rate of recurrence)
In 2025 We Contributed to the Headlessness of the OSI, But It's Not Over Yet
By airing some 'dirty laundry' about the OSI last year we contributed to its current state
Africa's Largest Population Sees Diminishing Impact of Windows
less than 1 in 10 Web requests in Nigeria comes from Windows
Russia Cuts Finnish Cables ("Hybrid War"), Finland Cuts Off Microsoft
the birthplace of Linux
Free Software is More Naturally Inclusive
large, intolerant, violent companies get painted as a glorious example of United Colours of Benetton
Europe in 2026: Over 5% GNU/Linux, Not Counting Chromebooks
2026 has started strongly
Slopfarm Says Microsoft's "Biggest Business" is the 'Business' Where It Loses Tens of Billions of Dollars
TOI still pretends to have a lot of output
At the Start of January 2025 Microsoft President Said Microsoft Would Spend 80 Billion Dollars on "AI" Data Centres. That Didn't Happen. Microsoft Laid Off 30,000 Workers, Debt Surged.
Maybe this coming Monday Microsoft will come up with more false promises and vapourware
Links 02/01/2026: Insurrectionist Attacks Musicians Critical of Him With Lawfare, Project Gutenberg Now Has Over 75,000 Books
Links for the day
Decline in LLM Slop About "Linux" is a Good Start for 2026
When the only remaining proponents of slop are slop, which is pretty much what's happening right now, the bubble is popping
EPO People Power - Part XXII - Contact Officials and Inform Your National Representatives (Delegates) of the EPO's Cocainegate
Europe's largest media intentionally covers up serious scandals in Europe's second-largest institution
Slopwatch Still Dead, Not Enough LLM Slop About "Linux"
this is the desirable thing
LibXML2 Will Carry on (Without or With the Name "LibXML2")
The proprietary software boosters are projecting
Gemini Links 02/01/2026: ThinkPad, SHARP Zaurus, Lagrange Handheld Support
Links for the day
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Thursday, January 01, 2026
IRC logs for Thursday, January 01, 2026
Links 01/01/2026: "Biophobia" and Renewed Effort to Locate MH370
Links for the day
Gemini Links 01/01/2026: Bot Accounts Online and Reading in 2025
Links for the day
IBM’s and Red Hat’s "Operation Evolution initiative" Just Long, Fancy Term for Bluewashing, Redundancies, Layoffs
Gerstner is still alive, but he's shorter and more arrogant
Designing a Better Mousetrap or Tools for the SSG
Static Site Generators (SSGs) - unlike all modern Content Management Systems (CMSs) - are so simple that extending them is easy
Links 01/01/2026: 1930 Works in the Public Domain, Electricity Pricing 'a Mystery'
Links for the day
Firefox is Toast Because It Got Toasted by Mozilla
Firefox cannot keep above 2% and hasn't been able to for quite some time
Ignore the LLM Slop and the Noise, Microsoft is in a Death Spiral
So what does Microsoft have left to sell?
Red Hat is Vanishing Before Our Eyes
With some Red Hat staff "transitioning" we wonder if it's an HR hack, wherein they "reset the clock" on employment duration so as to lessen severance obligations
In 2025 Microsoft Lost Palau
Palau now has GNU/Linux at steadily high levels
Microsoft Mocked UNIX/Linux for Not Handling Dates After 2038, Microsoft Breaks Down on 2026!
Only a truly moronic company would design it that way
Another New Year's Resolution: Public Domain Sources, Credits
In addition to our first one
Combatting Slop Images (and ClownFlare)
we won't use or reuse slop images
The End of Red Hat
expect many more layoffs soon
A New Year's Resolution: Maximal Transparency
We'll do our very best to be transparent about everything that's going on, even legal matters
Gemini Links 01/01/2026: 2025 Comes to a Close and Capsular Gemlog Manager
Links for the day
Free Software Foundation (FSF) Raised About 1.3 Million Dollars in the Past Couple of Months!
the FSF's Board now has 10 people in it
2026 IBM Phaseout of Red Hat
Red Hat won't fare any better than most IBM acquisitions
Microsoft Budget Issues, XBox Thrown Under the Bus
They're cutting budget. Soon they'll cut the staff.
Only Hours Into the New Year People Already Discuss the Next Round of Layoffs at Red Hat/IBM
2026 will be another tough year for Red Hat and IBM
EPO People Power - Part XXI - Europe's Second-Largest Institution Became a Corrupt For-Profit Company Run by Drug Addicts
it'll be the demise of the Rule of Law in Europe and maybe a death blow to the EU (eventually), not just the EPO
Another Very Productive Year Commences
"a total of over 17,000 pages in a year"
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Wednesday, December 31, 2025
IRC logs for Wednesday, December 31, 2025
Fiji: GNU/Linux Has Risen From Almost Nothing to Almost 5% in Recent Years
It's not as small as people are led to believe
Gemini Links 31/12/2025: Blogosphere is Growing and New Year Begins
Links for the day
Recruiters Don't Use Microsoft LinkedIn, Spammers Use LinkedIn
One of my best friends, a university professor, lost all of his life's savings due to Microsoft LinkedIn
You've Only Wasted Your Life in Social Control Networks
In a sense, social control media is a giant delusion
2025 Was a Very Bad Year for Social Control Media
statCounter sees a gradual demise in Social Control Media access
Don't "Go Paperless", Go Paperful [sic] (for What Really Matters)
Why should we favour paper use sometimes? Well, many reasons.
Complexity Considered Harmful: We Used to Run an Operating System on 64KB of RAM, Not 64GB of RAM (a Million Times More)
"Initially confined to single-tasking on 8-bit processors and no more than 64 kilobytes of memory"
The Slop Industry is Failing So Badly (Mountains of Debt, Losses) That It's Merging With the SPAM Industry
we reckon that Google will eventually delist all slopfarms, recognising they're just a form of SPAM
Links 31/12/2025: Cheeto Pushing for More Wars, ‘Security is a Shared Responsibility’
Links for the day
Enshittification of Postal Services Isn't Technological Advancement
Societies that say the aim is to "go digital" and eliminate paper trail aren't advanced; they're moving backwards
IBM Starts 2026 a Much Smaller Company (Not Homage to Gerstner)
People who get bluewashed out of their job (or bluewashed into unemployment) are gagged by NDAs
XBox is Likely Dead Already, But the Threat It Posed to Us All for Two Decades Isn't Over
"the Xbox was never about gaming and merely served as a test bed for DRM in commodity systems."
Ahead of 2026 Mass Layoffs at Microsoft the Tree Gets Shaken to See Who 'Falls' (Resigns/Retires)
"We had a quiet meeting last week about budget realignment. No one said layoffs, but it’s clear where the focus is shifting."
Almost 6,5000 Pages in 2025, Aiming Higher in 2026
if we can keep focused, then quantity will increase
Microsoft XBox Having a "Dog Ate My Homework" Moment: No New Console Until 3 Years From Now... Because "RAM Prices"
Who will ever remember this in 2028? Nobody.
Gemini End of Year Capsules Tally (Based on Lupa) Shows About 10% Growth
What a difference a year makes
Gemini Links 31/12/2025: New Resolution, Reverse Hexdump, and Programming Languages
Links for the day
Dr. Andy Farnell Explains Why Chatbots Became Dishonesty on Top of Dishonesty (Hiding Usage of Dishonest Salads of Words)
new article from CyberShow
Links 31/12/2025: Nvidia Faces Bubble-Bursting Moment, Saudi Oil Money Pumped Into Chatbots to Keep the Energy Waste Going (Circular Financing Again)
Links for the day
Richard Stallman's First Talk in a U.S. College Since 2018
Greetings from Georgia Tech!
EPO People Power - Part XX - Why António Campinos Chose to Put His Cokehead Friend on 'Sick Leave'
EPO Cocainegate will be covered for months to come
Over at Tux Machines...
GNU/Linux news for the past day
IRC Proceedings: Tuesday, December 30, 2025
IRC logs for Tuesday, December 30, 2025