01.22.22

From Software Eating the World to the Pentagon Eating All the Software

Posted in Free/Libre Software, GNU/Linux, Microsoft at 11:03 am by Dr. Roy Schestowitz

Video download link | md5sum 20c69b9ab91e5bd87233a45e3cbd94a7
Plunder by Platform Domination (Centralisation)
Creative Commons Attribution-No Derivative Works 4.0

Summary: “Software is eating the world,” according to Marc Andreessen (co-founder of Netscape), but the Empire Strikes Back (not the movie, the actual empire) by hijacking all code by proxy, via Microsoft, just as it grabbed a lot of the world’s communications via Skype, bypassing the world’s many national telecoms; coders need to fight back rather than participate in racist (imperial) shams such as GitHub

IN this latest series from Dr. Andy Farnell (see this morning’s installment, Peak Code — Part II) we see an interesting new framing of the exploitation of Free software by actors which this software was meant to replace. They’re looking for workarounds and legal hacks by which to rob hackers. To some degree, as we cautioned some months ago, they’re succeeding and Microsoft/GitHub is by far the number one threat to software freedom. It has gotten so bad that it took over the OSI and today’s Linux Foundation is outsourcing almost all of its supposedly “Open Source” projects (openwashing) to Microsoft’s proprietary software vault/prison.

“Who would wish to consciously participate in such a heist?”As we noted in our ongoing series (we will be publishing Microsoft GitHub Exposé — Part XVI a week late due to Monday's epic hardware crash), GPL violation en masse was part of the plan. GitHub is a massive attack (not the music group) on Free software, originally envisioned as means of detecting code defects (not licence violations) but eventually twisted into a mass plagiarism tool perfumed as “Hey Hi”. As noted earlier this month, Microsoft now leverages GitHub to confiscate code, taking it away from the original coders. Who would wish to consciously participate in such a heist? Cui bono?

Peak Code — Part II: Lost Source

Posted in Free/Libre Software, GNU/Linux at 12:01 am by Guest Editorial Team

Article/series by Dr. Andy Farnell

This work is licensed under version 4.0 of the Creative Commons CC-BY-SA license

Series parts:

  1. Peak Code — Part I: Before the Wars
  2. YOU ARE HERE ☞ Lost Source

A light sword

Summary: “Debian and Mozilla played along. They were made “Yeoman Freeholders” in return for rewriting their charters to “work closely with the new Ministry in the interests of all stakeholders” – or some-such vacuous spout… because no one remembers… after that it started.”

I was a Free Software “hacker”. The nights were late, the pay was… nothing. We were all-volunteers. There was no recognition, just a sense of being part of something. But oh boy, were we part of something! We felt like we were building history. I made companies. I wrote applications. I taught new hackers.

“With “disinformation” outlawed, we were swaddled, blind, clothed by the machine. Then, so suddenly, here, naked and together.”All things pass. Much changed between the great pandemics and the mid-century storms when skyscrapers fell like dominoes. But I remember the software crisis starting. No great conspiracy. No revolution. No foreign hackers. No mythical “software wars”. How suddenly it all blew up before that week when the food deliveries stopped and the lights went out. How many had already been on the edge, not knowing about each another or what was happening? With “disinformation” outlawed, we were swaddled, blind, clothed by the machine. Then, so suddenly, here, naked and together.

That old Malthusian worrier, your uncle Archie said it, “One day the code will run out. Everything runs on code, but it’s not sustainable”. We all laughed at him. Everyone knew software had zero cost and was inexhaustible. There would always be kids who wanted to write it, to prove something, to scratch an itch. Besides, machines would soon write all the code we’d ever need.

That must have been “peak code”. You don’t notice peak anything while you’re living through it. By definition, it’s the golden moment. Those days there were hundreds of languages, millions of coders and billions of devices. Software pulsed and flowed, in hourly updates, through the Internet into the gadgets that ran our lives. Secure Software, nourishing the always-on, always pumping machine. Then like all hearts, it just grew old, tired and sick, and one day it gave up. Some spirit within it died and the software went away.

“Those days there were hundreds of languages, millions of coders and billions of devices.”Hired coders never cared. In their short, exhausting careers they plastered libraries on top of libraries, dependencies all the way down. To where? Nobody remembered. Maybe those few strange people who hacked not for money, but because it made them feel good?

Old words from before The Face Chain and The Age of Legibility, “vocations”, “callings”, “civic duty”, seem senseless now. By the
thirties, only performative activity validated by public perception telemetry and backed by a smart contract could earn credit.

“Hired coders never cared. In their short, exhausting careers they plastered libraries on top of libraries, dependencies all the way down.”Graeber described “moral resentment”. Hate of care. Within a decade
it wasn’t just overt, it was policy. Helping a neighbour or family member might be overlooked. The Humans First Bill sealed it. Nurses and teachers, medics, firefighters, police, child-carers, all gone. “If a bot could a bot should“. Interpersonal Disorder, from a mid century copy of the DSM describes a “pathological desire to interact with or serve other humans rather than accept convenient rational transaction with the machine”.

Momentum, aspiration and the inability of the masses to comprehend the decline kept things buoyant throughout the late twenties and thirties. Who knew the giant corporations could no longer sustain their own code? Things advanced too fast. Complexity and dependency went too deep. Education faltered. The “third industrial revolution” quietly ran out of steam.

“Who knew the giant corporations could no longer sustain their own code?”“Free” coders did still exist. They still believed that “Software Freedom” as prescribed by the great Stallman could open a doorway out of enslavement. In practice authorities turned a blind eye. These farm animals were obliviously in service of the BigTeks, who harvested their code to fuel the machine.

Negative wages? That had an effect. Suddenly we were all supposed to pay for the privilege of keeping BigTek afloat?! Students, the only group who pay to work, rushed to fill the jobs without complaint. It was cheaper being a code worker than staying in education. Average age of the tech workers fell from 41 to 22 in a decade, expunging the entire body of active wisdom – those who knew how stuff worked.

“Average age of the tech workers fell from 41 to 22 in a decade, expunging the entire body of active wisdom – those who knew how stuff worked.”Some techies whispered of the great “Techxit” when all the creators and developers were supposed to stop coding in protest at the Face Chain. It never happened. Fear kept them in line. Not fear of losing income, such crude social control policies were so 20th century. To take away a person’s purpose, was the new cruelty of power. Losing your access to code or gaming often led to suicide.

Something was slowly shifting. Years before, in China it had been “Tang Ping”, that ended in the “code for food” camps. In the USA a “Great Resignation” was successfully dismissed by social control media as disinformation. Some withdrew or poisoned their own libraries in protest, but their works were seized, reverted and stripped of their names by the Ministry of Code.

“Some withdrew or poisoned their own libraries in protest, but their works were seized, reverted and stripped of their names by the Ministry of Code.”When SMMC’s “security mandated maintenance changes” were first issued, paying coders dutifully went along, virtuously signalling that it was the “responsible” thing to do. I would say it happened right there. Those first seeds were sown into the depleted soil of free software captured by its new master of “public necessity”. From there the weeds would slowly spread.

BigTek wanted to be the new banks, too big to fail. To show the vestiges of government who was boss the “three day weeks” came. Staged “security crises” lasted months, as the infamous Goldberg, alleged leader of Eponymous, “attacked our precious infrastructure”. Some people learned how to store electricity, offline data and food, but those who died could not hack the DRM of their solar batteries, home appliances or get past the “Life Rights Management” for online access.

“BigTek wanted to be the new banks, too big to fail.”BigTek’s right to extract from the Free coder’s “hobby projects”, now declared “critical infrastructure,” was official at last. GitHub underwent some re-branding. Accounts flipped to read-only, then locked, and then one day it became “The Ministry of Code”. In the blink of an eye Microsoft appropriated nearly ninety percent of all ‘Free Open Source’ software, to “ensure stability”. They kept the “messaging” light and positive – thanking all past contributors for their hard work over the years. It was, in all but name, the largest land-grab since William’s rule in 1066.

The Free Software Foundation remained dutifully quiet, helping deliver the peasants to their feudal lords. Debian and Mozilla played along. They were made “Yeoman Freeholders” in return for rewriting their charters to “work closely with the new Ministry in the interests of all stakeholders” – or some-such vacuous spout… because no one remembers… after that it started.

01.21.22

Computer Users Should be Operators, But Instead They’re Being Operated by Vendors and Governments

Posted in Deception, Free/Libre Software, GNU/Linux, Google, Security at 2:07 pm by Dr. Roy Schestowitz

Video download link | md5sum eea32ef00e491a975a1c16d6e11cf169
Treating Computer Users as Enemies
Creative Commons Attribution-No Derivative Works 4.0

Summary: Computers have been turned into hostile black boxes (unlike Blackbox) that distrust the person who purchased them; moreover, from a legislative point of view, encryption (i.e. computer security) is perceived and treated by governments like a threat instead of something imperative — a necessity for society’s empowerment (privacy is about control and people in positions of unjust power want total and complete control)

THE first part of Dr. Andy Farnell’s series, dubbed Peak Code, was published a few hours ago. Minutes after it had been published I wanted to interject personal thoughts and opinions. I decided to do this in the form of a video. During the video I was hoping the article would become available over gemini://, but this did not happen due to server issues that have since there been resolved. We've had a tough week when it comes to uptime, primarily due to hardware catastrophe resulting in a much-needed and long-overdue upgrade.

The gist of my take is, we’re dealing with a bizarre world of a fake security paradigm [1, 2], wherein the owners and users of their computers are presumed enemies and aren’t trusted by the computers they paid for. Instead, those computers trust vendors and oligarchs, who basically exercise remote control and treat the user with great suspicion. How did we get to such an awful status quo (still getting worse) and, more importantly, how do we get away/out of it? This is a subject we recently discussed in the context of automobiles.

Peak Code — Part I: Before the Wars

Posted in Free/Libre Software, GNU/Linux at 8:54 am by Guest Editorial Team

Article/series by Dr. Andy Farnell

This work is licensed under version 4.0 of the Creative Commons CC-BY-SA license

Series parts:

  1. YOU ARE HERE ☞ Before the Wars

Wars

Summary: “In the period between 1960 and 2060 people had mistaken what they called “The Internet” for a communications system, when it had in fact been an Ideal and a Battleground all along – the site of the 100 years info-war.”

They ask me, “Grandad, what did you do in the software wars?” I went mad. No, that I cannot say. I want to tell them, “I coded for the resistance”. But even today it is hard to talk about. Truth is, I was a coward. The “software wars” never really happened… not like people say. They’re a story we tell today about how things fell apart. Sure, I was a great hacker, but like all the others I just gave up, and that’s how we beat ‘em… if this is winning.

“The humanists stood up to “bullshit jobs” and AI dehumanisation, and were dubbed Neo-Luddites.”I think people knew it was coming. Digital technology had always created conflict. The humanists stood up to “bullshit jobs” and AI dehumanisation, and were dubbed Neo-Luddites. In the second and third crypto wars people fought for the right to private communication, and then for the right to use open plaintext protocols again, without mandated recipient codes. In the fog of creepy government lies contradiction and hypocrisy multiplied. People got confused. It took another 20 years to realise that encryption, for or against, had nothing to do with it. It was always about control.

Digital communications technology organised on large scales seemed inherently troublesome for meaningful human choice. But sometime in the last century it stopped serving people altogether. I remember your friend dying because her mother couldn’t call an ambulance. Maybe, if it hadn’t been for that Twitter thing with her work, if she hadn’t been disconnected, things could have been different. The neighbours she blamed, they weren’t bad people. Just afraid, like everyone. Unauthorised assistance meant certain disconnection. Besides, going outside for help, too risky for someone facebanned like her mum… one step too close to the Ring, a passing vehicle or stranger wearing Glasses would be enough.

The Semantic Wars (what they once called Culture Wars) had already made it impossible for anyone to talk to anyone else. Our fight to be heard amidst the trollbot armies, deepfake speech and meme mafia, was lost. The quest to have our words “mean what we mean” had silenced our voices and stripped away meaning. The Great Communication Breakdown was not a lack of will to talk, or “polarisation”, but a failure of the medium of communication itself.

“The Great Communication Breakdown was not a lack of will to talk, or “polarisation”, but a failure of the medium of communication itself.”In the mid-twenties the Security Wars raged, to decide whose security was most important – the vendors or the ‘users’. Until those days most of us had regarded security as a shared value, a tide to raise all ships. A more painful truth, is that under surveillance capitalism security is a zero sum game. With that kind of “security”, your security is my insecurity, and vice versa.

I had never wondered until that time, how Orwellian contronyms arose, but soon we were divided into two camps. So-called “Secure Software”, rubber-stamped by quasi-governmental corporations, was laughably insecure. Everyone knew it. But once governments had invested their pride, maintaining the pretence became a political priority. It gave the corporations a monopoly on commerce, medical and even educational computing. For a while the economy boomed for the certifiers, auditors, inquisitors, insurers and adjusters.

For everyone else, there was “Insecure Software”, also called “Free Software” by older people. That was the stuff you needed if you really wanted safe and stable systems. Once Google and Microsoft controlled access to half the world’s “secure” computers, society largely operated despite, not by, its consumer-communist institutions. Software became a Soviet-era black-market. For everything you needed there was an official solution, and an illegal “Insecure” one that actually worked.

“Once Google and Microsoft controlled access to half the world’s “secure” computers, society largely operated despite, not by, its consumer-communist institutions.”So, please understand, that by the time of the so-called Software Wars, everything was already “at war”. Indeed, as historians now note, in the period between 1960 and 2060 people had mistaken what they called “The Internet” for a communications system, when it had in fact been an Ideal and a Battleground all along – the site of the 100 years info-war.

01.17.22

The GUI Challenge

Posted in Free/Libre Software, GNU/Linux at 5:13 pm by Guest Editorial Team

Authored by Andy Farnell

Free red light

Summary: The latest article from Andy concerns the Command Line Challenge

Cheapskates wonderful guide is currently running a “One Week Command Line Challenge“. Some of the students I teach now are so young (to an old beard like me) they think this is some “crazy new thing”. Is there new hope and a new perspective to be explored here? Something other than retro and cool. Perhaps historical baggage, the narrative of how “superior” graphical interfaces replaced “old” consoles is an obstacle to new visions for the next generation?

As a lifelong textual user interface (TUI) user this got me thinking. If you were to give me “The GUI Challenge” I’d be sunk! My world (dwm, emacs, w3m etc) feels so familiar, it’s in my bones. After thirty or forty years on the command line if I were forced to use “normal computers” it would cripple my ability to do anything.

“After thirty or forty years on the command line if I were forced to use “normal computers” it would cripple my ability to do anything.”The command-line is super empowering, but particular. Put me on a Mac or Windows machine and I revert to a child-like flap, randomly clicking around on icons that look promising. I’d be twenty times less productive than my peers, yet, modesty be damned, I’m ten times more effective/productive at average computing tasks than other professionals when in my comfort zone – at the command-line. Isn’t this true for us all, that we have our comfy shoes?

Of course this isn’t about some innate inability to use graphical tools. I’ve mastered some jolly complex ones like Blender and Unreal editors (virtual world building), and ProTools or Ardour (for sound and music). One of the most complex I recall was a VLSI/CAD creator that used two four button mice (or mouse and ball).

So, is the command line challenge unfair? I am no more capable of quickly learning a new graphical paradigm than an entrenched GUI user is of adopting the keyboard and console. This probably applies at any age or ability level where you are comparing like-for-like paradigm switching.

No, the issue here is deeper and is about utility paradigms. How do people relate to computers as tools at the highest level – at the operating system level and above?

If you dig back in the Usenet and mailing-list archives, you’ll find fascinating, passionate and intelligent debates on the merits of different interfaces going right back to Xerox-PARC. They are really separate computing cultures. There’s a fair historical summary here.

The above history ends in 2001. GUIs did not end there, the debate has moved further, and many new things have not been well analysed. Mobile, which essentially emulates button-based handheld appliances, cannot really be compared to GUI (in its traditional sense), even though it’s technically a computer running a graphical interface.

“Mobile, which essentially emulates button-based handheld appliances, cannot really be compared to GUI (in its traditional sense), even though it’s technically a computer running a graphical interface.”It’s only since about 2010 that the GUI function of abstracting (hiding away complexity) was subverted by wicked corporations to hide away deception and to effect control. This shift from the abstract to the abstruse and obstructive is what we sometimes call “Dark Computing Patterns”, but really it goes deeper than that – visual computing is it’s own realm of psychology, politics, semiotics, iconography and subterfuge that in many cases thoroughly bastardises the function of computers qua “tools”.

The GUI/TUI debate can be framed in many ways; preference, freedom, extensibility, cognitive overhead, portability, control (tweakability), depth of understanding (legibility), and more.

For me, tool longevity and stability are important. I still use the same applications and skills I learned in 1980. Some people, foolishly I think, imagine that to be a bad/anti-progressive stance. One of the most underrated abilities in computer programming is knowing when something is finished. As is the ability to just use something instead of worshipping it as a digital artefact (cue NFT “first editions of brand apps).

By contrast many of my colleagues must re-learn their entire productivity stack every few months at the whim of corporate developers or seemingly random events in “the market”. I literally hear them anthropomorphising:

“Oh, Slack won’t let me do that now”

“Oh, Google ate my email”

“Sorry, something broke, can you resend it please?”

Their “computers” are chaotic mystery machines, magic fun fairs where superstitious ritual ministrations must be performed. This sort of Scooby-Doo “clown computing” has no place in serious business, in my opinion. So, another hugely underrated quality that TUIs favour is stability.

Where did this mess come from? In the 1980s “home computers” created a culture of their own, and from there Apple and Microsoft, needed to counter a socially constructed but actually mythical “fear” of computers as nerdy and silly, but also “dangerous”. Remember granny worrying that it would “blow up” if you typed the wrong thing?

Continuing a culture of sysadmins from the time-sharing Unix days, we created the “user” as a particular stereotype. To put it quite bluntly, we manufactured “users” to be idiots. Indeed, use of the word “users” instead of a more neutral term like “operators” is significant. The developer-user relationship today is a power relationship, and often an abusive one (in both directions).

In fact denigrating attitudes have their roots in the fragility of early software development. The “user” was an enemy who would always find ways to break our software and exhibit extraordinary “stupidity” by failing to understand our non-obvious interface puzzles. We used tropes like (P.E.B.K.A.C), lusers, and treated others with disrespectful and superior smugness.

Computing had its hashtag moment, and markets demanded that perceptions change. Microsoft solved the problem by erecting some soothing blue fire-hazard cladding around a crumbling DOS. Underneath, exposure to “The Registry” was like staring directly into the open core of Chernobyl.

At that point, enter Apple, who could play Good Cop, adding value by simply subtracting (or consolidating) features. For many, Steve Jobs was elevated to the man who “invented computers”. For a certain generation, he did. The ancient science of HCI (human computer interaction) was beaten and disfigured into the designer denomination of UX/UI that emphasised intuition, feel, and experience, which in turn ushered in the age of performative productivity. This trajectory of form over function culminated in neurotic obsessions with $2000 disposable thin laptops and the Onion’s infamous Apple Wheel parody that confused many as to whether it was a genuinely good idea.

Meanwhile the command line simply kept calm and carried on. Nothing changed in 30 years. Those who ran the servers, databases, scientific and technical applications never strayed far from the console, except where “presentation” demanded. However, through the mass media and advertising, digital technology became synonymous with these corporate veneers over actual computers, while Hollywood made the command-line a glowing green preserve of malcontents bent on destroying civilisation.

So, although the Command Line Challenge is fun – and I hope it inspires some people to go beyond their comfort zone – let’s be aware that human factors, history and politics play a greater role behind the scenes. Yes, it’s about mental models, rote motor skills and habits, rather than any intrinsic good or bad. But it’s also about culture and popular ideas of what a computer “is”.

The emphasis of Cheapskate’s article is on TUI allowing the use of older computers. That’s a very topical and important concern in the age of climate emergency. If readers don’t know already about books like Gerry McGovern’s World Wide Waste, I urge you to read more about e-waste. Making the connections between textual interfacing, more modest tech-minimalist use, and a better society and healthier planet, isn’t obvious to everyone.

There are many reasons people may prefer to return to the command line. I vastly prefer TUI’s for another reason. As a teacher I deal in ideas not applications, so it’s a way of imparting lasting concepts instead of ephemeral glitter. Commands are connections of action concepts to words, essential for foundational digital literacy. Almost everything I can teach (train) students to use by GUI will have changed by the time they graduate.

For younger people the difference is foundational. My daughter and I sit down together and do basic shell skills. She can log in, launch an editor, play music and her favourite cartoon videos. We use Unix talk to chat. It’s slow, but great fun, because character based coms is very expressive as you see the other person typing. She’s already internalising the Holy Trinity – storage, processing and movement.

To make this work I obviously customised bash, creating a kind of safe sandbox for her with highly simplified syntax. This week we are learning about modifier keys – shift is for SHOUTING and control is to CANCEL (you can’t get around needing to teach CTRL-C). What we are really working on is her typing skills, which are the foundation of digital literacy in my opinion. I think at the age of 5 she is already a long way ahead of her school friends who paw at tablets.

In conclusion then, the TUI/GUI saga is about much more than interchangeable and superficial ways of interacting with computers. In it’s essence it is about literacy, the ability to read and write (type). Behind, and ahead of it, are matters of cultural importance relevant to education, autonomy, democracy, self-expression, and the economy. So if you’re a mouser or screen smudger, why not give Cheapskate’s challenge a try?

01.16.22

[Meme] Gemini Space (or Geminispace): From 441 Working Capsules to 1,600 Working Capsules in Just 12 Months

Posted in Free/Libre Software at 6:58 pm by Dr. Roy Schestowitz

Gemini capsules

1600 working Gemini capsules

Summary: Gemini space now boasts 1,600 working capsules, a massive growth compared to last January, as we noted the other day (1,600 is now official)

Gemini Rings (Like Webrings) and Shared Spaces in Geminspace

Posted in Free/Libre Software at 5:51 pm by Dr. Roy Schestowitz

Video download link | md5sum 00694971abd0b010920fe27e4fa4f1d5
Connected Communities in Geminispace
Creative Commons Attribution-No Derivative Works 4.0

Summary: Much like the Web of 20+ years ago, Gemini lets online communities — real communities (not abused tenants, groomed to be ‘monetised’ like in Facebook or Flickr) — form networks, guilds, and rings

THE ‘old’ Web was, in a lot of ways, far more charming than today’s bloated and hostile ‘Web’, which became increasingly proprietary (Web browsers are rapidly becoming little but a canvas for proprietary software that runs remotely).

The original Web emancipated people, whereas today’s Web mostly oppresses them. “Each service we use that’s operated by someone other than ourselves is another point of failure in our lives,” one Gemini blogger recently noted. “Each of these corporations is handling staggeringly large amounts of personal data.”

“The original Web emancipated people, whereas today’s Web mostly oppresses them.”In the old Web we had things like Geocities (where I had a site when I was 15 or 16) and shared spaces like chiark, which still remains unchanged (not trying to become more “modern”). There’s something very similar to it in tilde.team, which even includes LEO, “a webring but for Gemini instead of the web” (yes, remember webrings?) and it certainly seems to have grown quite a bit.

Gemini space is a lot bigger than people care to realise and it grows rapidly. Towards the end the video above shows that there are now close to 2,000 known capsules and we’re very, very close to 1,600 active capsules which are known to Lupa (maybe it will exceed that number by the end of the day). There are a lot more users than capsules, probably tens of thousands regular users. Suffice to say, those are people who install a 'proper' client instead of using some Web gateway.

The Corporate Cabal (and Spy Agencies-Enabled Monopolies) Engages in Raiding of the Free Software Community and Hacker Culture

Posted in Deception, Free/Libre Software, GNU/Linux, Microsoft at 8:46 am by Dr. Roy Schestowitz

Video download link | md5sum 6fda57fbbfbb0443719a2afd74df26d5
Raiding the Community
Creative Commons Attribution-No Derivative Works 4.0

Summary: In an overt attack on the people who actually did all the work — the geeks who built excellent software to be gradually privatised through the Linux Foundation (a sort of price-fixing and openwashing cartel for shared interests of proprietary software firms) — is receiving more widespread condemnation; even the OSI has been bribed to become a part-time Microsoft outsourcer as organisations are easier to corrupt than communities

FOUR days ago we mentioned what Microsoft had done to Marak, in effect confiscating his work on Free software though he wasn’t working for Microsoft or even being paid for his work. This caused an uproar; does Microsoft covertly own and control everything in GitHub? If so, it’s not free hosting, it’s the passage of one’s work to Microsoft. What about projects that used GitHub for 14 years? Did they ever consent to such a transaction? And also, what did Microsoft actually pay for when it took over GitHub? Was this a non-consensual sale of code other than the proprietary software of GitHub itself?

“Recently, as we noted here several times, Microsoft tried to claim credit for the mission to Mars by merely asserting that everything in GitHub (as in every project with presence there) is property of Microsoft.”The video above talks about an upcoming series regarding the raiding of the Commons or the privatisation of volunteers’ hard work; not only Microsoft is doing it and I mention AWS as another example — raking in all the profits (financial gains) while denying a living wage to those who actually did all the work.

This crisis isn’t new and discussion about it is well overdue. Yes, Free software powers this planet (also our presence in space and Mars to some degree), but who controls this software? Recently, as we noted here several times, Microsoft tried to claim credit for the mission to Mars by merely asserting that everything in GitHub (as in every project with presence there) is property of Microsoft.

“Nobody accidentally makes a billion dollars while working on protecting human rights and democracy. The only way you make a billion dollars is by working on making a billion dollars.”Aral Balkan

« Previous entries Next Page » Next Page »

RSS 64x64RSS Feed: subscribe to the RSS feed for regular updates

Home iconSite Wiki: You can improve this site by helping the extension of the site's content

Home iconSite Home: Background about the site and some key features in the front page

Chat iconIRC Channels: Come and chat with us in real time

New to This Site? Here Are Some Introductory Resources

No

Mono

ODF

Samba logo






We support

End software patents

GPLv3

GNU project

BLAG

EFF bloggers

Comcast is Blocktastic? SavetheInternet.com



Recent Posts