THE technology tabloid of CBS says that "Free software advocate Richard Stallman spoke at Microsoft Research this week" and the article was composed by one of their full-time Microsoft boosters (some were literally Microsoft staff on Microsoft's payroll!).
"Is Microsoft trying to pay off RMS? It doesn't look like it, but his visit there can be distorted to say all sorts of bizarre things."The Microsoft booster who controls Linux.com soon rewrote the headline and added his own interpretation as follows: "Mary Jo Foley of ZDNet reports that Microsoft invited free software legend Richard Stallman to speak at its Microsoft Research headquarters this week. Stallman, known for launching the Free Software Movement to develop the GNU operating system, was and still is a staunch Microsoft critic. Stallman delivered his standard talk around four freedoms."
"[I]t doesn't excuse developing proprietary software. A desire for profit is not wrong in itself, but it isn't the sort of urgent overriding cause that could excuse mistreating others. Proprietary software divides the users and keeps them helpless, and that is wrong. Nobody should do that."
"Sadly, we're seeing some certain betrayal of principles from all sorts of directions."We've also just noticed more new evidence (post by Juozas Auskalnis) of Microsoft exercising control over Red Hat (using money). It's all about money. At what cost? As one blogger put it this afternoon, "the worst case scenario is when one’s life depends on (especially when closed) source code that one has not written oneself."
Sadly, we're seeing some certain betrayal of principles from all sorts of directions. Google also. Early this morning we wrote about Google openwashing and privacywashing by outsourcing code to a proprietary software platform, GitHub. This brings hilarity to new heights. Obviously Google outsourced all the 'open' code (for 'privacy') to NSA PRISM and Microsoft [1]. Google pretends to care about privacy while fighting a war against it and letting the NSA's back doors partner maintain the code. Corbet has just written about this [2]. ⬆
Related/contextual items from the news:
Differentially-private data analysis is a principled approach that enables organizations to learn from the majority of their data while simultaneously ensuring that those results do not allow any individual's data to be distinguished or re-identified. This type of analysis can be implemented in a wide variety of ways and for many different purposes. For example, if you are a health researcher, you may want to compare the average amount of time patients remain admitted across various hospitals in order to determine if there are differences in care. Differential privacy is a high-assurance, analytic means of ensuring that use cases like this are addressed in a privacy-preserving manner.
Google has announced the release of a new library for applications using differential privacy techniques.