Joseph, Chris and myself are visiting Microsoft this week to learn more about Silverlight 3.0
Joseph, Chris and myself are visiting Microsoft this week to learn more about Silverlight 3.0
The sorting software, called Ballot Browser (image above right shows the software’s user interface), is an open source program written in Python to run on a Windows or Linux platform. The Humboldt version is running on Debian Linux Etch and uses a Fujitsu high-speed scanner also using Debian Linux.
[Roy: Brazil has already moved all its voting machines to GNU/Linux (hundreds of thousands of boxes).]
#2: Open source
#3: Command line
#4: Hardware requirements
#8: More available software
#9: Not so dumbed-down
#10: Keyboard efficiency
At a time when most companies are happy if the balance sheet does not show any red ink, Red Hat Linux has bucked the trend. Its stock price leapt 32 percent last week compared to that a year ago, during a week when technology stocks overall fell by 2.6 percent.
According to information available at Channel Insider, the so-called mixed source company, Novell, saw its stock price fall by 11 percent, the biggest loser of the week.
Whenever someone tells me that something is easier in Windows, I am immediately suspicious. I wonder what compromises they have made in their own mind. This is telling. It says that Windows users are willing to put up with much in order to use Windows, before they begin to work. They either really don’t mind having to re-boot when they update and wait on endless updates after they boot up or they just see this as the cost of using Windows and it does not register.
Modern PCs can execute billions of instructions per second, but today’s web applications can access only a small fraction of this computational power. If web developers could use all of this power, just imagine the rich, dynamic experiences they could create. At Google we’re always trying to make the web a better platform. That’s why we’re working on Native Client, a technology that aims to give web developers access to the full power of the client’s CPU while maintaining the browser neutrality, OS portability and safety that people expect from web applications. Today, we’re sharing our technology with the research and security communities in the hopes that they will help us make this technology more useful and more secure.
The smart box is based on an open Linux-based platform and includes a raft of wireless technologies which allow users to connect remotely via a PC or smartphone.
Despite snapping up Symbian only a matter of days ago, Nokia has revealed that in the future it plans to use a Linux-based operating system in its more expensive models.
Nineteen-year-old Ciara Sauro has pancreatitis and because she needs an islet cell transplant, she’s hospitalized every week, a situation resulting in a huge accumulation of medical bills.
Now, “Because she didn’t defend herself against a copyright lawsuit, a federal judge in Pittsburgh ruled she’s a music pirate, and that could cost the Sauros almost $8,000 in fines,” says Pittsburg news channel WTAE.com.
“I already have severe depression,” the story has her saying. “I mean, it’s so hard to sit there and think that I have to get in trouble for something that I didn’t do. It’s not fair.”
As I type this, members of the European Parliament are preparing to repeat one of the worst mistakes in copyright history — enacting a European version of America’s reviled Copyright Term Extension Act of 1998.
The EU version will tack 45 years onto the duration of copyright for existing and future sound recordings, making for a grand total of 95 years’ worth of monopoly control for companies that produce recordings.
Five years after the US passage of the Copyright Term Extension Act, the US Supreme Court heard Eldred v Ashcroft, a case that challenged the constitutionality of extending the copyright of works that have already been created.
I will let you decide which applies to the author of a “research study” of Google’s bandwidth use being pushed by the anti–net neutrality site NetCompetition.org. Using some rather dubious proxy measures—which would be worth further scrutiny as well, if the fundamental premise weren’t so manifestly bogus as to render such quibbling moot—telecom shill Scott Cleland estimates that Google and its subsidiaries “used” 16.5% of consumer broadband traffic in 2008, but only paid 0.8% of consumer broadband costs. This, the author brazenly claims, amounts to an implicit subsidy of some $6.9 billion to Google, and proves that Google “uses” 21 times as much bandwidth as it pays for.
This is stupid on so many levels I’m almost too stunned to know where to begin. Why would you ever imagine that the per-byte cost of getting upstream traffic out on a few enormous pipes would be the same as the per-byte cost on the downstream side, where the same traffic is dispersed to a bazillion consumers, each with their own broadband connection? (Nestle pays a lot less per pound than you do for sugar; I await a “research study.”) What would possess anyone to posit that there’s some inherently “fair” division of the cost of connecting end users to popular (mostly free) services anyway? Google adds value to the product ISPs sell, presumably helping them to attract customers; should Eric Schmidt be demanding compensation for the “implicit subsidy”?
Dolby Linux wizard John Gilbert gives us a look inside the movie industry 02 (2004)
Digital Tipping Point is a Free software-like project where the raw videos are code. You can assist by participating.
Microsoft delivers consistent behaviour
We’re very slow at posting today (might soon do an interview with the CEO of OIN), but here are some links that readers may find valuable. They are about Microsoft’s latest failures.
A few weeks ago I received an email from Robert, a frustrated Xbox 360 owner. He was frustrated because he had paid $59.90 for an extended warranty but Microsoft would not honor it to fix his Xbox 360. He didn’t have a RROD, he had some video issues.
At 67% failure rate, this takes some nerve.
Speaking at Telstra’s annual investment day in Australia, the Microsoft (News – Alert) chief executive said that because it was Google’s first foray into the phone operating system space it was way behind in its efforts.
Bashing of his competition means that he doesn’t have anything of his own to market.
Microsoft’s second-rate search engine shut down for several hours on Black Friday, demonstrating once again Redmond’s inability to keep the motor running online.
Carrying on from our Lord of the Shills, I decided to investigate Andre’s claims that people are happy with Vista.
For those that dont know Andre is a very active pro-MS poster on the Microsoft Watch site, unfortunately he wasnt good enough this week to win the Lord of the Shills, but Im sure he will get at least one award before the end of the year.
This post from one of our regular reader shows what a dedicated shills-fighter he is. █
DNS cracks enable man-in-the-middle attacks and an alliance has just been formed to protect from these. But DNS is not the biggest issue if merely visiting a Web site becomes a great threat, e.g. due to drive-by downloads or rogue ActiveX controls.
Some days ago we wrote about botmasters that had infected and even taken control of US military operations that ran Microsoft Windows. The Economist, which is still respected by some people, has published an article which sheds light on how botnets have become weapons of mass digital destruction. These can be trivially utilised at times of war.
AS RUSSIAN tanks rolled into Georgia in August, another force was also mobilising—not in the physical world, but online. Russian nationalists (or indeed anyone else) who wished to take part in the attack on Georgia could do so from anywhere with an internet connection, simply by visiting one of several pro-Russia websites and downloading the software and instructions needed to perform a “distributed denial of service” (DDoS) attack.
The mainstream media rarely discloses numbers that reveal the scale of this problem as it may incite panic. When about 4 out of 10 Windows PCs are part of a botnet (conservative assessment), then the complexity of defending one from DDOS attacks is truly realised. Everyone is a suspect, so there are no simple solutions other than a quarantine of half of Web (or more).
By any stretch of imagination, it remains hard to believe that 98% of Windows PCs are constantly vulnerable and ready to become zombies. This may seem an interesting, if puzzling, recent discovery. In addition to this, IDG is now reporting that Windows malware has peaked and reached an all-time high.
The year 2008 has seen another record of explosive growth in the amount of malicious software (malware) on the Internet, according to F-Secure.
Didn’t Microsoft promise to curb security breaches? In one of the most shocking stories from the past few months, the following has just been reported by WirtschaftsWoche:
Report: 21 Million German Bank Accounts for Sale
Black market criminals are offering to sell details on 21 million German bank accounts for €12 million (US$15.3 million), according to an investigative report published Saturday.
Reporters for WirtschaftsWoche (Economic Week) managed to obtain a CD containing 1.2 million accounts after a November face-to-face meeting with criminals in a Hamburg hotel, according to the magazine.
It’s bad enough that the world is tortured by an economic crisis. The last thing it needs right now is fraud of such massive scale. It leads to a sort of anarchy which transcends the Web. █
IN the first part of this article we looked at the growth of GNU/Linux in areas which include high-performance computing (supercomputers), mobile phones, desktops, miniature laptops, consoles, and set-top boxes. There is a great deal of overlap between some of these areas, but they are certainly separable.
Herein we look at the growth of GNU/Linux in areas that were not covered before. This ought to demonstrate the tremendous presence which has been quietly gained throughout the year 2007.
The term “server” is very generic in the sense that covers a broad range of equipment and applications. For E-mail and Web services, for instance, there is a diverse set of systems which are operated so as to connect desktops and devices behind the scenes, so to speak. Application servers exist which blur the gap between the host (server) and the client. Even desktops and laptops can be compared to servers in terms of their function, but let’s try to sub-divide the domains at hand in a sensible fashion and begin with Web servers in particular.
Google is often cited as a major success story and a poster child for GNU/Linux. It is a pioneer capitalizing on disruptive trends as it concentrates on software as a service. In 2007, Google was believed to be using approximately one million GNU/Linux servers around the world, but nobody knows the real number for sure, except Google of course, which consistently keeps those cards close to its chest. Google uses GNU/Linux almost exclusively despite its short experimentation with OpenSolaris quite some time ago (circa 2006).
Other large companies have already chosen GNU/Linux to run various types of servers. Examples include eBay and Amazon for some of their Web services (in-house) and Oracle for its products (clients). They became more vocal about their use of GNU/Linux in the past couple of years. Many others began taking pride in GNU/Linux rather than hide it from the public eye. This trend can be generalized to account for other areas such as devices, which we will touch on in a moment.
The rise of the so-called ‘Web 2.0′ generation rationalized the need for high-capacity servers that are highly reliable and accessible (in terms of availability). Downtime is rarely acceptance when it comes to user-facing services which deliver and receive data almost in real time. Downtime is hardly affordable because it can drive customers away.
With growth in the server market in general and especially with the gradual decline of aging Unixes, GNU/Linux deployments kept rising in quantity. Being free software, however, it was impossible to keep track of the number of installations. Moreover, the number of servers does not say very much because actual server capacity depends a great deal on the available hardware and software which runs on it.
Modern hardware and resource-efficient software require less units to handle the same load. Additionally, there is the emergence of virtualisation to consider here. VMWare is the leading virtualisation company (IPOed in 2007) and it actually started up with GNU/Linux for quicker market penetration. Server virtualisation remains a GNU/Linux advantage where this platform is comfortably ahead of most counterparts. However, ironically enough when it comes to statistics, this also means better distribution and pooling of resources, which results in improved consolidation and therefore a decrease in the number of servers that are needed.
“Another important mistake is to assume that all GNU/Linux servers are sold, as opposed to deployed.”In servers, a great deal of disinformation is being spread to paint a deceiving picture. Despite the fact that not all server units are sold and shipped, GNU/Linux gets counted in this old-fashioned way. Another mistake which is commonly made involves counting only the revenue made through sales of servers, regardless of the number of servers sold. By adhering to such measures, more expensive servers will be viewed as more popular among users, who are always assumed to be buyers, i.e. paying customers. As a measure of popularity or ubiquity, this is incompatible with free software like GNU/Linux.
Another important mistake is to assume that all GNU/Linux servers are sold, as opposed to deployed. As stated earlier, Google is estimated to have approximately one millions servers, but the number remains unknown due to corporate secrecy. Google is able to build and even distribute its own servers, so such server usage can easily go below the radar of industry analysts, whose definitions are strictly controlled by those who commission studies for vanity and marketing purposes. As pointed out earlier, there is also the issue of server capacity. If a Linux server can handle greater loads, then fewer such servers are required to handle the same amount of work.
Let’s quickly look at some numbers from 2007. Market estimates have claimed a growth of 34% annually for GNU/Linux shipments, with strong evidence of growth in Red Hat’s latest financial figures from mid-December. That is commercialized Linux alone; additional figures remain unknown and uncounted. Red Hat is still the leader in the Linux servers market. Looking at Red Hat’s year-to-year growth, the company boasts a rise of 28% in sales, a 24% rise in cash flow, and an improvement measured at 39% for net income. Red Hat’s shares rose 12% after these results were published. These figures, in general, put GNU/Linux ahead of everyone else when it comes to pace of growth.
Towards the end of the year, even the New York Stock Exchange adopted GNU/Linux as a server platform. It also talked about its decision openly in the press and this story served as an excellent sign of validation.
There are many other success stories which could be covered. Consider rendering farms and studios in Hollywood where Linux enjoys a de facto monopoly with virtually all desktops and servers running Linux underneath a proprietary software stack. The application layer often hides an underlying embodiment of openness and freedom, which sits just ‘under the hood’. This is one of the least-covered success stories of GNU/Linux and it truly deserves greater attention.
Another class of servers which can be considered separately is the mainframe. IBM leads the way in the area of mainframes where the use of Linux has become the natural path for most mainframe to follow and evolve along.
Progression is encouraging because IBM recently upgraded the z/VSE mainframe OS to accommodate Linux use in large- and medium-sized businesses. IBM also reported a surge of 390% when it comes to the number of sites running Linux in the mainframe. In fact, Linux is said to be driving a revival of mainframes, some of which have been prematurely buried.
In 2007, mainframes were seeing somewhat of a comeback which was driven by ISV support from many in the Linux arena. System integrators are involved as well and the number of supported applications doubled. Earlier last year, an agreement between Oracle and IBM actually helped strengthen mainframe computing. Both companies are known for their love for — and arguably a dependence on — GNU/Linux.
One of the more fascinating trends, whose potential was only realised in the past few years, is cloud computing. Large enterprises, including not just technology companies but anything from banks to healthcare, wish to deploy clouds. Such phenomenal deployments can soon reach as far as governments, according to sources.
“Free software appears to be at the heart of cloud computing with companies like Google already taking a lead too.”Due to some of Red Hat’s new products, which were only introduced a couple of months ago and are geared towards clouds, questions began to arise about their future collaborators. Will it be Amazon or will it be IBM? Red Hat has already set itself a goal which is to maintain presence in over half of the world’s servers by 2015. Free software appears to be at the heart of cloud computing with companies like Google already taking a lead too. There are other lesser-known contenders to consider, such as Xcerion, whose Internet cloud might quietly mature and help the company grow as rapidly as VMWare.
IBM’s Blue Cloud, which is bound to arrive within a few months, will be using BladeCenter servers and run GNU/Linux. It will rely on free software and utilization enhancers such as Xen-based virtualization. On top of it, IBM’s Tivoli is expected to run and manage the cloud, so this might not be a case of free software cloud top-to-bottom.
IBM’s datacenters are slowly evolving into ‘computing clouds’ and the significance of this, which is often underestimated, can be compared to the importance of the company’s embrace of GNU/Linux many years back. This was seen as a big endorsement (never mind the generous investment) at the time. It also helped Linux rid itself from damaging stereotypes.
In this context, devices would be a large family of mostly embedded software. These tend to be miniature, but they needn’t be. Topology of the different devices is probably a subjective matter.
According to a 2007 survey from VDC, Linux is set to grow 278% in the domain which includes embedded, mobile and real-time applications. Linux is used very quietly in this area. People often use it without being aware of this. The closed nature of many Linux devices contributes to apathy and several companies are too shy to admit their use of Linux due to potential (sometimes known) GPL violations.
According to another survey from 2007, 87% percent of those who built their devices using Linux plan to use Linux in their next project as well. In other words, only few of those with Linux experience are actually looking elsewhere and assess other options. This indicates great satisfaction from a developer’s point-of-view.
“In the year 2007 we saw many media players that run Linux.”Moreover, and further to the study above, the use of free distributions was favoured considerably in comparison with paid distributions. Trends indicate that more and more developers escape the dependency on commercialized distributions. This makes everything more affordable and hence attractive to both developers and prospective users.
In the year 2007 we saw many media players that run Linux. This includes Wizpy, new models of the portable media player from Archos, an iPod competitor from AOL (manufactured in Germany by Haier) and many lesser-known gadgets. There is a vast array of other devices, including networked-attached storage units, home servers, children’s toys and innovative gadgets with well-known examples like the Chumby, which makes a wonderful gift even to grown-up kids like ourselves. An extensive list of such devices is constantly being compiled at LinuxDevices, as well as in a few smaller Web sites. Many of the devices are designed and/or manufactured in the far east, which secures low (and thus highly competitive) costs that lure in less receptive markets.
Linux also gained a high status and earned a place in a large number of industrial components including controllers, automation solutions, meters and monitors. Switches and routers, which arguably fall under the domain of servers as well, have played a role in the growth of GNU/Linux. For example, in 2007 3Com announced that it is betting on Linux and an open strategy. We recently saw a router and switch from Korenix and Vyatta delivers a truly free open source server based on GNU/Linux. It runs free software and adheres to the Red Hat-type business model, which is seen as quite faithful to the ideals of free software.
On the same note, while also considering hybrid devices, it’s worth stressing the importance of and the different roles of Linux in telephony or — more generally — communication . This includes Asterisk and other software that handles VoIP. Towards the end of 2007, Asterisk boasted the millionth download of its software. John ‘maddog’ Hall, a Linux luminary who is also the Executive Director of Linux International, once said that open-source VoIP “will be bigger than Linux.”
In a realm where customization is king, it is natural to expect advantages to be found in open systems. The robotics market in 2007 is said to have engendered roughly 10 general-purpose software development frameworks. 9 of these support Linux.
In 2007, Hanson Robotics found that in maintaining a mix of free software and proprietary software in robotics, the ideal ratio is 70% free open source software and only 30% proprietary. In this context, the Linux kernel is expected to play a major part. Linux is dominant in robotics in general. It is not just free open source software that gets chosen for its own separate merits.
The ‘hidden agenda’ in this two-part article — as if there ever was an agenda — was to show that ways in which Linux success is typically measured are deeply flawed.
“Trends are sometimes more meaningful than absolute numbers when it comes to predicting trends.”Computing has a visible and a less visible presence in our lives. People perceive the desktop as very important because it is highly visible to the general population. This can be deceiving. It is important to remember that there is no “year of Linux on the desktop”. If there was, then it’s already behind us and it’s called “the tipping point”.
Any type of real-world usage grows gradually; it doesn’t balloon overnight and clearly not over the course of a single year. Trends are sometimes more meaningful than absolute numbers when it comes to predicting trends. Bearing that in mind, there is no going back as Linux will mature and its usage further expand in many areas.
Let us never be obsessed too much with the desktop. In fact, a desktop might cease to be a primary target by the time that mythical, so-called ‘Linux domination’ is finally reached. Many call this “inevitable” and such sooner-or-later destiny is at times recognized by those who have the most to lose. That inevitability may or may not include the desktop, whose future role is yet unknown. Mobile devices seem to gradually replace the desktop, at least in Japan.
Last but not least, it is important to remind ourselves not to be distracted by any single area of computing, which is one just among many. What sustains growth and fuels development is a market which is broader than local computer stores. As Linus Torvalds said recently, “Linux is much bigger than me.” Linux is also bigger than the desktop .█
Originally published in Datamation in 2007
Probably on behalf of their paymasters at Microsoft, they are still trying to make European law hostile towards Free(dom) software, as the following new article shows.
The lack of a Europe-wide Community patent poses “incredible challenges” for small businesses and undermines the EU’s goal of becoming the most competitive knowledge-based economy in the world, according to a new study.
To bypass the EU regulatory framework, many innovative companies and especially SMEs end up “skipping” the European market by applying for a patent in the US, Jonathan Zuck of the Association for Competitive Technology (ACT) told EurActiv.
Unless Europe locks these people out of Europe, it will continue to be abused by foreign companies that fight against autonomy. █
RSS Feed: subscribe to the RSS feed for regular updates
Site Wiki: You can improve this site by helping the extension of the site's content
IRC Channels: Come and chat with us in real time