Ubuntu Live CD or Knoppix Live CD: Both are Linux distributions, but we're just using them because they run on most kinds of hardware without installing, and can transfer the files you need to your backup media. Ubuntu should work; if it doesn't, give Knoppix a go. You can use the free tool UNetBootin to transfer the ISO you downloaded to a thumb drive, which is necessary if you're backing up to DVDs, and recommended in any case to speed things up.
As far as the first question, Chrome OS is for someone like me — someone who spends 98% of their day in a browser. Or it could be for everyone else, provided they use it in the manner intended. Is it meant to replace a daily operating system for most people? No more so than a netbook would be a replacement for a high-powered workstation. It’s simply not that kind of tool. Chrome OS is intended for quick access to the web on a portable notebook-like companion device. Think of it as the environment and device you’d go to when you don’t want to boot up a full OS but you want a larger screen and keyboard than your smartphone has.
Google's Chrome OS is stirring up a lot of discussion and last time we talked, I looked at it from the public cloud end. This time, what are the possibilities with home clouds?
There’s also one killer feature to Chrome’s add-ons. You don’t need to close Chrome to get the benefits of an extension. Yep, you can use a new extension as soon as it’s installed. The same goes for enabling/disabling an add-on. I seem to remember hearing that other browsers can do that (I know Epiphany can), but Chrome/Chromium kicks Firefox’s tail on this count.
The forthcoming kernel version will support Intel's Moorestown platform, SFI - the alternative to ACPI, and the Trusted Execution Technology, which used to be called "LaGrande Technology". If required, the new KSM can now reduce memory loads by combining identical memory content in virtual machines. The new kernel also includes Timechart, a new tool for visualising what's going on in the system and kernel.
This brings me back to something I wrote earlier this year: Linux is not an OS. Besides the typical point that Linux is just the kernel my basic point was that what we typically call "Linux" is not really a single coherent operating system, but rather a framework for developing them or an ecosystem which spawns them. I instead opt to call specific distributions as operating systems rather than all of Linux, whatever that may include.
If you are using Linux, there are plenty of optical disc-authoring programs to choose from. Here are some of those that I like...
Getting Things GNOME! also known as GTG aims to be a simple, powerful and flexible organization tool to the GNOME desktop environment. It is a productivity tool which aids you in organizing your work flow into tasks and sub-tasks, uniquely tagging them so as to get things done in a more efficient manner.
The other key difference is that all the main code has stayed pretty much the same and it has been rethought with C++ to use polymorphism and other cool object oriented concepts. This means that instead of having 6 huge 1000 line plugins, we now have one 1500 line plugin, and each sub-plugin comes in at just under 100 lines.
Yesterday evening just before the hard freeze a nice feature was added to KWin: window decoration painted behind translucent windows.
The author of BOH, a retro-inspired action game, let us know about a new update with the following changes:
* adds one-way passages * adds flying enemies
[...]
EbonHack is a graphical (Qt-based) NetHack client with networked play; features include high scores, drop-down menus, and spectating.
I can see how it is very frustrating for a developers out there. The public clamors for innovation, but when you give it to them, they balk at the differences from what they’re used to. I think this is why the word innovation is beginning to lose its meaning from overuse in marketing materials. We claim to want one thing, but want another. It’d be easier if we just said what we wanted, but I don’t think most people realize they don’t want innovation until they are faced with it and want to crawl back to the familiar. I’m hoping the Gnome developers can have the resolve to see their innovation through. They should do their best and people should give it a shot. If there truly aren’t any benefits and if it truly sucks – we can go back to the old style. Otherwise maybe we’ll be the next thing Microsoft and Apple copy.
Damn Small Linux can be an excellent tool for learning Linux commands and running the Linux operating system. But what if you are not interested in becoming a computer nerd; can this software still be useful to regular people? The answer is a resounding yes; you can make use of this tiny operating system whether or not you want to learn the sometimes gruesome details of operating systems. This article introduces the text editors that come with your free Damn Small Linux that runs on even obsolete Windows computers. You can use these applications to compose simple text or programs of any level of complexity.
Instead of moving to Gentoo, I will move to Debian, which I’m already familiar with, but with an interesting idea. My idea is to compile my kernel (get the latest stable release from Kernel.org), then I would like to compile my Desktop Environment, which will be XFCE. If I do all of this, I should have then an efficient system. What happens with the applications? Easy, if I see that I need special good performance on a program, then I shall go and compile it from the sources. What if not? Then use “apt-get”, as always.
Autumn in the Northern Hemisphere is a happy time. Lots of fresh Linux distribution releases coming out, all ready for plucking and testing. Mandriva 2010 is one of those. Debuting two weeks ago, it has drawn many, mainly positive reviews, sparking intrigue and a desire to take it for a spin. The previous version, Mandriva 2009 was a decent distro, with some small issues here and there; overall it behaved nicely and gave the average desktop user a solid, unique package. So the big question for me is, what does Mandriva 2010 bring to the table?
[...]
It does not have the openSUSE corporate-leaning class or the Ubuntu userbase, but for the desktop user like you and me, it's everything you could ask for. I'm genuinely pleased and surprised. Mandriva 2010 is a keeper.
I find Fedora 12 fast and responsive. It loads quickly and after it is set up it seems stable. I can find most of the applications that I use and there are many online resources to assist me with setting up my system. So far, I have installed KDE 4 and GNOME and like the look and feel of both. The community has been helpful but quite a bit smaller than what I am used to. I expect to write more about Fedora in the future.
The story for the first article is “I bought a SSD and Ubuntu is faster on it”. Good for you. Now, on Linux you can really alter the system to take advantage of a peculiar device, so I suggested three things to try: disabling the readahead service, any re-ordering IO scheduler and try some filesystem that has optimizations for flash memory, in a one-line comment. The day after the guy has a whole new post about optimizations for a SSD. Hilarious. Also, since the filesystem suggestion required too much effort, he puts in the evergreen noatime mount option instead. That’s less than 24 hours of condensed experience for the world! Clearly the tweakings suggested are done in the worst possible way and upgrades will undo them.
Surely there is a lot of this kind of blogging and the magic word seems to be “Ubuntu“, possibly the latest release.
Few days ago Subversion has been submitted to the Apache Incubator, a move praised by many as the natural fit for both projects, both for technical reasons (Apache projects use Subversion, Subversion relies on many Apache projects) and a shared vision about IP (same license) and community governance (same voting process).
A word like “freedom” has a fairly short dictionary definition, but you can see that much has been written on different meanings of freedom. That is, as a word it has a wide coverage, which then needs a great deal of talking about to pin down again. Consider Wikipedia’s freedom (philosophy) and freedom (political). Those articles are actually fairly short. I wonder why? And of course we know that “the Four Freedoms” can mean only one thing. Oh, wait .. it doesn’t. I never knew there was a disambiguation page even for that.
Nexon having developed in part “Counter-Strike: Online”, have knowledge of the GoldSrc engine, which is based on id Software’s Quake engine. With this recent history of gaming engines, the developers must be aware of the decision by id Software to license the Quake engine under the GNU GPL. I am proposing the developers and those in charge of licensing consider making the same decision. As Combat Arms licensed under the GNU GPL would benefit Nexon greatly in the long term.
Jetpacks are basically add-ons for Mozilla Firefox that are written in HTML, JavaScript, and CSS. They're meant to be easier to write and deploy for Web designers than standard Firefox extensions that can involve needing to know Mozilla's XUL, too. While Mozilla has tons of people writing add-ons for Firefox, the group of people who know standard Web development is far larger than the group that know (or want to learn) XUL.
Open source software as an example of another, often less thought of opportunity for open and transparent government is through the tools we choose to use. Software underpins almost everything we do, whether it be for work, play or creative endeavour. To be able to scrutinise software – to see the human readable instructions and trust it has, if you will – becomes almost a democratic issue, for many in the technology community.
[...]
So we consider that the time is now right to build on our record of fairness and achievement and to take further positive action to ensure that Open Source products are fully and fairly considered throughout government IT; to ensure that we specify our requirements and publish our data in terms of Open Standards; and that we seek the same degree of flexibility in our commercial relationships with proprietary software suppliers as are inherent in the open source world.
The Swedish National Police Board (SNPB) estimates to save about 20 million euro in the next five years by switching to open source application servers, open source database servers and standard computer server hardware, according to a case study published by the Open Source Observatory and Repository.
In interesting thread keeps popping up in The Globe's reporting on H1N1. As you examine the efforts of the federal and provincial governments to co-ordinate their response to the crisis only one thing appears to be more rare than the vaccine itself: information.
For example, on Nov. 11, Patrick Brethour reported that “The premiers resolved to press the federal government to give them more timely information on vaccine supplies during their own conference call last Friday. Health officials across Canada have expressed frustration that Ottawa has been slow to inform them about how much vaccine provinces and territories will get each week.”
The components of a standard reflecting telescope haven't changed much since Issac Newton built the first one more than 300 years ago -- it's still essentially mirrors in a tube. As the technology behind telescope development and construction advances, however, so does the expense of building them. Cfree is a new project aimed at using open source principles to make reflector technology more accessible to -- and less expensive for -- the scientific community.
Putting aside all criticism, I do have this bit of advice. The Wikimedia Foundation ought to post a few snapshot copies of Wikipedia from the last few years, warts and all. If Wikipedia’s quality declines, at least the world will still have some “not too bad” Wikipedia articles to view. I have always maintained that Wikipedia is tremendously useful, and it would be a shame if there were not some “canonical” versions of the resource that we could consult.
Wikipedia has disputed claims that it has lost a huge number of editors that help maintain the online encyclopaedia.
We had a mapping party at NIT Calicut recently. After the first day of the event I shared some ideas to make such mapping parties better with GeoHackers team.
[...]
Maps are created at this stage. The data we have mined are ordered, analyzed, and tagged. We need to make sure that all the team follows a naming convention or comment. The coordinator should watch for over marking / mis-marking of the same location. Once the data is properly tagged, it is time to upload it to the OSM server.
Notwithstanding all of this, the future of the Cell processor is uncertain. It hasn't made a hit with consumer electronics devices as Toshiba and Sony had promised, and the encroachment of GPGPU processing definitely throws a spanner in the works.
With the health care debate preoccupying the mainstream media, it has gone virtually unreported that the Barack Obama administration is quietly supporting renewal of provisions of the George W. Bush-era USA PATRIOT Act that civil libertarians say infringe on basic freedoms.
Denis O'Connor, the chief inspector of constabulary, used a landmark report into public order policing to criticise heavy-handed tactics, which he said threatened to alienate the public and infringe the right to protest.
COMPUTER HACKER Gary McKinnon, from north London, could be sent to the US within weeks.
The Home Secretary Alan Johnson has today written a short defence of the practice of retaining innocent DNA on the national database for six years. You can read the article in full on the Guardian Comment is Free, but we thought we'd pick out a few choice cuts and show why his reasoning is faulty, unreferenced and internally inconsistent."The most recent research supports the case for the retention of DNA profiles of those arrested but not convicted. It also shows that, after six years, the probability of re-arrest is no higher than for the rest of the population."[...]
The Minister quotes several cases in which the DNA evidence was critical in securing convictions, but we all know that the police frequently solve crimes committed by people who have never given a sample. Yet again this is a policy driven by political expediency, research we can't read and the desire to be 'seen to be doing something' with little consideration of the wider consequences.
Prolific and talented street artist Nathan Bowen was formally cautioned by the City of London police on 17th November for causing €£100 worth of "damage" to building boards in the City of London. He spent two hours at Snowhill police station being cautioned and having his DNA taken.
It's the ultimate argument-killer when people raise the big issues like liberty to defend themselves from ever-more intrusive "security" legislation - which strangely always turns out to be "surveillance" of the little people like you and me. Yes, it seems to say, you're right, this *is* a tricky one, but we must find a compromise "to balance all these factors", as Alan Johnson puts it. And the way we do that is by making a *proportionate* response.
IAB and Struan Robertson from Pinsent Masons (among others with a vested interest) are all over the press today claiming that the ammendments to Article 5(3) of Directive 2002/58/EC allow companies to continue to use Opt-Out. But today they have stooped to new levels of delivering misinformation.
They claim that the new rules state that cookie management can be done through the browser (such as Firefox or Internet Explorer) and hark this as a triumph both for industry and consumers. Unfortunately for them, their claims are utter rubbish and Commissioner Reding has been quick to issue a clarifying statement to the press this morning
lso, just like in today's GSM (A5/1) crypto attacks, even back then the importance of known plaintext could not be underestimated. The verbosity of Japanese soldiers addressing a superior officer and the stereotypical nature of reports on weather or troop movements gave the cryptographers plenty of known plaintext for many of their intercepted message.
What was also new to me is the fact that the British even back then demanded that Cable+Wireless provides copies of all telegraphs through their network. And that's some 70-80 years before data retention on communications networks becomes a big topic ;)
World oil production peaked in July 2008 at 74.74 million barrels/day (mbd) and now has fallen to about 72 mbd. It is expected that oil production will decline at about 2.2 mbd per year as shown below in the chart. The forecasts from the IEA WEO 2008 and 2009 are shown for comparison. The IEA 2009 forecast has dropped significantly lower than the 2008 forecast. The IEA 2009 forecast also shows a slight decline from 2009 to 2012 implying that the IEA possibly agrees that world oil peaked in July 2008.
Now. In what can hardly be a coincidence, just a few weeks before the Copenhagen summit the Climatic Research Unit at the University of East Anglia got hacked. The sixty-odd megabytes of confidential e-mails that ended up littering the whole damn internet either a) blew the lid off a global conspiracy to fake the global warming crisis, or b) lay there in a big sludgy pile of boring communications about birthdays, conference meet-ups, and whether or not Poindexter over at Cal State was going to be allowed into the tree fort this year. Judging by the criteria I described at the top of the post, I should just stick my fingers in my ears and hum loudly until the current shitstorm abates.
The publication of a selection of the emails and data stolen from the Climatic Research Unit (CRU) has led to some questioning of the climate science research published by CRU and others. There is nothing in the stolen material which indicates that peer-reviewed publications by CRU, and others, on the nature of global warming and related climate change are not of the highest-quality of scientific investigation and interpretation. CRU’s peer-reviewed publications are consistent with, and have contributed to, the overwhelming scientific consensus that the climate is being strongly influenced by human activity.
There was, however, another factor that played an important role: the enormous incentive packages that many traders and senior executives on Wall Street received. Once the credit bubble got started, the men who ran the biggest financial institutions in America were determined to surf it, regardless of the risks involved. Because from where they sat, and given the financial incentives they faced, pursuing any other strategy would have been irrational.
First, because of the American debt to Beijing, they have the power to force the issue. Up to this point, American presidents had artfully dodged the issue. In 1986, President Reagan signed a piece of minor trade legislation he might not have read that included the acknowledgement of Beijing's rights to Tibet. But no American president, until now, had been forced to walk the plank in public.