Recently while troubleshooting an issue on a Windows 7 PC, I noticed a number of events in the Application Log labelled "Defrag". Sparking my curiosity, I looked further and discovered that there was approximately one entry per day in the log. I looked around some more at other Windows 7 PCs and found that they too have "Defrag" entries scattered about. It turns out that Windows 7 now automatically runs a defrag on its NTFS filesystem, compared to Windows XP which never did this. This is a great idea on Microsoft's part, rather than letting things stockpile up and forcing the user to defrag while waiting for minutes or even hours while it churns away.
This got me thinking back to when I read more about other filesystems, most notably ext3 and ext4 filesystems on GNU/Linux (which are standardly used now), which never need defragmenting. Yes that's correct, they do not need to be defragmented.
As we’ve hopefully shown, building a $200 PC is a fun experiment—and, provided you really modulate your expectations, you can get a solidly usable computer out of the deal. But when it comes right down to it, we admit that the $200 figure is a bit arbitrary. Is anyone going to complain if the final total is $225.87, or even $250.94? Of course not. The goal is to find components that have the best balance between low prices and high (or, perhaps more appropriately, decent) performance. And if you’re willing and/or able to spend a bit more money, you can get a better system still. Here are a few of our recommended upgrades if you want to take your $200 to slightly more expensive—but still solidly affordable—places.
Investigations are no doubt continuing on numerous fronts, and Kernel.org is working to make sure that each of its 448 users change their passwords and SSH keys. In the meantime, however, the good news is that there appears to be no need to worry about the Linux code we all know and love.
Like many of you, I occasionally come into possession of an older laptop. Usually, it’s something that used to run Windows XP, sometimes even older. You always hear that Linux is supposed to be so great for resurrecting old hardware, but many modern desktop distributions with all their bells and whistles end up chugging along just as slowly as Windows did. In those circumstances, you can either throw the machine away, or build your own custom install tailored toward the needs of the machine. Today we’re going to put together a Debian installation tailored specifically toward the needs of an older laptop.
Unigine Corp announced yesterday they have a new licensee for their Unigine Engine. The latest game studio to license this visually impressive multi-platform game engine -- with first-rate Linux support -- is BlueGiant Interactive. BlueGiant is developing a real-time strategy game using the Unigine Engine.
Innovative Linux Game Blocks That Matter about which we have written plenty of times has won Microsoft's Dream.Build.Play 2011 competition with grand prize of $ 40,000.
Kitty Lambda will soon release their action RPG The Real Texas which is be available for GNU/Linux.
Development of kdelibs 5.0 has begun in the framework branch of its git repository. The main goal for kdelibs 5.0 is that there will be no more kdelibs as it is now. kdelibs (and kde-runtime) will be split up into smaller pieces to lower the barrier for non-KDE developers to use part of the power we as a KDE development community provide. The rough idea is that there will be three groups of libraries/frameworks:
1. Tier 1: components which only depend on Qt and no other lib/component from KDE. 2. Tier 2: components which depend on Qt and other libraries from Tier 1. 3. Tier 3: components which depend on anything.
For lack of a better place, I plopped it into the kdeexamples repository so that others (and the future me ;) can easily include it into their project (DataEngine, Plasmoid, application, ..) and see what a given DataEngine is doing. It's BSD licensed, so it can be used pretty much anywhere.
The GNOME Project has released the first of two beta versions of GNOME 3.2. The pre-release version, designated GNOME 3.1.90, comes just a few days after the user interface freeze, which, along with the beta, was recently put back a week to allow time to incorporate further modifications.
The first beta for the upcoming GNOME 3.2 is here for eager users to enjoy. This is still unstable code, so most users will want to hold on until the final, stable version lands, but if you want an early peek, this is your chance.
Several projects exist that purport to be small, run-in-memory distributions. The most popular probably is Puppy Linux. Puppy has spawned several variations, and I have used it several times myself on older machines. But, I have discovered one that bowled me over completely—Tiny Core Linux. This distribution is a totally different beast and fills what I think is as of yet an unfilled category.
To start, Tiny Core is tiny—really tiny. The full desktop version weighs in at approximately 10MB—this is for a full graphical desktop. Not many other options can deliver something like this. People of a certain age may remember projects like Tom's root/boot, or muLinux. Tiny Core fits somewhere in between those older floppy-based projects and “heavier” small distributions like Puppy.
This posting is not meant to start another flame war between these two great Linux distribution. It's just meant to be my personal opinion after trying ArchLinux for several days and compare it with the distribution i have been using for the last six years. I know it's not completely fair to compare few days experience with six years, but i will try to be as fair as possible.
Considering that this was a major and highly expected release of a major Linux distribution, did anybody in management bother to take it for a spin to see if basic features work? I have visions of Steve Jobs getting involved in every phase of his company’s products development. There does not seem to be a Steve Jobs in Mandriva’s management team.
None of the shortcoming of Mandriva 2011 will stop me from upgrading one of my permanent test systems running Mandriva 2010.2, but my laptop, which I use for serious stuff, on which physical security is just as important as any other feature, will continue running the old system until I figure out how to configure disk encryption when installing my favorite Linux distribution.
Now that Red Hat has made public what had become the worst-kept secret in Triangle real estate circles, it's worth delving into what the company's move to downtown Raleigh will mean for interested parties.
In the near term, the decision eliminates uncertainty about whether downtown would be left with an empty building once Progress Energy and Duke Energy complete their merger and consolidate operations.
It’s Thursday, and you know what that means? Even if we can’t get Christine to wake-up long enough to write one of her articles, you can always depend on us to be here like clockwork for the Top 10 List.
A while back, Red Hat announced they might be leaving the big city of Raleigh to find a new location to continue tweaking their code. A little later, they announced they’d decided to remain in the North Carolina capital city after all – but they’d be looking for new digs since they were getting somewhat crowded at their old location. This week they announced they’d found their new home, a big ol’ office tower in Raleigh’s downtown.
If Canonical has its way, Ubuntu may soon be powering computers like the one in your car. At least, that’s what the release of Ubuntu Core, a new image of Ubuntu aimed at embedded devices, suggests. Here’s the scoop.
Ubuntu Core, as its name implies, constitutes a very bare-bones Ubuntu system. It’s a minimalist version of Ubuntu that provides the most basic software around which developers can build a larger system to suit their needs.
A month ago I wrote, "A few hours ago, Kate Stewart marked LP bug #760131 as being a milestone candidate for Ubuntu 11.10 Beta 1. This bug is for the main power regression introduced in the Linux 2.6.38 kernel as caused by PCI Express Active-State Power Management changes. There hasn't been a "solution" upstream in either the Linux 3.0 or 3.1 kernels yet since this is a tough problem. I'm not sure what Canonical is planning to do to "fix" the situation (considering their overall lack of low-level technical contributions particularly in the kernel area) besides possibly forcing PCI-E ASPM or just postponing the fix to a later milestone."
The first beta release of Ubuntu 11.10 has been made available for download – but what can you expect to find?
The RaspberryPi Foundation, which aims to put computers in front of children for €£15, has taken delivery of 50 engineering prototypes, and intends to get the final version to customers by the end of the year, writes Steve Bush.
Based in Cambridge and founded by six high-tech high-flyers, the foundation's aim is to cure the programmer shortage by inspiring people to take up computing in childhood - as Sinclair Spectrums and BBC Micros once did.
Now, about the camera. The idea here is that the 8MP camera can take photos in 3D (SE used "panorama" quite a lot in the description of the photos). But since the Arc S's screen isn't a 3D display, the images are shown in 2D. When the device is plugged up to a 3D-capable TV via the MicroHDMI port, they're shown as 3D photos. So you can (sort of) take 3D photos, but you won't be able to view them without a 3D TV. The concept of a 3D camera on a non-3D device baffles us, but we'll leave such judgments to you.
Reuters is reporting that webOS may not be dead, yet. In an interview, the head of HP’s PSG group Todd Bradley hinted HP may not be done with the webOS or tablet. So what does HP have up its sleeve? Bradley was elusive, but here are my top three ideas for what the PSG spinoff could do, along with some channel implications …
Samsung understands this, and has thus tried to build a tablet for just about any size pocket or backpack you may own. We all know about the GalTab 10.1 and 8.9, but today even smaller models join the pack. At the IFA conference in Berlin, Samsung today announced the Galaxy Tab 7.7 and the 5.3-inch Galaxy Note.
Just like the rumor stated, Toshiba used IFA 2011 to announce its latest Android tablet and it’s just as tiny as it looked. The AT200 packs a 1.2GHz TI OMAP 4430 CPU, up to 64GB of memory, and most of the ports that made the Toshiba Thrive popular: micro-USB, microSD, and micro-HDMI. Toshiba claims that the battery is good for “eight hours of video consumption.”
HTC announced its first 10.1-inch Honeycomb tablet, which is also AT&T's first tablet to run on the carrier's new LTE/HSPA+ 4G network. The HTC Jetstream runs Android 3.1 and HTC Sense on a 1.5GHz, dual-core Qualcomm Snapdragon processor, features eight-megapixel and 1.3-megapixel cameras, offers an optional HTC Stylus, and start selling Sept. 4 for a pricey $700 with 32GB of memory.
Amazon's 7-inch tablet PC, which is supplied by Quanta Computer, is expected to start shipping in October, the sources added.
If you thought you couldn't get a real Android tablet from a brand you've heard of for less than $200, think again. Lenovo's just announced the IdeaPad Tablet A1, a 7-inch Android unit that we got a sneaky first glimpse of back in July. Now it's real, and it's cheap, it's running Gingerbread, and while it doesn't hold a candle to the Galaxy Tab 7.7, it honestly feels like something far above its price point. Read on for our impressions.
Aside from the default webOS software, there's a good chance I'll be able to install Android onto the TouchPad at some point in the foreseeable future. Teams of Android enthusiasts like the gang from RootzWiki are already hard at work creating Android ports for the product. For the moment, the phone-focused Gingerbread will be as good as it gets -- the tablet-optimized Honeycomb release, remember, was never made open source -- but with the all-purpose Ice Cream Sandwich release on the horizon, the future holds no shortage of interesting Googley possibilities.
A 9.7-inch dual-core Ice Cream Sandwich tablet for $99? Yeah...exactly.
At the Community Leadership Summit in Portland back in July, I moderated a session called “The Death Star User Group”, aimed at community managers working for large corporations in the various stages of their journey towards software freedom. Community managers in that situation often have to deal with negative perceptions of their employer. I think having a model for the journey that a company is taking towards eventual embrace of software freedom is valuable.
Until fairly recently, cloud computing has been considered "bleeding-edge" technology, reserved for technology-focused companies and developers. We’re getting to the point now, though, where it’s being recognised by IT departments across all industries as a way to increase service flexibility and reduce in-house infrastructure and service costs.
As the cloud goes mainstream at last, all the major technology vendors are scrapping for market share. They have wasted no time building marketing strategies around their cloud products – explaining why their offerings are more robust, scalable and secure than the competition.
The reality, however, is that there are really only two choices when it comes to building your cloud. You can go with a proprietary solution, such as Microsoft Azure; or you can choose an open-source alternative, such as Ubuntu Cloud.
There are two interesting takeaways from this week’s flurry of PostgreSQL news. Most obviously, it’s a win for the Postgres community. Whether you believe that Salesforce or VMware are in this for the long haul as strategic database suppliers, each is a large, publicly traded enterprise software provider visibly committing to Postgres. Which is big.
It is September. Time for cooler weather and time to go back to school. The Apache OpenOffice.org podling is ready for the season with events to help developers learn more about OpenOffice.org.
Do you want to learn how to build Apache OpenOffice.org on Linux? Do you want to take the first steps towards becoming an OpenOffice hacker? Do you want to help test, review and improve our build instructions, on any one of a variety of Linux distros? If so, you will not want to miss this event.
The joint electronic health record for the Veterans Affairs and Defense Departments will in effect be open source when it is complete, according to a senior VA official, who provided more details about how that will occur.
According to a form D filed with the SEC this afternoon, open source cloud-sync company Funambol has raised $3 million in funding from previous investors HIG Ventures, Pacven Walden Ventures and Nexit Infocom.
This was an alternative perspective to most of the events in and around Leeds which we go to, which often focus on the benefits of exploiting technology. Dr Stallman argued that our use of technology means we give up our right ‘to be left alone’. Ask yourself if, 20 years ago, you had been asked to carry a device that tells people your whereabouts at all times, would you have said yes? And yet so many of us carry mobile phones. Personally I do this for all of the positive reasons that come with mobile phones, it was thought-provoking to consider the alternative perspective.
A lot of great work has been done in promotion and branding of GNU GPLv3. However, I think GPLv3 cannot promise freedoms in digital communications to ordinary users, and adequately protect their constitutional communication rights while using telematics communications.
Even a very wide deployment of GPLv3 software and its adoption - through lots of very easy to use online services and apps - by many end users would still not provide those end users with effective means to verify the levels of security, privacy and authentication of those services, because they would have no means to verify that:
* the code they are using on some website is effectively the same code that, thanks to the GPLv3 license, they could download from that same website * there is no other malicious software running on the same server * in general, the hardware on which that software runs has not been compromised * all that GPLv3 code is regularly tested, to maintain consistent levels of security, privacy and authentication
Pearson has embraced an open-source approach to digital content, making its proprietary content available to third-party digital developers.
As a project admin, maintaining the integrity of the brand around your software can seem like a daunting task. But it’s also one of the most important tasks you face. It’s *your* project, made with *your* blood, sweat and tears. Remember that “open source” does not have to mean “open season.”
We have taken a list of 24 software forges and classified them according to what features and artifacts are present on that forge (as of early June 2011). The word cloud below represents the relative frequency of the forge tags. The links lead to tables that show what characteristics each code forge has.
There's been a lot of dying technology predicted lately. The death of the desktop. The death of the PC (or, in more market-friendly terms, the "post-PC Era"). The death of Windows. The death of the mouse... you name it, if it's desktop-connected, its demise been predicted in the last couple of months.
So much anger has been leveled at the desktop operating systems and the PC, it really makes me wonder what the PC did to tick so many people off. Seriously, it's not like it ripped you off and then asked the government for a bailout, right?
In his quest to win the Republican presidential nomination, Texas Gov. Rick Perry is perpetuating a convincing hoax: that implementing Texas-style tort reform would go a long way toward curing what ails the U.S. health care system.
Like his fellow GOP contenders, Perry consistently denounces "Obamacare" as "a budget-busting, government takeover of healthcare" and "the greatest intrusion on individual freedom in a generation." He promises to repeal the law if elected.
As much as $60 billion intended for financing U.S. wars in Iraq and Afghanistan has been lost to waste and fraud over the past decade through lax oversight of contractors, poor planning and payoffs to warlords and insurgents, an independent panel investigating U.S. wartime spending estimates.
The UK's Guardian newspaper's Investigative Editor, David Leigh, author of the "Get this Wikileaks book out the door quickly before other Wikileaks books are published" Wikileaks book has messed up.
It seems like just days ago that Luddie asked me to begin looking into the curious case of Jeanne Whalen’s WSJ story, which claimed that five human rights organizations had written a letter complaining to WikiLeaks that it was not taking proper care to protect civilian informants. As we soon discovered, the article was riddled with errors. To wit: not all the signatories were with human rights organizations, most of the signatories were not speaking for their organizations, and the letter was a call to meet with Assange, not an upbraiding. That the letter (which Whalen won’t release) quickly made it into her hands made whole thing smell of Newscorp astroturfing.
Former State Department spokesperson PJ Crowley has written an op-ed on the recent release of more than 130,000 US State Embassy cables. Likening the cable publication to “pestilence,” Crowley provides his perspective on what he thinks will happen now that the cables have been published. Crowley was forced to resign in March after he made some comments that called attention to how accused whistleblower to WikiLeaks, Pfc. Bradley Manning, was being treated at Quantico Brig. When WikiLeaks published the war logs in July, he knew he had to do an assessment and figure out what might be put at risk if US State Embassy cables were released. What he says on WikiLeaks carries a lot of credibility. In fact, he has spoken about his work during the WikiLeaks release and why he made the comments he made about Manning on multiple panels.
A Guardian journalist has negligently disclosed top secret WikiLeaks’ decryption passwords to hundreds of thousands of unredacted unpublished US diplomatic cables.
Knowledge of the Guardian disclosure has spread privately over several months but reached critical mass last week. The unpublished WikiLeaks’ material includes over 100,000 classified unredacted cables that were being analyzed, in parts, by over 50 media and human rights organizations from around the world.
For the past month WikiLeaks has been in the unenviable position of not being able to comment on what has happened, since to do so would be to draw attention to the decryption passwords in the Guardian book. Now that the connection has been made public by others we can explain what happened and what we intend to do.
The release of diplomatic documents by WikiLeaks last year has given people more insight into how the US government works according to Suelette Dreyfus.
Dreyfus is the author of Underground: Tales of Hacking, Madness and Obsession on the Electronic Frontier, the 1997 that featured the exploits of Mendax — the hacker handle of WikiLeaks' founder Julian Assange.
Dreyfus told this week's Q&A show that people in the US now understand how their government worked behind "closed mahogany doors." She said that WikiLeaks has also shown that governments don't always act in the interests their own people. "In that sense, it's a true whistle blower," Dreyfus said.
Last night Koch Industries issued a statement on Kochfacts.com that effectively agrees with the main tenet of our new report: Toxic Koch: Keeping Americans at Risk of a Poison Gas Disaster.
U.S. Department of Transportation officials are disputing Texas Gov. Rick Perry’s statement at the Iowa State Fair today that federal administrators plan to require a farmer driving a tractor across a public road to obtain a commercial driver’s license.
“We are absolutely not requiring farmers” to obtain commercial licenses, such as those required of semi-trailer operators, said U.S. DOT spokeswoman Candice Tolliver in Washington, D.C.
She said U.S. DOT Secretary Ray LaHood had put out a statement last week making the DOT’s position clear.
“We have no intention of instituting onerous regulations on the hardworking farmers who feed our country and fuel our economy,” LaHood’s statement said.
Back in April we asked ORG supporters to write to their MEPs to help campaign against a Directive that would extend the term of copyright protection in sound recordings (for the reasons why, see our previous posts and the campaign site 'Sound Copyright'). We had a fantastic response, with thousands of letters sent to MEPs across Europe.
A key moment was the BT/NewzBin2 case. A clutch of Hollywood studios took BT to court in order to force them to restrict access to the website "NewzBin2". The site in question provides only links to film downloads – it does not even host copyrighted content. The studios were extremely pleased to have the court find in their favour, seeing it as a crucial precedent. They were beginning to lose patience with how slowly the Government was implementing the Digital Economy Act, and saw this as a convenient shortcut. Culture and communications minister Ed Vaizey enthusiastically welcomed the judgment – ironically enough, online, by tweeting: ”Interesting judgment in Newzbin case, should make it easier for rights holders to prevent piracy”. He went on to continue defending the result, and his statement, from a barrage of replies.