The weather office in UK has quit providing widgets for GNU/Linux desktops thanks to Adobe dropping support for AIR in GNU/Linux. There is a workaround. On my PC, running Debian GNU/Linux, there is an app called “metar”...
My mother, who successfully migrated from Windows to Pardus GNU/Linux, is always alert trying to find news about FLOSS in our little country. Two weeks ago, she called me with information that seemed like a dream: a reputable University that promotes online learning was offering a course named "Linux OS".
To be honest, although I really wanted to register, I hesitated. After all, online learning is not fully developed here and the platforms are Windows based. Paying only to discover that you are barred out because the software that the institution uses is not Linux inclusive is, obviously, no fun at all. So, before registering, I decided to find as much as possible about the course program and the platform. My inquiries gave positive results; everything seemed suspiciously fine.
I was never a great, or even good, guitar player, but it’s something I really enjoyed doing for a decent chunk of my life. But as life and work grew more complex, it kind of fell by the wayside, a casualty of the demands of adulthood.
But recently, I’ve been actively trying to carve out time to mess around with my guitar. Because I live in an apartment, I became intrigued by the idea of amp and pedal modeling, where instead of playing through a physical amp or guitar pedal, one plays into a computer, with the amp and pedal sound created by software.
Wow! A Tile processor uses a bunch of RISC CPUs on a chip in a mesh. They have 64bit processing and 40bit addressing. The idea is to get close to one processor per thread so that fewer context switches and massive parallelism will get a lot of throughput at lower cost than x86 with SMP. For servers this makes a lot of sense and because they are optimized for Linux and have tools, porting is trivial. Lots of software that runs on GNU/Linux will be able to move quickly to servers running these things. Sampling is happening and production will happen in March. 2012 will be even more interesting than Android/Linux v world.
Anyone familiar with GNU/Linux will not be surprised by the fact that this operating system runs on almost all known processors. However, very few people are aware that mere support just might not be enough. You’ll also need to keep an up-to-date repository of code. This is especially true when it comes to serious hardware such as POWER.
There are few things in life more exciting than a new system update for your favorite Linux distribution. Often, system updates can bring performance enhancements or simply address problematic security issues. These updates are generally considered a good thing. But when it comes to installing kernel updates, there are some critical factors that must be considered.
By now, you've likely heard all about the new 3.2 Linux kernel. While the new 3.2 kernel does offer some worthwhile benefits, this doesn't always mean that everything is going to work as expected for every person upgrading.
If you haven't tried out the Wayland Display Server as of late, after there being a stream of new announcements, you probably should or at least check out the videos in this posting. The Wayland Display Server is becoming more lively and slowly reaching a point where it may be possible for some to use it on a day-to-day basis.
This piece did not set out to re-define the term, "computer bulletin board," but that is the easiest way to describe Wboard, a window that provides a background for text notes and allows you to manage and save groups of notes easily.
TeamViewer, one of the world's most popular providers of remote control and online presentation software, today announced TeamViewer 7 for Linux. The new version incorporates a host of new features for remote computer support and introduces for the first time the ability to participate in online meetings of up to 25 people conducted with TeamViewer 7.
Google Earth is an application that provides a virtual globe, map and geographical information which you can use to travel the wold virtually, see images, 3D buildings, maps and more.
Do you think gaming on Linux has improved over the past few years?
Although variety and graphics might be a little off that available for OS X and Windows there’s no denying that with every passing month Linux as a gaming platform grows ever more viable.
Razor-qt is a new desktop environment based on the QT toolkit. I installed it from the PPA and gave it a quick go. It’s early days for the project, but it might eventually become a refuge for lovers of KDE 3 in the same way that Xfce has become popular with people who want to recreate the Gnome 2.x experience.
Now that KDE 4.8 has been released, it’s time to recap all changes you will find in Gwenview.
The main change is the addition of animations when viewing images: crossfading between images and nicer-to-use comparisons. You can learn more from this previous blog article.
If you haven’t been living under a rock this past week you probably know that the KDE Software Compilation version 4.8.0 has been released. This version brings a lot of great improvements and in my opinion is the best KDE release to date. Among the rather large list of new features, this release includes several kwin optimizations, the blur effect has been fixed, a new dolphin view engine, improvements to gwenview, a new QML splash screen (as well as most of the plasma widgets being ported to QML), the new Secret Service framework and many more. Rather than bore you with all the details which can be obtained from the announcement page, I’m going to visually showcase why KDE had it right all along.
KDE looks and works better than ever!
There’s absolutely no denying the fact that there has been a lot of bickering between people about which desktop environment is the best. However, in more recent times, the discussion has been expanded and refocused, from not just Gnome vs. KDE but now Gnome Shell vs. Unity, two desktop environments that are both dependent on the Gnome framework.
The difference between the two is simply the desktop shell, which is much more a difference in looks and functionality than a technical one. However, Gnome Shell has finally started to build itself a place in my heart, while Unity has not.
Before beginning with my arch story, let me tell you a bit about myself, or rather about my experience with Linux OS. I am software engineer by profession (used to be…but that is another story) worked in enterprise java and client solutions. My first experience with Linux was in 2003 or 2004 when I learned about an operating system called Red-hat and given a 3-cd install for the OS. I installed the OS in my computer, did not like it at all. Looked very bland and a cheap imitation of windows; I immediately realized being free means being cheap.
So you’re thinking about switching to Arch. Here are some things you should probably know first.
(I’m assuming you already know all the great things about Arch — otherwise, you wouldn’t be thinking about switching — so I’ll skip that part).
You all know that I don't like the Xfce desktop. For some reason, nearly every single implementation thereof lacks something so important, so basic. Recently, it's been hailed as the replacement for Gnome 2, the new hope for Linux users disillusioned by the cartoon fever of new touch-like interfaces so wrongly mated to the traditional desktop. But I'm skeptical.
Once upon a time operating systems shipped on a stack of 1.4MP floppy disks. These days most come on DVDs because the installer files can’t fit on 640MB CDs. And then there’s Tiny Core Linux.
According to this laconic post by Jean-Manuel Croset-0, there was not a solution for the Mandriva dilemma. He claims that the financial situation is "better than expected", which allows the company to try to find a new solution and the new deadline is "mid February".
Mandriva users have been anxiously awaiting word from corporate whether the first user-friendly distribution would be forced to cease operations. The decision, which has been postponed twice in the last week, has finally come down. Too bad it's really a "good news, bad news" situation.
The Linux landscape has become pretty interesting as of late, with all the new desktop environments and changing popularity between distributions. It seems that now is the best time for all the distributions to make their mark and differentiate from each other wherever possible, especially when it comes to major players.
Word is that Red Hat refused to sign on to OpenStack when it was announced, because it didn't like the governance model. Red Hat also has its own cloud management software projects. But the company that once dismissed OpenStack seems to be coming around. Look closely at the OpenStack community and you'll find quite a few Red Hat engineers, including some that have become core contributors to OpenStack projects.
Color management has historically been a weak area for the Linux desktop, but the situation is rapidly improving. Support for desktop-wide color management is being facilitated by projects like KDE's Oyranos and the GNOME Color Manager.
Red Hat developer Richard Hughes, who started implementing the GNOME Color Manager in 2009, launched a small company last year to sell an open source colorimeter--a hardawre device that is used to perform color calibration. The Linux-compatible device, which is called the ColorHug, will retail for €£60 (early adopters can currently order it at a sale price of €£48). He has already received a few hundred orders and is building more units to meet the unexpected demand.
The last few Fedora Engineering Steering Committee (FESCo) have seen a large number of features being approved for this next Fedora Linux release due out in May. This Monday's meeting wasn't any different with many more features being officially approved for this next Red-Hat-sponsored distribution. Below is a listing of the items that were just approved this week.
As you can see, the performance results between Mac OS X 10.7.2 and Ubuntu 11.10 are definitely mixed, at least when using the latest-generation Intel Sandy Bridge hardware. One trend though is that using LLVM/Clang 3.0 within Apple's Xcode4 package these days is a much better option than using the GCC 4.2.1 release they have shipped for a while. Depending upon the particular workload you're interested in, you can run the given tests relevant to you under both operating systems using the Phoronix Test Suite with OpenBenchmarking.org to determine what platform is able to meet your performance needs, aside from any other software platform features to consider.
Ubuntu's Unity interface is a step away from traditional graphical user interfaces. The intention is to make it the basis of a standard interface for everything from PCs to tablets to phones, and it's implementation has been somewhat controversial. It's predicated on two main ideas; that most users only ever use a handful of applications, and that people prefer to search for things by typing -- as they do on the web -- rather than going through going through arcane menus and clicking on drop-downs. I take issue with the second of those, but before abandoning the interface entirely -- this is Linux, after all! -- it's worth exploring Unity to see what it has to offer.
When it comes to branding, the open source world is rarely at the front of the pack. Free software hackers tend to be much better at writing code than they are at designing logos, inventing names and developing elegant color schemes. But Canonical has long stood out as an exception, and its latest stride — a new website devoted to helping the community adhere to Ubuntu branding conventions — is no exception. Here’s a look.
Ubuntu seems to have shifted lately "from trying to make a rock-solid desktop distribution to playing around with cool ideas for next-generation interfaces," observed Slashdot blogger Chris Travers. "A lot of these ideas are very untested in terms of overall usability, and they represent a sort of 'back to the future' approach, thinking of the old X applications before menus became prevalent ... ."
I have been a long time Ubuntu user, been using it since 2006. I loved it and have been installing it on user's PC's until version 11.04 came out with Unity. Before you get a wrong impression let me make it clear that I love to try new things as long as they don't come in between me and my work. [Also read: You Don't Have To Quit Ubuntu]
I started using Unity since its alpha days and am currently running Ubuntu 12.04 with HUD and KDE 4.8. The reason is simple -- I am curious and love trying new things. I am also running openSUSE with Gnome 3 to stay updated with the latest developments.
With the announcement of Unity HUD, Mark Shuttleworth tried hard not to use a technical language. While I certainly applaud the effort, it seems that it may have been just a little bit too non-technical, seeing the number of people who misunderstood his points.
He was really announcing two different things; the HUD itself, and the underlying technology that enables it; libdbusmenu. Because so far, it's only been used to hide menus when they're not in use and that's not particularly innovative.
If the jump from the GNOME 2 desktop to the new GNOME Shell or Unity desktop in Ubuntu has left you feeling dissatisfied, one increasingly popular distribution just might offer something that turns out to be the best of both worlds - Linux Mint.
Originally created as a spinoff of Ubuntu, Mint has long since come into its own and offers a number of advantages over other distros, including a desktop that dares to stay firmly in the Middle Earth of the ongoing desktop holy wars.
Good news for Ubuntu fans. The second alpha of 12.04 is expected to be available tommorow for testing. If you are planning to upgrade to Ubuntu 12.04 it's time for you to help the team in testing and ensuring there will be fewer or no bugs in the final release.
The first thing I wish to point out about Ubuntu 12.04, is the fact that the new release will no longer be targeting the much loved final ~700MB CD sized ISO. At first, this came as a shock to the Ubuntu community. But any long term users and community members of Ubuntu will know that this is a debate which has been raging among the developers and users for some time. It was always inevitable that Ubuntu would grow beyond a mere 700MB ISO. It was a classic example of not “if”, but “when” it would happen. Fortunately, it has only grown an extra 50MB, which will push the final ISO up to ~750MB. So when Ubuntu 12.04 goes gold, it will require either DVD media or USB stick for installation.
The HUD is based on a concept that I really believe in and supported (though my own usage and newb attempt at script) when Mozilla tried the same idea a few years ago with Ubiquity. Mozilla however has this obnoxious habit of killing projects that I like (or in there parlance - putting them on the backburner - ubiquity, prism, skywriter just to name a few). Ubiquity was supposed to become something called Taskfox in Firefox 3.6 but that never happened.
I am sure you have heard of Ubuntu Studio, an Ubuntu derivated targetted at multi-media, especially film and audio editing. Ubuntu Studio uses XFCE instead of Unity as its DE. The team is also known for one of the best wallpapers. Here is the latest Ubuntu Studio wallpaper.
Almost there. The default theme for Lubuntu, Ozone, is near to its final version. Lubuntu 12.04 Precise Pangolin is getting more and more polished. But if you can't wait, or you have another version (or a distro with the LXDE environment) feel free to test it. Download here.
When checking whether or not DirectFB 1.6 is released yet with its many new features, which was slated to happen in January, I discovered some interesting activity within the main Git repository for this lightweight graphics acceleration (and input, among other features) library. Landing in the DirectFB code-base in the past two weeks since last writing about the project has been early-stage Android support.
Its quite a coincidence, Android team says good bye to physical menu buttons in order to focus on ActionBar (something similar to MenuBar) the same week Ubuntu announces its desires to do away with MenuBar.
Samsung announced a 1GHz "Galaxy S Advance" phone running Android 2.3 on a dual-core 1GHz processor and featuring a four-inch Super-AMOLED screen -- slated for Russia in February. Meanwhile, Motorola announced a Europe-targeted, unlockable "Razr Developer Edition" and is preparing a similar Android device for the U.S. "in the coming months."
NOOK Tablet hackers have added a few new tools to their arsenal this weekend. Developer Cobroto has put together one of the first custom ROMs for the tablet, while developer AdamOutler has put together a rather impressive tool based on Ubuntu Linux which you can use to reformat the NOOK Tablet and roll back from OS 1.4.1 or later to NOOK Tablet OS 1.4.0.
If you remember, Asus introduced the PadFone, a tablet/smartphone hybrid, last year. Although we haven't heard much about the PadFone in quite some time, MoDaCo has some juicy news that caught our attention. They're reporting that Asus has confirmed that they will launch the extraordinary device at Mobile World Congress in Barcelona on February 27th.
JBoss has released Byteman 2.0.0, an open source Java bytecode manipulation tool licensed under GNU LGPL 2.1. Byteman is a Java agent which helps testing, tracing, and monitoring code. It allows developers to change the operation of Java applications, either as it is loaded or during runtime. It works without the need to rewrite or recompile the application, and can even modify Java Platform classes like String, Thread, etc.
Pentaho is moving its open-source business intelligence capabilities to the Apache license to make them more compatible with big data technologies. Pentaho’s Kettle extract, transform, load (ETL) technology was previously available under the LGPL or lesser Gnu General Public License.
IDG News Service - Business intelligence vendor Pentaho is releasing as open source a number of tools related to "big data" in the 4.3 release of its Kettle data-integration platform and has moved the project overall to the Apache 2.0 license, the company announced Monday.
There are a plethora of free/open source databases around, from the good old Berkeley DB, SQLite, MySQL, PostgreSQL, and the newer NoSQL DBs like MongoDB, to mention a few. Most of these have easy-to-use GUI interfaces too. As a result, the threshold to becoming a “database administrator” has become very low, and the quality of the average database is abysmal. People who do not know the A or B of database design are happily doing mission-critical stuff. Referential integrity is unheard of, and in the interests of temporary speed gains, the concept of normal forms is discarded.
As for security, don’t make me laugh. SQL injection was discovered in the last century, and the prevention is simple and well known — but guess which is still one of the most popular ways of cracking websites?
The next Mozilla browser update, Firefox 10, is on track for release tomorrow, as confirmed by a Mozilla meeting report from last week.
The update pertains to the desktop formats—Windows, Mac, and Linux—and the mobile edition, for Android. For those who never upgraded to Firefox 4, version 3.6 will also be updated to version 3.6.9, which only adds security and stability fixes (though 3.6.9 is expected to be retired this April).
"In many ways Version 3 is a leap forward for the ownCloud project — not just in the technology, but also as a measure of the contributions from our expanding community," said Frank Karlitschek, founder of ownCloud. "Aside from the new functionality, the calendar and contacts have been given major improvements both in capabilities and interfaces."
- Built-in cloud text editor that supports 35 programming languages for syntax highlighting, keyboard shortcuts support, automatic indent and outdent, unstructured / user code folding and live syntax checker (for JavaScript, Coffee and CSS). Editing more advanced file types like .doc and .odt is planned for a future release:
The GNU Project today announced the relaunch of its worldwide volunteer-led effort to bring free software to educational institutions of all levels.
Jonathan Thomas, the developer behind the famous Linux video editor, announced earlier today the immediate availability for download of the OpenShot 1.4.1 application.
Version 2.0 of the open source VLC media player is one step closer to a final release as release candidate source code and binaries for Mac OS X and Windows are made available. The source code was released last week when the developers "tagged" the first release candidate for testing. VLC 2.0, code-named "twoflower" is what was VLC 1.2 until early January when it was decided to bump the version number to reflect the changes.
Open source software is at the heart of a European Commission initiative to allow European's a voice through Europe-wide petitions. At a conference in Brussels on 26 January, the European Commission officially launched its European Citizens' Initiative, due to come into effect on 1 April 2012. Under the terms of that initiative, if a petition gathers more than a million signatures across a minimum of seven member states, the European Commission will have to consider enacting relevant legislation. The petitioning system will use open source software for collecting and storing signatures. The OnLine Collection Software (OCS) has been developed by the Commission's Interoperability Solutions for European Public Administrations.
"The appropriate use of standards will help alleviate lock-in", says a draft guideline prepared for the European Commission, on the link between ICT standardisation and public procurement. The draft text was published on 21 December 2011.
Last year during my Open Government Data Camp keynote speech on The State of Open Data 2011 I mentioned how I thought the central challenge for open data was shifting from getting data open (still a big issue, but a battle that is starting to be won) to getting all that open data in some common standards and schemas so that use (be it apps, analysis and other uses) can be scaled across jurisdictions.
At the Boston Open Source Science Lab, or BOSSLab — an example of the burgeoning movement of biology projects shifting out of laboratories — people are investigating their own DNA, or growing art materials in petri dishes.
Utah classrooms may soon be making the switch to open-source online textbooks that can be cheaper and easier to update.
Google's 2011 Code-In, which is a winter program similar to their Summer of Code, ended earlier this month with many contributions to some leading open-source projects.
While not as popular as Google Summer of Code, Google Code-In is an eight-week program that takes place each winter where Google organizes pre-university students to help out on various open-source projects. This year there were over 500 students working on 18 open-source projects for a period of up to eight weeks.
IBM's Lotus Symphony office suite has offered users a free Microsoft Office alternative since 2007, but last week saw the release of what's very likely the last version of the software.
This one didn't go quite the way I thought it might: it turns out that, as I speculated back in October, IBM is indeed dropping production of its Lotus Symphony office suite, ending a five-year run on the Microsoft Office alternative.
According to a brief blog post last week from IBM's Ed Brill, the latest release of Symphony, 3.0.1, is also likely to be the last, ending IBM's fork of the OpenOffice code.
"Our energy from here is going into the Apache OpenOffice project, and we expect to distribute an 'IBM edition' of Apache OpenOffice in the future," Brill wrote.
Going forward, IBM will be putting its efforts behind the Apache Foundation’s OpenOffice instead of its own OpenOffice fork.
What shall we make of this surprise pronouncement in President Obama’s State of the Union address? A belated investigation has been launched into the role of fraud in the financial crisis.
There is a lot to digest in a recent series of events on the Prosecuting Wall Street front – the two biggest being Barack Obama’s decision to make New York Attorney General Eric Schneiderman the co-chair of a committee to investigate mortgage and securitization fraud, and the numerous rumors and leaks about an impending close to the foreclosure settlement saga.
I can see a lot of lawsuits in the future and liability for taxpayers who may have to pay the bills. I can see people all over the world refusing to store any data on any server in US jurisdiction. This is yet another sign that the USA is going down the technological drain. The world does not need the bureaucracy of the US messing up IT.
Right now, we can expect Twitter to comply with court orders from countries where they have offices and employees, a list that includes the United Kingdom, Ireland, Japan, and soon Germany.
We've talked in the past about the ridiculousness of the US government pretending that the State Department cables that were leaked via Wikileaks are still confidential. The reasoning, obviously, is that they're afraid that declaring anything that's become public is no longer confidential is that it creates incentives to leak more documents. But the actual situation is simply absurd. Documents that everyone can see easily and publicly... live in this world, a world where anyone in government has to pretend that they're still secret and confidential. There have even been cases where officials have gotten into trouble for using information from a "public" document, because they're supposed to create this fiction that it's not.
Still, there is one way in which this has actually turned out to be enlightening. A few months ago, the ACLU filed some Freedom of Information Act (FOIA) requests to the State Department on some issues, getting some of the very same documents that were leaked via Wikileaks. Except... the kind that came with the FOIA had redactions. The Wikileaks documents, for the most part, do not. That created an interesting opportunity for Ben Wizner at the ACLU. He could now compare and contrast the two version of the document, to see just what the government is redacting, and figure out if they're redacting it for legitimate reasons... or just to do things like avoid embarrassment.
At a behind-closed-doors meeting facilitated by the UK Department for Culture, Media and Sport, copyright holders have handed out a list of demands to Google, Bing and Yahoo. To curb the growing piracy problem, Hollywood and the major music labels want the search engines to de-list popular filesharing sites such as The Pirate Bay, and give higher ranking to authorized sites.
Throughout the fall, I ran a daily digital lock dissenter series, pointing to a wide range of organizations representing creators, consumers, businesses, educators, historians, archivists, and librarians who have issued policy statements that are at odds with the government's approach to digital locks in Bill C-11. While the series took a break over the Parliamentary holiday, it resumes this week with more groups and individuals that have spoken out against restrictive digital lock legislation that fails to strike a fair balance.
Lovers and users of free and open source software are a hardy bunch. They've seen it all: Microsoft EULAs, DRM, UEFI, proprietary software and constant attempts to prevent end users jailbreaking and rooting the devices they paid for with hard-earned cash. If you think you've seen and heard it all, well, you haven't. Apple may have trumped them all with a possibly unique EULA.
Today, in Cannes, at the Midem conference, I did a presentation that was something of a follow up to the presentation I did here three years ago, about how Trent Reznor's experiments represented the future of music business models. This time, the presentation coincided with the release of a new research paper that we've spent the past few months working on, sponsored by CCIA and Engine Advocacy, in which we did a thorough look at the true state of the entertainment industry. For years, we've been hearing doom and gloom reports about how the industry is dying, how customers just want stuff for free, about analog dollars turning into digital dimes... and (all too frequently) about how new laws are needed to save these industries.
There is a problem with the world of illegal piracy that we have online today, but it's not what the RIAA and MPAA want you to think it is. It's that we've become accustomed to participating in illegal copying, and yet it is still illegal. This means that we have the illusion of a body of work that can be built upon, remixed, and combined with new work, but if real artists practice this commercially, we are exposed to legal attack. Being a remixer is revered by culture, but being a commercially successful remixer is punishable by massive lawsuits, and if SOPA ever passes, maybe even prison time.
Here we go again. Four years ago, during the presidential campaign, we had CBS News threaten the McCain campaign for using some news footage clips in a campaign ad. And here we are, four years later, with NBC Universal demanding that the Romney campaign remove an ad it's using against Newt Gingrich, making use of old TV news footage. This strikes us as bizarre (and ridiculous) as it did four years ago. In many cases, these ads are likely to be considered fair use. But, secondly, is it really any harm to NBC News if Romney uses classic footage? I mean, the news reports are what NBC News had reported in the past. Essentially acting like it hadn't -- by trying to block the use of the footage -- just seems silly.
MegaUpload has received a letter from the US Attorney informing the company that data uploaded by its users may be destroyed before the end of the week. The looming wipe-out is the result of MegaUpload’s lack of funds to pay for the servers. Behind the scenes, MegaUpload is hoping to convince the US Government that it’s in the best interest of everyone involved to allow users to access their data, at least temporarily.
And it is that final point that many in Hollywood still fail to understand. They positioned this whole battle as if it was about the right to enforce laws on a lawless internet vs. those who wanted to pirate. But pretty much everyone can see through that facade. And, as we've said before (and will say again), this was never about just this bill. You can see that in the continued focus of people on other efforts by these industries to push through bad policies -- such as ACTA and TPP. No, this was a rejection of crony capitalism -- an attempt by one industry to push through laws that solely benefit some of its biggest players, at the expense of everyone else.
However, are there more creative legislative solutions that come from thinking out of the box? Ian Rogers, the CEO of TopSpin, who has been a vocal opponent of SOPA/PIPA, (despite his close relatioinship with many in the recording industry) has an interesting proposal that he's put forth that's worth thinking about. It starts from a different perspective. Rather than using the opportunity to directly tackle this undefined "problem," he looks at solving a different problem: the fact that it's difficult (to impossible) and expensive to license music for an online service. So his suggestion is really based on dealing with that issue by creating a giant registry whereby copyright holders could indicate what they're willing to license and at what price. He notes that this is an idea that doesn't directly need a legislative solution -- and, in fact, notes that he's tried to build something like that in the past. However, multiple attempts to build this haven't gone very far. He suggests a more official version might be able to really go somewhere.
The EU Commission is engaging in an all-out offensive to portray ACTA as normal trade agreement harmless to fundamental rights or access to knowledge. In several published documents, the Commission's attempts to impose ACTA onto the EU Parliament while silencing legitimate criticism. But these misrepresentations don't resist scrutiny.