Status of this Document

This document is a Draft written by the Moderator of WG 3 of the Expert Group on Strategies for Software and Services. Neither the outline nor the content represent any consensus so far. The document is a mere compilation by the Moderator of feedback received so far. It contains interpretations, conclusions and recommendations that are not agreed between the parties. To get the pure positions, reference is made to the linked statements.

The European Commission called for a Workshop on 20 January 2009 to think about the future strategy of Europe in the field of Software and Services. The overall area was split up in several Working Groups tasked to provide input to the European Commission. The European Commission intends to produce a Whitepaper under the sole responsibility of the European Commission. Input and opinions given here are therefore only informative and have no binding character whatsoever.

Table of content

  1. Introduction
  2. What works
  3. Issues in standardization
    1. Recognition of Fora and Consortia
    2. Cooperation between organizations
    3. Relevance
  4. Issues in Interoperability
    1. Interoperability outside standards
    2. Is Software special?
    3. Interoperability is not the only goal
    4. Levels of interoperability
  5. Issues around IPR and Standardization
    1. Things in common
    2. Locating the trenches
      1. The Royalty Free Camp
      2. FRAND with ex ante
      3. License of Right endorsed patent
      4. FRAND or keep it as is
    3. Patents and strategic planning in competition
  6. Issues around Open Source and Standardization
    1. OSS and Standards
    2. Open Source and Venture Capital
  7. Issues around Standardization and Procurement
  8. Issues around Research and Standardization
  9. Trends
    1. Integration of fora and consortia
  10. Barriers
  11. Benefits
  12. Actions


The goal of this document is to identify the issues related to Software and Services and to suggest possible actions for the Commission in this sector.

Standardization, Rights on intangible goods and Interoperability have been intensely discussed for several years. The battle over software-related patents (or computer implemented inventions) around the EU Parliament's attempt to reform the European patent system did not resolve the issue. Our field of enquiry is therefore marked by camps and hardened, fine tuned positions that render open discussions and brainstorming difficult. Also, most issues are known and where already subject to fierce political fights and lengthy reports.

It can't be the goal of this document to repeat what has been said and done already. Covering all disputes, differences and moves in the IPR debate is unachievable in the given time frame. Additionally, touching on all the discussions in the recent past would render this document unusable. Consequently, known broad disputes will be mentioned and linked to existing overview pages on the web (where those exist).

Some issues surfaced again and again under different angles and with different conclusions attached to them. It is therefore difficult to represent all the fine grained relations between certain positions. To make this document readable, it is tried to condense issues. This means the full statements from participants are somewhat not fully represented in the text-flow. To remedy this situation, all statements are hypertexted in the menu and should be read in case the text of this document raises questions or contains ambiguities or wilful reductions.

What works

Europe has a well organized Standardization landscape on the Basis of Directive 98/34/EC. It takes lessons from the past and gives European Standardisation a clear priority over national standardisation. With ETSI, Europe has created a global player and home of the successful GSM standard. Consumer protection standards and New Approach are seen to work well. All stakeholders emphasized that standardisation efforts should, under normal circumstances, be voluntary, market led and industry driven. Additionally, FSFE and others reminded about the WTO criteria about openness, accountability and due process. As described below, standardization has some connotation to public interest and thus triggering some higher requirements.

There were many consultations and workshops around the times the DG Enterprise study was made. It quickly became clear that the European Standardization System works pretty well for classic manufacturing industries. But the European Standardization System as set out in Directive 98/34 has some flaws with respect to Information and Communication Technologies. And saying some flaws means that so far, nobody called for a radical change, for a revolution.

There is a well established cooperation between the ESOs by some common committees. Sometimes they have trouble to agree, but this can be considered to be inevitable. The cooperation between the ESOs and the Commission is rather working well and the mandate system allows the Commission to benefit from the ESO's insight.

Nevertheless FSFE raised an issue around the recent turbulent discussions in ISO. Alleged irregularities were also reported from European national standardization bodies. This would be challenge and opportunity at the same time to improve the quality of European national bodies.

Issues in standardization

Recognition of Fora and Consortia

As seen, the established system of cooperation between ESOs, but also the cooperation with national standardization organizations works rather well. But ICT has brought new important standards bodies into the landscape and the current system has trouble integrating them. In fact, Directive 98/34 created a monopoly in many areas for the ESOs. This monopoly was directed against national egoism in the EU. But as a collateral effect, it also excludes all the new actors and standardization organizations that have emerged in the Information and Communication Technology area.

There is a consensus that there are important specifications in fora and consortia and that there should be a possibility to recognize those specifications in the European standardization system. The question is, how to integrate specifications like XML, HTTP, email/RFC 822, IEEE 802.11, SAML etc. into the European Standardisation landscape and allow regulators to take them into account for classical regulation or New Approach rulings.

One statement reminded of the finding of the DG Enterprise study group to reform the mandate system for standardization. The EC has the possibility to issue mandates to the three ESOs to ask them for action. Actions can range from simple reports to a request for the elaboration of a standard. It is seen as a useful tool for the EC to influence and connect to the standards world. But the IT world has grown beyond the ESOs and sometimes mandates surprise the community outside the EU standardisation system. It is suggested in statements that the mandating system should be extended to allow for actions also by SDOs other than the ESOs and that the EC should engage in a consultation before issuing a mandate.

Cooperation between organizations

As the industry chooses certain organizations in which to develop certain standards, there are overlaps created. There is also competition happening and standardization is seen as a leverage. So if one competitor has to good a position in certain areas of standardization, it is not excluded that the competitors chose a different venue to elaborate a specification on the same technology. Sometimes, especially confronted with a dominant market player, competitors coordinate their efforts by creating Consortia and by elaborating a common technology. This is often masked as standardization. If there are groups of companies behind each point of such competition, there is more than one specification developing organization involved.

All this competition and the competing specifications confuse many people. Those who play know perfectly well, what value is attached to the respective initiatives, but players from outside and the market can be confused. But this is part of the global competition and the globalized organization of coalitions of all sorts. The growth of the IT industry overall in combination with the extraordinarily strong (and growing) networking effects have dramatically increased the value of standards, and thus the incentive for abuse of the system.

But at some point in time, even the competitors see the benefit of getting closer together to create more networking effects. In this case, SDOs come together and try to coordinate their efforts. But the barrier to the cooperation between the SDOs are often hidden in the detail of their IPR regime, version control and copyright protection. Also, with the proliferation of Consortia, the standards world itself has become a market where SDOs compete against each other and forum shopping happens. The issue here is a risk for a race to the bottom in terms of quality of the specifications and the consensus that backs them. This risk is mitigated by the fact that the quality of a specification is revealed during the implementation phase. Thus low quality standards have less deployment and less importance in the market place. But governments do not have that culture yet. They might choose a specification based on some other set of criteria and they are not really concerned whether a specification is already widely deployed as mandating it will automatically create a widespread deployment. So for mandating certain specifications, the filter of wide deployment used by the industry to find their way in the jungle of specifications may be challenging for governments.


If standardization happens top-down, a standard may remain without relevance in the market. It is understood that relevance is measured in uptake of the technology and use of the specification in question. As seen, the standardization world in the Software & Services arena is rather based on competition. So developing a European standard doesn't mean automatically that the standard will become relevant. There are great stories about nice standards that have never been used. Pouring resources into those losing standards will not make them win. This is related to the question about fora and consortia. In the current discussion, there is often a mantra talking about the speed of standardization. But experts will invoke timeliness and relevance as at least of equal value.

Achieving timeliness and relevance in a top down model is very difficult. It may work in conjunction with industry policy like in the GSM case. Top down standardization is often at odds with the timing: Either too late trying to influence an already mature grass roots effort or too early by mandating a promising but immature technology.

Most modern SDOs test relevance by requiring two full implementations of a specification. If nobody invests the sweat to make those implementations, the market is not ready and the effort will die before having reached the level of standard. It was suggested to enquire more about the criteria and conditions that lead to take up and relevance.


Most people talk about Open in context of standardization, if they mean more requirements to SDO processes to balance interests and requirements coming from all kinds of participants. Open itself is abused as a term and re-used in many contexts: Open standardisation, open source, open sky, open doors.

Open standards is a wrapper for various discussions about requirements for standardisation processes. Mostly, WTO criteria as set out in Annex 3 of the TRIPS TBT agreement. The hottest debate is whether Open means unencumbered with patents. This debate is explained further down in Issues around IPR and Standardization. It was triggered by the first version of EIF that required specifications implementable on a royalty free basis. One of the main reasons to require RF was that eGovernment solutions often make use of OSS licensing schemes. This is the link to the OSS and standards discussion, where OSS representatives state that OSS can not implement FRAND specifications.

But the term open standard is also used to establish a limit between vendor coalitions and SDOs and also between de-jure SDOs, including their dogmatic democratic foundations and the consortia having no national representation e.g. This distinctions and limits are used to defend positions by claiming openness and accusing all others of a lack thereof. This discussion rejoins the issues around fora and consortia

Issues in Interoperability

Interoperability is nicely defined in the ECIS statement taking on from IEEE: Interoperability is the ability of an IT system to communicate and exchange information with other IT systems in a way that all IT systems can use that information. For the question whether the interoperability is special to software, see below.

Interoperability is a an emerging principle of greater collaboration and interconnection between IT systems in a time where the European economy and market could need a dynamo of growth from within. Unleashing the power of truly open specification based, interoperable IT systems within the European Internal Market could re-vitalize economies, spreading efficiency and effectiveness in all vertical industries that have become IT intensive, one could think of Banking, Communications, Financial Services, Health Sciences, High Technology Insurance, Public Sector, Retail, or Utilities.

While the role of interoperability extends far beyond the reach of the software domain, its innovative capacity is not well known outside the IT industry. There is great need to raise awareness about core interoperability principles such as open standards and specification and open architectures. There is also, potentially, a role Europe can play in framing the debate and implementation of interoperability across domains across the EU.

So everybody joins the chorus that calls for more interoperability, thus the way to achieve it is highly disputed between participants.

Interoperability outside standards

Are standards the only way to achieve interoperability? This is not always the case. Companies or Open Source projects may cooperate. For Open Source projects this is rather obvious as code can be re-used. The complexity of such interoperability without specification is rather high. The challenge often lies in the information disclosure policy and also in the licensing details. If going to the pain on making it interoperable, mostly this is then brought to some SDO to generate neutral grounds. The mere implementation of a certain standard does not automatically mean that implementations interoperate. Often, further collaboration around testing and interop events is needed.

Is Software special?

There are a lot of other activities around the evaluation of the European standardization landscape. The question came up whether software is specific, has specific needs and would merit special treatment. Jean-François Abramatic (former chairman W3C and now IBM) said, software today is modularized and layered. No single person knows all the stacks and layers anymore. Thus people are condemned to extended cooperation and interoperability to transport information and data between the layers of implementations, from the driver/CPU level over network stacks up to the application layer and even within different applications of the same layer. This view is also reflected in the ECIS statement.

From SAP's point of view there might be a stronger requirement for interoperability in the software area, and standardization in software is mostly done in industry consortia rather than in de jure bodies. However, SAP does not believe that there are any requirements specific to software with respect to the actual standardization process. In addition, the trends around virtualization and embedded software as well as software business models that heavily depend on hardware sales make it increasingly difficult to discriminate between software and hardware in the context of standardization.

Raising the question whether software is special, opinions diverge quite dramatically. Why? Because answering this questions in one or the other direction is implicitly assumed to determine special consequences -e.g. mandatory RF- by the debating parties. This assumption leads to the effect that everybody immediately jumps into the trenches described later in this document. The discussion of RF or not RF is then masked by the question, whether software as such exists or whether (e.g. by including embedded systems) the border between software and hardware is completely blurred.

As the aforementioned assumption of a fixed outcome has hindered any useful discussion about the general question of specific standardization requirements for software by diving directly into the IPR question, even a suggestion to record an action in this report to suggest a study on whether software is special was controversial. While one side clearly identified core software originality with blurring edges to the field of hardware and wanted a study on which special requirements would have to be drawn, the other side denied any possible definition of pure software.

Intel took a different stance and led the discussion back into a more classic IPR debate. Standards can often be implemented in either hardware or software. A patent could cover both implementations. The choice of RF or FRAND regimes would be rather determined on a case by case basis and according to certain criteria. No matter whether software or hardware is targeted, the choice between RF or FRAND by a SDO participant is mostly determined by the search for an ideal position in competition. In some cases this can favor a RF approach. It remains to be determined and to be studied whether the fact that SDOs mostly targeting pure software specifications and file formats like OASIS or W3C, do so because of a possible generalization of the criteria mentioned or just for historical/community reasons, while the telecommunications sector in ETSI leans strongly towards FRAND possibly because of the technology specified and its peculiar eco system (e.g. fewer, bigger stakeholders).

The criteria given by Intel mention a strategic use of patents in the market that goes well beyond the question of royalties. This will be discussed in another point.

Interoperability is not the only goal

Standardization is mostly associated with the goal of interoperability. But as was said in the DG Enterprise Steering Committee, standardization in Europe can have a variety of goals. The EU standardization system is designed to also draw up standards for the New Approach legislation. The regulator sets a framework that is filled out and maintained by standardization organizations.

Levels of interoperability

Furthermore, different levels of interoperability can be drawn on the blackboard. Currently we are moving to ever higher levels of interoperability. In the classical phone system, we talked about interoperation because a new device respecting some specifications could connect to the network and other devices. Several syntactical specification like (e.g. ASCII or XML) brought syntactical interoperability thus allowing the interchange of data. But even though syntactical interoperability allows the exchange of data, it does not convey the meaning of that data. As people have experienced the breathtaking advantages of syntactical interoperability as best exemplified by the Web, they gather for more. Even with XML, semantics of data are not obvious with just looking at instance data. As those semantics often represent know how, there is some tendency to hide the semantics.

To allow seamless integration of third party data into the processing of own data, the syntax must be augmented by semantics. The semantics of data must be accessible. But doing so also makes the know how behind the semantics accessible. If it is standardized, the know how is moved from the application into specifications.

Now with eGovernment, we even talk about the next step: organizational interoperability. Not only is data supposed to carry their meaning with them, it is also supposed to integrate seamlessly into organizational processing rules and operations.

To achieve interoperability in the information society a certain agreement or specification has to find widespread acceptance that may even need further collaboration. If there is very large acceptance, economic network effects start. If those appear, every new adopter of a certain technology adds value to the existing group of people. Thus widespread adoption of a new technology creates new markets based on this technology and following the economic principles of network effects. Standards are thus a way to organize that effect. Network effects can generate huge benefits for early adopters.

Mostly, single companies, even multi-nationals, are not big enough to drive a technology up to a level where networking effects start to generate benefits. Standardization and specification elaboration therefore also play a role in the coordination of several players to generating sufficient momentum. The paradigm above can be found especially in Software and Services area and less so in areas where product- and hardware specifications prevail, although with convergence most ICT standards can be implemented in hardware, software (different kinds) or combinations of any or all of these.

This also explains how important the choice of platform is for an innovator or researcher. The concept of innovation in our heads is still tied to the lonely inventor in some garage having a really bright idea on some thing that merits protection. Some say, the reality of interoperability is different in software and services. Innovation in ICT is incremental (although the incremental steps can be huge, see the web e.g.) as the task is so huge that nobody would have the power to re-invent a new type of computer with all applications from scratch. Consequently, Innovators place themselves on a certain platform, e.g. Internet, Web, GSM, 3G or 4G, take those network effects and build on it to produce new innovative services powered by even more innovative software. Or strong stakeholders come together to create 5G or the Future Internet. But it still has to be based on some pre-existing technology. E.g. the whole hype about Web 2.0 was only possible because of the solid foundations of the Web with its solid addressing scheme and its anchors for scripting languages and its XML data exchange format. At the same time, they evolve the platform they are building on. Creating a new platform or making disruptive changes to an existing platform is therefore unlikely to be accepted in the market place.

The higher we get in interoperability, the less there is room for competition. So SDOs have to be careful to leave room for competition. This sometimes leads to tensions as those not well positioned in the competition call for standardisation to overcome their weak market position.

Issues around IPR and Standardization

The issue of IPR and Standardization is far too controversial to draw any kind of conclusion or any hope of consensus. Even this single phrase is disputed as one camp claims majority. The document therefore just reports from the trenches.

Things in common

Before trying to locate the trenches, mostly unsaid commonalities can be observed. All classical de-jure Standardization Organizations have a (F)RAND patent regime either based on some degree of disclosure or by virtue of participation. Nobody claims that the FRAND scheme is too constraining and that one should have the right to be unreasonable. There are Consortia and specifications floating on the web that have just no patent commitments attached to it. Nobody is suggesting to copy that model for the European standardisation system. The battle around standardization also reveals that specification developing organizations (SDOs) are the place of coordination for the software industry.

The classification of players in this game is also mostly agreed. It is best explained in the SAP statement and ranges from companies without production and 100% dependent on royalties for their revenue, over mixed models, up to pure implementers with no IPR. The roles make the interests on intangibles of those actors very tangible. In marketing and UI design, one could compare it to the concept of personae. Those roles appear in real life in every possible mixture. OSS players are a bit at odds with this definition as they are a mixture between consumers and pure implementers.

There is widespread concern about so called patent-trolls. The boundaries of the definition of a patent troll are not unanimous. There is some correlation with stakeholder that do not participate in standardization or do not produce products. Thus they do not implement a standard and have different incentives. One opinion goes as far as describing patent-trolls as an unavoidable consequence of the current system by grants of exclusive rights on ideas as items of trade for which profit should be maximised.

One characteristic, nevertheless, is that patent-trolls typically wait for several years until a standard is widely implemented and actors in the market are rather dependent of it with high costs of change. In this case, the royalties that can be obtained are much higher than they would have been if this patent would have been disclosed early on. One can also derive some repeating pattern to help recognition of patent-trolls: The reason not to disclose and to wait until large adoption before enforcing a patent is that otherwise, implementers would just have changed the technology to avoid the patent. So one other aspect of patent-trolling is the notion of trapping a whole community. Additionally, patent-troll action creates a lot of uncertainty in the market and can trigger high costs not only for those affected directly, but for all participants in the market. E.g. the Eolas patent on plugin technology would have required the change of the widely used plugin technology not only by the browser makers but by all plugin providers and users. The social cost in this case is enormous. Often trolls also litigate with a big player in the market to obtain additional damages. In all fast moving ICT sectors, a full blown patent search per standard would raise the cost to a prohibitive level and would not produce definitive results. Thus the actors prefer to take the risk. But they are not really satisfied with this unpredictable situation.Patent-trolls exemplify most of the shortcomings of the patent system today, the difficulty to find the appropriate patent information, the lack of predictability and the debatable balance of burdens in the system. Most of the past examples also were based on patents of debatable quality.

There are diverging opinions on how to deal with patent trolls. So far, no real solutions have been put forward. One very controversial solution from outside the participants of this group suggested to shift the burden of patent search from the technology implementer following a standard to the patent holder. Thus a patent holder would not be able to collect damages for past use of his technology if he failed to notify the technology users or standards makers of his patent. He could still collect royalties for the future. This idea did not find many friends on boths sides of the patent debate. It serves first of all the SDOs that would have a much easier life. There is a lot of unease about this solution. Stakeholders may feel obliged to do searches because of the inversion of the burden of notification. As said, patent searches, even in one's own patent portfolio, are expensive, time consuming, burdensome and represent a non-negligeable burden on trolls and patent holding implementers alike. Another suggestion was to include trademark like provisions of use it or lose it. More research on how to deal with patent-trolls seems to be in order.

Locating the trenches

One would expect some dichotomy between the proponents of royalty free approaches for standards and the proponents of (F)RAND. But the feedback received so far shows a more complex picture. The of three camps can be approximated: The OSS/RF camp, the FRAND camp and the FRAND with different approaches to ex ante disclosures.

The Royalty Free Camp

The RF camp mainly opposes patents on computer implementable inventions in general and such patents in standards on FRAND terms especially. The community is broader than just Open Source. As stated, it is not the goal of this document to restate all the arguments of the debate on computer implementable inventions.

But OSS has some specific arguments. The main argument is that OSS projects can't deal with patents in standards. To make OSS implementations of standards possible, standards should be unencumbered and bear no royalties. According to this camp, a starting point of a solution would be

  1. Minimum definition of RAND to include all of industry, as the Common Patent Policy already states it should be.
  2. Recognition of standards bodies such as W3C and OASIS, with blanket approval only for bodies that have binding policies which follow the above principle.
  3. Procurement of solutions only if the standards meet #1 and #2
  4. A general exception that renders patents unenforceable against interoperability

This argument is amplified by the fact that governmental entities, especially on the town/village level use more and more open source to share development costs of tailored software. But also higher up in eGovernment, this model seems to find a lot of support. For more information, see the statements from OFE and FSFE and the section on procurement below.

The link to the fora/consortia discussion is created because SDOs like W3C and OASIS offer standards on RF terms that can be rather securely implemented by open source. As the Internet and the Web is the basis for most of the eGovernment applications, already a broad variety of standards is available. But it is not available to eGovernment as the consortia specifications are not recognized. It has to be noted that the IETF lacks a clear Patent Policy. Currently, there are intense discussions about an IPR regime in the IETF. The IETF situation has its basis in history and is very peculiar. As the IETF requires 2 interoperable implementations before issuing a standard RFC, any real encumbrances would hinder implementation. So far this has worked more or less. But there were incidents that sparked the discussions mentioned above.

One of the difficulties are of very practical nature. Most FRAND licensing schemes do not foresee sublicensing. In an RF world this is a non-issue. Every OSS developer/user can get an RF license at any time. In practices this means, the subsequent developer would not have to know the patent holder. Additionally, since the author of the software grants to right to his neighboors to make copies to other neighboors, there is no possible way for the original author to count the number of copies distributed. Another difficulty is that OSS is by definition more vulnerable to patent litigation as it is much easier to determine a patent violation in open source code than it would be to determine such an infringement in object code alone.

If royalties are due, every subsequent user of the OSS code base would have to re-enter into negotiations with the patent holder. The OSS licenses live from the ability to sublicense work further and further in the development. Currently used Patent licenses seem to fall short on this ability. The shortcomings of current licensing models continue if it comes to the calculation of royalties. FSFE stated that an intermediate solution could see licenses based on revenue and not on a per/copy basis. The per copy basis assumes revenue per copy. The OSS model is based on services and the number of distributed copies is not in relation to the actual services/warranty that generates revenue. But here, caution is on order as OSS is itself fractioned in more branches than a tree has. Parts of the OSS world categorically refuse the patent system applying to software and would see the acceptance of licensing revenue in OSS as an offense against the OSS system itself. Also, OSS licenses have changed over time and newer licenses contain provisions about patent waivers. But those wavers are only applicable in the relation between OSS implementers and those who copy. The entity holding a patent on a standard would waive its rights on copying/using/altering the OSS software implementing the standard.

So despite the suggestion about a changed FRAND regime, many questions remain open and the acceptance of intermediate solutions in the OSS community remains unclear.

This camp is actually challenged by the other camp because OSS is in itself not uniform. There are many open source licenses that allow for all kinds of different business models. E.g. the Apache license would even allow for subsequent inclusion of code in commercial object code implementations. So, allegedly, only a tiny fraction of the open source movement would be in the trenches. But this argument ignores the people behind OSS and reduces the issue to the pure licensing problems. It remains that small or individual contributors to OSS are very vulnerable and can't cope with FRAND regimes, which in turn is the central argument of this camp for RF standardisation.

FRAND with ex ante

In the IPR Workshop of DG Enterprise, a large portion of the time was spent discussing the various patent litigations going on in the mobile sector and involving ETSI technology (but not ETSI as an organization). Some actors are not satisfied at all with the current patent policy in most de-jure SDOs which is just FRAND plus early disclosure. The argument is mainly that once the standard is finished, people have made a choice of technology and have already invested considerable amounts of capital into that technology.

Some people have argued there is a “patent hold-up” problem that can occur because an implementer has a reduced bargaining position with the holder of essential patent claims vis-à-vis licensing terms once the standard is finalized. After reviewing the issue in some detail, a number of SDOs have decided to permit the voluntary disclosure of those licensing terms “ex ante”. There have been additional proposals to the effect that SDOs should mandate the disclosure by the patent holder of the licensing terms for its essential claims to the SSO before the standard is approved.

There was always a tension between competition law and patent law. Recent cases and litigation involving DG Competition seemed to have focused their attention more on standardization. A representative was attending the November IPR Workshop and subsequently published an article where he clearly argues for ex ante procedures in standard setting.

This has also connections to the procurement question below as e.g. in the New Approach, a standard can be mandated by legislation thus making it impossible not to take a license of a certain patent without losing the market. This will strengthen the position of the patent holder to a government mandated monopoly in favor of the patent holder. Ex ante, no government mandating has happened yet and in Software and Services, there are nearly always alternatives available.

Ex ante would require a change in patent policy of most of the de jure standards. It would not be an issue of those SDO's having RF policies as RF is already a pricing that has to be declared up front and which happens to be zero.

SDOs routinely review their procedures and policies and make improvements as needed. This regular self-assessment is healthy.

Some question whether the “hold-up” problem is widespread. They believe that this situation only occurred in a very small number of situations when compared to the thousands of ICT standards that exist and are being widely implemented. This may be because many patent holders also are implementers, and there are incentives for stakeholders to make the system work. There also are concerns that mandatory approaches are overly burdensome, bureaucratic and inefficient, and the related “pains” to the system outweigh any perceived “gains”.

It is expected that ex ante regimes of all kind of flavours will occupy the news in the next few years.

License of Right endorsed patent

There is a suggestion from ECIS to promote the License of Right endorsed patent. For the moment, voluntary License of Right regimes already exist in several member states.

The Licence of Right regime could help address the problems faced by "innocent infringers" (an "innocent infringer" being an individual or business that did not know or could not reasonably be expected to have known of the patent), given that they would no longer be vulnerable to injunctions, which is particularly important in the case of individuals or businesses for which the use of patented inventions is essential in order to achieve software interoperability. In addition, Licences of Right will be a useful tool to SMEs, as under a Licence of Right there is certainty that licences will be available and "innocent infringers" will have less fear when marketing their products.

A License of Right to use a patented invention guarantees that any interested party will have legitimate access to the patent to develop interoperable software without fear of patent holders trying to assert their exclusive patent rights to block the development of new products. As a result, the License of Right ensures that patent protection will not be used strategically to prevent legitimate follow-on innovation in the software industry.

Especially on the follow-on innovation, there was concern expressed with respect to the current litigation system. It should not allow patent holders to exercise their rights abusively and distort competition. The patent litigation system should provide safeguards ensuring that granted patent rights are not used abusively against other companies in order to prohibit them from accessing essential information to develop new interoperable products and to reduce innovation in the ICT industry. For example, judges should take into account the potential (or actual) distortion of competition when measuring the potential harm for either of the parties in deciding to grant or refuse an injunction (interim/provision or permanent).

As a voluntary scheme, a potential Licence of Right regime should provide businesses with adequate financial incentive. Thus, for example, a patent holder filing a written statement with the EPO that Licences of Right are available should receive a significant reduction of the renewal fees for the patent that fall due after the receipt of the statement.

Another possible approach is wider use, particularly in standard setting, of the License of Right (LoR) endorsed patent. A patent endorsed such is available for all to license on transparent terms, which can be tested in a local court if royalties are deemed to be too high. Also LoR would mean that a license agreement in one case becomes a precedent and bilateral negotiations for specific conditions would be less onerous. LoR has been described as “benevolent FRAND” because there is a firm commitment to negotiate without the threat of injunctions. (see ECIS paper on this)

FRAND or keep it as is

There is a rather large proportion of the received feedback that is arguing for FRAND/RAND and not to change anything. If changes happen, they should be carefully considered and tested against unwanted consequences. The current perceived issues would rather lie in the practical application of existing systems than in the system itself. One of the arguments against the Open Source RF camp is well known and was repeated many times: Return of investment in research and development. But it is clear that the convergence between all kinds of information and communication technologies forces some actors into areas where the do not feel comfortable. Also OSS/RF on computer implementable innovations is seen as a threat to things that are unrelated to software and services, e.g. embedded systems and e.g. elevator control mechanisms.

There is also an interesting argument against ex ante regimes. There is some fear about triggering the attention by anti-trust authorities when entering into ex-ante negotiations. But DG Competition rejected this: opposition to such (ex ante) schemes has been mooted in some quarters on the grounds of supposed antitrust concerns (e.g., because "discussing" price in such a collective standards forum should be taboo). We believe that such criticisms should not be used as a smokescreen to hinder the uptake of ex ante type schemes. Boundaries seem to be unclear on where price discussion end and where concerns on negotiation to build a buyer or vendor cartel start. Meanwhile, a further argument puts forward that while developing new technology (creating markets) in a SDO, the extent to which a patent is essential to the implementation is only clear, once the specification is mature enough.

Many participants that argue rather to keep the a FRAND system also engaged in the discussion on practical issues that is reflected below. While the OSS/RF camp feels excluded and thus has a a simpler message and aim, people closer to the FRAND camp have a much more complex view of the world. Non of them excluded participation in RF standardisation and nobody is very keen on being classified in the FRAND only camp. Being classified in this camp is seen as not being modern and not taking up the challenges of today. Most participants arguing for FRAND schemes and against some privileges for the OSS world are also contributing new ideas to solve the practical issues of the current scheme.

For further debate on the Standards and IPR question, it is referred to the linked statements and the links to official documents and positions from there as they contain many more arguments and fine tuned positions.

Patents and strategic planning in competition

As seen, the choice to create a level playing field via standardization can be determined by the availability of patents under certain conditions. A glimpse of criteria that determine whether stakeholders participate in creating such a level playing field can be seen in Intel's contribution. There are many factors that are taken into account and generic schemes or rules are not determined yet and may be inexistent with respect to the diversity of interests of the variety of players involved.

It is important to note that royalties are not the most important criterion in the decision making. Potential market growth with accompanying economic network effects may determine a certain choice. This decision making is also framed by the Patent Policies of the SDOs chosen to create the level playing field. This explains in part the forum shopping in standardization. Individual decisions by stakeholders and market leaders will force others to follow and thus integrate into an SDO and following the IPR regime put forward there.

During the further discussion, Intel and other companies confirmed the initial position of SAP that the issue of patents in standards has some practical issues and that those practical issues should be explored in a pragmatic way. All of them argue more or less against a one-size-fits-all approach. All of them participate in organizations with RF policy and in SDOs with FRAND approach.

On a meta-level, putting the understanding of the strategic planning into the middle of further considerations escapes a little bit of the trenches. But it does not solve the OSS issue that created the trenches in the first place. It may be that there is some common ground where strategic planning and OSS meet and it may be seen as important by some to find the matching criteria to advance in the understanding of the problem space.

On this practical level, issues arise from stacks of patents on one technology and creating agreement on IPR before or while the decision on the creation of a level playing field is created. It was suggested that a platform allowing for negotiations would improve the situation. This has been tried in NGMN with mixed results and has itself a number of issues:

  1. How to avoid gaming
  2. How to avoid antitrust violations (process & limits)

Multilateral engagements on concrete licensing terms is seen as a mainly unexplored territory. Going into this space would need a multi - discipline approach as it will touch on aspects of law, technology and economy. There was a suggestion to have the industry issuing some best practices, once more experiences in this area have been collected. An example given was NGNM, where a cooperation platform was created to facilitate cooperation between partners. This is seen as a testbed that will bring further experience in the field of mediation and multilateral negotiations to help the building of larger platforms by building trust and confidence.

Issues around Open Source and Standardization

OSS and Standards

Despite the fact that it is strongly connected to one of the trenches mentioned above, open source in standardization merits an extra point. Some of the statements are equally true for commercial software and there is even some cross-pollenization. The terms free software, open source software, commercial software, proprietary software are not universally accepted terms with fixed boundaries. They are often used to attach some good/bad connotation to the adversary in discussions.

Eric Raymond wrote about the Cathedral and the Bazaar that explains why people chose open source development models and what the advantages are. Open source, as we know it today, would not be possible without the Internet and the Web that allows a smooth organization of the Bazaar. But as the Bazaar is sometimes wild and disorganized, branches, forks, flamewars, there needs to be some bottom line ensuring a stable backbone allowing the diversity to flourish. FSFE reminded that the discussion is about closed source vs. open source models and that there can be open source cathedrals and closed source bazaars.

This strong backbone is often but not exclusively found in standardization. Sometimes, rules of successful commercial software are copied. Open source projects often are very good citizens when it comes to implementing standards in a conforming way. Naturally, they are not the only good citizens. So OSS has some standardization friendly culture in general (yes, there are exceptions) and some dependency on standardization.

The Web and the Internet are not only facilitators for OSS development, but also targets. That means many OSS projects target things that are done over the Internet using this platform. But the Web and the Internet converge in some areas into established grounds with a complete different social structure. The Web converges into audiovisual, TV, video and thus touches on an established structure of large technology providers and a rich media industry that organized their monetary balance also with the help of patents in the standardized technology. The convergence makes the Internet culture hitting that market. TV over the internet, videos online etc. OSS implementers are keen to play with it, OSS companies see chances in this market. In order to participate, they need to implement standards that do not come from the usual Internet standardizers with their RF assumption. Another area where the convergence will hit hard is the mobile sector. The smarter the phones, the more they feel like a mini-computer and the more OSS people are keen to play with it and go into this market. Again, the mobile market is determined by huge players, device manufacturers and telecommunication companies. Again, this is a social clash of two different cultures.

For the moment, the clash has turned into trenched battles. The atmosphere is so heated that a discussion about reasonable solutions is near to impossible. But the issue is not necessarily without solutions. One could imagine that instead of having a license fee on a per piece/download basis (that would kill the OSS project) the fee could be based on the OSS revenue model. It could be an option to have the developers themselves not subject to patent rights, but only the users. Private use would be excluded and commercial use would then trigger royalties. Another possible solution is to make royalties payable a percentage of what the end user pays. Thus in an open source application the percentage would be times zero and would not therefore interfere with OSS GPL like licensing commitments. There also are many other options and ways that can be explored to develop creative solutions that balance all of the needs and IPR rights of the various stakeholders. But such solutions would also trigger highly controversial discussions within the OSS community whether something is open or proprietary.

This clash of cultures also continues on SDO level. As W3C and OASIS show a successful RF model, there are repeated questions, why other SDO would not allow also for an RF option. This raises again the question of the official European standardization system as Fora/Consortia are excluded and the recognized players refuse to offer a RF option. There is fear in the standardization community and on the opposite site of the convergence that RF models would proliferate in de-jure SDOs once allowed and destroy revenue. As ESOs don't do RF and those doing RF are not ESOs, the European standardization system has trouble with OSS.

Again, for the exact location of the current trenches, one has to look into the statements given. May be a constructive dialogue in the broader standards community on different ways to address these issues can be a way forward.

Open Source and Venture Capital

One of the very original debates in the group happened when FSFE started to debate with the representative of the European Private Equity and Venture Capital Association (VCA in the following). VCA contributed some statistics of venture capital coverage of OSS companies taken from the VentureSource Database:

  1. US: 3.4% (232 OSS out of 6801 total; trend from 1.2% to 5.5% in 08)
  2. EU: 1.1% (39 OSS out of 3435 total; trend from 0.7% to 1.0% in 08)

As OSS can take a wide variety of synonyms, further searches were conducted without raising the counts significantly. Georg Greve argued against the methodology when stating that the database field searched was not meant to be an authoritative source for the criteria researched. But the search was in the field (although human readable) of the core mission of the companies financed by VCs.

There is also a low risk of having hidden OSS components undiscovered in those counts. If a VC tries to sell his/her investment, the software is systematically checked to not contain OSS parts affected e.g. by some viral GPL licensing.

The initial VCA statement argued in favor of a globally harmonized system of IPR in order to make investments predicable for a larger number of venture capitalists. It also stated that VCs are rather agnostic to the patent question and look for unique investments, while uniqueness means uniqueness to the market and not just to the perception of the company. VCs seem to believe more in proprietary models to find a unique investment. FSFE called for more arguments why VCs believe more in proprietary approaches and suggested a study in this direction.

In another attempt to make the statistics more accurate, Georg Greve mentioned the example of Google. Google is enabled by free software as it uses linux clusters for its search engines. But Google has also some patent portfolio filed as early as 1997. So the question remains if venture capital was rather decided by the patents held or by the business model based on open source. Opinions diverge on the reasons and answers will not be easy to obtain.

Bernd Geiger also put forward, that in his experience, many software startups use OSS when it comes to commodity modules - either in the development process or as add-on tools for the product. They all care that some handling rules are no violated and stay within blurred borders of classical business models. The counter argument was that business models (differentiator revenue source), development models (differentiator methodology) and software models (differentiator control) are largely orthogonal issues. They are not strongly correlated, most business models work on either software models, some business models realistically only work on one or the other. This was not seen as opposing the findings so far as many startups use some sort of OSS software to complement there proprietary approach. But the use of OSS is just a commodity to VCs and does not affect the decision about the uniqueness of the business, only the proprietary part matters.

Further asked, FSFE stated that OSS companies are not against VC, are just subject to market mechanics as any other company and may need VC if it makes sense in their context. FSFE stated their believe that patents may be overvalued in the VC context, which may be one of the reasons for the inflationary filing going on, but that the actual value of patents are hard to assess and may even create further liabilities. They further criticized an automatism between patents and VC that creates just a comfort through numbers while the uniqueness mentioned above is a much more complex assessment. But issues of patents, VC funding and software model can be seen as being connected, but not in a deterministic way.

Issues around Standardization and Procurement

The issue around standardization and procurement is rather new. It is not unusual or absolutely exceptional that Governments procure products that contain patented technology. So what is new to it? First of all, with the advent of the Internet and the Web, technology procured for eGov services face the user directly. Often in the past, the initial eGovernment test applications required the citizen to install a piece of software. Other services required a certain input format/API that was not standardized in a SDO. Governments quickly understood that mandating citizen-facing technology may need some standardization to organize the coordination of client/server communication etc. So in procurement tenders, government agencies require compliance with certain standards. It is understood that mandating some standard is not always a sufficient solution for interoperability.

Now if such a required standard carries an essential patent, the government de facto requires the citizen to pay certain royalties to obtain a technology to exercise citizen rights and duties. This also has some difficult connotations in tax law as it uses governmental power to alter the market in favour of some specific stakeholder thus obliging the citizen to pay a governmentally mandated fee to a private party. But one could always argue that the citizen is still free to use the governmental offices that would not require such a fee.

Yet another aspect of open source in procurement is that a government has easier ways to scrutinze security aspects, evolution of the software and the like. So control of evolution of the software and the ICT infrastructure may be an issue as well. But there are also vendors of closed source where such testings are provided under non-disclosure agreements, where license agreements guarantee some strategic control. Open source just seems much easier to handle in this respect as it will not come with additional administrative burden to accomplish the scrutiny.

Back to the trenches, there is a fear from the object code community that OSS is seen as a panacea for issues around interoperability and procurement fees. OSS itself is not necessarily free as in free beer. It is challenging for those making rules for procurement to remain open to both sides. As public procurement is a big market, discussions in this area are rather heated to get the best position in the competition field.

Issues around Research and Standardization

This is a rather non-contentious issue that was explored by the Copras project and further investigated by the ICT Standardization Steering Group following the Study on the European standardization landscape. There is unanimity that research and standardization are interlinked and that those links have to be strengthened. Standardization gives research the opportunity to get scrutiny from a wider audience of industry, it allows to facilitate technology transfer and it allows industry to refocus research towards the common technology bases used by industry and to avoid re-inventing expensive wheels.

So there is no political issue. But so far, all the sunday talk remained lip service. IST Projects where made aware of the need to connect to standardization with some clauses in the grant agreement. But quite often projects do not have a concrete plan for standardization and do not understand how to use standardization as a tool to help accomplish their goals. Standardization Organizations participate in certain projects to bring in their knowledge, but those remain punctual cases of support. There is no support for SDOs reaching out to research projects and no reward for research projects doing standardization -projects are measured on research results. ICTSB had urged several DGs of the European Commission to help, but they were just pointing at each other and the whole undertaking of getting more standardization service to research projects was drowned in administrative hurdles.

This is also a very serious issue for Software and Services as new innovations risk to fail to connect to the right technology. The chance for disruptive innovations based on a complete new networking stack is rather low as the investments into networks have been huge and the installed base is very large and would take decencies to change. This is emphasized in the NESSI paper that calls for a industry policy by the EU Commission.

Integration of fora and consortia

The integration of fora and consortia was already in the centre of the study undertaken by DG Enterprise. There are three models on the table for consideration:

The most conservative approach calls for a better coordination and communication between the ESOs and the Consortia. The ESOs would be the entry point for the Consortia to the European Normalisation System. Established specifications from other bodies would be copied into the European System to acquire EN status. This option raised many questions concerning copyright on specification, maintenance and version control. It also raised questions about the resources that Consortia have to spend additionally to bring their already established work into the formal system.

A second approach was to change the Council Decision 87/95 EEC on ICT standardisation to allow the Commission to also address mandates and endorse standards from other organisations than the ESOs. The change would include criteria that those Consortia and the specification would have to satisfy in order to qualify for integration. Those criteria would follow the WTO criteria for standardisation as implemented according to the works going on in DG Enterprise.

The third approach would either fully change Directive 98/34 EC to include the criteria already mentioned above into the Directive. By council decision, Annex 1 of the Directive could be extended to also include the most important Fora and Consortia. To determine the concrete candidates would create further discussion on merits. Integration of a variety of Consortia would also be a challenge as the European system is built to avoid duplication of work and competing standards, thus challenging the European principle of uniqueness of a standard for a certain area. It was also put forward that such a change would be a very complex political move and would take a long time to succeed. To avoid this complex issue, criteria per specification are favoured over the integration of a single standards body. But this raises the issue of competition gaming as some Consortium may just be created by all others to challenge the market leader.

Open source models

As an argument of the FRAND/object code camp against the OSS/RF camp, the diversity and mixture of models is put forward. This means that an overall product has object code components and open source components. Thus the overall product can be assimilated to a normal commercial product and would just not represent an exception with respect to standards and FRAND regimes. The argument is given to dilute the message from the OSS folks that OSS is excluded from implementing standards because of patents. As a counter argument, even assuming a mixture of object code and open source, only those mixing would be able to cope with the FRAND regimes in standardization. A large part of the open source community would still remain excluded. So diluting even small parts does not make the overall argument less strong. It just cuts out the object code part in a mixed scheme. Claims of how much a certain system/camp represents in numbers are not yet empirically tested.

Not taking into account the above dispute and not assuming consensus, one could say that the market moves towards more open source. This is done in classical viral schemes and other models more friendly to the traditional object code oriented industry. The OSS communities react with new innovative schemes to accommodate proprietary parts into the large OSS platforms. At the same time new schemes of protection against abuse of the open model appear. This brings trademark law into the picture. The legal situation is rather complex and one could imagine that it is difficult for SMEs to understand the full complexity of mixing models.


Fora and Consortia

Currently, specifications of fora and consortia are not recognized in the European Union. The ESOs benefit from a monopoly created by the Directive 98/34 EC. For voluntary industry specifications, this isn't a big issue as those are independent of regulation. But if it comes to New Approach regulation or references, those specifications are excluded. This has a variety of consequences. Some of the Member states circumvent the monopoly of 98/34 by integrating fora specifications into national legislation and slightly changing them in the political process. This creates a fragmentation in the market that leads to higher prices due to localization efforts.

Open source and FRAND licensing

FSFE: The barriers to entry are particularly harmful in the area of interoperability, where inability to implement standards leads to increased cost and reduces the reuse and recombination factor, which will be essential for the future IT industry.

One particular barrier identified for OSS is the absence of sublicensing in most FRAND licensing schemes. Every new user would have to deal individually with the patent holder. This breaks the main characteristics of open source licensing.

Ex-ante and limits by competition law

Some participants expressed concern about ex-ante if it goes beyond mere disclosure of patents and conditions of access to them. There is some fear that going beyond disclosure would have fuzzy boundaries to anti-competition behaviour, let alone multi party negotiations on patent pricing in FRAND regimes. Confronted with the very effective sanction system, there is real reluctance to explore possibilities of ex-ante regimes without further clearing the field with respect to competition law. On the same line, patent clearinghouses and multi stakeholder negotiations ex-ante are facing reluctance.

Lack of participation in standardisation

Nessi raised concern about the level and effectiveness of participation in standardisation. ICT is very complex and new innovations are often based on existing widely deployed platforms like GSM or the Internet. In order to create a level playing field and to generate a critical mass for take up, stakeholders need to come together and agree. This is done in pre-standardisation and standardisation. ETPs already do some of the coordination work needed to achieve a certain impact, but there is still reluctance to participate in standardisation. This means, innovative solutions risk to be either based on the wrong platform, are not well aligned with the platform or insulated.

Venture Capital and Open source

The interaction between Venture Capitalists and Open source projects was rather orthogonal to the goals of this group and would rather have its home in WG4. But the discussion happened and revealed that venture capitalists seem not to believe too much in open source. Some open source companies have venture capital, but not many of them according to statistics circulated in the group.


Fora and Consortia

The recognition of fora and consortia allows companies to target a much larger market as it addresses the global market in a sectorial way. This leads to benefits and networking effects beyond the single market and lowers trade barriers. Expensive cooperations that have as only goal to introduce some well-known technology into the EU standardisation system will not be needed anymore. The European industry will likely be less reluctant to participate in global fora and thus typically will have more influence on the overall direction of such consortia. While in some fields, the recognition of fora will not be a concrete solution to an issue, it is of symbolic importance, especially in the area of future internet, software and services.

Open source and FRAND licensing

Taking OSS and making it compatible with a FRAND licensing scheme will be seen as a two edged sword. On the one hand, it will allow OSS solutions to enter as a normal player into the market. On the other hand, it is already foreseeable that some people from OSS will object. In their opinion, mixing schemes dilutes the OSS model and brings additional confusion.

Ex-ante, competition law and IPR policies

A dialog between industry and public administration might release some fears about patent negotiations in SDOs. Best practices may avoid belligerent stakeholders going into costly legal procedures that will ultimately destroy the level playing field. A fixed compulsory ex ante approach lacks the necessary flexibility in front of the infinite variety of real life situation that can be created by negotiations in standards. Standards strategy is about competition

Intel provided some substantial thoughts in the direction of more multi party coordination to avoid litigation and the subsequent problems for the technology and level playing field concerned. It is clear that the stakeholders haven't explored this route yet and first experiences are needed. NGMN is going into this direction and may yield some results that could be turned into best practices. But there are still many open question, just one of them being the fact that mostly licenses are negotiated for a product that may implement several patents thus not being focused on the standard that is also part of the product.

Venture Capital and Open source

The discussion revealed that open source companies would like to benefit more of venture capital. This would allow them to overcome some bottlenecks in their development. It is clear that Europe with its high number of open source companies would clearly benefit of a better understanding between venture capital and open source business models. A success in this are has the potential to even break some of the trenches described in this document.


Except for the recognition of fora and consortia, participants in WG3 had a wide variety of viewpoints. Thus none of the actions except for the recognition of fora and consortia is honored by consensus.

For the recognition of fora and consortia it is suggested that the Commission will take necessary steps to resolve the issue in a timely way. This is seen as especially important in the software sector and in the area of Future Internet. The main development work in that area happen in global consortia and resulting specification should be recognized to allow companies to address the European market based on those specifications. Further cooperation within the Commission and a push for a near time solution are put forward. that do not allow for concrete

A study on specificity of software standardization

This is a controversial action. Some participants do not think that this action make sense, others think it is worthwhile pursuing. (See: Is Software special?)

While software (computer programs, procedures and documentation that perform some tasks on a computer system) is functionally distinct from hardware (the physical artifacts of a technology), telecommunications (the transmission of signals) and other terms, the trend towards convergence should not blend the terms together, especially as regards the fact that the standards landscapes are quite distinct. A study on the specific standards needs of the software area would be helpful to clear up current confusion among stakeholders regarding the importance of standards based interoperability and open specifications in the software area. Rather than just listing standards, the study would attempt to ascertain which are the most crucial clusters of standards for public stakeholders to watch and get involved in, and also spell out the requirements in detail. The aim would be to contribute to an innovative European Software industry through active participation in software standards development and exploitation. Hence, the focus would be on:

  1. Software standards for interoperability
  2. Specific needs of the software industry
  3. Specific needs of key European stakeholders (industry, consumers,SMEs, governments)
  4. The relationship between software standards and innovation.
  5. The role of software in building a platform for innovation (I.e.the internet)
  6. Remove barriers to cross-border, cross-system interoperability in software across the EU ecosystem.

A further option would be to issue a software standards mandated to the ESOs to scope the state-of-the-art and suggest a way to address the challenges in the domain. (See E-Health mandate 403.)

The European Union therefore needs to which extent it can bring European standardisation bodies into line with the stated goals of the Common Patent Policy of ITU-T, ITU-R, ISO and IEC.

Some suggest that it would probably be helpful if this group came up with a suggestion for language to be used by the group working on procurement that would tie acceptance of standards in procurement to meeting the non-discrimination principle also in the licensing of the standard.

Even though debated, some participants deem important to ensure that all software paradigms and the software models fostered therein can compete effectively when using standards, and that no distortion or legal uncertainty exists in such situations.

Ex-ante, competition law and IPR policies

A workshop on a common understanding of best practices in using ex-ante techniques in FRAND IPR policies may trigger some better understanding of all actors involved about where communication and agreements are limited by the protection of the market and when such understandings may benefit the market. NGMN has created a positive approach by allowing for coordination outside the court system. It may be worthwhile looking at such platforms for coordination that would be far more flexible than a pure ex ante or pure ex post approach. Further communication in the industry is needed and a light series of conferences on this subject would keep the latter on the agenda. This would allow for a better understanding thus leading to the nucleus of best practices as mentioned by Intel.

Open Source and Venture capital

The dialog between open source and venture capital in the group was very interesting and inspiring. It should not end with the end of this exercise. It would be nice if both sides could learn from each other and increase understanding of why the things are the way they are. This can take the shape of a dialog that is officially supported by the Commission and may invite stakeholders to common workshops to get a better understanding of each other and show a way to overcome barriers.

License of Rights

ECIS favours introducing a voluntary License of Right system on the European level that would ensure wider access to technology essential to achieving software interoperability and that would sufficiently protect access to open standards. Again, a workshop could give it a bit more shape and would also allow stakeholders to express their opinion on such a system. This would also require the cooperation of the EPA as the labelling of such patents would allow potential users of such technology to better assess obstacles, chances and challenges when it comes to implementing a certain technology. Thus LoR would in the end become some kind of quality mark or branding as already largely used in the copyright sector with e.g. Creative Commons licenses. Again, further discussion is needed.

OFE suggests these further actions:

  1. A study on legal interoperability where lawyers will discuss about RF/RAND/LoR regimes and their relation to competition law
  2. Study on contractual market defence tools against trolls, that would lead to a Commission Internal Market proposal on a new legal instrument for indemnification of EU-standards against submarine patent trolls