Last week I encouraged Google to rethink their VP8 open sourcing patent strategy and

“do the right open standards thing — join and contribute to responsible standards groups that are working to solve the royalty-free open standards need.”

The blog was picked up in Simon Phipps’ ComputerWorld blog, ZDNet, The Register, LWN and elsewhere.

At one level, this is a classic debate about what is “open” and what should be its hierarchy of values, priorities, and even basic definitions.

But is a “de facto” standard the same as an “open” standard?  No, at least not in the definition of open standards of OpenForum Europe, of which Google is a leading member.

But there is more to consider.  Google is including WebM in the next version of Android and rules Android device makers with a strong hand, necessarily playing favorites to steer the Android ship. So the message must be clear to Android device makers, suppliers, and wannabes to get on the WebM bandwagon.  And though Google is known as tough with patent trolls, Android device makers appear to have been either left to fend off patent attacks themselves, cut deals, or perhaps be quietly aided in patent litigation defense.

All commercially rational choices in the crazy, hard-nosed, twisted global mobile patent wars.  After all, look at where the patents came from in MPEG LA subsidiary MobileMedia’s law suits against Apple, HTC, and RIM (Nokia and Sony) and HTC’s counter suit against Apple (AMD through Saxon).

So is the net-net simply “until you take open source and put it in a product you can’t get sued,” so just watch the big boys force each other to take, or cave in to, patent risks in order to get to the head of the line for a promising platform?  And just hope in the meantime that royalty-free open standards for the Open Web escape cannon fodder, collateral damage, or sell-out status in the smart phone patent wars?

Unfortunately, patent hold-up gambits thrive on adopt-first-ask-questions-later scenarios of the sort Google seems to be arm-twisting for here.  Standards groups, regulators, and industry continue to grapple with this challenge.   See yesterday’s FTC/DOJ/PTO workshop and the EU’s draft guidelines for horizontal cooperation agreements that mention that “[t]here should be no bias in favour or against royalty free standards, depending on the relative benefits of the latter compared to other alternatives”.

But if vendors ignore open standards altogether, we all lose.

UPDATE

According to CNET, the W3C is taking the position that WebM/VP8 needs to go through a royalty free standards process:

“WebM/VP8 has the potential of providing a solution for the baseline video format of HTML5. To be seriously considered by the W3C HTML Working Group, the specification would need to go through a standards group and be developed under RF [royalty-free] licensing participation terms,” said Philippe Le Hegaret, leader of Web video work at the W3C, in a statement. “W3C remains interested in having a video format for HTML5 that is compatible with the W3C Royalty-Free Patent Policy.”

It is gratifying to see the FCC Broadband Plan include an open set top recommendation (4.12), firmly grounded in the FCC’s continuing responsibility to implement section 629 of the 1996 Telco Act to “assure the commercial availability” of TV devices from retail and unaffiliated sources.

And welcome words in the frank acknowledgment that over 14 years “the FCC’s attempts to meet Congress’s objectives have been unsuccessful”.

So a new proceeding will no doubt move forward, given the much-documented crying need, multi-stakeholder support, and explicit congressional directive.

But how to tell if this time the effort is on track, and not just another capture-ready MacGuffin?

Ask four questions:

– Is it specified in an uncaptured venue?
– Does it use unencumbered technologies?
– Is “it” a network interface?
– Does it work for the Web?

Overview presentation downloadable here, slideshow below.

[album: https://robglidden.com/mpegrf/wp-content/uploads/sites/2/2010/03/FCCOpenSetTop/]

Standards “would thwart, not advance, innovation” and “entail crippling delays”  because they are “extremely time consuming, often divisive, and sometimes used by one faction to block the progress of another or to promote its own intellectual property portfolio”.

It would be easy to dismiss comments like these in the Cable industry’s latest response to the FCC set top box inquiry (#27) that question the wisdom and feasibility of a standardized multi-network gateway as just so much diversionary polemics, masked as the caution of experience.  But a closer look is merited in part because as discussed before, royalty-free standards can be America’s broadband advantage.

On the first point — the wisdom of a multi-network gateway — the NCTA has a point.  Adding another intervening box between your TV and your TV content may have a certain quick-fix political logic in the tortured history of the 1996 Telco Act’s goal of competitive services and devices.

But the important question isn’t how many boxes it should take to hook your TV to the Internet (or any network), but how few — and that will take an Internet-acceptable open video standard.

And on this score, the existing gateway initiatives have little to offer, and are even dismissive of the core need in the first place.  The Digital Living Network Alliance, seen by some as the leading gateway standards group, said flippantly in their filing to the FCC:

“there are few (if any) standards for Internet video. Another way of looking at it is there are too many standards for Internet video. DLNA Guidelines by themselves do not solve this problem.”

Instead, DLNA promotes a philosophy and architecture of “indirection”:

“network-specific hindrances can be addressed by ‘adding one layer of indirection'”

The software domain maxim that DLNA bases its strategy on — “any software problem can be solved by adding one layer of indirection” — is well-known in software development, but it is a stretch to assume it is therefore a particularly appropriate or adequate policy architecture for the multiple networking, standards, industry structure and business problems of set top boxes meeting the Internet.

DLNA and NCTA are far from alone in proposing variants of the theme of the devolving cycle of standards pragmaticism — ambivalence — doubting — bashing (check the UK version here).

But there is a better way. Instead of bashing standards, standards groups, industry groups, participants, and regulators should turn their focus and energies to how to make standards work.

References

REPLY COMMENTS OF THE NATIONAL CABLE & TELECOMMUNICATIONS ASSOCIATION ON NBP PUBLIC NOTICE #27

January 27, 2010

http://fjallfoss.fcc.gov/ecfs/document/view?id=7020384091

“- Proposals to require an ANSI standardized gateway solution would entail crippling delays. Standards activities are extremely time consuming, often divisive, and sometimes used by one faction to block the progress of another or to promote its own intellectual property portfolio. It would require years just to get the standards developed, at which point products would still have to be designed, manufactured, and brought to market.

– Subjecting this dynamic marketplace to an ANSI standards process in which each industry participant can delay or veto the innovations of the other would thwart, not advance, innovation.

– These demands call for massive standards activities required in multiple standards bodies for multiple services, interfaces, and technologies. Standardization and related intellectual property clearances are extremely time consuming.

– Zenith, the intellectual property holder for the rejected VSB system, sought to use the process of amending SCTE 40 to put VSB transport into SCTE 40. It slowed the standards process by submitting the majority of objections to SCTE 40 and an unsuccessful appeal to ANSI, in an effort to impose VSB transport onto the cable architecture. This process took years to resolve.

– Under the CEA standards process, IS-6 became IS-132, which became EIA-542, which became CEA-542B. It took more than 13 years to produce the very simple Cable Channel Plan standard. This slow process was one of the reasons that led to the development of CableLabs, so that the cable industry could innovate more rapidly.

– DBS could not have offered MPEG-4 if it had to await elaborate industry consensus or rule change.

– AT&T still would not have deployed U-verse if it were required to wait until IPTV issues were set through industry consensus or by an ANSI-accredited body.

– Had Verizon deferred its hybrid IP/QAM offering until such processes were completed, it too would still be waiting to enter the marketplace.

COMMENTS OF THE DIGITAL LIVING NETWORK ALLIANCE

http://fjallfoss.fcc.gov/ecfs/document/view?id=7020354067

These network-specific hindrances can be addressed by “adding one layer of indirection”7—a gateway device.

As some have already commented, there are few (if any) standards for Internet video.10 Another way of looking at it is there are too many standards for Internet video. DLNA Guidelines by themselves do not solve this problem; however, a DLNA gateway device which is able to receive Internet video is able to bridge that content onto a DLNA home network.11

The FCC Video Device Innovation Notice [1] asks one of the most fundamentally central questions to the prospect of not only a viable Broadband Plan for America, but also to the very future of the Open Internet that has revolutionized communications systems of all humanity:

“How could the Commission develop a standard that would achieve a retail market for devices that can attach to all MVPD [Multichannel Video Programming Distributor] networks and access Internet-based video sources?”

There can be little doubt that video is a central broadband driver, but simply put, today there is no standard for “Internet-based video”.  Just ask the World Wide Web Consortium, who has struggled for years to no success to find an acceptable video standard to incorporate into HTML5, the first major update to the core Web standard in a decade [2].

Of course, there is a lot of video on the Internet, but it is controlled by a hodge-podge of proprietary and so-called “semi-open” plug-ins that do not meet even the loosest definition of “standard” and certainly nothing that comes near meeting the requirements, processes, and practices of the standardizing bodies of the Web and Internet.

So developing a standard that could attach to both MVPD and Internet video would require in the first place developing a standard for Internet video, on terms acceptable to the Open Internet.  The FCC has much to contribute on this score, as do broadcasters in the burgeoning EBU-led “hybrid broadcast-broadband” initiative, which has already developed clear and measurable requirements specifically for hybrid, multi-network video standards [3].

But the non-solution to this problem is as clear as it is unacceptable.  Forcing the Open Internet to adopt the closed, “walled-garden” model and captured, controlled specifications of status-quo gridlock that are dogging digital TV, digital cable, and IPTV, or concede to proprietary control, would certainly damage the Open Internet and perpetuate the dysfunctional tendencies of “standards as trade association lobbying” that underpin this FCC notice.

The right way forward is equally clear.  Standardize, in appropriate organizations and with appropriate oversight, in the Open Internet model of uncaptured, royalty-free process, the needed elements: codecs, transport stream, conditional access, and UI middleware.

The good news is that media standards gridlock has been a global challenge addressed by regulators in retail-scale national standards, and initiatives already underway on all of these elements point to best practices, way forward to success, and pitfalls to avoid [4].  Gridlock, capture, patent overcharging, and proprietary control – key underlying contributors to the absence of the robust retail, unaffiliated device markets as contemplated by Section 629 and this notice — are not the only inevitable outcome, they can be addressed [5].

The Commission should evaluate these activities and incorporate appropriate lessons into a proactive video standardization element of the broadband plan for America, one that envisions video standards to embrace, empower, and leverage the best of the Open Internet, not one that protects walled gardens through silos of pseudo-standards and control points.

References

Notes are available in the FCC comment filing available here.

Gridlock, capture, patent overcharging, and proprietary control – key underlying contributors to the absence of the robust retail, unaffiliated device markets as contemplated by Section 629 and this notice — are not the only inevitable outcome, they can be addressed

I’ve pointed out how the EBU, the world’s largest organization of national broadcasters, is beating the drum to avoid patent lock-ins in new standards for hybrid broadcast-broadband TV services.

EBU’s own write-up of last week’s EBU/ETSI workshop is even more direct:

“Broadcasters are haunted by the ghosts of the submarine patents which emerged with MHP … This time this has to be avoided.”

The EBU should take a closer look at Ginga and Java DTV, which have taken the MHP patent issue head-on …

References

Licensing will be key for Hybrid Broadcast Broadband

10 September 2009

“… Finally there was an interactive discussion with the packed audience. Two important areas emerged from the discussions. The first was the need for attention to licence fees in these new systems. Broadcasters are haunted by the ghosts of the submarine patents which emerged with MHP five years after services had begun, and which was responsible for missed opportunities for its use. This time this has to be avoided. Particulary for the hybrid broadcasting area where the world is used to licence free Internet systems.”

“More Democratic” … “It is a matter of social justice”

So US ambassadors have lobbied South American governments since 2007 that “[t]he issue is whether the government will choose the [ATSC] digital television standard that is already providing the highest quality, lowest cost, and most democratic opportunities …”

In recent months Peru, Argentina, and now Chile have turned down ATSC for the Japanese-Brazilian ISDB digital TV system, so it is worth asking the somewhat inconvenient question of how did a controversial, pricey, and generally questionable digital television patent pool drift into becoming a US diplomatic cause for democracy slash trade policy?

And whether this advocacy should be carried forward under the leadership of the newly-appointed Ambassador Philip Verveer of the US State Department’s International Communication and Information Policy (CIP) group as it was by his predecessor David A. Gross, who as late as February 2008, advocated in an op-ed column published in Chile:

“During my recent visit to Chile, I met with decision-makers from the government, the National Congress, industry, non-governmental organizations (NGOs), and the media. I explained the clear advantages of the Advanced Television Systems Committee (ATSC) standard: a significant better quality and coverage and a lower cost”

The CIP is one of seven issue-oriented organizations within the Bureau of Economic, Energy, and Business Affairs at the U.S. Department of State.  In addition to Verveer, tech industry veteran Lorraine Hariton has just just been appointed Special Representative for Commercial and Business Affairs.

There is a lot of food for thought for the new team at CIP on this topic:

– trade policy and standards capture
broadband-broadcast convergence bridging a global digital divide
global level playing field network policy in the broadband age

To name a few.  Here’s to hoping the new CIP team will dig in, update outdated thinking, and lean forward!

References

The Opportunity for Chile in the Digital Television Age
Ambassador David A. Gross, U.S. Coordinator for International Communications & Information Policy, February 14, 2008:

“More Democratic: For technical reasons, the ATSC standard allows for a much greater number of broadcast stations to operate in a given area, thereby allowing for new broadcast stations and more types of additional broadcasts such as educational programming. “

American ATSC Digital TV Standard Offers Chile Advantages of Accessibility, Lower Costs, and Higher Quality”, March 16, 2007:

“Ambassador Kelly pointed out that ATSC offers Chile the unique opportunity of approaching the information society for all its citizens. He noted that the American standard “is much more flexible and open to future changes at reasonable and accessible costs to all.” “It is a matter of social justice that all citizens are able to participate and are guaranteed access,” he emphasized.”

Remarks to the Commercial Association of Sao Paulo (ACSP)
E. Anthony Wayne,  U.S. Assistant Secretary for Bureau of Economic and Business Affairs, Sao Paulo, Brazil, April 6, 2006:

I’d like to briefly discuss Brazil’s vibrant telecommunications industry. I understand that President Lula is going to announce the selection of the Brazilian digital TV standard soon.

Of the several options on the table, we believe the ATSC standard [Advanced Television Systems Committee (the North American standard for digital TV, including high-definition)] offers the best combination of economic, social, and technical advantages. It has been adopted by the U.S., Canada, Mexico and South Korea. These countries that have adopted the ATSC standards are seeing a rapid increase in the sales of high definition television products. Brazil’s adoption of the ATSC standard will ensure a hemispheric standard, creating a market of 800 million people for DTV products and services.

The U.S. No longer manufactures television sets. This poises Brazil to supply high definition television sets, converter boxes, and transmission equipment throughout the Hemisphere. Brazil’s potential role as a leading supplier will help create high-paying, highly-skilled jobs and significant economic development.

The U.S. Overseas Private Investment Corporation has set aside $150 million for U.S. Companies to invest in information technology development projects in Brazil. And U.S. Companies have already expressed their intention of making significant investments in ATSC-related manufacturing in Brazil.

ATSC’s open development process ensures Brazil a significant role in the evolution of the standard. Evolving ATSC standards present great opportunities for Brazilian-U.S. And Brazilian-South Korean collaboration and partnership.

A “Julius Stonian” observation:  standards groups aren’t “consensus organizations”, they are political organizations. Winners declare their way the “consensus”, and changes in political context shift the “consensus”.

So reflects calls in several slides at yesterday’s Hybrid Broadcast-Broadband (HBB) workshop to look deeper into Intellectual Property Rights and other control points in the new “broadcast+broadband” (aka OTT TV) standards initiatives.

Cases in point:  UK Project Canvas, see filing here, and HBBTV, a “consortium” claiming pan-European fait accompli authority that is questioned by the European Broadcasting Union’s workshop slides.
s-and-t-HBB
So Stonian kudos to MHEG vendor S&T and the EBU for frank, to-the-political-point, what’s-in-it-for-me observations:

  • “Do you know what IPR issues exist in new initiatives like HBBTV??? (S&T slide 45)
  • “IPR and patent issues shall be resolved prior to rolling out the HBB services” (EBU slide 9)
  • “Unresolved IPR issues (particularly “submarine” patents)” (EBU slide 28)

The EBU’s observations are perhaps most interesting, because they draw from a February 2009 EBU recommendation and initiative, cited below, that cuts to the chase of many of the fundamental political and policy interests that are at stake when “broadcast meets broadband”, many of which don’t fit neatly into the current standards landscape status quo, but require both regulatory oversight and clear, direct articulation of broadcasters interests (as well as other interests, public and private).

EBU

EBU’s observations are a frank step ahead of the BBC’s have-it-both-ways “standards-based

open environment” double talk challenged here (although neatly respected by Andrew Burke here), and miles ahead of the US ATSC Forum’s recent tepid “almost-as-good-as Europe” defense.

So who is Julius Stone?

Since his death in 1985, the influence of Julius Stone, one of the 20th century’s great legal scholars, has enjoyed a strange yet welcome renaissance.

Yorkshire born in 1907 of Lithuanian Jewish refugees, his first of 27 books, “International Guarantees of Minority Rights”, was published when he was only 25, and is still considered “the most authoritative and objective work in its field”.

From 1942 on an Australian law professor, Stone’s half-century of jurisprudence  once seemed destined to drift to dated obscurity. A review of a 1992 biography questioned the wisdom of bothering with a biography at all, since “as the spheres are aligned in the academic firmament today, Julius Stone’s star is not burning particularly brightly”.

But the generations he marked knew better, including this writer who was blessed in law school by Stone (he taught part-time in the US after his retirement) assigning one of his last books, “Conflict Through Consensus”, a thin dissection entirely unlike other law school casebooks whose title alone knifed a foolishness of his century, the 51 year quest to legalistically define the term “agression” to whitewash some of the century’s greatest criminal acts.

To this day the title “Conflict through Consensus” rings like an alarm in my head whenever I see such cover-up terms as “consensus organization” bandied about in standards groups process documents.  Of course there is no consensus when there is conflict, and only a nitwit self-deceiving “expert” couldn’t see, as Stone once far more artfully put it, “the realities disclosed by an examination of the definition.”

In 1999 an Institute of Jurisprudence was founded in Stone’s name, and nowadays the
accolades flow.

And a heavyweight annual lecture series that features dense yet topical speechs on international jurisprudence, such as one by a Harvard law professor warning of the dangers when “[w]e underestimate the power of expert consensus” or by a St Johns’ law professor on how overstated notions of “legal pluralism” mislead in a “new lex mercatoria” of pseudo-governmental venues.

The common Stonian thread?  A warning, really, of the dangers of overbelief, particularly in an international organizational context, as conditions evolve.  Overbelief in a status quo of expert consensus. Overbelief in the ultimately deluding dead-end thought process that others might be fooled by a “consensus covering up conflict”.  Overbelief in misleading notions of “legal pluralism” that see in the status quo of international organizations some sort of law, rather than just behavior.

So cut the “pan-European consortium” PR happy talk, folks, and put the interests on the table (here’s how).  Who owns what, and who will get what?  The EBU has had the courage and insight to do as much; so should everyone else.

References

“I begin with a simple “Julius Stonian” observation: the international world is governed.   The domain outside and between nation states is neither an anarchic political space beyond the reach of law, nor a domain of market freedom immune from regulation.   Our international world is the product and preoccupation of an intense and ongoing project of regulation and management.”

David Kennedy (Harvard Law School professor), “Challenging Expert Rule: The Politics of Global Governance”, in 2004 Julius Stone Memorial Address, Sydney Australia.

European Broadcasting Union Recomendation, “Television in a Hybrid Broadcast/Broadband Environment”, February 2009, http://tech.ebu.ch/docs/r/r127.pdf:

“The EBU recommends that EBU Members must foster, in cooperation with the industry and standardization bodies, the development of hybrid broadcast/broadband technical platforms with the necessary technical commonality to ensure the development of a European-wide
consumer market …

It is fair and reasonable that consumers should enjoy PSB [Public Service Broadcasting] ‘rich-media’ delivered over hybrid broadcast-broadband networks in the same way they consume broadcast-only content. They should be able to do so without organizations that have not contributed to the production process capitalizing on the process.

The EBU and its Members need to analyze the European and national regulatory frameworks, taking action where appropriate, to ensure that third parties associate their broadband services with EBU Member’s programmes only when authorized. For example, PSBs should retain editorial control of all content associated with their programmes (e.g. EPGs, surrounding text and rich multimedia, advertising and banners, picture-in-picture, interactive applications).”

I have filed comments in the UK Project Canvas public consultation.  To catch up on the UK context with global implications, watch James Murdock’s mesmerizing anti-BBC screed, and say…

“This is the BBC.”

Perhaps no other single phrase has broadcast more meaning to more people in the great call to communicate that has gripped our species and planet in the last two centuries and fed waves of techno-political-industrial revolutions from telegraphs to telephones, radio to TV.

And now, working title “Project Canvas”, the BBC’s proposal for a broadcaster-led, free-to-view IPTV service.  This is not  Telco TV, that cable-imitating subscription TV.

Project Canvas is the Internet TV every consumer wants (just hook the Net to my TV and let me watch for free) and (nearly) every incumbent dreads.

But the BBC-led Freeview is coming off a back-from-the-dead UK success, putting Free-To-View broadcasting back on the business-model map.  If anyone has earned the right to think different about IPTV, it is the BBC.

So little wonder trust is the watchword of the moment.

  • Absence of Trust“, Jame Murdock is shouting.
  • Potential of Trust“, UK trust-busting regulators are whispering after killing the precurser Project Kangaro.
  • And “BBC Trust“, the BBC’s watchdog-cum-champion who is running the Project Canvas public consultation.

In the 1981 MacTaggart Lecture, long before the World Wide Web as we know it today, and 28 years before James Murdock reprised his father’s 1989 role on the Edinburgh International Television Festival stage, Peter Jay painted the high-stakes vision of today’s Project Canvas:

“Quite simply we are within less than two decades technologically of a world in which there will be no technically based grounds for government interference in electronic publishing. To put it technically, ‘spectrum scarcity’ is going to disappear. In simple terms this means that there will be as many channels as there are viewers. At that moment all the acrimonious and difficult debate about how many channels there should be, who should control them, have access to them and what should be shown on them can disappear. But it will only disappear if we all work, indeed fight, extremely hard.”

So why shouldn’t Project Canvas also be built on royalty-free standards, advancing rather than opposing the thrust of the Open Internet and World Wide Web that has enabled the Project Canvas opportunity in the first place?

Is the BBC slipping unthinkingly into a common parlance of the day – seduced by the cynical allure of a semi-open “standards-based open environment” — open enough to help me, closed enough to hurt my competitors, with vendor complicity bought by the potential competitive advantage of conveniently under-disclosed patent royalties or other control points?

This is an under-addressed question that the BBC Executive, BBC Trust and proposed joint venture have skirted so far in this consultation, and should be fully addressed before proceeding. A Free-To-View TV Internet is both a TV and a network stewardship.

CONTENTS

EXECUTIVE SUMMARY
DISCUSSION
I. TO DATE THE PROJECT CANVAS CONSULTATION HAS NOT ADEQUATELY CONSIDERED KEY IPR BEST PRACTICES
A. The Core Principle of “Standards-Based Open Environment” is Ill-Defined and Problematic
B. The Needs of and Responsibilities to the Future of the Open Internet Are Not Sufficiently Considered
II. THE LATEST BBC RESPONSE RAISES IPR PROCESS CONCERNS
A. Proposed Framework Compounds Core Problems
B. Preferred Partner DTG Does Not Adequately Address IPR Process
III. PROJECT CANVAS SHOULD ADOPT AN IPR PROCESS BASED ON FACILITATION, EX ANTE, & PREFERENCE FOR ROYALTY FREE
A. Facilitation
B. Ex Ante
C. Preference for Royalty-Free
CONCLUSION

It is very exciting to see the “Open Video” movement taking off and finding voice with the upcoming Open Video Conference.

This well-earned “open breakthrough” has been a long time coming.  After all, open standards, and particularly royalty-free standards, are the very foundation of the Open Internet as we know it, and Internet leaders are vocal that open and royalty free standards are essential to its future.

But where are the open standards for open video?  Why don’t we already have them?

Hint:  business guru W. Edwards Deming once said: “If you control an industry’s standards, you control that industry lock, stock, and ledger”.

This bitter pill of insight points to the first thing you should know about open video and open standards:

1) Open Video is Collateral Damage
of the Digital TV Standards Wars


It’s not hard to figure out that if you could quietly bake your patents into a standard and then name your price after the standard becomes widely deployed, you could make a lot of money and wield a lot of control.

Great work if you can get it, and that’s pretty much the story of a set of international video and digital TV standards that got going in the 1990s, with MPEG the poster child of modern patent-pooled standards.

Of course this is a tale of big bucks.  Think $26 to $40 per TV, billions of dollars in royalties on billions of devices, vendor shoot-outs, litigation, dueling industry groups, back-room deals, claims of abuse, and consumer groups pushing for public disclosure of confidential patent licensing practices hidden behind claims they are “reasonable and nondiscriminatory”  — “RAND” in standards-speak.

So it is hardly surprising that RAND licensing practices and such developed through the DTV experience have done little to nothing to contribute royalty-free video technologies or standards now needed for broadband deployments, which today are essentially captured by proprietary solutions.

2) Standards Aren’t Just a “Techy Topic”
— They’re a Policy Problem


In fact, scratch almost any network policy issue and you’re likely to find a standards issue lurking inside.  Indeed, America’s broadband plan needs a standards policy.

Turns out country after country has a national “standards strategy”.

UK, France, Germany, Canada, and Korea to name a few.  Some closely tie international standards advantage to IPR & patents, as in Japan (“Intellectual Property Strategy Headquarters decided the International Standardization Comprehensive Strategy, with the aim of enhancing the international competitiveness of Japanese industries and contributing to setting global rules”) and China (“[the] Trade Barrier Treaty [TBT] can be used under the mask of standardization, patents and intellectual-property rights to obtain most world trade advantages.”).

And those that don’t, like Taiwan, have vendors crying foul.

Even in the U.S., a prescient 1992 Congressional report warned:

“The United States has been fortunate to have a pluralistic, industry-led standards setting process that has served us well in the past. Whether it will continue to do so in the future in the face of bruising international economic competition is uncertain.”

So if you think standards are for geeks and not wonks, think again.  As a Toyo University professor recently put the blunt zen to it:

“Standardization activities are political negotiations and not a forum for assessing which technologies excel over others.”

3) Open Source Doesn’t Solve
the Open Standards Problem


I don’t actually know anyone who is really confused or bent out shape about the difference between “open source” and “open standard” or believes that one is a good substitute for the other.  They are of course different things (one’s a license, one’s a specification, and so on).

But if you are inclined to dig in to this, check here or search the Web for “open source v. open standards’ and you’ll find numerous nice explanations.

4) Don’t Confuse Patent Reform with
Patent Licensing (They’re Different)


Another potential source of confusion is the distinction between patent reform — various proposals to make it more difficult to get a patent, to assure that patents are of appropriate quality, to tighten definitions of obviousness and so forth — and patent licensing — the rules and practices of patent pool licensing, disclosure, and IPR (Intellectual Property Rights) policies of standards groups.

Patents have been around for centuries, and so have patent pools, but the regulatory and policy linkages between the two are less than it might seem.  In fact, for a long time patent pools were rare and highly frowned upon by regulators (they weren’t even mentioned in the 1992 Congressional report on standards).  Then in the late 1990s many would trace the beginning of  the “modern” patent pool era to the U.S. Department of Justice’s authorization of the MPEG patent pool.

Pools and patents serve very different policy needs, raise different policy concerns, and by and large are even regulated by different entities.

So unless you are counting on a major scaling back of the patent system that somehow just makes patent issues go away (and few people are), it makes more sense to find a way, as many have, to achieve business model results.

5) “RAND” Isn’t


So what does the term “reasonable and nondiscriminatory” actually mean?

In theory it’s the commitment to fair licensing required of patent holders in standards groups that — unlike the W3C which defines HTML — are open to patents.

But in reality, since price isn’t set until after the standard comes out (sometimes years later), RAND ends up meaning whatever the patent holders want it to mean.

Studies of RAND licensing typically conclude:

“few SSOs [standard-setting organizations] define the term ‘reasonable and nondiscriminatory’ or have mechanisms to resolve disputes about its interpretation”

So Richard Stallman said it well:

“half of “RAND” is deceptive and the other half is prejudiced”

Still, sincere efforts have been made to give the term “reasonable and nondiscriminatory” a meaning  in standards IPR policies. For example the American Bar Association’s Standards Development Patent Policy Manual is a good source.  But good luck if you hope to wade through lawyerly weighing of “multiple factors” to get any particular practice declared unreasonable or discriminatory.

6) Don’t Fall For FUD — There Is a Solution


Finally, it seems there is a never-ending version of Fear, Uncertainty and Doubt that goes something like “you can never really be sure that someone might have a patent so there is no way to ever be sure a standard is truly royalty free”.

To be blunt — this is nonsense, and don’t believe it.  Not only are there thousands of royalty-free standards in the world, and although the number of patent disclosures started to accelerate in the 1990s, the vast majority of standards have no particular IPR or patent issues to speak of.

And even in areas of particular patent thickets and patent controversies, standards organizations with a determined and specific royalty-free policy and process (Khronos and Web3D are a couple of examples) have successfully established their royalty-free credentials.  Sure it takes diligence, a “Freedom-to-Operate” analytical approach, proactive patent reading, time and determination.  Dirac is already making good progress down this path.

So get going Open Video-ers — let’s get some truly open, truly royalty-free standards initiatives going!


References

“The Internet is fundamentally based on the existence of open, non-proprietary standards” Vint Cerf, “the father of the Internet” cited in The Importance of Open Standards in Interoperability, OFE Onepage Brief No.1 (31.10.08.) Available at http://www.openforumeurope.org/library/onepage-briefs/ofe-open-standards-onepage-2008.pdf.

“It was the standardisation around HTML that allowed the web to take off. It was not only the fact that it is standard but the fact that it is open and royalty-free. If HTML had not been free, if it had been proprietary technology, then there would have been the business of actually selling HTML and the competing JTML, LTML, MTML products.”

Tim Berners-Lee, quoted in Standards and the Future of the Internet, Declaration 25th February 2008, at http://www.openforumeurope.org/press-room/press-releases/standards-and-the-future-of-the-internet/

Yesterday’s kickoff of the FCC’s Broadband Plan proceedings were broadcast over the Internet in a proprietary video format.Standards "a key element in broadband deployment"

Worse, it was likely converted from a standards-based format to a proprietary format before it was put on the Internet! (The tip-off is that the closed-captioning overlay was already composited in).

Clearly, a proprietary broadband internet would not be, borrowing one Commissioner’s phrase, an “enlightened public policy” for America’s Broadband Plan.  The FCC’s notice of inquiry states (emphasis added):

“We also note that the development of equipment and protocol standards is a key element in broadband deployment and seek comment on the appropriate role of the Commission in facilitating the development of such standards.”

So here is a clear, actionable role for the Commission — use standards.   Just say no to proprietary formats.

Statements by Commissioners echoed the historic policy importance and high stakes of this proceeding (emphasis added):

Broadband can be the great enabler that restores America’s economic well-being”…

…. “the most important public policy initiative affecting broadband since the landmark Telecommunications Act of 1996” …

…. “it is critical that our plan be competitively and technologically neutral … our plan must not favor one particular technology or type of provider over another, even inadvertently”

Please do not inadvertently favor turning the open Internet into a proprietary one in the name of broadband policy.