Last week I encouraged Google to rethink their VP8 open sourcing patent strategy and

“do the right open standards thing — join and contribute to responsible standards groups that are working to solve the royalty-free open standards need.”

The blog was picked up in Simon Phipps’ ComputerWorld blog, ZDNet, The Register, LWN and elsewhere.

At one level, this is a classic debate about what is “open” and what should be its hierarchy of values, priorities, and even basic definitions.

But is a “de facto” standard the same as an “open” standard?  No, at least not in the definition of open standards of OpenForum Europe, of which Google is a leading member.

But there is more to consider.  Google is including WebM in the next version of Android and rules Android device makers with a strong hand, necessarily playing favorites to steer the Android ship. So the message must be clear to Android device makers, suppliers, and wannabes to get on the WebM bandwagon.  And though Google is known as tough with patent trolls, Android device makers appear to have been either left to fend off patent attacks themselves, cut deals, or perhaps be quietly aided in patent litigation defense.

All commercially rational choices in the crazy, hard-nosed, twisted global mobile patent wars.  After all, look at where the patents came from in MPEG LA subsidiary MobileMedia’s law suits against Apple, HTC, and RIM (Nokia and Sony) and HTC’s counter suit against Apple (AMD through Saxon).

So is the net-net simply “until you take open source and put it in a product you can’t get sued,” so just watch the big boys force each other to take, or cave in to, patent risks in order to get to the head of the line for a promising platform?  And just hope in the meantime that royalty-free open standards for the Open Web escape cannon fodder, collateral damage, or sell-out status in the smart phone patent wars?

Unfortunately, patent hold-up gambits thrive on adopt-first-ask-questions-later scenarios of the sort Google seems to be arm-twisting for here.  Standards groups, regulators, and industry continue to grapple with this challenge.   See yesterday’s FTC/DOJ/PTO workshop and the EU’s draft guidelines for horizontal cooperation agreements that mention that “[t]here should be no bias in favour or against royalty free standards, depending on the relative benefits of the latter compared to other alternatives”.

But if vendors ignore open standards altogether, we all lose.

UPDATE

According to CNET, the W3C is taking the position that WebM/VP8 needs to go through a royalty free standards process:

“WebM/VP8 has the potential of providing a solution for the baseline video format of HTML5. To be seriously considered by the W3C HTML Working Group, the specification would need to go through a standards group and be developed under RF [royalty-free] licensing participation terms,” said Philippe Le Hegaret, leader of Web video work at the W3C, in a statement. “W3C remains interested in having a video format for HTML5 that is compatible with the W3C Royalty-Free Patent Policy.”

Much of the initial commentary on Google’s open sourcing of the VP8 codec it acquired in purchasing On2 has breathlessly, and uncritically, centered on the purported game-changing impact of the move.

But unfortunately, these commentaries miss an essential point that Google has studiously avoided mentioning the need to standardize royalty free codecs (not just release an open source snapshot).

But since forward motion is good simply because it is forward motion, shouldn’t one hesitate to look this gift horse in the mouth?

Unfortunately, in the case of multimedia codecs and technologies, ignoring open standards and instead presenting open sourcing as a fait accompli solution just works to the detriment of the entire open community.

The open Web needs royalty free standards (true, multi-stakeholder run standards, not unilateral actions) — that is its essential genius.  And without them, proprietary, vendor-controlled projects, even those that self-label as “open”, do little good and more likely more harm than good.  We all have the right to expect, and demand, that the Web’s current beneficiaries and leaders stay true to this fundamental open standards proposition, and not just forget it when convenient.  And this includes Google.

It is well known that many experts consider it now feasible to standardize serviceable royalty-free codecs.  MPEG (the standards group, not the unaffiliated license administrator MPEG LA) has even put out a resolution to that effect, and IETF has recently launched a royalty free codec activity in a similar spirit.  Google should get on board on this important trend, not undermine it with studied avoidance.  So far they have not.

It is important to understand that patent claims are typically handled under confidential non-disclosure agreements.  So unless there is a forcing function (litigation or standardization-required disclosure and review), there is no effective way to know who is actually claiming, and who is paying, what.  And there are documented cases of this going on for literally years.  So leaving VP8 code out in the open with nothing but a mutual non-assert license leaves the patent issue not only unaddressed, but up for capture by those with uncharitable agendas, and on their turf and time frame (let’s at least hope that’s sooner rather than later — but remember, forming patent pools rarely disclose all their patents up front).

Not a smart move, and hopefully one Google will realize the error of and correct quickly (here’s a useful cover story: we intended to smoke out patent holders all along, and we were going to get around to working with standards groups when we had the chance).  Contributing VP8 to a standards group with a strong patent disclosure policy would be a good corrective move; it would force lurking patent holders to come fully into the public. Not perfect, but a step forward.

Google’s open sourcing of VP8 is very different from Sun’s Open Media Stack codec work, and for that matter other responsible open video initiatives, which have based their work on identifiable IPR foundations, documented their patent strategy, and have been willing to work with bona-fide standards groups to address and resolve IPR issues.  When companies like Google ignore standards and go on their own in such important areas as video codec standards, they just undermine the very standards groups the open Web needs to thrive and grow.

We’d never accept a brand name company unilaterally declaring control of the next version of TCP/IP, HTML, or any other of a host of foundational Internet and Web standards simply by open sourcing something they’d bought.  Codecs will also be such a foundational component, a critically important one.  Just because the technology of codecs might be less familiar than some other technologies is no reason to abandon the royalty-free standardization philosophy that has built the Web.

Certainly not based on the complete feel-good-marketing non-explanation for this radical abandonment that Google has offered so far.   Because patent pool licensing is out of control?  No argument about that from me (or antitrust complainants Nero, VIZIO, and others).  Because Google “must have done its patent homework”?  OK, if so why not hand that homework in as a contribution to a standards group where it could get some expert scrutiny?

So I would encourage Google to do the right open standards thing — join and contribute to responsible standards groups that are working to solve the royalty-free open standards need.  Be a part of the royalty-free, open-standards solution, not part of the problem.

UPDATE

Tip of the hat to Xiph’s leader, Chris Montgomery, for good tongue-in-cheek humor:

I, FOR ONE, WELCOME OUR NEW WEBM OVERLORDS

Not to confuse: Xiph is wholeheartedly supporting WebM, but another interesting remark by Montgomery:

“But Monty isn’t worried about the MPEG-LA suing him or anyone at the WebM Project.

“The recent saber-rattling by Jobs felt more like a message to his own troops than
a warning shot to ours,” he says. “MPEG itself has always has an internal contingent
that has pushed hard for royalty-free baselines from MPEG, and the missives about
video codecs and patents were probably meant for them, not us.”

After a lively debate, the IETF appears to be moving forward with a royalty-free audio codec standardization activity.  Here’s to its successful launch and positive outcome.

I’ve put a brief summary at the mpegrf.com site, and there is a good summary here.

The group’s email discussion alias is here — and my view, expressed there (echoing this), is pretty straightforward:

[codec] Royalty Free codec standards — don’t settle for less”

Here is my view, perhaps you share it, perhaps you don’t.

What the world needs now is royalty-free, standardized codecs. This is critical to the future of the Web, and the progress the Internet has brought to the world, and will bring to the world.

Video, audio, transport, the whole thing. Evaluated, vetted for patents. Under an appropriate, responsible and complete royalty free process. No less.

IETF, ITU, and ISO/MPEG should all get going on this important activity — after all why shouldn’t all of these organizations include this as core to their mission.

I have, and no doubt you have too, seen countless explanations why this should not, could not, will not, rather not, might not, or can not happen. Some well meaning and sincere, some from vested interests.

There are too many “powerful” interests against it. “Important” commercial interests are ambivalent. It is too hard “legally” or “politically” or “technically”. It is just too confusing to think through. There is no longer a critical mass that cares enough about keeping the future of the Open Internet open and royalty free. The well meaning are ignorant, or naive. Etc.

Don’t settle. Take the issue of royalty free, standardized codecs all the way to the top of these organizations. Do what it takes. If it requires new organizations, start them. It it requires revised processes, revise them. This is the spirit that built the Web and the Internet, this is the spirit that is its lifeblood, and this is the spirit that needs to be at the heart of its future.

Don’t settle. Don’t let those who have tried hard already, or have only half-heartedly tried, justify the status quo or their half-heartedness. Encourage them to focus on how to take the next steps.

Don’t let convenient “interpretations” of standards processes be an excuse for never starting, never finishing, or never setting up processes that will work. Need more legal background? Find it. More technical information? Get it.

Don’t settle. The world has plenty of patent-encumbered media standards, plenty of proprietary solutions, and plenty of standards in other domains that have figured out how to deliver royalty free.

But the world does not have enough royalty-free codec standards, so this is the task that needs to be addressed.

Rob

Here is my view, perhaps you share it, perhaps you don't.

What the world needs now is royalty-free, standardized codecs. This is critical to the future of the Web, and the progress the Internet has brought to the world, and will bring to the world. Video, audio, transport, the whole thing. Evaluated, vetted for patents. Under an appropriate, responsible and complete royalty free process. No less. IETF, ITU, and ISO/MPEG should all get going on this important activity -- after all why shouldn't all of these organizations include this as core to their mission. I have, and no doubt you have too, seen countless explanations why this should not, could not, will not, rather not, might not, or can not happen. Some well meaning and sincere, some from vested interests. There are too many "powerful" interests against it. "Important" commercial interests are ambivalent. It is too hard "legally" or "politically" or "technically". It is just too confusing to think through. There is no longer a critical mass that cares enough about keeping the future of the Open Internet open and royalty free. The well meaning are ignorant, or naive. Etc. Don't settle. Take the issue of royalty free, standardized codecs all the way to the top of these organizations. Do what it takes. If it requires new organizations, start them. It it requires revised processes, revise them. This is the spirit that built the Web and the Internet, this is the spirit that is its lifeblood, and this is the spirit that needs to be at the heart of its future. Don't settle. Don't let those who have tried hard already, or have only half-heartedly tried, justify the status quo or their half-heartedness. Encourage them to focus on how to take the next steps. Don't let convenient "interpretations" of standards processes be an excuse for never starting, never finishing, or never setting up processes that will work. Need more legal background? Find it. More technical information? Get it. Don't settle. The world has plenty of patent-encumbered media standards, plenty of proprietary solutions, and plenty of standards in other domains that have figured out how to deliver royalty free. But the world does not have enough royalty-free codec standards, so this is the task that needs to be addressed.

Rob

In late 2001, to much industry enthusiasm, H.264 and MPEG-4 AVC were launched as the world’s unifying codec family in a joint project between ITU and ISO/MPEG with the undertaking that the “JVT [Joint Video Team] will define a “baseline” profile. That profile should be royalty-free for all implementations.”

The failure to deliver on this royalty-free baseline is more than a lively standards history tale.

Years of exhausting disputes and doubts have recently resolved with court rulings soundly vindicating the original royalty-free process and vision.

And now, more than ever, the Web and broadband revolution need these groups to deliver on this 2001 royalty-free undertaking.  And in the coming months, ITU and ISO are poised to begin work on a next generation of codec and transport stream standards.

I have summarized a pro royalty-free viewpoint on how ITU and ISO/MPEG can and should go forward and complete this royalty free undertaking here.

A “Julius Stonian” observation:  standards groups aren’t “consensus organizations”, they are political organizations. Winners declare their way the “consensus”, and changes in political context shift the “consensus”.

So reflects calls in several slides at yesterday’s Hybrid Broadcast-Broadband (HBB) workshop to look deeper into Intellectual Property Rights and other control points in the new “broadcast+broadband” (aka OTT TV) standards initiatives.

Cases in point:  UK Project Canvas, see filing here, and HBBTV, a “consortium” claiming pan-European fait accompli authority that is questioned by the European Broadcasting Union’s workshop slides.
s-and-t-HBB
So Stonian kudos to MHEG vendor S&T and the EBU for frank, to-the-political-point, what’s-in-it-for-me observations:

  • “Do you know what IPR issues exist in new initiatives like HBBTV??? (S&T slide 45)
  • “IPR and patent issues shall be resolved prior to rolling out the HBB services” (EBU slide 9)
  • “Unresolved IPR issues (particularly “submarine” patents)” (EBU slide 28)

The EBU’s observations are perhaps most interesting, because they draw from a February 2009 EBU recommendation and initiative, cited below, that cuts to the chase of many of the fundamental political and policy interests that are at stake when “broadcast meets broadband”, many of which don’t fit neatly into the current standards landscape status quo, but require both regulatory oversight and clear, direct articulation of broadcasters interests (as well as other interests, public and private).

EBU

EBU’s observations are a frank step ahead of the BBC’s have-it-both-ways “standards-based

open environment” double talk challenged here (although neatly respected by Andrew Burke here), and miles ahead of the US ATSC Forum’s recent tepid “almost-as-good-as Europe” defense.

So who is Julius Stone?

Since his death in 1985, the influence of Julius Stone, one of the 20th century’s great legal scholars, has enjoyed a strange yet welcome renaissance.

Yorkshire born in 1907 of Lithuanian Jewish refugees, his first of 27 books, “International Guarantees of Minority Rights”, was published when he was only 25, and is still considered “the most authoritative and objective work in its field”.

From 1942 on an Australian law professor, Stone’s half-century of jurisprudence  once seemed destined to drift to dated obscurity. A review of a 1992 biography questioned the wisdom of bothering with a biography at all, since “as the spheres are aligned in the academic firmament today, Julius Stone’s star is not burning particularly brightly”.

But the generations he marked knew better, including this writer who was blessed in law school by Stone (he taught part-time in the US after his retirement) assigning one of his last books, “Conflict Through Consensus”, a thin dissection entirely unlike other law school casebooks whose title alone knifed a foolishness of his century, the 51 year quest to legalistically define the term “agression” to whitewash some of the century’s greatest criminal acts.

To this day the title “Conflict through Consensus” rings like an alarm in my head whenever I see such cover-up terms as “consensus organization” bandied about in standards groups process documents.  Of course there is no consensus when there is conflict, and only a nitwit self-deceiving “expert” couldn’t see, as Stone once far more artfully put it, “the realities disclosed by an examination of the definition.”

In 1999 an Institute of Jurisprudence was founded in Stone’s name, and nowadays the
accolades flow.

And a heavyweight annual lecture series that features dense yet topical speechs on international jurisprudence, such as one by a Harvard law professor warning of the dangers when “[w]e underestimate the power of expert consensus” or by a St Johns’ law professor on how overstated notions of “legal pluralism” mislead in a “new lex mercatoria” of pseudo-governmental venues.

The common Stonian thread?  A warning, really, of the dangers of overbelief, particularly in an international organizational context, as conditions evolve.  Overbelief in a status quo of expert consensus. Overbelief in the ultimately deluding dead-end thought process that others might be fooled by a “consensus covering up conflict”.  Overbelief in misleading notions of “legal pluralism” that see in the status quo of international organizations some sort of law, rather than just behavior.

So cut the “pan-European consortium” PR happy talk, folks, and put the interests on the table (here’s how).  Who owns what, and who will get what?  The EBU has had the courage and insight to do as much; so should everyone else.

References

“I begin with a simple “Julius Stonian” observation: the international world is governed.   The domain outside and between nation states is neither an anarchic political space beyond the reach of law, nor a domain of market freedom immune from regulation.   Our international world is the product and preoccupation of an intense and ongoing project of regulation and management.”

David Kennedy (Harvard Law School professor), “Challenging Expert Rule: The Politics of Global Governance”, in 2004 Julius Stone Memorial Address, Sydney Australia.

European Broadcasting Union Recomendation, “Television in a Hybrid Broadcast/Broadband Environment”, February 2009, http://tech.ebu.ch/docs/r/r127.pdf:

“The EBU recommends that EBU Members must foster, in cooperation with the industry and standardization bodies, the development of hybrid broadcast/broadband technical platforms with the necessary technical commonality to ensure the development of a European-wide
consumer market …

It is fair and reasonable that consumers should enjoy PSB [Public Service Broadcasting] ‘rich-media’ delivered over hybrid broadcast-broadband networks in the same way they consume broadcast-only content. They should be able to do so without organizations that have not contributed to the production process capitalizing on the process.

The EBU and its Members need to analyze the European and national regulatory frameworks, taking action where appropriate, to ensure that third parties associate their broadband services with EBU Member’s programmes only when authorized. For example, PSBs should retain editorial control of all content associated with their programmes (e.g. EPGs, surrounding text and rich multimedia, advertising and banners, picture-in-picture, interactive applications).”

I have filed comments in the UK Project Canvas public consultation.  To catch up on the UK context with global implications, watch James Murdock’s mesmerizing anti-BBC screed, and say…

“This is the BBC.”

Perhaps no other single phrase has broadcast more meaning to more people in the great call to communicate that has gripped our species and planet in the last two centuries and fed waves of techno-political-industrial revolutions from telegraphs to telephones, radio to TV.

And now, working title “Project Canvas”, the BBC’s proposal for a broadcaster-led, free-to-view IPTV service.  This is not  Telco TV, that cable-imitating subscription TV.

Project Canvas is the Internet TV every consumer wants (just hook the Net to my TV and let me watch for free) and (nearly) every incumbent dreads.

But the BBC-led Freeview is coming off a back-from-the-dead UK success, putting Free-To-View broadcasting back on the business-model map.  If anyone has earned the right to think different about IPTV, it is the BBC.

So little wonder trust is the watchword of the moment.

  • Absence of Trust“, Jame Murdock is shouting.
  • Potential of Trust“, UK trust-busting regulators are whispering after killing the precurser Project Kangaro.
  • And “BBC Trust“, the BBC’s watchdog-cum-champion who is running the Project Canvas public consultation.

In the 1981 MacTaggart Lecture, long before the World Wide Web as we know it today, and 28 years before James Murdock reprised his father’s 1989 role on the Edinburgh International Television Festival stage, Peter Jay painted the high-stakes vision of today’s Project Canvas:

“Quite simply we are within less than two decades technologically of a world in which there will be no technically based grounds for government interference in electronic publishing. To put it technically, ‘spectrum scarcity’ is going to disappear. In simple terms this means that there will be as many channels as there are viewers. At that moment all the acrimonious and difficult debate about how many channels there should be, who should control them, have access to them and what should be shown on them can disappear. But it will only disappear if we all work, indeed fight, extremely hard.”

So why shouldn’t Project Canvas also be built on royalty-free standards, advancing rather than opposing the thrust of the Open Internet and World Wide Web that has enabled the Project Canvas opportunity in the first place?

Is the BBC slipping unthinkingly into a common parlance of the day – seduced by the cynical allure of a semi-open “standards-based open environment” — open enough to help me, closed enough to hurt my competitors, with vendor complicity bought by the potential competitive advantage of conveniently under-disclosed patent royalties or other control points?

This is an under-addressed question that the BBC Executive, BBC Trust and proposed joint venture have skirted so far in this consultation, and should be fully addressed before proceeding. A Free-To-View TV Internet is both a TV and a network stewardship.

CONTENTS

EXECUTIVE SUMMARY
DISCUSSION
I. TO DATE THE PROJECT CANVAS CONSULTATION HAS NOT ADEQUATELY CONSIDERED KEY IPR BEST PRACTICES
A. The Core Principle of “Standards-Based Open Environment” is Ill-Defined and Problematic
B. The Needs of and Responsibilities to the Future of the Open Internet Are Not Sufficiently Considered
II. THE LATEST BBC RESPONSE RAISES IPR PROCESS CONCERNS
A. Proposed Framework Compounds Core Problems
B. Preferred Partner DTG Does Not Adequately Address IPR Process
III. PROJECT CANVAS SHOULD ADOPT AN IPR PROCESS BASED ON FACILITATION, EX ANTE, & PREFERENCE FOR ROYALTY FREE
A. Facilitation
B. Ex Ante
C. Preference for Royalty-Free
CONCLUSION

It is very exciting to see the “Open Video” movement taking off and finding voice with the upcoming Open Video Conference.

This well-earned “open breakthrough” has been a long time coming.  After all, open standards, and particularly royalty-free standards, are the very foundation of the Open Internet as we know it, and Internet leaders are vocal that open and royalty free standards are essential to its future.

But where are the open standards for open video?  Why don’t we already have them?

Hint:  business guru W. Edwards Deming once said: “If you control an industry’s standards, you control that industry lock, stock, and ledger”.

This bitter pill of insight points to the first thing you should know about open video and open standards:

1) Open Video is Collateral Damage
of the Digital TV Standards Wars


It’s not hard to figure out that if you could quietly bake your patents into a standard and then name your price after the standard becomes widely deployed, you could make a lot of money and wield a lot of control.

Great work if you can get it, and that’s pretty much the story of a set of international video and digital TV standards that got going in the 1990s, with MPEG the poster child of modern patent-pooled standards.

Of course this is a tale of big bucks.  Think $26 to $40 per TV, billions of dollars in royalties on billions of devices, vendor shoot-outs, litigation, dueling industry groups, back-room deals, claims of abuse, and consumer groups pushing for public disclosure of confidential patent licensing practices hidden behind claims they are “reasonable and nondiscriminatory”  — “RAND” in standards-speak.

So it is hardly surprising that RAND licensing practices and such developed through the DTV experience have done little to nothing to contribute royalty-free video technologies or standards now needed for broadband deployments, which today are essentially captured by proprietary solutions.

2) Standards Aren’t Just a “Techy Topic”
— They’re a Policy Problem


In fact, scratch almost any network policy issue and you’re likely to find a standards issue lurking inside.  Indeed, America’s broadband plan needs a standards policy.

Turns out country after country has a national “standards strategy”.

UK, France, Germany, Canada, and Korea to name a few.  Some closely tie international standards advantage to IPR & patents, as in Japan (“Intellectual Property Strategy Headquarters decided the International Standardization Comprehensive Strategy, with the aim of enhancing the international competitiveness of Japanese industries and contributing to setting global rules”) and China (“[the] Trade Barrier Treaty [TBT] can be used under the mask of standardization, patents and intellectual-property rights to obtain most world trade advantages.”).

And those that don’t, like Taiwan, have vendors crying foul.

Even in the U.S., a prescient 1992 Congressional report warned:

“The United States has been fortunate to have a pluralistic, industry-led standards setting process that has served us well in the past. Whether it will continue to do so in the future in the face of bruising international economic competition is uncertain.”

So if you think standards are for geeks and not wonks, think again.  As a Toyo University professor recently put the blunt zen to it:

“Standardization activities are political negotiations and not a forum for assessing which technologies excel over others.”

3) Open Source Doesn’t Solve
the Open Standards Problem


I don’t actually know anyone who is really confused or bent out shape about the difference between “open source” and “open standard” or believes that one is a good substitute for the other.  They are of course different things (one’s a license, one’s a specification, and so on).

But if you are inclined to dig in to this, check here or search the Web for “open source v. open standards’ and you’ll find numerous nice explanations.

4) Don’t Confuse Patent Reform with
Patent Licensing (They’re Different)


Another potential source of confusion is the distinction between patent reform — various proposals to make it more difficult to get a patent, to assure that patents are of appropriate quality, to tighten definitions of obviousness and so forth — and patent licensing — the rules and practices of patent pool licensing, disclosure, and IPR (Intellectual Property Rights) policies of standards groups.

Patents have been around for centuries, and so have patent pools, but the regulatory and policy linkages between the two are less than it might seem.  In fact, for a long time patent pools were rare and highly frowned upon by regulators (they weren’t even mentioned in the 1992 Congressional report on standards).  Then in the late 1990s many would trace the beginning of  the “modern” patent pool era to the U.S. Department of Justice’s authorization of the MPEG patent pool.

Pools and patents serve very different policy needs, raise different policy concerns, and by and large are even regulated by different entities.

So unless you are counting on a major scaling back of the patent system that somehow just makes patent issues go away (and few people are), it makes more sense to find a way, as many have, to achieve business model results.

5) “RAND” Isn’t


So what does the term “reasonable and nondiscriminatory” actually mean?

In theory it’s the commitment to fair licensing required of patent holders in standards groups that — unlike the W3C which defines HTML — are open to patents.

But in reality, since price isn’t set until after the standard comes out (sometimes years later), RAND ends up meaning whatever the patent holders want it to mean.

Studies of RAND licensing typically conclude:

“few SSOs [standard-setting organizations] define the term ‘reasonable and nondiscriminatory’ or have mechanisms to resolve disputes about its interpretation”

So Richard Stallman said it well:

“half of “RAND” is deceptive and the other half is prejudiced”

Still, sincere efforts have been made to give the term “reasonable and nondiscriminatory” a meaning  in standards IPR policies. For example the American Bar Association’s Standards Development Patent Policy Manual is a good source.  But good luck if you hope to wade through lawyerly weighing of “multiple factors” to get any particular practice declared unreasonable or discriminatory.

6) Don’t Fall For FUD — There Is a Solution


Finally, it seems there is a never-ending version of Fear, Uncertainty and Doubt that goes something like “you can never really be sure that someone might have a patent so there is no way to ever be sure a standard is truly royalty free”.

To be blunt — this is nonsense, and don’t believe it.  Not only are there thousands of royalty-free standards in the world, and although the number of patent disclosures started to accelerate in the 1990s, the vast majority of standards have no particular IPR or patent issues to speak of.

And even in areas of particular patent thickets and patent controversies, standards organizations with a determined and specific royalty-free policy and process (Khronos and Web3D are a couple of examples) have successfully established their royalty-free credentials.  Sure it takes diligence, a “Freedom-to-Operate” analytical approach, proactive patent reading, time and determination.  Dirac is already making good progress down this path.

So get going Open Video-ers — let’s get some truly open, truly royalty-free standards initiatives going!


References

“The Internet is fundamentally based on the existence of open, non-proprietary standards” Vint Cerf, “the father of the Internet” cited in The Importance of Open Standards in Interoperability, OFE Onepage Brief No.1 (31.10.08.) Available at http://www.openforumeurope.org/library/onepage-briefs/ofe-open-standards-onepage-2008.pdf.

“It was the standardisation around HTML that allowed the web to take off. It was not only the fact that it is standard but the fact that it is open and royalty-free. If HTML had not been free, if it had been proprietary technology, then there would have been the business of actually selling HTML and the competing JTML, LTML, MTML products.”

Tim Berners-Lee, quoted in Standards and the Future of the Internet, Declaration 25th February 2008, at http://www.openforumeurope.org/press-room/press-releases/standards-and-the-future-of-the-internet/

Updating market information in this post on the release of the royalty-free OMS Video draft specification, here are data points about MPEG released at the MPEG 20th Year Anniversary Commemoration in Tokyo in November 2008.

Importantly, Lawrence A. Horn, CEO of the license administration company, affirmed the:

“Freedom of Licensors and Licensees to develop competing products and standards”

(Note: the US Department of Justice required as much in its 1997 antitrust review of proposed MPEG patent licensing:

”We understand this to mean that licensees are free also to develop technological alternatives to the MPEG-2 compression standard.’)

Specific market info:

  • “~ 3.5 Billion MPEG-2 Devices
  • More than 1 million people working 40 hrs/week, 52 wks/year for 15 yrs (1994-2008)
  • ~ 40 Billion MPEG-2 Video (DVD) Discs
  • $2.5 Trillion in MPEG-2 Product Sales
  • In 2008 each of the world’s ~ 6.7 billion people will spend an average of $66.46 on MPEG-2 product”

Some interesting observations were made by Leonardo Chiariglione, the convenor of the MPEG committee:

  • “the MPEG-4 Visual licensing killed half of the standard
  • The “use fee” licensing model facilitated the widespread use of proprietary codecs
  • In the second half of the 1990s MPEG repeatedly invited ITU-T to collaborate on MPEG-4 Visual. The lack of collaboration produced the alternative H.263 Recommendation, similar – but not quite – to MPEG-4
  • 20 years after MPEG was born There are just too many video codecs…
    Compression technology has advanced
    The entry level to make video codecs is getting lower
    Many devices have to support many different codecs”

All the presentations of the commemoration are located here.

“Patent and legal issues” topped, at least numerically, the community goals developed at the recently-held Foundations of Open Media 2009 workshop, a write-up of which was just posted here.

Also noted in “Patents and the bright future of open media codecs”, the FOMS group has set aside 15% of its budget to support patent analysis.

For those who haven’t seen it, check the interesting article “Patent Status of MPEG-1, H.261 and MPEG-2” and associated wiki here.  Comments on calculating expiration dates can be found in the responses to article here.

To be blunt:  America has the world’s most overpriced, antiquated, under-performing and anti-convergence digital TV system, and yet another delay in transition will create yet another round of inevitably-necessary but paper-over-the-problems government subsidies to highly questionable interests of highly doubtful economic value to enfranchise millions of consumers into the digital TV transition who should never have been disenfranchised in the first place.

All for technology that could be much less expensive, indeed free!

So how could this possibly be a good thing?

For one, it might spark a much-needed rethink of how we got here in the first place and a consideration of a better path forward.

Rethinking digital TV is no doubt a mind-bendingly complex topic fraught with peril, and one that will require hunting into the very nature of good government oversight.

A hunt for good government oversight of commerce — a very timely topic on many fronts, isn’t it?

So here are three start-the-hunt topics, hints to further analysis, really, I’d suggest to anyone looking into digital TV:

  • – “vendor capture”
  • – “misplaced Americanism”
  • – “out-sourced justice”

“Vendor Capture”

One hint at the need for broader rethink came in pledges last week by the new acting FCC chairman Michael Copps for more openness at the FCC to

“make the FCC more transparent, open and useful to the stakeholders that we serve. And when I say stakeholders, I include not just the industries that we regulate but, more importantly, all citizens—and here let me once again underline the word ‘all.'”  (emphasis added)

Such a pledge to openness taps into a vein of “Reforming the FCC” projects sprouting up.

Of course, vendor capture by interested interests is nothing new for regulatory agencies or even standards groups that purport to represent broader interests.

Indeed, vendor capture is a timeless occupational hazard even when challenged with the best of intents.  But recognizing and controlling vendor capture seems to have been an un-exercised muscle in the DTV regulatory community, particularly in the deregulation orthodoxies of recent decades.

One saga of vendor capture worth reexamining in light of downturn economics is the FCC’s decade long (5744 filings to date!), epically byzantine “separable security” proceeding for cable TV.  Like the DTV transition, this proceeding lives on in an odd “convergence-what-convergence?” bubble of regulating each industry sub-segment in isolation and has only recently begun to ask such basic questions as “whether there are technological solutions that are network agnostic and deployable across all MVPD [Multichannel Video Programming Distributor] platforms”.

“Misplaced Americanism”

But before a rush to blame “someone” devolves to easily-blamed “usual suspects” — like non-voting foreigners — take a look closer to home.

The CUT FATT group has sounded a very important alarm about the patent royalties fiasco that is dogging the US digital TV transition, and has connected the dot to the delay in DTV transition:

“Delaying the DTV transition date is the first step to protecting consumers, but is only part of the remedy needed,” …“Large foreign corporations that bought U.S. patents are exploiting the transition to make outrageous profits off digital television sales to consumers.”

Now such a statement read superficially might be misconstrued as a call to “misplaced Americanism”  (big foreign corporations bad etc. etc.) — which would miss the point that it is American consumers who ultimately pay.

Rather, look back to a well-known anecdote from the “Grand Alliance” that was tasked in the mid-1990s with recommending the digital TV system to the FCC in the first place, whose decision making process was later neatly summarized in a news article after ensuing litigation:

“half of the voting members, MIT and Zenith, of the Grand Alliance were receiving monetary compensation from Dolby as a partial result of their vote for Dolby … Dolby’s selection came after it offered another of the four voting members of the Alliance’s Technical Oversight Group, Zenith Electronics Corp., a 25 percent discount on patent royalties in exchange for Zenith’s vote”

MITThe scandal of relevance isn’t whether the decision making was rigged or not (though the jury in the subsequent contract dispute concluded Dolby indeed owed MIT for a secret agreement in which MIT fell on its sword in voting against its own technology in favor of Dolby) — it is the more subtle “vote American” context that some used to rationalize the situation overall and was captured in the same news article:

“Jae [the MIT representative on the Grand Alliance] was very pro-American,” … “He would naturally favor an American system over a foreign system.” …“Jae knew he supported American solutions, so that deal was consistent with that,” … “If it hadn’t been consistent, I don’t think Jae would have made the deal.”

An interesting justification indeed for one American institution (MIT) to vote for an American institution (Dolby) in a backroom deal that was to benefit both.

“Out-sourced Justice”

Another aspect of the curious MIT-Dolby dispute was the question of whether there was a conflict of interest at all, since the Grand Alliance was only recommending a joint decision to the FCC, not actually making the final decision.

“I can see how it would be perceived as a conflict of interest,” Gast said. But the Grand Alliance “wasn’t a decision-making body,” it was a group of companies joining together, at the request of the FCC, to make a unified proposal, she said.

Sounds reasonable enough, until one fast-forwards a few years and considers such statements in the CUT FATT filing to the FCC as:

“The FCC does not know what license terms ATSC patent holders demand or how much consumers ultimately pay for the DTV standard the FCC chose.”

If that sounds like a case of “out-sourced regulation” — consider it might be even more — a case of “out-sourced justice”.

Patent pool licensing practices, which also began to be authorized in the same time frame by the US Department of Justice after a long period of at most skeptical legitimacy, have begun to beget “To Join or Not to Join” trolling:  “as many as one half to two-thirds of the eligible firms choose not to join a patent pool”, engendering a “myth of essentiality” practice of outsourcing patent pool evaluations to so-called “independent contractors” paid by the very same patent holders.

But more on that topic another day — in the meantime: happy hunting!