Commentary: Standards Are Meaningless (Until They Mean Something)

Two early experiences shaped my view on standards and industry coalitions. Since I’ve been so publicly down on WebM, HTML5, and DASH, I thought I would share them.

The first experience was in the mid-1990s. I was running a small company that sold PC fax boards that allowed computers to send faxes to fax machines, which was a desired functionality in those fax-crazy days. My company sold several cards under the JT Fax brand, and we were the market leader (in a very small market).

Soon, Intel released a competitor, but it didn’t just launch a product. Intel announced a new specification, called the Communicating Application Specification (CAS), that allowed word processors and other applications to communicate seamlessly with its product. As memory serves, 17 companies (including WordPerfect, the dominant word processor at the time) joined Intel on the podium that day — and in associated press releases. It was kind of reminiscent of … say, WebM, where Google lined up multiple companies promising products of which many (if not most) were never delivered.

Meanwhile, Intel overpriced its product, and it sold poorly. The highly touted, game-changing spec never went anywhere and died a quiet death several years later. The lesson? Industry coalitions such as WebM, or HTML5 or DASH for that matter, are meaningless until they actually mean something. This usually occurs several years after the announcement, if it happens at all.

The second experience occurred years later. I was VP of marketing with Iterated Systems, Inc., which sold fractal-based still image and video compression. Our peak came when (as the story goes) Mr. Bill Gates himself walked into our booth at COMDEX, early one morning, when only a very green marketing assistant was there setting up. He spoke with her briefly, and discussions ensued. Several months later, Microsoft Encarta shipped with fractal still image compression decoding all those lovely images.

I left Iterated about a year later and set up Doceo. My first product was a CD-ROM that benchmarked the CD-ROM video compression products, which included Cinepak, Indeo, and a funky, poorly performing codec called MPEG-1. On the horribly underpowered computers of the day, Indeo and Cinepak offered 30 fps 320×240 playback, where MPEG-1 was around 8–10 fps.

Three companies ended up offering MPEG-1 software decoders, and all got bought for between $40 million and $60 million, primarily so their MPEG-1 decode technology could get integrated into CPUs and graphics chips. As computers got faster, JPEG became fast enough, and since you couldn’t beat the price (free), fractal still image compression went away. Iterated’s video technology flared briefly as a plug-in for RealVideo, but it ultimately died as well. The lesson is that in the video compression space, standards usually win. This is why I’m absolutely certain HEVC, and not WebM or VP9, will supplant H.264. I just don’t think it will happen anytime soon.

What about HTML5 and DASH? Ignore the press releases and ballyhoo. Having been on the product side, I know how hard it is to actually finish a product, and that’s incrementally more challenging when you have multiple independent companies and ecosystems involved, as you do with HTML5 and DASH. I’m mindful of the lesson from Geoffrey A. Moore’s foundational book Crossing the Chasm: Marketing and Selling Disruptive Products to Mainstream Customers. The most apt summary I’ve seen is from Norway’s Alper Celik, who said, “While the early adopter would purchase a product that could deliver an 80% solution (seeing it as only 20% more to go), the pragmatist takes the position of buying when it is 100% complete (a ‘whole product’ as Moore puts it) and can be referenced as working within their industry.”

I’m a pragmatist and prefer to wait for the whole product. In the case of DASH and HTML5, they may come; in fact, they probably will come. But they’re certainly not here yet — or anywhere close.

About Jan Ozer

Avatar photo
I help companies train new technical hires in streaming media-related positions; I also help companies optimize their codec selections and encoding stacks and evaluate new encoders and codecs. I am a contributing editor to Streaming Media Magazine, writing about codecs and encoding tools. I have written multiple authoritative books on video encoding, including Video Encoding by the Numbers: Eliminate the Guesswork from your Streaming Video (https://amzn.to/3kV6R1j) and Learn to Produce Video with FFmpeg: In Thirty Minutes or Less (https://amzn.to/3ZJih7e). I have multiple courses relating to streaming media production, all available at https://bit.ly/slc_courses. I currently work as www.netint.com as a Senior Director in Marketing.

Check Also

Five star review for video quality metrics course.

New Five Star Review for Video Quality Metrics Course

The Computing and Using Video Quality Metrics course teaches encoding pro to compute and use video metrics like VMAF, PSNR, and SSIM.

Figure shows the different components to live streaming latency.

The Quality Cost of Low-Latency Transcoding

While low-latency transcoding sounds desirable, low-latency transcode settings can reduce quality and may not noticeably …

NAB Session on AI in Video Streaming

Like most encoding professionals, I’ve followed AI-related streaming advancements for the last few years. I’m …

Leave a Reply

Your email address will not be published. Required fields are marked *