central planning: better for technical standards than for economies

Ryan responds to my last post. I appreciate the thoughtful attention, but I can’t say that I agree with much of his post.

I understand what Tom’s saying, but I think he’s missing some key points. He wants to judge a technology in a “pure” world, outside the presence of the market conditions in which it will be sold and used, but you can’t do that. The utility of a technology is inextricably connected to the market conditions in which it will be sold and used. A Beta videotape might clearly be superior on most quality variables, but if a VHS tape is long enough to hold a full-length movie and Beta isn’t, well that’s important. Saying that length shouldn’t be as important as other variables is pointless; the market didn’t just want “Quality,” it wanted a certain quality.

I was careful not to mention Beta/VHS before because Ryan’s exactly right: length vs. quality is reasonable decision to have to make, and one that the market is better positioned to make than I am.

But his larger point is wrong. Yes, the utility of a technology can only be judged in relation to the world at large — more precisely, the world as it currently exists. But a technology absolutely can be judged apart from that, in its own timeless, rarified world. Every engineering discipline develops principles and design patterns by which work can be judged. Technical elegance is a real thing, and it really matters. Something designed well will be reusable, will be extensible, will be, as we sometimes say, “futureproof”.

And the difference between a good and a bad design does not always come with tradeoffs. The components in a DAT cassette deck and a high-end analog cassette deck are pretty similar. DAT has been much less commercially successful. But it is the superior technology. There’s just no getting around that. In that case the cost of finding the better solution was time; in other cases it’s as simple as giving a damn.

>The fact is that in many cases it would be better if those factors were weighted differently.


Emphasis mine. To this, economists will say, “Says who?” But the broader point is this. Tom believes that he can look at a technology and say it’s better or worse than another technology. Economists say he can’t, because Tom doesn’t actually know what the great mass of consumers wants.

I’ll bite: why do I feel so confident saying it would be better if the weighting were different? Well, consider the reason why MS-DOS was successful: it was selected for inclusion on IBM’s soon-to-be-blockbuster line of microcomputers. There were a number of similar technologies at the time, and it was up to IBM to choose one. Why MS-DOS? Well, depending on which version of the story you subscribe to, it was because Bill Gates’ main competitor was late to a meeting, or his wife wouldn’t sign an NDA, or, ironically, because the competing system was too successful in the marketplace and its owner — not realizing how valuable or market-changing the IBM deal would be — didn’t want to sign over his business for what IBM was offering.

This all made perfect sense at the time. But now, decades later, what has the result been? It would be wrong to assign all of Microsoft’s sins to MS-DOS, but the fact remains that the system was unequivocally technically inferior to other operating systems of the day, as judged by those aforementioned engineering principles. And those principles won out, as they almost always do: limitations of MS-DOS that may not have been immediately apparent became evident as technology advanced, and countless amounts of money and effort had to be expended to come up with workarounds, fixes and kludges. In a word: externalities!

How much did that all cost? I have no idea, but it clearly dwarfs the amounts that were being weighed and judged against one another during the IBM-CP/M-MSDOS deal. There’s every reason to believe that, had a superficially similar but fundamentally superior technology to MS-DOS been selected, we would all be better off.

Now of course I can’t say that definitively. Maybe the productivity gains of an all-Unix world would have been so great that we’d have accidentally opened an interdimensional portal to Dinosaur World by now and all been devoured. Or something. But I can say that the selection of an engineered product carries costs that may not be apparent for years — costs that non-experts are in no position to estimate until they occur. And even experts can generally only say “this was built well” or “this was built poorly”. But in many cases that’s enough, and it would save us all a lot of money if we listened to those pronouncements more carefully.

Oh, and one more thing: it’s probably worth noting that one of the greatest technical (and economic) triumphs in recent memory — TCP/IP and the suite of other protocols and standards that powers the internet — was designed by having a bunch of really smart engineers get together, execute an RFC process and then issue an ISO standard more or less by fiat. This is not to say that markets can’t help us arrive at good solutions — cable vs. DSL vs. FiOS is a good example of such a market working (or would be if the regulatory picture weren’t so complicated). But it ought to be acknowledged that markets are not always an optimal tool for making technical decisions. In fact there are now pseudo-centralized organizations that take responsibility for many of the technical standards that power our world, and nearly all engineers agree that we’re vastly better off for it.

UPDATE: Tim, who knows considerably more about this than I do, corrects my history and explains that TCP/IP did triumph through competition with other protocols. Fair enough! But I think the point stands: even when there is a “competition” stage in drafting a net technology spec — and this is an important function of the RFC process, so there ought to be — it’s still true that the selection of the winning ideas/specs is largely isolated from the consumer economy, and with good reason. In cases where the consumer economy inserts itself in the process — e.g. when Microsoft uses its marketshare to undercut the W3C — most people agree that the end result is detrimental. Openness and the winnowing of ideas is important, but when the decisions involve infrastructure the process needs to be restricted to those with some expertise.

8 Responses to “central planning: better for technical standards than for economies”

  1. Robert says:

    Ryan is wrong about what economists say. Economists don’t say that the standard selected by the market, whatever that means, will be better than available alternatives. In particular, Brian Arthur and Paul David don’t say that.
    This does not mean that technical standards should be selected by central planning. Sometimes a policy goal might be to defer locking into a standard.
    Is “non-ergodicity” in your vocabulary?

  2. Tom says:

    Nope! But, having skimmed this, I think I know what you’re getting at. Unfortunately “defer locking into a standard” is rarely an option in modern technical systems, which are built to be modular, interdependent and reliant on network effects. Even if modularity can be well-satisfied the interface still has to be standardized (which, in fact, is much of what ISO bodies do — the actual implementation is rarely defined).

  3. Mr. Noah says:

    Here’s a thought: Is a better technology always better?
    Supposing you’re right, and that UNIX is better than MS-DOS in all respects. If we had used UNIX instead of DOS from the start, would that have slowed the development of GUIs?
    Real-life examples of this do exist. For instance, Japan switched from CDs to mini-discs in the late 90s. MDs are functionally superior in every way to CDs. Result: America switched to MP3s first, which led to American dominance of the MP3 player industry.
    The decision to adopt a new technology must be forward-looking. Technologies have fixed costs of adoption that must be weighed against their current usefulness, and often we just don’t know when and with what probability a technology will become obsolete (or how complementary it’ll be with future technologies). So it’s easy to say in retrospect that adopting an “inferior” technology was a mistake, but that’s just hindsight – we only say it was a mistake because now we know that it happened to not go obsolete in the meantime.
    So, given equal adoption costs, there’s such a thing as a “better” technology, but in practice that is rarely the case.

  4. Bob Munck says:

    Current PCs are four to seven orders of magnitude bigger and faster than they were when that decision was made. There’s no way a decision made then could turn out to be “right” in terms of today’s technology other than by pure chance. Moreover, the technical differences between CP/M and MS-DOS in 1981 are invisible in the current world. I can’t think of any way that Vista would be different if CP/M were down there at the bottom of its layers of abstraction.
    However, the cultural differences between the Unix and Microsoft worlds today are immense. If, somehow, the first IBM PCs had adopted the nascent Unix culture (which was very different then than it is now), that might have made a difference. On the other hand, Unix didn’t have most of the desirable cultural characteristics of the Linux world back then; there’s no reason to believe that it would have continued to develop them if it had been adopted by Microsoft and IBM.

  5. Mr. Chris says:

    - Unix at the time (circa 1980) probably would not have actually worked on an early IBM PC. It required more memory and storage than could have been cost-effectively provided.
    - Can you imagine making the IBM PC even more difficult for the average user to use? Unix on the IBM PC would have done that. The Unix command line is very powerful, but not nearly as easy to use or as consistent as MS-DOS. Windows was created to make the PC more Macintosh-like, so that it would be easier to use (than MS-DOS). Unix probably would have quickened the inevitable move to Windows.
    - MD is/was inferior in at least three ways: (1) music is stored on MiniDisc using ATRAC compression, a lossy compression technology similar to MP3. CD audio can certainly sound better than MD if the CD is mastered properly. (2) the MiniDisc media has moving parts, the outer shell is similar in construction and appearance to a 3.5″ floppy disc (3) prerecorded music was not widely available for MiniDisc in the USA — only Sony Records supported the format, if I recall correctly.

  6. tomtom says:

    Bob Munck’s comment is far and away the most persuasive – culture is key.
    The Apple corporate culture was far more capable of producing good software than Microsoft. From 1985 to 1995 the Mac / PC OS quality gulf was shockingly wide.
    However, Apple charged 2X – 3X for their computers, and since the hardware architecture was closed they could get away with it. In the late 80′s a good Mac was $3000! That was a lot of money. At the top management level they failed to grasp the benefits of being the main player in an OS, and they nearly went under as a result. By the time they brought out some competitive hardware they we at 10% market share and dropping. Too late. When I went to work at Boeing in the late 80′s half their PC’s were Macs. By 1995 they were gone.
    Apple also instructs us that the architecture of the OS is less critical than Tom suggests. In the 90′s Apple switched from 68XXX to RISC processors without a hitch. More recently they rebuilt their entire system so it sits on Unix. Finally they made the whole thing work on Intel CPUs.
    DOS was crappy, but IBM’s size gave their PC a huge boost, and the fact that IBM did not own DOS gave MS the ability to support other hardware. The hardware was ‘open-source’ although the software was closed, and thatwas enought to take over the market and relegate Apple to niche status.

  7. Kaleberg says:

    One day in the late 1970s I was walking down 5th Avenue in NYC with a friend. One of us happened to mention the IBM antitrust case. This set off a wild eyed, middle aged guy in a fancy suit who began ranting about being a lawyer in the IBM case and being one of dozens of lawyers who had dozens of lawyers working for him and that this case was the biggest legal thing he had ever experienced. (Walking down the street in NYC is like being in a chat group).
    IBM decided to chuck the whole ownership of the operating system problem and Bill Gates was the trained seal who realized how important it was to jump the highest. The competition was not all that much better on technical grounds. CP/M was good, but basically used the same technology. Other disk based systems were not much better.
    UNIX had been around for years, but the newer versions relied on some level of memory protection and memory mapping hardware. (Yes, the original UNIX did not require this sort of stuff, but most of its reliability, flexibility and so on flowed from it). The Intel 8088 processor was a breakthrough and just fast enough. No one was going to add a memory management chip that would drive up the price of their basic box. The fact that such a chip would cost perhaps $100 and save 40 work hours a year was irrelevant. Most companies keep separate capital and labor budgets to discourage this kind of efficiency.
    Basically, moving from multichip minicomputers to single chip microcomputers meant taking a step backwards in operating system technology. It was a step that would take nearly 20 years to reverse. In fact, a lot of good technology from the 60s and 70s was lost in the microcomputer revolution. A lot of these ideas are even patentable having become novel and non-obvious over the years.

  8. Tom says:

    Good points all around. You’re right that I probably implied too much that DOS doomed us all, when in fact it’s not clear that CP/M would have served us all that much better. My main point was just that the criteria by which the decision was reached had little to do with the technology’s quality.
    And although full-blown unix may have been inappropriate, it still seems clear to me that DOS made some big mistakes, from its pidgin-unix command set to its memory management to aspects of its filesystem.
    Mr. Noah: I do disagree with your portrayal of the situation. Part of my point is that there *isn’t* always a tradeoff between a technology’s quality and its adoption costs — particularly in the case of software. It’s not difficult to come up with post-hoc justifications of why the way things turned out is the optimal solution; “case-sensitive command lines would have incurred huge costs on naive users!”, that sort of thing. But in this case I don’t find many of those justifications very convincing.

Leave a Reply