Intuit and Stanford recently asked me to give talks on computer platforms and what makes them successful. (By platforms I mean software with APIs that third party developers can write apps on top of; Windows and Macintosh are both platforms, as is Java.) Platforms are a hot topic in Silicon Valley these days. The success of the iPhone app store in mobile, and Facebook on the web, have forcefully reminded people that you can grow a tech business more quickly if you get third party developers to help you. Almost every tech company I work with is trying to expose some sort of API or platform offering in its products.
To explain how software platforms work today, I thought it'd be good to start with their history. But I wasn't sure about many of the details myself, so I ended up doing some research. The information was surprisingly hard to find, and also pretty controversial -- for every person who claims to be the first to have done something in computing, there's someone else who begs to differ. I did my best to sort through all the claims. The picture that developed makes an interesting story, but also has some very important lessons about where the industry might go next.
Fair warning: this is a long post. But I hope you'll feel that the destination is worth the trip.
Here's what I found:
Hardware memory, software amnesia
The computer industry is often criticized for its failure to remember its own history. Supposedly we're so focused on the new thing that we forget what's come before.
In reality, though, we're actually fairly good at remembering a lot of our hardware history (for example, Apple fans are celebrating the 25th anniversary of the Macintosh this year). There's passionate controversy over what was the first computer -- was it Konrad Zuse's Z1 (link), Tommy Flowers' Colossus (link), etc. The answer depends in part on your definition of the word "computer." But it's a well-documented disagreement, and you can find a lot of information about it online, including a cool timeline at the Computer History Museum (link).
The machine most commonly cited as the first fully programmable general-purpose electronic computer was ENIAC, the Electronic Numerical Integrator And Computer. It was completed in 1946 (link).
Here's ENIAC (well, part of it, anyway)
You can find lots of histories of ENIAC online (link). There are multiple simulators of it on the web (link), and the engineering school at the University of Pennsylvania even has an ENIAC museum online (link).
But when it comes to software, our memories are much hazier. For example, I doubt there will be a 25th anniversary celebration in 2010 for Aldus PageMaker, the program that did more than any other to make Macintosh successful. And about a day after I post this article -- May 12, 2009 -- will be the 30th anniversary of the introduction of Visicalc, the first spreadsheet program. Anyone planning a parade?
Today we take it for granted that you can use a computer for a variety of business or personal tasks, but it didn't always work that way. ENIAC and Colossus were government-funded tools for solving military and scientific problems. The US Army funded ENIAC, and in addition to calculating artillery tables, it was also used for tasks like weather prediction, wind tunnel design, and atomic energy calculations.
These nice ladies are programming ENIAC, by moving cables around.
How did we end up using computers for other purposes? The UPenn site says only, "it is recalled that no electronic computers were being applied to commercial problems until about 1951."
Yeah, "it is recalled." This is where I had to start digging. Once again there are disputes (link), but you can make a very good case that business computing started in the UK, and it involved something called a Swiss roll.
The first business computer
I had never heard of Joseph Lyons & Company, but in the 1950s they ran a chain of tea shops in the UK. I have to pause here for a second and explain what the term "tea shop" means. It's not a shop where you can buy bags of tea (which is what I assumed). Instead, it is what Americans call a coffee shop -- a fixed-menu restaurant that people would come to when they wanted to have a quick meal, snack, or meeting. The closest equivalent in the US these days is probably Denny's.
In the 1950s, Lyons had the biggest network of tea shops in the UK. It employed 30,000 people and served 150 million meals a year. The company sold 36 miles of Swiss roll a day (link).
(In case you're wondering, Swiss roll is a flat sponge cake rolled around a filling. Americans call it jelly roll. In India, it's called jam roll. In Sweden, rulltårta. In Japan, "roll cake." But in Spain, for some reason it's called brazo de gitano (gypsy's arm). Don't ask me why. [link] )
A Swiss roll made and photographed by Musical Linguist on 25 June 2006
Like every other company of its day, everything at Lyons was run on paper -- tallying 150 million receipts, calculating payroll, managing taxes, and even figuring out how many miles of Swiss roll you need to make for tomorrow's customers. All of that by hand with adding machines. It was an incredibly expensive and error-prone way of running a business, but it was the best anyone could do at the time.
When the people at Lyons first heard about these new computer thingies, they wanted one immediately to help run the business. But there wasn't any way to buy one. So they donated $5,000 (about $50k today) to Cambridge University to create a modified version of a computer that Cambridge had been working on.
The result was called LEO (Lyons Electronic Computer), and when it started regular operations on November 17, 1951, it was the world's first business computer. It occupied 5,000 square feet of floor space (about 500 square meters), and its 4k memory unit weighed half a ton because it was full of mercury. LEO's lead programmer was David Caminer, who is generally credited as either the world's first business software programmer or the first systems analyst. LEO's software let it handle -- guess what -- the same sorts of tasks we handle on business computers today: payroll, inventory, financials, and so on. It cut the time to calculate one employee's wages from eight minutes to 1.5 seconds (link).
David Caminer
Pause for a moment and think about the courage and vision it took for Lyons -- a catering company -- to build its own computer. There was no guarantee the process would succeed, and indeed the process took two years, with plenty of setbacks along the way.
But LEO was eventually a big success, and Lyons eventually spun it out as a separate computing subsidiary. Caminer went on to have a distinguished career in computing. He died in 2008, unfortunately, so we just missed our opportunity to say thanks to him. If you want to read more about LEO, Caminer co-wrote a book about it (link). Naturally, it's out of print, and the cheapest used copy when I looked it up was $75.
What is software, anyway?
One interesting aspect of LEO is that although Caminer and his team wrote software for it, that software was not available separately from the computer. That's the way the computing industry worked throughout the 1950s. For example, if you bought an IBM computer there was a set of standard IBM programs that ran on it.
In fact, the term "software" didn't even exist until it was popularized by John Tukey in 1958, more than ten years after ENIAC began operation (link). He wrote:
Today the "software" comprising the carefully planned interpretive routines, compilers, and other aspects of automative programming are at least as important to the modern electronic calculator as its "hardware" of tubes, transistors, wires, tapes and the like.
So the whole idea of software as a separate entity, a concept that we take for granted today, did not exist at the beginning of computing. The concept of making computers reprogrammable came along quite early, but it took a couple of decades for software to fully separate itself from hardware as its own distinct discipline.
John Tukey
(Naturally, there's some dispute about whether Tukey was the first to use the term "software." You can read about it here.)
Tukey was an interesting guy. He also created the term "bit," helped design the U-2 spy plane, and did a lot of other fascinating things (link).
If you want to read more about the history of software technologies, there's an essay here. And the best (and just about only) book on the history of the software industry is here.
Software as a business
Once we got the idea of software into our heads as a separate discipline, the next milestone in platform history was the creation of the first independent computer program, the first one you could buy separately from the hardware. As far as I can tell, that idea didn't just spring into being all at once; it emerged as a slow-motion avalanche over a period of 15 years.
Computer Usage Corporation, founded in 1955, is often cited as the first computer software company. It focused on custom programming services (link). Another very early custom programming company was CEIR, founded in 1954 (link). After them, a number of other custom programming firms sprang up. Sometime between 1962 and 1965, California Analysis Center, Inc. started selling a proprietary version of the Simscript programming language as a standalone product (the Computer History Museum says it was 1962 here, but CACI's own website says 1965 here). The 1962 date is the earliest I can find for any sort of independent software product. To my amazement, CACI is still selling Simscript today (link).
Several other programming languages and compilers came to market in the early 1960s, but there's disagreement over how much they actually sold, or whether they were really managed as independent products (link). A file management program called Mark IV, by Informatics, is credited as the first independent software product to generate more than a million dollars revenue. It was published in 1967 (link). That year also saw the first publication of the International Computer Programs Quarterly, the first commercial software catalog, which helped small software companies get to market at low cost (link). Think of it as a paper version of the iPhone App Store.
But if you want to find the first snowball that started the commercial software avalanche, I think it was tossed in 1964 when a contract programming company called Advanced Data Research was jerked around on a business deal by RCA.
The first commercial software product
In the mid-1960s, a cottage industry of contract programming firms did custom software development. When a new mainframe was in the works, its manufacturer would sometimes hire these firms to create software to offer with it. Computer owners could also hire those development houses to write create custom software for them. The idea of off-the-shelf software didn't exist; you got it for free with your computer, had it written for you, or developed it yourself.
RCA, which at the time was a promising mainframe company, approached ADR asking them to create a program to draw flow charts of computer programs (the flow charts were used for documentation and debugging). That may not sound like a big deal today, but in the early days of computing the industry didn't have the sort of automated debugging tools it has today. A flowchart was very useful to help maintain and document a custom software program after the project was finished.
So ADR created a proposal and submitted it to RCA. Fortunately for the computer industry, RCA turned it down, as did every other mainframe company. But ADR believed in its concept, so it decided on its own to develop the product anyway. It spent over $5,000 (about $35k in today's money) and half a man-year on the project.
But RCA was not impressed. Once again they said no.
Now ADR had a sunk cost. In business school they teach you to walk away from those, but in real life companies hate to admit they made a mistake. So ADR decided to try marketing the software on its own. They named it Autoflow, and wrote a letter to all 100 RCA mainframe owners offering them the program for $2,400 on a three year lease. It was three milestones in one: the first commercial software program, the first subscription software, and the first junk mail urging you to buy a software program.
ADR sold two licenses.
That may not sound like much, but somebody at ADR did the math -- if we sold two copies to 100 RCA customers, what would happen if we offered our software to IBM's much larger installed base? So ADR ported Autoflow to IBM mainframes. In the second half of the1960s it sold more than a thousand licenses of Autoflow, and created a portfolio of other independent software programs for IBM systems.
IBM was not pleased. Nobody was supposed to mess with the IBM customer base; that might weaken IBM's control over its customers. The company created its own flow charting software, which it gave away for free to its customers, and started to copy ADR's other programs as well. This became a huge competitive problem for ADR -- even if its software worked better than IBM's, it was hard to compete with free. IBM was also able to freeze the market for ADR by promising that it would in the future offer a free version of something ADR was currently selling. Customers would delay ADR purchases until they could evaluate the IBM product.
ADR and other fledgling software companies complained to the US government. In 1969, the Justice Department, ADR, and several others filed antitrust suits against IBM. ADR collected $2 million in penalties, and IBM agreed to stop bundling free software with its computers.
And thus the independent software industry was born.
Martin Goetz (above) was the product manager of Autoflow. I wrote to him and asked for his take on which was the first software product. Here's his reply:
Autoflow was recognized as the first software product to be commercially marketed. Starting in 1964, ADR licensed its products nationally and through ads in all the major computer publications, started investing in the development of other products and became known as a software products company.
I think that's the right way to look at it: Autoflow was the first software product to be commercially marketed, which is why I call it the snowball that started the avalanche. Informatics' Mark IV also played an important role because its financial success validated the market -- reportedly it was the top-selling software product for the next 15 years (link).
Goetz says Mike Guzik was the lead programmer on Autoflow (link), and he cites ADR President Dick Jones as a strong supporter of the idea (link). I think we should credit Goetz and Guzik as the creators of the first commercial software application, although neither of them has an entry in Wikipedia.
Incidentally, Goetz also holds the first software patent:
Computerworld, June 1968
That has to be one of the most visionary headlines in the history of the computer press: "Full Implications Are Not Yet Known." Here we are 41 years later, and it's still accurate.
Goetz was named the "Father of Third-Party Software" by mainframezone.com (link) and there's a very interesting interview with him here. You can find a much longer interview here and his memoirs are here.
Advocates of open source software will probably view Goetz as a bad guy, since he helped make software a for-profit industry. But he has some pretty strong opinions about the poor quality and slow innovations that happened in software when it was only free. In particular, he says that a completely free software industry was not responsive to the needs of users (link).
An amusing anecdote complaining about Goetz, apparently written by a former ADR employee, is here. I can't verify the anecdote, but if nothing else it shows that ADR was also a pioneer in the practice of engineers making catty comments about product managers.
(I should add that there are some different interpretations of the effect of IBM's unbundling decision. One is in a very interesting interview with the creator of the ICP catalog here.)
The rise of the third party application platform
The next evolutionary step was for computer companies to see their products as development platforms -- for them to actively encourage software developers rather than viewing them as a nuisance. I haven't been able to figure out when in the 1970s this change in perspective happened (please post a comment if you know the history). It may have happened in the era of minicomputers, or it may have been a PC thing. Definitely Dan Bricklin and Bob Frankston's VisiCalc, the world's first spreadsheet program, played a role when it came to market for the Apple II in 1979. It was so revolutionary that reviewers at the time didn't know how to describe it. They just said it was a way to make the computer do things you want it to do, without writing your own program. VisiCalc established the idea of the "killer app," a software program so popular that it drove demand for the underlying hardware.
"Visicalc could some day become the software tail that wags (and sells) the personal computer dog."
--Ben Rosen, co-founder of Compaq, reviewing VisiCalc when we was still an analyst with Morgan Stanley. Nice call, Ben. (Link)
By the early 1980s, software developers were being actively courted by computer manufacturers. Apple had a developer recruitment team for the Macintosh, and apparently coined the term "software evangelism." That's where Guy Kawasaki cut his eyeteeth, although he wasn't the first evangelist. As he puts it:
"Mike Boich started evangelism and hired me, and Alain Rossman worked with me as a software evangelist. Essentially, Mike started evangelism, Alain did the work, and I took the credit." (Link)
I happen to know that Guy did a bit of the work too.
The other critical change in the 1980s was the separation of the OS from the underlying hardware. Most of the new PC software platforms had been tied to hardware, just like traditional computers. For example, you had to buy a Macintosh in order to run Mac software, or an Amiga in order to use Amiga apps. But then IBM created the PC, and through a series of business blunders allowed Microsoft to separately sell the DOS operating system used on its hardware. IBM's brand and marketing power established the PC as a standard, but the company enabled Microsoft and Intel to create a "clone" hardware market, and eventually drive IBM out of the PC business.
So now there were three layers in the industry -- the application was independent of the OS, and the leading OS was independent of the hardware.
The network strikes back
That's where the situation sat until the late 1990s, when Java and web browsers threatened to create another layer in the architecture by separating software applications from the OS. The theory was that instead of writing programs that depended on Windows, programmers could create code that worked on Java, or on the Netscape browser.
Microsoft fought back very aggressively, killing Netscape by giving away Internet Explorer, and crippling Java on the PC. Looking back, it was an impressive use of business muscle, worthy of Microsoft's tutor IBM.
But it was also a pyrrhic victory. Microsoft's actions in the 1990s forced software innovation completely off the PC platform, because investors were afraid that new software apps would just get cannibalized by Microsoft. Instead software innovation moved onto the web, where Microsoft had virtually no control. That's one of several reasons why the next generation of software is being written as web apps.
And that's where we are today.
Where we go next
As I said at the start of the post, I think all of this history is fun in its own right. I also wanted to take this opportunity to thank some of the people who built the tech industry into the fun place it is today.
But understanding computing history is also very important because, if you look across the sweep of it from the 1940s to today, it's much easier to see where we might go next.
Here's what I think that long perspective shows us: The history of software is a history of disaggregation. First the application software gets separated from the hardware, then the OS gets separated from the hardware, and so on.
I think disaggregation is a natural outcome of the maturation of the industry, because multiple companies can move faster than a single one. At the start you need everything coordinated together to make sure the whole thing will work. But over time, no single company can pursue all of the innovation possibilities, so you get a backlog of potential creativity that can happen only if control over the architecture is broken into pieces.
For example, most of the interesting innovation in applications happened only after they were separated from the hardware.
But as the industry continues to grow, each of the pieces becomes its own stodgy monolith, and eventually another subdivision happens.
The fastest growth and the easiest innovation has generally happened at the leading edge of disaggregation, because each change creates new business opportunities.
That doesn't mean that old school companies are dead. IBM still sells mainframes, and Apple still makes PCs bundled with an OS. But to succeed in an old paradigm you have to execute extremely well, and it's much harder to grow explosively. The easiest progress is made at the leading edge.
A common thread among the people working at the leading edge of disaggregation is their excitement as they recognize the opportunities created by the change:
"There was a tremendous euphoria of success. You couldn't lose. All you needed was a group of highly technical people who could create a software product and that was it. And to some degree there was some truth to that. Because you didn't have to be good sales people. You didn't have to worry about the competition. For years I used the aphorism that we were like little boys on the beach each with our sand piles. There was plenty of sand to put in our buckets. We didn't have to edge out the other little boy to get all the sand we needed. We were limited by the size of our pail and our little shovels but not by the amount of the beach that was there or the fact that there was another little boy there with his pail."
That's Walter Bauer, cofounder of Informatics, talking about the birth of the independent software industry in the 1960s (link). But you could find similar sentiments from the people who built the first computers, or the first Mac programmers, or the first web app developers. The leading edge of disaggregation is where the action is; it's where the fun happens.
So, if you're looking to succeed in the software industry, it's extremely important to figure out what's going to get disaggregated next. Which brings us to the point of this article.
Say hello to the metaplatform
Sun's rallying cry in the 1990s was, "the network is the computer" (link). It was an excellent insight that pointed to the emerging importance of the Internet, but most of the industry misread what it meant. We looked at the architecture of the thing we knew best, the PC, and tried to map it directly to the network. So servers would replace the PC hardware, and software on those servers would replace Windows. The PC itself would be reduced to light client, a screen connected to a wire.
What we expected
But instead of a new OS on the network replacing the OS on the PC, what we're seeing is the breakdown of the OS into component parts that live everywhere, on both the client and the server.
In other words, the OS is the next thing that gets disaggregated.
What's actually happening
People have been talking about elements of this change for years, but like the proverbial blind men feeling bits of the elephant, we've talked about individual pieces of it, with each of us assuming that the piece in front of us was the most important. So people producing software layers like Java and Flash say that they are separating the APIs on the device from the underlying OS. And the advocates of cloud computing say they're creating a software services architecture that runs on servers. But in reality we're doing both of those things, and a lot more. The OS is dissolving into a soup of resources distributed across both the network and the local device, with the application in the middle calling on both as appropriate. We need to get off the idea that the network or the client will be dominant; they're both supporting elements in something larger.
You can see this process operating in the evolution of web applications. The first web app companies tried to make applications that were entirely light client, but they didn't work particularly well -- they were slow, and their user interfaces were too limited. Web apps took off only when they adopted an approach in which the platform was split between the PC and the network -- the user interface ran locally through the browser, while back-end calculation and data storage was done on the network.
Mobile computing reinforces the need for this sort of hybrid architecture. Wireless broadband has important limitations that make pure light client computing extremely problematic. Wireless networks are relatively slow compared to wired networks, there's high latency on them, coverage is inconsistent, heavy communication drains device batteries rapidly, bandwidth is expensive, and most importantly, total wireless bandwidth is limited. The most effective mobile application are and will continue to be hybrids of local and network resources, like RIM's e-mail solution.
Companies entering the mobile market often ask me which mobile operating systems are going to win long term. I think that's the wrong question. What we're seeing is the gradual evolution of a super-OS that includes both the network and the device.
Like software developers before the word "software" was invented, we don't have a name for this new thing, and so we have trouble talking about it. It's not just the Network or the Cloud, because those terms are usually understood not to include the software on the client computer. And it's certainly not just the local APIs on the client device.
I'm calling it the "metaplatform" because it subsumes all other platforms. No single company controls the metaplatform. Google obviously contributes a lot to it, as does Amazon Web Services, as does Microsoft. But they're only fragments of the picture. There are thousands of other contributors to the metaplatform, in areas ranging from mapping to graphics to identity.
There's still a lot of work that needs to be done on the metaplatform, especially in the mobile space. But already it's evolving faster than any single company could move it, because the work is divided across so many companies, and because there's competition driving innovation at almost every point in the architecture. Although the metaplatform isn't necessarily elegant (because it's poorly coordinated), what it lacks in beauty it more than makes up for in rate of change and versatility.
New opportunities
The metaplatform helps to solve some computing problems, but creates others. For example, a recurring problem for software in the OS era has been compatibility. Old data files, even when perfectly preserved, can become unreadable if the hardware and software that created them is no longer available. A lot of software is very dependent not just on the hardware, but on the particular version of the OS it's running on. (If you want to see that effect in action, try running a ten-year-old Windows game on a new PC. It may work, it may refuse to run at all -- or it may freeze right when you're about to defeat the boss bad guy.)
The metaplatform is helping to resolve some compatibility problems, through emulators available online. But more importantly, web apps on a PC are less vulnerable to PC-style compatibility breakdowns because PC browsers are relatively standardized, and much of the OS code the web app relies on lives on the same server as the app itself, so they are less likely to get out of sync.
But metaplatform-based software is uniquely vulnerable to a new set of problems. When a user's data is stored on a web app company's server 3,000 miles away, what happens if that company goes out of business or just decides to stop maintaining the product?
Another problem experienced by any website using plug-ins is component breakage. If you've incorporated external web services into your site, the site will break if any of those services stops working. This can happen without warning. On my own weblog, the load time for the site suddenly became ridiculously long. It took me weeks to realize that a user-tracking service I'd once signed up for had gone out of business without telling anyone. My site stopped loading while it tried helplessly to connect to a tracking site that no longer existed.
An old software application from the OS era has some hope of revival if you have a copy of the CD, because all the code that made up the app is together in one place. But an old, broken web app will be almost irretrievably dead, because huge chunks of its code will be missing.
Problems like these are just starting to emerge, but as the metaplatform grows and ages they'll become much more prominent. We don't have any systematic ways to deal with problems like these today -- which means they're a business opportunity for the next crop of software entrepreneurs.
What the metaplatform means to you
Much of the discussion in this post is pretty theoretical. But I think it has important practical implications. Here are a few specifics to think about:
If you're a computer user (and if you're reading this, you must be), keep in mind that the most interesting new software innovations are likely to come from companies that consciously work the metaplatform. If you want to be at the leading edge of software innovation, you should keep yourself open to experimenting with new web applications and plug-ins, and make sure your browser doesn't artificially cut you off from some technologies. This is especially true for mobile devices. The iPhone today gives (in my opinion) the best overall mobile browsing and app discovery experience, but you pay a price for it -- you're cut off from some web technologies (Flash, Java) and your choice of applications is limited by the Apple app police. You pay a serious price for the superior user experience of the iPhone. That price is worth paying today, but in the future I hope there will be mobile devices that are as satisfying as the iPhone but less controlled. Actually, I'm sure that will happen over time. But "over time" can sometimes mean a long time in the future. You can help the process along with what you buy and by the feedback you give to device manufacturers.
Are you working at an OS company? If so, you probably measure success by the number of devices your software controls. You need to rethink that viewpoint. The OS is going to be less and less of a technology control point in the future. It will become commodity plumbing underneath the metaplatform, limiting your ability to charge a lot of money for it. So at a minimum, you need to plan for cost control.
But you should also be asking if plumbing is the right place for your company's creativity in the long term. There will be much more profit opportunity in contributing to the metaplatform by creating APIs and developer functionality that can be used across different operating systems. OS companies have many of the assets needed to build those components of the metaplatform. A successful OS can be a great launching point for technologies that run across platforms, because you already have a big installed base that you can use to jump-start the technology's adoption.
Are you at an application company? Many successful app vendors are trying to create APIs that will enable other developers to extend their products. This is the right idea, but the implementation is often off-target. Many of the app companies I talk to are trying to make their APIs into the business equivalent of an operating system, with developers coming to them and living entirely within their private ecosystem. A warning sign is when a company uses a phrase like, "(insert company name) developer network" to describe its offering.
The wave of the future is not turning an application inward into its own little walled garden; it's opening the application outward so it can be mixed and matched with other functionality in the metaplatform. If you have the best drawing program in the industry, you should be asking how you can also become the best drawing module in the metaplatform. Get used to being a component in addition to a standalone product. You lose some identity in the process, but gain greater opportunities to grow.
And besides, if you don't do it, you'll be vulnerable to someone else doing it and taking your place.
If you're a computing student, or a computing veteran looking to create a new product, think about what role you can play in the metaplatform, and what customer problems you can solve with this new tool. There will be big market openings in both products for users and companies, and infrastructure for other developers in the ecosystem (billing, rights management, security, etc).
As in previous generations of software, the answers are not immediately obvious, and the people who figure them out first will have huge opportunities to do something impactful. Like Caminer, Goetz, Bauer, Bricklin, and Frankston, you're on an enormous beach with a trowel and bucket, and you have a chance to shape the next generation of computing.
Have fun.
====
I'd like to thank Eugene Miya of NASA Ames and Martin Goetz for helping with the research that contributed to this article. They're not responsible for any errors I made, but they definitely corrected some.
I'm sure there are folks out there who have additional information on the history I wrote about here. If you have anything to add (or correct) please post a comment.
15 comments:
Interesting slice of history.
> The other critical change in the 1980s was the separation of the OS from the underlying hardware.
This happened earlier in the 70s and 80s with CP/M. On the hardware side, the S-100 bus was a widely used standard.
On Motorola processors, another system offered some independence even though it was not as popular: Flex and later Flex 9.
br -d
If you like computing history, by far the best review of it is this series "The Machine That Changed the World" http://waxy.org/2008/06/the_machine_that_changed_the_world/
Also, while more or less true, your comment about Lyons doing everything on paper till the LEO came online is typically misleading and avoids a really interesting side-story.
The tabulation machine was not a computer, but was indeed a way to process data. I haven't written a comprehensive history myself, nor did I live through all of this, but I suspect much of the "the world only needs a dozen computers" comments, and similar misunderstanding and resistance is that there appeared to be a perfectly sound, reliable and relatively efficient-looking machine already on the market.
Using this as an analogy to something today is left as an exercise for you.
wow! what a blog my friend! excellent stuff. I also follow history and computing history and this is excellent stuff. will blog and cross link with thoughts. hope you are well kind rgds Ajit
Irretrievably dead web apps are always a worry. A big reasons why I spend my money on desktop apps... it feels like its mine, and my data is my responsibility. Being able to export data isn’t enough... what springs to mind is open source.
Source hosting site Devjavu recently announced that they are shutting down. Sad news, but since they are hosting with a tweaked version of Trac, so it is perfectly possible to install Trac locally and continue on.
If web applications were like Linux distributions, one could imagine transferring data from one to another.
Linux as a mashup would be equivalent to a single entity hosting various components, at given version numbers, that are tested to interoperate. Quite different from the real world, where APIs exposed for social networks, photo sharing sites, etc. could change or “fail whale” at any time.
Lots to think about, which may warrant a lengthly blog post of my own.
Thanks for the comments, folks.
David Mery wrote:>>This happened earlier in the 70s and 80s with CP/M. On the hardware side, the S-100 bus was a widely used standard.Thanks. Researching this stuff is like peeling an infinite onion -- there's always more info to find.
shoobe01 wrote:>>while more or less true, your comment about Lyons doing everything on paper till the LEO came online is typically misleading and avoids a really interesting side-story.Ouch, "misleading" feels a little bit harsh. I did say in that same paragraph, "by hand with adding machines," which was a reasonable description. One of the biggest challenges in writing a post like this is figuring out what to leave out. And you're right, there is indeed an interesting backstory about the need for a business computer in the 1940s. It focuses on the structure of work and the changing economy in the UK after World War II.
The story's explained in detail in the book on Leo, but to summarize it briefly:
--Lyons' business required an extraordinary number of small calculations that could not be completely routinized. Here's how Lyons Chief Comptroller described it: "We still employed large numbers of clerks doing a dreary kind of job. You see the difference between a routine office job and a routine factory job is that the routine factory job is doing the same thing, literally the same thing, time and time again. In the office you are doing the same kind of thing, but you're feeding in different figures to the machine all the time, you have got to concentrate, you can't even daydream. We had plenty of work and that was really the reason why it was obvious to us that an automatic computer was what we wanted."
--Because there was most opportunity post-war, Lyons was afraid that no one would be willing to do this job in the future.
--Even if people were willing to do the job, there was a high likelihood of mistakes.
In other words, the business simply couldn't scale much further. They had to find something like the computer.
I think Lyons saw this before other companies in the UK because they were a huge company processing a massive number of tiny transactions. My guess is that they were probably one of the first companies to hit the limit of what an organization could do with humans and tabulating machines.
Thanks for the nice comments, Ajit. I appreciate them.
Nathany wrote:>>Lots to think about, which may warrant a lengthy blog post of my own.Cool. Please post another comment here if you write the post; I'd like to read it.
I read "From Airline Reservations to Sonic the Hedgehog" by Campbell-Kelly some years ago, but I don't remember anything about the LEO story in there.
Thanks for the excellent article.
--Tobias
Great post and fun reading!
While coming from a different angle, you reached almost the same conclusions that Clayton Christensen presented in his "Skate Where The Money Will Be" from 2001. He says that technological advances always lead to value chain fragmentation and offered a framework to predict where most profits will be made in the future.
There is one company consciously playing exactly in the direction you are pointing at. This is obviously Google with all what they do with Android and Chrome ("commodity plumbing underneath the metaplatform"), as well as HTML5 and Google Apps Engine (the metaplatform itself).
There are many parallels between what happened to computer industry 15 years ago, and changes that are brewing in the mobile industry. The value chain began to restructure from vertical to horizontal integration (exactly like Christensen predicted). First victims are handset manufacturers - 4 out of 5 leading vendors are caught without independent smartphone software platform and face commoditization onslaught. Next will be vertically integrated cellcos (some already starting to outsource their network).
Your historical perspective and Christensen framework can shed some light on chances of success for such companies like Nokia (still running in molasses?), Microsoft (what could have been?) and Google.
Sorry for long comment, couldn't help it...
What a great article! Thanks for a brilliant and well-researched read. That little bit of history was very insightful and thoroughly enjoyable.
Thanks for the nice comments, Jonathan and Tobias.
Michael, you need never apologize for a long comment. I love it when people add to what I've written. Interesting thoughts.
By the way, it turns out the 30th anniversary of the first showing of VisiCalc was May 11, not May 12. But as I expected, there was no parade.
But Dan Bricklin did blog about it (link), and I'm pleased to hear that a celebration of the 30th anniversary of the shipment of the product is planned for this June.
Michael,
Anther nice article! I always enjoy reading your writing. You make very complex subjects easier to understand and your writing style is very engaging. When are you going to write a book? :)
Richard Law
Great article, thanks. One minor confusion:
"Instead, it is what Americans call a coffee shop ... The closest equivalent in the US these days is probably Denny's."
I thought Denny's was for breakfast. The closest equivalent to a tea shop is starbucks. A tea shop sells cups of tea in different varieties (earl grey, english breakfast, etc), and usually cakes and scones. Just substitute coffee for tea, and you have starbucks. Lyons are now principally a cake manufacturer, I believe?
Anonymous wrote:>>The closest equivalent to a tea shop is starbucks. A tea shop sells cups of tea in different varieties (earl grey, english breakfast, etc), and usually cakes and scones.Thanks for the comment. I thought the same thing. But then I found a website (here) that had images of old Lyons menus. It was a full-service restaurant, with delicacies like Saute Kidney and Boiled Syrup. Very similar in spirit to Denny's.
Sadly, Lyons was broken up in the late 1980s, according to Wikipedia.
Richard wrote:>>When are you going to write a book? :)Thanks very much for the nice comments, Richard. To write a book I first have to come up with something worth saying that's book-length.
Meanwhile, Mobile Opportunity now amounts to about 255,000 words, which is about three books. So I guess you're reading it. ;-)
I am currently learning about operating systems for the first time. Though some terms remain vague to me, I understood the main ideas you were conveying about the importance and implications of softwares today. So, thank you. I found the history particularly helpful and interesting too.
Post a Comment