Predating the first Internet bubble, there was the applet craze: It was thought that software would be delivered through the Web, on demand. Java was riding high on this fad, and lost a lot of credibility when the immaturity of Java technology (i.e. the unsuitability of AWT for making big, rich UIs on PC clients), low Internet connection speeds, high bandwidth costs, and Microsoft’s adverse reaction to hosting a potentially undermining technology in the bosom of their desktop OS, worked together to make applets less of the big deal than the pundits made them out to be.
Microsoft was able to contribute to blocking this disruption, but was not able to directly counter it or provide an alternative. .NET, like Java, became one of those dull gray tools of the IT crowd – a means of making multi-tier software systems without the hassle of understanding the complex threading implementations needed to make a middle tier that can serve thousands of users. This, despite the fact that .NET implemented code-behind-HTML and other features that made it quite a good tool for delivering user interface in Web pages.
Web hackers – the sort that don’t program in Real Languages like C# and Java - who never got the memo about server software, kept plugging away at making the Web user experience better.
“Better” means escaping the UI-as-form-filling paradigm that gave the Web a plodding, oppressive feel any time user input was required. The Web was born a hypertext system. Accommodating interaction beyond that required for hypertext is something that should be unsurprising in its difficulty.
For an idea of the size of this problem and the value of simplifying the solution, take a look, on the one hand, at Enterprise Java Beans, even in its much cleaner 3.0 implementation, and, on the other hand, Ruby on Rails, which epitomizes the Web 2.0 ethos of productivity over universality. By focusing on solving exactly the above specific problem, which, while narrow in scope, vexes a large number of application developers, Ruby on Rails struck a very resonant tone, and has made an impression beyond the number of projects that actually use Ruby on Rails.
Ruby on Rails belongs in a spectrum of technologies we can call “Web OS.” Web OS is a system for running and building applications that run on Web servers and are used through the medium of a Web browser. Ruby on Rails is at the “building” end of the spectrum, while application suites like Writely, Gmail and Google Calendar, that have Web services interfaces but no SDK, are at the “running-only” end of the spectrum (although you can be sure Google has an impressive set of tools for their own coders).
We will soon see a lot more Web OS attempts, and, hopefully, some successes. Success, in this context, means making it possible for a large number of coders – not just those with Google's resources for toolsmithing – to create and run rich Web-based applications that truly break free from the “screen A -> screen B -> screen C” model of UI misdesign.
Whatever you may think about having your documents live on a server and editing them with tools that also live on servers, the WebOS future is likely to provide some interesting results: One easy prediction to make is that the mash-up culture will spread to all manner of applications, whereas the component document technologies that were supposed to put a spreadsheet inside your word processing document never really delivered. Maps, communication, auctions, translation, etc. will drop into Web OS documents naturally, because the data and communication that animates these document mash-ups will be available and compatible.
This is what makes Web OS interesting. If it stopped at implementing MVC applications in a particularly challenging implementation environment it would be only a technology curiosity, and probably not enough to motivate investment. But Web OS applications really can be better.
It's a good thing developers of these applications were never told how hard that is to do.