This interview with Paul Prescod is a first here at kintespace.com. To explain to you the importance of Paul Prescod is like trying to explain to you how and why you are reading these words on the Internet. No, Paul Prescod did not invent the Internet. Paul Prescod is a leading visionary in forming the consciousness required to build on the Internet, using its HTTP protocol efficiently. Paul is all about the consciousness of REST. One might say that Roy Fielding built the electric guitar but it’s Paul Prescod as Jimi Hendrix who really gets you interested in playing it. So let’s sit down and listen to the Internet with the electric sounds of Paul Prescod.
Bryan Wilhite: In my youthful ignorance, I’m going to admit that, since I can’t find your entry at Wikipedia.org, I must ask you to give us a bio’. Would you? Is a bio omitted at prescod.net on purpose? Does the acronym SGML play a large part here?
Paul Prescod: Here’s my formal bio:
Paul Prescod is a leading researcher and implementor of markup technologies. His formal education was in mathematics and computer science at the University of Waterloo. His research interests include formalisms for document modeling, queries and schemata. Paul has been very involved in the development and promotion of new standards. He worked within the XML Working Group of the World Wide Web consortium to develop the XML family of standards and co-wrote a popular book on that family of standards with Charles Goldfarb, inventor of SGML. More recently he contributed to the DITA standard for componentized documentation.
SGML is not a part of my official biography, but is the source of many of my ideas and experience. Much of what Don Box talks about in the interview you cite below is sort of common sense for SGML-heads. For example, people like Tim Bray and Sean McGrath have been saying forever that what matters is the message on the wire and not the shared semantics represented in metadata. Don Box cites how this was a revelation to him in 2003. The whole web services movement was basically about people without appropriate backgrounds building stuff they didn’t understand in a rush. I include myself in that category. I had no idea about protocols back then, just as Don Box had no idea about markup (and seemingly not a lot about protocols either).
rasx(): Are your Canadian Technologists having the same issues (debates) about SOAP vs. REST here in the United States?
P 2 : Canadian technologists are an integral part of most technology discussions and are in no way segmented from or different than American or European participants. We are all just Internet users.
rasx(): You are truly a responsible and influential Internet citizen. What accomplishments among the Internet “technorati” stand out for you?
P 2 : I am happy with my popularization of the REST model, which has since been adopted enthusiastically by Yahoo, Flickr (before they were part of Yahoo), the Atom protocol group, the Rails development team and too many other organizations to count.
rasx(): REST is distinguished by its fixed interface. Its members are HTTP verbs, featuring GET and POST. The parameters or arguments sent to these members are URIs. The assumption here is that this constraint makes OOP-centric people shun REST. Do you agree with this reasoning?
P 2 : Your description is only partially correct. REST is distinguished by its focus on addressable objects (URI-addressable, on the Web) with fixed method interfaces. Most competitive models to Web do not feature addressable objects but rather addressable “services.” Therefore many OOP programmers are attracted to REST because it puts objects (rather than services) back at the heart of networked software development. The REST emphasis on a fixed interface is different than what many OOP programmers are used to but that does not typically cause them to shun it. For example, the Ruby on Rails community, organized around the highly object oriented Ruby language, are fans of REST. The same is true for object oriented programmers using Java, Python and every other object oriented language.
Read some magazines or books published by Microsofties and you’ll see what I mean. Don Box is playing that same game here: “SOAP was crap. XML Schema was crap. WSDL was crap. But trust us, we know what we’re doing now.” They can’t afford to learn the lesson about premature standardization because they can’t afford to slow the revenue stream (and “thought leadership”) generated by technology churn.
rasx(): What’s your summary of the explosive expansion Blogosphere and the defacto rule of XML-RPC in Web applications like WordPress?
P 2 : XML-RPC is not dominant in Web applications. A tiny fraction of Web applications use XML-RPC. Actually, the REST model is derived from the way that Web applications are built.
With respect to the Blogosphere: Blogs use REST much more than they do XML-RPC. Some Blog software does use XML-RPC to upload the post from the author to the website. But how does the post get distributed to hundreds, thousands or tens of thousands of readers? If “wonkette” does one XML-RPC call to upload her thoughts and then her readers do ten thousands downloads and views (including many through aggregators) using REST, then which technique is really doing the heavy lifting?
Even for posting, the REST-based Atom API will be much more significant over the long term than the XML-RPC API which has as its only virtue that it was first. The problems with the XML-RPC API are well documented:
XML-RPC itself has lost most of its momentum over the last several years. Its design was quick and not comprehensive which got it to market first but doomed it long-term. It is just a legacy protocol that is so entrenched that it will take a few years to replace. SixApart, Blogger, etc. have moved towards Atom/REST.
That said, XML-RPC is good for highly simplistic call and response services where scalability of data reads is not an issue (or where you intend to depend on REST to handle data reads).
You asked specifically about WordPress. There is an extension for WordPress to allow it to use the new Atom publishing protocol.
My overall view of Dave Winer is that he has brilliant ideas ahead of their time and then does such poor implementation that other people must send years cleaning up the mess created by the early adopters on the basis of his half-written, barely-thought-out specifications. Atom cleans up his broken idea of RSS. SOAP was supposed to clean up XML-RPC although it made the situation worse… As Don says in the talk you referenced: “The whole world is living with the arbitrary decisions that Dave Winer made when he rolled out of bed that morning.”
rasx(): To me, XML-RPC means Dave Winer. After listening to Dave on Leo Laporte’s TWiT and Dave’s Morning Coffee Notes—and reading his Blog, the strong suggestion to me is that Dave is saying that XML-RPC is more simple and elegant than SOAP. He talks about getting into what some can regard as “arguments” with SOAP pundits at Microsoft. There is what I consider an historic video recording of Don Box and Dave Winer (off camera) exchanging a few cute words. Now here comes Paul Prescod saying that REST is more elegant and simple than XML-RPC! Do you place your views in this context? Have you had to “defend” yourself in this context?
P 2 : I wouldn’t say that REST is simpler than XML-RPC. XML-RPC is both brain-dead simple and in many ways simply brain dead. REST is substantially more powerful and scalable than XML-RPC which is why (for example) REST is the foundation for the whole syndication craze. (REST is how RSS files get downloaded, aggregated, etc.) What does XML-RPC have to say about caching? About proxying? About extensibility?
The interesting thing about the talk you sent is that it seems to me that Don Box makes the case for REST. He says that the problem with COM was that it made it too easy to define new abstractions using the IDL compiler. He says that he wishes that it had been more difficult to develop new interfaces. REST is all about using a single pre-standardized interface: the resource.
Don throws in a bit about how he isn’t really saying anything in favour of REST because RESTafarians are trying to “sell an abstraction.” My rejoinder is that a system with no common abstractions is not a system.
Despite what Don said about the mistakes of COM, Microsoft has dedicated the last few years to developing developer tools analogous to IDL compilers for SOAP and WSDL. Microsoft in general has an amazing capacity to state blatantly: “That stuff we sold you last year was total crap but please trust next year’s product.” Read some magazines or books published by Microsofties and you’ll see what I mean. Don Box is playing that same game here: “SOAP was crap. XML Schema was crap. WSDL was crap. But trust us, we know what we’re doing now.” They can’t afford to learn the lesson about premature standardization because they can’t afford to slow the revenue stream (and “thought leadership”) generated by technology churn.
Software vendors will inherently hate the idea that a “technique” for using existing protocols might be enough. They need to sell new stacks of software and therefore need innovation for its own sake. In the link you sent, Don says: “We live in an industry with a lot of mouths to feed. If something is too simple, how am I going to get paid?”
rasx(): Dave Winer also means RSS. This is what Dave Winer says about Atom: “I don’t care if Atom gains strength, just as long as it doesn’t hold anything back or sacrifice any of the progress we’ve made. I said early-on that I would support Atom, and I have. I don’t want to fight over this so I don’t. Unfortunately some of the people involved want a fight, so they act as if one is happening, and this confuses a lot of people.” How would you summarize Dave’s view when trying to explain this Atom/RSS ‘issue’ to an outsider not hip to this discussion?
P 2 : I was not involved in the syndication world at the time of Atom’s inception. My overall view of Dave Winer is that he has brilliant ideas ahead of their time and then does such poor implementation that other people must send years cleaning up the mess created by the early adopters on the basis of his half-written, barely-thought-out specifications. Atom cleans up his broken idea of RSS. SOAP was supposed to clean up XML-RPC although it made the situation worse. I can’t speak to OPML.
As Don says in the talk you referenced: “The whole world is living with the arbitrary decisions that Dave Winer made when he rolled out of bed that morning.”
rasx(): The assertion here is that Microsoft’s preference for SOAP is yet more evidence that it “hates” the Web. This comes because SOAP seems to work ‘on top’ of the Web while REST is ‘in’ the Web—is the programmatic manifestation of HTTP. SOAP seems to build flying buttresses—an abstraction layer on top of an abstraction layer—just in case HTTP goes away—almost hoping HTTP goes away. Do you agree with this perception?
P 2 : Microsoft is a huge company and people will vary widely in their views.
I think that Don Box does kind of “hate” Web technologies, at least as an interprocess communication mechanism.
“We have to do something to make it (HTTP) less important,” said Box. “If we rely on HTTP we will melt the Internet. We at least have to raise the level of abstraction, so that we have an industry-wide way to do long-running requests—I need a way to send a request to a server and not the get result for five days.”
—“Microsoft guru: Stamp out HTTP,” news.zdnet.com
That’s overstated a bit, but I don’t think Don Box has ever really thought about how to use HTTP to its potential.
If you want a “human drama” you might investigate the relationship between Don Box and Roy Fielding. Don once hinted to me that his first-hand knowledge of Roy’s time at University (I think that they both went to UC Irvine) was part of his skepticism of REST. I don’t think that’s the whole story but it does suggest to me that the REST argument was destined to fall on deaf ears in his case.
rasx(): Here’s a strange one: do you resonate with the flippant remark that SOAP networks have more “intellectual property rights” (or proprietary surface area) while REST networks are too “universal” for many corporations to culturally understand?
P 2 : It depends what kind of corporations you are talking about. Software vendors will inherently hate the idea that a “technique” for using existing protocols might be enough. They need to sell new stacks of software and therefore need innovation for its own sake. In the link you sent, Don says: “We live in an industry with a lot of mouths to feed. If something is too simple, how am I going to get paid?”
The non-tech companies (their customers and “users”) are stuck in the position of trusting random guys they read about on the Web or trusting their long-term infrastructure providers. Even if the techies feel confident to make the decision on their own, they are very likely to be overruled by their manager who does not totally trust his tech team.
rasx(): I made quite a big deal in my Blog when we announced our switch from the SOAP-based Google search API to the REST-based Yahoo search API. I was humbled to find that you were dealing with this issue over three years ago! Has anything new come up to further explain what the hell is Google thinking?
P 2 : Not really. Their approach to interoperability has been very team-specific and inconsistent. They’ve used a mix of SOAP, REST, JSON and JavaScript APIs.
rasx(): When I review your excellent contributions at xml.com, I see that your publishing stops at 2003. Are you contributing regularly to any online venue? Are your bricks-and-mortar obligations causing a change in priorities over the last few years?
P 2 : I stopped publishing on REST when I thought that the ideas had been promulgated enough that they should stand on their own. I wasn’t confident that they would, but I thought that if the community was not ready to run with the ideas after I had put roughly a person-year into various kinds of promotion and education then I’d have to admit that there were problems with the ideas themselves. But REST has done quite well in the last few years. I love going to talks by engineers at Yahoo or Amazon or Flickr about how their most popular services are the ones that are built in terms of simple HTTP call and response. It isn’t always ideologically pure “high REST” but they usually get the basics right. I’m also pleasantly surprised by what I hear from people at the heart of the Web Services movement who realize that things are much worse than they should be, in part because nobody listened back in 2001 when we tried to tell them things were off the rails.
rasx(): I would be surprised to find that an excellent historian/journalist like Robert X. Cringely, what with his NerdTV, has failed to find you. This begs the larger question, what is your attitude toward the Blog and the podcast? Do think it is a “threat” to “traditional” media?
P 2 : Traditional media companies may be under threat, because in any time of change there will be winners and there will be losers. But capitalists will always find ways of
I see that the definition of “traditional” broadcast media and I don’t think it will go away for the same reason that restaurant brands and coffee shop brands don’t go away despite the fact that everyone claims to prefer their down-home restaurants and coffee shops.
rasx(): The famous REST tutorial touts XLink as a supporting technology of REST. After reading “XLink: Who Cares?” at xml.com, I am led to ask is there a replacement for XLink? Is there new guidance?
P 2 : I believe that “XLink: Who Cares?” is actually older than the REST article. XLink enables the recognition of links in a vocabulary agnostic manner. It seems that the industry does not need such a facility. Recognizing links in a vocabulary-specific manner is no harder than recognizing paragraphs, titles, or whatever else. I’ve always thought it might be nice to be able to recognizing links in general for purposes of content crawling, link checking and deep cloning.
rasx(): There simply has to be a human (emotional) story behind the SOAP Working Group working on SOAP HTTP GET. What the hell is (was?) this?
P 2 : David Orchard is a friend of mine, more because he lives in Vancouver than because of any collaboration on this sort of project. The vendors were under quite a bit of pressure to heal the rift between Web architecture and Web Services technologies and this is one artifact of that period. The primary benefit of moving the center of gravity of Web Services standardization to OASIS is that OASIS does not really attempt to build a unified architecture for its standards. The W3C is supposed to be building an information system known as “The Web.” This means that there is pressure to actually make standards work together.
rasx(): For me XSLT plays a major role in the development of human-readable data interfaces. It is the platform/framework neutral way to design a UIdeclaratively. Since XML sets play a major role in REST, what do you think about its relationship to XSLT?
P
2
: XSLT’s interaction with external data sources is through the document()
function which interacts with HTTP’s GET. Amazon has some great APIs where you can link to an XSLT on your site and apply it to XML content on their site.
rasx(): When I designed Windows Forms with .NET 1.x, I got seriously burned due to its immaturity—and my need to get it to do Web-browser-like things. For Windows Forms in .NET 2.0, I’ve ignored almost all “improved” and new widgets except for one: the WebBrowser control. My designs basically inject or transfer state via XSLT into div
blocks in the WebBrowser
control. These ‘injections’ are basically HttpWebRequest calls to a remote server—these calls returning chunks of XML (XHTML actually) are identified by URIs. Am I on the “right” track under these unfortunate Microsoft circumstances?
P 2 : Well you’re really doing something vary AJAX-y, though it sound like you are doing it in an application outside of a traditional web browser. AJAX always uses HTTP and often does so in a very REST-y manner as you describe. I look forward to a future where more and more XML services are available for Javascript and XSLT-based apps to “mash up.”
This document was generated offline from a Microsoft Office Word WordprocessingML file by .NET console application that transformed it into XHTML. I would like to think that this web presentation would not be possible, in large part, without Paul Prescod. Much appreciation! Much respect! Play on!