The Web Hypertext Application Technology Working Group (WHATWG) and the World Wide Web Consortium (W3C) have decided to go separate ways. There is now a risk of a power struggle emerging over the HTML standard that could cause it to fork. Everybody that has to develop websites knows that energy and therefore money is wasted in accommodating how different browsers variably implement the features of different HTML standards. Forking the HTML standard could lead to two parallel webs. Even if it did not, it certainly would be more complex and expensive to implement for multiple versions of two living standards rather than one. Two parallel standards would inevitably stifle innovation with energy wasted on duplicated effort.
The thing that enables the really useful ‘world wide’ part of the World Wide Web, just like any other sophisticated undertaking, is standardisation. Standardisation is what enables and makes affordable complex undertakings requiring many specialists. Microsoft, Apple, and other influential players that could, have undermined standardisation efforts using their market dominance to their own advantage. Failure to regulate adherence to standards is what allows them to place their petty commercial self-interest above progress and the greater good. Some things can be left to choice, but others are too important and must be regulated. Adherence to a single HTML standard should be regulated across the world to ensure progress.
Google are improving their game. They know a good deal of information about us and many other entities we commonly deal with, and associations between entities. Therefore they can infer more useful search results from the usual thin stream of information we give in a search because the context of a search is much greater than what we type in. This should make for more accurate search results. However this could also tend to push the more interest results down the list, reducing the chance for serendipitous discoveries. Perhaps they can provide a serendipity slide control. Anyway, this also opens up a new frontier for SEO and dilutes a little the heavily abused priority attached to links. Good thing too. Semantic search
The risk of interrupted access to data can be measured in terms of the amount of computing hardware and software it must pass through. So if the only or main data store and processing is by internet services, the risk to access is much higher. The same can be said of data security. Indeed, by UK law some data must be held with specific constraints that naturally militate against the main principle of the cloud, which is to delegate IT management, reducing IT to a set of benefits and business level decisions. Nonetheless, cloud solutions do present opportunities to do things that were difficult or impossible before, especially in terms of scalability and volatility of resource requirements.
All this raises an important question: Are cloud services a sound choice if more proximate services are an option?
I think the cases where cloud services are the only option are few. I also don’t believe that a substantial cost reduction is possible because of purchasing large amounts of equipment, especially when the costs of running a large organisation are taken into consideration. Locating services in economies with lower costs definitely has price advantages, but at the cost of access time and risk to access and security. I have noticed that where worthwhile cost reductions have been claimed it is often at the expense of IT jobs. As a result either the cloud provider must be adding a similar number of people and hence costs which will eventually return in service prices, or the services offered cannot be as well matched to the specific needs of the business. That standardisation of IT and hence business processes reduces the opportunity for business differentiation.
In the end there is no generic answer to the question. Care must be taken to make the right decision.