Should computers have their own websites?

I think so, if not now very soon:

Websites designed to be read by computers rather than humans could make it easier to share and use data says Stephen Wolfram, creator of “computational knowledge engine” Wolfram Alpha. Writing in a blog post, he suggests that “.data” should join the likes of .com, .org and .net as a new top-level domain (TLD) for organisations to share data in a standard from, creating a “data web” that would run in parallel with the ordinary web.

Under Wolfram’s scheme, a website like would be accompanied by A human visitor to would just see a list of publicly available databases, but a computer would be able to access and interact with the data itself.

Of course, this kind of data sharing is already possible thanks to application programming interfaces (APIs), the software instructions published by many web services that allow programmers to combine data in creative ways, such as plotting Twitter updates on a Google map. Each organisation’s API is different though, which can make them hard to use. Wolfram’s proposal would put data in a standard location and format, making it easier to access.

For the pointer I thank Michelle Dawson.


Comments for this post are closed