The "Web of Data" remains an intriguing concept rather than the next big thing in digital information. But big data could help change that.

Jeff Bertolucci, Contributor

November 26, 2012

4 Min Read

 Big Data Talent War: 7 Ways To Win

Big Data Talent War: 7 Ways To Win


Big Data Talent War: 7 Ways To Win (click image for larger view and for slideshow)

The term "Semantic Web" -- also known as the Linked Data Web, Web of Data, or Web 3.0 -- has been around for more than a decade. Its origins trace back to a 2001 Scientific American article by Tim Berners-Lee, better known as the inventor of the World Wide Web, and co-authors James Hendler and Ora Lassila. The piece projected a futuristic Web where a common framework allows applications to share and reuse data.

The concept may seem fuzzy, but Semantic Web technologies could enable a variety of uses not possible with today's document-centric Web. The World Wide Web Consortium (W3C), the international standards organization leading the Semantic Web effort, provides about three dozen case studies of how organizations are using semantic technologies today in a variety of areas. One example is data integration, where information in disparate locations and formats is integrated into one unified application.

Enterprises have been slow to embrace the Semantic Web, but software tools and solutions based on these semantic technologies are available from major vendors, including IBM and Oracle. Smaller players include Cambridge Semantics, which provides semantic data management software for the enterprise.

[ Does Hadoop point to the demise of the relational data warehouse? Share your view: Big Data Debate: End Near For Data Warehousing? ]

The W3C envisions the Semantic Web as an extension rather than a replacement of the current Web -- a framework that extends Web principles from documents to data. Many semantic specifications for the Web are already in place, and the W3C continues to develop more specs to standardize the technology.

There are three main Semantic Web standards:

-- The Resource Description Framework (RDF), a general method for data interchange on the Web, which allows the sharing and mixing of structured and semi-structured data across various applications.

-- SPARQL (SPARQL Protocol and RDF Query Language), which is designed to query data across different systems.

-- OWL (Web Ontology Language), which enables users to define concepts in a way that allows them to be mixed and matched with other concepts for various uses and applications, according to Cambridge Semantics' tutorial on Semantic Web technologies.

In a phone interview with InformationWeek, Cambridge Semantics chief technical officer CTO Sean Martin summed up the Semantic Web in a nutshell: "In essence, all you're doing is tagging data and giving it a description of what it is."

"If you can put more information in -- more metadata with the data -- then the software can interrogate the data to find out what the data is, and what it's capable of," added Martin, who believes the rise of big data could help spur the adoption of Semantic Web technologies.

Here's why: "First of all, a lot of the big data efforts are still very crude," said Martin, referring to Hadoop and related technologies. "The tools are relatively immature, and you've got specialized people using them."

And while the Hadoop Distributed File System (HDFS) offers many benefits, including excellent redundancy capabilities for big data operations, it has its shortcomings. "You can create oceans and oceans of files in HDFS. But who knows what they mean?" Martin said. "And how do we get lots of people to be able to use them?"

Semantics, he believes, has the opportunity to bridge that gap. "You can put a layer of semantics on top of any raw, immature system. So now you're using open data standards, which are well written and increasingly world-supported."

This isn't to say that global acceptance of the Semantic Web is right around the corner. In fact, proponents may find themselves spreading the gospel for years to come -- with no guarantee of success. "There's the issue of trying to foist this on the world -- getting people to understand what it's good for, why it's important, and what they can do with it. All of these things take a long time to surmount," Martin said.

Predictive analysis is getting faster, more accurate and more accessible. Combined with big data, it's driving a new age of experiments. Also in the new, all-digital Advanced Analytics issue of InformationWeek: Are project management offices a waste of money? (Free registration required.)

About the Author(s)

Jeff Bertolucci

Contributor

Jeff Bertolucci is a technology journalist in Los Angeles who writes mostly for Kiplinger's Personal Finance, The Saturday Evening Post, and InformationWeek.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights