ECM Reaches For The Clouds

IT doesn't have to give up content management features when users adopt online storage and document sharing.

Kurt Marko, Contributing Editor

September 21, 2011

3 Min Read
InformationWeek logo in a gray background | InformationWeek

InformationWeek Green -  September 26, 2011

InformationWeek Green - September 26, 2011

InformationWeek Green

InformationWeek Green

Download the InformationWeek, September supplement on enterprise content management distributed in an all-digital format as part of our Green Initiative
(Registration required.)
We will plant a tree for each of the first 5,000 downloads.


Kurt Marko

Kurt Marko

One goal of enterprise content management software is to provide a standard repository for business content, which ensures that IT can apply policies for access control, retention, and disposition. However, users have become enamored with online storage and file-sharing services, such as Box.net, Google Apps, and Dropbox. They're free and convenient for end users, but they can keep business content hidden from ECM policies and outside IT control.

Sure, you could thwart the use of these services with data loss prevention software or application-aware firewalls, but such draconian countermeasures not only alienate business users, they throw the cloud-sharing baby out with the corporate-policy bathwater. Better to adopt a nuanced approach to cloud storage that monitors and regulates data copied to external services and integrates these distributed content repositories into the corporate ECM ecosystem.

While we're still in the early days of managing online content, several tools have emerged that provide ECM-like visibility to cloud-stored data. Most common is adding ECM capabilities to online content repositories. For example, Box.net offers an ECM Cloud Connect service that can synchronize information from widely adopted ECM platforms like EMC's Documentum or IBM's FileNet with its online repository. Given Box's tentacles into numerous mobile clients and many SaaS applications, this makes the information instantly discoverable on everything from the CEO's iPad to the marketing manager's Salesforce home screen. Box can also sync content stored in Google Apps, Jive, or NetSuite and publishes a set of APIs for building apps using Box-hosted content in custom workflows. Box's service isn't a one-way street, either, because it also allows archiving data stored online into on-premises ECM systems.

Some ECM products, like Alfresco, can access online file systems via standard protocols like WebDAV, SFTP, IMAP, or even Amazon's S3 API, which means many cloud storage services can be indexed and searched via internal ECM software with a bit of configuration or coding.

An alternative scenario entails using cloud storage to augment internal ECM repositories. Here, instead of the ECM application residing in the cloud, accessing local on-premises storage, the roles are reversed: a conventional on-site ECM application uses an infrastructure-as-a-service storage provider as its document repository. For example, Nuxeo offers a extension that lets ECM users transparently access and manipulate data stored in the cloud (Amazon S3 in Nuxeo's case) just as they do locally stored files. Similarly, OpenText supports the ability to use Microsoft's Azure storage service alongside on-premises SANs as a document repository.

We're still early in the adaptation of ECM systems to cloud services, but as the popularity of off-site storage continues to grow among end users, and IT professionals become more comfortable using IaaS to supplement internal storage systems, it's clear that integration between ECM and the cloud will tighten. Those upgrading their ECM capabilities in the near term would do well to add support for off-site services to their evaluation criteria.

About the Author

Kurt Marko

Contributing Editor

Kurt Marko is an InformationWeek and Network Computing contributor and IT industry veteran, pursuing his passion for communications after a varied career that has spanned virtually the entire high-tech food chain from chips to systems. Upon graduating from Stanford University with a BS and MS in Electrical Engineering, Kurt spent several years as a semiconductor device physicist, doing process design, modeling and testing. He then joined AT&T Bell Laboratories as a memory chip designer and CAD and simulation developer.Moving to Hewlett-Packard, Kurt started in the laser printer R&D lab doing electrophotography development, for which he earned a patent, but his love of computers eventually led him to join HP’s nascent technical IT group. He spent 15 years as an IT engineer and was a lead architect for several enterprisewide infrastructure projects at HP, including the Windows domain infrastructure, remote access service, Exchange e-mail infrastructure and managed Web services.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights