NIST Seeks New Hash Algorithm

NIST is wrapping up accepting submissions for a new cryptographic one-way hash algorithm today. NIST's <a href="http://csrc.nist.gov/groups/ST/hash/sha-3/index.html" target="_blank">competition</a> follows a tradition of peer review, public discussion, and acceptance of algorithms that brought us DES, SHA, and AES. The selection process won't be complete until 2012, but final selection should addresses weaknesses in the hash algorithms used today.

Mike Fratto, Former Network Computing Editor

October 31, 2008

3 Min Read

NIST is wrapping up accepting submissions for a new cryptographic one-way hash algorithm today. NIST's competition follows a tradition of peer review, public discussion, and acceptance of algorithms that brought us DES, SHA, and AES. The selection process won't be complete until 2012, but final selection should addresses weaknesses in the hash algorithms used today.Cryptographic one-way hash algorithms ensure the integrity of data by calculating a unique number based on an input. There are two main benefits of one-way hash algorithms. First, you can't take a hash value -- the output of a hash algorithm -- and re-create the original input. One-way hash algorithms are not encryption. The second benefit is that it is highly unlikely you can find two inputs that result in the same output. A one-way hash value is likely to be unique to the input and is used with digital signatures.

Weakness in MD5 and SHA, two common one-way hash functions, have led NIST to create the competition for a new algorithm just as it did when looking for a new encryption, AES. NIST is in the middle of a multiyear timeline. Once the submission period ends today, NIST will host a conference in 2009 to select the submissions that meet the minimum criteria set forth by NIST. Then the evaluation period begins, as well as a public comment period. In 2010, a second conference will be held to discuss the analysis and for submitters to offer any improvements to their algorithms. By the end of 2010, the finalists will be selected. In 2012, the new algorithm will be selected.

NIST uses public selection because public peer review is the only way to ensure that cryptographic algorithms are reliable and secure. Anyone can create a cryptographic algorithm, but making it strong enough to withstand an attack is extremely difficult. There is no standardized method to test the strength of cryptography. Peer review lets cryptographers review the algorithm and point out weaknesses. Peer review works because reviewers might see a weakness the algorithm designer overlooked or the reviewers might have problem-solving skills that will weaken the algorithm that the designers lack.

The only thing algorithm authors get if their algorithm is selected is bragging rights. The submitter of the selected algorithm has to grant irrevocable and nonexclusive rights to the algorithm even if it is patented. A guy like Bruce Schneier who blogs regularly on security issues, runs a company, is a well-respected cryptographer, writes numerous articles, and is often quoted in trade and mainstream press, could easily let his ego get the better of him.

Schneier announced in his blog that he, along with Niels Ferguson, Stefan Lucks, Doug Whiting, Mihir Bellare, Tadayoshi Kohno, and Jon Callas, submitted their own algorithm, Skein. But his skepticism of new algorithms, even one he co-authored, is healthy and welcome.

A reader of Schneier's blog asked, "When will threefish and skein be available in commercial software?" To which Schneier responded:

As soon as someone implements the algorithms. They're free and open source; so there's nothing stopping anyone.

Except that it would be foolish. The algorithms are much too new to be used in a commercial application. Don't trust us when we tell you Skein and Threefish are secure; we designed them. Give it a year or two; let the community start evaluating the submissions. Let some consensus start to develop. There's no rush.

Read more about:

20082008

About the Author(s)

Mike Fratto

Former Network Computing Editor

Mike Fratto is a principal analyst at Current Analysis, covering the Enterprise Networking and Data Center Technology markets. Prior to that, Mike was with UBM Tech for 15 years, and served as editor of Network Computing. He was also lead analyst for InformationWeek Analytics and executive editor for Secure Enterprise. He has spoken at several conferences including Interop, MISTI, the Internet Security Conference, as well as to local groups. He served as the chair for Interop's datacenter and storage tracks. He also teaches a network security graduate course at Syracuse University. Prior to Network Computing, Mike was an independent consultant.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights