Humans were responsible for only 41% of network traffic monitored by Distil Networks; bots made up the rest. What does this mean for your applications?

Curtis Franklin Jr., Senior Editor at Dark Reading

May 22, 2015

3 Min Read
<p align="left">"Bad bots" look more like humans with each passing year.</p>

8 Linux Security Improvements In 8 Years

8 Linux Security Improvements In 8 Years


8 Linux Security Improvements In 8 Years (Click image for larger view and slideshow.)

We're outnumbered. The rise of the bots has begun, and humans fall farther behind with each passing day. And that's the good news.

The 2015 Bad Bot Landscape Report, released by Distil Networks, said that the amount of traffic generated by "bad" bots" on the networks they monitor went down slightly between 2013 and 2014. The thing is, the amount of traffic with a human origin declined dramatically in the same period.

In 2013, human beings were responsible for roughly 55% of all network traffic monitored by Distil. By 2014, that had fallen to 41%. In the same time period, traffic from "bad" bots (those that are involved in espionage, fraud, theft, or cyber-vandalism) declined slightly, from 24% to 23%. What makes up the rest? Good bots.

[ What matters more: Technology or people? Read Technology Is A Human Endeavor. ]

According to Eli May, a spokesman for Distil Networks, "good" bots are those that perform benign or essential network tasks. "Think about the web crawlers that Google uses to index pages for its search engines," he said.

In an email exchange with InformationWeek, Distil Networks' CEO Rami Essaid further explained, "A good bot provides value back to the site. For example, Googlebot and Bingbot index your site so your prospects and customers can find you on the web. Clear value for your site. Also, if a bot doesn't respect robot.txt then we consider it a bad bot. Robot.txt tells bots which areas of your site they are allowed to access, crawl, or scan." Good bot traffic has increased in volume, but bad bot behavior has dramatically increased in sophistication — and in some networks the increase in volume has been in the hundreds of percentage points.

The report said that smaller companies are most at risk from the rise of bot traffic because they lack the ability to effectively block unwanted data flows. Even so, the company with the highest percentage of bot-generated traffic in its network was Amazon, with nearly 78% of all its traffic coming from machine-based sources.

Two facts from the report signal a more significant change in bot trends for coming years. The first is the rise in mobile-oriented (or originated) bad bots. According to Distil, mobile bots are now responsible for more than 8% of all web traffic. And while each of the 21 largest global wireless companies were a source of bad bot traffic, T-Mobile (USA) became one of the top 20 bad bot hosts for the first time in 2014.

The second important development is the increasing sophistication of bot behavior. Distil's report says that 41% of the bad bots they found are attempting to mimic human behavior in order to get into a web site's infrastructure. And 23% of the bad bots are so good at this mimicking that they are essentially unstoppable by technologies employed by most web application firewalls in use today.

For companies running public networks the lessons are clear: They must do more to try to stop the proliferation of bad bots, or risk choking their networks with machine-generated traffic to the detriment of human users. Organizations that depend on web-facing applications to transact business or reach an audience should step up their game in application-level defense, especially as more of their traffic comes from users via mobile devices.

[Did you miss any of the InformationWeek Conference in Las Vegas last month? Don't worry: We have you covered. Check out what our speakers had to say and see tweets from the show. Let's keep the conversation going.]

About the Author(s)

Curtis Franklin Jr.

Senior Editor at Dark Reading

Curtis Franklin Jr. is Senior Editor at Dark Reading. In this role he focuses on product and technology coverage for the publication. In addition he works on audio and video programming for Dark Reading and contributes to activities at Interop ITX, Black Hat, INsecurity, and other conferences.

Previously he was editor of Light Reading's Security Now and executive editor, technology, at InformationWeek where he was also executive producer of InformationWeek's online radio and podcast episodes.

Curtis has been writing about technologies and products in computing and networking since the early 1980s. He has contributed to a number of technology-industry publications including Enterprise Efficiency, ChannelWeb, Network Computing, InfoWorld, PCWorld, Dark Reading, and ITWorld.com on subjects ranging from mobile enterprise computing to enterprise security and wireless networking.

Curtis is the author of thousands of articles, the co-author of five books, and has been a frequent speaker at computer and networking industry conferences across North America and Europe. His most popular book, The Absolute Beginner's Guide to Podcasting, with co-author George Colombo, was published by Que Books. His most recent book, Cloud Computing: Technologies and Strategies of the Ubiquitous Data Center, with co-author Brian Chee, was released in April 2010. His next book, Securing the Cloud: Security Strategies for the Ubiquitous Data Center, with co-author Brian Chee, is scheduled for release in the Fall of 2018.

When he's not writing, Curtis is a painter, photographer, cook, and multi-instrumentalist musician. He is active in amateur radio (KG4GWA), scuba diving, stand-up paddleboarding, and is a certified Florida Master Naturalist.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights