Cybersecurity In-Depth: Feature articles on security strategy, latest trends, and people to know.
Security Researchers Struggle with Bot Management Programs
Bots are a known problem, but researchers will tell you that bot defenses create problems of their own when it comes to valuable data.
Bot management is all the rage in the security world. Every day, I find myself bombarded with articles proclaiming that N percent of Internet traffic is generated by bots, where N is a sufficiently alarming number to make most executives want to dash out and purchase the first bot-defense product in sight. While I can't speak for the accuracy of those reports, one thing's certain: There's a growing demand for effective bot mitigation.
I know. I work for a company that develops one such bot management solution, and I talk to customers about it daily. I do enjoy having some semblance of job security, but being the recovering academic that I am, I'm also really concerned. Conducting large-scale Internet crawls is an all too common task in many fields of security research. Does the research community fully understand the implications of bot defenses on their experiments? Do they do anything about it? I am not optimistic.
Bot is a notoriously overloaded term with numerous meanings. Today, the term is understood to mean any software that performs automated tasks over the Internet. This includes malware such as those comprising a botnet, but also benign software like search engines and information aggregators. Conveniently, this definition is aligned with the features of popular bot management solutions; businesses certainly want malware protection, but they also have strong incentives to monitor, limit, block, or even serve false content to automated requests reaching their web properties.
This is a serious problem for security researchers.
Data collection via Internet crawls is a crucial part of security research. In my own work, I crawled millions of websites and scraped application stores, code repositories, forums, vulnerability databases, and more. Think about it. Researchers meticulously design experiments, build and analyze invaluable data sets in a scientific framework, and (sometimes literally) fight to publish and present their results at prestigious conferences, only to discover that their data set was tainted by a plethora of bot defenses scattered around the Internet.
In the best case, the collected data would be biased because servers equipped with bot defenses would block the connection or return a static page without meaningful content. And if worst comes to worst, servers that return false information to thwart information harvesters could make it nigh impossible to even detect that something somewhere went wrong.
I have no reason to doubt this situation significantly affects Internet crawls and measurement studies — today. In all likelihood, we regularly work with bad data, and then publish and read papers with skewed results. But we just don't yet have insights into how data collection is affected by bot defenses.
A solution is not likely to come from the business side. Widespread adoption of bot defenses won't be tapering off anytime soon. There simply isn't enough motivation for businesses to back down from their strong stance against bots; they won't forgo protection to accommodate a few innocuous crawlers among myriad malicious hits.
As far as researchers are concerned, there's always been a certain degree of awareness of anti-crawling techniques. Researchers came up with best practices such as crafting realistic request headers, limiting connection rates, and building crawlers on headless browsers. However, modern bot defenses are well-prepared to catch these tricks; they analyze browser characteristics, connection patterns, packet structure, and even hardware inputs, and combine these observations in nontrivial ways to distinguish between humans and our robot overlords.
Yes, even the most intricate defense can be reverse-engineered and bypassed given enough resources and dedication. The bar, however, is high. Faced with a growing number of evolving bot management products, researchers are perpetually at a disadvantage.
The Need for Change
We need a paradigm shift. Here is an idea: The next time we run a crawl, let's acknowledge that the entire Internet is out there to corrupt our data, and duly deal with it! Data validation is key. Questionable data collection methodologies and low-quality data sets aren't exactly unknown territory for the research community, but we need even greater focus on this issue today.
I'm all too familiar with that urge to rush through data collection and get to the more interesting data analysis (and then submit a half-decent paper minutes before a deadline). This approach is missing the mark if it leads to inaccurate measurements and incorrect conclusions.
Data validation is a hard problem, but at the same time it's a well-explored area of computer science. We have the necessary tools, like constraint validation for predictable data, or clustering to spot outliers in complex data sets. When all else fails, manual analysis combined with sampling can be a surprisingly effective and viable approach, even for extremely large datasets. It's well worth putting in the extra time and effort to systematically validate data, in addition to writing at length about the process in publications, so that the reviewers and readers know we did our part.
Finally, I'll point out that this problem has an interesting beneficial side effect: the potential to open up unique research directions. Enabling functional yet ethical crawling techniques that are also aligned with businesses' needs is one obvious route this can take. However, I also anticipate novel techniques that can scientifically quantify the impact of bot defenses on measurements.
With better insights and visibility into this issue, we can better recognize our limitations, and pursue the promising paths toward a solution.
Related Content:
Black Hat Europe returns to London Dec. 3-6, 2018, with hands-on technical Trainings, cutting-edge Briefings, Arsenal open-source tool demonstrations, top-tier security solutions, and service providers in the Business Hall. Click for information on the conference and to register.
About the Author
You May Also Like