When Websites Attack
Windows threats like Cryptolocker and ZeroAccess get all of the attention, but malware targeting (Linux) Web servers continues to evolve
Malware became even smarter, stealthier, and shadier in 2013, according to the latest Sophos Threat Report. Nowhere was this more evident than in the use of the Web as a vector for spreading malware to unsuspecting users. Sure, the payloads -- from the disruptive Cryptolocker ransomware to the silent but deadly ZeroAccess botnet -- were more sophisticated this past year, but the unsung "heroes" of cybercrime are the 20,000 to 30,000 new malicious URLs that come online each day.
Those malicious URLs -- 80 percent of which are on compromised, legitimate websites, according to a SophosLabs estimate -- can serve a number of purposes. Some deliver the payload, of course, usually through drive-by downloads, malvertising, or social engineering. But those are just a small handful of the total. The rest serve as the funnel to get users to the payload delivery sites. That includes generating SEO spam that increases exposure to dangerous URLs and shuffling users from the legitimate sites they were viewing through a series of traffic redirectors to the ultimate payload. Recently, Sophos researchers have seen that some compromised sites are centrally controlled like a botnet, allowing them to serve up DDoS and other coordinated attacks.
That coordination, and the delivery of the payloads, is handled by exploit kits. While Blackhole has been on the decline, especially after the arrest of its alleged creator, Paunch, plenty of others have stepped up to take its place. Names like Neutrino and Glazunov have become familiar to security researchers, along with Redkit, which wreaked havoc this spring on high profile sites like NBC.com and lesser-known URLs advertised by tasteless spam exploiting the Boston Marathon bombings. These new exploit kits build on the leaked source code of Blackhole, while adding new features and capabilities, like the aforementioned bot-like behavior.
Hosting the exploit kits are infected Web servers. This past year saw a rise in the use of malicious modules for the Apache Web server, such as Darkleech. This nasty bugger is capable of using all kinds of tricks to avoid detection and analysis, such as only responding with malicious behavior once per IP address or triggering randomly one in every 10 times a page is accessed.
It's notable that most of the compromised Web servers out there are running Linux, which should give pause to those who think of the OS as immune from malware. Attackers continue to infect Linux Web servers through vulnerabilities in content management systems (e.g., WordPress and Joomla), plugins for those CMSes, control panels, and development platforms like PHP. Of course, passwords are also a weak link, as they can be stolen by malware, guessed based on defaults or common user choices, and purchased on the black market following data breaches (since many site owners use the same password in multiple places).
For organizations, the implications are clear. First, websites and other servers exposed to the Internet must be protected with defense in-depth. Web application firewalls, AV software, a robust patching strategy, and even specialized Web protection services may be warranted. Second, a thoughtful strategy for protecting users within the organization from Web-borne malware is critical. This includes robust perimeter protection, but also layered endpoint protection like Web filtering, Web threat detection, and HIPS, so users' machines are secure even when they're outside of your network perimeter.
For more information about the latest threat trends, check out Sophos Threat Report 2014 at sophos.com/threatreport.
About the Author
You May Also Like