Skip navigation
Currently Being Moderated

Last week USENIX held its 20th Security Symposium in San Francisco, and I attended a number of interesting and inspiring presentations.

On Monday during the WOOT 11 workshop Chris Kanich from UCSD gave a talk that was closely related to our own BrowserCheck work here at Qualys, but used some very creative means to gain access to test subjects. He and his fellow researchers, Stephen Checkoway and Keaton Mowery, used Amazon’s Mechanical Turk crowdsourcing service to advertise a task and then fingerprint the security of the browsers used by the interested workers.

Amazon’s Mechanical Turk is a “crowdsourcing” marketplace for tasks that are best solved with or even require human intelligence. An example might be the identification and labeling of an image, the translation of a foreign text or the categorization of a website. These tasks are called HITs (Human Intelligence Tasks) and are coded by the HIT requestor as webpages. They are labeled with both an expected duration for each HIT (often less than a minute) and also the offered pay for each HIT (often in the cents range). The workers (“turkers”) use normal web browsers to navigate the site and select HITs that they feel competent to complete. At the end of a paycycle, Amazon’s payment system charges requestors and pays turkers.

The UCSD team put up a very simple HIT that consisted of typing in the name of the Antivirus (AV) program used by the user, and offered to pay 1 cent for the answer. When the turker accepted the HIT, the webpage prompted for the name of the AV in use and also ran JavaScript code to identify the browser and its installed plugins.

Once the HIT is executed, the turker is offered another task, slightly more complex (download and run a script) and better paid (between 5 and 15 cents). The script to execute has roughly the same purpose - record the security status of the workstation in use.

The results mirror very closely our data from BrowserCheck - over 80% of all participating turkers have at least one vulnerable plugin that could be used to take over the machine:
Looking at the data from the more complex follow-up HIT, where the turker ran a script to provide more detail on the machine configuration, confirmed the vulnerability data gathered by JavaScript and provided an additional insight into the AV configurations in use: over 90 percent of all turkers have AV installed, but many of them are using outdated AV definitions. The US is particularly disappointing: over 75 percent have outdated AV definition files on their machines, a fact that the researchers attribute to the common pre-installation of “teaser” AV installations that come with a newly purchased PC, but that stop updating after six months unless the user buys a full subscription.


“Up to date” percentages that are in such a low range make me question whether we (the internet´s users as a whole) would not be better off if PC manufacturers refrained from including commercial AV packages in their standard builds for consumers. Future versions of our BrowserCheck initiative will add an “AV updated” check and we will see if we can confirm this tendency in both the end user version (https://browsercheck.qualys.com) and also for the users of the Business Edition (www.qualys.com/browser).

 

BTW, the real purpose of the research was to determine if Amazon’s Mechanical Turk can provide an efficient way to install malware on machines, i.e. to see if a botnet could be constructed that way. Answer: it depends. Read the full paper “Putting Out a HIT: Crowdsourcing Malware Installs” itself for a detailed answer to the question and more insight into this fascinating experiment.

Comments (0)