The Anti-Detect Browser Royal Rumble - updated with notes
Some comments to the latest Anti-Detect Browser Royal Rumble.
Some weeks ago we published the first edition of the Anti-Detect Browser Royal Rumble and it was a great success in terms of reading.
In case you’ve missed it, here’s the full article.
Since every anti-detect browser has its own options and features, trying to use the same configuration on each of them could be a daunting challenge and could lead to some penalties in the scores of the tests used for the benchmark.
In order to improve the understanding of each solution, I’ve decided to open the article to the companies involved so that they can share their ideas and comments on the tests.
Again, a special thanks go to all the companies involved in this project since they provided me with a free demo for doing all the tests needed: this article could not have been written without them.
In this edition, we’ll compare the performances of the following anti-detect browsers:
Some of you may have noticed that there’s one company added to this list in the previous article, and you’re right.
In this edition I wanted to add also MultiLogin to the basket, since for some issues depending on my side I was not able to test it on time for the publishing of the initial article. So let’s recap the testing methodology and see how MultiLogin performs compared to the other browsers.
Testing methodology
In this edition, we’re going to test the performances of these anti-detect browsers against two famous fingerprinting test pages: Creepjs and BrowserScan.
For every browser we’ll follow these steps:
The download of their client on a Windows Server virtual machine hosted on an AWS data center.
Create a profile that needs to mimic a Mac OS Desktop using a Chrome browser
Add residential proxies located in the same region of the datacenter (eu-central-1)
Disable WebRTC features to avoid the original IP leak
Add noise on Canvas and WebGL renderer and in any other parameter possible.
After setting up the profiles, we’ll visit the two test pages and assign a score from 0 to 260, split as follows:
From 0 to 100 points according to the trust score of the CreepJS test.
From 0 to 100 points according to BrowserScan fingerprint authenticity
20 points if the browser is correctly recognized on both websites, 10 if only in one.
20 points if the operating system is correctly recognized on both websites, 10 if only on one
20 points if there’s no sign of a WebRTC leak on both
All these tests are created by myself without any help from the companies, that are not aware of the final score until the publishing of this article.
The purpose of this article is to use the same methodology for every user and not maximize the score for every test, a thing that could be probably achieved by playing around with some other options.
Anyway, I’m open to reviewing the test results if needed, so please write me at pier@thewebscraping.club if you think there are errors in the evaluations.
MultiLogin
The MultiLogin browser on BrowserScan has a 95% fingerprint authenticity.
OS and Browser are correctly detected while there are some red flags on the Canvas fingerprint.
The same red flags are also signaled on the CreepJS test, where MultiLogin gets a 58% score, while the baseline is 66.
The test detects that we’re using a Windows machine:
so the final score is 203, a good score if compared to the baseline of 226.
The updated ranking, with the inclusion of MultiLogin, is as follows:
Let’s see some comments now from the companies involved in the test, using the same order of the raking.
GoLogin
Here’s the comment from Anton from GoLogin:
To be honest, I don't have anything to add or change. We have recently updated the Chrome core to 123 and 124 will be out in a week or so. It doesn't matter a lot, but still.
Kameleo
Talking with Tamas, the CEO of Kameleo, I’ve understood I made an error in the test. I got a little bit confused with Kameleo’s spoofing settings. When setting WebRTC to OFF, as I did in every anti-detect browser, instead of disabling WebRTC features in the browser, I turned off any spoofing. This is why there was a public IP leak. From the next edition, I’ll be more careful about it!
Here are instead some comments from Tamas:
“My general comment for future customers is: Use our team’s recommended default settings to reach the best success rate against anti-bot systems. Adding noise to all the fingerprint parameters is not the ideal set-up all the time, as having unique values won’t let you blend into the jungle of the web. The whole idea of an anti-detect browser is to appear as a real-life person. To reach it, please follow the above recommendations. However, if you are advanced web-scraper it always worth to test multiple settings and benchmark them on your target site. We have a customer who uses 3 different set-ups, ranging from mobile profiles to Junglefox, he always choose the right setup for his target site he is scraping.”
Octo Browser
Here’s the comment from Artem Sapryco of Octo Browser.
The first part of the Antidetect Royal Rumble was a big hit among the Octo Browser team. Internally, we were already thinking about how to measure the effectiveness of the antidetect browser and the quality of spoofing. We found this task really challenging due to the variety of tasks that antidetects perform.
The WSC approach in the first part of the Rumble is somewhat controversial from our perspective. Firstly, some of the participants aren't actually adding any audio noise to the fingerprint, which improves their CreepJS results. This can be easily verified by checking the audio noise hash. Adding the noise to the fingerprint is an action that can be easily tracked by checkers and only makes sense in certain tasks. That's why Octo has the noise turned off by default. It would be better to either test browsers on sites with protection systems that can only be bypassed with added audio noise or to disable audio noise for all participants in CreepJS tests.
Another controversial topic is that during the test, there were varying numbers of visits to the CreepJS site from different competitors. Our profile visited the site five times during the test, while there were 2-3 visits from other browsers, as seen in the screenshots. The CreepJS score decreases when the site is visited with an inconsistent fingerprint, so it would be better for all participants to conduct the CreepJS test once with a single profile.
In any case, we are eagerly anticipating the next challenges. We're ready to engage and contribute, and we hope to encounter real-life challenges in the future, such as bypassing protections on highly secure sites like hyatt.com or vavada.com. We'd like to thank WSC and Pier for this motivating challenge!
Incogniton
Niels from Incogniton suggested a different setup for raising the score of the tests, like forcing the user agent to be coherent with the browser used.
Here are his comments on the test.
As already mentioned by Octo browser is that GoLogin doesn't actually add noise in the fingerprint. So their fingerprint in a browser profile is actually configured based on the input of your own device. So for example if you sign in from multiple devices the reliability of your browser profiles is actually even worser. It doesn't feel entirely fair that our competitors are not all equally honest about how their browser profiles are configured and that we do this and are held accountable for this.
I’m happy that this test created a little controversy since the different products have different strengths and weaknesses and it’s difficult to compare them on a common ground. A score in these tests it’s just a score and it doesn’t mean anything about the quality of the products. In the next episode, we’ll use anti-detect browsers to bypass one famous anti-bot protection, which is what means the most for the web scraping community.
Thanks again to all the companies, we’ll see in the July edition.
Love the honest and in-depth approach you took here. I think for readers it might be useful to give a ball park cost per request for each option including any subscription costs, proxy costs, and how long you spent setting each up so they can get a more full picture.
I love the comparisons! You could also add to the score how old is the Chrome version. -1 ok, but -5 is bad. And ofcourse some tests with bot detection sites like you had in the unblocker tests.