While this new version potentially helps things, it feels like the users with more automated interaction methods or those who want to reduce fingerprinting efforts will still fall foul of over-zealous site owners.
At the end of the day, all a CAPTCHA is is a method to externalise business costs onto your users.
[1] https://www.gov.uk/service-manual/technology/using-captchas
Now what about about literally every other entity in the world that has a website? The vast majority of which not being a business? Nor serving life-critical documentation to all of the country's disabled?
> Please upgrade to a supported browser to get a reCAPTCHA challenge.
There are many reasons this is bad, but for now I'll point out that creating barriers that prevent new competitors from entering an established market is the behavior of a monopolist abusing their power over a market.
I wouldn't be surprised if they get another fine from the EU over this.
i didn't test opera nor brave, but neither should have any issues either, as they're just chrome forks at this point.
In my experience using Firefox and not being logged into a google account results in a very long if not impossible chain of captcha challenges.
- Using Firefox with uBlock Origin
- Not logged in to Google (at least outside of Firefox's containers)
- Third-party cookies disabled + Cookie AutoDelete addon
They present me with 3-5 captchas every time
Lately I've experienced the same thing, often 3-4 rounds of captcha challenges, sometimes more. It's painful from UX standpoint and insulting as user. I've been trying to avoid any site that uses them, they're very user-hostile, but there's only so much you can avoid.
If the site is important to you, you should contact the owners with an alternative (such as an open source captcha, or spam filtering).
The only way to topple the monopoly of Google is to erode it site by site.
Do I also click the poles?
"Click the traffic lights"
Do I also click the poles they're attached to?
"Click the store fronts"
If it doesn't have writing on it but it looks like it might be a store do I click it?
"Click all cars"
Do I also include trucks, busses, or just cars?
Besides, if a relatively simple puzzle is too much to ask, then maybe you didn't care all that much to begin with. I can think of various platforms where this would be a good filter even if there was no such thing as abuse. ;)
I'm curious where the line on "relatively simple" is.
Other people have already noted the tasks can be confusing if you're not an American English-speaker, and that the backup tasks for people with vision difficulties are nigh impossible.
But beyond that: some of the time I have to do 1-3 Captcha pages, which I would class as relatively simple. Other times, I've gotten up to 10+ pages with what I believe was perfect accuracy. (They were fairly simple tasks like "click the stoplights", not the sometimes-ambiguous ones like "click the storefronts".) That's usually when I'm traveling, so it's correlated with slow internet. I don't know what the upper limit is, because there are very few pages I care about enough to push through 10+ rounds of Captcha, but I'd argue that "spending 3 minutes studying traffic photos with no end in sight" is way past my definition of 'simple'.
Every time I’m asked to identify the motorbikes or traffic lights I feel like google should be paying me a few cents each time for helping train their machine learning algorithms.
And on mobile the experience is even worse. Depending on the placement of the captcha box half to a third of the tiles might be off the edge of the screen, making it impossible to solve. Seriously, how can Google not have a mobile version in 2018?
Over and over again, too. It's completely overdone. I have had screen after screen of images to click on before the Recaptcha is finally happy that I'm human / I've provided enough training data for whatever object they're currently trying to get their cars not to run into.
I wouldn't really object to "of these three pictures, which is a motorbike", but when I'm on my 4th or 5th screen of 9 images each I'm getting pretty annoyed... And they're so slow to fade in, too!
That's even before other panopticon questions of who all this added telemetry even benefits.
When I mentioned panopticon benefits, I was more directly implying the complex "cui bono?" question of whether or not this data continues to entrench Google's behavioral analysis arms that use such data to sell our every behavior to advertisers for the purpose of buying our attention. It's not the websites using reCAPTCHA that benefit from all that extra advertising information stored on Google's servers, and it's not necessarily the individuals like you or me using those websites that's benefitting from all that extra information on Google's servers.
Especially given that in v2 it seems very clear that Google has been using reCAPTCHA as their own personal Mechanical Turk to also entrench their positions in map data and possibly automated driving image recognition, this is not an idle question.
I do not want Google to have any more fucking data about me than it already does! "Put this blob of JavaScript on every page of your site so that we can see how users are clicking, scrolling, and browsing around. Think of the children^W spam and abuse!"
I just cannot believe that Google somehow gets away with spinning this as some sort of "guardian of the Internet" thing when it is a transparent attempt to a) make adblocking more difficult and b) force people to accept being tracked by Google or get blacklisted from the web.
Getting banned from sites or treated as a subhuman because you don't want Big Brother to follow your actions around should not be something that we're okay with. It just shouldn't be.
They already do with google analytics.
Well - we're not.
This effort by Google, at its worst possible implementation, could break a huge number of "required" websites for users. As others have pointed out in this thread, the users who will be impeded are likely already at a disadvantage[1]. This just reinforces that, especially if everyone starts adopting this given how it is "free" for businesses/organizations.
A convenient side effect? Google gets more information about us and encourages us to view more ads.
[1] Someone mentioned smaller villages in India. The GOV.UK reference talks about users with disability. One could also imagine shared locations like public library, whose users may not have direct access to the internet.
The website may deserve to get recompense for their content, but I deserve to know what I'm paying just as much, if not more.
If you're depriving me of something in our transaction that I'm not aware of, then at least part of what happened was theft. If doesn't matter that I didn't know about it until days, weeks, years or even decades later. If you've deprived me of something that we didn't agree upon, you've stolen from me. Even if in this case it's privacy. Especially in this case because it's privacy, because you can never get it back.
A server can put various measures in place to limit general access, but then by those measures the IP is no longer public. T&Cs that a site may wish to enforce can only be made available after the connection has already been established.
Also, robots.txt is a standard that we agree to respect but it is not something that legally binding.
So have to disagree.
What is the incentive for the site owner to provide a streamlined experience to an user that will consume resources but intentionally prevent monetization?
Not sure I agree with it, but it is obvious to expect that.
Looking through the documentations for reCAPTCHA v3, it wants to load a script from Google's servers. As an end user I do not want Google to track me across the web, on other people's property. Thus I want to block this HTTP call. This discussion is pretty orthogonal to "Ad"block, given that all sorts of places have a CAPTCHA implementation.
The cynical side of me thinks that Google's doing this such that a convenient side effect is crippling the user experience for those who uses content blockers/custom user agents to impede its adoption. Even without a cynical motive, this side effect seems to be conveniently ignored, which hurts users. Suppose another non-ad based company were to create such a system, I'd imagine they would want to explicitly work around the problems of blockers/tracking such that the website owner can get a more "true" metric of "suspicious" users as this would likely lead to more sales/engagement/whatever.
But the big point is Google is not the Chinese state. In fact, one might say, quite accurately, that they're not very friendly.
Frictionless user interfaces arr great, but could this be a ploy to get websites to add Google-property tracking JS on more pages?
I mean, you must hate cloud software if you've got problems with this. Surprise ! Almost all software, from steam to windows, to fusion360, to fastmail, to github, office, ... is cloud software. Just naming some random examples. All have the same problem, most without providing any service (hat off to fastmail and github though, who provide service, like Google, in trade for cloud. And poeh ! to MS, for having office be cloud software for no good reason whatsoever)
German contract law has the notion of "unexpected clauses", especially for terms of services. Certain ToS clauses have been invalidated by courts even if the customer agrees to the contract that includes those ToS, because the clauses have been deemed "too unexpected". The basic idea is that people should not be expected to read the entire ToS before agreeing to them.
To me, this is sort of the same: When I visit github.com, my mental model says that GitHub will know about this and be able to run scripts in my browser. However, it would be "unexpected" in this sense, and therefore IMO deceptive, that visiting github.com causes my browser to run code from google.com.
But on the other side, it needs to be understood just how important having a CAPTCHA is. The amount of destruction to user experience that bots can cause is sometimes far worse than the pain the CAPTCHA causes.
The long chains of reCAPTCHAs annoy me to no end, and I hope a middle ground can be reached, but bots are a very serious problem.
I do wonder if maybe computational challenges are a feasible alternative in some scenarios, or perhaps as an alternative choice you could give to the users.
In some scenarios such as spam, the profit per bot action isn't high at all, so this might be feasible. If it's a ticket bot, that wouldn't work at all.
Of course, you're hurting users without good devices, which just sucks.
I haven't really thought too deep into the economics of this, I don't know if it'd work at all. Just a thought.
Edit: Meant to include mobile in there, I don't know how if this would or wouldn't work with this scheme for mid-end+ Android devices or iPhones.
Need to give them credit for fighting: hurting malicious websites by not sending them traffic, keeping search results relevant against SEO abuses, cutting down email spam effectiveness, ... and reCaptcha.
All of this becomes very relevant when you run our own online business like I do. You can lament that google knows you're shopping for a new car, but my users lose real dollars if a scammer gets on my website - and google provides the tools to combat this.
And, no you can't implement your own captcha. No matter how smart you think you are, you don't have the data that Google does.
Worst part is even after solving dozens of images (which keep refreshing by the way to no end) you still sometimes get we don't believe you're human comment and no way to go forward.
Cloudflare and this recaptcha can really break the internet for some people, esp in small Indian cities.
The captcha is broken because at least in my case even after solving dozens of images (which keep refreshing now btw) it still can't be convinced I'm human.
It works by having a bot signup on thousands of websites at once with your email. The purpose is to flood your email with hundreds of welcome message emails every minute so you will miss the real security message emails (such as someone resetting your password).
What makes this attack so evil is that these are real sites you have to individually unsubscribe after the attack is over. This includes many sites from countries without email unsubscribing laws. So to this day, I still get hundreds of emails everyday from these sites who think I have signed up for their newsletter/product/etc.
I would not be against enforcing a captcha on every site out there just to prevent these kinds of attacks.
Do you get it? A company whose business model is based on their bots ability to crawl the web will now have more power over other bots.
Brilliant.
For example, "the new way to stop bots", "alert you of suspicious traffic", "identify the pattern of attackers", "pages are being targeted by bots", "stay ahead of attackers and keep the Internet easy and safe to use (except for bots)"
Many companies have built valuable services by automating HTTP requests. One might even think that Google would like them to stop.
Two things that particularly worry me about this are a) encouraging sites to apply captchas to pages that have nothing to do with authentication and form inputs, and b) the hint of requiring two-factor authentication and phone numbers to proceed. [edit] will Google be offering to handle this on behalf of sites?
Exactly this. Note that reCaptcha v3 is meant to be placed on every page, not just forms, and returns a "bot score" which the site can use for any purpose. I can see any new web search engine being horribly muffled by the (mis)use of reCaptcha v3.
And who pays for those costs ? Who eliminates the tons of spam posted ?
I understand it for account creation. I understand it a bit less for login (seems like a lazy way of preventing automated attempts).
But for simply accessing a site? What gives?
I'm increasingly starting to find that only tech blogs, the odd big site I'm logged in on like Amazon, and sites like HN are usable lately, because anything else seems to require a 1 minute + gateway of CAPTCHA + GDPR + whatever else before I can actually get to the site.
Is it some way of filtering out users the sites don't want without expressly having a "403 Forbidden" or whatever?
If I block this with uMatrix, I'm likely permanently gated from whatever I'm trying to access. If Gitlab updates, I won't be able to access my repos, because they (for some reason) make me solve a reCAPTCHA before doing anything on my own repos.
Why do we want this version of the web again?
Your anger would be better directed at the bad actors who ruin things for you, like how you need to buy a lock for your door. How much time in your life has been wasted by slotting a key into a hole? Ugh, doesn't anyone know this isn't the world I asked for?
I noticed if I encounter a recaptcha on a website, I just tend to abandon and seek information elsewhere. Last time I was presented with a recaptcha when setting a search filter on a website. No, thanks. This is too much of a pain to unblock everything and answer a recaptcha. I'll pass.
When I do answer a recaptcha in despair, this is a pain to do.
Not affiliated with the site or any or the alternatives.
What do HN users use them for?
BUY ! BUY ! BUY ! Penis enlargement pills ... sorry to put it bluntly, but it's either that or reCaptcha.