On its surface, it’s a good idea. Website hosts don’t want to be inundated with fake traffic and fake data inserted by bots.
But it’s entirely unnecessary when your local library or doctor’s office is using captcha just for you to fill out a form, it’s a bit excessive because it’s highly unlikely any type of botnet would be targeting sign up fields on their sites. The attackers wouldn’t get anything out of it.
Eh, not that unlikely.
The bots just scan everything they can.
For example, I’ve seen a guy do some testing with SSH server on default port (22). On average, there was constant 10Mbps of traffic just from the login attempts.
I can imagine it to be similar with websites.
On its surface, it’s a good idea. Website hosts don’t want to be inundated with fake traffic and fake data inserted by bots.
But it’s entirely unnecessary when your local library or doctor’s office is using captcha just for you to fill out a form, it’s a bit excessive because it’s highly unlikely any type of botnet would be targeting sign up fields on their sites. The attackers wouldn’t get anything out of it.
Eh, not that unlikely.
The bots just scan everything they can.
For example, I’ve seen a guy do some testing with SSH server on default port (22). On average, there was constant 10Mbps of traffic just from the login attempts.
I can imagine it to be similar with websites.