0

So the basic answer to my title is Use Captchas. I agree, they are great and I am using google's recaptcha newest one. I wanted to state that before I explain anything else so that we are on the same page.

I have been thinking about user experience and basically if I were to secure every form post on my website I'd have to stick a captcha on every page. That is not very user friendly. Its annoying and people would hate to do it. However, I've noticed functionality on few websites such as Steam where it will allow you to log in few times and if you get it wrong lets say 3 times then it will start showing a captcha for you.

I like this approach, you get 3 attempts, if you get them wrong then you will start needing to prove you are human. Furthermore I wish to completely block you after lets say 10 attempts from even visiting my website. Unfortunately I do not know how to go about achieving this.

My question is this:

How do I keep track of your attempts to lets say log in and then throw you a captcha after 3 attempts and completely cut you off after 10 attempts? I have thought about cookies but that can be cleared out. Would I have to do something with external IP's? Would this be too much overhead? Every time you submit any form on the site (errors or not) I'd write to database to your counter and add 1, seems like its too much. Is there a better way? How would you do this?

I am developing in ASP.NET MVC5 Framework with EF6 - C#.

EDIT:

I saw a vote for too broad, I'm not really sure how I can be clearer but here is my attempt:

What is the best way to track user's attempts at submitting forms when they are not logged in?

EDIT 2: The question that was linked below as possible duplicate only addresses showing a captcha after a certain username hits a threshold, I don't like this approach. I wish to limit more towards a single attacker and blocking them off not the legit user who will try to login afterwards. That question also doesn't address how to block that user from loading the site after too many attempts.

Bagzli
  • 6,254
  • 17
  • 80
  • 163
  • *offtopic* another method to prevent automated posting is keeping track between the page load and the form post. Everything, under lets say, 5sec could then be regarded as spam. – DarkBee Jun 05 '16 at 19:47
  • @DarkBee In those cases they can just add a delay to the bot scripts, seems easy to get around. This could be a feature I can add but not a final solution. How would you go about tracking that? – Bagzli Jun 05 '16 at 19:49
  • it is not clear what is your primary concern. Preventing bots from spamming your site? Or tracking users's attempts as you say (whatever "tracking" means here). I don't think the logging example fits here as there is a clear indication of valid vs invalid request. On the other hand, what would "invalid" be in your general case? – Wiktor Zychla Jun 05 '16 at 19:50
  • @Bagzli just add a hidden field with the current timestamp as value and calculate the difference between the post. This add a double layer to prevent spam. You ensure that the poster first has loaded the form before doing a post. Most bots just post directly to the link as well without ever loading the form. – DarkBee Jun 05 '16 at 19:53
  • @WiktorZychla My primary concern is to stop brute force/bot spamming. For example withg Logging in example, if I enter wrong username/password combo would be an invalid gesture. On the other hand if you take a look at a scenario such as a search, I want to stop the user/bot from spamming a search feature and abusing it causing a DoS. – Bagzli Jun 05 '16 at 20:32
  • @Bagzli, I also voted too broad (you are asking a complete solution from scratch), you should try to [edit] your question, avoiding "What's best", that is primary opinion based, and if you like a specific approach, write down how it should work and even better show some code on your ideas (so people understand your level of programming). – Petter Friberg Jun 05 '16 at 21:43

0 Answers0