0

i am working on a project that needs to shorten URLs to of about ~25 chars of length. I can create cryptic ids which map to full length URL and persist it in a DB. My only worry is how can i prevent someone from random generating such ids and bombard my service.

Any suggestions?

bluefalcon
  • 143
  • 3

4 Answers4

2

You could require proof of work, by requiring that every submissions are attached with a Proof of work, a string that when appended to the original URL, yields a string that when hashed, had a specified number of binary zeros. The number of zeros requested could be configured based on your system load.

This works to reduce a (D)DOS attack from polluting the database with spammy URLs. It doesn't protect against volumetric DDOS attack, which will still bring your service down. To protect against volumetric aspect of the DDOS, you'll need to replicate your servers. You need a server on many parts of the world which will check that the URL submission is valid (the proof of work is valid), before submitting the request to central database.

Additionally, you might also want to check that the expanded URL exists and is accessible before accepting submission, although there are reasons why URL that might not be resolvable to you could still be useful to other people (e.g. links to internal services, non-HTTP URLs).

schroeder
  • 123,438
  • 55
  • 284
  • 319
Lie Ryan
  • 31,089
  • 6
  • 68
  • 93
1

If you don't require a login then IP address is about the only way you can rate-limit the requests. If you do require or allow logins then you can also restrict by account. It is common, for example, to restrict the number of calls to a service in any given period (hour, day) - especially on a free tier, you might then give a higher rate option for paid accounts.

You will probably also want to have a whole-service monitor running so that you can spot unusual peaks in requests across multiple IP addresses. This is typical of the more sneaky attacks that use botnets so that each individual source IP is operating below the limit.

Julian Knight
  • 7,092
  • 17
  • 23
  • I cant leverage a login flow. Have thought about IP based limits, but was wondering if there is something else that can be employed. Thanks ! – bluefalcon Dec 11 '16 at 18:16
  • Well you could limit based on any reasonably unique thing that you get from the requestor. But IP address is by far the easiest and most available. Just don't forget about botnets gaming the system. – Julian Knight Dec 11 '16 at 18:25
1

There really is nothing special about URL shortning services in this situation. You need to look at traditional anti-DDOS strategies.

This answer contains a fairly informative summary of how major sites prevent DDOS attacks. IP-based access control shall only prevent against DOS and that cannot help if the access control part is the one that's being DDOS'ed.

Summary: - Buy big bandwidth capacity - Buy a reputable WAF or/and anti-DDOS solution

NA AE
  • 188
  • 3
0

If a user or bot is trying to create too many urls to junk links or a particular link you can stop him by using captcha. Plus you can put ip based restrictions on the number of urls that they can shorten.

jammy47
  • 43
  • 1
  • 6