If you are using a reasonably modern bash, you could use shell expansion like so:
wget http://www.example.com/file/image{1..99}.png
The above will try to get image1.png
all the way to image99.png
. Obviously, choose your range as appropriate. Those files that don't exist should throw up an error, but cause no actual problems. If you want, you can probably suppress the errors with some wget option, or by redirecting STDERR to /dev/null:
wget http://www.example.com/file/image{1..99}.png 2> /dev/null
For a truly huge number of files, you may come up against the limits of the command-line - there is a maximum number of arguments that can be on the command-line. You are highly unlikely to run into this problem, but in the event that you're dealing with a really huge number of files, a for loop might be safer:
for f in {1..1000}; do wget http://www.example.com/file/image$f.png; done
## and the version sending STDERR to /dev/null:
for f in {1..1000}; do wget http://www.example.com/file/image$f.png 2> /dev/null; done