Kudos for trying to present honest reviews - a somewhat unusual take when most business models involving user-submitted reviews tend not to favour such an approach.
Regarding what you are doing currently....
You've only told us part of the process here. One common feature of many online sites is that they insist on establishing trust up front before allowing a user to do anything. I find this particularly annoying. I don't want to provide my postal address, my age and shoe size before I'm allowed to share my experience of a product, good or bad. Indeed I'm only likely to jump through such hoops if I really want to post a vitriolic review. Hence I would suggest that any part of the process directly involving the user take place after the review is logged (but before it is displayed). Sending an email with a confirmation link is an easy, minimally obtrusive way to do that.
Using browser fingerprinting, both on submission of the review and validation will give a good indication if the same browser submitted the review and confirmed it but people can have multiple devices.
user IP + user browser fingerprint combination already exist
IP addresses for clients are rarely static. Even using a subnet is not all that efective. The ASN (or its associated ORG record) will give you less granular, but much more consistent, accurate results. Using the ASN number also simplifies the process of identifying the locality of the client address; you may wish to restrict the reviews to the countries where the product is available. There are organizations (my experience is that there is an unusual density in the Phillipines and Eastern Europe) who will carry out black-ops marketing.
Any advice about fingerprint collisions?
Try to avoid them?
For some reason, companies offering these services are somewhat reticent about publishing stats on the uniqueness of their solution - and there seems to be very little literature (beyond the original panopticlick study) comparing methodologies.
You might some useful pointers in the blog post here. I'd previously reported 1056 unique hashes from 1160 different devices using some of those methods. I've since updated my methodology to include canvas fingerprinting which ramps up the uniqueness a lot.
Sooner or later you'll find that you have lots of data for each review - multiple IP addresses, cookies, fingerprints (and potentially vocabularies, writing style and others). You may find its more appropriate to associate a weighting to the different flags detected in a similar way to how spamassassin works.