For your use case, where you seem to be expecting extended outages, I would actually recommend trying to get a diesel generator with a small UPS (while you start it up) if possible, or maybe even consider batteries designed for solar power, with the added advantage of saving on future power costs ;)
When you're picking out any kind of power supply, you usually want to look at VA (Volt-Amps), not W (Watts). Nominally, W = V (Volts) * A (Amps), but Watts don't account for "unused" power that's "returned"1. You need to pick a supply that's capable of supplying the peak load you can expect, and a little extra. It also needs to be able to supply an adequate continuous load.
When you want to know how much the electricity is costing you, you use watts. When you are specifying equipment loads, fuses, and wiring sizes you use the VA, or the rms voltage and rms amperage.
That sorts out whether the power supply is capable of supporting your usage. Then you need to look a the capacity, which determines how long the power supply can last with your usage. This is normally measured in Wh (Watt-hours, where 1 WH = 1 Watt for 1 hour) or kWh (kilowatt-hours, 1 kWh = 1000 Wh). To figure out how much you need, you take your continuous load and multiply it by the time you need, with some extra.
Performing the actual calculations with your data:
Your laptop will likely draw 150 W at peak, but far less when idling. I don't have that data, so I'm going to assume a 150 W continuous load. Add in the lights and fan and we reach almost 300 W (rounding and safety). We can usually approximate VA with 1 VA = 0.7 W, though this can vary a lot depending on what the equipment is. 60%-70% is the typical ratio2, and I'm using 70% here. Therefore, we reach about 430 VA. Let's say 450 VA. Remember, this is the value you cannot exceed - so this supply will never support more than about 300 W safely.
At this point you're looking for a power source that can supply at least 450 VA. Next step is to calculate the capacity - 300 W for 6 hours is 300 * 6 = 1600 Wh, or 1.6 kWh. That would require a pretty beefy battery from a UPS. Also note that batteries do degrade over time and will lose capacity and need to be replaced. Again, you might want to add a safety margin, maybe an extra hour. With capacity, if you draw more power it'll just last shorter, and vice versa.
Again, I don't have actual figures for your real continuous load. But you can repeat these calculations with adjusted values.
1 See What is the practical difference between watts and VA (volt-amps)?
2 Some loads, including some 80+ certified SMPSes, can almost reach a one-to-one ratio (1 VA > 0.9 W). But it's safer to assume lower than the other way around, unless you can measure and confirm.
Add up the watts, compute total watt-hours you need. Get a UPS that can provide the peak watts and which has a watt-hour rating probably 50% more than your requirements. – Daniel R Hicks – 2014-01-20T04:42:31.637
@DanielRHicks UPSes are normally rated in Volt-Amps for peak load (i.e. do not exceed, even when on mains), and SOHO ones typically have very limited battery capacity. Volt-Amps translate to a slightly lower number of Watts. – Bob – 2014-01-20T04:48:41.087
1what does VA mentioned on batteries mean ? – MrClan – 2014-01-20T04:51:06.540
Also, at the moment this question reads like a product-rec/shopping question, which are usually closed. You can probably save it by emphasising the method of choosing a UPS, and preferably removing the request for a specific "model". – Bob – 2014-01-20T04:52:05.117
2
@MrClan Watts (W) literally equal Volts (V) x Amps (A), but Volt-Amps (VA) normally means a slightly different thing. It's explained pretty well here, but basically VA refers more to how much power is traveling through the wire, and W more to how much power is "consumed"/"used" (dissipated) at the end. As the linked answer states, some of the VA power is actually returned back to the source "unused", making W lower than VA.
– Bob – 2014-01-20T04:57:08.3901VA is important when calculating how much max you can have flowing through a wire (or UPS) before something melts or gets damaged, while W is how much power you've used and need to pay for. A somewhat conservative bet for computing loads is W = 70% of VA. Something like a fan could have completely different characteristics - it could "return" far more of the power "unused", leading to W being a lower percentage of VA - e.g. you might need a 160 VA supply to run your 80 W fan (those numbers are made up). – Bob – 2014-01-20T04:57:56.317
Actually, VA is always larger than W. Watts are computed based on the vector multiplication of volts times watts, through the 1/60th second power cycle, and if the two are 90 degrees out of phase you have zero watts, regardless of the VA (which is straight AC volts times amps). The difference between the two is the "power factor". Older computers had really lousy power factors (like 0.3), but since Energy Star rules went into place they have improved quite a bit (though I don't know what PF is typical these days). – Daniel R Hicks – 2014-01-20T12:41:28.503
@DanielRHicks I'm trying to avoid being too technical, partially because I don't completely understand the technical side myself ;) The PF of an 80+ certified PSU should be at least 0.9. – Bob – 2014-01-20T14:43:14.037