New
#11
I wouldn't worry about calculating power factor (it's never 1, more properly called unity, even on resistive loads, since some reactance always manages to get introduced); it's rarely enough of an issue to be of concern for anyone other than the power company (I've handled the capacitor banks used by power companies to correct power factor; some of those puppies are freaking huge). The VA rating of a UPS is normally used only to specify how much power is drawn at the outlet the UPS is plugged into and that will always be the same, no matter what power factor the UPS is rated at. That's why I didn't want to confuse the OP with it. Also, UPS manufacturers love to tout the input VA since it makes the UPS look more powerful than the output wattage would. Dividing the VA by the input voltage (normally, 120v in the U.S.) to get the amperage needed at the wall outlet is simple enough without having to mess around with charts.
That quote from Tripp-Lite must have been for industrial UPSes since I've never seen a consumer UPS that was rated in VA for both the input and the output (btw, when quoting something like that, it's standard practice to link the source). As you sorta pointed out, power factor is not a concern for computer gear so why worry about? It's always a good idea to slightly oversize equipment (say, 5-10%) to provide a safety margin and it doesn't hurt to oversize even more to allow for future expansion (such as I did with my UPS; I just pulled the trigger on three monitors that will use 50% more juice than my present single monitor). Of course, oversizing also allows for more runtime during an outage.
One more exception I take to the Tripp-Lite directions I already pointed out earlier; just because a computer PSU is rated for a certain wattage doesn't mean it will ever draw that much. If one has a PSU that will never draw more than a fraction of its rating, there is no need to shell out shekels for a UPS any bigger than will actually be needed.