Timeless mate this is a quote from this
What is the difference between Voltage-Amps (VA) and watts and how do I properly size my UPS? - Power Solutions
UPS manufacturers only publish the VA rating of the UPS. For small UPS designed for computer loads, which have only a VA rating, it is appropriate to assume that the Watt rating of the UPS is 60% of the published VA rating. For larger UPS systems, it is becoming common to focus on the Watt rating of the UPS, and to have equal Watt and VA ratings for the UPS, because the Watt and VA ratings of the typical loads are equal.
I don't know if this is what you are asking for. In any case I run a 850VA UPS and three machines plus monitors plus modem from it. I do that through surge boards too as I am a little "pedantic" when it comes to protecting my gear. Never had a problem with this set up except once I plugged in a blower heater without thinking straight and it immediately sent the UPS into panic mode
I didn't bother to read the entire article you linked (it's WAY past my bedtime and my eyes are starting to cross) but the part you quoted is misleading.
Actually, most manufacturers list both the VA and the wattage of UPSes. First, some definitions. VA means volt/amp. It's used instead of wattage since the amount of usable wattage is often less because of something called Power Factor. In an ideal world, the current will always be in sync with the voltage, resulting in a Power Factor rating of 1.0, also call Unity. However, since we have to live in the real world, when inductive loads, such as large motors, are placed on the power line, they will cause the current to lag a bit behind which causes the amount of wattage the load can see to be less than the product of volts times amps. The causes the Power Factor to be less than Unity (btw, power companies switch big capacitor banks in and out to correct power factor) and the wattage seen by loads will be less than the product of volts and amps. By using VA instead of wattage, the true amount of voltage and current flow through a circuit is represented.
The input rating of an UPS has to reflect how much power it is going to draw from the mains. While Power Factor will reduce the amount of output wattage a little bit, the biggest cause for the difference between the input power and output power ratings is the UPS has parasitic loads necessary for its operation. The power regulating circuitry will use and lose some power and the battery recharging circuitry will use up even more. UPS manufacturers chose to use VA instead of wattage to ensure users will know the true amount of current draw being drawn from the mains when the UPS is running full tilt. Wattage is used to show the amount of usable power available to devices protected by the UPS.
On average, the wattage will be roughly 60% less than the VA but that figure will be lower for less efficient units and/or ones that recharge the batteries more quickly. UPSes that are more efficient and/or take more time to recharge the batteries will have a wattage rating that is a higher percentage of the VA. Keep in mind that a UPS with more battery capacity will require more charging time, no matter what the charging rate is.
Hopefully, that was a wee bit clearer than mud. It's time for me to become the filling of a sheet sandwich,