0

A typical office lease says the tenant has 1.5 watts of electricity to use per sq.ft. for lighting and 3.5 watts of electricity per sq.ft to use for computers etc. anything over this they say extra for via sub meter. How do I calculate what the threshold is and amount they can not exceed per kWh ?

  • This question is a bit off-topic - are you principally concerned with the cost, or is the sustainability of appliance use also a concern? Voting to close for now. – LShaver Dec 21 '16 at 19:05

1 Answers1

1

Since the owner's meter is unable to differentiate between electricity used by lighting vs computers, the terms are essentially saying that the tenant is limited to 3.5W + 1.5W = 5W/sqft.

This value is for power, often called demand on an electricity bill. Based on the info provided, if the tenant required 5W of electricity per square foot, they could leave their lights and computers on 24 hours a day without paying anything additional.

This would equate to 5W × 730 hours/month = 3,650 watt-hours (Wh) of energy usage, also called consumption, per sqft, per month.

However, let's say the tenant makes widgets with 1 sq-ft widgetmakers that require 3,650W of power, but only have to run once a month to produce all the widgets needed to keep the company in business.

Usage remains the same: 3,650W × 1 hour/month = 3,650Wh per sqft, per month. But now the demand is 3,650W, or 3,645W higher than the limit specified in the lease.

Thus, in order to effectively answer the question, the tenant needs to know the power rating of each appliance they intend to use. As long as the total power demand of all appliances in use simultaneously doesn't exceed 5W/sqft, there will be no demand charge.

LShaver
  • 11,885
  • 6
  • 30
  • 81