As more and more remote monitoring is installed on sites ranging from sewerage pump stations and dam level monitors to air conditioning and environmental monitoring systems, so the need for reliable standby power, at 12 or 24 V, to drive conventional instrumentation and other remote monitoring systems has increased.
In the past, this area of standby power equipment has been dominated by either 12 V systems as used in the access control and security areas, or by computer and general-purpose UPSs. Neither of these classes of equipment is suited to small, remote industrial applications. In such applications, parameters such as maximum charge time, minimum standby time, battery life, and voltage regulation demand clear specifications to ensure that these installations are fit for purpose. Space and cost are also issues to be addressed during equipment selection.
There is even an inclination by some engineers to simply connect a 24 V DC battery directly across a 24 V DC power supply to achieve a conveniently simple and small standby system. While specifying and installing such industrial-grade systems does not have to be difficult or expensive, there are a number of pitfalls the unwary can easily encounter with the specification and management of small battery standby systems.
This article examines the issues involved and suggests methods to ensure successful equipment selection.
Battery type selection
There are a number of battery types available for standby applications, such as lead-acid (sealed and wet cell), nickel cadmium (NiCad), lithium-ion and nickel metal hydride (NiMH). All of these have differing costs and features that suit different applications.
For the type of systems described here, sealed lead-acid (SLA) cells are the dominant source and the sole focus of this article. This is because the technology is proven, readily available from multiple sources, low-maintenance and inexpensive in the typical ampere-hour (Ah) ratings used in these sorts of applications.
These batteries are normally equipped with a safety valve that will open if the internal pressure rises above a preset limit. Overcharging or overheating of the battery normally causes this event to occur. When this valve opens, hydrogen can be expelled, which will cause an explosive atmosphere to develop – obviously an extremely dangerous situation. Once this valve has opened, the battery is damaged and needs to be replaced. These types of batteries are also called valve regulated lead-acid (VRLA) batteries.
There are two types of SLA batteries on the market: absorbent glass mat (AGM) and ‘Gel-cell’. This refers to the method used to immobilise the electrolyte in the battery. Either of these two types of battery may be used with chargers designed for SLA batteries.
Temperature compensation
A lead-acid battery is constructed of a string of cells in series, at approximately 2,3 V per cell when fully charged. A 12 V battery has six such cells. It is quite common to put two 12 V batteries in series for 24 V applications. This fully charged voltage varies by approximately -3,3 mV/°C per cell. This does not sound much, but across 12 cells in a 24 V application, this amounts to a change of 0,4 V over a 10°C temperature swing. If the float voltage of the charger does not compensate for this change, then it is possible to overcharge the battery at high temperatures and/or undercharge the battery at low temperatures.
Over a normal ambient working range of 15°C to 35°C, a fixed-voltage charger is considered quite satisfactory and no temperature compensation is required. The fixed voltage selected for the charger will depend upon the standby time required of the battery and how conservative the designer is in their voltage selection. If the average ambient temperature is likely to be outside of this range, then a temperature-compensated float voltage is advisable.
Many chargers are equipped with integral temperature compensation. In industrial applications it is common to mount the batteries away from the charger itself, and the temperature environment of the batteries may be different to that of the charger. This means that the charger will not control the battery voltage effectively to suit the temperature of the batteries.
Charging time
While it is true in principle that a battery’s time-to-charge is determined by the charge current and the Ah rating of the battery, this will only be true over a limited range of charging current. This is because charging the battery is a chemical process, and the chemical changes require a finite amount of time to occur in the battery.
Every battery manufacturer quotes a maximum charge current that is normally in the order of 10-20% of the battery’s Ah capacity. Exceeding this charging current will not necessarily charge the battery any quicker, but more importantly, the battery could be damaged in the process.
When using a single-mode, constant-voltage charger it is therefore important to ensure that the current capability of the charger does not exceed this maximum charging rate of the battery. This problem is often encountered when using a conventional power supply as a battery charger. Very few conventional supplies will have a tightly specified maximum current limit, and the current-delivery capability of these types of power supplies can quite often reach twice that of the power supply manufacturer’s minimum specification under certain operating conditions.
Using a constant-voltage charger set to the recommended float voltage, the battery will not be charged in the optimum time after usage in standby. This is because the battery terminal voltage will reach this float voltage well before the battery reaches its fully charged state. As the battery terminal voltage approaches the float voltage, the charging current will tail off to a lower and lower value, and the battery will not reach its fully charged state for a significant period of time. These types of chargers are not suited to applications where the time-to-charge after standby is an important specification of the system.
Many of these limitations are overcome using dual-mode chargers. In a dual-mode charger, the battery is charged in two phases. When the AC power returns after the battery has been supplying the load and therefore requires recharging, the charger will enter its bulk-mode charging phase. In this mode the battery will be charged with a constant current until it reaches its bulk charge voltage. The charger then switches into float charge mode and the voltage is reduced to its float voltage, where the battery can remain indefinitely.
The bulk mode charge rate is chosen to ensure that the battery reaches 85-95% charge in the shortest possible time (within the constraints of the battery’s specifications). The remaining 5-15% charge is then topped up more slowly during the float charge cycle.
If it is important in the application that the battery be at design capacity within the bulk charge phase, then it is wise to over-rate the battery by up to 15%, and to consider the battery fully charged when it reaches this 85-95% capacity.
Low-voltage cut-out
As described above, a battery is a string of cells in series. The voltage across one cell will range from about 2,1 V when fully charged, to about 1,9 V when fully discharged.
When charging a battery, only the battery terminal voltage is monitored, and it is assumed that each of the cells has the same state of charge at any given time, and therefore that the individual cell voltages are the same across the battery. Because of chemical imbalances in the battery, this is rarely the case in practice. The consequence of this is that the individual cells of the battery will not reach full discharge at the same time.
If a battery continues to supply the load after the first cell reaches its state of full discharge (powered by the other cells) then this discharged cell will enter a deep discharge state. This results in a reversal of the cell voltage in this cell, which will damage the cell if allowed to continue for too long. It is therefore important, for battery life, to disconnect the load before the first cell enters this state. Since the charger is not monitoring the individual cell voltages, this point must be estimated based upon the terminal voltage across the entire battery. This feature of chargers is known as the low-voltage cut-out, and serves to disconnect the battery from the load before it becomes permanently damaged.
Battery size selection
In order to select an appropriate battery for the application, follow these simple steps:
1. Calculate the ampere-hours of standby time required, by multiplying the number of hours of standby required by the average standing load in amperes.
2. To take into account deterioration of battery capacity over the life of the battery (20% over 48 months is typical), and residual charge remaining at cut-off (20%), multiply this figure by 1,6 (this multiplier may vary from application to application).
3. If the battery is required to provide full standby time at temperatures lower than 20°C, increase this capacity by a further 10% for each 10°C below 20°C.
4. An additional factor of 15% may be added to the battery capacity if the recharge time to required capacity from the discharged state is an important factor of the design.
5. This then gives the minimum ampere-hour capacity for the battery required for the application.
In general, the larger the battery, the better in any given application (taking into account the compromise between size and cost).
Choosing a battery for backup
The method for calculating battery backup times and battery sizes, for the batteries used with Omniflex power supplies and chargers, is as follows:
20-hour discharge current
Backup times are based on the so-called 20-hour discharge current, referred to as J20. The product of this current and the time of 20 hours equals the battery capacity. Therefore, if the battery capacity is known, e.g. 24 Ah, the current can be calculated as J20 = 24/20 A = 1,2 A.
If the discharge current is constant and equals J20, the battery will provide backup power for 20 hours. If the current is even lower, we can assume that the backup time will increase proportionately, and that a battery of twice the capacity will provide twice the backup time.
However, this relationship is not linear for currents greater than J20. For larger load currents, the battery needs to be allowed to discharge to a lower terminal voltage and the backup times are shorter than expected. The graph in Figure 1 is a typical example of the recommendations available from battery manufacturers (in this case a 12 V battery).
The graph shows that if the current is four times higher than J20, the battery should be allowed to work down to about 9,8 V and the backup time would be only four hours, instead of the expected five hours.
Normally the power supply must provide specific voltage levels to power the associated equipment, and therefore have a fixed terminal voltage to which the battery can be allowed to discharge. When the battery is discharged to this voltage, it should be galvanically disconnected from the load until the primary power source is restored, in order to protect the battery.
The important conclusion is that if the battery size is reduced to a point where the base draws an average current greater than J20, the backup time must be evaluated using the battery manufacturer’s graph similar to that of Figure 1.
Evaluating backup time and battery size
The following example illustrates the evaluation of backup times from the graph in Figure 1, when the average current is greater than J20.
If, for example, the power consumption from the power supply is 38 W and the battery disconnect voltage is required to be 11,3 V, then under the battery backup condition, the load will be working from a fully charged battery. For the greater part of the backup period, the voltage across the output terminals will be 12,6 V. The current drawn by the load will be:
Inom = 3 A
Because of its small size, it would be convenient to use a 12 Ah backup battery. For this battery, J20 is calculated as follows:
J20 = 12/20 A = 0,6 A
The nominal current is five times higher than the J20 current, therefore:
Inom= 3 A = 5 x 0,6 A = 5 x J20
From the graph in Figure 1, it can be established that at the load current which is five times greater than J20, the battery will reach 11,3 V after about three hours. Conversely, if four hours’ backup time is needed, the nominal current should equal three times J20 to ensure a safety margin. This requires a J20 of 1 A and a 20 Ah battery.
Conclusion
There are a number of factors to be considered when designing a battery-backed DC power supply system. Ignoring any one of them can lead to a system that will either not meet specification or will result in premature battery failure. It is therefore important to choose a power supply/battery charger combination that meets the needs of the system. If this is done, there is no reason why such systems cannot give years of useful life without needing to be high in cost, space or weight.
Tel: | +27 31 207 7466 |
Email: | [email protected] |
www: | www.omniflex.com |
Articles: | More information and articles about Omniflex Remote Monitoring Specialists |
© Technews Publishing (Pty) Ltd | All Rights Reserved