Keith Welsh, Senior Technical Staff at Maxim Integrated Products discusses a low-voltage HB-LED solution for Li+ battery operated systems.
High-brightness (HB) LEDs are now available for a wide range of lighting applications. The light output, often referred to as ‘luminous efficacy’ and measured as lumens output per watt consumed, now exceeds that of even fluorescent lighting systems. Reliability and intrinsically safe operating voltages make HB LEDs a good solution for battery backup lighting systems like those found in emergency lighting.
Along with advances in LED devices, improvements in battery technology have also arrived. Energy densities for the highest capacity lithium-ion (Li+) cells now exceed ~750kJ/kg. Nickel manganese hydride (NiMH) cells have lower energy density at around 200kJ/kg. (For comparison, note that petrol is about 44MJ/kg.)
A single-cell Li+ battery has a terminal voltage of around 3.7V. Consequently, unless multiple cells are placed in series, which introduces design challenges such as power sharing, a user will often prefer to work with a single-cell solution.
The challenge today is to marry highly efficient LED light sources with high-capacity, single-cell Li+ batteries where the available supply voltage is only 3V to 4V.
To provide sufficient gate-drive voltage to the switching MOSFETs an operating voltage of at least 4.5V is required to bring the FETs into good conduction. This is not an uncommon requirement for an HB LED driver operating in a boost mode using n-channel FETs.
A power supply derived from a single Li+ cell could be as low as 3V, so the drive to the FETs and other supplies in the circuit would be insufficient for proper operation. Therefore, if the battery supply voltage could be boosted to a higher value, the device can operate correctly.
Perfecting power efficiency
Boosting the battery supply once for the controller and then again for correct current control to the LED string has serious, negative consequences for power consumption and, therefore, battery life. This is because the overall efficiency is the product of the efficiencies of each stage. Or restated more precisely, 70 percent efficiency to boost followed by 70 percent efficiency in control would only give ~50 percent efficiency overall.
The solution described here uses a low-cost, low-power boost converter to provide a constant 5V supply to the HB LED driver in the EV kit. Meanwhile, the raw battery power is supplied directly to the FET boost converter stage. This way the battery power is boosted only once to power the LED string.
The whole circuit was used to drive currents up to 1A into a series string of six Seoul Semiconductor P7 LEDs. While the LEDs are capable of much higher currents than those used in this example, the standard MAX16834 EV kit drives up to 1A, which is sufficient for this analysis.
To eliminate the effects of voltage change or impedance increase during battery discharge, a high-current, low-voltage power supply was used instead of the battery. This kept the input voltage near constant as the current drive to the LEDs was changed to vary the system load.
The input and output, currents, and voltages were measured to provide data on the performance of the system at a 5V, 4V, and 3V supply, which simulated the range of voltages expected with a single Li+ cell. Measuring the input and output currents would require separate calibrated digital voltmeters (DVMs), but there was an alternative approach.
The input current was measured using the EV kit for a current-sense amplifier with a very low ohmic-value shunt to minimise measurement errors due to the shunt. The standard shunt is a 50mO, 4-terminal resistor, but this was bypassed with six 100O resistors to give a 12.5mO shunt.
The transfer ratio, therefore, for the EV kit went from 2.5V per amp to 625mV per amp. Now the voltage output could be measured with the same DVM used for all measurements throughout the analysis.
The output current was determined by measuring the voltage across the 0.1O series resistor on the output of the EV kit using the same digital volt meter (DVM). This approach ensured that all current and voltage readings were made by measurement of voltage alone. Use of the same DVM for all measurements essentially nullified any calibration errors in the test gear.
With a few minor modifications to the circuit, the challenges of driving a string of HB LEDs are met. The overall power conversion efficiency was maintained around or above 90 percent even with battery supplies as low as 3V.
Engineers can now use the latest technology, high-capacity Li+ cells to provide lighting for applications that used to require multiple stages of power conversion that, by itself, provided poor system efficiency and reduced battery life.
Maxim Integrated Products