Tonkabot wrote:I put a scope on the peek fan earlier, and found the following:
12.1v peek fan only
11.7v head on %100
11.5v Bed on %100
11.1v Head and Bed both on %100
And then when the head and bed reach temp they switch off and on, so the voltage jumps between the above values rapidly.
I think I'd rather use a SEPIC type converter like the one I found, and then I can pick a voltage above the input, like 13v for my LEDs.
Are you running the converter you suggested with a 12v output? If it is a buck converter, it doesn't seem like it should be able to keep the voltage at 12v.
Led devices are current sensitive devices most being at the 20 milliamp max range. Usually a resistor limits the current. If your leds are brighter running at automobile voltages that means that
a better resistor value needs to be selected to give the leds more current but not exceeding the led specifications. See below for the calculations.
20 mA is a good MAXIMUM current rating for general-purpose LEDs. LEDs designed for lighting applications have much higher maximum current ratings.
LEDs are current-driven components. This means that if you have a voltage source, such as a power supply or a battery, you need a resistor in series with the LED to limit the current. If you connect an LED directly to a 12V power adapter, a very high current will flow and the LED will glow very brightly... for a very short time!
When you have an LED and a resistor in series, connected to a voltage source, the LED will drop a certain voltage, called its forward voltage. For general purpose LEDs, this voltage varies from about 2V (for red LEDs) to about 4V (for some white LEDs). The remaining voltage is dropped across the resistor.
The voltage across the resistor, and its resistance, together determine the current that will flow in the series circuit. You can calculate the current using Ohm's Law: I = V / R where I is the current, in AMPS; V is the voltage ACROSS THE RESISTOR, in volts, and R is the resistor value, in ohms.
To calculate the right resistor value to use, you can rearrange that formula to R = V / I, where R is the resistance you should use, in ohms; V is the voltage across the resistor, in volts, and I is the desired current, in amps.
For example, if you have an LED with a forward voltage of 3.0V and a power supply of 12V, and you want to run the LED at 20 mA. The series resistor should be:
R = V / I
V = (12 - 3) = 9 (remember, V is the voltage ACROSS THE RESISTOR)
I = 0.02 (remember, I is in AMPS, not milliamps)
R = 9 / 0.02
= 450 ohms.
Point being that leds could care less what the voltage is, they care about how much current
flows through them. Voltage does matter however because an increase or decrease in voltage causes a corresponding rise and fall in the current.
I know that the device I talked about works and several users use them. It may not work for you and you should purchase whatever you know that will work.
Sharing each others ideas is part of what this Forum is about.