Originally Posted by LordFoo
Another afterthought, after sleeping on it -- if the output is 5V regulated, you might be able to get away with just putting a diode in series with the load (rated for the appropriate current drawn by the device, of course). One of the weird thing about diodes is that as long as they're operating, they have a voltage drop of about 0.7V --so that'd give you a bit closer to 4.5V (probably within acceptable tolerance?) and regulated.
Again, if the charger is unregulated, it's a whole different ballgame.
EDIT: For something more complicated, you could use a transistor based circuit, see "Emitter-Follower buffer":
http://hyperphysics.phy-astr.gsu.edu...c/emitfol.html
EDIT AGAIN: Now that I think about it, the easiest way would be to use a normal voltage divider followed by an op-amp configured as a voltage follower -- this basically acts as a buffer between the load (device) and the supply circuit. It'll work only if the charging current isn't too large - typical op-amps can deliver 20-30 mA and "high current" ones are around 200mA.
http://hyperphysics.phy-astr.gsu.edu...opampvar2.html
Thanks but this is all pretty much over my head... I was kind of looking for a 'it will blow up your camera' or 'it will work'. I guess its neither. I did some more research on the charger and it says that it provides from 3V to 6V, so would that mean it adjusts the voltage for devices automatically? That would mean all I would have to do would be to splice a couple of wires...