Welcome to Solid State Guitar Amp Forum | DIY Guitar Amplifiers. Please login or sign up.

April 27, 2024, 07:12:11 PM

Login with username, password and session length

Recent Posts

 

Amp power supply - voltage question

Started by camerongrieve, December 19, 2008, 04:25:53 PM

Previous topic - Next topic

camerongrieve

Hi guys,
First question here. I've recently revived my interest in electronics and have built a few
FX pedals. Now I've been looking at building a little practice amp. I've built the Ruby but
I've been looking for something with a little more power.

Many of the designs I've come across need bipolar power, ie. +/- 35 V for example.
I've seen the schems and noticed there's 3 power-in termials: -35, ground, and +35
This has been doing my head in, so there's a few parts to my question.

a. How is this example different from using zero Volt ground and +70V?
b. If a project says it requires an 18V bipolar power supply, does this mean -9V, 0V, +9V,
a total difference of 18V? Or -18V, 0V, +18V?
c. What are the advantages of bipolar power supplies? I've noticed that larger DC-powered amps
almost exclusively use them.

I've tried Googling answers to these questions, but I've had very little luck. Maybe I'm searching the
wrong keywords.

Thanks in advance for your help,

Cameron

teemuk

Well... The amplifier's output's DC potential settles midway between the supply rails. In an (e.g.) 70V – 0V amp the midway would be 35V and you would need to AC couple the output to isolate the speaker from this DC potential. In a +35V - -35V bipolar supply the midway is zero volts and AC coupling can therefore be omitted.

Second, the rail voltage is lower so instead of using a higher voltage filter capacitor(s) with a rather high capacitance you can cope with two lower voltage capacitors with less capacitance. E.g. instead of needing a single 80V 4400uF capacitor you need two 50V 2200uF capacitors. Believe it or not, those two capacitors often prove to be cheaper than the single one with higher ratings. Other parts may also cope with lesser voltage ratings.

Third, an amplifier circuit that uses a bipolar supply tends to have a better power supply reduction ratio because the ripple signals in the two supply rails tend to cancel each other out. This means the amplifier can cope with less filtering while still providing output with less hum. A single-supply amplifier is typically poorer in this regard and tends to need more filtering to compensate. Couple this to the issue that you need the higher voltage rating for the filter capacitors, as well as an output coupling capacitor, and you realise that the single supply amplifier is likely a much more expensive solution.

I can see the point of single supply when you need to use an inherently single-supply power source such as a battery, which would hence require some special tricks to be converted into a bipolar supply. But if you have a choice of using the bipolar supply then it's most often the way to go.

camerongrieve

Thanks for your response. It appears I still have a lot of reading to do :o