Jump to content
Dr-David-Banner

Why Aren't There More Aspies Into Electronics?

Recommended Posts

Dr-David-Banner

I have an issue at the moment with mains transformers. I found out that in the very early Forties (or maybe 1939 or around), household mains supply was altered from 200 volts A.C. to 230 Volts A.C. This meant that mains equipment, such as radio receivers that had a primary transformer wound for just 200 volts, were supposed to be modified if it was intended to plug in to the new 230 set-up. That sort of set me off. I'd always assumed if you plugged 230 VAC to a transformer coil that was wound for, say, 180 VAC, the voltage would automatically be stepped down. However, the engineers of the time reasoned that gap between the 200 VAC coil and the new mains of 230 VAC needed to be addressed. So, there is the usual formula for dropping 30 volts via a resistor. I seem to recall it was about 800 Ohms in the line supply to the set.

I was up pretty late again doing a bit of maths around this. I was curious about how current tolerances and thresholds where a supply is increased. How important was the resistor in this case? I had a look at some of the flux density graphs and watts calculations so may find the answer to my questions soon I hope.

Share this post


Link to post
Share on other sites
Gone away
39 minutes ago, Dr-David-Banner said:

I found out that in the very early Forties (or maybe 1939 or around), household mains supply was altered from 200 volts A.C. to 230 Volts A.C

Why was this?

Share this post


Link to post
Share on other sites
Pinky and his brain

@Dr-David-Banner A transformer always changes the voltage according to the ratio between the primary winding and the secondary winding.

So if you have a transformer with a 4:1 ratio, you'll get 1/4 the input voltage out on the output. ( 200V in = 50V out) If you increase the input voltage to 230V, the output will increase to 57,5V.

The reason the engineers added a resistor before the primary winding, is that they wanted to reduce the power consumption of the transformer. The transformer is made with a certain amount of wire resistance, that fit the voltage it's made for. So when the voltage is increased, the resistance gets too low, and the winding will get hot and fail over time. They also wanted to make sure that whatever the transformer is powering, won't get overloaded. If the transformer is a step-up transformer, let's say 200V to 300V, then the increase of the mains voltage to 230V would result in an output voltage of 345V. That could easily fry some of the components that weren't made for this voltage.  

 

@Going home The reason they have increased the voltage over the years, is because increased voltage is more efficient. What causes the loss in a power line is the current, so by sending a higher voltage through the line, they can reduce the current and thereby the power loss.

The main power grid that distributes the power over large distances, run on voltages all the way up 765kV. Those are the really big metal towers you can see carrying the big fat wires, that run through most countries.

It's all about Ohms law. :)

U = R * I          P = U * I       and so on.

Share this post


Link to post
Share on other sites
Dr-David-Banner
On ‎26‎/‎02‎/‎2016 at 10:59 PM, Pinky and his brain said:

@Dr-David-Banner A transformer always changes the voltage according to the ratio between the primary winding and the secondary winding.

So if you have a transformer with a 4:1 ratio, you'll get 1/4 the input voltage out on the output. ( 200V in = 50V out) If you increase the input voltage to 230V, the output will increase to 57,5V.

The reason the engineers added a resistor before the primary winding, is that they wanted to reduce the power consumption of the transformer. The transformer is made with a certain amount of wire resistance, that fit the voltage it's made for. So when the voltage is increased, the resistance gets too low, and the winding will get hot and fail over time. They also wanted to make sure that whatever the transformer is powering, won't get overloaded. If the transformer is a step-up transformer, let's say 200V to 300V, then the increase of the mains voltage to 230V would result in an output voltage of 345V. That could easily fry some of the components that weren't made for this voltage.  

 

@Going home The reason they have increased the voltage over the years, is because increased voltage is more efficient. What causes the loss in a power line is the current, so by sending a higher voltage through the line, they can reduce the current and thereby the power loss.

The main power grid that distributes the power over large distances, run on voltages all the way up 765kV. Those are the really big metal towers you can see carrying the big fat wires, that run through most countries.

It's all about Ohms law. :)

U = R * I          P = U * I       and so on.

As I understand, the current value of 2 windings is at opposite phase to voltage. Thus, where any secondary has a higher voltage than the primary, the secondary current will actually be the smaller. So, there are 3 voltages at work: Mains input at 230 volts and the primary winding (wound for 200 volts). Finally the secondary winding that has the circuit load in watts.

What I do is calculate the total current for the tubes in total and then convert to watts (secondary load). Whatever watts value there is becomes the secondary load. That then determines the primary current. The books tell you to calculate this current against a figure of 230 volts supply and then add whatever resistance is applicable. I ended up with about 750 ohms to drop 30 volts. No problem with that as such as I worked it out but my curiosity was triggered.

The supply in most systems tends to have the neutral house wire going to a centre tapped return at the distribution supply transformer. The hot wire will be the alternating one. Where it gets interesting is even the old regulations insisted upon a complete isolation of the primary transformer and not an auto-transformer. And even more interesting for me is that the manuals indicate devices in production were plugged into a supply of 220 volts A.C. and yet the radio-set transformers were wound for 250 VAC. Maybe you can see why I got curious. I ask myself why 220 VAC at the time (1950's). And what about the voltage difference? I'm really not quite sure what the present voltage is but I think it's now about 260 although not sure.

Share this post


Link to post
Share on other sites
Dr-David-Banner
On ‎26‎/‎02‎/‎2016 at 6:20 PM, Going home said:

Why was this?

Very first house supplies were D.C. Before WW2 many homes would have about 120 volts D..C. Then Tesla's A.C. caught on and the very first homes were provided with 200 A.C.

Americans still use 120 VAC but will have larger cables.

To satisfy your curiosity Ohm's law is really very simple. Here is is:

(1) Voltage = this is merely pressure or force. Compare to drawing a long-bow and arrow. The tighter the bow, the faster the arrow flies and with more force.

(2) Current = flow of electrons in conductors. Electrons are part of an atom but carry negative charge. At low voltages (pressure) you need thicker and bigger conductors to enable flow. This is why 12 volt car batteries have thick cables.

(3) Resistance = Anything that slows the current such as a sudden thinner wire or bad connection. Or maybe the wrong sized cable. Or even a voltage of the opposite polarity. Electrons always flow to a positive voltage. In fact, in a battery the electrons leave the negative terminal, flow through the wire and back to the positive terminal. Therefore I sort of dislike the use of + as being called "current"

 

Edited by Dr-David-Banner

Share this post


Link to post
Share on other sites
Pinky and his brain

@Dr-David-Banner Yes, you're correct. The highest voltage will always have the lowest current. That's also why they always talk about the wattage of a transformer. Because the wattage will always be the same. In reality they call it VA, or Volt Ampere. But that is to avoid problems with highly inductive or capacitive loads. The VA number indicate how much wattage it can deliver into a pure resistive load.

The reason they make them for a higher mains voltage, is because the mains voltage is never completely stable. I think the standard voltage is the UK is 240 VAC, but it can vary from 220 - 260 VAC. So any manufacturer of transformers have to take that in to account. 

In the rest of Europe the standard mains voltage is 230 VAC, but anything from 207 - 244 VAC is accepted as good enough.

So being a transformer manufacturer, requires a lot of information about the different power networks, in all the countries in the world. You can't just make one model and then sell it to anyone.

Edited by Pinky and his brain

Share this post


Link to post
Share on other sites
Dr-David-Banner
15 minutes ago, Pinky and his brain said:

@Dr-David-Banner Yes, you're correct. The highest voltage will always have the lowest current. That's also why they always talk about the wattage of a transformer. Because the wattage will always be the same. In reality they call it VA, or Volt Ampere. But that is to avoid problems with highly inductive or capacitive loads. The VA number indicate how much wattage it can deliver into a pure resistive load.

The reason they make them for a higher mains voltage, is because the mains voltage is never completely stable. I think the standard voltage is the UK is 240 VAC, but it can vary from 220 - 260 VAC. So any manufacturer of transformers have to take that in to account. 

In the rest of Europe the standard mains voltage is 230 VAC, but anything from 207 - 244 VAC is accepted as good enough.

So being a transformer manufacturer, requires a lot of information about the different power networks, in all the countries in the world. You can't just make one model and then sell it to anyone.

I have a battery eliminator I dug out from the Forties. To be honest, I'm very wary of testing it at the moment as the primary winding has some break in the casing. The very uppermost turn wires are a bit crooked even if only about 2 or three wires. I thought of testing with some low voltage although it may be far wiser to just make a dropper of my own for safety.

Share this post


Link to post
Share on other sites
Pinky and his brain

If the insulation is cracked, then don't use it. It's not safe any more. No need to get hurt or start a fire.

Edited by Pinky and his brain

Share this post


Link to post
Share on other sites
Dr-David-Banner
22 hours ago, Pinky and his brain said:

If the insulation is cracked, then don't use it. It's not safe any more. No need to get hurt or start a fire.

I had a late night as I've been puzzled over heater circuits. It's kind of interesting actually so I'll explain it to anyone who may be reading and curious.

The very first battery powered tubes essentially had filaments. That is, like the filaments in a regular lamp that glow and light a room. Except the early tube filaments only went a bit orange and only need 2 volts D.C.

The way the tube works is simple: The filament gets hot and emits a cloud of free electrons. There is only one 2 volt battery for the filament or heater circuit.

When the vacuum tubes are connected between filament - and anode + to 100 volts + D.C., all the electrons around the filament begin to flow through the tube. It conducts.

The filaments actually connect both to the 2 volt negative or ground on chassis as well as the H.T. (High Tension) 100 volt negative.

A few people may know that later tube radios had the filaments heated indirectly by a low voltage A.C. (about 5 volts). In this case, the actual filament was encased and the whole thing was collectively called a "cathode"

I can't really stress how clever this idea is and how effectively it works. Unlike semiconductors the tubes conduct free electrons inside the tube. Imagine taking a load of iron filings and putting them on paper and then drawing a magnet forwards. You'll see the filings immediately flow towards the magnet. And that's, more or less, how it works.

When radio signals hit an aerial, the voltage is very weak ( a few millivolts). However, the tube uses high voltage D.C. to amplify the captured waves.

By the late Fifties tubes were down to about 60 volts in some cases. Even the paste 90 volt H.T. battery was about the size of a PP9 large rectangular battery sold today and it could fit in the case and be carried on picnics.

I enclose a pic of one of the very last tube battery sets just before the invention of the transistor radio killed it off. The very first batteries were lead acid like in cars but by the late Fifties there were paste batteries.



 

15463622098_5ebd1056d6.jpg

Share this post


Link to post
Share on other sites
Dr-David-Banner

More technically, I mentioned I had a struggle with the filament/heater circuit of my Thirties radio.

A few days ago I was surprised to find resistance in Ohms on the positive heater pin sockets tested as small resistance to chassis earth. I hadn't expected that. I did expect the negative heater pin sockets to show "0" to earth (chassis) but not the positive pin sockets. In fact, 2 of the positive sockets didn't show continuity but the other two did (there are 4 tubes).

I went through it all with a fine toothed comb. I traced all the pins (anode, grid, screens, heater pins). I found no errors and any small short circuits elsewhere were being eradicated.

Then, I reasoned the filaments really are an essential 2 volt short circuit. It's just positive meeting negative across a thin wire so it gets hot and glows. So, did polarity matter? I mean, you can use A.C. to heat a filament which is why ceiling lights just go straight to the A.C. supply.

Last night I did a test I found in my 1930's circuit book. I just identified the triode filaments and put a small bulb and battery across the two. The bulb lit up. Presumably it means possibly my triode is O.K.

However, as it stands, I'm still sort of not easy over my heater circuit. Also the fuse bulb in that doesn't light up. I've been using a 1.5 volt battery but the correct value is 2 volts. I need a 2 volt battery or to make one up.

One thing I do know is you must make sure the filaments are "on" before connecting the full 100 volts.

I read some weeks ago a story of an American who tried to get this particular radio working and it set on fire. He was pretty upset. It can happen so I figure I'll just take my time with it and not get cocky.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.