Electrical / Electronic Theory and Learning Center > Learning Center Miscellaneous

Antennas & Transmission Lines (feedlines)

(1/13) > >>

poynt99:
Seems there have been many debates in the past amongst RF engineers, and Ham and CB operators about "tuning" the antenna system by cutting the coax (i.e. transmission line or feedline) to a specific length, usually 1/2 wavelength or multiple thereof, or by trial and error to minimize VSWR at the transceiver. Well, seems I've started such a debate at my job.

Given a 50 Ohm generator, 50 Ohm coax, but 150 Ohm load, there will be a 3:1 VSWR. But the VSWR measurement from the generator end will vary, depending on the length of the coax. This part is easy to prove by measurement, so all agree.

The contention however is this; Will the length of the coax (same scenario) affect the performance of the antenna, i.e. will changing the coax length, change the total radiated power of the antenna?

My belief is that it will not. The only part that really changes is what the generator sees, due to the reflections. Changing the coax length only changes the measurement point on the standing wave, and that is why it varies.

I have tested my hypothesis using a simulator called "bounce", and it seems to validate my belief. It shows that the dissipated power remains constant no matter the length of the coax, and I am assuming this also translates to a constant radiated power as well.

What are your thoughts guys?

TinMan:
I was once heavily into CB radio's,and still have all my equipment-EG,SWR meters,and auto tuners.

The length and type of coax will have different db losses. Say your useing a cheap coax that may have a 3db loss every 25 meters of length,and you change this coax for an expensive coax that has only a .1db loss every 25 meters of length. You have gained nearly a 3db signal strength to your antenna O0. Big problem is that the recieving station and your reciever wont really see that gain. You need an increase of around 5 to 6db to see any gain.

In saying that,i would have to say(although negligble in most cases)there is a loss as you increase the length of the coax.It also makes sence in that by increasing the length of a wire,you also increase the resistance. If you increase resistance,you increase the amount of energy disipated across that resistance that is avaliable from the source.

But then we have absorbed energy, reflected energy and re/reflected energy flowing through the coax :-\
Mmm-well im not really sure now i think about it poynt,but i do know there is a definite db loss over a given length of coax,but wether that loss is worth worrying about is another story.


End thought.
Current is flowing forward(toward the antenna),then the absorbed energy is transmited out from the antenna as electromagnetic waves. What is not radiated is returned back through the coax(reflected energy). This is then re/reflected at the tuner,and sent back into the coax-->added to the transmitters energy output. It is said that because of this re/reflection,100% of the energy that leaves the transmitter will be radiated out of the antenna-->minus the losses in the coax.

Chet K:
I use to dabble in 11 meters ,and I always felt that what Poynt is investigating
was important ,that the coax length be a derivative of the wave length in quarters
[at the very least].

I had line of site stations that I would tune to and compare ,I would imagine real time
testing of a faint signal would make quick work of this thought line.



Centraflow:

--- Quote from: poynt99 on 2015-03-14, 02:31:27 ---Seems there have been many debates in the past amongst RF engineers, and Ham and CB operators about "tuning" the antenna system by cutting the coax (i.e. transmission line or feedline) to a specific length, usually 1/2 wavelength or multiple thereof, or by trial and error to minimize VSWR at the transceiver. Well, seems I've started such a debate at my job.

Given a 50 Ohm generator, 50 Ohm coax, but 150 Ohm load, there will be a 3:1 VSWR. But the VSWR measurement from the generator end will vary, depending on the length of the coax. This part is easy to prove by measurement, so all agree.

The contention however is this; Will the length of the coax (same scenario) affect the performance of the antenna, i.e. will changing the coax length, change the total radiated power of the antenna?

My belief is that it will not. The only part that really changes is what the generator sees, due to the reflections. Changing the coax length only changes the measurement point on the standing wave, and that is why it varies.

I have tested my hypothesis using a simulator called "bounce", and it seems to validate my belief. It shows that the dissipated power remains constant no matter the length of the coax, and I am assuming this also translates to a constant radiated power as well.

What are your thoughts guys?

--- End quote ---

Hi Poynt

There are two points to consider, one is the transmitter to transmission line, and the other is transmission line to antenna.

Though in practice an ATU is normally placed at the transmitter end so as the transmitter sees a good match and as so is happy and does not get damaged by high reflective current, it does not solve a mismatch of line to antenna.

If the antenna has a good impedance "such as 50 ohms", any mismatch will happen at that end due to the feed line. Feed lines should always be multiples of 1/2 wave length so as to make sure there are no standing waves generated by the feed line. Other than 1/2 wave lengths can be used to bring a poor "line impedance" into the realms of where it should be eg. 50 ohms, and should only be so when the transmitting cable is of poor quality "not exactly 50 ohms". By todays standards these transmission cables are very good, as so, the length should be multiples of 1/2 wave length O0

All this being said, your point is of an antenna mismatch of 150 ohms, all others being good, therefore an ATU should be placed at the antenna end and not the transmitter end. A simple coil and capacitor bridge would solve the problem if the frequency of transmission is fixed.

Half wave lengths of feed cable must be used, it is obvious that if this was not so, at the point of the connection to the antenna with say a 5/8 wave termination, there will be a mismatch creating a standing wave at that point.

I hope I have clarified the problem :)

regards

Mike 8)

poynt99:
Thanks guys.

Yeah, resistive losses are known, and a secondary effect that is a reality of life. But that is not really what I'm getting at.

My test using the bounce program shows that standing waves are built up on the coax no matter what length is used. I do agree however that the system can be separated somewhat into the transceiver/feedline, and feedline/antenna. In our systems, we do not use ATU's at all.

My findings indicate that the only value added by using specific 1/2 wavelengths of coax is to allow for an accurate measurement of an antenna VSWR measured back at the radio end. That's all. Coax length seems to have absolutely no affect (taking into account known losses) on forward and reflected power, the ratio of the two, nor how much power ultimately radiates from the antenna.

"Tuning" the coax to minimize VSWR is a myth in my opinion, and only masks the problem which really is the antenna mismatch. Yes it can minimize the effects of the reflected power at the radio, but it has no effect on the antenna itself.

As Chet pointed out, I think I will have to set up an experiment whereby I use a spectrum analyzer to monitor the signal strength of the carrier being radiated through a slightly mismatched antenna some distance away. Take three different measurements with three different lengths of feedline, and see if the received signal strength changes at all.

.99

Navigation

[0] Message Index

[#] Next page

Go to full version