[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: in the centrum of the payla



Okay, since it was started already... It's rather off topic.

>On the line to the internet can be transfered only one bit at once.

>From your endpoint, yes.

>When u upgrade form 64k to 128k its dont mean that now 2 bit will go
>toghether paralely side by side.

Surprise: This is not necessarily true. At times, the carrier may
(and does) join together two 64Kb circuits to give you one 128Kb/s.
Actually this is not too common at the low speeds, but at higher speeds
it's common practice.

>Its mean that the bit will travel faste to the next hope so it will give
>more information at same time.

Only in the very simplistic view.

Forget cables for a moment. Let's think of roads and cars.

Assume a road that connects point A and point B.

Define latency as the time it takes you to drive from A to B (and back,
if you wish - doesn't really matter).

Define bandwidth as the amount of cars that can get from A to B in (say)
one hour.

For the one road we have assumed above, let the latency be L and bandwidth
be W.

Now, add another road from A to B, with the same physical dimensions
(width and length).

If we further assume equal distribution of cars on both roads, we see
that the latency remains L, while bandwidth is now W*2.

I guess your immediate response will be "yes, but this assumes that bits
are sent in parallel, rather than serially", which was your original point.

Okay, so in addition to my first point (parallel does happen, a lot):

Assume a second (alternative) road from A to B. This road is half as
wide as the first, but it is so modern that you can drive at double
the speed.

For this road, you get latency L/2, while bandwidth remains W.

Going back to digital information vocabulary: When you buy a 128Kb/s
line, it means that in *average*, 128K bits will be able to pass from
one end to the other, in one second. It doesn't mean anything about
how fast a *single* bit will make it - it can even be 5 seconds -
only about the *rate*.

The bottom line: bandwidth is one measure. Latency is another.
They say different things about a communication line. Both are
important (and they are not the only two parameters!).

I guess further discussion on this should not be on the Linux list...
Sorry for being off topic, I thought others would be interested too.

Doron Shikmoni



>So if a bit travling faste it will make latency much lower (without
>entring traffic or computer latency into the picture)
>
>Doron Shikmoni wrote:
>>
>> Actually, Asher is right. Bandwidth and latency are two very distinct
>> quantities, which are not related. Academically, I can create a 2Gb/s
>> line with a latency of half an hour. It is true that both measures have
>> their impact on the total "experience" of a network user, but they are
>> technically unrelated.
>>
>> Latency being extended on a loaded line is a phenomenon of queueing.
>> The equipment on the ends of a link usually queues data before transmission.
>> A packet may stay in the queue when the line is loaded. This may extend
>> the latency you see. However this does not have anything to do with the
>> actualy bandwidth - only with the bandwidth *utilization*, and in any case,
>> has nothing to do with the link itself.
>>
>> Doron Shikmoni
>
>--
>--------------------------
>Canaan Surfing Ltd.
>Internet Service Providers
>Ben-Nes Michael - Manager
>Tel: 972-6-6925757
>Fax: 972-6-6925858
>http://www.canaan.co.il
>--------------------------
>