Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: SpaceBar

I think this quote cuts through the hype:

” In a situation with zero losses, there was little if any benefit, but loss-free wireless scenarios are rare.”

loss-free scenarios are not rare, They are entirely predictable, and they just require higher signal strength

Error correction codes are already part of the data link layer.

What they imply (order of magnitude increase) is a violation of Nyquist’s Law.

What they are doing will not solve plain old congestion problems.

I suspect that they are using normal packet transmission overhead in a different way that includes error correction elements. I could see some level of improvement possible - mainly by managing flow-control differently.

Here is the fundamental theory issue: Typical Digital Networks perform consistently and then “fall off a brick wall”. If you can stave off the brick wall with a bit of low-overhead error-correction, you might be able to measure a significant increase in performance (10x even) at the signal ‘brick wall’.

This may increase cell coverage for a specific link scenario (fringe) a little bit, but will not increase designed capacity or any other meaningful measure of a well-designed network - per their quote at the beginning of this post.


35 posted on 10/24/2012 5:07:56 AM PDT by RFEngineer
[ Post Reply | Private Reply | To 32 | View Replies ]


To: RFEngineer

I was thinking something like a hybrid FEC scheme with an ACK/NACK fallback, or even variable code robustness similar to the latest versions of PACTOR which estimate the channel S/N and adjust accordingly.


36 posted on 10/24/2012 5:21:52 AM PDT by SpaceBar
[ Post Reply | Private Reply | To 35 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson