////// 출처 : http://hasd.tistory.com/entry/Why-use-BCH-but-not-RS-as-out-code-in-DVB-S2 //////
If they select BCH, then they must have made an assessment that it is
sufficient to handle the remnant errors after LDPC decoding. Perhaps it
would be clearer if you can find out the error correcting capability of the
BCH used in DVB-S2 and the error distribution characteristic after LDPC
decoding. With that, it may be possible to deduce whether the ECC code
satisfies the required BER/FER specified for the DVB-S2.
kc
"Eric Jacobsen" <e...@ieee.org> wrote in message
news:4...@news.west.cox.net...
> On 24 Apr 2005 01:31:32 -0700, "Davy" <z...@gmail.com> wrote:
>
>>Hello all,
>>
>>I am simulating the ECC code in DVB-S2. I found the error after LDPC
>>decoding is mostly burst errors. And papers said when handling burst
>>errors, Reed-solomon is better than BCH. But why use BCH? Is BCH's
>>decoder simpler than RS? Any ideas will be appreciated.
>>
>>Best Regards,
>>Davy
>
>
> Binary BCH decoders are much simpler to implement than typical RS
> codes. The LDPC in DVB-S2 is already quite complex, so the outer
> code needs to be simple.
>
> I'm just speculating, but I'd suppose that's why they selected the
> BCH.
>
>
> Eric Jacobsen
> Minister of Algorithms, Intel Corp.
> My opinions may not be Intel's opinions.
> http://www.ericjacobsen.org