Date: Jun 23, 2000
I have just started to use an XS40 board. I will mostly be using it for the analog signal interface. I have made some algorithm and would like to implement it on fpga. To start with, I have run the loopback (codec demo) successfully.
In a demo it has mentioned that the codec chip gives about 20 bit of accuracy. My question is: how to test this whether it has 20-bit accuracy or not. Also, I would like to know if it is possible to change the accuracy e.g. to 12-bit etc. for other applications ? If anybody has a demo for 12-bit or any bit accuracy, I would appreciate the help.
You should be able to change the resolution by changing the width declaration in the codec interface. Note that this just changes the length of the shift-registers that load and unload the codec; the codec always works with twenty-bits of resolution.
As to whether you really get 20 bits of resolution with the codec on the XStend board, the answer is probably not. That's a minimum step size of a few microvolts and this is easily swamped by noise on the board. The board uses separate analog power and ground planes for the codec, but you really need to do more shielding and use improved interface electronics (e.g., don't use LM386 output amplifiers) to get the full precision out of the codec.