My understanding is that the ADC reference voltage is 1.1V and that different input ranges can be achieved via software settable "attenuation" settings.
My question is about the accuracy and tolerance of the reference itself. 5%? 1%? drift over temperature?
Thanks!
ADC reference voltage accuracy / tolerance
Re: ADC reference voltage accuracy / tolerance
It is nowhere documented (I couldn't find it), but the reference voltage source should be an bandgap reference with at least 50 ppm/°C to be usable...
Who is online
Users browsing this forum: Baidu [Spider] and 51 guests