I have posted a reply to a similar question in a adjacent topic.http://www.matrixmultimedia.com/mmforums/viewtopic.php?f=28&t=7256&p=18122#p18122
The resolution of the MIAC ADCs should not be affected by the input circuitry, but the accuracy of the applied voltage will be limited by the 1% tolerance of the values of the resistors used in the input dividers.
In general, an accuracy of approximately 1% can be expected, allowing 10V to be displayed with 1 decimal place of accuracy (10.0V). This makes full use of the 8-bit ADC conversion, which has been set up to convert 12V to a value of 240 (50mV per step).
Sampling in 10-bit mode will improve the resolution of the sampling of the voltage being applied (precision), but will not improve the absolute accuracy beyond 1%. 10-bit mode is of more use tracking changes in voltage rather than absolute voltage.
The MIAC input circuit produces an ADC conversion value of 960 when 12V is applied - 12.5mV per step (Medelec, you were right!). The 1% tolerance of the input resistors, coupled with the potential for a small offset voltage generated by the op-amps, could lead to errors when measuring very low voltages.
These are common instrumentation issues that are covered in many control documents and ADC data sheets.