L. Richard Carley, STMicroelectronics Professor of Engineering
Carnegie Mellon University
In the early days of MOS integrated circuits (way back in the 1970's), getting any kind of accurate analog-to-digital converter (ADC) to work was very challenging. Designers only had depletion-mode and enhancement-mode NMOS transistors to work with; and, getting even modest analog voltage gain was quite difficult. The successive approximation ADC, which does not require an amplifier of any kind, took off as a dominant ADC architecture in the early days of MOS. Then, as CMOS took hold in the 1980's, many other ADC architectures took over and successive approximation became just one of many possible choices.