Simulating a simple analog-input digital-output feedback loop; Input is a vector of 1024 scalars representing a sampled analog signal. It should get rounded to simulate quantization error, then the one-sample-delayed output signal is subtracted. The output of the subtract is integrated. The integrator output is the system output and gets fed back, forming a first-order feedback loop with a gain of 1. Any ideas about the best way to handle this? ...Larry