Simulating a simple analog-input digital-output feedback loop; Input is a vector of 1024 scalars representing a sampled analog signal. It gets the one-sample-delay output subtracted, then it gets rounded to simulate quantization error, then integrated once or twice. The integrator output is the system output and gets rounded and fed back, forming a first-order feedback loop with a gain of 1. Any ideas about the best way to handle this? ...Larry