You should do some basic testing.
- Put a tone generator on a track in a new reaper project. Turn off speaker feed. Set all faders and audio interface controls to zero dB. Put in the JS Signal generator plug, at 0 dB, with a 400 Hz sine wave.
(You can skip this next step if you don't have a voltmeter)
If you have a voltmeter, measure the AC voltage out of you interface.
For a -10 dBu interface it should read 0.244 V RMS.
For a +4 dBu interface its 1.228 V RMS.
Edit: I just checked. The US-1800 is a 4 dBu unit.
This calculator comes in handy.
http://www.analog.com/designtools/en/toolbox/dbconvert
Now loop the output back into an input, arm a new track, and monitor the level coming in. If it says 0 dB all is well.
If you want to go a step further, put a spectrum analyzer plugin on the input track. Ideally there should be a single spike at 400 Hz and nothing else. Harmonic distortion (a possible symptom of a sick interface) will show up as smaller spikes at multiples of 400 Hz (800, 1200, 1600, 2000 Hz).
Since you are at 0 dB, these are on the verge of these showing up (You can see them by setting the sig gen plugin above 0 dB). As long as they are no more than -60 dB with all levels set at 0 dB things are good.