I read Bob Barns' 1996 article about seismograph calibration and I wonder if there is not a much simpler method that does not require any special equipment. Suppose you just take the damping off a Lehman pendulum and give it a little shove. It will begin to oscillate and if you know the period and the amplitude then you can calculate the peak velocity in nm/s. Then you read the peak value from your ADC and that allows you to compute the velocity sensitivity of your seismograph in nm/s per bit. By observing your backgraound noise level in bits during normal damped operation you can convert it to nm/s. Or am I making a stupid mistake? And what is a good value for sensitivity and background noise level? When I do the tests above, it looks like I have a sensitivity of about 120nm/s/bit and my noise level is about +/- 3. I have a 16 bit ADC so should I crank up the analog gain a tad? The reason I am trying to calibrate my Lehman is that I just got it working and I can not figure why it detects some quakes well and others poorly. I am comparing my results with those posted on seismicnet. Especially the results from Herndon, VA which is just a few miles away from me at Bailey's Xroads, VA. I am trying to figure out whether I need more gain or is my location on a office building basement slab next to a busy street just too noisy? TIA for any help, Dave Saum __________________________________________________________ Public Seismic Network Mailing List (PSN-L)
Larry Cochrane <cochrane@..............>