PSN-L Email List Message

Subject: Re: Optical siesmometer
From: Charles R Patton charles.r.patton@........
Date: Fri, 18 Jan 2013 10:05:17 -0800


On 1/18/2013 8:10 AM, chrisatupw@....... wrote:
> From: Charles R Patton charles.r.patton@........ 
> 
> Sent: Wed, 16 Jan 2013 17:55
> Subject: Re: Optical siesmometer
>
> On 1/16/2013 3:14 AM, chrisatupw@....... wrote:
>> Subject: Optical seismometer
>> Date: Tue, 15 Jan 2013 15:33:51 -0600
>> Larry,
>> I thought you might find this interesting:
>> http://www.ctbto.org/fileadmin/user_upload/SandT_2011/presentations/T3-O5%20J_Berger%20Optical%20Seismometer.pdf
>> -Charlie
>> Hi Charlie,
>>     It looks as if Mark & Co have done quite a bit more development 
>> work.
>> But amateurs would likely to have difficulty measuring optical fringes to
>> 1/2 ppm / Root Hz and Michaelson Interferometers are not cheap.
>> Amateurs can get about 10 nano metres resolution over 10 Hz using large area
>> photocells and a stabilised light source, but this is likely to be adequate -
>> unless you can 'lay your hands' on a couple of redundant Streckheisens !
>> Regards,
>> Chris Chapman
>> OK, how about this for a thought experiment?
> Take a standard $10 or so USBweb cam -- definitely a cheaper one, no 
> autofocus, but rather one where you can unscrew the lens easily.It 
> will have a 640 x 480 or better resolution at a 60 Hz sample rate.Use 
> it in an optical lever arrangement on the seismometer and project a 
> laser beam spot on the face of the sensor.So how sensitive could it be?
> Let's assume a 20" pendulum and a 20" optical lever length.We're 
> interested in duplicating the interferometer capability in the 
> Zumberge/Berger paper -- about:
> 3e-7 * 1e-6 = 3e-13m=1.18e-12"(see pg 8).
> Although I have a problem with this number.They describe a 16 bit 
> conversion so the number can't be much better than:
> 3e-7 / 65536 = 4.6e-11m=1. 8e-9"
> ****I'm a bit confused 3x10^-13 m = 1.18x10^-11 inches
> I think that there may be 100cm in every metre and 2.54 cm in every 
> inch.......
> And I'm not sure why are we talking about inches when the seismometer 
> is calibrated in metres ?
> So, assuming the typical 1/3" sensor in the webcam.Therefore
> 0.33" / 640 = 5.21e-4"640 pixel spacing
> 5.21e-4" / 256= 2.03e-6"due to interpolation from the 8-bit analog 
> digitization
> **** (0.33x25.4/640)x1000 = 13.1 microns
> I quite don't understand where the 256 bit interpolation fits in ?
> If the movement covered several pixels, you could reduce the pixel 
> 'noise'.
> If there is only one pixel concerned, you couldn't interpolate.
> As the optical lever length is assumed equal to the pendulum length, 
> then for small movements, the projected laser dot displacement will 
> equal the pendulum movement.
>  **** One problem is that semiconductor lasers are VERY VERY noisy. 
> Another is that the light output is highly temperature sensitive - a 
> factor of ~x7 between 0c and 100C.
> So the optical sensor is still a factor of 1000 away from the 
> interferometer.Averaging the sample rate from 60 Hz/16 down to approx 
> 4Hz, could add another 4x resolution improvement or about:
> 2.03e-6" / 4 = 0.5e-6"
> Not really close enough.No joy there.
> ****???? I don't think that it will make much difference. The pixel 
> size / count stay the same - assuming a 'quiet' signal. If the noise 
> was several pixels (unlikely) you could reduce the error - but it 
> would take more than a factor of x2 -> 4 counts !
> I can't think of another major improvement to the resolution except:
> 1) Maybe project the beam through a cylindrical lens that would 
> increase the deviation, but also spread the beam so probably a wash.
> 2) An optical lever distance of 1000 x 20" = 2e4"= 1667'.I don't think 
> so, unless we did it with a set of parallel mirrors spaced perhaps 2' 
> apart and where we allow the laser beam to enter at an almost 
> perpendicular angle to bounce back and forth 800 times before 
> exiting.Mirror loss per bounce of 1% would attenuate the beam by 
> 8.That shoudldn't be a problem -- just the quality of the mirrors 
> would be tough.
>  ****Umm ? Just 2^100 = 1.27x10^30 angular gain..... Do we want that 
> much ?
> Any other ideas?
> **** Larger photocells and more light ? The noise was about 13 nano 
> metres using cheap BPW34 7 sq mm photocells, so we should be able to 
> reduce it quite a bit.
>  Alternatively, use either an LVDT or a capacitative sensor ?
> Regards,
> Chrls Chapman
Hi Chris,

Thanks for the sanity check on the calculations. My error.  I should 
have gone over the calculations a few more times. Let's go with your 
your basic resolution of 13.1um for the the webcam sensor.  Then as I 
explain below, I would still take an 8 bit resolution improvement for 
13.1um/ 256 =approx. 52nm resolution.

  Re why calculate in inches?
1) Some items are given in inches such as a typical webcam sensor that I 
remember as 1/3 inch.
2) I'm the old, provincial fuddy-duddy who learned in inches and just 
can't get my head around metric.  My ruler is the 1 inch length of the 
first joint of my little finger.

Re noiseand 256 bit resolution -- what I was thinking is not a go/no-go 
decision based decision on individual pixels.  The key is to do a 
centroid calculation of the blobs of light and dark being formed by the 
fringes. Therefore the fringe position resolution is strongly affected 
by the analog resolution of the sensor pixels along with the pixel 
pitch.  And centroid calculation should be essentially immune to the 
flicker noise of the laser as the pixels are integrating charge 
relatively equally prior to readout and the centroid calculation 
shouldn't care about total intensity, only relative intensity  through 
the image.  Also the fringe spacing is not the key, only the movement of 
the fringes expressed as a percentage of fringe spacing. Also one 
doesn't have to do the ellipse calculation as they did in the 
interferometer.  The fringes move linearly in one direction or another 
according to the relative movement of the Ronchi ruling.  As I thought 
more about the above concept a few more considerations came to mind.

An important problem with fringe counting is that the fringe rate 
becomes very high very easily -- in the MHz range with moderate 
movements.  The webcam will lose counts approaching its frame rate -- 
approx 60 Hz.  So movements in excess of 1.5um * 60 Hz = 90 um/s will be 
lost.  Not a very suitable situation.  One solution that comes to mind 
is to use a linear sensor such as used in paper scanners.  I estimate 
that they may have a line update rate in the 1000 Hz or more so perhaps 
900um/s or more.  That is probably sufficient as one also has to account 
for the PC graphics library centroid calculation rate.

Another potential route to the fringe rate is to use the sensor in an 
Optical Mouse.  An early optical sensor for a Mouse was the HP HDNS-2000 
and could do 400 cpi at rates of motion up to 12 ips so about 4800 
fringes/second.  Randall Peters and his college,
Sheng-Chiang "John" Lee, did work with this method and were limited to 
about 50um resolution.  That resolution could be upped by use of the 
fringe counting technique, but probably significantly less than a 1000x, 
so less than 50nm, not as high as we're aiming for. Another problem is 
that the image sensor is really very small, I seem to remember something 
like 7x7 (I've looked for the spec sheet that mentioned it, but can't 
find it right now.) and for centroid calculation to work multiple 
fringes would have to be on the sensor.  This would tighten up the 
optical alignment requirements quite a bit together with a limited 
number of pixels per fringe would degrade the centroid calculation.

I took a glance at Edmund Scientifics. ( 
http://www.scientificsonline.com/holographic-diffraction-grating-film-10036.html) 
2 pcs of  2"x12" (?) holographic film (Substrate: Clear polyacetate 
film, Thickness: 0.002" , Grooves/Inch: 12,700, Groove Orientation: 
Linear)  is only US$7.95.  The groove spacing is comparable to the CD, 
but now straight lines with probably a better diffraction amplitude in 
the fringe image.  Glue  pcs of this film on plexiglass sheet and  a 
really cheap sensor should result.

Regards,
Charles R. Patton


  
    
  
  
    
On 1/18/2013 8:10 AM, chrisatupw@....... wrote:
From: Charles R Patton charles.r.patton@........
Sent: Wed, 16 Jan 2013 17:55
Subject: Re: Optical siesmometer

On 1/16/2013 3:14 AM, chrisatupw@....... wrote:
          
Subject: Optical seismometer
Date: Tue, 15 Jan 2013 15:33:51 -0600
Larry,
I thought you might find this interesting:
-Charlie
Hi Charlie,
    It looks as if Mark & Co have done quite a bit more development work.
But amateurs would likely to have difficulty measuring optical fringes to 
1/2 ppm / Root Hz and Michaelson Interferometers are not cheap. 
Amateurs can get about 10 nano metres resolution over 10 Hz using large area 
photocells and a stabilised light source, but this is likely to be adequate - 
unless you can 'lay your hands' on a couple of redundant Streckheisens !
Regards,
Chris Chapman
  OK, how about this for a thought experiment?
Take a standard $10 or so USB  web cam -- definitely a cheaper one, no autofocus, but rather one where you can unscrew the lens easily.  It will have a 640 x 480 or better resolution at a 60 Hz sample rate.   Use it in an optical lever arrangement on the seismometer and project a laser beam spot on the face of the sensor.  So how sensitive could it be?
Let's assume a 20" pendulum and a 20" optical lever length.   We're interested in duplicating the interferometer capability in the Zumberge/Berger paper -- about:
 
      3e-7 * 1e-6 = 3e-13m=1.18e-12"   (see pg 8). 
Although I have a problem with this number.  They describe a 16 bit conversion so the number can't be much better than:
3e-7 / 65536 = 4.6e-11m=1. 8e-9" 
****I'm a bit confused 3x10^-13 m = 1.18x10^-11 inches
I think that there may be 100cm in every metre and 2.54 cm in every inch.......
And I'm not sure why are we talking about inches when the seismometer is calibrated in metres ?
 
      So, assuming the typical 1/3" sensor in the webcam.  Therefore
0.33" / 640 = 5.21e-4"     640 pixel spacing
5.21e-4" / 256  = 2.03e-6"  due to interpolation from the 8-bit analog digitization
**** (0.33x25.4/640)x1000 = 13.1 microns 
I quite don't understand where the 256 bit interpolation fits in ?
If the movement covered several pixels, you could reduce the pixel 'noise'.
If there is only one pixel concerned, you couldn't interpolate.  
As the optical lever length is assumed equal to the pendulum length, then for small movements, the projected laser dot displacement will equal the pendulum movement.
 **** One problem is that semiconductor lasers are VERY VERY noisy. Another is that the light output is highly temperature sensitive - a factor of ~x7 between 0c and 100C.
So the optical sensor is still a factor of 1000 away from the interferometer.  Averaging the sample rate from 60 Hz/16 down to approx 4Hz, could add another 4x resolution improvement or about:
2.03e-6" / 4 = 0.5e-6" 
Not really close enough.  No joy there. 
****???? I don't think that it will make much difference. The pixel size / count stay the same - assuming a 'quiet' signal. If the noise was several pixels (unlikely) you could reduce the error - but it would take more than a factor of x2 -> 4 counts !
 
I can't think of another major improvement to the resolution except:
1) Maybe project the beam through a cylindrical lens that would increase the deviation, but also spread the beam so probably a wash. 
2) An optical lever distance of 1000 x 20" = 2e4"= 1667'.  I don't think so, unless we did it with a set of parallel mirrors spaced perhaps 2' apart and where we allow the laser beam to enter at an almost perpendicular angle to bounce back and forth 800 times before exiting.  Mirror loss per bounce of 1% would attenuate the beam by 8.  That shoudldn't be a problem -- just the quality of the mirrors would be tough. 
 ****Umm ? Just 2^100 = 1.27x10^30 angular gain..... Do we want that much ?
 Any other ideas? 
**** Larger photocells and more light ? The noise was about 13 nano metres using cheap BPW34 7 sq mm photocells, so we should be able to reduce it quite a bit.
 Alternatively, use either an LVDT or a capacitative sensor ?
 
       Regards,
   
       Chrls Chapman
Hi Chris,

Thanks for the sanity check on the calculations.  My error.  I should have gone over the calculations a few more times.  Let's go with your your basic resolution of 13.1um for the the webcam sensor.  Then as I explain below, I would still take an 8 bit resolution improvement for 13.1um / 256 = approx. 52nm resolution. 

 Re why calculate in inches? 
1) Some items are given in inches such as a typical webcam sensor that I remember as 1/3 inch.
2) I'm the old, provincial fuddy-duddy who learned in inches and just can't get my head around metric.  My ruler is the 1 inch length of the first joint of my little finger.

Re noise and 256 bit resolution -- what I was thinking is not a go/no-go decision based decision on individual pixels.  The key is to do a centroid calculation of the blobs of light and dark being formed by the fringes.  Therefore the fringe position resolution
is strongly affected by the analog resolution of the sensor pixels along with the pixel pitch.  And centroid calculation should be essentially immune to the flicker noise of the laser as the pixels are integrating charge relatively equally prior to readout and the centroid calculation shouldn't care about total intensity, only relative intensity  through the image.  Also the fringe spacing is not the key, only the movement of the fringes expressed as a percentage of fringe spacing. Also one doesn't have to do the ellipse calculation as they did in the interferometer.  The fringes move linearly in one direction or another according to the relative movement of the Ronchi ruling.  As I thought more about the above concept a few more considerations came to mind.

An important problem with fringe counting is that the fringe rate becomes very high very easily -- in the MHz range with moderate movements.  The webcam will lose counts approaching its frame rate -- approx 60 Hz.  So movements in excess of 1.5um * 60 Hz = 90 um/s will be lost.  Not a very suitable situation.  One solution that comes to mind is to use a linear sensor such as used in paper scanners.  I estimate that they may have a line update rate in the 1000 Hz or more so perhaps 900um/s or more.  That is probably sufficient as one also has to account for the PC graphics library centroid calculation rate. 

Another potential route to the fringe rate is to use the sensor in an Optical Mouse.  An early optical sensor for a Mouse was the HP HDNS-2000 and could do 400 cpi at rates of motion up to 12 ips so about 4800 fringes/second.  Randall Peters and his college,
Sheng-Chiang "John" Lee, did work with this method and were limited to about 50um resolution.  That resolution could be upped by use of the fringe counting technique, but probably significantly less than a 1000x, so less than 50nm, not as high as we're aiming for.  Another problem is that the image sensor is really very small, I seem to remember something like 7x7 (I've looked for the spec sheet that mentioned it, but can't find it right now.) and for centroid calculation to work multiple fringes would have to be on the sensor.  This would tighten up the optical alignment requirements quite a bit together with a limited number of pixels per fringe would degrade the centroid calculation.
 
I took a glance at Edmund Scientifics. ( http://www.scientificsonline.com/holographic-diffraction-grating-film-10036.html)  2 pcs of  2"x12" (?) holographic film (Substrate: Clear polyacetate film, Thickness: 0.002" , Grooves/Inch: 12,700, Groove Orientation: Linear)  is only US$7.95.  The groove spacing is comparable to the CD, but now straight lines with probably a better diffraction amplitude in the fringe image.  Glue  pcs of this film on plexiglass sheet and  a really cheap sensor should result. 

Regards,
Charles R. Patton


[ Top ] [ Back ] [ Home Page ]