Dave,
I appreciate your viewpoint in some respects but can't agree with it =
in others. Indeed your following comment goes straight to the heart of the=
matter:
"The key figure of merit for any instrument configuration is the =3D
instrument self noise and response as a function of frequency. This =3D
directly determines the minimum seismic motion the instrument is capable =
=3D
of detecting and then providing useful data for analysis."
The feedback instrument is a marvelous invention and went about as far a=
s is possible in the hands of Gunar Streckeisen. If it was the end-all of =
seismic instrument application type, then the IRIS broadband conference tha=
t I attended several years ago would never have happened.
The trend that I see important to seismology in every quarter involves t=
he exploitation of latest technologies. Science has always benefitted from=
innovative technology-for which in our lifetime nothing has impacted the w=
orld like the connectivity of the internet. Its advances have progressed i=
n parallel with the great progress made in the world of MEMS devices. And =
many thought that MEMS accelerometer chips would change the history of seis=
mology. Unfortunately, their hopes were dashed through the experimental re=
alization that micro-miniaturized cantilever type accelerometers (seismomet=
ers) cannot compete with their macro-sized (traditional) counterparts becau=
se of something the great Richard Feynman anticipated decades ago. When he=
wrote his article "There's plenty of room at the bottom", he predicted tha=
t MEMS devices (such as motors) would meet with a whole new class of challe=
nges because of friction. He was smart enough to appreciate, just on the b=
asis of scaling ingenuity, that friction would take on a new and different =
role once technology brought us to that place.
In my own work I have found his predictions to be right on target. The=
MEMS accelerometers used in cell-phones just don't have what it takes, ins=
ofar as competing performance wise with the traditional instruments. Even =
if the usual 12-bit adc were increased to twice that number of bits, they e=
vidently would still 'miss the mark' because of the friction conundrum. Th=
is has got to be of concern to those in California who have been putting to=
gether a network of personal computers in the hands of hoped-for-many parti=
cipants to try and improve earthquake prediction capability by means of a g=
reater data base of instruments.
The last sentence above is one of importance-multitude of sites provi=
ding useful information. I would love to have not just quiet site informat=
ion from a fairly small number of instruments such as you mention. Rather =
I would like to have data available in ways that only the internet can mana=
ge-from a huge number of instruments distributed around the globe. My reas=
on is different from most, however. Recently, with two highly regarded Fre=
nch theorists, I generated a paper titled "Prediction of catastrophes-an ex=
perimental model". It can be read online at
http://arxiv.org/abs/1204.1551
The 'key' to their 'saddle node' model's success, insofar as novel earth=
quake prediction capability (should it eventually prove possible) involves =
the nature of their data analysis. Unlike others who concentrated on the '=
time domain', this saddle node approach uses the 'frequency domain', which =
we discovered in physics about a century ago-as being the 'place where the =
action is". There are precursors to events that may happen, early enough t=
o save lives-but only if the network looking at relevant earth motions is l=
arger than some minimum number of appropriate seismometer types, yet to be =
determined. The 'boxcar like' pulses that have been identified as a possibl=
e precursor to earthquakes do not show up in a proper way when one works wi=
th feedback instruments that give an output of 'velocity' type. The deriva=
tive of the acceleration-determined signal becomes for these pulses a pair =
of positive and negative 'spikes', typically interpreted as noise rather th=
an real.
So my interest is two-fold: (i) working with instruments that depart f=
rom the norm in terms of signal output better suited to seeing precursors, =
and maybe allow meaningful earthquake prediction, and (ii) additionally mak=
ing every effort to extend sensitivity toward ever lower frequencies. As m=
entioned above, the professionals have never ceased from trying to expand t=
he pass-band of even the unmatched performance Streckeisen instruments.
Perhaps you can see then, from item (ii) above, why I have so much int=
erest in attainment to 'zero-defects' when it comes to power spectral densi=
ty calculations. You will find in the above referenced paper the use of th=
e 'cumuative spectral power', that has significant advantages over the psd =
(density function) from which it is obtained by integration. The 'clutter'=
that is natural in the psd is largely eliminated when one works with the c=
sp. This allows the meaningful overlay of multiple records, that assists i=
nterpretation in a novel way. Professors La Berre and Pomeau essentially c=
alculate the cumulative and observe how there is a lowering of characterist=
ic frequencies (at least in the ordinary solder wire of our published study=
) before avalanches that we expect to be in some respects like earthquakes.=
This 'calm before the storm' is addressed in quantitative manner by this =
means in ways that are not straightforward when one looks only at the tempo=
ral data. I believe it is high time that the seismology world begin to 'mi=
ne' the information that is available from proper spectral analysis, especi=
ally of cumulative spectrum type. There is a whole new world of exciting i=
nformation just waiting to be discovered-if folks would simply go there and=
look.
Randall
Dave, =
I appreciate your v=
iewpoint in some respects but can’t agree with it in others. In=
deed your following comment goes straight to the heart of the matter:<=
/o:p>
“The key figure of merit for any instrument configurati=
on is the =3D
instrument self noise and r=
esponse as a function of frequency. This =3D
dir=
ectly determines the minimum seismic motion the instrument is capable =3D
of detecting and then providing useful data for an=
alysis.”
=
The feedback instrument is a marvelous invention and went about=
as far as is possible in the hands of Gunar Streckeisen. If it was t=
he end-all of seismic instrument application type, then the IRIS broadband =
conference that I attended several years ago would never have happened.&nbs=
p;
The trend that I see import=
ant to seismology in every quarter involves the exploitation of latest tech=
nologies. Science has always benefitted from innovative technology=
212;for which in our lifetime nothing has impacted the world like the conne=
ctivity of the internet. Its advances have progressed in parallel wit=
h the great progress made in the world of MEMS devices. And many thou=
ght that MEMS accelerometer chips would change the history of seismology.&n=
bsp; Unfortunately, their hopes were dashed through the experimental realiz=
ation that micro-miniaturized cantilever type accelerometers (seismometers)=
cannot compete with their macro-sized (traditional) counterparts because o=
f something the great Richard Feynman anticipated decades ago. When h=
e wrote his article “There’s plenty of room at the bottom”=
;, he predicted that MEMS devices (such as motors) would meet with a whole =
new class of challenges because of friction. He was smart enough to a=
ppreciate, just on the basis of scaling ingenuity, that friction would take=
on a new and different role once technology brought us to that place. =
;
In my own work I have =
found his predictions to be right on target. The MEMS accelerometers =
used in cell-phones just don’t have what it takes, insofar as competi=
ng performance wise with the traditional instruments. Even if the usu=
al 12-bit adc were increased to twice that number of bits, they evidently w=
ould still ‘miss the mark’ because of the friction conundrum.&n=
bsp; This has got to be of concern to those in California who have been put=
ting together a network of personal computers in the hands of hoped-for-man=
y participants to try and improve earthquake prediction capability by means=
of a greater data base of instruments.
&nb=
sp; The last sentence above is one of importance—mu=
ltitude of sites providing useful information. I would love to have n=
ot just quiet site information from a fairly small number of instruments su=
ch as you mention. Rather I would like to have data available in ways=
that only the internet can manage—from a huge number of instruments =
distributed around the globe. My reason is different from most, howev=
er. Recently, with two highly regarded French theorists, I generated =
a paper titled “Prediction of catastrophes—an experimental mode=
l”. It can be read online at
http://arxiv.org/abs/1204.1551=
The ‘key’ to their ‘saddle =
node’ model’s success, insofar as novel earthquake prediction c=
apability (should it eventually prove possible) involves the nature of thei=
r data analysis. Unlike others who concentrated on the ‘time do=
main’, this saddle node approach uses the ‘frequency domainR=
17;, which we discovered in physics about a century ago—as being the =
‘place where the action is”. There are precursors to even=
ts that may happen, early enough to save lives—but only if the networ=
k looking at relevant earth motions is larger than some minimum number of a=
ppropriate seismometer types, yet to be determined. The ‘boxcar like&=
#8217; pulses that have been identified as a possible precursor to earthqua=
kes do not show up in a proper way when one works with feedback instruments=
that give an output of ‘velocity’ type. The derivative o=
f the acceleration-determined signal becomes for these pulses a pair of pos=
itive and negative ‘spikes’, typically interpreted as noise rat=
her than real.
So my interest is tw=
o-fold: (i) working with instruments that depart from the norm in ter=
ms of signal output better suited to seeing precursors, and maybe allow mea=
ningful earthquake prediction, and (ii) additionally making every effort to=
extend sensitivity toward ever lower frequencies. As mentioned above=
, the professionals have never ceased from trying to expand the pass-band o=
f even the unmatched performance Streckeisen instruments.
=
Perhaps you can see then, from item (ii)=
above, why I have so much interest in attainment to ‘zero-defects=
217; when it comes to power spectral density calculations. You will f=
ind in the above referenced paper the use of the ‘cumuative spectral =
power’, that has significant advantages over the psd (density functio=
n) from which it is obtained by integration. The ‘clutter’=
; that is natural in the psd is largely eliminated when one works with the =
csp. This allows the meaningful overlay of multiple records, that ass=
ists interpretation in a novel way. Professors La Berre and Pomeau es=
sentially calculate the cumulative and observe how there is a lowering of c=
haracteristic frequencies (at least in the ordinary solder wire of our publ=
ished study) before avalanches that we expect to be in some respects like e=
arthquakes. This ‘calm before the storm’ is addressed in =
quantitative manner by this means in ways that are not straightforward when=
one looks only at the temporal data. I believe it is high time that =
the seismology world begin to ‘mine’ the information that is av=
ailable from proper spectral analysis, especially of cumulative spectrum ty=
pe. There is a whole new world of exciting information just waiting t=
o be discovered—if folks would simply go there and look. <=
/pre>
Randall
=