Título: History and Future of Seismology
Resumo:
The talk spans the time from 1831 BC, when Chinese chronicles report about the „Shaking of the Taishan Mountain“ in the Province of Shantung, up to our times and current dreams. The lecture highlights the tremendous conceptual influence of the great Greek naturalists, philosophers and geographers. Their ideas dominated the way of thinking about the nature of earthquakes for almost 2000 years. Only with the period of Enlightenment, since about the 17h century, begins a rapid and complex development. It is driven by pioneering innovative new concepts. They have been developed by mathematicians and physicists working in the fields of theoretical mechanics and later also electrodynamics, by geologist and some 200 to 100 years later also by mechanical engineers, instrument developers and first geophysicists.
Of crucial importance for the progress of seismology has been the development of international border-crossing cooperation in monitoring and data exchange since the turn from the 19th to the 20th century, the introduction of electrodynamic sensors since the early 20th century, the booming of analog seismology after WWII, driven both by devastating earthquake disasters as well as the demands of the nuclear test-ban control communities of the atomic powers. Crucial impact had the great international cooperative research projects such as the international Geophysical Year (1957-59), the Upper Mantle Project (1960-70), the Geodynamic Project (IGP, 1980-90), the United Nations International Decade for Natural Disaster Reduction (IDNDR, 1990-1999) and the Global Seismic Hazard Assessment Project (GSHAP, 1990-2000). But the largest influence, as in many other fields of applied science, had the digital revolution since the mid-1970s with the rapid advancement of broadband, high-dynamic range, miniaturized and new principles of sensors as well as of the global reach of near-real-time data transmission and processing technologies. The resulting flood of data necessitated increased automated data analysis, the calculation of myriads of model solutions, but often at the expense of data quality, complex problem understanding, lack of really new ideas and deeper insights. Although the precision of model-based solutions has steadily been increased this is not equally the case with respect to the accuracy of our knowledge in terms of “ground truth”. We should discuss the consequences and perspectives of such a development.
Fonte: LabSis/UFRN
Peter Bormann, Jordi Julià, Rodrigo Pessoa, Joaquim Ferreira