{ "cells": [ { "cell_type": "markdown", "id": "d4f629fb", "metadata": { "tags": [] }, "source": [ "# Earthquake Signals" ] }, { "cell_type": "markdown", "id": "4f8bf1b3-d255-4c4b-92a4-5326b5bb57ba", "metadata": { "tags": [] }, "source": [ "## Part I: Seismicity on Earth\n", "\n", "An earthquake is a vibration of Earth caused by the sudden release of energy, usually as an abrupt breaking of rock along planar fractures called faults. An earthquake initiates at its hypocenter or focus at some depth below the land surface or sea floor. The epicenter of the earthquake is defined as the point on the Earth’s surface above where the earthquake initiated.\n", "\n", "But only rocks that are relatively cold and brittle (the Earth’s lithosphere) can be broken in earthquakes. Rocks that are hot and ductile (e.g. the Earth’s asthenosphere) will deform without breaking, and therefore do not produce earthquakes. By observing where earthquakes occur, both horizontally and with depth, we can learn where stress is concentrated and infer the material properties of the Earth.\n", "\n", "In this section, we will examine the distribution of earthquakes. You will notice that most of them are near plate boundaries. Each type of plate boundaries has a unique moving pattern shown in the image below. Keep this in mind when you look at the earthquake epicenter plot later in this section.\n", "\n", "\n", "\"drawing\" \n", "\n", "*Image: Three major types of plate boundaries.*\n" ] }, { "cell_type": "code", "execution_count": 1, "id": "68c1a3d8-4916-4e05-87e0-603555e0c5f2", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "import pandas as pd\n", "import json\n", "import numpy as np\n", "import plotly.express as px\n", "import plotly.graph_objects as go\n", "import warnings\n", "warnings.filterwarnings('ignore')" ] }, { "cell_type": "code", "execution_count": null, "id": "b4c20227-5dc0-4fa3-bef6-28dfbeff62a4", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "def cutoff_track(lon, lat, cutoff_lon):\n", " \"\"\"\n", " Insert nan between two points that cross the cutoff lontitude. \n", " Avoid horizontal line when plot the track\n", " \n", " Input Parameters:\n", " ----------------\n", " \n", " lon: longitude array\n", " lat: latitude array\n", " cutoff_lon: divide the curve at this cutoff longitude\n", " \n", " Return:\n", " ------\n", " \n", " lon, lat: updated arrays\n", " \"\"\"\n", " xtrk = np.mod(lon - cutoff_lon, 360)\n", " is_cross = np.where(np.abs(xtrk[1:-1]-xtrk[0:-2]) > 90)[0]\n", " nans = np.empty(np.size(is_cross))\n", " nans[:] = np.nan\n", " lon = np.insert(lon, is_cross+1, nans)\n", " lat = np.insert(lat, is_cross+1, nans)\n", " return lon, lat\n", "\n", "def load_plate_boundaries(filename):\n", " \"\"\"\n", " Input Parameters:\n", " ----------------\n", " \n", " filename: input JSON file name containing the curve locations\n", " \"\"\" \n", "\n", " # Opening JSON file\n", " f = open(filename)\n", "\n", " # returns JSON object as a dictionary\n", " boundary = json.load(f)\n", " \n", " f.close()\n", "\n", " boundary = np.array(boundary[1])\n", " lons = boundary[:,0]\n", " lats = boundary[:,1]\n", "\n", " lons, lats = cutoff_track(lons, lats, 180)\n", " \n", " return lons, lats\n", "\n", "\n", "def plot_earthquake_map(catalog):\n", " \"\"\"\n", " Input Parameters:\n", " ----------------\n", " \n", " catalog: dataframe containing earthquakes\n", " \"\"\" \n", " \n", " rlons, rlats = load_plate_boundaries('../Geophysics/Files/ridge.json')\n", " tlons, tlats = load_plate_boundaries('../Geophysics/Files/transform.json')\n", " hlons, hlats = load_plate_boundaries('../Geophysics/Files/trench.json')\n", "\n", " \n", " fig1 = px.scatter_geo(catalog, lon=\"ep.lon\", lat=\"ep.lat\", size=\"centroid.MW\", color=\"ep.depth\", \n", " labels={\"ep.lon\":\"longitude (degrees)\", \"ep.lat\":\"latitude (degrees)\", \"centroid.MW\":\"Moment Magnitude\",\n", " \"ep.depth\":\"depth (km)\", \"ep.timestamp\":\"origin time\"}, \n", " hover_data=['ep.timestamp'])\n", "\n", " fig1a = px.line_geo(lon=rlons, lat=rlats)\n", " fig1a.update_traces(line={'color':'skyblue', 'width':3})\n", " \n", " fig1b = px.line_geo(lon=tlons, lat=tlats)\n", " fig1b.update_traces(line={'color':'lime', 'width':3})\n", " \n", " fig1c = px.line_geo(lon=hlons, lat=hlats)\n", " fig1c.update_traces(line={'color':'orangered', 'width':3})\n", " \n", " fig = go.Figure(data=fig1.data + fig1a.data + fig1b.data + fig1c.data, layout = fig1.layout)\n", " fig.update_xaxes(range=[-180, 180], tickvals=np.arange(-180,190,30))\n", " fig.update_yaxes(range=[-90, 90], tickvals=np.arange(-90,100,15))\n", " fig.update_geos(projection_type=\"robinson\", lataxis_showgrid=True, lonaxis_showgrid=True)\n", " fig.update_layout(title_text=\"Global CMT catalog (N = %d earthquakes)\"%len(catalog), title_x=0.5)\n", " fig.show()" ] }, { "cell_type": "code", "execution_count": null, "id": "9ea8acb3-eabf-4344-b4c5-8cee1ca2d25e", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "# Read earthquake catalog\n", "cmt_cat = pd.read_table(\"Files/Global_CMT_1972_2018.csv\", delimiter=',')\n", "cmt_cat = cmt_cat.sort_values(by=['ep.depth', 'centroid.MW'])" ] }, { "cell_type": "markdown", "id": "4909159c-3229-4078-bfe4-77cdb1661ccc", "metadata": {}, "source": [ "
\n", "

Review

\n", "

Feel free to adjust the filter parameters in the cell below to have more or less earthquakes on the map. You may hover the dots to get the information of the earthquakes. The colors of the lines indicate the type of plate boundaries: Red is a Trench, Green is a Transform, and Blue is a Ridge boundary.

\n", "
" ] }, { "cell_type": "code", "execution_count": null, "id": "c151f47b-eec0-4de0-b1c1-782d0c97382f", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "# Filter to improve readablility\n", "Mw_min = 6.5 # Moment magnitude\n", "Mw_max = 10 # Moment magnitude\n", "depth_min = 0 # minimum depth in km\n", "depth_max = 6400 # maximum depth in km\n", "datetime_min = '1000-01-01 00:00:00.000000' # 'yyyy-mm-dd hh:mm:ss.SSSSSS'\n", "datetime_max = '2050-01-01 00:00:00.000000' # 'yyyy-mm-dd hh:mm:ss.SSSSSS'\n", "\n", "# Don't change these two statements\n", "mask = (cmt_cat['centroid.MW'] >= Mw_min) & (cmt_cat['centroid.MW'] <= Mw_max) & (cmt_cat['ep.depth'] >= depth_min) & (cmt_cat['ep.depth'] <= depth_max) & (cmt_cat['ep.timestamp'] >= datetime_min) & (cmt_cat['ep.timestamp'] <= datetime_max)\n", "plot_earthquake_map(cmt_cat[mask])" ] }, { "cell_type": "markdown", "id": "7840d07c-6e51-4295-bb5c-fa6ce28c4d6b", "metadata": {}, "source": [ "**Question** Examine the regions on and around Earth’s ocean ridges and ocean trenches. Both the spatial distribution and the depths of earthquakes associated with each are different. Describe/illustrate these differences between earthquake associated with ridges and earthquakes associated with trenches." ] }, { "cell_type": "markdown", "id": "850f2488-981a-47a5-97cf-a67a53662235", "metadata": {}, "source": [ "**Answer**:" ] }, { "cell_type": "markdown", "id": "10a99543-4ef0-4bd5-afad-76fb0e3f6238", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "markdown", "id": "8467e4a8-291f-40ce-b4fa-1f83ef190ab6", "metadata": {}, "source": [ "**Question**. Using earthquake depths as evidence, discuss whether the Earth’s lithosphere is thicker at ridges or at trenches." ] }, { "cell_type": "markdown", "id": "8dead0b3-ed8d-4669-9d20-4daa1c86ee85", "metadata": {}, "source": [ "**Answer**:" ] }, { "cell_type": "markdown", "id": "ff14aec3-a442-4f87-8538-dc1e0d55595b", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "markdown", "id": "c65fb6ff", "metadata": { "tags": [] }, "source": [ "## Part II: Seismic stations, seismographs, and seismograms\n", "\n", "\n", "Seismographs are instruments at seismic stations that measure ground motion. The data is displayed as\n", "seismograms. This data is of fundamental importance to the geosciences. By studying earthquake-generated ground motion we can learn about 1) the nature of the earthquake itself (the source of the motion) and 2) the material the earthquake waves travel through (Earth’s interior). This is because the observed vibration (in the time domain *t*) is the result of the effects of earthquake source, interior structure and how the instrument *senses* the vibration. These effects can be most easily modeled by considering each frequency ($\\omega$) of the waveform in turn, and using their products (i.e. frequency-domain convolution per mathematical definition) to derive the waveform.\n", "\n", "\n", "\n", "*Figure: Relationship between measurement and contribution from instrument, structure and source.*\n", "\n", "\n", "\n", "The United States Geologic Survey estimates that several million earthquakes occur each year. Many go\n", "undetected because they hit remote areas without good seismic station coverage, or have very small\n", "magnitudes. But various seismic networks locate something like 20,000 earthquakes a year. We will focus on two large earthquakes since they caused the whole Earth to ring like a bell, causing waveforms to be recorded at various stations well above the noise level of the instrument.\n", "\n", "**Tohoku earthquake [USGS Link](https://earthquake.usgs.gov/earthquakes/eventpage/official20110311054624120_30/executive)**\n", "\n", "The March 11, 2011, M 9.1 Tohoku earthquake, which occurred near the northeast coast of Honshu, Japan, on the subduction zone plate boundary between the Pacific and North America plates. The location, depth (about 25 km), and focal mechanism solutions of the March 11th earthquake are consistent with the event having occurred on the subduction zone plate boundary. Modeling of the rupture of this earthquake indicates that the fault moved as much as 50–60 m, and slipped over an area approximately 400 km long (along strike) by 150 km wide (in the down-dip direction). The Japan Trench subduction zone has hosted nine events of $m_w$ 7+ since 1973. \n", "\n", "The focal mechanisms that are usually given to convey this configuration are a way of plotting the three dimensional fault information in two dimensions (they’re lower hemisphere [stereographic projections](http://mathworld.wolfram.com/StereographicProjection.html)). The figure below is an attempt to show how a simple block model of the first motions of 5 common types of earthquake look when projected onto a lower hemisphere, and then when the pattern on that focal sphere is projected onto the horizontal to produce a 2D focal mechanism." ] }, { "cell_type": "markdown", "id": "62a76001-bd20-4206-bee3-37a70797dd6d", "metadata": {}, "source": [ "\n", "\n", "*Figure: Types of faulting and their expression as focal mechanism of the earthquake.*\n", "\n", "The [Global Centroid-Moment-Tensor (CMT) Project](https://www.globalcmt.org) has measured the moment tensors of all earthquakes with magntiudes $m_w$ 5.5+ since 1976. The CMT project has been continuously funded by the National Science Foundation since its inception. Let us read the CMT solution of the Tohoku earthquake in the *CMTSOLUTION* format, available from their webpage." ] }, { "cell_type": "code", "execution_count": null, "id": "95b726b8", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "import numpy as np\n", "import os\n", "import subprocess\n", "from multiprocessing import Pool\n", "import glob\n", "import errno\n", "import cmath\n", "from math import atan2\n", "import pandas as pd\n", "import pdb\n", "from obspy import UTCDateTime,read_events,read,read_inventory\n", "from obspy.geodetics import gps2dist_azimuth\n", "\n", "#########################################################\n", "\n", "def readcmtsolution(file='CMTSOLUTION'):\n", " if not os.path.isfile(file): raise FileNotFoundError(errno.ENOENT, os.strerror(errno.ENOENT), file)\n", " cmt={}\n", " f = open(file,'r')\n", " line=f.readline().split()\n", " cmt['source']=line[0];cmt['iyr']=int(line[1]);cmt['imo']=int(line[2]);cmt['ida']=int(line[3])\n", " cmt['iho']=int(line[4]);cmt['imi']=int(line[5]);cmt['fsec']=float(line[6]);cmt['eplat']=float(line[7]\n", ")\n", " cmt['eplong']=float(line[8]);cmt['depth']=float(line[9]);cmt['mb']=float(line[10]);cmt['ms']=float(line[11])\n", " cmt['region']=' '.join(line[12:])\n", "\n", " cmt['ityphdur']=2\n", " cmt['itypso']=0\n", " cmt['time'] = UTCDateTime(cmt['iyr'],cmt['imo'],cmt['ida'],cmt['iho'],cmt['imi'],cmt['fsec'])\n", "\n", " # read cmts\n", " xm = np.zeros(6)\n", " while line:\n", " line = f.readline()\n", " if line[0:10] == 'event name':\n", " cmt['name']=line[12:].split()[0]\n", " elif line[0:10] == 'time shift':\n", " cmt['tcmt']=float(line[12:].split()[0])\n", " elif line[0:13] == 'half duration':\n", " cmt['hdur']=float(line[15:].split()[0])\n", " elif line[0:8] == 'latitude':\n", " cmt['epla']=float(line[10:].split()[0])\n", " elif line[0:9] == 'longitude':\n", " cmt['eplo']=float(line[11:].split()[0])\n", " elif line[0:5] == 'depth':\n", " cmt['depth']=float(line[7:].split()[0])\n", " #Double couple\n", " elif line[0:3] == 'Mrr':\n", " xm[0]=float(line[5:].split()[0])\n", " elif line[0:3] == 'Mtt':\n", " xm[1]=float(line[5:].split()[0])\n", " elif line[0:3] == 'Mpp':\n", " xm[2]=float(line[5:].split()[0])\n", " elif line[0:3] == 'Mrt':\n", " xm[3]=float(line[5:].split()[0])\n", " elif line[0:3] == 'Mrp':\n", " xm[4]=float(line[5:].split()[0])\n", " elif line[0:3] == 'Mtp':\n", " xm[5]=float(line[5:].split()[0])\n", " # Single force\n", " elif line[0:3] == 'CFr':\n", " cmt['itypso']=1\n", " xm[1]=float(line[5:].split()[0])\n", " elif line[0:3] == 'CFt':\n", " cmt['itypso']=1\n", " xm[2]=float(line[5:].split()[0])\n", " elif line[0:3] == 'CFp':\n", " cmt['itypso']=1\n", " xm[3]=float(line[5:].split()[0])\n", " # Single force\n", " elif line[0:3] == 'SFr':\n", " cmt['itypso']=2\n", " xm[1]=float(line[5:].split()[0])\n", " elif line[0:3] == 'SFt':\n", " cmt['itypso']=2\n", " xm[2]=float(line[5:].split()[0])\n", " elif line[0:3] == 'SFp':\n", " cmt['itypso']=2\n", " xm[3]=float(line[5:].split()[0])\n", "\n", " if cmt['itypso'] > 1:\n", " xm[3]=0.\n", " xm[4]=0.\n", " xm[5]=0.\n", " cmt['xm']=xm\n", " \n", " # Moment tensor\n", " M = np.array([[xm[0], xm[3], xm[4]], [xm[3], xm[1], xm[5]], [xm[4], xm[5], xm[2]]])\n", " \n", " # Scalar moment\n", " M0 = np.sqrt(np.sum(M ** 2) / 2)\n", " cmt['m0'] = M0\n", " \n", " # Moment magnitude\n", " Mw = np.round(np.log10(M0) / 1.5 - 10.73, 1)\n", " cmt['mw'] = Mw\n", "\n", " return cmt" ] }, { "cell_type": "code", "execution_count": null, "id": "66dc21f0", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "# read the CMT solution file of Tohoku earthquake\n", "cmt = readcmtsolution('Files/CMTSOLUTION_TOHOKU')\n", "#print(cmt)" ] }, { "cell_type": "code", "execution_count": null, "id": "87ea076a-cbfa-4a79-8073-70c72c3a6304", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "from obspy.imaging.beachball import beachball\n", "fig = beachball(cmt['xm'])" ] }, { "cell_type": "markdown", "id": "59f69af9-48cf-4b89-aa86-f506d9bd0bb8", "metadata": { "tags": [] }, "source": [ "### TO DO\n", "\n", "**Question** Based on the beachball focal mechanism above, what type of faulting caused the Tohoku earthquake? Was the fault shallowly dipping at a low angle from the surface (dip < 30$^{\\circ}$) or steeply dipping at a high angle (dip > 60$^{\\circ}$)? Which plate was the overriding plate?\n", "\n", "**Answer:**" ] }, { "cell_type": "markdown", "id": "503b1034-f7e7-4975-8b24-a90ffbaa84a0", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "markdown", "id": "fcdbe285-4d33-4ec3-a65e-1b804abd8a52", "metadata": { "tags": [] }, "source": [ "### Section A: Vibrations at a single station\n", "\n", "Modern seismometers measure ground motion in three mutually perpendicular directions:\n", "\n", "- *North-South* (LHN or BHN component). This records the north-south horizontal component of vibration. “Up” on this graph means the ground moved north. “Down” on this graph means the ground moved south.\n", "\n", "- *East-West* (LHE or BHE component). This records the east-west horizontal component of vibration. “Up” on this graph means the ground moved east. “Down” on this graph means the ground moved west.\n", "\n", "- *Vertical* (LHZ or BHZ component). This records the vertical component of vibration. “Up” on this graph the means the ground moved up. “Down” on this graph means the ground moved down. \n", "\n", "The prefix (L-long period, or B-broadband) indicates the sampling rate of seismic data stream. When we wish to analyse signals that are finely spaced, we use the broadband data steam (i.e. think of Nyquist frequency). Sometimes, the horizontal orientations of the sensor in the instrument are not perfectly aligned with the N-S and E-W directions, even if the vertical component is perfectly aligned. In these cases, the horizontal components are denoted as LH1 and LH2 (or BH1 and BH2). We typically analyze the motion along the propagation direction of the wavefront, which gives us clearer information on individual seismic phases (as you will see below). Note that finding the propagation direction from earthquake to a station requires the coordinates and depth of the hypocenter, as denoted by the CMT focal mechanism above. \n", "\n", "\n", "\n", "*Figure: Rotation of vibrations recorded on a seismogram to radial, transverse and vertical components along the propagation direction.*\n" ] }, { "cell_type": "markdown", "id": "7cef9399-f6e0-4279-a19f-6b948d5a4506", "metadata": {}, "source": [ "Let us perform this calculation for the vibrations recorded on the long-period components at Ascension Island (station code: ASCN) in the South Atlantic Ocean from the Tohoku earthquake." ] }, { "cell_type": "code", "execution_count": null, "id": "a528064e-49be-4cc9-b181-954943b9186e", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "st = read('Files/waveforms/II.ASCN.*.LH*.mseed') # Ascension Island, distance=144\n", "print(st)" ] }, { "cell_type": "code", "execution_count": null, "id": "61cc3ed4-2273-4538-924b-2f8b5975ecfe", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "#st.plot();" ] }, { "cell_type": "code", "execution_count": null, "id": "ac19240b-1bb6-4445-ad5e-e34de5056872", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "stationdir = 'Files/stations/'\n", "inv = read_inventory('%s%s.%s.xml' % (stationdir, st[0].stats.network, st[0].stats.station))\n", "#inv.plot_response(0.001, output=\"DISP\");" ] }, { "cell_type": "markdown", "id": "95cae9e3-9a8d-4dd8-9bc1-dd185b14457f", "metadata": {}, "source": [ "We also wish to focus on vibrations in the period range 20-200 seconds to start with. This done by *band-pass filtering* the seismogram below:" ] }, { "cell_type": "code", "execution_count": null, "id": "ea48be79-430b-4e78-ac64-62e19cf385f7", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "\n", "periods = [20,200]\n", "\n", "stat_lat = inv.select(station=st[0].stats.station)[0][0].latitude\n", "stat_lon = inv.select(station=st[0].stats.station)[0][0].longitude\n", "evt_lat = cmt['epla']\n", "evt_lon = cmt['eplo']\n", "\n", "# compute back azimuth (check sign, later!):\n", "dist, back_az, az = gps2dist_azimuth(evt_lat, evt_lon, stat_lat, stat_lon)\n", "\n", "# convert distance to degrees\n", "dist = (dist / 6371000) / np.pi * 180\n", "\n", "# select the traces from a station\n", "st3 = st.copy()\n", "\n", "# remove instrument response, filter, and rotate\n", "pre_filt = [0.8/periods[1] ,1/periods[1], 1/periods[0], 1.25/periods[0]]\n", "st3 = st3.remove_response(inventory=inv, output=\"DISP\",pre_filt=pre_filt).taper(0.05)\n", "if not st3.select(channel='LHE'): st3 = st3.rotate(method=\"->ZNE\", inventory=inv,back_azimuth=back_az)\n", "st3 = st3.rotate(method=\"NE->RT\", inventory=inv,back_azimuth=back_az)\n", "st3.plot();" ] }, { "cell_type": "code", "execution_count": null, "id": "ef897c33-b6ae-46f9-a93d-8ab3591d8b53", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "import holoviews as hv\n", "from holoviews.operation.datashader import datashade\n", "from holoviews.streams import PlotSize\n", "import datashader as ds\n", "from holoviews import opts\n", "PlotSize.scale=2\n", "hv.extension('plotly')\n", "\n", "import plotly\n", "import plotly.graph_objects as go\n", "import plotly.express as px\n", "plotly.offline.init_notebook_mode() \n", "\n", "from obspy.taup import TauPyModel\n", "from obspy.geodetics.base import gps2dist_azimuth\n", "import warnings\n", "warnings.filterwarnings(\"ignore\")" ] }, { "cell_type": "markdown", "id": "65a3f866-d8ce-4249-bd30-410684f92259", "metadata": {}, "source": [ "### Section B: Horizontal versus Vertical Motions" ] }, { "cell_type": "markdown", "id": "5dfc5a36-7b46-437c-b811-b14509f66875", "metadata": {}, "source": [ "#### Task I: At a single station" ] }, { "cell_type": "code", "execution_count": null, "id": "8afc8405-d29c-48fa-9e86-286466cd77ba", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "duration_h = 1.5\n", "st3 = st3.slice(endtime=st3[0].stats.starttime + 3600*duration_h)\n", "\n", "# grab the time series data for processing\n", "scalefactor = 0.5\n", "times = np.array([])\n", "signals = np.array([])\n", "component_indicator = []\n", "components = [\"T\", \"Z\"]\n", "for ii, component in enumerate(components):\n", " trace = st3.select(component=component)[0]\n", " t = trace.times()\n", " x = trace.data\n", " x_scale = np.max(np.abs(x)) * scalefactor\n", " times = np.hstack((times, t/60)) # convert to minutes\n", " signals = np.hstack((signals, 2*ii + x / x_scale))\n", " component_indicator += [component] * t.size\n", " \n", "# convert to dataframe\n", "data = dict()\n", "cols = []\n", "\n", "data['time (minutes)'] = times \n", "data['signal (normalized)'] = signals\n", "data['component'] = component_indicator\n", "data['station'] = ['%s-%s' % (st3[0].stats.network, st3[0].stats.station)] * times.size\n", "data['distance'] = [dist] * times.size\n", "data['azimuth'] = [az] * times.size\n", "dfm = pd.DataFrame(data)" ] }, { "cell_type": "code", "execution_count": null, "id": "fd2777ce-654f-4af5-a6c9-30212ae9d237", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "fig_title = 'M%.1f %s lat=%.2f lon=%.2f dep=%.2f km band=%d-%d s station=%s-%s' % (cmt['mw'], \n", " cmt['region'].capitalize(), \n", " cmt['epla'], \n", " cmt['eplo'], \n", " cmt['depth'],\n", " periods[0], \n", " periods[1],\n", " st3[0].stats.network,\n", " st3[0].stats.station)\n", "fig_1 = px.line(dfm, x='time (minutes)', y='signal (normalized)', color='component', \n", " hover_data={'station': ':.s', 'distance': ':.2f', 'azimuth':':.2f'}, \n", " title=fig_title)\n", "fig = go.Figure(data=fig_1,layout=fig_1.layout)\n", "plotly.offline.iplot(fig)" ] }, { "cell_type": "markdown", "id": "93ace362-b51b-45ea-ba9a-17645e1e52fb", "metadata": { "tags": [] }, "source": [ "#### TO DO\n", "\n", "**Question** Identify the first seismic wave arrival at your station on the Z component. This is the first P-wave. Note that you should consider when the seismograph first starts to respond to the P-wave – i.e., the “shoulder” of the wave, or where trace starts to climb or fall towards the first peak or trough. Report your P-wave *travel time pick* in minutes and seconds.\n", "\n", "**Answer:**" ] }, { "cell_type": "markdown", "id": "93a4eef1-e4b0-4f44-b66c-2ce61b0db847", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "markdown", "id": "5e3259b3-fdd2-4cf9-b8ca-c6881dd5b555", "metadata": {}, "source": [ "**Question** Identify the first S-wave arrival at your station. Hint: how would the relative horizontal vs. vertical motion differ for an S wave compared to a P wave? There may also be a change in wavelength and/or amplitude; S-waves look ‘different” than P-waves. \n", "But there is likely more uncertainty about when to “pick” the arrival time of the first S-wave. This is because by this time, the station is still responding to the passage of P and other waves; and the S-waves are superimposed on these other waves. This is part of the “art” of seismology – and also partly why we use data from many stations to analyze seismic waves. Report your S-wave *travel time pick* in minutes and seconds.\n", "\n", "**Answer:**" ] }, { "cell_type": "markdown", "id": "3e096dc9-b45a-4e3c-b26d-0c5a109e54ad", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "markdown", "id": "355f3cd1-5e71-457e-a21f-c5ba02d1870c", "metadata": {}, "source": [ "**Question** Identify the surface waves on the seismogram. These are the waves with the strongest vibrations (largest amplitudes). Provide your *travel time picks* for the Rayleigh and Love wave packets on Z and T components, respectively. *Note*: Picking a travel time of a surface wave is complicated and depends strongly on the frequency of the surface wave, due to a property called *dispersion*. For this task, simply report the time of the first shoulder from the left (earliest in time). Which of the two waves (Love or Rayleigh) is arriving sooner at the station?\n", "\n", "**Answer:**" ] }, { "cell_type": "markdown", "id": "122bff2a-ca78-4945-b2b5-43060bc7695d", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "markdown", "id": "40316d9f", "metadata": {}, "source": [ "#### Task II: Record Section of the USArray Transportable Array" ] }, { "cell_type": "markdown", "id": "82c14255-96d0-4df1-b6ec-fc072e03567f", "metadata": {}, "source": [ "A single seismometer can give us some information about the earthquake and interior structure. In fact, this is the idea behind deploying a seismometer on the Moon and Mars. However, *arrays* of seismometer can provide unprecedented insights into geological processes. \n", "\n", "[USArray](http://www.usarray.org) is a 15-year program to place a dense network of permanent and portable seismographs across the continental United States. The seismographs record local, regional, and distant (teleseismic) earthquakes. This mega-project is funded by the US National Science Foundation and we will use data from a part of this project called the *Transportable Array*.\n", "\n", "\n", "\n", "\n", "*Figure: Installation plan of the transportable array of USArray.*\n", "\n", "USArray has a large traveling network of 400 high-quality, portable seismographs that are being placed in temporary sites across the United States in a rolling fashion. Station spacing is ~70 km (42 mi). Transportable Array data are extremely useful for mapping the structure of Earth's uppermost 70 km. The array was initially deployed in the westernmost United States. Unless adopted and made into a permanent installation, after 18-24 months, each instrument is picked up and moved to the next carefully spaced array location to the east. When completed, over 2000 locations will have been occupied during this program.\n", "\n", "Note that since we are looking at the data from Tohoku earthquake in 2011, we are using data from the Transportable Array as it was installed at that time. This means the stations that we are analyzing lie on a grid from Michigan to Louisiana.\n" ] }, { "cell_type": "code", "execution_count": null, "id": "a3a8d515", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "from scipy import interpolate\n", "\n", "def plottraces(cmt, \n", " st, \n", " stationdir, \n", " periods=[20,100],\n", " component=\"T\", \n", " phaselist=[\"P\", \"S\", \"PcP\", \"ScS\"],\n", " plottype=\"holoviews\",\n", " duration_h=1.0,\n", " scalefactor=1.0):\n", " '''\n", " Make holoview objects for plotting traces against epicentral distance\n", " \n", " Input:\n", " -----\n", " cmt: CMT Solution\n", " st: Obspy stream object containing 3 components seismograms\n", " stationdir: directory to station XML files for the streams\n", " periods: [min, max] periods for bandpass filter\n", " phaselist: list of phases to be plotted\n", " plottype: either \"holoviews\" or \"datashader\"\n", " duration: the number of hours of seismograms to be plotted\n", " scalefactor: factor to reduce the amplitude of the seismogram\n", " \n", " Output:\n", " ------\n", " overlay_trace: holoview object for trace plots\n", " overlay_phase: holoview object for phase plots\n", " options: options used by datashader\n", " \n", " These outputs will be used in the next command to create a plot:\n", " ---------------------------------------------------------------\n", " datashade(ndoverlay * overlay_phase, cmap=[\"black\"], cnorm='linear', \n", " aggregator=ds.count(), line_width=8).opts(opts)\n", " \n", " '''\n", " \n", " ########################################################\n", " # READ, PREOCESS, AND GATHER TRACES\n", " ########################################################\n", " signals = []\n", " dists = []\n", "\n", " # set the time for all trace to be the same\n", " t_trace = st[0].slice(endtime=st[0].stats.starttime + 3600*duration_h).times()\n", "\n", " for tr in st.select(channel='LHZ'):\n", " inv = read_inventory('%s%s.%s.xml' % (stationdir, tr.stats.network, tr.stats.station))\n", " stat_lat = inv.select(station=tr.stats.station)[0][0].latitude\n", " stat_lon = inv.select(station=tr.stats.station)[0][0].longitude\n", " evt_lat = cmt['epla']\n", " evt_lon = cmt['eplo']\n", "\n", " # compute back azimuth (check sign, later!):\n", " dist, back_az, _ = gps2dist_azimuth(evt_lat, evt_lon, stat_lat, stat_lon)\n", "\n", " # convert distance to degrees\n", " dist = (dist / 6371000) / np.pi * 180\n", " dists += [dist]\n", "\n", " # select the traces from a station\n", " st3 = st.select(station=tr.stats.station).copy()\n", "\n", " # remove instrument response, filter, and rotate\n", " pre_filt = [0.8/periods[1] ,1/periods[1], 1/periods[0], 1.25/periods[0]]\n", " st3 = st3.slice(endtime=st3[0].stats.starttime + 3600*duration_h).remove_response(inventory=inv, \n", " output=\"DISP\",pre_filt=pre_filt).taper(0.05)\n", " if not st3.select(channel='LHE'):\n", " st3 = st3.rotate(method=\"->ZNE\", inventory=inv,back_azimuth=back_az)\n", " st3 = st3.rotate(method=\"NE->RT\", inventory=inv,back_azimuth=back_az)\n", "\n", " # grab the time series data for processing\n", " trace = st3.select(component=component)[0]\n", " t = trace.times()\n", " x = trace.data\n", "\n", " # interpolate to the same time\n", " f = interpolate.interp1d(t, x, kind='linear', fill_value='extrapolate')\n", " x = f(t_trace)\n", " x_scale = np.max(np.abs(x)) * scalefactor\n", " signals += [dist + x / x_scale]\n", " \n", " \n", " # convert to dataframe\n", " data = dict()\n", " cols = []\n", "\n", " for ii in range(len(signals)):\n", " data[\"trace_%03d\" % ii] = signals[ii]\n", " cols += [\"trace_%03d\" % ii]\n", " data['time'] = t / 60 # convert to minutes\n", " dfm = pd.DataFrame(data)\n", " \n", " ########################################################\n", " # CALCULATE TRAVEL TIMES\n", " ########################################################\n", " dist_min = np.min(dists)\n", " dist_max = np.max(dists)\n", "\n", " distances = np.arange(np.floor(dist_min), np.ceil(dist_max) + 1, 0.5)\n", " traveltime_phases = np.ones((np.size(distances), len(phaselist))) * np.nan\n", "\n", " model = TauPyModel(model=\"ak135\")\n", " for ii, dist in enumerate(distances):\n", " for jj, phase in enumerate(phaselist):\n", " arrivals = model.get_travel_times(source_depth_in_km=cmt['depth'],\n", " distance_in_degree=dist,\n", " phase_list=[phase])\n", " if arrivals:\n", " traveltime_phases[ii, jj] = arrivals[0].time\n", " \n", " # convert to dataframe\n", " traveltime = dict()\n", " for jj, phase in enumerate(phaselist):\n", " traveltime[phase] = traveltime_phases[:,jj] / 60 # convert to minutes\n", " traveltime['distance'] = distances\n", " dft = pd.DataFrame(traveltime)\n", " \n", " ########################################################\n", " # PLOT\n", " ########################################################\n", " options = hv.opts.RGB(width=800, height=800, xlabel='Time (minutes)', \n", " title='M%.1f %s lat=%.2f lon=%.2f dep=%.2f km comp=%s band=%d-%d s, phase=%s' % (cmt['mw'], \n", " cmt['region'].capitalize(), \n", " cmt['epla'], \n", " cmt['eplo'], \n", " cmt['depth'], \n", " component, \n", " periods[0], \n", " periods[1],\n", " phaselist))\n", " \n", " \n", " if plottype == \"datashader\":\n", " overlay_trace = hv.NdOverlay({c:hv.Curve((dfm['time'], dfm[c]), vdims = ['Distance (degree)'], label='trace') for c in cols})\n", " if len(phaselist) > 0: \n", " overlay_phase = hv.NdOverlay({phase:hv.Curve((dft[phase], dft['distance']), \n", " vdims = ['Distance (degree)']\n", " #label = phase\n", " ) for phase in phaselist})\n", " # Make the phase curve thicker\n", " overlay_phase = overlay_phase * overlay_phase * overlay_phase * overlay_phase\n", " else:\n", " overlay_phase=None\n", " else:\n", " overlay_trace = [hv.Curve((dfm['time'], dfm[c]), vdims = ['Distance (degree)'], group='trace') for c in cols]\n", " if len(phaselist) > 0: \n", " overlay_phase = [hv.Curve((dft[phase], dft['distance']), vdims = ['Distance (degree)'], group = 'phase') for phase in phaselist]\n", " else:\n", " overlay_phase=None\n", " \n", " return overlay_trace, overlay_phase, options" ] }, { "cell_type": "code", "execution_count": null, "id": "8712a172", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "# read waveforms recorded by US Transportable Array\n", "st = read('Files/waveforms/TA.*.LH*.mseed')" ] }, { "cell_type": "code", "execution_count": null, "id": "848b5362", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "#overlay_trace, overlay_phase, options = plottraces(cmt, st, 'Files/stations/', [20,200], \"Z\", [\"S\", \"pP\", \"P\",\"PP\"], \"datashader\")\n", "overlay_trace, overlay_phase, options = plottraces(cmt, st, 'Files/stations/', [20,200], \"Z\", [], \"datashader\")\n", "if overlay_phase != None:\n", " things_to_plot = overlay_trace * overlay_phase\n", "else:\n", " things_to_plot = overlay_trace \n", "datashade(things_to_plot, cmap=\"gist_gray\", cnorm='linear', aggregator=ds.count()).opts(options)" ] }, { "cell_type": "code", "execution_count": null, "id": "feceeaec", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "#overlay_trace, overlay_phase, options = plottraces(cmt, st, 'Files/stations/', [20,100], \"T\", [\"S\", \"ScS\"], \"datashader\", 1, 2)\n", "overlay_trace, overlay_phase, options = plottraces(cmt, st, 'Files/stations/', [20,100], \"T\", [], \"datashader\", 1, 2)\n", "if overlay_phase != None:\n", " things_to_plot = overlay_trace * overlay_phase\n", "else:\n", " things_to_plot = overlay_trace \n", "datashade(things_to_plot, cmap=\"gist_gray\", cnorm='linear', aggregator=ds.count()).opts(options)" ] }, { "cell_type": "markdown", "id": "d9fda269-5963-42c7-9d23-fbb5a133d203", "metadata": { "tags": [] }, "source": [ "#### TO DO\n", "\n", "**Question** The *record sections* above show data from the first hour after the earthquake origin time on the Z and T components. Zoom into any distance range 10 degrees in width, and download the image as png from the options (top right). Draw a line with a red pen where you see the P wave (on the Z component) and S wave (on the T component). Find slope of this curve and report this as the speed of the P and S wave (in m/s). Are the slopes expected to vary with distance? If so, why? *Hint:* You can assume that 1 degree equals 111.1949 km on average. Attach the annotated image with the assignment.\n", "\n", "**Answer:**" ] }, { "cell_type": "markdown", "id": "cad4b97a-775c-4b2e-8bd1-6e8df35d1ff2", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "markdown", "id": "aafb7464", "metadata": { "tags": [] }, "source": [ "### Part 2: Waveforms at different period bands\n", "\n", "Next, we will evaluate if we can glean more information by analyzing waveforms at other period bands. As you will see, there is a plethora of information in these seismograms at various period/frequency bands. Let us look at the vertical (Z) component from the US Transportable Array at two period bands (5-40 seconds vs 20-100 seconds)." ] }, { "cell_type": "code", "execution_count": null, "id": "568b3129", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "overlay_trace, overlay_phase, options = plottraces(cmt, st, 'Files/stations/', [5,40], \"Z\", [], \"datashader\", 1, 2)\n", "if overlay_phase != None:\n", " things_to_plot = overlay_trace * overlay_phase\n", "else:\n", " things_to_plot = overlay_trace \n", "datashade(things_to_plot, cmap=\"gist_gray\", cnorm='linear', aggregator=ds.count()).opts(options)" ] }, { "cell_type": "code", "execution_count": null, "id": "9ebd55ea", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "overlay_trace, overlay_phase, options = plottraces(cmt, st, 'Files/stations/', [20,100], \"Z\", [], \"datashader\", 1, 2)\n", "if overlay_phase != None:\n", " things_to_plot = overlay_trace * overlay_phase\n", "else:\n", " things_to_plot = overlay_trace \n", "datashade(things_to_plot, cmap=\"gist_gray\", cnorm='linear', aggregator=ds.count()).opts(options)" ] }, { "cell_type": "markdown", "id": "009b9715-2f7e-4051-bfaf-5924b07a19cf", "metadata": { "tags": [] }, "source": [ "#### TO DO\n", "\n", "**Question** The *record sections* above show data from the first hour after the earthquake origin time on the Z component for data filtered at two period bands (5-40 seconds vs 20-100 seconds). Do you see any phases that are more prominent in one period band or the other? Annotate this phase by zooming into any distance range 10 degrees in width, and downloading the image as png from the options (top right). Attach the annotated image with the assignment.\n", "\n", "**Answer:**" ] }, { "cell_type": "markdown", "id": "db4758cb-5338-428d-804e-b6c4ff907e3d", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "markdown", "id": "14e6c146-8a7f-4a32-8f60-5f2de18d2ba0", "metadata": { "tags": [] }, "source": [ "### Part 3: Spectrum of Vibrations\n" ] }, { "cell_type": "code", "execution_count": null, "id": "26c6ed4e", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "%matplotlib inline\n", "\n", "from scipy.fftpack import fft\n", "import numpy as np\n", "from importlib import reload\n", "import os\n", "from scipy import signal\n", "import matplotlib.pyplot as plt" ] }, { "cell_type": "markdown", "id": "550f7bb8-a54c-4846-9bfe-9c14e5d30a56", "metadata": {}, "source": [ "To complete the illustration of time versus frequency plots (a normal signal versus it's spectrum), here is a seismic signal (ground motion caused by an earthquake). The time series is a complicated looking signal. In fact, this signal can be described as a combination of a very large number of sinusoids, all with amplitudes as shown in the spectrum graph." ] }, { "cell_type": "markdown", "id": "c1a0a17b-cdc7-452e-8b2c-f700e72645fd", "metadata": {}, "source": [ "**2004 Sumatra - Andaman Islands Earthquake [USGS Link](https://earthquake.usgs.gov/earthquakes/eventpage/official20041226005853450_30/executive)**\n", "\n", "The devastating M 9.1 earthquake off the west coast of northern Sumatra on December 26, 2004, occurred as the result of thrust faulting on the interface of the India plate and the Burma microplate. In a period of minutes, the faulting released elastic strains that had accumulated for centuries from ongoing subduction of the India plate beneath the overriding Burma microplate. \n", "\n", "Since this earthquake was so large, it was able to *ring* the Earth like a bell, causing vibrational normal modes or standing waves. Let us look at the data from this earthquake recorded at Echery - Sainte Marie aux Mines, France.\n" ] }, { "cell_type": "code", "execution_count": null, "id": "6391c685", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "from pathlib import Path\n", "import numpy as np\n", "import matplotlib.pyplot as plt\n", "from obspy.clients.fdsn import Client\n", "from obspy import UTCDateTime,read_inventory, read\n", "from obspy.geodetics.base import gps2dist_azimuth\n", "from gatspy.periodic import LombScargleFast\n", "\n", "inventory_store = 'Files/ECH_inventory.xml'\n", "path = Path(inventory_store)\n", "\n", "if path.is_file():\n", " inv = read_inventory(inventory_store)\n", "else:\n", " # Create two times which we will later use to download time series data.\n", " starttime = UTCDateTime(2004, 12, 26, 0, 58, 50)\n", " # Create a new time object by adding 10 days to the previous ones. Time\n", " # differences are in seconds.\n", " endtime = starttime + 86400 * 10\n", " client='IRIS'\n", " level='response'\n", " query ='G-ECH-*-LH*'\n", "\n", " c = Client(client)\n", " network,station,location,channel=query.split('-')\n", " inv = c.get_stations(network=network, station=station, location=location, channel=channel,\n", " starttime=starttime, endtime=endtime,level=level)\n", " \n", " inv.write(inventory_store, format='STATIONXML') " ] }, { "cell_type": "code", "execution_count": null, "id": "4b1f5f79", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "st = read('Files/Sumatra_2004_ECH.mseed')" ] }, { "cell_type": "code", "execution_count": null, "id": "7b78f776-922c-4f64-beb3-ee433919acf5", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "nhours = 230\n", "pre_filt=[1e-4,2e-4,3e-3,4e-3]\n", "\n", "# read stat_lon, stat_lat\n", "stat_lon = inv[0][0].longitude\n", "stat_lat = inv[0][0].latitude\n", "\n", "# read evt_lat, evt_lon\n", "evt_lat = 3.09\n", "evt_lon = 94.26\n", "\n", "# compute back azimuth (check sign, later!):\n", "dist, back_az, _ = gps2dist_azimuth(evt_lat, evt_lon,stat_lat, stat_lon) \n", "\n", "st2 = st.copy().slice(endtime=st[0].stats.starttime + 3600*nhours).remove_response(inventory=inv, output=\"ACC\",pre_filt=pre_filt).rotate(method=\"NE->RT\", inventory=inv,back_azimuth=back_az).taper(0.05)\n", "\n", "# Remember the copy()!\n", "tr = st2.select(component=\"Z\")[0]\n", "\n", "\n", "fmin = 0.0001\n", "fmax = 0.001\n", "N = 100000 # frequency resolution\n", "df = (fmax - fmin) / N\n", "\n", "time = np.linspace(start=0,stop=tr.stats.endtime-tr.stats.starttime,num=tr.stats.npts,endpoint=True)\n", "model = LombScargleFast().fit(time, tr.data)\n", "power = model.score_frequency_grid(fmin, df, N)\n", "freqs = fmin + df * np.arange(N)" ] }, { "cell_type": "code", "execution_count": null, "id": "9d8401fa-da22-4559-9b90-5b58db9d07b1", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "# convert to dataframe\n", "data = dict()\n", "data['Frequency [mHz]'] = freqs*1000 \n", "data['Power'] = power\n", "dfm = pd.DataFrame(data)\n", "\n", "\n", "fig_title = 'Station = %s-%s, %s Channel' % (tr.stats.network,tr.stats.station,tr.stats.channel)\n", "fig_1 = px.line(dfm, x='Frequency [mHz]', y='Power', \n", " title=fig_title)\n", "fig = go.Figure(data=fig_1,layout=fig_1.layout)\n", "plotly.offline.iplot(fig)" ] }, { "cell_type": "markdown", "id": "de4ff8b3-a5d0-4c78-8ad2-54687ee2b002", "metadata": { "tags": [] }, "source": [ "#### TO DO\n", "\n", "**Question** Report the approximate frequencies that you observe on the spectrum here. Identify the lowest frequency (longest period) normal mode and describe how it deforms the Earth.\n", "\n", "**Answer:**" ] }, { "cell_type": "markdown", "id": "2f3aec87-d019-4bc3-b7c8-01ed8d7bd55c", "metadata": {}, "source": [ "

\n", "

" ] }, { "cell_type": "code", "execution_count": null, "id": "877cd6b7", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "# LOOK AT TIDES\n", "fmin = 0.001*1E-3\n", "fmax = 0.03*1E-3\n", "\n", "pre_filt=[1e-6,2e-6,3e-3,4e-3]\n", "st2 = st.copy().slice(endtime=st[0].stats.starttime + 3600*nhours).remove_response(inventory=inv, \n", " output=\"ACC\",pre_filt=pre_filt).rotate(method=\"NE->RT\", \n", " inventory=inv,back_azimuth=back_az).taper(0.05)\n", "tr = st2.select(component=\"Z\")[0]\n", "tr.plot()\n", "\n", "N = 60 # frequency resolution\n", "df = (fmax - fmin) / N\n", "\n", "# Faster\n", "time = np.linspace(start=0,stop=tr.stats.endtime-tr.stats.starttime,num=tr.stats.npts,endpoint=True)\n", "model = LombScargleFast().fit(time, tr.data)\n", "power = model.score_frequency_grid(fmin, df, N)\n", "freqs = fmin + df * np.arange(N)" ] }, { "cell_type": "code", "execution_count": null, "id": "3a213171-d62b-49dd-bbda-67691742a445", "metadata": { "tags": [ "remove-input" ] }, "outputs": [], "source": [ "# convert to dataframe\n", "data = dict()\n", "data['Frequency [mHz]'] = freqs*1000 \n", "data['Power'] = power\n", "dfm = pd.DataFrame(data)\n", "\n", "\n", "fig_title = 'Station = %s-%s, %s Channel' % (tr.stats.network,tr.stats.station,tr.stats.channel)\n", "fig_1 = px.line(dfm, x='Frequency [mHz]', y='Power', \n", " title=fig_title)\n", "fig = go.Figure(data=fig_1,layout=fig_1.layout)\n", "plotly.offline.iplot(fig)" ] }, { "cell_type": "markdown", "id": "6007757e-d051-4f43-8114-bd8333c1a625", "metadata": { "tags": [] }, "source": [ "#### TO DO\n", "\n", "**Question** Seismometer are so sensitive that they also record vibrations due to tides in the solid Earth. An example is seen in the spectrum above. Report the two frequencies. Convert to minutes/hours and comment on the forces causing these longer period vibrations (compared to seismic waves).\n", "\n", "**Answer:**" ] }, { "cell_type": "markdown", "id": "4c44480e-4c65-44c7-adb2-1f64854c0222", "metadata": {}, "source": [ "

\n", "

" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.12" } }, "nbformat": 4, "nbformat_minor": 5 }