If a cell exists where the probability exceeds a random number uniformly distributed between 0 and 1, that cell is assigned as the nucleation point for a new event and its occurrence time is assigned a value randomly distributed between t0 + Δt and t0 + 1.5Δt, otherwise new probabilities are computed, increasing Δt by a factor of 1.5 and compared with random numbers. If more than one cell fulfills the criterion of exceedance of the random number, the cell with the largest probability is assumed as the nucleation cell. The process is repeated until the condition of exceedance for the probability in any of the cells is met and the rupture of a new event starts. According to equation [2.3], after the rate change caused by an event, if no other perturbation happens, the rate r gradually returns to the background rate r0. By the application of this algorithm in the new version of the simulator, the nucleation point and occurrence time for events is determined randomly by a stochastic procedure, rather than by a deterministic rule. As it will be shown later, the synthetic catalogs obtained using this new algorithm proved to contain realistic features of event clustering after strong events, not achieved by the previous versions.
2.4. Application of the last version of the simulator to the Nankai mega-thrust fault system
The physical model on which the latest version of our simulation algorithm is based also includes, besides tectonic stress loading and static stress transfer as in the previous versions, the rate-and-state constitutive law. The simulator code can be run on a relatively modest computer and is capable of simulating thousands of years of seismic activity, producing catalogs of tens of thousands of events in a wide range of magnitudes. In this study, we apply the simulator code to a physical model of the Nankai mega-thrust, a well-known seismic structure 650 km long, aligned with the Pacific Ocean coast of Southern Japan, which generated several earthquakes of magnitude larger than 8.0 in the last 13 centuries (Table 2.2). This structure is typically modeled as sub-divided in five main segments characterized by different slip rates, which can rupture separately from or simultaneously with each other (Figure 2.14).
We ran the simulator for 3,000 years and included a warm up period of 1,000 years. The 2,000 years output synthetic catalog contains 9,635 events of magnitude ranging from 5.6 (earthquakes rupturing only two cells) to 8.48 (an earthquake rupturing 1,838 cells of segments A-D). Table 2.3 displays some of the main features of the synthetic catalog and Figure 2.15 shows its magnitude distribution.
Following the method described in section 2.3.3, we carried out a statistical analysis of the interevent times for the 2,000 years worth of simulation of the seismicity in the Nankai fault system. According to “Ellsworth B” equation (WGCEP 2003), already mentioned in section 2.3.2, the magnitude of earthquakes rupturing the entire area of one of the segments named from A to E in Figure 2.12 would range from 8.2 and 8.4. In this study, we considered the statistics of events with a magnitude equal to or larger than 8.0, capable of rupturing a relevant part of one or more of these segments. Figure 2.16 shows the interevent time distribution of the simulation.
Table 2.2. List of observed or argued mega-earthquakes that ruptured two or more segments of the Nankai mega-thrust (from Parsons et al. 2013)
Figure 2.14. Simplified model of the Nankai mega-thrust. The seismogenic structure is modeled by quadrilateral faults, each of which is composed of square cells of 5 × 5 km. For a color version of this figure, see www.iste.co.uk/limnios/statistical.zip
Table 2.4 displays the mean interevent time, the standard deviation and the coefficient of variation for each segment. The relatively short average interevent times of the simulations can be justified by the circumstance that often two or more segments rupture simultaneously in a single earthquake. Along with the above-mentioned temporal parameters, Table 2.4 also reports the results of the difference between the log-likelihood computed by the BPT renewal model and the Poisson time-independent model (dlogL). For the likelihood estimation, we have adopted the values obtained for Tr and Cv reported in Table 2.4 for each fault segment.
Table 2.3. Features of the 2,000 years synthetic catalog
Figure 2.15. Frequency magnitude distribution of the 2,000 years synthetic catalog
Both Figure 2.16 and Table 2.4 show, as expected, that the most active segments are those characterized by higher slip rate (such as A, B and C). The simulation also shows that for the less active segments (E), interevent times longer than 400 years are possible. The coefficient of variation Cv is typically close to 0.3, which can be associated with remarkably time-predictable behavior of the seismicity. The log-likelihood difference denotes a better performance of the renewal model against the time-independent hypothesis. As the simulator algorithm allows the computing of the stress on all the cells constituting the seismic structure adopted in the model, we may build up the stress history on each cell and display it in a sort of animation (Figure 2.17).
Figure 2.16. Interevent time distribution from a simulation of 2,000 years of seismic activity across the Nankai mega-thrust fault system. For a color version of this figure, see www.iste.co.uk/limnios/statistical.zip
Table 2.4. Statistical parameters of the 2,000 years synthetic catalog of the Nankai mega-thrust fault system
Figure 2.18(a) shows (in a normalized scale) the time variation of the average stress computed