Detrital MC user guide¶

Predicted and observed detrital thermochronometer age distributions plotted as empirical cumulative distribution functions¶
Detrital MC is a Monte Carlo code for comparing predicted and observed detrital thermochronometer ages. This website contains documentation for Detrital MC. If you are interested in the code itself, you can download Detrital MC from its GitHub repository at https://github.com/HUGG/Detrital-MC/.
Quick start¶
This is a quick introduction to how to use Detrital MC. Please check the rest of the documentation for more detailed explanations of how the software operates.
Quick intro coming soon…
Code overview¶
This is a general overview of Detrital MC. Please check the rest of the documentation for more detailed explanations of how the software operates.
Code overview coming soon…
Input file¶
Here we explain the input file used to control Detrital MC, where all parameters for a Detrital MC model and the locations of input data files are specified.
The input file is located in the input
subdirectory.
Note that you can freely add comments in your copies of the Detrital MC input file by starting lines with the $
character.
Below, we desribe the different sections of the Detrital MC input file and how they work. The general format of this documentation gives information about what should be listed on each line of each section of the input file. The sections and lines are given by number, while the values on each line are given using letters. An example of two lines containing 3 and 5 values is given below.
$=== [3] - Section name ========================================================
VALUE_A VALUE_B VALUE_C
VALUE_A VALUE_B VALUE_C VALUE_D VALUE_E
Please check the rest of the documentation for more detailed explanations of how other parts the software operate.
Section 1: Basin summary information¶
The first section of the Detrital MC input file is for specifying how many basins are being analyzed, and the names, formats, and associated parameters for the input data files. The input values are described in more detail below.
$=== [1] - Basin summary information ===========================================
1
BH398-AFT 3 BH398_WB009-1km_Pecube_and_topometrics_250m 8 97 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1
Section 1, line 1 (1 required value)¶
Value a: Number of basins to analyze. Type:
int
The only input value here is the number of basins to analyze.
Section 1, line 2+ (4+ required values)¶
In section 1, the remaining line(s) are used to provide information about each of the basins that should be analyzed. You should use one line per basin.
Value a: Name of the observed age data file in the
data/observed_ages
subdirectory. Type:char
Listing
ba1
would tell Detrital MC to read measured ages from thedata/observed_ages/ba1.dat
file. This should be a character string.Value b: Predicted age file format. Type:
int
Note
Values c, d, … differ depending on the value selected for value 2 on this line. As a result, we list the corresponding descriptions of those values below each number below.
If value b =
1
:Comparison.txt
file generated by HUGG version of Pecube. Complete this line with the following values:Value c: Name of the Pecube model run output directory (e.g.,
RUN00
) in thedata
subdirectory. Type:char
Value d: The predicted thermochronometer age system to use (
1
= AHe,2
= AFT,3
= ZHe,4
= ZFT,5
= MAr). Type:int
If value b =
2
: Generic CSV file. Complete this line with the following values:Value c: Name of the predicted age file in the data subdirectory (e.g.,
DW001
forDW001.csv
). Type:char
Value d: The number of the column containing predicted ages in the CSV file. Type:
int
Value e: The number of the column containing predicted erosion rates in the CSV file. Type:
int
Note
If value e is equal to
16
,17
,18
,19
,98
, or99
, additional information is required.If value e =
16
:Value f: Bedrock fertility scaling factor 1 (TSS). Type:
float
Value g: Bedrock fertility scaling factor 2 (GHS). Type:
float
Value h: Bedrock fertility scaling factor 3 (LHS). Type:
float
Value i: Bedrock fertility scaling factor 4 (Siwaliks). Type:
float
Value j: Bedrock fertility scaling factor 5 (Leucogranites). Type:
float
Value k: Bedrock fertility scaling factor 6 (LHS-C). Type:
float
Value l: Erosion scaling factor. Type:
int
0
: None1
: Instantaneous exhumation rates from Pecube2
: Normalized channel steepness3
: Specific stream power
If value e =
17
:Value f: Glacier scaling factor 1 (Glacier-covered areas). Type:
float
Value g: Glacier scaling factor 1 (Glacier-free areas). Type:
float
Value h: Erosion scaling factor. Type:
int
0
: None1
: Instantaneous exhumation rates from Pecube2
: Normalized channel steepness3
: Specific stream power
If value e =
18
:Value f: Moraine scaling factor 1 (Moraine-covered areas). Type:
float
Value g: Moraine scaling factor 1 (Moraine-free areas). Type:
float
Value h: Erosion scaling factor. Type:
int
0
: None1
: Instantaneous exhumation rates from Pecube2
: Normalized channel steepness3
: Specific stream power
If value e =
19
:Value f: Rock glacier scaling factor 1 (Rock glacier-covered areas). Type:
float
Value g: Rock glacier scaling factor 1 (Rock glacier-free areas). Type:
float
Value h: Erosion scaling factor. Type:
int
0
: None1
: Instantaneous exhumation rates from Pecube2
: Normalized channel steepness3
: Specific stream power
If value e =
98
:Value f: Bedrock fertility scaling factor 1 (Checkha/TSS). Type:
float
Value g: Bedrock fertility scaling factor 2 (GHS). Type:
float
Value h: Bedrock fertility scaling factor 3 (LHS). Type:
float
Value i: Bedrock fertility scaling factor 4 (Siwaliks). Type:
float
Value j: Bedrock fertility scaling factor 5 (Leucogranites). Type:
float
Value k: Bedrock fertility scaling factor 6 (Paro). Type:
float
Value l: Glacier scaling factor (Glacier-covered areas). Type:
float
Value m: Moraine scaling factor (Moraine-covered areas). Type:
float
Value n: Rock glacier scaling factor (Rock glacier-covered areas). Type:
float
Value o: Non-glacial scaling factor (Areas free of glacial formations). Type:
float
Value p: Erosion scaling factor. Type:
int
0
: None1
: Instantaneous exhumation rates from Pecube2
: Normalized channel steepness3
: Specific stream power
If value e =
99
:Value f: Glacier scaling factor (Glacier-covered areas). Type:
float
Value g: Moraine scaling factor (Moraine-covered areas). Type:
float
Value h: Rock glacier scaling factor (Rock glacier-covered areas). Type:
float
Value i: Non-glacial scaling factor (Areas free of glacial formations). Type:
float
Value j: Erosion scaling factor. Type:
int
0
: None1
: Instantaneous exhumation rates from Pecube2
: Normalized channel steepness3
: Specific stream power
If value b =
3
: Newer generic CSV file. Listed values are the same as for Value b =2
, with the addition below:Note
If value e is equal to
97
, additional information is required.If value e =
97
:Value f: Bedrock fertility scaling factor 1 (Checkha/TSS). Type:
float
Value g: Bedrock fertility scaling factor 2 (GHS). Type:
float
Value h: Bedrock fertility scaling factor 3 (LHS). Type:
float
Value i: Bedrock fertility scaling factor 4 (Siwaliks). Type:
float
Value j: Bedrock fertility scaling factor 5 (Leucogranites). Type:
float
Value k: Bedrock fertility scaling factor 6 (Paro). Type:
float
Value l: Glacier scaling factor (Glacier-covered areas). Type:
float
Value m: Moraine scaling factor (Moraine-covered areas). Type:
float
Value n: Rock glacier scaling factor (Rock glacier-covered areas). Type:
float
Value o: Non-glacial scaling factor (Areas free of glacial formations). Type:
float
Value p: Scaling factor for regions with hillslopes >30 degrees. Type:
float
Value q: Scaling factor for regions with hillslopes <10 degrees. Type:
float
Value r: Erosion scaling factor. Type:
int
0
: None1
: Instantaneous exhumation rates from Pecube2
: Normalized channel steepness3
: Specific stream power
Section 2: Number of grains to consider in predicted age PDFs¶
The second section of the Detrital MC input file contains information about the number of “samples” to consider when calculating age distributions. The input values are described in more detail below.
$=== [2] - Number of grains to consider in predicted age PDFs ==================
0
0
Section 2, line 1 (1 required value)¶
Value a: Number of different sample sizes to consider. Type:
int
Detrital MC has the option to calculate age distributions using different numbers of ages in the distribution.
If a < 1, the code will use the number in the observed age file
If a > 0, you should list the number of different samples sizes on the second line
Section 2, line 2 (1 required value, additional optional values)¶
Value a: Number of ‘grains’ in each sample, separated by a single space. Type:
int [int int ...]
If value a on line one of this section is less than 1, this value is read, but ignored
Section 3: PDFs to calculate¶
Section three of the Detrital MC input file contains flags for which age distributions should be calculated. The input values are described in more detail below.
$=== [3] - PDF generation ======================================================
1 0 1
Section 3, line 1 (3 required values)¶
Value a: Flag for whether or not to calculate age distributions for the observed age data. Type:
int
The input value must be either
1
or0
.1 = yes
0 = no
Value b: Flag for whether or not to calculate age distributions for the entire predicted age population. Type:
int
The input value must be either
1
or0
.1 = yes
0 = no
Value c: Flag for whether or not to calculate age distributions from Monte Carlo random samples from the predicted age population. Type:
int
The input value must be either
1
or0
.1 = yes
0 = no
Input data file formats¶
Here we explain the formats of data files that can be read for Detrital MC.
In addition to the input file (input/det_mc_input.txt
), Detrtial MC can read observed/measured age data and three different formats of predicted age data.
Each are described in more detail below.
Please check the rest of the documentation for more detailed explanations of how other parts the software operate.
Observed age data file format¶
Predicted age data file format¶
Pecube Comparison.txt file¶
NLINES
LON LAT ELEV PECUBE_ELEV PECUBE_VZ AHE_OBS AHE_PRED AFT_OBS AFT_PRED ZHE_OBS ZHE_PRED ZFT_OBS ZFT_PRED KAR_OBS KAR_PRED BAR_OBS BAR_PRED MAR_OBS MAR_PRED HAR_OBS HAR_PRED FTL_OBS01 FTL_OBS02 FTL_OBS03 FTL_OBS04 FTL_OBS05 FTL_OBS06 FTL_OBS07 FTL_OBS08 FTL_OBS09 FTL_OBS10 FTL_OBS11 FTL_OBS12 FTL_OBS13 FTL_OBS14 FTL_OBS15 FTL_OBS16 FTL_OBS17 FTL_PRED01 FTL_PRED02 FTL_PRED03 FTL_PRED04 FTL_PRED05 FTL_PRED06 FTL_PRED07 FTL_PRED08 FTL_PRED09 FTL_PRED10 FTL_PRED11 FTL_PRED12 FTL_PRED13 FTL_PRED14 FTL_PRED15 FTL_PRED16 FTL_PRED17 RAMAN_OBS RAMAN_PRED
...
CSV file format 1¶
NLINES
UNUSED,UNUSED,UNUSED,UNUSED,SCALE1,AGE_PRED1,AGE_PRED2,AGE_PRED3,AGE_PRED4,UNUSED,UNUSED,AGE_PRED5,UNUSED,UNUSED,UNUSED,UNIT_ID1,UNIT_ID2,UNIT_ID3,UNIT_ID4,SCALE2,SCALE3,SCALE4,SCALE5,UNUSED,UNUSED,UNUSED,UNUSED,UNUSED,UNUSED,UNUSED,SCALE6,SCALE7
CSV file format 2¶
NLINES
UNUSED,UNUSED,UNUSED,UNUSED,SCALE1,AGE_PRED1,AGE_PRED2,AGE_PRED3,AGE_PRED4,UNUSED,UNUSED,AGE_PRED5,UNUSED,UNUSED,UNUSED,UNIT_ID1,UNIT_ID2,UNIT_ID3,UNIT_ID4,SCALE2,SCALE3,SCALE4,SCALE5,UNUSED,UNUSED,UNUSED,UNUSED,UNUSED,UNUSED,UNUSED,SCALE6,SCALE7,SCALE8,UNIT_ID5,UNIT_ID6
Calculation of age distributions¶
This page presents and overview of how age distributions in Detrital MC are calculated. Distributions of ages from detrital samples can be assembled and visualized in serveral different ways. Below I describe the different distributions and their meanings, as well as how they are used in Detrital MC.
Measured age PDFs and sample age distributions¶
Measured age PDFs¶
In order to calculate age distributions for all grain ages in a sample, the first step is to calculate the probability distribution function \(\mathrm{PDF}(x)\) for a single age assuming a normal distribution of error about the mean age \(\mu\) with the standard deviation \(\sigma\), and with a kernel width scaling factor \(\alpha\).
The default value for \(\alpha\) in Detrital MC is 0.6, but this value can be modified as described in Brandon (1996).
Sample age distributions (SPDFs)¶
Age distributions for measured sample ages can be generated by calculating the sum of the individual measured age PDFs and normalizing that sum to the number of measured ages,
This is referred to as the synoptic probability density function (SPDF) by Ruhl and Hodges (2005)
Predicted age PDFs and age distributions¶
Predicted age PDFs¶
The calculation of individual predicted age PDFs is similar to that above, but the predicted age PDFs are scaled by one or more scaling factors collectively referred to as \(f_{\mathrm{eff}}\) in order to account for factors that might increase the probability of an age being present in a catchment predicted age distribution, such as differences in the tectonic uplift rate or bedrock mineral fertility. Thus, the age PDF for a given predicted age can be calculated as
The values for the scaling factors that are combined as \(f_{\mathrm{eff}}\) are given in the input data file for Detrital MC. Further detail about this is given in the section describing the Detrital MC input file.
Another important difference for the predicted age PDFs is that there are no mean ages or standard deviations for the predicted ages. Instead, the predicted age is used as the mean age \(\mu_{\mathrm{p}}\) and the standard deviation \(\sigma_{\mathrm{p}}\) can be calculated as a function of the uncertainties in the measured ages or a constant percentage of the mean age. For example, it is often the case that the mean uncertainty fraction in the measured ages is used to calculate the predicted age standard deviations such that the calculated standard deviation would be
where \(\sigma_{i}\) and \(\mu_{i}\) are the mean age and standard deviation for the \(n\) measured ages.
Predicted age distributions (SPDFs)¶
The predicted age SPDFs are also calculated similar to those for the sample measured ages, but scaled once again by the scaling factor \(f_{\mathrm{eff}}\). In this case, the age distribution should be normalized to have an area of 1.0, so the predicted SPDF is simply the SPDF divided by the average scaling factor \(\overline{f_{\mathrm{eff}}}\). In other words,
Catchment cumulative distributions¶
In order to compare the measured and predicted age distributions, both need to be converted to cumulative density functions (CDFs) of some form. There are two options for this in Detrital MC, described below.
Smoothed distributions¶
The standard CDF used in Detrital MC is simply an integrated version of the SPDF, integrated using the trapezoid rule. This produces a smooth CDF, since the ages in the SPDF have been smoothed by their measurement uncertainties.
Unsmoothed distributions¶
Option two is to calculate an unsmoothed cumulative distribution function. This function is referred to as the empirical cumulative distribution function (ECDF), which is the same as the cumulative age distribution described by Vermeesch (2007). The result is a step function, where the function value increases by \(1/n\) for each age in the sorted distribution.
References¶
Brandon, M. T. (1996). Probability density plot for fission-track grain-age samples. Radiation Measurements, 26(5), 663–676.
Ruhl, K. W., & Hodges, K. V. (2005). The use of detrital mineral cooling ages to evaluate steady state assumptions in active orogens; an example from the central Nepalese Himalaya. Tectonics, 24, no.4, 14.
Vermeesch, P. (2007). Quantitative geomorphology of the White Mountains (California) using detrital apatite fission track thermochronology. Journal of Geophysical Research, F, Earth Surface, 112(F3), F03004.
Monte Carlo age sampling¶
Here we explain how Monte Carlo age sampling works in Detrital MC. Please check the rest of the documentation for more detailed explanations of how other parts the software operate.
Monte Carlo code description coming soon…
Comparing age distributions¶
Here we explain how predicted and measured age distributions are compared in Detrital MC. Please check the rest of the documentation for more detailed explanations of how other parts the software operate.
Data comparison description coming soon…
Output files and formats¶
Here we explain the files and formats produced by Detrital MC. Please check the rest of the documentation for more detailed explanations of how other parts the software operate.
Output file format description coming soon…
Using Detrital MC with Pecube¶
Instructions for using Detrital MC with Pecube will be added soon…