• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/56

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

56 Cards in this Set

  • Front
  • Back

What is a model?

A simplification/abstraction/analogy of reality.

What are the different types of models?

- Empirical models


- Theoretical models

What is an empirical model?

Empirical models are based on observations (data driven). They are good for prediction, but not for explanation.

What is a theoretical model?

Theoretical (or physically based) models are based on process representation (theory driven). They are often based on assumptions and need a lot of measurements to fully implement them. They are not always good for prediction but can have high explanatory power.

Which type of model is best for prediction?

Empirical models.

Which type of model is best for explanation?

Theoretical models.

How can the predictive capability of theoretical models be improved?

Calibration to observations.

What are the objectives of a model?

- To facilitate understanding by emulating unnecessary components

- To aid in decision making by simulating 'what if' scenarios


- To explain, control, and predict events on the basis of of past observations


What is diagnostic modelling?

It aims to understand/investigate important processes.

Give an example of diagnostic modelling.

Payne et al. (2014) - Understanding how quickly inland ice can respond to changes at the periphery.

What is prognostic modelling?

It is used to make future predictions/forecast and assess the likelihood of something occurring based of past reconstructions for example. Useful for long timescales that we can't observe over.

Give an example of diagnostic modelling.

Shepherd et al. (2001) - "Inland Thinning of Pine Island Glacier, West Antarctica"



They used satellite altimetry and interferometry. The grounded PIG thinned by up to 1.6 metres per year between 1992 and 1999. They found the thinning can't be explained by short-term variability in accumulation thus must result from glacier dynamics.




Describe the study by Payne et al. (2004).

"Recent dramatic thinning of largest West Antarctic ice stream triggered by oceans"




Used a numerical model of ice flow to determine whether an oceanic trigger can affect the deep interior of the ice stream on the appropriate timescales. Results show changes in PIG's ice shelf and/or ice plain can be transmitted rapidly upstream on decadal timescales. Therefore likely thinning is a response to recent changes in the oceanography of the Amundsen Sea.

Give an example of regression modelling.

Adalgeisdottir et al. (2003) - "A regression model for the mass-balance distribution of the Vatnajokull ice cap, Iceland"



They found that these variables could explain 96% of the spatial and temporal variance observed in the mass-balance data.


What is numerical modelling?

A simplification/abstraction of reality (in process and in space and time).

Describe the Gladstone et al. (2012) prognostic modelling example.

"Calibrated prediction of PIG retreat during 21st and 22nd centuries with a coupled flowline model"



This was a 1D flowline ice sheet model coupled to a box model for cavity circulation.


Describe the Cornford et al. (2015) prognostic modelling example.

"Century-scale simulations of the response of the West Antarctic Ice Sheet to a warming climate"




BISICLES adaptive mesh ice sheet model was used. One, two and three century simulations were carried out of the fast-slowing ice streams of the West Antarctic Ice Sheet.

Name the steps in the modelling process.

1. physical process


2. mathematical description


3. computer code


4. application

Define constant.

Quantity whose value does not vary in the target system (e.g. speed of light).

Define parameter.

Quantity whose value is constant in the case considered but may vary in different cases (e.g. total solar radiation at the top of Earth's atmosphere).

Define variable.

Quantity whose value may change freely in response to the functioning of the system (e.g. amount of precipitation).

Define relation.

Functional connection or correspondence between two or more system elements (e.g. rainfall, run-off and soil erosion).

Define relationship.

State of being related.

Define process.

Operation or event, operating over time (temporal process) or space (spacial process) or both, which changes a quantity in the target system (e.g. evapotranspiration).

Define scale.

Relative dimension, in space or time, over which processes operate and measurements are made (e.g. local, regional, global; diurnal, seasonal, annual).

Define structure.

Manner in which component parts of a system are organised.

Define system.

Set of related elements (e.g. constants, parameters and variables), the relations between them, the functions or processes that govern these relations and the structure by which they are organised (e.g. forest ecosystem, drainage basin, global carbon cycle, Earth's climate).

State a process that occurs in the glacier system.

Internal deformation.

State a relation that exists within internal deformation.

Ice velocity is related to the gravitational driving stress.

Why do we not want to always represent systems as a whole?

- It may be too difficult numerically (e.g. turbulence)


- We may only be interested in the large scale result


- May not have enough data across the model domain (e.g. infiltration rate)

State the two types of parameters.

Physical parameters and process parameters.

Give examples of physical parameters.

Cliff height, basin area and soil height.

What are the two types of process parameters.

Physical resemblance and non-physical resemblance.

Give examples of process parameters.

Infiltration rate, stomatal resistance and weathering rate.

What is a physical resemblance parameter?

Require field measurements (e.g. infiltration rate).

What is a non-physical resemblance parameter?

Requires optimisation to find the optimal parameter values (e.g. ice flow parameter).

What is a sensitivity analysis?

An investigation into how a model responds to changes in the parameter values.

Why do parameters need calibration?

Physical parameters may not have measurements for the whole domain. Non-physical parameters need to tune the parameter values to give best result.

What is calibration?

Measures of assessing model prediction (overlaps with validation).

How is a model calibrated?

Root Mean Square Error (RMSE) .

What is the split sample approach?

It only uses some of the observations for calibration and then it is compared with predictions.

What is validation?

This abstraction represents a complex reality in the simplest way that is adequate for the purpose of the modelling.

What is the difference between validation and verification?

Verification involves converting equations into computer code. Validation: is the model formulation itself fit for purpose.

List the different types of verification/validation.

- Analytical solutions


- Field data


- Comparison with other models


- Benchmarking


- Scale models

Outline limitations to model validation.

- It is often vague and requires subjective judgement


- There may be a lack of data if it was all used in calibration


- Limited available data may fit different models equally well

Outline questions that need to be addressed in experimental design.

- What is the research question?


- What scenarios do I need to set up to answer the question best?

State a consideration that must be taken when using a pre-existing model.

It may need calibrating/validating for the application.

Why is understanding uncertainty important?

People make decisions based on predictions/projections (policy makers, industry). Identifying a worst, best and most 'likely' outcome can be useful.

List the sources of model uncertainty.

- Parameter uncertainty


- Input data uncertainty


- Uncertainty in the forcing conditions


- Conceptual uncertainty

What is parameter uncertainty?

It comes from the model parameters that are inputs to the computer model (mathematical model) but whose exact values are unknown to experimentalists and cannot be controlled in physical experiments, or whose values cannot be exactly inferred by statistical methods.

What are parameter 'ensemble' approaches?

They are part calibration and part uncertainty. A model is run many times with different sets of parameter values. The predicted and observed responses are compared and each set of parameter values is assigned a likelihood of being a simulator of the system (i.e. may be 0 when it is not a good predictor). All simulations with a likelihood measure significantly greater than zero are retained for consideration.

Give an example of a parameter 'ensemble' approach.

GLUE (Generalised Likelihood Uncertainty Estimation) by Bevan and Binley, 1992.

What is input data uncertainty?

It refers to the fact that computers will unquestioningly process unintended, even nonsensical, input data ('garbage in') and produce undesired, often nonsensical, output ('garbage out').

What is conceptual uncertainty?

It links back to the validity of the model.

How can different conceptual approaches be compared?

Model intercomparison exercises.

Give examples of model intercomparison exercises.

- CMIP (Coupled Model Intercomparison Project)


- PMIP (Paleoclimate Modelling Intercomparison Project)


- EISMINT (European Ice Sheet Modelling INiTiative)