By using this site you consent to the use of cookies for the sole purpose of providing the full functionality of this website and its related services.

For more information please refer to Terms of use and Privacy & Cookies   No Thanks

Chapter 5 Study methods

Chapter summary

There are a number of related guides referred to in the text, which provide further detail on study methods. This chapter:

  • Provides an overview of the study methods used in estuarine management;
  • Describes the techniques used in analysis and modelling, in addition to the application of models (Figure 5.1);
  • Outlines data requirements, resolution and accuracy, model calibration and validation, sensitivity analysis and the usefulness of scenario testing as part of the review on modelling techniques;
  • Discusses errors and uncertainty and their importance in understanding data collection methods and the levels of data accuracy.
Figure 5.1. Flow diagram outlining techniques used to study estuaries
Study methods - click boxes to go to relevant section

Analysis and modelling

Over recent years, more research effort has been focussed on methods to predict morphological change, both on the coast and in estuaries. As already noted, however, it is widely accepted that no one model or analysis technique adequately represents the behaviour of an estuary system. This has led to the concept of using a suite of techniques to develop an understanding of how the system functions and how it may respond to external influences, or internal changes (Pontee & Townend, 1999; Capobianco et al., 1999; EMPHASYS Consortium, 2000; HR Wallingford et al., 2006).

In order to consider the consequences of a particular change, be it natural or anthropogenic, the potential interactions can be summarised in what has been called a cause-consequence model (Pontee & Townend, 1999). This maps the possible routes from a particular cause of change to the resultant changes in estuary form (see Cause-consequence model). Linking the causes of change and the consequences (or system responses), a toolbox of methods is available that describe, or compute, particular responses and these are the focus of this section. This reflects the fact that there is not one integrated model of estuary functional response.The choice of methods invariably depends on the data available, opportunities to collect further data, available budget and the specific scope of the problem to be studied, as already outlined in the Study approach chapter (see also the EMPHASYS guide - PDF 3.97MB (EMPHASYS Consortium, 2000) and HR Wallingford et al. (2006) for further suggestions on how to evaluate these aspects).

Techniques available

A review of the methods that are available and contribute to understanding change in estuaries, suggests four broad categories (EMPHASYS Consortium, 2000):

  1. Data analysis methods
  2. Regime and equilibrium “top-down” methods
  3. Process based, “bottom-up” methods
  4. “Hybrid” techniques that combine “top-down” and “bottom-up” methods

In addition, there are a range of related modelling and analysis techniques that address issues such as water and sediment quality, ecosystem behaviour and socio-economics.

As with any scientific investigation, data analysis forms an important building block to understanding changes that have gone on in the past and aspects of behaviour. In general, two approaches have been taken to the prediction of morphological change in estuaries, namely the application of process based models (referred to as “bottom-up” methods) and a variety of techniques based on behavioural system concepts (referred to as “top-down” methods). The latter often define a goal based on some form of equilibrium concept. By combining the top-down target state, with process models run at individual time steps, it is possible to set-up a goal-seeking iteration to find possible outcomes. This is often referred to as the “hybrid” method. There are then a number of related models and methods, which use the output from the hydrodynamic and morphological models as a basis of making predictions about dependent properties such as water quality, benthic communities, bird/fish populations and so forth.

The following tables (Tables 5.1 to 5.5) summarise the range of analysis techniques and models available and what they are used for. Further details and some pointers to the relevant literature are given in the Analysis and modelling guide.

Table 5.1  Data analysis methods
Method Brief explanation
Accommodation space Changes in sediment storage capacity over Holocene time scale (10,000 years)
Analytical Solutions Characterisation of the estuary system or estuary processes into manageable stand alone mathematical equations.
Expert geomorphological analysis Using many of the above techniques, together with an understanding of how different types of landform evolve, to assess the expected development of the estuary system
Historical trends analysis Documents changes to estuary over time from charts, maps and historical archives (e.g. parliamentary records). Identifies any trends. Should include a chronology of human developments (reclamation, dredging, etc)
Holocene analysis Description of geological development of basin. Usually includes estimates of sea level change and identification of periods of marine regression and transgression
Saltmarsh analysis Relates properties of exposure and tidal range to the presence and distribution of species
Sediment budget analysis Reconciliation of sediment inputs, outputs and sources/sinks within the estuary
Statistical, spatial and time series analysis techniques Uses standard data analysis techniques to identify dominant components, trends, cycles and relationships between variables, to give insights into the dynamics and complexity of the system
Table 5.2 Regime and equilibrium “top-down” methods
Method Brief explanation
EstSim Prototype Simulator The Prototype Simulator takes a systems-based description of the geomorphological elements present within an estuary, and through a mathematical formalisation of the influences between the morphological and process components, investigates its response to natural and anthropogenic changes
Estuary translation (rollover) Defines the vertical and horizontal movements of the whole system as a consequence of changes in sea level
Form analysis Uses shape descriptions to characterise the estuary form (e.g. exponential width decay, or power law width and depth)
Intertidal form analysis Considers the equilibrium shape of the cross-shore profile
Regime relationships Relates estuary form properties such as cross-sectional area, plan area of intertidal or subtidal to tidal prism, volumes to given elevations, sediment type, and erosion threshold
Tidal asymmetry analysis Examines changes in tidal wave propagation as a function of estuary form
Table 5.3  Process based “bottom-up” methods
Method Brief explanation
Advection-diffusion models Calculates the movement and dispersion of a constituent (particle matter or solute), given an initial concentration field (e.g. dispersion of a heat from a power station outfall)
Hydrodynamic modelling Process based modelling of water levels, discharge, current speed and direction, waves, density currents and secondary circulation patterns
Morphological bed-updating models Prediction of changes to bed levels based on sediment transport modelling. The bed is updated at regular intervals to provide a feedback to the hydrodynamic and sediment transport models.
Particle tracking Prediction of particle movement by seeding particles into the flow regime with given properties (size, density, settling velocity, etc) and tracked in a Lagrangian manner
Sediment transport modelling Process based modelling of bed load and suspended sand and/or mud movement, with relationships to determine the rates of erosion and deposition
Table 5.4 “Hybrid” techniques that combine “top-down” and “bottom-up” methods
Method Brief explanation
Behaviour models Describes the net behaviour of the system (or some aspect of it) using simplified descriptions, or relationships derived from the use of more detailed process models.
Coupled hydraulic and energy relationships Examines the distribution of bed shear stresses and compares these with erosion thresholds for the types of sediment present.
Coupled hydraulic and entropy relationships As above but defines a target steady state based on the concept of minimum work in the system as a whole
Coupled hydraulic and regime relationships Given a perturbation to the estuary system this method uses a target equilibrium, defined by some form of regime relationship, to iterate to a new equilibrium
Uniform sediment flux or sediment balance In this type of model, sediment is moved within the estuary until a steady state is achieved when equal amounts are moved on the flood and ebb tide.
Table 5.5 Related modelling and analysis topics
Method Brief explanation
Ecological modelling Models to describe the interactions between physical, chemical and biological components. Generally these are limited to specific interests (e.g. bird/fish populations, benthic communities, vegetation cover, etc).
Sediment quality Techniques to establish the transport pathways of sediment and the way in which contaminants are adsorbed and released from the sediment
Socio-economic modelling Techniques to address the pressure-state-impacts-response cycle, usually in terms of some form of economic valuation, as a basis for predicting societal responses and so identifying how the pressures may change in the future.
Water quality Models that represent the advection and dispersion of suspended matter, dissolved oxygen and contaminants

Clearly, there are significant limits to the predictions that can be made using the methods currently available. Many of the top-down and hybrid techniques do not deal with specific scenarios. Instead they provide diagnostic tools, which allow an understanding of the estuaries geomorphology to be developed. A degree of interpretation and subjective judgement may then be required to reach conclusions as to how the system will evolve. Drawing the disparate results together then requires a degree of synthesis, making use of a conceptual model as already described, to help summarise the conclusions reached from individual studies. Ongoing research is likely to progressively supplement both the range of tools available and the way in which results are integrated.

Model application

There are a number of issues that arise from the application of the various models available. These are described in some detail in many standard texts (Abbott & Basco, 1989; 1994) and manuals (see for instance the EMPHASYS guide - PDF 3.97MB (EMPHASYS Consortium, 2000), the Good modelling practice handbook - PDF 4.15MB (STOWA/RIZA, 1999) and ERP2 outputs (HR Wallingford et al., 2006; Huthnance et al., 2007; EstSim Consortium, 2007). A brief summary of some of the main issues is provided in the following sections.

Data requirements

In identifying change, historical data plays a vital role both for analysis in its own right and for use in calibrating and validating models. Collating the available data at the beginning of a study invariably influences what further data is needed and what analysis and modelling can actually be undertaken.   Data quality is also an important issue. For existing data it is essential to establish a description of the data source, how it was obtained and what quality control has been undertaken.   Similar information is needed for new data and this usually forms part of the data collection specification, along with the resolution and accuracy required.

Collecting data in the marine environment is invariably expensive. As a consequence, it is usually necessary to balance the costs of data collection against the quality or detail of outputs that can be achieved. It is important to relate the significance that is likely to be attached to the results, to the quality of output required and then work back to define the information needed to deliver the quality with an appropriate degree of confidence. Whilst short cuts may be financially attractive, they can have unforeseen consequences, as the cartoon indicates.  More information can be found in Data requirements - PDF 180KB (ABPmer, 2007) .

Figure 5.2. Cartoon illustrating the importance of accurate field data collection
cartoon

Resolution and accuracy

Issues of resolution and accuracy need to be considered for both the measured data and the model set-up. Often, data collection methods and instrumentation, as well as the limitations of fieldwork will often constrain the resolution of the data and the accuracy with which it can be measured. This is often overlooked and yet it can have a significant effect on the proper use of the data and interpretation of model outputs. If models are calibrated against measured data with given accuracy levels, there is little point in reporting model results as a significantly higher degree of accuracy – this simply gives the impression of a precision which does not exist.

The resolution of a predictive model is also critical to the outcome. This is particularly the case for process based “bottom-up” methods where the way in which space and time is divided up can influence the results. The spatial scale must also be consistent with the features being modelled. To examine the effects of a dredged channel or land reclamation it is necessary to have a cell size that adequately resolves the flow. Similarly, if the temporal resolution is inadequate, the model may not be stable, or will not resolve fluxes in sufficient detail to accurately determine change between different cases.

For regime and equilibrium “top-down” methods, the issue is often more about recognising what can and cannot be resolved. Questions of accuracy are also more likely to relate to an appreciation of how the particular technique was derived and the degree of statistical certainty associated with it. More information can be found in Data requirements - PDF 180KB (ABPmer, 2007) .

Calibration and validation

This is traditionally the first stage in the application of any particular model, once the set-up stage has been completed. Typically, the model is initially run with parameters taken from the literature, available data and the benefit of the users previous experience. The model outputs are then tested against some measured data and selected parameters adjusted until a satisfactory fit is obtained (as outlined in the Data requirements - PDF 180KB (ABPmer, 2007) document).

Subsequently, the input parameters are fixed and the model is run for one or more different cases and validated against some independent measured data. For example, a flow model may be calibrated against water level data for a spring tide and then validated against water level and flow measurements for a neap tide. What is important is that any tuning of the model parameters is only done during the calibration stage and the validation is then based on information that, as far as possible, is independent of the data used for calibration. This improves the severity of the test and hence adds to the credibility of and confidence in, the model set-up.

It is quite common for the results of calibration and validation to be presented as plots of model versus measured data, providing a useful illustration of how well the model is doing but giving little information about the quality of the fit or the severity of the test. It is far better to also consider some quantitative measure of the error between measured and model data (e.g. RMS values) and the potential error sources (see discussion in section on Error and uncertainty - PDF 379KB (ABPmer, 2007)).

In a formal sense a hypothesis test can be formulated. This helps to explore whether the model represents real effects or is doing no better than would be expected by chance. For instance the null hypothesis H0 may be that the model is doing no better than chance. Statistical error testing can then examine whether the hypothesis should be accepted or rejected and consider the probability of accepting the hypothesis when it should be rejected (a Type 1 error), or rejecting the hypothesis when it should be accepted (a Type 2 error).

Sensitivity analysis

Sensitivity tests are aimed at understanding the model response to given model parameters of forcing conditions. The variation introduced is dependent on the particular variable, but it is common to start with a halving and doubling of expected values (although this may need to be order of magnitude changes for highly non-linear variables). If a variable proves to be particularly sensitive to change, further tests may then focus on sensitive regions of the parameter space.

Varying individual model parameters provides an indication of how they affect the model outputs and can be used to assess their relative importance. This is often done as part of the model calibration exercise, but is not always recorded. Doing so can, however, be particularly useful at the interpretation and synthesis phase of any study, as it highlights what is important and how dependent the outputs are on given assumptions.

The other key set of variables is the forcing conditions and again some sensitivity tests to examine the relative importance of how changes in magnitude, frequency and sequencing can reveal behavioural responses that aid interpretation and understanding of the system. For example, sensitivity tests might involve altering the freshwater supply, varying the sequence of storm events, or changing the suspended sediment supply at the head or mouth of the estuary.

Scenario testing

Once a model has been calibrated and validated it can be used to explore particular changes in conditions. 'Scenarios can be formulated that look at changes in (i) natural forcing conditions (such as sea level rise); (ii) estuary form (e.g. due to reclamation or dredging); and (iii) to reproduce historic changes (hindcasts). Given the uncertainty in the natural world and the likely direction of future socio-economic development, it is highly unlikely that any one scenario can be said to be definitive. To address this, studies are increasingly being undertaken that consider a number different “future worlds” (e.g. Foresight Flood and Coastal Defence study). This provides a way of testing the robustness of the predictions and accounting for the various future uncertainties (see Future scenario testing - PDF 53KB (ABPmer, 2007)).

Errors and uncertainty

The various studies that can be undertaken, as set out above, all make assumptions. These reflect the boundaries of current understanding or constraints of the particular model or analysis technique. In addition, both models and measurements will contain errors, which need to be recognised when interpreting the study outputs. Taken together these comprise the main elements of uncertainty. Whilst the process outlined in the section onStudy approach, using a number of different approaches and synthesising the results to maximise understanding, will, in general, limit the influence of isolated errors or poor assumptions, there remains an uncertainty that should be assessed. There are a number of techniques that can be incorporated in the modelling and synthesis process that can help to minimise the uncertainty and build confidence in the outputs. These are described in the following sections.

Assumptions

Assumptions are an inherent part of all assessments and modelling exercises. Our knowledge of both the natural and the built environment is incomplete. Progress in our technological age has however been based on idealised assumptions, but despite these limitations, the industrial world has developed largely by the application of a range of physical laws and simplifying assumptions, coupled with extensive testing and validation. The models that have been described above are no different. Understanding the ways in which models can be applied is incomplete. The degree of knowledge varies with the complexity of the problem. For instance, tide and wave models have been extensively tried and tested. Sediment transport models are more complex, combining more processes and seeking to represent more non-linear interactions. Therefore a larger error is likely in the predictions from such models. None the less, these models have now been used extensively and there is a good body of knowledge concerning their use (HR Wallingford et al., 1996; 1994).

The area of most recent endeavour is that of long-term prediction of morphological change. One way of overcoming limits on the ability of the approach to provide predictions over very long time periods (greater than a few years) is to develop a probabilistic description of the most likely outcomes.

An alternative approach has been to look for “target” states that the system or feature is trying to reach. This is the top-down approach referred to above, because it seeks to establish a system view of change and equilibrium, rather than examining the internal detail. In due course, it may be possible to define the various possible states for the system and provide a probability that they will actually occur. For now, we must rely on combining the various techniques that are currently available in order to synthesise a consensus of the likely outcome, or range of outcomes.

This is the procedure outlined in the section on Study approach and is the reason why such a wide range of different techniques and models may need to be applied to study the likely impacts of a development. More information can be found in Error and uncertainty - PDF 379KB (ABPmer, 2007).

Errors

Models and measurements both have potential sources of error and some of these have already been discussed above. Field measurements in the marine environment are notoriously difficult because of its dynamic nature. Nonetheless, one would generally put greater weight on the field measurements, providing they have been collected with suitable quality control and are indicating physically realistic values.

A further difficulty is that model output and measurements often have very different spatial and temporal resolutions. This is because a measurement may be taken at a point in space but a model may average over an area or volume. Consequently, differences between model and measured values combine the errors of both, as well as any differences due to spatial and temporal resolution. This issue is discussed more fully in a paper on data and models by Cunge (2003). More information can be found in Error and uncertainty - PDF 379KB (ABPmer, 2007).

Uncertainty

We live in an uncertain world and our knowledge of the world is also limited. Thus when we try to make predictions of how a particular part of the world may behave in the future, we are inevitably faced with a host of uncertainties. Individual models are limited by their inherent assumptions and the ability to prove the models is constrained by the difficulty and expense of obtaining good quality field measurements. This does not mean that the models are bad or of no use. Quite the contrary, they are significantly better than assertion or untested conjecture. The models are built around testable laws and hypotheses. By using them in the manner described they can help to build understanding of specific aspects of the problem, often with known levels of possible error. It is therefore often possible to state the probability of a given outcome and the probability of being wrong. Bringing the results from different models and methods of analysis together allows the understanding to be expanded from one limited perspective to encompass a range of considerations. If this is done around the conceptual model of the system, this should allow confidence in the predictions or forecasts to be progressively developed. More information on the communication of uncertainty can be found in Error and uncertainty - PDF 379KB (ABPmer, 2007).

 

Contents

Last Modified on: 19 June 2011
Printed from the Estuary Guide on 09/12/2024 07:27:59