Property:Describe numerical limitations

From CSDMS

This is a property of type Text.

Showing 100 pages using this property.
S
--  +
R
--  +
--  +
W
--  +
--  +
Y
--  +
U
--  +
W
--  +
R
--  +
1
S
--  +
O
--  +
A
--  +
E
--  +
S
--  +
D
L
--  +
U
A
--  +
H
--  +
M
--  +
D
--  +
O
--  +
H
--  +
M
--  +
H
--  +
P
--  +
L
--  +
D
--  +
L
--  +
--  +
--  +
S
--  +
L
--  +
I
H
--  +
S
--  +
O
B
A compilation of examples and documentation can be found in this Badlands-doc GitHub repository (http://github.com/badlands-model/Badlands-doc).  +
C
A documented numerical stability criterium must be adhered to for the solution to remain stable.  +
G
Analytical solution produces unrealistic results with low elastic thickness and/or very large cells due to the Green's function approximation.  +
Any of the processes modeling can be obviously improved.  +
A
Artificially limit the alpha values between 2 and -2 and the constant values between 0 and 2. Done so to avoid extreme outliers.  +
H
As the size of the DEM increases, processing time will increase. An 8000*8000 DEM will take several days to process. Documentation is available to help users select appropriate input parameters.  +
B
Assumes Bingham rheology  +
S
Assumes constant flexural rigidity  +
C
Assumes mixed-grain sediment compacts linearly  +
F
At the present I am trying to improve the wetting and drying algorithm, unphysical large velocities are sometime produced at the wet/dry front.  +
B
Becomes unstable at timesteps much greater than 10 years.  +
S
Becomes very slow if run on domains with pits. Options include filling pits before running the model, or using the Landlab DepressionFinderAndRouter.  +
G
Can be very slow in complex systems.  +
Catchment must be small enough that it can be approximated as being flat.  +
A
Code is research grade  +
P
Code is research grade  +
G
Concerns about extrapolations beyond range of data used for model calibration and parameterization.  +
W
Covergency and instabiliy may occur depending the stiffness of the problems.  +
Q
Currently it is not possible to model transgression followed by regression.  +
G
Currently only shallow water equations are solved, using explicit finite volume methods. High order Boussinesq equations to better model dispersive waves (e.g. for short wavelength submarine landslide generated tsunamis) would require implicit time stepping and is still in the experimental phase.  +
E
D8 flow codes are used to compute contributing areas. Would be better to use D-Infinity or the Mass-Flux method.  +
L
Depends on application/process  +
P
Depends only on computer resources.  +
C
Diffusion and other sediment transport routines require short time steps  +
G
Documentation provides a description of the available processes and associated limitations: https://gospl.readthedocs.io/en/latest/tech_guide/index.html  +
Does not consider influence of cross-shore sand transport, not intended for short-term storm-induced shoreline change. No wave reflection from structures. No direct provision for changing tide level.  +
H
Dpeending on the size of the DEM this can be quite a slow process, particularly the third step.  +
W
Dynamics, chemistry, RT, microphysical, and slab ocean model are all coupled, thus the model runs slowly.  +
G
Equation set is still and uses robust but relatively inefficient solvers. Recommend testing with coarse grid resolutions (say, 20x20 cells) before attempting larger/finer grids.  +
Explicit finite volume routing formulations are time-step limited.  +
Explicit forward in time finite volume method limits maximum timestep/resolution combination. This can be managed using the adaptive timestep solver that is included.  +
W
Explicit schemes make high resolution runs expensive.  +
D
Explicit solver can go unstable, and it is not always obvious. Make sure to check resulting soil depths in the landscape to look for instabilities.  +
S
Extreme deformation of mesh is expected during large deformationin the updated Lagrangian formulation but prevented by regridding. The numerical diffusion due to regridding, however, can sometime make structures (e.g. shear localization) lose desired sharpness. Regridding itself fails from time to time.  +
B
Free surface formulation does not support rapid changes in water elevation like hydraulic jumps in rivers.  +
I
If you will be processing very large landscapes then you may need to configure PETSc with the --with-64-bit-indices option.  +
M
Implicit code. Very stable. Very efficient  +
T
In general, Matlab stores all data in the main memory. Manageable grid size will depend on your available RAM. For conveniently working with grids with ~5000x5000 rows and columns, a 4Gb of RAM will likely be sufficient.  +
C
Limited to medium-size resolutions. Typical runs 256x256  +
R
Linear wave theory  +
C
Matlab may max out memory if drainage basin is too large. Since this is a simplified landscape with wrap-around boundary conditions, one workaround is to do numerous smaller runs and add the output.  +
M
Maximum timestep must be determined by trial.  +
S
Model Assumptions * Mild bottom slope and negligible wave reflection * Spatially homogeneous offshore wave conditions * Steady-state waves, currents, and winds * Linear refraction and shoaling * Depth-uniform current * Bottom friction is neglected  +
T
Model Limitations * TOPMODEL only simulates watershed hydrology, although studies have been conducted to modify it to simulate water quality dynamics. * TOPMODEL can be applied most accurately to watersheds that do not suffer from excessively long dry periods and have shallow homogeneous soils and moderate topography. * Model results are sensitive to grid size, and grid size <=50 m is recommended.  +
I
Model has difficulty with negative (uphill) slopes  +
B
Model is solved implicitly, but can become inaccurate at very large (~1000 year) timesteps. When baselevel forcing is mild and block effects are significant, slope-inversion instabilities can develop. The model catches these and will not continue running.  +
E
Model limitations are related to the use of the goal seek function in excel to find the solution.  +
P
Model should never be numerically unstable but its behavior depends on ratios of various parameters. If the model seems to not be "doing anything", look at the parameter initialization functions in deltaRCM_tools.py  +
M
Model slows down as layers are added making long runs (>2000 years) impractical.  +
H
Model works well for resistant layer dips between 10 and 80 degrees. End members will work, but domain setup must be altered.  +
D
Most Dakota analysis techniques require multiple iterations of a model to explore a requested parameter space, so an experiment created with Dakotathon can take a long time to run and produce a lot of model output.  +
C
Most of the heavy lifting algorithms are implicit, thus numerically stable  +
H
N/A  +
N
S
None identified  +
R
None known; the model requires very little computational expense.  +
G
Numerical instabilities occur if the time step is too large.  +
C
Numerical limitations and issues: # Currently the model runs with a constant timestep, which is limited by the maximum inflow. Future versions may include adaptive time-stepping. # As mentioned above, the model channels tend to be one or two cells wide. Future versions may address this issue with some combination of diffusive regularization or multi-scale modeling.  +
A
Overall, the model is very computationally intensive. It is usually ran on a grid or a cluster.  +
T
Overland flow is currently modeled in a nonstandard way. Diffusive wave and dynamic wave routing routines need more testing. The linkage between the unsaturated zone (infiltration component) and saturated zone (subsurface flow component and water table) is not robust.  +
I
Poor scaling for ice-flow models with direct solvers (improves upon use of iterative solvers, but convergence is not systematic).  +
T
Presently limited to grids up to 4GB  +