wiki:Czarships/Development

Version 45 (modified by ehansen, 12 years ago) ( diff )

Development Team

For information on bugs and development projects consult the ticketing system.

For a complete list of open development tickets see this report


AstroBEAR 2.0 Features

Please note that 'Tested' here does not mean rigorous testing. Also 'Documented' refers to just the most basic information about how to turn-on the feature - not what the feature actually does. Primary Developer/Porter also refers to who either primarily developed the feature, or ported the feature from AstroBEAR 1.0

Feature Designed Implemented Tested Documented Checked In Comments Primary Developer/Porter in AstroBEAR 2.0 Tickets
Elliptic or Parabolic Solvers
Self Gravity X X X X X Does not support Cylindrical Jonathan #150
Thermal Diffusion (iso/ani) X X X Does not support Cylindrical Shule #151
Implicit Magnetic Resistivity X X Does not support Cylindrical Shule #152
Explicit Magnetic Resistivity X X Does not support Cylindrical Shule #152
Viscosity X X Does not support Cylindrical Shule #153
Ambipolar Diffusion
Radiative Diffusion
Radiative Line Transfer
Source Terms/Objects
DM Cooling X X X X X Kris
II Cooling X X X X X Kris
Analytic Cooling X X X X X Kris
Modified DM (Non-equilibrium cooling) #147
Pat Tables (Non-equilibrium cooling) #147
Cylindrical X X X Martin
Point Gravity X X X X Jonathan
Uniform Gravity X X X Jonathan
Hyperbolic Solvers
Integration Schemes
Unsplit CTU w/ CT X X X X Jonathan
Directionally split
Runge-Kutta time integration
Spatial-Temporal Reconstruction Schemes
PPM X X X X X 3rd order w/ Characteristic Tracing Jonathan
PLM X X X X X 2nd order w/ Characteristic Tracing Jonathan
PCM X X X X X 1st order Jonathan
Reconstruction Limiters
Van-Leer MinMod X X X X Jonathan
Multi-Dimensional Limiters X X X X Jonathan
Riemann Solvers
Exact X X X X X Does not support MHD Jonathan
HLL X X X X X Does not support MHD or Isothermal Jonathan
HLLC X X X X X Does not support MHD Jonathan
HLLD X X X X X Does not support Hydro Jonathan
Roe Solver #155
Multi-D Riemann Solvers
Tracer Methods
Fully Coupled X X X X X Jonathan
Lagrangian Advection X X X X X Jonathan
Equations of State
Polytropic X X X X X Jonathan
Isothermal X X X X X Jonathan
SESAME TABLES
Initial/Boundary Condition Objects
Ambients X X X X X Jonathan
Clumps X X X X X Jonathan
Winds X X X X Jonathan
UniformRegions X X X X X Jonathan
SplitRegions X X X X X Jonathan
CollidingFlows X X X X X Jonathan
OutflowObjects
Disks
Jets
Interfaces X X X X X Jonathan
Perturbations X X X X Jonathan
Shapes X X X X X Jonathan
AMR Related Features
Threading X X X X Jonathan #76
Scheduling (Pseudo-Threading) X X X X Jonathan
Optional Load Balancing X X X Jonathan
Refinement Variable Factors X X X X Martin
Prolongation Slope Limiters X X X X X Jonathan
Miscellaneous
SinkParticles X X X X X Not tested with Cylindrical Jonathan #154
Bear2Fix X Jonathan
Runtime Processing X Jonathan
Adding Tracer Fields X X X X Jonathan

And here is a list of other projects/improvements for the code

From Phase III ideas

  • Create a standard interface for module objects (clumps, winds, backgrounds, etc…)
     CreateObject
     InitObject
     FindObject
     DeleteObject
     ObjectGridInit
     ObjectBeforeStep
     ObjectSource
     ObjectSetErrFlags
    
  • Create ErrFlag modules for refining on jeans length, cooling length, field gradients, etc… so that any problem module can easily use these.
  • Turn initambient into a background type object module that can be called from the problem module. That way users can calculate background densities within their problem module etc… (and we can get rid of modules.data)
  • Allow module control to keep track of the order that objects are created - or assign each object a creation number - so that objects that overlap - will be called in a user specified order… (Winds and colliding flows are both used by the MolecularCloudFormation module - but it requires windbeforestep to be called before collidingflowbeforestep for example)
  • Create a web interface where users can create objects - which then get passed to a series of scripts which create a problem module - and start the code at a low resolution to output the first frame - which then gets processed by bear2fix to spit back an image - so the user can see their problem setup on the fly.
  • Create processing modules that can be called during the run or after the run completes - but that are built into AstroBEAR. This way they can use the same IO routines and can easily be run in parallel - on the same system the simulation is run on. This will avoid massive data storage and transportation.
  • Preallocate sweep buffers for the entire level - instead of individually for each grid?
  • Reorder sweep direction to operate on contiguous memory blocks
  • Move all protection routines/pressure calculations/conversion between conservative and primitive variables/ to the EOS module (instead of the sweep module and the riemann_solvers module).
  • Clean up the sweep module so that different stages of the update are separated into different subroutines…
  • Add other choices for update schemes
  • Add pencil method for directionally split schemes? Or create different stencil sets for different combinations of directional ordering (x-y-z, y-z-x, etc…)
  • Update limiter method to reduce mbc (Colella and Sekora 2008)
  • Convert variables that are constant for each run into Parameters that are set by a configure script at build time… (nDim, lMHD, MaintainAuxArrays, iSolver, iEOS, etc…)

Also see

Active enhancement tickets

Old Astrobear 1.0 issues

Old Astrobear 1.0 issues

Introduction:The goal of this page is for people to suggest ways to improve AstroBEAR including known bugs that need fixing.

* To Do List *

iDivB == 2 (Dia and Woodward) needs 3D support: Look in problem.f90:afterfixup - should be fairly straightforward to implement.

InfoFieldUtils.f90 should be a module?: I believe Kris took a look at this when porting to bluehive?

iScheme==1 only supports ideal gas EOS or isothermal EOS: This involves finding pressure calculations and replacing them with calls to the pressure function

Unifying iScheme==0 and iScheme==1 to use same sections of code for solvers and eigensystems: They use different versions of common solvers, and don't both have access to the same collection of solvers.

Implementing Self-Gravity in AMR with Hypre: Gravity currently only works in a statically-defined refinement scheme; if grids move, unsightly cracking occurs.

Find a way to store "reverse" fixup fluxes: Currently, during each grid's update, it subtracts off the fluxes it eventually will add back in after it receives it's childrens fixup fluxes. This however, leaves the values for q in limbo between the grid's "timesteplevel" and it's "synchlevels". Currently algorithms that need access to updated values for q rely on allocating and updating a second copy of q (qfix). If instead of subtracting off the fluxes from q, each grid fully updated using all of the fluxes while storing the "reverse" fixup fluxes it would have subtracted and then later compare those with the fixupfluxes it receives from it's children, there would be no need for qfix. This in principle would involve less calculations and less memory (although the book-keeping would be a little tricky)

Stress test i_Protect.f90: Maybe run some very strong rarefactions that should create temperatures/pressures below the min value…


* Known Bugs *

iScheme==0,method(4)==2 does not behave properly with source terms and amr: Someone will have to explain to me how this algorithm is supposed to work (jjc)

MinTemp not used consistently: There seems to be confusion with the temperature floor used for pressure protections, for cooling cut-off, (and for an Isothermal sound speed)

iProtect (or some mysterious force) causes occasional unnatural explosions: We've all seen 'em - most recently in Sean's single clump cooling runs at very high densities

BackLinks

Note: See TracWiki for help on using the wiki.