preliminary mach stem runs
This is a preliminary run for the mach stem problem with a single stationary clump. This is a slice of a 3D simulation. The left boundary is reflecting to simulate two bow shocks interacting from two identical clumps. I am setting up refinement objects around the clump and in a small rectangular region along the left edge where the mach stem should form. Here are some of the parameters:
Object | Density (particles/cc) | Temperature (K) | Velocity (km/s) | Size (100 AU) |
---|---|---|---|---|
Ambient | 100 | 1000 | 0 | 10 x 20 x 10 |
Wind | 1000 | 1000 | 50 | —- |
Clump | 1e6 | 0.1 | 0 | radius = 1 |
I need to do some more detailed calculations to set the rectangular refinement region more appropriately, but I think this illustrates the basic idea of what I am doing.
UPDATE
I did some calculations and found that the critical angle occurs at angles larger than i thought, which corresponds to regions along the bow shock that are closer to the head. So the above image is simulating way too much of the bow shock. To get a mach stem, the reflecting wall should be much closer, so we can actually simulate this at a much higher resolution than the previous run. I also took advantage of some symmetry and am only simulating half a clump.
Spectral prolongation
I've written a routine that can take a coarse grid, do a fourier transform, extrapolate in fourier space, and tranform back to real space resulting in a spectrally prolongated fine mesh. This could be useful for avoiding excess power in AMR spectra.
The fourier prolongation in 1D is fairly straightforward. The only tricky part is that every other fine mesh point does not coincide with the coarser mesh - as would be the case for finite element instead of finite volume. As a result each wave needs to be shifted by the offset between the 1st fine cell and the 1st coarse cell (which works out to be .5*dx_fine)
Here is the result without any shifting…
To shift each wave by .5*dx_fine we have to add a phase = k*dx to each wave. For each element in fourier space we have to determine the corresponding wave vector and then multiply by .5*dx*k.
I also added the ability to do linear prolongation as well as spectral prolongation of layouts for use with FFTs. Here are view of what the data looks like going into the FFT. Constant interpolation is lower left, than linear (monotone center) interpolaton lower right, parabolic interpolation is upper left, and spectral interpolation is upper right
And here are the spectra for those four methods
And here are the side by side view of the spectral prolongation and the fixed grid version.
I also looked at the colliding flow runs. Of course without windowing there ends up being ringing in the spectral prolongation due to the non-periodic nature of the region.
Windowing before doing the spectral prolongation reduces this dramatically
And here is the same window for a later frame
And a comparison between the resulting spectra with and without spectral prolongation
Martin's update, 29 Oct, '12
Poloidal collimation.. Running a set of sims to match the morphology of the latest Andrea's experiments.
AGN jet truncation. Running the hydro cases with the newest resolution.
Binary global simulations. Started implementation using the golden version of the code.
Disk paper. In Jason's hands now. Aim to submit by Friday.
CRL618. Emails with Bruce. He plans to send us a draft of the paper (or at least the theory section of it) by weeks end.
Will give a talk at RIT this Thursday.
Meeting Update 1029 - Jason and Ivan
- Finished 6 AMR runs. Updated mass loss comparison
I didn't notice that switching to 5 AMR levels still leaves one single cell to vertical smoothing, which is not ideal.
The movie reveals something interesting though Thin disk movie compared to Thick disk movie
- Initial images of BE sphere in Hydrostatic ambient
Meeting Update 1028
Triggered Star Formation: Implemented cooling function described in Neufeld & Kaufman 1993:
http://articles.adsabs.harvard.edu//full/1993ApJ...418..263N/0000265.000.html
Tried to rerun the problem on bluegene, but the job always terminate with the following message:
RestoreRelationships::StrictFindBackupNode() error: node not associated.
StrictFindBackupNode(box%MPI_id = 596) failed on box [ 49 27 11 96 28 12]
StrictFindBackupNode(box%MPI_id = 550) failed on box [ 49 11 13 96 12 14]
StrictFindBackupNode(box%MPI_id = 578) failed on box [ 49 29 9 96 30 10]
Clump Paper: Revised the paper based on referee's comment, wrote up the response to the referee:
Revised paper:
http://www.pas.rochester.edu/~shuleli/1028/paper_shule_0823.pdf
Response to referee:
http://www.pas.rochester.edu/~shuleli/1028/referee_response.pdf
Also wrote up a referee response to the HEDLA proceeding.
Resistive Paper: Wrote up the "Method" and "Simulation Setup" section.
Meeting Update 10/29/2012 - Eddie
- old tickets: #82, #121
- finally got relative error plots for primitive variables of the radiative shock models ehansen10232012
- started setting up refinement criteria for mach stem problem, will have images up later this week
Meeting Update 10/29/2012 - Baowei
- Golden Version & Blue Streak
- Tested with testing modules on blue streak, bear2fix can extract BE/LE data automatically (Ticket #262). Do not need the flag lDataFromBlueGene and convertfrombluegene functions any more.
- Still working on running all tests on blue streak
- Need a big project to run on blue streak
- Tickets
- New: #263(Implementing cylindrical coordinates to reconstruction step)
- Teragrid
- Added Erica to the allocation
Meeting Update
I worked on writing scripts and making figures last week for the BE paper. Here is the page I put together of the results:
https://clover.pas.rochester.edu/trac/astrobear/wiki/u/erica/BERunsFigures
Added a page on adding tracers to objects, https://clover.pas.rochester.edu/trac/astrobear/wiki/CodeExplanation > linked on this page.
This week I have a research project and 30 minute talk to prepare for the astro course I am taking. I will be busy for the next few days working on that.
Next up on my plate will be getting the data for the sub crit case and making a similar figures page for that run, compiling references for the BE paper, writing the BE paper, reading Toro, writing my code, and working with Jonathan on the grid generating algorithm stuff.
Comparing Shock Models
I regenerated the images for comparing my shock models (dashed) with Anna's (solld). I am comparing density (blue), temperature (red), and ionization fraction (black). I also made plots showing the percent relative error. The error does not look that good even though the plots of the primitive variables look okay. It just goes to show you how deceiving plots can be, and this is why quantifying differences and errors is so important.
Primitive Variables | Percent Relative Error |
---|---|
UPDATE 1 (10/23 1:00 PM)
I discovered something that might be screwing up the relative error plots. Anna's data does not have a fixed step size while mine does. I set it up so that we have the same number of data points, but Anna's data points are more concentrated where gradients are higher whereas my simulations are a fixed grid. Therefore, visit could be doing the relative error based on the point number rather than the actual distance behind the shock. Perhaps if I use gnuplot it will do some type of interpolation to get a more accurate error plot.
UPDATE 2 (10/23 9:00 PM)
As I previously mentioned, visit could not correctly interpolate the data to produce accurate relative error plots. gnuplot can interpolate data but only as a plotting option. I couldn't figure out how to do operations on the interpolated curves. So I did something different…I fitted polynomials to the data points and then found the relative errors between the polynomial curves. To achieve better fits, I used 10th order polynomials. I have only done this for vs = 30 km/s, but I will do the rest tomorrow. I will also post images of the fits vs. the data so that you're convinced that the fits are okay.
UPDATE 3 (10/25 11:00 AM)
The above method worked pretty well for vs = 30 km/s. The fits get a little oscillatory for temperature and ionization fraction, but they are not too bad. However, the fits for the higher shock velocities did get pretty bad (especially for temperature). So I will have to come up with something else. I think my only option at this point is to write my own little fortran routine to interpolate the data and calculate the errors for me.
UPDATE 4 (10/25 3:30 PM)
Finally got accurate relative error plots. I wrote a little fortran program that resamples/interpolates the data, then calculates relative errors and outputs to a curve file to easily create the plots in visit. I also redid the primitive variable plots using gnuplot instead of visit, so that I could have better control of the tick marks.
Meeting Update 10/22/2012 - Baowei
- Golden Version Status
- New revision 1144:3bc7b231aa1a with memory leak fix in main repo.
- Set a cron job that will notify about new revision.
- Unstable Endian issues with bear2fix when running testing suites on blue streak (#262)
- Tickets:
- New: #261(Download page for Golden Version AstroBEAR), #262 (Big/Little Endian on blue streak)
- Attended Matlab workshop
- Added Eddie to Teragrid Allocation
Meeting Update 10/22/2012 - Eddie
- learned about geometrical source terms, primarily 2.5D (ehansen10202012)
- learned how to use valgrind and wrote a script to automate testing for memory leaks. This can become part of the test suite in the near future. (ehansen10182012)
- updated the 3DCoolingJets page with some new images/movies. I still need to play around with the volume renderings. The strong cooling case still won't run in 3D, even with the memory leaks patched. Getting this run to work is not crucial, so I will put this on the back burner for now, especially since Shule wants to use the Kraken token this week. I emailed Francisco about getting the parameters to set up a run to simulate his lab experiment.
- found a typo in my H-alpha emission routine. Fixing it helped a little bit, but my data still does not match Anna's data. I am waiting to get a copy of the shock code that she is using so that I can figure out the source of our discrepancies. The following image is from before the typo fix, but this is essentially where I'm at now. v30_p2_f0.01.png
- started setting up a physically realistic run for the Mach Stem project. I still need to play around with the parameters and refinement, but I should have images later this week.
Meeting Update 1022 - Jason and Ivan
We are preparing for production runs:
- Tests runs can be found here
- Probably need to rerun some sims using tracer fields as suggested by Jason and Adam Frank. Currently, a visit expression based on position, x-z velocity and density is used to measure the mass of the disk.
- Disk tracer test here
- We are currently looking at how many AMR levels we need to have accurate results.
- Waiting for the first 32x32x32x6 sim. Was submitted on fri morning and expected to run sat night but keeps getting delayed.
Meeting update
This past week, I have been working on:
- Writing script to radially average quantities in visit. It seems Visit subcycles (see a previous post with images), so it made sense to write a script in Python for Visit to read to do the radial averaging as opposed to astrobear, as a-bear doesn't currently subcycle. I wrote this script (commented on previous post) and am attaching it here for any one to use. I made some preliminary plots just to check whether or not we are happy with the simulation output, now that we can see what it looks like when it is radially averaged:
- Read Ch.2 Toro
- Looking into debugging mesh-asymmetries with Jonathan. Will write up what I learned so far in the next day or so.
- Played with tracers, would like to ask someone to help with this, as I am not sure if I am setting them properly.
Script for averaging over successive shells in visit
Here is a script I wrote in python for averaging over successive shells in Visit, writing to a file, then plotting that file as a .curve in Visit:
What this script does is the following:
- Begins with a small sphere (or in my simulations - an octant of a sphere), of radius ~2*dx, where dx=length of smallest cell. The sphere is averaged for the quantity of interest - here rho, and this value, along with the radius, is printed to a file.
- The operators are removed, and then added again in a for-loop that adjusts the radius of successive shells to clip and then average. Inverse = 0 means clip (or in other words, get rid of) data within the clip object, whereas inverse = 1 clips data outside of the clip object.
- The for-loop iterates over 48 steps, increasing each radius over which to average by ~dx. The for-loop ends at about 1.5 Rbe.
- Average-value data written to a .curve file is then plotted and visualized in the Visit window.
Things I learned during this process:
You can record steps you take in making a plot in the GUI, and the GUI can then translate these steps into Python language. I did this for some of the more complicated plot attributes.
The plot needs to be drawn before Visit can query; when you add operators, you must also draw plots in order to query.
Python is pretty cool.
A Brief Lesson in 2.5D
A couple of weeks ago, I learned how to run a 2.5D (axisymmetric cylindrical coordinates) simulation in astrobear. More recently, I learned about geometric source terms in general, and how astrobear actually accomplishes its transformation to 2.5D. This blog post is a brief summary of what I learned and is intended as an introduction for the naive user…
The 2.5D Source Term for Mass Conservation
To illustrate the concept of geometrical source terms, it suffices to look at mass conservation in 2D and 2.5D. As you already know, the mass conservation equation can be written as:
First, we move the flux term to the other side and use a product rule: Now, we can simplify the right hand side using 2D Cartesian coordinates: Now, we can simplify eq. (1) again but this time in axisymmetric Cylindrical coordinates. Axisymmetry just means that there is no dependence on phi, so those terms from the del operator go to 0. So eq. (1) now becomes: This can be simplified further with a product rule: If you compare equations (2) and (3), the meaning of a geometrical source term is realized. The flux terms are essentially the same (just in different directions), but switching to cylindrical coordinates introduces an extra source term Similar derivations can be done for the other conservation equations to get more geometrical source terms. Those can get slightly more complicated when you allow for rotation (velocity in phi direction), and if you have magnetic fields. You could also derive geometrical source terms for other coordinate systems.
How AstroBEAR Handles 2.5D
Switching on 2.5D is quite simple; you just set iCylindrical in physics.data. 0 is off, 1 is for without rotation (aka angular momentum) , and 2 is for with rotation. When iCylindrical is 1 or 2, astrobear treats the x-axis as the r-axis and the y-axis as the cylindrical z-axis. When you look back at equations (2) and (3), you can see why mapping x —> r and y —> z makes a lot of sense. This information is also explained in PhysicsDataExplained.
Now, when astrobear is actually solving the hydro (or MHD) equations, it uses a Runge-Kutta method. Those methods can get somewhat involved, but in our case, they are basically used to solve equations of this form:
where change is a time derivative, fluxes are spatial derivatives, and sources are everything else. These methods do not care if the flux terms are in the x direction or r direction; they are just generic flux terms. So the only thing that astrobear has to do differently for 2.5D is add the geometrical source terms to the dqdt-array (the array that represents the time derivative of q). This step is done in the function SrcDerivs in source_control.f90, and the actual cylindrical source terms are defined in cylindrical.f90. If you look at cylindrical.f90 you will see all of the source terms for all of the variables in the dqdt-array (including the one I derived above). Note that source terms also have to be added to any tracers that might be present, and tracers are handled just like density.
Cylindrical Coordinates in Athena
This is the paper that Jonathan was referring to. It is about implementing cylindrical coordinates in the Athena code. The paper mentions the geometrical source terms that I discussed, but includes MHD and rotation (also, in primitive form rather than conservative). The bulk of the paper is focused on how to do the reconstruction in cylindrical coordinates without changing the Riemann solvers. This definitely looks like something we could do for astrobear. http://adsabs.harvard.edu/abs/2011ASPC..444..260S
valgrind script
I made a fancy bash script to run valgrind. Originally, I was just making this so that I could easily compile and run valgrind on any problem module since I am now the "valgrind czar". However, I have since added a lot of functionality so that this could perhaps become part of the test suite.
Basically, the script will compile astrobear, run valgrind, and then check the valgrind output for memory leaks specific to astrobear. The script will print the tracebacks for all the memory leaks that it finds to a file named astrobear_leaks.out. If that file does contain leak information, then a message will tell the user to check that file. If no leaks are found, that file should be empty, and a message will tell the user that no leaks were found. I implemented many checks and error messages to make the script fairly fool proof. There are also a handful of options the user can specify…
Usage: ./valgrind_leaks [OPTIONS] -m module
specified module must be in the modules directory
Options: -d code_dir
specify code directory (default is present working directory)
-sc —skipclean
skip clean, but still recompiles (use only if you have previously compiled with proper FFLAGS for valgrind, and you have not made changes to Makefile.inc or makefile)
-sm —skipmake
skip compiling completely if you just want to rerun
-sr —skiprun
skip running if you already ran valgrind and just want to check output
-np
choose number of processors to run on (default is 1)
The script is in /home/ehansen/astrobear_GVpp if anyone is curious. If anyone has any questions or comments, just post a comment to this blog post or email me.
update on cooling jets
It was much easier to visualize the strong cooling simulations on kraken. For future reference, anyone that wants to run visit in parallel on kraken should see this web page. Also, in order for visit to read chombos in parallel you have to do this first:
module swap hdf5 hdf5-parallel
So the new images/movies are from the strong cooling case (2D and 2.5D) and can be viewed at 3DCoolingJets. Still working on the 3D version…it appears to be running farther than before, but I have to dig deeper to see if I can find any more memory leaks. At least I am getting good valgrind practice.
Martin's update, 22 Oct, '12
- Disk formation in binaries. The shape of the stagnation point. Images from the 15AU case. CLICK TO ENLARGE.
My conclusion form these plots is that the stagnation occurs at a surface with a complex shape; not a point. Lets call this the stagnation region. An improved version of these images has been added to the paper.
- Poloidal collimation of isotropic winds. Phone call with Andrea. He has new unpublished experimental data in magnetic field strengths never explores before. He has simulations of the experiments which agree quite well. Wants me to find astro parameters which will match the morphology and shock structure they see. Also, to connect the experiments and their sim, with our astro sims and thus astro observations.
Meeting Update
- Found memory leak using valgrind
- Looked at Colliding Flow Spectra https://clover.pas.rochester.edu/trac/astrobear/wiki/CollidingFlows#Spectra
- AstroBEAR II Paper accepted to JComp Phys
- Started what could be user guide? https://clover.pas.rochester.edu/trac/astrobear/wiki/ScramblerIntro2?version=2
Meeting Update 10/15/2012 - Baowei
- Golden Version
- checked in revision 1125:2c3e80f15c86 which passed all tests on local machines and bluehive. All modules run on blue streak after hypre installed (#245) and fixed several problems in makefile and Makefile.inc for bgq. Working on post-processing issues which will not affect the running.
- folks can pull from the main repo and keep your astrobear updated to the main repo. If you fix something, please make sure you run buildproblem on local machines before check in.
- testing results can be viewed with USER_MACHINE. For example "CurrentTests(bliu_grass, width=250px)" with double "and" will show the current testing results I run on grass.
- Testings for the main repo will run weekly from now on.
- Tickets
- Trac updates
- Installed Collaps and BackLinksMenu macros
- Tried: wikipage to pdf plugins but failed
Meeting Update 1015 - Jason and Ivan
- Migrated HSE to 3D (post here).
- Inserted accretion disk back in the model (post here)
- I ran the disk simulation up to t=1, the movies in the linked blog post go up to t=0.7. I haven't noticed any significant events.
- I am a little bit concerned with the contrast in temperature at t=0 between disk and ambient pic
- Updated/created a few wiki pages
- User's Manual ? Good idea or not ?
Meeting
I have the 5 runs now:
- BP
- Light
- Matched
- 3 in between Light and Matched, log spaced
I am sorry to not have movies posted by tomorrow but I am having issues with Visit over my remote connection. I will make them tomorrow after the meeting.
Meeting Update 10/15/2012 - Eddie
Cooling Jets
My priority over the last week has been the cooling jet sims, and I am close to being done with all the runs that I had set out to do. Jonathan found a memory leak that was causing the strong cooling runs to fail. With this patched, I was able to run the 2D and 2.5D versions of the strong cooling case. The 3D version is still having some issues, but we might be able to get this one working too. All images and movies can be found at 3DCoolingJets. The volume renderings still need some work.
Cooling Diagnostics
I made cooling length and cooling time fields in fields.f90 in order to calculate the cooling strength chi (cooling length / jet radius). I had previously calculated chi by running several 1D radiative shock simulations which allowed me to come up with an analytic formula for cooling length in terms of shock velocity and ambient density (see ehansen06122012). With the new cooling diagnostics, I can see more quantitatively the effect of cooling near the jet head (see ehansen10122012). I also confirmed, at least for the weak cooling case, that my analytic method for chi is consistent with the cooling diagnostics method.
Comparing Radiative Shock Models
I have been working on comparing my data with Anna's data (Pat's student). The comparison of emission data did not look promising, so I took a step back and compared the hydrodynamic quantities density, temperature, and ionization fraction. I compared 3 different runs which vary only in shock velocity. They all have a preshock density of 100 cm-3 and a preshock ionization fraction of 0.01.
Configuration Script
I worked on writing a configuration script for astrobear. I made one that works as a bash script, but it is somewhat limited. It worked on grass, and it allowed me to compile and run astrobear without having any modules loaded or a Makefile.inc. However, we probably do not want to use this since it is specific to how our library paths (HDF5, HYPRE, etc.) are named. The best configuration script would be an AutoConf script. I am reading up on this and even started writing one, but this will take some time because it is like learning another language. Depending on when we need this to be done, it might make more sense to pass this project to someone else?
Mach Stems
I did not get around to doing much for this project, but I did notice that Kris has a nice project page that I will be referencing. ClumpClump/MachStems
Meeting Update 1007
Triggered Star FormationI've been running simulations according to Boss' setup:
http://www.pas.rochester.edu/~shuleli/triggered/bosspaper2.pdf
I'll post three new movies Monday morning.
Qual I'm currently making the ppt, 15 pages done (total 45 pages?).
Papers Done with the revision of HEDLA proceeding.
http://www.pas.rochester.edu/~shuleli/triggered/shuleconf2012_AF.pdf
Got the referee's response for the ApJ paper, which is pretty positive. The following points require some thought: (the rest points are trivial to improve)
2) On page 7 they state that they have used the Dalgarno and McCray (1972) cooling function. This is a bit old: it would have been more realistic to use Wolfire et al. (1994) to get a cooling function. They could also have included heating to get an equilibrium temperature (as in Van Loo et al. MNRAS, 406, 1260, 2010), rather than just switching off the cooling at 50K.
3) The choice of parameters for the cloud on page 8 seems somewhat eccentric. A density of 100 and a temperature of 100 is fine since this is roughly what Wolfire et al. would predict for thermal equilibrium, but the size seems very small. The translucent clumps in molecular clouds have these properties, but they are much bigger (~1 pc instead of 150 a.u.). Objects this small are only interesting if they have much higher densities.
The cloud size only affects the ratio of the cooling time to the other timescales. A more realistic simulation would have a very short cooling time: so short that one might as well assume that the gas is in thermal equilibrium.
Martin's update, 15 Oct, '12
AGN jet truncation
- The x-ray Italy conference went well. We may have gotten another astrobear user, a phd student from Bologna who's simulating disks using Zeus-2d, and a lady from Grenoble France who was very interested in our models of AGN jets, so a potential collaboration.
- Will resume running these simulations this week.
Disk paper Almost done with my corrections. Extending the disk material orbits section. Will give the final version to Adam early this week. I believe is almost ready for submission.
crl 618. Exchangins emails with Bruce about previous sims on the object. He's working on the paper and will send it to us in some weeks time.
Flipping Disks. Plan to set the 1st test for the flipping disks interacting with stellar winds.
Collimation of isotropic winds In correspondance with Andrea Ciardi about his nature publication.
teragrid Getting the tokens to acces kraken.
Will give a talk on magnetic towers and disk at the CIRC symposium this Friday.
Also preparing job applications.
Disk in Hydrostatic Ambient
I've put the disk back in the ambient, here are some very early results.
There isn't any overly exciting event with this configuration, maybe I should run it for a little bit more time.
Anyway, I've got two movies that we can discuss in the next meeting
density movie, temperature movie.
There is some evolution going on with the disk, however, the ambient does not undergo big changes, you can see this better when you compare t=0 to tfinal=0.7=~4h
plotting local chi ("cooling strength")
I was a bit skeptical that calculating cooling lengths from 1D radiative shock simulations would directly translate to these 3D jet simulations, so I implemented a field variable to calculate the cooling length on the fly for each cell. Doing it this way results in chi's that are very different. Here is a pseudocolor plot of chi from what I have been calling weak cooling.
I had estimated a chi of approximately 0.4 using my 1D radiative shock simulations, but as you can see here chi can locally be much different. However, you sort of have to take the ambient with a grain of salt. You get extremely low chi's in those regions because nothing is moving, and the material has not been shocked yet. Here is a lineout of chi along the center of the jet.
You can see that the ambient is very low, and then chi does some interesting things as it goes through the bow shock, and then it becomes constant once you enter the jet beam. For most of the jet chi > 1. Here is a zoom in of the chi lineout at the head of the jet.
In this specific region chi does fall between 0.1 and 1, so perhaps a chi of 0.4 is a good approximation after all.
UPDATE
The above plots are not the original post; they are more accurate now.
Also, since plotting hte cooling length of the ambient is kind of strange, I also decided to plot cooling time. Now the ambient looks exactly the same as the jet beam…
10/11/12
I was very excited to get the new data and begin analysing it. First, I wanted to check out the mesh of the octant simulation. This is what I found:
From this 3d plot, we see that the mesh is squarish in the y-z plane, but roundish on the other 2 faces of the cube. Depending on which face I chose to make plots of, the mesh will be slightly different (and the way the fluid flows will be slightly different as well):
A pseudo-color plot of the density can be found here:
As you can see from this movie, the ambient is very poorly resolved. I wonder about the consequences of this. The fluid variables for each cell are calculated by considering gradients along cell-cell interfaces. If the cell is course, as in the ambient, then one value of density is assigned to that course cell, whereas if that one cell was split into many little cells, there may be more structure than is being seen at a lower resolution. Now, the structure may be missing here, but is there a sufficient loss of information? The collapse seems to be (largely) spherical, so the loss of information would be in the radial direction. Is this right? Is it like the density is being "averaged" so-to-speak over a larger volume = volume of coarse cell..?
I tried to get more refinement in the ambient, so I removed the de-refinement object from my problem module (last post). I expected refinement based on a) jeans in the ambient, and b) density gradient in the sphere. The grids were initialized as expected. However, later in the simulation, BH killed the simulation, reporting a seg. fault. A look at the last frame revealed massive refinement throughout the grid:
The difference in the refinement calls between this and yesterday's post, is in this most current set-up, I had astrobear refine on Jeans length.
10/10's directory is /alfalfadata/erica/DEBUG/Matched_From_Bluehive 10/11's is /grassdata/erica/from_bluehive/MatchedMeshMadness
I'd like to figure out what is the refinement being triggered from?
10/10/2012
The new set of simulations for the matched case has been a bit difficult getting off the ground. First, I was unable to get frame 0, due to refinement to highest level being triggered in ambient, ostensibly due to the default gradient in phi refinement criteria. By adding clearAllRefinements to grid, I was able to run the sim.
This past week I learned why the simulation was then getting to frame 13 and slowing down A LOT. It was because a much larger filling fraction of the amr mesh was being triggered due to density gradients forming in the ambient medium:
Now, this triggered refinement may be due to boundary effects, and thus the gradients are extraneous information that isn't important. Or, they may be physically meaningful, and something we may want to actually resolve. If this is the case, then I will have to see about moving to much more processors/memory to carry out the calculation further. For now, the fast method to get data is to de-refine outside of the BE radius.
What I'd like to accomplish with my refinement: -derefining outside of the sphere on everything -refining inside of the sphere on 2 things: jeans and density gradients
-The code I used was:
CALL AddRefinementCriterion(Mass_Field) CALL AddRefinementThreshold(JeansLength_Field,LESSTHAN, (/(4d0*levels(i)%dx, i=0, MaxLevel)/)) CALL CreateRefinement(Refinement) CALL CreateShape(Refinement%Shape) Refinement%Shape%size_param=(/1d0, 1d0, 1d0/) CALL SetShapeBounds(Refinement%Shape) Refinement%BufferCells=4 Refinement%tolerance=-1d0
I noticed 2 things of interest:
1) When I do not add the de-refine lines (from the 3rd call, on), the entire ambient was being refined to level 1, whereas the max level was being reached only inside of the sphere:
This seems to make sense if that is being triggered due to Jeans' criteria, which I am asking the code to use with the lines:
CALL AddRefinementCriterion(Mass_Field) CALL AddRefinementThreshold(JeansLength_Field,LESSTHAN, (/(4d0*levels(i)%dx, i=0, MaxLevel)/))
A quick calculation shows that, yes, the Jeans length is < 4*dx in the ambient, so this makes sense:
2) When I add the de-refine outside of the BE sphere, the ambient is coarse, and only the sphere is refined maximally:
I am happy with this refinement for now and am running in the AFRANK queue now. The higher levels of refinement inside of the BE sphere should be restricted to Jeans, by adjusting the qtolerance to have low tolerance for levs 0-5, and high levels for 6-7.
Questions
- How does cells/jeans work with ref subroutine?
- How does refVarFactor “ “ “ “
the actual 2.5D sims for the cooling jets
So I made a silly mistake last week. The images I showed on Monday's meeting are literally 2D, not 2.5D. I altered my problem.f90 and the .data files accordingly to run actual 2.5D sims. Here are the images for the weak cooling case.
The 2D and 2.5D versions of the moderate cooling case are running right now, and when those are done I will run the 2D and 2.5D versions of the strong cooling case. The 3D versions of both the moderate cooling and strong cooling case will have to be run on kraken, but as mentioned in previous posts I do not know if I will be able to get the 3D strong cooling case to work.
I quickly learned how to set up a 2.5D simulation, and there wasn't much on the wiki about it, so I plan on adding some details.
UPDATE
I added a small section about cylindrical symmetry at the bottom of PhysicsDataExplained.
Migrating HSE from 2D to 3D
This morning I ran a simulation of the HydroStatic Star module in 3D.
The simulation was successful and stable.
A remarkable fact is the change in the pressure profile that leads to a more natural temperature profile.
Here are the two models compared, they have an identical density profile but differ in pressure profiles and therefore temperature profiles.
This is most likely due to the switch from plummersoftening2D to its 3D version.
Edit (Jonathan explained the reason for the change): in 2D the point gravity object is actually a line charge, so the force falls with 1/r rate. in 3D it is 1/r2
I ran the 3D model with 4 AMR levels, and this led to less mass loss and increased stability overall. Computational time was not too long (~2 hours on 32 cores). movie
As I did with our 2D simulation, I compared the values of this new 3D model with the refence model, I was able to load all the values in the same plot this time for a better understanding. Legend:
- continuos lines: reference model
- dots: our model
- blue/purple: pressure
- red/orange: temperature
- gray/green: density
Overall, this looks like a good result to me. The next step is to put our accretion disk back in the picture.
Hydrostatic Equilibrium Module Recap
So we had some new developments since last week with the Hydrostatic Equilibrium module. I thought I would give an update on were we are at now.
We now have a model that is stable enough to be suitable for a simulation.
Below is a plot that shows pressure, temperature and density profiles in logscales, also, you can get an impression on what the velocity magnitude and mach numbers are.
Here of density and pressure evolution of the model
movie1
Jonathan fixed the density infall from the boundaries and the model is stable up to t=1, with a timeScale = 17403 second =~ 5 hours. There is still some, although not excessive, expansion of the central part.
Things get a little bit ugly when we see the plots for temperature and mach numbers.
movie2
There is still an inverse profile for the temperature, apparently trying to change it represents a cause of instability, but we can push it a little bit. The Mach number plot seems to be affected by some AMR related issue, Jonathan hinted the possibility of such an issue today at the meeting. I will look into that.
Last but now least, lets see how our profile compares to the model of an AGB star (the refence model can be found here ). The plots below represent our model (first image) and the reference model (second image).
The two models are not perfectly identical for the central part of the star (where r ≤ 4). This is related due to the smoothing that we had to apply to density and gravity potential. However, for our specific problem, the central part of the star will be covered by an accretion disk.
Below are the same plots as above but translated by 4 units so that you can see how the two profiles compare at a r ≥ 4.
- What is left to do is to tweak the temperature profile a bit, bring the problem to 3D, and place our accretion disk in.
meeting summary
- worked on Martin's paper
- Hydrostatic Star Problem
- Applied for Miller fellowship
- Working on Spectra for colliding flow papers
Meeting Update 1008
Star formation: I've been doing more triggered star formation runs. The following three runs are for Mach 3.5, density contrast 13:
Hydro
Bx beta=4
By beta=4
The By run has not finished yet. These runs are with lower Mach and higher density contrast still do not form sink particles. For the Hydro and Bx cases, the images are at 1 crushing time, and the simulations actually last till 2 crushing times. We see that Hydro and Bx does not differ much. They seem to stretch quite a lot at about 1 crushing time, which I hope could be solved by the presence of By. This week I will use Boss' original data and see if we can reproduce the collapse he found. From there, we should be able to add magnetic field.
Qual: Wrote the brief for the qual. It has been sent to the committee members last Friday.
Papers: Almost Finished revising the HEDLA proceeding except some figures to tweek. It should be ready in 2 or 3 days.
Resistive Paper: Obtained account on Kraken. will try compiling and running the last two jobs for the resistive paper on it this week.
MHD sink particles: Andy's comment:
I also simply remove mass inside a volume of r ~ 4*dx.
Physically this means the gas switches from flux-frozen (or as close
to flux-frozen as the code can maintain with finite numerical
dissipation) to perfectly resistive inside the sink region. The
motivation for this is that the mass to flux ratio in young stars is
vastly larger than the mass to flux ratio inferred from observations
and in simulations from infrared dark cores. Flux freezing must break
down somewhere because observationally only a small fraction of the
magnetic flux from the core winds up in the star. Resistive effects,
like ambipolar diffusion, it is conjectured, is the dominant effect
for this, operating on small scale close to the surface of the
protostar. We assume that the magnetic dissipation scale is much,
much smaller than the grid scale and therefore throw flux freezing out
the window at the smallest scale that we can actually resolve (~ 4
dx). One consequence of doing accretion this way is that flux tubes
that get pulled into the sink region become boyant after you remove
mass from them, leading to interchange instabilities in the flow. If
the assumption of "small scale resistivity" is correct, I believe
these interchange instabilities are also correct or, as correct as one
can resolve.
http://adsabs.harvard.edu/abs/2011MNRAS.415.1228P
Meeting Update 10/08/2012 - Baowei
- Golden Version Status
Merged with Ivan's revision last Friday and Jonathan's revision on Sunday — So I was wrong about the final merge last week. Hopefully we are close to the final merge…The following table summarize the testing results on weekends — with Ivan's code.
Machines | Testing Results |
---|---|
Clover | Goes Very slow. Takes hours. Stop testing on clover |
Alfalfa | All passed |
Bamboo | All passed |
Grass | All passed |
Bluehive | Modules using bear2fix process testing chombos passed. Found an issue running modules which don't use bear2fix (generate plots like BrioWuShockTubes). Updated the testing suite script for these modules. |
Blue Gene/P | No reservation. Didn't run tests on it |
Blue Streak | Found a bunch of bugs in the code and data files but all fixed. Updated the testing suite script for modules with plots testing results. Got a slightly different chombo files when running Bondi which failed the test. Working on it |
With Jonathan's revision, I found an segmentation fault error running Basic Disk. Working on it.
- Trac 1.0
- Installed a new Latex plugin. Tried multiple times but the old plugin won't work with the current version Trac. The way writing equations is slightly different (single pair of ). Details can be found at Ticket #254. Old wiki pages with equations may look strange. We probably will go back to the old plugin when the new version comes.
- Reinstalled Mercurial plugin.
Meeting Update 1008 - Jason and Ivan
HSE:
We are obtaining the new HSE code/module from the repository.
Jonathan sent us a movie of the new module.
AstroBEAR code improvements:
Ticket 250: module control is now able to keep track of objects creation order for gridinit, beforestep and seterrflag calls.
Different routines such as <object>RemoveFromList, <object>GridInit, <module>Init, were removed or ported to the new object module.
Work is complete, code is being tested and merged.
The new code behaves exactly like the old one (for now) for testing purposes.
This is an example of the GridInit subroutine of module control:
CALL ObjectsGridInit(Info,AMBIENTOBJ) CALL ObjectsGridInit(Info, COLLIDINGFLOWOBJ) CALL ObjectsGridInit(Info,CLUMPOBJ) CALL ObjectsGridInit(Info, DISKOBJ) CALL ObjectsGridInit(Info, WINDOBJ) CALL ObjectsGridInit(Info, OUTFLOWOBJ) CALL ObjectsGridInit(Info, UNIFORMREGIONOBJ) CALL ObjectsGridInit(Info, SPLITREGIONOBJ)
and ObjectsGridInit(Info,type) in objects.f90:
DO WHILE(ASSOCIATED(Object)) IF (Object%type == TYPE) THEN CALL ObjectGridInit(Info, Object) ENDIF Object => Object%next END DO
The object type hierarchy is still used although a creation time priority is followed among same-type objects. Before proceeding to a creation time only ordering, we have to make sure each problem module instantiates its objects in a correct order, as Jonathan correctly mentioned.
Licensing:
Ticket 256 - tagging script deployed. It is simply a matter of running it inside the repository.
Below is the copyright notice that will appear in each file (Disk.f90 example):
Copyright 200x-2012 Department of Physics and Astronomy, University of Rochester. Disk.f90 is part of AstroBEAR. AstroBEAR is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. AstroBEAR is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with AstroBEAR. If not, see <http://www.gnu.org/licenses/>.
Full license version that has to appear once is the code can be found at http://www.gnu.org/licenses/gpl.txt.
Meeting Update 10/08/2012 - Eddie
- made the cooling jet simulations work in 2D ehansen10032012
- I didn't like the way the moderate cooling case looked. The jet head doesn't form a good bow shock; it just breaks up into the ambient. I did find a small typo in the problem module, so I'm rerunning that.
- still can't get the strong cooling case to run on kraken. It stops after only one frame dump, so it could be a memory thing. I'll look into this more this week.
- got the problem module set up and ready to go for the mach stem research ehansen10052012
Approximating m(t) onto the BE sphere
Did a calculation today to try to get a handle on the mass falling onto the BE sphere in the light ambient case. That is written up here (also accessible from my page):
https://clover.pas.rochester.edu/trac/astrobear/wiki/u/massBElight
Mach stem simulation set up
I have a module written and ready for the mach stem research that I'll be doing with Kris. This run was sort of just a "proof of concept" to make sure I could do runs like this and that it works. To start, I just need to tweak the parameters and do a set of runs with different gammas.
BE module write - up
Worked on writing up a description on initializing a BE sphere on the grid:
https://clover.pas.rochester.edu/trac/astrobear/wiki/u/BonnorEbertModule
Hope to add tomorrow some information on the actual BE_module in physics folder of a-bear and information on controlling the ambient medium inside problem.f90.
Nice reference for Latex writing on wiki
So one does not need anymore to flank either side of the latex formula.
http://en.wikipedia.org/wiki/Help:Displaying_a_formula
From what I have gathered today, it seems you just need to do:
< {{{#!latex formula stuff here }}} >
When you are trying to group terms together, use the curly bracket {}.
A study of the gravitational potential of my BE sphere and the surround ambient
Yesterday, in lieu of anamolous fatality of a-bear running my matched ambient case, I began some detective work on the possible cause.
Comparing the light and matched cases in visit, I didn't notice any glaring discrepancies in rho, but did see drastic differences in Phi. So, I set-out to try and better understand if these were actually un-expected.
I did some basic calculations of phi yesterday and made some plots. Today I was able to make a nice wiki page describing my methods and findings.
At the bottom of the page, you will see that phi calculated is what phi was simulated. That is a nice result.
https://clover.pas.rochester.edu/trac/astrobear/wiki/u/GasPhiBE
Cooling Jets - 3D vs. 2D
The cooling jets that I have been working on have all been 3D runs, but the very strong cooling cases require a lot of resolution. In order to get around this, I modified a few things in Martin's jet module to make it work in 2D. I ran the weak cooling case in both 2D and 3D to compare.
2D | |
---|---|
3D Slice |
As expected, they are not exactly the same, so we just have to decide if 2D is good enough. For the very strong cooling case it may be our only option. The 2D run above did run extremely fast (less than a minute on grass).
Meeting update 1001 - Jason and Ivan
AstroBEAR related updates:
- created a script for source code tagging, waiting for group decision regarding author tags.
- ticket 250 (unifying calls among object modules) in the workings.
Research related updates:
- minimal velocities at beginning of evolution
- instability develops and is amplified with time turbulence
Meeting update
- Confirmed BE sphere in a light medium is stable for 5 crossing times:
https://clover.pas.rochester.edu/trac/astrobear/blog/erica09252012
- Running tests on Matched ambient case. I am getting a seg-fault when running on 16 processors in Debug queue and some other quarkiness. Will work on today or tomorrow and post ticket if need be.
- Wrote outline for BE paper, and thought about the next steps for the project. To talk to Adam about later today.
- Read and took notes on Toro, ch. 1.
Meeting Update 1001
Sweet-Parker Problem
AMR issue is kind of fixed, it's due to a typo in the code. However, the error generated on the AMR boundaries depends on resolution. Here is a comparison between a successful 2 level AMR run (top) and a higher resolution 1 level AMR run (bottom). As one can see, the higher resolution run generates error on the inner boundaries.
Here is a movie for a successful 2 level AMR run:
http://www.pas.rochester.edu/~shuleli/multi_test/spmovie.gif
Magnetic Island Generation
Magnetic field in a sheer pinch configuration can generate many resistive instabilities under perturbation.
The magnetic island formation is induced by perturbing the sheer pinch by a sinusoidal perturbation. X points and O points can form due to non zero resistivity, and dense regions separated by X and O points are formed, hence "islands". The growth rate and the size of the "islands" depend on resistivity and the strength of the perturbation.
Full movie:
http://www.pas.rochester.edu/~shuleli/multi_test/mihr.gif
Triggered Star Formation
I completed a run with the BE sphere defined in the existing module. The density contrast of this BE sphere is 5, the shock is Mach 5, Gamma = 1, 3D.
We do see post-shock material bounded by self gravity, but no sink particles formed. The dense material is eventually torn apart by the incoming flow. The full movie is here:
http://www.pas.rochester.edu/~shuleli/multi_test/tsf1.gif
This one looks promising: with a very low density contrast, the post-shock material is still well bounded by gravity. Next step should be: (1) reduce shock Mach. In the Boss paper, they did Mach 1.5, 2.5 and 5. A slower shock should compress the material to trigger collapse but and the same time not strong enough to fragment stuff. (2) increase density contrast so that more material is available during the compression. (3) cooling and magnetic field can help a lot. Even in the presented setup, a By field will greatly confine the material to prevent the tearing along the x direction. (4) produce Boss' result.
MHD Sink Particle
A paper uses MHD sink particle to produce magnetic field expulsion:
http://arxiv.org/abs/1105.5739
Their method:
"When the matter from a cell is added to a sink particle, the magnetic flux from the cell cannot be added to the sink as well, on both physical and numerical grounds. Physically, the addition of the magnetic flux to the sink particle would make the stellar field strength much higher than observed (which is, of course, the well known “magnetic flux problem”). Numerically, the sink particle cannot hold a large magnetic flux, which would produce a large, unbalanced magnetic force in the host cell of the sink particle. The needed decoupling of the magnetic field from matter during sink particle mass accretion makes the creation of the DEMS unavoidable in an ideal MHD simulation of the protostellar phase of star formation"
Adam, Eric and I had a short discussion last week via email about their method. This should be easy to do in the code, but is it OK for our application (trigger star formation)?
Resistive Paper
Reynolds number = 10 runs are running now, they are very slow due to the fact that the code does not have efficient subcycling yet for the multiphysics. Maybe I will move to the TeraGrid machines to finish them.
This week
Continue the projects presented above, write up the Qual brief (deadline this Friday), finish the revision for HEDLA proceeding.
Baowei's Meeting Update -- Oct 1 2012
- Golden Version
- Did final merge. Running final tests on local machines and blue hive, bluegene. Will check in devel_branch when all tests pass.
- Total 28 testing modules.
- Things to do: GPL license, configure file, testing pages on wiki for each local machine. Weekly running testing, Downloading page.
- Blue Streak
- Installed AstroBEAR and necessary libraries with IBM XL compilers — makefile modified. (Ticket #245)
- Ran succesffully with Bondi testing module
- Will try to run all testing modules and scaling tests
- Trac
- updated to 1.0.1
- More plugins needed to be install (#254…)
- UCLA visitor
- Reported a compiling error. Solved.