Posts for the month of August 2014

Use AstroBEAR to transfer initial data of 3D Ablative RT from fine grid to coarse grid

The initial grid for the Data from LLE is too small and AstroBEAR runs slows with such base grid. Here's how to transfer these initial data to a twice bigger grid with AstroBEAR.

  • 1. Set the Base grid resolution to half and AMR level to be 1 in global.data
GmX      = 50, 601, 50 !100,1205, 100                   ! Base grid resolution [x,y,z]
MaxLevel = 1                            ! Maximum level for this simulation (0 is fixed grid)
  • 2. Set ErrFlag to be 1 everywhere if not restart (or when read in 3D txt data).
  SUBROUTINE ProblemSetErrFlag(Info)
        !! @brief Sets error flags according to problem-specific conditions..
        !! @param Info A grid structure.        
    
    TYPE (InfoDef) :: Info
   
    ! if need to generate coarse grid data (with 1 level AMR) set ErrFlag everywhere to be 1
    if (InitialProfile .eq. 3 .AND. lRestart .eq. .false.) then
       Info%ErrFlag(:,:,:) = 1
    end if 
   
  END SUBROUTINE ProblemSetErrFlag
  • 3. Read the txt data to level 1 grid instead of level 0. Level 0 grid needs to be initialized also to avoid protections.
        DO i =1,mx
        DO j =1,my
        DO k =1,mz
            read(11,*),pos(1),pos(2),pos(3),rho
            read(12,*),pos(1),pos(2),pos(3),p
            read(13,*),pos(1),pos(2),pos(3),vx
            read(14,*),pos(1),pos(2),pos(3),vy
            read(15,*),pos(1),pos(2),pos(3),vz

            rho=5.0*rho/rScale
            p=1.25E+14*p/pScale
            vz=5E+6*vz/VelScale
            vy=5E+6*vy/VelScale
            vx=5E+6*vx/VelScale

           if(Info%level .eq. 0) then
            Info%q(i,j,k,1)=1.0
            Info%q(i,j,k,2)=0.0
            Info%q(i,j,k,3)=0.0
            Info%q(i,j,k,4)=0.0
            Info%q(i,j,k,iE)=0.0
           end if

           if(Info%level .eq. 1) then
            Info%q(i,j,k,1)=rho
            Info%q(i,j,k,2)=rho*vx
            Info%q(i,j,k,3)=rho*vy
            Info%q(i,j,k,4)=rho*vz
            energy = 0.5*rho*(vx**2+vy**2+vz**2)+p/(gamma-1d0)
            Info%q(i,j,k,iE)=energy
           end if

        end do
        end do
        end do
  • 4. Run the program from start. Frame 0 will have level=1 grid everywhere.
http://www.pas.rochester.edu/~bliu/AblativeRT/3Dcase/CoarseGrid/test/CoarseGrid_1AMR_test_frame0.png
  • 5. Restart from Frame 0 (and ErrFlag will be 0). Frame 1 after a tiny step (or any frame other than frame 0) will only have the level 1 at the interface.
http://www.pas.rochester.edu/~bliu/AblativeRT/3Dcase/CoarseGrid/test/CoarseGrid_1AMR_test_dt.png

Meeting Update 08/25/2014 - Eddie

New Outflow Object

The new outflow object module is starting to come together. I successfully ran Martin's problem module with an outflow object of a simple 2.5-D hydro jet.

movie

Next, I will test some of the velocity parameters and the diverging wind. There are some things that were already built into the outflow object module that I have excluded for now. I will have to re-implement those later.

Other Stuff

  • Need to revise the 2.5-D pulsed jet paper
  • Will start 2-D Mach stem runs of clumps with relative velocities

Meeting Update 08/25/2014: I made a shape!

So in an attempt to visualize the pn data that Baowei set aside for me in SHAPE, I need to learn the ropes of the software first. So that is what I've been up to. The SHAPE guys have a youtube channel, so I am going through all their videos, copying what they do on my own computer with the software (see: SHAPE 3D Interactive Astrophysics)

So I made a torus. Understand the 3D view ports in SHAPE now, it is pretty intuitive when you know where things are located. Turns out you can also add observational photos to the background, which might be useful as a reference when you're creating detailed objects. In the SHAPE tutorials, they have a Jet Template Project in their Hydrodynamics module. My short term goal is to recreate that, and perhaps a few other templates until I can convert the data we have (see: SHAPE Templates).

Another issue is converting hdf5 to ascii, which Martin informed me is the necessary format to feed into SHAPE. Baowei has a script (see: Baowei's Page) where one first needs to rewrite each chombo with a single core, and then conver to ascii.

For fun I threw the chombo files into VisIt to see what they looked like:

GIF GIF

Both sets have the same max and min.

Going forward:

  • Attempting to become a shape aficionado, try to visualize the data we have in SHAPE… and understand it. Dig on some literature about p/pns.
  • Working on doing the post processing for Erica's runs. They should all be complete now.
  • Once I'm done making some of the typical movies of the colliding flows bovs, I'll work on doing the movie fly-throughs. Martin said he needed some pretty graphics, so I am going to try to get as much of this done as I can soon. Also for the VISTA contest in September 5th.

Meeting update

Development

Working on 2.5d and 1d spherical gravity mods to the code.

Here is a page on this I am putting together as I go:

https://astrobear.pas.rochester.edu/trac/wiki/u/erica/CylindricalGravity

Using Leveque's Finite Difference book (http://volunteerlecturerprogram.com/wp/wp-content/uploads/2013/01/lecnotes.pdf) to refresh my memory on the matrix form of the equations

Christina's runs

Have access to her school's machines now and ostensibly the hydro data set

One of the MHD runs is almost complete now

Beta1, 60 - 166/200

Beta10, 15 - 172/200

Beta10, 30 - 179/200

Beta10, 60 - 188/200

Meeting Update 08/18/14 - Eddie

Basically just working on two things right now:

  • the jet/outflow object module (ticket #370)
    • this is more of a code issue that I would like to discuss briefly tomorrow
  • revisions to my 2.5-D pulsed jets paper

Meeting Update 08/18

No science this week for me. Still working on Erica's runs, and starting to delve into more development related topics.

On High Res Runs

  • Transferring materials currently to Stampede, about to start submitting jobs today (beta 1, shear 60 & beta 10, shear 15). The other two will run on Bluestreak.
  • Submitting jobs to Bluestreak. Waiting on Carl to respond to renew reservation, having beta 10 shear 60 running in standard queue at 128 nodes.
  • Aiming to update the CF Production Runs Page. Some of our runs are almost done. For instance beta 10 shear 60 is nearly to chombo 190.

Computing Resources

See previous post.

Development

Had a meeting with Jonathan last Friday to start working on development projects.

  • Once I am done working with Erica's data I am going to focus on creating a nap sack algorithm (which Eddie kindly gave to me :) ). Jonathan and I took a look at the current algorithm we use (see: (17) http://arxiv.org/pdf/1112.1710.pdf). Found this thesis by Pierre Schaus (see: http://becool.info.ucl.ac.be/pub/theses/thesis-pschaus.pdf). Planning to contact both Pierre Schaus (nurse - patient algorithm) and Mark Kremholtz (about load balancing on the accretion module for Orion (astrophysical AMR)). We also came across this Zoltan code which might be useful to look at?
  • Cleaning up unused variables in the code.
  • Implementing hydrostatic equilibrium in the disk module. For which I assume I'll have to consider:

We know hydrostatic equilibrium is defined generally to be

which I'll define to be for some cylindrical coordinate system (with a distance from the disk axis, and height above the plane). Then balancing with the ideal gas law and isothermal sound speed, we can integrate our equation, yielding:

If anyone wants to go over the derivation with me, that is fine, I just included some highlights here. It is a pretty standard derivation, I think it can be found in some books. Given this would be implemented for the disk module, I just assumed a central mass and an infinitesimal chunk of mass at for a cylindrical coordinate system.

However for now I'll be focusing primarily on the napsack problem since it'll be more beneficial for the code.

SHAPE

Downloaded the software. It looks pretty neat. Haven't had any creative ideas of what to do with it yet in order to understand its capabilities.

Single star

It is about debuging. The former program let the sink particle move when it interact with gas via gravity(not via accretion), the trajectory of the sink particle is odd.

The result of this is that asymmetric angular momentum of the outflow will be generated. Theoretically, there should not be any angular momentum at all, but I guess it is the boundary effect on xy, xz, yz planes.

jzasy.gif

Then I fixed the sink particle and do not let it move. Using outflow parameters:

Two different outflow velocity

Jz become symmetric but its value increases.

jzsymmnonesc.gifjzsymm.gif

Their line plots are:

test1.giftest2.gif

Magnetized Triggered Star Formation - Comparison of Azimuthal Angle

Science Meeting Update 08/18/14 -- Baowei

http://www.pas.rochester.edu/~bliu/AblativeRT/3Dcase/2ndCut/ThickSec0499.png
  1. thin target: movie
http://www.pas.rochester.edu/~bliu/AblativeRT/3Dcase/2ndCut/ThinSec0340.png

Computing Resources & Local Machines

Noticed as I am scp-ing data on the CollidingFlows runs that some of the typical machines I used are starting to become full. I also do visualizations on Clover or my own machine, however they can be quite hard on my own graphics card. This has led to talk to Baowei and others about potentially looking into developing a new machine quote (see: https://astrobear.pas.rochester.edu/trac/blog/bliu02072012).

Given that the quote is 6 months old, that the price is not the same anymore. It has been suggested to me that someone in the group should e-mail Sean Kesavan <sean@asacomputers.com> and ask him to give up an updated quote or change it to meet our requirements (referencing the quote number on the quotation). I also think these are the specs for Bamboo (https://www.pas.rochester.edu/~bliu/ComputingResources/ASA_Computers.pdf).

In my opinion it might be a good idea to look into more disk space/storage and also get a new machine for visualization. In the mean time I took a look at our local machines and who is using them.

Commands I am using:

*du -sh * 2>&1 | grep -v "du: cannot" prints the disk spaced used for each user by printing standard error to standard out.

*df -h inside of the directory to see disk spaced used (along with %) and space that is available.

alfalfadata

The Disk Space Used

DS Used User / Directory
63G bliu
13G ckrau
4.6G ehansen
227G erica
422G johannjc
16K lost+found
1.1T martinhe
4.0K modulecalls
20M nordhaus
1.3T shuleli
300M test.dd
3.5G trac_backup

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata
deptdisk.pas.rochester.edu:/home/ehansen 11T 4.7T 6.3T 43% /home/ehansen
deptdisk.pas.rochester.edu:/home/shuleli 11T 4.7T 6.3T 43% /home/shuleli
alfalfa.pas.rochester.edu:/alfalfadata 5.3T 4.3T 802G 85% /mnt/net/alfalfadata

bamboodata

The Disk Space Used

DS Used User / Directory
1.2T bliu
4.0K brookskj
6.0G chaig
1.6G Defense.zip
8.0K ehansen
580G erica
1.1T johannjc
16K lost+found
2.6T madams
1.2T martinhe
21M rsarkis
4.9G ScalingTest
3.6T shuleli

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata
deptdisk.pas.rochester.edu:/home/ehansen 11T 4.7T 6.3T 43% /home/ehansen
deptdisk.pas.rochester.edu:/home/shuleli 11T 4.7T 6.3T 43% /home/shuleli
alfalfa.pas.rochester.edu:/alfalfadata 5.3T 4.3T 802G 85% /mnt/net/alfalfadata
bamboo.pas.rochester.edu:/bamboodata 13T 11T 1.2T 91% /mnt/net/bamboodata

botwindata

The Disk Space Used

DS Used User / Directory
4.0K bliu
673G johannjc
904K orda
3.4G repositories
4.0K repositories_backup.s
11G scrambler_tests
2.6G tests
65G TO_CLEAN
75G trac
4.0K trac_backup.s
7.2G trac_dev
4.0K trac.wsgi
1.8M www

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata
deptdisk.pas.rochester.edu:/home/ehansen 11T 4.7T 6.3T 43% /home/ehansen
deptdisk.pas.rochester.edu:/home/shuleli 11T 4.7T 6.3T 43% /home/shuleli
alfalfa.pas.rochester.edu:/alfalfadata 5.3T 4.3T 802G 85% /mnt/net/alfalfadata
bamboo.pas.rochester.edu:/bamboodata 13T 11T 1.2T 91% /mnt/net/bamboodata
botwin.pas.rochester.edu:/botwindata 890G 838G 6.3G 100% /mnt/net/botwindata

cloverdata

The Disk Space Used

DS Used User / Directory
28M afrank
368K ameer
72K aquillen
4.0K array.f90
4.0K awhitbe2
20K BasicDisk
137G bcc
31M blin
9.7M bliu
1.7G bobbylc
3.3G chaig
2.6M clover
37M cruggier
0 data
851M DB
4.0K devsrc
107M edmonpp
72G ehansen
301G erica
1019M iminchev
448K jnp1
3.0T johannjc
14M langao
154M laskimr
2.9M lijoimc
3.9G local
16K lost+found
221M madams
13G martinhe
8.0K MegaSAS.log
124K mendygral
37M MHDWaves
13M mitran
648M moortgat
852M munson
3.0G noyesma
20K odat1
4.0K old_accounts
45G pvarni
292K raid
3.3G repositories
5.8G repositories_backup
140M rge21
2.9G rsarkis
192K ryan
126G scrambler_tests
2.1T shuleli
16M spearssj
0 test
54M test3
0 test.me
840M tests
355M tests130
231M tests_old
452K tkneen
14G trac
27G trac_backup
60K wma
1.8M www

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata
deptdisk.pas.rochester.edu:/home/ehansen 11T 4.7T 6.3T 43% /home/ehansen
deptdisk.pas.rochester.edu:/home/shuleli 11T 4.7T 6.3T 43% /home/shuleli
alfalfa.pas.rochester.edu:/alfalfadata 5.3T 4.3T 802G 85% /mnt/net/alfalfadata
clover.pas.rochester.edu:/cloverdata 11T 5.8T 4.5T 57% /mnt/net/cloverdata

grassdata

The Disk Space Used

DS Used User / Directory
184M afrank
4.0K bshroyer
711M bshroyerdata
8.5M chaig
4.0K cruggier
27M data4out.out
127M data5out.out
4.0K ehansen
1.4T erica
4.0K eschroe3
2.1G ferreira
3.2G fnauman
33G grass_swapfile
29G jnp1
119G johannjc
4.0K johannjc:~
4.0K langao
4.0K laskimr
16K lost+found
310G madams
656G martinhe
12M munson
1.4G noyesma
112M root
762G shuleli
1.4G slucchin
4.0K tdennis
1001M test.dd
4.0K testlog
403M tkneen
46M xguan

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata
deptdisk.pas.rochester.edu:/home/ehansen 11T 4.7T 6.3T 43% /home/ehansen

Magnetized Triggered Star Formation

Top panel: 0.36Myrs, Bottom Panel: 0.47Myrs

Left to right: beta = 12.8, 204.8, 5120

http://www.pas.rochester.edu/~shuleli/triggered_star_formation/pol12.png

Movies:

Beta = 12.8

Beta = 204.8

Beta = 5120

Fall back shell

Red line indicate the escape velocity at the boundary.

Green line is the escape velocity at that radius.

test.gif

color.gif

The program still has bug in sink particle.

Working around bubbles in high speed wind

Bubbles in wind were prevented from forming by keeping the magnetic field from extending to the grid boundary where the wind comes in.

Why bubbles form when a significant magnetic field is in contact with the edge from which a high speed wind comes into the grid remains unanswered.

Description of attached images:

Image 1 (left): Linear false-color plot of magnetic field strength at setup with wire edge marked by ring. Note the hard outer boundary of the magnetic field.

Image 2 (right): Optical band emissivity map of simulation with: wind density = 0.01 mg/cc; wind velocity = 150 km/s; B = 7.5 T; sigma = 10.08

Image 3: bow shock and CD altitudes of this run (empty and filled squares) plotted against data (empty and filled circles) from previous runs with much greater wind density (2 - 20 mg/cc) and slower speeds (< 70 km/s).

From the combined results as shown in the plot, sigma alone can determine the altitude of the magnetosphere structures; rho, v and B at wire surface are degenerate.

Meeting Update 08-05-2014

File Transfer from BS & Run Updates

  • Went through all of the chombos on visit. All look great, no issues.
  • Deleted all of the chombos on BS as I was over soft limit and could no longer write files.
  • Been updating Erica's Production Runs page: https://astrobear.pas.rochester.edu/trac/wiki/u/erica/CFRunStatsHighRes
  • Reservation starting Thursday at 8AM for 5 days. Plan on running B1S60 at 128 nodes, B10S15 at 256 nodes, and B10S30 at 128 nodes. Currently have B10S60 running in th standard queue, already at chombo/sink 157.

Post-Processing & Local Machines

  • All of the files I have run on BS are transferred over either on grassdata or bamboodata (some b10s30 (143-153). Can check ProductionRuns directories under madams in grassdata/bamboodata.
  • We have the following chombos: 144 (B10S15), 153 (B10S30), 153 (B10S60), and 130 (B1S60)
  • We have about ¾ of all the runs done.
  • Post-Processing is up to date for B1S60 (completed last night), currently doing post processing on both bamboo and grass (bamboodata) for B10S60 and B10S30 respectively.

GIF GIF

Going Forward

  • Reservations on BS. E-mailed Carl to see if he could take off the soft limit for Erica's account, including my own.
  • Getting onto Stampede so we can split the jobs between the two machines.
  • Need more space on our local machines. Many of our files are scattered between our directories. It would be smart to manage them in the same directories.

Meeting Update 08/05/2014 -- Baowei

  • 3D Ablative RT
    1. Extended 2D data to 3D: expect to have the exact same results as 2D. Tried to put gravity along different directions and found only when gravity along y the code works as expected. Running a job with gravity along y.
Gravity comparison http://www.pas.rochester.edu/~bliu/AblativeRT/3Dcase/checkGravity/3d2dgmass_galongyz.pnghttp://www.pas.rochester.edu/~bliu/AblativeRT/3Dcase/checkGravity/3d2dgmass_galongyz2.png
y-direction Movies density; temperature
  1. Reran the 3D Conduction Front tests along different directions to longer time( same long as the test we did for 2D), since the total mass plots look OK. Didn't find something wrong, although the x&z-direction case take a little more cycles to converge than y-direction at time later onconduction front

Interaction of Planetary and Stellar winds (Stone/Proga continued)

  • Modified the outer boundary condition to introduce stellar wind.

-Fixed the density, internal energy and velocity in both directions at the boundary.

-Planet inside the wind acceleration region. Non-MHD model.

-Planetary wind swept back into parabolic shape.

-Discontinuity separating shocked planet and stellar winds

-Shock diverts the stellar wind around the outflow

-Constant density at terminal velocity*

-Nearly isothermal

Sonic Surfaces:

-At 0.5R fort the planetary wind. Same as the case without stellar wind

-At 2.5R planetary wind termination shock. Decelerates the wind.

-At 4.75R stellar wind termination shock.

-Exact location depends on the assumed momentum flux in the stellar wind. Confines the planetary wind close to the surface.

-Mass loss rate is nearly identical to the case with no stellar wind.

Comparison of Column density: -With stellar wind, viewing from the night side the column density is enhanced as the planetary wind is confined.

  • Cometary tail at a different angle.

Limitations: -No modelling of thermal processes, fixed parameters. -Non-MHD -Orbital motion of the planet and gravitation from the star not included. -Valid for low values of lambda.