Export Rerun Grids with POSYDON Post Processing Pipeline

Sometimes the POSYDON MESA engineer pratitioner might be challenged with exporting grid points to be rerun systematically for multiple MESA grids with the aim of fixing the “islands of unhappines” (cit. Aaron Dotter). These islands are portions of the parameter space where the MESA simulations did not converge to a solution. The POSYDON post processing pipeline can be used to export the grid points to be rerun grid.csv (give any predefined logic) and the pipeline.ini file already preconfigured with the correct MESA-INLIST submodule branch and commit. The params.ini file can be used to run the grid points with the mesa command line tool.

POSYDON MESA grids associated with v2.0.0 made use of the following predefined rerun options in the post processing pipeline: - ‘PISN’: “matthias_PISN-d68228338b91fd487ef5d55c9b6ebb8cc5f0e668” - ‘reverse_MT’: “zepei_fix_implicit-afa1860ddf9894aa1d82742ee2a73e8e92acd4a9” - ‘opacity_max’: “matthias_PISN-d68228338b91fd487ef5d55c9b6ebb8cc5f0e668” - ‘opacity_max_hms-hms’: “zepei_fix_implicit-afa1860ddf9894aa1d82742ee2a73e8e92acd4a9” - ‘TPAGBwind’: “development-22c1bb9e730343558c3e70984a99b3fc1f3c346e” - ‘thermohaline_mixing’: “development-22c1bb9e730343558c3e70984a99b3fc1f3c346e” Please refer to the POSYDON v2.0.0 paper for more details.

Here we show how to export the opacity_max reruns associated with the “getting started” tutorial using the post-processing pipeline.

If you haven’t done it already, export the environemnt variables.

[1]:
%env PATH_TO_POSYDON=/srv/beegfs/scratch/shares/astro/posydon/simone/documentation/POSYDON/
%env PATH_TO_POSYDON_DATA=/srv/beegfs/scratch/shares/astro/posydon/POSYDON_GRIDS_v2/POSYDON_data/230914/
env: PATH_TO_POSYDON=/srv/beegfs/scratch/shares/astro/posydon/simone/documentation/POSYDON/
env: PATH_TO_POSYDON_DATA=/srv/beegfs/scratch/shares/astro/posydon/POSYDON_GRIDS_v2/POSYDON_data/230914/

Preparing the Pipeline initialization file

Let’s copy the pipeline ini file template for the UNIGE HPC cluster.

[2]:
import os
import shutil
from posydon.config import PATH_TO_POSYDON

path_to_ini = os.path.join(PATH_TO_POSYDON, 'grid_params/pipeline_yggdrasil.ini')
shutil.copyfile(path_to_ini, './pipeline.ini')
[2]:
'./pipeline.ini'

We now edit the pipeline ini file to point to the MESA grid directory test_grid/ containing a set of 100 MESA models of the HMS-HMS grid at 0.1Zsun for the mass ratio q=0.7, see the running MESa grid getting started tutorial.

In order for the pipeline to be able to process the data we need to follow the following directory naming convention: /HMS-HMS/1e-01_Zsun/test_grid/.

Here we just want to run the rerun step of the pipeline, in order to create the grid.csv file and the pipeline.ini file to be used to run the subsample of MESA simulations with the new configuration. Notice that the rerun step can be also used after the second step to export reruns of concatenate MESA grid (PSyGrid object h5 file).

After setting up the HPC account options and PATH_TO_GRIDS the value

[4]:
from posydon.config import PATH_TO_POSYDON_DATA

PATH_TO_GRIDS = os.path.join(PATH_TO_POSYDON_DATA, 'POSYDON_data/tutorials/processing-pipeline')
PATH_TO_GRIDS
[4]:
'/srv/beegfs/scratch/shares/astro/posydon/POSYDON_GRIDS_v2/POSYDON_data/230914/POSYDON_data/tutorials/processing-pipeline'

We set:

CREATE_GRID_SLICES = False
COMBINE_GRID_SLICES = False
CALCULATE_EXTRA_VALUES = False
TRAIN_INTERPOLATORS = False
EXPORT_DATASET = False
RERUN = True

And edit the [rerun] section of the file to

GRID_TYPES = ['HMS-HMS']
METALLICITIES = [['1e-01_Zsun']
                ]
GRID_SLICES = [['test_grid']]
COMPRESSIONS = [['LITE']]
DROP_MISSING_FILES = True
RERUN_TYPE = 'opacity_max'

Also remember to set

CREATE_PLOTS = []
DO_CHECKS = []

for all other steps, otherwise the pipeline will try to create plots and do checks on the data, which is not possible since we are not running the full pipeline.

Setting-up and Running the Post Processing Pipeline

We are now rady to setup the grid pipeline with the posydon-setup-pipeline command, as follows:

[1]:
!posydon-setup-pipeline pipeline.ini
/home/bavera/.conda/envs/posydon_env/bin/posydon-setup-pipeline:4: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
  __import__('pkg_resources').require('posydon==1.0.0+194.g3953a14')

+++++++++++++++++++ACCOUNT+++++++++++++++++++
{ 'ACCOUNT': 'meynet',
  'EMAIL': 'simone.bavera@unige.ch',
  'MAILTYPE': 'ALL',
  'PARTITION': 'public-cpu',
  'WALLTIME': '24:00:00'}

++++++++++++++++++++SETUP++++++++++++++++++++
{ 'CALCULATE_EXTRA_VALUES': False,
  'COMBINE_GRID_SLICES': False,
  'CREATE_GRID_SLICES': False,
  'EXPORT_DATASET': False,
  'PATH': '.',
  'PATH_TO_GRIDS': '/srv/beegfs/scratch/shares/astro/posydon/POSYDON_GRIDS_v2/POSYDON_data/230914/POSYDON_data/tutorials/processing-pipeline/',
  'RERUN': True,
  'TRAIN_INTERPOLATORS': False,
  'VERBOSE': True,
  'VERSION': ''}


-------------CREATE_GRID_SLICES--------------  step_1 :False


-------------COMBINE_GRID_SLICES-------------  step_2 :False


-----------CALCULATE_EXTRA_VALUES------------  step_3 :False


-------------TRAIN_INTERPOLATORS-------------  step_4 :False


---------------EXPORT_DATASET----------------  step_9 :False


--------------------RERUN--------------------  rerun  : True
{ 'COMPRESSIONS': [['LITE']],
  'DROP_MISSING_FILES': True,
  'GRID_SLICES': [['test_grid']],
  'GRID_TYPES': ['HMS-HMS'],
  'METALLICITIES': [['1e-01_Zsun']],
  'RERUN_TYPE': 'opacity_max'}

Great let’s run the pipepline with the shell command.

[3]:
!./run_pipeline.sh
rerun.slurm submitted as 28473947

Once the job is done, you can check that the files were created in the PATH_TO_GRIDS/HMS-HMS/1e-01_Zsun/ directory.

[11]:
!ls /srv/beegfs/scratch/shares/astro/posydon/POSYDON_GRIDS_v2/POSYDON_data/230914/POSYDON_data/tutorials/processing-pipeline/HMS-HMS/1e-01_Zsun/rerun_opacity_max_test_grid
grid.csv  HMS-HMS_yggdrasil.ini

Great! You can now submit the simulation. See the tutorial on how to run the grid with the POSYDON MESA submission API tool.