lightcone-io 0.3.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (87) hide show
  1. lightcone_io-0.3.0/.github/workflows/publish.yml +56 -0
  2. lightcone_io-0.3.0/.gitignore +7 -0
  3. lightcone_io-0.3.0/PKG-INFO +14 -0
  4. lightcone_io-0.3.0/README.md +365 -0
  5. lightcone_io-0.3.0/docs/FLAMINGO/Makefile +9 -0
  6. lightcone_io-0.3.0/docs/FLAMINGO/aas_macros.sty +107 -0
  7. lightcone_io-0.3.0/docs/FLAMINGO/lightcone_corrections.tex +115 -0
  8. lightcone_io-0.3.0/docs/FLAMINGO/references.bib +51 -0
  9. lightcone_io-0.3.0/docs/Makefile +20 -0
  10. lightcone_io-0.3.0/docs/make.bat +35 -0
  11. lightcone_io-0.3.0/docs/requirements.txt +4 -0
  12. lightcone_io-0.3.0/docs/source/api.rst +7 -0
  13. lightcone_io-0.3.0/docs/source/conf.py +35 -0
  14. lightcone_io-0.3.0/docs/source/index.rst +9 -0
  15. lightcone_io-0.3.0/examples/make_map_from_particles.py +87 -0
  16. lightcone_io-0.3.0/examples/make_smoothed_map.py +61 -0
  17. lightcone_io-0.3.0/examples/make_smoothed_map.sh +16 -0
  18. lightcone_io-0.3.0/examples/plot_healpix_map.py +35 -0
  19. lightcone_io-0.3.0/examples/plot_pencil_beam.py +72 -0
  20. lightcone_io-0.3.0/examples/read_test.py +43 -0
  21. lightcone_io-0.3.0/lightcone_io/__init__.py +15 -0
  22. lightcone_io-0.3.0/lightcone_io/choose_bh_tracer.py +288 -0
  23. lightcone_io-0.3.0/lightcone_io/combine_maps.py +79 -0
  24. lightcone_io-0.3.0/lightcone_io/compare_maps.py +40 -0
  25. lightcone_io-0.3.0/lightcone_io/correct_maps.py +244 -0
  26. lightcone_io-0.3.0/lightcone_io/downsample_maps.py +147 -0
  27. lightcone_io-0.3.0/lightcone_io/halo_catalogue.py +201 -0
  28. lightcone_io-0.3.0/lightcone_io/healpix_maps.py +434 -0
  29. lightcone_io-0.3.0/lightcone_io/index_particles.py +49 -0
  30. lightcone_io-0.3.0/lightcone_io/kernel.py +92 -0
  31. lightcone_io-0.3.0/lightcone_io/match_black_holes.py +309 -0
  32. lightcone_io-0.3.0/lightcone_io/memory_use.py +43 -0
  33. lightcone_io-0.3.0/lightcone_io/particle_halo_ids.py +593 -0
  34. lightcone_io-0.3.0/lightcone_io/particle_metadata.py +189 -0
  35. lightcone_io-0.3.0/lightcone_io/particle_reader.py +561 -0
  36. lightcone_io-0.3.0/lightcone_io/particle_sorter.py +361 -0
  37. lightcone_io-0.3.0/lightcone_io/smoothed_map.py +395 -0
  38. lightcone_io-0.3.0/lightcone_io/units.py +110 -0
  39. lightcone_io-0.3.0/lightcone_io/util.py +34 -0
  40. lightcone_io-0.3.0/lightcone_io.egg-info/PKG-INFO +14 -0
  41. lightcone_io-0.3.0/lightcone_io.egg-info/SOURCES.txt +85 -0
  42. lightcone_io-0.3.0/lightcone_io.egg-info/dependency_links.txt +1 -0
  43. lightcone_io-0.3.0/lightcone_io.egg-info/requires.txt +8 -0
  44. lightcone_io-0.3.0/lightcone_io.egg-info/top_level.txt +1 -0
  45. lightcone_io-0.3.0/pyproject.toml +29 -0
  46. lightcone_io-0.3.0/scripts/FLAMINGO/choose_tracer.sh +37 -0
  47. lightcone_io-0.3.0/scripts/FLAMINGO/combine_L1000N1800.sh +31 -0
  48. lightcone_io-0.3.0/scripts/FLAMINGO/correct_L1000N1800.sh +34 -0
  49. lightcone_io-0.3.0/scripts/FLAMINGO/downsample_L1000N1800.sh +33 -0
  50. lightcone_io-0.3.0/scripts/FLAMINGO/match_bh_L1000N1800.sh +46 -0
  51. lightcone_io-0.3.0/scripts/FLAMINGO/match_bh_L1000N1800_all.sh +53 -0
  52. lightcone_io-0.3.0/scripts/FLAMINGO/match_bh_L1000N3600.sh +47 -0
  53. lightcone_io-0.3.0/scripts/FLAMINGO/match_dm_L1000N1800.sh +46 -0
  54. lightcone_io-0.3.0/scripts/FLAMINGO/old/README +29 -0
  55. lightcone_io-0.3.0/scripts/FLAMINGO/old/check_index_files.py +53 -0
  56. lightcone_io-0.3.0/scripts/FLAMINGO/old/combine_L1000N0900.sh +29 -0
  57. lightcone_io-0.3.0/scripts/FLAMINGO/old/combine_L1000N3600.sh +29 -0
  58. lightcone_io-0.3.0/scripts/FLAMINGO/old/combine_L2800N5040.sh +30 -0
  59. lightcone_io-0.3.0/scripts/FLAMINGO/old/combine_L5600N5040.sh +30 -0
  60. lightcone_io-0.3.0/scripts/FLAMINGO/old/correct_L1000N0900.sh +34 -0
  61. lightcone_io-0.3.0/scripts/FLAMINGO/old/correct_L1000N1800_ds.sh +41 -0
  62. lightcone_io-0.3.0/scripts/FLAMINGO/old/correct_L1000N3600.sh +33 -0
  63. lightcone_io-0.3.0/scripts/FLAMINGO/old/correct_L2800N5040.sh +33 -0
  64. lightcone_io-0.3.0/scripts/FLAMINGO/old/correct_L5600N5040.sh +33 -0
  65. lightcone_io-0.3.0/scripts/FLAMINGO/old/downsample_L1000N0900.sh +32 -0
  66. lightcone_io-0.3.0/scripts/FLAMINGO/old/downsample_L1000N3600.sh +32 -0
  67. lightcone_io-0.3.0/scripts/FLAMINGO/old/downsample_L2800N5040.sh +38 -0
  68. lightcone_io-0.3.0/scripts/FLAMINGO/old/downsample_L5600N5040.sh +28 -0
  69. lightcone_io-0.3.0/scripts/FLAMINGO/old/halo_ids_L1000N1800.sh +45 -0
  70. lightcone_io-0.3.0/scripts/FLAMINGO/old/match_L1000N0900.sh +38 -0
  71. lightcone_io-0.3.0/scripts/FLAMINGO/old/sort_L2800N5040.sh +44 -0
  72. lightcone_io-0.3.0/scripts/FLAMINGO/old/submit_L1000N1800.sh +24 -0
  73. lightcone_io-0.3.0/scripts/FLAMINGO/old/test_ordering_L1000N1800.sh +51 -0
  74. lightcone_io-0.3.0/scripts/FLAMINGO/old/untar_L2800N5040.sh +24 -0
  75. lightcone_io-0.3.0/scripts/FLAMINGO/sort_L1000N0900.sh +50 -0
  76. lightcone_io-0.3.0/scripts/FLAMINGO/sort_L1000N1800.sh +50 -0
  77. lightcone_io-0.3.0/scripts/FLAMINGO/sort_L1000N1800_bh_only.sh +52 -0
  78. lightcone_io-0.3.0/scripts/FLAMINGO/sort_L1000N3600_bh_only.sh +53 -0
  79. lightcone_io-0.3.0/scripts/FLAMINGO/sort_L2800N5040.sh +50 -0
  80. lightcone_io-0.3.0/scripts/FLAMINGO/sort_L2800N5040_bh_only.sh +49 -0
  81. lightcone_io-0.3.0/scripts/FLAMINGO/submit_L1000N1800_bh.sh +7 -0
  82. lightcone_io-0.3.0/scripts/FLAMINGO/untar_L2800N5040.sh +25 -0
  83. lightcone_io-0.3.0/scripts/virtual_env/make_cosma_env_python3.12.4.sh +37 -0
  84. lightcone_io-0.3.0/setup.cfg +4 -0
  85. lightcone_io-0.3.0/tests/FLAMINGO/hash_L1000N1800_indexed.sh +24 -0
  86. lightcone_io-0.3.0/tests/FLAMINGO/hash_L1000N1800_original.sh +24 -0
  87. lightcone_io-0.3.0/tests/hash_particles.py +145 -0
@@ -0,0 +1,56 @@
1
+ name: Upload Python Package
2
+
3
+ on:
4
+ release:
5
+ types: [published]
6
+
7
+ permissions:
8
+ contents: read
9
+
10
+ jobs:
11
+ release-build:
12
+ runs-on: ubuntu-latest
13
+
14
+ steps:
15
+ - uses: actions/checkout@v4
16
+
17
+ - uses: actions/setup-python@v5
18
+ with:
19
+ python-version: "3.12"
20
+
21
+ - name: Build release distributions
22
+ run: |
23
+ python -m pip install build
24
+ python -m build
25
+
26
+ - name: Upload distributions
27
+ uses: actions/upload-artifact@v4
28
+ with:
29
+ name: release-dists
30
+ path: dist/
31
+
32
+ pypi-publish:
33
+ runs-on: ubuntu-latest
34
+
35
+ needs:
36
+ - release-build
37
+
38
+ permissions:
39
+ # IMPORTANT: this permission is mandatory for trusted publishing
40
+ id-token: write
41
+
42
+ # Dedicated environments with protections for publishing are strongly recommended.
43
+ environment:
44
+ name: pypi
45
+ # OPTIONAL: uncomment and update to include your PyPI project URL in the deployment status:
46
+ url: https://pypi.org/project/lightcone_io
47
+
48
+ steps:
49
+ - name: Retrieve release distributions
50
+ uses: actions/download-artifact@v4
51
+ with:
52
+ name: release-dists
53
+ path: dist/
54
+
55
+ - name: Publish release distributions to PyPI
56
+ uses: pypa/gh-action-pypi-publish@release/v1
@@ -0,0 +1,7 @@
1
+ build
2
+ dist
3
+ *.egg-info
4
+ *~
5
+ *.pyc
6
+ logs
7
+ docs/build
@@ -0,0 +1,14 @@
1
+ Metadata-Version: 2.4
2
+ Name: lightcone_io
3
+ Version: 0.3.0
4
+ Summary: Module for reading SWIFT lightcone output.
5
+ Author-email: John Helly <j.c.helly@durham.ac.uk>
6
+ Requires-Python: >=3.10
7
+ Requires-Dist: h5py
8
+ Requires-Dist: numpy
9
+ Requires-Dist: mpi4py
10
+ Requires-Dist: unyt
11
+ Requires-Dist: healpy
12
+ Requires-Dist: virgodc>=1.0.1
13
+ Requires-Dist: scipy
14
+ Requires-Dist: tqdm
@@ -0,0 +1,365 @@
1
+ # Lightcone I/O for SWIFT
2
+
3
+ This is a python module for reading lightcone output from SWIFT simulations.
4
+
5
+ ## Installation
6
+
7
+ ### Installing in a virtual env on Cosma
8
+
9
+ The bash script in `lightcone_io/scripts/virtual_env/make_cosma_env_python3.12.4.sh`
10
+ will create a new virtual environment in `/cosma/apps/dp004/${USER}/lightcone_env`
11
+ and install the necessary dependencies. If you intend to make any changes to the
12
+ lightcone_io module you can then install it in editable mode by activating the venv
13
+ and running
14
+ ```
15
+ pip install -e .
16
+ ```
17
+ in the source directory. You should then be able to `import lightcone_io` in
18
+ python.
19
+
20
+ ### Installing elsewhere
21
+
22
+ lightcone_io requires mpi4py and h5py with MPI support, which means that h5py
23
+ must be built from source and linked to a libhdf5 which uses the same MPI
24
+ installation as mpi4py.
25
+
26
+ To install mpi4py, ensure that the mpicc from the MPI installation you want
27
+ to use is in your $PATH and run `pip install mpi4py`.
28
+
29
+ To install h5py, assuming that we're using the bash shell:
30
+ ```
31
+ export CC="`which mpicc`"
32
+ export HDF5_MPI="ON"
33
+ export HDF5_DIR=<path to mpi enabled hdf5 installation>
34
+ pip install setuptools cython numpy pkgconfig
35
+ pip install --no-binary h5py --no-build-isolation h5py
36
+ ```
37
+ The HDF5_DIR path should contain HDF5's lib and include directories. You can
38
+ then run
39
+ ```
40
+ pip install -e .
41
+ ```
42
+ in the lightcone_io source directory to install the module.
43
+
44
+ ## Replacement of scripts with runnable modules
45
+
46
+ Earlier versions on lightcone_io installed executable scripts such as
47
+ lightcone_io_index_particles.py to the python environment's bin directory and
48
+ relied on $PATH being set correctly to find the scripts.
49
+
50
+ These scripts have been replaced with modules which can be run with commands
51
+ such as
52
+ ```
53
+ mpirun -np 8 python3 -m mpi4py -m lightcone_io.index_particles <parameters>
54
+ ```
55
+
56
+ ## Reading lightcone HEALPix maps
57
+
58
+ When running with lightcones enabled, SWIFT can output HEALPix maps of various
59
+ quantities in concentric, spherical shells around the observer. The class
60
+ lightcone_io.healpix_maps.ShellArray is used to read these maps. The maps may
61
+ be in a single file per shell or split over many files.
62
+ ```
63
+ import lightcone_io.healpix_maps as hm
64
+
65
+ # Location of the lightcone output
66
+ basedir="./lightcones/"
67
+
68
+ # Which lightcone to read
69
+ basename="lightcone0"
70
+
71
+ # Open a set of HEALPix maps
72
+ shell = hm.ShellArray(basedir, basename)
73
+ ```
74
+ The ShellArray object is a sequence of Shells. Individual shells are accessed
75
+ by indexing with the shell number:
76
+ ```
77
+ # Print the number of shells
78
+ print(len(shell))
79
+
80
+ # Print the inner and outer raddi of a single shell
81
+ print(shell[0].comoving_inner_radius, shell[0].comoving_outer_radius)
82
+ ```
83
+ Each shell contains HEALPix maps of one or more quantities. These maps
84
+ are accessed with dictionary style indexing:
85
+ ```
86
+ # Print names of the available maps for the first shell
87
+ print(list(shell[0]))
88
+
89
+ # Find the TotalMass map for the first shell
90
+ total_mass_map = shell[0]["TotalMass"]
91
+
92
+ # Return some information about this map
93
+ print(total_mass_map.nside) # HEALPix nside parameter
94
+ print(total_mass_map.dtype) # Data type of the pixel data
95
+ print(total_mass_map.units) # Units of the pixel data
96
+ print(len(total_mass_map)) # Total number of pixels
97
+ ```
98
+ The pixel data can be read in by indexing the map. Simple [start:end] slices
99
+ can be used to read subsets of the pixels or Ellipses (...) can be used to
100
+ read all of the pixels:
101
+ ```
102
+ # Read all of the pixels from the TotalMass map of the first shell
103
+ pixel_data = shell[0]["TotalMass"][...]
104
+
105
+ # Or read just the first 100 pixels
106
+ pixel_data_partial = shell[0]["TotalMass"][0:100]
107
+ ```
108
+ If the unyt module is available then the results are returned as a unyt array
109
+ with unit information derived from the HDF5 attributes in the output files.
110
+
111
+ If the simulation hasn't completed yet it wont be possible to initialize a
112
+ ShellArray because some of the files are missing. In that case you can open
113
+ individual shells. E.g.:
114
+ ```
115
+ shell_nr = 10
116
+ shell = hm.Shell(basedir, basename, shell_nr)
117
+ total_mass_map = shell["TotalMass"][...]
118
+ ```
119
+
120
+ ## Reading indexed lightcone particle outputs
121
+
122
+ Lightcone particle outputs can be post-processed to allow faster access to
123
+ specified areas of the sky and redshift ranges. These post-processed outputs
124
+ can be read with the class lightcone_io.particle_reader.IndexedLightcone:
125
+ ```
126
+ import lightcone_io.particle_reader as pr
127
+
128
+ # Specify the name of one of the lightcone particle files
129
+ filename = "./lightcones/lightcone0_particles/lightcone0_0000.0.hdf5"
130
+
131
+ # Open the lightcone particle output
132
+ lightcone = pr.IndexedLightcone(filename)
133
+ ```
134
+ The lightcone object acts like a dictionary where the particle types are
135
+ the keys. E.g. to see which types are available:
136
+ ```
137
+ print(list(lightcone))
138
+ ```
139
+ You can use the properties attribute to see what quantities are available for
140
+ each particle type:
141
+ ```
142
+ print(lightcone["Gas"].properties)
143
+ ```
144
+ Each entry in properties is a zero element unyt array with the dtype, units
145
+ and shape of the quantity in the file and a `attrs` attribute which contains
146
+ a copy of the HDF5 attributes of the dataset.
147
+
148
+ Particles can be read in as follows:
149
+ ```
150
+ # Quantities to read in
151
+ properties = ("Coordinates", "ParticleIDs")
152
+
153
+ # Position and angular radius (in radians) on the sky to read in
154
+ vector = (1., 0., 0.)
155
+ radius = np.radians(10.)
156
+
157
+ # Redshift range to read in
158
+ redshift_range = (0., 1.0)
159
+
160
+ # Read dark matter particles
161
+ data = lightcone["DM"].read(property_names, vector, radius, redshift_range)
162
+ ```
163
+ The return value is a dictionary containing the quantities read in - in this
164
+ case Coordinates and ParticleIDs. If redshift_range=None then all redshifts
165
+ are read in. If vector=None and radius=None then the whole sky is read in.
166
+
167
+ Note that this may return particles outside the specified region because the
168
+ indexed lightcone is stored in chunks and all chunks overlapping the region
169
+ are returned. There is also a read_exact() method which filters out the extra
170
+ particles. E.g.:
171
+
172
+ ```
173
+ data = lightcone["DM"].read_exact(property_names, vector, radius, redshift_range)
174
+ ```
175
+ This is likely to be slower because it's necessary to read in the coordinates for
176
+ spatial selection and the expansion factors for redshift selection even if
177
+ these quantities are not being returned. Computing the angles and redshifts
178
+ adds some CPU overhead too.
179
+
180
+ There is also a way to iterate through the selected particles without reading
181
+ them all into memory:
182
+ ```
183
+ for data in lightcone["DM"].iterate_chunks(property_names, vector, radius,
184
+ redshift_range):
185
+ pos = data["Coordinates"]
186
+ ids = data["ParticleIDs"]
187
+ # then do something with this subset of the selected particles...
188
+ ```
189
+ Each iteration of this loop will receive a chunk of particles in the dict data.
190
+
191
+ ## Combining HEALPix maps
192
+
193
+ The code above can read SWIFT HEALPix output regardless of how many files it
194
+ is split over. However, it may be desirable to reduce the number of files if
195
+ they're stored on a file system optimized for small numbers of large files.
196
+
197
+ The module lightcone_io.combine_maps can be used to combine the maps for each
198
+ shell into a single HDF5 file. This code is parallelized using mpi4py and can
199
+ be run as follows:
200
+
201
+ ```
202
+ input_dir=./lightcones/
203
+ output_dir=./indexed_lightcones/
204
+
205
+ mpirun python3 -m mpi4py -m lightcone_io.combine_maps \
206
+ ${input_dir} ${output_dir} lightcone0 lightcone1 ...
207
+ ```
208
+
209
+ This will process all shells for the specified lightcones.
210
+
211
+ There is an example SLURM batch script to run on the FLAMINGO simulations on
212
+ COSMA-8 in scripts/FLAMINGO/combine_L1000N1800.sh.
213
+
214
+ ## Indexing particle outputs
215
+
216
+ SWIFT lightcone particle outputs are spread over many files and not sorted
217
+ in any useful order. The module lightcone_io.index_particles can be
218
+ used to sort the particles and generate an index which can be used to
219
+ quickly retrieve particles by redshift and position on the sky.
220
+
221
+ The sky is divided into pixels using a low resolution HEALPix map and
222
+ each pixel is split into redshift bins. This defines a set of cells of
223
+ varying volume. The redshift bins are chosen such that the number of particles
224
+ per cell is roughly constant. The particles are then stored in order of which
225
+ cell they belong to and the location of each cell in the output files is
226
+ stored. This information is used by the lightcone_io.particle_reader module
227
+ to extract requested particles.
228
+
229
+ The code is parallelized with mpi4py and can be run as follows:
230
+ ```
231
+ # Location of the input lightcones
232
+ basedir="./lightcones/"
233
+
234
+ # Name of the lightcone to process
235
+ basename="lightcone0"
236
+
237
+ # Number of redshift bins to use
238
+ nr_redshift_bins=4
239
+
240
+ # HEALPix map resolution to use
241
+ nside=32
242
+
243
+ # HEALPix pixel ordering scheme
244
+ order="nest"
245
+
246
+ mpirun python3 -m mpi4py -m lightcone_io.index_particles \
247
+ ${basedir} ${basename} ${nr_redshift_bins} ${nside} \
248
+ ${outdir} --order ${order} --redshift-first
249
+ ```
250
+ There is an example SLURM batch script to run on the FLAMINGO simulations on
251
+ COSMA-8 in scripts/FLAMINGO/sort_L1000N1800.sh.
252
+
253
+ ## Computing halo membership in particle lightcones
254
+
255
+ The script `bin/lightcone_io_particle_halo_ids.py` can compute halo membership
256
+ for particles in the particle lightcone outputs. It works as follows:
257
+
258
+ * The full halo lightcone is read in
259
+ * For each halo in the halo lightcone we look up a mass and radius from SOAP
260
+ (so SOAP must have been run on all snapshots)
261
+ * The lightcone particles are read in
262
+ * Particles within the radius of each halo in the halo lightcone are flagged
263
+ as belonging to that halo
264
+ * For each particle in the lightcone we write out the associated halo ID and mass
265
+
266
+ The mass and radius to use are specified by the name of the SOAP group
267
+ which they should be read from (e.g. `--soap-so-name="SO/200_crit"`)
268
+ so it's possible to run the code using various halo radius
269
+ definitions.
270
+
271
+ Where the radii of several halos overlap there are three different
272
+ ways we can decide which halo to assign the particle to. These are
273
+ specified using the command line flag `--overlap-method`. Possible
274
+ values are
275
+
276
+ * `fractional-radius`: for each particle we compute the distance to the halo centre in units of the halo radius. Particles are assigned to the halo for which this value is lowest.
277
+ * `most-massive`: particles within the radius of multiple halos are assigned to the most massive halo
278
+ * `least-massive`: particles within the radius of multiple halos are assigned to the least massive halo
279
+
280
+ This is also parallelized using mpi4py. To run it:
281
+ ```
282
+ # Location of the lightcone particle data
283
+ lightcone_dir="/cosma8/data/dp004/flamingo/Runs/L1000N1800/HYDRO_FIDUCIAL/particle_lightcones/"
284
+ lightcone_base="lightcone0"
285
+
286
+ # Format string to generate halo lightcone filenames
287
+ halo_lightcone_filenames="/snap8/scratch/dp004/jch/FLAMINGO/ScienceRuns/L1000N1800/HYDRO_FIDUCIAL/lightcone_halos/${lightcone_base}/lightcone_halos_%(file_nr)04d.hdf5"
288
+
289
+ # Format string to generate SOAP catalogue filenames
290
+ soap_filenames="/cosma8/data/dp004/flamingo/Runs/L1000N1800/HYDRO_FIDUCIAL/SOAP/halo_properties_%(snap_nr)04d.hdf5"
291
+
292
+ # Directory to write the output to
293
+ output_dir="/snap8/scratch/dp004/jch/FLAMINGO/ScienceRuns/${sim}/lightcone_particle_halo_ids/lightcone${lightcone_nr}/"
294
+
295
+ mpirun python3 -m mpi4py -m lightcone_io.particle_halo_ids \
296
+ "${lightcone_dir}" \
297
+ "${lightcone_base}" \
298
+ "${halo_lightcone_filenames}" \
299
+ "${soap_filenames}" \
300
+ "${output_dir}" \
301
+ --soap-so-name="SO/200_crit" \
302
+ --overlap-method=fractional_radius
303
+ ```
304
+
305
+ There is a batch script to run this code on FLAMINGO on COSMA-8 in
306
+ ./scripts/FLAMINGO/halo_ids_L1000N1800.sh.
307
+
308
+ ## Example Scripts
309
+
310
+ ### Plotting a HEALPix map
311
+
312
+ The script `examples/plot_healpix_map.py` shows how to read in a full HEALPix map
313
+ and plot it using the healpy mollview function.
314
+
315
+ ### Plotting a pencil beam from the particle data
316
+
317
+ The script `examples/plot_pencil_beam.py` reads in all particles in a 2 degree
318
+ radius about a vector along the x axis and makes a log scaled plot of projected
319
+ mass.
320
+
321
+ ### Making a new HEALPix map from lightcone particle data
322
+
323
+ The script `examples/make_map_from_particles.py` shows how to make a new
324
+ HEALPix map by projecting particles from a lightcone particle output onto
325
+ the sky.
326
+
327
+ The script uses the iterate_chunks() method from the IndexedLightcone class
328
+ to read in chunks of particles in the required redshift range. It calculates
329
+ which HEALPix pixel each particle maps onto using healpy and adds the particle's
330
+ mass to that pixel. The resulting map is written to a new HDF5 file.
331
+
332
+ ### Combining, correcting and downsampling FLAMINGO L1000N1800 maps on Cosma
333
+
334
+ To postprocess maps from a 1Gpc FLAMINGO run on Cosma:
335
+
336
+ * Download and install this module
337
+ ```
338
+ git clone git@github.com:jchelly/LightconeIO.git
339
+ module load python/3.10.1
340
+ cd LightconeIO
341
+ pip install --user .
342
+ ```
343
+ * Modify the output location (variable `output_dir`) in these scripts to point at a writable location:
344
+ ```
345
+ scripts/FLAMINGO/combine_L1000N1800.sh
346
+ scripts/FLAMINGO/correct_L1000N1800.sh
347
+ scripts/FLAMINGO/downsample_L1000N1800.sh
348
+ ```
349
+ * Create a directory for the log files (the scripts will silently fail if this doesn't exist):
350
+ ```
351
+ cd scripts/FLAMINGO
352
+ mkdir -p logs/L1000N1800
353
+ ```
354
+ * Submit the script to combine the maps into one file per shell. The job name (sbatch -J) specifies which simulation to process. The job array index (sbatch --array=...) is which lightcone(s) to do. E.g. for L1000N1800/HYDRO_FIDUCIAL:
355
+ ```
356
+ sbatch -J HYDRO_FIDUCIAL --array=0-1 ./combine_L1000N1800.sh
357
+ ```
358
+ * Once that completes, the combined maps can be corrected:
359
+ ```
360
+ sbatch -J HYDRO_FIDUCIAL --array=0-1 ./correct_L1000N1800.sh
361
+ ```
362
+ * And then the corrected map is downsampled:
363
+ ```
364
+ sbatch -J HYDRO_FIDUCIAL --array=0-1 ./downsample_L1000N1800.sh
365
+ ```
@@ -0,0 +1,9 @@
1
+ lightcone_corrections.pdf: lightcone_corrections.tex references.bib
2
+ pdflatex lightcone_corrections.tex
3
+ #bibtex lightcone_corrections
4
+ #pdflatex lightcone_corrections.tex
5
+ #pdflatex lightcone_corrections.tex
6
+
7
+ clean:
8
+ \rm -f *.log *.out *.pdf *.blg *.aux lightcone_corrections.bbl
9
+
@@ -0,0 +1,107 @@
1
+ %
2
+ % These Macros are taken from the AAS TeX macro package version 5.2
3
+ % and are compatible with the macros in the A&A document class
4
+ % version 7.0
5
+ % Include this file in your LaTeX source only if you are not using
6
+ % the AAS TeX macro package or the A&A document class and need to
7
+ % resolve the macro definitions in the TeX/BibTeX entries returned by
8
+ % the ADS abstract service.
9
+ %
10
+ % If you plan not to use this file to resolve the journal macros
11
+ % rather than the whole AAS TeX macro package, you should save the
12
+ % file as ``aas_macros.sty'' and then include it in your LaTeX paper
13
+ % by using a construct such as:
14
+ % \documentstyle[11pt,aas_macros]{article}
15
+ %
16
+ % For more information on the AASTeX and A&A packages, please see:
17
+ % http://journals.aas.org/authors/aastex.html
18
+ % ftp://ftp.edpsciences.org/pub/aa/readme.html
19
+ % For more information about ADS abstract server, please see:
20
+ % http://adsabs.harvard.edu/ads_abstracts.html
21
+ %
22
+
23
+ % Abbreviations for journals. The object here is to provide authors
24
+ % with convenient shorthands for the most "popular" (often-cited)
25
+ % journals; the author can use these markup tags without being concerned
26
+ % about the exact form of the journal abbreviation, or its formatting.
27
+ % It is up to the keeper of the macros to make sure the macros expand
28
+ % to the proper text. If macro package writers agree to all use the
29
+ % same TeX command name, authors only have to remember one thing, and
30
+ % the style file will take care of editorial preferences. This also
31
+ % applies when a single journal decides to revamp its abbreviating
32
+ % scheme, as happened with the ApJ (Abt 1991).
33
+
34
+ \let\jnl@style=\rm
35
+ \def\ref@jnl#1{{\jnl@style#1}}
36
+
37
+ \def\aj{\ref@jnl{AJ}} % Astronomical Journal
38
+ \def\actaa{\ref@jnl{Acta Astron.}} % Acta Astronomica
39
+ \def\araa{\ref@jnl{ARA\&A}} % Annual Review of Astron and Astrophys
40
+ \def\apj{\ref@jnl{ApJ}} % Astrophysical Journal
41
+ \def\apjl{\ref@jnl{ApJ}} % Astrophysical Journal, Letters
42
+ \def\apjs{\ref@jnl{ApJS}} % Astrophysical Journal, Supplement
43
+ \def\ao{\ref@jnl{Appl.~Opt.}} % Applied Optics
44
+ \def\apss{\ref@jnl{Ap\&SS}} % Astrophysics and Space Science
45
+ \def\aap{\ref@jnl{A\&A}} % Astronomy and Astrophysics
46
+ \def\aapr{\ref@jnl{A\&A~Rev.}} % Astronomy and Astrophysics Reviews
47
+ \def\aaps{\ref@jnl{A\&AS}} % Astronomy and Astrophysics, Supplement
48
+ \def\azh{\ref@jnl{AZh}} % Astronomicheskii Zhurnal
49
+ \def\baas{\ref@jnl{BAAS}} % Bulletin of the AAS
50
+ \def\bac{\ref@jnl{Bull. astr. Inst. Czechosl.}}
51
+ % Bulletin of the Astronomical Institutes of Czechoslovakia
52
+ \def\caa{\ref@jnl{Chinese Astron. Astrophys.}}
53
+ % Chinese Astronomy and Astrophysics
54
+ \def\cjaa{\ref@jnl{Chinese J. Astron. Astrophys.}}
55
+ % Chinese Journal of Astronomy and Astrophysics
56
+ \def\icarus{\ref@jnl{Icarus}} % Icarus
57
+ \def\jcap{\ref@jnl{J. Cosmology Astropart. Phys.}}
58
+ % Journal of Cosmology and Astroparticle Physics
59
+ \def\jrasc{\ref@jnl{JRASC}} % Journal of the RAS of Canada
60
+ \def\memras{\ref@jnl{MmRAS}} % Memoirs of the RAS
61
+ \def\mnras{\ref@jnl{MNRAS}} % Monthly Notices of the RAS
62
+ \def\na{\ref@jnl{New A}} % New Astronomy
63
+ \def\nar{\ref@jnl{New A Rev.}} % New Astronomy Review
64
+ \def\pra{\ref@jnl{Phys.~Rev.~A}} % Physical Review A: General Physics
65
+ \def\prb{\ref@jnl{Phys.~Rev.~B}} % Physical Review B: Solid State
66
+ \def\prc{\ref@jnl{Phys.~Rev.~C}} % Physical Review C
67
+ \def\prd{\ref@jnl{Phys.~Rev.~D}} % Physical Review D
68
+ \def\pre{\ref@jnl{Phys.~Rev.~E}} % Physical Review E
69
+ \def\prl{\ref@jnl{Phys.~Rev.~Lett.}} % Physical Review Letters
70
+ \def\pasa{\ref@jnl{PASA}} % Publications of the Astron. Soc. of Australia
71
+ \def\pasp{\ref@jnl{PASP}} % Publications of the ASP
72
+ \def\pasj{\ref@jnl{PASJ}} % Publications of the ASJ
73
+ \def\rmxaa{\ref@jnl{Rev. Mexicana Astron. Astrofis.}}%
74
+ % Revista Mexicana de Astronomia y Astrofisica
75
+ \def\qjras{\ref@jnl{QJRAS}} % Quarterly Journal of the RAS
76
+ \def\skytel{\ref@jnl{S\&T}} % Sky and Telescope
77
+ \def\solphys{\ref@jnl{Sol.~Phys.}} % Solar Physics
78
+ \def\sovast{\ref@jnl{Soviet~Ast.}} % Soviet Astronomy
79
+ \def\ssr{\ref@jnl{Space~Sci.~Rev.}} % Space Science Reviews
80
+ \def\zap{\ref@jnl{ZAp}} % Zeitschrift fuer Astrophysik
81
+ \def\nat{\ref@jnl{Nature}} % Nature
82
+ \def\iaucirc{\ref@jnl{IAU~Circ.}} % IAU Cirulars
83
+ \def\aplett{\ref@jnl{Astrophys.~Lett.}} % Astrophysics Letters
84
+ \def\apspr{\ref@jnl{Astrophys.~Space~Phys.~Res.}}
85
+ % Astrophysics Space Physics Research
86
+ \def\bain{\ref@jnl{Bull.~Astron.~Inst.~Netherlands}}
87
+ % Bulletin Astronomical Institute of the Netherlands
88
+ \def\fcp{\ref@jnl{Fund.~Cosmic~Phys.}} % Fundamental Cosmic Physics
89
+ \def\gca{\ref@jnl{Geochim.~Cosmochim.~Acta}} % Geochimica Cosmochimica Acta
90
+ \def\grl{\ref@jnl{Geophys.~Res.~Lett.}} % Geophysics Research Letters
91
+ \def\jcp{\ref@jnl{J.~Chem.~Phys.}} % Journal of Chemical Physics
92
+ \def\jgr{\ref@jnl{J.~Geophys.~Res.}} % Journal of Geophysics Research
93
+ \def\jqsrt{\ref@jnl{J.~Quant.~Spec.~Radiat.~Transf.}}
94
+ % Journal of Quantitiative Spectroscopy and Radiative Transfer
95
+ \def\memsai{\ref@jnl{Mem.~Soc.~Astron.~Italiana}}
96
+ % Mem. Societa Astronomica Italiana
97
+ \def\nphysa{\ref@jnl{Nucl.~Phys.~A}} % Nuclear Physics A
98
+ \def\physrep{\ref@jnl{Phys.~Rep.}} % Physics Reports
99
+ \def\physscr{\ref@jnl{Phys.~Scr}} % Physica Scripta
100
+ \def\planss{\ref@jnl{Planet.~Space~Sci.}} % Planetary Space Science
101
+ \def\procspie{\ref@jnl{Proc.~SPIE}} % Proceedings of the SPIE
102
+
103
+ \let\astap=\aap
104
+ \let\apjlett=\apjl
105
+ \let\apjsupp=\apjs
106
+ \let\applopt=\ao
107
+
@@ -0,0 +1,115 @@
1
+ \documentclass{article}
2
+
3
+ \usepackage{cite}
4
+ \usepackage{hyperref}
5
+ \usepackage{aas_macros}
6
+ \usepackage{graphicx}
7
+ \usepackage{hyperref}
8
+
9
+ \graphicspath{ {./images/} }
10
+
11
+ \title{FLAMINGO Lightcone Output Corrections}
12
+ \date{2022-12-08}
13
+ \author{John Helly}
14
+
15
+ \begin{document}
16
+ \maketitle
17
+
18
+ \section{FLAMINGO Lightcone Output Corrections}
19
+
20
+ This document describes the post processing applied to the lightcone
21
+ particle outputs and HEALPix maps generated by the FLAMINGO
22
+ simulations. This post processing serves two purposes:
23
+ \begin{itemize}
24
+ \item{To correct various bugs found in the simulation code after the
25
+ simulations were run.}
26
+ \item{To put the output into a more convenient form by, for example,
27
+ splitting it over fewer files and adding indexing to allow
28
+ extraction of useful subsets of the data.}
29
+ \end{itemize}
30
+ The python code and batch scripts used to do this are stored in a git
31
+ repository at \url{https://github.com/jchelly/LightconeIO}. The
32
+ README.md file in the git repository explains how to run the scripts.
33
+
34
+ Slurm batch scripts for processing the FLAMINGO outputs on Cosma are
35
+ in the \verb|scripts/FLAMINGO| subdirectory in the git repository.
36
+
37
+ \subsection{Post processing of the particle output}
38
+
39
+ The script \verb|lightcone_io_index_particles.py| is used to combine the
40
+ lightcone particle outputs into a smaller number of files and sort the
41
+ particles by redshift and position on the sky so that sub-regions of
42
+ the lightcone can be extracted without reading all of the particles.
43
+
44
+ The output from this script is required for making lightcone halo
45
+ catalogues.
46
+
47
+ No corrections are made to the particle data by this script.
48
+
49
+ \subsection{Post processing of the HEALPix maps}
50
+
51
+ The HEALPix maps are processed in three steps: the maps are combined
52
+ into one file per redshift bin, various corrections are made, and then
53
+ down sampled versions of the maps are created.
54
+
55
+ \subsubsection{Combining the maps into single files}
56
+
57
+ The script \verb|lightcone_io_combine_maps.py| is used to combine the maps
58
+ for each shell into a single file. This script also sets the unit
59
+ information for the following HEALPix maps to an assumed correct
60
+ value, disregarding the units in the input files:
61
+
62
+ \begin{itemize}
63
+ \item{XrayErositaLowIntrinsicPhotons}
64
+ \item{XrayErositaHighIntrinsicPhotons}
65
+ \item{XrayROSATIntrinsicPhotons}
66
+ \item{XrayErositaLowIntrinsicEnergies}
67
+ \item{XrayErositaHighIntrinsicEnergies}
68
+ \item{XrayROSATIntrinsicEnergies}
69
+ \end{itemize}
70
+
71
+ It should be safe to run this script on output from a version of SWIFT
72
+ where this bug has been fixed as long as the actual units of the X-ray
73
+ maps have not changed. If the script changes the units of a map it
74
+ generates a warning on stdout.
75
+
76
+ The file \verb|lightcone_io/units.py| defines what the units of these maps
77
+ should be and any additional corrections can be added there.
78
+
79
+ \subsubsection{Applying corrections to the maps}
80
+
81
+ The script \verb|lightcone_io_correct_maps.py| applies corrections to the map
82
+ pixel data. The corrections are:
83
+ \begin{itemize}
84
+ \item{The DopplerB maps must be multiplied by $(1+z)$ at the shell
85
+ midpoint}
86
+ \item{The DM (dispersion measure) maps must be multiplied by $1/(1+z)$
87
+ at the shell midpoint}
88
+ \item{Three times the mean neutrino mass per pixel must be subtracted
89
+ from the NeutrinoMass and TotalMass maps}
90
+ \end{itemize}
91
+
92
+ This script adds attributes to the maps which describe what
93
+ corrections have been made. If the input map already contains these
94
+ attributes then the corrections will not be made again.
95
+
96
+ Note that the script has no way to determine if these corrections have
97
+ already been made in SWIFT. If new runs are carried out with the
98
+ corresponding bugs fixed then this script should not be run.
99
+
100
+ \subsubsection{Downsampling the maps}
101
+
102
+ The script \verb|lightcone_io_downsample_maps.py| can be used to make
103
+ low resolution maps. In order to do this it is necessary to know for
104
+ each map if pixels should be combined by taking the sum or the average
105
+ - the function \verb|get_power()| in
106
+ \verb|lightcone_io_downsample_maps.py| will need to be modified if new
107
+ map types are added.
108
+
109
+ This script does not do any corrections so it should be applied to
110
+ high resolution maps which have already been corrected.
111
+
112
+ %\bibliography{references}{}
113
+ %\bibliographystyle{plain}
114
+
115
+ \end{document}