hebb 0.7__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
hebb-0.7/LICENSE.md ADDED
@@ -0,0 +1,11 @@
1
+ Copyright 2026 Andrea Negri
2
+
3
+ Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
4
+
5
+ 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
6
+
7
+ 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
8
+
9
+ 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
10
+
11
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
hebb-0.7/PKG-INFO ADDED
@@ -0,0 +1,79 @@
1
+ Metadata-Version: 2.4
2
+ Name: hebb
3
+ Version: 0.7
4
+ Summary: Halo Extreme Block Bootstrap package to estimate the largest halo mass given the field of view and redshift depth of a survey
5
+ Author-email: Andrea Negri <anegri@us.es>
6
+ License-Expression: BSD-3-Clause
7
+ Project-URL: Homepage, https://github.com/NegriAndrea/hebb
8
+ Project-URL: Issues, https://github.com/NegriAndrea/hebb/issues
9
+ Classifier: Programming Language :: Python :: 3
10
+ Classifier: Operating System :: OS Independent
11
+ Requires-Python: >=3.10
12
+ Description-Content-Type: text/markdown
13
+ License-File: LICENSE.md
14
+ Requires-Dist: numpy
15
+ Requires-Dist: astropy
16
+ Requires-Dist: scipy
17
+ Requires-Dist: h5py>=3.2.0
18
+ Requires-Dist: scipy
19
+ Requires-Dist: matplotlib
20
+ Requires-Dist: mpi4py
21
+ Dynamic: license-file
22
+
23
+ # hebb
24
+ Halo Extreme Block Bootstrap package to estimate the distribution of heaviest halo mass you can find in a survery at a given redshift.
25
+
26
+ `hebb` uses as a basedata the Uchuu simulation halo catalogue (https://skiesanduniverses.org/Simulations/Uchuu) to perform a block bootstrap by shooting N boxes of a volume equal of the estimated volume of a survey, and recovering the largest halo formed at a particular z, with an uncertainty estimate. The volume of the survey is computed given the survey's field of view and redshift depth, or manually selected. Optionally, the code can dump the list of halo found in the search, which can be used to perform a trace back in time with the Uchuu merger tree.
27
+
28
+ Please cite the following papers if you use `hebb` in your work: Negri & Belli (2026), [Ishiyama et al. (2021)](https://ui.adsabs.harvard.edu/abs/2021MNRAS.506.4210I) (Uchuu Data Release 1).
29
+
30
+ ## Installation
31
+ There are 2 ways to install `hebb`: via `PyPI`
32
+ ```
33
+ python3 -m pip install hebb
34
+ ```
35
+ Or directly from the repository `https://github.com/NegriAndrea/hebb/` and install locally
36
+ ```
37
+ git clone https://github.com/NegriAndrea/hebb
38
+ cd hebb
39
+ python3 -m pip install -e .
40
+ ```
41
+
42
+ ### Database Setup
43
+ In addition to installing the Python package, you must download a reduced version of the Uchuu database. Two versions are available, with a different cut in mass; a light [one](https://uses0-my.sharepoint.com/:u:/g/personal/anegri_us_es/IQCDx5T1XlTnSbM2YmYcFoYMATIk_pI1XKPzp4ycUJ6N12M?e=pDdoDg) (2.3 GB) and a more complete one (~40 GB) from [here](https://uses0-my.sharepoint.com/:u:/g/personal/anegri_us_es/IQBrGP-3e0xHR6pV0YkfCb1hAYS1KLSNnjmDvVb3H6ytsUA?e=fcudjY). To set the database path you can define the following environment variable in your `~/.bash_profile`
44
+ ```
45
+ # bash
46
+ export HEBB_DB_PATH=/path/to/database
47
+ ```
48
+ where you have to change `/path/to/database` to the path of the downloaded database. By default, the code looks first for the most complete catalugue, and if it is not found, it will look for the light one.
49
+
50
+ NOTE: In order to keep the file size manageble the database contains only log10(M200), positions and merger tree IDs; the halo positions have been binned on a 40 ckpc gridsize and stored as `uint16` integer, which is precise enough for volumes that are usually way larger than 1 cMpc.
51
+
52
+ ### Test installation
53
+ The easiest way to test if everything is correctly installed and set up is to run the command
54
+ ```
55
+ hebb 100 0. -L 1000 -M 1e14
56
+ ```
57
+
58
+ ## Usage
59
+ The simplest way to use `hebb` is via command line, `hebb -h` returns the user manual:
60
+
61
+
62
+
63
+
64
+
65
+ @ARTICLE{2021MNRAS.506.4210I,
66
+ author = {{Ishiyama}, Tomoaki and {Prada}, Francisco and {Klypin}, Anatoly A. and {Sinha}, Manodeep and {Metcalf}, R. Benton and {Jullo}, Eric and {Altieri}, Bruno and {Cora}, Sof{\'\i}a A. and {Croton}, Darren and {de la Torre}, Sylvain and {Mill{\'a}n-Calero}, David E. and {Oogi}, Taira and {Ruedas}, Jos{\'e} and {Vega-Mart{\'\i}nez}, Cristian A.},
67
+ title = "{The Uchuu simulations: Data Release 1 and dark matter halo concentrations}",
68
+ journal = {\mnras},
69
+ year = 2021,
70
+ month = sep,
71
+ volume = {506},
72
+ number = {3},
73
+ pages = {4210-4231},
74
+ doi = {10.1093/mnras/stab1755},
75
+ archivePrefix = {arXiv},
76
+ eprint = {2007.14720},
77
+ primaryClass = {astro-ph.CO},
78
+ adsurl = {https://ui.adsabs.harvard.edu/abs/2021MNRAS.506.4210I},
79
+ }
hebb-0.7/README.md ADDED
@@ -0,0 +1,57 @@
1
+ # hebb
2
+ Halo Extreme Block Bootstrap package to estimate the distribution of heaviest halo mass you can find in a survery at a given redshift.
3
+
4
+ `hebb` uses as a basedata the Uchuu simulation halo catalogue (https://skiesanduniverses.org/Simulations/Uchuu) to perform a block bootstrap by shooting N boxes of a volume equal of the estimated volume of a survey, and recovering the largest halo formed at a particular z, with an uncertainty estimate. The volume of the survey is computed given the survey's field of view and redshift depth, or manually selected. Optionally, the code can dump the list of halo found in the search, which can be used to perform a trace back in time with the Uchuu merger tree.
5
+
6
+ Please cite the following papers if you use `hebb` in your work: Negri & Belli (2026), [Ishiyama et al. (2021)](https://ui.adsabs.harvard.edu/abs/2021MNRAS.506.4210I) (Uchuu Data Release 1).
7
+
8
+ ## Installation
9
+ There are 2 ways to install `hebb`: via `PyPI`
10
+ ```
11
+ python3 -m pip install hebb
12
+ ```
13
+ Or directly from the repository `https://github.com/NegriAndrea/hebb/` and install locally
14
+ ```
15
+ git clone https://github.com/NegriAndrea/hebb
16
+ cd hebb
17
+ python3 -m pip install -e .
18
+ ```
19
+
20
+ ### Database Setup
21
+ In addition to installing the Python package, you must download a reduced version of the Uchuu database. Two versions are available, with a different cut in mass; a light [one](https://uses0-my.sharepoint.com/:u:/g/personal/anegri_us_es/IQCDx5T1XlTnSbM2YmYcFoYMATIk_pI1XKPzp4ycUJ6N12M?e=pDdoDg) (2.3 GB) and a more complete one (~40 GB) from [here](https://uses0-my.sharepoint.com/:u:/g/personal/anegri_us_es/IQBrGP-3e0xHR6pV0YkfCb1hAYS1KLSNnjmDvVb3H6ytsUA?e=fcudjY). To set the database path you can define the following environment variable in your `~/.bash_profile`
22
+ ```
23
+ # bash
24
+ export HEBB_DB_PATH=/path/to/database
25
+ ```
26
+ where you have to change `/path/to/database` to the path of the downloaded database. By default, the code looks first for the most complete catalugue, and if it is not found, it will look for the light one.
27
+
28
+ NOTE: In order to keep the file size manageble the database contains only log10(M200), positions and merger tree IDs; the halo positions have been binned on a 40 ckpc gridsize and stored as `uint16` integer, which is precise enough for volumes that are usually way larger than 1 cMpc.
29
+
30
+ ### Test installation
31
+ The easiest way to test if everything is correctly installed and set up is to run the command
32
+ ```
33
+ hebb 100 0. -L 1000 -M 1e14
34
+ ```
35
+
36
+ ## Usage
37
+ The simplest way to use `hebb` is via command line, `hebb -h` returns the user manual:
38
+
39
+
40
+
41
+
42
+
43
+ @ARTICLE{2021MNRAS.506.4210I,
44
+ author = {{Ishiyama}, Tomoaki and {Prada}, Francisco and {Klypin}, Anatoly A. and {Sinha}, Manodeep and {Metcalf}, R. Benton and {Jullo}, Eric and {Altieri}, Bruno and {Cora}, Sof{\'\i}a A. and {Croton}, Darren and {de la Torre}, Sylvain and {Mill{\'a}n-Calero}, David E. and {Oogi}, Taira and {Ruedas}, Jos{\'e} and {Vega-Mart{\'\i}nez}, Cristian A.},
45
+ title = "{The Uchuu simulations: Data Release 1 and dark matter halo concentrations}",
46
+ journal = {\mnras},
47
+ year = 2021,
48
+ month = sep,
49
+ volume = {506},
50
+ number = {3},
51
+ pages = {4210-4231},
52
+ doi = {10.1093/mnras/stab1755},
53
+ archivePrefix = {arXiv},
54
+ eprint = {2007.14720},
55
+ primaryClass = {astro-ph.CO},
56
+ adsurl = {https://ui.adsabs.harvard.edu/abs/2021MNRAS.506.4210I},
57
+ }
@@ -0,0 +1,52 @@
1
+ [build-system]
2
+ requires = ["setuptools >= 77.0.3", "wheel"]
3
+ build-backend = "setuptools.build_meta"
4
+ #[tool.setuptools-git-versioning]
5
+ #enabled = true
6
+
7
+ [project]
8
+ name = "hebb"
9
+ authors = [
10
+ { name = "Andrea Negri", email = "anegri@us.es" }
11
+ ]
12
+ readme = "README.md"
13
+ description = "Halo Extreme Block Bootstrap package to estimate the largest halo mass given the field of view and redshift depth of a survey"
14
+ classifiers = [
15
+ "Programming Language :: Python :: 3",
16
+ "Operating System :: OS Independent",
17
+ ]
18
+ license = "BSD-3-clause"
19
+ license-files = ["LICEN[CS]E*"]
20
+ version = "0.7"
21
+
22
+
23
+ requires-python = ">=3.10"
24
+ dependencies = [
25
+ "numpy",
26
+ "astropy",
27
+ "scipy",
28
+ "h5py>=3.2.0",
29
+ "scipy",
30
+ "matplotlib",
31
+ "mpi4py",
32
+ ]
33
+
34
+ #dynamic = ["version"]
35
+
36
+ [project.optional-dependencies]
37
+
38
+ [project.urls]
39
+ Homepage = "https://github.com/NegriAndrea/hebb"
40
+ Issues = "https://github.com/NegriAndrea/hebb/issues"
41
+
42
+ [project.scripts]
43
+ hebb = "hebb.hebb_cli:hebb_CLI"
44
+ hebb-tr = "hebb.hebb_cli:hebb_trace_CLI"
45
+ hebb-estimate = "hebb.hebb_cli:hebb_estimate_CLI"
46
+
47
+
48
+ [tool.setuptools]
49
+ package-dir = {"" = "src"}
50
+
51
+ [tool.setuptools.packages.find]
52
+ where = ["src"]
hebb-0.7/setup.cfg ADDED
@@ -0,0 +1,4 @@
1
+ [egg_info]
2
+ tag_build =
3
+ tag_date = 0
4
+
@@ -0,0 +1,2 @@
1
+ #!/usr/bin/env python3
2
+ # -*- coding: utf-8 -*-
@@ -0,0 +1,164 @@
1
+ #!/usr/bin/env python3
2
+ # -*- coding: utf-8 -*-
3
+ from .hebb_core import hebb, hebb_estimate
4
+ from .hebb_trace import hebb_trace
5
+
6
+ def hebb_CLI():
7
+ import numpy as np
8
+ import argparse
9
+ import os
10
+
11
+ try:
12
+ def_path = os.environ['HEBB_DB_PATH']
13
+ except KeyError:
14
+ raise ValueError('The environment variable HEBB_DB_PATH must be set')
15
+
16
+ description = (
17
+ 'Compute the heaviest dark matter halo that you can find in a given '
18
+ 'survey with [z_min, z_max] and field-of-view by performing a '
19
+ 'non-parametric block bootstrap over the Uchuu (2/h cGpc)^3 run.'
20
+ )
21
+ parser = argparse.ArgumentParser(description=description)
22
+ parser.add_argument('Nboxes', type=int, help='Number of boxes for bootstrap')
23
+ parser.add_argument('z_target', type=float, help='Redshift of your target')
24
+
25
+ group = parser.add_argument_group( "Processing mode (required, mutually exclusive)")
26
+ group_container = group.add_mutually_exclusive_group(required=True)
27
+ group_container.add_argument('--survey', nargs=3, type=float,
28
+ help='Survey min z, max z, fov in arcmin^2'
29
+ ' (used to compute box volume)',
30
+ metavar=("z_min", "z_max", "fov"))
31
+ group_container.add_argument('-L', type=float, help='Size of the box in cMpc,'
32
+ ' alternative to the boxsize computation from the FOV'
33
+ ' and z-depth of the survey')
34
+
35
+ parser.add_argument('-n', type=int, help='Track the N most massive haloes'
36
+ ' in each box (1 tracks only the most massive) [default: %(default)d]', default=1)
37
+ parser.add_argument('-v', action='count', default=0,
38
+ help='Verbosity level [default: %(default)d]')
39
+ parser.add_argument('-t', action='store_true', help='Create a table with'
40
+ ' the sampled haloes')
41
+ parser.add_argument('--plot', action='store_true', help='Show a plot of the M200 distribution')
42
+
43
+ parser.add_argument('-M', type=float, help='OPTIMIZATION: Database mass'
44
+ ' cut, greatly speed up the search but you can incour'
45
+ ' into empty boxes [default: None]')
46
+ parser.add_argument('--lf', type=int, help='OPTIMIZATION: Leafe size for'
47
+ ' each node of the KDTree [default: %(default)d]',
48
+ default=128)
49
+ parser.add_argument('--force-light', action='store_true', help='DEBUG: '
50
+ 'force the reading of the light catalogue first')
51
+
52
+
53
+ args = parser.parse_args()
54
+
55
+ if args.survey is None:
56
+ z1 = None
57
+ z2 = None
58
+ fov = None
59
+ else:
60
+ z1 = args.survey[0]
61
+ z2 = args.survey[1]
62
+ fov = args.survey[2]
63
+
64
+ if args.n < 1:
65
+ raise ValueError('The value in -n needs to be a positive integer')
66
+
67
+ Mmax, fileNrMax, subNrMax = hebb(args.z_target, args.Nboxes,
68
+ def_path, z1, z2, fov,
69
+ L=args.L, M=args.M, v=args.v, leafsize =
70
+ args.lf, force_light=args.force_light,
71
+ nn=args.n)
72
+
73
+ for i in range(Mmax.shape[0]):
74
+ M16, M50, M86 = np.quantile(10.**Mmax[i,:].astype(np.float64), [0.16, 0.5, 0.86])
75
+ # print(f"log10(M200) = {np.log10(M50)} _-{(M50-M16)/M50/np.log(10)} ^+{(M86-M50)/M50/np.log(10)}")
76
+ print(f"{i} log10(M200) = {np.log10(M50):.2f} _-{(M50-M16)/M50/np.log(10):.2f} ^+{(M86-M50)/M50/np.log(10):.2f}")
77
+
78
+ if args.plot:
79
+ import matplotlib.pyplot as plt
80
+ fig, ax = plt.subplots()
81
+ for i in range(Mmax.shape[0]):
82
+ hist, bins = np.histogram(Mmax[i,:], 50, range=[Mmax.min(), Mmax.max()])
83
+ ax.plot((bins[:-1]+bins[1:])/2, hist, label=f"{i}")
84
+ ax.set_ylabel('N halos')
85
+ ax.set_xlabel(r'$\log (M_{200}/M_\odot)$')
86
+ ax.legend(loc='best')
87
+ plt.show()
88
+
89
+ if args.t:
90
+ from astropy.table import Table
91
+ import astropy.units as u
92
+ # unroll them
93
+ ordering = np.repeat(np.arange(Mmax.shape[0],
94
+ dtype=np.min_scalar_type(args.n)),
95
+ Mmax.shape[1])
96
+
97
+ # alter shape to force an error in case the code attempts to copy the
98
+ # arrays
99
+ fileNrMax.shape = fileNrMax.size
100
+ subNrMax.shape = subNrMax.size
101
+ ordering.shape = ordering.size
102
+ Mmax.shape = Mmax.size
103
+
104
+ t=Table([Mmax, ordering, fileNrMax, subNrMax],
105
+ names=['M200', 'IndMostMassive', 'fileNr', 'subNr'])
106
+ t['M200'].unit = u.dex(u.Msun)
107
+ t.meta = {'Description':'Hebb result table','Nboxes':args.Nboxes,
108
+ 'z_target':args.z_target, '-n':args.n}
109
+ t.sort(['fileNr', 'subNr', 'IndMostMassive'])
110
+ t.write('table_max.txt', format='ascii.ecsv', overwrite=True)
111
+
112
+ def hebb_trace_CLI():
113
+ import argparse
114
+
115
+ description = ("Trace back galaxies found with hebb.")
116
+ parser = argparse.ArgumentParser(description=description)
117
+
118
+ parser.add_argument('hebbTable', type=str, help='Data table produced by hebb')
119
+ parser.add_argument('z_target', type=float, help='Target redshift')
120
+ parser.add_argument('treePath', type=str, help='Path of the merger tree file')
121
+
122
+
123
+ parser.add_argument('-v', action='count', default=0,
124
+ help='verbosity level [%(default)d]')
125
+ parser.add_argument('-s', action='store_true',
126
+ help='Use the code in serial, without the need of MPI')
127
+
128
+ args = parser.parse_args()
129
+
130
+ finalT, mpirank = hebb_trace(args.hebbTable, args.treePath, args.z_target,
131
+ args.v, args.s)
132
+
133
+
134
+ if mpirank == 0:
135
+ finalT.write('hebb_traceback.txt', format='ascii.ecsv', overwrite=True)
136
+
137
+ def hebb_estimate_CLI():
138
+ import numpy as np
139
+ import argparse
140
+ import os
141
+
142
+ try:
143
+ def_path = os.environ['HEBB_DB_PATH']
144
+ except KeyError:
145
+ raise ValueError('The environment variable HEBB_DB_PATH must be set')
146
+
147
+ description = (
148
+ 'Estimate the smallest L at a certain z with a certain mass cut'
149
+ )
150
+ parser = argparse.ArgumentParser(description=description)
151
+ parser.add_argument('z_target', type=float, help='Number of sampling')
152
+ parser.add_argument('L', type=float, help='Size of the box in cMpc')
153
+
154
+ parser.add_argument('-v', action='count', default=0,
155
+ help='verbosity level [%(default)d]')
156
+
157
+ parser.add_argument('-M', type=float, help='OPTIMIZATION: Database mass'
158
+ ' cut, greatly speed up the search but you can incour'
159
+ ' into empty boxes [default: None]')
160
+
161
+
162
+ args = parser.parse_args()
163
+
164
+ hebb_estimate(args.z_target, def_path, args.L, M=args.M, v=args.v)
@@ -0,0 +1,327 @@
1
+ #!/usr/bin/env python3
2
+ # -*- coding: utf-8 -*-
3
+ import numpy as np
4
+ import h5py
5
+ from pathlib import PurePath, Path
6
+ import astropy.units as u
7
+ import astropy.cosmology.units as cu
8
+ u.add_enabled_units(cu)
9
+ import numpy as np
10
+ # from numba import njit
11
+ from scipy import spatial
12
+
13
+ def comov_volume(area, z1, z2):
14
+ """
15
+ Area must be with an astropy unit, e.g. 300*(u.arcmin**2)
16
+ Returns size lenght of a box in cMpc
17
+
18
+ """
19
+ from astropy.cosmology import Planck15
20
+ Omega = area.to(u.steradian).value # get rid of unit
21
+ d2 = Planck15.comoving_distance(z1)
22
+ d3 = Planck15.comoving_distance(z2)
23
+ V = Omega/3 * (d3**3 - d2**3)
24
+ newL = np.cbrt(V.to(u.Mpc**3))
25
+
26
+ return newL
27
+
28
+ # @njit
29
+ def dL(coord1, coord2_in, boxsize):
30
+ """
31
+ Computes the distance in a 3D periodic box between a set of coordinates and
32
+ a point, or another, broadcastable set of coordinates.
33
+
34
+ coord1: array of dimensions (n,3)
35
+
36
+ coord2: array of dimensions (1,3) or (3,) in the case of a single point, or
37
+ array of size (n,3) in case of multiple points
38
+
39
+ boxsize : scalar, the size of the box, each dimension has the same size
40
+
41
+ Returns: the distance squared
42
+
43
+
44
+ usage:
45
+
46
+ import numpy as np
47
+
48
+ box = 0.5
49
+
50
+ # generate fake data
51
+ rng = np.random.default_rng(12345)
52
+ c1 = rng.random((1000,3))
53
+
54
+ distancePeriodic(c1, np.array([0.,0.,0.], box))
55
+
56
+
57
+ c2 = rng.random((1000,3))
58
+ distancePeriodic(c1, c2, box))
59
+
60
+ """
61
+
62
+ message = 'coord1 must be a two dimensional array with shape[1]==3'
63
+ coord2=np.atleast_2d(coord2_in)
64
+
65
+ if coord1.ndim != 2:
66
+ raise ValueError(message)
67
+
68
+ if coord1.shape[1] != 3:
69
+ raise ValueError(message)
70
+
71
+ if coord2.ndim == 1:
72
+ if coord2.size != 3:
73
+ raise ValueError('I need 3 points as np.array([x,y,z])')
74
+ elif coord2.ndim == 2:
75
+ if coord2.shape[1] != 3:
76
+ raise ValueError('I need 3 points as np.array([x,y,z])')
77
+ else:
78
+ raise ValueError('I need 3 points as np.array([x,y,z])')
79
+
80
+ dx = coord1[:,0] - coord2[:,0]
81
+ dy = coord1[:,1] - coord2[:,1]
82
+ dz = coord1[:,2] - coord2[:,2]
83
+
84
+ dx[(dx > boxsize*0.5)] -= boxsize
85
+ dx[(dx < - boxsize*0.5)] += boxsize
86
+ dy[(dy > boxsize*0.5)] -= boxsize
87
+ dy[(dy < - boxsize*0.5)] += boxsize
88
+ dz[(dz > boxsize*0.5)] -= boxsize
89
+ dz[(dz < - boxsize*0.5)] += boxsize
90
+
91
+ return dx, dy, dz
92
+
93
+ # @njit(parallel=True)
94
+ def bootstrap_brute_force(Nboxes, BoxSize, newL, coords, centers, mass, fileNr, subNr):
95
+
96
+ Mmax = np.zeros(Nboxes)
97
+ fileNrMax = np.zeros(Nboxes, dtype=fileNr.dtype)
98
+ subNrMax = np.zeros(Nboxes, dtype=subNr.dtype)
99
+
100
+ for j in range(Nboxes):
101
+
102
+ dx, dy, dz = dL(coords, centers[j,:], BoxSize)
103
+ mask = (np.abs(dx) < newL) & (np.abs(dy) < newL) & (np.abs(dz) < newL)
104
+
105
+ if np.count_nonzero(mask) == 0:
106
+ raise ValueError('Hitting an empty region, in case you used -M try'
107
+ ' to lower (or omit) the mass cut')
108
+ mass_tmp = mass[mask]
109
+ fileNr_tmp = fileNr[mask]
110
+ subNr_tmp = subNr[mask]
111
+
112
+ index = np.argmax(mass_tmp)
113
+
114
+ Mmax[j] = mass_tmp[index]
115
+ fileNrMax[j] = fileNr_tmp[index]
116
+ subNrMax[j] = subNr_tmp[index]
117
+
118
+ return Mmax, fileNrMax, subNrMax
119
+
120
+ def bootstrap_kdtree_double(Nboxes, BoxSize, newL, coords, centers, mass, fileNr, subNr):
121
+ """
122
+ Do a search using a double KDTree and perform a block bootstrap.
123
+
124
+ """
125
+
126
+ Mmax = np.zeros(Nboxes)
127
+ fileNrMax = np.zeros(Nboxes, dtype=fileNr.dtype)
128
+ subNrMax = np.zeros(Nboxes, dtype=subNr.dtype)
129
+
130
+ tree_data = spatial.KDTree(coords, boxsize=BoxSize, leafsize=32)
131
+ tree_centres = spatial.KDTree(centers, boxsize=BoxSize, leafsize=32)
132
+ indexes = tree_centres.query_ball_tree(tree_data, newL, p=np.inf)
133
+
134
+ assert len(indexes) == Nboxes
135
+ for j in range(Nboxes):
136
+
137
+ if len(indexes[j]) == 0:
138
+ raise ValueError('Hitting an empty region, in case you used -M try'
139
+ ' to lower (or omit) the mass cut')
140
+ ind = np.array(indexes[j],
141
+ dtype=np.min_scalar_type(coords.shape[0]))
142
+ mass_tmp = mass[ind]
143
+ fileNr_tmp = fileNr[ind]
144
+ subNr_tmp = subNr[ind]
145
+
146
+ index = np.argmax(mass_tmp)
147
+ Mmax[j] = mass_tmp[index]
148
+ fileNrMax[j] = fileNr_tmp[index]
149
+ subNrMax[j] = subNr_tmp[index]
150
+
151
+ return Mmax, fileNrMax, subNrMax
152
+
153
+ def bootstrap_kdtree_single(Nboxes, BoxSize, newL, coords, centers, mass,
154
+ fileNr, subNr, leafsize=128, nn=1):
155
+ """
156
+ Do a search using a single KDTree and perform a block bootstrap. Better
157
+ than the double one since the centers are uniformly seeded in the volume,
158
+ and it can be parallelized
159
+
160
+ """
161
+
162
+ Mmax = np.zeros((nn,Nboxes), dtype=mass.dtype)
163
+ fileNrMax = np.zeros((nn,Nboxes), dtype=fileNr.dtype)
164
+ subNrMax = np.zeros((nn,Nboxes), dtype=subNr.dtype)
165
+
166
+ tree = spatial.KDTree(coords, boxsize=BoxSize, leafsize=leafsize)
167
+ dtype=np.min_scalar_type(coords.shape[0])
168
+
169
+ for j in range(Nboxes):
170
+ ind = np.array(tree.query_ball_point(centers[j,:], newL, p=np.inf, workers=-1), dtype=dtype)
171
+
172
+ if ind.size < nn:
173
+ raise ValueError('Hitting a region with less than '
174
+ f"{nn} subhalos, in case you used -M try"
175
+ ' to lower (or omit) the mass cut')
176
+
177
+ mass_tmp = mass[ind]
178
+ fileNr_tmp = fileNr[ind]
179
+ subNr_tmp = subNr[ind]
180
+
181
+ index = np.argmax(mass_tmp)
182
+ index = np.argsort(mass_tmp)[-nn:][::-1]
183
+ Mmax[:,j] = mass_tmp[index]
184
+ fileNrMax[:,j] = fileNr_tmp[index]
185
+ subNrMax[:,j] = subNr_tmp[index]
186
+
187
+ return Mmax, fileNrMax, subNrMax
188
+
189
+
190
+
191
+ def hebb(z_target, Nboxes, path_data, z1, z2, fov, L=None, M=None, v=0,
192
+ leafsize=128, force_light=False, nn = 1):
193
+ from .uchuu_snaps_z import uchuu_snap_list
194
+
195
+ snapNr_list, z = uchuu_snap_list()
196
+ snapNr = snapNr_list[np.abs(z - z_target).argmin()]
197
+
198
+ # Boxsize in cMpc
199
+ BoxSize = 2000/0.6774
200
+
201
+ catFileName = Path(path_data)/f'catalogue_uchuu.hdf5'
202
+ if not catFileName.is_file() or force_light:
203
+ catFileName = Path(path_data)/f'catalogue_uchuu_light.hdf5'
204
+
205
+ if not catFileName.is_file():
206
+ raise IOError('I cannot locate the catalogue file, I have tried'
207
+ f"{Path(path_data)/f'catalogue_uchuu.hdf5'}"
208
+ f" and {Path(path_data)/f'catalogue_uchuu_light.hdf5'}"
209
+ f" visit https://github.com/NegriAndrea/hebb")
210
+
211
+ if v>0:
212
+ print(f"Reading {catFileName}")
213
+
214
+ with h5py.File(catFileName, 'r') as ff:
215
+
216
+ # the catalogue is sorted in M200, with histograms to load only the
217
+ # haloes above a certain mass without having to read the full dataset
218
+ # if not needed
219
+ if M is None:
220
+ offset = 0
221
+ else:
222
+ m200_indexes = ff[f'S-{snapNr}/M200_indexes'][()]
223
+ m200_bins_edges = ff[f'S-{snapNr}/M200c_bins_edges'][()]
224
+ if np.log10(M) < m200_bins_edges[0]:
225
+ raise ValueError(f"The requested mass {M=:.2e} Msun is too low for "
226
+ f"z={z_target}, database min mass at this z "
227
+ f"is {10.**float(m200_bins_edges[0]):.2e} Msun")
228
+ tmp = np.searchsorted(m200_bins_edges, np.log10(M))
229
+ offset = int(m200_indexes[max(tmp-1,0)])
230
+
231
+ coords = ff[f'S-{snapNr}/Coordinates'][offset:]
232
+ mass= ff[f'S-{snapNr}/M200c'][offset:]
233
+ fileNr = ff[f'S-{snapNr}/fileNr'][offset:]
234
+ subNr = ff[f'/S-{snapNr}/SubNr'][offset:]
235
+ bin_size = BoxSize/65_535
236
+
237
+ coords=coords.astype(np.float32)*bin_size
238
+
239
+ if v>0:
240
+ print(f"READ log(M200/Msun): min={mass[0]:.2f},"
241
+ f" max={mass[-1]:.2f}")
242
+
243
+ if L is None:
244
+ # for surveys
245
+ area = fov*(u.arcmin**2)
246
+ newL = comov_volume(area,z1, z2).value/2 # in cMpc
247
+ else:
248
+ newL = L/2.
249
+
250
+ if v>0:
251
+ print(f"V={8*newL**3:3e} cMpc^3, L={newL*2:3f} cMpc")
252
+
253
+ # shoot (Nboxes,3) random numbers between 0 and BoxSize
254
+ rng = np.random.default_rng()
255
+ centers=rng.random(size=(Nboxes,3), dtype=np.float32)
256
+ centers*=BoxSize
257
+
258
+
259
+ # Mmax, fileNrMax, subNrMax = bootstrap_brute_force(Nboxes, BoxSize, newL, coords,
260
+ # centers, mass, fileNr, subNr)
261
+ # Mmax, fileNrMax, subNrMax = bootstrap_kdtree_double(Nboxes, BoxSize, newL, coords,
262
+ # centers, mass, fileNr, subNr)
263
+ Mmax, fileNrMax, subNrMax = bootstrap_kdtree_single(Nboxes, BoxSize, newL, coords,
264
+ centers, mass, fileNr, subNr,
265
+ leafsize=leafsize,
266
+ nn=nn)
267
+
268
+ return Mmax, fileNrMax, subNrMax
269
+
270
+ def hebb_estimate(z_target, path_data, L, M=None, v=0):
271
+ from .uchuu_snaps_z import uchuu_snap_list
272
+
273
+ snapNr_list, z = uchuu_snap_list()
274
+ snapNr = snapNr_list[np.abs(z - z_target).argmin()]
275
+
276
+ # Boxsize in cMpc
277
+ BoxSize = 2000/0.6774
278
+
279
+ catFileName = Path(path_data)/f'catalogue_uchuu.hdf5'
280
+ if not catFileName.is_file():
281
+ catFileName = Path(path_data)/f'catalogue_uchuu_light.hdf5'
282
+
283
+ if not catFileName.is_file():
284
+ raise IOError('I cannot locate the catalogue file, I have tried'
285
+ f"{Path(path_data)/f'catalogue_uchuu.hdf5'}"
286
+ f" and {Path(path_data)/f'catalogue_uchuu_light.hdf5'}")
287
+
288
+ if v>0:
289
+ print(f"Reading {catFileName}")
290
+
291
+ with h5py.File(catFileName, 'r') as ff:
292
+
293
+ # the catalogue is sorted in M200, with histograms to load only the
294
+ # haloes above a certain mass without having to read the full dataset
295
+ # if not needed
296
+ if M is None:
297
+ offset = 0
298
+ else:
299
+ m200_indexes = ff[f'S-{snapNr}/M200_indexes'][()]
300
+ m200_bins_edges = ff[f'S-{snapNr}/M200c_bins_edges'][()]
301
+ if np.log10(M) < m200_bins_edges[0]:
302
+ raise ValueError(f"The requested mass {M=:.2e} Msun is too low for "
303
+ f"z={z_target}, database min mass at this z "
304
+ f"is {10.**float(m200_bins_edges[0]):.2e} Msun")
305
+ tmp = np.searchsorted(m200_bins_edges, np.log10(M))
306
+ offset = int(m200_indexes[max(tmp-1,0)])
307
+
308
+ coords = ff[f'S-{snapNr}/Coordinates'][offset:]
309
+ mass= ff[f'S-{snapNr}/M200c'][offset:]
310
+ fileNr = ff[f'S-{snapNr}/fileNr'][offset:]
311
+ subNr = ff[f'/S-{snapNr}/SubNr'][offset:]
312
+ bin_size = BoxSize/65_535
313
+
314
+ coords=coords.astype(np.float32)*bin_size
315
+
316
+ if v>0:
317
+ print(f"READ log(M200/Msun): min={mass[0]:.2f},"
318
+ f" max={mass[-1]:.2f}")
319
+
320
+
321
+ for n in range(32):
322
+ nbins = 2**n
323
+ H, bins = np.histogramdd(coords, bins=nbins)
324
+ stop_cond = np.any(H==0)
325
+ print(f"L={BoxSize/nbins} {stop_cond}")
326
+ if stop_cond:
327
+ return
@@ -0,0 +1,136 @@
1
+ #!/usr/bin/env python3
2
+ # -*- coding: utf-8 -*-
3
+ from astropy.table import Table, vstack
4
+
5
+ def unique_ordered(x):
6
+ """
7
+ Same output of np.unique when return_counts=True,
8
+ return_index=True, but 1000 faster since x is assumed sorted.
9
+ It works on 1d arrays, untested for other shapes.
10
+ """
11
+ import numpy as np
12
+
13
+ x = np.asarray(x)
14
+ if x.ndim != 1:
15
+ raise ValueError('x must be a 1D array-like')
16
+
17
+ # compare the elements and find where the elements change. This
18
+ # is better than using np.diff, since the comparison always
19
+ # work, while for some arrays (like recarrays) the difference
20
+ # may be not defined
21
+ flt = np.flatnonzero(x[1:] != x[:-1])
22
+
23
+ # by construction, we to add 1 to flt
24
+ off = np.concatenate([[0], flt+1])
25
+ dim = np.concatenate([np.diff(off), [x.size-off[-1]]])
26
+ unique = x[off]
27
+
28
+ return unique, off, dim
29
+
30
+ def hebb_trace(tableName, mergerTreePath, targetZ, v, serial):
31
+
32
+ if serial:
33
+ # do it serially
34
+ t = Table.read(tableName, format='ascii.ecsv')
35
+ return (hebb_trace_single(tableName, mergerTreePath, targetZ, v, 0,
36
+ len(t)),0)
37
+ else:
38
+ import helperspy as hpy
39
+ from mpi4py import MPI
40
+ comm = MPI.COMM_WORLD
41
+ rank = comm.Get_rank()
42
+ size = comm.Get_size()
43
+
44
+ t = Table.read(tableName, format='ascii.ecsv')
45
+ t.sort(['fileNr'])
46
+
47
+ offset, sizes = hpy.chopArray(len(t), size)
48
+
49
+ i_begin = offset[rank]
50
+ i_end = sizes[rank]+i_begin
51
+
52
+ local_t = hebb_trace_single(tableName,
53
+ mergerTreePath, targetZ, v, i_begin, i_end)
54
+
55
+ all_tables = comm.gather(local_t, root=0)
56
+
57
+ if rank == 0:
58
+ final_table = vstack(all_tables)
59
+ return final_table, rank
60
+ else:
61
+ return None, rank
62
+
63
+
64
+
65
+
66
+ def hebb_trace_single(tableName, mergerTreePath, targetZ, v, i_begin, i_end):
67
+ """
68
+ Trace back galaxies found with hebb.
69
+
70
+ """
71
+ import numpy as np
72
+ from pathlib import PurePath
73
+ import astropy.units as u
74
+ import astropy.cosmology.units as cu
75
+ u.add_enabled_units(cu)
76
+ import helperspy as hpy
77
+
78
+ from .uchuu_snaps_z import uchuu_snap_list
79
+
80
+ snapNr_list, z = uchuu_snap_list()
81
+ targetSnapNr = snapNr_list[np.abs(z - targetZ).argmin()]
82
+
83
+ t = Table.read(tableName, format='ascii.ecsv')
84
+ t=t[i_begin:i_end]
85
+
86
+ t.sort(['fileNr', 'subNr'])
87
+
88
+ UfileNr, off, size = unique_ordered(t['fileNr'])
89
+
90
+ if v>1:
91
+ tmp = Table([UfileNr, size], names=['MergTreeID', 'Nhalos'])
92
+ print(tmp)
93
+ del tmp
94
+
95
+ fullT = []
96
+ fileNr_original = []
97
+ subNr_original = []
98
+
99
+ # I loop over the merger tree files (MT)
100
+ for jMT, o, s in zip(UfileNr, off, size):
101
+
102
+ # table containing only files related to 1 merger tree file
103
+ subt = t[o:o+s]
104
+
105
+ fileMT = PurePath(mergerTreePath) / f'mergertree_{jMT}.h5'
106
+
107
+ if v>0:
108
+ print(f'Read {fileMT}')
109
+ # in case I want the position, use newFields=['Pos']
110
+ tree = hpy.forestCT(fileMT, newFields=[])
111
+
112
+ for sNr in subt['subNr']:
113
+ if v>0:
114
+ print(f'Doing {sNr}')
115
+
116
+ try:
117
+ # depends what I want to find, in this case the progenitor
118
+ # do I need raiseError=True?
119
+ tHist = tree.subHistory(sNr, targetSnapNr=targetSnapNr,
120
+ t=True, raiseError=True)
121
+
122
+ # filter to get the target I want
123
+ fullT.append(tHist[tHist['SnapNr'] == targetSnapNr])
124
+ fileNr_original.append(jMT)
125
+ subNr_original.append(sNr)
126
+ except hpy.CTNoProgenitorError:
127
+ if v>1:
128
+ print('NoProgenitor')
129
+
130
+ fullT = vstack(fullT)
131
+ fullT['fileNr_original'] = np.array(fileNr_original,
132
+ dtype=t['fileNr'].dtype)
133
+ fullT['subNr_original'] = np.array(subNr_original,
134
+ dtype=t['subNr'].dtype)
135
+
136
+ return fullT
@@ -0,0 +1,110 @@
1
+ #!/usr/bin/env python3
2
+ # -*- coding: utf-8 -*-
3
+
4
+ def uchuu_snap_list():
5
+ import numpy as np
6
+ snapNr = np.array([
7
+ 0 ,
8
+ 1 ,
9
+ 2 ,
10
+ 3 ,
11
+ 4 ,
12
+ 5 ,
13
+ 6 ,
14
+ 7 ,
15
+ 8 ,
16
+ 9 ,
17
+ 10 ,
18
+ 11 ,
19
+ 12 ,
20
+ 13 ,
21
+ 14 ,
22
+ 15 ,
23
+ 16 ,
24
+ 17 ,
25
+ 18 ,
26
+ 19 ,
27
+ 20 ,
28
+ 21 ,
29
+ 22 ,
30
+ 23 ,
31
+ 24 ,
32
+ 25 ,
33
+ 26 ,
34
+ 27 ,
35
+ 28 ,
36
+ 29 ,
37
+ 30 ,
38
+ 31 ,
39
+ 32 ,
40
+ 33 ,
41
+ 34 ,
42
+ 35 ,
43
+ 36 ,
44
+ 37 ,
45
+ 38 ,
46
+ 39 ,
47
+ 40 ,
48
+ 41 ,
49
+ 42 ,
50
+ 43 ,
51
+ 44 ,
52
+ 45 ,
53
+ 46 ,
54
+ 47 ,
55
+ 48 ,
56
+ 49 ], dtype=np.uint8)
57
+
58
+ z = np.array([
59
+ 13.956626 ,
60
+ 12.687381 ,
61
+ 11.509382 ,
62
+ 10.4416485 ,
63
+ 9.472301 ,
64
+ 8.587728 ,
65
+ 7.764242 ,
66
+ 7.0256824 ,
67
+ 6.347539 ,
68
+ 5.725402 ,
69
+ 5.1549826 ,
70
+ 4.630631 ,
71
+ 4.268704 ,
72
+ 3.9307232 ,
73
+ 3.6095695 ,
74
+ 3.3151808 ,
75
+ 3.12916 ,
76
+ 2.9508514 ,
77
+ 2.779575 ,
78
+ 2.6155903 ,
79
+ 2.4582927 ,
80
+ 2.309505 ,
81
+ 2.1655588 ,
82
+ 2.0288346 ,
83
+ 1.8969555 ,
84
+ 1.7713115 ,
85
+ 1.6509027 ,
86
+ 1.5358186 ,
87
+ 1.4257126 ,
88
+ 1.3211012 ,
89
+ 1.2202487 ,
90
+ 1.1240892 ,
91
+ 1.0319009 ,
92
+ 0.9442769 ,
93
+ 0.8595655 ,
94
+ 0.77916944 ,
95
+ 0.70206976 ,
96
+ 0.62813413 ,
97
+ 0.55789936 ,
98
+ 0.48993552 ,
99
+ 0.42559803 ,
100
+ 0.36407042 ,
101
+ 0.3046825 ,
102
+ 0.2480967 ,
103
+ 0.19424379 ,
104
+ 0.14247847 ,
105
+ 0.0928843 ,
106
+ 0.04553294 ,
107
+ 0.022683144 ,
108
+ 2.002716e-05 ], dtype=np.float32)
109
+
110
+ return snapNr, z
@@ -0,0 +1,79 @@
1
+ Metadata-Version: 2.4
2
+ Name: hebb
3
+ Version: 0.7
4
+ Summary: Halo Extreme Block Bootstrap package to estimate the largest halo mass given the field of view and redshift depth of a survey
5
+ Author-email: Andrea Negri <anegri@us.es>
6
+ License-Expression: BSD-3-Clause
7
+ Project-URL: Homepage, https://github.com/NegriAndrea/hebb
8
+ Project-URL: Issues, https://github.com/NegriAndrea/hebb/issues
9
+ Classifier: Programming Language :: Python :: 3
10
+ Classifier: Operating System :: OS Independent
11
+ Requires-Python: >=3.10
12
+ Description-Content-Type: text/markdown
13
+ License-File: LICENSE.md
14
+ Requires-Dist: numpy
15
+ Requires-Dist: astropy
16
+ Requires-Dist: scipy
17
+ Requires-Dist: h5py>=3.2.0
18
+ Requires-Dist: scipy
19
+ Requires-Dist: matplotlib
20
+ Requires-Dist: mpi4py
21
+ Dynamic: license-file
22
+
23
+ # hebb
24
+ Halo Extreme Block Bootstrap package to estimate the distribution of heaviest halo mass you can find in a survery at a given redshift.
25
+
26
+ `hebb` uses as a basedata the Uchuu simulation halo catalogue (https://skiesanduniverses.org/Simulations/Uchuu) to perform a block bootstrap by shooting N boxes of a volume equal of the estimated volume of a survey, and recovering the largest halo formed at a particular z, with an uncertainty estimate. The volume of the survey is computed given the survey's field of view and redshift depth, or manually selected. Optionally, the code can dump the list of halo found in the search, which can be used to perform a trace back in time with the Uchuu merger tree.
27
+
28
+ Please cite the following papers if you use `hebb` in your work: Negri & Belli (2026), [Ishiyama et al. (2021)](https://ui.adsabs.harvard.edu/abs/2021MNRAS.506.4210I) (Uchuu Data Release 1).
29
+
30
+ ## Installation
31
+ There are 2 ways to install `hebb`: via `PyPI`
32
+ ```
33
+ python3 -m pip install hebb
34
+ ```
35
+ Or directly from the repository `https://github.com/NegriAndrea/hebb/` and install locally
36
+ ```
37
+ git clone https://github.com/NegriAndrea/hebb
38
+ cd hebb
39
+ python3 -m pip install -e .
40
+ ```
41
+
42
+ ### Database Setup
43
+ In addition to installing the Python package, you must download a reduced version of the Uchuu database. Two versions are available, with a different cut in mass; a light [one](https://uses0-my.sharepoint.com/:u:/g/personal/anegri_us_es/IQCDx5T1XlTnSbM2YmYcFoYMATIk_pI1XKPzp4ycUJ6N12M?e=pDdoDg) (2.3 GB) and a more complete one (~40 GB) from [here](https://uses0-my.sharepoint.com/:u:/g/personal/anegri_us_es/IQBrGP-3e0xHR6pV0YkfCb1hAYS1KLSNnjmDvVb3H6ytsUA?e=fcudjY). To set the database path you can define the following environment variable in your `~/.bash_profile`
44
+ ```
45
+ # bash
46
+ export HEBB_DB_PATH=/path/to/database
47
+ ```
48
+ where you have to change `/path/to/database` to the path of the downloaded database. By default, the code looks first for the most complete catalugue, and if it is not found, it will look for the light one.
49
+
50
+ NOTE: In order to keep the file size manageble the database contains only log10(M200), positions and merger tree IDs; the halo positions have been binned on a 40 ckpc gridsize and stored as `uint16` integer, which is precise enough for volumes that are usually way larger than 1 cMpc.
51
+
52
+ ### Test installation
53
+ The easiest way to test if everything is correctly installed and set up is to run the command
54
+ ```
55
+ hebb 100 0. -L 1000 -M 1e14
56
+ ```
57
+
58
+ ## Usage
59
+ The simplest way to use `hebb` is via command line, `hebb -h` returns the user manual:
60
+
61
+
62
+
63
+
64
+
65
+ @ARTICLE{2021MNRAS.506.4210I,
66
+ author = {{Ishiyama}, Tomoaki and {Prada}, Francisco and {Klypin}, Anatoly A. and {Sinha}, Manodeep and {Metcalf}, R. Benton and {Jullo}, Eric and {Altieri}, Bruno and {Cora}, Sof{\'\i}a A. and {Croton}, Darren and {de la Torre}, Sylvain and {Mill{\'a}n-Calero}, David E. and {Oogi}, Taira and {Ruedas}, Jos{\'e} and {Vega-Mart{\'\i}nez}, Cristian A.},
67
+ title = "{The Uchuu simulations: Data Release 1 and dark matter halo concentrations}",
68
+ journal = {\mnras},
69
+ year = 2021,
70
+ month = sep,
71
+ volume = {506},
72
+ number = {3},
73
+ pages = {4210-4231},
74
+ doi = {10.1093/mnras/stab1755},
75
+ archivePrefix = {arXiv},
76
+ eprint = {2007.14720},
77
+ primaryClass = {astro-ph.CO},
78
+ adsurl = {https://ui.adsabs.harvard.edu/abs/2021MNRAS.506.4210I},
79
+ }
@@ -0,0 +1,14 @@
1
+ LICENSE.md
2
+ README.md
3
+ pyproject.toml
4
+ src/hebb/__init__.py
5
+ src/hebb/hebb_cli.py
6
+ src/hebb/hebb_core.py
7
+ src/hebb/hebb_trace.py
8
+ src/hebb/uchuu_snaps_z.py
9
+ src/hebb.egg-info/PKG-INFO
10
+ src/hebb.egg-info/SOURCES.txt
11
+ src/hebb.egg-info/dependency_links.txt
12
+ src/hebb.egg-info/entry_points.txt
13
+ src/hebb.egg-info/requires.txt
14
+ src/hebb.egg-info/top_level.txt
@@ -0,0 +1,4 @@
1
+ [console_scripts]
2
+ hebb = hebb.hebb_cli:hebb_CLI
3
+ hebb-estimate = hebb.hebb_cli:hebb_estimate_CLI
4
+ hebb-tr = hebb.hebb_cli:hebb_trace_CLI
@@ -0,0 +1,7 @@
1
+ numpy
2
+ astropy
3
+ scipy
4
+ h5py>=3.2.0
5
+ scipy
6
+ matplotlib
7
+ mpi4py
@@ -0,0 +1 @@
1
+ hebb