prepostcov 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,29 @@
1
+ Creative Commons Attribution-NonCommercial 4.0 International License
2
+
3
+ Copyright (c) 2026 Maus, Baleato Lizancos and White
4
+
5
+ This work is licensed under the Creative Commons Attribution-NonCommercial 4.0
6
+ International License. To view a copy of this license, visit
7
+ http://creativecommons.org/licenses/by-nc/4.0/ or send a letter to
8
+ Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.
9
+
10
+ You are free to:
11
+ - Share — copy and redistribute the material in any medium or format
12
+ - Adapt — remix, transform, and build upon the material
13
+
14
+ Under the following terms:
15
+ - Attribution — You must give appropriate credit, provide a link to the license,
16
+ and indicate if changes were made. You may do so in any reasonable manner, but
17
+ not in any way that suggests the licensor endorses you or your use.
18
+ - NonCommercial — You may not use the material for commercial purposes.
19
+ - No additional restrictions — You may not apply legal terms or technological
20
+ measures that legally restrict others from doing anything the license permits.
21
+
22
+ Notices:
23
+ You do not have to comply with the license for elements of the material in the
24
+ public domain or where your use is permitted by an applicable exception or
25
+ limitation.
26
+
27
+ No warranties are given. The license may not give you all of the permissions
28
+ necessary for your intended use. For example, other rights such as publicity,
29
+ privacy, or moral rights may limit how you use the material.
@@ -0,0 +1,15 @@
1
+ Metadata-Version: 2.1
2
+ Name: prepostcov
3
+ Version: 0.1.0
4
+ Summary: Covariance and multipole tools for power spectrum and correlation function analysis.
5
+ Home-page: https://github.com/abaleato/PrePostCov/tree/main
6
+ Author: A. Baleato Lizancos & M. Maus
7
+ Author-email: a.baleatolizancos@berkeley.edu
8
+ License: MIT
9
+ Classifier: Programming Language :: Python :: 3
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Operating System :: OS Independent
12
+ Requires-Python: >=3.7
13
+ License-File: LICENSE
14
+ Requires-Dist: numpy
15
+ Requires-Dist: scipy
@@ -0,0 +1,83 @@
1
+
2
+ # PrePostCov
3
+
4
+ PrePostCov is a Python package to compute analytical covariances between multipoles of the pre-reconstruction galaxy power spectrum and the post-reconstruction two-point correlation function.
5
+
6
+ ## Installation
7
+
8
+ You can install the package locally (from the root directory) with:
9
+
10
+ ```bash
11
+ pip install .
12
+ ```
13
+
14
+ Or install dependencies for development:
15
+
16
+ ```bash
17
+ pip install -e .[dev]
18
+ ```
19
+ ---
20
+
21
+
22
+ ## $P_{\ell}(k)^{\rm{pre-recon}}$ - $\xi_{\ell}(r)^{\rm{post-recon}}$ cross-covariance
23
+ The main novelty presented in this repo is an analytic calculation of the cross-covariance between $P_{\ell}(k)^{\rm{pre-recon}}$ and $\xi_{\ell}(r)^{\rm{post-recon}}$. The required functionality lives in `covariances_pk_xi.py`. Running this requires only very basic dependencies like `numpy` and `scipy`.
24
+
25
+ ### Basic usage
26
+
27
+ For the basic cross-covariance calculation, see the worked example in [preprost_covariance.ipynb](preprost_covariance.ipynb). In brief:
28
+
29
+ ```python
30
+ # After installation, you can import the main classes and functions as
31
+ from prepostcov import PkXiCovariance, LegendreMultipoleExtractor
32
+
33
+ # Initialize the covariance builder
34
+ cov_builder = PkXiCovariance(k=k_array, r=r_array, V=survey_volume,
35
+ poles_Pk=[0, 2], poles_Xi=[0, 2])
36
+
37
+ # Provide the fiducial P_cross(k, mu) (see paper for details)
38
+ cov_builder.set_Pcross_from_callable(P_cross_of_mu, ells=[0, 2, 4])
39
+
40
+ # Compute the Pk-Xi block of the covariance matrix
41
+ cov_PkXi = cov_builder.compute_covariance() # shape (n_poles_Pk, n_poles_Xi, nk, nr)
42
+ ```
43
+
44
+ ## Full $P_{\ell}(k)^{\rm{pre-recon}}$ - $\xi_{\ell}(r)^{\rm{post-recon}}$ covariance
45
+
46
+ Additionally, we include code to calculate the remaining blocks of the covariance matrix involving $P_{\ell}(k)^{\rm{pre-recon}}$ and $\xi_{\ell}(r)^{\rm{post-recon}}$. To do this, we use [TheCov](https://github.com/cosmodesi/thecov) for the $P_{\ell}(k)^{\rm{pre-recon}}$ auto piece, [RascalC](https://github.com/mikhailov/rascalc) for $\xi_{\ell}(r)^{\rm{post-recon}}$.
47
+
48
+ Overall steps are:
49
+ 1. Compute P(k),Xi(s) from catalogs
50
+ 2. Compute P(k) window matrices
51
+ 3. Obtain fiducial spectra for covariances
52
+ (Can skip steps 1-3 if you already have fiducial spectra)
53
+ 4. Compute gaussian, t0, and ssc pieces of P(k) covariance
54
+ 5. Compute Xi(s) covariance
55
+ 6. Compute PkxXi covariance and combine everyting
56
+
57
+ These are all subroutines in the `full_covariance()` class in [Covariances_full.py](Covariances_full.py). The front end script is [do_everything.py](do_everything.py) to which you add arguments related to the desired task:
58
+
59
+ 1. **Measure P(k):**
60
+ ```bash
61
+ srun -n 64 python do_everything.py --task measure_pk \
62
+ --tracer LRG --zrange 0.4 0.6 --zeff 0.51 --DR 2 --version v2 \
63
+ --region GCcomb --outpath ./data/pk_measured/Pk_LRG_z0.4-0.6_DR2_GCcomb
64
+ ```
65
+
66
+ 2. **Compute window:**
67
+ ```bash
68
+ srun -n 64 python do_everything.py --task compute_window \
69
+ --pk_poles_path ./data/pk_measured/Pk_LRG_z0.4-0.6_DR2_GCcomb.npy \
70
+ --tracer LRG --zrange 0.4 0.6 --zeff 0.51 --DR 2 --version v2 \
71
+ --region GCcomb --outpath ./data/pk_window/wmat_LRG_z0.4-0.6_DR2_GCcomb
72
+ ```
73
+
74
+ ## Citation
75
+
76
+ If you use this code in your research, please cite:
77
+
78
+ [**Maus, Baleato Lizancos, White, de Mattia & Chen (2026)**](https://arxiv.org/abs/2602.12343)
79
+
80
+ ## License
81
+
82
+ This project is licensed under the [Creative Commons Attribution-NonCommercial 4.0 International License](LICENSE) (CC BY-NC 4.0). You are free to use, share, and adapt this code for non-commercial purposes with appropriate attribution.
83
+
@@ -0,0 +1,3 @@
1
+ from .multipoles import *
2
+ from .covariances_pk_xi import *
3
+ # Optionally, import from Covariances_full if needed
@@ -0,0 +1,5 @@
1
+ from .multipoles import *
2
+ from .covariances_pk_xi import *
3
+ # Optionally, add CLI or main entry point logic here
4
+ if __name__ == "__main__":
5
+ print("Welcome to the PrePostCov package! This module is not meant to be run directly.")
@@ -0,0 +1,150 @@
1
+ import numpy as np
2
+ from typing import Dict, Iterable, List, Optional
3
+ from scipy.special import spherical_jn
4
+
5
+ from .multipoles import LegendreMultipoleExtractor, integrate_legendre_product_4
6
+
7
+
8
+ class PkXiCovariance:
9
+ """
10
+ Compute disconnected covariance between P_ell(k) and xi_ell(r) using a P_cross(k, mu).
11
+
12
+ The covariance block for given (ell1, ell2) is
13
+ Cov[ P_{ell1}(k), xi_{ell2}(r) ] = ((2*ell1+1)*(2*ell2+1) / V) * jbar_{ell2}(k, r) *
14
+ sum_{L1,L2 in poles} i^{ell2} I_{ell1,ell2,L1,L2} * P_{L1}(k) * P_{L2}(k)
15
+ where I is the integral of four Legendre polynomials over mu in [-1, 1] and
16
+ jbar_{ell2}(k, r) is the r-bin-averaged spherical Bessel j_{ell2}(k r).
17
+
18
+ Parameters
19
+ - k: 1D np.ndarray of k values (nk,)
20
+ - r: 1D np.ndarray of r-bin centers (nr,)
21
+ - V: survey volume
22
+ - poles_Pk: list of P multipoles to include (even ells), e.g., [0, 2, 4]
23
+ - poles_Xi: list of xi multipoles to include (even ells), e.g., [0, 2, 4]
24
+ - ngauss_leg_product: Gauss-Legendre order (half-range) for 4-Legendre integral; 2*ngauss used total
25
+ - r_avg_nsamp: samples for r averaging per bin
26
+ """
27
+
28
+ def __init__(
29
+ self,
30
+ k: np.ndarray,
31
+ r: np.ndarray,
32
+ V: float,
33
+ poles_Pk: Iterable[int] = (0, 2, 4),
34
+ poles_Xi: Iterable[int] = (0, 2, 4),
35
+ ngauss_leg_product: int = 32,
36
+ r_avg_nsamp: int = 100,
37
+ ) -> None:
38
+ self.k = np.asarray(k)
39
+ self.r = np.asarray(r)
40
+ self.V = float(V)
41
+ self.poles_Pk = list(poles_Pk)
42
+ self.poles_Xi = list(poles_Xi)
43
+ self.ngauss_leg_product = int(ngauss_leg_product)
44
+ self.r_avg_nsamp = int(r_avg_nsamp)
45
+
46
+ # Storage for P_cross multipoles: dict ell -> array(nk,)
47
+ self.Pcross_ells: Optional[Dict[int, np.ndarray]] = None
48
+
49
+ # Precompute 4-Legendre integrals for all combinations
50
+ self._I4 = self._precompute_I4()
51
+
52
+ # Precompute r-averaging grids per bin
53
+ self._rr_grid = self._build_r_window_grid()
54
+
55
+ # ----- Public API -----
56
+ def set_Pcross_from_callable(self, P_of_mu, ells: Iterable[int] = (0, 2, 4), ngauss_eval: int = 8) -> None:
57
+ """Compute and store P_cross multipoles from a callable P(k, mu)."""
58
+ ext = LegendreMultipoleExtractor(ngauss=ngauss_eval)
59
+ self.Pcross_ells = ext.project(P_of_mu, ells=tuple(ells))
60
+
61
+ def set_Pcross_from_dict(self, Pcross_ells: Dict[int, np.ndarray]) -> None:
62
+ """Provide precomputed P_cross multipoles as a dict ell -> P_ell(k)."""
63
+ # Basic validation
64
+ nk = self.k.shape[0]
65
+ for ell, arr in Pcross_ells.items():
66
+ arr = np.asarray(arr)
67
+ if arr.shape[0] != nk:
68
+ raise ValueError(f"P_cross ell={ell} has length {arr.shape[0]} != nk={nk}")
69
+ self.Pcross_ells = {int(k): np.asarray(v) for k, v in Pcross_ells.items()}
70
+
71
+ def compute_covariance(self) -> np.ndarray:
72
+ """
73
+ Return cov_Pkell_Xiell with shape (len(poles_Pk), len(poles_Xi), nk, nr).
74
+ Requires Pcross_ells to be set.
75
+ """
76
+ if self.Pcross_ells is None:
77
+ raise RuntimeError("Pcross_ells not set. Call set_Pcross_from_callable or set_Pcross_from_dict first.")
78
+
79
+ nk = self.k.shape[0]
80
+ nr = self.r.shape[0]
81
+ cov = np.zeros((len(self.poles_Pk), len(self.poles_Xi), nk, nr))
82
+
83
+ # Precompute P_ell arrays for all Pk poles
84
+ P_arrays = {ell: np.asarray(self.Pcross_ells[ell]) for ell in self.poles_Pk}
85
+
86
+ # Precompute jmean_kr for each xi multipole (returns (nr,))
87
+ jmean_cache: Dict[int, np.ndarray] = {}
88
+ for ell2 in self.poles_Xi:
89
+ jmean_cache[ell2] = self._jmean_kr(ell2) # (nr,)
90
+
91
+ # Loop over desired covariance blocks
92
+ for i, ell1 in enumerate(self.poles_Pk):
93
+ for j, ell2 in enumerate(self.poles_Xi):
94
+ # Sum over L1, L2
95
+ S_k = np.zeros(nk)
96
+ for L1 in self.poles_Pk:
97
+ P1 = P_arrays[L1]
98
+ for L2 in self.poles_Pk:
99
+ P2 = P_arrays[L2]
100
+ I = self._I4[(ell1, ell2, L1, L2)] # scalar
101
+ S_k += I * (P1 * P2)
102
+ # Prefactor depends on r only (via jmean), and on ells, V
103
+ pref = 1.j**ell2 * ((2 * ell1 + 1) * (2 * ell2 + 1) / self.V) * jmean_cache[ell2] # (nr,)
104
+ cov[i, j, :, :] = S_k[:, None] * pref[None, :]
105
+
106
+ return cov
107
+
108
+ # ----- Internal helpers -----
109
+ def _precompute_I4(self) -> Dict[tuple, float]:
110
+ I4: Dict[tuple, float] = {}
111
+ for ell1 in self.poles_Pk:
112
+ for ell2 in self.poles_Xi:
113
+ for L1 in self.poles_Pk:
114
+ for L2 in self.poles_Pk:
115
+ I = integrate_legendre_product_4((ell1, ell2, L1, L2), ngauss=self.ngauss_leg_product)
116
+ I4[(ell1, ell2, L1, L2)] = I
117
+ return I4
118
+
119
+ def _build_r_window_grid(self) -> np.ndarray:
120
+ """Build an r-grid for averaging around each r-bin center, shape (ns, nr)."""
121
+ r = self.r # (nr,)
122
+ nr = r.shape[0]
123
+ ns = self.r_avg_nsamp
124
+ if nr > 1:
125
+ r_step = float(np.mean(np.diff(r)))
126
+ else:
127
+ r_step = 0.0
128
+ # Uniform samples in [r_j - r_step, r_j + r_step] for each r_j
129
+ t = np.linspace(-1.0, 1.0, ns)[:, None] # (ns,1)
130
+ rr = r[None, :] + t * r_step # (ns, nr)
131
+ return rr
132
+
133
+ def _jmean_kr(self, ell: int) -> np.ndarray:
134
+ """Compute ⟨ j_ell(k r) ⟩_r over the r-window; returns array (nk, nr)."""
135
+ k = self.k # (nk,)
136
+ rr = self._rr_grid # (ns, nr)
137
+ ns, nr = rr.shape
138
+
139
+ # Denominator per r-bin: ∫ rr^2 drr over the window
140
+ den = np.trapz(rr ** 2, x=rr, axis=0) # (nr,)
141
+
142
+ # Numerator per r-bin and k-bin
143
+ num = np.zeros((k.shape[0], nr))
144
+ for j in range(nr):
145
+ r_col = rr[:, j] # (ns,)
146
+ y = spherical_jn(ell, k[:, None] * r_col[None, :]) * (r_col[None, :] ** 2) # (nk, ns)
147
+ num[:, j] = np.trapz(y, x=r_col, axis=1) # (nk,)
148
+
149
+ jmean = num / den[None, :]
150
+ return jmean # (nk, nr)
@@ -0,0 +1,143 @@
1
+ import numpy as np
2
+ from typing import Dict, Iterable, Optional
3
+
4
+
5
+ class LegendreMultipoleExtractor:
6
+ """
7
+ Helper to project an anisotropic P(k, mu) onto Legendre multipoles P_ell(k).
8
+
9
+ Usage patterns:
10
+ - Preferred (accurate): pass a callable P(mu) -> Pk (1D array over k). The class will
11
+ evaluate it at Gauss-Legendre nodes and use quadrature to integrate over mu in [-1, 1].
12
+ - Fallback (data grid): pass a 2D array Pkmu with a corresponding mu_array. The class will
13
+ integrate using the trapezoidal rule (assuming P is even in mu if mu_array in [0, 1]).
14
+
15
+ Normalization: P_ell(k) = (2*ell+1)/2 * int_{-1}^{1} dmu P(k, mu) L_ell(mu)
16
+ """
17
+
18
+ def __init__(
19
+ self,
20
+ ngauss: int = 8,
21
+ ) -> None:
22
+ """
23
+ Parameters
24
+ - ngauss: number of positive-side Gauss-Legendre nodes to use (total nodes = 2*ngauss)
25
+ """
26
+ assert ngauss >= 2, "ngauss should be >= 2"
27
+ self.ngauss = ngauss
28
+ # Full-range Gauss-Legendre nodes/weights on [-1, 1]
29
+ self.mu_full, self.w_full = np.polynomial.legendre.leggauss(2 * ngauss)
30
+ # Positive half [0, 1] subset of nodes (ascending)
31
+ self.mu_pos = self.mu_full[ngauss:]
32
+
33
+ @staticmethod
34
+ def _legendre_L(ell: int, mu):
35
+ return np.polynomial.legendre.Legendre.basis(ell)(mu)
36
+
37
+ def project(
38
+ self,
39
+ Pkmu,
40
+ ells: Iterable[int] = (0, 2, 4),
41
+ mu_array: Optional[np.ndarray] = None,
42
+ ) -> Dict[int, np.ndarray]:
43
+ """
44
+ Project P(k, mu) onto Legendre multipoles.
45
+
46
+ Parameters
47
+ - Pkmu: either a callable f(mu) -> Pk (shape (nk,)) or a 2D array with shape (nmu, nk)
48
+ - ells: iterable of even multipoles to compute, e.g., (0, 2, 4)
49
+ - mu_array: if Pkmu is an array, the corresponding mu grid. If None and Pkmu is callable,
50
+ Gauss-Legendre nodes are used automatically.
51
+
52
+ Returns
53
+ - dict mapping ell -> P_ell(k) as a 1D array of length nk
54
+ """
55
+ ells = list(ells)
56
+ if any(ell < 0 or int(ell) != ell for ell in ells):
57
+ raise ValueError("Multipoles must be non-negative integers")
58
+
59
+ # Case 1: callable -> use Gauss-Legendre quadrature for best accuracy
60
+ if callable(Pkmu):
61
+ # Evaluate P at positive mu nodes (assumes evenness in mu)
62
+ P_pos = np.atleast_2d(np.array([Pkmu(mu) for mu in self.mu_pos])) # (nmu_pos, nk)
63
+ if P_pos.ndim != 2:
64
+ raise ValueError("Callable must return a 1D array over k for each mu")
65
+ # Build full-range values using evenness: P(-mu) = P(+mu)
66
+ P_full = np.vstack((P_pos[::-1], P_pos)) # (2*ngauss, nk), order matches mu_full
67
+
68
+ results: Dict[int, np.ndarray] = {}
69
+ for ell in ells:
70
+ L_full = self._legendre_L(ell, self.mu_full)[:, None] # (2*ngauss, 1)
71
+ pref = (2 * ell + 1) / 2.0
72
+ P_ell = pref * np.sum(self.w_full[:, None] * L_full * P_full, axis=0)
73
+ results[ell] = P_ell
74
+ return results
75
+
76
+ # Case 2: grid data -> use provided mu grid and trapz integration
77
+ if mu_array is None:
78
+ raise ValueError("mu_array must be provided when Pkmu is an array")
79
+
80
+ mu_array = np.asarray(mu_array)
81
+ P_arr = np.asarray(Pkmu)
82
+ if P_arr.ndim != 2:
83
+ raise ValueError("Pkmu array must be 2D with shape (nmu, nk)")
84
+ if P_arr.shape[0] != mu_array.shape[0]:
85
+ raise ValueError("First dimension of Pkmu must match length of mu_array")
86
+
87
+ # Determine if mu_array spans [0,1] (half-range) or [-1,1] (full-range)
88
+ half_range = (mu_array.min() >= 0) and (mu_array.max() <= 1)
89
+ full_range = (mu_array.min() >= -1) and (mu_array.max() <= 1) and not half_range
90
+ if not (half_range or full_range):
91
+ raise ValueError("mu_array must lie within [0,1] or [-1,1]")
92
+
93
+ results: Dict[int, np.ndarray] = {}
94
+ for ell in ells:
95
+ L_vals = self._legendre_L(ell, mu_array)[:, None] # (nmu, 1)
96
+ if half_range:
97
+ # Assume even P(k, mu) and integrate over [0,1] with factor 2
98
+ pref = (2 * ell + 1) / 2.0
99
+ P_ell = 2 * pref * np.trapz(P_arr * L_vals, x=mu_array, axis=0)
100
+ else:
101
+ # Full-range integration over [-1,1]
102
+ pref = (2 * ell + 1) / 2.0
103
+ P_ell = pref * np.trapz(P_arr * L_vals, x=mu_array, axis=0)
104
+ results[ell] = P_ell
105
+ return results
106
+
107
+
108
+ def integrate_legendre_product_4(ells, ngauss: int = 16) -> float:
109
+ """
110
+ Compute I = ∫_{-1}^{1} dμ ∏_{i=1}^4 L_{ℓ_i}(μ), where L_ℓ is the Legendre polynomial.
111
+
112
+ Uses Gauss–Legendre quadrature (accurate for smooth polynomials) and
113
+ cheaply enforces obvious parity selection: if sum(ells) is odd, the integral is 0.
114
+
115
+ Parameters
116
+ - ells: iterable of 4 non-negative integers (ℓ1, ℓ2, ℓ3, ℓ4)
117
+ - ngauss: number of Gauss nodes per half-range (uses 2*ngauss total)
118
+
119
+ Returns
120
+ - float integral value
121
+
122
+ Notes
123
+ - An exact analytic expression exists in terms of Wigner symbols via product expansions,
124
+ but high-order closed forms are cumbersome. For EFT use-cases, Gauss rules are exact for
125
+ polynomials up to degree 4*max(ells) with sufficiently large order; 2*ngauss nodes exactly
126
+ integrate polynomials up to order 4*ngauss-1. Choose ngauss >= max(ells)+1 for exactness.
127
+ """
128
+ ells = list(ells)
129
+ if len(ells) != 4:
130
+ raise ValueError("ells must contain exactly four integers")
131
+ if any((not isinstance(L, int)) or L < 0 for L in ells):
132
+ raise ValueError("All ell must be non-negative integers")
133
+
134
+ # Parity selection: product of four Legendre polynomials is even if sum(ell) is even.
135
+ if (sum(ells) % 2) == 1:
136
+ return 0.0
137
+
138
+ # Quadrature
139
+ mu, w = np.polynomial.legendre.leggauss(2 * ngauss)
140
+ prod = np.ones_like(mu)
141
+ for L in ells:
142
+ prod *= np.polynomial.legendre.Legendre.basis(L)(mu)
143
+ return float(np.sum(w * prod))
@@ -0,0 +1,15 @@
1
+ Metadata-Version: 2.1
2
+ Name: prepostcov
3
+ Version: 0.1.0
4
+ Summary: Covariance and multipole tools for power spectrum and correlation function analysis.
5
+ Home-page: https://github.com/abaleato/PrePostCov/tree/main
6
+ Author: A. Baleato Lizancos & M. Maus
7
+ Author-email: a.baleatolizancos@berkeley.edu
8
+ License: MIT
9
+ Classifier: Programming Language :: Python :: 3
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Operating System :: OS Independent
12
+ Requires-Python: >=3.7
13
+ License-File: LICENSE
14
+ Requires-Dist: numpy
15
+ Requires-Dist: scipy
@@ -0,0 +1,13 @@
1
+ LICENSE
2
+ README.md
3
+ setup.py
4
+ prepostcov/__init__.py
5
+ prepostcov/__main__.py
6
+ prepostcov/covariances_pk_xi.py
7
+ prepostcov/multipoles.py
8
+ prepostcov.egg-info/PKG-INFO
9
+ prepostcov.egg-info/SOURCES.txt
10
+ prepostcov.egg-info/dependency_links.txt
11
+ prepostcov.egg-info/entry_points.txt
12
+ prepostcov.egg-info/requires.txt
13
+ prepostcov.egg-info/top_level.txt
@@ -0,0 +1,2 @@
1
+ [console_scripts]
2
+ prepostcov = prepostcov.__main__:main
@@ -0,0 +1,2 @@
1
+ numpy
2
+ scipy
@@ -0,0 +1 @@
1
+ prepostcov
@@ -0,0 +1,4 @@
1
+ [egg_info]
2
+ tag_build =
3
+ tag_date = 0
4
+
@@ -0,0 +1,28 @@
1
+ from setuptools import setup, find_packages
2
+
3
+ setup(
4
+ name="prepostcov",
5
+ version="0.1.0",
6
+ description="Covariance and multipole tools for power spectrum and correlation function analysis.",
7
+ author="A. Baleato Lizancos & M. Maus",
8
+ author_email="a.baleatolizancos@berkeley.edu",
9
+ url="https://github.com/abaleato/PrePostCov/tree/main",
10
+ packages=find_packages(),
11
+ install_requires=[
12
+ "numpy",
13
+ "scipy",
14
+ # Add other dependencies as needed
15
+ ],
16
+ entry_points={
17
+ "console_scripts": [
18
+ "prepostcov=prepostcov.__main__:main"
19
+ ]
20
+ },
21
+ python_requires=">=3.7",
22
+ license="MIT",
23
+ classifiers=[
24
+ "Programming Language :: Python :: 3",
25
+ "License :: OSI Approved :: MIT License",
26
+ "Operating System :: OS Independent",
27
+ ],
28
+ )