zipline_polygon_bundle 0.2.1__tar.gz → 0.2.3__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (19) hide show
  1. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/PKG-INFO +86 -6
  2. zipline_polygon_bundle-0.2.3/README.md +195 -0
  3. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/pyproject.toml +23 -33
  4. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/config.py +13 -13
  5. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/trades.py +2 -0
  6. zipline_polygon_bundle-0.2.1/README.md +0 -115
  7. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/LICENSE +0 -0
  8. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/__init__.py +0 -0
  9. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/adjustments.py +0 -0
  10. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/bundle.py +0 -0
  11. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/compute_signals.py +0 -0
  12. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/concat_all_aggs.py +0 -0
  13. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/concat_all_aggs_partitioned.py +0 -0
  14. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/nyse_all_hours_calendar.py +0 -0
  15. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/polygon_file_reader.py +0 -0
  16. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/process_all_aggs.py +0 -0
  17. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/quotes.py +0 -0
  18. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/split_aggs_by_ticker.py +0 -0
  19. {zipline_polygon_bundle-0.2.1 → zipline_polygon_bundle-0.2.3}/zipline_polygon_bundle/tickers_and_names.py +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.3
2
2
  Name: zipline_polygon_bundle
3
- Version: 0.2.1
3
+ Version: 0.2.3
4
4
  Summary: A zipline-reloaded data provider bundle for Polygon.io
5
5
  License: GNU AFFERO GENERAL PUBLIC LICENSE
6
6
  Version 3, 19 November 2007
@@ -666,7 +666,7 @@ License: GNU AFFERO GENERAL PUBLIC LICENSE
666
666
  Keywords: zipline,data-bundle,finance
667
667
  Author: Jim White
668
668
  Author-email: jim@fovi.com
669
- Requires-Python: >=3.10,<4.0
669
+ Requires-Python: >= 3.10,<4.0
670
670
  Classifier: Programming Language :: Python :: 3
671
671
  Classifier: License :: OSI Approved :: GNU Affero General Public License v3
672
672
  Classifier: Operating System :: OS Independent
@@ -679,17 +679,23 @@ Requires-Dist: polygon-api-client (>=1.14.2)
679
679
  Requires-Dist: pyarrow (>=18.1.0,<19)
680
680
  Requires-Dist: pytz (>=2018.5)
681
681
  Requires-Dist: requests (>=2.9.1)
682
- Requires-Dist: toolz (>=1)
682
+ Requires-Dist: toolz (>=0.8.2)
683
683
  Requires-Dist: zipline-arrow (>=3.2.2)
684
684
  Project-URL: Repository, https://github.com/fovi-llc/zipline-polygon-bundle
685
685
  Description-Content-Type: text/markdown
686
686
 
687
687
  # zipline-polygon-bundle
688
- `zipline-polygon-bundle` is a `zipline-reloaded` (https://github.com/stefan-jansen/zipline-reloaded) data ingestion bundle for [Polygon.io](https://polygon.io/).
688
+ `zipline-polygon-bundle` is a `zipline-arrow` (https://github.com/fovi-llc/zipline-arrow) data ingestion bundle for [Polygon.io](https://polygon.io/).
689
+
690
+ Zipline Arrow is a fork of Zipline Reloaded `zipline-reloaded` (https://github.com/stefan-jansen/zipline-reloaded) which is only required if you want to use Polygon.io trades flatfiles. So if you only need to use Polygon daily or minute agg flatfiles then you may want to use `zipline-polygon-bundle<0.2` which depends on `zipline-reloaded>=3.1`.
689
691
 
690
692
  ## GitHub
691
693
  https://github.com/fovi-llc/zipline-polygon-bundle
692
694
 
695
+ ## PyPi
696
+
697
+ https://pypi.org/project/zipline_polygon_bundle
698
+
693
699
  ## Resources
694
700
 
695
701
  Get a subscription to https://polygon.io/ for an API key and access to flat files.
@@ -706,7 +712,25 @@ Code from *Trading Evolved* with some small updates for convenience: https://git
706
712
 
707
713
  One of the modifications I've made to that code is so that some of the notebooks can be run on Colab with a minimum of fuss: https://github.com/fovi-llc/trading_evolved/blob/main/Chapter%207%20-%20Backtesting%20Trading%20Strategies/First%20Zipline%20Backtest.ipynb
708
714
 
709
- # Ingest data from Polygon.io into Zipline
715
+ # Zipline Reloaded (`zipline-reloaded`) or Zipline Arrow (`zipline-arrow`)?
716
+
717
+ This bundle supports Polygon daily and minute aggregates and now trades too (quotes coming). The trades are converted to minute and daily aggregates for all trading hours (extended both pre and post, as well as regular market). But in order to support those extended hours I needed to change how Zipline handles `get_calendar` for Exchange Calendar (`exchange-calendar`) initialization. To make that work I've forked `zipline-reloaded` as `zipline-arrow`. The versions of this package before 0.2 depend on `zipline-reloaded>=3.1` and only support daily and minute flatfiles. Versions >= 0.2 of `zipline-polygon-bundle` depend on `zipline-arrow` and will work with daily and minute flatfiles as well as trades flatfiles.
718
+
719
+ # Ingest data from Polygon.io into Zipline using `aws s3` CLI
720
+ Get AWS S3 CLI in the usual way: https://docs.aws.amazon.com/cli/latest/reference/s3/
721
+
722
+ This will get everything which is currently around 12TB.
723
+ ```bash
724
+ aws s3 sync s3://flatfiles/us_stocks_sip $POLYGON_DATA_DIR/flatfiles/us_stocks_sip --checksum-mode ENABLED --endpoint-url https://files.polygon.io
725
+ ```
726
+
727
+ If you don't need quotes yet (and this bundle doesn't use them yet) then this will be faster (quotes about twice as big as trades):
728
+ ```bash
729
+ aws s3 sync s3://flatfiles/us_stocks_sip/{subdir} $POLYGON_DATA_DIR/flatfiles/us_stocks_sip/{subdir} --checksum-mode ENABLED --endpoint-url https://files.polygon.io
730
+ ```
731
+
732
+ # Alternative: Ingest data using `rclone`.
733
+ I've had problems with `rclone` on the larger files for trades and quotes so I recommend using `aws s3` CLI instead.
710
734
 
711
735
  ## Set up your `rclone` (https://rclone.org/) configuration
712
736
  ```bash
@@ -741,9 +765,20 @@ register_polygon_equities_bundle(
741
765
  )
742
766
  ```
743
767
 
768
+ ## Cython build setup
769
+
770
+ ```bash
771
+ sudo apt-get update
772
+ sudo apt-get install python3-dev python3-poetry
773
+
774
+ CFLAGS=$(python3-config --includes) pip install git+https://github.com/fovi-llc/zipline-arrow.git
775
+ ```
776
+
777
+
744
778
  ## Install the Zipline Polygon.io Bundle PyPi package and check that it works.
745
779
  Listing bundles will show if everything is working correctly.
746
780
  ```bash
781
+
747
782
  pip install -U git+https://github.com/fovi-llc/zipline-reloaded.git@calendar
748
783
  pip install -U git+https://github.com/fovi-llc/zipline-polygon-bundle.git
749
784
 
@@ -761,7 +796,7 @@ quantopian-quandl <no ingestions>
761
796
 
762
797
  ## Ingest the Polygon.io data. The API key is needed for the split and dividend data.
763
798
 
764
- Note that ingest currently stores cached API data and shuffled agg data in the `POLYGON_DATA_DIR` directory (`flatfiles/us_stocks_sip/api_cache` and `flatfiles/us_stocks_sip/day_by_ticker_v1` respectively) so write access is needed at this stage. After ingestion the data in `POLYGON_DATA_DIR` is not accessed.
799
+ Note that ingest currently stores cached API data and shuffled agg ("by ticker") data in the `$CUSTOM_ASSET_FILES_DIR` directory which is `$ZIPLINE_ROOT/data/polygon_custom_assets` by default.
765
800
 
766
801
  ```bash
767
802
  export POLYGON_API_KEY=<your API key here>
@@ -795,6 +830,51 @@ This ingestion for 10 years of minute bars took around 10 hours on my Mac using
795
830
  zipline ingest -b polygon-minute
796
831
  ```
797
832
 
833
+ ## Using trades flat files.
834
+ This takes a lot of space for the trades flatfiles (currently the 22 years of trades take around 4TB) and a fair bit of time to convert to minute aggregates. The benefit though is the whole trading day is covered from premarket open to after hours close. Also the current conversion logic ignores trade corrections, official close updates, and the TRF "dark pool" trades (because they are not reported when they occurred nor were they offered on the exchanges). That is to make the aggregates be as good of a simulation of real-time as we can do for algo training and backtesting. Details in the `trades_to_custom_aggs` function in `zipline_polygon_bundle/trades.py`.
835
+
836
+ The conversion process creates `.csv.gz` files in the same format as Polygon flatfiles in the custom assets dir, which is `$ZIPLINE_ROOT/data/polygon_custom_assets` by default. So while `$ZIPLINE_ROOT` needs to be writable, the Polygon flatfiles (`$POLYGON_DATA_DIR`) can be read-only.
837
+
838
+ Get AWS S3 CLI in the usual way: https://docs.aws.amazon.com/cli/latest/reference/s3/
839
+
840
+ ```bash
841
+ aws s3 sync s3://flatfiles/us_stocks_sip/trades_v1 $POLYGON_DATA_DIR/flatfiles/us_stocks_sip/trades_v1 --checksum-mode ENABLED --endpoint-url https://files.polygon.io
842
+ ```
843
+
844
+ ## `extension.py`
845
+
846
+ If you set the `ZIPLINE_ROOT` environment variable (recommended and likely necessary because the default of `~/.zipline` is probably not what you'll want) and copy your `extension.py` config there then you don't need to put `-e extension.py` on the `zipline` command line.
847
+
848
+ If you leave out the `start_date` and/or `end_date` args then `register_polygon_equities_bundle` will scan for the dates of the first and last trade file in `$POLYGON_DATA_DIR` and use them respectively.
849
+
850
+ The `NYSE_ALL_HOURS` calendar (defined in `zipline_polygon_bundle/nyse_all_hours_calendar.py`) uses open and close times for the entire trading day from premarket open to after hours close.
851
+
852
+ Right now `agg_time="1min"` is the only supported aggregate duration because Zipline can only deal with day or minute duration aggregates.
853
+
854
+ ```python
855
+ from zipline_polygon_bundle import register_polygon_equities_bundle, register_nyse_all_hours_calendar, NYSE_ALL_HOURS
856
+ from exchange_calendars.calendar_helpers import parse_date
857
+ # from zipline.utils.calendar_utils import get_calendar
858
+
859
+ # Register the NYSE_ALL_HOURS ExchangeCalendar.
860
+ register_nyse_all_hours_calendar()
861
+
862
+ register_polygon_equities_bundle(
863
+ "polygon-trades",
864
+ calendar_name=NYSE_ALL_HOURS,
865
+ # start_date=parse_date("2020-01-03", raise_oob=False),
866
+ # end_date=parse_date("2021-01-29", raise_oob=False),
867
+ agg_time="1min",
868
+ minutes_per_day=16 * 60,
869
+ )
870
+ ```
871
+
872
+ As with the daily and minute aggs, the POLYGON_API_KEY is needed for the split and dividend data. Also coming is SID assignment across ticker changes using the Polygon tickers API data.
873
+
874
+ ```bash
875
+ zipline ingest -b polygon-trades
876
+ ```
877
+
798
878
  # License is Affero General Public License v3 (AGPL v3)
799
879
  The content of this project is Copyright (C) 2024 Fovi LLC and authored by James P. White (https://www.linkedin.com/in/jamespaulwhite/). It is distributed under the terms of the GNU AFFERO GENERAL PUBLIC LICENSE (AGPL) Version 3 (See LICENSE file).
800
880
 
@@ -0,0 +1,195 @@
1
+ # zipline-polygon-bundle
2
+ `zipline-polygon-bundle` is a `zipline-arrow` (https://github.com/fovi-llc/zipline-arrow) data ingestion bundle for [Polygon.io](https://polygon.io/).
3
+
4
+ Zipline Arrow is a fork of Zipline Reloaded `zipline-reloaded` (https://github.com/stefan-jansen/zipline-reloaded) which is only required if you want to use Polygon.io trades flatfiles. So if you only need to use Polygon daily or minute agg flatfiles then you may want to use `zipline-polygon-bundle<0.2` which depends on `zipline-reloaded>=3.1`.
5
+
6
+ ## GitHub
7
+ https://github.com/fovi-llc/zipline-polygon-bundle
8
+
9
+ ## PyPi
10
+
11
+ https://pypi.org/project/zipline_polygon_bundle
12
+
13
+ ## Resources
14
+
15
+ Get a subscription to https://polygon.io/ for an API key and access to flat files.
16
+
17
+ https://polygon.io/knowledge-base/article/how-to-get-started-with-s3
18
+
19
+ Quantopian's Zipline backtester revived by Stefan Jansen: https://github.com/stefan-jansen/zipline-reloaded
20
+
21
+ Stefan's excellent book *Machine Learning for Algorithmic Trading*: https://ml4trading.io/
22
+
23
+ *Trading Evolved* by Andreas Clenow is a gentler introduction to Zipline Reloaded: https://www.followingthetrend.com/trading-evolved/
24
+
25
+ Code from *Trading Evolved* with some small updates for convenience: https://github.com/fovi-llc/trading_evolved
26
+
27
+ One of the modifications I've made to that code is so that some of the notebooks can be run on Colab with a minimum of fuss: https://github.com/fovi-llc/trading_evolved/blob/main/Chapter%207%20-%20Backtesting%20Trading%20Strategies/First%20Zipline%20Backtest.ipynb
28
+
29
+ # Zipline Reloaded (`zipline-reloaded`) or Zipline Arrow (`zipline-arrow`)?
30
+
31
+ This bundle supports Polygon daily and minute aggregates and now trades too (quotes coming). The trades are converted to minute and daily aggregates for all trading hours (extended both pre and post, as well as regular market). But in order to support those extended hours I needed to change how Zipline handles `get_calendar` for Exchange Calendar (`exchange-calendar`) initialization. To make that work I've forked `zipline-reloaded` as `zipline-arrow`. The versions of this package before 0.2 depend on `zipline-reloaded>=3.1` and only support daily and minute flatfiles. Versions >= 0.2 of `zipline-polygon-bundle` depend on `zipline-arrow` and will work with daily and minute flatfiles as well as trades flatfiles.
32
+
33
+ # Ingest data from Polygon.io into Zipline using `aws s3` CLI
34
+ Get AWS S3 CLI in the usual way: https://docs.aws.amazon.com/cli/latest/reference/s3/
35
+
36
+ This will get everything which is currently around 12TB.
37
+ ```bash
38
+ aws s3 sync s3://flatfiles/us_stocks_sip $POLYGON_DATA_DIR/flatfiles/us_stocks_sip --checksum-mode ENABLED --endpoint-url https://files.polygon.io
39
+ ```
40
+
41
+ If you don't need quotes yet (and this bundle doesn't use them yet) then this will be faster (quotes about twice as big as trades):
42
+ ```bash
43
+ aws s3 sync s3://flatfiles/us_stocks_sip/{subdir} $POLYGON_DATA_DIR/flatfiles/us_stocks_sip/{subdir} --checksum-mode ENABLED --endpoint-url https://files.polygon.io
44
+ ```
45
+
46
+ # Alternative: Ingest data using `rclone`.
47
+ I've had problems with `rclone` on the larger files for trades and quotes so I recommend using `aws s3` CLI instead.
48
+
49
+ ## Set up your `rclone` (https://rclone.org/) configuration
50
+ ```bash
51
+ export POLYGON_FILE_ENDPOINT=https://files.polygon.io/
52
+ rclone config create s3polygon s3 env_auth=false endpoint=$POLYGON_FILE_ENDPOINT \
53
+ access_key_id=$POLYGON_S3_Access_ID secret_access_key=$POLYGON_Secret_Access_Key
54
+ ```
55
+
56
+ ## Get flat files (`*.csv.gz`) for US Stock daily aggregates.
57
+ The default asset dir is `us_stock_sip` but that can be overriden with the `POLYGON_ASSET_SUBDIR`
58
+ environment variable if/when Polygon.io adds other markets to flat files.
59
+
60
+ ```bash
61
+ export POLYGON_DATA_DIR=`pwd`/data/files.polygon.io
62
+ for year in 2024 2023 2022 2021; do \
63
+ rclone copy -P s3polygon:flatfiles/us_stocks_sip/day_aggs_v1/$year \
64
+ $POLYGON_DATA_DIR/flatfiles/us_stocks_sip/day_aggs_v1/$year; \
65
+ done
66
+ ```
67
+
68
+ ## `extension.py`
69
+
70
+ ```python
71
+ from zipline_polygon_bundle import register_polygon_equities_bundle
72
+
73
+ # All tickers (>20K) are ingested. Filtering is TBD.
74
+ # `start_session` and `end_session` can be set to ingest a range of dates (which must be market days).
75
+ register_polygon_equities_bundle(
76
+ "polygon",
77
+ calendar_name="XNYS",
78
+ agg_time="day"
79
+ )
80
+ ```
81
+
82
+ ## Cython build setup
83
+
84
+ ```bash
85
+ sudo apt-get update
86
+ sudo apt-get install python3-dev python3-poetry
87
+
88
+ CFLAGS=$(python3-config --includes) pip install git+https://github.com/fovi-llc/zipline-arrow.git
89
+ ```
90
+
91
+
92
+ ## Install the Zipline Polygon.io Bundle PyPi package and check that it works.
93
+ Listing bundles will show if everything is working correctly.
94
+ ```bash
95
+
96
+ pip install -U git+https://github.com/fovi-llc/zipline-reloaded.git@calendar
97
+ pip install -U git+https://github.com/fovi-llc/zipline-polygon-bundle.git
98
+
99
+ pip install zipline_polygon_bundle
100
+ zipline -e extension.py bundles
101
+ ```
102
+ stdout:
103
+ ```
104
+ csvdir <no ingestions>
105
+ polygon <no ingestions>
106
+ polygon-minute <no ingestions>
107
+ quandl <no ingestions>
108
+ quantopian-quandl <no ingestions>
109
+ ```
110
+
111
+ ## Ingest the Polygon.io data. The API key is needed for the split and dividend data.
112
+
113
+ Note that ingest currently stores cached API data and shuffled agg ("by ticker") data in the `$CUSTOM_ASSET_FILES_DIR` directory which is `$ZIPLINE_ROOT/data/polygon_custom_assets` by default.
114
+
115
+ ```bash
116
+ export POLYGON_API_KEY=<your API key here>
117
+ zipline -e extension.py ingest -b polygon
118
+ ```
119
+
120
+ ### Cleaning up bad ingests
121
+ After a while you may wind up with old (or empty because of an error during ingestion) bundles cluttering
122
+ up the list and could waste space (although old bundles may be useful for rerunning old backtests).
123
+ To remove all but the last ingest (say after your first successful ingest after a number of false starts) you could use:
124
+ ```bash
125
+ zipline -e extension.py clean -b polygon --keep-last 1
126
+ ```
127
+
128
+ ## Using minute aggregate flat files.
129
+ Minute aggs work too but everything takes more space and a lot longer to do.
130
+
131
+ ```bash
132
+ export POLYGON_DATA_DIR=`pwd`/data/files.polygon.io
133
+ for year in 2024 2023 2022 2021; do \
134
+ rclone copy -P s3polygon:flatfiles/us_stocks_sip/minute_aggs_v1/$year \
135
+ $POLYGON_DATA_DIR/flatfiles/us_stocks_sip/minute_aggs_v1/$year; \
136
+ done
137
+ ```
138
+
139
+ If you set the `ZIPLINE_ROOT` environment variable (recommended and likely necessary because the default of `~/.zipline` is probably not what you'll want) and copy your `extension.py` config there then you don't need to put `-e extension.py` on the `zipline` command line.
140
+
141
+ This ingestion for 10 years of minute bars took around 10 hours on my Mac using an external hard drive (not SSD). A big chunk of that was copying from the default tmp dir to the Zipline root (6.3million files for 47GB actual, 63GB used). I plan to change that `shutil.copy2` to use `shutil.move` and to use a `tmp` dir in Zipline root for temporary files instead of the default which should save an hour or two. Also the ingestion process is single threaded and could be sped up with some concurrency.
142
+
143
+ ```bash
144
+ zipline ingest -b polygon-minute
145
+ ```
146
+
147
+ ## Using trades flat files.
148
+ This takes a lot of space for the trades flatfiles (currently the 22 years of trades take around 4TB) and a fair bit of time to convert to minute aggregates. The benefit though is the whole trading day is covered from premarket open to after hours close. Also the current conversion logic ignores trade corrections, official close updates, and the TRF "dark pool" trades (because they are not reported when they occurred nor were they offered on the exchanges). That is to make the aggregates be as good of a simulation of real-time as we can do for algo training and backtesting. Details in the `trades_to_custom_aggs` function in `zipline_polygon_bundle/trades.py`.
149
+
150
+ The conversion process creates `.csv.gz` files in the same format as Polygon flatfiles in the custom assets dir, which is `$ZIPLINE_ROOT/data/polygon_custom_assets` by default. So while `$ZIPLINE_ROOT` needs to be writable, the Polygon flatfiles (`$POLYGON_DATA_DIR`) can be read-only.
151
+
152
+ Get AWS S3 CLI in the usual way: https://docs.aws.amazon.com/cli/latest/reference/s3/
153
+
154
+ ```bash
155
+ aws s3 sync s3://flatfiles/us_stocks_sip/trades_v1 $POLYGON_DATA_DIR/flatfiles/us_stocks_sip/trades_v1 --checksum-mode ENABLED --endpoint-url https://files.polygon.io
156
+ ```
157
+
158
+ ## `extension.py`
159
+
160
+ If you set the `ZIPLINE_ROOT` environment variable (recommended and likely necessary because the default of `~/.zipline` is probably not what you'll want) and copy your `extension.py` config there then you don't need to put `-e extension.py` on the `zipline` command line.
161
+
162
+ If you leave out the `start_date` and/or `end_date` args then `register_polygon_equities_bundle` will scan for the dates of the first and last trade file in `$POLYGON_DATA_DIR` and use them respectively.
163
+
164
+ The `NYSE_ALL_HOURS` calendar (defined in `zipline_polygon_bundle/nyse_all_hours_calendar.py`) uses open and close times for the entire trading day from premarket open to after hours close.
165
+
166
+ Right now `agg_time="1min"` is the only supported aggregate duration because Zipline can only deal with day or minute duration aggregates.
167
+
168
+ ```python
169
+ from zipline_polygon_bundle import register_polygon_equities_bundle, register_nyse_all_hours_calendar, NYSE_ALL_HOURS
170
+ from exchange_calendars.calendar_helpers import parse_date
171
+ # from zipline.utils.calendar_utils import get_calendar
172
+
173
+ # Register the NYSE_ALL_HOURS ExchangeCalendar.
174
+ register_nyse_all_hours_calendar()
175
+
176
+ register_polygon_equities_bundle(
177
+ "polygon-trades",
178
+ calendar_name=NYSE_ALL_HOURS,
179
+ # start_date=parse_date("2020-01-03", raise_oob=False),
180
+ # end_date=parse_date("2021-01-29", raise_oob=False),
181
+ agg_time="1min",
182
+ minutes_per_day=16 * 60,
183
+ )
184
+ ```
185
+
186
+ As with the daily and minute aggs, the POLYGON_API_KEY is needed for the split and dividend data. Also coming is SID assignment across ticker changes using the Polygon tickers API data.
187
+
188
+ ```bash
189
+ zipline ingest -b polygon-trades
190
+ ```
191
+
192
+ # License is Affero General Public License v3 (AGPL v3)
193
+ The content of this project is Copyright (C) 2024 Fovi LLC and authored by James P. White (https://www.linkedin.com/in/jamespaulwhite/). It is distributed under the terms of the GNU AFFERO GENERAL PUBLIC LICENSE (AGPL) Version 3 (See LICENSE file).
194
+
195
+ The AGPL doesn't put any restrictions on personal use but people using this in a service for others have obligations. If you have commerical purposes and those distribution requirements don't work for you, feel free to contact me (mailto:jim@fovi.com) about other licensing terms.
@@ -1,6 +1,6 @@
1
1
  [project]
2
2
  name = 'zipline_polygon_bundle'
3
- version = '0.2.1'
3
+ version = '0.2.3'
4
4
  description = 'A zipline-reloaded data provider bundle for Polygon.io'
5
5
  authors = [
6
6
  { name = 'Jim White', email = 'jim@fovi.com' },
@@ -14,42 +14,32 @@ classifiers = [
14
14
  'Operating System :: OS Independent',
15
15
  ]
16
16
 
17
- [project.urls]
18
- Repository = 'https://github.com/fovi-llc/zipline-polygon-bundle'
17
+ requires-python = ">= 3.10,<4.0"
19
18
 
20
- [tool.poetry]
21
- name = 'zipline-polygon-bundle'
22
- version = '0.2.1'
23
- description = 'A zipline-reloaded data provider bundle for Polygon.io'
24
- authors = ['Jim White <jim@fovi.com>']
25
- license = 'AGPL-3.0'
26
- readme = 'README.md'
27
- keywords = ['zipline', 'data-bundle', 'finance']
28
- classifiers = [
29
- 'Programming Language :: Python :: 3',
30
- 'License :: OSI Approved :: GNU Affero General Public License v3',
31
- 'Operating System :: OS Independent',
19
+ dependencies = [
20
+ "fsspec>=2024.10",
21
+ "filelock>=3.16.0",
22
+ "polygon-api-client>=1.14.2",
23
+ "pandas>=2.2,<3",
24
+ # "pandas-market-calendars>=4.4.2",
25
+ # "pandas-ta>=0.3", # pandas-ta install doesn't work with poetry for some reason.
26
+ # It is used in compute_signals.py which we're not using yet.
27
+ "pytz>=2018.5",
28
+ "requests>=2.9.1",
29
+ "bcolz-zipline>=1.2.11",
30
+ # There was an issue in PyArrow 19 which is probably fixed but don't remember how to test it.
31
+ "pyarrow>=18.1.0,<19",
32
+ "numpy<2",
33
+ "toolz>=0.8.2",
34
+ "zipline-arrow>=3.2.2",
35
+ # "zipline-arrow = { git = 'https://github.com/fovi-llc/zipline-arrow.git' }"
32
36
  ]
33
37
 
34
- [tool.poetry.dependencies]
35
- fsspec = ">=2024.10"
36
- filelock = ">=3.16.0"
37
- python = ">=3.10,<4.0"
38
- polygon-api-client = ">=1.14.2"
39
- pandas = ">=2.2,<3"
40
- # pandas-market-calendars = ">=4.4.2"
41
- # pandas-ta install doesn't work with poetry for some reason.
42
- # It is used in compute_signals.py which we're not using yet.
43
- # pandas-ta = ">=0.3"
44
- pytz = ">=2018.5"
45
- requests = ">=2.9.1"
46
- bcolz-zipline = ">=1.2.11"
47
- pyarrow = ">=18.1.0,<19"
48
- numpy = "<2"
49
- toolz = ">=1"
50
- zipline-arrow = { version = ">=3.2.2" }
38
+ [project.urls]
39
+ Repository = 'https://github.com/fovi-llc/zipline-polygon-bundle'
40
+
51
41
 
52
- [tool.poetry.dev-dependencies]
42
+ [poetry.group.dev.dependencies]
53
43
  pytest = "*"
54
44
 
55
45
  [build-system]
@@ -3,7 +3,7 @@ from zipline.utils.calendar_utils import get_calendar
3
3
 
4
4
  from .nyse_all_hours_calendar import NYSE_ALL_HOURS
5
5
 
6
- from typing import Iterator, Tuple
6
+ from typing import Iterator, Mapping, Tuple
7
7
 
8
8
  import pandas as pd
9
9
  from pyarrow.fs import LocalFileSystem
@@ -38,7 +38,7 @@ def to_partition_key(s: str) -> str:
38
38
  class PolygonConfig:
39
39
  def __init__(
40
40
  self,
41
- environ: dict,
41
+ environ: Mapping[str, str],
42
42
  calendar_name: str,
43
43
  start_date: Date,
44
44
  end_date: Date,
@@ -71,17 +71,6 @@ class PolygonConfig:
71
71
  )
72
72
  self.market = environ.get("POLYGON_MARKET", "stocks")
73
73
  self.asset_subdir = environ.get("POLYGON_ASSET_SUBDIR", "us_stocks_sip")
74
- self.tickers_dir = environ.get(
75
- "POLYGON_TICKERS_DIR",
76
- os.path.join(os.path.join(self.data_dir, "tickers"), self.asset_subdir),
77
- )
78
- self.tickers_csv_path = environ.get(
79
- "POLYGON_TICKERS_CSV",
80
- os.path.join(
81
- self.tickers_dir,
82
- f"tickers_{self.start_timestamp.date().isoformat()}_{self.end_timestamp.date().isoformat()}.csv",
83
- ),
84
- )
85
74
  self.flat_files_dir = environ.get(
86
75
  "POLYGON_FLAT_FILES_DIR", os.path.join(self.data_dir, "flatfiles")
87
76
  )
@@ -101,6 +90,17 @@ class PolygonConfig:
101
90
  self.custom_asset_files_dir = environ.get(
102
91
  "CUSTOM_ASSET_FILES_DIR", self.asset_files_dir
103
92
  )
93
+ self.tickers_dir = environ.get(
94
+ "POLYGON_TICKERS_DIR",
95
+ os.path.join(self.custom_asset_files_dir, "tickers"),
96
+ )
97
+ self.tickers_csv_path = environ.get(
98
+ "POLYGON_TICKERS_CSV",
99
+ os.path.join(
100
+ self.tickers_dir,
101
+ f"tickers_{self.start_timestamp.date().isoformat()}_{self.end_timestamp.date().isoformat()}.csv",
102
+ ),
103
+ )
104
104
 
105
105
  self.cache_dir = os.path.join(self.custom_asset_files_dir, "api_cache")
106
106
 
@@ -116,6 +116,7 @@ def custom_aggs_schema(raw: bool = False) -> pa.Schema:
116
116
  pa.field("low", price_type, nullable=False),
117
117
  pa.field("window_start", timestamp_type, nullable=False),
118
118
  pa.field("transactions", pa.int64(), nullable=False),
119
+ pa.field("vwap", price_type, nullable=False),
119
120
  pa.field("date", pa.date32(), nullable=False),
120
121
  pa.field("year", pa.uint16(), nullable=False),
121
122
  pa.field("month", pa.uint8(), nullable=False),
@@ -366,6 +367,7 @@ def batches_for_date(aggs_ds: pa_ds.Dataset, date: pd.Timestamp):
366
367
  table = table.sort_by([("part", "ascending"), ("ticker", "ascending"), ("window_start", "ascending"), ])
367
368
  return table.to_batches()
368
369
 
370
+
369
371
  def generate_batches_for_schedule(config, aggs_ds):
370
372
  schedule = config.calendar.trading_index(
371
373
  start=config.start_timestamp, end=config.end_timestamp, period="1D"
@@ -1,115 +0,0 @@
1
- # zipline-polygon-bundle
2
- `zipline-polygon-bundle` is a `zipline-reloaded` (https://github.com/stefan-jansen/zipline-reloaded) data ingestion bundle for [Polygon.io](https://polygon.io/).
3
-
4
- ## GitHub
5
- https://github.com/fovi-llc/zipline-polygon-bundle
6
-
7
- ## Resources
8
-
9
- Get a subscription to https://polygon.io/ for an API key and access to flat files.
10
-
11
- https://polygon.io/knowledge-base/article/how-to-get-started-with-s3
12
-
13
- Quantopian's Zipline backtester revived by Stefan Jansen: https://github.com/stefan-jansen/zipline-reloaded
14
-
15
- Stefan's excellent book *Machine Learning for Algorithmic Trading*: https://ml4trading.io/
16
-
17
- *Trading Evolved* by Andreas Clenow is a gentler introduction to Zipline Reloaded: https://www.followingthetrend.com/trading-evolved/
18
-
19
- Code from *Trading Evolved* with some small updates for convenience: https://github.com/fovi-llc/trading_evolved
20
-
21
- One of the modifications I've made to that code is so that some of the notebooks can be run on Colab with a minimum of fuss: https://github.com/fovi-llc/trading_evolved/blob/main/Chapter%207%20-%20Backtesting%20Trading%20Strategies/First%20Zipline%20Backtest.ipynb
22
-
23
- # Ingest data from Polygon.io into Zipline
24
-
25
- ## Set up your `rclone` (https://rclone.org/) configuration
26
- ```bash
27
- export POLYGON_FILE_ENDPOINT=https://files.polygon.io/
28
- rclone config create s3polygon s3 env_auth=false endpoint=$POLYGON_FILE_ENDPOINT \
29
- access_key_id=$POLYGON_S3_Access_ID secret_access_key=$POLYGON_Secret_Access_Key
30
- ```
31
-
32
- ## Get flat files (`*.csv.gz`) for US Stock daily aggregates.
33
- The default asset dir is `us_stock_sip` but that can be overriden with the `POLYGON_ASSET_SUBDIR`
34
- environment variable if/when Polygon.io adds other markets to flat files.
35
-
36
- ```bash
37
- export POLYGON_DATA_DIR=`pwd`/data/files.polygon.io
38
- for year in 2024 2023 2022 2021; do \
39
- rclone copy -P s3polygon:flatfiles/us_stocks_sip/day_aggs_v1/$year \
40
- $POLYGON_DATA_DIR/flatfiles/us_stocks_sip/day_aggs_v1/$year; \
41
- done
42
- ```
43
-
44
- ## `extension.py`
45
-
46
- ```python
47
- from zipline_polygon_bundle import register_polygon_equities_bundle
48
-
49
- # All tickers (>20K) are ingested. Filtering is TBD.
50
- # `start_session` and `end_session` can be set to ingest a range of dates (which must be market days).
51
- register_polygon_equities_bundle(
52
- "polygon",
53
- calendar_name="XNYS",
54
- agg_time="day"
55
- )
56
- ```
57
-
58
- ## Install the Zipline Polygon.io Bundle PyPi package and check that it works.
59
- Listing bundles will show if everything is working correctly.
60
- ```bash
61
- pip install -U git+https://github.com/fovi-llc/zipline-reloaded.git@calendar
62
- pip install -U git+https://github.com/fovi-llc/zipline-polygon-bundle.git
63
-
64
- pip install zipline_polygon_bundle
65
- zipline -e extension.py bundles
66
- ```
67
- stdout:
68
- ```
69
- csvdir <no ingestions>
70
- polygon <no ingestions>
71
- polygon-minute <no ingestions>
72
- quandl <no ingestions>
73
- quantopian-quandl <no ingestions>
74
- ```
75
-
76
- ## Ingest the Polygon.io data. The API key is needed for the split and dividend data.
77
-
78
- Note that ingest currently stores cached API data and shuffled agg data in the `POLYGON_DATA_DIR` directory (`flatfiles/us_stocks_sip/api_cache` and `flatfiles/us_stocks_sip/day_by_ticker_v1` respectively) so write access is needed at this stage. After ingestion the data in `POLYGON_DATA_DIR` is not accessed.
79
-
80
- ```bash
81
- export POLYGON_API_KEY=<your API key here>
82
- zipline -e extension.py ingest -b polygon
83
- ```
84
-
85
- ### Cleaning up bad ingests
86
- After a while you may wind up with old (or empty because of an error during ingestion) bundles cluttering
87
- up the list and could waste space (although old bundles may be useful for rerunning old backtests).
88
- To remove all but the last ingest (say after your first successful ingest after a number of false starts) you could use:
89
- ```bash
90
- zipline -e extension.py clean -b polygon --keep-last 1
91
- ```
92
-
93
- ## Using minute aggregate flat files.
94
- Minute aggs work too but everything takes more space and a lot longer to do.
95
-
96
- ```bash
97
- export POLYGON_DATA_DIR=`pwd`/data/files.polygon.io
98
- for year in 2024 2023 2022 2021; do \
99
- rclone copy -P s3polygon:flatfiles/us_stocks_sip/minute_aggs_v1/$year \
100
- $POLYGON_DATA_DIR/flatfiles/us_stocks_sip/minute_aggs_v1/$year; \
101
- done
102
- ```
103
-
104
- If you set the `ZIPLINE_ROOT` environment variable (recommended and likely necessary because the default of `~/.zipline` is probably not what you'll want) and copy your `extension.py` config there then you don't need to put `-e extension.py` on the `zipline` command line.
105
-
106
- This ingestion for 10 years of minute bars took around 10 hours on my Mac using an external hard drive (not SSD). A big chunk of that was copying from the default tmp dir to the Zipline root (6.3million files for 47GB actual, 63GB used). I plan to change that `shutil.copy2` to use `shutil.move` and to use a `tmp` dir in Zipline root for temporary files instead of the default which should save an hour or two. Also the ingestion process is single threaded and could be sped up with some concurrency.
107
-
108
- ```bash
109
- zipline ingest -b polygon-minute
110
- ```
111
-
112
- # License is Affero General Public License v3 (AGPL v3)
113
- The content of this project is Copyright (C) 2024 Fovi LLC and authored by James P. White (https://www.linkedin.com/in/jamespaulwhite/). It is distributed under the terms of the GNU AFFERO GENERAL PUBLIC LICENSE (AGPL) Version 3 (See LICENSE file).
114
-
115
- The AGPL doesn't put any restrictions on personal use but people using this in a service for others have obligations. If you have commerical purposes and those distribution requirements don't work for you, feel free to contact me (mailto:jim@fovi.com) about other licensing terms.