viewinline 0.2.0__tar.gz → 0.2.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,4 +1,4 @@
1
- Copyright 2025 Keiko Nomura
1
+ Copyright 2026 Keiko Nomura
2
2
 
3
3
  Apache License
4
4
  Version 2.0, January 2004
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: viewinline
3
- Version: 0.2.0
3
+ Version: 0.2.1
4
4
  Summary: Quick look geospatial viewer for iTerm2 compatible terminals
5
5
  Project-URL: Homepage, https://github.com/nkeikon/viewinline
6
6
  Project-URL: Repository, https://github.com/nkeikon/viewinline
@@ -12,9 +12,18 @@ Requires-Python: >=3.9
12
12
  Requires-Dist: geopandas
13
13
  Requires-Dist: matplotlib
14
14
  Requires-Dist: numpy
15
+ Requires-Dist: pandas
15
16
  Requires-Dist: pillow
16
17
  Requires-Dist: pyogrio
17
18
  Requires-Dist: rasterio
19
+ Provides-Extra: all
20
+ Requires-Dist: duckdb; extra == 'all'
21
+ Requires-Dist: h5py; extra == 'all'
22
+ Requires-Dist: pyarrow; extra == 'all'
23
+ Provides-Extra: hdf5
24
+ Requires-Dist: h5py; extra == 'hdf5'
25
+ Provides-Extra: parquet
26
+ Requires-Dist: pyarrow; extra == 'parquet'
18
27
  Provides-Extra: sql
19
28
  Requires-Dist: duckdb; extra == 'sql'
20
29
  Description-Content-Type: text/markdown
@@ -25,10 +34,12 @@ Description-Content-Type: text/markdown
25
34
  [![Python version](https://img.shields.io/badge/python-%3E%3D3.9-blue.svg)](https://pypi.org/project/viewinline/)
26
35
 
27
36
  **Quick-look geospatial viewer for compatible terminals.**
28
- Displays rasters, vectors, and CSV data directly in the terminal with no GUI and no temporary files.
37
+ Displays rasters, vectors, and tabular data directly in the terminal with no GUI and no temporary files.
29
38
 
30
39
  Think of it as `ls` for geospatial files — designed for quick visual inspection at the command line, not a replacement for QGIS, ArcGIS, or analytical workflows.
31
40
 
41
+ Particularly useful on HPC systems and remote servers accessed via SSH. Images render on your local terminal without X11 forwarding, VNC, or file downloads.
42
+
32
43
  This tool combines the core display logic of `viewtif` and `viewgeom`, but is **non-interactive**: you can't zoom, pan, or switch colormaps on the fly. Instead, you control everything through command-line options (e.g. --display, --color-by, --colormap).
33
44
 
34
45
  It uses the iTerm2 inline image protocol (OSC 1337) to render previews. In incompatible terminals, the escape codes are silently ignored with no errors or crashes.
@@ -42,77 +53,130 @@ pip install viewinline
42
53
 
43
54
  ## Usage
44
55
  ```bash
56
+ # Rasters
45
57
  viewinline path/to/file.tif
46
- viewinline path/to/vector.geojson
47
- viewinline R.tif G.tif B.tif # RGB composite
48
- viewinline path/to/multiband.tif --rgb-bands 3,2,1
58
+ viewinline R.tif G.tif B.tif # RGB composite (also works with --rgbfiles)
59
+ viewinline path/to/multiband.tif --rgb 3 2 1
49
60
  viewinline path/to/folder --gallery 4x3 # show image gallery (e.g. 4x3 grid)
61
+
62
+ # NetCDF and HDF
63
+ viewinline file.nc # list variables
64
+ viewinline file.nc --subset 2 # display variable 2
65
+ viewinline file.nc --subset 1 --band 10 # variable 1, timestep 10 --band or --timestep
66
+ viewinline temp.nc --subset 1 --colormap plasma --vmin 273 --vmax 310
67
+
68
+ # Vectors
69
+ viewinline path/to/vector.geojson
70
+ viewinline boundaries.geoparquet --color-by population --colormap viridis
71
+
72
+ # CSV and Parquet
50
73
  viewinline data.csv # preview rows and columns
51
- viewinline data.csv --describe # summary statistics for all numeric columns
52
- viewinline data.csv --describe Income # summary statistics for one column
74
+ viewinline data.parquet --describe # summary statistics
53
75
  viewinline data.csv --hist # histograms for all numeric columns
54
76
  viewinline data.csv --hist area_km2 # histogram for one column
55
77
  viewinline data.csv --scatter X Y # scatter plot
56
78
  viewinline data.csv --where "year > 2010" # filter rows
57
- viewinline data.csv --sort population # sort rows
79
+ viewinline data.csv --sort population --desc # sort rows
58
80
  viewinline data.csv --sql "SELECT * FROM data WHERE area > 100 ORDER BY year" # full SQL
81
+
82
+ # Tabular view of vectors
83
+ viewinline counties.shp --table # view shapefile as table
84
+ viewinline counties.shp --table --describe # summary statistics
85
+ viewinline counties.shp --table --unique STATE_NAME
86
+ viewinline data.geoparquet --table --where "POP > 100000" --sort POP --desc
59
87
  ```
60
88
 
61
89
  ## Compatible terminals
62
90
 
63
- The iTerm2 inline image protocol is supported by:
91
+ The iTerm2 inline image protocol (OSC 1337) is supported by:
64
92
 
65
93
  - **iTerm2** (macOS)
66
94
  - **WezTerm** (cross-platform)
67
95
  - **Konsole** (Linux/KDE)
68
- - **Rio**, **Contour** (Linux)
96
+ - **Rio**, **Contour** (cross-platform)
97
+
98
+ **Not compatible:** Mac Terminal, GNOME Terminal, Kitty (uses different protocol), Ghostty, Alacritty
69
99
 
70
- Not supported: Mac Terminal, GNOME Terminal, Kitty, Ghostty, Alacritty.
100
+ **SSH/HPC usage:** Works over SSH when connecting from a compatible terminal. Images render on your local machine, not the remote server. No X11 forwarding or VNC required.
71
101
 
72
- > **Note:** Does not work inside tmux or screen.
102
+ **tmux/screen:** Inline images don't work inside tmux or screen sessions, even with `allow-passthrough on`. Use a plain terminal tab.
73
103
 
74
104
  ## Features
75
- - Previews rasters, vectors and CSV files directly in the terminal
76
- - Non-interactive: everything is controlled through command-line options
105
+ - Previews rasters, vectors, and tabular data directly in the terminal
106
+ - Non-interactive: everything is controlled through command-line options
107
+ - **NetCDF/HDF Support:** Display variables from NetCDF (.nc) and HDF5 (.h5, .hdf5) files with automatic nodata detection and multi-slice navigation
108
+ - **Parquet/GeoParquet:** Render GeoParquet as vector maps or view as tabular data
109
+ - **Tabular View for Vectors:** Use `--table` to access CSV-style operations (filter, sort, describe, hist) on any vector file
77
110
 
78
111
  ## Supported formats
79
112
  **Rasters**
80
113
  - GeoTIFF (.tif, .tiff)
81
114
  - PNG, JPEG (.png, .jpg, .jpeg)
115
+ - NetCDF (.nc)
116
+ - HDF5 (.h5, .hdf5)
117
+ - HDF4 (.hdf) — requires GDAL with HDF4 support
82
118
  - Single-band or multi-band composites
83
119
 
84
120
  **Composite inputs**
85
- - You can pass three rasters (e.g. `R.tif G.tif B.tif`) to create an RGB composite
121
+ - You can pass three rasters (e.g. `R.tif G.tif B.tif`) or use `--rgbfiles R.tif G.tif B.tif` to create an RGB composite
122
+ - Multi-band files: use `--rgb 3 2 1` to specify band order
86
123
 
87
124
  **Vectors**
88
125
  - GeoJSON (`.geojson`)
89
- - Shapefile (`.shp`, `.dbf`, `.shx`)
90
- - GeoPackage (`.gpkg`)
91
-
92
- **CSV**
126
+ - Shapefile (`.shp`)
127
+ - GeoPackage (`.gpkg`)
128
+ - Parquet/GeoParquet (`.parquet`, `.geoparquet`)
129
+
130
+ **Tabular data (CSV and Parquet)**
131
+ - CSV (`.csv`)
132
+ - Parquet (`.parquet`) — requires `pyarrow`
133
+ - All CSV operations work on parquet files
93
134
  - Preview file summary (rows, columns, and names)
94
135
  - Summary statistics with `--describe`
95
- - Show all numeric columns, or specify one (e.g. `--describe height`)
96
136
  - Inline histograms with `--hist`
97
- - Show all numeric columns, or specify one (e.g. `--hist area_km2`)
98
137
  - Scatter plots with `--scatter X Y`
99
138
  - Filter rows with `--where`, sort with `--sort`, limit output with `--limit`
100
- - Full SQL queries with `--sql` (DuckDB required) — use `data` as the table name (e.g. `--sql "SELECT State, AVG(Income) FROM data GROUP BY State"`)
139
+ - Full SQL queries with `--sql` (DuckDB required) — use `data` as the table name
140
+
141
+ **Tabular view of vectors**
142
+ - Use `--table` flag to view any vector file (shapefiles, GeoPackage, GeoParquet) as tabular data
143
+ - Enables all CSV-style operations: `--describe`, `--hist`, `--scatter`, `--unique`, `--where`, `--sort`
101
144
 
102
145
  **Gallery view**
103
146
  - Display all images in a folder with `--gallery 4x4`
104
147
 
148
+ **NetCDF/HDF notes:**
149
+ - viewinline lists only variables that can be displayed as 2D or 3D arrays
150
+ - Variables with additional dimensions (e.g., vertical levels) may be listed but will fail to display with a clear error message
151
+ - For a complete variable list, use `ncdump -h file.nc` or `viewtif`
152
+
105
153
  ## Dependencies
106
154
 
107
- Core dependencies (installed automatically):
108
- - `rasterio` — raster reading
155
+ **Core dependencies** (installed automatically):
156
+ - `rasterio` — raster reading (includes GDAL)
109
157
  - `geopandas`, `pyogrio` — vector reading
110
158
  - `matplotlib` — vector rendering
111
159
  - `Pillow` — image encoding
112
160
  - `numpy`, `pandas` — data handling
113
161
 
114
- Optional:
115
- - `duckdb` — required for `--where`, `--sort`, `--sql`, and `--limit` with filtering. Install separately with `pip install duckdb`.
162
+ **Optional dependencies:**
163
+ - `duckdb` — required for `--where`, `--sort`, `--sql`, `--limit` with filtering
164
+ ```bash
165
+ pip install duckdb
166
+ ```
167
+ - `pyarrow` — required for Parquet/GeoParquet files
168
+ ```bash
169
+ pip install pyarrow
170
+ ```
171
+ - `h5py` — fallback for HDF5 files if GDAL lacks HDF5 support (usually not needed)
172
+ ```bash
173
+ pip install h5py
174
+ ```
175
+
176
+ **Note on HDF support:**
177
+ - **HDF5** (.h5, .hdf5): Supported via rasterio if GDAL has HDF5 support (most installations)
178
+ - **HDF4** (.hdf): Requires GDAL compiled with HDF4 support (common in MODIS data)
179
+ - **NetCDF** (.nc): Supported via rasterio (uses GDAL's NetCDF driver)
116
180
 
117
181
  ## Available options
118
182
  ```
@@ -120,37 +184,39 @@ General:
120
184
  --display DISPLAY Resize only the displayed image (0.5=smaller, 2=bigger). Default: auto-fit to terminal.
121
185
 
122
186
  Raster:
123
- --band BAND Band number to display for single-band rasters. (default: 1)
124
- --colormap Apply colormap to single-band rasters. Flag without value → 'terrain'.
125
- --rgb-bands RGB_BANDS Comma-separated band numbers for RGB display (e.g., '3,2,1'). Overrides default 1-3.
187
+ --band BAND Band number to display (single raster), or slice number for NetCDF. (default: 1)
188
+ --timestep INTEGER Alias for --band when working with NetCDF files.
189
+ --subset INTEGER Variable index for NetCDF/HDF files (e.g., --subset 1).
190
+ --colormap Apply colormap to single-band rasters. Flag without the color scheme → 'terrain'.
191
+ --rgb R G B Three band numbers for RGB display (e.g., --rgb 4 3 2). Overrides default 1 2 3.
192
+ --rgbfiles R G B Three single-band rasters for RGB composite. Can also provide as positional arguments.
126
193
  --vmin VMIN Minimum pixel value for raster display scaling.
127
194
  --vmax VMAX Maximum pixel value for raster display scaling.
128
- --nodata NODATA Override nodata value for rasters if dataset metadata is incorrect.
195
+ --nodata NODATA Override nodata value for rasters if dataset metadata is missing or incorrect.
129
196
  --gallery [GRID] Display all PNG/JPG/TIF images in a folder as thumbnails (e.g., 5x5 grid).
130
197
 
131
198
  Vector:
132
199
  --color-by COLUMN Column to color vector features by.
133
200
  --colormap Apply colormap to vector coloring. Flag without value → 'terrain'.
134
201
  --width WIDTH Line width for vector boundaries. (default: 0.7)
135
- --edgecolor COLOR Edge color for vector outlines (hex or named color). (default: #F6FF00)
136
- --layer LAYER Layer name for GeoPackage or multi-layer files.
202
+ --edgecolor COLOR Edge color for vector outlines (hex or named color). (default: white)
203
+ --layer LAYER Layer name for GeoPackage/multi-layer files, or variable name for NetCDF files.
204
+ --table Display vector/parquet file as tabular data instead of rendering geometry.
137
205
 
138
- CSV:
206
+ CSV and Parquet:
139
207
  --describe [COLUMN] Show summary statistics for all numeric columns or specify one column name.
140
208
  --hist [COLUMN] Show histograms for all numeric columns or specify one column name.
141
209
  --bins BINS Number of bins for histograms (used with --hist). (default: 20)
142
- --scatter X Y Plot scatter of two numeric CSV columns (e.g. --scatter area_km2 year).
210
+ --scatter X Y Plot scatter of two numeric columns (e.g., --scatter area_km2 year).
143
211
  --unique COLUMN Show unique values for a categorical column.
144
- --where EXPR Filter rows using SQL WHERE clause (DuckDB required) (e.g. --where "year > 2010")
145
- --sort COLUMN Sort rows by values in the specified column, ascending by default. Use --desc to reverse.
212
+ --where EXPR Filter rows using SQL WHERE clause (DuckDB required). Example: --where "year > 2010"
213
+ --sort COLUMN Sort rows by column, ascending by default. Use --desc to reverse.
146
214
  --desc Sort in descending order (used with --sort).
147
- --limit N Limit number of rows shown (e.g. --limit 100).
148
- --select COLUMNS Select specific columns (space separated) (e.g. --select Country City).
149
- --sql QUERY Execute full DuckDB SQL query. Use 'data' as the table name (e.g. --sql "SELECT * FROM data WHERE Poverty > 40").
215
+ --limit N Limit number of rows shown (e.g., --limit 100).
216
+ --select COLUMNS Select specific columns (space separated). Example: --select Country City
217
+ --sql QUERY Execute full DuckDB SQL query. Use 'data' as table name. Example: --sql "SELECT * FROM data WHERE Poverty > 40"
150
218
  ```
151
219
 
152
- </small>
153
-
154
220
  ## Need help?
155
221
  You can ask questions about usage via the documentation-based assistant:
156
222
 
@@ -159,7 +225,7 @@ You can ask questions about usage via the documentation-based assistant:
159
225
  👉 For NASA staff: find 'viewtif + viewgeom + viewinline Helper' via the ChatGSFC Agent Marketplace
160
226
 
161
227
  ## License
162
- This project is released under the Apache License 2.0 © 2025 Keiko Nomura.
228
+ This project is released under the Apache License 2.0 © 2026 Keiko Nomura.
163
229
 
164
230
  If you find this tool useful, please consider supporting or acknowledging it in your work.
165
231
 
@@ -4,10 +4,12 @@
4
4
  [![Python version](https://img.shields.io/badge/python-%3E%3D3.9-blue.svg)](https://pypi.org/project/viewinline/)
5
5
 
6
6
  **Quick-look geospatial viewer for compatible terminals.**
7
- Displays rasters, vectors, and CSV data directly in the terminal with no GUI and no temporary files.
7
+ Displays rasters, vectors, and tabular data directly in the terminal with no GUI and no temporary files.
8
8
 
9
9
  Think of it as `ls` for geospatial files — designed for quick visual inspection at the command line, not a replacement for QGIS, ArcGIS, or analytical workflows.
10
10
 
11
+ Particularly useful on HPC systems and remote servers accessed via SSH. Images render on your local terminal without X11 forwarding, VNC, or file downloads.
12
+
11
13
  This tool combines the core display logic of `viewtif` and `viewgeom`, but is **non-interactive**: you can't zoom, pan, or switch colormaps on the fly. Instead, you control everything through command-line options (e.g. --display, --color-by, --colormap).
12
14
 
13
15
  It uses the iTerm2 inline image protocol (OSC 1337) to render previews. In incompatible terminals, the escape codes are silently ignored with no errors or crashes.
@@ -21,77 +23,130 @@ pip install viewinline
21
23
 
22
24
  ## Usage
23
25
  ```bash
26
+ # Rasters
24
27
  viewinline path/to/file.tif
25
- viewinline path/to/vector.geojson
26
- viewinline R.tif G.tif B.tif # RGB composite
27
- viewinline path/to/multiband.tif --rgb-bands 3,2,1
28
+ viewinline R.tif G.tif B.tif # RGB composite (also works with --rgbfiles)
29
+ viewinline path/to/multiband.tif --rgb 3 2 1
28
30
  viewinline path/to/folder --gallery 4x3 # show image gallery (e.g. 4x3 grid)
31
+
32
+ # NetCDF and HDF
33
+ viewinline file.nc # list variables
34
+ viewinline file.nc --subset 2 # display variable 2
35
+ viewinline file.nc --subset 1 --band 10 # variable 1, timestep 10 --band or --timestep
36
+ viewinline temp.nc --subset 1 --colormap plasma --vmin 273 --vmax 310
37
+
38
+ # Vectors
39
+ viewinline path/to/vector.geojson
40
+ viewinline boundaries.geoparquet --color-by population --colormap viridis
41
+
42
+ # CSV and Parquet
29
43
  viewinline data.csv # preview rows and columns
30
- viewinline data.csv --describe # summary statistics for all numeric columns
31
- viewinline data.csv --describe Income # summary statistics for one column
44
+ viewinline data.parquet --describe # summary statistics
32
45
  viewinline data.csv --hist # histograms for all numeric columns
33
46
  viewinline data.csv --hist area_km2 # histogram for one column
34
47
  viewinline data.csv --scatter X Y # scatter plot
35
48
  viewinline data.csv --where "year > 2010" # filter rows
36
- viewinline data.csv --sort population # sort rows
49
+ viewinline data.csv --sort population --desc # sort rows
37
50
  viewinline data.csv --sql "SELECT * FROM data WHERE area > 100 ORDER BY year" # full SQL
51
+
52
+ # Tabular view of vectors
53
+ viewinline counties.shp --table # view shapefile as table
54
+ viewinline counties.shp --table --describe # summary statistics
55
+ viewinline counties.shp --table --unique STATE_NAME
56
+ viewinline data.geoparquet --table --where "POP > 100000" --sort POP --desc
38
57
  ```
39
58
 
40
59
  ## Compatible terminals
41
60
 
42
- The iTerm2 inline image protocol is supported by:
61
+ The iTerm2 inline image protocol (OSC 1337) is supported by:
43
62
 
44
63
  - **iTerm2** (macOS)
45
64
  - **WezTerm** (cross-platform)
46
65
  - **Konsole** (Linux/KDE)
47
- - **Rio**, **Contour** (Linux)
66
+ - **Rio**, **Contour** (cross-platform)
67
+
68
+ **Not compatible:** Mac Terminal, GNOME Terminal, Kitty (uses different protocol), Ghostty, Alacritty
48
69
 
49
- Not supported: Mac Terminal, GNOME Terminal, Kitty, Ghostty, Alacritty.
70
+ **SSH/HPC usage:** Works over SSH when connecting from a compatible terminal. Images render on your local machine, not the remote server. No X11 forwarding or VNC required.
50
71
 
51
- > **Note:** Does not work inside tmux or screen.
72
+ **tmux/screen:** Inline images don't work inside tmux or screen sessions, even with `allow-passthrough on`. Use a plain terminal tab.
52
73
 
53
74
  ## Features
54
- - Previews rasters, vectors and CSV files directly in the terminal
55
- - Non-interactive: everything is controlled through command-line options
75
+ - Previews rasters, vectors, and tabular data directly in the terminal
76
+ - Non-interactive: everything is controlled through command-line options
77
+ - **NetCDF/HDF Support:** Display variables from NetCDF (.nc) and HDF5 (.h5, .hdf5) files with automatic nodata detection and multi-slice navigation
78
+ - **Parquet/GeoParquet:** Render GeoParquet as vector maps or view as tabular data
79
+ - **Tabular View for Vectors:** Use `--table` to access CSV-style operations (filter, sort, describe, hist) on any vector file
56
80
 
57
81
  ## Supported formats
58
82
  **Rasters**
59
83
  - GeoTIFF (.tif, .tiff)
60
84
  - PNG, JPEG (.png, .jpg, .jpeg)
85
+ - NetCDF (.nc)
86
+ - HDF5 (.h5, .hdf5)
87
+ - HDF4 (.hdf) — requires GDAL with HDF4 support
61
88
  - Single-band or multi-band composites
62
89
 
63
90
  **Composite inputs**
64
- - You can pass three rasters (e.g. `R.tif G.tif B.tif`) to create an RGB composite
91
+ - You can pass three rasters (e.g. `R.tif G.tif B.tif`) or use `--rgbfiles R.tif G.tif B.tif` to create an RGB composite
92
+ - Multi-band files: use `--rgb 3 2 1` to specify band order
65
93
 
66
94
  **Vectors**
67
95
  - GeoJSON (`.geojson`)
68
- - Shapefile (`.shp`, `.dbf`, `.shx`)
69
- - GeoPackage (`.gpkg`)
70
-
71
- **CSV**
96
+ - Shapefile (`.shp`)
97
+ - GeoPackage (`.gpkg`)
98
+ - Parquet/GeoParquet (`.parquet`, `.geoparquet`)
99
+
100
+ **Tabular data (CSV and Parquet)**
101
+ - CSV (`.csv`)
102
+ - Parquet (`.parquet`) — requires `pyarrow`
103
+ - All CSV operations work on parquet files
72
104
  - Preview file summary (rows, columns, and names)
73
105
  - Summary statistics with `--describe`
74
- - Show all numeric columns, or specify one (e.g. `--describe height`)
75
106
  - Inline histograms with `--hist`
76
- - Show all numeric columns, or specify one (e.g. `--hist area_km2`)
77
107
  - Scatter plots with `--scatter X Y`
78
108
  - Filter rows with `--where`, sort with `--sort`, limit output with `--limit`
79
- - Full SQL queries with `--sql` (DuckDB required) — use `data` as the table name (e.g. `--sql "SELECT State, AVG(Income) FROM data GROUP BY State"`)
109
+ - Full SQL queries with `--sql` (DuckDB required) — use `data` as the table name
110
+
111
+ **Tabular view of vectors**
112
+ - Use `--table` flag to view any vector file (shapefiles, GeoPackage, GeoParquet) as tabular data
113
+ - Enables all CSV-style operations: `--describe`, `--hist`, `--scatter`, `--unique`, `--where`, `--sort`
80
114
 
81
115
  **Gallery view**
82
116
  - Display all images in a folder with `--gallery 4x4`
83
117
 
118
+ **NetCDF/HDF notes:**
119
+ - viewinline lists only variables that can be displayed as 2D or 3D arrays
120
+ - Variables with additional dimensions (e.g., vertical levels) may be listed but will fail to display with a clear error message
121
+ - For a complete variable list, use `ncdump -h file.nc` or `viewtif`
122
+
84
123
  ## Dependencies
85
124
 
86
- Core dependencies (installed automatically):
87
- - `rasterio` — raster reading
125
+ **Core dependencies** (installed automatically):
126
+ - `rasterio` — raster reading (includes GDAL)
88
127
  - `geopandas`, `pyogrio` — vector reading
89
128
  - `matplotlib` — vector rendering
90
129
  - `Pillow` — image encoding
91
130
  - `numpy`, `pandas` — data handling
92
131
 
93
- Optional:
94
- - `duckdb` — required for `--where`, `--sort`, `--sql`, and `--limit` with filtering. Install separately with `pip install duckdb`.
132
+ **Optional dependencies:**
133
+ - `duckdb` — required for `--where`, `--sort`, `--sql`, `--limit` with filtering
134
+ ```bash
135
+ pip install duckdb
136
+ ```
137
+ - `pyarrow` — required for Parquet/GeoParquet files
138
+ ```bash
139
+ pip install pyarrow
140
+ ```
141
+ - `h5py` — fallback for HDF5 files if GDAL lacks HDF5 support (usually not needed)
142
+ ```bash
143
+ pip install h5py
144
+ ```
145
+
146
+ **Note on HDF support:**
147
+ - **HDF5** (.h5, .hdf5): Supported via rasterio if GDAL has HDF5 support (most installations)
148
+ - **HDF4** (.hdf): Requires GDAL compiled with HDF4 support (common in MODIS data)
149
+ - **NetCDF** (.nc): Supported via rasterio (uses GDAL's NetCDF driver)
95
150
 
96
151
  ## Available options
97
152
  ```
@@ -99,37 +154,39 @@ General:
99
154
  --display DISPLAY Resize only the displayed image (0.5=smaller, 2=bigger). Default: auto-fit to terminal.
100
155
 
101
156
  Raster:
102
- --band BAND Band number to display for single-band rasters. (default: 1)
103
- --colormap Apply colormap to single-band rasters. Flag without value → 'terrain'.
104
- --rgb-bands RGB_BANDS Comma-separated band numbers for RGB display (e.g., '3,2,1'). Overrides default 1-3.
157
+ --band BAND Band number to display (single raster), or slice number for NetCDF. (default: 1)
158
+ --timestep INTEGER Alias for --band when working with NetCDF files.
159
+ --subset INTEGER Variable index for NetCDF/HDF files (e.g., --subset 1).
160
+ --colormap Apply colormap to single-band rasters. Flag without the color scheme → 'terrain'.
161
+ --rgb R G B Three band numbers for RGB display (e.g., --rgb 4 3 2). Overrides default 1 2 3.
162
+ --rgbfiles R G B Three single-band rasters for RGB composite. Can also provide as positional arguments.
105
163
  --vmin VMIN Minimum pixel value for raster display scaling.
106
164
  --vmax VMAX Maximum pixel value for raster display scaling.
107
- --nodata NODATA Override nodata value for rasters if dataset metadata is incorrect.
165
+ --nodata NODATA Override nodata value for rasters if dataset metadata is missing or incorrect.
108
166
  --gallery [GRID] Display all PNG/JPG/TIF images in a folder as thumbnails (e.g., 5x5 grid).
109
167
 
110
168
  Vector:
111
169
  --color-by COLUMN Column to color vector features by.
112
170
  --colormap Apply colormap to vector coloring. Flag without value → 'terrain'.
113
171
  --width WIDTH Line width for vector boundaries. (default: 0.7)
114
- --edgecolor COLOR Edge color for vector outlines (hex or named color). (default: #F6FF00)
115
- --layer LAYER Layer name for GeoPackage or multi-layer files.
172
+ --edgecolor COLOR Edge color for vector outlines (hex or named color). (default: white)
173
+ --layer LAYER Layer name for GeoPackage/multi-layer files, or variable name for NetCDF files.
174
+ --table Display vector/parquet file as tabular data instead of rendering geometry.
116
175
 
117
- CSV:
176
+ CSV and Parquet:
118
177
  --describe [COLUMN] Show summary statistics for all numeric columns or specify one column name.
119
178
  --hist [COLUMN] Show histograms for all numeric columns or specify one column name.
120
179
  --bins BINS Number of bins for histograms (used with --hist). (default: 20)
121
- --scatter X Y Plot scatter of two numeric CSV columns (e.g. --scatter area_km2 year).
180
+ --scatter X Y Plot scatter of two numeric columns (e.g., --scatter area_km2 year).
122
181
  --unique COLUMN Show unique values for a categorical column.
123
- --where EXPR Filter rows using SQL WHERE clause (DuckDB required) (e.g. --where "year > 2010")
124
- --sort COLUMN Sort rows by values in the specified column, ascending by default. Use --desc to reverse.
182
+ --where EXPR Filter rows using SQL WHERE clause (DuckDB required). Example: --where "year > 2010"
183
+ --sort COLUMN Sort rows by column, ascending by default. Use --desc to reverse.
125
184
  --desc Sort in descending order (used with --sort).
126
- --limit N Limit number of rows shown (e.g. --limit 100).
127
- --select COLUMNS Select specific columns (space separated) (e.g. --select Country City).
128
- --sql QUERY Execute full DuckDB SQL query. Use 'data' as the table name (e.g. --sql "SELECT * FROM data WHERE Poverty > 40").
185
+ --limit N Limit number of rows shown (e.g., --limit 100).
186
+ --select COLUMNS Select specific columns (space separated). Example: --select Country City
187
+ --sql QUERY Execute full DuckDB SQL query. Use 'data' as table name. Example: --sql "SELECT * FROM data WHERE Poverty > 40"
129
188
  ```
130
189
 
131
- </small>
132
-
133
190
  ## Need help?
134
191
  You can ask questions about usage via the documentation-based assistant:
135
192
 
@@ -138,7 +195,7 @@ You can ask questions about usage via the documentation-based assistant:
138
195
  👉 For NASA staff: find 'viewtif + viewgeom + viewinline Helper' via the ChatGSFC Agent Marketplace
139
196
 
140
197
  ## License
141
- This project is released under the Apache License 2.0 © 2025 Keiko Nomura.
198
+ This project is released under the Apache License 2.0 © 2026 Keiko Nomura.
142
199
 
143
200
  If you find this tool useful, please consider supporting or acknowledging it in your work.
144
201
 
@@ -4,7 +4,7 @@ build-backend = "hatchling.build"
4
4
 
5
5
  [project]
6
6
  name = "viewinline"
7
- version = "0.2.0"
7
+ version = "0.2.1"
8
8
  description = "Quick look geospatial viewer for iTerm2 compatible terminals"
9
9
  readme = "README.md"
10
10
  license = { text = "Apache-2.0" }
@@ -12,17 +12,22 @@ authors = [
12
12
  { name = "Keiko Nomura" }
13
13
  ]
14
14
  requires-python = ">=3.9"
15
+
15
16
  dependencies = [
16
17
  "numpy",
17
18
  "pillow",
18
19
  "rasterio",
19
20
  "geopandas",
20
21
  "matplotlib",
21
- "pyogrio"
22
+ "pyogrio",
23
+ "pandas"
22
24
  ]
23
25
 
24
26
  [project.optional-dependencies]
25
27
  sql = ["duckdb"]
28
+ parquet = ["pyarrow"]
29
+ hdf5 = ["h5py"]
30
+ all = ["duckdb", "pyarrow", "h5py"]
26
31
 
27
32
  [project.scripts]
28
33
  viewinline = "viewinline.viewinline:main"
@@ -3,14 +3,17 @@
3
3
  viewinline — quick-look geospatial viewer with inline image support.
4
4
 
5
5
  Supports:
6
- • Rasters (.tif, .tiff, .png, .jpg, .jpeg)
7
- • Vectors (.shp, .geojson, .gpkg)
6
+ • Rasters (.tif, .tiff, .png, .jpg, .jpeg, .nc, .hdf)
7
+ • Vectors (.shp, .geojson, .gpkg, .parquet, .geoparquet)
8
8
  • CSV (.csv) scatter plots and histograms
9
9
 
10
10
  Display:
11
11
  Sends iTerm2-style inline image escape sequences. Works in terminals that support
12
12
  the iTerm2 inline image protocol (iTerm2, WezTerm, Konsole, etc.). In other
13
- terminals, the escape codes are harmlessly ignored.
13
+ terminals, the escape codes are ignored.
14
+
15
+ Particularly useful on HPC systems and remote servers accessed via SSH — images
16
+ render on your local terminal without X11 forwarding, VNC, or file downloads.
14
17
 
15
18
  No detection, no fallbacks. If images are not shown, it means that the terminal
16
19
  is not compatible.
@@ -29,7 +32,7 @@ import warnings
29
32
  warnings.filterwarnings("ignore", message="More than one layer found", category=UserWarning)
30
33
  warnings.filterwarnings("ignore", message="Dataset has no geotransform", category=UserWarning)
31
34
 
32
- __version__ = "0.2.0"
35
+ __version__ = "0.2.1"
33
36
 
34
37
  AVAILABLE_COLORMAPS = [
35
38
  "viridis", "inferno", "magma", "plasma",
@@ -40,7 +43,7 @@ AVAILABLE_COLORMAPS = [
40
43
  # ---------------------------------------------------------------------
41
44
  # Display utilities
42
45
  # ---------------------------------------------------------------------
43
- def show_inline_image(image_array: np.ndarray, display_scale: float | None = None, is_vector: bool = False) -> None:
46
+ def show_inline_image(image_array: np.ndarray, display_scale = None, is_vector: bool = False) -> None:
44
47
  """Encode image and write iTerm2 inline escape sequence to stdout.
45
48
 
46
49
  Raises only if image encoding fails. Cannot detect whether the terminal
@@ -68,7 +71,7 @@ def show_inline_image(image_array: np.ndarray, display_scale: float | None = Non
68
71
  sys.stdout.flush()
69
72
 
70
73
 
71
- def show_image_auto(img: np.ndarray, display_scale: float | None = None, is_vector: bool = False) -> None:
74
+ def show_image_auto(img: np.ndarray, display_scale = None, is_vector: bool = False) -> None:
72
75
  """Attempt inline image display. No fallbacks, no detection.
73
76
 
74
77
  Just sends the iTerm2 inline image escape sequence. If the terminal supports it,
@@ -179,7 +182,6 @@ def preview_df(df, max_rows: int = 10, query_mode: bool = False, filename: str =
179
182
  # Describe
180
183
  # =============================================================
181
184
  def describe_df(df, column=None):
182
-
183
185
  if df is None or df.empty:
184
186
  print("[WARN] No data rows found.")
185
187
  return
@@ -497,7 +499,6 @@ def render_simple_image(filepath: str, args) -> None:
497
499
  except Exception as e:
498
500
  print(f"[ERROR] Failed to load image: {e}")
499
501
 
500
-
501
502
  def render_raster(paths: list[str], args) -> None:
502
503
  try:
503
504
  import rasterio
@@ -507,14 +508,72 @@ def render_raster(paths: list[str], args) -> None:
507
508
  return
508
509
 
509
510
  try:
511
+
510
512
  if len(paths) == 1:
513
+ path = paths[0]
514
+
515
+ # Handle NetCDF/HDF with subdatasets
516
+ if path.lower().endswith(('.nc', '.hdf', '.hdf5', '.h5')):
517
+ try:
518
+ with rasterio.open(path) as src:
519
+ subdatasets = src.subdatasets
520
+
521
+ # If there are subdatasets, require --subset to select one
522
+ if subdatasets:
523
+ if not args.subset:
524
+ file_type = "variables" if path.lower().endswith('.nc') else "datasets"
525
+ print(f"Found {len(subdatasets)} {file_type} in {os.path.basename(path)}:")
526
+ for i, sub in enumerate(subdatasets, 1):
527
+ # Extract dataset/variable name from GDAL subdataset string
528
+ ds_name = sub.split(':')[-1].lstrip('/')
529
+ print(f" [{i}] {ds_name}")
530
+ print(f"\nUse --subset <N> to display a specific {file_type[:-1]}.")
531
+ return
532
+
533
+ # Select by index
534
+ if args.subset < 1 or args.subset > len(subdatasets):
535
+ print(f"[ERROR] --subset must be between 1 and {len(subdatasets)}")
536
+ return
537
+
538
+ path = subdatasets[args.subset - 1]
539
+ var_name = path.split(':')[-1]
540
+ print(f"[INFO] Displaying variable {args.subset}: {var_name}")
541
+
542
+ except rasterio.errors.RasterioIOError as e:
543
+ # GDAL lacks support, try h5py fallback for HDF5
544
+ if path.lower().endswith(('.hdf5', '.h5')):
545
+ try:
546
+ import h5py
547
+ except ImportError:
548
+ print("[ERROR] HDF5 file cannot be opened.")
549
+ print("[INFO] Requires either:")
550
+ print(" - GDAL with HDF5 support, or")
551
+ print(" - h5py: pip install h5py")
552
+ return
553
+
554
+ # print("[ERROR] h5py fallback not yet implemented.")
555
+ print("[INFO] Install GDAL with HDF5")
556
+ return
557
+
558
+ elif path.lower().endswith('.hdf'):
559
+ print(f"[ERROR] Cannot open HDF4 file: {e}")
560
+ print("[INFO] HDF4 requires GDAL with HDF4 support")
561
+ return
562
+ else:
563
+ # NetCDF error
564
+ print(f"[ERROR] Cannot open NetCDF file: {e}")
565
+ return
511
566
 
512
- with rasterio.open(paths[0]) as ds:
567
+ # Continue with normal raster opening
568
+ with rasterio.open(path) as ds:
513
569
  H, W = ds.height, ds.width
514
570
  print(f"[DATA] Raster loaded: {os.path.basename(paths[0])} ({W}×{H})")
515
-
516
571
  band_count = ds.count
517
-
572
+
573
+ # Auto-detect nodata from file metadata
574
+ if args.nodata is None and ds.nodata is not None:
575
+ args.nodata = ds.nodata
576
+
518
577
  # Downsample for performance
519
578
  max_dim = 2000
520
579
 
@@ -532,19 +591,25 @@ def render_raster(paths: list[str], args) -> None:
532
591
  else:
533
592
  data = ds.read()
534
593
 
535
- # MULTI BAND
536
- if band_count >= 3:
594
+ # Print band/slice count for all multi-band files
595
+ if band_count > 1:
596
+ if paths[0].lower().endswith('.nc'):
597
+ print(f"[INFO] {band_count} slices detected")
598
+ else:
599
+ print(f"[INFO] Multi-band raster detected ({band_count} bands)")
600
+
601
+ # MULTI BAND RGB (skip for NetCDF - treat as slices/timesteps, not RGB)
602
+ if band_count >= 3 and not paths[0].lower().endswith('.nc'):
537
603
 
538
- print(f"[INFO] Multi-band raster detected ({band_count} bands)")
539
604
 
540
- if getattr(args, "rgb_bands", None):
605
+ if getattr(args, "rgb", None):
541
606
  try:
542
- rgb_idx = [int(b) - 1 for b in args.rgb_bands.split(",")]
607
+ rgb_idx = [b - 1 for b in args.rgb]
543
608
  if len(rgb_idx) != 3:
544
609
  raise ValueError
545
- print(f"[INFO] Using RGB bands: {args.rgb_bands}")
610
+ print(f"[INFO] Using RGB bands: {args.rgb}")
546
611
  except Exception:
547
- print("[WARN] Invalid --rgb-bands. Using default 1,2,3")
612
+ print("[WARN] Invalid --rgb. Using default 1 2 3")
548
613
  rgb_idx = [0, 1, 2]
549
614
  else:
550
615
  rgb_idx = [0, 1, 2]
@@ -565,6 +630,7 @@ def render_raster(paths: list[str], args) -> None:
565
630
  else:
566
631
 
567
632
  band_idx = max(0, min(args.band - 1, band_count - 1))
633
+ # print(f"[INFO] Displaying band {band_idx + 1} of {band_count}")
568
634
  raw_band = data[band_idx].astype(float)
569
635
 
570
636
  # mask nodata
@@ -576,7 +642,14 @@ def render_raster(paths: list[str], args) -> None:
576
642
  if np.any(np.isfinite(raw_band)):
577
643
  min_val = np.nanmin(raw_band)
578
644
  max_val = np.nanmax(raw_band)
579
- print(f"[DATA] Value range (preview): {min_val:.3f} → {max_val:.3f}")
645
+
646
+ if paths[0].lower().endswith('.nc'):
647
+ print(f"[DATA] Slice {band_idx + 1} of {band_count} — value range: {min_val:.3f} → {max_val:.3f}")
648
+ if band_count > 1:
649
+ print(f"[INFO] Use --band <N> to display a different slice")
650
+ else:
651
+ print(f"[DATA] Band {band_idx + 1} of {band_count} — value range: {min_val:.3f} → {max_val:.3f}")
652
+
580
653
  else:
581
654
  print("[WARN] No valid pixels found.")
582
655
 
@@ -629,7 +702,12 @@ def render_raster(paths: list[str], args) -> None:
629
702
  show_image_auto(img, getattr(args, "display", None), is_vector=False)
630
703
 
631
704
  except Exception as e:
632
- print(f"[ERROR] Raster rendering failed: {e}")
705
+ if paths[0].lower().endswith('.nc'):
706
+ print(f"[ERROR] Cannot display this variable.")
707
+ print("[INFO] viewinline only supports 2D or 3D NetCDF variables")
708
+ else:
709
+ print(f"[ERROR] Raster rendering failed: {e}")
710
+
633
711
 
634
712
  def render_gallery(folder: str, grid: str = "4x4", display_scale=None, is_vector=False) -> None:
635
713
  """Render a folder of rasters/images as small thumbnails in a grid."""
@@ -733,37 +811,51 @@ def render_vector(path, args):
733
811
  except Exception as e:
734
812
  print(f"[WARN] Could not list layers: {e}")
735
813
 
814
+ # try:
815
+ # gdf = gpd.read_file(path, layer=getattr(args, "layer", None))
816
+ # print(f"[DATA] Vector loaded: {os.path.basename(path)} ({len(gdf)} features)")
817
+ # except Exception as e:
818
+ # print(f"[ERROR] Failed to read vector: {e}")
819
+ # return
820
+
736
821
  try:
737
- gdf = gpd.read_file(path, layer=getattr(args, "layer", None))
822
+ # Use read_parquet for parquet/geoparquet files
823
+ if path.lower().endswith(('.parquet', '.geoparquet')):
824
+ gdf = gpd.read_parquet(path)
825
+ else:
826
+ gdf = gpd.read_file(path, layer=getattr(args, "layer", None))
738
827
  print(f"[DATA] Vector loaded: {os.path.basename(path)} ({len(gdf)} features)")
828
+ except ImportError:
829
+ print("[ERROR] Parquet/GeoParquet support requires pyarrow. Install with: pip install pyarrow")
830
+ return
739
831
  except Exception as e:
740
832
  print(f"[ERROR] Failed to read vector: {e}")
741
833
  return
742
-
834
+
743
835
  # Detect non-geometry columns
744
836
  all_cols = [c for c in gdf.columns if c != gdf.geometry.name]
745
837
 
746
838
  if all_cols:
747
839
  n = len(all_cols)
748
- print(f"[INFO] Available columns ({n}):")
749
- if n <= 20:
750
- for c in all_cols:
751
- print(f" {c}")
752
- else:
753
- ncols = 2 if n <= 30 else 3 if n <= 100 else 4
754
- nrows = (n + ncols - 1) // ncols
755
- padded = all_cols + [""] * (nrows * ncols - n)
756
- col_width = max(len(c) for c in all_cols) + 3
757
- for i in range(nrows):
758
- row = ""
759
- for j in range(ncols):
760
- row += padded[i + j * nrows].ljust(col_width)
761
- print(" " + row.rstrip())
762
- if not args.color_by:
840
+ if not args.color_by: # Only show columns if user didn't specify one
841
+ print(f"[INFO] Available columns ({n}):")
842
+ if n <= 20:
843
+ for c in all_cols:
844
+ print(f" {c}")
845
+ else:
846
+ ncols = 2 if n <= 30 else 3 if n <= 100 else 4
847
+ nrows = (n + ncols - 1) // ncols
848
+ padded = all_cols + [""] * (nrows * ncols - n)
849
+ col_width = max(len(c) for c in all_cols) + 3
850
+ for i in range(nrows):
851
+ row = ""
852
+ for j in range(ncols):
853
+ row += padded[i + j * nrows].ljust(col_width)
854
+ print(" " + row.rstrip())
763
855
  print("[INFO] Showing border-only view (use --color-by <column> to color features).")
764
856
  else:
765
857
  print("[INFO] No attribute columns found.")
766
-
858
+
767
859
  # Figure setup
768
860
  fig, ax = plt.subplots(figsize=(6, 6), dpi=150, facecolor="gray")
769
861
  ax.set_facecolor("gray")
@@ -808,8 +900,10 @@ def render_vector(path, args):
808
900
  cmap=cmap,
809
901
  vmin=vmin,
810
902
  vmax=vmax,
811
- linewidth=0.2,
812
- edgecolor="white",
903
+ linewidth=0,
904
+ edgecolor="none",
905
+ # linewidth=getattr(args, "width", 0.2),
906
+ # edgecolor=args.edgecolor,
813
907
  zorder=1
814
908
  )
815
909
 
@@ -827,8 +921,10 @@ def render_vector(path, args):
827
921
  subset.plot(
828
922
  ax=ax,
829
923
  facecolor=color,
830
- edgecolor="white",
831
- linewidth=0.2,
924
+ linewidth=0,
925
+ edgecolor="none",
926
+ # edgecolor=args.edgecolor,
927
+ # linewidth=getattr(args, "width", 0.2),
832
928
  zorder=1
833
929
  )
834
930
 
@@ -887,6 +983,157 @@ class SmartDefaults(argparse.ArgumentDefaultsHelpFormatter):
887
983
  # ---------------------------------------------------------------------
888
984
  # Dispatcher
889
985
  # ---------------------------------------------------------------------
986
+ def handle_tabular_data(df: pd.DataFrame, args, filepath: str) -> None:
987
+ """Handle CSV/parquet/vector-as-table with all tabular operations."""
988
+ if args.sql and (args.where or args.sort or args.limit or args.select):
989
+ print("[ERROR] --sql cannot be combined with --where/--sort/--limit/--select.")
990
+ sys.exit(1)
991
+
992
+ if args.sql:
993
+ try:
994
+ import duckdb
995
+ except ImportError:
996
+ print("[ERROR] --sql requires DuckDB. Install with: pip install duckdb")
997
+ sys.exit(1)
998
+
999
+ print("[INFO] Executing SQL query...")
1000
+
1001
+ try:
1002
+ query = args.sql.replace("data", f"read_csv_auto('{filepath}')")
1003
+ con = duckdb.connect()
1004
+ df = con.execute(query).df()
1005
+ con.close()
1006
+ except Exception as e:
1007
+ print(f"[ERROR] DuckDB SQL failed: {e}")
1008
+ sys.exit(1)
1009
+
1010
+ if df.empty:
1011
+ print("[WARN] Query returned no rows.")
1012
+ return
1013
+
1014
+ if args.describe:
1015
+ if isinstance(args.describe, str):
1016
+ describe_df(df, column=args.describe)
1017
+ else:
1018
+ describe_df(df)
1019
+ return
1020
+
1021
+ if args.hist:
1022
+ if isinstance(args.hist, str):
1023
+ inline_histogram_df(df, column=args.hist, bins=args.bins,
1024
+ display_scale=args.display, is_vector=False)
1025
+ else:
1026
+ inline_histogram_df(df, bins=args.bins,
1027
+ display_scale=args.display, is_vector=False)
1028
+ return
1029
+
1030
+ if args.scatter:
1031
+ plot_scatter_df(df, args.scatter[0], args.scatter[1],
1032
+ display_scale=args.display, is_vector=False)
1033
+ return
1034
+
1035
+ preview_df(df, max_rows=10, query_mode=True)
1036
+ return
1037
+
1038
+ if args.where or args.sort or args.limit or args.select:
1039
+ try:
1040
+ import duckdb
1041
+ except ImportError:
1042
+ print("[ERROR] Filtering requires DuckDB. Install with: pip install duckdb")
1043
+ sys.exit(1)
1044
+
1045
+ print("[INFO] Building query...")
1046
+
1047
+ base_query = "SELECT * FROM df"
1048
+
1049
+ if args.select:
1050
+ selected = ", ".join(args.select)
1051
+ print(f"[INFO] Selecting columns: {selected}")
1052
+ base_query = f"SELECT {selected} FROM df"
1053
+
1054
+ clauses = []
1055
+
1056
+ if args.where:
1057
+ print(f"[INFO] Applying filter: {args.where}")
1058
+ clauses.append(f"WHERE {args.where}")
1059
+
1060
+ if args.sort:
1061
+ direction = "DESC" if args.desc else "ASC"
1062
+ print(f"[INFO] Sorting by: {args.sort} ({direction})")
1063
+ clauses.append(f"ORDER BY {args.sort} {direction}")
1064
+
1065
+ if args.limit:
1066
+ print(f"[INFO] Limiting rows: {args.limit}")
1067
+ clauses.append(f"LIMIT {args.limit}")
1068
+
1069
+ query = " ".join([base_query] + clauses)
1070
+
1071
+ try:
1072
+ df = duckdb.query(query).to_df()
1073
+ except Exception as e:
1074
+ print(f"[ERROR] DuckDB query failed: {e}")
1075
+ sys.exit(1)
1076
+
1077
+ if df.empty:
1078
+ print("[WARN] Query returned no rows.")
1079
+ return
1080
+
1081
+ if args.unique:
1082
+ col = args.unique
1083
+
1084
+ if col not in df.columns:
1085
+ print(f"[ERROR] Column '{col}' not found.")
1086
+ return
1087
+
1088
+ vals = sorted(df[col].dropna().astype(str).unique())
1089
+ n = len(vals)
1090
+
1091
+ print(f"[DATA] Unique values in '{col}' ({n}):")
1092
+
1093
+ if n == 0:
1094
+ print(" (none)")
1095
+ return
1096
+
1097
+ if n <= 10:
1098
+ for v in vals:
1099
+ print(f" {v}")
1100
+ else:
1101
+ ncols = 2 if n <= 30 else 3 if n <= 100 else 4
1102
+ nrows = (n + ncols - 1) // ncols
1103
+ vals += [""] * (nrows * ncols - n)
1104
+ col_width = max(len(v) for v in vals) + 3
1105
+
1106
+ for i in range(nrows):
1107
+ row = ""
1108
+ for j in range(ncols):
1109
+ row += vals[i + j * nrows].ljust(col_width)
1110
+ print(" " + row.rstrip())
1111
+
1112
+ return
1113
+
1114
+ if args.describe:
1115
+ if isinstance(args.describe, str):
1116
+ describe_df(df, column=args.describe)
1117
+ else:
1118
+ describe_df(df)
1119
+ return
1120
+
1121
+ if args.hist:
1122
+ if isinstance(args.hist, str):
1123
+ inline_histogram_df(df, column=args.hist, bins=args.bins,
1124
+ display_scale=args.display, is_vector=False)
1125
+ else:
1126
+ inline_histogram_df(df, bins=args.bins,
1127
+ display_scale=args.display, is_vector=False)
1128
+ return
1129
+
1130
+ if args.scatter:
1131
+ plot_scatter_df(df, args.scatter[0], args.scatter[1],
1132
+ display_scale=args.display, is_vector=False)
1133
+ return
1134
+
1135
+ preview_df(df, max_rows=args.limit or 10, query_mode=bool(args.where or args.sort or args.select))
1136
+
890
1137
  def main() -> None:
891
1138
  parser = argparse.ArgumentParser(
892
1139
  prog="viewinline",
@@ -901,10 +1148,9 @@ def main() -> None:
901
1148
 
902
1149
  # File input
903
1150
  parser.add_argument(
904
- "paths", nargs="+",
1151
+ "paths", nargs="*", # Zero or more (optional)
905
1152
  help="Path to raster(s), vector, or CSV file. Provide 1 file or exactly 3 rasters for RGB (R G B)."
906
1153
  )
907
-
908
1154
  # Display options
909
1155
  parser.add_argument(
910
1156
  "--display", type=float, default=None,
@@ -914,7 +1160,11 @@ def main() -> None:
914
1160
  # Raster options
915
1161
  parser.add_argument(
916
1162
  "--band", type=int, default=1,
917
- help="Band number to display (single raster case)."
1163
+ help="Band number to display (single raster case), or slice number for NetCDF."
1164
+ )
1165
+ parser.add_argument(
1166
+ "--timestep", type=int, default=None,
1167
+ help="Alias for --band when working with NetCDF files (1-based index)."
918
1168
  )
919
1169
  parser.add_argument(
920
1170
  "--colormap", nargs="?", const="terrain",
@@ -922,8 +1172,8 @@ def main() -> None:
922
1172
  help="Apply colormap to single-band rasters or vector coloring. Flag without value → 'terrain'."
923
1173
  )
924
1174
  parser.add_argument(
925
- "--rgb-bands", type=str, default=None,
926
- help="Comma-separated band numbers for RGB display (e.g., '3,2,1'). Overrides default 1-3."
1175
+ "--rgb", nargs=3, type=int, metavar=('R', 'G', 'B'), default=None,
1176
+ help="Three band numbers for RGB display (e.g., --rgb 4 3 2). Overrides default 1 2 3."
927
1177
  )
928
1178
  parser.add_argument(
929
1179
  "--vmin", type=float, default=None,
@@ -935,12 +1185,20 @@ def main() -> None:
935
1185
  )
936
1186
  parser.add_argument(
937
1187
  "--nodata", type=float, default=None,
938
- help="Override nodata value for rasters if dataset metadata is incorrect."
1188
+ help="Override nodata value for rasters if dataset metadata is missing or incorrect."
939
1189
  )
940
1190
  parser.add_argument(
941
1191
  "--gallery", nargs="?", const="4x4", metavar="GRID",
942
1192
  help="Display all PNG/JPG/TIF images in a folder as thumbnails (e.g., 5x5 grid)."
943
1193
  )
1194
+ parser.add_argument(
1195
+ "--subset", type=int, default=None,
1196
+ help="Variable index for NetCDF files (e.g. --subset 1)."
1197
+ )
1198
+ parser.add_argument(
1199
+ "--rgbfiles", nargs=3, type=str, metavar=('R', 'G', 'B'),
1200
+ help="Three single-band rasters for RGB composite (e.g., --rgbfiles R.tif G.tif B.tif). Can also provide as positional arguments without the flag."
1201
+ )
944
1202
 
945
1203
  # CSV options
946
1204
  parser.add_argument(
@@ -1017,13 +1275,30 @@ def main() -> None:
1017
1275
  )
1018
1276
  parser.add_argument(
1019
1277
  "--layer", type=str, default=None,
1020
- help="Layer name for GeoPackage or multi-layer files."
1278
+ # help="Layer name for GeoPackage or multi-layer files."
1279
+ help="Layer name for GeoPackage/multi-layer files, or variable name for NetCDF files."
1021
1280
  )
1281
+ parser.add_argument(
1282
+ "--table", action="store_true",
1283
+ help="Display vector/parquet file as tabular data instead of rendering geometry."
1284
+ )
1022
1285
 
1023
1286
  parser.add_argument("--version", action="version", version=f"%(prog)s {__version__}")
1024
1287
 
1025
1288
  args = parser.parse_args()
1026
1289
 
1290
+ # Handle --rgbfiles flag (takes precedence)
1291
+ if args.rgbfiles:
1292
+ args.paths = args.rgbfiles
1293
+
1294
+ # Handle aliases
1295
+ if args.timestep is not None:
1296
+ args.band = args.timestep
1297
+
1298
+ # Validate input
1299
+ if not args.paths:
1300
+ parser.error("No input file(s) provided")
1301
+
1027
1302
  # Basic argument sanity check
1028
1303
  for bad in ("color-by", "edgecolor", "colormap", "band", "display"):
1029
1304
  for a in args.paths:
@@ -1035,8 +1310,8 @@ def main() -> None:
1035
1310
  paths = args.paths
1036
1311
 
1037
1312
  # File routing
1038
- raster_exts = (".png", ".jpg", ".jpeg", ".tif", ".tiff")
1039
- vector_exts = (".shp", ".geojson", ".json", ".gpkg")
1313
+ raster_exts = (".png", ".jpg", ".jpeg", ".tif", ".tiff", ".nc", ".hdf", ".hdf5", ".h5")
1314
+ vector_exts = (".shp", ".geojson", ".json", ".gpkg", ".parquet", "geoparquet")
1040
1315
 
1041
1316
  if len(paths) == 1:
1042
1317
  p = paths[0].lower()
@@ -1054,173 +1329,54 @@ def main() -> None:
1054
1329
  return
1055
1330
 
1056
1331
  elif p.endswith(vector_exts):
1057
- render_vector(paths[0], args)
1058
- return
1059
-
1060
- if os.path.isdir(paths[0]) and args.gallery:
1061
- render_gallery(paths[0], grid=args.gallery, display_scale=args.display, is_vector=False)
1062
- return
1063
-
1064
- elif p.endswith(".csv"):
1065
- # CSV handling (unchanged from original)
1066
- if args.sql and (args.where or args.sort or args.limit or args.select):
1067
- print("[ERROR] --sql cannot be combined with --where/--sort/--limit/--select.")
1068
- sys.exit(1)
1069
-
1070
- try:
1071
- df = pd.read_csv(paths[0])
1072
- except Exception as e:
1073
- print(f"[ERROR] Failed to read CSV: {e}")
1074
- sys.exit(1)
1075
-
1076
- if args.sql:
1077
- try:
1078
- import duckdb
1079
- except ImportError:
1080
- print("[ERROR] --sql requires DuckDB. Install with: pip install duckdb")
1081
- sys.exit(1)
1082
-
1083
- print("[INFO] Executing SQL query...")
1084
-
1332
+ if args.table:
1333
+ # Treat as tabular data
1085
1334
  try:
1086
- query = args.sql.replace("data", f"read_csv_auto('{paths[0]}')")
1087
- con = duckdb.connect()
1088
- df = con.execute(query).df()
1089
- con.close()
1090
- except Exception as e:
1091
- print(f"[ERROR] DuckDB SQL failed: {e}")
1092
- sys.exit(1)
1093
-
1094
- if df.empty:
1095
- print("[WARN] Query returned no rows.")
1096
- return
1097
-
1098
- if args.describe:
1099
- if isinstance(args.describe, str):
1100
- describe_df(df, column=args.describe)
1101
- else:
1102
- describe_df(df)
1103
- return
1104
-
1105
- if args.hist:
1106
- if isinstance(args.hist, str):
1107
- inline_histogram_df(df, column=args.hist, bins=args.bins,
1108
- display_scale=args.display, is_vector=False)
1335
+ if p.endswith(('.parquet', '.geoparquet')):
1336
+ df = pd.read_parquet(paths[0])
1109
1337
  else:
1110
- inline_histogram_df(df, bins=args.bins,
1111
- display_scale=args.display, is_vector=False)
1112
- return
1113
-
1114
- if args.scatter:
1115
- plot_scatter_df(df, args.scatter[0], args.scatter[1],
1116
- display_scale=args.display, is_vector=False)
1338
+ import geopandas as gpd
1339
+ gdf = gpd.read_file(paths[0])
1340
+ df = pd.DataFrame(gdf.drop(columns='geometry'))
1341
+ handle_tabular_data(df, args, paths[0])
1117
1342
  return
1118
-
1119
- preview_df(df, max_rows=10, query_mode=True)
1120
- return
1121
-
1122
- if args.where or args.sort or args.limit or args.select:
1123
- try:
1124
- import duckdb
1125
- except ImportError:
1126
- print("[ERROR] Filtering requires DuckDB. Install with: pip install duckdb")
1127
- sys.exit(1)
1128
-
1129
- print("[INFO] Building query...")
1130
-
1131
- base_query = "SELECT * FROM df"
1132
-
1133
- if args.select:
1134
- selected = ", ".join(args.select)
1135
- print(f"[INFO] Selecting columns: {selected}")
1136
- base_query = f"SELECT {selected} FROM df"
1137
-
1138
- clauses = []
1139
-
1140
- if args.where:
1141
- print(f"[INFO] Applying filter: {args.where}")
1142
- clauses.append(f"WHERE {args.where}")
1143
-
1144
- if args.sort:
1145
- direction = "DESC" if args.desc else "ASC"
1146
- print(f"[INFO] Sorting by: {args.sort} ({direction})")
1147
- clauses.append(f"ORDER BY {args.sort} {direction}")
1148
-
1149
- if args.limit:
1150
- print(f"[INFO] Limiting rows: {args.limit}")
1151
- clauses.append(f"LIMIT {args.limit}")
1152
-
1153
- query = " ".join([base_query] + clauses)
1154
-
1155
- try:
1156
- df = duckdb.query(query).to_df()
1157
1343
  except Exception as e:
1158
- print(f"[ERROR] DuckDB query failed: {e}")
1344
+ print(f"[ERROR] Failed to read file: {e}")
1159
1345
  sys.exit(1)
1160
-
1161
- if df.empty:
1162
- print("[WARN] Query returned no rows.")
1163
- return
1164
-
1165
- if args.unique:
1166
- col = args.unique
1167
-
1168
- if col not in df.columns:
1169
- print(f"[ERROR] Column '{col}' not found.")
1170
- return
1171
-
1172
- vals = sorted(df[col].dropna().astype(str).unique())
1173
- n = len(vals)
1174
-
1175
- print(f"[DATA] Unique values in '{col}' ({n}):")
1176
-
1177
- if n == 0:
1178
- print(" (none)")
1179
- return
1180
-
1181
- if n <= 10:
1182
- ncols = 1
1183
- elif n <= 30:
1184
- ncols = 2
1185
- elif n <= 100:
1186
- ncols = 3
1187
- else:
1188
- ncols = 4
1189
-
1190
- nrows = (n + ncols - 1) // ncols
1191
- vals += [""] * (nrows * ncols - n)
1192
- col_width = max(len(v) for v in vals) + 3
1193
-
1194
- for i in range(nrows):
1195
- row = ""
1196
- for j in range(ncols):
1197
- row += vals[i + j * nrows].ljust(col_width)
1198
- print(" " + row.rstrip())
1199
-
1346
+ else:
1347
+ render_vector(paths[0], args)
1200
1348
  return
1201
1349
 
1202
- if args.describe:
1203
- if isinstance(args.describe, str):
1204
- describe_df(df, column=args.describe)
1205
- else:
1206
- describe_df(df)
1207
- return
1350
+ if os.path.isdir(paths[0]) and args.gallery:
1351
+ render_gallery(paths[0], grid=args.gallery, display_scale=args.display, is_vector=False)
1352
+ return
1208
1353
 
1209
- if args.hist:
1210
- if isinstance(args.hist, str):
1211
- inline_histogram_df(df, column=args.hist, bins=args.bins,
1212
- display_scale=args.display, is_vector=False)
1354
+ elif p.endswith((".csv", ".parquet")):
1355
+ # Try geoparquet first if it's a parquet file
1356
+ if p.endswith(".parquet"):
1357
+ try:
1358
+ import geopandas as gpd
1359
+ gdf = gpd.read_parquet(paths[0])
1360
+ if hasattr(gdf, 'geometry') and gdf.geometry is not None:
1361
+ render_vector(paths[0], args)
1362
+ return
1363
+ except Exception:
1364
+ pass
1365
+
1366
+ # Read as tabular data
1367
+ try:
1368
+ if p.endswith(".parquet"):
1369
+ df = pd.read_parquet(paths[0])
1213
1370
  else:
1214
- inline_histogram_df(df, bins=args.bins,
1215
- display_scale=args.display, is_vector=False)
1216
- return
1217
-
1218
- if args.scatter:
1219
- plot_scatter_df(df, args.scatter[0], args.scatter[1],
1220
- display_scale=args.display, is_vector=False)
1221
- return
1222
-
1223
- preview_df(df, max_rows=args.limit or 10, query_mode=bool(args.where or args.sort or args.select))
1371
+ df = pd.read_csv(paths[0])
1372
+ except ImportError:
1373
+ print("[ERROR] Parquet support requires pyarrow. Install with: pip install pyarrow")
1374
+ sys.exit(1)
1375
+ except Exception as e:
1376
+ print(f"[ERROR] Failed to read file: {e}")
1377
+ sys.exit(1)
1378
+
1379
+ handle_tabular_data(df, args, paths[0])
1224
1380
  return
1225
1381
 
1226
1382
  else:
File without changes