jetson-examples 0.1.8__py3-none-any.whl → 0.1.9__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {jetson_examples-0.1.8.dist-info → jetson_examples-0.1.9.dist-info}/LICENSE +21 -21
- {jetson_examples-0.1.8.dist-info → jetson_examples-0.1.9.dist-info}/METADATA +1 -1
- jetson_examples-0.1.9.dist-info/RECORD +109 -0
- reComputer/__init__.py +1 -1
- reComputer/main.py +60 -60
- reComputer/scripts/MoveNet-Lightning/clean.sh +8 -8
- reComputer/scripts/MoveNet-Lightning/getVersion.sh +59 -59
- reComputer/scripts/MoveNet-Lightning/init.sh +6 -6
- reComputer/scripts/MoveNet-Lightning/readme.md +30 -30
- reComputer/scripts/MoveNet-Lightning/run.sh +19 -19
- reComputer/scripts/MoveNet-Thunder/clean.sh +7 -7
- reComputer/scripts/MoveNet-Thunder/getVersion.sh +59 -59
- reComputer/scripts/MoveNet-Thunder/init.sh +6 -6
- reComputer/scripts/MoveNet-Thunder/readme.md +31 -31
- reComputer/scripts/MoveNet-Thunder/run.sh +18 -18
- reComputer/scripts/MoveNetJS/clean.sh +4 -4
- reComputer/scripts/MoveNetJS/readme.md +56 -56
- reComputer/scripts/MoveNetJS/run.sh +13 -13
- reComputer/scripts/Sheared-LLaMA-2.7B-ShareGPT/init.sh +16 -16
- reComputer/scripts/Sheared-LLaMA-2.7B-ShareGPT/run.sh +8 -8
- reComputer/scripts/audiocraft/README.md +35 -35
- reComputer/scripts/audiocraft/clean.sh +5 -5
- reComputer/scripts/audiocraft/init.sh +16 -16
- reComputer/scripts/audiocraft/run.sh +7 -7
- reComputer/scripts/check.sh +4 -4
- reComputer/scripts/clean.sh +33 -33
- reComputer/scripts/comfyui/LICENSE +21 -21
- reComputer/scripts/comfyui/README.md +127 -127
- reComputer/scripts/comfyui/clean.sh +9 -7
- reComputer/scripts/comfyui/config.yaml +30 -29
- reComputer/scripts/comfyui/init.sh +9 -163
- reComputer/scripts/comfyui/run.sh +30 -30
- reComputer/scripts/depth-anything/Dockerfile +5 -5
- reComputer/scripts/depth-anything/LICENSE +21 -21
- reComputer/scripts/depth-anything/README.md +135 -135
- reComputer/scripts/depth-anything/clean.sh +7 -7
- reComputer/scripts/depth-anything/config.yaml +31 -31
- reComputer/scripts/depth-anything/init.sh +164 -164
- reComputer/scripts/depth-anything/run.sh +22 -22
- reComputer/scripts/depth-anything-v2/Dockerfile +5 -5
- reComputer/scripts/depth-anything-v2/LICENSE +21 -21
- reComputer/scripts/depth-anything-v2/README.md +135 -135
- reComputer/scripts/depth-anything-v2/clean.sh +7 -7
- reComputer/scripts/depth-anything-v2/config.yaml +31 -31
- reComputer/scripts/depth-anything-v2/init.sh +164 -164
- reComputer/scripts/depth-anything-v2/run.sh +22 -22
- reComputer/scripts/live-llava/init.sh +16 -16
- reComputer/scripts/live-llava/run.sh +278 -278
- reComputer/scripts/llama-factory/README.md +68 -68
- reComputer/scripts/llama-factory/clean.sh +4 -4
- reComputer/scripts/llama-factory/init.sh +52 -52
- reComputer/scripts/llama-factory/run.sh +10 -10
- reComputer/scripts/llama3/clean.sh +22 -22
- reComputer/scripts/llama3/config.yaml +31 -0
- reComputer/scripts/llama3/init.sh +19 -16
- reComputer/scripts/llama3/run.sh +13 -13
- reComputer/scripts/llava/clean.sh +3 -3
- reComputer/scripts/llava/init.sh +16 -16
- reComputer/scripts/llava/run.sh +9 -9
- reComputer/scripts/llava-v1.5-7b/init.sh +16 -16
- reComputer/scripts/llava-v1.5-7b/run.sh +9 -9
- reComputer/scripts/llava-v1.6-vicuna-7b/init.sh +16 -16
- reComputer/scripts/llava-v1.6-vicuna-7b/run.sh +10 -10
- reComputer/scripts/nanodb/init.sh +16 -16
- reComputer/scripts/nanodb/readme.md +10 -10
- reComputer/scripts/nanodb/run.sh +90 -90
- reComputer/scripts/nanoowl/init.sh +16 -16
- reComputer/scripts/nanoowl/run.sh +7 -7
- reComputer/scripts/ollama/clean.sh +22 -22
- reComputer/scripts/ollama/config.yaml +31 -0
- reComputer/scripts/ollama/init.sh +19 -16
- reComputer/scripts/ollama/run.sh +10 -10
- reComputer/scripts/parler-tts/clean.sh +7 -7
- reComputer/scripts/parler-tts/getVersion.sh +59 -59
- reComputer/scripts/parler-tts/init.sh +8 -8
- reComputer/scripts/parler-tts/readme.md +63 -63
- reComputer/scripts/parler-tts/run.sh +17 -17
- reComputer/scripts/run.sh +48 -48
- reComputer/scripts/stable-diffusion-webui/init.sh +16 -16
- reComputer/scripts/stable-diffusion-webui/run.sh +6 -6
- reComputer/scripts/text-generation-webui/init.sh +16 -16
- reComputer/scripts/text-generation-webui/run.sh +11 -11
- reComputer/scripts/ultralytics-yolo/LICENSE +21 -21
- reComputer/scripts/ultralytics-yolo/README.md +124 -124
- reComputer/scripts/ultralytics-yolo/clean.sh +6 -6
- reComputer/scripts/ultralytics-yolo/config.yaml +31 -31
- reComputer/scripts/ultralytics-yolo/init.sh +4 -4
- reComputer/scripts/ultralytics-yolo/run.sh +26 -26
- reComputer/scripts/update.sh +26 -26
- reComputer/scripts/utils.sh +168 -166
- reComputer/scripts/whisper/init.sh +16 -16
- reComputer/scripts/whisper/run.sh +7 -7
- reComputer/scripts/yolov10/Dockerfile +13 -13
- reComputer/scripts/yolov10/README.md +71 -71
- reComputer/scripts/yolov10/clean.sh +4 -4
- reComputer/scripts/yolov10/config.yaml +31 -31
- reComputer/scripts/yolov10/init.sh +20 -20
- reComputer/scripts/yolov10/run.sh +7 -7
- reComputer/scripts/yolov8-rail-inspection/config.yaml +31 -31
- reComputer/scripts/yolov8-rail-inspection/init.sh +5 -5
- reComputer/scripts/yolov8-rail-inspection/readme.md +35 -35
- reComputer/scripts/yolov8-rail-inspection/run.sh +21 -21
- jetson_examples-0.1.8.dist-info/RECORD +0 -107
- {jetson_examples-0.1.8.dist-info → jetson_examples-0.1.9.dist-info}/WHEEL +0 -0
- {jetson_examples-0.1.8.dist-info → jetson_examples-0.1.9.dist-info}/entry_points.txt +0 -0
- {jetson_examples-0.1.8.dist-info → jetson_examples-0.1.9.dist-info}/top_level.txt +0 -0
reComputer/scripts/nanodb/run.sh
CHANGED
@@ -1,91 +1,91 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
|
3
|
-
BASE_PATH=/home/$USER/reComputer
|
4
|
-
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
5
|
-
|
6
|
-
check_disk_space() {
|
7
|
-
directory="$1" # a directory
|
8
|
-
required_space_gb="$2" # how many GB we need
|
9
|
-
|
10
|
-
# get disk of directory
|
11
|
-
device=$(df -P "$directory" | awk 'NR==2 {print $1}')
|
12
|
-
echo $device
|
13
|
-
|
14
|
-
# get free space in KB
|
15
|
-
free_space=$(df -P "$device" | awk 'NR==2 {print $4}')
|
16
|
-
echo $free_space
|
17
|
-
|
18
|
-
# change unit to GB
|
19
|
-
free_space_gb=$(echo "scale=2; $free_space / 1024 / 1024" | bc)
|
20
|
-
echo $free_space_gb
|
21
|
-
|
22
|
-
# check and fast-fail
|
23
|
-
if (( $(echo "$free_space_gb >= $required_space_gb" | bc -l) )); then
|
24
|
-
echo "disk space ($1) enough, keep going."
|
25
|
-
else
|
26
|
-
echo "disk space ($1) not enough!! we need $2 GB!!"
|
27
|
-
exit 1
|
28
|
-
fi
|
29
|
-
}
|
30
|
-
|
31
|
-
# check data files TODO: support params to force download
|
32
|
-
DATA_PATH="$JETSON_REPO_PATH/data/datasets/coco/2017"
|
33
|
-
if [ ! -d $DATA_PATH ]; then
|
34
|
-
mkdir -p $DATA_PATH
|
35
|
-
fi
|
36
|
-
cd $DATA_PATH
|
37
|
-
# check val2017.zip
|
38
|
-
if [ ! -d "$DATA_PATH/val2017" ]; then
|
39
|
-
if [ ! -f "val2017.zip" ]; then
|
40
|
-
check_disk_space $DATA_PATH 1
|
41
|
-
wget http://images.cocodataset.org/zips/val2017.zip
|
42
|
-
else
|
43
|
-
echo "val2017.zip existed."
|
44
|
-
fi
|
45
|
-
check_disk_space $DATA_PATH 19
|
46
|
-
unzip val2017.zip && rm val2017.zip
|
47
|
-
else
|
48
|
-
echo "val2017/ existed."
|
49
|
-
fi
|
50
|
-
# check train2017.zip
|
51
|
-
if [ ! -d "$DATA_PATH/train2017" ]; then
|
52
|
-
if [ ! -f "train2017.zip" ]; then
|
53
|
-
check_disk_space $DATA_PATH 19
|
54
|
-
wget http://images.cocodataset.org/zips/train2017.zip
|
55
|
-
else
|
56
|
-
echo "train2017.zip existed."
|
57
|
-
fi
|
58
|
-
check_disk_space $DATA_PATH 19
|
59
|
-
unzip train2017.zip && rm train2017.zip
|
60
|
-
else
|
61
|
-
echo "train2017/ existed."
|
62
|
-
fi
|
63
|
-
if [ ! -d "$DATA_PATH/unlabeled2017" ]; then
|
64
|
-
# check unlabeled2017.zip
|
65
|
-
if [ ! -f "unlabeled2017.zip" ]; then
|
66
|
-
check_disk_space $DATA_PATH 19
|
67
|
-
wget http://images.cocodataset.org/zips/unlabeled2017.zip
|
68
|
-
else
|
69
|
-
echo "unlabeled2017.zip existed."
|
70
|
-
fi
|
71
|
-
check_disk_space $DATA_PATH 19
|
72
|
-
unzip unlabeled2017.zip && rm unlabeled2017.zip
|
73
|
-
else
|
74
|
-
echo "unlabeled2017/ existed."
|
75
|
-
fi
|
76
|
-
|
77
|
-
# check index files
|
78
|
-
INDEX_PATH="$JETSON_REPO_PATH/data/nanodb/coco/2017"
|
79
|
-
if [ ! -d $INDEX_PATH ]; then
|
80
|
-
cd $JETSON_REPO_PATH/data/
|
81
|
-
check_disk_space $JETSON_REPO_PATH 1
|
82
|
-
wget https://nvidia.box.com/shared/static/icw8qhgioyj4qsk832r4nj2p9olsxoci.gz -O nanodb_coco_2017.tar.gz
|
83
|
-
tar -xzvf nanodb_coco_2017.tar.gz
|
84
|
-
fi
|
85
|
-
|
86
|
-
# RUN
|
87
|
-
cd $JETSON_REPO_PATH
|
88
|
-
./run.sh $(./autotag nanodb) \
|
89
|
-
python3 -m nanodb \
|
90
|
-
--path /data/nanodb/coco/2017 \
|
1
|
+
#!/bin/bash
|
2
|
+
|
3
|
+
BASE_PATH=/home/$USER/reComputer
|
4
|
+
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
5
|
+
|
6
|
+
check_disk_space() {
|
7
|
+
directory="$1" # a directory
|
8
|
+
required_space_gb="$2" # how many GB we need
|
9
|
+
|
10
|
+
# get disk of directory
|
11
|
+
device=$(df -P "$directory" | awk 'NR==2 {print $1}')
|
12
|
+
echo $device
|
13
|
+
|
14
|
+
# get free space in KB
|
15
|
+
free_space=$(df -P "$device" | awk 'NR==2 {print $4}')
|
16
|
+
echo $free_space
|
17
|
+
|
18
|
+
# change unit to GB
|
19
|
+
free_space_gb=$(echo "scale=2; $free_space / 1024 / 1024" | bc)
|
20
|
+
echo $free_space_gb
|
21
|
+
|
22
|
+
# check and fast-fail
|
23
|
+
if (( $(echo "$free_space_gb >= $required_space_gb" | bc -l) )); then
|
24
|
+
echo "disk space ($1) enough, keep going."
|
25
|
+
else
|
26
|
+
echo "disk space ($1) not enough!! we need $2 GB!!"
|
27
|
+
exit 1
|
28
|
+
fi
|
29
|
+
}
|
30
|
+
|
31
|
+
# check data files TODO: support params to force download
|
32
|
+
DATA_PATH="$JETSON_REPO_PATH/data/datasets/coco/2017"
|
33
|
+
if [ ! -d $DATA_PATH ]; then
|
34
|
+
mkdir -p $DATA_PATH
|
35
|
+
fi
|
36
|
+
cd $DATA_PATH
|
37
|
+
# check val2017.zip
|
38
|
+
if [ ! -d "$DATA_PATH/val2017" ]; then
|
39
|
+
if [ ! -f "val2017.zip" ]; then
|
40
|
+
check_disk_space $DATA_PATH 1
|
41
|
+
wget http://images.cocodataset.org/zips/val2017.zip
|
42
|
+
else
|
43
|
+
echo "val2017.zip existed."
|
44
|
+
fi
|
45
|
+
check_disk_space $DATA_PATH 19
|
46
|
+
unzip val2017.zip && rm val2017.zip
|
47
|
+
else
|
48
|
+
echo "val2017/ existed."
|
49
|
+
fi
|
50
|
+
# check train2017.zip
|
51
|
+
if [ ! -d "$DATA_PATH/train2017" ]; then
|
52
|
+
if [ ! -f "train2017.zip" ]; then
|
53
|
+
check_disk_space $DATA_PATH 19
|
54
|
+
wget http://images.cocodataset.org/zips/train2017.zip
|
55
|
+
else
|
56
|
+
echo "train2017.zip existed."
|
57
|
+
fi
|
58
|
+
check_disk_space $DATA_PATH 19
|
59
|
+
unzip train2017.zip && rm train2017.zip
|
60
|
+
else
|
61
|
+
echo "train2017/ existed."
|
62
|
+
fi
|
63
|
+
if [ ! -d "$DATA_PATH/unlabeled2017" ]; then
|
64
|
+
# check unlabeled2017.zip
|
65
|
+
if [ ! -f "unlabeled2017.zip" ]; then
|
66
|
+
check_disk_space $DATA_PATH 19
|
67
|
+
wget http://images.cocodataset.org/zips/unlabeled2017.zip
|
68
|
+
else
|
69
|
+
echo "unlabeled2017.zip existed."
|
70
|
+
fi
|
71
|
+
check_disk_space $DATA_PATH 19
|
72
|
+
unzip unlabeled2017.zip && rm unlabeled2017.zip
|
73
|
+
else
|
74
|
+
echo "unlabeled2017/ existed."
|
75
|
+
fi
|
76
|
+
|
77
|
+
# check index files
|
78
|
+
INDEX_PATH="$JETSON_REPO_PATH/data/nanodb/coco/2017"
|
79
|
+
if [ ! -d $INDEX_PATH ]; then
|
80
|
+
cd $JETSON_REPO_PATH/data/
|
81
|
+
check_disk_space $JETSON_REPO_PATH 1
|
82
|
+
wget https://nvidia.box.com/shared/static/icw8qhgioyj4qsk832r4nj2p9olsxoci.gz -O nanodb_coco_2017.tar.gz
|
83
|
+
tar -xzvf nanodb_coco_2017.tar.gz
|
84
|
+
fi
|
85
|
+
|
86
|
+
# RUN
|
87
|
+
cd $JETSON_REPO_PATH
|
88
|
+
./run.sh $(./autotag nanodb) \
|
89
|
+
python3 -m nanodb \
|
90
|
+
--path /data/nanodb/coco/2017 \
|
91
91
|
--server --port=7860
|
@@ -1,16 +1,16 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
|
3
|
-
|
4
|
-
BASE_PATH=/home/$USER/reComputer
|
5
|
-
mkdir -p $BASE_PATH/
|
6
|
-
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
7
|
-
BASE_JETSON_LAB_GIT="https://github.com/dusty-nv/jetson-containers"
|
8
|
-
if [ -d $JETSON_REPO_PATH ]; then
|
9
|
-
echo "jetson-ai-lab existed."
|
10
|
-
else
|
11
|
-
echo "jetson-ai-lab does not installed. start init..."
|
12
|
-
cd $BASE_PATH/
|
13
|
-
git clone --depth=1 $BASE_JETSON_LAB_GIT
|
14
|
-
cd $JETSON_REPO_PATH
|
15
|
-
bash install.sh
|
16
|
-
fi
|
1
|
+
#!/bin/bash
|
2
|
+
|
3
|
+
|
4
|
+
BASE_PATH=/home/$USER/reComputer
|
5
|
+
mkdir -p $BASE_PATH/
|
6
|
+
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
7
|
+
BASE_JETSON_LAB_GIT="https://github.com/dusty-nv/jetson-containers"
|
8
|
+
if [ -d $JETSON_REPO_PATH ]; then
|
9
|
+
echo "jetson-ai-lab existed."
|
10
|
+
else
|
11
|
+
echo "jetson-ai-lab does not installed. start init..."
|
12
|
+
cd $BASE_PATH/
|
13
|
+
git clone --depth=1 $BASE_JETSON_LAB_GIT
|
14
|
+
cd $JETSON_REPO_PATH
|
15
|
+
bash install.sh
|
16
|
+
fi
|
@@ -1,7 +1,7 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
|
3
|
-
BASE_PATH=/home/$USER/reComputer
|
4
|
-
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
5
|
-
cd $JETSON_REPO_PATH
|
6
|
-
|
7
|
-
./run.sh $(./autotag nanoowl) bash -c "ls /dev/video* && cd examples/tree_demo && python3 tree_demo.py ../../data/owl_image_encoder_patch32.engine"
|
1
|
+
#!/bin/bash
|
2
|
+
|
3
|
+
BASE_PATH=/home/$USER/reComputer
|
4
|
+
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
5
|
+
cd $JETSON_REPO_PATH
|
6
|
+
|
7
|
+
./run.sh $(./autotag nanoowl) bash -c "ls /dev/video* && cd examples/tree_demo && python3 tree_demo.py ../../data/owl_image_encoder_patch32.engine"
|
@@ -1,22 +1,22 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
BASE_PATH=/home/$USER/reComputer
|
3
|
-
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
4
|
-
# search local image
|
5
|
-
img_tag=$($JETSON_REPO_PATH/autotag -p local ollama)
|
6
|
-
# 检查返回值
|
7
|
-
if [ $? -eq 0 ]; then
|
8
|
-
echo "Found Image successfully."
|
9
|
-
sudo docker rmi $img_tag
|
10
|
-
else
|
11
|
-
echo "[warn] Found Image failed with error code $?. skip delete Image."
|
12
|
-
fi
|
13
|
-
#
|
14
|
-
# 4 build whl
|
15
|
-
read -p "Delete all data for ollama? (y/n): " choice
|
16
|
-
if [[ $choice == "y" || $choice == "Y" ]]; then
|
17
|
-
echo "Delete=> $JETSON_REPO_PATH/data/models/ollama/"
|
18
|
-
sudo rm -rf $JETSON_REPO_PATH/data/models/ollama/
|
19
|
-
echo "Clean Data Done."
|
20
|
-
else
|
21
|
-
echo "[warn] Skip Clean Data."
|
22
|
-
fi
|
1
|
+
#!/bin/bash
|
2
|
+
BASE_PATH=/home/$USER/reComputer
|
3
|
+
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
4
|
+
# search local image
|
5
|
+
img_tag=$($JETSON_REPO_PATH/autotag -p local ollama)
|
6
|
+
# 检查返回值
|
7
|
+
if [ $? -eq 0 ]; then
|
8
|
+
echo "Found Image successfully."
|
9
|
+
sudo docker rmi $img_tag
|
10
|
+
else
|
11
|
+
echo "[warn] Found Image failed with error code $?. skip delete Image."
|
12
|
+
fi
|
13
|
+
#
|
14
|
+
# 4 build whl
|
15
|
+
read -p "Delete all data for ollama? (y/n): " choice
|
16
|
+
if [[ $choice == "y" || $choice == "Y" ]]; then
|
17
|
+
echo "Delete=> $JETSON_REPO_PATH/data/models/ollama/"
|
18
|
+
sudo rm -rf $JETSON_REPO_PATH/data/models/ollama/
|
19
|
+
echo "Clean Data Done."
|
20
|
+
else
|
21
|
+
echo "[warn] Skip Clean Data."
|
22
|
+
fi
|
@@ -0,0 +1,31 @@
|
|
1
|
+
# The tested JetPack versions.
|
2
|
+
ALLOWED_L4T_VERSIONS:
|
3
|
+
- 35.3.1
|
4
|
+
- 35.4.1
|
5
|
+
- 35.5.0
|
6
|
+
- 36.3.0
|
7
|
+
REQUIRED_DISK_SPACE: 15 # in GB
|
8
|
+
REQUIRED_MEM_SPACE: 7
|
9
|
+
PACKAGES:
|
10
|
+
- nvidia-jetpack
|
11
|
+
DOCKER:
|
12
|
+
ENABLE: true
|
13
|
+
DAEMON: |
|
14
|
+
{
|
15
|
+
"default-runtime": "nvidia",
|
16
|
+
"runtimes": {
|
17
|
+
"nvidia": {
|
18
|
+
"path": "nvidia-container-runtime",
|
19
|
+
"runtimeArgs": []
|
20
|
+
}
|
21
|
+
},
|
22
|
+
"storage-driver": "overlay2",
|
23
|
+
"data-root": "/var/lib/docker",
|
24
|
+
"log-driver": "json-file",
|
25
|
+
"log-opts": {
|
26
|
+
"max-size": "100m",
|
27
|
+
"max-file": "3"
|
28
|
+
},
|
29
|
+
"no-new-privileges": true,
|
30
|
+
"experimental": false
|
31
|
+
}
|
@@ -1,16 +1,19 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
|
3
|
-
|
4
|
-
|
5
|
-
|
6
|
-
|
7
|
-
|
8
|
-
|
9
|
-
|
10
|
-
|
11
|
-
|
12
|
-
|
13
|
-
|
14
|
-
|
15
|
-
|
16
|
-
|
1
|
+
#!/bin/bash
|
2
|
+
|
3
|
+
# check the runtime environment.
|
4
|
+
source $(dirname "$(realpath "$0")")/../utils.sh
|
5
|
+
check_base_env "$(dirname "$(realpath "$0")")/config.yaml"
|
6
|
+
|
7
|
+
BASE_PATH=/home/$USER/reComputer
|
8
|
+
mkdir -p $BASE_PATH/
|
9
|
+
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
10
|
+
BASE_JETSON_LAB_GIT="https://github.com/dusty-nv/jetson-containers"
|
11
|
+
if [ -d $JETSON_REPO_PATH ]; then
|
12
|
+
echo "jetson-ai-lab existed."
|
13
|
+
else
|
14
|
+
echo "jetson-ai-lab does not installed. start init..."
|
15
|
+
cd $BASE_PATH/
|
16
|
+
git clone --depth=1 $BASE_JETSON_LAB_GIT
|
17
|
+
cd $JETSON_REPO_PATH
|
18
|
+
bash install.sh
|
19
|
+
fi
|
reComputer/scripts/ollama/run.sh
CHANGED
@@ -1,11 +1,11 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
|
3
|
-
BASE_PATH=/home/$USER/reComputer
|
4
|
-
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
5
|
-
cd $JETSON_REPO_PATH
|
6
|
-
|
7
|
-
# try stop old server
|
8
|
-
docker rm -f ollama
|
9
|
-
# run Front-end
|
10
|
-
./run.sh $(./autotag ollama)
|
1
|
+
#!/bin/bash
|
2
|
+
|
3
|
+
BASE_PATH=/home/$USER/reComputer
|
4
|
+
JETSON_REPO_PATH="$BASE_PATH/jetson-containers"
|
5
|
+
cd $JETSON_REPO_PATH
|
6
|
+
|
7
|
+
# try stop old server
|
8
|
+
docker rm -f ollama
|
9
|
+
# run Front-end
|
10
|
+
./run.sh $(./autotag ollama)
|
11
11
|
# user only can access with http://ip:11434
|
@@ -1,7 +1,7 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
|
3
|
-
# get image
|
4
|
-
source ./getVersion.sh
|
5
|
-
|
6
|
-
# remove docker image
|
7
|
-
sudo docker rmi feiticeir0/parler-tts:${TAG_IMAGE}
|
1
|
+
#!/bin/bash
|
2
|
+
|
3
|
+
# get image
|
4
|
+
source ./getVersion.sh
|
5
|
+
|
6
|
+
# remove docker image
|
7
|
+
sudo docker rmi feiticeir0/parler-tts:${TAG_IMAGE}
|
@@ -1,59 +1,59 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
# based on dusty - https://github.com/dusty-nv/jetson-containers/blob/master/jetson_containers/l4t_version.sh
|
3
|
-
# and llama-factory init script
|
4
|
-
|
5
|
-
# we only have images for these - 36.2.0 works on 36.3.0
|
6
|
-
L4T_VERSIONS=("35.3.1", "35.4.1", "36.2.0", "36.3.0")
|
7
|
-
|
8
|
-
ARCH=$(uname -i)
|
9
|
-
# echo "ARCH: $ARCH"
|
10
|
-
|
11
|
-
if [ $ARCH = "aarch64" ]; then
|
12
|
-
L4T_VERSION_STRING=$(head -n 1 /etc/nv_tegra_release)
|
13
|
-
|
14
|
-
if [ -z "$L4T_VERSION_STRING" ]; then
|
15
|
-
#echo "reading L4T version from \"dpkg-query --show nvidia-l4t-core\""
|
16
|
-
|
17
|
-
L4T_VERSION_STRING=$(dpkg-query --showformat='${Version}' --show nvidia-l4t-core)
|
18
|
-
L4T_VERSION_ARRAY=(${L4T_VERSION_STRING//./ })
|
19
|
-
|
20
|
-
#echo ${L4T_VERSION_ARRAY[@]}
|
21
|
-
#echo ${#L4T_VERSION_ARRAY[@]}
|
22
|
-
|
23
|
-
L4T_RELEASE=${L4T_VERSION_ARRAY[0]}
|
24
|
-
L4T_REVISION=${L4T_VERSION_ARRAY[1]}
|
25
|
-
else
|
26
|
-
#echo "reading L4T version from /etc/nv_tegra_release"
|
27
|
-
|
28
|
-
L4T_RELEASE=$(echo $L4T_VERSION_STRING | cut -f 2 -d ' ' | grep -Po '(?<=R)[^;]+')
|
29
|
-
L4T_REVISION=$(echo $L4T_VERSION_STRING | cut -f 2 -d ',' | grep -Po '(?<=REVISION: )[^;]+')
|
30
|
-
fi
|
31
|
-
|
32
|
-
L4T_REVISION_MAJOR=${L4T_REVISION:0:1}
|
33
|
-
L4T_REVISION_MINOR=${L4T_REVISION:2:1}
|
34
|
-
|
35
|
-
L4T_VERSION="$L4T_RELEASE.$L4T_REVISION"
|
36
|
-
|
37
|
-
IMAGE_TAG=$L4T_VERSION
|
38
|
-
|
39
|
-
#echo "L4T_VERSION : $L4T_VERSION"
|
40
|
-
#echo "L4T_RELEASE : $L4T_RELEASE"
|
41
|
-
#echo "L4T_REVISION: $L4T_REVISION"
|
42
|
-
|
43
|
-
elif [ $ARCH != "x86_64" ]; then
|
44
|
-
echo "unsupported architecture: $ARCH"
|
45
|
-
exit 1
|
46
|
-
fi
|
47
|
-
|
48
|
-
|
49
|
-
if [[ ! " ${L4T_VERSIONS[@]} " =~ " ${L4T_VERSION} " ]]; then
|
50
|
-
echo "L4T_VERSION is not in the allowed versions list. Exiting."
|
51
|
-
exit 1
|
52
|
-
fi
|
53
|
-
|
54
|
-
# check if 36 to change IMAGE_TAG
|
55
|
-
if [ ${L4T_RELEASE} -eq "36" ]; then
|
56
|
-
# image tag will be 2.0
|
57
|
-
IMAGE_TAG="36.2.0"
|
58
|
-
fi
|
59
|
-
|
1
|
+
#!/bin/bash
|
2
|
+
# based on dusty - https://github.com/dusty-nv/jetson-containers/blob/master/jetson_containers/l4t_version.sh
|
3
|
+
# and llama-factory init script
|
4
|
+
|
5
|
+
# we only have images for these - 36.2.0 works on 36.3.0
|
6
|
+
L4T_VERSIONS=("35.3.1", "35.4.1", "36.2.0", "36.3.0")
|
7
|
+
|
8
|
+
ARCH=$(uname -i)
|
9
|
+
# echo "ARCH: $ARCH"
|
10
|
+
|
11
|
+
if [ $ARCH = "aarch64" ]; then
|
12
|
+
L4T_VERSION_STRING=$(head -n 1 /etc/nv_tegra_release)
|
13
|
+
|
14
|
+
if [ -z "$L4T_VERSION_STRING" ]; then
|
15
|
+
#echo "reading L4T version from \"dpkg-query --show nvidia-l4t-core\""
|
16
|
+
|
17
|
+
L4T_VERSION_STRING=$(dpkg-query --showformat='${Version}' --show nvidia-l4t-core)
|
18
|
+
L4T_VERSION_ARRAY=(${L4T_VERSION_STRING//./ })
|
19
|
+
|
20
|
+
#echo ${L4T_VERSION_ARRAY[@]}
|
21
|
+
#echo ${#L4T_VERSION_ARRAY[@]}
|
22
|
+
|
23
|
+
L4T_RELEASE=${L4T_VERSION_ARRAY[0]}
|
24
|
+
L4T_REVISION=${L4T_VERSION_ARRAY[1]}
|
25
|
+
else
|
26
|
+
#echo "reading L4T version from /etc/nv_tegra_release"
|
27
|
+
|
28
|
+
L4T_RELEASE=$(echo $L4T_VERSION_STRING | cut -f 2 -d ' ' | grep -Po '(?<=R)[^;]+')
|
29
|
+
L4T_REVISION=$(echo $L4T_VERSION_STRING | cut -f 2 -d ',' | grep -Po '(?<=REVISION: )[^;]+')
|
30
|
+
fi
|
31
|
+
|
32
|
+
L4T_REVISION_MAJOR=${L4T_REVISION:0:1}
|
33
|
+
L4T_REVISION_MINOR=${L4T_REVISION:2:1}
|
34
|
+
|
35
|
+
L4T_VERSION="$L4T_RELEASE.$L4T_REVISION"
|
36
|
+
|
37
|
+
IMAGE_TAG=$L4T_VERSION
|
38
|
+
|
39
|
+
#echo "L4T_VERSION : $L4T_VERSION"
|
40
|
+
#echo "L4T_RELEASE : $L4T_RELEASE"
|
41
|
+
#echo "L4T_REVISION: $L4T_REVISION"
|
42
|
+
|
43
|
+
elif [ $ARCH != "x86_64" ]; then
|
44
|
+
echo "unsupported architecture: $ARCH"
|
45
|
+
exit 1
|
46
|
+
fi
|
47
|
+
|
48
|
+
|
49
|
+
if [[ ! " ${L4T_VERSIONS[@]} " =~ " ${L4T_VERSION} " ]]; then
|
50
|
+
echo "L4T_VERSION is not in the allowed versions list. Exiting."
|
51
|
+
exit 1
|
52
|
+
fi
|
53
|
+
|
54
|
+
# check if 36 to change IMAGE_TAG
|
55
|
+
if [ ${L4T_RELEASE} -eq "36" ]; then
|
56
|
+
# image tag will be 2.0
|
57
|
+
IMAGE_TAG="36.2.0"
|
58
|
+
fi
|
59
|
+
|
@@ -1,8 +1,8 @@
|
|
1
|
-
#!/bin/bash
|
2
|
-
|
3
|
-
echo "Creating models directory at /home/$USER/models"
|
4
|
-
|
5
|
-
# Create Model dir in User home
|
6
|
-
mkdir /home/$USER/models
|
7
|
-
|
8
|
-
|
1
|
+
#!/bin/bash
|
2
|
+
|
3
|
+
echo "Creating models directory at /home/$USER/models"
|
4
|
+
|
5
|
+
# Create Model dir in User home
|
6
|
+
mkdir /home/$USER/models
|
7
|
+
|
8
|
+
|
@@ -1,63 +1,63 @@
|
|
1
|
-
# Parler TTS Mini: Expresso
|
2
|
-
|
3
|
-
|
4
|
-
Parler-TTS Mini: Expresso is a fine-tuned version of Parler-TTS Mini v0.1 on the Expresso dataset. It is a lightweight text-to-speech (TTS) model that can generate high-quality, natural sounding speech. Compared to the original model, Parler-TTS Expresso provides superior control over emotions (happy, confused, laughing, sad) and consistent voices (Jerry, Thomas, Elisabeth, Talia).
|
5
|
-
|
6
|
-
[You can get more information on HuggingFace](https://huggingface.co/parler-tts/parler-tts-mini-expresso)
|
7
|
-
|
8
|
-
![Gradio Interface] (audio1.png)
|
9
|
-
![Gradio Interface result] (audio2.png)
|
10
|
-
|
11
|
-
## Getting started
|
12
|
-
#### Prerequisites
|
13
|
-
* SeeedStudio reComputer J402 [Buy one](https://www.seeedstudio.com/reComputer-J4012-p-5586.html)
|
14
|
-
* Audio Columns
|
15
|
-
* Docker installed
|
16
|
-
|
17
|
-
## Instalation
|
18
|
-
PyPI (best)
|
19
|
-
|
20
|
-
```bash
|
21
|
-
pip install jetson-examples
|
22
|
-
```
|
23
|
-
|
24
|
-
## Usage
|
25
|
-
### Method 1
|
26
|
-
##### If you're running inside your reComputer
|
27
|
-
1. Type the following command in a terminal
|
28
|
-
```bash
|
29
|
-
reComputer run parler-tts
|
30
|
-
```
|
31
|
-
2. Open a web browser and go to [http://localhost:7860](http://localhost:7860)
|
32
|
-
3. A Gradio interface will appear with two text boxes
|
33
|
-
1. The first for you to write the text that will be converted to audio
|
34
|
-
2. A second one for you to describe the speaker: Male/Female, tone, pitch, mood, etc.. See the examples in Parler-tts page.
|
35
|
-
4. When you press submit, after a while, the audio will appear on the right box. You can also download the file if yo want.
|
36
|
-
|
37
|
-
### Method 2
|
38
|
-
##### If you want to connect remotely with ssh to the reComputer
|
39
|
-
1. Connect using SSH but redirecting the 7860 port
|
40
|
-
```bash
|
41
|
-
ssh -L 7860:localhost:7860 <username>@<reComputer_IP>
|
42
|
-
```
|
43
|
-
2. Type the following command in a terminal
|
44
|
-
```bash
|
45
|
-
reComputer run parler-tts
|
46
|
-
```
|
47
|
-
3. Open a web browser (on your machine) and go to [http://localhost:7860](http://localhost:7860)
|
48
|
-
|
49
|
-
4. The same instructions above.
|
50
|
-
|
51
|
-
## Manual Run
|
52
|
-
|
53
|
-
If you want to run the docker image outside jetson-examples, here's the command:
|
54
|
-
|
55
|
-
```bash
|
56
|
-
docker run --rm -p 7860:7860 --runtime=nvidia -v $(MODELS_DIR):/app feiticeir0/parler_tts:r36.2.0
|
57
|
-
```
|
58
|
-
|
59
|
-
**MODELS_DIR** is a directory where HuggingFace will place the models downloaded from its hub. If you want to run the image several times, the code will only download the model once, if that diretory stays the same.
|
60
|
-
|
61
|
-
This is controlled by an environment variable called HF_HOME.
|
62
|
-
|
63
|
-
[More info about HF environment variables](https://huggingface.co/docs/huggingface_hub/package_reference/environment_variables)
|
1
|
+
# Parler TTS Mini: Expresso
|
2
|
+
|
3
|
+
|
4
|
+
Parler-TTS Mini: Expresso is a fine-tuned version of Parler-TTS Mini v0.1 on the Expresso dataset. It is a lightweight text-to-speech (TTS) model that can generate high-quality, natural sounding speech. Compared to the original model, Parler-TTS Expresso provides superior control over emotions (happy, confused, laughing, sad) and consistent voices (Jerry, Thomas, Elisabeth, Talia).
|
5
|
+
|
6
|
+
[You can get more information on HuggingFace](https://huggingface.co/parler-tts/parler-tts-mini-expresso)
|
7
|
+
|
8
|
+
![Gradio Interface] (audio1.png)
|
9
|
+
![Gradio Interface result] (audio2.png)
|
10
|
+
|
11
|
+
## Getting started
|
12
|
+
#### Prerequisites
|
13
|
+
* SeeedStudio reComputer J402 [Buy one](https://www.seeedstudio.com/reComputer-J4012-p-5586.html)
|
14
|
+
* Audio Columns
|
15
|
+
* Docker installed
|
16
|
+
|
17
|
+
## Instalation
|
18
|
+
PyPI (best)
|
19
|
+
|
20
|
+
```bash
|
21
|
+
pip install jetson-examples
|
22
|
+
```
|
23
|
+
|
24
|
+
## Usage
|
25
|
+
### Method 1
|
26
|
+
##### If you're running inside your reComputer
|
27
|
+
1. Type the following command in a terminal
|
28
|
+
```bash
|
29
|
+
reComputer run parler-tts
|
30
|
+
```
|
31
|
+
2. Open a web browser and go to [http://localhost:7860](http://localhost:7860)
|
32
|
+
3. A Gradio interface will appear with two text boxes
|
33
|
+
1. The first for you to write the text that will be converted to audio
|
34
|
+
2. A second one for you to describe the speaker: Male/Female, tone, pitch, mood, etc.. See the examples in Parler-tts page.
|
35
|
+
4. When you press submit, after a while, the audio will appear on the right box. You can also download the file if yo want.
|
36
|
+
|
37
|
+
### Method 2
|
38
|
+
##### If you want to connect remotely with ssh to the reComputer
|
39
|
+
1. Connect using SSH but redirecting the 7860 port
|
40
|
+
```bash
|
41
|
+
ssh -L 7860:localhost:7860 <username>@<reComputer_IP>
|
42
|
+
```
|
43
|
+
2. Type the following command in a terminal
|
44
|
+
```bash
|
45
|
+
reComputer run parler-tts
|
46
|
+
```
|
47
|
+
3. Open a web browser (on your machine) and go to [http://localhost:7860](http://localhost:7860)
|
48
|
+
|
49
|
+
4. The same instructions above.
|
50
|
+
|
51
|
+
## Manual Run
|
52
|
+
|
53
|
+
If you want to run the docker image outside jetson-examples, here's the command:
|
54
|
+
|
55
|
+
```bash
|
56
|
+
docker run --rm -p 7860:7860 --runtime=nvidia -v $(MODELS_DIR):/app feiticeir0/parler_tts:r36.2.0
|
57
|
+
```
|
58
|
+
|
59
|
+
**MODELS_DIR** is a directory where HuggingFace will place the models downloaded from its hub. If you want to run the image several times, the code will only download the model once, if that diretory stays the same.
|
60
|
+
|
61
|
+
This is controlled by an environment variable called HF_HOME.
|
62
|
+
|
63
|
+
[More info about HF environment variables](https://huggingface.co/docs/huggingface_hub/package_reference/environment_variables)
|