@huggingface/transformers 3.3.1 → 3.3.3
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +4 -2
- package/dist/ort-wasm-simd-threaded.jsep.mjs +26 -26
- package/dist/ort-wasm-simd-threaded.jsep.wasm +0 -0
- package/dist/transformers.cjs +163 -19
- package/dist/transformers.cjs.map +1 -1
- package/dist/transformers.js +524 -356
- package/dist/transformers.js.map +1 -1
- package/dist/transformers.min.cjs +1 -1
- package/dist/transformers.min.cjs.map +1 -1
- package/dist/transformers.min.js +1 -1
- package/dist/transformers.min.js.map +1 -1
- package/dist/transformers.min.mjs +1 -1
- package/dist/transformers.min.mjs.map +1 -1
- package/dist/transformers.mjs +170 -20
- package/dist/transformers.mjs.map +1 -1
- package/package.json +4 -4
- package/src/configs.js +9 -2
- package/src/env.js +1 -1
- package/src/models/processors.js +1 -0
- package/src/models/wav2vec2/processing_wav2vec2.js +4 -2
- package/src/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.js +17 -0
- package/src/models.js +17 -0
- package/src/ops/registry.js +9 -1
- package/src/tokenizers.js +4 -2
- package/types/configs.d.ts.map +1 -1
- package/types/models/processors.d.ts +1 -0
- package/types/models/wav2vec2/processing_wav2vec2.d.ts +3 -1
- package/types/models/wav2vec2/processing_wav2vec2.d.ts.map +1 -1
- package/types/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.d.ts +14 -0
- package/types/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.d.ts.map +1 -0
- package/types/models.d.ts +12 -0
- package/types/models.d.ts.map +1 -1
- package/types/ops/registry.d.ts.map +1 -1
- package/types/tokenizers.d.ts.map +1 -1
- package/types/tsconfig.tsbuildinfo +1 -1
package/README.md
CHANGED
|
@@ -47,7 +47,7 @@ npm i @huggingface/transformers
|
|
|
47
47
|
Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using [ES Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules), you can import the library with:
|
|
48
48
|
```html
|
|
49
49
|
<script type="module">
|
|
50
|
-
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/transformers@3.3.
|
|
50
|
+
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/transformers@3.3.3';
|
|
51
51
|
</script>
|
|
52
52
|
```
|
|
53
53
|
|
|
@@ -155,7 +155,7 @@ Check out the Transformers.js [template](https://huggingface.co/new-space?templa
|
|
|
155
155
|
|
|
156
156
|
|
|
157
157
|
|
|
158
|
-
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@huggingface/transformers@3.3.
|
|
158
|
+
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@huggingface/transformers@3.3.3/dist/), which should work out-of-the-box. You can customize this as follows:
|
|
159
159
|
|
|
160
160
|
### Settings
|
|
161
161
|
|
|
@@ -328,6 +328,7 @@ You can refine your search by selecting the task you're interested in (e.g., [te
|
|
|
328
328
|
1. **Florence2** (from Microsoft) released with the paper [Florence-2: Advancing a Unified Representation for a Variety of Vision Tasks](https://arxiv.org/abs/2311.06242) by Bin Xiao, Haiping Wu, Weijian Xu, Xiyang Dai, Houdong Hu, Yumao Lu, Michael Zeng, Ce Liu, Lu Yuan.
|
|
329
329
|
1. **[Gemma](https://huggingface.co/docs/transformers/main/model_doc/gemma)** (from Google) released with the paper [Gemma: Open Models Based on Gemini Technology and Research](https://blog.google/technology/developers/gemma-open-models/) by the Gemma Google team.
|
|
330
330
|
1. **[Gemma2](https://huggingface.co/docs/transformers/main/model_doc/gemma2)** (from Google) released with the paper [Gemma2: Open Models Based on Gemini Technology and Research](https://blog.google/technology/developers/google-gemma-2/) by the Gemma Google team.
|
|
331
|
+
1. **[GLM](https://huggingface.co/docs/transformers/main/model_doc/glm)** (from the GLM Team, THUDM & ZhipuAI) released with the paper [ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools](https://arxiv.org/abs/2406.12793v2) by Team GLM: Aohan Zeng, Bin Xu, Bowen Wang, Chenhui Zhang, Da Yin, Dan Zhang, Diego Rojas, Guanyu Feng, Hanlin Zhao, Hanyu Lai, Hao Yu, Hongning Wang, Jiadai Sun, Jiajie Zhang, Jiale Cheng, Jiayi Gui, Jie Tang, Jing Zhang, Jingyu Sun, Juanzi Li, Lei Zhao, Lindong Wu, Lucen Zhong, Mingdao Liu, Minlie Huang, Peng Zhang, Qinkai Zheng, Rui Lu, Shuaiqi Duan, Shudan Zhang, Shulin Cao, Shuxun Yang, Weng Lam Tam, Wenyi Zhao, Xiao Liu, Xiao Xia, Xiaohan Zhang, Xiaotao Gu, Xin Lv, Xinghan Liu, Xinyi Liu, Xinyue Yang, Xixuan Song, Xunkai Zhang, Yifan An, Yifan Xu, Yilin Niu, Yuantao Yang, Yueyan Li, Yushi Bai, Yuxiao Dong, Zehan Qi, Zhaoyu Wang, Zhen Yang, Zhengxiao Du, Zhenyu Hou, Zihan Wang.
|
|
331
332
|
1. **[GLPN](https://huggingface.co/docs/transformers/model_doc/glpn)** (from KAIST) released with the paper [Global-Local Path Networks for Monocular Depth Estimation with Vertical CutDepth](https://arxiv.org/abs/2201.07436) by Doyeon Kim, Woonghyun Ga, Pyungwhan Ahn, Donggyu Joo, Sehwan Chun, Junmo Kim.
|
|
332
333
|
1. **[GPT Neo](https://huggingface.co/docs/transformers/model_doc/gpt_neo)** (from EleutherAI) released in the repository [EleutherAI/gpt-neo](https://github.com/EleutherAI/gpt-neo) by Sid Black, Stella Biderman, Leo Gao, Phil Wang and Connor Leahy.
|
|
333
334
|
1. **[GPT NeoX](https://huggingface.co/docs/transformers/model_doc/gpt_neox)** (from EleutherAI) released with the paper [GPT-NeoX-20B: An Open-Source Autoregressive Language Model](https://arxiv.org/abs/2204.06745) by Sid Black, Stella Biderman, Eric Hallahan, Quentin Anthony, Leo Gao, Laurence Golding, Horace He, Connor Leahy, Kyle McDonell, Jason Phang, Michael Pieler, USVSN Sai Prashanth, Shivanshu Purohit, Laria Reynolds, Jonathan Tow, Ben Wang, Samuel Weinbach
|
|
@@ -337,6 +338,7 @@ You can refine your search by selecting the task you're interested in (e.g., [te
|
|
|
337
338
|
1. **[Granite](https://huggingface.co/docs/transformers/main/model_doc/granite)** (from IBM) released with the paper [Power Scheduler: A Batch Size and Token Number Agnostic Learning Rate Scheduler](https://arxiv.org/abs/2408.13359) by Yikang Shen, Matthew Stallone, Mayank Mishra, Gaoyuan Zhang, Shawn Tan, Aditya Prasad, Adriana Meza Soria, David D. Cox, Rameswar Panda.
|
|
338
339
|
1. **[Grounding DINO](https://huggingface.co/docs/transformers/model_doc/grounding-dino)** (from IDEA-Research) released with the paper [Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection](https://arxiv.org/abs/2303.05499) by Shilong Liu, Zhaoyang Zeng, Tianhe Ren, Feng Li, Hao Zhang, Jie Yang, Qing Jiang, Chunyuan Li, Jianwei Yang, Hang Su, Jun Zhu, Lei Zhang.
|
|
339
340
|
1. **[GroupViT](https://huggingface.co/docs/transformers/model_doc/groupvit)** (from UCSD, NVIDIA) released with the paper [GroupViT: Semantic Segmentation Emerges from Text Supervision](https://arxiv.org/abs/2202.11094) by Jiarui Xu, Shalini De Mello, Sifei Liu, Wonmin Byeon, Thomas Breuel, Jan Kautz, Xiaolong Wang.
|
|
341
|
+
1. **[Helium](https://huggingface.co/docs/transformers/main/model_doc/helium)** (from the Kyutai Team) released with the blog post [Announcing Helium-1 Preview](https://kyutai.org/2025/01/13/helium.html) by the Kyutai Team.
|
|
340
342
|
1. **[HerBERT](https://huggingface.co/docs/transformers/model_doc/herbert)** (from Allegro.pl, AGH University of Science and Technology) released with the paper [KLEJ: Comprehensive Benchmark for Polish Language Understanding](https://www.aclweb.org/anthology/2020.acl-main.111.pdf) by Piotr Rybak, Robert Mroczkowski, Janusz Tracz, Ireneusz Gawlik.
|
|
341
343
|
1. **[Hiera](https://huggingface.co/docs/transformers/model_doc/hiera)** (from Meta) released with the paper [Hiera: A Hierarchical Vision Transformer without the Bells-and-Whistles](https://arxiv.org/pdf/2306.00989) by Chaitanya Ryali, Yuan-Ting Hu, Daniel Bolya, Chen Wei, Haoqi Fan, Po-Yao Huang, Vaibhav Aggarwal, Arkabandhu Chowdhury, Omid Poursaeed, Judy Hoffman, Jitendra Malik, Yanghao Li, Christoph Feichtenhofer.
|
|
342
344
|
1. **[Hubert](https://huggingface.co/docs/transformers/model_doc/hubert)** (from Facebook) released with the paper [HuBERT: Self-Supervised Speech Representation Learning by Masked Prediction of Hidden Units](https://arxiv.org/abs/2106.07447) by Wei-Ning Hsu, Benjamin Bolte, Yao-Hung Hubert Tsai, Kushal Lakhotia, Ruslan Salakhutdinov, Abdelrahman Mohamed.
|
|
@@ -25,31 +25,31 @@ if(!E){l=new WebAssembly.Memory({initial:256,maximum:65536,shared:!0});if(!(l.bu
|
|
|
25
25
|
var Xa=[],Ya=[],Za=[],$a=0,ab=null,bb=null;function cb(){$a--;if(0==$a&&(null!==ab&&(clearInterval(ab),ab=null),bb)){var a=bb;bb=null;a()}}function K(a){a="Aborted("+a+")";H(a);I=!0;Va=1;a=new WebAssembly.RuntimeError(a+". Build with -sASSERTIONS for more info.");oa(a);throw a;}var db=a=>a.startsWith("data:application/octet-stream;base64,"),Ba=a=>a.startsWith("file://"),eb;
|
|
26
26
|
function fb(a){if(a==eb&&Ta)return new Uint8Array(Ta);if(za)return za(a);throw"both async and sync fetching of the wasm failed";}function gb(a){if(!Ta&&(qa||C)){if("function"==typeof fetch&&!Ba(a))return fetch(a,{credentials:"same-origin"}).then(b=>{if(!b.ok)throw`failed to load wasm binary file at '${a}'`;return b.arrayBuffer()}).catch(()=>fb(a));if(ya)return new Promise((b,c)=>{ya(a,d=>b(new Uint8Array(d)),c)})}return Promise.resolve().then(()=>fb(a))}
|
|
27
27
|
function hb(a,b,c){return gb(a).then(d=>WebAssembly.instantiate(d,b)).then(c,d=>{H(`failed to asynchronously prepare wasm: ${d}`);K(d)})}function ib(a,b){var c=eb;return Ta||"function"!=typeof WebAssembly.instantiateStreaming||db(c)||Ba(c)||D||"function"!=typeof fetch?hb(c,a,b):fetch(c,{credentials:"same-origin"}).then(d=>WebAssembly.instantiateStreaming(d,a).then(b,function(f){H(`wasm streaming compile failed: ${f}`);H("falling back to ArrayBuffer instantiation");return hb(c,a,b)}))}
|
|
28
|
-
function Ja(){jb={O:kb,Aa:lb,b:mb,aa:nb,B:ob,qa:pb,Y:qb,_:rb,ra:sb,oa:tb,ha:ub,na:vb,L:wb,Z:xb,W:yb,pa:zb,X:Ab,va:Bb,F:Cb,Q:Db,P:Eb,E:Fb,u:Gb,q:Hb,G:Ib,A:Jb,R:Kb,ua:Lb,ka:Mb,U:Nb,ba:Ob,H:Pb,ja:La,ta:Qb,t:Rb,Ba:Sb,x:Tb,
|
|
29
|
-
var Lc={
|
|
30
|
-
|
|
31
|
-
{B.kb("Sigmoid",a,void 0)},
|
|
32
|
-
{B.kb("Tanh",a,void 0)},
|
|
33
|
-
a,void 0)},
|
|
34
|
-
0)):[]})},
|
|
35
|
-
axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},
|
|
36
|
-
|
|
37
|
-
h,u)=>{B.kb("ConvTranspose",a,{format:p?"NHWC":"NCHW",autoPad:b,dilations:[c],group:d,kernelShape:[f],pads:[g,k],strides:[m],wIsConst:()=>!!e()[n>>>0],outputPadding:r?Array.from(z().subarray(Number(r)>>>0,Number(v)>>>0)):[],outputShape:x?Array.from(z().subarray(Number(x)>>>0,Number(h)>>>0)):[],activation:L(u)})},
|
|
38
|
-
0,(Number(f)>>>0)+2>>>0)),pads:Array.from(z().subarray(Number(g)>>>0,(Number(g)>>>0)+4>>>0)),strides:Array.from(z().subarray(Number(k)>>>0,(Number(k)>>>0)+2>>>0)),wIsConst:()=>!!e()[p>>>0],outputPadding:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],outputShape:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[],activation:L(h)})},
|
|
39
|
-
wIsConst:()=>!!e()[n>>>0],outputPadding:r?Array.from(z().subarray(Number(r)>>>0,Number(v)>>>0)):[],outputShape:x?Array.from(z().subarray(Number(x)>>>0,Number(h)>>>0)):[],activation:L(u)})},
|
|
40
|
-
0)),strides:Array.from(z().subarray(Number(k)>>>0,(Number(k)>>>0)+2>>>0)),wIsConst:()=>!!e()[p>>>0],outputPadding:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],outputShape:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[],activation:L(h)})},
|
|
41
|
-
0,Number(k)>>>0)):[],kernel_shape:m?Array.from(z().subarray(Number(m)>>>0,Number(p)>>>0)):[],pads:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],strides:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[]})},
|
|
42
|
-
0)):[],kernel_shape:m?Array.from(z().subarray(Number(m)>>>0,Number(p)>>>0)):[],pads:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],strides:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[]})},
|
|
43
|
-
Array.from(z().subarray(Number(m)>>>0,Number(p)>>>0)):[],pads:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],strides:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[]})},
|
|
44
|
-
0,Number(p)>>>0)):[],pads:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],strides:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[]})},
|
|
45
|
-
|
|
46
|
-
excludeOutside:k,extrapolationValue:m,keepAspectRatioPolicy:L(p),mode:L(n),nearestMode:L(r)})},
|
|
47
|
-
format:c?"NHWC":"NCHW"})},
|
|
48
|
-
a,{exclusive:Number(b),reverse:Number(c)})},
|
|
49
|
-
0,Number(m)+k>>>0)):[],pastPresentShareBuffer:!!p})},
|
|
50
|
-
0,Number(r)>>>0)):[],w_is_const:()=>!!e()[Number(x)>>>0],activation:L(h),activation_params:u?Array.from(ja().subarray(Number(u)>>>0,Number(y)>>>0)):[]})},
|
|
51
|
-
simplified:!!d})},
|
|
52
|
-
a,{epsilon:b,simplified:!!c})},
|
|
28
|
+
function Ja(){jb={O:kb,Aa:lb,b:mb,aa:nb,B:ob,qa:pb,Y:qb,_:rb,ra:sb,oa:tb,ha:ub,na:vb,L:wb,Z:xb,W:yb,pa:zb,X:Ab,va:Bb,F:Cb,Q:Db,P:Eb,E:Fb,u:Gb,q:Hb,G:Ib,A:Jb,R:Kb,ua:Lb,ka:Mb,U:Nb,ba:Ob,H:Pb,ja:La,ta:Qb,t:Rb,Ba:Sb,x:Tb,o:Ub,m:Vb,c:Wb,n:Xb,k:Yb,w:Zb,p:$b,f:ac,s:bc,l:cc,e:dc,j:ec,i:fc,g:gc,d:hc,ea:ic,fa:jc,ga:kc,ca:lc,da:mc,T:nc,h:oc,D:pc,I:qc,M:rc,y:sc,sa:tc,V:uc,v:vc,z:wc,N:xc,S:yc,za:zc,ya:Ac,la:Bc,ma:Cc,$:Dc,C:Ec,K:Fc,ia:Gc,J:Hc,a:l,xa:Ic,wa:Jc,r:Kc};return{a:jb}}
|
|
29
|
+
var Lc={916868:(a,b,c,d,f)=>{if("undefined"==typeof B||!B.Fb)return 1;a=L(Number(a>>>0));a.startsWith("./")&&(a=a.substring(2));a=B.Fb.get(a);if(!a)return 2;b=Number(b>>>0);c=Number(c>>>0);d=Number(d>>>0);if(b+c>a.byteLength)return 3;try{const g=a.subarray(b,b+c);switch(f){case 0:w().set(g,d>>>0);break;case 1:B.dc(d,g);break;default:return 4}return 0}catch{return 4}},917583:(a,b,c)=>{B.ec(a,w().subarray(b>>>0,b+c>>>0))},917646:()=>B.bc(),917687:a=>{B.Pb(a)},917723:()=>{B.Wb()},917754:()=>{B.Xb()},
|
|
30
|
+
917783:()=>{B.ac()},917808:a=>B.Vb(a),917841:a=>B.Zb(a),917873:(a,b,c)=>{B.Ob(Number(a),Number(b),Number(c),!0)},917936:(a,b,c)=>{B.Ob(Number(a),Number(b),Number(c))},917993:()=>"undefined"!==typeof wasmOffsetConverter,918050:a=>{B.kb("Abs",a,void 0)},918101:a=>{B.kb("Neg",a,void 0)},918152:a=>{B.kb("Floor",a,void 0)},918205:a=>{B.kb("Ceil",a,void 0)},918257:a=>{B.kb("Reciprocal",a,void 0)},918315:a=>{B.kb("Sqrt",a,void 0)},918367:a=>{B.kb("Exp",a,void 0)},918418:a=>{B.kb("Erf",a,void 0)},918469:a=>
|
|
31
|
+
{B.kb("Sigmoid",a,void 0)},918524:(a,b,c)=>{B.kb("HardSigmoid",a,{alpha:b,beta:c})},918603:a=>{B.kb("Log",a,void 0)},918654:a=>{B.kb("Sin",a,void 0)},918705:a=>{B.kb("Cos",a,void 0)},918756:a=>{B.kb("Tan",a,void 0)},918807:a=>{B.kb("Asin",a,void 0)},918859:a=>{B.kb("Acos",a,void 0)},918911:a=>{B.kb("Atan",a,void 0)},918963:a=>{B.kb("Sinh",a,void 0)},919015:a=>{B.kb("Cosh",a,void 0)},919067:a=>{B.kb("Asinh",a,void 0)},919120:a=>{B.kb("Acosh",a,void 0)},919173:a=>{B.kb("Atanh",a,void 0)},919226:a=>
|
|
32
|
+
{B.kb("Tanh",a,void 0)},919278:a=>{B.kb("Not",a,void 0)},919329:(a,b,c)=>{B.kb("Clip",a,{min:b,max:c})},919398:a=>{B.kb("Clip",a,void 0)},919450:(a,b)=>{B.kb("Elu",a,{alpha:b})},919508:a=>{B.kb("Gelu",a,void 0)},919560:a=>{B.kb("Relu",a,void 0)},919612:(a,b)=>{B.kb("LeakyRelu",a,{alpha:b})},919676:(a,b)=>{B.kb("ThresholdedRelu",a,{alpha:b})},919746:(a,b)=>{B.kb("Cast",a,{to:b})},919804:a=>{B.kb("Add",a,void 0)},919855:a=>{B.kb("Sub",a,void 0)},919906:a=>{B.kb("Mul",a,void 0)},919957:a=>{B.kb("Div",
|
|
33
|
+
a,void 0)},920008:a=>{B.kb("Pow",a,void 0)},920059:a=>{B.kb("Equal",a,void 0)},920112:a=>{B.kb("Greater",a,void 0)},920167:a=>{B.kb("GreaterOrEqual",a,void 0)},920229:a=>{B.kb("Less",a,void 0)},920281:a=>{B.kb("LessOrEqual",a,void 0)},920340:(a,b,c,d,f)=>{B.kb("ReduceMean",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},920515:(a,b,c,d,f)=>{B.kb("ReduceMax",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>
|
|
34
|
+
0)):[]})},920689:(a,b,c,d,f)=>{B.kb("ReduceMin",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},920863:(a,b,c,d,f)=>{B.kb("ReduceProd",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},921038:(a,b,c,d,f)=>{B.kb("ReduceSum",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},921212:(a,b,c,d,f)=>{B.kb("ReduceL1",a,{keepDims:!!b,noopWithEmptyAxes:!!c,
|
|
35
|
+
axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},921385:(a,b,c,d,f)=>{B.kb("ReduceL2",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},921558:(a,b,c,d,f)=>{B.kb("ReduceLogSum",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},921735:(a,b,c,d,f)=>{B.kb("ReduceSumSquare",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},
|
|
36
|
+
921915:(a,b,c,d,f)=>{B.kb("ReduceLogSumExp",a,{keepDims:!!b,noopWithEmptyAxes:!!c,axes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},922095:a=>{B.kb("Where",a,void 0)},922148:(a,b,c)=>{B.kb("Transpose",a,{perm:b?Array.from(z().subarray(Number(b)>>>0,Number(c)>>>0)):[]})},922272:(a,b,c,d)=>{B.kb("DepthToSpace",a,{blocksize:b,mode:L(c),format:d?"NHWC":"NCHW"})},922405:(a,b,c,d)=>{B.kb("DepthToSpace",a,{blocksize:b,mode:L(c),format:d?"NHWC":"NCHW"})},922538:(a,b,c,d,f,g,k,m,p,n,r,v,x,
|
|
37
|
+
h,u)=>{B.kb("ConvTranspose",a,{format:p?"NHWC":"NCHW",autoPad:b,dilations:[c],group:d,kernelShape:[f],pads:[g,k],strides:[m],wIsConst:()=>!!e()[n>>>0],outputPadding:r?Array.from(z().subarray(Number(r)>>>0,Number(v)>>>0)):[],outputShape:x?Array.from(z().subarray(Number(x)>>>0,Number(h)>>>0)):[],activation:L(u)})},922971:(a,b,c,d,f,g,k,m,p,n,r,v,x,h)=>{B.kb("ConvTranspose",a,{format:m?"NHWC":"NCHW",autoPad:b,dilations:Array.from(z().subarray(Number(c)>>>0,(Number(c)>>>0)+2>>>0)),group:d,kernelShape:Array.from(z().subarray(Number(f)>>>
|
|
38
|
+
0,(Number(f)>>>0)+2>>>0)),pads:Array.from(z().subarray(Number(g)>>>0,(Number(g)>>>0)+4>>>0)),strides:Array.from(z().subarray(Number(k)>>>0,(Number(k)>>>0)+2>>>0)),wIsConst:()=>!!e()[p>>>0],outputPadding:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],outputShape:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[],activation:L(h)})},923632:(a,b,c,d,f,g,k,m,p,n,r,v,x,h,u)=>{B.kb("ConvTranspose",a,{format:p?"NHWC":"NCHW",autoPad:b,dilations:[c],group:d,kernelShape:[f],pads:[g,k],strides:[m],
|
|
39
|
+
wIsConst:()=>!!e()[n>>>0],outputPadding:r?Array.from(z().subarray(Number(r)>>>0,Number(v)>>>0)):[],outputShape:x?Array.from(z().subarray(Number(x)>>>0,Number(h)>>>0)):[],activation:L(u)})},924065:(a,b,c,d,f,g,k,m,p,n,r,v,x,h)=>{B.kb("ConvTranspose",a,{format:m?"NHWC":"NCHW",autoPad:b,dilations:Array.from(z().subarray(Number(c)>>>0,(Number(c)>>>0)+2>>>0)),group:d,kernelShape:Array.from(z().subarray(Number(f)>>>0,(Number(f)>>>0)+2>>>0)),pads:Array.from(z().subarray(Number(g)>>>0,(Number(g)>>>0)+4>>>
|
|
40
|
+
0)),strides:Array.from(z().subarray(Number(k)>>>0,(Number(k)>>>0)+2>>>0)),wIsConst:()=>!!e()[p>>>0],outputPadding:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],outputShape:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[],activation:L(h)})},924726:(a,b)=>{B.kb("GlobalAveragePool",a,{format:b?"NHWC":"NCHW"})},924817:(a,b,c,d,f,g,k,m,p,n,r,v,x,h)=>{B.kb("AveragePool",a,{format:h?"NHWC":"NCHW",auto_pad:b,ceil_mode:c,count_include_pad:d,storage_order:f,dilations:g?Array.from(z().subarray(Number(g)>>>
|
|
41
|
+
0,Number(k)>>>0)):[],kernel_shape:m?Array.from(z().subarray(Number(m)>>>0,Number(p)>>>0)):[],pads:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],strides:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[]})},925296:(a,b)=>{B.kb("GlobalAveragePool",a,{format:b?"NHWC":"NCHW"})},925387:(a,b,c,d,f,g,k,m,p,n,r,v,x,h)=>{B.kb("AveragePool",a,{format:h?"NHWC":"NCHW",auto_pad:b,ceil_mode:c,count_include_pad:d,storage_order:f,dilations:g?Array.from(z().subarray(Number(g)>>>0,Number(k)>>>
|
|
42
|
+
0)):[],kernel_shape:m?Array.from(z().subarray(Number(m)>>>0,Number(p)>>>0)):[],pads:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],strides:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[]})},925866:(a,b)=>{B.kb("GlobalMaxPool",a,{format:b?"NHWC":"NCHW"})},925953:(a,b,c,d,f,g,k,m,p,n,r,v,x,h)=>{B.kb("MaxPool",a,{format:h?"NHWC":"NCHW",auto_pad:b,ceil_mode:c,count_include_pad:d,storage_order:f,dilations:g?Array.from(z().subarray(Number(g)>>>0,Number(k)>>>0)):[],kernel_shape:m?
|
|
43
|
+
Array.from(z().subarray(Number(m)>>>0,Number(p)>>>0)):[],pads:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],strides:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[]})},926428:(a,b)=>{B.kb("GlobalMaxPool",a,{format:b?"NHWC":"NCHW"})},926515:(a,b,c,d,f,g,k,m,p,n,r,v,x,h)=>{B.kb("MaxPool",a,{format:h?"NHWC":"NCHW",auto_pad:b,ceil_mode:c,count_include_pad:d,storage_order:f,dilations:g?Array.from(z().subarray(Number(g)>>>0,Number(k)>>>0)):[],kernel_shape:m?Array.from(z().subarray(Number(m)>>>
|
|
44
|
+
0,Number(p)>>>0)):[],pads:n?Array.from(z().subarray(Number(n)>>>0,Number(r)>>>0)):[],strides:v?Array.from(z().subarray(Number(v)>>>0,Number(x)>>>0)):[]})},926990:(a,b,c,d,f)=>{B.kb("Gemm",a,{alpha:b,beta:c,transA:d,transB:f})},927094:a=>{B.kb("MatMul",a,void 0)},927148:(a,b,c,d)=>{B.kb("ArgMax",a,{keepDims:!!b,selectLastIndex:!!c,axis:d})},927256:(a,b,c,d)=>{B.kb("ArgMin",a,{keepDims:!!b,selectLastIndex:!!c,axis:d})},927364:(a,b)=>{B.kb("Softmax",a,{axis:b})},927427:(a,b)=>{B.kb("Concat",a,{axis:b})},
|
|
45
|
+
927487:(a,b,c,d,f)=>{B.kb("Split",a,{axis:b,numOutputs:c,splitSizes:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},927643:a=>{B.kb("Expand",a,void 0)},927697:(a,b)=>{B.kb("Gather",a,{axis:Number(b)})},927768:(a,b)=>{B.kb("GatherElements",a,{axis:Number(b)})},927847:(a,b)=>{B.kb("GatherND",a,{batch_dims:Number(b)})},927926:(a,b,c,d,f,g,k,m,p,n,r)=>{B.kb("Resize",a,{antialias:b,axes:c?Array.from(z().subarray(Number(c)>>>0,Number(d)>>>0)):[],coordinateTransformMode:L(f),cubicCoeffA:g,
|
|
46
|
+
excludeOutside:k,extrapolationValue:m,keepAspectRatioPolicy:L(p),mode:L(n),nearestMode:L(r)})},928288:(a,b,c,d,f,g,k)=>{B.kb("Slice",a,{starts:b?Array.from(z().subarray(Number(b)>>>0,Number(c)>>>0)):[],ends:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[],axes:g?Array.from(z().subarray(Number(g)>>>0,Number(k)>>>0)):[]})},928552:a=>{B.kb("Tile",a,void 0)},928604:(a,b,c)=>{B.kb("InstanceNormalization",a,{epsilon:b,format:c?"NHWC":"NCHW"})},928718:(a,b,c)=>{B.kb("InstanceNormalization",a,{epsilon:b,
|
|
47
|
+
format:c?"NHWC":"NCHW"})},928832:a=>{B.kb("Range",a,void 0)},928885:(a,b)=>{B.kb("Einsum",a,{equation:L(b)})},928966:(a,b,c,d,f)=>{B.kb("Pad",a,{mode:b,value:c,pads:d?Array.from(z().subarray(Number(d)>>>0,Number(f)>>>0)):[]})},929109:(a,b,c,d,f,g)=>{B.kb("BatchNormalization",a,{epsilon:b,momentum:c,spatial:!!f,trainingMode:!!d,format:g?"NHWC":"NCHW"})},929278:(a,b,c,d,f,g)=>{B.kb("BatchNormalization",a,{epsilon:b,momentum:c,spatial:!!f,trainingMode:!!d,format:g?"NHWC":"NCHW"})},929447:(a,b,c)=>{B.kb("CumSum",
|
|
48
|
+
a,{exclusive:Number(b),reverse:Number(c)})},929544:(a,b,c)=>{B.kb("DequantizeLinear",a,{axis:b,blockSize:c})},929634:(a,b,c,d,f)=>{B.kb("GridSample",a,{align_corners:b,mode:L(c),padding_mode:L(d),format:f?"NHWC":"NCHW"})},929804:(a,b,c,d,f)=>{B.kb("GridSample",a,{align_corners:b,mode:L(c),padding_mode:L(d),format:f?"NHWC":"NCHW"})},929974:(a,b,c,d,f,g,k,m,p)=>{B.kb("Attention",a,{numHeads:b,isUnidirectional:c,maskFilterValue:d,scale:f,doRotary:g,qkvHiddenSizes:k?Array.from(z().subarray(Number(m)>>>
|
|
49
|
+
0,Number(m)+k>>>0)):[],pastPresentShareBuffer:!!p})},930246:a=>{B.kb("BiasAdd",a,void 0)},930301:a=>{B.kb("BiasSplitGelu",a,void 0)},930362:a=>{B.kb("FastGelu",a,void 0)},930418:(a,b,c,d,f,g,k,m,p,n,r,v,x,h,u,y)=>{B.kb("Conv",a,{format:v?"NHWC":"NCHW",auto_pad:b,dilations:c?Array.from(z().subarray(Number(c)>>>0,Number(d)>>>0)):[],group:f,kernel_shape:g?Array.from(z().subarray(Number(g)>>>0,Number(k)>>>0)):[],pads:m?Array.from(z().subarray(Number(m)>>>0,Number(p)>>>0)):[],strides:n?Array.from(z().subarray(Number(n)>>>
|
|
50
|
+
0,Number(r)>>>0)):[],w_is_const:()=>!!e()[Number(x)>>>0],activation:L(h),activation_params:u?Array.from(ja().subarray(Number(u)>>>0,Number(y)>>>0)):[]})},931002:a=>{B.kb("Gelu",a,void 0)},931054:(a,b,c,d,f,g,k,m,p)=>{B.kb("GroupQueryAttention",a,{numHeads:b,kvNumHeads:c,scale:d,softcap:f,doRotary:g,rotaryInterleaved:k,smoothSoftmax:m,localWindowSize:p})},931271:(a,b,c,d)=>{B.kb("LayerNormalization",a,{axis:b,epsilon:c,simplified:!!d})},931382:(a,b,c,d)=>{B.kb("LayerNormalization",a,{axis:b,epsilon:c,
|
|
51
|
+
simplified:!!d})},931493:(a,b,c,d,f,g)=>{B.kb("MatMulNBits",a,{k:b,n:c,accuracyLevel:d,bits:f,blockSize:g})},931620:(a,b,c,d,f,g)=>{B.kb("MultiHeadAttention",a,{numHeads:b,isUnidirectional:c,maskFilterValue:d,scale:f,doRotary:g})},931779:(a,b)=>{B.kb("QuickGelu",a,{alpha:b})},931843:(a,b,c,d,f)=>{B.kb("RotaryEmbedding",a,{interleaved:!!b,numHeads:c,rotaryEmbeddingDim:d,scale:f})},931982:(a,b,c)=>{B.kb("SkipLayerNormalization",a,{epsilon:b,simplified:!!c})},932084:(a,b,c)=>{B.kb("SkipLayerNormalization",
|
|
52
|
+
a,{epsilon:b,simplified:!!c})},932186:(a,b,c,d)=>{B.kb("GatherBlockQuantized",a,{gatherAxis:b,quantizeAxis:c,blockSize:d})},932307:a=>{B.$b(a)},932341:(a,b)=>B.cc(Number(a),Number(b),B.Gb.hc,B.Gb.errors)};function lb(a,b,c){return Mc(async()=>{await B.Yb(Number(a),Number(b),Number(c))})}function kb(){return"undefined"!==typeof wasmOffsetConverter}function Nc(a){this.name="ExitStatus";this.message=`Program terminated with exit(${a})`;this.status=a}
|
|
53
53
|
var Oc=a=>{a.terminate();a.onmessage=()=>{}},Rc=a=>{0==M.length&&(Pc(),Qc(M[0]));var b=M.pop();if(!b)return 6;N.push(b);O[a.Bb]=b;b.Bb=a.Bb;var c={cmd:"run",start_routine:a.ic,arg:a.Rb,pthread_ptr:a.Bb};D&&b.unref();b.postMessage(c,a.nc);return 0},Sc=0,P=(a,b,...c)=>{for(var d=2*c.length,f=Tc(),g=Vc(8*d),k=g>>>3,m=0;m<c.length;m++){var p=c[m];"bigint"==typeof p?(J[k+2*m]=1n,J[k+2*m+1]=p):(J[k+2*m]=0n,la()[k+2*m+1>>>0]=p)}a=Wc(a,0,d,g,b);Xc(f);return a};
|
|
54
54
|
function Ic(a){if(E)return P(0,1,a);Va=a;if(!(0<Sc)){for(var b of N)Oc(b);for(b of M)Oc(b);M=[];N=[];O=[];I=!0}wa(a,new Nc(a))}function Yc(a){if(E)return P(1,0,a);Dc(a)}var Dc=a=>{Va=a;if(E)throw Yc(a),"unwind";Ic(a)},M=[],N=[],Zc=[],O={};function $c(){for(var a=B.numThreads-1;a--;)Pc();Xa.unshift(()=>{$a++;ad(()=>cb())})}var cd=a=>{var b=a.Bb;delete O[b];M.push(a);N.splice(N.indexOf(a),1);a.Bb=0;bd(b)};function Na(){Zc.forEach(a=>a())}
|
|
55
55
|
var Qc=a=>new Promise(b=>{a.onmessage=g=>{g=g.data;var k=g.cmd;if(g.targetThread&&g.targetThread!=Ia()){var m=O[g.targetThread];m?m.postMessage(g,g.transferList):H(`Internal error! Worker sent a message "${k}" to target pthread ${g.targetThread}, but that thread no longer exists!`)}else if("checkMailbox"===k)Ra();else if("spawnThread"===k)Rc(g);else if("cleanupThread"===k)cd(O[g.thread]);else if("killThread"===k)g=g.thread,k=O[g],delete O[g],Oc(k),bd(g),N.splice(N.indexOf(k),1),k.Bb=0;else if("cancelThread"===
|
|
@@ -115,7 +115,7 @@ B._OrtAddFreeDimensionOverride=(a,b,c)=>(B._OrtAddFreeDimensionOverride=Y.Ja)(a,
|
|
|
115
115
|
B._OrtGetOutputName=(a,b)=>(B._OrtGetOutputName=Y.Qa)(a,b);B._OrtFree=a=>(B._OrtFree=Y.Ra)(a);B._OrtCreateTensor=(a,b,c,d,f,g)=>(B._OrtCreateTensor=Y.Sa)(a,b,c,d,f,g);B._OrtGetTensorData=(a,b,c,d,f)=>(B._OrtGetTensorData=Y.Ta)(a,b,c,d,f);B._OrtReleaseTensor=a=>(B._OrtReleaseTensor=Y.Ua)(a);B._OrtCreateRunOptions=(a,b,c,d)=>(B._OrtCreateRunOptions=Y.Va)(a,b,c,d);B._OrtAddRunConfigEntry=(a,b,c)=>(B._OrtAddRunConfigEntry=Y.Wa)(a,b,c);B._OrtReleaseRunOptions=a=>(B._OrtReleaseRunOptions=Y.Xa)(a);
|
|
116
116
|
B._OrtCreateBinding=a=>(B._OrtCreateBinding=Y.Ya)(a);B._OrtBindInput=(a,b,c)=>(B._OrtBindInput=Y.Za)(a,b,c);B._OrtBindOutput=(a,b,c,d)=>(B._OrtBindOutput=Y._a)(a,b,c,d);B._OrtClearBoundOutputs=a=>(B._OrtClearBoundOutputs=Y.$a)(a);B._OrtReleaseBinding=a=>(B._OrtReleaseBinding=Y.ab)(a);B._OrtRunWithBinding=(a,b,c,d,f)=>(B._OrtRunWithBinding=Y.bb)(a,b,c,d,f);B._OrtRun=(a,b,c,d,f,g,k,m)=>(B._OrtRun=Y.cb)(a,b,c,d,f,g,k,m);B._OrtEndProfiling=a=>(B._OrtEndProfiling=Y.db)(a);
|
|
117
117
|
B._JsepOutput=(a,b,c)=>(B._JsepOutput=Y.eb)(a,b,c);B._JsepGetNodeName=a=>(B._JsepGetNodeName=Y.fb)(a);
|
|
118
|
-
var Ia=()=>(Ia=Y.gb)(),X=B._free=a=>(X=B._free=Y.hb)(a),Ad=B._malloc=a=>(Ad=B._malloc=Y.ib)(a),Ka=(a,b,c,d,f,g)=>(Ka=Y.lb)(a,b,c,d,f,g),Sa=()=>(Sa=Y.mb)(),Wc=(a,b,c,d,f)=>(Wc=Y.nb)(a,b,c,d,f),bd=a=>(bd=Y.ob)(a),Qa=a=>(Qa=Y.pb)(a),Jd=()=>(Jd=Y.qb)(),ed=(a,b)=>(ed=Y.rb)(a,b),Xc=a=>(Xc=Y.sb)(a),Vc=a=>(Vc=Y.tb)(a),Tc=()=>(Tc=Y.ub)(),fd=B.dynCall_ii=(a,b)=>(fd=B.dynCall_ii=Y.wb)(a,b),ce=a=>(ce=Y.xb)(a),Sd=()=>(Sd=Y.yb)(),be=a=>(be=Y.zb)(a),de=()=>(de=Y.Ab)();B.___start_em_js=
|
|
118
|
+
var Ia=()=>(Ia=Y.gb)(),X=B._free=a=>(X=B._free=Y.hb)(a),Ad=B._malloc=a=>(Ad=B._malloc=Y.ib)(a),Ka=(a,b,c,d,f,g)=>(Ka=Y.lb)(a,b,c,d,f,g),Sa=()=>(Sa=Y.mb)(),Wc=(a,b,c,d,f)=>(Wc=Y.nb)(a,b,c,d,f),bd=a=>(bd=Y.ob)(a),Qa=a=>(Qa=Y.pb)(a),Jd=()=>(Jd=Y.qb)(),ed=(a,b)=>(ed=Y.rb)(a,b),Xc=a=>(Xc=Y.sb)(a),Vc=a=>(Vc=Y.tb)(a),Tc=()=>(Tc=Y.ub)(),fd=B.dynCall_ii=(a,b)=>(fd=B.dynCall_ii=Y.wb)(a,b),ce=a=>(ce=Y.xb)(a),Sd=()=>(Sd=Y.yb)(),be=a=>(be=Y.zb)(a),de=()=>(de=Y.Ab)();B.___start_em_js=932469;B.___stop_em_js=932715;
|
|
119
119
|
function Ee(){var a=Y;a=Object.assign({},a);var b=d=>f=>d(f)>>>0,c=d=>()=>d()>>>0;a.Da=b(a.Da);a.gb=c(a.gb);a.ib=b(a.ib);a.emscripten_main_runtime_thread_id=c(a.emscripten_main_runtime_thread_id);a.tb=b(a.tb);a.ub=c(a.ub);return a}B.stackSave=()=>Tc();B.stackRestore=a=>Xc(a);B.stackAlloc=a=>Vc(a);
|
|
120
120
|
B.setValue=function(a,b,c="i8"){c.endsWith("*")&&(c="*");switch(c){case "i1":e()[a>>>0]=b;break;case "i8":e()[a>>>0]=b;break;case "i16":ca()[a>>>1>>>0]=b;break;case "i32":z()[a>>>2>>>0]=b;break;case "i64":J[a>>>3]=BigInt(b);break;case "float":ja()[a>>>2>>>0]=b;break;case "double":la()[a>>>3>>>0]=b;break;case "*":A()[a>>>2>>>0]=b;break;default:K(`invalid type for setValue: ${c}`)}};
|
|
121
121
|
B.getValue=function(a,b="i8"){b.endsWith("*")&&(b="*");switch(b){case "i1":return e()[a>>>0];case "i8":return e()[a>>>0];case "i16":return ca()[a>>>1>>>0];case "i32":return z()[a>>>2>>>0];case "i64":return J[a>>>3];case "float":return ja()[a>>>2>>>0];case "double":return la()[a>>>3>>>0];case "*":return A()[a>>>2>>>0];default:K(`invalid type for getValue: ${b}`)}};B.UTF8ToString=L;B.stringToUTF8=pd;B.lengthBytesUTF8=nd;var Fe;bb=function Ge(){Fe||He();Fe||(bb=Ge)};
|
|
Binary file
|
package/dist/transformers.cjs
CHANGED
|
@@ -752,18 +752,19 @@ function parse(tokens) {
|
|
|
752
752
|
return left;
|
|
753
753
|
}
|
|
754
754
|
function parseCallMemberExpression() {
|
|
755
|
-
const member = parseMemberExpression();
|
|
755
|
+
const member = parseMemberExpression(parsePrimaryExpression());
|
|
756
756
|
if (is(TOKEN_TYPES.OpenParen)) {
|
|
757
757
|
return parseCallExpression(member);
|
|
758
758
|
}
|
|
759
759
|
return member;
|
|
760
760
|
}
|
|
761
761
|
function parseCallExpression(callee) {
|
|
762
|
-
let
|
|
762
|
+
let expression = new CallExpression(callee, parseArgs());
|
|
763
|
+
expression = parseMemberExpression(expression);
|
|
763
764
|
if (is(TOKEN_TYPES.OpenParen)) {
|
|
764
|
-
|
|
765
|
+
expression = parseCallExpression(expression);
|
|
765
766
|
}
|
|
766
|
-
return
|
|
767
|
+
return expression;
|
|
767
768
|
}
|
|
768
769
|
function parseArgs() {
|
|
769
770
|
expect(TOKEN_TYPES.OpenParen, "Expected opening parenthesis for arguments list");
|
|
@@ -817,8 +818,7 @@ function parse(tokens) {
|
|
|
817
818
|
}
|
|
818
819
|
return slices[0];
|
|
819
820
|
}
|
|
820
|
-
function parseMemberExpression() {
|
|
821
|
-
let object = parsePrimaryExpression();
|
|
821
|
+
function parseMemberExpression(object) {
|
|
822
822
|
while (is(TOKEN_TYPES.Dot) || is(TOKEN_TYPES.OpenSquareBracket)) {
|
|
823
823
|
const operator = tokens[current];
|
|
824
824
|
++current;
|
|
@@ -1043,6 +1043,41 @@ var StringValue = class extends RuntimeValue {
|
|
|
1043
1043
|
new FunctionValue(() => {
|
|
1044
1044
|
return new StringValue(this.value.trimStart());
|
|
1045
1045
|
})
|
|
1046
|
+
],
|
|
1047
|
+
[
|
|
1048
|
+
"split",
|
|
1049
|
+
// follows Python's `str.split(sep=None, maxsplit=-1)` function behavior
|
|
1050
|
+
// https://docs.python.org/3.13/library/stdtypes.html#str.split
|
|
1051
|
+
new FunctionValue((args) => {
|
|
1052
|
+
const sep = args[0] ?? new NullValue();
|
|
1053
|
+
if (!(sep instanceof StringValue || sep instanceof NullValue)) {
|
|
1054
|
+
throw new Error("sep argument must be a string or null");
|
|
1055
|
+
}
|
|
1056
|
+
const maxsplit = args[1] ?? new NumericValue(-1);
|
|
1057
|
+
if (!(maxsplit instanceof NumericValue)) {
|
|
1058
|
+
throw new Error("maxsplit argument must be a number");
|
|
1059
|
+
}
|
|
1060
|
+
let result = [];
|
|
1061
|
+
if (sep instanceof NullValue) {
|
|
1062
|
+
const text = this.value.trimStart();
|
|
1063
|
+
for (const { 0: match, index } of text.matchAll(/\S+/g)) {
|
|
1064
|
+
if (maxsplit.value !== -1 && result.length >= maxsplit.value && index !== void 0) {
|
|
1065
|
+
result.push(match + text.slice(index + match.length));
|
|
1066
|
+
break;
|
|
1067
|
+
}
|
|
1068
|
+
result.push(match);
|
|
1069
|
+
}
|
|
1070
|
+
} else {
|
|
1071
|
+
if (sep.value === "") {
|
|
1072
|
+
throw new Error("empty separator");
|
|
1073
|
+
}
|
|
1074
|
+
result = this.value.split(sep.value);
|
|
1075
|
+
if (maxsplit.value !== -1 && result.length > maxsplit.value) {
|
|
1076
|
+
result.push(result.splice(maxsplit.value).join(sep.value));
|
|
1077
|
+
}
|
|
1078
|
+
}
|
|
1079
|
+
return new ArrayValue(result.map((part) => new StringValue(part)));
|
|
1080
|
+
})
|
|
1046
1081
|
]
|
|
1047
1082
|
]);
|
|
1048
1083
|
};
|
|
@@ -1379,6 +1414,8 @@ var Interpreter = class {
|
|
|
1379
1414
|
}
|
|
1380
1415
|
})
|
|
1381
1416
|
);
|
|
1417
|
+
case "join":
|
|
1418
|
+
return new StringValue(operand.value.map((x) => x.value).join(""));
|
|
1382
1419
|
default:
|
|
1383
1420
|
throw new Error(`Unknown ArrayValue filter: ${filter.value}`);
|
|
1384
1421
|
}
|
|
@@ -1405,6 +1442,7 @@ var Interpreter = class {
|
|
|
1405
1442
|
)
|
|
1406
1443
|
).join("\n")
|
|
1407
1444
|
);
|
|
1445
|
+
case "join":
|
|
1408
1446
|
case "string":
|
|
1409
1447
|
return operand;
|
|
1410
1448
|
default:
|
|
@@ -1443,6 +1481,21 @@ var Interpreter = class {
|
|
|
1443
1481
|
throw new Error("If set, indent must be a number");
|
|
1444
1482
|
}
|
|
1445
1483
|
return new StringValue(toJSON(operand, indent.value));
|
|
1484
|
+
} else if (filterName === "join") {
|
|
1485
|
+
let value;
|
|
1486
|
+
if (operand instanceof StringValue) {
|
|
1487
|
+
value = Array.from(operand.value);
|
|
1488
|
+
} else if (operand instanceof ArrayValue) {
|
|
1489
|
+
value = operand.value.map((x) => x.value);
|
|
1490
|
+
} else {
|
|
1491
|
+
throw new Error(`Cannot apply filter "${filterName}" to type: ${operand.type}`);
|
|
1492
|
+
}
|
|
1493
|
+
const [args, kwargs] = this.evaluateArguments(filter.args, environment);
|
|
1494
|
+
const separator = args.at(0) ?? kwargs.get("separator") ?? new StringValue("");
|
|
1495
|
+
if (!(separator instanceof StringValue)) {
|
|
1496
|
+
throw new Error("separator must be a string");
|
|
1497
|
+
}
|
|
1498
|
+
return new StringValue(value.join(separator.value));
|
|
1446
1499
|
}
|
|
1447
1500
|
if (operand instanceof ArrayValue) {
|
|
1448
1501
|
switch (filterName) {
|
|
@@ -1904,8 +1957,10 @@ var Template = class {
|
|
|
1904
1957
|
throw new Error(args);
|
|
1905
1958
|
});
|
|
1906
1959
|
env.set("range", range);
|
|
1907
|
-
|
|
1908
|
-
|
|
1960
|
+
if (items) {
|
|
1961
|
+
for (const [key, value] of Object.entries(items)) {
|
|
1962
|
+
env.set(key, value);
|
|
1963
|
+
}
|
|
1909
1964
|
}
|
|
1910
1965
|
const interpreter = new Interpreter(env);
|
|
1911
1966
|
const result = interpreter.run(this.parsed);
|
|
@@ -5647,6 +5702,8 @@ function getNormalizedConfig(config) {
|
|
|
5647
5702
|
break;
|
|
5648
5703
|
case 'gemma':
|
|
5649
5704
|
case 'gemma2':
|
|
5705
|
+
case 'glm':
|
|
5706
|
+
case 'helium':
|
|
5650
5707
|
mapping['num_heads'] = 'num_key_value_heads';
|
|
5651
5708
|
mapping['num_layers'] = 'num_hidden_layers';
|
|
5652
5709
|
mapping['dim_kv'] = 'head_dim';
|
|
@@ -5719,12 +5776,17 @@ function getNormalizedConfig(config) {
|
|
|
5719
5776
|
mapping['encoder_hidden_size'] = mapping['decoder_hidden_size'] = 'd_model';
|
|
5720
5777
|
break;
|
|
5721
5778
|
case 'musicgen_decoder':
|
|
5722
|
-
case 'moonshine':
|
|
5723
5779
|
mapping['num_encoder_layers'] = mapping['num_decoder_layers'] = 'num_hidden_layers';
|
|
5724
5780
|
mapping['num_encoder_heads'] = mapping['num_decoder_heads'] = 'num_attention_heads';
|
|
5725
5781
|
mapping['encoder_hidden_size'] = mapping['decoder_hidden_size'] = 'hidden_size';
|
|
5726
5782
|
break;
|
|
5727
|
-
|
|
5783
|
+
case 'moonshine':
|
|
5784
|
+
mapping['num_decoder_layers'] = 'decoder_num_hidden_layers';
|
|
5785
|
+
mapping['num_decoder_heads'] = 'decoder_num_key_value_heads';
|
|
5786
|
+
mapping['num_encoder_layers'] = 'encoder_num_hidden_layers';
|
|
5787
|
+
mapping['num_encoder_heads'] = 'encoder_num_key_value_heads';
|
|
5788
|
+
mapping['encoder_hidden_size'] = mapping['decoder_hidden_size'] = 'hidden_size';
|
|
5789
|
+
break;
|
|
5728
5790
|
case 'vision-encoder-decoder':
|
|
5729
5791
|
// @ts-expect-error TS2339
|
|
5730
5792
|
const decoderConfig = getNormalizedConfig(config.decoder);
|
|
@@ -5970,7 +6032,7 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
5970
6032
|
|
|
5971
6033
|
|
|
5972
6034
|
|
|
5973
|
-
const VERSION = '3.3.
|
|
6035
|
+
const VERSION = '3.3.3';
|
|
5974
6036
|
|
|
5975
6037
|
// Check if various APIs are available (depends on environment)
|
|
5976
6038
|
const IS_BROWSER_ENV = typeof window !== "undefined" && typeof window.document !== "undefined";
|
|
@@ -8108,6 +8170,9 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
8108
8170
|
/* harmony export */ GemmaForCausalLM: () => (/* binding */ GemmaForCausalLM),
|
|
8109
8171
|
/* harmony export */ GemmaModel: () => (/* binding */ GemmaModel),
|
|
8110
8172
|
/* harmony export */ GemmaPreTrainedModel: () => (/* binding */ GemmaPreTrainedModel),
|
|
8173
|
+
/* harmony export */ GlmForCausalLM: () => (/* binding */ GlmForCausalLM),
|
|
8174
|
+
/* harmony export */ GlmModel: () => (/* binding */ GlmModel),
|
|
8175
|
+
/* harmony export */ GlmPreTrainedModel: () => (/* binding */ GlmPreTrainedModel),
|
|
8111
8176
|
/* harmony export */ GraniteForCausalLM: () => (/* binding */ GraniteForCausalLM),
|
|
8112
8177
|
/* harmony export */ GraniteModel: () => (/* binding */ GraniteModel),
|
|
8113
8178
|
/* harmony export */ GranitePreTrainedModel: () => (/* binding */ GranitePreTrainedModel),
|
|
@@ -8115,6 +8180,9 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
8115
8180
|
/* harmony export */ GroundingDinoPreTrainedModel: () => (/* binding */ GroundingDinoPreTrainedModel),
|
|
8116
8181
|
/* harmony export */ GroupViTModel: () => (/* binding */ GroupViTModel),
|
|
8117
8182
|
/* harmony export */ GroupViTPreTrainedModel: () => (/* binding */ GroupViTPreTrainedModel),
|
|
8183
|
+
/* harmony export */ HeliumForCausalLM: () => (/* binding */ HeliumForCausalLM),
|
|
8184
|
+
/* harmony export */ HeliumModel: () => (/* binding */ HeliumModel),
|
|
8185
|
+
/* harmony export */ HeliumPreTrainedModel: () => (/* binding */ HeliumPreTrainedModel),
|
|
8118
8186
|
/* harmony export */ HieraForImageClassification: () => (/* binding */ HieraForImageClassification),
|
|
8119
8187
|
/* harmony export */ HieraModel: () => (/* binding */ HieraModel),
|
|
8120
8188
|
/* harmony export */ HieraPreTrainedModel: () => (/* binding */ HieraPreTrainedModel),
|
|
@@ -12653,6 +12721,19 @@ class LlamaModel extends LlamaPreTrainedModel { }
|
|
|
12653
12721
|
class LlamaForCausalLM extends LlamaPreTrainedModel { }
|
|
12654
12722
|
//////////////////////////////////////////////////
|
|
12655
12723
|
|
|
12724
|
+
//////////////////////////////////////////////////
|
|
12725
|
+
// Helium models
|
|
12726
|
+
class HeliumPreTrainedModel extends PreTrainedModel { }
|
|
12727
|
+
class HeliumModel extends HeliumPreTrainedModel { }
|
|
12728
|
+
class HeliumForCausalLM extends HeliumPreTrainedModel { }
|
|
12729
|
+
//////////////////////////////////////////////////
|
|
12730
|
+
|
|
12731
|
+
//////////////////////////////////////////////////
|
|
12732
|
+
// Glm models
|
|
12733
|
+
class GlmPreTrainedModel extends PreTrainedModel { }
|
|
12734
|
+
class GlmModel extends GlmPreTrainedModel { }
|
|
12735
|
+
class GlmForCausalLM extends GlmPreTrainedModel { }
|
|
12736
|
+
//////////////////////////////////////////////////
|
|
12656
12737
|
|
|
12657
12738
|
//////////////////////////////////////////////////
|
|
12658
12739
|
// EXAONE models
|
|
@@ -15507,6 +15588,8 @@ const MODEL_MAPPING_NAMES_DECODER_ONLY = new Map([
|
|
|
15507
15588
|
['cohere', ['CohereModel', CohereModel]],
|
|
15508
15589
|
['gemma', ['GemmaModel', GemmaModel]],
|
|
15509
15590
|
['gemma2', ['Gemma2Model', Gemma2Model]],
|
|
15591
|
+
['helium', ['HeliumModel', HeliumModel]],
|
|
15592
|
+
['glm', ['GlmModel', GlmModel]],
|
|
15510
15593
|
['openelm', ['OpenELMModel', OpenELMModel]],
|
|
15511
15594
|
['qwen2', ['Qwen2Model', Qwen2Model]],
|
|
15512
15595
|
['phi', ['PhiModel', PhiModel]],
|
|
@@ -15603,6 +15686,8 @@ const MODEL_FOR_CAUSAL_LM_MAPPING_NAMES = new Map([
|
|
|
15603
15686
|
['cohere', ['CohereForCausalLM', CohereForCausalLM]],
|
|
15604
15687
|
['gemma', ['GemmaForCausalLM', GemmaForCausalLM]],
|
|
15605
15688
|
['gemma2', ['Gemma2ForCausalLM', Gemma2ForCausalLM]],
|
|
15689
|
+
['helium', ['HeliumForCausalLM', HeliumForCausalLM]],
|
|
15690
|
+
['glm', ['GlmForCausalLM', GlmForCausalLM]],
|
|
15606
15691
|
['openelm', ['OpenELMForCausalLM', OpenELMForCausalLM]],
|
|
15607
15692
|
['qwen2', ['Qwen2ForCausalLM', Qwen2ForCausalLM]],
|
|
15608
15693
|
['phi', ['PhiForCausalLM', PhiForCausalLM]],
|
|
@@ -19192,8 +19277,9 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
19192
19277
|
/* harmony export */ SamProcessor: () => (/* reexport safe */ _sam_processing_sam_js__WEBPACK_IMPORTED_MODULE_12__.SamProcessor),
|
|
19193
19278
|
/* harmony export */ SpeechT5Processor: () => (/* reexport safe */ _speecht5_processing_speecht5_js__WEBPACK_IMPORTED_MODULE_13__.SpeechT5Processor),
|
|
19194
19279
|
/* harmony export */ VLChatProcessor: () => (/* reexport safe */ _janus_processing_janus_js__WEBPACK_IMPORTED_MODULE_3__.VLChatProcessor),
|
|
19195
|
-
/* harmony export */
|
|
19196
|
-
/* harmony export */
|
|
19280
|
+
/* harmony export */ Wav2Vec2Processor: () => (/* reexport safe */ _wav2vec2_processing_wav2vec2_js__WEBPACK_IMPORTED_MODULE_14__.Wav2Vec2Processor),
|
|
19281
|
+
/* harmony export */ Wav2Vec2ProcessorWithLM: () => (/* reexport safe */ _wav2vec2_with_lm_processing_wav2vec2_with_lm_js__WEBPACK_IMPORTED_MODULE_15__.Wav2Vec2ProcessorWithLM),
|
|
19282
|
+
/* harmony export */ WhisperProcessor: () => (/* reexport safe */ _whisper_processing_whisper_js__WEBPACK_IMPORTED_MODULE_16__.WhisperProcessor)
|
|
19197
19283
|
/* harmony export */ });
|
|
19198
19284
|
/* harmony import */ var _florence2_processing_florence2_js__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! ./florence2/processing_florence2.js */ "./src/models/florence2/processing_florence2.js");
|
|
19199
19285
|
/* harmony import */ var _grounding_dino_processing_grounding_dino_js__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(/*! ./grounding_dino/processing_grounding_dino.js */ "./src/models/grounding_dino/processing_grounding_dino.js");
|
|
@@ -19210,7 +19296,9 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
19210
19296
|
/* harmony import */ var _sam_processing_sam_js__WEBPACK_IMPORTED_MODULE_12__ = __webpack_require__(/*! ./sam/processing_sam.js */ "./src/models/sam/processing_sam.js");
|
|
19211
19297
|
/* harmony import */ var _speecht5_processing_speecht5_js__WEBPACK_IMPORTED_MODULE_13__ = __webpack_require__(/*! ./speecht5/processing_speecht5.js */ "./src/models/speecht5/processing_speecht5.js");
|
|
19212
19298
|
/* harmony import */ var _wav2vec2_processing_wav2vec2_js__WEBPACK_IMPORTED_MODULE_14__ = __webpack_require__(/*! ./wav2vec2/processing_wav2vec2.js */ "./src/models/wav2vec2/processing_wav2vec2.js");
|
|
19213
|
-
/* harmony import */ var
|
|
19299
|
+
/* harmony import */ var _wav2vec2_with_lm_processing_wav2vec2_with_lm_js__WEBPACK_IMPORTED_MODULE_15__ = __webpack_require__(/*! ./wav2vec2_with_lm/processing_wav2vec2_with_lm.js */ "./src/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.js");
|
|
19300
|
+
/* harmony import */ var _whisper_processing_whisper_js__WEBPACK_IMPORTED_MODULE_16__ = __webpack_require__(/*! ./whisper/processing_whisper.js */ "./src/models/whisper/processing_whisper.js");
|
|
19301
|
+
|
|
19214
19302
|
|
|
19215
19303
|
|
|
19216
19304
|
|
|
@@ -20439,17 +20527,55 @@ class Wav2Vec2FeatureExtractor extends _base_feature_extraction_utils_js__WEBPAC
|
|
|
20439
20527
|
\****************************************************/
|
|
20440
20528
|
/***/ ((__unused_webpack___webpack_module__, __webpack_exports__, __webpack_require__) => {
|
|
20441
20529
|
|
|
20530
|
+
"use strict";
|
|
20531
|
+
__webpack_require__.r(__webpack_exports__);
|
|
20532
|
+
/* harmony export */ __webpack_require__.d(__webpack_exports__, {
|
|
20533
|
+
/* harmony export */ Wav2Vec2Processor: () => (/* binding */ Wav2Vec2Processor)
|
|
20534
|
+
/* harmony export */ });
|
|
20535
|
+
/* harmony import */ var _tokenizers_js__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! ../../tokenizers.js */ "./src/tokenizers.js");
|
|
20536
|
+
/* harmony import */ var _auto_feature_extraction_auto_js__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(/*! ../auto/feature_extraction_auto.js */ "./src/models/auto/feature_extraction_auto.js");
|
|
20537
|
+
/* harmony import */ var _base_processing_utils_js__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(/*! ../../base/processing_utils.js */ "./src/base/processing_utils.js");
|
|
20538
|
+
|
|
20539
|
+
|
|
20540
|
+
|
|
20541
|
+
|
|
20542
|
+
class Wav2Vec2Processor extends _base_processing_utils_js__WEBPACK_IMPORTED_MODULE_2__.Processor {
|
|
20543
|
+
static tokenizer_class = _tokenizers_js__WEBPACK_IMPORTED_MODULE_0__.AutoTokenizer
|
|
20544
|
+
static feature_extractor_class = _auto_feature_extraction_auto_js__WEBPACK_IMPORTED_MODULE_1__.AutoFeatureExtractor
|
|
20545
|
+
|
|
20546
|
+
/**
|
|
20547
|
+
* Calls the feature_extractor function with the given audio input.
|
|
20548
|
+
* @param {any} audio The audio input to extract features from.
|
|
20549
|
+
* @returns {Promise<any>} A Promise that resolves with the extracted features.
|
|
20550
|
+
*/
|
|
20551
|
+
async _call(audio) {
|
|
20552
|
+
return await this.feature_extractor(audio)
|
|
20553
|
+
}
|
|
20554
|
+
}
|
|
20555
|
+
|
|
20556
|
+
|
|
20557
|
+
/***/ }),
|
|
20558
|
+
|
|
20559
|
+
/***/ "./src/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.js":
|
|
20560
|
+
/*!********************************************************************!*\
|
|
20561
|
+
!*** ./src/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.js ***!
|
|
20562
|
+
\********************************************************************/
|
|
20563
|
+
/***/ ((__unused_webpack___webpack_module__, __webpack_exports__, __webpack_require__) => {
|
|
20564
|
+
|
|
20442
20565
|
"use strict";
|
|
20443
20566
|
__webpack_require__.r(__webpack_exports__);
|
|
20444
20567
|
/* harmony export */ __webpack_require__.d(__webpack_exports__, {
|
|
20445
20568
|
/* harmony export */ Wav2Vec2ProcessorWithLM: () => (/* binding */ Wav2Vec2ProcessorWithLM)
|
|
20446
20569
|
/* harmony export */ });
|
|
20447
|
-
/* harmony import */ var
|
|
20570
|
+
/* harmony import */ var _tokenizers_js__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! ../../tokenizers.js */ "./src/tokenizers.js");
|
|
20448
20571
|
/* harmony import */ var _auto_feature_extraction_auto_js__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(/*! ../auto/feature_extraction_auto.js */ "./src/models/auto/feature_extraction_auto.js");
|
|
20572
|
+
/* harmony import */ var _base_processing_utils_js__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(/*! ../../base/processing_utils.js */ "./src/base/processing_utils.js");
|
|
20449
20573
|
|
|
20450
20574
|
|
|
20451
20575
|
|
|
20452
|
-
|
|
20576
|
+
|
|
20577
|
+
class Wav2Vec2ProcessorWithLM extends _base_processing_utils_js__WEBPACK_IMPORTED_MODULE_2__.Processor {
|
|
20578
|
+
static tokenizer_class = _tokenizers_js__WEBPACK_IMPORTED_MODULE_0__.AutoTokenizer
|
|
20453
20579
|
static feature_extractor_class = _auto_feature_extraction_auto_js__WEBPACK_IMPORTED_MODULE_1__.AutoFeatureExtractor
|
|
20454
20580
|
|
|
20455
20581
|
/**
|
|
@@ -21037,9 +21163,12 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
21037
21163
|
/* harmony export */ });
|
|
21038
21164
|
/* harmony import */ var _backends_onnx_js__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! ../backends/onnx.js */ "./src/backends/onnx.js");
|
|
21039
21165
|
/* harmony import */ var _utils_tensor_js__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(/*! ../utils/tensor.js */ "./src/utils/tensor.js");
|
|
21166
|
+
/* harmony import */ var _env_js__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(/*! ../env.js */ "./src/env.js");
|
|
21040
21167
|
|
|
21041
21168
|
|
|
21042
21169
|
|
|
21170
|
+
|
|
21171
|
+
const IS_WEB_ENV = _env_js__WEBPACK_IMPORTED_MODULE_2__.apis.IS_BROWSER_ENV || _env_js__WEBPACK_IMPORTED_MODULE_2__.apis.IS_WEBWORKER_ENV;
|
|
21043
21172
|
/**
|
|
21044
21173
|
* Asynchronously creates a wrapper function for running an ONNX inference session.
|
|
21045
21174
|
*
|
|
@@ -21055,10 +21184,16 @@ const wrap = async (session_bytes, session_options, names) => {
|
|
|
21055
21184
|
const session = await (0,_backends_onnx_js__WEBPACK_IMPORTED_MODULE_0__.createInferenceSession)(
|
|
21056
21185
|
new Uint8Array(session_bytes), session_options,
|
|
21057
21186
|
);
|
|
21187
|
+
|
|
21188
|
+
/** @type {Promise<any>} */
|
|
21189
|
+
let chain = Promise.resolve();
|
|
21190
|
+
|
|
21058
21191
|
return /** @type {any} */(async (/** @type {Record<string, Tensor>} */ inputs) => {
|
|
21059
21192
|
const proxied = (0,_backends_onnx_js__WEBPACK_IMPORTED_MODULE_0__.isONNXProxy)();
|
|
21060
21193
|
const ortFeed = Object.fromEntries(Object.entries(inputs).map(([k, v]) => [k, (proxied ? v.clone() : v).ort_tensor]));
|
|
21061
|
-
|
|
21194
|
+
|
|
21195
|
+
// When running in-browser via WASM, we need to chain calls to session.run to avoid "Error: Session already started"
|
|
21196
|
+
const outputs = await (chain = IS_WEB_ENV ? chain.then(() => session.run(ortFeed)) : session.run(ortFeed));
|
|
21062
21197
|
|
|
21063
21198
|
if (Array.isArray(names)) {
|
|
21064
21199
|
return names.map((n) => new _utils_tensor_js__WEBPACK_IMPORTED_MODULE_1__.Tensor(outputs[n]));
|
|
@@ -25004,13 +25139,15 @@ class TokenizerModel extends _utils_generic_js__WEBPACK_IMPORTED_MODULE_0__.Call
|
|
|
25004
25139
|
return new BPE(config);
|
|
25005
25140
|
|
|
25006
25141
|
default:
|
|
25007
|
-
// Some tokenizers, like
|
|
25008
|
-
// In this case, we can infer the tokenizer type based on the structure of the `vocab` field.
|
|
25142
|
+
// Some older tokenizers, like `google-t5/t5-small` and `distilbert/distilbert-base-uncased`, do not have a `type` field.
|
|
25143
|
+
// In this case, we can infer the tokenizer type based on the structure of the `vocab` field and other properties.
|
|
25009
25144
|
if (config.vocab) {
|
|
25010
25145
|
if (Array.isArray(config.vocab)) {
|
|
25011
25146
|
// config.vocab is of type `[string, number][]`
|
|
25012
25147
|
// @ts-ignore
|
|
25013
25148
|
return new Unigram(config, ...args);
|
|
25149
|
+
} else if (typeof config.vocab === 'object' && config.continuing_subword_prefix && config.unk_token) {
|
|
25150
|
+
return new WordPieceTokenizer(config);
|
|
25014
25151
|
} else {
|
|
25015
25152
|
// @ts-ignore
|
|
25016
25153
|
return new LegacyTokenizerModel(config, ...args);
|
|
@@ -35344,6 +35481,9 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
35344
35481
|
/* harmony export */ GemmaModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GemmaModel),
|
|
35345
35482
|
/* harmony export */ GemmaPreTrainedModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GemmaPreTrainedModel),
|
|
35346
35483
|
/* harmony export */ GemmaTokenizer: () => (/* reexport safe */ _tokenizers_js__WEBPACK_IMPORTED_MODULE_3__.GemmaTokenizer),
|
|
35484
|
+
/* harmony export */ GlmForCausalLM: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GlmForCausalLM),
|
|
35485
|
+
/* harmony export */ GlmModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GlmModel),
|
|
35486
|
+
/* harmony export */ GlmPreTrainedModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GlmPreTrainedModel),
|
|
35347
35487
|
/* harmony export */ GraniteForCausalLM: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GraniteForCausalLM),
|
|
35348
35488
|
/* harmony export */ GraniteModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GraniteModel),
|
|
35349
35489
|
/* harmony export */ GranitePreTrainedModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GranitePreTrainedModel),
|
|
@@ -35354,6 +35494,9 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
35354
35494
|
/* harmony export */ GroundingDinoProcessor: () => (/* reexport safe */ _models_processors_js__WEBPACK_IMPORTED_MODULE_16__.GroundingDinoProcessor),
|
|
35355
35495
|
/* harmony export */ GroupViTModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GroupViTModel),
|
|
35356
35496
|
/* harmony export */ GroupViTPreTrainedModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.GroupViTPreTrainedModel),
|
|
35497
|
+
/* harmony export */ HeliumForCausalLM: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.HeliumForCausalLM),
|
|
35498
|
+
/* harmony export */ HeliumModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.HeliumModel),
|
|
35499
|
+
/* harmony export */ HeliumPreTrainedModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.HeliumPreTrainedModel),
|
|
35357
35500
|
/* harmony export */ HerbertTokenizer: () => (/* reexport safe */ _tokenizers_js__WEBPACK_IMPORTED_MODULE_3__.HerbertTokenizer),
|
|
35358
35501
|
/* harmony export */ HieraForImageClassification: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.HieraForImageClassification),
|
|
35359
35502
|
/* harmony export */ HieraModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.HieraModel),
|
|
@@ -35724,6 +35867,7 @@ __webpack_require__.r(__webpack_exports__);
|
|
|
35724
35867
|
/* harmony export */ Wav2Vec2ForSequenceClassification: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.Wav2Vec2ForSequenceClassification),
|
|
35725
35868
|
/* harmony export */ Wav2Vec2Model: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.Wav2Vec2Model),
|
|
35726
35869
|
/* harmony export */ Wav2Vec2PreTrainedModel: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.Wav2Vec2PreTrainedModel),
|
|
35870
|
+
/* harmony export */ Wav2Vec2Processor: () => (/* reexport safe */ _models_processors_js__WEBPACK_IMPORTED_MODULE_16__.Wav2Vec2Processor),
|
|
35727
35871
|
/* harmony export */ Wav2Vec2ProcessorWithLM: () => (/* reexport safe */ _models_processors_js__WEBPACK_IMPORTED_MODULE_16__.Wav2Vec2ProcessorWithLM),
|
|
35728
35872
|
/* harmony export */ WavLMForAudioFrameClassification: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.WavLMForAudioFrameClassification),
|
|
35729
35873
|
/* harmony export */ WavLMForCTC: () => (/* reexport safe */ _models_js__WEBPACK_IMPORTED_MODULE_2__.WavLMForCTC),
|