@100mslive/hms-virtual-background 1.3.7 → 1.3.8

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,32 +1,20 @@
1
- # TSDX User Guide
2
-
3
- Congrats! You just saved yourself hours of work by bootstrapping this project with TSDX. Let’s get you oriented with what’s here and how to use it.
4
-
5
- > This TSDX setup is meant for developing libraries (not apps!) that can be published to NPM. If you’re looking to build a Node app, you could use `ts-node-dev`, plain `ts-node`, or simple `tsc`.
6
-
7
- > If you’re new to TypeScript, checkout [this handy cheatsheet](https://devhints.io/typescript)
8
-
9
1
  # How to use
10
2
 
11
- 1) npm i @100mslive/hms-virtual-background
12
- 2) import { setVirtualBackground } from '@100mslive/hms-virtual-background'
13
- 3) setVirtualBackground accepts 2 parameters :-
14
- a) background, Its a string, can be one of these 3 parameters: ['default' | 'blur' | 'Image']
15
- i) 'default' -> No virtual background is set
16
- ii) 'blur' -> Background will be blurred
17
- iii) 'Image' -> It's a source URL of image you want to replace your background with
18
- b) stream: It's the input camera feed on which virtual background will work
19
-
20
- For ex:->
21
- To blur your background, Use: setVirtualBackground( background: 'blur', stream: stream )
22
- To Add Virtual Image, specify an Image URL in background, Use: setVirtualBackground(background: Image, stream: stream)
23
- For default mode , Use setVirtualBackground( background: 'default', stream: stream )
24
-
25
- ## Commands
3
+ 1. npm i @100mslive/hms-virtual-background
4
+ 2. import { setVirtualBackground } from '@100mslive/hms-virtual-background'
5
+ 3. setVirtualBackground accepts 2 parameters :-
6
+ a) background, Its a string, can be one of these 3 parameters: ['default' | 'blur' | 'Image']
7
+ i) 'default' -> No virtual background is set
8
+ ii) 'blur' -> Background will be blurred
9
+ iii) 'Image' -> It's a source URL of image you want to replace your background with
10
+ b) stream: It's the input camera feed on which virtual background will work
26
11
 
27
- TSDX scaffolds your new library inside `/src`.
12
+ For ex:->
13
+ To blur your background, Use: `setVirtualBackground( background: 'blur', stream: stream )`
14
+ To Add Virtual Image, specify an Image URL in background, Use: setVirtualBackground(background: Image, stream: stream)
15
+ For default mode , Use `setVirtualBackground( background: 'default', stream: stream )`
28
16
 
29
- To run TSDX, use:
17
+ ## Commands
30
18
 
31
19
  ```bash
32
20
  npm start # or yarn start
@@ -37,84 +25,3 @@ This builds to `/dist` and runs the project in watch mode so any edits you save
37
25
  To do a one-off build, use `npm run build` or `yarn build`.
38
26
 
39
27
  To run tests, use `npm test` or `yarn test`.
40
-
41
- ## Configuration
42
-
43
- Code quality is set up for you with `prettier`, `husky`, and `lint-staged`. Adjust the respective fields in `package.json` accordingly.
44
-
45
- ### Jest
46
-
47
- Jest tests are set up to run with `npm test` or `yarn test`.
48
-
49
- ### Bundle Analysis
50
-
51
- [`size-limit`](https://github.com/ai/size-limit) is set up to calculate the real cost of your library with `npm run size` and visualize the bundle with `npm run analyze`.
52
-
53
- #### Setup Files
54
-
55
- This is the folder structure we set up for you:
56
-
57
- ```txt
58
- /src
59
- index.tsx # EDIT THIS
60
- /test
61
- blah.test.tsx # EDIT THIS
62
- .gitignore
63
- package.json
64
- README.md # EDIT THIS
65
- tsconfig.json
66
- ```
67
-
68
- ### Rollup
69
-
70
- TSDX uses [Rollup](https://rollupjs.org) as a bundler and generates multiple rollup configs for various module formats and build settings. See [Optimizations](#optimizations) for details.
71
-
72
- ### TypeScript
73
-
74
- `tsconfig.json` is set up to interpret `dom` and `esnext` types, as well as `react` for `jsx`. Adjust according to your needs.
75
-
76
- ## Continuous Integration
77
-
78
- ### GitHub Actions
79
-
80
- Two actions are added by default:
81
-
82
- - `main` which installs deps w/ cache, lints, tests, and builds on all pushes against a Node and OS matrix
83
- - `size` which comments cost comparison of your library on every pull request using [`size-limit`](https://github.com/ai/size-limit)
84
-
85
- ## Optimizations
86
-
87
- Please see the main `tsdx` [optimizations docs](https://github.com/palmerhq/tsdx#optimizations). In particular, know that you can take advantage of development-only optimizations:
88
-
89
- ```js
90
- // ./types/index.d.ts
91
- declare var __DEV__: boolean;
92
-
93
- // inside your code...
94
- if (__DEV__) {
95
- console.log('foo');
96
- }
97
- ```
98
-
99
- You can also choose to install and use [invariant](https://github.com/palmerhq/tsdx#invariant) and [warning](https://github.com/palmerhq/tsdx#warning) functions.
100
-
101
- ## Module Formats
102
-
103
- CJS, ESModules, and UMD module formats are supported.
104
-
105
- The appropriate paths are configured in `package.json` and `dist/index.js` accordingly. Please report if any issues are found.
106
-
107
- ## Named Exports
108
-
109
- Per Palmer Group guidelines, [always use named exports.](https://github.com/palmerhq/typescript#exports) Code split inside your React app instead of your React library.
110
-
111
- ## Including Styles
112
-
113
- There are many ways to ship styles, including with CSS-in-JS. TSDX has no opinion on this, configure how you like.
114
-
115
- For vanilla CSS, you can include it at the root directory and add it to the `files` section in your `package.json`, so that it can be imported separately by your users and run through their bundler's loader.
116
-
117
- ## Publishing to NPM
118
-
119
- We recommend using [np](https://github.com/sindresorhus/np).
120
- # hms-virtual-background
@@ -1,57 +1,58 @@
1
- import '@tensorflow/tfjs-backend-webgl';
2
- import { HMSVideoPlugin, HMSVideoPluginType } from '@100mslive/hms-video';
3
- export declare class HMSVirtualBackgroundPlugin implements HMSVideoPlugin {
4
- background: string | HTMLImageElement;
5
- personMaskWidth: number;
6
- personMaskHeight: number;
7
- isVirtualBackground: boolean;
8
- backgroundImage: HTMLImageElement | null;
9
- backgroundVideo: HTMLVideoElement | null;
10
- backgroundType: string;
11
- loadModelCalled: boolean;
12
- blurValue: any;
13
- tfLite: any;
14
- tfLitePromise: any;
15
- modelName: string;
16
- input: HTMLCanvasElement | null;
17
- output: HTMLCanvasElement | null;
18
- outputCtx: CanvasRenderingContext2D | null;
19
- timerID: number;
20
- imageAspectRatio: number;
21
- personMaskPixelCount: number;
22
- personMask: ImageData;
23
- personMaskCanvas: HTMLCanvasElement;
24
- personMaskCtx: any;
25
- filters: any;
26
- enableSharpening?: boolean | false;
27
- gifFrames: any;
28
- gifFramesIndex: number;
29
- gifFrameImageData: any;
30
- tempGifCanvas: HTMLCanvasElement;
31
- tempGifContext: any;
32
- giflocalCount: number;
33
- constructor(background: string, enableSharpening?: boolean);
34
- init(): Promise<void>;
35
- isSupported(): boolean;
36
- getName(): string;
37
- getPluginType(): HMSVideoPluginType;
38
- setBackground(path?: string | HTMLImageElement | HTMLVideoElement): Promise<void>;
39
- stop(): void;
40
- processVideoFrame(input: HTMLCanvasElement, output: HTMLCanvasElement, skipProcessing?: boolean): Promise<void> | void;
41
- private setImage;
42
- private setGiF;
43
- private log;
44
- private resizeInputData;
45
- private infer;
46
- private postProcessing;
47
- private sharpenFilter;
48
- private drawPersonMask;
49
- private drawSegmentedBackground;
50
- private runSegmentation;
51
- private fitVideoToBackground;
52
- private fitImageToBackground;
53
- private fitGifToBackground;
54
- private fitData;
55
- private addBlurToBackground;
56
- private initSharpenFilter;
57
- }
1
+ import '@tensorflow/tfjs-backend-webgl';
2
+ import { HMSPluginSupportResult, HMSVideoPlugin, HMSVideoPluginType } from '@100mslive/hms-video';
3
+ export declare class HMSVirtualBackgroundPlugin implements HMSVideoPlugin {
4
+ background: string | HTMLImageElement;
5
+ personMaskWidth: number;
6
+ personMaskHeight: number;
7
+ isVirtualBackground: boolean;
8
+ backgroundImage: HTMLImageElement | null;
9
+ backgroundVideo: HTMLVideoElement | null;
10
+ backgroundType: string;
11
+ loadModelCalled: boolean;
12
+ blurValue: any;
13
+ tfLite: any;
14
+ tfLitePromise: any;
15
+ modelName: string;
16
+ input: HTMLCanvasElement | null;
17
+ output: HTMLCanvasElement | null;
18
+ outputCtx: CanvasRenderingContext2D | null;
19
+ timerID: number;
20
+ imageAspectRatio: number;
21
+ personMaskPixelCount: number;
22
+ personMask: ImageData;
23
+ personMaskCanvas: HTMLCanvasElement;
24
+ personMaskCtx: any;
25
+ filters: any;
26
+ enableSharpening?: boolean | false;
27
+ gifFrames: any;
28
+ gifFramesIndex: number;
29
+ gifFrameImageData: any;
30
+ tempGifCanvas: HTMLCanvasElement;
31
+ tempGifContext: any;
32
+ giflocalCount: number;
33
+ constructor(background: string, enableSharpening?: boolean);
34
+ init(): Promise<void>;
35
+ isSupported(): boolean;
36
+ checkSupport(): HMSPluginSupportResult;
37
+ getName(): string;
38
+ getPluginType(): HMSVideoPluginType;
39
+ setBackground(path?: string | HTMLImageElement | HTMLVideoElement): Promise<void>;
40
+ stop(): void;
41
+ processVideoFrame(input: HTMLCanvasElement, output: HTMLCanvasElement, skipProcessing?: boolean): Promise<void> | void;
42
+ private setImage;
43
+ private setGiF;
44
+ private log;
45
+ private resizeInputData;
46
+ private infer;
47
+ private postProcessing;
48
+ private sharpenFilter;
49
+ private drawPersonMask;
50
+ private drawSegmentedBackground;
51
+ private runSegmentation;
52
+ private fitVideoToBackground;
53
+ private fitImageToBackground;
54
+ private fitGifToBackground;
55
+ private fitData;
56
+ private addBlurToBackground;
57
+ private initSharpenFilter;
58
+ }
@@ -1,2 +1,2 @@
1
- declare const loadTFLite: () => Promise<any>;
2
- export { loadTFLite };
1
+ declare const loadTFLite: () => Promise<any>;
2
+ export { loadTFLite };
@@ -0,0 +1 @@
1
+ var W=Object.create;var M=Object.defineProperty;var N=Object.getOwnPropertyDescriptor;var U=Object.getOwnPropertyNames;var z=Object.getPrototypeOf,q=Object.prototype.hasOwnProperty;var B=s=>M(s,"__esModule",{value:!0});var J=(s,t)=>()=>(t||s((t={exports:{}}).exports,t),t.exports),Y=(s,t)=>{B(s);for(var e in t)M(s,e,{get:t[e],enumerable:!0})},$=(s,t,e)=>{if(t&&typeof t=="object"||typeof t=="function")for(let i of U(t))!q.call(s,i)&&i!=="default"&&M(s,i,{get:()=>t[i],enumerable:!(e=N(t,i))||e.enumerable});return s},L=s=>$(B(M(s!=null?W(z(s)):{},"default",s&&s.__esModule&&"default"in s?{get:()=>s.default,enumerable:!0}:{value:s,enumerable:!0})),s);var u=(s,t,e)=>new Promise((i,a)=>{var n=h=>{try{r(e.next(h))}catch(l){a(l)}},o=h=>{try{r(e.throw(h))}catch(l){a(l)}},r=h=>h.done?i(h.value):Promise.resolve(h.value).then(n,o);r((e=e.apply(s,t)).next())});var E=J((ut,Q)=>{Q.exports={version:"1.3.8",license:"MIT",main:"dist/index.js",typings:"dist/index.d.ts",files:["dist","src/tflite","src/models"],scripts:{start:'concurrently "yarn dev" "yarn types"',dev:"node ../../scripts/dev","build:only":"node ../../scripts/build",build:"yarn build:only && yarn types:build",types:"tsc -w","types:build":"tsc -p tsconfig.build.json",test:"jest --maxWorkers=1 --passWithNoTests",lint:"eslint -c ../../.eslintrc .","lint:fix":"yarn lint --fix",prepare:"yarn build",size:"size-limit",analyze:"size-limit --why",format:"prettier --write src/**/*.ts"},peerDependencies:{"@100mslive/hms-video":"^0.1.45-alpha.1"},name:"@100mslive/hms-virtual-background",author:"ashish17",module:"dist/index.js",devDependencies:{"@100mslive/hms-video":"0.1.48"},dependencies:{"@tensorflow/tfjs-backend-webgl":"^3.3.0","@tensorflow/tfjs-core":"^3.3.0","@webassemblyjs/helper-wasm-bytecode":"^1.11.0","@webassemblyjs/wasm-gen":"^1.11.0","gifuct-js":"^2.1.2","wasm-check":"^2.0.2"},eslintIgnore:["tflite.js","tflite-simd.js","tflite.wasm","tflite-simd.wasm","defineTFLite.ts","importing.test.ts"],gitHead:"159b16c9cf62ce17ad5026a1e8558690c00fc3c3"}});Y(exports,{HMSVirtualBackgroundPlugin:()=>_});var K=E(),V=`https://unpkg.com/${K.name}/src`,y="VBProcessor",X="tflite/tflite.js",Z="tflite/tflite-simd.js",tt="models/selfie_segmentation_landscape.tflite",A=s=>new Promise(function(t,e){let i=document.createElement("script");i.src=s,i.onload=t,i.onerror=e,document.head.appendChild(i)}),et=()=>u(void 0,null,function*(){let s,t=V+"/"+Z;yield A(t);try{s=yield createTFLiteSIMDModule()}catch(e){console.warn("SIMD not supported. You may experience poor virtual background effect."),t=V+"/"+X,yield A(t),s=yield createTFLiteModule()}return s}),O=()=>u(void 0,null,function*(){let s=V+"/"+tt,[t,e]=yield Promise.all([et(),fetch(s)]),i=yield e.arrayBuffer(),a=t._getModelBufferMemoryOffset();return t.HEAPU8.set(new Uint8Array(i),a),t._loadModel(i.byteLength),console.debug(y,"Input memory offset:",t._getInputMemoryOffset()),console.debug(y,"Input height:",t._getInputHeight()),console.debug(y,"Input width:",t._getInputWidth()),console.debug(y,"Input channels:",t._getInputChannelCount()),t});var gt=L(require("@tensorflow/tfjs-backend-webgl")),x=L(require("gifuct-js")),w=L(require("@100mslive/hms-video")),v="VBProcessor",it=33,st=E(),at=214,nt=855,ot=120,rt=720,_=class{constructor(t,e=!1){this.backgroundType="none";this.background=t,this.enableSharpening=e,this.backgroundImage=null,this.backgroundVideo=null,this.personMaskWidth=256,this.personMaskHeight=144,this.isVirtualBackground=!1,this.blurValue="10px",this.loadModelCalled=!1,this.tfLite=null,this.modelName="landscape-segmentation",this.outputCtx=null,this.input=null,this.output=null,this.timerID=0,this.imageAspectRatio=1,this.personMaskPixelCount=this.personMaskWidth*this.personMaskHeight,this.personMask=new ImageData(this.personMaskWidth,this.personMaskHeight),this.personMaskCanvas=document.createElement("canvas"),this.personMaskCanvas.width=this.personMaskWidth,this.personMaskCanvas.height=this.personMaskHeight,this.personMaskCtx=this.personMaskCanvas.getContext("2d"),this.filters={},this.gifFrames=null,this.gifFramesIndex=0,this.gifFrameImageData=null,this.tempGifCanvas=document.createElement("canvas"),this.tempGifContext=this.tempGifCanvas.getContext("2d"),this.giflocalCount=0,this.enableSharpening=e,this.log(v,"Virtual Background plugin created"),this.setBackground(this.background)}init(){return u(this,null,function*(){this.loadModelCalled?yield this.tfLitePromise:(this.log(v,"PREVIOUS LOADED MODEL IS ",this.tfLite),this.loadModelCalled=!0,this.tfLitePromise=O(),this.tfLite=yield this.tfLitePromise),this.enableSharpening&&this.initSharpenFilter()})}isSupported(){return navigator.userAgent.indexOf("Chrome")!==-1||navigator.userAgent.indexOf("Firefox")!==-1||navigator.userAgent.indexOf("Edg")!==-1||navigator.userAgent.indexOf("Edge")!==-1}checkSupport(){let t={};return["Chrome","Firefox","Edg","Edge"].some(e=>navigator.userAgent.indexOf(e)!==-1)?t.isSupported=!0:(t.isSupported=!1,t.errType=w.HMSPluginUnsupportedTypes.PLATFORM_NOT_SUPPORTED,t.errMsg="browser not supported for plugin, see docs"),t}getName(){return st.name}getPluginType(){return w.HMSVideoPluginType.TRANSFORM}setBackground(t){return u(this,null,function*(){if(t!=="")if(t==="none")this.log(v,"setting background to :",t),this.background="none",this.backgroundType="none",this.isVirtualBackground=!1;else if(t==="blur")this.log(v,"setting background to :",t),this.background="blur",this.backgroundType="blur",this.isVirtualBackground=!1;else if(t instanceof HTMLImageElement){this.log("setting background to image",t);let e=yield this.setImage(t);if(!e||!e.complete||!e.naturalHeight)throw new Error("Invalid image. Provide a valid and successfully loaded HTMLImageElement");this.isVirtualBackground=!0,this.backgroundImage=e,this.backgroundType="image"}else if(t instanceof HTMLVideoElement)this.log("setting background to video",t),this.backgroundVideo=t,this.backgroundVideo.crossOrigin="anonymous",this.backgroundVideo.muted=!0,this.backgroundVideo.loop=!0,this.backgroundVideo.oncanplaythrough=()=>u(this,null,function*(){this.backgroundVideo!=null&&(yield this.backgroundVideo.play(),this.isVirtualBackground=!0,this.backgroundType="video")});else if(console.log("setting gif to background"),this.gifFrames=yield this.setGiF(t),this.gifFrames!=null&&this.gifFrames.length>0)this.backgroundType="gif",this.isVirtualBackground=!0;else throw new Error("Invalid background supplied, see the docs to check supported background type");else throw new Error("Invalid background supplied, see the docs to check supported background type")})}stop(){var t,e;this.isVirtualBackground&&((t=this.backgroundImage)==null||t.removeAttribute("src"),(e=this.backgroundVideo)==null||e.removeAttribute("src"),this.backgroundType==="video"&&(this.backgroundVideo.loop=!1,this.backgroundVideo=null)),this.outputCtx&&(this.outputCtx.fillStyle="rgb(0, 0, 0)",this.outputCtx.fillRect(0,0,this.output.width,this.output.height)),this.gifFrameImageData=null,this.gifFrames=null,this.giflocalCount=0,this.gifFramesIndex=0}processVideoFrame(t,e,i){if(!t||!e)throw new Error("Plugin invalid input/output");this.input=t,this.output=e;let a=e.getContext("2d");if(a.canvas.width!==t.width&&(a.canvas.width=t.width),a.canvas.height!==t.height&&(a.canvas.height=t.height),this.backgroundType==="video"&&(this.backgroundVideo.width=t.width,this.backgroundVideo.height=t.height),this.outputCtx=a,this.imageAspectRatio=t.width/t.height,this.imageAspectRatio<=0)throw new Error("Invalid input width/height");let n=()=>u(this,null,function*(){yield this.runSegmentation(i)});this.background==="none"&&!this.isVirtualBackground?(this.outputCtx.globalCompositeOperation="copy",this.outputCtx.filter="none",this.outputCtx.drawImage(t,0,0,t.width,t.height)):n()}setImage(t){return u(this,null,function*(){return t.crossOrigin="anonymous",new Promise((e,i)=>{t.onload=()=>e(t),t.onerror=i})})}setGiF(t){return fetch(t).then(e=>e.arrayBuffer()).then(e=>(0,x.parseGIF)(e)).then(e=>(0,x.decompressFrames)(e,!0))}log(t,...e){console.info(t,...e)}resizeInputData(){this.personMaskCtx.drawImage(this.input,0,0,this.input.width,this.input.height,0,0,this.personMaskWidth,this.personMaskHeight);let t=this.personMaskCtx.getImageData(0,0,this.personMaskWidth,this.personMaskHeight),e=this.tfLite._getInputMemoryOffset()/4;for(let i=0;i<this.personMaskPixelCount;i++)this.tfLite.HEAPF32[e+i*3]=t.data[i*4]/255,this.tfLite.HEAPF32[e+i*3+1]=t.data[i*4+1]/255,this.tfLite.HEAPF32[e+i*3+2]=t.data[i*4+2]/255}infer(t){t||this.tfLite._runInference();let e=this.tfLite._getOutputMemoryOffset()/4;for(let i=0;i<this.personMaskPixelCount;i++)if(this.modelName==="meet"){let a=this.tfLite.HEAPF32[e+i*2],n=this.tfLite.HEAPF32[e+i*2+1],o=Math.max(a,n),r=Math.exp(a-o),h=Math.exp(n-o);this.personMask.data[i*4+3]=255*h/(r+h)}else if(this.modelName==="landscape-segmentation"){let a=this.tfLite.HEAPF32[e+i];this.personMask.data[i*4+3]=255*a}this.personMaskCtx.putImageData(this.personMask,0,0)}postProcessing(){this.outputCtx.globalCompositeOperation="copy",this.outputCtx.filter="none",this.isVirtualBackground?this.outputCtx.filter="blur(4px)":this.outputCtx.filter="blur(8px)",this.drawPersonMask(),this.outputCtx.globalCompositeOperation="source-in",this.outputCtx.filter="none",this.outputCtx.drawImage(this.input,0,0),this.enableSharpening&&this.output.width>at&&this.output.height>ot&&this.output.width<nt&&this.output.height<rt&&this.sharpenFilter(),this.drawSegmentedBackground()}sharpenFilter(){let t=this.outputCtx.getImageData(0,0,this.output.width,this.output.height),e=this.filters.convolute(t);this.outputCtx.putImageData(e,0,0)}drawPersonMask(){this.outputCtx.drawImage(this.personMaskCanvas,0,0,this.personMaskWidth,this.personMaskHeight,0,0,this.output.width,this.output.height)}drawSegmentedBackground(){this.outputCtx.globalCompositeOperation="destination-over",this.outputCtx.imageSmoothingEnabled=!0,this.outputCtx.imageSmoothingQuality="high",this.isVirtualBackground?this.backgroundType==="video"&&this.backgroundVideo!=null&&this.backgroundVideo.readyState>=4?this.fitVideoToBackground():this.backgroundType==="image"?this.fitImageToBackground():this.backgroundType==="gif"&&(this.giflocalCount>this.gifFrames[this.gifFramesIndex].delay/it?(this.gifFramesIndex++,this.gifFramesIndex>=this.gifFrames.length&&(this.gifFramesIndex=0),this.giflocalCount=0):this.giflocalCount++,this.fitGifToBackground()):this.addBlurToBackground()}runSegmentation(t){return u(this,null,function*(){this.tfLite&&(this.resizeInputData(),yield this.infer(t),this.postProcessing())})}fitVideoToBackground(){this.fitData(this.backgroundVideo,this.backgroundVideo.videoWidth,this.backgroundVideo.videoHeight)}fitImageToBackground(){this.fitData(this.backgroundImage,this.backgroundImage.width,this.backgroundImage.height)}fitGifToBackground(){if(this.gifFrameImageData==null){let t=this.gifFrames[this.gifFramesIndex].dims;this.tempGifCanvas.width=t.width,this.tempGifCanvas.height=t.height,this.gifFrameImageData=this.tempGifContext.createImageData(t.width,t.height)}this.gifFrameImageData.data.set(this.gifFrames[this.gifFramesIndex].patch),this.tempGifContext.putImageData(this.gifFrameImageData,0,0),this.fitData(this.tempGifCanvas,this.gifFrameImageData.width,this.gifFrameImageData.height)}fitData(t,e,i){let a,n,o,r;e/i<this.imageAspectRatio?(a=e,n=e/this.imageAspectRatio,o=0,r=(i-n)/2):(n=i,a=i*this.imageAspectRatio,r=0,o=(e-a)/2),this.outputCtx.drawImage(t,o,r,a,n,0,0,this.output.width,this.output.height)}addBlurToBackground(){return u(this,null,function*(){let t="15px";this.input.width<=160?t="5px":this.input.width<=320?t="10px":this.input.width<=640?t="15px":this.input.width<=960?t="20px":this.input.width<=1280?t="25px":this.input.width<=1920&&(t="30px"),this.outputCtx.filter=`blur(${t})`,this.outputCtx.drawImage(this.input,0,0,this.output.width,this.output.height)})}initSharpenFilter(){this.filters.tmpCanvas=document.createElement("canvas"),this.filters.tmpCtx=this.filters.tmpCanvas.getContext("2d"),this.filters.createImageData=(t,e)=>this.filters.tmpCtx.createImageData(t,e),this.filters.convolute=(t,e=[0,-1,0,-1,5,-1,0,-1,0],i)=>{let a=Math.round(Math.sqrt(e.length)),n=Math.floor(a/2),o=t.data,r=t.width,h=t.height,l=r,I=h,S=this.filters.createImageData(l,I),p=S.data,R=i?1:0;for(let d=0;d<I;d=d+1)for(let g=0;g<l;g=g+1){let c=(d*l+g)*4;if(o[c+3]!==0&&g<l&&d<I){let j=d,G=g,H=0,P=0,D=0,C=0;for(let m=0;m<a;m++)for(let f=0;f<a;f++){let F=j+m-n,T=G+f-n;if(F>=0&&F<h&&T>=0&&T<r){let k=(F*r+T)*4,b=e[m*a+f];H+=o[k]*b,P+=o[k+1]*b,D+=o[k+2]*b,C+=o[k+3]*b}}p[c]=H,p[c+1]=P,p[c+2]=D,p[c+3]=C+R*(255-C)}}return S}}};
package/dist/index.d.ts CHANGED
@@ -1 +1 @@
1
- export * from './HMSVirtualBackgroundPlugin';
1
+ export * from './HMSVirtualBackgroundPlugin';
package/dist/index.js CHANGED
@@ -1,8 +1 @@
1
-
2
- 'use strict'
3
-
4
- if (process.env.NODE_ENV === 'production') {
5
- module.exports = require('./hms-virtual-background.cjs.production.min.js')
6
- } else {
7
- module.exports = require('./hms-virtual-background.cjs.development.js')
8
- }
1
+ var A=(h,t)=>()=>(t||h((t={exports:{}}).exports,t),t.exports);var u=(h,t,e)=>new Promise((i,s)=>{var a=r=>{try{o(e.next(r))}catch(l){s(l)}},n=r=>{try{o(e.throw(r))}catch(l){s(l)}},o=r=>r.done?i(r.value):Promise.resolve(r.value).then(a,n);o((e=e.apply(h,t)).next())});var C=A((et,O)=>{O.exports={version:"1.3.8",license:"MIT",main:"dist/index.js",typings:"dist/index.d.ts",files:["dist","src/tflite","src/models"],scripts:{start:'concurrently "yarn dev" "yarn types"',dev:"node ../../scripts/dev","build:only":"node ../../scripts/build",build:"yarn build:only && yarn types:build",types:"tsc -w","types:build":"tsc -p tsconfig.build.json",test:"jest --maxWorkers=1 --passWithNoTests",lint:"eslint -c ../../.eslintrc .","lint:fix":"yarn lint --fix",prepare:"yarn build",size:"size-limit",analyze:"size-limit --why",format:"prettier --write src/**/*.ts"},peerDependencies:{"@100mslive/hms-video":"^0.1.45-alpha.1"},name:"@100mslive/hms-virtual-background",author:"ashish17",module:"dist/index.js",devDependencies:{"@100mslive/hms-video":"0.1.48"},dependencies:{"@tensorflow/tfjs-backend-webgl":"^3.3.0","@tensorflow/tfjs-core":"^3.3.0","@webassemblyjs/helper-wasm-bytecode":"^1.11.0","@webassemblyjs/wasm-gen":"^1.11.0","gifuct-js":"^2.1.2","wasm-check":"^2.0.2"},eslintIgnore:["tflite.js","tflite-simd.js","tflite.wasm","tflite-simd.wasm","defineTFLite.ts","importing.test.ts"],gitHead:"159b16c9cf62ce17ad5026a1e8558690c00fc3c3"}});var _=C(),F=`https://unpkg.com/${_.name}/src`,M="VBProcessor",R="tflite/tflite.js",j="tflite/tflite-simd.js",G="models/selfie_segmentation_landscape.tflite",S=h=>new Promise(function(t,e){let i=document.createElement("script");i.src=h,i.onload=t,i.onerror=e,document.head.appendChild(i)}),W=()=>u(void 0,null,function*(){let h,t=F+"/"+j;yield S(t);try{h=yield createTFLiteSIMDModule()}catch(e){console.warn("SIMD not supported. You may experience poor virtual background effect."),t=F+"/"+R,yield S(t),h=yield createTFLiteModule()}return h}),H=()=>u(void 0,null,function*(){let h=F+"/"+G,[t,e]=yield Promise.all([W(),fetch(h)]),i=yield e.arrayBuffer(),s=t._getModelBufferMemoryOffset();return t.HEAPU8.set(new Uint8Array(i),s),t._loadModel(i.byteLength),console.debug(M,"Input memory offset:",t._getInputMemoryOffset()),console.debug(M,"Input height:",t._getInputHeight()),console.debug(M,"Input width:",t._getInputWidth()),console.debug(M,"Input channels:",t._getInputChannelCount()),t});import"@tensorflow/tfjs-backend-webgl";import{parseGIF as N,decompressFrames as U}from"gifuct-js";import{HMSPluginUnsupportedTypes as z,HMSVideoPluginType as q}from"@100mslive/hms-video";var y="VBProcessor",J=33,Y=C(),$=214,Q=855,K=120,X=720,Z=class{constructor(t,e=!1){this.backgroundType="none";this.background=t,this.enableSharpening=e,this.backgroundImage=null,this.backgroundVideo=null,this.personMaskWidth=256,this.personMaskHeight=144,this.isVirtualBackground=!1,this.blurValue="10px",this.loadModelCalled=!1,this.tfLite=null,this.modelName="landscape-segmentation",this.outputCtx=null,this.input=null,this.output=null,this.timerID=0,this.imageAspectRatio=1,this.personMaskPixelCount=this.personMaskWidth*this.personMaskHeight,this.personMask=new ImageData(this.personMaskWidth,this.personMaskHeight),this.personMaskCanvas=document.createElement("canvas"),this.personMaskCanvas.width=this.personMaskWidth,this.personMaskCanvas.height=this.personMaskHeight,this.personMaskCtx=this.personMaskCanvas.getContext("2d"),this.filters={},this.gifFrames=null,this.gifFramesIndex=0,this.gifFrameImageData=null,this.tempGifCanvas=document.createElement("canvas"),this.tempGifContext=this.tempGifCanvas.getContext("2d"),this.giflocalCount=0,this.enableSharpening=e,this.log(y,"Virtual Background plugin created"),this.setBackground(this.background)}init(){return u(this,null,function*(){this.loadModelCalled?yield this.tfLitePromise:(this.log(y,"PREVIOUS LOADED MODEL IS ",this.tfLite),this.loadModelCalled=!0,this.tfLitePromise=H(),this.tfLite=yield this.tfLitePromise),this.enableSharpening&&this.initSharpenFilter()})}isSupported(){return navigator.userAgent.indexOf("Chrome")!==-1||navigator.userAgent.indexOf("Firefox")!==-1||navigator.userAgent.indexOf("Edg")!==-1||navigator.userAgent.indexOf("Edge")!==-1}checkSupport(){let t={};return["Chrome","Firefox","Edg","Edge"].some(e=>navigator.userAgent.indexOf(e)!==-1)?t.isSupported=!0:(t.isSupported=!1,t.errType=z.PLATFORM_NOT_SUPPORTED,t.errMsg="browser not supported for plugin, see docs"),t}getName(){return Y.name}getPluginType(){return q.TRANSFORM}setBackground(t){return u(this,null,function*(){if(t!=="")if(t==="none")this.log(y,"setting background to :",t),this.background="none",this.backgroundType="none",this.isVirtualBackground=!1;else if(t==="blur")this.log(y,"setting background to :",t),this.background="blur",this.backgroundType="blur",this.isVirtualBackground=!1;else if(t instanceof HTMLImageElement){this.log("setting background to image",t);let e=yield this.setImage(t);if(!e||!e.complete||!e.naturalHeight)throw new Error("Invalid image. Provide a valid and successfully loaded HTMLImageElement");this.isVirtualBackground=!0,this.backgroundImage=e,this.backgroundType="image"}else if(t instanceof HTMLVideoElement)this.log("setting background to video",t),this.backgroundVideo=t,this.backgroundVideo.crossOrigin="anonymous",this.backgroundVideo.muted=!0,this.backgroundVideo.loop=!0,this.backgroundVideo.oncanplaythrough=()=>u(this,null,function*(){this.backgroundVideo!=null&&(yield this.backgroundVideo.play(),this.isVirtualBackground=!0,this.backgroundType="video")});else if(console.log("setting gif to background"),this.gifFrames=yield this.setGiF(t),this.gifFrames!=null&&this.gifFrames.length>0)this.backgroundType="gif",this.isVirtualBackground=!0;else throw new Error("Invalid background supplied, see the docs to check supported background type");else throw new Error("Invalid background supplied, see the docs to check supported background type")})}stop(){var t,e;this.isVirtualBackground&&((t=this.backgroundImage)==null||t.removeAttribute("src"),(e=this.backgroundVideo)==null||e.removeAttribute("src"),this.backgroundType==="video"&&(this.backgroundVideo.loop=!1,this.backgroundVideo=null)),this.outputCtx&&(this.outputCtx.fillStyle="rgb(0, 0, 0)",this.outputCtx.fillRect(0,0,this.output.width,this.output.height)),this.gifFrameImageData=null,this.gifFrames=null,this.giflocalCount=0,this.gifFramesIndex=0}processVideoFrame(t,e,i){if(!t||!e)throw new Error("Plugin invalid input/output");this.input=t,this.output=e;let s=e.getContext("2d");if(s.canvas.width!==t.width&&(s.canvas.width=t.width),s.canvas.height!==t.height&&(s.canvas.height=t.height),this.backgroundType==="video"&&(this.backgroundVideo.width=t.width,this.backgroundVideo.height=t.height),this.outputCtx=s,this.imageAspectRatio=t.width/t.height,this.imageAspectRatio<=0)throw new Error("Invalid input width/height");let a=()=>u(this,null,function*(){yield this.runSegmentation(i)});this.background==="none"&&!this.isVirtualBackground?(this.outputCtx.globalCompositeOperation="copy",this.outputCtx.filter="none",this.outputCtx.drawImage(t,0,0,t.width,t.height)):a()}setImage(t){return u(this,null,function*(){return t.crossOrigin="anonymous",new Promise((e,i)=>{t.onload=()=>e(t),t.onerror=i})})}setGiF(t){return fetch(t).then(e=>e.arrayBuffer()).then(e=>N(e)).then(e=>U(e,!0))}log(t,...e){console.info(t,...e)}resizeInputData(){this.personMaskCtx.drawImage(this.input,0,0,this.input.width,this.input.height,0,0,this.personMaskWidth,this.personMaskHeight);let t=this.personMaskCtx.getImageData(0,0,this.personMaskWidth,this.personMaskHeight),e=this.tfLite._getInputMemoryOffset()/4;for(let i=0;i<this.personMaskPixelCount;i++)this.tfLite.HEAPF32[e+i*3]=t.data[i*4]/255,this.tfLite.HEAPF32[e+i*3+1]=t.data[i*4+1]/255,this.tfLite.HEAPF32[e+i*3+2]=t.data[i*4+2]/255}infer(t){t||this.tfLite._runInference();let e=this.tfLite._getOutputMemoryOffset()/4;for(let i=0;i<this.personMaskPixelCount;i++)if(this.modelName==="meet"){let s=this.tfLite.HEAPF32[e+i*2],a=this.tfLite.HEAPF32[e+i*2+1],n=Math.max(s,a),o=Math.exp(s-n),r=Math.exp(a-n);this.personMask.data[i*4+3]=255*r/(o+r)}else if(this.modelName==="landscape-segmentation"){let s=this.tfLite.HEAPF32[e+i];this.personMask.data[i*4+3]=255*s}this.personMaskCtx.putImageData(this.personMask,0,0)}postProcessing(){this.outputCtx.globalCompositeOperation="copy",this.outputCtx.filter="none",this.isVirtualBackground?this.outputCtx.filter="blur(4px)":this.outputCtx.filter="blur(8px)",this.drawPersonMask(),this.outputCtx.globalCompositeOperation="source-in",this.outputCtx.filter="none",this.outputCtx.drawImage(this.input,0,0),this.enableSharpening&&this.output.width>$&&this.output.height>K&&this.output.width<Q&&this.output.height<X&&this.sharpenFilter(),this.drawSegmentedBackground()}sharpenFilter(){let t=this.outputCtx.getImageData(0,0,this.output.width,this.output.height),e=this.filters.convolute(t);this.outputCtx.putImageData(e,0,0)}drawPersonMask(){this.outputCtx.drawImage(this.personMaskCanvas,0,0,this.personMaskWidth,this.personMaskHeight,0,0,this.output.width,this.output.height)}drawSegmentedBackground(){this.outputCtx.globalCompositeOperation="destination-over",this.outputCtx.imageSmoothingEnabled=!0,this.outputCtx.imageSmoothingQuality="high",this.isVirtualBackground?this.backgroundType==="video"&&this.backgroundVideo!=null&&this.backgroundVideo.readyState>=4?this.fitVideoToBackground():this.backgroundType==="image"?this.fitImageToBackground():this.backgroundType==="gif"&&(this.giflocalCount>this.gifFrames[this.gifFramesIndex].delay/J?(this.gifFramesIndex++,this.gifFramesIndex>=this.gifFrames.length&&(this.gifFramesIndex=0),this.giflocalCount=0):this.giflocalCount++,this.fitGifToBackground()):this.addBlurToBackground()}runSegmentation(t){return u(this,null,function*(){this.tfLite&&(this.resizeInputData(),yield this.infer(t),this.postProcessing())})}fitVideoToBackground(){this.fitData(this.backgroundVideo,this.backgroundVideo.videoWidth,this.backgroundVideo.videoHeight)}fitImageToBackground(){this.fitData(this.backgroundImage,this.backgroundImage.width,this.backgroundImage.height)}fitGifToBackground(){if(this.gifFrameImageData==null){let t=this.gifFrames[this.gifFramesIndex].dims;this.tempGifCanvas.width=t.width,this.tempGifCanvas.height=t.height,this.gifFrameImageData=this.tempGifContext.createImageData(t.width,t.height)}this.gifFrameImageData.data.set(this.gifFrames[this.gifFramesIndex].patch),this.tempGifContext.putImageData(this.gifFrameImageData,0,0),this.fitData(this.tempGifCanvas,this.gifFrameImageData.width,this.gifFrameImageData.height)}fitData(t,e,i){let s,a,n,o;e/i<this.imageAspectRatio?(s=e,a=e/this.imageAspectRatio,n=0,o=(i-a)/2):(a=i,s=i*this.imageAspectRatio,o=0,n=(e-s)/2),this.outputCtx.drawImage(t,n,o,s,a,0,0,this.output.width,this.output.height)}addBlurToBackground(){return u(this,null,function*(){let t="15px";this.input.width<=160?t="5px":this.input.width<=320?t="10px":this.input.width<=640?t="15px":this.input.width<=960?t="20px":this.input.width<=1280?t="25px":this.input.width<=1920&&(t="30px"),this.outputCtx.filter=`blur(${t})`,this.outputCtx.drawImage(this.input,0,0,this.output.width,this.output.height)})}initSharpenFilter(){this.filters.tmpCanvas=document.createElement("canvas"),this.filters.tmpCtx=this.filters.tmpCanvas.getContext("2d"),this.filters.createImageData=(t,e)=>this.filters.tmpCtx.createImageData(t,e),this.filters.convolute=(t,e=[0,-1,0,-1,5,-1,0,-1,0],i)=>{let s=Math.round(Math.sqrt(e.length)),a=Math.floor(s/2),n=t.data,o=t.width,r=t.height,l=o,x=r,T=this.filters.createImageData(l,x),p=T.data,P=i?1:0;for(let d=0;d<x;d=d+1)for(let g=0;g<l;g=g+1){let c=(d*l+g)*4;if(n[c+3]!==0&&g<l&&d<x){let D=d,B=g,L=0,E=0,V=0,w=0;for(let m=0;m<s;m++)for(let f=0;f<s;f++){let v=D+m-a,I=B+f-a;if(v>=0&&v<r&&I>=0&&I<o){let k=(v*o+I)*4,b=e[m*s+f];L+=n[k]*b,E+=n[k+1]*b,V+=n[k+2]*b,w+=n[k+3]*b}}p[c]=L,p[c+1]=E,p[c+2]=V,p[c+3]=w+P*(255-w)}}return T}}};export{Z as HMSVirtualBackgroundPlugin};
package/package.json CHANGED
@@ -1,5 +1,5 @@
1
1
  {
2
- "version": "1.3.7",
2
+ "version": "1.3.8",
3
3
  "license": "MIT",
4
4
  "main": "dist/index.js",
5
5
  "typings": "dist/index.d.ts",
@@ -9,49 +9,28 @@
9
9
  "src/models"
10
10
  ],
11
11
  "scripts": {
12
- "start": "tsdx watch",
13
- "build": "tsdx build",
14
- "test": "tsdx test",
15
- "lint": "tsdx lint src",
16
- "prepare": "tsdx build",
12
+ "start": "concurrently \"yarn dev\" \"yarn types\"",
13
+ "dev": "node ../../scripts/dev",
14
+ "build:only": "node ../../scripts/build",
15
+ "build": "yarn build:only && yarn types:build",
16
+ "types": "tsc -w",
17
+ "types:build": "tsc -p tsconfig.build.json",
18
+ "test": "jest --maxWorkers=1 --passWithNoTests",
19
+ "lint": "eslint -c ../../.eslintrc .",
20
+ "lint:fix": "yarn lint --fix",
21
+ "prepare": "yarn build",
17
22
  "size": "size-limit",
18
- "analyze": "size-limit --why"
23
+ "analyze": "size-limit --why",
24
+ "format": "prettier --write src/**/*.ts"
19
25
  },
20
26
  "peerDependencies": {
21
- "@100mslive/hms-video": "^0.0.147"
22
- },
23
- "husky": {
24
- "hooks": {
25
- "pre-commit": "tsdx lint"
26
- }
27
- },
28
- "prettier": {
29
- "printWidth": 80,
30
- "semi": true,
31
- "singleQuote": true,
32
- "trailingComma": "es5"
27
+ "@100mslive/hms-video": "^0.1.45-alpha.1"
33
28
  },
34
29
  "name": "@100mslive/hms-virtual-background",
35
30
  "author": "ashish17",
36
- "module": "dist/hms-virtual-background.esm.js",
37
- "size-limit": [
38
- {
39
- "path": "dist/hms-virtual-background.cjs.production.min.js",
40
- "limit": "500 KB"
41
- },
42
- {
43
- "path": "dist/hms-virtual-background.esm.js",
44
- "limit": "500 KB"
45
- }
46
- ],
31
+ "module": "dist/index.js",
47
32
  "devDependencies": {
48
- "@100mslive/hms-video": "^0.0.147",
49
- "@size-limit/preset-small-lib": "^4.5.5",
50
- "husky": "^6.0.0",
51
- "size-limit": "^5.0.3",
52
- "tsdx": "^0.14.1",
53
- "tslib": "^2.2.0",
54
- "typescript": "^4.2.4"
33
+ "@100mslive/hms-video": "0.1.48"
55
34
  },
56
35
  "dependencies": {
57
36
  "@tensorflow/tfjs-backend-webgl": "^3.3.0",
@@ -61,14 +40,13 @@
61
40
  "gifuct-js": "^2.1.2",
62
41
  "wasm-check": "^2.0.2"
63
42
  },
64
- "resolutions": {
65
- "tsdx/**/node-notifier": "10.0.0"
66
- },
67
- "eslintConfig": {
68
- "env": {
69
- "browser": true,
70
- "node": true
71
- }
72
- },
73
- "eslintIgnore": ["tflite.js", "tflite-simd.js", "tflite.wasm", "tflite-simd.wasm", "defineTFLite.ts", "importing.test.ts"]
43
+ "eslintIgnore": [
44
+ "tflite.js",
45
+ "tflite-simd.js",
46
+ "tflite.wasm",
47
+ "tflite-simd.wasm",
48
+ "defineTFLite.ts",
49
+ "importing.test.ts"
50
+ ],
51
+ "gitHead": "159b16c9cf62ce17ad5026a1e8558690c00fc3c3"
74
52
  }
package/LICENSE DELETED
@@ -1,21 +0,0 @@
1
- MIT License
2
-
3
- Copyright (c) 2021 Praveen
4
-
5
- Permission is hereby granted, free of charge, to any person obtaining a copy
6
- of this software and associated documentation files (the "Software"), to deal
7
- in the Software without restriction, including without limitation the rights
8
- to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
- copies of the Software, and to permit persons to whom the Software is
10
- furnished to do so, subject to the following conditions:
11
-
12
- The above copyright notice and this permission notice shall be included in all
13
- copies or substantial portions of the Software.
14
-
15
- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
- IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
- FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
- AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
- LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
- OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
- SOFTWARE.