@onirix/ar-engine-sdk 1.8.0 → 1.8.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,5 +1,11 @@
1
1
  # Changelog
2
2
 
3
+ ## v1.8.1 (2024-07-24)
4
+
5
+ ### Fixed
6
+
7
+ - Fallback to Onirix world tracking whenever WebXR initialization fails for any reason.
8
+
3
9
  ## v1.8.0 (2024-07-01)
4
10
 
5
11
  ### Changed
package/README.md CHANGED
@@ -7,6 +7,7 @@ It internally uses advanced web standards like WebGL or WebAssembly combined wit
7
7
  Onirix AR Engine SDK is compatible with the following Onirix **tracking modes**:
8
8
 
9
9
  * **Image**: Will load the image classifier generated for your Onirix project and perform detection and tracking of any of its images.
10
+ * **Curved**: Will load the image classifier generated for your Onirix project and perform detection and tracking of any of its images, taking into account that those images will be wrapping a cylinder.
10
11
  * **QR Code**: Will detect any QR code in the camera feed, return its decoded content and perform tracking.
11
12
  * **Surface**: Will use device camera and motion sensors to track an object fixed over any place or surface. This mode has 2 different sub-modes:
12
13
  * **World-Tracking** (Default): Allows to place objects fixed at a surface, so you can walk around and see them from different perspectives. Internally uses WebXR device API if the device is compatible (Android with ARCore support for now), and fallbacks to a custom implementation if not, or when using the config option `disableWebXR = true`.
@@ -29,7 +30,7 @@ Onirix AR Engine SDK is agnostic (not tied) to any rendering engine, so you can
29
30
 
30
31
  First of all, you'll need to access [Onirix Studio](https://studio.onirix.com?target=_blank) and create a Project.
31
32
 
32
- If you plan to use Image-Tracking mode, then you'll also need to **create an image scene for every marker image** you want to be detected so Onirix can generate the required image classifier. For Spatial-Tracking, you'll also need to create an scene by scanning your environment with Onirix Constructor App. If none of these modes are used, then there is no need to create any scene as, when using the SDK, you can provide your own assets and interaction through your hosting and code.
33
+ If you plan to use Image-Tracking or Curved-Tracking mode, then you'll also need to **create an image scene for every marker image** you want to be detected so Onirix can generate the required image classifier. For Spatial-Tracking, you'll also need to create an scene by scanning your environment with Onirix Constructor App. If none of these modes are used, then there is no need to create any scene as, when using the SDK, you can provide your own assets and interaction through your hosting and code.
33
34
 
34
35
  Finally, you will have to **publish** your project and **copy the Web SDK token** from the "share" and "settings" top-menu options respectively.
35
36
 
@@ -73,6 +74,7 @@ Whenever you want to launch the AR experience, **create a new OnirixSDK instance
73
74
  The **mode** can be one of following depending on the tracking type you want to use for your experience:
74
75
 
75
76
  * OnirixSDK.TrackingMode.Image
77
+ * OnirixSDK.TrackingMode.Curved
76
78
  * OnirixSDK.TrackingMode.QRCode
77
79
  * OnirixSDK.TrackingMode.Surface
78
80
  * OnirixSDK.TrackingMode.Spatial
@@ -213,6 +215,10 @@ While in image mode, positive Y is orthogonal coming out from marker image, X go
213
215
 
214
216
  ![](https://docs.onirix.com/user/pages/05.onirix-sdk/08.web-ar/image-axes.jpg)
215
217
 
218
+ #### Curved mode
219
+
220
+ While in curved mode, -Z is orthogonal coming out from marker image, X goes right and Y down.
221
+
216
222
  #### Spatial mode
217
223
 
218
224
  While in Spatial mode, the coordinate system aligns with the scanned environment. This means that, similar to Surface mode, the -Z axis points towards the horizon, the Y axis points upward, aligned with gravity, and the X axis is perpendicular to both, forming the cross vector. However, in Spatial mode, these axes are rotated in such a way that they match the orientation and position of the previously scanned space, ensuring that virtual objects are accurately placed within the real-world context.