Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Last revision Both sides next revision | ||
blocks:app-note:nexmosphere:handgestures [2023-12-20 14:29] mattias [The Driver] |
blocks:app-note:nexmosphere:handgestures [2023-12-21 07:30] mattias [Use a Nexmosphere hand gesture sensor to control content in Blocks] |
||
---|---|---|---|
Line 1: | Line 1: | ||
======Use a Nexmosphere hand gesture sensor to control content in Blocks====== | ======Use a Nexmosphere hand gesture sensor to control content in Blocks====== | ||
- | This is an application note that describes how to use the Nexmosphere hand gesture sensor XV-H40 | + | :!:WORK IN PROGRESS: |
+ | This is an application note that describes | ||
- | This note assumes a controller | + | This note assumes a controller with built-in serial to usb controller directly to a Pixilab Players USB port (i.e. XN-185), but it will work also with a serial controller connected over the network via a serial port server such as Moxa Nport. |
+ | :!: This application note is technical and assumes Blocks users with essential knowledge of tasks, properties and behaviors. | ||
====Hardware requirements==== | ====Hardware requirements==== | ||
Line 31: | Line 33: | ||
The existing Nexmosphere drivers have been updated with extensive support for the XV-H40 hand gesture sensor. | The existing Nexmosphere drivers have been updated with extensive support for the XV-H40 hand gesture sensor. | ||
Update/ | Update/ | ||
- | Most required functions have been implemented as callables on the driver. Any other settings can be set by sending a custom command to the element. | + | Most required functions have been implemented as callables on the driver. Any other settings can be set by sending a custom command to the element |
===Activation Zones=== | ===Activation Zones=== | ||
Line 40: | Line 42: | ||
* // | * // | ||
Exposes the following readonly properties: | Exposes the following readonly properties: | ||
- | * // | + | * // |
- | * // | + | * // |
+ | |||
+ | (*this information is stored persistent in the HV-H40) | ||
===Gesture detection=== | ===Gesture detection=== | ||
Exposes the following readonly property: | Exposes the following readonly property: | ||
- | * //gesture// a string property that exposes both the gesture and the direction in a single value. i.e. THUMB:UP, OPENPALM: | + | * //gesture// a string property that exposes both the gesture and the direction in a single value. i.e. THUMB:UP, OPENPALM: |
- | This can be used as a trigger for a blocks-task. | + | |
- | (*note this information | + | ===Position Tracking=== |
+ | Exposes the following readonly properties one set for hand ID1 and another for hand ID2: | ||
+ | //hand#X// number, horizontal position, normalized 0-100% of cameras field of view as normalized 0-1 | ||
+ | // | ||
+ | // | ||
+ | // | ||
+ | // | ||
+ | // | ||
+ | // | ||
+ | // | ||
+ | |||
+ | ====General comments==== | ||
+ | Both gesture tracking and Activation Zone triggering works very well. For position tracking the resolution of the data is not quite fast enough, even though it work. | ||
+ | The fastest reporting the Controller is capable of is 250ms interval. That is not enough for a nice user experience. That is where it could make sense to combine a captured video stream from the device with the drivers properties to create a good user experience. | ||
+ | ====Use a captured video stream==== | ||
+ | The device can store a custom hand icon and a custom background. | ||
+ | If we use a png Icon for the hand in ie white outline and a background that is black or vece versa, we can use the blens mode settings | ||
+ | |||