This is an old revision of the document!


Use a Nexmosphere hand gesture sensor to control content in Blocks

This is an application note that describes how to use the Nexmosphere hand gesture sensor XV-H40 with Blocks.

This note assumes a controller that with built-in serial to usb controller directly to a Pixilab Players USB port (i.e. XN-185).

:!: This application note is technical and assumes Blocks users with essential knowledge of tasks, properties, behaviors.

Hardware requirements

  • Nexmosphere XN-185 Experience controller. This is the 8-interface controller device, with USB connection.
  • Nexmosphere XV-H40 Element, this is the sensor module that handles the image data processing.
  • Nexmosphere S-CLA05 camera module
  • Computer running Pixilab Player allowing us to hook up the sensors to Blocks.
  • USB Capture interface (optional) video from the sensor and use that in Blocks content.

Wiring

Follow the vendors manual for the wiring. This is a principal schema including the optional capture interface.

Sensor

All the Nexmosphere sensors are designed with simplicity in mind. This is by far the most advanced sensor at the time of authoring this article, still it is easy to get started. There is one significant difference with this element compared to others from Nexmosphere, some settings are stored with persistence, but not all.

The Driver

The existing Nexmosphere drivers have been updated with extensive support for the XV-H40 hand gesture sensor. Update/install the driver in any existing system to gain support for this device. Most required functions have been implemented as callables on the driver. Any other settings can be set by sending a custom command to the element.

Activation Zones

Exposes the following callables:

  • clearSingleZone, cleares a single zone with a specific ID.
  • clearAllZones , cleares all zones at once
  • setZone, set up a single zone with size and position parameters.*
  • setZoneTriggerGesture, specify a specific gesture and optional direction to act as a trigger while using Activation Zones.

Exposes the following readonly properties:

  • inZoneX, a Boolean per zone (1-9), can be use to bind a Classify behavior placed on a i.e. button to or used as a trigger in tasks. Becomes true while a hand is in zone.
  • clickInZoneX, a Boolean per zone (1-9), can be used to bind a Press behavior placed on a button to or used as a trigger in tasks. Momentary becomes true if a hand is in zone and the selected trigger gesture is detected from the same hand.

(*this information is stored persistent in the HV-H40)

Gesture detection

Exposes the following readonly property:

  • gesture a string property that exposes both the gesture and the direction in a single value. i.e. THUMB:UP, OPENPALM:CENTER, SWIPE:LEFT, can be used to bind behaviors to or as a trigger for a blocks tasks.

Get started