As a National Endowment for the Arts (NEA) Research Lab, Laboratory for the Scientific Study of Dance (LAB:SYNC) is currently developing methods for quantifying dance exposures in adults 18-85 years old. Later this year, LAB:SYNC will be releasing three specific applications for quantifying dance exposures using sensors. The applications allow researchers the ability to calibrate data acquired using both wearable devices and 2D cameras; provide ground truth labels for wearable sensor and video-based data; and analyze data acquired from wearable sensors and 2D cameras. See below for a brief overview of each of the applications. An announcement about how to access the applications will be included in gravity Volume 2, Issue 4.
iDANCE (Integrated Dance Motion Capture Calibration Suite)
![](https://websites.umass.edu/labsync/files/2022/09/Screenshot-4-1024x616.png)
iDANCE is a calibration application for researchers who use wearable sensors or video to collect motion capture data. The application automates the processes of calibrating inertial measurement unit data acquired from wearable sensors, as well as provides a both a visual and numerical pre-calibration report of any detected signal drift in each device. The iDANCE application additionally provides estimates of 2D camera intrinsic and extrinsic parameters within a calibration report, and the iDANCE calibration file can then used to conduct kinematic analyze video data further in DIVAS.
DIVAS (Dance Image & Video Analysis Suite)
![](https://websites.umass.edu/labsync/files/2022/09/Screenshot-2-1024x650.png)
DIVAS is intended for researchers and educators who would like to conduct direct observations analyses of physical activity behaviors, including dance, using a series of pre-programmed motor behavior codes. The DIVAS application additionally allows researchers to assign ground truth labels to video signals to train deep learning algorithms in MATLAB or to conduct further analyses in another platform. Researchers and educators can also use the software to conduct exploratory kinematic analyses. LAB:SYNC is currently optimizing an automated solution for conducting kinematic analyses in a markerless motion capture paradigm that we hope to include as an additional feature in its second year as an NEA Research Lab.
SANDS (Signal Analysis Nexus for Dance Science)
![](https://websites.umass.edu/labsync/files/2022/09/Screen-Shot-2022-09-06-at-11.42.49-PM-1024x585.png)
SANDS allows researchers to analyze physical activity data acquired from wearable devices (e.g., accelerometers) using validated and published algorithms that are pre-programmed into the the SANDS application. SANDS facilitates signal processing, data reduction, wear time analyses, and the application of various cut point algorithms for analyzing wearable sensor data. The online version of the SANDS application will feature regular updates from LAB:SYNC with newly published algorithms for quantifying physical activity, especially dance, using wearable devices.