Software/Data
Working Memory Dataset: 1st release Feb 2023
Working Memory (WM) involves the temporary retention of information over small amounts of time mission. It is an important aspect of cognitive function that allows humans to perform a variety of tasks that require online processing, such as dialling a phone number, recalling routes, etc. Inherent limitations in the individual capacity to hold information leads to people often forgetting important specifics during such tasks. In this work, we would like to showcase how wearable and assistive technologies for improving other types of memory functions that are longer-term in nature (e.g., episodic memory). Motivated by this, we leverage multimodal, wearable sensor data to legibly extract attentional focus during those activities to intelligently cue in-situation, to improve the recall of those tasks.
Access the dataset here.
Publication: Under Preparation
Dataset Description: We collected data from 20 volunteers performing retention of cognitive information in desktop-based navigation tasks for four different stimulated environments, (a) an indoor dorm, (b) a familiar, suburban campus area, (c) the downtown area of a mid-size US city (Baltimore) and (d) dense, cosmopolitan city (NYC). We design two data collection procedures for this work:-physiology-driven episode extraction and verbal cueing and navigation retracing. We employed multi-modalities, wearable and eye tracking-based sensing modules which comprised of inertial motion units (IMU) (Accelerometer, Gyroscope and Magnetometer), Galvanic Skin Response (GSR), Photoplethysmography (PPG), Electroencephalogram (EEG) and Eye Tracking sensors.
------------------------------------------------------##############---------------------------------------------------
Multi-view Dataset: 1st release Feb 2023
Deep video representation learning has achieved excellent performance in video action recognition, but performance degrades significantly when applied to video clips from varying perspectives. Existing video action recognition (VAR) models often include both view and action information, making it challenging to learn a view-invariant representation. To address this issue, we collected a large-scale multiview video dataset. This dataset includes various metadata to facilitate further research for robust VAR systems.
Access the dataset here.
Publications: Under Review in ICASSP 2023 and IEEE Transactions on Image Processing
Dataset Description: We collected data on ten micro-actions, including static and dynamic poses, with regular, wide-angle, and drone cameras from 12 volunteers in different environments and lighting conditions. We obtained approximately ten hours of total video data in a time-controlled and safe setup, including the background-only data for the identical backgrounds. The videos are collected under varying realistic lighting conditions indoors, outdoors, and multiple realistic backgrounds with varying camera settings.
------------------------------------------------------##############---------------------------------------------------
The Firearm Recoil Dataset: 1st release Oct 2021
This dataset was collected utilizing a wrist worn accelerometer to record the recoil generated from one subject’s use of 15 different firearms of the Handgun, Rifle and Shotgun class. The type of the firearm based on its ability to auto-load or not is also denoted. Data was collected at a private range where the user was instructed to conducting the shooting exercise in the same manner they would during a normal day at a shooting range. Slow deliberate shots were taken, with the subject taking time to aim at a target in a standing position; feet shoulder width apart.
Access the dataset here.
Publications:
David Welsh, Abu Zaher Faridee, Nirmalya Roy. Hybrid Distance-Based Framework for Classification of Embedded Firearm Recoil Data, in Proceedings of the 17th Workshop on Context and Activity Modeling and Recognition (CoMoRea’21), co-located with IEEE PerCom, Kassel, Germany, March 2021
Md. Abdullah Al Hafiz Khan, David Welsh, and Nirmalya Roy. Firearm Detection Using Wrist Worn Tri-Axis Accelerometer Signals, in Proceedings of the 4th Workshop on sensing systems and applications using wrist worn smart devices (WristSense’18), co-located with PerCom, March 2018
Dataset Description:
A wrist worn AX3 Watch, Axivity Ltd tri-axis accelerometer sensor is used. Data collection was performed at 1600 Hz with offset ±16g. The age of the participant was 27 years and height and weight was 6'2'' and 180lbs respectively. The user was right handed and sensor was placed on the right wrist. Data files are saved in separate CSV files based on the firearm used.
------------------------------------------------------##############---------------------------------------------------
The MPSC-rPPG Dataset: 1st release September 2021
This MPSC-rPPG dataset was collected to capture facial video at high-resolution and frame per second (input) with simultaneous wrist PPG signal (ground truth). The dataset covers personal variances, background, skin tone, brightness variations. We believe, providing open access to the MPSC-rPPG dataset would enable development and validation for different PPG extraction methods, and thus data is made available. If you use these datasets for your research, please cite the following paper. You can find the dataset in the following website link. However, please do not use any subject's face/description in your presentation, report, or paper.
Access the dataset here.
Project GitHub page with source code available here.
Publication:
Zahid Hasan, Sreenivasan Ramasamy Ramamurthy, Nirmalya Roy. CamSense: A Camera-Based Contact-less Heart Activity Monitoring, in Proceedings of the IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE), Washington D.C. December 2021
Dataset Description:
The dataset contains RGB DSLR facial videos under artificial light from 3-6 feet distance. The subjects wear Empatica E4 Wristwatch during the video collection to track the PPG simultaneously. We align the vides and corresponding Empatica PPG signal with an error bound of 1/30 seconds by leveraging the Event Marker features of Empatica. The two hours of rPPG data contain two females and six males who volunteered multiple times, covering heterogeneity such as sex, facial hair, fitness level, skin color, and spectacles usage in the dataset.
------------------------------------------------------##############---------------------------------------------------
Badminton Activity Recognition (BAR): 1st release 2020
The Badminton Activity Recognition (BAR) Dataset was collected for the sport of Badminton for 12 commonly played strokes. Besides the strokes, the objective of the dataset is to capture the associated leg movements. We believe in open access and thus data is made available without any password protection. If you use these datasets for your research, please cite the following dataset and paper. You can find the dataset in the following website link given below with the help of the DOI.
Access the dataset here.
Publication:
Indrajeet Ghosh, Sreenivasan Ramasamy Ramamurthy, Nirmalya Roy, StanceScorer: A Data Driven Approach to Score Badminton Player, in Proceedings of the 6th IEEE International Workshop on Sensing Systems and Applications Using Wrist Worn Smart Devices (WristSense), co-located with PerCom, Austin, March 2020
Indrajeet Ghosh, Sreenivasan Ramasamy Ramamurthy, Avijoy Chakma, and Nirmalya Roy. "DeCoach: Deep Learning-based Coaching for Badminton Player Assessment." Pervasive and Mobile Computing 83 (2022): 101608.
Indrajeet Ghosh, Sreenivasan Ramasamy Ramamurthy, Avijoy Chakma, Nirmalya Roy and Nicholas Waytowich, "PerMTL: A Multi-Task Learning Framework for Skilled Human Performance Assessment." In 2022 21th IEEE International Conference on Machine Learning and Applications (ICMLA), December 2022.
Dataset Description:
Four Shimmer3 IMU Units were placed on the dominant palm and wrist, and both the legs at 512 Hz.
Each IMU unit comprises a 3-axis low-noise accelerometer, 3-axis high-noise accelerometer, 3-axis gyroscope, and 3-axis magnetometer.
The data was annotated for two aspects:
the strokes played (12 strokes),
how good the player executed the shot.
------------------------------------------------------##############---------------------------------------------------
A Circuit level Green Building Dataset with Appliance, Room and Floor level Information for Energy Disaggregation.
We are making available a floor, room and an appliance level dataset for one of our locations. The data has been collected from a three-storied townhome (approx. 2000 sq. ft.) with a variety of appliances at the circuit level. Data is available at the minute-by-minute, hour-by-hour, and day-by-day level. We believe in open access and thus data is made available without any password protection. If you use these datasets for your research please cite the following papers. We plan to provide the subsequent circuit level datasets from this location in the due course of time. Stay tuned.
Publications:
Nilavra Pathak, Md. Abdullah Al Hafiz Khan, and Nirmalya Roy. “Acoustic based appliance state identifications for fine grained energy analytics”, in Proceedings of the IEEE International Conference on Pervasive Computing and Communications (PerCom), March 2015.
Nirmalya Roy, Nilavra Pathak, and Archan Misra. “AARPA: Combining Mobile and Power-line Sensing for Fine-grained Appliance Usage and Energy Monitoring”, in Proceedings of the IEEE International Conference on Mobile Data Management (MDM), June 2015.
Datasets:
Dataset Description:
Minute by minute data: wattage (power) by circuit, plus voltage for the whole house and outside temperature (in 7 days or 14 days chunk).
Hourly data: Kilowatt hours (energy usage) by circuit, plus average voltage for the whole house and outside temperature (in 30 days chunk).
Daily data: Daily kilowatt hours (energy usage) by circuit and outside temperature (in 6 months, 8 months or entire 1 year chunk).