The TUM Kitchen Data Set of Everyday Manipulation Activities for Motion Tracking and Action Recognition. Tenorth, Bandouch, Beetz. ICCV Workshops 2009

<Also see>

The TUM Kitchen Data Set contains observations of several subjects setting a table in different ways. Some perform the activity like a robot would do, transporting the items one-by-one, other subjects behave more natural and grasp as many objects as they can at once. In addition, there are two episodes where the subjects repetitively performed reaching and grasping actions. Applications of the data are mainly in the areas of human motion tracking, motion segmentation, and activity recognition.

To provide sufficient information for recognizing and characterizing the observed activities, we recorded the following multi-modal sensor data:

  • Video data from four fixed, overhead cameras (384×288 pixels RGB color or 780×582 pixels raw Bayer pattern, at 25Hz)
  • Motion capture data (*.bvh file format) extracted from the videos using our markerless full-body MeMoMan tracker
  • RFID tag readings from three fixed readers embedded in the environment (sample rate 2Hz)
  • Magnetic (reed) sensors detecting when a door or drawer is opened. (sample rate 10Hz)
  • Action labels (the data is labeled separately for the left hand, the right hand, and the trunk of the person)
  1. Trackerless 51 DOF tracking
    1. Looks like there is no headmount (although 4 static overhead)
  2. Has data on subjects setting a table
  3. <Because no headmount, probably not usable for us now, although may be good for testing later, if so, will revisit>

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: