Thursday, May 23, 2013

Week 11

Bathroom Integration

This week was spent combining our Kinect interactivity with the bathroom models created by the other team. We began by positioning the player character within the virtual bathroom, which required some scaling and careful positioning. Once this was done, we began implementing our ideas regarding the interaction with the bathroom.

  • Lights: The first, and theoretically easiest, task was to control the lights with the presence of a user. The lights will only turn on if the user is within the room, and the mirror-lights will turn on when the user is near the mirror. The room lights turned out to be more difficult than was initially thought, since the Kinect's range was not large enough to scan the entire room while being able to sense a person leaving the room. Unfortunately this is more of a hardware problem, which is difficult to rectify given the closed nature of the Kinect hardware.
    The mirror lights work quite well, once it was realized that the distance between the mirror and the user was being measured from the corner of the mirror.
  • Temperature: The next interaction was controlling the temperature using our custom-built gesture.
    Since this had already been used to control the size of an object during initial testing, it was quite simple to use it to instead control the height of a rectangular prism in the corner of the shower, to represent the temperature change.
  • Height Control: This ended up being one of the more difficult interactions to create. Although we'd already created a "bench" that would automatically match your knee height, using this to change the toilet and sink heights was more difficult. For instance, we didn't want the toilet to constantly move around as you moved, rather it should just match your height once and lock at that position. This took some time to figure out. Another issue was only activating them when the user was close enough. This was to ensure that, for instance, the sink did not suddenly lower itself when a user sits on the toilet.
    There was also the problem of the bench going to extreme positions when the Kinect was unable to track the user, so there are now limits in place for both the bench and the toilet to ensure they maintain reasonable heights regardless of any extraneous input.
  • Fall Detection: One of the interactions we discussed very early on in the project was detecting when someone falls down. This was also a difficult thing to detect, since the Kinect cannot properly sense a person lying down, their limbs are less defined, and their angle to the camera means there are often parts of the body that can't be seen. In testing, the representation of the user would glitch and jump around, meaning our expected check of "Is the waist at the same height as the head?" would not consistently work. We developed a process which gets the general difference between the head and waist over a period of time, so even if the skeleton is glitching and jumping around, it can still detect if the person has fallen over and notify someone automatically.

No comments:

Post a Comment