Tuesday, May 28, 2013

Week 12

This week we refined the bathroom interactions, as well as adding a new interaction.

  • Light brightness control: We decided to add another gesture-based interaction to control the brightness of both the main light and the mirror lights. This gesture simply involves placing both hand at shoulder height and moving the apart to brighten the light, or together to dim it. The control switches between the main and mirror lights depending on where the user is standing, and once the user drops their hands, the light locks to the current brightness. 
We also corrected some inconsistencies with the falling detection and improved the smoothness of the sink and toilet movement.

We also, at Russell's suggestion, added more direct feedback on what was happening within the bathroom. This can be partially seen in the videos that have been uploaded, it effectively lets the user know what's happening. For instance, when the temperature is being changed, there is a number that indicates the current temperature. There's also text that lets the user know if the bench height is currently tracking them, and whether they're in a position to adjust the main light or the mirror light.
We also placed the "Calling Ambulance" message in the same system, but gave it a more prominent position.

In preparation for the presentation in week 13, we set up an area in the class-room to simulate the bathroom. We outlined major features like the shower and sink with masking tape on the ground, a chair for the toilet, and tables to represent the walls. We then shot the following video to comprehensively cover all the interactions, in case we where unable to get the live features to work during the presentation (This is where we filmed the feature videos for this and the previous post as well.):



For the presentation we decided that for our part, a live demonstration would be by far the most effective way to present, as well as asking Russell or Stephen to use it to see how it works. I'll be the demonstrator, while Steven talks about what I'm doing.

Thursday, May 23, 2013

Week 11

Bathroom Integration

This week was spent combining our Kinect interactivity with the bathroom models created by the other team. We began by positioning the player character within the virtual bathroom, which required some scaling and careful positioning. Once this was done, we began implementing our ideas regarding the interaction with the bathroom.

  • Lights: The first, and theoretically easiest, task was to control the lights with the presence of a user. The lights will only turn on if the user is within the room, and the mirror-lights will turn on when the user is near the mirror. The room lights turned out to be more difficult than was initially thought, since the Kinect's range was not large enough to scan the entire room while being able to sense a person leaving the room. Unfortunately this is more of a hardware problem, which is difficult to rectify given the closed nature of the Kinect hardware.
    The mirror lights work quite well, once it was realized that the distance between the mirror and the user was being measured from the corner of the mirror.
  • Temperature: The next interaction was controlling the temperature using our custom-built gesture.
    Since this had already been used to control the size of an object during initial testing, it was quite simple to use it to instead control the height of a rectangular prism in the corner of the shower, to represent the temperature change.
  • Height Control: This ended up being one of the more difficult interactions to create. Although we'd already created a "bench" that would automatically match your knee height, using this to change the toilet and sink heights was more difficult. For instance, we didn't want the toilet to constantly move around as you moved, rather it should just match your height once and lock at that position. This took some time to figure out. Another issue was only activating them when the user was close enough. This was to ensure that, for instance, the sink did not suddenly lower itself when a user sits on the toilet.
    There was also the problem of the bench going to extreme positions when the Kinect was unable to track the user, so there are now limits in place for both the bench and the toilet to ensure they maintain reasonable heights regardless of any extraneous input.
  • Fall Detection: One of the interactions we discussed very early on in the project was detecting when someone falls down. This was also a difficult thing to detect, since the Kinect cannot properly sense a person lying down, their limbs are less defined, and their angle to the camera means there are often parts of the body that can't be seen. In testing, the representation of the user would glitch and jump around, meaning our expected check of "Is the waist at the same height as the head?" would not consistently work. We developed a process which gets the general difference between the head and waist over a period of time, so even if the skeleton is glitching and jumping around, it can still detect if the person has fallen over and notify someone automatically.

Tuesday, May 21, 2013

Group Presentations Review: Remuneration

Vivid Group:

The Vivid Group did the final group presentation on the subject of remuneration.
Their presentation was inconsistent, some members presented clearly and engagingly, but, possibly as a result of the topic rather than personal choice, went in to far more detail than was useful, creating something of an overload of information, while other members seemed unsure of what they where talking about and where overly vague.
The members in the first group where surprisingly competent at audience engagement, only glancing down at their notes occasionally and speaking clearly and informatively. Those in the second group fell into the common trap of spending most of their time talking to their notes.
The Prezi presentation was well structured but had too much text on the screen, meaning it was difficult to get a clear picture of what they where saying, especially with some members talking at an excessive speed.

The written presentation was detailed but suffered from too many unexplained terms an concepts. It may have assumed to much about the knowledge of the audience, meaning that complex ideas where being explained without explaining the core ideas that they're built on.

They gave some good examples which where fairly well explained, but the examples didn't really relate to their project, so their usefulness was limited.
Their presentation didn't have many images in it, and the ones that where present where fairly generic stock photos, though this is somewhat understandable given their topic.
They did seem to have a decent understanding of the topic, but some of them seemed unsure how to present this knowledge effectively.
With that being said, it is a difficult topic to make interesting or engaging, so some leeway could perhaps be given for that, but I think a simpler presentation with more direct links to their project would've made it much more informative and engaging.

Tuesday, May 14, 2013

Group Presentations Review: Conflict

DCLD Group:

The DCLD Group where the first to present on the subject of conflict. Their presentation was detailed and complex, but suffered from the common problem of having too much text on the slides and a lack of engagement with the audience. Their oral presentation leant heavily on the reading of text from the slides or at least the screen in front of them. This meant that instead of feeling like I was being told something, it was more like I was simply being read something, which I can do on my own. There seemed to be a lack of organisation with the slides as well, as they often skipped over some slides or went backwards to previous slides.
The written component of the presentation contained plenty of details, but in my opinion consisted of more lists than was useful. Lists are good for creating a starting point, giving a basic outline of what topics you're covering, but after that it's more engaging, and easier to follow, to talk about things progressively, moving from one idea to the next smoothly without just jumping to the next point on a list. As an extension of the list problem, there was a sense of disconnectedness between each topic, with little explanation of why they where in this particular order or how one topic followed on from the previous one.
Their examples where good, especially the BIM one, but where not explained effectively, so while I got a good idea of what conflicts might arise, I didn't really understand what the potential solutions where.
They referenced most of their images, but the references where often too small to read.

Their images where decent, but consisted of a lot more text than was useful. In some cases, especially the large flowgraph, it was almost impossible to read the text, meaning the audience had only the vaguest idea how it connected to the topic.
While there was useful information in the presentation, the way it was presented gave the impression that the team didn't really have a deep understanding of the topic and where just reading what was in front of them for the first time.

Kinecting the boxes:

The “Kinecting the boxes” presentation was the second one on the topic of conflict.
Their oral presentation was better than the last groups, as they read mostly from notes rather than off the slides, though there was still a distinct lack of connection with the audience, which could've been achieved by at least looking at them regularly.
Their reading was clear, but not engaging, as if they where simply dictating the text rather than trying to explain something to an audience.
There was also a significant imbalance between the people presenting, as some presented for far longer than others. This may have been due to one person simply being more willing to talk than others, but it did lead to what felt like a less effective group dynamic.
The presentation also went substantially longer than it should've, succinct explanations are far more effective with an audience than lengthy, convoluted details.
Their written presentation was quite clearly written, though it tended on occasion to be oddly melodramatic. They suffered from the same problem as the last group by including far more lists than was useful. They did manage to maintain a reasonable level of flow between the various topics, though having some kind of overall outline would have been useful in identifying this.
Their examples where a bit unrelated, and tended to be very general, rather than project specific. While it's possible they had suffered very little conflict in their group, even theoretical examples are more useful than broad generalisations, since conflict resolution is something that should be considered on a case-by-case basis.
Their presentation was well laid out, but a lot of the images had too much text. The flowgraph was interesting, but took far too long to explain.
It was difficult to tell if they had a good understanding of the topic with so much of the information being read off cards rather than to an audience. While they had a fairly basic view of conflict, they explained it well and gave plenty of detail on the potential resolutions, though there where a few occasions of repeated content.

Monday, May 13, 2013

Week 10

Interactivity Testing:

Gestural Control Test: Scaling a ball using a specific gesture to ensure against inadvertent activation, usable for light or temperature control:

 

Proximity Test:
Using the distance between a specific joint, in this case the hand, and an object to interact with the object, usable for operating taps or opening cupboards:


Skeletal Analysis Test:
Using skeletal proportions, in this case from the leg, to control the height of an object, such as a chair or bench, to make it comfortable to use:

Arduino Testing:

Stepper motor test:

Monday, May 6, 2013

Visual Evidence of Progress

Arduino Tests:

Controlling a Hobby Servo:

Controlling a basic motor:

Controlling an LED with a potentiometer:


(Made with the assistance of Sparkfun Tutorials)

To control a stepper motor requires a specific microchip that is currently being mailed.

Kinect Progress:

Cryengine can now access skeletal positions as well as some basic gestural controls.


Full Skeletal tracking.

(Programmed with the help of Stephen Davey)

Saturday, May 4, 2013

Individual Major Milestone


Introduction:


The key feature of this project, it's in the course name, is collaboration. Unfortunately, this is an area in which our group has not been particularly successful.This project is part of a larger project under the guidance of Stephen Davey. Unfortunately he has had a large amount of other work to attend to and has not had many opportunities to provide direction, though he has been quite willing to give feedback.
I think that, in retrospect, the development of a timeline for us to work in would have helped us both assign tasks to be completed and complete them within a reasonable time frame.
However, we have made substantial progress and I hope we will achieve an interesting and innovative solution by the end of this project.
Hopefully we can still deliver an effective idea, even if it's somewhat theoretical, but it would be much more useful to have a functioning prototype.
I have made progress towards my individual milestones, however, a lack of group interaction and technical limitations have hampered my ability and enthusiasm to make more substantial developments. Hopefully our group and myself will be able to come together to create an effective project solution, but this will require a cohesive effort on the part of every member.

My personal progress has mostly been in the areas of industrial design and intellectual property. I had hoped to use this project to increase my programming prowess, particularly with the Kinect, however technical limitations have meant that I am unable to take as active a role as I would have liked in this part of the project.


Collaboration:

Our group has struggled with collaboration throughout this project due to the language barrier between its members and also the lack of a single, fundamental goal which has been present in past studios. I feel this is somewhat a result of the group formation process, since a group with a wider range of skills may have been able to better divide elements of the project and potentially communicate more effectively.
The detachment present in some members of the group has also meant that Steven, as the group leader, receives very little feedback or even basic contributions to the project, so effectively some members only action is to complete assigned work, slowly and ineffectively, and then to ignore the project. This problem is much more difficult to resolve since it stems from the attitudes of the members themselves and the only solution so far has been to not rely on any getting any input from them and simply running the project with a smaller group.


Industrial Design:

One area in which I personally have developed is Industrial Design. Having never done anything remotely similar to designing this Kinect mechanism, I was unsure where to start. Unfortunately, none of my team-members had any experience in this area either, so I effectively had to use trial-and-error to discover an effective solution. The solution I created turned out to be somewhat over-engineered, meaning it was excessively expensive and only marginally efficient. This was due to my focus on having it able to be laser cut and structurally sound. Perhaps if I had focused more on pre-built alternatives I could have procured an effective powered bearing that would have filled the same role as this design but cheaper and more effective Thankfully, we have a fall-back in the form of the 2-axis bracket, however, the amount of time spent on this is largely disproportionate to the payback. While this is useful to know, it is unfortunately too late to be of much use in this particular course.


Intellectual Property:

Due to the presentation we where required to deliver, I have become much more proficient in the area of Intellectual Property, particularly in the process of acquiring protection for the various kinds of IP that can be created. This promises to be useful in both this and future projects as most of the things we create in our course can be protected and there are ways this protection can be used to generate income as well as protect against IP theft. I also had another assignment in a different course on the subject of Intellectual Property and this presentation was of great value. There are very few areas, in fact, where a knowledge of IP would not be useful. I would say that so far it has been the most useful part of this course.


Programming:

While I had planned to focus on the programming side of this project, the lack of an available Kinect as well as the inability to even create any flowgraphs on Windows 8 means that this side of the project has rested mostly with Steven.
With that being said, we've worked collaboratively on the programming side and with Stephens help have achieved a working integration of the Kinect sensor and Cryengine 3. We hope to add the ability to track individual limbs to facilitate more intricate interactions with the various elements of the environment, though this may require delving into the Kinect SDK and exposing more of the positional joint data.

I've also worked on understanding the Arduino coding language so that, if we manage to implement it, we can control the movement of the Kinect in reaction to the location of the user.
Using the Arduino control board to move control a number of stepper motors would be the main component in moving or rotating the Kinect sensor. The stepper motors work by enabling electromagnets located around a central spiked-gear which rotate the gear a minuscule amount, down to a single degree. This gives the motor a high degree of accuracy, though at the cost of lower speed. Rotation speed, however, is not particularly important for this project, since the sensor only has to keep up with a human at walking speed, potentially even slower.
The implementation of this is quite simply, it simply requires four wires to be connected and an electric pulse sent down them in sequence to activate one of the four electromagnets.
The use of these motors is the most effective solution for tracking people, given its accuracy, and also if we manage to implement the Kinect Apparatus, to control the wire winches that move the apparatus around the room.


Conclusion:


I have made progress towards my individual milestones, however, a lack of group interaction and technical limitations have hampered my ability and enthusiasm to make more substantial developments. Hopefully our group and myself will be able to come together to create an effective project solution, but this will require a cohesive effort on the part of every member.

Wednesday, May 1, 2013

Week 8

This week the Kinect Apparatus prototype came back from the laser cutters and was assembled.
Overall I was quite pleased with the result. However, there are a number of issues with it that would need to be resolved. For instance, the brackets that hold the Kinect are substantially too large, due to an error when measuring. This means the Kinect does not sit particularly well and would have to be braced in order to not move around. The brace that stops the Kinect sliding sideways does not connect quite as well as it should, but this can be rectified by some minor filing.
The rotation system works well, but as Russell pointed out, it's rather excessive and somewhat expensive, with the whole setup costing around $80.
We are looking into cheaper alternatives, such as a pre-built hinge. Unfortunately the specific kind of apparatus we require doesn't seem to exist. Some options include table bearings, which don't have a motor and are substantially too heavy, or a motorized pan-tilt head, generally used for cameras, which tend to be too expensive and would be difficult to modify.
For the moment we plan to keep the prototype as a basic proof-of-concept and work with the 2-axis bracket that Russell has ordered.

We also gave our Intellectual Property presentation today. I think we did quite well, especially with the way we discussed it in terms of our project, which is something the other group didn't focus on. The text of our presentation can be found on our wiki.