Prototyping

The prototyping stage consists of a low fidelity, medium fidelity and high fidelity prototypes. The low fidelity prototypes includes sketches and storyboards that allow me to conduct cognitive walkthroughs of the application which in turn identify usability issues at an early stage. The medium-fidelity prototype is a basic Unity3D application using the Kinect Sensor with also incorporating Skype’s voice chat feature running in the background. This will allow users to give feedback on their experience navigating a 3D environment while engaged in a conversation. The high-fidelity prototype will be a fully functional representation of the final product. When the empirical studies are evaluated after testing, issues will be addressed in the final stages of development.

Sketches

This sketch depicts how the users will be situated in physical space while interacting with the application. The first displays how the viewer will see it and the next maybe a staging idea for the DAWN exhibit. Ideally I would like to have separate rooms for users to get the full experience of the application. If not the sketch below would be my plan as the projector screen will hide the one user from another.

*click images to enlarge

This sketch depicts the room requirements of the kinect. Users need to have a clear space to move around as well as being at least 6 feet from the sensor. (recommended by Microsoft). The display device should be 2 to 6 feet tall while the sensor can sit above or below the display device.

 

 

 

 

 

 

 

The sketch below is an idea of how I would like the user to be immersed in an environment map that is projected onto a dome shape. I hope that this will give the user a feeling of immersion.

This short storyboard shows the simplistic functionality of how the system will take in the physical environment and render it onto the dome shape canvas. The user will select the option to capture the background. They will have to then vacate the area so the kinect has a clear view. It will then take a photo and pass that back into the program. The application will then render the image as an environment map onto the dome shape mesh.

The sketch below depicts the avatars reacting similarly to the physical actions of the user.

Finally this sketch shows different ideas controls that are interpreted by poses.

Testing the Kinect

These medium fidelity prototypes are in no way near the final product but they are mockups which allow me to carry out user testing. Initially when I installed the SDK for the kinect I did some testing of all the aspects that the sensor is capable of to become familiar its abilities.

This module provides sample code used to demonstrate Kinect NUI processing such as capturing depth stream, color stream and skeletal tracking frames and displaying them on the screen. When the sample is executed I was be able to see the following the depth stream, the background in grayscale and different people in different colors, darker colors meaning farther distance from camera. This is important for the navigation requirement of my thesis. This sample also had the ability to tell what angle audio is coming from. As my application is intended to have a chat facility it maybe important that the environment can tell where the user is speaking from.

This next test was to see how a test subject enjoyed using the kinect. This also showed me how well the user was able to interact with the kinect with their body as apposed to a controller. This simple shape game also had voice commands such as “slow down” “speed up” “reset”. I choose this game to see if there was any difficulty in speaking commands while trying to achieve the goals of the game.

As mentioned previously I tested a simple kinect skeleton tracker within Unity3D while a user had a conversation with a family member. This scene shows you how a skeleton is generated and the bones tracked by the Kinect, and model control. It also tests the preperation of GameObjects.The family member was connected via Skype. The video below, crude as it is, also displays an example of voice recognition and completing a task using gesture based interaction. As I delve further and further into my thesis each stage will be documented and tested in further detail.

 

 Technological Requirements

Kinect Sensor

This is a motion sensing input device by Microsoft originally designed for the Xbox 360 game console and later on Windows PCs. It allocates control and interaction with the Xbox 360 without the need of a game controller. This allows for natural user interfaces using gestures and spoken commands.

OpenNI

The name stands for Open Natural Interaction. Natural interaction is interpreted as interacting with a device or piece of technology without the dependence on input devices such as a mouse or keyboard. The ambition is to participate with the technology in the same manner that humans interact naturally. This can be achieved through speech and gestures.
It is a cross-platform framework that defines APIs for creating applications that benefit from natural interaction. It divides the API for the sensor from the API for the middleware that conducts tasks such as tracking 3D objects in space. This breaks the dependency between sensor and middleware. The API enables apps to be written and ported. OpenNI’s describes it as “write once, deploy everywhere”.

Unity3D

Unity3D is a game development engine that is an integrated authoring tool primarily used for creating 3D multi-platform games. Unity consists if an editor for developing content and also a game engine for executing final developments. Programming is based on Mono Scripting. This is when a script attached to an Object is a persistent instance of a class derived from MonoBehaviour. As such, it is one of multiple components defining both the behavior (what it can do and how it does it) and state of the GameObject it is attached to. It can be called “persistent” when it is attached to the GameObject because any public variables are serialized and their values stored persistently (Jashan, Unity Answers, 2009).

Autodesk’s 3DsMax

3DsMax s a computer graphics application for modeling and rendering 3D animations, models and images. It also has the capability to composite video that is used by commercial and architecture visualization studios. 3Ds max also has features such as shaders. These can be rendered and imported into Unity3D. Both applications work quite well with one another.

Skype

This software application is a Voice over Internet Protocol (VoIP) service created by Niklas Zennström and Januas Friis in 2003 before its change over to Microsoft in 2011. Skype allocates users with the ability to communicate with peers by voice, video and IM over the Internet and traditional telephone networks. Unlike other VoIP services this application is a peer-to-peer and client-server system that integrates background processing on the device running the software.

Design and implementation

As the proposed project is quite technical I believes it would be crucial to follow the Software Development Lifecycle while maintaining the evaluation techniques proposed in the User Centred Design Cycle to ensure the design of the project meets the needs of the users. The SDLC will hopefully produce a high quality system that meets expectations and that it reaches completion on time and on budget. It will also ensure ridged testing regimes that will ensure that the system works correctly and efficiently. The analysis phase will tie down all the aims and objectives of the project. Following that will be the requirements phase which will identify all non-functional and functional requirements. The design phase will deliver layouts, process diagrams and other documentation, this will significantly enhance testing. The implementation phase speaks for itself while testing and maintenance precede that.
The user centred design process will ensure that the approach will encompass the user as the heart of the design and development process. This will also ensure that iterative evaluation and testing are constantly being implemented throughout. These techniques should lead to a successful outcome

Scridb filter

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>