Live Visualization for our lo-fi prototype

Our lo-fi prototype designed with a computer

This illustration displays the interface to our visualization. The five components are: ambient light, as measured by the intensity of sunlight in the upper right-hand corner; magnetic field, illustrated with the red triangle and blue waves extending towards the person; orientation, measured in terms of north, south, east and west; accelerometer, measured as the speed at which the phone is being moved; and proximity, the distance with which the phone is being held from the person’s body.

Hand-Drawn Sketch of our lo-fi prototype

This is a hand-drawn sketch of our lo-fi prototype.

This hand-drawn sketch is the intermediate phase between brainstorming and the detailed mock-up designed with the use of the computer (at top).

Webcam Visualization

Webcam Visualization

The image above shows a possible area for exploration: live capture of users’ activities by camera, a visualization using augmented reality. We may combine two features, brightness tracking and color sorting, to build our visualization. By using color to represent each of the five sensor programs, we can visually display the data for each sensor. The benefit of using a camera is that the visualization will be a more realistic and visually alive representation of data.


Use Cases

Our system is composed by three parts: (1) a mobile client running at background of the mobile device to collect sensor data and send them back to a remote server; (2) a server, as a middleware, will store the data and ready to be queried; (3) a desktop application that can fetch live data from the server then visualize them for the purpose of understanding human behavior.

Our system can help in situations such as the following design case: Brian is developing a messaging application for mobile devices. He wants his application to have the capacity to adapt to the surrounding environment. Following traditional methods, he would conduct a contextual inquiry, build personas and construct scenarios to support his design process. In our proposed approach, Brian can install the mobile client on users’ phones and collect data passively without interrupting users’ regular tasks. With our system, he notices that users sometimes send text messages on the go while under intense levels of light. This situation reduces the effectiveness of the performance. Then he figures a way (e.g. brightness self-tuning, voice-input) to make his application context-aware to adapt to the changing environment.

Progress

The entire architecture of our system has been set up. Mobile client, server, and desktop client all work together to communicate via the data. So we can get live, continuously updated sensor data at this stage. The next step is to visualize and display our data in an aesthetically pleasing and creative visualization. We will post additional ideas concerning the visualization as our brainstorming and planning both progress.

Leave a comment