experience designer
A10.png

MFA Thesis: Human Interface

A0.png
 

Human Interface

SOLUTION

This project aims at developing an improved method of interaction with Ambient Intelligence Environments (AmI) to enhance human experiences in daily life.

Human Interface creates a more intuitive and natural system of interaction using Wearable Technology more effectively in the home environment.

CHALLENGE

Over the past decade, computing technology has evolved alot. The interactions for this technology however has not and we still stick to the age old idea of adapting to the machine infront of you.

There is need for a more meaningful interaction and experience with computing technology that makes humans the center of interaction.

 

How might we create a more intuitive/natural system of interaction using Wearable Technology more effectively in a smart home environment?

 
01.png

UBIQUITOUS COMPUTING (UC) & AMBIENT INTELLIGENCE (AmI)

Some initial research led me to the concept of Ubiquitous Computing (UC) put forth by Weiser in 1991, which is a human-centered approach to computing technology where Machines fit the human environment instead of forcing humans to enter theirs. (Kerasidou & Charalampia, 2017)

This idea has today evolved in the concept of Ambient Intelligence (AmI) environments that are aware of the people present within them (Jose, et al. 2011) and anticipate user needs without user intervention. AmI achieves this with devices of modern life that are fused with computational technology and sensing capabilities.

 

Sensing styles

AmI relies on one of two methods to sense environments and actions and understand human needs. However, apart from environment and action, humans also rely on emotion when it comes to decision making. This is where technology falls short. It does not understand Human Intent. However very adept at understanding data & communicating with other machines

02.png

A combination of these 2 styles of sensing could lead to a more defined system where one solves the problems of the other, where the user is still the center of interaction but their actions are read and translated using an intermediate wearable device that helps the system understand user intent.

01.2.png

A wearable, however, is more like a piece of clothing than a PC or an appliance, and clothing has been shown to help define identity and supply clues to categorize oneself and others in the culture. (Kelly & Gilbert, 2016)

A significant departure from what is considered normal in current society can lead to the rejection of technology. It therefore becomes very important to identify a balance between form and interaction to promote adoption of this concept.

 

USER RESEARCH

Now, I needed user input to to gauge reaction to this concept that I was building. I conducted research in 2 phases.

Phase I

Phase I of the research focused on Gauging users’ feelings towards WT devices & Identify a balance between invisibility and interaction. Participants were asked to interact with three WT devices in a predetermined scenario observation study conducted in private & public spaces.

04.png

Phase II

For phase II participants were asked to take photos of inconveniences faced in & around house. The goal of the participatory photo interviews was to get a glimpse of participants’ everyday rituals and identify problems faced that AmI can positively influence

05.png
 

Insights

Based on data collected from Secondary & primary research, I found that:

06.2.png
 

Wearable Device Perameters

The research also led to identifying the parameters of the device. The wearable device acts as a key to access the ambient smart environment around the user and as such should be able to read human actions and convey the necessary data required to the specific smart device..

11.2.png
 

Form Ideation & Development

From here, I started exploring various WT forms that would fit the concept I was building.

10.png
13.png

Considering that users interact with most objects & environments with their hands, it made most sense for the wearable to be in that area. Based on participant feedback from Phase I and in later interviews, the following 3 forms were selected.

I further did a pro-con analysis to compare the forms against each other and devices from Phase I.

ITERATION I

Based on user feedback, I further explored & refined the selected forms that would merge into everyday objects. The devices had to be small enough to go unnoticed by others but still provide clear feedback and control.

A3.png

While iterating and refining the device forms, it became obvious that one form could not fit every user’s needs pertaining to ambiguity. As a result, I decided to work with a range of devices that could target a larger audience.

A26.png
 

Participant Feedback & Further research

At this stage, I decided to get some user insight into the concept I was working on. The feedback mainly focused on integration into current smart device systems.

“But how will this work with my current smart devices”

-Research Participant


So I conducted some further user research to gain an understanding of user interaction with existing smart spaces

00 Phase 3.jpg
 

Gaps in Concept

This further research led to the identification of a few gaps in the concept.

A7.6.png
 

Many people don’t trust smart devices, and rightly so!

-Danko Nikolic (AI & ML Expert)

 

Markovic’s Paradox

“We can teach machines to solve the hard problems, but it's the easy ones that are difficult”

-Hamer A. 2018

A22.2.png
 

The Idea

A24.png

These thoughts lead to a concept that I started calling Human Interface.

The concept of human interfaces moves away from the idea of users interacting with technology and instead focuses on technology that interacts with humans.

A method of interaction that uses human attributes like, gesture, location and biological readings to control the AmI environment around them.

09.1.png

Typical actions to activate interfaces are hidden within larger actions, making the change in state invisible to the user.

The challenge here would be to create a way to create a higher level of ambiguity without loosing trust in technology. As a result of this, feedback and manual overrides become very important to keep user trust intact.

08.jpg

This concept is similar to AI in predicting user needs without user intervention. However, instead of using machine learning (as most smart devices do today), HI uses natural human actions (through WT Sensing) to predict user needs.

 

Ecosystem Design

20200113_110811.jpg

User Experience Maps

Based on research from observations and earlier research, I put together experience maps that focused on low points while interacting with smart spaces to understand where this lack of trust came from & how they can be improved.

View Experience Maps

A7.png

CURRENT/FUTURE STATE MAP

I also started looking at popular smart devices available today and mapped their current use with my proposed future state of use to analyze its effects on the device ecosystem.

INPUT MODALITIES - HI actions | Manual Controls | Decision v/s Suggestion

CENTRAL COMMAND - Ecosystem & HI Set-up | Ecosystem Fail-safes

DEVICE GROUPS/MODES - Active | Inactive || Home | Away | Sleep

A11.png

WEARABLE STATE DIAGRAM

This led to the finding that apart from HI decision-making, suggestions and manual control would be essential to build trust in the smart ecosystem.

The WT digital user flows and feedback systems were represented using state diagrams that match the app’s flows and visual style.

The state diagrams visually represent the flow of interactions for the wearable. The 2 major functions are invisible action feedback and manual control.

A8.png

DEVICE ECOSYSTEM MAP

Next, I tried to understand device connections and how each smart device can influence others based on the journey maps. I found that the more devices a user has the more completely it will function.

Further, certain devices like the smart bed can also be used to address sensing limitations during sleep mode (when the device is charging).

 

Assistant App design

The most important finding, however, what apart in order to perform the more complex tasks of setting up the ecosystem, an assistant app would be needed. As a starting point, I identified major functions that the app will need to perform. These were broken down further into features and used for a card sorting analysis.

A27.png
 

HIGHER LEVEL BLUEPRINT (ASSISTANT APP)

These functions were then broken down further into features and used for a card sorting analysis to help build the blueprint for the assistant app.

20200123_014735.jpg
A15.png
 

Lo-Fi WIREFRAMES

Next I started building wireframes based on the higher level blueprint to start identifying user flows.

A28.png
 

STYLE GUIDE

I also created a style guide to define a visual language for the app and wearable that resonated with the ideas if energy and interconnectedness that I was trying to achieve with this smart device ecosystem concept.

Group 259.png
 

HI-FI WIREFRAMES & QUASI EMPIRICAL USER TEST

I finally ended up with a Hi-Fi prototype, that I used for user testing, to identify gaps in the app’s flow and confirm that it matches the users’ mental models. Participants were asked to perform predefined tasks. The user test brought some user flow flaws to light and were addressed in the final prototype.

A30.png
 
 

Hive Mind

Hive mind is more than just another wearable. It is the key to your home. Hive Wear performs smart device tasks by reading invisible human actions and relative position thus making the user the center of interaction. It also provides feedback of invisible actions and manual control of smart devices to help build user trust in the technology & compensate for human tendencies.

You can chose from 3 devices that fit your needs.

A10.1.png
Band final.png

Hive Band

The hive band is a sleek 10mm wrist band with a feedback screen. The controller on the right can be used to quickly find the device you need and set it the way you want.

 
Watch Final.png

Hive Watch

The hive watch comes with a larger 44mm screen. the hexagonal screen creates a balance between a typical watch face but is still highly optimized for scree real-estate.

The watch uses a crown that performs the required control actions of scrolling, selecting and going back.

Lens final 2.png

Hive Lens

The top of the line Hive lens uses retinal projection to give the user a feeling that their ordinary glasses are a screen. The hive lens comes with a control ring that allows you to navigate through your smart devices with ease

 

How it Works

The wearable has 4 major functionalities that it can perform within the the ecosystem:

HI Decision Making
These are actions performed automatically by the device based on Human Interface.
Ex. Walk into the room >> Lights turn on

HI Suggestions
These are suggested actions performed based on Human Interface & user input.
Ex. Sit on Couch >> Turn on TV ? >> Yes/No

Shortcut Toggles
These are quick access manual actions that the user can perform through the wearable.
Ex. Someones at the door >> Scroll to door Icon >> Toggle Unlock

Manual Device Control
These are Room specific manual controls for each smart device in the system.
Ex. Too much Sunlight >>Find Room >> Find Blinds >> Set blinds

 

hive mind Assistant app

The Wear comes with an assisting Hive App that acts as a fail-safe for the Wear and also performs more complex actions like setting up new devices & Wear Routines

A16.png

Central Control

The home tab displays all connected devices, segregated by room that can be controlled through the app via:

-Discrete Control
-Discrete + Variable Controls
-Discrete + Variable + Media Controls

Additionally, there are (user generated) device toggle shortcuts at the top of the page for quick access to certain devices.

A17.png

Connected Wear

The wear tab shows users (and their wear devices) connected to the home. Next, it shows the mode that the wearable is in, based on its state:

-Home
-Away
-Asleep (charging)

It further shows “Routines” that are activated by the wear device when performing specific actions. These actions, and the devices they control can be set here in this tab.

A18.png

New Additions

The dedicated add button on the top left corner of the home tab allows for quick set up of new devices, wear devices & routines. Additionally, devices can be added from settings menu on the top right.

Routines can also be added/imported within the set up process to save time and effort.

Prototype

A20.png
A21.png

WIZARD OF OZ (WoZ) TESTING

To gauge user acceptance of form and interaction within this system and identify possible flaws and pitfalls that could lead to failure, a WoZ prototype was conducted using one of three WT devices (with screen prototypes) and a phone with app prototype. A researcher walked with participants and enacted WT feedback while a second researcher controlled smart devices in the space via another phone.

 

Takeaways & Further study

Research & design as a process never ends
This project provides an initial set of principles that help build a criteria list for the development of wearables to function as communicators within AmI Environments & defines parameters to assist the adoption of invisible technologies. However, further research needs to be done on the specific areas of shared space interactions, human interaction tendencies with their environments, device platforms & compatibility, and connected device ecosystems. While this thesis focuses on moving away from screens and making humans the center of interaction, there is still a need for some screen interactions, which seems counterintuitive. Much more research needs to be conducted before we can completely move away from the concept of “adapting to the machine in front of you.” I hope that the findings in this project and subsequent research paper are a step in the right direction to achieve a true ubiquitous system of interaction that makes technology more accessible.

User research, collaboration & testing is key
This project highlights the importance of user research, mental models, and continuous collaboration with them throughout the design process in order to develop a product that truly holds value to the user. This project thought me that design gives us an opportunity to develop user-focused products that are an extension of themselves and help define their identity.

Failing Fast and living in complexity
Failure is not necessarily a bad thing. Sometimes failed methods and unreliable results teach you the most and make you adapt and pivot in order to solve the real issues. Complexity takes time to unravel & synthesize, don't rush the process of absorbing and reflecting. It often helps you see things you would otherwise miss.