Pelion: Digital Product Design for Arm

Imagine the software that powers the internet of things

Arm Pelion wanted to reimagine the software for their industry-leading image recognition technology. Modern Human went undercover with customers, developers and product owners, before designing an intuitive and learnable user interface that reflects the power of Pelion’s machine vision platform.

Pelion is a privacy sensitive, artificially intelligent, machine vision platform. It takes the CCTV cameras that are already present in many environments like city streets, stores and offices, and adds machine vision and artificial intelligence to create smart environments. For example: Pelion can take the existing CCTV cameras in a grocery store, and apply machine vision and artificial intelligence to create a smart store; where people can put items in their bags and be automatically charged for the products they choose; where shopper behaviour can be analysed; where staff are alerted when products on the shelf run low; and where shoplifters are highlighted to security.

But Pelion isn’t just a solution for smart retail environments. It can take the CCTV cameras on city streets to create a smart city, or the cameras available in an office to create a smart office, or the cameras available in a museum to create an interactive environment. Applying machine vision and artificial intelligence to physical environments makes the internet of things possible and creates all kinds of possibilities. This makes Pelion incredibly powerful, but it also provides quite a design challenge.

A Misaligned Experience

When Modern Human engaged with the Pelion team they had, understandably, concentrated on building the underlying technology; their initial focus was making the artificial intelligence and machine vision work. They had done it brilliantly and the product was considered technologically superior to all of its competitors. The user interface enabled the product’s developers to test the technology in different environments but, unfortunately, it was largely unusable by anyone outside the development team.

To really understand the needs of the different types of potential users we shadowed systems integrators, expert customers (e.g. facilities managers) and end-customers. We observed up-close what work is like for them and how they interacted with the Pelion proof of concept, as well as any other systems that sat within their workflow. This ethnographic design research provided an unrivalled understanding of potential Pelion users and uncovered behaviour, needs, values and motivations that participants wouldn’t have thought to tell us about in an interview. It enabled us to witness their context of use, and understand the things that were important to them (and the things that were not).

The research highlighted just how advanced the product was compared to its competitors, but it also confirmed that the current interface made it almost impossible for people to apply this power to their own setting. This gave us our vision for our work: to reveal the power of the product through its user interface. We radically simplified the workflow, starting with a strong product walkthrough on first use. We created a simple initial configuration workflow that got users to establish a setup that they could build on and add complexity to as they mastered the basics and gained confidence with Pelion.

A Lexicon of Recognition

Our research highlighted how unintelligible the language of machine learning and artificial intelligence can be, even to expert systems integrators. There was a lack of conventions about what things were called and how to refer to them. This led to an inconsistency in the language of the prototype user interface. In some places, the same word was used to describe different things. In other places different words were used as labels for the same thing. This inconsistent application of language added to the perceived complexity of the prototype.

As designers, we needed a lexicon for the product. We needed a set of common terms used to describe concepts, features and details of the system that we could use consistently.

We set out to choose those words very carefully. We wanted the lexicon to imbue Pelion with the perfect character, present the right concepts to the user and, of course, be intuitively understandable. The Pelion lexicon made sure we always used the same word for the same thing, and that each word was carefully and deliberately chosen from all of the available synonyms.

If a lexicon sets out the words we use to describe concepts, features and details, an ontology describes the relationships between those things. Creating an ontology ensures things always behave consistently. The Pelion ontology ensures that the user has the correct expectations about the behaviour of the system, and that the system always behaves consistently with those expectations. This leads to an intuitive interface in which users feel empowered to experiment with new features or functionality, without a manual or prior education.

The ontology enabled us to create a design that connected key components in the most intuitive way, and to ensure that people could perform the actions they wanted from where they expected to do it. Establishing these basic principles meant that we could radically simplify the user interface. We could encapsulate all of that power in just three screens…

An Insight becomes the point at which the data being provided by different sensors is translated into information that is useful to the user.

Each insight has a type associated with it, for example ‘room occupancy’. This tells the system what it should measure and the type of visualisation that should be used. Some visualisations are a simple graph whereas some will be more complicated, such as a heat map.

The UI we designed enables users to associate workflows with Insights. A workflow contains a series of steps based around what the Insight is measuring, and is able to perform actions on it. This also provides a location for third parties to integrate with the system, by providing additional inputs and outputs for workflows. For example, Slack could be integrated to send a message to a channel when the occupancy of a room is too high, or Google Calendars could be integrated as an input to only monitor a space at specific times.

Being able to see the history of an Insight’s measurements was important to understanding the use of a Space or series of Spaces. Each insight type has a different history view accessible from the graph icon in the top right of an Insight.

A Space translates a physical area into a digital one. It enables the system to gain a deeper level of understanding of where it operates, therefore providing more meaningful insights.

Spaces are translatable to work in a range of different environments:

  • In an office, a Space could be a floor, a breakout area or a meeting room
  • In a smart city, a Space could be an intersection, a park or a section of a road
  • In retail, a Space could be an aisle, or an area such as the checkouts or a section of the store

Each Space could have one or more types associated with it. In a retail environment this means that a Space could have a type of ‘aisle’ but also ‘orientation’. These types then make it easier to find spaces on which to build insights.

Spaces contain the devices that are physically installed within them. This enabled anyone maintaining the system to understand the environment, even if they are not on site. Users can be added to a Space and granted access to either only that space, or all the spaces that are contained within it.

The Device is the source of the data that the system relies on to provide meaningful insights for users.

A device is an object that is installed within a space, such as a camera, a thermostat or a light meter. Each device has one or more streams of data that can be accessed from this screen.

For cameras, the visual snapshot or live stream gives the user reassurance that it is connected and operating as it should. The ability to add a location to a device was another way to connect the real world and allow the system to understand the context it was operating in.

We made the icon for each device an illustration of the actual device make and model. This became an easy way for the person adding the device to know that they had added the right one.

A Truly Intuitive User Experience

Establishing a comprehensive lexicon and ontology provided us with a stepping stone to begin imagining a user experience that truly reflects the power of Arm Insight’s product. We began by mapping out, in detail, a series of user journeys - starting from the process of registering and signing in for the first time, and encompassing everything from adding new users and devices to creating insights and receiving notifications. We then used wireframes to imagine the functionality of the product, mapping out the system structure and information architecture, and created a series of site maps that illustrated how individual screens connect. From there, we created two high fidelity prototypes - one for mobile, and one for desktop - which we took through a period of user testing and iteration.

In addition to the product prototypes, we also provided the client with a series of roadmaps to support the implementation of the reimagined platform. These roadmaps detailed the optimum order and timeframe for feature implementation, beginning with the functionality that brings the most value to users. As part of our handover to the client, we provided a comprehensive style guide including colours, fonts, icons and interactions across both the mobile and desktop version, in order to support the development of the product.

Stepping into the future

Following the initial project handover, we were asked by the client to support them through implementation.. During this time we worked closely with the client and their external developers to build the Arm Pelion product. The redesign of the product was so successful that it has since spun out from Arm to form its own company, SeeChange.