Robots in Retail: The Design Process

When exploring the idea of Retail Robots, it’s important to take a step back and try and understand platform innovations throughout history to gain a clearer view of what the future may hold. Technological innovation, as it pertains to platform creation, opens up new dimensions of development and user experience design. In the time period shortly after the release of the new platform technology, the creation process is inherently revolutionary and is ridden with creative problem solving. Such design requires a mental alignment with concepts that, often times, only exist within the mind, having never had the opportunity to manifest in physical production. It was in learning to develop applications for temi that I began to realize what this truly means. temi is the world’s first scalable robotics platform. Building a platform that has tremendous potential to scale was of utmost importance to us as a company and this priority manifests itself in both the affordable price of $1,999 of the robot, but also in the development resources that we provide for third-party developers. These resources include engaging developer support, a fully open SDK, and the physical components of the temi robot that allow anyone to create applications that use temi’s ability to move and understand it’s direct environment. 

Just the fact that temi can move in a three-dimensional space opens up so many doors for new types of applications. Thinking in this new way, trying to create something that has not truly existed at a large scale, has proven to be a rigorous exercise, but I think that there is much to gain from taking a peek into the creation process during the advent of new development and UX dimensions.

“A small change at the beginning of the design process defines an entirely different product at the end.”

Jony Ive

My creation process always begins with a question: what needs to be accomplished? Are we trying to solve a problem, or improve upon a similar experience? The first application that I set out to create was a retail solution that turned the robot into an in-store shopping tool. The solution needed to not only improve the experience for the shopper, but also had to solve a problem for the store. After establishing what needed to be accomplished, more questions arose: 

What are some problems that store owners face?

How do I create a new experience, that feels familiar?

Learning about the issues that affect store owners was fairly straight-forward, but it was the second question that proved to be the most educational. It was in thinking about the latter question of ‘how do I create a new experience, that feels familiar’ that I came across the discipline of Human Interface Design. One evening I was scrolling through my Twitter Feed and came across someone by the name of Dorian Dargan. Dorian works on Human Interface Design for Apple, and the way he wrote about this form of design inspired me to learn more. Human Interface Design is the area of design that focuses on human-object interactions. It involves the experience of tools ranging from cell-phones to spoons, games to door handles. It quickly became evident to me that Human-Interface Design would play a pivotal role in the creation of the first retail applications on temi. 

To create a new, but familiar, experience I had to (1) understand the in-store shopping experience at the most fundamental level and (2) see where there are opportunities for tools to improve the experience. This exercise became really interesting when I began to input the idea of a robot as the tool. It was at this point when a new, and very important, question emerged:

How can temi, a robot with a graphic user-interface that can autonomously navigate in a three dimensional space, use the ability of movement to improve the experience for the shopper and solve operational problems for the store.

It was a Tuesday afternoon when I first showcased my initial designs for a retail application and something about the designs was just not sitting well with me. The user interface was inspired by a clothing brand’s website. It felt all wrong. Why would someone go all the way to a store just to use something that looked, and felt, like a website that they could have accessed from home. On my commute out of Manhattan and back to Brooklyn that evening, I suddenly realized that the application needed to be strictly familiar to the in-store shopping experience as opposed to the online shopping experience. It was as if a light-bulb went off in my head. We were scheduled to present the progress of the application the next day and these changes needed to happen. Ten hours later as the Sun began to rise, I closed my computer and got ready for my morning commute, only this time I was crossing the East River with a useful application.  

The big shift manifested within the actual user experience of the application. It started off as a relatively stationary kiosk-type user experience that involved the shopper to spend a good amount of time looking down at the screen to choose the article of clothing that they were interested in finding/purchasing. Over the course of that evening, I constructed a new experience: this time instead of looking down at the robot, the new application invited the shopper to put the robot into following mode and walk through the aisles of the store as the robot moves with them. If the shopper likes a piece of clothing on the wall, they can simply take it off the rack, scan the barcode to the robot’s camera, and then try it on with augmented reality. Not only is this an engaging experience for the shopper, but it also opens up a tremendous amount of potential selling points and valuable user experience additions for the stores. Within the screen that allows the shopper to try-on the piece of clothing, they can also be invited to share the image with their friends or via social media, which if done, could give them a particular in-store discount.

Since the creation of this application format, retailers have jumped on the opportunity to place temi robots in stores across the United States as personal shopping assistants. The temi Retail application is always evolving and something tells me that this is just the beginning for possible temi developments in retail stores. Using Human Interface Design principles, it is crucial that we move forward in the creation of these experiential tools in a way that allows temi to assist as many people as possible.

After the shopper adds the item to their bag, they can put the piece of clothing back on the shelf as someone in the back begins to pack their items for checkout. This shift in user experience allows the shopper to continue walking around the store completely hands-free, while simultaneously improving the operational efficiency of the store. Once ready to checkout, the robot flawlessly guides the shopper to the counter, or serves as a Point of Sale where the shopper can pay directly on the robot. If the shopper experiences any difficulty during the course of their shopping experience, they can easily use temi to call a Personal Stylist who can navigate the robot from thousands of miles away, providing further guidance around the store.