Episode 112: Designing for Humans and Machines
Jon Prial: I think we all know, that to make a truly great product, you need to design the product so that the user has a great experience and perceives valued. The focus of the user has been called human centered design. And now we have to design products that apply AI to these solutions. With my guest today, these two trends converge and we'll be discussing centaur design, designing for a world of AI- human hybrid systems. I'm Jon Prial and welcome to the Georgian Impact Podcast. I'm excited to welcome Lindsay Ellerby, Senior Design Director at Normative. Lindsay welcome. Why don't you kick things off by telling us more about your journey and how you ended up thinking about centaur design?
Lindsay Ellerby: Yeah, I'm really kind of a child of human- centered design. I started doing interaction design in the early 2000s when Don Norman coined that term human- centered design and really built that practice and IDEO pioneered it and really kind of created the toolkit that a lot of us have been using ever since. And I really came to it because I saw an opportunity to try to shape experiences around human problems. When I started off, it was really mostly in the realm of information architecture and building websites, and then kind of realized that I wanted to really get into doing software design, so building software products and trying to apply some of my design skills and problem- solving skills to creating better technology, better software tools for people to use day to day and to help them in their everyday lives. That's what really brought me to Normative and I've been here for 10 years.
Jon Prial: I love the evolution from kind of the traditional design marketing thought to software. And now you're part of Normative and you call yourself a software innovation firm. So tell me a little bit more about why a CEO might pick up the phone and call up Normative?
Lindsay Ellerby: We really view every business as a software business in some way shape or form. And one of the main challenges in leveraging the power that software can bring to a business is in getting that confidence and clarity behind the tool that you're building. And so what we do is we really help leaders, our clients, our partners, get into working software very quickly through an iterative prototyping process that helps them to answer, should we do this? And can we do this from a technology perspective? So we do lots of design concepts and conceptual work. We don't spend a lot of time refining concepts and building mock- ups and wire frames. We want to get into working software quickly, even if it's the smallest feature set we can, so that we can answer that question, we can help our clients answer that question. Can we do this? And should we move forward with it?
Jon Prial: Now, I saw this phrase, technology has a high IQ, but often lacks EQ. And when I think of EQ and I think it's emotional quotient, I think about people. Can you talk to me about your thoughts of EQ and technology?
Lindsay Ellerby: Yeah, that's a great question. The way that we think of technology and specifically sort of thinking about AI, and let's say nonhuman types of intelligence, is that those tools really augment people. And so when we look at the sort of impact of technology over time, technology also could be like a hammer or a tool that we used a couple hundred years ago has always augmented people, given us new capabilities. Now we're in this phase where technology is augmenting us, giving us new capabilities in really different and new ways. And so I think that software being designed to adhere or to be able to understand emotion a little bit better and respond with emotion a little bit better and have better sort of if you will, digital conversations is really important when we're creating these tools. If you think of emojis and gifs as a really interesting product of our digital age, those are really great examples of how technology can help us with our own EQ and can have EQ and help us have those more emotional and meaningful conversations.
Jon Prial: I do feel like the fight over emojis is funny as to what's the next one to put out there to communicate some particular feeling. It it seems like we're evolving and I'm not sure, particularly a lot of the B2C software that I interact with has done a great job of giving me a good experience. So my sense is what you're talking about is more than watching people using a system through one way glass. And I'm not saying companies did a great job with that, but I get the sense, particularly with the AI, you want to look at much more than that. Is that fair?
Lindsay Ellerby: Yeah absolutely, and I think you brought up a great term there which is system and I think in design now we really need to take a systems approach, we need to look at every different aspect and level of the system and the ecosystem in which a human is interacting in the world and using technology to do that. They have a supercomputer, as a mobile phone is basically a supercomputer in your back pocket, hooks you into GPS and social networks and many different apps, interacting with data as they move through the city, for example. So we really need to look at that at the systems level and also interacting with technology that doesn't have a screen, voice enabled technology where you're speaking to the machine and the machine is responding. And so design and any team, and when I say design, I think that's a broad term. I would look at designers problem solving. So there's developers working on that, there's designers, there's business people, researchers, all of us really have to kind of understand how do we make that experience better? How do we think about the spoken communication, the script if you will, of a voice enabled technology. The screen- based technology is still important and kind of consider all of those factors.
Jon Prial: And those factors will make interacting with systems somewhat frictionless, but not necessarily better yet, right? We have to begin to get, as these systems begin to direct, they've got to get to more seamlessness as well. How do you see us evolving to taking the broader holistic view of things?
Lindsay Ellerby: I have a very specific opinion about seamlessness and that is that I think to have positive human machine relationships and interactions seams are really important. So there is a concept in design called seamfullness. And I think what that's about is, especially where we're at right now with with technology and with machines and with AI, it's allowing the human to see what the machine is really good at and let it do its thing, recognize the differences and sort of celebrate the dissimilarities between humans and machines instead of expecting a machine to be exactly like a human. And of course, expecting a human to be able to operate on the level of a machine. So J. C. Licklider, who wrote Man- Computer Symbiosis decades ago, he defined this concept of machine and human interaction by saying it's the co- operative living together in intimate association or even close union of two dissimilar things. And so I think that desimilarity in those seams is something that we should actually design into the system as a signal to humans to say, this is what you can expect from the machine and also for the machine to be able to help the human and understand what the human constraints are.
Jon Prial: What's the best way to get out in front of kind of ensuring there really aren't any unintended consequences.
Lindsay Ellerby: So a big part of what I think is the future of design is doing implications analysis. When you're designing a tool or an application, or you're using AI in an applied way to do an analysis of, okay, in 50 years or even 100 years, I know that seems far out, what are some of the imagined implications we can see this technology having? I think of when the automobile was designed, what if we would have thought about some of the implications of what mass production and use of automobile might've been? That might've put us on a slightly different trajectory. So I think frameworks for implications and for ethical considerations is something that is very, there's a lot of people talking about that right now, especially in the world of AI. I think that using in developing those frameworks is really important. And then more day to day, I think that getting into functional prototypes and working software as quickly as we can, can help us manage the risk of those unintended consequences because we can prototype out specific scenarios and we can start to test and validate what is going to happen? What is the value we're creating or what are the kind of more dangerous things that this product might produce? We can start to validate that early in the process.
Jon Prial: When we think about human- centered design, and obviously for you, that's the right starting point. But you also have to make sure you think outside of the box. And I have a history in business process management software and it was always the story that there was an existing, paving the cow path was what we did because the cows went from point A to point B on some path because there was grass to eat and all we would do is use IT and pave it, versus redesigning a new process. So when you're thinking about human- centered design, that's one of the lenses. But there's other lenses such as innovation, implications, which we talked about a little bit in terms of new products and unintended consequences, but also ensuring that we have the most efficient process coming up. So as you're looking at this, what becomes your focus?
Lindsay Ellerby: I'm really thinking about the future of design beyond human- centered design. So while, before I sat in, and this is absolutely true, that I'm a child of human- centered design, have kind of gone through my career using and working within that discipline. I now see that we need to evolve past it. And the way that the lens really that I have been using to think about what the next 25 years of design look like is centaur design. And bear with me, I know that that sounds like a fantastical...
Jon Prial: I'm excited, this sounds good, this is cool.
Lindsay Ellerby: Yeah. A fantastical concept, and it sort of is, but centaur design is really a new practice that involves designing for different types of species. Human- machine hybrids that are operating within really complex networked systems. So the thinking there is that design's next big paradigm will really have to account for a wide variety of what it means to be a person. Centaur design is really thinking of people as hybrids because ultimately we can't escape it, we are augmented by technology. Not to go so far as to say cyborgs, but centaurs. And I think that concept of centaur is really interesting because it helps us, like you said before, think outside of the box. A centaur is a mythological creature, it's known for being sort of chaotic and wild. They appear in modern fiction, appeared in Harry Potter, for example. They self identify as beasts rather than beings and they really have a rich history of divination and being able to read the stars. And I really like this label for the future of design and the lens we should be using because it means hybrid, it means fluid and chaotic, it means future- focused. And it also gives permission to try new things and to explore what it means to be human. So that's really the lens that I've been using a lot lately.
Jon Prial: And what it means to be human in this world of new technology. So let me go on a journey towards becoming a true centaur, not necessarily a man and a horse or person in a horse, but a person and a computer. So I'm walking down the street today and I'm staring at a screen and I walk into a lamppost, or I'm now walking down the street because I'm smarter and I have earbuds stuck in my ear. I'm close to the device, but if I'm working on a factory floor, it might be that I'm doing a piece of work and there's a robot right next to me waiting for some of my output or a robot handing me some of its output. So that it doesn't necessarily mean that it's a single entity, but that it could be a person in technology that are working closely together now before we get to a point X decades down the road. Is that a way for me to think about this evolution of scent tourism, if that's a fair word?
Lindsay Ellerby: Yeah, I think what you described as a really good scenario that describes sort of where we're at right now in our relationships, certainly the factory floor or in the case of our partnership with Inspiren to build the hospital unit, patient monitoring systems. So Inspiren had identified a hidden inefficiency costing hospitals billions of dollars every year. So nurses, noncompliance with standard patient safety protocols, such as hourly rounding, which requires caregivers to engage and evaluate patients on an hourly basis is something that's really hard to track. Especially for busy and understaffed units, this task becomes really difficult. It gets compounded by archaic bedside reporting practices that leave hospitals with no actionable data on vital care metrics, really. So it can create sort of a domino effect of consequences, dissatisfied patients, medication errors, and longer stays at the hospital. So there was a need that Inspiren saw, who is our client, to modernize the solution. But it was really unclear when we started, how a solution like that could function within the constraints of a busy hospital. Not to mention like significant costs and integration risks. And so what Inspiren created was a unit. It's called the In- Unit that is installed in the hospital room. Every hospital room has an end unit installed within it. And it monitors, in real time, through an intuitive mobile app that we designed. And so the unit itself that's mounted to the wall is a physical computing unit that has sensor capabilities, it has computer visioning software within it, and it monitors the interactions and what's happening in the hospital room. And then we designed a software tool that nurses use and they're able to track along and see what is happening in each hospital room. Especially for their rounds. They're able to see, oh, if they've got 12 beds, they're meant to be visiting hourly that the apps sort of helps them track what beds they need to visit within that hour.
Jon Prial: Yeah. If I think about process management, obviously I love the tracking and the recognition that when a nurses in a room, because I think patients will be happy to know that we're tracking the nurses are doing as expected. At the same time. Will it also perhaps notify nurses say patient in bed number seven, hasn't been visited in a bit, then you need to kind of get over there. Does it also do a little bit of nudging?
Lindsay Ellerby: It does, yeah. So it nudges nurses, if there's a patient that needs to be visited at the bed and then it also can flag if there is a bell call. So if there's an issue in one of the rooms, it flags up for them, so they can go and check in on that patient. The other great thing about this system is that it really brings all the nurses who are on shift together. It networks them together, so it helps them to see what other nurses are doing. And so you may have one nurse that has a little bit of downtime and can see that their colleague has three beds that haven't been visited. And so that nurse can go and help them out and visit those beds and sort of backfill for them and really support each other. So there's also that kind of network effect that ends up happening within the system.
Jon Prial: When I was looking at the case study, you mentioned that there were these four intuitive colors. So obviously you took a lesson from all your research and history in terms of making this easy for a nurse to see what's going on. What do you mean by four intuitive colors? What do they see?
Lindsay Ellerby: Yeah, that's great. So one of the biggest challenges was creating an application that really supported nurses and empathized with them and helps them on their job, and also helps them to feel as though this system is not there to watch over them, it's there to support and help them. And so one of the really important things in designing the application that they would be interacting with on their shift was to make it approachable. We were talking before about how do you imbue technology with a sense of EQ? One of those ways, at a very sort of tactical level is to use color. And so we used four different colors. The applications are split into four sections. One is hourly rounding, then bedside reporting, unit intensity and patient feedback. And so for those four kind of components of the experience, we use different colors, green, blue, orange, and yellow and there's sort of a warm palette to it. And the other thing that really helps with, and just kind of talking about maybe other work environments, where you're not staring at the app the whole time, especially as a nurse, or if you're working on a factory floor, you're looking around a lot of different stuff happening. So color helps you to orient very quickly when you glance down at the application to the information that you need and that color coding system just helps you get that information instantly.
Jon Prial: Interesting. Now I've got to ask the question, how about trust? Is it at the forefront of some of the design work you do which obviously includes element of privacy?
Lindsay Ellerby: Yeah, trust is huge. And I think when we're designing within an ecosystem such as a hospital and there's, it's sort of a multi- sided experience. Of course, we have our primary users who are nurses, we have other stakeholders, hospital administrators who are looking at the data that comes out of the application at the aggregate level. And then of course you have patients in that scenario who, while not directly interacting with the application, although that is a potential future state, they are impacted by it because it's being used in their presence. And so understanding how to design for trust is really, really important and I think it's making data very accessible and easily explainable. So when you're looking at a data point, when you're looking at your hourly rounding, you understand what the data means, it's presented in such a way that the human brain can quickly process it. A lot of times we use icons and imagery to communicate concepts easily to people and make it accessible for people. So that's just one sort of tactical aspect of it but that trust component is really important.
Jon Prial: Right. And what I like about this solution is when I looked at who is part of the team, you had technologists, you had the designers such as yourself, clinicians that were there. I liked that you were calling it a cognitive care system and I'll just throw a plug out for it because it turns out it got to last year on fast company is one of an honorable mention in an innovation by design award, which is pretty cool. And I think it starts with how you build the team to make sure you don't lose all of these critical elements of getting the design, right? So my hats off to you on that. What I find and really fascinating is you're already incredibly assumptive that the data, the AI, that's already being worked on. You're making this much more usable. We're not having an AI and big data discussion or an analytics discussion. That's inherent in everything you're building from, it's kind of the foundation from what you're building on. Let's not just get all the data that we think we need, let's start thinking about what we can do to deliver more appropriate and better applications based on the fact that we've got all this data and these machine learning tools and techniques and artificial intelligence plays to, that's just the starting point, that really is just foundational.
Lindsay Ellerby: We're now at a point where, unless we can surface the insight in that data and the value to people around what it helps them achieve in their lives or how it helps them do their job better or faster, how it's really augmenting them, then we're sort of going around in circles. Most of the conversations I have about AI ends up not really being about AI, but more being about people. And first of all, what people need and the state of human machine interaction right now and where it's going. And then also to your point, the teams that we need to build these tools, which are not purely engineering teams, definitely not just design. And I would say beyond that, it's so many other disciplines that we need to really fold in to make this cross- functional in order to be building within these systems.
Jon Prial: That's great. And when you talk about the data, You specifically called that insight. That's right, it's the insights and it's the insights and you talked about people, how they get applied to people, that's what really matters Lindsey Ellerby, this was a great conversation. Thanks for spending the time to be with us on the podcast.
Lindsay Ellerby: Thank you, this was really fun.
Human-centered design has helped to make truly great products, by designing the product so that the user has a great experience and perceives value. Now, to be successful, we have to design products that consider intelligent machines as users and optimize the interactions they have with human users.
In this episode of the Georgian Impact Podcast, Jon Prial and Lindsay Ellerby, Senior Designer at Normative, discuss the concept of centaur design - designing solutions for a world of AI-human hybrid systems.
You’ll hear about:
- What EQ looks like in machines
- How to think about the user in an age of intelligent machines
- How to create positive human-machine interactions
- What centaur design is and how it will impact the future of design
Who is Lindsay Ellerby?
Lindsay Ellerby is a designer who excels in the messy overlaps between business, technology and human context. Throughout her 10+ year career, she has honed skills in information architecture, interaction design, design research and strategy. Since joining Normative in 2009, she delivers strategic and tactical leadership on a diverse range of projects - everything from design and software development to creating strategic roadmaps for clients.