Inside Out Quality

Boeing 737 MAX: Human Factors with Dr. Mica Endsley

August 10, 2021 Aaron & Diane Season 2 Episode 1
Inside Out Quality
Boeing 737 MAX: Human Factors with Dr. Mica Endsley
Show Notes Transcript

In this episode we explore the tragic failure in human factors engineering on the Boeing 737 MAX and why early incorporation of the user’s needs and limitations is key to developing better, safer products. Dr. Mica Endsley, former Chief Scientist of the Air Force and founder of SA Technologies, joins us to discuss Human Factors Engineering and what we can learn from the 737 MAX tragedies.  

Human Factors & Ergonomics Society: www.hfes.org


Unknown:

Here we are in Washington and everybody in this town, everybody, nearly in this town. You sit up here and you're dealing with billions and trillions of dollars and all these crazy acronyms and processes and none of it often makes sense or fits the common sense test. And oftentimes, you see people that just forget about objectives what why are we actually doing this? What is the purpose of this whole process that we go through the regulations, the procedures, why, and at the end, it's always about people. That's, that's what we're here for. We're here for people for fellow fellow Americans, fellow citizens. And it is amazing to me just being here how often that is forgotten.

Aaron Harmon:

Hi, I'm Aaron Harmon.

Diane Cox:

And I'm Diane Cox Welcome to Insight out quality.

Aaron Harmon:

both Dan and I build and implement quality systems in the biotech and medical device industry. But we often get asked, Is this really necessary? Do we know if we are doing too much too early? Or do we even need a quality system?

Diane Cox:

Our goal is to explore questions like these three real life events and experiences shared by our guests from various regulated industries. We will show you why quality is not just about compliance and how when it's done right, it can help your product and company improve lives and make a difference.

Aaron Harmon:

You just heard Congressman Mr. Garrett graves speaking at the transportation committee hearing on the Boeing 737 Max eight crashes. He's nailed it. All the regulations, all the bureaucracy, processes, dollars, acronyms. It's all to help protect people. That's why we do it. Now let's go back in time, march nine 2017, Boeing announced that the FAA certified their 737 Max eight for safety and reliability, this new plane can now be sold to customers. A year and a half later and October 29. A 737 Max eight departed from Jakarta and 13 minutes later crashed into the Indonesian sea. March 10 2019. A flight from Ethiopia heading to Kenya crashed in a similar manner. A total of 346 people lost their lives. On my way home from work, I pulled up behind a truck hauling gravel at a stop sign. There were too hard to read sign on the back first construction vehicle do not follow the second warning, stay back 200 feet, not responsible for broken windshields. The problem with both of these signs is you can't read them 200 feet away, the font size was so small that I had to be nearly at the back of the truck to read them. The designer of these signs failed to take into account the human's ability to read a long distance. They did not apply human factors engineering. Now back to Boeing. When there's a tragedy, it's easy to hear numbers and not think of what this truly means. 346 means moms, dads, children, grandchildren, friends are lost forever. Maybe we gloss over these facts and prefer to use numbers and scales so that we don't have to take in the gravity of the human loss. I find myself doing this also. But what is hardest for me is when I realized that the loss could have been prevented. This is Congressman Mr. Peter DeFazio. Sharing his frustration.

Unknown:

We don't know why Boeing designed a plane with a critical safety critical system side to a single point of failure. inexplicable, inexcusable, as far as I know, unprecedented in the history of passenger aviation production. We do know and we have seen that, you know pressures from Wall Street market forces have a way of influencing the decisions of the best companies in the worst way. endangering the public jeopardizing the good work of countless countless hard working employees on the factory lines. I hope that's not the story that is ultimately going to be written about this long, admired

Aaron Harmon:

company. We can't go backwards in time, we can't undo what happened to these people or all the others lost in tragic events, we can only move forward. Inside Out quality is a podcast about moving forward and using lessons from the past to make things better. That's what quality assurance is really about. It's about continuous improvement. So what was the underlying cause of the 737 max eight failures, a failure to use human factors engineering, errors in design caused the crashes? I have only known about human factors engineering for a few years. And I now believe I have found the best primer on the subject is Dr. Mica Endsley's congressional report on the 737 max eights failures. The mistakes being made aren't unique to aviation, human factors engineering impacts everything from the products of biotech and medtech. To how we run our companies is the reason the FDA requires human factor studies on medical devices prior to approval. We are honored to have Dr. Mica Endsley with us today. Dr. Hensley received her Bachelor of Science degree in Industrial Engineering from Texas Tech University a Master of Science degree in Industrial Engineering from Purdue, and her PhD in Industrial and Systems Engineering from the University of Southern California. She served as chief scientist to the Air Force, founded sa technologies and remain CEO and president. She has authored over 200 papers in the Human Factors field and testify for Congress in the Boeing investigation. Welcome to the show, Dr. Endsley.

Mica Endsley:

Thank you.

Aaron Harmon:

So to dive right in. First question is can you tell us more about human factors and what it is?

Mica Endsley:

Sure, human factors engineering is based on scientific information about how people perceive think, move and act particularly when you're interacting with technology. It turns out that the way that any technology is designed significantly affects the performance of the people who are interacting with it. The user interface the technology can make performance much more efficient and human error less likely when it's designed to be compatible with basic human capabilities. But it also makes human error more likely if it is not. So human factors engineering puts the emphasis on designing the technology in order to be consistent with how people actually work. So rather than just blaming people for making errors or thinking that you can trained overcome technology design problems, human factors, recognizes that people are an important part of the system performance. We all know training is important. And while that's true, it can't overcome. For system design and long run, people are still likely to make the same types of errors as system design is not consistent with human capabilities and limitations. The Human Factors profession can be traced back to the early years of aviation, when it was discovered that a large number of crashes were occurring due to human error. That human error resulted from cockpits that were inconsistent with the basic capability of the pilots who are trying to fly them. by redesigning the controls and displays the aircraft to be more consistent with pilot characteristics. We've significantly reduced aviation accidents over the past century to its current rate, which is very low. Since its early inception, human factors has been applied in many different domain areas, starting with aviation, but it's also been included design of automobiles, manufacturing system, power plants, consumer products, and now in healthcare. Great, thanks

Diane Cox:

for that explanation. So if we had to make a case for performing human factors engineering, what would you say is important about it? Or why is it important, I should say?

Unknown:

Well, human performance is important for a number of reasons. Basically, it's very important to think about the human when you're developing a system. So what we find is that if you neglect to take human factors into consideration is going to take longer for people to carry out their tasks, and they're going to have much higher rates of error. So if we neglect human factors, engineering, it'll actually has a direct impact on a company's bottom line, because it affects the performance of people using the system. And it also can reduce errors, which can result in things like patient injury in the healthcare profession, for example. So one good example of this was that we found that when human factors was neglected in the design of infusion pump, the rate of wrong medication dosages being administered went up dramatically. So human factors based solutions make it easy to do things right, that makes it hard to do things wrong. And then when errors do occur, it makes it less likely to lead to patient harm. We also know that when good human factors is including design of a product, users are far more satisfied and experienced less frustration. So applying human factors, design of electronic medical health records, for example, can decrease problems with poor communication across the clinic staff. And it can decrease problems with clinician burnout. So we find that good human factors is good for everyone.

Aaron Harmon:

I can't tell you how many times I've been involved in investigations where something went wrong. And the outcome that people want to draw to really quickly was there wasn't adequate training.

Unknown:

Yes. And that's kind of a default mode, that that we always want to fall back and say if we just train people, but it turns out, no matter how hard you train people, they'll still make those same kinds of errors, if it's if it's not consistent with your expectations. So for example, imagine pulling up to a stop sign that's green, you're going to be very confused, right? You're gonna be getting conflicting cues, people are going to be likely if they're if they're not paying 100% attention to just blow right past it. We have expectations about what design of our environment is going to look like. And we need to be consistent with those those expectations. And that makes a significant effect on on him performance.

Aaron Harmon:

Yeah, definitely.

Diane Cox:

Makes sense. Aaron and I work in the medical device industry primarily. And so I guess one thing that I'm curious from your perspective is, do we need to outsource or or look into having experts perform human factors engineering for the company, or can companies do this on their own?

Unknown:

Well, you definitely need people with expertise in human factors on your staff in some way carrying out these tasks where they You hire consultants to help you or whether you do it in house is, it's really a matter of the size of your organization. But but you do need people who are qualified in doing this. Most human factors professionals have master's degrees or PhDs in human factors engineering, they have a combination of psychology degrees or engineering degrees. But they've specialized in understanding human performance and how design technology to work with it. If we think, Oh, we can just do this ourselves, it's like, would you go out and build a bridge on your own without involving a structural engineer? Probably not. And you wouldn't be too surprised when the bridge fell down, I see that that's a big problem that people make is they think, Oh, we can do this ourselves without really having the people on staff that have the training and expertise in that area.

Aaron Harmon:

So do you have any insight into why there is like, I feel like a resistance to incorporate human factors in design, like, I've seen so many products with poor design features that made it difficult. Yeah, that could have been prevented. And just as many times as I've seen, people try to train things in and just keep falling back to, we just have to train and people have to learn if they if they aren't learning, then it must be something wrong with that person, instead of this concept that it's, this is a human, you know, factors issue. Any insight into why I think

Unknown:

the biggest problem is that a lot of people never heard of human factors. And they don't know that this whole science base exists. So they keep making the same mistakes over and over again. And because they don't understand why these problems occur, they just end up blaming the user. And that, of course, doesn't really solve the problems. When we go and we look at accidents and errors. And these events, we find it the same kinds of errors that many people will make over and over and over again. And that tells us it's not that individual, it's the design of the system. When you discuss

Aaron Harmon:

aviation, was there an issue with ego involved, where pilots were just assuming they were so talented? That this would not be a problem.

Unknown:

That can certainly occur. But I find that most pilots actually have been gotten at least a little bit of human factors training, they get they get frustrated with systems not working correctly, either. So you know, they want systems that are designed well, because they know their lives are on the state, you know, they don't want to be making those kinds of errors.

Aaron Harmon:

So in the case of the Boeing 737, Max eight, and where there was human factors errors, are there lessons we can take from that NumPy to other industries, like the medical device, the biotech space, there are some obvious ones that jumped out.

Unknown:

So the the, the accidents involving the 737, Max eight were quite tragic. We lost two airplanes, and hundreds of lives were lost. And they were lost due to some fairly basic engineering problems, where they didn't incorporate human factors the way they should have done. So there's a number of issues that I can cover about this accident. And then I'll talk about how those apply to, you know, the kinds of systems that that your listeners are developing in the healthcare arena. Basically, the 737 max eight was designed with a new automation system, it was called the M cast and a new brand characteristics Augmentation System or M cast. And the impasse basically control the pitch of the aircraft because they added new engines to the plane and the new engines were much bigger and heavier, and it changed sort of the weight and balance of the plane. So the M cast was designed to keep the pitch level so that it wasn't tilting tilting up, especially with these larger engines. And there were a number of different kinds of problems I'll I'll hit sort of the big ones. The first and foremost problem in the accident was was that there was insufficient reliability of that automated MCAT system. redundancy is a key engineering principle, it's a key safety principle, because of a sensor gives bad input or it gets blocked or damaged. In this case, missile a got misaligned due to maintenance problem, then if you don't have redundancy in those sensors, your automation is getting bad data. And if it's getting bad data, it's going to perform inappropriately when they design the M cast, so they didn't draw from two parallel sensors that were available on the plane, but they just relied on one sensor at a time. It was called an angle of attack sensor, which basically senses the the relationship of the aircraft pitch in relationship to its movement. So they only relied on one sensor rather than both sensors. And when that sensor gave bad readings, basically the automation does acted inappropriately. And it kept forcing the airplane nose down despite the actions of the pilots who are trying their hardest to counteract that. Furthermore, it didn't just do this once it kept doing it over and over repeatedly. So the pilots really weren't able to easily overcome the problem. So our first lesson is that system reliability is really important. It can have a big effect on everything that comes afterwards. And like many accidents, there's a fundamental error in the performance of the technology that underlies then people being frustrated and tried to try to work with the technology to make it do what they need to do. The second problem that we had was that there was a significant lack of training on M. Cass, because Boeing wanted to get the plane out there without requiring pilots to go through an expensive retraining process, they basically convinced the FAA that training, more training wasn't going to be necessary. As a matter of fact, they didn't even put a notification about this new MCAT system into the pilot manual. So their assumption was, Oh, we've automated this, the pilots don't need to know about it. And that, of course, is another major error. Automation actually doesn't mean less training, it means more training, because automation is always complex, and it's never perfect. And people actually need to understand a lot about how it works. So they know how to interact with it appropriately. In the first accident that occurred on the 737 max eight, the pilots weren't even aware that this M cast system existed, it wasn't in their flight manuals, they weren't trained on it. So they were completely confused as to why the plane was behaving erratically for so. So that was that was sort of that the second big problem here was you have to pay attention to use your training. But you have to make it simple enough to understand that so that the training requirements aren't onerous. So that's a second major lesson out of out of this accident.

Aaron Harmon:

So human factors is trying to prevent relying on training, but not removed training, just making sure that training is where it needs to be. And two elements that are critical, is that yeah,

Unknown:

yes, you want to make something easy to use, but you can't neglect the fact that people will need training. And they certainly in this case, whenever you're involving automation, you need training on the automation and how it works, you can't just assume that you're going to be able to to operate something without even knowing how that that system is working at all. So training is important. We just can't rely on training to overcome bad design. And and that's the big difference.

Diane Cox:

I see a parallel with the risk management standard for medical devices. And, and kind of what Dr. Endsley is talking about with regards to training. I mean, they they actually say that training and labeling and information for the user is not enough that shouldn't be your your top mitigation for risks that are discovered during the design process, that you need to focus on design and inherent design and manufacturing, best practices before you even consider training and labeling as a mitigation. So very, very good parallels there.

Unknown:

That that's absolutely true. And if you design something, well, it turns out that training requirements are actually significantly reduced. You don't have to go through extensive training, where it takes you three hours to learn how to assist, and you maybe have learned how to use it in 15 minutes if you've designed it well.

Diane Cox:

Right,

Aaron Harmon:

right. So I ran into a training problem in our facility where we have a environmental chamber that requires air regulation. And when we're not using it, we have to turn the valve off so that this air compressor isn't continuously running. And I had verbally told to individuals, make sure that when you're done with it, turn the valve and did not happen. So I then came back and thought about that and figured my approach was not working, I'm going to try something different. So I took a picture of the valve, expanded it, put a red highlighter mark around it, placed it on the front of the machine, and then put clear instructions saying when turning off, please close this valve. And that solved it immediately. But it was not just the training, but then how I applied to training. Yeah,

Unknown:

that's that's a really good example. Because, you know, try making people remember to do things like that, that that's sort of an invitation for error. Because people don't remember things very well. It's we often have checklists and environments to try and deal with those kinds of problems. What you had to do was a user design heuristic, you had to create a workaround, because the design of the system didn't take care of that for you. So we find a lot of environments where people have tried to create these heuristics to work around problems with systems because the engineers who were developing them didn't, didn't think about these kinds of things.

Aaron Harmon:

You define heuristics real quick.

Unknown:

A heuristic is like a mental shortcut, just like you did you put a big, a big sticky note essentially up on top of the machine to remind you to do something or you tie a string around your finger. Those are simple heuristics that people might use as shortcuts to try to remember something, for example,

Aaron Harmon:

I've heard the term a lot and I've had to look it up. But I figured you could do a much better job defining it than I could for the listeners.

Unknown:

So So those are two of the some of the basic problems that occurred. This was a this was a classic aviation automation accident, though, and we have about 30 years of of adding automation to systems in the aviation environment. And automation is usually designed to think they're going to reduce workload or do some something more accurately, because of the automation and the process. means that automation has built in problems that actually makes the task more complex for the user. And it can can add to confusion. And it can add, it can create a loss of situational awareness where people don't know necessarily what the automation is doing, and what the state of the system is that they are, the automation is working on. And this has real implications for the healthcare field, because we're seeing a lot of automation that was originally done in aviation, is now being done in many other environments, we're starting to see automated vehicles on the roadway, for example, we're seeing more and more automation added into medical devices in a medical environment. So there's some very real lessons learned about automation that we need to be thinking about in the healthcare field. This accident was actually a classic example of what happens when automation get gets added, that the pilots who were dealing with this in caste system were very confused about what the automation was doing. This was because first of all, they weren't trained on it, they didn't know it was there. But even then the displays really didn't give them any good information about what the automation was doing. So there were there were no displays to indicate that the MKS was acting on the aircraft trim and there was no display south the pilots understand that it was getting erroneous data, they, you know, those those are basic automation, transparency issues that really weren't applied here. They should have had displays that told them what the sensor data coming into it was was displaying. So they could have cross checked was it was the system acting accurately. But they didn't have that. And they didn't have any information that show that this automation was acting on the trim, or how it was acting on the trim. So to them, the aircraft was just acting erratically. And they were completely in the dark as to as to what was going on. Boeing had basically assume that if something went wrong with impasse, the pilots would be able to diagnose the problem and take corrective action within three seconds. That of course, was yeah, that's fast, particularly given that, you know, they didn't, again, they had all these problems with understanding that the system was there or what it was how it was performing. We know that decision making and performance is dependent on on creating it and having an accurate understanding of what's happening in the situation. Because they didn't have any displays that have had them help them form accurate situation awareness, though, they were really not able to, to carry out those actions. So they had an aircraft that was repeatedly making uncommanded pitch changes, they did receive a lot of alerts. However, those alerts weren't telling them anything related to this MKS system, the alerts were actually about something completely different. They was telling them they had an airspeed disagreement problems that told them they had altitude disagreement problems. And so they the system died diagnosis messages, the Alert, alert messages, were actually sending them down the wrong path. They were pulling up their checklists and trying to run proceed run these procedures to address these different kinds of alerts that they were getting. And all that did was just added their workload and create distractions that prevented them from being able to eat to even think through what was really happening with the aircraft. So they weren't getting the information that they needed. A key principle for people who have to oversee or interact with automation systems, is you have to give them displays that will let them know how the system is performing, as well as what's really happening so that they can detect problems and correct them. And that just didn't happen in this case. And I think this goes back to the problem that engineers oftenly often mistakenly think that when you automate some function, you no longer have to worry about the human operator. And but in fact, the opposite is true. Because automation increases the complexity of the system. And because it's generally not 100% reliable, it actually becomes even more critical to provide people with good displays so they can understand what the system's doing, and they can intervene when they need to be able to. And that's a principle that we definitely need to apply to to automated systems, whether it's in the aircraft or automobile or in your operating room.

Aaron Harmon:

Now we'll take a quick break to hear from one of our sponsors. Today's startups

Unknown:

become tomorrow's growth engines in South Dakota, we're entering a new stage of expansion for our biotech industry, and you'll want to be part of it. Hi, I'm Tony Johnson, Executive Director of South Dakota biotech, where the state affiliate of the International bio organization and we're proud to be leading a state that's driving innovation to feed, fuel and heal the world. South Dakota biotech is here to inform, to connect and to advocate for our critical industry. Whether you're directly involved in biotechnology, or looking to learn more about it, we want to hear from you. Find us at www that SD bio.org. Now back to the show.

Aaron Harmon:

With automation, I one of the things I had read is there's a risk where you create long periods of boredom and low activity, and then sudden instance Have a lot of demand on the on the pilot or the person in the process that's being automated. Is that something that was happening here as well, it sounds like what three to four seconds to react?

Unknown:

Yes, that's called the irony of automation. And, well, Sam brain bridge coined that term 40 years ago. And it's still true today. And it was it was very much true in these accidents. Automation simplifies the simple and it makes more complex, the the emergency or the off nominal condition where you have to react, understand and react very quickly to something new, that's happening. And that's a real problem with automation.

Aaron Harmon:

I've heard stories about manufacturing fill lines, where people were doing repetitive tasks, but they were like, low cognitive requirements, and getting so used to what they were doing that they were making errors of being completely unaware, because they essentially mentally checked out because of the way the task was designed.

Unknown:

Yes. Oh, and unfortunately, that's a problem with a lot of these things is that can can over simplify things to such extent that people get bored, they have vigilance problems, people aren't good at just monitoring something, for example, or just doing repetitive, rote, rote labor, that's, that's actually a poor use of human beings. And so when I talked earlier about human factors being about you know, augmenting the things people are good at and not putting people in situations they're not good at with automation, we often do the exactly the wrong thing. We make the boring, more boring. And that's, that creates a massive problem for him and performance,

Aaron Harmon:

I can pretty much imagine a good driving automation, that's probably where I would I would see the most versus I'm not a pilot, if I've got a car driving me everywhere, eventually, if something key happens, my ability to react and avoid a problem would be pretty greatly diminished.

Unknown:

That's exactly what happens. That is exactly what happens. And essentially, it makes you a late in the flight making your passenger in your own car. And so what we do is we sometimes we they don't have the displays they need, but even when they do have the displays they need automation can make people less engaged in the task. This is like if you've ever gone to a party with someone, and they were driving at the end of the night, you found that you have to drive back. Do you have as good of understanding of how you got there? Is this as if you'd been the one driving the car? Yeah, that'd be I'd be lost? Yeah, no, it was because you were just sort of along for the ride, you weren't really actively mentally engaged in it. And that's what automation does to us. And that's a very difficult problem to overcome. So when we automate that vehicle, like you say, when something happens, you're proof, you're mentally checked out a little bit, you're even if you're even if you're visually looking where you need to be visually looking, we often don't see it or don't connect and understand what's happening. A good example of this is with automated vehicles right now. I drive a Tesla, I looked up and I saw that I was getting very close to a RV that was in the next lane. And I thought that's really interesting. And we kept getting closer and closer and closer. And finally I realized, Oh, I better jump in and I hit the brakes. And I realized a little too late that the lanes were merging. Apparently, the Tesla didn't know how to do a lane merge. But I didn't know this because of course, there's no training on it. You it was interesting to me how long I sat there and watch, the two vehicles get closer and closer together before realizing oh, I need to do something here. And even then I really didn't understand what was happening. And that's very classic.

Aaron Harmon:

Do people doubt themselves and start to believe that the automation is right, and they're not. And that's why some of these things happen.

Unknown:

That can happen. That what we find is that people either overdressed or underdressed and sometimes they it's, they get complacency because they think oh, the automation can do all this. And they don't realize the types of things it doesn't do. And then we say oh, they were over trusting. And other cases they under trust. And what happens is, if it ever disappoints you like that, then people are very slow to regain trust, and they don't trust it at all. And so then they say, if they're under trusting it, it's actually very hard for people to properly calibrate their trust, based on a good understanding of what the automation can and can't do. What they really need to know is, should I trust it right now to do these kinds of tasks and these kinds of conditions. And because the designers of automation have often not made it as transparent as they need to be, it's very hard for people to make those kinds of assessments in real time.

Diane Cox:

Well, it sounds to me like incorporating human factors engineering early, early, early, early on in the development process is important. Now, what's typical? I guess, is it right at the beginning, when you have a design that you should start thinking about this or a design idea, I should say? Or when does this fit in a development process?

Unknown:

Well, the best place to put it is early on, you want human factors to be involved right from the get go in terms of defining what those system requirements are, and then and how people are going to interact with the automation and the technology and being able to define the kind of information They're going to need the kind of tasks they need to go perform. When you do that you actually can design in good performance fairly easily and cheaply right from the beginning. Unfortunately, what happens is many companies will leave it to the end and say, well, let's let's design all the hardware and software first, and then we'll get a human factors engineer to sort of wave their magic wand at the end. And that practically never works. It's just too expensive then to make the changes and they can they can only apply you know, a few band aids, but making changes becomes very expensive at the end. So human factors engineering needs to be done at the beginning, right along with system development.

Aaron Harmon:

How do you find someone with the expertise to bring on board or to have as a consultant for human factors?

Unknown:

Well, your best resource is the Human Factors and Ergonomics Society. This is the largest nonprofit association for human factors, ergonomics professionals, it includes researchers, practitioners, federal agency officials, it's been around for over 70 years, it's really one of the earliest professional societies in the field, you can find them@hfs.org. And they do a lot of things in the healthcare arena. Now they have an annual symposium for Human Factors and Ergonomics and health care, which is a great resource for finding the latest research and best practices in the field. They're also in the process of launching a new journal on human factors in health care. In addition, it has a consulting directory, and a good career center where you can find people who have the skills and expertise that that you're looking for, for your company.

Aaron Harmon:

Yeah, we'll put that we'll get that link in the show notes. So people can find it easily. Do you mind talking about sa technologies company founded and?

Unknown:

Sure. Um, so I was I was in academia, I did a lot of research, I started as a technologies in 1997. And we've done work in situational awareness for a wide variety of systems, military systems, space, oil, and gas drilling, healthcare, just about any place where people are trying to make real time decisions and complex systems, and they need to have good awareness of what's happening in their system. And so a lot of my work has been at the research level, but also in applying that to system good system design, doing user testing, to make sure that those systems are successfully presenting the information that's needed for people to be able to interact with them. A lot of the works been in automation, complex systems, things like that,

Aaron Harmon:

what what drew you into human factors engineering.

Unknown:

So my background is in industrial engineering, many human factors, engineers come out of Industrial Engineering. At some point in my coursework, I got really interested in this whole idea that how we design systems can actually affect human behavior. And that was, that was a lightbulb moment. For me. I like many people, I've really never thought about the design of technology, we just it just there you used it. And the whole idea that how we designed it could so dramatically affect how people interact with it, I thought was fascinating. And so I, I went and got my master's, and then later my PhD, and I was working at at an aircraft company, in the design of systems at that time, we were looking at implementing AI in the cockpit. And that was a real interesting challenge. So I have found it to be just a fascinating field

Aaron Harmon:

for them, how'd you end up in the air force.

Unknown:

So most of my career, I was in a private industry, I was in academia, I was in a private consulting. And in 2013, I got a call to come and be the Chief Scientist for the Air Force. And that's a really interesting position. The chief scientist is a role where they bring in people from the outside, generally for a two year appointment, to provide scientific technical advice to the Secretary of the Air Force and the Chief of Staff of the Air Force. And they intentionally tried to bring people from the outside and get people with different kinds of expertise levels. So I was the first person who'd ever come in, in the Human Factors area. And I was the first woman chief scientist, as it turned out. And it was just a wonderful opportunity to see how the government work from the inside, get a better understanding of how they develop their technology and how we could better apply and address some of these critical kinds of human factors issues to their programs.

Aaron Harmon:

Had to be a fun phone call yet.

Unknown:

It was a very fun phone call. Yes. And it was a wonderful experience as well. I very much enjoyed it. Yeah, that would

Aaron Harmon:

make my day if the air force called me up.

Unknown:

Yeah, I do want to say one other thing about the Boeing accident that I think is really important for your your listeners and the kinds of systems we're developing. And that is that the kinds of problems that the pilots experienced weren't new problems, but they've could have easily been foreseen because we've seen them in many parts. aviation accidents. And we have a lot of good design principles for improving people's ability to successfully oversee and interact with with automated systems to to avoid the kinds of problems whether they be simple problems or more complex problems like these, it's really important that human factors, engineers get involved in design of the system. And they need to be trained, they need to be involved early to determine system requirements, and design the displays, but they also need to be involved in testing. And all the problems that we saw in this system could have been detected early and remedied, before the system got fielded, if they had just done detailed user testing. Things like assuming that pilots would be able to detect and understand and respond in three seconds that should have been tested to see Is that realistic will will will pilots be able to understand the operation of the system and understand what needs to be done, for example. So you need to involve human factors engineers heavily in testing your system designed to make sure that people can perform and both normal conditions and in the case of system failures or unexpected events, you can't just assume that the design is going to work or people will be able to perform. And that's really critical for any of these technologies. And it's something that's required by the FDA now for medical systems, for example, is going in and doing good user testing to verify that the designs you've come up with are actually sufficient for supporting the kinds of human performance that's needed.

Aaron Harmon:

I've gone through the the standards on human factors that apply to medical devices. And one thing that they really emphasize is doing those tests in the use environment, they try to simulate as much as you can, the real world real world conditions that our users are working in when they're using the instrument or whatever piece of equipment technology that it might be,

Unknown:

that that's absolutely critical. So you know, healthcare systems have to be used under a wide variety of conditions with a lot of different types of patients. And they're used in so many different settings that they're using conjunction with a lot of other technologies and a lot of different kinds of organizational practices. So doing that sort of Institute testing, where you can really understand the conditions in which this technology is going to be used and the types of users you're going to be interacting with it is really critical for bringing to light any of these kinds of problems so that you can fix them, before the system gets fielded,

Aaron Harmon:

sure that any of these spaces, I can just imagine like the stress level, if something is going wrong, like in the case of these of the Boeing crashes. But even in the healthcare space, where you have patient that's in critical condition, you've got a large team, you have loud noise, a lot of urgent things you have to do. If things aren't designed for that space, I could see it being very error prone.

Unknown:

Absolutely. And even before you get into that stage, we typically will do a lot of simulation testing. Because you don't want to be putting this out there where it might actually create a problem. When you're doing the testing, we can really exercise the system in a simulation environment, where you know, accidents aren't gonna hurt anybody

Aaron Harmon:

seems like such a daunting task.

Diane Cox:

Yes, and no, to me, it's kind of one of those things where I mean, we're talking about kind of the basics of validation here, you know, putting the product in actual users hands, having them go through simulated or actual case. I mean, it's kind of what we've been preaching, right.

Aaron Harmon:

I think of like aviation, as an example showing building a new aircraft and making sure that you've got those elements of design there. imagining it takes a large team to pull that off. Oh, yeah, for sure.

Unknown:

It's, you know, it's interesting, it's, it's really just prioritizing this and realizing that, you know, people are a part of your system, they're not something separate, that happens later. And you don't worry about, they're very much a component of the technology that you're developing. And you have to consider them when you're designing it. And when you're testing it, that that makes all the difference in the world, we find that you don't need a huge human factors team. But you do need, you know, at least a couple of people who are dedicated to this part of the job, and they should be a fundamental part of your engineering design team has

Aaron Harmon:

been really good.

Diane Cox:

Yes. Fantastic, very, very great information that you've been able to share so far. Thank you so much.

Unknown:

Well, thank you. Hopefully, we've we've kind of gotten the message across that, you know, designing the technology to fit the human pays enormous dividends. It it not only saves lives, but it prevents the companies who are developing these systems from getting caught up in many of these sorts of problems. And we find that systems that are designed around human users actually sell much better because people like to buy them and they like to use them. So it's a real win win.

Aaron Harmon:

Definitely. When they're not designed, well. They're not fun to use. I can verify that. No, absolutely. All right. Well, thank you for being on the show.

Unknown:

Thank you for having me, and I hope you're very successful and applying human factors to your systems.

Aaron Harmon:

Well thank you again, Dr. Hensley. And thank you, listeners. hope you stay tuned for the next episode of insight out quality.

Diane Cox:

We hope you enjoyed this episode. This was brought to you thanks to South Dakota biotech Association. If you have a story you'd like us to explore and share, let us know by visiting www.sd bio.org.

Aaron Harmon:

Other resources for quality include the University of South Dakota's biomedical engineering department where you can find courses on quality systems, regulatory affairs, and medical product development. Also, if you live in Sioux Falls area, check out QUIBIT a local Quality Assurance Professionals Network. You can find out more about pivot by clicking on the link on our website to the end and I would like to thank several people, but a few who stand out are Nate peple for a support with audio mixing Barbara Durrell, Christian or support with graphics design and web. And lastly, the support from South Dakota bio