Aaron Irizarry (a.k.a. “Ron”), is the Director of User Experience at Nasdaq and has been building products for startups and large corporations for over 15 years.

He is also a life long Dodgers fan, heavy metal enthusiast, and co-author of Discussing Design: Improving Communication and Collaboration Through Critique.

You can follow his frequent ramblings about food, sports, music, and design on twitter at @aaroni.

Episode Transcript

J Cornelius: You might think Nasdaq is that little ticker that scrolls stock prices across your screen or around Times Square. Turns out, they do a whole lot more. Today, we'll hear how they use human-centered design and rapid prototyping to make business apps that keep stock markets moving.

Intro: This is Design Driven. The podcast about using design thinking to build great products and lasting companies. Whether you're running a startup, or trying something new inside a Fortune 1,000, the tools, methods, and insights we talk about will help you create things people love. And now, your host, J Cornelius.

J: We are excited to have Aaron Irizarry on the show today. He is the Director of User Experience at Nasdaq. Yes, that Nasdaq you've probably all heard of. He's also co-author of "Discussing Design," a book about improving communication and collaboration through critique. Great book available on O'Reily. You can go get that now. And he is a frequent speaker on the conference circuit, and master of barbecue. Aaron, welcome to the show.

Aaron Irizarry: Thanks for having me. I'm psyched to be here.

J: Yeah, it's great to have you. We've talked off and on throughout the years about a lot of this stuff, so I'm excited to actually sit down and have an actual, formal conversation, so to speak, and record it so everybody else can hear what's going on.

Aaron: Noted that this is being recorded. Gotcha, I'll make sure to be on my best behavior. Yeah, it's great. I'm excited to talk some of this stuff. I hope whatever conversations, whether they be about barbecue or anything else, prove to be helpful to listeners.

J: Why don't we start by giving people an idea of what you're doing at Nasdaq. People think of it as an exchange, but there's a lot more to it and it might be surprising that there's someone in your role, actually controlling the experience design. What do you day-to-day?

Aaron: That's funny. That's one of the most common questions I get when I'm ... Whether I'm at an event or a meetup, or just chatting with people saying, "Oh, what do you do for a living?" That kind of thing. I tell them, "I design at Nasdaq," and they're like, "Do you design that little ticker that runs across the bottom of the screen?" "No, no man. That probably wouldn't have been a very long career, because that thing's not very complex."

Nasdaq is known, probably traditionally as more of a technology exchange. Over time, they realized that they had the opportunity to generate additional revenue by providing certain services to companies who list on the exchange. They have a business within Nasdaq, which is primarily the main business our design team works in, called "Corporate Solutions." This is a collection of software as a service, like a suite of products that's offered to these companies who list on the exchange.

It allows them to better monitor and facilitate a lot of the things they have to do now as a public company. You would get things like press release distribution and reporting, earnings reporting, webcasting for your earnings calls. Software that allows you to facilitate, monitor, promote these types of things, as well as the big meaty piece of software that we work on is an investor relations application. This allows the IRO or the CFO, or some of those teams within companies to be able to manage the ownership in their company, and who's investing in them, and setup road shows to go meet with them, to continue to build that relationship. It also keeps them informed of what's happening in the market, and their peers, and their sector, these types of things. This is really helpful software for these folks who...

Financial services have, in my opinion, the short end of the stick when it comes to design. We want to go design the next car service that's going to piss somebody off, or the next cute thing over here, or this Internet of things in my home, to where my refrigerator tweets at me to get eggs to bring home ... All these things that we deem as sexy design projects. But yet, financial services is like incredibly antiquated. It's so old. The software project we redesigned that launched in 2016, people have been using it for over 10 years. Think about software you used ten years ago.

J: Yeah, it was horrible.

Aaron: It was horrible, even though the cutting edge software by today's standard would be horrible. We have these immense challenges, and I think it's really cool to see that it's starting with my manager Chris O'Bore, who some listeners may be familiar with ... I know you know him ... When he came in as a consultant and then they asked him to stay on and build our team ... He came in 2011, I think. I joined the team in 2013. We're at like 20 folks now. We've been as high as about 30. We've really grown into this role within the company of trying to facilitate product design, and working alongside product managers and developers, and contributing to strategy, and really striving to understand the business goals of corporate solutions with the Nasdaq, and how we can use design to facilitate and meet those goals. It's been a really unique challenge. Nothing I could've imagined coming into the working at Nasdaq at all. I honestly took the role, and I was like, "I have no idea what I'm actually going to be doing."

J: It sounds like it's a lot more complicated than most realize. There's obviously very high-value stake holders, people who have a lot to lose if things go south. You've got to be really on top of making sure that the stuff you're designing just works well from a usability perspective and from an aesthetic perspective, but functionally, on the backhand, that everything actually works.

Aaron: There's multiple levels to this. One of the things that's been a really great lesson for us is good design is not always what's best. I think people would balk at that and be like, "Yo, whoever's on your show is smoking something." But in reality, we found that in some of the software people have been using for the last ten years. So, whether it was designed well or not, they've become really efficient at using poorly-designed software. When you're going to redesign a platform that's a part of the work flow that someone uses for their job every single day ... Or the other end of that, someone logs into once a quarter, and they're so used to it being a certain way, you really have to think through things.

For us, data density is huge, tables ... It's financial services, there's tons of tabular data that we have to constantly surface. What you end up doing as a designer, you got nice spacing in there, lends to readability, there's a nice aesthetic to it, and you'll just hear from the customer, "This looks beautiful, but it's useless to me, because when I'm in a rush trying to put together earnings, I need to see 40 and 50 results at a time." The designer in me is thinking, "No, but your mind can't process all that stuff," and they're like, "Well, my 10 years of experience doing it says otherwise." You realize that you're like, "Okay, where do I find the happy middle here? Of trying to improve design, but am I really improving the design if I'm completely disrupting the work flow and making their lives harder?" I think that reverses the intent of what design should be.

We've had to really challenge ourselves and find these happy mediums and balance these needs with business goals. There's a tendency to fall into idealism when it comes into the work we do, and I think that working at Nasdaq and seeing the business goals, and seeing the different customers, and the variety of customers... We constantly get feedback from customers wanting something different in the system, but for every one that wants it one way, you've got ten that want it a different... Who do you cater to? Who do you design for? It's kind of hard to find that happy middle sometimes. I think that's a great challenge for designers is to find that place, right?

J: Yeah, exactly. I think you spoke to something that is a common fallacy, is that design is about the aesthetics and about the way it looks. It's really not. It's much more about the way that it works, and what it enables you to accomplish, right?

Aaron: Yeah, I'm sure I could probably start like six or seven Twitter arguments right now, but in my mind, when I look at these things, it's like, when I'm making something for someone else, why am I doing it the way I want? This is common sense ... I don't know, this is probably my lack of technical design education speaking.

J: It's the whole "you are not the user" thing.

Aaron: Yeah, and it's ... We can go back to all of our human-computer interaction, we can go back to all the stuff we've read, but when it comes down to it, if you're designing basically on what you want to do, in my mind, you're really just doing art. Because art is for me. I can paint something that's absolutely horrid, which could be some splatters on a canvas with a picture of a stop sign, and it was a fun project of expression and enjoyment for me, and I really think it looks cool. Someone will look at that and be like, "Wow, that was a waste of a couple hours." It doesn't matter, it was still useful, because it's for me.

In my mind, when I design for myself-if I'm only designing things that I think are of value, or that I think are going to work, and I'm not listening to others-I'm not designing for others, and I'm not balancing what I think should be done with what their needs actually are… Then I'm leaning to the art side of the design. Not so much the functional. I would even say the product side or the service side of design, right?

J: Yeah, exactly. Can you talk a little bit about how you decide what you're going to put into a particular round of work? How do you decide which problems you're going to solve? How do you go about figuring out the right solution?

Aaron: I think, fortunately, as an in-house team we usually have a starting point as the business has a set of goals. We now are in a pretty awesome position that we are hearing from business units, or we're hearing from our own Corporate Solutions that we work with in the Nasdaq and have some level of influence, so we can be involved in some of those early stages about what we want to do, or what the needs are. We can kind of show how we work as a team, and then work alongside them. That's what we really strive for, anyways.

We do a ton of research. I think when we were looking our year in review this year, I think last year we did like 187 sessions combined. That's with a director of research, one dedicated researcher, and then those two leading and encouraging, mentoring, and facilitating the leads on our design team to lead and participate in the research practice. For two people to do that much and do summaries is overwhelming, if not impossible, so we have to involve the whole team in there. Which is great. I think that's a great idea to get everyone involved.

We use a lot of research. We are driven by it. We do a lot of discovery. We do any form of research we can. Whether it's a survey or ... And when we get into the process and start prototyping, we start usability testing. We really try to find ways to use that information to inform what we do.

Fortunately, because of the type of software we design, Nasdaq has what our customer would be. If we're designing an investor relations software, Nasdaq has an IRO. We can go to them, maybe, ahead of time and pick their brain a little bit. Put some screens in front of them. Chris, actually, has been really great in paving those ways and doing that relationship management and building relationships with individuals in the organization that have given us that audience when them. That's really helped our design. We just try to use what we're learning to inform the work we do. To get it back in front of customers and get their feedback.

Fortunately for us, we have an interesting dynamic with our product teams in that some of them are definitely product background. They've built something before, and they know the space. Other times, a lot of our product managers have come up through the ranks of the company and just been put into a product role. While they're not maybe as used to the product design process, they have enormous amounts of domain knowledge that can be incredibly helpful. A lot of that thing is really partnering with these folks to set goals and share understanding of what we want to accomplish, and try to move it forward as a team. That's really the driving factor in a lot of what we do.

When we make decisions, early in the process we'll do a collaborative exercise, like Design Studio. I think that's usually the best place for that, earlier in the process, to get that idea generation going. Then we try to have a lot of sessions with critiquing, bouncing ideas off each other, talking through things, putting them back in front of folks. Keeping constant dialogue about what we're trying to do that's rooted in the goals we're trying to accomplish is kind of what really helps us in our decision-making process and where we take things.

J: It sounds like it's a lot of collaboration and iteration really getting the stakeholders ... The end user or somebody who's representative of that end user involved in the process of making things, and making sure that you stay really tight to what their expectations are. Making sure that you're going to meet their goals, and not really caring about your personal ... Or the internal goals around aesthetics or anything else.

Aaron: Yeah. It doesn't always work. A lot of the stuff, I definitely will say that I say we like to do all these things and, "This is our approach," and, "This is how I view it and want to do it." Sometimes it doesn't work that way. Sometimes you do ... There's been instances where we've done tons of research, got the product in front of folks and we still missed the mark on certain things. It just happens. That's the nature of design, but that's what is kind of cool about building a team and trying to facilitate a team's efforts and work to be resilient is that we can like, "Okay, cool. We found it didn't work. Let's put that effort now into finding out what does," and we work from there.

Aesthetics are super cool. I dig it. I love seeing great design. I love seeing creative ideas and concepts. I love all that stuff, and that's great. There's a place for all of that. And there is within financial services, even though financial services sounds wicked boring. But… How do we apply some of those things? Maybe I don't do some cool animations and slide-out and interface, but maybe I apply a level of problem-solving and creative-thinking to the solution. Maybe the design surfaces itself as a graph for some numbers, or a table, but if I was creative in my thinking of how to solve a problem, I think I'm still being creative. I think I'm still providing the right aesthetic for the right time. I think things can look really ... I don't think that function and form have to be polar opposites. I think they can be intertwined, and I think that's important to remember that, "I can still make this look good and have it meet someone's needs." That's not that far fetched of an idea in my mind.

J: Yeah, of course. And the trick is to getting to that spot.

Aaron: Yeah.

J: Is finding a way that you're solving the user needs first, and then you can use the creativity in aesthetics and visuals to make it look nice at the same time.

Aaron: Yeah, yeah-

J: It's kind of the "form follows function" rule, but I'm not sure if that applies specifically anymore.

Aaron: I'm not sure any of this applies, anymore. The more I do this and the more I listen to the community, and the more I see things, it's like ... I think there's a lot of great things being said, there's a lot of cool things being done, but to me, it always comes back to the context of where you are, what you're trying to accomplish. If you want to use lean, use lean. If you want to try to say you're Agile ... Whatever is working, do that. If it's not working, do something else and figure out what that thing is. Try these different ideas. Listen to different perspectives from people within the community, and the things we're reading in books. But more importantly, be driven by the need of who you're designing for and allow that to determine the process that you think will facilitate that best.

I think there is going to be times where we thought, you're right, "form follows function," or even throw it on its head and "function follows form," who knows? I think the scenario in where we are can determine those things for us. If we just look at the valuable resources of what's being said, and taught, and published around us and say, "Okay, how does that fit into the situation that I am in? If it does, cool. If it doesn't, cool. But maybe this one little part does, so I'll borrow that." Awesome.

That's kind of the beauty of the work we do. As much as we like to be pragmatic in our view on Twitter about this or that, I think that we're probably okay if we use a little of this and a little of that. Maybe not even anything over there, or what we've learned before. We do it completely new, you know?

J: It's like we say internally at our shop, "There's one answer to every problem. And that answer is: it depends." There's so many different variables. A lot of it can come down to subjectivity. Earlier you said something about, "The best design is not always what's best." It's because "best" is so subjective. Best for who? Best for what situation? What are you actually trying to accomplish here? So, yeah, I think that's a good point. Can you talk a little bit about the processes that you're using to go through some of those exercises? What type of research tools, or how are you capturing information? Just talk a little bit about that.

Aaron: What's kind of cool is that with our Director of Research, Megan, she kind of sets the tone for like, "Here's the type of scripts we're going to use for usability testing. Here's the process we'd like to do. Here's how to facilitate an interview." She's always willing to work with team members, maybe who their primary role is not research, to give them advice, give them feedback, help them refine that skillset.

When we can go onsite, we do. We try to record as many conversations as we can, obviously with whoever we're working with's permission. I know we've talked to you about this before, we have our tool Mosaic, which is basically a tool that allows us to aggregate all of our research findings and information in a way that's useful and tied to campaigns and products that we're working on.

We do a couple types of campaigns, because getting participants sometimes can be a little ... Recruiting can be a little tough, depending on people's schedules. Some people will say they're interested. You never hear from them again, or the schedules just don't line up. Sometimes you do very short bursts, like one week, almost mini-campaigns where we just talk to whoever we can and then share the summary and say, "Was there enough to make an iterative decision from this?" When there is-

J: So, if I campaign ... Are you defining a campaign as a research initiative?

Aaron: Yeah, sorry about that. So in a campaign, maybe we're going to test out an option for a new navigation structure.

J: Sure.

Aaron: We'll say, "Okay, we have some ideas. Maybe let's try some different language on the nav. Maybe different groupings of sub-nav items. Let's throw together a little campaign for that. Try to talk to maybe four or five folks. Get four or five sessions booked and see what we learn from it." And then do that. Depending on how that goes, we'll get some information. Sometimes, things surface that are very definitive and we know to act on. Sometimes, they don't, so maybe we have to run another one and say, "Okay, we didn't get quite the information we wanted from that, so let's try this." Or, "People didn't respond to the name of this navigation item, let's try a different name and run it again." It allows us to try to work as quickly as we can on a large software project within a large company. Sometimes by the nature of the work we do at a large company, you don't get to move as quickly as you like. We try to do that as best we can.

J: Yeah, of course. How many people are you testing with? When you make a change and send that out, how many people see it? How much data do you feel is enough data to make a good-

Aaron: Yeah. I think Megan says ... And if I'm wrong, maybe she'll hear this and correct me, or I'll ask her again and then send you the info ... But, I think we try to do, like if you hear the same thing, or something very similar on your fourth, fifth, maybe even sixth interview, you're probably solid. You've heard that consistent thing a handful of times, then you're probably good there. If you're hearing varying things every single time, it's maybe time to change up the process, or to continue on and grab some new participants until something sticks. Or maybe try a different method. If you're not getting that response, then maybe you don't do usability sessions or discovery sessions, maybe you try something like card sorting, or you tried something else ... A different tool to facilitate finding whatever common theme you're trying to identify, or the response you're ... The assumption you're trying to validate with your customers.

J: That's interesting. If you're not identifying a thread within maybe five to ten people, you try a different tool to try to identify that thread.

Aaron: Yeah, I would say ... In the same sense that we're like, if something is working five times in a row and you're hearing the same thing, you're probably onto something. I would say the opposite could very well be true. If you're trying something and you're in your fifth or six, seventh session, there's nothing common ... I don't know how often that really happens ... But there's nothing sticking and every person is saying something different, maybe it's time to reevaluate and see if there's a different way to facilitate that conversation. What I would wonder, then, is if what you're asking for feedback on, or what you're putting in front of them, or the questions you're asking maybe aren't resonating.

I think of this in the context of our work. We're designing for, usually, a specific group of people. If we're designing an investor relations software, we're talking to people who work in that industry, so it's possibly a little bit easier for us to have certain things resonate, because there's an industry there.

J:                                             Well it sounds like your audience is just really well-defined. If you're talking to IROs within companies of a certain size, there's only so many of those people, and you already know how to contact them, so getting that feedback should be relatively straightforward.

Aaron: On the other end ... And I don't know how Google does this ... But, if I was doing a campaign about how to improve search ... Good Lord. There's a million ... Some bajillion users for search, and you can be searching for local businesses, you can be searching for something funny on the Internet, you can be searching for music. You obviously have to drill in deeper to find that. Really, it's just keep trying to find ways to have conversations. If you do that, you'll get answers.

J: That's where the "it depends" comes back, right? It depends on the problem you're trying to solve. On the research side, do you think it might also be true that if you're using a certain tool and you're getting very consistent answers from everybody, the tool might actually be creating a bias, and you should try a different tool to confirm?

Aaron: That's where things were having variance in whether it's in your testing, or what you're putting in front of people as well. Sometimes the way the artifact that you're putting in front of someone is set up ... Say you're reviewing this screen, and you're looking at the priority of items on that screen. The different modules on the page ... And you're talking to folks ... Sometimes how things are worded, or the data you put in there, or that type of information can be biasing itself. We've learned that as well.

Sometimes you put fake data in. So you're like ... Maybe you put someone ... You're looking at a screen and there's a profile of someone or some type of metric, and if you're not clear in what that is, because you've just kind of thrown in, "Oh this is placeholder text," the person responding to that doesn't know the context behind that. They're only going to respond to what they see. You could be setting them up to give you information that's not as helpful. It's really important to think through that, and how you're presenting to them. "How real can you make this to what they're doing? Let's not use fake stuff." Just looking through, like ... Reviewing your screens before you get them in front of people to understand, "Are there things on here that are going to confuse people as far as what they mean?" Which is okay if they do, because you could learn something there, where something needs to be clarified or better communicated. At the same time, they need to at least know what's going on, right?

J: Yeah, exactly.

Aaron: Don't go and do anything with Lorem Ipsum on the screen. Or something that's not relevant there. Try to make it as real for them as possible, so that they can respond and say, "Oh, I understand what you're trying to do here, but that's not useful to me." That kind of a scenario.

J: We've seen that, too, where you've put placeholder content in something and people just don't take it seriously because they know that it's placeholder content. They're too focused on trying to imagine things, than rather actually think about how they would actually use it.

Aaron:  Sometimes you don't have the content, right? I'm sure folks will come across that, where they're designing for someone, and they honestly don't know what goes there.

J: Right.

Aaron: Maybe that's a position where you are thinking about maybe having something a little less defined there and saying, "Hey, if you had these five sections on a screen, as far as the priority goes... Section one, two, three, four, five... Based on what this application would do for you, what would be the type of content you would value on this page, and then what priority?"

J: That's where you drop to a lower fidelity prototype, right?

Aaron: Exactly, that's what I was going to say.

J: Yeah, where you just put a gray line in place of where the text would go, so that they can see that text would go there, but they're not distracted by it being words they don't understand.

Aaron: Exactly.

J: Speaking of prototypes, how are you using prototyping?

Aaron: As a team, really since I've been on, have constantly prototyped in the browser. I think, one, when there's functionality there, it's so much easier to explain and reduces the need for extra documentation. I still think that it's important to provide documentation alongside of prototypes, but you just don't have to do as much now. I know a lot of people say that the prototype is the documentation ... That, to me, is kind of a "yes, and," or a, "yes, but," there because, I feel like if you're handing your prototype to someone to build out, like a developer, they still need to know. "Okay, what's going on here?" Especially if you're working on something over a number of sprints for an upcoming release, and you give them the same prototype every couple weeks, they're going to need to know what's changed. What features are where, or what things are still a work in progress and they don't have to worry about yet. I think some level of documentation and context is always helpful. It just means less assumptions from the other side, which possibly means less things aren't what they're supposed to be. Less conversations to clarify things. The more you can do to clarify things is always better, within reason.

Another thing we found extremely valuable, outside of just handing prototypes over to developers or for usability testing, is that our business owners really like them. The stakeholders, the business side of the house that we work with can use them for demos. They can show them to customers, or they can show them to other senior management or executives. They see something functional, and it's easier for them to convey the idea.

As a designer, it's often pretty easy for me to look at a static screen and say, "Yeah, I know it's going to do this, and it's going to do that," because I kind of have a vision there. But if I hand that screen over to someone else, and then they go to talk to someone else, communication gets broken down there. Then I'm expecting that person to explain this vision for this ... Even if they know it really well, and we've talked about it a million times, it's just so much easier for them to see it working. It shows that we can make it, and we can get in there. That's been really good for us.

But there's other times, too, where we deliver screens, or we've used InVision to crank out some prototypes. I love the pairing of InVision and Sketch together. That works really well, and that's been helpful. That allows ... Depending on our team, who has a lot of generalists ... If some of the folks on our team are just ... They can prototype things out in code, but it takes them a lot longer than others. Because maybe that's not their primary skillset. Something like InVision helps them crank out and simulate interactions quicker and more efficiently. It's like, "Why force a skillset if it's going to be more frustrating to accomplish something or reduce efficiency?"

"Use the tool that works," is kind of what we try to do, but in the end, try to get most everything back into a prototype. It allows us to compare. It allows us to demonstrate. When we get things back from development, we can look at the prototypes side-by-side with it and say, "When we click on the prototype, it does this. When we click in the production or the testing environment, it kind of does that. Maybe it needs to be working, or actually 'Cool, it does work.'" I think prototyping is really, incredibly helpful for us in that sense.

J: When you say a prototype, you're talking about something that's actually in application code that could run on a website, or ...

Aaron: Yeah. It's all front-end stuff. HTML, CSS, JavaScript. We're not doing any crazy back-end stuff.

J: Could you talk a little bit about... As the team has been built out, and as you've introduced some of these research tactics, and some of these processes around prototyping, and getting things in front of people, and testing, and all of that... Can you talk a little bit about how that's changed the way Nasdaq has approached what they've done as a business, and maybe some of the business impact of those processes?

Aaron: Yes. I will put an asterisk before this and say Chris can definitely speak to some of this a little better than I can, because he's a lot closer to the business, but I will say this: We've been working towards building products that customers get value from. The business will definitely better understand design and better understand what we do. Not all of the business does. Sometimes they do, sometimes they don't. Sometimes there's turnover, new people come in, we have to reeducate or show what we do.

It's a constant process. I don't think that at any point in time sharing how we approach our research, turning designs around has instantly given us a certain spot. It has given us a lot of recognition, and support, and advocacy within the company. We've seen that. I think one of the easiest ways to see that value is ... I think the individual is not on our team anymore, but they send out a deal-of-the-day, where someone's selling the software as a service that we design gets the deal so they send an email saying this person landed this killer deal. What we would see, interestingly enough, they'd be like, "Oh, yeah. They like this better than competitor, and they would note the design was much better than the competitor's," or, "This approach was much better than the competitor's," and it was things that our teams had direct influence on. So, I think, in a large company, that's kind of a cool way internally to see that your process, how you're approaching your work, is having an impact.

Sometimes it's also really hard to see, too. Sometimes with an organization that can tend to be slow-moving, you may not see things for a long time. One of the things I think is really hard when you're trying to measure a business impacting design, a lot of times when you're really close to the work, and you're really close to the business, you only see the issues. There can be a tendency to look at things and see like, "Oh, man. This customer complained about this, or about that. Okay, let's go talk to them. See what we can do." That's one of the areas that I think we've tried to help, is when a customer complained maybe about we migrated to the new design, or they're having some issue with it, our design team is willing to hop on the phone with them and listen. I think that has helped. Folks have been saved. They've been like, "We were going to cancel, now we're not."

I think those are things that are... One, customer retention is impacted. I think new sales are impacted. When we take someone and they've switched from a competitor to us. And there's revenue tied to them, and they have different contracts and different value. Sometimes it's a deal that's thousands and thousands of dollars. It can be a $20,000 or $30,000 deal. Other times, it's like $1,200 service upgrade or something. It's not always this huge, massive impact. But what I love is that people can see that the efforts of the design team impacted the decision of the customer to do something positive with the company.

I think that's ... Obviously, money is always in the business ... It’s one of the more important things. But keeping customers on, pulling customers from competition because we offer a new feature or something's more intuitive or easier for them than it was in the competitor's application ... Those are the things that we've seen. We've seen the opposite, too. We've seen people frustrated with the work, and we just listen. We jot it down. We try to work on it, and try to improve it, and we go back to them. We try to show it to them again and say, "Hey, look. We're listening to you."

I think that it's one of those things where sometimes there's a short game, and you see an instant like, "Yeah, cool! That customer switched to us, that's awesome! Because we used this design process and facilitated and executed on that design idea, it had awesome business impact." Sometimes you don't see that, though, for six or seven months, either. So there's a long game, as well. It's about being patient and know, and both trusting the process, and believing in what you're doing.

J: I think, especially in financial services where there's ... There could be significant risk is switching platforms, right?

Aaron: Oh, absolutely.

J: So, when you get a customer to move from a competitor's platform because they've experienced the result of your human-centered design processes ... That's really powerful. Even if it is $1,200, that stuff adds up.

Aaron: Yeah, absolutely.

J: As we're wrapping up here, Aaron, any final thoughts?

Aaron: No, I think ... I definitely appreciate you giving me the opportunity to ramble and just speak off the top of my head about my initial thoughts on some of this stuff, it's been cool. I would just say that thing that's helped me have whatever level of longevity, or persistence, or success is trying to remain open and trying to not be so set in my way of thinking or approaching something that I allow it to limit. I would just encourage listeners in that. Look for new ways. Try something different.

Also, then, on the other end of that, don't be afraid if something's working and it's going well, but you're not using the coolest, newest thing. That's also totally 100% fine. Really just look at what you're working on in the moment and try to figure out what's best. Whether it's something old, something new, or a combination of various things. Just figure it out. I think the more you listen, the more you take notes, and the more you are willing to learn from those experiences, like be teachable, I think that you're going to see progress over time that's really valuable.

J: So just listen to people, keep an open mind, and use the tools and processes that are right for you.

Aaron: Yeah, exactly.

J: Yeah, cool. If somebody wants to get in touch with you, chat about things, learn more about what you're doing at Nasdaq, what's the best way to get ahold of you?

Aaron: Probably Twitter. It's easy to just say like, "Hey. @AaronI." That's the easiest way, I think. Shoot me something on Twitter. We can set up ... We can exchange information that way, or meet up for coffee, or food, or drink, or something, and chat. I'm always up for stuff like that. I think that's the best way.

J: Sounds good. I'd like to get you back on this show at some point. Hear a little bit more about what's going on. Maybe get you and Chris on at the same time. I think that'd be a lot of fun.

Aaron: Oh, yeah.

J: Maybe next time we can chat a little bit more about brisket and barbecue.

Aaron: I like all of those things.

J: Thanks a lot for being on the show, Aaron. It was really valuable, and we'll chat again soon.

Aaron: Awesome, I really appreciate it. Thanks a lot, man.

Outro: That's it for today. Thanks for listening to Design Driven. We're glad you enjoyed the show. Have comments, questions, or an idea that you'd like us to cover? Point your browser to designdriven.biz and click Contact Us on the top of your screen. We'd love to hear from you. Tell your friends and colleagues about the Design Driven pod. Post on Facebook, Twitter, LinkedIn, or send them an email, and tell them to go to designdriven.biz, or wherever they find their podcasts. Until next time, remember what Thomas Watson, founder of IBM said, "Good design is good business."