Top of Mind with Tambellini Group

Building Digital Trust on Campus

Tambellini Group Season 5 Episode 51

How does your institution define digital trust? In the August installment of Tambellini Group’s Top of Mind podcast series, Dr. Donna Kidwell, CISO and Chief Digital Trust Officer at Arizona State University (ASU), joins the conversation to offer listeners a glimpse into what digital trust looks like on ASU’s campus. Outside of simply defining the concept, Dr. Kidwell explains the power behind having intentional conversations with stakeholders, faculty & staff, and most importantly, students.

Speaker 1:

Hello, and welcome to the August Taney group. Top of mind podcast. I'm your host Liz Farrell today. We're talking about digital trust at colleges and universities. What is digital trust? How do we build trust between institutions, students, and faculty, and how are increased concerns over data collection, data privacy, and cyber breaches shaping the way institutions protect their communities online here to answer those questions and more as Donna Kidwell, Donna currently serves as the chief information security and digital trust officer at Arizona state university. Prior to that role, she has served as a CTO at ed plus online ASU online arm. And she has also served as a faculty member in business and law at various universities, domestic and abroad. So welcome, Donna.

Speaker 2:

Thank you. It's so great to be here.

Speaker 1:

We're excited to have you<laugh> I'd like to start things off by just asking, how does someone become to have this title chief digital trust officer? I don't know if we've seen a lot of that, so maybe you can give us a bit of background on your career trajectory and, and how you came to this role and interest in it.

Speaker 2:

Sure. Well, thank you for asking, you know, I think career adventures are always such a wonderful and winding paths and take you in places that you never expected. So I certainly, as a history major in the late eighties, would've never thought that my future job title would be chief information security and digital trust officer. Like that is not a path I would've expected, but you know, history

Speaker 1:

Major for sure.<laugh>

Speaker 2:

Right, right. Which is just, you know, like go liberal arts. Um, but I did that history major at the university of Texas at Austin at the beginning of what became a wonderful technology ride in that city started a company when I was an undergrad, a technology company, um, whole nother wonderful adventure with that. And then like what straight into technology, primarily leveraging a lot of the skills that I had in, um, what at the time was more technical writing and trying to interpret, um, what we really needed, what people really needed into the technology systems that we were building. So that kind of then turned into this entire adventure of working in technology in the private sector for a long time, and then moving into higher ed, um, you know, at about a midpoint in my career during that, I'd had a lot of opportunity to really look at the way technology serves us and unattended consequences for when the technology doesn't exactly serve us in the way that maybe we anticipated, or maybe frankly, we didn't even think about it. And then it become kind of becomes an issue on the, on the back end at a whole tenure where part of what I was doing was, um, environmental health and safety data with, uh, Exxon and got to see how powerful data is. And then as I was pivoting, my career was working for a really large real estate franchise, um, where I got to see a lot of very powerful data and how, if you mobilize that data, it could really change somebody's career that kind of led towards what ended up bringing me to ASU, um, because there's no place else in the world that's as committed to creating digital learning experiences for learners that really do help them chart their own course in the process of that more and more data, whether that's LMS data, um, your SIS, your operational data, that kind of keeps the lights on and runs the university itself, client side data like incredible marketing technologies that not only help us to outreach to students that would never think that, Hey, I could get a degree in astronomy, which by the way, today is kind of an exciting day for students getting degrees in astronomy astronom, uh, with the web telescope. So cool<laugh> but how amazing we have, you know, 300 degree programs and it requires a very special use of some pretty sophisticated marketing technologies to even create awareness that those things exist and to help people recognize that that is an alternative for them, but in the process of doing that, you now start using very popular, um, technologies that for the most part marketers have used, or that we see the footprints of in Twitter and Facebook and, uh, and worry, I think to some extent about, uh, surveillance kinds of activities in these spaces as a public university, we have this really unique opportunity to say, yes, we have all of this incredible data, everything that you're doing on any device that you're on is creating a really incredible digital footprint. How could we take that and then become stewards of it where we turn it around for digital trust, um, and really do some interesting things that only a public university might be motivated to do. So when the opportunity came to kind of take that more MarTech angle with my career and bring it into information security, the future of cyber security, the future of digital trust, um, absolutely it's, it's a really interesting place to be. And most CSOs kind of the traditional CSO role do not have, um, relationships per se, with chief marketing officers or with the teams that are, uh, building out Salesforce architectures. Uh, they, they traditionally have been very different worlds. And yet, if you think of the basics of what's happening in cybersecurity, there's a lot of places where that kind, all that Venn diagram becomes really interesting.

Speaker 1:

Yeah, I hadn't thought about it that way, but it makes perfect sense. I mean, when we think of all the data breaches and everything, it's from a lot of that marketing data that people are using to track us and our preferences and everything, so, right. It just seems obvious<laugh>

Speaker 2:

It? Except for, yeah, we don't necessarily have the organizational models or institutional relationships. Uh, we speak very different languages. Those, um, those areas, those particular domains have completely different languages that they speak use very different tools. Um, so at the heart of it, you know, rethinking, um, issues around data governance, rethinking issues around identity, uh, rethinking how we think about our, the relationship between a person, the device they're using and the networks that they're on, you know, those are all the spaces where for someone in the digital trust space, we really get to try to tackle some, some pretty nay problems.

Speaker 1:

Yeah. It sounds like it. So let's start it at the basic level. What do you, I, I read something you said, which is building trust is more precious than ever in today's data driven world. How do you define digital trust?

Speaker 2:

That is such a great question. It to, to keep it really simple, every interaction that you have in a, a experience. And that's, that's really complicated right now, but every interaction is an opportunity to either build trust or to erode trust for how you feel like the experience that you're having and the company or the people on the other side of that experience are actually serving you. So digital trust is really about purposefully intentionally building that trust and every digital interaction that we have, which requires a whole lot of complexity underneath it, but at its core, that's what we're, we're really aspiring to do is to make sure that we are building and creating trust. What do we do with that trust? Well, here at ASU, we're really dedicated towards using that digital trust towards agency, enabling learners, to be able to have more transparency into their own experience, to be able to make quicker, faster decisions about what they're doing, uh, to have a sense that the university is really a safe, um, but also incredibly, um, fostering place, uh, for them to kind of reshape what they hope to do with their own personal journeys. Um, you only get to do that. If you feel like you've got an underpinning of serious trust, um, like for example, students will give us, uh, sometime when they're in high school, because a guidance counselor told them it was a good idea, access to pretty much the most precious information about them, know their academic histories, their extracurricular activities, um, their financial data, their parents' financial data, all of that without even really thinking, is this a good idea to share all of this? You just trust that your guidance councilor told you that was a good idea and FSA is a thing, and you hear a lot about it and you just trust. We start from such a bedrock of pretty incredible information about somebody that I think that's, that's another reason for us really to think about what is that digital trust? How do we keep that, um, how do we actually expand on it? How do we develop it into something, um, that really, really, really does serve people.

Speaker 1:

Yeah. And it's interesting when you say they, they give this information over, so, you know, without questioning it or anything. Yeah. Um, the big part of this whole building digital trust seems to be, um, education. Like, is there a component of this when we're talking about all the stakeholders and, and not just for students, right? Cause we have many community stakeholder holders, whether they're faculty, staff, administrators that have a lot of sensitive data through payroll or through donations or anything like that, um, how do we get them to understand they should care about this? Is that a big part of it or do has, is that happening organically

Speaker 2:

Mean it is happening organically, but if we waited for the organic work, it'd be pretty spiky. Um, we are really intentional about that. So it, it means a lot to me that I'm able to pick up the phone and talk to one of my colleagues, um, at the provost office and say, Hey, I'm seeing this inside of our cybersecurity space. I've got particular concerns. You also have access to data about student journeys over either in academic integrity or maybe student services. If I'm interested in a holistic picture about what's happening with learners on the campus, let's join forces. Um, another good example on the CSO side of the house, one of the pieces of data I'm the most concerned about protecting is student data, that's FERPA data. So by allying the training and awareness campaigns that my office is doing, cuz I'm resourced to be able to do that. I'm allying that with our FERPA colleagues and with my HIPAA colleagues, so that ultimately there'll be a combined sense of the training that you're getting, the awareness that you have, the touch points that each of us have because we're coming from different parts of the life cycle for a staff student or faculty member will be able to have a common message, a common voice, a common set of learning outcomes because ultimately it's the FERPA data that I'm protecting in those spaces. Uh, so all of that is kind of to come back to your question, having intentional conversations with the other players that have a different kind of vested interest in what they're protecting, what they're stewards of and then talking very seriously about how do we design those experiences together.

Speaker 1:

Gotcha. Now, when you talked about this sort of learner agency, the way you described it makes me think of it as you're educating all the learners to be empowered by this data. Would you say that's correct?

Speaker 2:

I think we, as universities, we have this really unique opportunity to set both the tone and an example for what your experience should look like. So that when learners also then sign up for, I don't know, TikTok or a, a bank account or whatever, um, at the same point, they can kind of think through and say, Hey, that's a different experience than the way ASU treated me or ASU was able to share that we had this type of data. They probably do too. I think we get a chance to kind of set expectations for what a learner's experience should look like, be kind of role models and frankly, experiment with them, not on them about what these models should look like and how to build them. Um, that's a pretty exciting thing because it's absolutely true. That is digital natives. They have more fluency, they've got, uh, faster uptake of what's actually happening out there. My teams can learn faster by actually being in design sessions with them about what these things could look like.

Speaker 1:

I like the distinction that you make between experimenting with them, not on them. And it seems that ASU is very proactive in establishing digital trust. And I wanna talk a little bit about the summit and some of the outcomes from that, but okay. Given that point that you said, can you, can you give any examples of, um, this experimenting with students? Like I I've heard of some other institutions, you know, when, when you're doing these pilot programs, the student feedback on experience or in helping design, it is so crucial. I don't know if you have any examples of specific tools or applications of data that are either in process or have been completed with a lot of that student input. And if you could tell us about those maybe and what you learned from the students.

Speaker 2:

Oh yeah. So much. So one of our more interesting examples, uh, that will be launching and kind of moving forward in a big way over the next fiscal year is called pocket and pocket is our work in self software and identity. It's our work in actually giving a digital wallet to the students where they can start understanding what to do with verified credentials, badges, and, you know, really thinking about the reality of what, um, many, many, many years of trying to understand that space is gonna look like for students. And we could not do that without them. So whether it's user acceptance testing or beta testing or, uh, focus groups where they actually sit down and they, we work with them as they walk through it. Um, Dr. Timothy Summers and his team have done just an extraordinary job of keeping the voice of the students that is so much of a focus of his work. It's, it's pretty incredible. So that that's one good example. Another good example is simply trying to pull them, the students into the dialogue really intentionally and to be really welcoming. So I regularly have conversations with students that have an experience where they realize, oh, this seemed like more data than I should have had to give, which frankly, I love that I've got students that are thinking that way. And then they're willing to kind of walk me through the journey. So we'll do privacy mapping. Um, privacy mapping will actually take somebody through the steps of a particular process that they're going through and we'll walk through. What was your experience here? Um, we do, we use a technique that draws from empathy mapping. What did they hear? What did they see? Uh, what were they clicking on? So we're kind of using a bit of an interaction design around that, but grounded an empathy to then say, okay, well, did you feel like that was appropriate? Uh, when you then had a question you had to pick up a call, did you have a sense that the person that you talk to understood the whole process holistically and could guide you through it? Oh, that's um, that kind of being able to actually sit down with students and frankly having them being, feeling like we've got an open invitation for them to help see that. Um, I think another element of that is that we really, my team's really try to project being pretty open and vulnerable to conversations that are scary or hard. So we don't shy away from the things that are difficult or challenging or problems that we know institutionally. We're only one part. Um, so conversations around international students who are concerned about the level of scrutiny that, um, they're going to give in terms of data, when they apply for a student visa or residency requirements, these are all really important issues that are at the heart of ASU mission of access. So the ability to have that conversation, walk through it with techniques and teams that understand how to start doing privacy by design, um, that's, uh, it's really real and meaningful and, and frankly fun work.

Speaker 1:

Yeah. I imagine there are probably similar sensitive issues when you're dealing with undocumented students and, and their concern about any of those privacy issues and other things on that.

Speaker 2:

Yeah. And it's, uh, you know, we've got more and more about how freely, in some ways we've given our data and what that then means you, you can't exactly take it back. Uh, once you've signed up for an app and been using it for a long time. Uh, so I think we're starting to get a better sense of the reality of what that means and more than willingness to participate in trying to figure out, okay, how do we get better at this?

Speaker 1:

That makes a lot of sense. So switching gears, ASU is obviously very committed to digital trust to the extent that they have someone like you with this joint title of CISO and chief digital trust officer. And when we previously spoke, you had mentioned that the biggest driver of this commitment is the university's mission to ensure access how, when we're thinking of access and modern learning environments, digital trust is central to not only ensuring access to the tools and the safe online environments that students need to succeed, but ensuring that they have agency by being, as you had said, masters of their own space, but there are other ways A's mission and charter align with its commitment to digital trust. Can you tell us a bit more about those?

Speaker 2:

We've got another component of the charter that talks about we are deeply committed to research that helps the public good. And this is a space where we need better examples and models for what this looks like, so that we can actually start creating the public good. Um, the action based research, actually working with PhD, students who are interested in issues around privacy and my ability to partner with them so that they have access to data that I might have, and that we can kind of figure out how to appropriately partner. That's another huge part of what we can do. Our graduate students are pretty amazing. And then the third part of the charter is that we have actual responsibility for our communities. We're actually engaged with them. So whether it's me sitting down with a vendor or sitting down with our digital equity work and talking to libraries, we can activate the things we're trying to do as a university. We have such an ability to do that across all sorts of partners. It's really an incredible opportunity. So digital trust, pretty new. We're still trying to figure out really what it is, what the boundaries of it are, how it fits across efforts of privacy and security. Um, universities have a very singular opportunity to figure that out. And for ASU, we kind of got it mandated in the charter itself.

Speaker 1:

Really cool. So I know have I think coming up the next couple of weeks is the third digital trust summit

Speaker 2:

It's in October, October 11th and 12th. Uh, this will be the, the third, uh, annual digital trust summit. Um, really exciting because we, we bring together such a diverse set of folks. Um, everybody from financial aid advisors, to faculty members and students, and, uh, folks that are deeply interested in both privacy and security, uh, which is not, not communities that always spend a lot of time together. We're really thoughtful about the individuals that we bring in to covenant speakers and to help us kind of work through those issues. Um, I usually try to find folks like, um, ADA Paris who came last year and really, or I think she came at our first one and really helped us to ground. Uh, what kind of ancestors would we like to be, and to think deeply about design elements, and then we'll bring in privacy professionals or cybersecurity professionals, um, stay tuned for who we're gonna bring. Uh, this year, we're still talking to our keynote speakers to kind of pull them together. It'll be a hybrid event. Um, our past events have been all online, but this year the campus is rocking and rolling. Uh, so we'll actually do events on the Tempe campus as well as online. And it really is designed to be like, like many of the events at ASU it's designed to be unstructured in such a way that we can actually do some working, um, participatory efforts with those students. So for example, some of the activities that I'm planning are in partnership with HDA, um, our school of, of design, um, so really interesting to kind of pull those groups together. Um, it's, it's a really fun event.

Speaker 1:

It's interesting in hearing you talking about the range of topics and the range of attendees and presenters, it drives home that idea that this digital trust is just across the board. It has so many different touchpoints and it also touches so many different disciplines in terms of who needs to be involved in this. Um, well you were talking about some grad students and others, um, and I'm thinking back to you, you saying you had this liberal arts background and in history and design school being one example, um, it sounds like there are experts brought in who don't really have what we think of as a computer or technical background at all.

Speaker 2:

No, no, not always. Um, you know, we tried to draw from legal backgrounds. We tried to draw an artist. Um, another one of our keynotes is Marcus Anderson. Who's retired NFL player, uh, but in the NFL kind of became driven to work on global leadership and his thought pretty deeply about how to then work with communities across the world to bring in local wisdom, which is not what you'd expect from, uh, from somebody who, you know, was an NFL player and had a whole different trajectory to now go into global leadership. We really do try to bring in very diverse, uh, thinking to then help us tackle these pretty complex challenges,

Speaker 1:

The outcomes of the, like the, you had the last summit, the, the second one, what are there? Action items? Are there takeaways?

Speaker 2:

Yes. So the teams themselves, like my teams will actually leverage the summit as a way to do working sessions around the work that they do. So we may bring in students who have been working on pocket or bring in students that have been, um, participating on different kinds of policy panels with us. So there's definitely a correlation between the voices that we bring to the summit and the kinds of conversations we're having and the work that's ongoing. Um, this year, we're talking about doing a series of privacy salons on the campus and working with student clubs to do those salons. So that gives us a kind of an opportunity to both hear the students and what they're worried about. For example, um, it is fantastic to talk to our students that run our eSports groups. They are very aware of some of the challenges, um, in this space. And like, where else are you gonna find such really cogent appreciation for the challenges of proctoring then from a group of students who spend a lot of their time really understanding how to be effective at Twitch streaming. There's a really interesting level of awareness there, uh, that we can certainly learn from. So there's, there's an intentional component of it to make sure that the kinds of conversations we have the summit are part of a thread that happened in the work that we're doing all year round.

Speaker 1:

Yeah. I hadn't even thought of the whole eSports aspect of it, but they must, oh yeah. The treasure trove of private privacy information and,

Speaker 2:

And, and they, they swim in these waters every day. So they they're absolutely, uh, we, we, there's kind of a, um, I don't know if it's a myth, we should do some research around it, but there's this notion that people have kind of given up on privacy and that's not what we discover at all when we're actually talking to students. Um, they're, they're starting to be, and some of them are very good at navigating their own privacy and at recognizing the borders of what they want public and what they want private and how to navigate that in the tools that they're using.

Speaker 1:

So what are we seeing? And I know you've been at this role not two years yet, correct.

Speaker 2:

That's right. That's right.

Speaker 1:

So it's, it's hard to say what has changed since then, but I'm thinking you joined this dur during the peak of the pandemic. Um, we know how remote learning, everything else that had to happen in a virtual environment brought up a lot of privacy concerns. Is there anything we can say about what has changed or how the thinking questions, problems you're looking to solve have evolved in just that time period?

Speaker 2:

Well, there's, there's nothing like, um, big, horrible challenges to try to rally around hard conversations. So a, a couple good examples. Um, we had teams looking at and trying to understand how we were gonna handle, um, the pandemic on the campus and thinking about what became our mobile health app. So this was a mobile app that ASU got serves all of our students, um, very widely adopted and, and quite a fun way for us to communicate with students. Uh, we instituted a privacy committee that pulled together and started thinking about all the different issues around that mobile app and how to approach it in a, in a way that supported responsible innovation. We did that at a time when we absolutely had to do it fast because we weren't sure, um, this summer of 2020, nobody was really sure what was gonna be happening on the campus and what kinds of tools we'd need for communicating. So there's nothing like trying to do privacy by design when you're also under duress. Um, and with a lot of ambiguity, it created this space where we could have lots of great conversations with partners across the campus who had very different reasons for joining us in the conversation. And that then cracked the opportunity to keep those conversations rolling in lots of different ways. So it's, um, I don't think anybody would've chosen that path to get here and we're a very collaborative community, so I think we would've done it anyway. Um, but it certainly did give us some really, really poignant and very real opportunities to talk about how we wanted to do it, how we, how we wanted to learn quickly about what we did well and what we didn't do well, um, and how to sustain those types of organizations. Like the, the kinds of things that help you convene a squad, uh, squad specifically around privacy to rally around an issue.

Speaker 1:

It definitely sounds like most, most other things for higher education. It was a catalyst.

Speaker 2:

Yes,<laugh> so many ways

Speaker 1:

Momentum there

Speaker 2:

<laugh> in so many ways. It also broadened, you know, ASU already had such a large online footprint. Uh, we had the technology capability and we were pretty well positioned to make the pivot. Our challenge was making that pivot at scale and making that pivot in spaces that, um, had not necessarily brought into the asynchronous kind of online experience that ASU online has. Uh, so not saying it was easy, but in a lot of ways, we were pretty well positioned to be able to mobilize very quickly that also created an opportunity for us to think, because we have to think at scale, um, how do we do this in ways that are, are really transformative, so that that's led to a lot of the kinds of conversations that we're having now, and that in the 18 months that I've had in the role that we're kind of rethinking, rethinking things like how do we at scale handle privacy and security and accessibility when we're trying to determine a technology that we wanna bring into our, our space, how do we work with the units and the faculty to understand risk mitigation so that they not only can make an assessment of what they're trying to accomplish, but we can actually help them reduce that risk. Um, how do we think about simplifying some of the technology that we use such that it's easier for us to start getting a view across the board of what's happening and respond to it, uh, which is paramount in today's cybersecurity space. So definitely a lot of catalyzing going on.

Speaker 1:

Definitely. And it sounds like you're operating at a much higher altitude than a lot of others institutions are. And before we close things out, that's a perfect segue into, um, you had mentioned also that you were fortunate to have these resources, the sophisticated brain power of a large research institution, and the people there to use the building opportunity, but knowing that many other institutions don't have that, what is your most important piece of advice that you would give to institutions who, you know, are far behind on this, but want to build greater digital trust?

Speaker 2:

Like many things. I think it starts by having relationships. Um, if you are resource constrained, your team is tiny and has big ambitious purview. One of the best things that you can do is to go identify who's adjacent to the work that you're doing. That's also doing important work and embracing together, and then figuring out what are common sets of things that we can go solve. Um, so I, I have an example, that's somewhat of a tragic example, but I do some mentoring for, um, CSOs of school districts. And right now the CSOs of school districts have lots of adjacent conversations around active shooter, um, security issues, and trying to understand how would they message their communities. How do they think about social media? If you're a CSO, you have access to a lot of digital experiences. You have understandings of a lot of incident response. Like that's unfortunate, but a really interesting space to kind of gather forces to say, Hey, what problems are you solving? This is the set of problems I'm solving. Can we come together to have a common conversation so that we can start tackling some really serious issues together? Um, on the cheer side, like sit down with your chief marketing officers. They're the ones that are trying to figure out, how does the university wanna represent itself in social media? What do we look like on TikTok? And what does that environment mean? Um, so those kinds of conversations like foraging those relationships so that you could look at the spaces where you can kind of rally together and have greater force because you're on a team that's trying to move in a direction together. That's the easiest and simplest place to start.

Speaker 1:

And it comes back to those liberal arts skills, right? That, that communication and interaction<laugh> totally well. Donna, thank you so much for taking the time today to share your perspectives and insights with us for the top of mind podcast.

Speaker 2:

Oh no, it's been a pleasure to talk to you and just a, a really wonderful conversation. Thank you so much for having me.

Speaker 1:

And that concludes this month's episode of the, the top of mind Tamini group podcast. Don't forget to check out our other resources on our blog@thetaminigroup.com.