×
FACULTY PROFILE Charlotte Tschider

Tech savvy

Assistant professor Charlotte Tschider uses experiential knowledge in cybersecurity, privacy, and health law to approach problems posed by ever-evolving technology

Growing up in Bismarck, North Dakota, Charlotte Tschider taught herself how to code at age 15, sparking a passion for technology and how people interact with it. Before attending law school, she worked for 10 years in upper management and consultative capacities in information technology, cybersecurity, privacy, and legal compliance for consumer and healthcare organizations including retail giant Target and Medtronic Corporation, a global leader in medical device manufacturing. Today she serves as assistant professor in the School of Law, using her experiential knowledge to approach new problems posed by ever-evolving technology. Here, Tschider talks about career transitions, the intersection of technology and health care, and privacy rights.

You already had a thriving career as a technologist before you went to law school. What made you want to become a lawyer?

I remember working with Target on a third-party contract, and the legal contracting team said they didn’t understand the cybersecurity terms in the contract. Although they were excellent attorneys, I realized there was a need to bridge technology knowledge and legal knowledge…a need for attorneys who can speak technology in its various forms, whether that’s cybersecurity, privacy, or healthcare technology.

Do you advise organizations on legal matters related to privacy, cybersecurity, and artificial intelligence (AI)?

Due to the newness of healthcare AI, I am frequently engaged to weigh in on business approaches that do not fit our typical understanding under the law, such as artificial intelligence creators, and small businesses trying to “do the right thing” with cybersecurity and privacy. I feel really motivated to at least help them think about the right questions because they will change the face of how we do medicine. Start-ups in this space want to make sure that the technology they’ve developed is safe, that it’s effective, and that it actually makes a difference for people.

Health care is tremendously expensive. It’s also highly unavailable to a huge part of the population. So having technologies that are less invasive and, in some cases, can be used at home, promote access to individuals without having to go through or to a doctor—that makes a huge difference for a lot of people. Other than my scholarship, that’s the work I find most motivating today.


“Having [healthcare] technologies that are less invasive and, in some cases, can be used at home, promote access to individuals without having to go through or to a doctor—that makes a huge difference for a lot of people.”

What are some of the biggest challenges in your work?

In the AI space, one of the biggest challenges is, “What does good look like?” It’s very, very hard from a legal perspective to say, “This is the right legal direction,” when the industry itself can’t even yet agree on what those standards are. There certainly is the potential for abuse and harm as well. In the medical space in particular, people could be hurt—physically hurt. For example, there is an artificial pancreas that is in its third set of clinical trials now, and it’s designed to work autonomously. If you have issues with the code, if the AI doesn’t work as expected, or if you haven’t tested it properly, people could be hurt. And you’re not going to have a traditional medical malpractice kind of defense, either, because this is a device that was designed by a manufacturer, and often [manufacturers] are protected. 

In my scholarship, I try to explain the technology to the audience so they can make their own determinations about what the right go-forward step is. And I do believe it’s important to educate about the technology. Instead of jumping to, “This is what the law should be,” first, we need to understand how the technology works to figure out if the law is actually going to influence things in the right direction.

Which of your research projects are you most proud of?

I’ve done a lot of work at the intersection of medical devices and AI—medical devices that are connected through the internet, especially devices that are pervasively connected to the human body like hearing aids and insulin pumps. These do not diagnose; rather, they effectively manage a medical condition. Take the artificial pancreas example: You can imagine a situation if the FDA [U.S. Food and Drug Administration] does not have AI experts who are reviewing the methodology of creating the AI algorithm; they’re probably not validating to make sure that algorithm has been adequately tested. They also are probably not looking at the overall technology structure to ensure its security protections are strong. Because the AI and the instructions associated with the AI are driving how the device works, that device could function dramatically differently than how it is designed to function. When the device is available to the general public, anybody who has the particular device could be killed or seriously injured. … It’s the AI that we don’t often talk about or think about that could actually cause the most injury.

I’m most proud of the research because I’m noticing that the FDA is taking AI more seriously. I had the opportunity to present with one of the drafters of recent AI guidance. I have been publishing and pushing the FDA to do better, and the FDA tends to be taking this a little more seriously. Seeing that kind of movement makes me feel like I’ve actually made a difference to the people who depend on these devices.

You’ve conducted research about the overreliance on consent being the biggest threat to people’s privacy rights. What else concerns you?

A lot of privacy scholars remark that [young people] don’t care about privacy. And my response usually is that they may not care about privacy generally, but they do care about privacy when the discussion is who can see or access something, rather than that it’s being shared. For example, individuals may not have an issue with sharing pictures of the recent party they went to with their friends, but they may have an issue sharing that with their parents or with their grandmother. That makes a difference. Who the audience is always matters. Remember the rock band Cinderella’s song “Don’t Know What You Got (Till It’s Gone)”? That’s how I feel about privacy. Privacy is one of those things that we sort of take for granted. We feel safe, until suddenly we are not safe anymore.

I’m open with my students about this: I’ve been stalked. The individual who stalked me found my information online. I went through a two-year process of trying to have that information removed, and there were many organizations that would not remove my private information from their websites. I share this openly because I probably didn’t care that much about the fact that my address was listed until, suddenly, I figured out that somebody was stalking me. The minute that you feel like your private life is interrupted, suddenly you care very much about your privacy.

I’ll give you another example. I’ve worked with healthcare providers, and the biggest, single problem we have are family members or ex-partners of individuals who might come into a healthcare facility looking at their files without legitimate authorization to do so. Maybe they’re going through a divorce, for example. They may be in the middle of a legal proceeding, or they may just be nosy. It often matters a lot to an individual if your ex-partner is looking at your health condition. Does anyone really think someone might misuse healthcare information for a legal preceding? No. It’s not the kind of thing that’s front of mind. We’re concerned about receiving appropriate treatment, making sure that we don’t get the wrong medication. Those are the things you’re thinking about when you’re seeing a doctor. You’re not thinking about someone taking your information and misusing it. That’s why privacy is such a complicated thing. Privacy laws do matter, because when things go wrong, we care a lot.

Is the intersection of cybersecurity and health law a growing area of law?

It definitely is. There are students who have been hired into positions where they focus specifically on healthcare data protection. Compliance roles in particular—those teams are only growing. They’re even hiring in the middle of the COVID epidemic. It’s an area where a lot of people do not have this background or a specialization where they understand how the technology works. They are truly indispensable. Manufacturers, in particular, are going to continue to make products. Healthcare organizations have to continue to function, and they have to do so in a way where they’re compliant with the law. They’re not going anywhere. –Kristi Turnbaugh 

Learn more about Tschider’s research focusing on FDA activities, healthcare privacy, cybersecurity, and a variety of other health and science law topics on SSRN.

Research for Justice

Learn how faculty members advance justice through scholarship that matters.

Read more


			CTA

Nationally recognized as a leading center devoted to the study of health care law, Loyola’s Beazley Institute for Health Law and Policy educates health law leaders and policymakers through a curriculum grounded in transactional, regulatory, life science, and public health and policy issues. The program supports the JD certificate in health law, Master of Jurisprudence in Health Law, Master of Laws in Health Law, and it creates networking, symposia, clinician, and research opportunities for students. Learn More