How Will AI Change Business School Requirements and Admissions?
B-School Search
For the 2023-2024 academic year, we have 118 schools in our BSchools.org database and those that advertise with us are labeled “sponsor”. When you click on a sponsoring school or program, or fill out a form to request information from a sponsoring school, we may earn a commission. View our advertising disclosure for more details.
Artificial intelligence (AI) innovations such as ChatGPT raise concerns amongst educators and school staff. How will students utilize this new tool, and will it help or just promote cheating and plagiarism? Christian Terwiesch, a professor at the Wharton School of the University of Pennsylvania, discovered that ChatGPT would have earned a B to B- grade on the final exam for operations management, an MBA core course. Chat GPT was also at or near the passing threshold for the United States Medical Licensing Exam (USMLE).
With widespread access to a tool that advertises it can do the writing for students, admissions teams and faculty are looking for a way to ensure that the essays and work produced are original and the product of the students, not an AI bot. But how will AI change business school admissions and assignments?
“We acknowledge that this problem exists, and we aren’t going to stick our heads in the sand and pretend it’s not a thing,” shares Anthony Whitten, director of diversity admissions at the Haas School of Business University of California, Berkeley. “What we are looking at and trying to understand how it might impact what we see from candidates. We are asking how might admission applicants use this tool?”
ChatGPT is relatively new, and, according to Whitten, it didn’t significantly impact the admissions process this past winter. As an admission team, though, they have been closely monitoring how applicants might use it. “Our team has made mock-ups with AI and shared them so we can see examples and know what we are looking for. I’ve not personally read an essay produced with AI that I’d be thinking, ‘Admit this person straightaway,’” shares Whitten.
If Whitten did suspect a student had used AI to write their admissions essay, it wouldn’t be an automatic denial. “It’s something I would note in my review. If the rest of the profile was strong enough, I’d send them on to the later stage, including the interview. That’s where we can really suss out how much of their essay was the real deal, not just AI,” he explains.
Overall, Whitten believes that business school education is multidisciplinary enough that a student can’t make it by solely relying on AI: “There are some places where it probably could be beneficial, but there are still human elements involved in that experience. Suppose the assignment is answering a question where all the information can be found online—then it might work. But that isn’t the end all and be all to your business school experience,” he says.
Keep reading to learn how AI may change business school requirements and admissions.
Meet the Expert: Anthony Whitten
Anthony Whitten is the director of diversity admissions at the Haas School of Business University of California, Berkeley. Before this role, he held full-time positions on the business school’s admissions team as both an associate and senior director.
Before starting his journey in the field of higher education admission, he spent over a decade as a public high school teacher in Virginia. During his teaching tenure, he taught various subjects ranging from journalism and US history to economics and personal finance. He holds a bachelor’s in history and African American studies from the University of Virginia. He has a master’s degree in educational leadership and policy from Portland State and another in teaching from the University of Virginia.
Authenticity Can’t Be Faked
For Haas School of Business, the application is a multi-step process that goes far beyond an essay that could be written with AI: “In business school admissions, we spent a lot of time thinking about authenticity. As we review applications, we want to understand who we are considering adding to our community,” explains Whitten. “Sure, ChatGPT can be a shortcut and may ultimately get an applicant a pretty solid essay. But there’s a humanity that’s clearly lacking in every AI essay I have read. I personally am not super worried about the influx of students using AI in their applications.”
He adds, “No application, at least in our business school, is just the essays, though. There are so many different other components of the actual review process. Sure, one element of their application could use a shortcut, but that isn’t going to translate if you’re invited to an interview, and you have to talk about yourself and tell your story. ChatGPT is not going to be there. So sure you can use it, but is it really serving you in what you want to accomplish in your education?”
AI Can Be a Valuable Tool
Much like computers and the internet have made writing and research easier when used appropriately, ChatGPT, and AI programs like it, can be valuable tools that can help students accomplish their work. “The best case scenario as people integrate this technology into their lives is that it should provide some type of ease. For some people, who may not be academically prepared as strong writers, it might help them organize their thoughts. I don’t see that as a bad thing,” explains Whitten.
For decades students have already been utilizing other tools to help them gain admission to business schools. “Many applicants use consultants to work through their application process. They use them to prepare for interviews or to give feedback, commentary, and review on their essays. AI, as a tool, could be similar to that process,” says Whitten. “It would be weird to place a lay a line down and say you can do things like use a consultant but don’t use AI. If used appropriately, it could be helpful.”
However, these tools, including consultants, should be strictly limited to providing help, not replacing a student’s original work. “It’s important to recognize that there are lines between assistance, plagiarism, and cheating. Students have to make sure that, as they are reducing their effort and work, they are still producing something they have truly done,” he clarifies.
AI can also be a valuable tool for enrolled business school students, particularly in examining how to apply AI to business models. “In business school, there’s an interest in how AI impacts the industry and how it is being utilized to make decisions. I think there’s a wealth of information available, so we now see students thinking about how they can utilize their tools,” notes Whitten.
The diversity of assignments in most business schools will, in many scenarios, prevent outright plagiarism with AI: “The concern has been discussed amongst faculty, but I don’t think ChatGPT will be able to support students in every scenario because there are varying types of assignments. Sure, it could lead to some faculty redesigning their curriculum where perhaps they no longer ask for papers but rather assign a culminating activity emphasizing what they have learned,” suggests Whitten. “There will be adaptations no matter what. We will have to do things differently so students aren’t encouraged to cheat.”
Equity and Diversity Concerns with AI in Business School
The inevitable integration of AI and ChatGPT into business schools and admissions processes naturally raises concerns about equity and diversity. The ACLU notes, “There is ample evidence of the discriminatory harm that AI tools can cause to already marginalized groups. After all, AI is built by humans and deployed in systems and institutions that have been marked by entrenched discrimination.” Often, the data used to train AI tools is biased, which then bakes racism and inequity directly into the products the AI produces.
“I think it’s important to have conversations where we spell things out, particularly in the diversity and equity inclusion spaces. How do we not let AI kind of run rampant? What are the ethics behind it? So how do we use it responsibly? And how do we use it to make lives better and not cause more harm?” asks Whitten.
“AI isn’t just AI. It was programmed. That program comes in with certain expectations and rules they’ve established,” continues Whitten. “ While it is not necessarily AI, every time I go into an airport bathroom with automatic sensors on any of the equipment, there’s a moment of dread. I know that I have to approach that sensor, as a black person, with my palms up. Otherwise, it’s a 50-50 chance of seeing my darker skin.”
As AI software development continues, it will be critical for developers and their management to ensure their tools are inclusive and diverse: “There will always be the concern of how it was developed. Who does it impact, and what specific values might it be reinforcing? Conversations about ChatGPT or other AI systems need to include such things as if they can incorporate African American vernacular English or produce only the standard English taught in schools. How are we capturing people’s actual voice, and how does that translate out?” says Whitten.