Meet Vee, the Vibes Manager.
“I make sure our team feels good and has fun even during deadlines,” said Vee.

She’s an artificial persona created for a new Boise State University class called “Teamwork in the Digital Age.” There are 12 personas altogether, including Cass the Charismatic Communicator and Rex the Devil’s Advocate.
“I question everything so we can strengthen our ideas. I know it’s annoying sometimes, but someone has to play devil’s advocate,” said Rex.

Students use ChatGPT to simulate a workplace conflict with the personas. They debate scenarios, such as whether to use new AI tools in a workshop. ChatGPT narrates the conversation.
"I'll embody both of them alongside you, letting them interact with you and with each other like a real team."
That means you'll hear Rex grumbling at Vee sometimes, and Vee reminding Rex to chill,” ChatGPT says to the user.
The real-life students share what they’ve learned from the artificial conversation on a discussion board and receive written feedback from each other. Everything happens online.
Kelly Arispe is the launch director of the university’s School for the Digital Future, which offers an artificial intelligence certificate program. Arispe said there are about 600 students in the program and the majority only take courses online without meeting in person.
“We have to understand what human-centered AI is, and we need to lead our students to be able to think about that and problem solve and work on teams and, um, to be ready to engage,” said Arispe.
This is part of a broader effort to integrate artificial intelligence into education at the university. Boise State also just launched an in-house chatbot service called boisestate.ai.
It’s a collection of large language learning models created by Meta, Amazon and OpenAI. The models use data from the internet to summarize and generate text and answer user questions. Staff can fill out a Google Form to debut the AI in their classroom syllabus.
The university is laying out use guidelines for the chatbot. Students can use it to summarize homework and help with writing, but are not allowed to get direct answers to homework questions or generate essays.

Elvira Gomez, a third-year Radiology major, said AI is now integrated into academics.
“Now we have to talk more about cheating, we have to talk more about using it in creative ways instead of just to do our work for us,” said Elvira.
The university also has guidelines for instructors. Educators can paste students’ work into the chatbot to generate assignment feedback. They’re warned not to use it for “grading complex assignments.”
![A Boise State AI feedback prompt. It reads: "You are a feedback specialist helping faculty provide meaningful student evaluations. I am a Boise State professor grading [assignment type] for my [course] class. Here's the assignment: [brief description] and here's a student's work: [sample work]. Can you help me provide constructive, specific feedback that acknowledges strengths and gives actionable suggestions for improvement?" Instructors can fill in the prompt and paste it into the chat bot to generate assignment feedback.](https://npr.brightspotcdn.com/dims4/default/6bbacb1/2147483647/strip/true/crop/2040x734+0+0/resize/880x317!/quality/90/?url=http%3A%2F%2Fnpr-brightspot.s3.amazonaws.com%2Ff3%2F96%2F634e9ce844dab78ecd19552d13f7%2Fscreenshot-2025-09-15-at-11-35-22-am.png)
Jen Schneider, interim dean of the College of Innovation and Design,. says the proprietary service is meant to protect student data.
"If students are writing papers, that is ostensibly their intellectual property. If they upload it to a commercial model, that paper, their ideas, then go and train that commercial model. It doesn't really belong to them anymore,” said Schneider.
Boise State’s chatbot is also cheaper than commercial models because it’s hosted on university servers. The campus Office of Information Technology estimates that giving everyone on campus a ChatGPT license would cost about $7.2 million a year.
Colleges like California State University have paid more than double that to give their students the licenses.

Schneider said Boise State isn’t trying to encourage AI adoption, but understands its presence in student and professional spaces. Pew Research reported this year the majority of U.S. adults under 30 have used ChatGPT at least once.
"We're trying to promote engagement with AI. And that might mean that you decide, 'Wow, I never want to use these tools. They're not for me. I'm ethically opposed to them,'” said Schneider.
Natalie Elder, a Business and Economics major, said AI is a helpful tool in moderation, but is concerned about its long-term effects.
“People become a lot more lazy when they use AI more frequently. They're just, they don't want to think of things,” said Elder.
A study by the MIT Media Lab this June says the use of artificial intelligence in essay writing decreases brain connectivity, which has long-term implications for the brain development of people who rely on AI.
Trinity, a student at Boise State University, was using ChatGPT to summarize and complete an essay about a piece of art.
“Instead of reading the instructions, I just paste it into ChatGPT and then it reads it for me and tells me the instructions in a way shorter, like one sentence,” Trinity said. “I get a lot more done. I have time to do more fun things in college than just only school. But I also learn a lot from it because it, like, tells you everything.”

The university has no policy for using AI detection tools at the institutional level. Dean Schneider says the technology is evolving so quickly, any tool the school purchases would quickly become obsolete.
Back in the simulated teamwork discussion with AI personas Rex and Vee, ChatGPT comments on the rapid development of AI.
“The chaos is because I keep pointing out trends we’re ignoring — AI shifts so fast that if we don’t adapt, we’ll be outdated before the workshop even launches,” said ChatGPT.
Schneider said the university is carefully considering the effects of AI on student learning and thinks students will use AI appropriately.
“We never want students using AI to produce work uncritically and then faculty using AI to grade that work. That is the dystopic narrative,” says Schneider.
Schneider said cheating has always existed and AI may make it easier.
“But generally speaking, the data that we have on what students want and certainly what faculty want suggests that people are not wanting to scam their way through university."