New AI-Powered Sensors Could Tell Teachers What’s Really Going on With Students

New AI-Powered Sensors Could Tell Teachers What’s Really Going on With Students


In a busy classroom, teachers sometimes miss social interactions that contain clues about how students are faring. New technology, powered by artificial intelligence, is trying to change that.

Researchers at Michigan State University have received a $600,000 grant from the National Science Foundation to test wearable technology that senses children’s location in the classroom and the orientation of their bodies, as well as their movement and speech. The research will start in preschool, with a focus on students with developmental disabilities, including autism and attention deficit/hyperactivity disorder, or ADHD, who sometimes struggle to interact and connect with their peers.

Teachers will receive feedback—both real-time and after-the-fact—based on what the sensors show. The researchers say that more data on how these students are engaging in class will help teachers better support them and tailor interventions to improve outcomes.

“The traditional way in which teachers have addressed these needs is through direct observation of children—but in a classroom of 15 to 20 children, it’s difficult to capture all the nuances of what’s going on and really understand what’s best for the child to support their social and emotional development,” said Sarah Douglas, an associate professor who studies special education at MSU and is one of the researchers on this project.

The researchers say they believe the technology could eventually be used for older grades, too, and could help teachers support students with depression, anxiety, or other mental health conditions.

The sensors are attached to children’s shirts and can also be placed throughout the classroom. The teacher can also wear one.

“It can detect where you are exactly in the classroom [and] detect which direction you’re looking at, who else is around you,” said Subir Biswas, a professor in MSU’s department of electrical and computer engineering and the other researcher on the project. “We don’t measure eye contact, but we can tell by collecting this data who is possibly talking to whom.”

The data from the sensors is wirelessly uploaded to the cloud, and AI is then used to send relevant alerts to teachers.

For example, a teacher who is concerned that a student is not engaging with their peers might request to be notified if that child is playing alone for more than two minutes. But if the teacher gets that alert, and the child is actually putting away their backpack or doing something else appropriate, the AI can use that feedback to adjust its knowledge of different areas of the classroom and become smarter for the future.

Other behaviors might be compiled into a report that will be sent to the teacher at the end of the day or week.

“I wouldn’t want a teacher to constantly be getting alerts,” Douglas said. “There’s a fair amount of individualization that goes into the tool that makes it really beneficial.”

The sensors can give teachers eyes in the back of their heads

The researchers said the data could reveal some important patterns and behaviors that might have been missed otherwise.

For example, if two children are talking, the sensors can tell which student initiated the conversation, how long the conversation happened, and which student was the first to walk away.

“For a single conversation, that might not be too much information, but it can be quite rich when you start collecting this information for a given person over a period of time,” Biswas said. “A pattern will emerge.”

The sensors can also alert teachers whether children are being excluded or interacting less often with their peers.

“So many times as early-childhood teachers, we end up spending a lot of energy on the children who make the most noise—who stand out in their behaviors and challenges they’re facing—but the children who fly under the radar are the ones who are the most timid or don’t know how to engage,” Douglas said. “There might be a lot of ways in which teachers might naturally not notice the social interactions.”

Students in a classroom with their interactions being observed.

When teachers do have access to this level of granular detail, they can better design interventions and supports for students who need them, the researchers said.

And Douglas said she hopes that having clear data will remove any bias from discipline decisions since teachers won’t be relying solely on their own observations. Boys and Black children are suspended from preschool in disproportionate rates.

Teachers are as likely to show racial biases as any other American adult, and research has shown that Black boys as young as 10 are routinely seen as older and less innocent than their white peers. One study found that student-teachers were more likely to view Black children’s expressions as “angry” compared to those of white children.

Having objective data from the sensors about children’s behavior will hopefully “allow teachers to have a more reflective nature,” instead of having knee-jerk reactions, Douglas said.

Teachers have logistical questions and concerns

The researchers have already tested the sensors in action but are planning to expand their research with the grant money. They will test the sensors in classrooms in Michigan for the next couple years and use feedback from educators to improve the technology.

“Administrators [have said], ‘This is super cool; this is really going to help my teachers. This is a need—this is an area that’s a struggle for them,’” Douglas said.

Still, she said, the teachers she has spoken to have been a little more apprehensive so far, with questions about the logistics.

For example, teachers expressed concerns about getting parents on board. (Ideally, Douglas said, all students in the class will have permission to wear sensors. In order to have a better chance of getting parents’ permission, the researchers won’t record audio or video.)

And teachers have wondered whether young students—who are prone to putting things in their mouths—would be able to take the sensors off by themselves. (The sensors are snapped into place, so they’re difficult for a child to remove independently, Douglas said, but the wires are also safely encased.)

The research is expected to be completed in July 2026. In the final year, the researchers will conduct an experiment in which a teacher has access to the sensors sometimes and not other times, so they can see which prompts stronger interventions: human observation or the sensors.

If their hypothesis—that the sensor will be more effective—is correct, the researchers will pursue mass production of the sensors so that they can become widely available for educators, Douglas said.

“We’re really making strides toward future classrooms that use technology in clever ways to allow teachers to do their jobs more effectively and efficiently,” she said.





Source link

No Comments

Post A Comment