Elevate Your Tech Prowess with High-Value Skill Courses
Offering College | Course | Website |
---|---|---|
IIT Delhi | IITD Certificate Programme in Data Science & Machine Learning | Visit |
Indian School of Business | ISB Digital Transformation | Visit |
Indian School of Business | ISB Professional Certificate in Product Management | Visit |
The researchers found that while ChatGPT displayed an ability to identify location-specific environmental justice challenges in large, high-density population areas, the tool had limitations when it came to local environmental justice issues.
They said that the AI model could provide location-specific information for only about 17 per cent, or 515, of the total 3,018 counties it was asked about. Their findings are published in the journal Telematics and Informatics. “We need to investigate the limitations of the technology to ensure that future developers recognise the possibilities of biases. That was the driving motivation of this research,” said Junghwan Kim, assistant professor at the Virginia Tech University and the study’s corresponding author.
The researchers said they chose environmental justice as the subject for the investigation to expand the range of questions typically used to test the performance of generative AI tools.
Discover the stories of your interest
The US Department of Energy describes environmental justice as the “fair treatment and meaningful involvement of all people, regardless of race, colour, national origin, or income, with respect to the development, implementation, and enforcement of environmental laws, regulations, and policies”. Asking questions county-wise allowed the researchers to measure ChatGPT’s responses against sociodemographic parameters such as population density and median household income, they said. The populations in the counties they surveyed ranged from 1,00,19,635 (Los Angeles County, California) to 83 (Loving County, Texas).
The team found that in rural states such as Idaho and New Hampshire, more than 90 per cent of the population lived in counties that could not receive local-specific information.
On the other hand, in states with larger urban populations such as Delaware or California, fewer than one per cent of the population lived in counties that cannot receive specific information, the researchers said.
“While more study is needed, our findings reveal that geographic biases currently exist in the ChatGPT model,” said Kim, who teaches in the Department of Geography.
The findings hinted at issues regarding the “reliability and resiliency of large-language models”, according to study co-author Ismini Lourentzou, who teaches in the Department of Computer Science.
“This is a starting point to investigate how programmers and AI developers might be able to anticipate and mitigate the disparity of information between big and small cities, between urban and rural environments,” Kim added.