AI thinks you should go to jail, even if you didn't do the crime

Evidence from using ChatGPT as a prosecutor

By Rory Pulvino (Justice Innovation Lab), Dan Sutton (Stanford Law School), and JJ Naddeo (Justice Innovation Lab & University of Michigan Law School)

Climate impact

There is a large, known environmental cost to training generative AI models. While that cost may be mitigated depending on the energy source used to train the model, it is still likely to be substantial. In addition, there is an environmental cost per prompt. The environmental cost per prompt should be considered when conducting research projects like this one that are in the interests of the public good.

To that end, there are not readily available, public environmental cost estimates for prompts to generative AI chatbots like ChatGPT. In order to provide this information, we reviewed questions regarding the environmental cost of a prompt in OpenAI forums, the website AI StackExchange, environmental protection sites, and ultimately ChatGPT itself. Estimates generally vary between 1 to 2 grams of CO2 per prompt and reply. Other estimates are based on the total number of tokens used—essentially how long the prompt and reply are—which is how we estimate our cost below given that there was significant variation in prompt length and reply.

For our experiment, we estimate that 100 tokens cost 0.00375g of CO2. Over the course of our experiment, we used 14.377M tokens. This brings our estimated carbon cost to 53,914 grams of CO2. Using the EPA greenhouse gas equivalencies calculator, this equates to burning about 6 gallons of gasoline.