Ecocalculator Methodology
To estimate the environmental cost of your AI usage, we built a calculator based on recent peer-reviewed research. This methodology is limited and only provides an estimate.
Reach out to addie@projectkaleidoscope.org with feedback!
Equation - Electricity
We estimate electricity consumption with this equation:

Key

Key Definitions - Energy
Tokens
A token is a unit of data that an LLM processes. While there is no exact token size, we use the standard industry heuristic for English: 4 characters ≈ 1 token.
Input vs Output
Processing input is a distinct process from producing output for an LLM. Each has independent energy costs. Input Energy is the cost to "read" your instructions, while Output Energy is the cost to create a response.
Reasoning Multiplier
Reasoning models use a deeper "thought process" that generates invisible hidden tokens. To address this, we apply a Reasoning Multiplier, based on task type, to reflect the additional energy expended.
PUE
Data centers use energy beyond computing (such as cooling and lighting. PUE (Power Usage Effectiveness) is a ratio of total energy spent vs the computing energy. 1.0 is a perfect PUE score.
Startup Cost
Before a single token is generated, energy is spent routing and batching the request. We include this in our estimate as a fixed constant.
Equation - Water Consumption
We estimate water consumption with this equation:

Key

Key Definitions - Water
WUE
Water calculations are grounded in WUE or Water Usage Effectiveness. Like PUE it’s a ratio of water used per unit of energy.
Site
Water used for cooling the servers in a data center.
Source
Water used by the power plant that generates electricity for the data center.
Equation - Carbon Consumption
We estimate carbon consumption with this equation:

Key

Key Definitions - Carbon
CIF
CIF or Carbon Intensity Factor represents how much carbon was emitted given a unit of energy.
Sources
We are deeply grateful to the following papers, which informed our calculations.
CoIn: Counting the Invisible Reasoning Tokens in Commercial Opaque LLM APIs (Sun et al., 2025)