← Back to Research

Carbon-Aware Inference Under Deployment Constraints: Extending the CCI Framework

Faculty-advised research — team forming

→ Core contributors included as co-authors

AI Energy Research

Extension of the published CCI energy benchmarking framework to deployment environments with infrastructure constraints. Energy cost characterization and carbon-aware model selection studied under constrained deployment conditions.

EnergyCarbon-Aware ComputeLLMDeployment ConstraintsSustainabilityAI Efficiency

The crisis

  • Current energy benchmarking frameworks do not characterize inference costs at the infrastructure edge. Field offices, disconnected facilities, and cost-constrained environments deploy AI systems without per-query carbon visibility.
  • Carbon-aware model selection requires measurement frameworks that extend beyond cloud API contexts. No such framework currently exists for constrained deployment environments.
  • Organizations with infrastructure constraints make model deployment decisions without per-query energy cost data.

About this research

The CCI framework established per-query carbon cost measurement for cloud-based LLM inference. Constrained deployment environments present a distinct energy profile not characterized by the original framework. This thread extends the CCI methodology to address that measurement gap.

Key findings

  • (In Progress)