Crafting experience...
10/26/2025
A Project Made By
Submitted for
Built At
Gator Hack IV
Hosted By
Climate change is an urgent challenge, but the impact of proposed sustainability actions is often abstract and difficult to visualize. Without clear spatial predictions, it’s hard to evaluate which interventions will have the greatest effect and where resources should be directed. This affects citizens, policymakers, and urban planners, particularly in dense and complex environments like New York City, making it a great place to start. Policymakers need to make informed, data-driven decisions about climate action but lack the tools to easily visualize the outcomes of different strategies.
CarbonIQ is an AI-powered climate impact simulator that translates natural language prompts about sustainability into visual, data-driven simulations. By using real NYC data and geography, it generates spatial patterns showing how interventions like converting taxis to EVs or installing green roofs affect emissions across boroughs. The carbon emission data is then displayed visually on the map. This makes climate actions’ impacts more tangible and measurable.
CarbonIQ has a frontend React + Vite app that provides an interactive Leaflet.js map visualization. Users input natural language prompts (e.g., “Convert 30% of taxis to EVs in Manhattan”), which are sent to the Python backend. The backend parses the description with AI logic from Anthropic’s Claude API, applies sector- and borough-specific modeling, and produces geographic emission data. The results are returned to the frontend via FastAPI, where they are visualized as baseline vs. simulation maps with interactive data points. Large datasets and models are managed with Git LFS for scalability.
We struggled with integrating real NYC geography into deterministic, AI-driven patterns, which required careful tuning to balance realism with reproducibility. Another hurdle was handling the large datasets and ensuring smooth, efficient communication between the backend and frontend. We overcame these challenges by implementing data normalization techniques for our visualizations, designing an efficient API structure for data transfer, and leveraging Git LFS for handling the heavy data and model files.
We learned how to merge AI reasoning with real-world geographic data. We also learned how to successfully connect all the technical layers, the Python backend, the React frontend, and the Leaflet.js visualization, into a single system. We built a working system where a user's natural language prompt can dynamically change and update emission maps in real time. Most importantly, we accomplished our goal of creating a tool that makes the complex impact of climate interventions visible and easy to understand.
Future directions include integrating real-time emissions data, training machine learning models for more predictive accuracy, and expanding the platform beyond NYC to other cities. We can also improve it by adding more advanced visualization options and developing mobile accessibility.