(It is easy to calculate in series packs, but the parallel ones would be tricky, since the bus links will equalizes the voltage. Did you manually remove the links and then measures each battery's voltage? Or did you estimate spread by measuring voltage drop between the bus?)
Because the cells are hard‑paralleled, their terminals are forced to the same potential, so true inter‑battery divergence can’t be measured without isolation taps. Instead, “spread” refers to:
*Voltage_Max – Voltage_Min of the pack‑level voltage within each hourly window.*
This captures short‑term variation in the measured pack voltage (ADC noise, EMI artifacts, inverter mode shifts, temperature coefficient), not cell‑to‑cell imbalance.
The raw data is in `Data/combined_output.csv` with columns:
``` Timestamp, Voltage_Min, Voltage_Max ```
Those come from 60‑second samples aggregated hourly. The analysis scripts compute:
``` Spread = Voltage_Max – Voltage_Min ```
So the ~10–15 mV “spread” in the report reflects the measurement envelope of the pack, not divergence between individual batteries. Measuring true per‑battery drift would require either per‑cell taps or momentary isolation, which wasn’t part of this study.
Happy to go deeper if you want details on sampling or noise characterization.