Open shankari opened 1 day ago
wrt max current, it is set via node_red using everest_external/nodered/#/cmd/set_max_current
which invokes
// external input to charger: update max_current and new validUntil
bool set_max_current(float ampere, std::chrono::time_point<date::utc_clock> validUntil);
and is less than
if (c >= 0.0 and c <= CHARGER_ABSOLUTE_MAX_CURRENT) {
it then sets it on the BSP and signals.
bsp->set_overcurrent_limit(c);
signal_max_current(c);
This is the max, so it is clearly not the source of the 48
static constexpr float CHARGER_ABSOLUTE_MAX_CURRENT{1000.};
and it is clearly published from
mod->charger->signal_max_current.connect([this](float c) {
mod->mqtt.publish(fmt::format("everest_external/nodered/{}/state/max_current", mod->config.connector_id), c);
limits.uuid = mod->info.id;
so where is it invoked from?
It is initially invoked the giant EvseManager::ready
// start with a limit of 0 amps. We will get a budget from EnergyManager that is locally limited by hw
// caps.
charger->set_max_current(0.0F, date::utc_clock::now() + std::chrono::seconds(120));
and the energy manager uses https://github.com/EVerest/everest-core/blob/828072742f816d74d44d55fdf13d01c8fbecd449/modules/EvseManager/energy_grid/energyImpl.cpp#L353
if (value.limits_root_side.value().ac_max_current_A.has_value()) {
limit = value.limits_root_side.value().ac_max_current_A.value();
}
...
float a = value.limits_root_side.value().total_power_W.value() / mod->config.ac_nominal_voltage /
mod->ac_nr_phases_active;
But the only energy that I see is
grid_connection_point:
module: EnergyNode
config_module:
fuse_limit_A: 40.0
phase_count: 3
Need to add some logs here to see what is going on.
In parallel, let's figure out where the default composite schedule comes from. The composite schedules that we use come from
std::vector<CompositeSchedule> ChargePoint::get_all_composite_schedules(const int32_t duration_s,
const ChargingRateUnitEnum& unit) {
which calls
auto schedule = this->smart_charging_handler->calculate_composite_schedule(
Bingo! Here's where the defaults are set https://github.com/EVerest/libocpp/blob/925e9cd3049faf6d31e496e33aad619501f5c3d9/lib/ocpp/v201/smart_charging.cpp#L632
const auto default_amps_limit =
this->device_model->get_optional_value<int>(ControllerComponentVariables::CompositeScheduleDefaultLimitAmps)
.value_or(DEFAULT_LIMIT_AMPS);
const auto default_watts_limit =
this->device_model->get_optional_value<int>(ControllerComponentVariables::CompositeScheduleDefaultLimitWatts)
.value_or(DEFAULT_LIMIT_WATTS);
and looking at the /ext/dist/share/everest/modules/OCPP201/component_config/standardized/SmartChargingCtrlr.json
, we do indeed see 48 A and 33120 W. This was added in https://github.com/EVerest/libocpp/commit/8d74ff558945eb189738555be2d60b22800cf962. I looked through the commit, and the related PR, and there was no explanation for the defaults.
Going back to the other limits, they are indeed set from the fuse_limit_A
modules/EnergyNode/energy_grid/energyImpl.cpp
local_schedule.limits_to_root.ac_max_phase_count = mod->config.phase_count;
local_schedule.limits_to_root.ac_max_current_A = mod->config.fuse_limit_A;
local_schedule.limits_to_leaves.ac_max_phase_count = mod->config.phase_count;
local_schedule.limits_to_leaves.ac_max_current_A = mod->config.fuse_limit_A;
or
if (!e.limits_to_root.ac_max_current_A.has_value() ||
e.limits_to_root.ac_max_current_A.value() > mod->config.fuse_limit_A)
e.limits_to_root.ac_max_current_A = mod->config.fuse_limit_A;
if (!e.limits_to_root.ac_max_phase_count.has_value() ||
e.limits_to_root.ac_max_phase_count.value() > mod->config.phase_count)
e.limits_to_root.ac_max_phase_count = mod->config.phase_count;
The default config has it at 32, but this variable is set to 40 in our config
This has been true since the very first commit
Let's add some logs and figure out what is going on...
rooting around within the energyImpl
, I also see another location from which we can get the max current, the hardware capabilities.
entry_import.limits_to_root.ac_max_current_A = hw_caps.max_current_A_import;
entry_import.limits_to_root.ac_min_current_A = hw_caps.min_current_A_import;
Let's see where they are coming from. Bingo! They come from the hardware (aka the BSP), and in our simulator, the BSP is the JSYetiSimulator
, which does have it defined as 32.
Note also that the API publishes these values once a minute
Let's get some more logs, and then make sure that we set all the locations to 32 and see if everything is more consistent then!
Logs confirm that the limits are coming from the hardware capabilities. I am not seeing the call to the fuse API happen yet
2024-11-18 07:25:14.773429 [INFO] evse_manager_2: :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.773564 [INFO] evse_manager_2: :: Clearing export request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.773658 [INFO] evse_manager_2: :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.773744 [INFO] evse_manager_2: :: Clearing export request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.774441 [INFO] evse_manager_2: :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 07:25:14.774556 [INFO] evse_manager_2: :: Clearing export request schedule by setting max current from hw_caps = 32
double checked today morning with a fresh pair of eyes, and can confirm that:
we do see the value from hw_capabilities
2024-11-18 15:05:35.427388 [INFO] evse_manager_1: :: Handle enforce limits with ac_max_current_A = 32
2024-11-18 15:05:35.481733 [INFO] evse_manager_1: :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 15:05:35.481918 [INFO] evse_manager_1: :: Clearing export request schedule by setting max current from hw_caps = 32
2024-11-18 15:05:36.483674 [INFO] evse_manager_1: :: Clearing import request schedule by setting max current from hw_caps = 32
2024-11-18 15:05:36.483872 [INFO] evse_manager_1: :: Clearing export request schedule by setting max current from hw_caps = 32
we don't see any updates from the fuse (even after plugging in)
it is still mismatched with the charge profile.
2024-11-18 15:09:28.420556 [INFO] ocpp:OCPP201 :: {
"chargingRateUnit": "A",
"chargingSchedulePeriod": [
{
"limit": 48.0,
"numberPhases": 3,
"startPeriod": 0
}
],
"duration": 600,
"evseId": 0,
"scheduleStart": "2024-11-18T15:09:28.000Z"
}
2024-11-18 15:09:28.421033 [INFO] ocpp:OCPP201 :: {
"chargingRateUnit": "A",
"chargingSchedulePeriod": [
{
"limit": 48.0,
"numberPhases": 3,
"startPeriod": 0
}
],
"duration": 600,
"evseId": 1,
"scheduleStart": "2024-11-18T15:09:28.000Z"
}
Maybe that is OK - the CSMS thinks that the station can have 48 A although the station has a hardware limit of 32, and then the limit should be 32. But shouldn't that 32 be reflected as an external constraint then? Need to discuss with the community @the-bay-kay and @abby-wheelis.
When plugged in, the kW also seems to reflect the 48 (240 * 48 = 11.5kw) which is wrong since we can only provide 32A at the hardware level. Going to investigate that next...
why do we not use charging curves when the departure time is not set?
It seems there are a few different issues occurring in the powercurve generation. For one, we're not generating any of the previews, because the exec
module is failing to run. The first error we get is:
/bin/bash: line 1: /usr/src/node-red/python_environments/everest/bin/python3: No such file or directory
So, we can assume our venv
isn't configured correctly. As a sanity, check, we attempt to run with just python3
...
python3: can't open file '/bin/scripts/preview_curve_4_nodered.py': [Errno 2] No such file or directory
... and we fail to find the script entirely. Running find / -name preview_curve_4_nodered.py
in the node-red container shows that our script hasn't been copied over. The script is present in the repository, but the the steps to copy it over are absent. Originally, I implemented this as setup within the Node-RED Dockerfile -- would it be better if we set this up in a separate script, as we now do with the manager patches here?
When plugged in, the kW also seems to reflect the 48 (240 * 48 = 11.5kw) which is wrong since we can only provide 32A at the hardware level. Going to investigate that next...
Will update this comment as I find more details, but wanted to link to my findings on the PowerMeter issues here before I forget (this thread was originally concerned with piping through the ChargingProfileSchedules, but the findings concerning JSYetiSimulator's PWM implementation should remain relevant).
Ah maybe I missed that while porting over the changes. In the Dockerfile is fine, there were just soooo many patches for the manager that I thought it would be tidier to put them into scripts. But the nodered Dockerfile is fairly simple now.
Some updates on the Node-RED changes!
Video of existing changes included in the cut below -- I'll upload my existing changes and link to the fork in this comment. fork can be found here -- this builds NodeRED locally, I can upload and image if that'd be preferred! We shouldn't need to change the NodeRED beyond the script setup.
So I finally figured out the value in the gauge. And it is in fact tied back to the 48 and some confusion around naming.
Starting with basic electrical engineering, AC, unlike DC has phases. In the US, we typically use 3-phase charging. The total power is drawn across the three phases. Most chargers in the US apparently use 40-48 amps on a 240v line, giving us around 11520W or 11.5 kw. While supplying 48 A of current, the total current is split between three phases, giving us ~ 16 A per phase. That is the L1, L2, L3 current/voltage that we see in the powermeter output logs.
The demo/simulator seems to have some inconsistencies between values. Going forward, we need to decide:
The inconsistencies are:
draw_power
message indicates 16A, 3 phase2024-11-18 16:44:11.562515 [INFO] car_simulator_1 :: {
cmd: 'iso_draw_power_regulated',
args: [ 16, 3 ],
exec: [Function (anonymous)]
}
So those are internally consistent and make sense.
The 32A comes from a different location - the hardware capabilities of the charger, defined at https://github.com/EVerest/everest-core/blob/828072742f816d74d44d55fdf13d01c8fbecd449/modules/simulation/JsYetiSimulator/index.js#L1437
It appears, though, that this capability is not specified at the phase level, but is combined. But the slider on the UI is at the phase level. That makes the slider confusing. We see it at 32A, but nothing starts happening to the power drawn gauge until we get down below 16A.
So how can we fix this for the demo?
We should then go over all this and understand which limit is being specified where, and unify them for readability.
Finally, the simulation settings or the power meter are defined in modules/simulation/JsYetiSimulator/index.js
The voltage there is 230V, not 240, which is how we get 230 * 48 ~11kW
mod.simdata_setting = {
cp_voltage: 12.0,
pp_resistor: 220.1,
impedance: 500.0,
rcd_current: 0.1,
voltages: { L1: 230.0, L2: 230.0, L3: 230.0 },
currents: {
L1: 0.0, L2: 0.0, L3: 0.0, N: 0.0,
},
frequencies: { L1: 50.0, L2: 50.0, L3: 50.0 },
};
Note also that we can see the delivered power in greater detail in the "Debug" screen; see screenshots below.
Power gauge | without charging (note fake voltage "noise") | with charging (note current around 16A) |
---|---|---|
I also double-checked, and the Yeti reference hardware has a cap of 16A.
./modules/YetiDriver/board_support/evse_board_supportImpl.cpp: caps.max_current_A_import = 16;
./modules/YetiDriver/board_support/evse_board_supportImpl.cpp: caps.max_current_A_export = 16;
That's another option π
I changed the value in the simulator and verified that it is getting invoked by adding logs, But the max limit is not changing. I am first going to cheat by setting the limit as part of the node-red SIL and then will ask the community/figure out where it is coming from
Hardcoding also does not work. However, I do notice
{
std::scoped_lock lock(hw_caps_mutex);
hw_capabilities = c;
// Maybe override with user setting for this EVSE
if (config.max_current_import_A < hw_capabilities.max_current_A_import) {
hw_capabilities.max_current_A_import = config.max_current_import_A;
}
if (config.max_current_export_A < hw_capabilities.max_current_A_export) {
hw_capabilities.max_current_A_export = config.max_current_export_A;
}
}
This is indeed the reason
2024-11-19 07:48:08.647155 [WARN] evse_manager_1: module::EvseManager::init()::<lambda(types::evse_board_support::HardwareCapabilities)> :: Received new capability {
"connector_type": "IEC62196Type2Cable",
"max_current_A_export": 16.0,
"max_current_A_import": 48.0,
"max_phase_count_export": 3,
"max_phase_count_import": 3,
"min_current_A_export": 0.0,
"min_current_A_import": 6.0,
"min_phase_count_export": 1,
"min_phase_count_import": 1,
"supports_changing_phases_during_charging": true
}comparing to config 32
Let's try to set the config as well!
Ah it is set to 32 by default. Let's override...
max_current_import_A:
description: User configurable current limit for this EVSE in Ampere
type: number
default: 32
Now we see a limit of 48, but it is then dialed down to 40, presumably as part of the fuse value. Setting that to 48 as well.
2024-11-19 08:02:43.451230 [INFO] evse_manager_1: :: Clearing import request schedule by setting max current from hw_caps = 48
2024-11-19 08:02:43.451392 [INFO] evse_manager_1: :: Clearing export request schedule by setting max current from hw_caps = 48
2024-11-19 08:02:43.497242 [INFO] evse_manager_1: :: Handle enforce limits with ac_max_current_A = 40
Great! That worked. I am now going to fix the limit setting slider, and then pull everything and generate a new release. I can verify that:
I think that the easiest option at this point is to reset everything to either 16 or 32. Let's go with 16 to be consistent with the uWMC.
wrt https://github.com/EVerest/everest-demo/issues/92#issuecomment-2484422103
@the-bay-kay can you pull my node-red changes, merge them and submit a PR with your changes? I think we can be GtG then!
...can you pull my node-red changes, merge them and submit a PR with your changes? I think we can be GtG then!
Awesome! I'll leave docker-compose.ocpp201.yaml
set up to build Node-RED locally, and then you can create a new image from that if you'd like. (Speaking of, I think there may be a missing image -- .env was set to .22, but we're only up to .21! Easy enough fix : ))
There were no merge conflicts, so I'm assuming it should be smooth sailing -- I'll create the PR once I know I've run some tests and confirm things are working!
@the-bay-kay great! I think it is particularly important to ensure that, when we specify a power delivery curve as part of the power delivery req/res, that the charging "gauge" reflects that. If it doesn't, you may need to take a look at modules/EvseManager/energy_grid/energyImpl.cpp
and modules/simulation/JsYetiSimulator/index.js
(in particular, drawPower
) to see what is going on.
I think this is really and truly the last piece for the demo. I don't plan to investigate the other issues in the time left.
Awesome! I'll leave docker-compose.ocpp201.yaml set up to build Node-RED locally, and then you can create a new image from that if you'd like.
No need to do this; all images (except the manager) are built by the CI and pushed. We should be using pre-built images, and only pre-built images in the demo.
(Speaking of, I think there may be a missing image -- .env was set to .22, but we're only up to .21! Easy enough fix : ))
The manager is not built by the CI due to lack of resources. So there will always be a ~ 30 min lag while I build and push it locally. Just try again in a bit....
I think the demo is in pretty good shape for the kind of high-level, superficial exploration that we can do in 15 mins. But there are still some inconsistencies I found that I would like to take the time to fix. Some of these may just be my lack of understanding (e.g. #90), but then I would like to deepen my understanding to be ready for questions during the office hours.
I will keep a summar in the description and the details below.