zephyrproject-rtos / zephyr

Primary Git Repository for the Zephyr Project. Zephyr is a new generation, scalable, optimized, secure RTOS for multiple hardware architectures.
https://docs.zephyrproject.org
Apache License 2.0
10.77k stars 6.57k forks source link

STM32H747I_DISCO_M4 STM32H747I_DISCO_M7 IPC (Inter-processor-communication) #69580

Closed RIPHeisenberg closed 7 months ago

RIPHeisenberg commented 8 months ago

Hello everyone,

I have been working with the STM32H747I Discovery board for a couple of weeks now. One particular difficult thing to achieve is the (IPC Inter Processor Communication) .

The IPC mailbox feature doesn't support data communication as is mentioned here: https://github.com/zephyrproject-rtos/zephyr/pull/68758

The STM32H7 uses shared memory on the DRAM4 section (0x38'000'000). This seems to be usefull for RPMsg or openAMP. I tried both but RPMsg seems to be using the mailbox feature to transport data, so I'm back to square one. The openAMP implementation is made to be used with a openAMP linux client which utilises RPMsg.

https://github.com/zephyrproject-rtos/zephyr/pull/68741

discusses this issue in greater detail. @uLipe seems to be working on a solution.

Are there efforts to fix this issue. Is there a workaround in place to get ahead. This is a very important feature for me and I'd really like to stay on native Zephyr. Am I understanding something fundamental wrong about de platform. Any help is greatly appreciated.

Board: STM32H747i_DISCOVERY Zephyr version: 3.5.99 Zephyr SDK version: 0.16.5 Commit: https://github.com/zephyrproject-rtos/zephyr/commit/023d4838fbb8513f9d9ff6f26b065591131c347d

best regards

uLipe commented 8 months ago

Hey @RIPHeisenberg , have you tried to use the RPMSg service samples?

They are located under samples/subsys/ipc.

The openAMP uses a bit different approach through RPMsg protocol, the shared RAM area is used by both cores to exchange data using virtqueues ( from virtio spécification), so the RPMsg protocol is built on top of that mechanism by placing a special formatted data under the shared memory, so the mailbox is used only to serve as a doorbell to synchronize both cores and does not need to transport any kind of data.

Please check the RPMsg and the openamp sample they should work in scenario M7 <---> M4 communication.

RIPHeisenberg commented 8 months ago

Hello again, so you would recommend me to use RPMsg on the M7 and openAMP on the M4, right? I'll try it and give you feedback. Small side note there is a issue with the libmetal implementation for the cortex family https://github.com/OpenAMP/open-amp/issues/463 explains a solution for this in detail. I'm not sure if they are going to fix it in the near future.

RIPHeisenberg commented 8 months ago

So I tried to mix and match but it doesn't seem to be a very promising way forward. I tried RPMsg on M7 and openAMP on M4. There isn't a proper communication between both of them. ST shows a pingpong demo, which seems to have strongy inspired the RPMsg examples. They mix RPMsg and openAMP simmular to your recommendation. https://github.com/STMicroelectronics/STM32CubeH7/tree/master/Projects/STM32H747I-DISCO/Applications/OpenAMP/OpenAMP_PingPong.

My current configuration looks like this.

Example_openAMP_STM32H7.zip

It would be very helpful to have a example for this with the STM32 chips. Would this be something we could work on?

regards @ @uLipe

RIPHeisenberg commented 7 months ago

I have seen some efforts to tackle the issues with IPC by: https://github.com/zephyrproject-rtos/zephyr/pull/69680/commits I'll wait till things are smoothed out and proceed from there.

gizmo1904 commented 7 months ago

Did you manage do get things working @RIPHeisenberg? Could you please share your working configuration?

RIPHeisenberg commented 7 months ago

So I wasn't able to run any of the IPC services on the STM32H7. The IPC API and the RPMsg api both relly on a data transport layer that is not supported by the STM32. It is a known issue from the community. I talked to one of the IPC maintainers and they want to use the MBOX api for the future of the STM32. There is actually a pull request running that tackles this issue. https://github.com/zephyrproject-rtos/zephyr/pull/69680/commits. One of the maintainers told me that it should be possible to use RPMsg and openAMP for this issue. But I'll wait for this feature to be implemented. It seems like the easiest way for me to achieve my goals. Even if I have to wait for a while.

gizmo1904 commented 7 months ago

Thank you very much for the quick reply, very kind of you!

celinakalus commented 6 months ago

@RIPHeisenberg I have only now become aware of this issue here. It is true that in #69680, I am implementing the MBOX driver for the IPC service to work on the STM32H7, but the RPMsg should already been working after a modification I made in #68741 that has already been merged into upstream. You appear to be using an older version of Zephyr, however. You should make sure to use any version containing the fix. The problem I fixed was that the RPMsg service expected the IPM driver to be able to send at least a bit of data, but the IPM implementation of the STM32H7 was unable to do so.

Please look at PR #68758 for how to setup the device tree to use the RPMsg sample. This PR is currently vastly out of date and I have not yet made attempts to fix it, but it should give you an idea where to start. You will also have to disable caching on the MCU, either by configuring the MPU in code, setting the device tree up properly, or by simply configuring CONFIG_DCACHE=n in your Kconfig (easiest for a proof of concept). That I have not yet added to the sample.

I will hopefully also get back to finishing up the MBOX driver for the IPC service to work soon, but I am not sure how long it will take for it to be merged still, and the necessity to disable cache will remain the same for now. I hope to implement a fix for that in ICMsg directly soon.

Hope that helps!

UPDATE: I have made an updated sample available now. It should work out of the box with no modification. Check out PR #71867

RIPHeisenberg commented 5 months ago

Hi Celina,

Thanks for responding to this issue. Our project doesn't currently work in dual-core mode. The effort to getting everything running was to high. I managed to use the IPM driver as you described. The problem however is that (as you already mentioned) no data can be transported inbetween the two cores. The RPMsg sample was to complicated. Might be a skill issue on my side. We have to stay on Zephyr 3.2 because a vendor library doesn't support a more recent version yet. Very unfortunate. But I am looking forward to trying out your Implementation. :D Multicore applications are a very interessting subject.

Best regards

celinakalus commented 4 months ago

Hi @RIPHeisenberg,

I hope you are well. You might have already seen that the MBOX driver is now merged, and I have added two samples to showcase its usage. If you experience any difficulties using my code, feel free to let me know, I will gladly try to help out if I can :)

Multicore applications are a very interessting subject.

Indeed, they are :)

Kind regards