google-home / sample-apps-for-matter-android

The Google Home Sample App for Matter (GHSA for Matter) uses the Home Mobile SDK to create an Android app that's similar to Google Home.
Apache License 2.0
104 stars 37 forks source link

Matter/Chip libraries in Maven #211

Open jonsmirl opened 3 months ago

jonsmirl commented 3 months ago

https://central.sonatype.com/artifact/com.google.matter/matter-android-demo-sdk/versions

Is there a script to make this? I would find it more useful to have multiple versions of the artifact: Matter 1.0, Matter 1.1, Matter 1.2 and Matter Head. Can you see if the Google Matter developers will put up official build releases in Maven?

aBozowski commented 3 months ago

Hi Jon,

Thanks for your interest and engagement with the sample!

We're not planning to update matter-android-demo-sdk unless updates to the sample app require us to use a new SDK version. This artifact primarily exists for convenience reasons (as compared to hosting the blobs in the GH source tree).

We don’t want to incorporate the build process into the sample offering because supporting potential SDK build issues in different user environments is out of scope for this project.

Here is one possible strategy to build new library versions in case it's helpful to you; you might incorporate a similar script into your project’s build process. Note the below was executed with 01facfdc41b61779387f904325a5101dea871fb6 (CHIP HEAD at time of writing), but you may need subtle changes in earlier versions of the SDK.

Edit these to suit your environment

# Input
chipvers=<Desired SDK hash>
destdir=<The /libs directory of your library in you android app project>

Find a build container

git clone https://github.com/project-chip/connectedhomeip.git
cd connectedhomeip
git checkout $chipvers
buildimage=$(cat .github/workflows/full-android.yaml | grep chip-build-android | head -n 1 | awk '{print $2}')
echo "Using $buildimage"

# Run the build container
docker run -it --rm \
  --mount source=$(pwd),target=/workspace,type=bind \
  --workdir="/workspace" \
  $buildimage /bin/bash

Within the container:

# Execute build
chown -R $(whoami) /workspace
git submodule update -f --init --recursive
source ./scripts/bootstrap.sh
./scripts/build/build_examples.py \
  --target android-arm-chip-tool \
  --target android-arm64-chip-tool \
  --target android-x86-chip-tool \
  --target android-x64-chip-tool \
  build
# Strip for size
ndkpath="$ANDROID_HOME/.."
ndktgt=$(ls -la $ndkpath | awk '{print $NF}' | sort -r | grep ndk | head -n 1)
ndkfq="$ndkpath/$ndktgt/toolchains/llvm/prebuilt/linux-x86_64/bin/llvm-strip"
builddir="/workspace/examples/android/CHIPTool/app/libs/"
echo "Using ndk at $ndkfq to strip .so"
find $builddir -name "*.so" | xargs $ndkfq

Exit the container and copy libs to project tree

srcdir="connectedhomeip/examples/android/CHIPTool/app/libs/"
cp -R $srcdir $destdir

Then in your android lib’s Gradle config, you can simply reference the blobs like this

  sourceSets {
        main {
            jniLibs.srcDirs = ['libs/jniLibs']
        }
    }

The topic of an official release exceeds the scope of this sample app project, but we will work to ensure your feedback is noted where relevant.

law-ko commented 2 weeks ago

@aBozowski We are trying to read our own MEI attributes and seems like we would need to regenerate the Matter lib, and we followed this guide. Do we still need to create the docker like quoted in order to generate jniLibs?

Find a build container

git clone https://github.com/project-chip/connectedhomeip.git
cd connectedhomeip
git checkout $chipvers
buildimage=$(cat .github/workflows/full-android.yaml | grep chip-build-android | head -n 1 | awk '{print $2}')
echo "Using $buildimage"

# Run the build container
docker run -it --rm \
  --mount source=$(pwd),target=/workspace,type=bind \
  --workdir="/workspace" \
  $buildimage /bin/bash

Within the container:

# Execute build
chown -R $(whoami) /workspace
git submodule update -f --init --recursive
source ./scripts/bootstrap.sh
./scripts/build/build_examples.py \
  --target android-arm-chip-tool \
  --target android-arm64-chip-tool \
  --target android-x86-chip-tool \
  --target android-x64-chip-tool \
  build
# Strip for size
ndkpath="$ANDROID_HOME/.."
ndktgt=$(ls -la $ndkpath | awk '{print $NF}' | sort -r | grep ndk | head -n 1)
ndkfq="$ndkpath/$ndktgt/toolchains/llvm/prebuilt/linux-x86_64/bin/llvm-strip"
builddir="/workspace/examples/android/CHIPTool/app/libs/"
echo "Using ndk at $ndkfq to strip .so"
find $builddir -name "*.so" | xargs $ndkfq

Exit the container and copy libs to project tree

srcdir="connectedhomeip/examples/android/CHIPTool/app/libs/"
cp -R $srcdir $destdir

Then in your android lib’s Gradle config, you can simply reference the blobs like this

  sourceSets {
        main {
            jniLibs.srcDirs = ['libs/jniLibs']
        }
    }
jonsmirl commented 2 weeks ago

Using docker makes the build more reproducible but it is not required.

It is also not required to rebuild the library in order to read MEI clusters. I use reporting to reading the devices. The custom attributes will get reported like normal attributes (AttributeState), but they don't get decoded. Example of manually decoding....

    fun decodeSensors(deviceId: Long, attrib: AttributeState) {
        val sensor = mutableListOf<Sensor>()
        TlvReader(attrib.tlv).apply {
            enterArray(AnonymousTag)
            while (!isEndOfContainer()) {
                enterStructure(AnonymousTag)
                val s = getUtf8String(ContextSpecificTag(1))
                val c = getUInt(ContextSpecificTag(2))
                val r = getUInt(ContextSpecificTag(3))
                exitContainer()
                Timber.d("Sensor Decoded %s %s %s", s, c, r)
                sensor += Sensor(deviceId, s, c, r)
            }
            exitContainer()
        }
       ... use the data ...

This is the same thing as writing XML for the cluster and then recompiling CHIP lib. I just did it manually to avoid maintaining a modified CHIP library. Same process works for writing attributes.

It would really be useful to have a small tool which uses an IDL description of the cluster to generate the appropriate snippet of Kotlin or C++ but that tool does not exist as a standalone tool. Instead, just look at what ZAP does in the zzz directories and then build your version of it by hand. Once you write this code it rarely changes.

jonsmirl commented 2 weeks ago

It would be useful if this sample included an example of manually decoding a cluster. Note that you can manually decode the standard clusters too. AttributeState.tlv is there for every cluster you read. You can also access AttributeState.json to read/write in JSON format.

law-ko commented 1 week ago

Using docker makes the build more reproducible but it is not required.

It is also not required to rebuild the library in order to read MEI clusters. I use reporting to reading the devices. The custom attributes will get reported like normal attributes (AttributeState), but they don't get decoded. Example of manually decoding....

    fun decodeSensors(deviceId: Long, attrib: AttributeState) {
        val sensor = mutableListOf<Sensor>()
        TlvReader(attrib.tlv).apply {
            enterArray(AnonymousTag)
            while (!isEndOfContainer()) {
                enterStructure(AnonymousTag)
                val s = getUtf8String(ContextSpecificTag(1))
                val c = getUInt(ContextSpecificTag(2))
                val r = getUInt(ContextSpecificTag(3))
                exitContainer()
                Timber.d("Sensor Decoded %s %s %s", s, c, r)
                sensor += Sensor(deviceId, s, c, r)
            }
            exitContainer()
        }
       ... use the data ...

This is the same thing as writing XML for the cluster and then recompiling CHIP lib. I just did it manually to avoid maintaining a modified CHIP library. Same process works for writing attributes.

It would really be useful to have a small tool which uses an IDL description of the cluster to generate the appropriate snippet of Kotlin or C++ but that tool does not exist as a standalone tool. Instead, just look at what ZAP does in the zzz directories and then build your version of it by hand. Once you write this code it rarely changes.

@jonsmirl We are trying to implement onReport and onError based on readAttribute but it seems like it has never fall into these two on functions. Any tips on how did you get the report from readAttribute? Also, where did you put your readAttribute function? We want to put it during the pairing stage when we add into our own fabric.

Here's our attempt:

suspend fun readAttribute(devicePtr: Long, attributePath: ChipAttributePath): AttributeState? {
        return readAttributes(devicePtr, listOf(attributePath))[attributePath]
    }

    /** Wrapper around [ChipDeviceController.readAttributePath] */
    suspend fun readAttributes(
        devicePtr: Long,
        attributePaths: List<ChipAttributePath>
    ): Map<ChipAttributePath, AttributeState> {
        return suspendCoroutine { continuation ->
            val callback: ReportCallback =
                object : ReportCallback {
                    override fun onError(
                        attributePath: ChipAttributePath?,
                        eventPath: ChipEventPath?,
                        e: Exception?
                    ) {
                        Timber.i("ReadAttributes Fail")
                        continuation.resumeWithException(
                            IllegalStateException(
                                "readAttributes failed",
                                e
                            )
                        )
                    }

                    override fun onReport(nodeState: NodeState?) {
                        val states: HashMap<ChipAttributePath, AttributeState> = HashMap()
                        Timber.i("ReadAttributes Success")
                        for (path in attributePaths) {
                            var endpoint: Int = path.endpointId.id.toInt()
                            states[path] =
                                nodeState!!
                                    .getEndpointState(endpoint)!!
                                    .getClusterState(path.clusterId.id)!!
                                    .getAttributeState(path.attributeId.id)!!
                        }
                        continuation.resume(states)
                    }

                    override fun onDone() {
                        super.onDone()
                    }
                }
            chipDeviceController.readAttributePath(callback, devicePtr, attributePaths)
        }
    }
jonsmirl commented 1 week ago

Look in the chip directory, there is subscribeToAttribute() and SubscriptionHelper() those can subscribe to any cluster MEI included.

This will subscribe to everything:

class SubscriptionHelper @Inject constructor(private val chipClient: ChipClient) {

    suspend fun awaitSubscribeToPeriodicUpdates(
        connectedDevicePtr: Long,
        subscriptionEstablishedCallback: SubscriptionEstablishedCallback,
        resubscriptionAttemptCallback: ResubscriptionAttemptCallback,
        reportCallback: ReportCallback
    ) {
        return suspendCoroutine { continuation ->
            Timber.d("subscribeToPeriodicUpdates()")
            val endpointId = ChipPathId.forWildcard()
            val clusterId = ChipPathId.forWildcard()
            val attributeId = ChipPathId.forWildcard()
mattluoint commented 1 week ago

Look in the chip directory, there is subscribeToAttribute() and SubscriptionHelper() those can subscribe to any cluster MEI included.

This will subscribe to everything:

class SubscriptionHelper @Inject constructor(private val chipClient: ChipClient) {

    suspend fun awaitSubscribeToPeriodicUpdates(
        connectedDevicePtr: Long,
        subscriptionEstablishedCallback: SubscriptionEstablishedCallback,
        resubscriptionAttemptCallback: ResubscriptionAttemptCallback,
        reportCallback: ReportCallback
    ) {
        return suspendCoroutine { continuation ->
            Timber.d("subscribeToPeriodicUpdates()")
            val endpointId = ChipPathId.forWildcard()
            val clusterId = ChipPathId.forWildcard()
            val attributeId = ChipPathId.forWildcard()

We are successful implement onReport and onError based on readAttribute but NodeState getEndpointState() value is null , Do you have any ideas about what might be the cause of the problem?

jonsmirl commented 1 week ago

The source code to the Java/Kotlin libraries is in the CHIP tree. You can look at the source to see what is happening. I used SubscriptionHelper from the example without needing to change it.

law-ko commented 1 week ago

It would be useful if this sample included an example of manually decoding a cluster. Note that you can manually decode the standard clusters too. AttributeState.tlv is there for every cluster you read. You can also access AttributeState.json to read/write in JSON format.

It appears that the our custom cluster returns null while other default ones (like onoff) does not. Does this mean that the chiptool lib needs to be rebuilt due to our custom cluster does not exist in default CHIPTOOL lib?

jonsmirl commented 1 week ago

You need to decode the TLV on the AttributeState It is also available as JSON. attrib.json

    fun decodeSensors(deviceId: Long, attrib: AttributeState) {
        val sensor = mutableListOf<Sensor>()
        TlvReader(attrib.tlv).apply {
            enterArray(AnonymousTag)
            while (!isEndOfContainer()) {
                enterStructure(AnonymousTag)
                val s = getUtf8String(ContextSpecificTag(1))
                val c = getUInt(ContextSpecificTag(2))
                val r = getUInt(ContextSpecificTag(3))
                exitContainer()
                Timber.d("Sensor Decoded %s %s %s", s, c, r)
                sensor += Sensor(deviceId, s, c, r)
            }
            exitContainer()
        }
law-ko commented 1 week ago

You need to decode the TLV on the AttributeState It is also available as JSON. attrib.json

    fun decodeSensors(deviceId: Long, attrib: AttributeState) {
        val sensor = mutableListOf<Sensor>()
        TlvReader(attrib.tlv).apply {
            enterArray(AnonymousTag)
            while (!isEndOfContainer()) {
                enterStructure(AnonymousTag)
                val s = getUtf8String(ContextSpecificTag(1))
                val c = getUInt(ContextSpecificTag(2))
                val r = getUInt(ContextSpecificTag(3))
                exitContainer()
                Timber.d("Sensor Decoded %s %s %s", s, c, r)
                sensor += Sensor(deviceId, s, c, r)
            }
            exitContainer()
        }

@jonsmirl It appears that when I try to getEndpointState for my custom cluster it would be null:

Whereas if I read an onoff cluster then it would be fine.

The problem is that since I get null for getEndpointState, I am unable to do decoding.

jonsmirl commented 1 week ago

I use a wild card subscription for the endpoint, then I am able to decode it like this, LOWPAN_CLUSTER_BLE is a custom cluster. I did need to add locking because these callbacks are not on the main app thread. device.update { ... acquires a lock. Maybe your subscription isn't matching?

image

law-ko commented 1 week ago

I use a wild card subscription for the endpoint, then I am able to decode it like this, LOWPAN_CLUSTER_BLE is a custom cluster. I did need to add locking because these callbacks are not on the main app thread. device.update { ... acquires a lock. Maybe your subscription isn't matching?

image

Wildcard does not include my custom clusterID in it. Any reasons why?

Screen Shot 2024-07-12 at 4 10 20 AM
jonsmirl commented 1 week ago

Look in your Android log and make sure your device is really reporting it. The reports are printed in detail in the logs.

law-ko commented 1 week ago

Look in your Android log and make sure your device is really reporting it. The reports are printed in detail in the logs.

If I use wildcard then the log will not print out, if I define the clusterId of my custom cluster then I will see the CHIPTOOL JSON.

We can confirm that the device is reporting it since we tested with CHIPTOOL and it is also printing the CHIPTOOL json in the log as well.

jonsmirl commented 1 week ago

The logs print all report messages from the devices. Search for where it reports the other clusters on that endpoint.

You could have an error in the TLV. If there is an error it won't report in Android. But there will be a log message.

law-ko commented 1 week ago

The logs print all report messages from the devices. Search for where it reports the other clusters on that endpoint.

You could have an error in the TLV. If there is an error it won't report in Android. But there will be a log message.

This is the log that it prints out correctly. IMG_0360

law-ko commented 1 week ago

The logs print all report messages from the devices. Search for where it reports the other clusters on that endpoint.

You could have an error in the TLV. If there is an error it won't report in Android. But there will be a log message.

If we have an error in the TLV, would it cause the cluster to not show in the endpointState?

Screen Shot 2024-07-12 at 4 10 20 AM
jonsmirl commented 1 week ago

It will print an error in the logs about invalid TLV and it won't decode it.

This is my custom cluster and it works using the code above. My data is an array of structures. image

jonsmirl commented 1 week ago

Note I am using SubscriptionHelper and you are using something different.

law-ko commented 1 week ago

It will print an error in the logs about invalid TLV and it won't decode it.

This is my custom cluster and it works using the code above. My data is an array of structures. image

do you need to define your custom cluster attributes somewhere in the Android Studio project so it knows what datatypes? I see your log has datatype in the brackets right after the data value. Note I am not using any custom Chiptool lib at this point.

jonsmirl commented 1 week ago

TLV encodes the data types in its headers.

law-ko commented 1 week ago

TLV encodes the data types in its headers.

So this is not the JSON payload that prints out from chiptool but from your TLV decoder function?

jonsmirl commented 1 week ago

Somewhere down inside the Android library it is printing that. I don't know where. It is getting printed when the library decodes the TLV.

law-ko commented 4 days ago

@jonsmirl We are still having trouble reading the custom cluster ID, where did you put the subscribe or read attribute function in the Sample app?

jonsmirl commented 4 days ago

Note that each reportCallback is unique do you have to track one for each device.

    private suspend fun addDevice(deviceId: Long) {
        val reportCallback = object : SubscriptionHelper.ReportCallbackForDevice(deviceId) {
            override fun onReport(nodeState: NodeState) {
                //super.onReport(nodeState)
                Timber.d(
                    "ReportCallbackForDevice %x", deviceId
                )
                do stuff with NodeState
            }
        }

        try {
            val connectedDevicePointer = chipClient.getConnectedDevicePointer(deviceId)
            update { devices ->
                devices[deviceId] = Device(reportCallback)
            }
            Timber.d(
                "JDS add device %x, count %d", deviceId, _devices.size
            )
            subscriptionHelper.awaitSubscribeToPeriodicUpdates(
                connectedDevicePointer, SubscriptionHelper.SubscriptionEstablishedCallbackForDevice(
                    deviceId
                ), SubscriptionHelper.ResubscriptionAttemptCallbackForDevice(
                    deviceId
                ), reportCallback
            )
        } catch (e: IllegalStateException) {
            Timber.e("Can't get connectedDevicePointer for ${deviceId}.")
        }
    }