agittins / bermuda

Bermuda Bluetooth/BLE Triangulation / Trilateration for HomeAssistant
MIT License
415 stars 10 forks source link

Implement trilateration of devices #21

Open agittins opened 1 year ago

agittins commented 1 year ago

This is the primary goal of this integration, but the first milestones to hit are:

The milestones for this feature are:

Extension goals:

jaymunro commented 6 months ago

Some thoughts on this:

agittins commented 6 months ago

Heheh... if there's anything this project has taught me, it's that "ideas are cheap" - it's the implementation where the real cost comes in! :-D

I've been mulling over most of those ideas for quite a while, and I chose the above path because I think it might be the most viable approach. Some of my thought processes on this are:

Anyway, I could talk for ages about the what-ifs and nice-to-haves, but for a while I am going to be limited to:

Something I don't see a lot of in the prior art is addressing the "nature" of rssi noise. Lots of filtering assumes that noise is symmetrical around the signal, but rssi has the property that almost all noise (I'm guessing over 99%) adds to the distance, so if you receive a stronger signal, you should assume it's "more accurate" than any weaker signal, because all the things that mess with signal strength (propagation) tend to reduce signal, except for the rare case of multipath (reflections) that happen to be in-phase - which should be a minuscule number, since the odds are much more in favour of a reflected signal being out of phase. I probably need to read up on that a bit if I'm going to pin my entire reasoning on it! :-)

I expect that once I make an official release with the recent changes (UUID/iBeacon support in particular) we might get a lot more users coming on board, as tracking android phones I think is a common desired use-case. So I want to get some of the basics bedded-down a bit better for that, like having it recover from restarts/reloads, not causing errors in logs etc - the sort of things that result in more time spent supporting it rather than working on it.

Also I think my documentation is becoming less apt. The readme feels a bit long and unwieldy, so I might need to move some of that info into a separate "manual" and trim the readme down so it gets the basics across in a clear way. I'm always frustrated by projects that are hard to get a solid overview of without digging too deep.

Anyway, thanks for sharing your thoughts, and I think we have similar ideas about how to progress to realising the ultimate goal for Bermuda, and I think we've got a good chance to build it into something pretty cool (actually, I think it's already pretty cool, and that's just in trying to achieve feature-parity with what else is out there).

jaymunro commented 6 months ago
  • We don't get to measure rssi between receivers

There may be some magic happening here then. I am able to add my proxies as 'beacons' in Bermuda and see distances/RSSI from other proxies.

In the screenshot below, the 'Bathroom wall heater switch' is a Shelly Plus1PM controlling a wall heater in the bathroom as seen from a Shelly 1PMMini in the Office (next room) which itself is marked as present in the 'Hallway' because there is a M5Stack running ESPHome about 0.5m away from it in the Hallway.

The bathroom heater Shelly switch:

Screenshot 2024-03-07 at 2 54 40 PM

The office ceiling light Shelly:

Screenshot 2024-03-07 at 2 57 40 PM

Note: It sees itself as unknown, which makes sense.

Upshot is, there already seems to a large volume of data on static positions that could potentially be tapped without the user being involved, so long as there is a way to identify these as proxies (probably using the MAC addresses of the found 'scanners', I guess).

Do you not see the same?

jaymunro commented 5 months ago

After some research, further notes on this line for when the time comes...

It is indeed possible to perform trilateration without knowing the exact locations of the proxies, as long as we have information about which room each proxy is in. This approach is known as anchor-free or anchor-unknown localization.

The basic idea is to use the distance estimates between the proxies themselves, in addition to the distance estimates between the proxies and the target device, to estimate the relative positions of the proxies and the device within the same coordinate system.

Here's a general outline of the steps to follow:

  1. Distance Matrix Estimation: Estimate the pairwise distances between all the proxies based on the received signal strength indicator (RSSI) or other distance estimation techniques. This will give a distance matrix between all the proxy pairs.

  2. Multidimensional Scaling (MDS): Use a technique called Multidimensional Scaling (MDS) to estimate the relative positions of the proxies in a common coordinate system, based on the distance matrix. MDS tries to find a configuration of points in a low-dimensional space that best preserves the pairwise distances.

  3. Device Localization: With the relative positions of the proxies obtained from MDS, we can then use trilateration or multilateration techniques to estimate the location of the target device relative to the proxies, based on the distance estimates between the device and each proxy.

  4. Room Probability Calculation: Since we know which room each proxy is in, we can treat the distances between the device's estimated location and each room proxy for the probability of the device being in that room. We can use a probability density function (PDF), such as a Gaussian or exponential distribution, where the probability decreases as the distance from the room centroid increases..

This anchor-free localization approach has the advantage of not requiring any prior knowledge of the proxy locations, but it typically requires more computational effort and may be less accurate than methods that use known proxy locations.

Additionally, we may need to incorporate techniques to handle potential issues like flip ambiguities (reflections in the coordinate system) and translation ambiguities (arbitrary shifts in the coordinate system) that can arise in anchor-free localization.

The specific algorithm and techniques to use will depend on factors such as the number of proxies, the accuracy of the distance estimates, and the complexity of the room layouts.

jaymunro commented 5 months ago

Assuming we can get the Distance Matrix Estimation in metres from each proxy for each other proxy like I posted on March 7, into an array of distance_estimates an example of the code needed might be:

import numpy as np
from sklearn.manifold import MDS
from scipy.spatial.distance import pdist, squareform

# Distance matrix between proxies
# Assuming you have estimated the pairwise distances between proxies
# and stored them in a distance_matrix
distance_matrix = np.array([[0, 5, 8, 9],
                             [5, 0, 7, 6],
                             [8, 7, 0, 3],
                             [9, 6, 3, 0]])

# Convert the distance matrix to a condensed distance vector
distance_vector = pdist(distance_matrix)

# Convert the condensed distance vector back to a square distance matrix
distance_matrix_precomputed = squareform(distance_vector)

# Perform MDS
mds = MDS(n_components=2, dissimilarity='precomputed')
proxy_positions = mds.fit_transform(distance_matrix_precomputed)

# Print the estimated relative positions of the proxies
print("Estimated relative positions of the proxies:")
for i, pos in enumerate(proxy_positions):
    print(f"Proxy {i+1}: ({pos[0]:.2f}, {pos[1]:.2f})")

In this example:

  1. We assume that we have already estimated the pairwise distances between the proxies and stored them in a distance_matrix numpy array.
  2. We use the pdist function from scipy.spatial.distance to convert the distance matrix to a condensed distance vector, which is required by the MDS class in scikit-learn.
  3. We use the squareform function from scipy.spatial.distance to convert the condensed distance vector back to a square distance matrix.
  4. We create an instance of the MDS class, specifying that we want to embed the data in a 2-dimensional space (n_components=2). We also set dissimilarity='precomputed' to indicate that we are providing precomputed distances.
  5. We call the fit_transform method of the MDS instance, passing in the condensed distance vector distance_vector. This method performs the MDS algorithm and returns the estimated relative positions of the proxies.
  6. Finally, we print the estimated relative positions of the proxies.
  7. Note we may need to handle potential issues like flip ambiguities and translation ambiguities that can arise in MDS.

Then for the device localisation:

import numpy as np
from scipy.optimize import least_squares

def trilaterate(proxy_positions, distances):
    """
    Trilaterate the position of a device given the relative positions of the proxies
    and the distances between the device and each proxy.

    Args:
        proxy_positions (numpy.ndarray): 2D array of relative positions of the proxies (x, y)
        distances (numpy.ndarray): 1D array of distances between the device and each proxy

    Returns:
        numpy.ndarray: Estimated position of the device (x, y)
    """
    def objective_function(coords, proxy_positions, distances):
        x, y = coords
        residuals = []
        for i in range(len(proxy_positions)):
            x_proxy, y_proxy = proxy_positions[i]
            distance_expected = np.sqrt((x - x_proxy) ** 2 + (y - y_proxy) ** 2)
            residuals.append(distance_expected - distances[i])
        return residuals

    initial_guess = np.mean(proxy_positions, axis=0)
    result = least_squares(objective_function, initial_guess, args=(proxy_positions, distances))
    return result.x

# Example usage
# Assuming you have obtained the relative positions of the proxies using MDS
proxy_positions = np.array([[-2.5, 1.0],
                             [1.5, 2.0],
                             [3.0, -1.5],
                             [-1.0, -2.5]])

# Distances between the device and each proxy (in meters)
distances = np.array([5.0, 4.0, 6.0, 3.5])

# Trilaterate the device's position
device_position = trilaterate(proxy_positions, distances)
print(f"Estimated position of the device: ({device_position[0]:.2f}, {device_position[1]:.2f})")

The result of the example Estimated position of the device: (-3.05, -2.22) represents the estimated coordinates of the device's location in the relative coordinate system established by the Multidimensional Scaling (MDS) step.

In the MDS step, the relative positions of the proxies were determined, and these positions were used to define a new coordinate system. The proxy positions in the example were:

proxy_positions = np.array([[-2.5, 1.0],
                             [1.5, 2.0],
                             [3.0, -1.5],
                             [-1.0, -2.5]])

These proxy positions effectively define the origin and scale of the new coordinate system.

The trilateration step then estimates the device's position within this relative coordinate system, using the distances between the device and each proxy. The estimated position (-3.05, -2.22) means that, relative to the proxy positions and the established coordinate system, the device is located at the coordinates (-3.05, -2.22).

However, this estimated position is not an absolute location in a physical space, like meters or feet. It's a relative position within the coordinate system defined by the proxy positions. The units of the coordinates are arbitrary and determined by the scale of the proxy positions.

To interpret the estimated device position more meaningfully, you would need to map or transform the relative coordinates to physical coordinates or room boundaries. This could involve techniques like:

  1. Scaling and translating the relative coordinates to match the actual dimensions and layout of the rooms or environment.
  2. Associating each proxy with its known physical location or room, and using this information to map the relative coordinates to physical locations.
  3. Combining the estimated device position with other information, like room centroids or boundaries, to calculate the probability of the device being in each room.

So, while the estimated position (-3.05, -2.22) itself doesn't directly represent a physical location, it serves as an intermediate step in the overall process of locating the device and determining the probability of it being in each room.

jaymunro commented 5 months ago

Putting that all together:

import numpy as np
from scipy.optimize import least_squares
from scipy.spatial.distance import pdist, squareform
from sklearn.manifold import MDS

def trilateration(distances, positions):
    def error(x, c, r):
        return np.linalg.norm(x - c, axis=1) - r

    l = len(positions)
    S = sum(distances)
    # Approximate initial guess
    x0 = [sum([positions[i][0] * distances[i] / S for i in range(l)]),
          sum([positions[i][1] * distances[i] / S for i in range(l)])]

    res = least_squares(error, x0, args=(positions, distances))
    return res.x

# Proxy data in metres from each other
proxy_names = ['Office', 'Garage', "Living room", 'Gate']
proxy_distances = np.array([[0, 8, 11, 17],
                            [8, 0, 15, 19],
                            [11, 15, 0, 7],
                            [17, 19, 7, 0]])

# Beacon data in meters
beacon_name = "Jay iPhone"
beacon_distances = np.array([2, 9, 10, 17])

# Perform multidimensional scaling
mds = MDS(n_components=2, dissimilarity='precomputed', random_state=42)
proxy_positions = mds.fit_transform(proxy_distances)

# Calculate beacon position using trilateration
beacon_position = trilateration(beacon_distances, proxy_positions)

# Calculate distances from beacon to each proxy
distances_to_proxies = np.linalg.norm(beacon_position - proxy_positions, axis=1)

# Calculate probabilities based on inverse distances
inverse_distances = 1 / distances_to_proxies
probabilities = inverse_distances / np.sum(inverse_distances)

# Print the results
print("Proxy positions:", proxy_positions)
print("Beacon position:", beacon_position)
print("Distances to proxies:", distances_to_proxies)
print("Probabilities:")
for i in range(len(proxy_names)):
    print(f"{proxy_names[i]}: {probabilities[i]:.3f}")

Output from Google Colab is:

Proxy positions: [[  3.26972281   6.20398504]
 [ -4.3259723    8.60916266]
 [  2.83524773  -4.73209981]
 [ -1.77899824 -10.08104789]]
Beacon position: [4.57211838 5.34202899]
Distances to proxies: [ 1.56179463  9.4789335  10.22275848 16.67956783]
Probabilities:
Office: 0.709
Garage: 0.117
Living room: 0.108
Gate: 0.066

EDIT. The above output perfectly matches reality. It is somewhat helped by me choosing proxies far apart from each other, so I am unsure how well it's going to scale with more rooms/areas and proxies that are all closer to each other.

I calculated the proxies distances from each other using a ruler on a house plan as the vectors have to somewhat make sense for the equation to be solvable and then measured the distances from an imaginary location in the office to all 4 proxies.

The proxy_distances array is simply the distances of each proxy to the others and that's why it has zeroes diagonally down the middle and the array is a reflection of itself across the zero diagonal. It can easily scale to however number of proxies there are.

I plotted the result on a graph and I am amazed at how well it matches my original plan locations, to a T. Well, with the exception that it is mirrored in the horizontal, but I don't think that matters.

Screenshot 2024-04-04 at 12 53 14 AM
jaymunro commented 5 months ago

And here is a version to handle missing data, such as when the signal is too weak.

import numpy as np
from scipy.optimize import least_squares
from scipy.spatial.distance import pdist, squareform
from sklearn.manifold import MDS

def trilateration(distances, positions):
    def error(x, c, r):
        return np.linalg.norm(x - c, axis=1) - r

    l = len(positions)
    S = sum(distances)
    # Approximate initial guess
    x0 = [sum([positions[i][0] * distances[i] / S for i in range(l)]),
          sum([positions[i][1] * distances[i] / S for i in range(l)])]

    res = least_squares(error, x0, args=(positions, distances))
    return res.x

# Proxy data in metres from each other
proxy_names = ['Office', 'Garage', "Living room", 'Gate']
proxy_distances = np.array([[0, 8, 11, 17],
                            [8, 0, 15, 19],
                            [11, 15, 0, np.nan],  # Missing data
                            [17, 19, np.nan, 0]])

# Beacon data in meters
beacon_name = "Jay iPhone"
beacon_distances = np.array([2, 9, np.nan, 17])  # Missing data

# Handle missing data in proxy distances
proxy_distances_masked = np.ma.masked_invalid(proxy_distances)
proxy_distances_filled = proxy_distances_masked.filled(proxy_distances_masked.mean())

# Perform multidimensional scaling
mds = MDS(n_components=2, dissimilarity='precomputed', random_state=42)
proxy_positions = mds.fit_transform(proxy_distances_filled)

# Handle missing data in beacon distances
beacon_distances_masked = np.ma.masked_invalid(beacon_distances)
valid_indices = np.where(~beacon_distances_masked.mask)[0]
valid_beacon_distances = beacon_distances_masked[valid_indices].data
valid_proxy_positions = proxy_positions[valid_indices]

# Calculate beacon position using trilateration with valid data
beacon_position = trilateration(valid_beacon_distances, valid_proxy_positions)

# Calculate distances from beacon to each proxy
distances_to_proxies = np.linalg.norm(beacon_position - proxy_positions, axis=1)

# Calculate probabilities based on inverse distances
inverse_distances = 1 / distances_to_proxies
probabilities = inverse_distances / np.sum(inverse_distances)

# Print the results
print("Proxy positions:", proxy_positions)
print("Beacon position:", beacon_position)
print("Distances to proxies:", distances_to_proxies)
print("Probabilities:")
for i in range(len(proxy_names)):
    print(f"{proxy_names[i]}: {probabilities[i]:.3f}")

Output here is:

Proxy positions: [[  3.16277882   6.01075486]
 [ -4.15451769   8.49639084]
 [  4.49918298  -4.3901149 ]
 [ -3.50744411 -10.1170308 ]]
Beacon position: [4.37666947 4.82150806]
Distances to proxies: [ 1.69936414  9.28902136  9.21243763 16.89139397]
Probabilities:
Office: 0.681
Garage: 0.125
Living room: 0.126
Gate: 0.069
Screenshot 2024-04-04 at 1 57 11 AM

(md) points are from the set with missing data.

jaymunro commented 5 months ago

And here it is cleaned up a bit with the potentially resource heavy MDS function separated as proxies are fixed so do not need to be calculated often. Added a plot and some x/y mirroring flags to help with testing.

import numpy as np
from scipy.optimize import least_squares
from scipy.spatial.distance import pdist, squareform
from sklearn.manifold import MDS
import matplotlib.pyplot as plt

def calculate_proxy_positions(proxy_distances):
    # Handle missing data in proxy distances
    proxy_distances_masked = np.ma.masked_invalid(proxy_distances)
    proxy_distances_filled = proxy_distances_masked.filled(proxy_distances_masked.mean())

    # Perform multidimensional scaling
    mds = MDS(n_components=2, dissimilarity='precomputed', random_state=42, normalized_stress='auto')
    proxy_positions = mds.fit_transform(proxy_distances_filled)

    # Print the results
    print("Proxy positions:", proxy_positions)
    print()

    return proxy_positions

def localize_beacon(proxy_positions, proxy_names, beacon_name, beacon_distances, mirror_x=False, mirror_y=False):
    def trilateration(distances, positions):
        def error(x, c, r):
            return np.linalg.norm(x - c, axis=1) - r

        l = len(positions)
        S = sum(distances)
        # Approximate initial guess
        x0 = [sum([positions[i][0] * distances[i] / S for i in range(l)]),
              sum([positions[i][1] * distances[i] / S for i in range(l)])]
        res = least_squares(error, x0, args=(positions, distances))
        return res.x

    # Handle missing data in beacon distances
    beacon_distances_masked = np.ma.masked_invalid(beacon_distances)
    valid_indices = np.where(~beacon_distances_masked.mask)[0]
    valid_beacon_distances = beacon_distances_masked[valid_indices].data
    valid_proxy_positions = proxy_positions[valid_indices]

    # Calculate beacon position using trilateration with valid data
    beacon_position = trilateration(valid_beacon_distances, valid_proxy_positions)

    # Calculate distances from beacon to each proxy
    distances_to_proxies = np.linalg.norm(beacon_position - proxy_positions, axis=1)

    # Calculate probabilities based on inverse distances
    # Note: Probabilities are more an indicator of relative proximity than a true statistical probability
    inverse_distances = 1 / distances_to_proxies
    probabilities = inverse_distances / np.sum(inverse_distances)

    # Mirror the positions if requested
    if mirror_x:
        proxy_positions[:, 0] *= -1
        beacon_position[0] *= -1
    if mirror_y:
        proxy_positions[:, 1] *= -1
        beacon_position[1] *= -1

    # Print the results
    print("Beacon name:", beacon_name)
    print("Beacon position:", beacon_position)
    print("Distances to proxies:", distances_to_proxies)
    print("Probabilities:")
    for i in range(len(proxy_names)):
        print(f"{proxy_names[i]}: {probabilities[i]:.3f}")

    # Plot the positions
    plt.figure(figsize=(8, 6))
    proxy_x = [x for x, y in proxy_positions]
    proxy_y = [y for x, y in proxy_positions]
    x_lim = max(abs(min(proxy_x)), abs(max(proxy_x)))
    y_lim = max(abs(min(proxy_y)), abs(max(proxy_y))) * 1.1
    lim = max(x_lim, y_lim)
    for i, (x, y) in enumerate(proxy_positions):
        plt.scatter(x, y, marker='o')
        plt.annotate(proxy_names[i], (x, y), textcoords="offset points", xytext=(0, -15), ha='center')

    beacon_x, beacon_y = beacon_position
    plt.scatter(beacon_x, beacon_y, marker='*', s=200, c='r')
    plt.annotate(beacon_name, (beacon_x, beacon_y), textcoords="offset points", xytext=(0, -15), ha='center')

    # plt.legend()
    plt.xlabel('X')
    plt.ylabel('Y')
    plt.xlim(-lim, lim)
    plt.ylim(-lim, lim)
    plt.gca().set_aspect('equal', adjustable='box')
    plt.title('Positions of Proxies and Beacon')
    plt.show()

    return beacon_position, distances_to_proxies, probabilities

# Proxy data in metres from each other
proxy_names = ['Office', 'Garage', "Living room", 'Gate', 'Kitchen']
proxy_distances = np.array([[ 0.0,  8.0, 11.0, 17.0,  9.5],
                            [ 8.0,  0.0, 15.0, 19.0, 15.0],
                            [11.0, 15.0,  0.0,  7.0,  2.5],
                            [17.0, 19.0,  7.0,  0.0,  9.5],
                            [ 9.5, 15.0,  2.5,  9.5,  0.0]])

proxy_positions = calculate_proxy_positions(proxy_distances)

# Beacon data in meters
beacon_name = "iPhone1"
beacon_distances = np.array([2, 9, 10, np.nan, 8.5])
beacon_position, distances_to_proxies, probabilities = localize_beacon(proxy_positions, proxy_names, beacon_name, beacon_distances, mirror_x=True, mirror_y=True)

beacon_name = "iPhone2"
beacon_distances = np.array([2, 6, 10.5, 16, 9.5])
beacon_position, distances_to_proxies, probabilities = localize_beacon(proxy_positions, proxy_names, beacon_name, beacon_distances, mirror_x=False, mirror_y=False)
agittins commented 5 months ago

Now that's exciting! Amazing stuff :-)

For some reason I hadn't seen your post from March7, either.

So glad you were able to find the algo's that I was trying to describe! I knew that once we had estimates for each side of each triangle we'd be able to reach a solution, and I was pretty sure numpy or others would have those things but was at a loss of how to find them. This is awesome. MDS is exactly what I was trying to describe previously - once we have a "relative" solution for all the distances, we simply scale against the known real-world datum points.

I also really like that you get a probabilistic result, too - since our measurements are inherently noisy I definitely wanted to be able to return a "radius of confusion" for the estimates.

Re the inter-proxy distances, the shelly's must be acting as beacons, which the esphome proxies do not. I tried setting them up to do so but in the config at least the beacon and proxy roles are mutually exclusive. I don't think this is an inherent limitation though, just one that esphome hasn't tackled due to probably not foreseeing a use-case. That's super handy for the shelly's though!

The last week or so I've been thinking that Bermuda might be at about the right stage to start tackling the trilateration, with the core functionality reasonably bedded down and the filtering being probably "good enough", so I'm keen to start working on this - your timing is perfect! :-)

I think I'll start with implementing a trilateration.py that gets fed the filtered distances so that we have a clean(ish) interface that doesn't need to deal with the homeassistant stuff, which might make it easier for you to contribute directly, does that sound like a plan?

I'll also get on to gathering the inter-proxy distances (directly in the case of shellys and indirectly via triangle-inequality for others). We might need the facility to tag certain proxies as "mobile", for example if a proxy is mounted to a robot vacuum :-) we'd need to exclude them from the static proxies used for the solution.

Thanks again for this amazing work!

jaymunro commented 5 months ago

Great idea to put it in its own .py. I haven't developed any integration and what little looking into it I have done, it seems quite a thing to wrap your head around. Hat tip to you Ashley.

So yes that would help me and friends give our input. I've made a few very capable friends who've helped with this, particularly a chap named Claude.

The probabilistic result is what I was aiming for, but what we actually have now is a normalised estimation of proximities rather than a true statistical probability. But for the purposes of this project, it should hopefully be sufficient.

For the proxies, maybe this'll be a good excuse to automate some light switches or power points. I find them cheap and very reliable. They report changes (switch, temp, power, etc) immediately (less than 500ms), the connection is live I believe. The Gen2+ ones have two radios so this can be used for two WiFi connections or other but this most likely helps with detecting bluetooth without interrupting the WiFi connection. Highly recommend the gen2+ models. They've also helped me get a comprehensive picture of live power and energy usage (kids hate it, lol).

Yes, I am the same, I do not see my three ESPHome M5Stacks but all (I think) Gen2 Shellys (Gen1 doesn't have bluetooth) and some others like smart speakers, a water sprinkler, door locks and such. There are actually 110 more devices I can select, but I suspect the majority are people or cars passing by outside the gate. Might be an idea to have a configurable expiration date on these things some day.

Point is, people may have many more devices linked to rooms that, although won't proxy, will aid in trilateration of the rooms through clustering of the devices in a room to get a mean location of the room/area and its boundaries. For example, in my living room I have 2 shellys and a smart speaker and they are all up against the room boundaries (walls) so anything that is estimated to be within those beacons can be regarded as actually in the room. More definitive than just near the room.

Hence, it may not be needed but would be good to keep in mind the facility to add other tags, in addition to fixed/mobile, such as against wall vs. inside room.

Let me know when you're ready for me to look at the new file or any questions about the code above, of course.

tbrasser commented 5 months ago

Awesome

Hmm I flashed all my shellies (Gen3) with esphome. Following this thread!

jaymunro commented 5 months ago

Ashley, do your esphome proxies see an other static Bluetooth devices in your house like mine do (speakers and such)? If they are in specific rooms, perhaps they can be used to provide some trilateration data too.

First principles tells me that if proxies in specified areas can calculate distances to something in a specific fixed area, it can be used to provide data that would help with the trilateration.

Just trying to think of ways that you could get some distances in your home (in a repeatable way for general users) for testing.

agittins commented 5 months ago

Oh yeah, heaps of stuff. Including neighbours driving past (someone has a corporate vehicle which has an ODBII interface attached to it).

I deliberately retain all seen bluetooth devices in Bermuda's internal state, as even though a user may only be interested in a small number of devices, I knew that it would be helpful to leverage data from all devices to build a sense of the environment. So the plan was always to keep all adverts seen, so that the data can be used to refine estimates regardless of whether those particular devices are of interest to the user.

It may become helpful to allow the user to tag certain devices as "known-static-devices" to serve as datum-points, but there's also a lot that can be done just by considering the overall "picture" of devices, such as when a proxy gets occluded by something, we'll see many device ranges to that proxy increase, so the algo's can weight that accordingly. Some of this I think will happen for free, some will take deliberate incorporation - but it's been a core idea from the start, and why I've always believed that this could work despite people saying that rssi measurements are too noisy - because we have a lot more measurement points than first appears obvious, we can use that to smooth out some of the ephemeral glitchyness.

I also have the initial bits of doing some "fingerprinting" of devices, so that certain device types can be automatically tagged as being "static" - google home speakers, chromecasts, appleTV, door/window sensors - all those product types can typically be assumed as being physically static in order to establish some fixed points of reference to scale everything else around.

agittins commented 4 months ago

Raised issue in esphome for allowing ble proxies to also act as beacons (thus providing inter-proxy distance estimations) at https://github.com/esphome/feature-requests/issues/2689

nliaudat commented 4 months ago

Some research on BLE-Based Indoor Positioning:

agittins commented 4 months ago

Cheers, I'll take some time to read the properly when I can. The first looks interesting after looking at the abstract, as I was thinking that kalman filters might become useful in the latter stages of the pipeline (I don't think they're well-suited to rssi values directly, but as a step later in the processing they may help with restoring some latency / semi-predictive aspects).

The second one I'll have to try again when I am less tired and less grumpy - it reads badly like it was put together by an LMM switching between basic and overly-specific levels of info, and some of the claims are... less than well-supported, like their conclusions about tx power and interval... very dodgy reasoning there, as far as I can see right now. But that might be just me :-) I'll read more closely when I can take the time to process it properly.

Thanks for the tip-off!

asucrews commented 1 month ago

Raised issue in esphome for allowing ble proxies to also act as beacons (thus providing inter-proxy distance estimations) at esphome/feature-requests#2689

looks like this request has been merged

https://github.com/esphome/esphome/pull/6908

Rembock commented 6 days ago

Some interesting developments in Bluetooth 6 with Bluetooth Channel Sounding: https://www.bluetooth.com/channel-sounding-tech-overview/