AndersMalmgren / FreePIE

Programmable Input Emulator
644 stars 144 forks source link

YEI 3-Space Sensor accelerometer and magnetic axis #37

Open artimage25 opened 10 years ago

artimage25 commented 10 years ago

I see that I can output the pitch, yaw and roll, and the two buttons of the Yei sensor. But how can access to the three accelerometer axis (x,y,z) and the magnetic sensor? I believe that for replicate any real movements is necesary to mix the signal of the gyros and the acelerometers for example for see if any person go crouch or prone.

AndersMalmgren commented 10 years ago

Maybe use this to get corrected (calibrated?) sensor values

//Corrected Data Commands
//  37(0x25)
 /*****************AutoGenerated*************//**
 * \brief Get all corrected component sensor data
 *
 * Returns the corrected gyro rate vector, accelerometer vector, and compass vector. Note that the gyro vector is in units of radians/sec, the accelerometer vector is in units of G, and the compass vector is in units of gauss. 
 *
 * Input:
 * None
 *
 * Output:
 * Gyro Rate in units of radians/sec (Vector x3), Acceleration Vector in units of G (Vector x3), Compass Vector in units of gauss (Vector x3)
 ********************************************/
TSS_EXPORT TSS_Error tss_getAllCorrectedComponentSensorData(TSS_Device_Id device, float * gyro_rate3, float * accelerometer3, float * compass3, unsigned int * timestamp);

Or this one to get unfiltered raw data

//Raw Data Commands
//  64(0x40)
 /*****************AutoGenerated*************//**
 * \brief Get all raw component sensor data
 *
 * Returns the raw gyro rate vector, accelerometer vector and compass vector as read directly from the component sensors without any additional post-processing. The range of values is dependent on the currently selected range for each respective sensor.
 *
 * Input:
 * None
 *
 * Output:
 * Gyro Rate in counts per degrees/sec (Vector x3), Acceleration Vector in counts per g (Vector x3), Compass Vector in counts per gauss (Vector x3)
 ********************************************/
TSS_EXPORT TSS_Error tss_getAllRawComponentSensorData(TSS_Device_Id device, float * gyro_rate3, float * accelerometer3, float * compass3, unsigned int * timestamp);

Some one that owns a Yei device has to implement this

AndersMalmgren commented 10 years ago

@baggyg maybe you can help out here?

baggyg commented 10 years ago

As far as I am aware what is output is the orientation based upon the various sensor types defined within the sensor suite. The MAGS and GYRO aren't there to detect positional movement but instead to add accuracy to the overall reading. The sensor is 3DOF, not 6DOF. You could output the RAW data but I think that would actually be more problems than solutions since you wouldn't have the sensor suite values automatically helping with removal of "fuzzy" raw data.

This would potentially change if you had two or more sensors (like the PrioVR suit will be) because you can then use distance assumptions with angles and IKinematics to determine relative position to each other.

Or perhaps I have completely misinterpreted what you are asking....

I suppose if you were to compare them you may be able to conclude something about the persons position, but all output is around the 3 axis. I dont think a vertical movement downward would register any movement on any of the sensors.

artimage25 commented 10 years ago

Hello @baggyg In my project for example I want to calculate when a person go crouch or prone. I have a knee pad where I put one YEI3 sensor. It calculate the angle of my knee but I want to calulate with one of the accelerometer if the person that bend the knee go down or up for avoid errors. Also, with the same manner that function the wiimote, I want to calculate if a person is running and the only way I believe thay is with the accelerometers, because I only want to use one or two sensors max to replicate those movements. It is not important the accuracy, it is more important the direction of the applied force and the amount of it. The magnetic sensor I believe that can be more util in augmented reality for know the real direction for compare with the real world. But it is more important the values of the accelerometers. Thanks and Greetings.

sthalik commented 10 years ago

@baggyg @AndersMalmgren Isn't existing AHRS code enough for sensor fusion?

What else is there as an issue?

AndersMalmgren commented 10 years ago

The Yei SDK has built in ahrs fusion, but ahrs is for rotation vector only. @artimage25 want to use the accelerometer sensor todo some crude gesture detection. The SDK has a bunch of methods, one is tss_getAllCorrectedComponentSensorData my guess it will calibrate it to some values set att factory, for example they probably calibrate so each axis return 9.8 when faced the floor.

AndersMalmgren commented 10 years ago

It would be cool with some kind of gesture recorder / detector in FreePIE

sthalik commented 10 years ago

Good luck with tha! /sarcasm

baggyg commented 10 years ago

Having used similar things even with Kinect sensors, I would agree this turns out to be "not good". In this instance I would be tempted to prototype something in python before committing anything to the code.

I still do not think there will be adequate readings from the YEI sensor at least to accommodate what artimage is trying to do. A DeltaVector of sorts is probably already achievable just with the current sensor readings

sthalik commented 10 years ago

Get a few raw readings (not doing desired pose, doing desired pose) and go from there.

AndersMalmgren commented 10 years ago

My guess is that its as easy as adding a call to tss_getAllCorrectedComponentSensorData from this method

https://github.com/AndersMalmgren/FreePIE/blob/master/FreePIE.Core.Plugins/Yei3Space/Api.cs#L119

artimage25 commented 10 years ago

Maybe that ChrisAtYeiTech in the MTBS3D Freepie forum can help us also with those questions?