Hi!
In a project of mine I will be using a 9-axis accelerometer that will provide me with data on the sensor's linear acceleration, rotational acceleration, and magnetic field.
I know that most smartphones use the same type of 9-axis accelerometer as well, which is why I wanted to code an Android app first. However, I'm confused by the types of existing sensors in the android API.
More specifically, I am unsure about the type of data the API gives me, and even more about how the API acquires the data: Especially about the TYPE_ROTATIONAL_VECTOR. The link above says that the orientation of the device is measured, and that the origin of the data can either be software or hardware. However, I thought that hardware sensors are only capable of measuring rotational acceleration, and not rotational position, and am thus unsure in what way the raw data of the hardware sensor has been processed already.
I am thus pretty sure that I must have misunderstood either how the API or how the hardware works, and would therefore be grateful if someone could explain me how the "pipeline" of Sensor[Type/Name]->Raw Data[Units]->Android Phone[Processing]->Android API[What data do I get] result works, for three types of data: Acceleration, rotation, and magnetic fields.
Thanks a lot!
In a project of mine I will be using a 9-axis accelerometer that will provide me with data on the sensor's linear acceleration, rotational acceleration, and magnetic field.
I know that most smartphones use the same type of 9-axis accelerometer as well, which is why I wanted to code an Android app first. However, I'm confused by the types of existing sensors in the android API.
More specifically, I am unsure about the type of data the API gives me, and even more about how the API acquires the data: Especially about the TYPE_ROTATIONAL_VECTOR. The link above says that the orientation of the device is measured, and that the origin of the data can either be software or hardware. However, I thought that hardware sensors are only capable of measuring rotational acceleration, and not rotational position, and am thus unsure in what way the raw data of the hardware sensor has been processed already.
I am thus pretty sure that I must have misunderstood either how the API or how the hardware works, and would therefore be grateful if someone could explain me how the "pipeline" of Sensor[Type/Name]->Raw Data[Units]->Android Phone[Processing]->Android API[What data do I get] result works, for three types of data: Acceleration, rotation, and magnetic fields.
Thanks a lot!