Skip to content
riansanderson edited this page May 22, 2014 · 7 revisions

Note, if there are additional questions not covered below, feel free to submit to the OSP forum.

FAQ

Getting Started

Can I use this source code in a commercial project?

Yes. This Open Sensor Platform project is open sourced for your use under the Apache License Version 2.0. This is the same license as the Android Open Source Project (AOSP) for the same reasons.

How can I access the latest documentation?

After downloading the source, you can build the latest documentation using doxygen:

~> cd docs/
~> doxygen Doxygen.in
~> sensible-browser generated-docs/html/index.html

or get the latest generated PDF here.

How does software versioning work in this project?

Stable tested builds are tagged with a version using the format major.minor.bugfix where the 'major' version number represents updates to the API that are potentially incompatible with the previous release. The 'minor' version number represents updates to existing functionality, or addition of a new capability or API that does not break the existing API. The 'bugfix' number increments as bug fixes are deemed necessary to incorporate into the release, with no change to the API.

Contributions will be staged for merge into the master branch after review. As they are merged, the revision may increment.

If you are writing applications that might depend on the version, be sure to cross check the version using OSP_GetVersion().

How can I try the embedded side?

The currently supported platforms are in the embedded/projects/ directory. There is a more detailed explanation for the sample project.

How can I try the Android side?

First [create the linux kernel] with necessary drivers to bring sensor data into the application processor.

Then download the Android NDK and toolchain, or create a toolchain for the version and target of interest. Then build the sensorhubd using android-cmake. This can be done out-of-tree, for example, as follows:

~> mkdir osp-builds/
~> cd osp-builds/
~> android-cmake ~/open-sensor-platforms/linux/sensorhubd/
~> make

Then [compile the sensor Android HAL] into a sensors library.

How can I license the FreeMotion Library?

Although not required, OSP is compatible with adding third-party libraries such as Sensor Platforms FreeMotion Library. Please contact us for more information.

Utilizing OSP

Where can I get hardware that OSP runs on?

The initial release of OSP runs on the first generation Nexus7 (Asus ME370T, aka grouper , aka nakasi available on Amazon) hooked up over I2C to OSP firmware running on an STEVAL-MKI119V1 (available at Digikey and other distributors).

Support for more Android platforms and sensor hubs boards is in progress.

How is the code organized?

There are two main directories: embedded/ and linux/. They are related to each other but can be utilized independently. All the embedded code is in ANSI C99 compliant C-code for portability across various embedded compilers. The linux code is mostly in C++ for extensibility and utilizes the Android framework. There is an include/ directory with header files shared between the embedded and linux code. The external/ directory contains third-party open source code needed to run certain examples.

What is the difference between sensor descriptors, types, attributes, and data conventions?

The Sensor Descriptor collects all information related to a sensor:

  1. Sensor type such as accelerometer or gyroscope,
  2. Data convention such as raw (LSBs) or Android (in dimensionful units),
  3. Attributes such as uncalibrated or calibrated, which are defined in the flags,
  4. Supported output data rates,
  5. Pointers to callback functions invoked when output is ready, calibration is to be written, or sensor control is to be sent,
  6. Any platform- or vendor-specific information, such as further description of input sensors like axes mapping and vendor identification.

This struct represents the data from any type of sensor in the OSP framework.

What about composite or virtual sensors?

Composite or virtual sensors are algorithmic interpretation of sensor input, e.g. quaternion or step counting. These logical sensors produce data in similar way to physical sensors and so are represented in the same way. Some composite sensors have discrete values, such as in-car context detected or double-tap gesture detected. These events are normally reported with a confidence or probability of occurrence. Therefore an additional enum is needed for describing the indices into the probability vector.

Why is the OSP sensor list different than the one provided by Android?

OSP can be used with Android, but also is flexible to support other host operating systems or no operating system at all. New sensors or virtual sensors could be ahead of Android releases and need to be specified independently. The end hardware abstraction layer (HAL) to the overarching OS does follow the required API.

Some distinct differences that should be mentioned:

  • Physical sensors such as a gyroscope generally have three different ways of being represented: raw (in LSBs), uncalibrated (in rps), and calibrated (also in rps). Android assumes the raw is taken care of with low-level drivers, and the other two are differentiated using sensor types. OSP deals with all three representations. Android's SENSOR_TYPE_GYROSCOPE and SENSOR_TYPE_GYROSCOPE_UNCALIBRATED are replaced in OSP with SENSOR_GYROSCOPE_CALIBRATED and SENSOR_GYROSCOPE_UNCALIBRATED. In addition these can be differentiated in the sensor descriptor attributes if needed. The RAW and ANDROID data conventions can also be specified to distinguish different types of uncalibrated data. See the include/osp-api.h for more information.

  • Discrete virtual sensor events such as significant motion is grouped logically into a set of device motion events. This means the single boolean SENSOR_TYPE_SIGNIFICANT_MOTION is replaced by an array of probabilities SENSOR_DEVICE_MOTION with indices for STILL, ACCELERATING, ROTATING, TRANSLATING, FREE_FALLING, SIGNIFICANT_MOTION, and SIGNIFICANT_STILLNESS. See the include/osp-sensors.h for more information.

Contributing to OSP

How do I contribute to this project?

Just download and sign this contributor license agreement and return to osp at sensorplatforms dot com. Then you will be able to submit pull requests to the main repository for review and subsequent merge.

How do I add a sensor or virtual sensor?

  1. Sensors and virtual sensors are defined in include/osp-sensors.h. First check if the sensor is already defined there. If not, check whether the sensor is related to an existing one through simple data convention or sensor attributes. Finally, especially if it is a virtual sensor, check if the new sensor can fit within existing logical groupings such as SENSOR_GESTURE_EVENT or a SENSOR_CONTEXT. If applicable, add the new sensor to the SensorType_t or relevant enum.
  2. Define a struct describing the data of the new sensor in include/osp-api.h. The header file has many examples for guidance. Consider using fixed-point format as defined in include/osp-fixedpoint-types.h for generality.

This sensor is then sufficiently defined to make use of the infrastructure of OSP, including initialization, being registered as an input, or being subscribed to as an output.

How do I add a platform?

The embedded sources are organized in a modular fashion:

  • Hardware dependent code is placed in a folder specific to the platform in the projects/ folder,
  • Device specific drivers are placed either in common/modules/ or external/modules/,
  • Vendor provided device support libraries are in external/MCU/,
  • Common device independent support framework and application source files are placed in the common/ folder.

An [example] is provided as a guideline. Create a new folder in the projects/ directory and add hardware specific initialization, configuration, and interfacing source code for the new hardware.

How do I add an RTOS?

Although a real-time operating system (RTOS) is not required, the OSP Application Support Framework (ASF) provides abstraction for an underlying RTOS. Currently the ASF is implemented to use Keil RTX. Future updates are targeted to use the CMSIS-RTOS API that will allow the RTX to be replaced by any CMSIS compliant RTOS. Note that RTX with the CMSIS-RTOS API is provided by ARM under Open Source License.

For a non-CMSIS RTOS or for processors other than Cortex-M, it is easy to adapt the ASF. The source code is written in clear and concise manner with comments that allows for easy understanding and minimal porting effort.

Will this work on other CPU architectures or host OSes?

Yes. Most of the source code for application and drivers are written in ANSI-C. Only the target specific sources and/or supporting libraries have CPU architecture specific implementation. The OSP can easily be adapted for other CPU architectures.