Eye Oculus app
-
Implement speech recognition
-
Add vocal feedback (ongoing)
-
Getting actual data from the microcontroller
-
Improve the design
-
Add semantic labels (accessibility mode)
The two protocols can be extended to work with multiple appliances.
Protocol A : Protocol used for the connection between the microcontroller and the API
| code | feature |
|---|---|
| A | turn |
| B | light |
| C | temperature |
| D | timer |
| E | heating |
Protocol B :
Protocol used for the connection between the API and the client-side application
We use HTTP.
{
"feature" : feature_code
}Range :
{
"featureType" : feature_type, // range | switch | dropdown
feature_name : { // temperature | timer
"value" : value,
"lower_bound" : lower_bound_value,
"upper_bound" : upper_bound_value
}
}Switch :
{
"featureType" : feature_type, // range | switch | dropdown
feature_name : { // turn | light
"on" : boolean, //
}
}Dropdown:
{
"featureType" : feature_type, // range | switch | dropdown
"list" : [
"option1",
"option2",
"option3",
"option4"
]
}Range :
{
"featureType" : feature_type, // range | switch | dropdown
feature_name : { // temperature | timer
"value" : value,
}
}Switch :
{
"featureType" : feature_type, // range | switch | dropdown
feature_name : { // turn | light
"on" : boolean, //
}
}Dropdown:
{
"featureType" : feature_type, // range | switch | dropdown
"value" : option
}Range :
{
"featureType" : feature_type, // range | switch | dropdown
feature_name : { // temperature | timer
"value" : value,
}
}Switch :
{
"featureType" : feature_type, // range | switch | dropdown
feature_name : { // turn | light
"on" : boolean, //
}
}Dropdown:
{
"featureType" : feature_type, // range | switch | dropdown
"value" : option
}