Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 12 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,14 @@ audio-annotator also provides mechanisms for providing real-time feedback to the
1. none (There is no feedback provided. Solution set is not needed)
2. silent (Annotation score is calculated and recorded with each action the user takes. Solution set is required)
3. notify (Annotation score is calculated and recorded with each action the user takes. A message will appear telling the user if they are improving or not. Solution set is required)
4. hiddenImage (Annotation score is calculated and recorded with each action the user takes. A message will appear telling the user if they are improving or not. Also parts of a hidden image will be revealed to the user. Solution set and image src are required)
4. hiddenImage (Annotation score is calculated and recorded with each action the user takes. A message will appear telling the user if they are improving or not. Also, parts of a hidden image will be revealed to the user. Solution set and image src are required)

### To Demo
1. In the audio-annotator/ directory run `python -m SimpleHTTPServer`
2. Visit <http://localhost:8000/examples> in your browser to see the verison with annotation and proximity tags. This demo also uses the spectrogram visualization, and does not provide the user with feedback as they annotate the clip.
3. Visit <http://localhost:8000/examples/curiosity.html> in your browser to see the verison with just annotation tags. This demo also uses the spectrogram visualization, and provides the user feedback in the form of revealing a hidden image as the user correctly annotate the sound clip.
2. Visit <http://localhost:8000/examples> in your browser to see the version with annotation and proximity tags. This demo also uses the spectrogram visualization and does not provide the user with feedback as they annotate the clip.
3. Visit <http://localhost:8000/examples/curiosity.html> in your browser to see the version with just annotation tags. This demo also uses the spectrogram visualization and provides the user feedback in the form of revealing a hidden image as the user correctly annotate the sound clip.

Note: In the examples, the submit annotations btn will output data to the web console, since the POST is not hooked up to the backend
Note: In the examples, the submit annotations btn will output data to the web console since the POST is not hooked up to the backend

### Interfacing with backends
The examples in the **examples/** do not depend on any specific backend. They make a call to json containing fake data in order to render the interface. Extra information for specific backends:
Expand All @@ -54,7 +54,7 @@ To view the curio versions of these files, take a look at **curio_original/audio
* [urban-ears.css](static/css/urban-ears.css)
Custom css for urbanears interface
* [materialize.min.css](static/css/materialize.min.css)
Minified version of materlize css
The minified version of materlize css

* [**static/js/**](static/js/)
* [colormap/](static/js/colormap/)
Expand All @@ -63,8 +63,8 @@ To view the curio versions of these files, take a look at **curio_original/audio
run `source gen_colormap.sh` in the colormap directory to generate the new colormap.min.js
* [gen_colormap.js](static/js/colormap/gen_colormap.js)
This file is used by gen_colormap.sh to generate colormap.min.js
It that requires colormap node module and adds it as a global variable
This file also defines the magma colour scheme
It requires the colormap node module and adds it as a global variable
This file also defines the magma color scheme
* [colormap.min.js](static/js/colormap/colormap.min.js)
Generated JS file
* [lib/](static/js/lib/)
Expand All @@ -79,15 +79,15 @@ To view the curio versions of these files, take a look at **curio_original/audio
* [hidden_image.js](static/js/src/hidden_image.js)
Defines: HiddenImg (Creates elements to hide an image behind a canvas, and reveal random parts of the image)
* [main.js](static/js/src/main.js)
Defines: UrbanEars (Creates and and updates all parts of the interface when a new task is loaded. Also submits task data)
Defines: UrbanEars (Creates and updates all parts of the interface when a new task is loaded. Also submits task data)
* [message.js](static/js/src/message.js)
Defines: Message (helper functions that alert the user of different messages using Materlize toast)
* [wavesurfer.drawer.extended.js](static/js/src/wavesurfer.drawer.extended.js)
Using the logic from the wavesurfer spectrogram plugin to override the wavesurfer drawer logic in order to have waveform visiualizations as well as spectrogram and inivisble visiualizations
Using the logic from the wavesurfer spectrogram plugin to override the wavesurfer drawer logic in order to have waveform visualizations as well as spectrogram and inivisble visualizations
* [wavesurfer.labels.js](static/js/src/wavesurfer.labels.js)
Defines: WaveSurfer.Labels (creates container element for lables and controls the positioning of the labels), WaveSurfer.Label (individual label elements)
Defines: WaveSurfer.Labels (creates container element for labels and controls the positioning of the labels), WaveSurfer.Label (individual label elements)
* [wavesurfer.regions.js](static/js/src/wavesurfer.regions.js)
Modified version of wavesurfer regions plugin
The modified version of wavesurfer regions plugin
(https://github.com/katspaugh/wavesurfer.js/blob/master/plugin/wavesurfer.regions.js)

* [**static/json/**](static/json/)
Expand All @@ -103,4 +103,4 @@ To view the curio versions of these files, take a look at **curio_original/audio
django view of interface from curio repo
* [main.js](curio_original/main.js)
Curio repo version that makes API calls using user information provided by the django layer.
Defines: UrbanEars (Creates and and updates all parts of the interface when a new task is loaded. Also submits task data)
Defines: UrbanEars (Creates and updates all parts of the interface when a new task is loaded. Also submits task data)