1,295
edits
| Line 29: | Line 29: | ||
10) Trigger a sound sample to be played through the effects graph ASAP but without causing any blocking | 10) Trigger a sound sample to be played through the effects graph ASAP but without causing any blocking | ||
11) | 11) Trigger a sound sample to be played through the effects graph at a given time | ||
12) | 12) Capture video from a camera and analyze it (e.g. face recognition) | ||
13) Capture video | 13) Capture video, record it to a file and upload the file (e.g. Youtube) | ||
14) Capture video, record it | 14) Capture video from a canvas element, record it and upload (e.g. Screencast/"Webcast" or composite multiple video sources with effects into a single canvas then record) | ||
15) | 15) Synchronized MIDI + Audio capture | ||
16) Synchronized MIDI + Audio playback (Would that just work if streams could contain MIDI data?) | |||
=== Straw-man Proposal === | === Straw-man Proposal === | ||
edits