Confirmed users
1,643
edits
| Line 116: | Line 116: | ||
Here are the threads in WebRTC(signaling threads are excluded) | Here are the threads in WebRTC(signaling threads are excluded) | ||
# Media stream graph run thread: audio/video | #(MediaStreamGraph) Media stream graph run thread: audio/video coding.(MediaStreamGraphImpl::RunThread in MediaStreamGraph.cpp) | ||
# | #(Network) Socket transport service: send/receive packets. (Entry point of user space callback function??) | ||
# | #(Capture) Camera capture thread (from camera api): recieve image frame from camera.(Entry point of user space callback function??) | ||
# | #(Capture) Audio capture thread: recieve audio frame from microphone. (Entry point of user space callback function??) | ||
# | #(Process) Process thread (worker threads in GIPS): doing many other tasks. Process thread has a task queue for client to inject tasks into. | ||
In a nut shell, we can divide these threads into three categories. | |||
===encode path=== | |||
*Encode path start from capture(getUserMedia). | |||
*MediaPipelineListner listen update notification(NotifyQueueTrackChanges) from MSG Run Thread and | |||
**Encode audio chunks in MSG Run Thread. | |||
**Encode video chunks in another thread(ViECapter Thread). | |||
***Put Encoded media data into Transport Service Thread to network | |||
<gallery> | <gallery> | ||
FIle:Camera_cap_.jpg|Camera capture | FIle:Camera_cap_.jpg|Camera capture | ||
File:MicCapture_.jpg|Mic capture | File:MicCapture_.jpg|Mic capture | ||
</gallery> | </gallery> | ||
===Decode path=== | |||
*''Steven, please update whole story from network/ jitter buffer to renderer.'' | |||
<gallery> | <gallery> | ||
File:AudioDecode_.jpg|Audio decode | File:AudioDecode_.jpg|Audio decode | ||
| Line 139: | Line 140: | ||
File:ReceiveRTPPakcets_.jpg|RTP receive | File:ReceiveRTPPakcets_.jpg|RTP receive | ||
</gallery> | </gallery> | ||
===Process dispatcher threads=== | |||
**RTCP - NACK/ Statistic | **RTCP - NACK/ Statistic | ||
<gallery> | <gallery> | ||