Changes

Jump to: navigation, search

WeeklyUpdates/EmergingTechnology

2,175 bytes added, 15:54, 16 December 2019
Added ET headlines for this week
! colspan="2" | 2019 ET Headlines
|-
! colspan="2" | '''Latest''': [[#December 9th16th, 2019|December 9th16th, 2019]]
|-
| December
| [[#December 2nd, 2019|2nd]], [[#December 9th, 2019|9th]], [[#December 16th, 2019|16th]]
|-
| November
|-
|}
== December 16th, 2019 ==
* '''Getting WebXR to 1.0''' --The WebXR standard is in the [https://blog.mozvr.com/webxr-1-0-is-here/ home stretch to hit 1.0], and we’ve updated our tools to the final API. WebXR is the new standard for virtual and augmented reality on the web, letting web developers create immersive experiences without native code or installing an app. People can browse VR catalogs, play VR games, and view 360 videos. On the AR side, you can build a web app that places objects in real 3D space inside of a viewer’s living room, while still protecting user privacy and security. Anticipating the final API, we’ve updated the [https://blog.mozvr.com/webxr-emulator-extension/ WebXR Emulator] add-on for Firefox and Chrome and added a brand new look and feel, as well as our [https://blog.mozvr.com/custom-elements-for-the-immersive-web/ Immersive Web Components], which lets you drop a 360 image or video into your page without writing any Javascript at all. Additionally, [https://ecsy.io/ ECSY], an entity component system for the Web, now has its own [https://blog.mozvr.com/ecsy-developer-tools/ dev tool add-on], with remote debugging to test on real headsets. You can learn about these tools and more at our new [https://developer.mozilla.com/topics/mixed-reality/ WebXR Developer Portal], and [https://immersiveweb.dev/ immersiveweb.dev] with more great WebXR info.
* '''Watch the Videos''' -- We promised we’d let you know, and they’re here. All [https://www.youtube.com/watch?v=P47kN2WyoFU&list=PLo3w8EB99pqLt4A5jZP5Fiw2Gi08CYj6H View Source 2019 videos] have now been posted and are available on YouTube.
* '''IndieWebCamp Record''' -- Last week’s we hosted [https://2019.indieweb.org/sf IndieWebCamp San Francisco] at our SF office ([https://indieweb.org/2019/SF/Schedule session notes & videos], [https://indieweb.org/2019/SF/Photos photos]), completing a record 12 IndieWebCamps this year across [https://indieweb.org/cities 11 cities] worldwide. Information is already available on [https://2020.indieweb.org/ planned 2020 IndieWebCamps]: Austin, London, Summit (Portland OR), with more to come.
 
== December 9th, 2019 ==
* '''Major Deep Speech Release''' -- Last week our Machine Learning team [https://github.com/mozilla/DeepSpeech/releases/tag/v0.6.0 released version 0.6 of DeepSpeech], our automatic speech recognition engine which aims to make speech recognition technology openly available to developers. DeepSpeech provides a speech-to-text (STT) engine with a [https://deepspeech.readthedocs.io/en/v0.6.0/ simple API] along with pre-trained English models. Version 0.6 offers the highest quality, most feature-packed model so far -- with lower latency and memory utilization, and the addition of TensorFlow Lite support for smaller models and a faster start-up time. This all means that DeepSpeech can provide better STT results faster and on a variety of devices, from data center down to Raspberry Pi. Read more about the new release, and some great community contributions that made it happen, in the team’s [https://hacks.mozilla.org/2019/12/deepspeech-0-6-mozillas-speech-to-text-engine/ launch blog post].
407
edits

Navigation menu