From MozillaWiki
Jump to: navigation, search

Hackanooga: Chattanooga Ignite Hack Days
Sept. 14-16, 2012
Register here

If you enjoy pushing the limits of the open web platform, we want you to join us September 14-16 in the Gig City of Chattanooga, Tennessee for a weekend of good food, good friends, and — most importantly — a unique opportunity to get your hands dirty on a citywide, 1 gigabit per second network.

This wiki is intended primarily as a place for team formation and idea sharing among teams planning to go to the Chattanooga Hack Days. If you're curious about the event, read more at the Mozilla Hacks blog: "Push the web further at Hackanooga"

App idea / team formation discussion call
Conference Number: 800-503-2899
Secondary Conference Number: +1 303-248-0817
7-Digit Access Code: 5435555
Tues., Aug. 21 @ 5 PM ET
Tues., Aug. 28 @ 5 PM
Tues., Sept. 4 @ 5 PM
Tues., Sept. 11 @ 5 PM

For the calls themselves, please use this etherpad to take notes. It's a little easier for real-time communication and we will migrate it to the wiki after the meeting:

Hackanooga2012 etherpad

Apps and App Teams Forming for Hackanooga

We're interested in demonstrating innovation in education, workforce training, healthcare, and other public benefit areas. We'll be prototyping using client-side open web technologies (HTML5, WebGL, WebRTC) and a local private cloud. The types of applications we're talking about include:

  • applications that require high bandwidth (100Mbps to 1Gbps)
  • applications using huge data sets
  • applications that take advantage of layer 2 programmability/software defined networking
  • demonstrations of the above running point-to-point with local anchor institutions (over community fiber or wireless)

Team idea 1: High-Quality Open Source Web Conferencing

WHO: Fred Dixon (ffdixon .at. bigbluebutton .dot. org), Calvin Walton, Ryan Seys

WHAT: Four hacks on BigBlueButton to leverage high speed networks and HTML5 clients.

Special experimental Firefox builds for WebRTC

You can get the "Alder" builds of Firefox nightly, which are those that include the bleeding edge code for WebRTC, from the Tinderbox downloads directory on ftp.mozilla.org. These are roughly hourly test builds and are not necessarily stable, so you may have to hunt around for one that works for you. Be sure to get one of the builds that start with "alder-". The others are used for testing other experiments.

Note: For all the following hacks, we want to run BigBlueButton on the Chattanooga's network. We need servers running Ubuntu 10.04 LTS 64-bit.

Hack #1: Standalone HD Video Chat application -- Modify BigBlueButton so it starts up with Video Doc as the main screen. Create a Rails application that lets anyone setup and join an on-line session. Modify the record and playback scripts to create a HD video file showing a checkerboard pattern of all webcams. We should be able to get 16 simultaneous users doing video using BigBlueButton. [Fred Dixon]

NEEDS: Web designers for the rails application. We'll also need a physical server on the Gig network to install BigBlueButton.

Hack #2: Integrated HTML5 client -- We've already created a prototype HTML5 client for BigBlueButton, but it's currently separate from BigBlueButton. We'll be hacking this weekend on the integration, with the goal of demonstrating an HTML5 client joining a live BigBlueButton session by the end of the weekend. [Ryan Seys].

NEEDS: UI designers for mocking up HTML5 interface for web conferencing. We have some initial designs, but it would be great to brainstorm on how to layout the controls.

Hack #3: Output recording to Popcorn Maker -- We already use popcorn.js for playback. Working with David Seifried (popcorn developer), create scripts to export a subset of the BigBlueButton recording (video + slides) into Popcorn Maker for enabling students to create mashups with other web content. [David Siefried]

NEEDS: Ruby skills for extracting and converting the XML data from events.xml into JSON format for integration with Popcorn Maker.

Hack #4: Broadcast audio to HTML5 client -- BigBlueButton uses FreeSWITCH which already integrates with Icecast. Recently, Icecast added support for WebM. This hack will be to extract a live audio stream from FreeSWITCH and broadcast it to the HTML5 client. This will extend Hack #2. [Calvin Walton]

NEEDS: Experience with Icecast and WebM.

Team Idea 2: 3D videoconferencing

WHO: Andor Salga
WHAT: 3D videoconfereing using Kinect sensors for capture.
NEEDS: WebGL wizard

Discussion: Why not just use Kinect with Zigfu and either 3.js or Unity? (WebGL would likely make this impossible to complete within the hackathon period.)

Team Idea 3: Chattanooga Public Library

WHO: Nate Hill, Chattanooga Public Library
WHAT: Imagine an immersive, interactive information environment where a map of the city of Chattanooga is projected onto the floor. Looking and walking around the map, you orient yourself. First you find the street you live on and step over to it. You tap your foot twice and zoom in. Cool! You scuff your foot to the left and zoom back out. Next you find the location of the art museum and the piece of public sculpture you love. You tap your foot once on an icon, and another projector lights up the wall with information about this piece of sculpture. A life size photograph of a Tom Otterness bronze is displayed, along with biographical information about the artist and suggestions of other similar works nearby or in other cities. Links to resources about Otterness, bronze casting, and public art from the library catalog and across the internet are displayed as well. The Otterness sculpture is actually a part of a larger exhibition, a tour of public art in Chattanooga. When you discover this, you tap again and all of the other items on this tour light up on the floor around you.
This is a proposal to create an interactive digital map of the city of Chattanooga that would be projected on the concrete floor of the fourth floor space in the Chattanooga Public Library. The map would make use of projection mapping technology, gig-speed wireless connectivity, Esri GIS data, and Open Street Maps to create an inverted augmented reality space. This map would be an exhibition space, an urban planning tool, and an educational asset for Chattanooga. In addition, the map could link to other gig-speed communities featuring similar compatible geographic interfaces and exhibitions.
NEEDS: Development help. I'm rallying some folks from the Chattanooga area, but expertise hacking this together would be fantastic.

Discussion: It's cool and feasible, but not sure how this utilizes a Gigabit network. (Yosun 21:01, 24 August 2012 (PDT))

@Yosun I was trying to tell the story of an inspiring, immersive geographic interface, knowing that it's one implementation of something that could be much bigger. If you simply wanted to make this interface leverage the gig, you'd have location based videoconferencing. I step on a point that has an active user, *blip*, I open a live channel with that user. Done. Hopefully, this is an idea that I hope has hooks, an idea that is extensible. If you had an immersive location based tool like this, what would you make it do?

Video-conferencing alone is feasible already on non-gigabit networks. VIdeo-conferencing with many many users simultaneously could use a gigabit network, but then, there would be too much cross-talk in the lag, for regular-connectivity users. It seems a "walking-based" interface would rely on a much too expensive setup for precision -- Kinect doesn't do well for precision, so you would end up needing an array of mounted IR detectors (6 digits). What about a touch-screen interface (Yosun 18:47, 2 September 2012 (PDT))

Team Idea 4: High performance distributed research computing (for science, business, etc)

WHO: Proposed by Roger Pincombe (OkGoDoIt), but very open for suggestions and discussion

WHAT: Projects like Folding@home (http://folding.stanford.edu), Seti@Home (http://setiathome.ssl.berkeley.edu/) and LHC@Home (http://lhcathome.web.cern.ch/LHCathome/) enable researchers to harness spare computing power to do insane amounts of distributed number crunching. There are even commercial efforts like CPUsage (http://cpusage.com/) and Plura Processing (http://www.pluraprocessing.com/). One of issues with massively distributed computing is that the network overhead limits the types of tasks that can be effectively distributed: tasks that are easily broken into separate chunks that can be worked on independently. The software clients download a data set, process it, and then upload results.

With the power of gigabit internet, massively distributed computing could be applied to a much wider set of scenarios. Perhaps systems that require constant communication among the workers or where data sets cannot be broken into reasonably small sizes for computation. I'm no expert in distributed computing and I don't have a specific idea yet, but this is an area I think will be hugely empowered by the rise of superfast internet. I encourage us all to discuss possible scenarios where this could be applied, or even methods to allow arbitrary computation (like less expensive AWS EC2 spot instances for data processing) without compromising the security of the end user. Maybe even client software that can run on smartphones (at night, when charging and connected to home wifi). Millions of surprisingly powerful smartphones are idle for a large portion of the night. With "big data" being the buzzword that it currently is, I imagine there is a lot of potential here.

I can add some specific ideas here soon, but I wanted to get the discussion started and see what you all think.

NEEDS: This is less of an idea and more of a starting point for idea discussion. It would be great if people more familiar with distributed computing and "big data" can add their thoughts.

DISCUSSION: (your thoughts here...)

Team 5: New Opportunity for Community-Based Public Media-casting

Partner with other nonprofits/businesses. · Enter into “content sharing” arrangements with partners in addition to the sharing of other resources. · Assume a leadership position to develop a Community Media Center. Leverage deep digital library and high quality production assets. · Share/rent its content production capability with other nonprofits/businesses and New Media ventures that need multimedia support. · Develop new revenue streams based on marketing this community resource. More deeply engage social media platforms. Develop a mobile App that functions as a “content aggregator” for local content produced not only by WTCI but others. NEEDS: Seriously talented coder(s). We have an active volunteer board and professional staff eager to support this effort in a wide variety of ways.

Team 6: Compound Media File Format

WHO: Hoyt Jolly of Simulosity LLC

WHAT: I have been working and talking with many innovators in the city of Chattanooga, each tying to figure out how to take advantage of a gig pipe. All of the ideas that I have heard (outside of gaming) that could make a dent in a gig pipe involve synchronized video streams. The format is intended to focus on a single point in time, shown from different places and angles.

This format can be used to:

  • Allow views to watch talk shows from any camera angel they want.
  • Allow a family to record their christmas tree hunt from everyone's cellphone.
  • Events can be recorded and viewed from any angel from anyone.
  • Turn film watching into an act of exploration
  • Recording and reviewing video conferences
  • Better accessibility to security footage.
  • Could possibly use a gig pipe. It justifies the gig for the average person.
  • Open the door to a new arean of development, for great innovators to play with.
  • It is a part of the natural progression of static media: image / radio / video / CMF

This process will lead to a basic app, one that can present users with a synchronized video sharing and viewing platform.

NEEDS: People to brainstorm on how this format should be built, used and do. He also need to figure out how the file needs to be build. It would be nice to build a prototype, but the initial planning phase would be great to focus on this weekend.


  • Why inclose the streams in a singe format? It is important to have them available and taged for all to use.
  • Free information form the confines of restriction
  • Remove the stigma of ownership
  • store video on the server in its rawest, and re-encode it to be effectively transfered along with other video files
  • Ensure that key frames do not get transfered at the same time
  • Lots of different types of media could be [linked] around the same event. Classes include audio, video, powerpoint, screen caps
  • Versioning can become important as people develop more ideal versions of their presentation files

Team Idea 7: Long-Term Medical Monitoring & Crisis Event Handling System

WHO: Amr Ali & Dmitri Boulanov

WHAT: A cloud-based complex event processing and monitoring system using existing equipment for signal input will create value for both the patients and their providers.

During the hackathon, we propose to work on a simple app prototype, which will aggregate signal information from multiple existing devices. A simple event based alert could also be implemented. We plan to implement a fall-detection system, which may be utilized by the elderly population, using the sensors and open APIs (Android) available at our disposal. Upon a detected fall, the user will be able to choose to notify an emergency service or an emergency contact. More literature on fall simulation and detection may be found here: http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0037062

With the emergence of biomedical signal monitoring devices, many people identify the importance of self-monitoring in keeping with good health. To stay fit, runners use mobile device applications to keep track of their heart rate and speed, which they can analyze after completing their exercise. Advances in modern biomedical signal monitoring will allow for deeper and more informative description of one’s health state at any given moment.

Existing technologies like pulse oximeters, glucose meters, EKG and ECG sensors presently allow for the monitoring of the elderly and patients with chronic diseases as well as general lifestyle tracking (Bachmann et al, 2012). Zeo and Fitbit are examples of commercially available monitoring equipment that has been adopted by the public in recent years. However, there is no backend infrastructure that goes along with this type of hardware. Due to the current limitations of live analysis and the lack of signal interpretation, these tools provide only nonessential functions. For example, a patient would need a trained physician to analyze an EKG signature.

NEEDS: A FitBit device ($100 at http://www.fitbit.com/product/specs) {other options are Wahoo Fitness BlueHR, Scosche myTrek, iBG Star}, a Neurosky Mindwave Mobile EEG device ($130 at http://store.neurosky.com/products/mindwave-mobile), more C++/Java server-side developers (people with experiencing implementing a back-end to handle app requests), Android/iOS developers, physician contacts, UX/UI folk, quantitative physiologists

Team Idea 8: City Budget/Priorities Impact Visualizer

WHO: Aaron Gustafson

WHAT: Using algorithms available for city growth and financial impact on taxes, services, infrastructure, this tool would enable citizens and politicians to see how choices they make with regard to budgetting would affect their city/town over time (possibly out to 40 years for helping with cities’ 40-year plans).

For example, a citizen could see the adverse impact pulling money from the public works department would have on a city’s roads, bridges and sidewalks. Or she could see how unregulated growth (sprawl) would affect city services or how increased development can affect stormwater runoff.

NEEDS: Other developers, "big data" folks, mapping and/or WebGL folks.


  • We have local folks with the data we would need to start building out a prototype for Chattanooga.

Team Idea 9: "Wanderlust"

Wanderlust - realtime cross-platform multiuser interaction virtual collaboration platform.

what if you could turn your phone into a monocle that lets you look into a fantasy reality... and collaborate with people around you and beyond.

Move virtual objects with the same realistic response rate as real objects!

- Tested Qualcomm AllJoyn (opensource adhoc connectivity) with a gigabit LAN. Not sure how a citywide gigabit network would be setup - but if it's just a huge LAN spanning several miles (possible?)...

Team Idea 10: "TentWatch"

WHO: Alex Ogle (From Tubatomic)

TentWatch is an open platform for recording location based data using a defined reporting structure.

All reports are anonymous and easily filed using selectable drop-down lists of events via free mobile apps on iOS and Android.

The report viewer is a real-time map feed of reported activities in specific areas. This feed will contain cumulative data of pre-defined events that, when played chronologically, create a "radar-like" visual that illustrates the development of a trend that can be used by individuals and communities to make decisions based on those results.

The data feed that renders the map is a live stream of information. Using a high-speed network allows reports to download faster and refresh as events are recorded. The cumulative reports can then be reviewed like rewinding a video timeline to establish trends, and users can jump back to specific days to review events. The "reporting" of this data by the masses requires cellular, wifi and broadband networks to get the data into the system. To amass and process the data to create real-time and/or cumulative reports and present them as visual "radar-like" maps requires large amounts of data storage, fast servers and a next generation high-speed network so that the results can scale based on bandwidth (ie., larger imagery, faster "dragging around" map data to illustrate trends, etc.) and be delivered to the end user.

Visual designers
Front end HTML 5 developers
PHP developers to help write open source code for the server recording system

Only Hardware needs for project are local install of PHP and MYSQL.