Sales Ended

New England Machine Learning Accessibility Hackathon

Event Information

Share this event

Date and Time

Location

Location

Microsoft New England R&D

1 Memorial Drive

Cambridge, MA 02142

View Map

Event description

Description

Join us on Monday, June 11th for a day of Machine Learning for Accessibility! The goal will be to create solutions that promote accessibility and inclusion. Prizes include Xbox One X bundles and $100 Amazon gifts cards.

Please register to attend and note if you are interested in leading or participating in a specific team. Projects include:

  • Data Analytics Tool for parents and therapists using Pathfinder Health Innovations which tracks multi-year behaviors and skill acquisition for children in autism therapy and special education. Led by Leo Junquero & Brent Samodien, Microsoft.

  • Neurodiversity Social Chatbot. Led by Dr. Joel Salinas, Harvard Medical School/MGH and Dr. Jordi Albo-Canals, NTT Data/Tufts University. How do we learn to relate with another person? How do we communicate so we both feel heard, honored, and respected for who we are? How--despite so many barriers--can we connect better? We all struggle with these questions. But for some, these questions feel unanswerable and insurmountable. While there is still no replacement for the all benefits of face-to-face interaction with others, we can begin to overcome this challenge through the thoughtful application of machine-learning to make face-to-face connections easier, better. As featured in this New York Times Modern Love essay, Gus, a 13-year-old on the autism spectrum, learned how to connect better with other people on his own terms with some unexpected help: Siri. Yes, Siri on his iPhone. BIG challenges don't have easy solutions. But if that gets your blood pumping, then let's work together and tackle this epic problem head on!

  • American Sign Language: Fact or Opinion Quiz. Led by Danielle Bragg, University of Washington/Microsoft Research, and Dr. Naomi Caselli, Boston University. The ability to distinguish between facts and opinions is an important skill taught in K-12 education. Exercises used in schools are all in English, which is not the primary language of the Deaf community -- American Sign Language (ASL) is. Help us build a tool entirely in ASL that quizzes students on whether content is fact or opinion. The system will both display content in signed ASL, and evaluate answers signed to a camera.

  • ASL Scattergories. Led by Danielle Bragg, University of Washington/Microsoft Research, and Dr. Naomi Caselli, Boston University. Sign language translation lags far behind spoken language translation in large part due to a lack of proper training data. Help us build an online American Sign Language (ASL) scattergories game to help collect a large, labelled corpus of signs executed by diverse signers to boost translation efforts.

  • Seeing AI App - improving UPC barcoding identification, particularly on non-flat surfaces. Led by Rob Gallo, Microsoft Accessibility Engineer, and Saqib Shaikh, Seeing AI Tech Lead.

  • Aphasia Augmented Language interface tool designed to facilitate word finding when needed without disrupting a conversation. Led by Kristin Williams, Carnegie-Mellon University

  • Augmented Screen Reader that uses audio and vibration input/output, using a single website and creating an easy to navigate, interactive semantic mapping of its contents. Led by Kalli Retzepi, MIT.

AGENDA

• 9am: Doors Open, Check-In, Coffee

• 9:30am: Kick-Off & Team Orientations.Planning

• 12:30pm: Idea Exchange & Lunch

• 4:30pm: Team Submissions due

• 5pm - 7pm: Team Presentations, Dinner, Prizes and Awards

JUDGES:

  • Dr. Meryl Alper, Northeastern University

  • Dr. Daniel Hannon, Tufts University

  • Elaine Harris, MIT Hack for Inclusion organizer

  • Jamie MacLennan, Microsoft Azure ML

  • Paul Medeiros, President of Easter Seals MA

  • Jaya Narain, MIT ATHack Cofounder

  • Dr. Ognjen Rudovic, MIT Media Lab

  • Dr. D. Sculley, Google Brain


MENTORS


  • Anastasiya Belyaeva, MIT Institute for Data, Systems and Society
  • Dr. Manohar Swaminathan, Microsoft Research India
  • Dr. Adam Kalai, Microsoft Research New England
  • Dr. Bill Thies, Microsoft Research New England

PRIZES: 1st prize: Xbox One X bundles. Second prize: $100 gift cards.


Below are the notes to the team leaders and I have a attached a sample email to each of the projects. Eventbrite with details: //aka.ms/ml4inclusion.

Owen – could Sarah be with you when you tape the interviews? Maybe two team leaders, two judges, and the mentoring? Would that be too many?

  • Two of the team leaders could be interviewed at 4:30pm-5pm after their submissions but before their presentations. (Not during the hackathon day)
  • Two Judges could be interviewed during that same time.
  • Could we have the mentors sit together for an interview?

Thanks – see you Monday!

Sandra

Thanks so much again for leading a Hackathon Project on Monday – we are so excited for the event! NOTE: Please arrive by 9:00am on Monday as we have adjusted the agenda to kick off the teams at 9:30am.

Below is the latest hackathon information. If you are available and interested in a team leaders’ call tomorrow, we will have a quick 30 minute check-in on Friday at 12pm ET.

We currently have over 100 registrations, 7 projects, and 8 judges. All teams will be presenting three items at the end of the day to the judges: 1) their defined challenge and starting point; 2) storyboard describing a “Day in the Life of their user”; and 3) their solution. Prizes include Xbox One X bundles and $100 Amazon gifts cards.

· PROJECT TEAMS

  1. Data Analytics Tool for parents and therapists using Pathfinder Health Innovations which tracks multi-year behaviors and skill acquisition for children in autism therapy and special education. Led by Leo Junquero & Brent Samodien, Microsoft.
  2. Neurodiversity Social Chatbot. Led by Dr. Joel Salinas, Harvard Medical School/MGH and Dr. Jordi Albo-Canals, NTT Data/Tufts University. How do we learn to relate with another person? How do we communicate so we both feel heard, honored, and respected for who we are? How--despite so many barriers--can we connect better? We all struggle with these questions. But for some, these questions feel unanswerable and insurmountable. While there is still no replacement for the all benefits of face-to-face interaction with others, we can begin to overcome this challenge through the thoughtful application of machine-learning to make face-to-face connections easier, better. As featured in this New York Times Modern Love essay, Gus, a 13-year-old on the autism spectrum, learned how to connect better with other people on his own terms with some unexpected help: Siri. Yes, Siri on his iPhone. BIG challenges don't have easy solutions. But if that gets your blood pumping, then let's work together and tackle this epic problem head on!
  3. American Sign Language: Fact or Opinion Quiz. Led by Danielle Bragg, University of Washington/Microsoft Research, and Dr. Naomi Caselli, Boston University. The ability to distinguish between facts and opinions is an important skill taught in K-12 education. Exercises used in schools are all in English, which is not the primary language of the Deaf community -- American Sign Language (ASL) is. Help us build a tool entirely in ASL that quizzes students on whether content is fact or opinion. The system will both display content in signed ASL, and evaluate answers signed to a camera.
  4. ASL Scattergories. Led by Danielle Bragg, University of Washington/Microsoft Research, and Dr. Naomi Caselli, Boston University. Sign language translation lags far behind spoken language translation in large part due to a lack of proper training data. Help us build an online American Sign Language (ASL) scattergories game to help collect a large, labelled corpus of signs executed by diverse signers to boost translation efforts.
  5. Aphasia Augmented Language interface tool designed to facilitate word finding when needed without disrupting a conversation. Led by Kristin Williams, Carnegie-Mellon University
  6. Augmented Screen Reader that uses audio and vibration input/output, using a single website and creating an easy to navigate, interactive semantic mapping of its contents. Led by Kalli Retzepi, MIT.
  7. Seeing AI App - improving UPC barcoding identification, particularly on non-flat surfaces. Led by Rob Gallo, Microsoft Accessibility Engineer, and Saqib Shaikh, Seeing AI Tech Lead.

AGENDA

  • 9am: Doors Open, Check-In, Breakfast
  • 9:30am: Kick-Off & Team Orientations, Planning
  • 10am: Inclusive Design
  • 10:30am: Break
  • 11am: Final Idea Selection Hacking
  • 12:30pm: Lunch and Idea Exchange
  • 3:00pm: Storyboard exercise & submission prep
  • 4:30pm: Team Submissions due
  • 5pm - 7pm: Team Presentations, Dinner, Prizes and Awards

JUDGES

  • Dr. Meryl Alper, Northeastern
  • Dr. Daniel Hannon, Tufts
  • Elaine Harris, MIT Hack for Inclusion organizer
  • Jamie MacLennan, Microsoft ML
  • Paul Medeiros, President Easter Seals MA
  • Jaya Narain, MIT ATHack Cofounder
  • Dr. Oggi Rudovic, MIT Media Lab
  • Dr. D. Sculley, Google Brain

MENTORS

  • Anastasiya Belyaeva, MIT Institute for Data, Systems and Society
  • Dr. Manohar Swaminathan, Microsoft Research India
  • Dr. Adam Kalai, Microsoft Research New England
  • Dr. Bill Thies, Microsoft Research New England

Date and Time

Location

Microsoft New England R&D

1 Memorial Drive

Cambridge, MA 02142

View Map

Save This Event

Event Saved