Collecting reliable, real-time data on the migratory patterns of birds can help foster more effective conservation practices, and – when correlated with other data – provide insight into important environmental phenomena. Scientists at CLO currently rely on information from weather surveillance radar, as well as reporting data from over 400,000 active birdwatchers, one of the largest and longest-standing citizen science networks in existence. However, there are important gaps in this information since radar imaging cannot differentiate between species, and most birds migrate at night, unobserved by citizen scientists. The combination of acoustic sensing and machine listening in this project addresses these shortcomings, providing valuable species-specific data that can help biologists complete the bird migration puzzle.
Our BirdVox project has been awarded a $1.5 million Big Data program grant, awarded to the project BirdVox: Automatic Bird Species Identification from Flight Calls, conducted jointly by NYU and the Cornell Lab of Ornithology (CLO), who lead the project. Here's an excerpt from the NYU press release:
For further information about the project check out the BirdVox website.
Following BirdVox's appearance on the Science Friday radio show, Public Radio International (PRI) has published a follow-up article about BirdVox: "Scientists are using sound to track nighttime bird migration". Here's an excerpt:
A group of researchers at New York University and the Cornell Lab of Ornithology are helping to track the nighttime migratory patterns of birds by teaching a computer to recognize their flight calls. The technique, called acoustic monitoring, has existed for some time, but the development of advanced computer algorithms may provide researchers with better information than they have gathered in the past.
You can read the full article here, and listen to the Science Friday interview here. Further information about the project can be found on the BirdVox website.
On Friday June 24th the popular Science Friday radio show featured a segment about our BirdVox project. The segment included sound bites from fellow BirdVox researcher Andrew Farnsworth and myself, followed by a live interview with Juan Pablo Bello.
In the segment we discuss our latest work on automatic bird species classification from flight call recordings captured with acoustic sensors for bioacoustics migration monitoring.
You can listen to the BirdVox segment here.
Today I'll be giving an invited talk at the Machine Learning for Music Discovery Workshop as part of the ICML 2016 conference.
The talk is about Pitch Analysis for Active Music Discovery:
A significant proportion of commercial music is comprised of pitched content: a melody, a bass line, a famous guitar solo, etc. Consequently, algorithms that are capable of extracting and understanding this type of pitched content open up numerous opportunities for active music discovery, ranging from query-by-humming to musical-feature-based exploration of Indian art music or recommendation based on singing style. In this talk I will describe some of my work on algorithms for pitch content analysis of music audio signals and their application to music discovery, the role of machine learning in these algorithms, and the challenge posed by the scarcity of labeled data and how we may address it.
And here's the extended abstract:
Pitch Analysis for Active Music Discovery
Machine Learning for Music Discovery workshop, International Conference on Machine Learning (ICML), invited talk, New York City, NY, USA, June 2016.
The workshop has a great program lined up, if your'e attending ICML 2016 be sure to drop by!
Hot off the press: SONYC is an NYC BigApps finalist! Two weeks ago we pitched the SONYC project at the BigApps semifinals. The results have just been announced, and we're excited to report that SONYC has made it to the BigApps finals in the Connected Cities category!
The event will take place on December 2nd at the Brooklyn Academy of Music (BAM). Each team will pitch their project in front of a panel of judges, and there will also be time for Q&A and demos. The event will close with the BigApps Award Ceremony, during which the winner in each category will be announced.
To learn more about the SONYC project have a look at the video below. Further information, including a list of academic publications, is available on the SONYC website.
The SONYC project has made it to the NYC BigApps semifinals! The event will take place this Sunday as part of the BigApps Demo Day (details and RSVP here). From the BigApps website:
BigApps Demo Day – NYC’s largest civic tech expo – takes place on Sunday, November 1 from 12PM to 5PM. Hosted at DUMBO’s Made in NY Media Center, Demo Day is an all-out celebration of all things tech!
The event is open to the public and promises to be very interesting, with some 40 projects in 5 categories: Affordable Housing, Zero Waste, Connected Cities (us!), Civic Engagement, and Wildcard.
As part of the event there'll be a "People's Choice" public voting, where you get to vote for your favorite 3 projects to help them make it to the finals! If you're in NYC, come and help us make it to the next round!
What: BigApps Demo Day
When: 12:00PM – 5:00PM | Sunday, November 1, 2015
Where: Made in NY Media Center | 30 John Street | Brooklyn, NY 11201
To learn more about the SONYC project have a look at our video:
Over the past two years I've been working together with a fantastic team of researchers on the SONYC: Sounds of New York City project. Check out our new video!
The objectives of SONYC are to create technological solutions for: (1) the systematic, constant monitoring of noise pollution at city scale; (2) the accurate description of acoustic environments in terms of its composing sources; (3) broadening citizen participation in noise reporting and mitigation; and (4) enabling city agencies to take effective, information-driven action for noise mitigation.
Noise pollution is one of the topmost quality of life issues for urban residents in the United States. It has been estimated that 9 out of 10 adults in New York City (NYC) are exposed to excessive noise levels, i.e. beyond the limit of what the EPA considers to be harmful. When applied to U.S. cities of more than 4 million inhabitants, such estimates extend to over 72 million urban residents.
To learn more about the SONYC project please check out the project website: wp.nyu.edu/sonyc
To read our publications on automatic urban sound classification as well as the development of low-cost, high-quality acoustic sensors, check out the project's publication page: wp.nyu.edu/sonyc/publications
Machine listening research, code, data & hacks!