Ever since I can remember, music has been a huge part of who I am. Growing up, my parents formed a traditional Mexican trio band and their music filled the rooms of my childhood home. I’ve always felt deeply moved by music, and I’m fascinated by the emotions music brings out in people.
When I attended community college and took my first physics course, I was introduced to the science of music—how it’s a complex assembly of overlapping sound waves that we sense from the resulting vibrations created in our eardrums. Though my parents had always taken an artistic approach to playing with soundwaves, I took a scientific one. Studying acoustics opened up all kinds of doors for me I never thought were possible, from pursuing a career in electrical engineering—to studying whale calls using machine learning.
I applied to the Monterey Bay Aquarium Research Institute (MBARI) summer internship program, where I learned about John Ryan and Danelle Cline’sresearchusing machine learning (ML) to monitor whale sounds. Once again, I found myself fascinated by sound, this time by analyzing the sounds of endangered blue and fin whales to further understand their ecology. By identifying and tracking the whales’ calls and changing migration patterns, scientists hope to gain insight on the broader impacts of climate change on ocean ecology, and how human influence negatively impacts marine life.
MBARI had already collected thousands of hours of audio, but it would have proven too cumbersome of a task to sift through all of that data to find whale calls. That’s what led Danelle to introduce me to machine learning. ML enables us to pick out patterns from very large data sets like MBARI’s audio recordings. By training the model usingTensorFlow, we can efficiently sift through the data and track these whales with 98 percent accuracy.
This article was sourced from Google Blog