Decoding the language of things through deep signal analytics
Samarjit Das has always been fascinated with space. From dreaming about being an astronaut as a young boy to pursuing his passion into adulthood and landing a project working with NASA to develop novel AI based sensing technologies to advance space exploration.
Sam leads the Bosch research team that developed SoundSee — a system that uses sound recognition with machine learning to help monitor nominal operation of critical systems and infrastructure aboard the International Space Station (ISS).
Through integrated Artificial Intelligence, this technology provides a visual map of the surroundings sound patterns emanating from their sources to detect potential anomalies.
Sam hasn’t physically made it to space yet, but a part of him was with SoundSee when it was sent to the International Space Station (ISS) in November 2019, making his dream come true.
Sam is a native of India and grew up in a part of the world where human space exploration did not exist.
The experience of working with astronauts and leading this mission has been an emotional experience for Samarjit and his family.
When you meet Sam, you quickly learn that his passion and enthusiasm for space and making the world a better place is not limited to his day job. His daughter is named Antariksa, which means “outer space” in Sanskrit.
On many nights, Sam and Antariksa can be found gazing at the sky through a telescope from the comfort of their backyard in Pittsburgh.
One thing is clear, Sam is not slowing down anytime soon. His accomplishments and discoveries have driven him to continue to explore more potential solutions and opportunities to make life better for his community and around the globe. The team of scientists at Bosch’s research are taking lessons learned from SoundSee and applying them to the healthcare industry. They worked with collaborators like Highmark to potentially detect early signs of lung abnormalities in pediatric patients related to childhood asthma.
As the team continues to develop technologies for various audio AI applications, there is a larger vision in mind. The vision to decode the “language of things” through deep signal analytics.
“We continue to push the boundaries of what you can do with audio, but it may not just be audio. What we call deep signal analytics AI, meaning all the other sensory cues that might be coming out of “things” like temperature, pressure, radio waves, and more…and to decode their stories, to make sense of them, to reason and understand them like we do with human language
“We have had this vision of decoding the language of things way before Chat GPT, but we see Chat GPT as an enable to realize that original vision.”
Understanding that everything around us is talking, just not in the same language, Sam aims to immerse himself in the world of things to better understand their communication patterns. Things such as a fuel cell, your radiator, computer and HVAC systems. Everyday items that are communicating constantly through sound, vibrations, frequencies, current and the like.
Sam likes to imagine the possibilities if we had technology that could tell us exactly what is happening in our home appliances before they went out completely and even recommend the solution. He sees the technology potentially being able to record and analyze your child’s cough into an app for preliminary diagnosis and to recommend a doctor’s office visit depending on the results.
“We see everything through signals and what signals are telling us. What rich stories are we not picking up?”
The work is never done for people like Sam who have made it their mission to continuously work toward the greater good; to invent for life. *