NASA Telescope Data Gets Turned Into Music You Can Play, Includes Where Parallel Lines Converge

NASA Telescope Data Gets Turned Into Music You Can Play, Includes Where Parallel Lines Converge

NASA Telescope Data Music Where Parallel Lines Converge
Composer Sophie Kastner teamed up with researchers to develop versions of NASA telescope data that can actually be played by musicians. This pilot program is currently focusing on a small region at the center of our Milky Way galaxy where a supermassive black hole resides.



Why this region? NASA has telescope data from the Chandra X-ray Observatory, Hubble Space Telescope, and now retired Spitzer Space Telescope, which spans about 400 light-years across. Computers then used algorithms to mathematically map the digital data from these telescopes to sounds that humans can perceive. We’d imagine Google’s MusicLM would generate something completely different-sounding if the right text prompt was inputted.

Sale

AKAI Professional MPK Mini MK3 - 25 Key USB MIDI Keyboard Controller With 8 Backlit Drum Pads, 8 Knobs...

AKAI Professional MPK Mini MK3 – 25 Key USB MIDI Keyboard Controller With 8 Backlit Drum Pads, 8 Knobs…

  • Music Production and Beat Maker Essential – USB powered MIDI controller with 25 mini MIDI keyboard velocity-sensitive keys for studio production,…
  • Total Control of Your Production – Innovative 4-way thumbstick for dynamic pitch and modulation control, plus a built-in arpeggiator with adjustable…
  • The MPC Experience – 8 backlit velocity-sensitive MPC-Style MIDI beat pads with Note Repeat and Full Level for programming drums, triggering samples…

I like to think of it as creating short vignettes of the data, and approaching it almost as if I was writing a film score for the image. I wanted to draw listener’s attention to smaller events in the greater data set,” said Sophie Kastner, a composer and vocalist located in Montreal.


Author
Jackson Chung

A technology, gadget and video game enthusiast that loves covering the latest industry news. Favorite trade show? Mobile World Congress in Barcelona.

NVIDIA Jetson-Powered CUREE Robot Dives Deep to Gather Data on Reefs and Sea Creatures

NVIDIA Jetson-Powered CUREE Robot Dives Deep to Gather Data on Reefs and Sea Creatures

NVIDIA Jetson CUREE Robot Reefs Sea Creatures
The NVIDIA Jetson-powered CUREE robot, developed by researchers from the Woods Hole Oceanographic Institution (WHOI) Autonomous Robotics and Perception Laboratory (WARPLab) and MIT, dives deep to study coral reefs as well as their ecosystems. CUREE (Curious Underwater Robot for Ecosystem Exploration) can collect visual, audio, and other environmental data alongside divers to help them better understand the human impact on reefs and the sea life around them.



CUREE runs on an NVIDIA Jetson-enabled edge AI to build 3D models of reefs and to track creatures, along with plant life. It’s capable of running models to navigate and collect data autonomously. The robot comes equipped with four forward-facing cameras, four hydrophones for underwater audio capture, depth sensors and inertial measurement unit sensors.

NVIDIA Jetson CUREE Robot Reefs Sea Creatures

The problem is that, underwater, the snapping shrimps are loud. If only we could figure out an algorithm to remove the effects of sounds of snapping shrimps from audio, but at the moment we don’t have a good solution. We manually drive the vehicle until we see an animal that we want to track, and then we click on it and have the semi-supervised tracker take over from there,” said Yogesh Girdhar, an associate scientist at WHOI, who leads WARPLab.

[Source]


Author
Bill Smith

When it comes to cars, video games or geek culture, Bill is an expert of those and more. If not writing, Bill can be found traveling the world.