herakoi 0.2.0

Creator: bradpython12

Last updated:

Add to Cart

Description:

herakoi 0.2.0

herakoi


herakoi is a motion-sensing sonification experiment.
It uses a Machine Learning (ML)-based algorithm for hand recognition to track in real-time the position of your hands in the scene observed by a webcam connected to your computer. The model landmarks coordinates of your hands are then re-projected onto the pixel coordinates of your favorite image. The visual properties of the "touched" pixels (at the moment, color and saturation) are then converted into sound properties of your favorite instrument, which you can choose from your favorite virtual MIDI keyboard.
In this way, you can hear the sound of any images, for educational, artistic, or just-fun purposes!
Fully written in python, herakoi requires relatively little computational power and can be run on different on the most popular operating systems (macOS, Microsoft Windows, Linux).
Usage

run herakoi path_to_your_favorite_image
open your favorite MIDI player (e.g., if you run herakoi on an Apple computer, GarageBang is a good option)
have fun!

You can customize your herakoi by using the following flags:

--notes XX YY, that will allow the pitch to span the range from the note XX and YY (with XX equal to, e.g., C4 for middle C)
--volume ZZ, that will set lower threshold for the note volume (with ZZ in percentage)
--switch, inverting the color-brightness mapping

FAQs
A list of frequently asked questions.
Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change, or contact the authors.
License
Copyright 2022 Michele Ginolfi, Luca Di Mascolo, and contributors.
herakoi is a free software made available under the MIT License. For details see the LICENSE file.

License

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.