I coded up an audiovisual website that allows me to use hand gestures and voice to control a dynamic / interactive flower.
I wanted something I could control with my hand gestures, so MediaPipe hand tracking was appropriate for my use-case. I also wanted something that had my camera preview, so that it can see how I am using my hands to control the dynamism of the flowers - all the visuals are procedural and there are no pre-made assets. For audio analysis, i used a web audio API to do real-time speech recognition and the output of it is text.
The flower was initially a feedback-loop flower, but I didn't feel the elegance and dynanism of the flower. So we switched towards a more volumetric, translucent petal system - this is achievable using Three.js and GLSL.
The entire process was inspired by the building of new knowledge systems through old ones. The connection between new ideas and existing ones can happen automatically through LLMs, very similar to the pollination in flowers. After using obisidan, i was also able to see how two completely unrelated ideas can form as one.