Our prices are exclusive of taxes and shipping. 18% GST shall be added during check out. Happy Shopping! :)  

Robocraze Welcome to Robocraze

Flick HAT : 3D Tracking & Gesture Control

Tap, swipe or flick your palm to control your Raspberry Pi using the Flick HAT!

Flick HAT : 3D Tracking & Gesture Control

We are used to control our computers using a mouse. What if we could do the same with our finger, without any mouse? And what if we can give inputs, not just in a two-dimensional space, but in a three-dimensional one? Does all this seem unbelievably cool and exciting enough to make you want to get a hands-on experience of? If your answer is “Yes!”, then you have come to the right place.

Flick HAT is an electrical field based 3D Tracking and Gesture Control Device which uses a MGC3130 capacitive sensor chip. It enables user command input with natural hand and finger movements. Applying the principles of electrical near-field sensing, the MGC3X30 contain all the building blocks to develop robust 3D gesture input sensing systems. 

Flick HATFlick HAT

The details and the library package of the HAT is available as an repository on GitHub.

The Flick library can be installed by running the following command in the terminal window:
curl -sSL https://pisupp.ly/flickcode | sudo bash

After installing the complete package in the Raspberry Pi system, we can check our Flick HAT with a few available demo programs which are provided to us by the Pi Supply people. Although, these people have been generous to provide us with some really nice sample programs (like controlling a robotic arm or volume of the R-Pi), most of these program will not work with just the Flick HAT. So, we choose to run ‘flick-demo’ which takes us to an interface which contains the x, y, z co-ordinates and some other parameters. As we will move our hand above the HAT, we will see the corresponding parameters change on the screen. Note that the range is only 15 cm.

We can also try some more interesting stuff, like changing images on the screen using hand gestures. For this, I used the ‘uinput’ library and the ‘flick’ function from the flick library. We emulate the left and right arrow keys of the keyboard and let that execute according the left or right flick through our palms above the Flick HAT. This will be formulated in an if-else block in the python script file. Now, open some image and see it changing as you hover your hand over the HAT.

Hope this was interesting and learning for you. Thank you

Write Your Comment

Only registered users can write comments. Please, log in or register