Final Project: Across Kazakhstan 🐎

Yay, I could put my final idea into reality! 🏄‍♀️

Idea 🙆‍♀️

As you may have noticed, I had some difficulties in coming up with a project idea for so long, it took me 2 weeks… I had even started creating my first idea, which was about creating a 3D controller for a game. Honestly, I found this idea on the internet and just wanted to create my own snake game to use this controller. Yet the idea was really “half-baked” and I was not really excited about creating it, I thought that having a particular idea is enough to implement it. Yet, apparently not for me. Having a lot of struggles with the project idea, I talked to Professor Aaron, who emphasized that there should be a part of me in the project. Being lost for a couple of weeks, I got eventually inspired by the song of the Kazakh artists ( The song made me think about the nature of our country and suddenly I started to have ideas about creating an interactive experience. Then, I decided to discuss the idea with Professor Aaron, who helped me to have a more refined view of the project. I got really excited about implementing my final idea for the project

Process 👩‍💻

I booked iPad from the lab to create some visuals (that was my first time using it and I really enjoyed using its possibilities for creative purposes). I found some photos of our nature on the internet and smudged the colours to create a painting effect. 

Also special thanks to Jack for expanding my idea of creating physical interaction with the 3D printed grass. He kindly guided me through the process of using the 3D printer for the first time.

I had some struggles in controlling the ZX detection sensor, which should work when a user waves a hand over the grass to turn on the led strip and initiate the horse’s running. First of all, I used coordinates to be able to control hand detection. However, the sensor could give some random values sometimes and the horse would demonstrate to me its audacity to run on its own and not “obey” the movements of my hand. Therefore, instead of using the library example with coordinates, together with the Professor’s help, we discovered the gesture example. As opposed to the previous library example, it could be easily controlled due to any gesture the sensor could be exposed to. 

Also, I have a potentiometer, which is hidden by the sun and moon pictures. I am using it to change the background of the program so that there is a transition from day to night. Compared to the sensor, the potentiometer was less “capricious” :))

Result 🤸‍♀️

I am really glad about the final result! Thank you to the whole IM Lab team and Professor Aaron for helping me throughout this process! I learned a lot of interesting and practically useful things this semester. I had several hardships, but it was a rewarding experience 🙂

#include <Wire.h>
#include <ZX_Sensor.h>
#include <FastLED.h>
#define LED_PIN     5
#define NUM_LEDS    25
#define BRIGHTNESS  64
#define LED_TYPE    WS2811
int currentVal = 0;

const int ZX_ADDR = 0x10;  // ZX Sensor I2C address

// Global Variables
ZX_Sensor zx_sensor = ZX_Sensor(ZX_ADDR);
uint8_t x_pos;
uint8_t z_pos;

CRGBPalette16 currentPalette;

GestureType gesture;

void setup() {
  delay( 3000 ); // power-up safety delay
  FastLED.addLeds<LED_TYPE, LED_PIN, COLOR_ORDER>(leds, NUM_LEDS).setCorrection( TypicalLEDStrip );
  FastLED.setBrightness(  BRIGHTNESS );

  // Initialize Serial port


  while (Serial.available() <= 0) {

void loop()

  if ( zx_sensor.gestureAvailable() ) {
    gesture = zx_sensor.readGesture();
    switch ( gesture ) {
      case RIGHT_SWIPE:
        currentPalette[0] =  CRGB(255, 255, 0);
        x_pos = 0;
      case LEFT_SWIPE:
        x_pos = 1;
        currentPalette[0] =  CRGB(0, 255, 0);

  for (int i = 0; i < NUM_LEDS ; i++) {
    // let's set an led value
    leds[i] = currentPalette[0];//CHSV(hue++, 255, 255);


  while (Serial.available()) {
    byte inComing =;
    if (inComing == '\n') {

      int sensor = analogRead(A0);



Leave a Reply