makerdad

Rover Hardware Lessons

As I was putting together the Rover v2, there were quite a few lessons along the way, mostly about hardware. I’ll capture them here for posterity.

Handling Power Since I have two motors for driving and another for the LIDAR, I had to use two L298N modules, with different input voltages. I already had a voltage booster for the drive motors, getting 7.4v from the LiPo and boosting it to 14v (12v for motors + 2v for the module). I thought I could simply plug in a direct line from the LiPo for the other module, since the LIDAR motor is 6v. In practice this proved to be unreliable. Without a booster, there was a lot of variability in the current and it caused stalling issues with the motor. Once I added a secondary boost to 8v, it started humming along as expected.

Sharing Logic Power L298N modules have a very useful two-pin bridge for allowing battery power to be bucked down to 5v and passed back through to the microcontroller. Problem is having to fit on the little hat to turn this on and off constantly when switching between programming mode (Arduino powered by USB) and mobile driving mode.

I ended up adding a second flip switch to the control panel on the rover to make this easy. Now I can flip between mobile vs. plugged mode at any time.

Preventing Motor Stall I noticed that the LIDAR motor needs an initial power boost to start turning. This meant that the default PWM sometimes wouldn’t be enough to generate the torque needed for the initial move and the motor would get stuck. Solution was actually easy: Start with a higher PWM and slowly bring it down to the desired level. This ensures the motor always turns properly.

Adding PID for LIDAR Motor Since I had quite a bit of experience dealing with PID, adding one for the LIDAR motor was relatively easy. I added a simple delta check to make sure it’s turning at the desired speed. This ensures that the LIDAR can capture scans at a fairly constant rate, and the degree distribution is more even.

Hardware Interrupts Despite using an Arduino Mega, I ran out of hardware interrupts. This is mainly because Mega shares two interrupts with the I2C ports. It’s non-trivial (or probably impossible) to have them work at the same time, so I had to change my port planning a bit. I found the Enable Interrupt library, which also has a great write up about PCINT capability on the Arduinos. I ended up using the A8 and A9 pins for the LIDAR detection, and Enable Interrupt made it really easy to set things up.

One downside: You can’t use Enable Interrupt alongside the excellent Encoder library, it won’t compile. However Enable Interrupt doesn’t have native support for quadrature encoders, so I had to do some more digging on this to make it work with the driving motors. In the end I settled for this Quadrature Encoder library that uses Enable Interrupt under the hood so it plays well together. Unfortunately Quadrature Encoder doesn’t seem to be supported anymore, but it’s only a few lines of code to count the turns properly. So I integrated it directly into the project.

Wheels and Traction I spent a lot of time on experimenting with different wheel configurations (probably too much). The first version had two fixed wheels at the front. This was great for going straight, very stable. Unfortunately making turns was problematic. Front wheels had too much friction and caused the back wheels to spin. This became a big issue since it caused the encoder counts to become very unreliable for odometry.

I then tried a bunch of different configurations, using a single wheel at the front. I tried building a makeshift “landing gear” style wheel, which looked really cool but didn’t work in practice (it would get stuck in weird positions). I also tried making a version of this that had limits on its Z rotation axis. That didn’t improve things either.

In the end I had to settle for a caster wheel. I am still quite unhappy about this, since the caster is too small and very noisy on hard surfaces. It also gets stuck and I had to reverse the driving direction of the rover 180 degrees, since it only worked well when it was in the back.

I also ended up redesigning the back wheels. I replaced them with AndyMark HiGrip wheels. They work really well, have amazing traction and look great.

LIDAR v2 Design

Now that the Rover v2 base is working and driving well, I am turning my attention to the new LIDAR module. After reconsidering the previous GT2 pulley based design, I decided to redesign it completely on the new Rover base, using a gear based approach.

Reasoning behind using a geared drive is fairly simple: Pulley and timing belt approach didn’t necessarily create a smooth drive mechanism like I had imagined. It also takes up a lot of space and it’s a nightmare to calculate for. Finally, printing GT2 pulleys doesn’t generate very clean results on the 3D printer, which in turn contributes to lack of smoothness.

So there are lots of reasons to simplify the design with this version. I kept the bearing the same, but learning from the previous version, made an extrusion that can be detected by the QRD1114, which will be placed conveniently underneath. I also have a nice little lid for the LIDAR.

LIDAR Rover Design

After a few nights of printing and tinkering, I got a successful implementation of the new LIDAR, completely integrated with the Rover!

New design worked really well. Gear based integration with the motor is really smooth, and the sensor placement is perfect. The only issue was having the sensor detect the PLA extrusion. I ended up solving this by gluing a small piece of white printer paper with a thin black line printed on it. QRD1114 detects the switch from white to black on the paper, and nothing else.

Bigger issue was the motor. 6V encoder motors I got from Banggood didn’t have enough torque to drive the gears. I could get them to work at max PWM, but then it could only capture 24 scans per revolution. That’s not enough resolution for mapping.

Thankfully I had another set of encoder motors I had bought on sale. These are tiny 6V motors but they come with a gear reducer attached. I thought this might give me enough torque and I was right. Once I printed a custom holder for the small motor and attached it, it could drive the whole mechanism at 40 PWM range. As a bonus, the motor is four times lighter.

Once I ran this setup, to my delight I got 360 scans per revolution. That’s a one degree resolution per scan and it’s pretty amazing. I can also visualize the laser scan array in Rviz:

Driving Rover with an Xbox Controller

Before jumping to the PID implementation and eventually autonomous driving, I wanted to make sure that the Rover v2 could drive and turn with stability. I could have done a simple keyboard implementation but I really wanted to integrate an old Xbox controller I had lying around.

What I have seems to be an Xbox One controller and it connected to my Mac over Bluetooth with no issues. After some research, I found out that pygame is actually the best way to work with these game pads.

I had never used pygame before but it felt very natural. It has a great abstraction layer for the various controls on the gamepad so I could easily extract axis data from joysticks relatively easily:

import pygame

pygame.init()
controller = pygame.joystick.Joystick(0)
controller.init()

while True:
    left_power = controller.get_axis(1)
    right_power = controller.get_axis(3)
    left_speed = int(abs(left_power) * MAX_POWER)
    right_speed = int(abs(right_power) * MAX_POWER)

I then coupled this with the previous XBee implementation I had done and hooked it up with pyserial. To Python, XBee looks like just another USB port and thanks to the magic XBees all the remote control part is taken care of.

After some tinkering, I had a sweet little Python script that got the control values from the Xbox controller and passed them on to the Arduino on the rover. Here it is in action:

Overall I’m very happy with this. I think the crappy tires are causing issues with traction though. I will have to swap them out or print new ones using the new direct drive extruder I got.

Next step: PID implementation, two way data transfer, and eventually ROS integration for control. I also need to redesign the LIDAR piece to fit on the rover. Instead of pulleys, I’m planning to do a gear based implementation.

Visualizing LIDAR Data

I had my first try at visualizing incoming data from the LIDAR tonight. Good news is that it worked, and I got a successful SLAM map of the room. That’s pretty awesome.

Bad news is I definitely have to redesign the pulley mechanism. Stepper motor is ridiculously loud, heavy and slow. I need to design a smaller pulley system that runs on an encoded DC motor.

I used a Python library called BreezyLIDAR for the SLAM implementation. It really simplifies the amount of work you have to do. I had to do some figuring out on Python serial comms, but once I got the data from Arduino to Jupyter, I had the image rendered in a few minutes. Here is the relevant piece of code:

import io
import PIL.Image as Image
import IPython.display

MAP_SIZE_PIXELS = 800

from breezyslam.sensors import Laser
from breezyslam.algorithms import RMHC_SLAM

garmin = Laser(332, 4, 360, 4000)

slam = RMHC_SLAM(garmin, MAP_SIZE_PIXELS, 7)

for i in range(TOTAL_SCANS):
    slam.update(scans_mm[i])

map_data = bytearray(MAP_SIZE_PIXELS * MAP_SIZE_PIXELS)

slam.getmap(map_data)

image = Image.frombuffer('L', (MAP_SIZE_PIXELS, MAP_SIZE_PIXELS), map_data, 'raw', 'L', 0, 1)

IPython.display.display(image)

So, success on the software but back to the drawing board on the mechanical. This is not such a big deal, since I can apply the learnings to the new design. Notes for v2:

  • Keep pulleys as small as possible
  • Reduce timer belt distance to 200mm if possible (currently 400mm)
  • Move QRD1114 to the bottom for easier mounting (and hiding the calibration disk)

Driving Straight for Real Using PID

After implementing the simple straight driving sample for the rover, and integrating the XBee wireless protocol, it was time to test things out. I installed the new gearboxes (they came with motors installed and cables soldered, yay!) and did a test run. However, the first issue I ran into after switching to the battery setup was the OLED screen not starting up. I lost a bit of time on this, because the screen was actually working well when I connected the Arduino to the computer.

OLED screen uses I2C, so I figured it was an issue with the I2C discovery step and spent some time debugging it to no avail. Then I realized that the screen only worked when the serial monitor was turned on. This led me to suspect the software serial implementation, and I was right.

I was using SoftwareSerial to run the XBee serial communication module. Some research showed that SoftwareSerial has some serious performance issues. An alternative suggested was AltSoftSerial, so I gave it a try and magically things started working again! Lesson learned. One note for future though: AltSoftSerial is very well optimized, but disables PWM on Pin 10.

Rover v1

Now that everything was running smoothly, I ran my first test on the ground. Results were better, but not great. Rover was still swerving quite a bit to the left. I suspected some of this was mechanical, since I had a caster at the front and it’s not the best solution for making sure the robot is moving on a straight line. So I reworked the front into a two wheel setup. This helped somewhat and definitely removed a lot of friction, but the swerve was still there.

I also suspected that the simplistic motor speed modulation script I took from the Sparkfun sample didn’t work great. So I did some research into how this is achieved and came across a concept called a PID controller. A proportional–integral–derivative controller is apparently a logic loop that keeps adjusting an output value based on an input and an expected input value. Algorithm itself is not that complicated, but there is also a great Arduino library for implementing it that simplifies things significantly.

That said, it took me a bit of time to understand how to implement PID into my control system. PID itself is agnostic to the way it’s implemented, all it primarily cares about is the three values: input, output and setpoint. Then it runs on a specific time cycle, giving back output values.

Output is easy to understand, it’s essentially the PWM value that’s pushed out to the motor so it determines the motor speed. Input and Setpoint obviously need to be of the same unit type. In my case, they need to relate to the encoder values. What I was doing before was simply checking how close the two motor encoder values were, and varying the speed accordingly. That approach doesn’t really work with the PID controller.

Solution was to separate out the two motors completely. Instead of doing a comparative analysis, each motor needs to have its own PID controller, checking its encoder output against a preset expected value (hence the term “setpoint”). So I ended up calculating the amount of encoder clicks that are expected over a specific period of time and converting that to a time based frequency. Then I compared that to an expected frequency.

As an example, I could say that I would run the PID controller calculation every 100ms, and say that the expected encoder click change every 100ms is 50 clicks. Then I pass the actual encoder click delta in the same timeframe and the PID controller compares this and makes a decision as to how much power should be outputted to the motor (as PWM).

This article on element14 was very helpful. There is also a Zumo bot project I found that was very useful for understanding the concepts, however I didn’t end up using the implementation since the Arduino PID Library provided much better abstraction.

Of course, the actual PID implementation is much more complicated than this. There are three additional parameters: proportional, integral and derivative attributes. These are used to “tune” the PID algorithm and have it output expected results. Unfortunately tuning PID seems to be a bit of a black magic and I will have to spend some time on this.

Good news is that the initial implementation is working really well after some tinkering with it. Here is a video of it in action:

Next step is to add rotation to this flow. And then I will work on a dead reckoning implementation to have it move reliably on a specific path.

XBee Protocol Design

One of the goals of the self driving rover project has always been to have it controlled by a base “brain” unit that does the actual analysis and motion control. I have a few XBee S1 units I had purchased a long time ago and I wanted to put them to good use for this.

XBees are quite versatile but notoriously hard to configure. There are tons of configuration options available and the XCTU software is not the most friendly.

I started out by plugging both XBees to a USB FTDI board and launched XCTU to make sure they are properly visible. After finding them, I reset them to factory settings in case some weird config is lurking in there. Then I configured one as the controller and the other as the end unit. I set the matching address attributes to make sure they could auto-discover each other. There are tons of good articles on this part. This one from Sparkfun and this beginner’s guide to XBee on Atomic Object were very helpful.

Once configured, I hooked up the end unit one to the Arduino. It’s a simple connection, and thanks to the SoftwareSerial library, you don’t have to use the hardware RX/TX ports. Actually it’s recommended that you use this method, since RX/TX might interfere with USB serial communications.

Then I uploaded a very simple sketch on the Arduino. It simply prints out every received byte from the XBee:

while (XBee.available()) {
    char value = XBee.read();

    Serial.println(value);
}

I then went into terminal mode on XCTU and started typing random commands. Output immediately came back through the Arduino to the serial monitor. Things were looking good.

Next step was designing a simple protocol for sending a command and a value. My thinking was that I could get away with three commands:

  • Forwards: F
  • Backwards: B
  • Rotate: R

Then I could append the value to this. For instance F100 would move 100cm forwards.

Reading from XBee is no different than any other serial interface. However I wanted to ensure full commands were being read without having to enforce a command length. I decided to use byte 0x0D as the end delimiter. This seems to be the carriage return character and keeps the terminal tidy as well (since there is one command per line).

Implementation required me to refresh my C knowledge a bit. I had to figure out how to properly convert an array of char values into an integer and then how to reset the values in that array (memset to the rescue).

Here is the relevant implementation:

const char COMMAND_FORWARDS = 0x46;
const char COMMAND_BACKWARDS = 0x42;
const char COMMAND_ROTATE = 0x52;
const char COMMAND_END = 0x0D;

char command = 0x00;
char payload[5] = {};
uint8_t payload_index = 0;

while (XBee.available()) {
    char value = XBee.read();

    if (value == COMMAND_END) {
      int value = atoi((char *)payload);

      Serial.println(command);
      Serial.println(value);

      display.clearDisplay();
      display.setCursor(0, 0);

      if (command == COMMAND_FORWARDS) {
        display.println(F("Forwards:"));
      } else if (command == COMMAND_BACKWARDS) {
        display.println(F("Backwards:"));
      } else if (command == COMMAND_ROTATE) {
        display.println(F("Rotate:"));
      }

      display.println(value);
      display.display();

      payload_index = 0;
      command = 0x00;

      memset(payload, 0, sizeof(payload));
    } else if (payload_index == 0) {
      command = value;
      payload_index++;
    } else {
      payload[payload_index - 1] = value;
      payload_index++;
    }
}

For good measure I attached a tiny OLED on the rover and had it display the active command. Because why not?

Next step: Attach the new gearboxes (old one is definitely broken) and drive this thing remotely. I will also need to calibrate the encoders to make sure it’s travelling the exact distance I expect.

XBee

Driving Straight

I made some OK progress on the driving unit last night. I finally got the T connector for the LiPo battery, so I could hook everything up and power both the Arduino and the motors from a portable source.

I was worried about powering them together but it turned out to be simple. I soldered a buck converter (voltage down) between the battery line and the motor shield. Set it up to give out a constant 4.5V, which should be more than enough for the 3V motors I have in there.

Arduino specs say that the power jack can safely handle 7-12V input. This is perfect since the LiPo I’m using is 2S (7.4V). I hooked up the LiPo straight into the Arduino power jack and then split a line out to the buck converter. This also prevents any power backlash from the motor shield. I now have a mobile rover!

Things didn’t go so well on the ground though! Robot immediately swerved left and hit the wall. I suspected this might happen, since the encoders were showing that the right side was clocking half the turns of the left side. After looking (and listening) to the wheels and gears, I think the gear box on the right one is busted.

I ordered some replacement gear boxes and will open up the one I have to see if I can fix it tomorrow.

In the meantime I also hooked up an Xbee S1 module to the unit. Plan is to be able to remotely control it from the computer. Configuring the Xbees was easier than I remembered. X-CTU app is a dumpster fire but once you know how to do a few basic things configuration is fairly easy.

Plan for tomorrow: Implement a simple serial protocol to control the driver unit. I’m imagining a couple of commands for starters: Direction + Distance and Rotation + Angle (ie. F100, B50, R90).

Rover

Visualizing LIDAR Data

I had my first try at visualizing incoming data from the LIDAR last night. Good news is that it worked, and I got a successful SLAM map of the room. That’s pretty awesome.

LIDAR map

Bad news is I definitely have to redesign the pulley mechanism. Stepper motor is ridiculously loud, heavy and slow. I need to design a smaller pulley system that runs on an encoded DC motor.

I used a Python library called BreezySLAM for the SLAM implementation. It really simplifies the amount of work you have to do. I had to do some figuring out on Python serial comms, but once I got the data from Arduino to Jupyter, I had the image rendered in a few minutes. Here is the relevant piece of code:

import io
import PIL.Image as Image
import IPython.display

MAP_SIZE_PIXELS = 800

from breezyslam.sensors import Laser
from breezyslam.algorithms import RMHC_SLAM

garmin = Laser(332, 4, 360, 4000)

slam = RMHC_SLAM(garmin, MAP_SIZE_PIXELS, 7)

for i in range(TOTAL_SCANS):
    slam.update(scans_mm[i])

map_data = bytearray(MAP_SIZE_PIXELS * MAP_SIZE_PIXELS)

slam.getmap(map_data)

image = Image.frombuffer('L', (MAP_SIZE_PIXELS, MAP_SIZE_PIXELS), map_data, 'raw', 'L', 0, 1)

IPython.display.display(image)

So, success on the software but back to the drawing board on the mechanical. This is not such a big deal, since I can apply the learnings to the new design. Notes for v2:

  • Keep pulleys as small as possible
  • Reduce timer belt distance to 200mm if possible (currently 400mm)
  • Move QRD1114 to the bottom for easier mounting (and hiding the calibration disk)

Encoding DC Motor Steps

Now that I have successfully encoded the stepper motor position for LIDAR, I turned my attention to encoding the DC motors that I am planning to use for running the robot wheels.

There are several good examples of this online, but most use quadratic encoders (DC motors with embedded dual encoders in them, usually magnetic). Unfortunately I don’t have any such motors at hand, and I have many, many DC motors so I am hesitant to buy more.

So after watching this fun little video from Sparkfun, I decided to build my own simple encoder. I already have the optical encoders with disks ready to go. So I designed and printed a motor holder that can hold the gear shaft and encoder disk in place.

3D printed motor encoder

Funny enough, the biggest problem I had was soldering the cables on these flimsy 3V motors. In retrospect using the bigger 6V motors would have been a better idea. I might swap things around once I have a working prototype.

Sparkfun video comes with a handy little code snippet for speeding up and slowing down the two motors so they drive more or less equally. I adapted this for the Arduino Motor Shield and ran a few tests. It seems to be working, but hard to tell without an actual test on the ground.

I will now have to convert everything to run on a LiPo battery so I can autonomously test this on the floor. Objective is to get it to drive straight a specific distance.

Once I have reliable results, I am planning to add coupled Xbee units to be able to remotely control the robot from my laptop.

3D printed motor encoders in place

Encoding Stepper Motor Position

I finally ran into an issue that I have been dreading for a while: Encoding the actual position of the LIDAR pulley mechanism. Now that the LIDAR scan works, I need accurate position data alongside the measurements. As reliable as my ridiculously powerful stepper is, it’s not a servo and doesn’t know which position it starts from.

There are various ways of solving this, with varying levels of budget and difficulty:

  • Absolute encoders use code wheels alongside optical or magnetic sensors that have a unique pattern per step. Whatever orientation the wheel is in, the sensors are always able to determine the exact angle.
  • Quadratic encoders use two coupled sensors (either magnetic Hall effect sensors or optical ones) to determine wheel direction. They cannot determine absolute position.
  • Single optical encoder wheels that rely on a chip such as LM393 are able to determine incremental movement, but not direction. They also cannot determine absolute position.

Unfortunately, among these options, the only one that could work is the absolute encoder. These are hard to find and as far as I can tell quite expensive. One can probably DIY it, but it wouldn’t be a simple implementation.

As I was thinking through this, it occurred to me that all I care about is the starting position. I had initially thought that the stepper motor wouldn’t be reliable, but some quick testing showed that it’s very reliable. Stepper motor is very effective at counting its steps, since that’s literally the main thing it’s good at. So as long as I positioned the starting point properly, I could get accurate positional data.

This is obviously not a novel idea, since all stepper motors work like this. In a 3D printer or CNC machine, the steppers move until they hit the stop switch, which tells the device that it found the starting position. Then it can count its steps from that position and know its exact location.

I really didn’t want to use a physical stop switch, even though I have quite a few lying around from the flip clock project. This got me thinking: Could I use an optical switch instead? I started looking into line tracking robots, since that’s a very similar use case.

Research led me to the QRD1114 Optical Detector module. The device is a bit of a weird mashup of a phototransistor coupled with an IR detector. The result is that it’s very good at detecting black and white on non-reflective (i.e. standard printer paper) surfaces.

Device looked familiar, so I looked in my sensors box, and got lucky! I already had one. I got to work prototyping it out and got it working very quickly. The Sparkfun guide above was very helpful.

After some trial and error, I ended up preparing a circular drawing with the exact dimensions of the pulley extension in Illustrator. Then I printed this with the laser printer and cut it out. Sensor could detect it very efficiently.

LIDAR with makeshift encoder

To make the code just as efficient, I used a hardware interrupt whenever the sensor triggers on black:

attachInterrupt(digitalPinToInterrupt(2), detectMark, RISING);

Every time the LIDAR starts running, I run a calibration cycle, determining where the line is. Then I can start scanning from that position and increment with the stepper. Every time the line is detected, position is reset since that means we completed a full circle scan.

Early testing showed very promising results today. I will attempt to create a Processing.org sketch that can visualize the results tomorrow.

Stepper Motor Success

After a few days of struggling with stepper motor controls, I finally got everything working reliably last night. There were a lot of learnings along the way, most of them not very well explained on most tutorials I found online.

My problem was essentially that I have a ridiculously powerful set of steppers. I kept referring to them as NEMA-14, but while reading this guide from Adafruit, I learned that NEMA nomenclature doesn’t mean much other than enclosure size.

So I searched the serial number and found the product specs sheet on Pololu. I bought these many years ago so I had no idea what the specs were.

As it turns out, specs are critical when dealing with steppers. You specifically need to know the current requirements per coil (amps per phase and resistance per phase). In my case, it was a fairly high 1A per coil. Problem is, the motor drivers I had couldn’t supply more than 0.6A per coil, and that’s probably why they kept heating up and the motor was so erratic.

Thankfully, I also found some A4983 drivers in a motor drivers box I had in the basement. Perusing the specs showed that they are capable of 2A output! That sounded promising.

After hooking up the driver with the basic setup, I plugged it in to a 24V supply. This was the other learning. Even though the stepper is specced at 2.7V, that doesn’t mean much. They are very power hungry and work more efficiently under higher voltages.

Another thing you have to do with A4983 is to adjust the output current using the potentiometer on board. This can be done using the REF pin and doing a simple calculation. I set it to output roughly 0.8A for starters. This guide on RepRap was excellent for figuring this out.

In any case, I plugged it in and… Nothing happened. I even soldered some pins again since they didn’t look great, and also tried a second driver. But no success. Motor was quiet and there was no movement at all.

This wasn’t too surprising to me because I had tried to make these drivers work back in October for a Halloween project. But this time I was determined. Back to the specs I went.

That’s when I noticed something weird. There was an obscure reference to the board requiring voltage input from the logic controller. Even though I was powering the board and the Arduino separately, A4983 still wanted the 5V from Arduino into the VDD pin. I plugged it in, and the stepper burst into motion as expected!

Next step was testing this alongside the LIDAR. Earlier this week, I had identified the old LIDAR unit I had as a Garmin LIDAR Lite v1. Unfortunately v1 is from 2015 and Garmin dropped support for it a while back (they are on v4 now). They have a well maintained Arduino library but I couldn’t get it to work with v1 unit I have. After much trial and error, I found an Arduino sketch that a kind soul had posted for v1. After some tweaking, I had the Garmin LIDAR Lite v1 delivering consistent results.

Thankfully, making the LIDAR work alongside the stepper was easy. A4983 makes stepper control a breeze. All you have to do send a HIGH to STEP pin and it steps. That’s it, no fuss. With a few lines of code, I had the stepper moving reliably and mapping position data to LIDAR distance measurements.

Of course, the stepper positional data is not reliable and I’m not planning on using it. But if I can couple this with an optical encoder, I should be able to output reliable mapping data.

Pulley design in Fusion 360

Assembled LIDAR pulley

Stepper Motor Hell

I got the stepper motor working tonight. This was significantly harder than I expected. I had several motor driver boards lying around, and thought I would make my job easier and use an Arduino Motor Shield. As it turns out, running steppers with it is not that straightforward. I tried several libraries, everything from the standard Stepper library to the AccelStepper, which is supposed to be state of the art.

There are a few things that threw me off, in no particular order:

The NEMA-14 I am using is a 4-cable bipolar motor. I tried to initialize the libraries with pins for all 4 coil cables, and this doesn’t work. Apparently what you are supposed to do is to use the 2-cable setup, but then set the PWM pins to HIGH in setup:

Stepper stepper(stepsPerRevolution, DIR_A, DIR_B);

void setup() {
  pinMode(PWM_A, OUTPUT);
  pinMode(PWM_B, OUTPUT);
  pinMode(BRAKE_A, OUTPUT);
  pinMode(BRAKE_B, OUTPUT);

  digitalWrite(PWM_A, HIGH);
  digitalWrite(PWM_B, HIGH);
  digitalWrite(BRAKE_A, LOW);
  digitalWrite(BRAKE_B, LOW);

  stepper.setSpeed(400);
}

Same idea applies to AccelStepper. Brake pins are needed only because of the Arduino Motor Shield’s setup.

I eventually got the motor to run with the AccelStepper library, but it was very unreliable. Even though I wanted it to run at a constant speed, motor would continuously accelerate or decelerate, and eventually would stall. In addition, I noticed that trying to use Serial during operation was a big problem with AccelStepper. It would change the way the motor was running.

I also ran some tests on the Adafruit Motor Shield rev1 I had from many years ago. Again, I could get the motor to run, but it was unreliable. In addition the board started smelling funny and I noticed that the L293D H-bridge driver getting very hot.

Finally, I decided to get back to basics. I hooked the Arduino Motor Shield up, loaded up the Stepper library again and wrote the simplest stepper program:

void setup() {
    stepper.setSpeed(400);
}

void loop() {
  stepper.step(1);
}

After fiddling with the speed a bit, this actually worked! I elaborated on it a bit more to confirm the RPM:

void loop() {
  stepper.step(1);
  currentStep += 1;

  if (millis() - lastCheckTime >= 60000) {
    Serial.print("RPM: ");
    Serial.println(currentStep / stepsPerRevolution);

    currentStep = 0;
    lastCheckTime = millis();
  }
}

And sure enough, I got an RPM reading of 398. It’s good enough for the LIDAR, and I can safely print things out to the Serial.

Next step will be designing the pulley system. I got the 42mm bearing and the timing belt today. It’s going to be fun to hook up the whole mechanism.

Notes on LIDAR research

I have been researching various ways to build a radar for robotic projects. I already have a Garmin LIDAR Lite I had bought years ago, so LIDAR seems to be the best way to go. However, my research shows that there are several additional parts that are also required:

  • A slip ring, to keep the rotating LIDAR module in connection with the rest of the robot
  • Some kind of pulley mechanism to rotate the unit
  • A stepper motor or a brushed motor for rotation
  • In the case of a brushed motor, an encoder would be needed to calculate exact position
  • Wide metal bearing of some sort
  • A timing belt or similar for pulley

I found several projects that show some level of success with this method:

I couldn’t get my head around how to design the pulley mechanism until I saw this project: DIY Arduino LIDAR. Basically, the slip ring attaches to the base plate and the rotating part is left to turn on its own. Aluminum bearing does the bulk of the work, since it’s inserted between the large pulley wheel, and an extrusion from the base plate.

The Garmin LIDAR Lite can scan at a maximum rate of 500 scans per second! Even if I assumed half that speed, that would be amazing. Unfortunately, I will be limited by the mechanical aspects of the components. The slip ring is rated at 300 RPM. It could go higher, but would start introducing significant noise into the signal. Using a brushed motor with an encoder doesn’t make sense in this scenario, since I wouldn’t require the higher RPM provided by it.

It seems like using one of the NEMA-14 steppers I already have is the best way to go. I have smaller stepper motors from the flipclock project, but they only seem capable of 20-25 RPM at most. NEMA-14 is rated at 600 RPM, with 200 steps per revolution. So here is the plan:

  1. Get NEMA-14 stepper working reliably, with continuous speed at 300 RPM and reporting its exact position
  2. Design and print a pulley mechanism connected to the stepper, with the LIDAR attached to the large part
  3. Have LIDAR scanning with every step and returning distance data, couple stepper position with distance data
  4. Implement a simple visualizer to show the results

This should get me a simple, 2D live map of the environment. Once I have this, I can start getting into using SLAM algorithms using the ROS platform.

Designing with Fusion 360

Despite my initial aversion to it, Autodesk Fusion 360 is truly an amazing tool. The combination of sketching on faces and timeline editing makes it incredibly powerful. Following the great Layer by Layer tutorials form Adafruit I designed a simple case for the ZoomBox. I was able to import the official DXF files from Arduino right into Fusion 360 and map the screw holes exactly.

Fusion 360 Design

Surprisingly, there was no good information available on component heights. I had to manually measure them. For reference, the USB-A port is 12mm by 11mm and the power jack is 9mm by 11mm. They are respectively 9mm and 4mm inset from the edges of the board. These two drawings were also good references but they didn’t provide all the information I needed:

After adding some simple snap fit joints to the bottom part of the box, I sent it to print and went to bed. To my mild surprise, I found a perfectly fitting box in the morning!

Next step: Design the top part with button inserts.

Printed Arduino box Printed Arduino box from top

Zoom Mute Controller with Arduino

I had a very informative hacking session tonight and quite a few trips down rabbit holes. I decided to do a little side project: A couple of big hardware buttons that can control Zoom mute/unmute and screen share on/off states on my laptop. I spend most of my days on Zoom and it would be very handy.

Easy, right? Well, it turns out that it’s not that easy.

First thing I did was confirming that Zoom already had global hot keys for the controls I needed. Found them under Settings > Keyboard Shortcuts. Looked good.

Then I did some research on Arduino keyboard controlling. Initial results suggested that the Keyboard library included with the Arduino IDE could easily do this. Great!

Finally I hooked up a couple of big push buttons to an Arduino UNO and spent a bit of time on doing a clean button test implementation. I even got double click and long press actions working thanks to the excellent AceButton library.

It was looking like I was about to wrap this project up. That’s when I tried to compile the whole thing and got a weird error: Keyboard.h doesn’t exist.

My first thought was that something went awry with the IDE included libraries. After some digging, I learned that Keyboard and Mouse libraries are not available for Arduino UNO. Some more digging taught me that this was because ATMEGA328p processor was actually incapable of doing USB communication. So essentially I wasn’t able to have the Arduino Uno or none of the other Nanos or Megas I have at home show up to the computer as a USB HID (Human Interface Device).

Some more research brought me to this article and similar articles and YouTube posts that suggested the bootloader on the Arduino UNO could be swapped with a custom bootloader that would allow USB HID protocol on the ATMEGA328p. I went down several rabbit holes, and even loaded up the official Arduino bootloader using dfu-programmer, following the guide on Arduino.cc.

I was hesitant to upload a custom bootloader on the Arduino. Even though you can’t brick it with this method, none of the tutorials I looked at seemed reputable. They each had their own sketchy Github repos with weirdly named hex file downloads. Also most of the articles were at least 5 years old. Things didn’t seem very promising.

Just as I was getting ready to change tracks – I started considering converting this into a Raspberry Pi Pico project, which I just received in the mail – I came across Hoodloader2. Immediately I could see that this is what I was looking for: A well-maintained, open source project that has been around for years. It came with a very detailed wiki but even then it took me an hour to understand what it was actually doing. This is primarily my fault, because apparently I had no idea how the Arduino UNO boards actually worked.

Here is what I learned: The Arduino UNO prototyping boards actually don’t just contain the ATMEGA328p processor, but also a second processor: the ATMEGA16u2 (older ones have the ATMEGA8u2). ATMEGA16u2 and its siblings are small but powerful microprocessors that have embedded USB HID capabilities. The Arduino UNO board essentially uses this secondary chip to do the USB to Serial translation behind the scenes.

This is where Hoodloader’s genius comes in: It’s a custom bootloader you can load up on the ATMEGA16u2 that allows you to load up Arduino sketches on it! This is the part I didn’t initially understand fully. I thought Hoodloader was going to automatically create a bridge between the ATMEGA328p and the new bootloader, so I could use Keyboard libraries directly from the Arduino UNO side. Apparently this is technically possible, but Hoodloader doesn’t do it. So essentially you build a sketch for the 16u2, use its pins (you can have up to 7 digital pins) and essentially run everything in a little corner of the board. You can still upload a secondary sketch to the main Arduino board, but it’s not necessary.

Armed with this knowledge, and by also leveraging NicoHood’s HID library, I was able to get the keyboard input working fairly quickly. I now have a nice little prototype running that can mute and unmut Zoom. More actions can easily be added.

I also did some more research on the ATMEGA16u2 and its bigger brother, ATMEGA32u4. It’s possible to build fairly simple PCBs that use these microprocessors as the brain. That would be a fun little project, and very handy. I’ll dig into this in the upcoming weeks.

Rainbow Protocol

Made surprisingly good progress on the Bluetooth implementation tonight. Built a simple iOS app that can connect to the Arduino and read existing device settings. I’m pretty happy with the clean Swift implementation and the object oriented Arduino controller.

Protocol design ended up being very simple: A command byte, then three optional argument bytes. It’s enough to cover all cases. Same commands can be used for both reading and writing, which is nice.

This also ended up being a nice little recap for me on Swift 5 data types and advanced use of enums. Porting the same code over to Arduino was much easier than I anticipated.

Next step: Writing configuration changes to the device!

Bluetooth Rainbows

Decided to add Bluetooth control to the LED Rainbow. Every once in a while we need to change the go to sleep and wake up times and M wants to be able to control the colours from her iPad.

Got my old RedBearLab BLE units out. Unfortunately RedBearLab seems to have gone out of business in the last few years. But the units are still working well and thankfully their GitHub repos are still up.

Spent a few hours converting and old BLE library into a working Swift implementation. Next step will be writing a simple protocol to control time and colour settings on the rainbow.

Workout Timer Initial Steps

Made some good progress on the workout timer tonight. I have been coding for Arduino for years but I had never bothered to learn the class-based architecture. It takes a bit of getting used to, but being able to organize methods into classes like I normally do at the day job is delightful. I ended up designing a simple and clean architecture for the LCD menu system, as well as the timer itself. As a bonus, I got the large digits working!

A little bit about the large digits: These LCD screens are pretty simple. They essentially have 5x8 blocks that can be filled in using a byte array. This handy tool is great for generating custom characters. I also found this Instructable to be very helpful.

Unfortunately, these characters need to be written to the screen’s memory, and that’s very limited. On the version I have, I only have space for 8 characters. Thankfully, that’s just enough to be able to generate all digits from. I just had to find a bit of a hack to get the large timer colon.

Another thing I worked on was a stable rotary encoder class. There has been some great work done by the community on this, with tons of example code and libraries. Unfortunately, there is almost too much content, and everyone seems to have a different solution to the problem. After dabbling with a few libraries and not getting great results, I found this clear and concise Instructable on the topic and integrated the straightforward implementation into a class-based architecture.

To top it off, I merged this with a standard software button debouncer implementation and ended up with a nice little reusable class for tackling encoders. I am planning to add more “gestures” to this, such as long press and double click.

My progress so far is in this Github repo.

New Project: Workout Timer

Started work on a little side project: A workout timer. I need a simple timer during morning workouts and this is a good challenge for it.

Requirements and problems to solve:

  • Smooth rotary encoder feedback (suprisingly hard)
  • Large numbers on LCD
  • Object oriented menu system
  • Battery powered and rechargeable

Built a prototype board with LCD, encoder and Arduino for easy testing. Encoder implementation with interrupt pins already working nicely. Next step: OOB menu system.