Convert a Turtlebot3 Waffle Pi to use UP board with R200 camera

Recently I purchased a Turtlebot3 waffle Pi, and I would like to convert it to use an UP board instead of the RPI3 and use the R200 camera, like what was used on the earlier Waffle. I have also mentioned this up on the Robotis Forum: http://en.robotis.com/service/forum_view.php?bbs_no=2406761&page=1&sca=&stx=&sfl=

I believe that this should be very doable, but I suspect that I could always run into some unforeseen issues and would appreciate if you have any suggestions.

Hardware wise, I am planning on mounting the UP board using the same setup was put in place for the RPI3 as it has the same form factor.

Currently I have two UP boards (actually 3 if I count the UP2 board). I am tying to decide between the UP board with a Heatsink versus the one that came with the Realsense development kit (https://www.mouser.com/new/Intel/intel-realsense-robotic-dev-kit/) which has a fan mounted on it. While I like the heat sink version as it is quieter and probably takes a little less battery power, will probably go with the Fan version to reduce heat issues.

With this setup I will need to see what changes I need to make with the USB cables, as the R200 plugs into the USB3 connector, plus I need to add a Wifi adapter and optionally a Bluetooth adapter.

5V power to run the UP board plus camera and the like. Not sure yet if VR on OpenCR board will be sufficient? I believe it has 4amp VR for 5v? Or if I need to use an external VR(BEC). If I need external BEC will see if I can simply setup to plug the BEC into the 12v plug on the OpenCR board, which is hopefully after the switch and not controlled by the circuit that enables power to servos.

Software on the UP board - For the most part, I believe I can simply do a setup using the instructions given to setup the Intel Joule: http://emanual.robotis.com/docs/en/platform/turtlebot3/joule_setup/#install-linux-ubuntu. Actually I have both of the UP boards currently setup with Ubuntu, plus ROS and catkin_ws setup with the turtlebot code.

Mounting the R200 camera. Looks like there were two brackets that shipped with the original Turtlebot3 Waffle robot. I wonder if the 3d layout for these parts is available, if not looks like simple L bracket, so should not be hard to setup.

With ROS, wondering if I should setup the UP board like the Joule and RPI and have external Linux workstation setup as the ROS_MASTER_URI and have the roscore run on it, or instead have the UP board as the master. Will probably try the later as to only need to run my secondary Desktop machine that has Ubuntu installed on it. That way can use my standard Windows desktop with Putty(KiTTy) and potentially VNC for most of the playing with the Turtle.

Does this all make sense? Any suggestions?

Thanks
Kurt

2 Likes

Some updates,

I am planning, to set the UP board as the master ROS node.

Also started playing around with bracket for R200 camera. I tried downloading the 3d design of the actual used bracket fro OnShape (as pointed to by the main page for turtlebot3… So far I have not gotten a great 3d Print. I actually redid the design to make it a little simpler and tried a couple of prints.

IMG_0272-(002)

But I also ordered some parts from Robotis that I should also be able to use: http://www.robotis.us/bracket-dual-l-spl-2b3-w-5pcs/

I would prefer the black ones here, but they were out of stock… Also ordered a rivet set.

@Kurt

Hi,
it seems like you are going to improve the waffle pi to the newest; that’s better.

We, in fact, had tested Up board - RealSense R200, and found no problem (though these were carried out long ago, and I’m not 100% sure it still works, since there were many updates on both system)

Installation of Linux and ROS is nearly as same as what would be done in any laptops. Don’t follow the joule board setup; it is only for joule board but not for the others.

For this:
With ROS, wondering if I should setup the UP board like the Joule and RPI and have external Linux workstation setup as the ROS_MASTER_URI and have the roscore run on it, or instead have the UP board as the master. Will probably try the later as to only need to run my secondary Desktop machine that has Ubuntu installed on it. That way can use my standard Windows desktop with Putty(KiTTy) and potentially VNC for most of the playing with the Turtle.

it is completely up to your ways to use it; if you need more computation bandwidth, you should run a part of heavy tasks in external (as you said, Ubuntu is OK). I usually use the computer inside the robot as a Sensor data transmitter & Locomotion layer, and that’s why cloud robotics system like amazon cloud are still being popular. ROS CORE used to be run in Robot computer (to avoid critical malfunction), but I used to run ROSCORE in external.

Thanks @Leon_Ryuwoon_Jung,

I should hopefully make some progress on this, in the next few days. The UP board that came with the R200 kit did not include a Wifi Adapter. I could simply use the one off of my other UP board, but decided to order a new one which should arrive today (The new one should have better Linux support, faster, dual band…) than the one that UP shop sells.

Also for mounting camera, I did 3d print a couple of brackets, that may work:
IMG_0276
They are not pretty, but may work. I also have some on order from Robotis… Hopefully they will ship soon

Again thanks, will update again as I make progress.

Kurt

Quick update: I have not tried the earlier brackets out yet, still waiting for order to be shipped, looks like held up on OpenCM9.04C being out of stock…

Decided to try different mounting technique. Used the bracket that was used for RPI camera and printed a simple 3d bracket that fits in the slot on the R200 camera, and then used some M2 screws to hold…
IMG_0285-(002)

I have not fully assembled yet as I am playing around with adding on some simple sensors to hopefully detect we are coming up to a stair drop off, before it falls.

I am currently experimenting with an VL6180 sensor for usage with the Turtle… I have talked some about it up on Robotis Forum: ROBOTIS

Not sure to discuss the progress of adding a sensor to the OpenCR, then to add a new ROS topic (or topics), and what steps that are necessary, which probably include changes to Arduino Sketch, also to Turtlebot Arduino libraries.

Then maybe to some of the ROS configuration and Launch files.

Current State: I have one of these units currently connected up to the I2C pins on the OpenCR board, using the Wire library. Note: Currently the Wire library is bitbanged, but at least it appears to work.

I started off looking at the Adafruit library as well as the Sparkfun library, and decided to hack up my own. I used the same initialization code as Adafruit, which comes from the sensors documents. The reason why I rolled my own, is with the library code, when you ask for the current range, these libraries sit and spin until the range operation completes. I instead want a version where I can tell it to start a range then have the board trigger an interrupt when it completes. Currently the interrupt code sets a flag to say it is done and I have code in main loop of test program detects the completion and then ask for the results… It appears to be working.

Next up try integrating this code with Turtlebot 3 Arduino code base. and see if I can add a new topic or two to the RosSerial code associated with the board.

Not sure if anyone is interested in this or not. If so would it be better to extract to own topic? Anyone interested in WIP code? If so could up load ZIP and/or add to a github project.

Now back to playing around.

1 Like

Hi Kurt,

Thank you for your consideration and supports.

I recommend using various sensors for TurtleBot.
This was also discussed at ROSCon2016, which announced Turtlebot3. Please refer to the video below.

But, I have not been able to create a tutorial on this part because other tasks are now of high priority.
However, we listened to your story and made you feel once again how important and necessary it is.
Thank you so much.

I will be preparing a tutorial on how to use various analog sensors and related embedded programming using OpenCR.

Thank you for sharing all the progresses and its references.
I appreciate for your suggestions and help. :slight_smile:

Thanks @Pyo,

I noticed the roscon PDF earlier which also mentioned these sensors. I will watch the video as well.

I can image you have lots of other high priority tasks to do as well. Including maybe making sure everything works on the new LTS Ubuntu (18.04), along with the corresponding ROS. Plus probably lots of other things.

For me, this is more of a learning exercise. As I mentioned sometimes just reading the book and following the tutorials (which are great), But sometimes to get things to sink in, sometimes some simple projects like this help for things to make sense.

I have been busy with other stuff (it is spring here :smiley:) so I have not worked on it for a couple of days. I also purchased a couple of more sensors, so I can have 3 (with a spair) for the left, center, right, which I will work on the next day or two.

I need to redo my 3d bracket for the sensor as I did not like the first one… Plus then figure out how I wish to wire it. May start off with breadboard… maybe later do a quick and dirty circuit board for it.

Yesterday I was looking into switching over from using it’s own new topic to instead use the SensorState cliff field (with the three defines). Which probably should not be too difficult to do. But then wondered, about how to setup the thresholds for what value should specify that I am at a cliff. Likewise if I should still export the raw data, how to specify the frequency.

Currently it looks like everything is hard coded into the Arduino sketch.

Example: the Frequency on when to publish the sensor data, is in the main loop in the section:

if ((t-tTime[2]) >= (1000 / DRIVE_INFORMATION_PUBLISH_PERIOD))
  {
    publishSensorStateMsg();
    publishBatteryStateMsg();
    publishDriveInformation();
    tTime[2] = t;
  }

Where the turtlebot_core_config.h file contains:

#define DRIVE_INFORMATION_PUBLISH_PERIOD    30   //hz

This is a minor NIT, but to me DRIVE_INFORMATION_PUBLISH_PERIOD, sounds like it should be specify an amount of time, like how many milliseconds, not the frequency. Would expect from name that maybe the define would be (1000/30) or the define name maybe should be something like: DRIVE_INFORMATION_PUBLISH_FREQUENCY

I obviously could do something similar of simply defining something like:
#define TOF_CLIFF_LEFT 50

And use it to decide the cliff. But I wonder if instead this should be Ros parameter? To allow these values to be configured? i.e. use some of the stuff mentioned in: rosserial/Overview/Parameters - ROS Wiki

Again still learning. Which is the main point of this exercise.

Thanks again

2 Likes

Hi Kurt,

With your help, OpenCR, OpenCM, and TurtleBot3 software are getting better and better. Thank you very much.

The TurtleBot3 and OpenCR hack you mentioned is a very good idea. We will prepare a tutorial for this. We will explain how to connect additional sensors to OpenCR and how to change the ROS messages. Here are the sensors we will include in the tutorial now.

  • Bumper (using switch, I/O)
  • IR (for sensing Cliff, ADC)
  • Ultra Sonic Sensor (for avoidance, Timer)
  • LED bar (for display, I/O)
  • Light Sensor (using CDS, ADC)

This work will be completed and released this month.

Thanks,
Pyo

1 Like

Thanks and you are welcome,

I am having fun and learning, which is the whole point of this!

Will have to pickup a few of the different sensors you are using, so I can play along.

One of the interesting things to try to figure out as I play with the Turtlebot 3 Arduino code base is, ROS specific and how much of it is or can be considered to be possibly stand alone Rover code.

That is what code makes sense to run when nh.connected() is false? On the surface, most of the stuff is ROS? Obviously you can not publish data, or receive data from topics you subscribe to or retrieve parameters.

However there are things that are still maybe active: Example do you still process the data from the RC unit? Can it still drive the rover even if ROS is not running? likewise maybe the diagnostic buttons?

Assuming yes to these (and in some cases maybe regardless), what things should be self contained that still function even when ROS is not available? Things like:

a) Battery voltage checking - I would hope that the system would still alert you and hopefully automatically shutdown if the power gets too low, hopefully before the battery is damaged.

b) Some Odometer stuff?

c) Kill switch?

The reason I mention some of this stuff, is suppose we add the Cliff sensors and Bumpers? To hopefully keep it from running of a cliff (down the stairs). Should the code work while not ROS is not connected? That is should some of the code to process this data be on the OpenCR or should there be an external ROS node, maybe equivalent to kobuki_safety_controller that should run on the host?

Again lots of stuff to have some fun experimenting with!

Hello @Kurt :slight_smile:

Nice to see you again!

First, if you inactivate nh.spinOnce() function(nh means node handler) then the ROS communication called ROS Serial is not running.

Second, you can use all code when ROS is not available as battery, odometer, Dynamixel SDK, Button, etc. Used ROS framework is to send some information about TurtleBot3 to operate applications like SLAM or Navigation.

Third, any sensors don’t need to ROS framework. You can use everything to make Arduino code. However if you want to develop safety_controller (Node or Package in ROS) to TurtleBot3, you will publish sensor data with ROS Serial.

Thank you for your inquiry and i am looking forward to your Rover :wink:

Thanks

Thanks @Darby,

I hope to get started again with some of this. I took a few day diversion back to doing some stuff with a Teensy 3.5, which with current beta IDE now as 256KB of memory, which allowed me to enable a Frame buffer mode like we had on 3.6 and with this the capability of doing DMA screen updates… Sorry I know that this is off this topic…

Next up, to wire up the three Sensors and/or wait until I know which Analog sensors you are using, to setup the cliff detection.

But may still have a few more diversions before that, like again seeing how hard it would be to better automate uploads From running Arduino on PC to have it copy of the built binary to the (in my case UP board, but could also be RPI), and have some script and/or program running on the host that waits for any directory/file change notifications on some specific location on the UP board and if so, then run the code to convert the Arduino file to OpenCR format and then update the OpenCr board…

We had something like that when I was doing stuff using the Edison boards (at that time Eclipse on PC), and have done it sort of manual using Arduino. That is use the Arduino command “Export compiled Binary”, and then use have WinSCP setup for appropriate directory, and then drag and drop it to copy down to UP… Then have an open PuTTy window to do the commands on the UP to update the OpenCR… But would be nice to start off automating some of this. Wish we could build the UI into Arduino, to select the IP address and the like. Probably updating to have something like the Wifi101 Firmware … commands…

But again that is a diversion. One issue I run into on the Waffle and doing experiments with OpenCR is that it is hard to press the Reset button let alone with the other button, when the third deck is installed. Wonder if I can install some remote buttons? Need to look to see if you have pins available to connect up other buttons for this…

Maybe I should ask this in it’s own question thread.

Suppose, I wish to move the OpenCR board on my waffle from the 2nd deck to the third deck? Couple of reasons. While I am programming this board it sure is easier to get to the buttons ad the like when the Arduino decides to not want to talk to the board (No USB device located…)

2nd - Maybe I want to do like @ROBOTIS-Will and plug a display into it…

Disregarding the obvious things like, need to reroute cables, what all else needs to change?

I would assume probably some stuff in the turtlebot3_waffle_pi.urdr.xacro file like probably:
The block:

<joint name="imu_joint" type="fixed">
    <parent link="base_link"/>
    <child link="imu_link"/>
    <origin xyz="0.0 0 0.068" rpy="0 0 0"/>
  </joint>

probably the .068 is the Z offset in meters, would need to increase by deck height…
Maybe to something like .105

I am not sure about the X, Y… Need to verify where the IMU is on the board.

Now suppose I wish to rotate the board 90 degrees. I am assuming I would probably need to change the rpy value (roll pitch yaw). Need to figure out which one and how much. Probably something like 3.1416/2?

Likewise if I do this will probably need to move a few things to make room, so probably similar changes.

But wondering what else am I missing? I probably need to look to see if the Waffle Arduino code base uses any of these values directly.

Hello…thanks for sharing this information. As per my experience LIDAR is on top of robot it only detects thin tube sticking out of round base. I tried giving command to robot to pass around stand fan it always hits round base, wheels start to slip and robot looses orientation in space.There is big difference in data coming from odometry and Lidar and robot is having difficulties to define position in space.