Dawn Robotics Blog » Tutorials http://blog.dawnrobotics.co.uk Tutorials and updates from Dawn Robotics Fri, 06 Nov 2015 12:42:24 +0000 en-US hourly 1 http://wordpress.org/?v=3.7.11 Adding Sensors to the Raspberry Pi Camera Robot Kit http://blog.dawnrobotics.co.uk/2014/09/adding-sensors-raspberry-pi-camera-robot-kit/ http://blog.dawnrobotics.co.uk/2014/09/adding-sensors-raspberry-pi-camera-robot-kit/#comments Mon, 15 Sep 2014 14:13:47 +0000 http://blog.dawnrobotics.co.uk/?p=656 Sensors are a vital part of any robotic project, as they allow a robot to get information about the environment in which it’s operating. Without sensors, a robot has no information about the world around it, and it’s very tough to program intelligent behaviours for the robot. Now, you’ve always been able to attach sensors […]

The post Adding Sensors to the Raspberry Pi Camera Robot Kit appeared first on Dawn Robotics Blog.

]]>
Sensors are a vital part of any robotic project, as they allow a robot to get information about the environment in which it’s operating. Without sensors, a robot has no information about the world around it, and it’s very tough to program intelligent behaviours for the robot.

With sensors your robot can find out about the world

With sensors your robot can find out about the world

Now, you’ve always been able to attach sensors to our Raspberry Pi Camera Robot and the Arduino Mini Driver board we use for motor control, but previously you would have had to modify quite a bit of code in order to get your sensor data out. To fix this, we’ve just released an update to the software for our Raspberry Pi Camera robot which makes things much easier. You can now get a large number of sensors to work simply by connecting them to the Mini Driver. The new Mini Driver firmware repeatedly reads from the attached sensors and sends the readings up to the Pi at a rate of 100 times a second. Once on the Pi, the sensor values can be easily retrieved using the robot web server’s websocket interface. Sensors can also be connected directly to the Pi, with sensor readings returned in the same way.

In this tutorial we show you how to update your robot’s software if needed, how to connect sensors to robot, and then how to read the sensor values using the Python py_websockets_bot library. This will let you write control scripts for your robot that use sensors, and which either run on the Pi, or which run on another computer connected over the network.

Upgrading the Software

If you bought your robot before this post was published, then it’s likely that you’ll need to upgrade the software on your robot. Probably the easiest way to do this is to download the latest SD card image from here, and then flash it to an SD card using the instructions here. If you’re interested, you can also find details of how we built the image from the ground up here.

Moving the Neck Pan and Tilt Servo Connectors

Originally, the pan and tilt servo were connected up to pins D4 and D3 respectively. With this software update, we’ve had to move the servo connections in order to be able to attach wheel encoder sensors (these need the interrupts on D2 and D3). Therefore, in order to make sure that the neck motors keep on working, move the pan servo to the row marked D11, and the tilt servo to the row marked D6.

Move the neck servos to pins D6 and D11

Move the neck servos to pins D6 and D11

Connecting Sensors to the Robot

You have two main options when connecting sensors to the robot. You can either connect sensors to the Mini Driver, or alternatively you can connect them to the Raspberry Pi.

Connecting sensors to the Mini Driver is usually the simpler option, as the Mini Driver can talk to sensors running at 5V, and the sensor values are read automatically at a rate of 100Hz by the firmware on the Mini Driver. It can be useful however, to sometimes connect sensors to the Raspberry Pi. This can happen if you run out of pins on the Mini Driver, or if you need to use I2C or SPI to communicate with a sensor (as these protocols aren’t supported in the Mini Driver firmware yet, and probably won’t be due to a lack of space). Connecting sensors to the Raspberry Pi will probably involve a bit more work as you may need to shift 5V sensor outputs to the 3.3V used by the Pi, and you’ll also need to write a little bit of Python code to talk to the sensor.

With that in mind, we’ll look at the two options for connecting the sensors in a bit more detail. Please Note: In the examples below we use sensors we sell in our store, but there’s nothing to stop you from using any other sensor that you can find, as long as they’re electrically compatible. This means that there’s a truly vast array of sensors out there that you can use with your robot. :D

Connecting Sensors to the Mini Driver

The range of sensors you can attach to the Mini Driver includes digital sensors, analog sensors, encoders and an ultrasonic sensor. The ability to read from analog sensors is really useful as the Raspberry Pi doesn’t have an Analog to Digital Converter (ADC) of its own.

The Mini Driver runs at 5V and so it’s assumed that that’s the voltage level at which sensors you connect to the Mini Driver will run at as well. Please check that the sensors you connect to the Mini Driver are happy running at 5V to avoid damaging them. Also, please check your 5V (VCC) and GND connections carefully before turning the power on, as getting them the wrong way round may let the magic smoke out of your sensors, rendering them useless…

Digital Inputs

Digital sensors are sensors which give a simple high or low output as a result of detecting something. There are 8 pins that you can attach digital sensors to, pins D12, D13 and the 6 analog pins A0 to A5. It may seem odd that you can attach digital sensors to the analog pins, but the pins can be used as both type of input. We’ll configure the exact type of input for the pins in software later on.

As an example of attaching digital sensors we use the trusty line sensor which gives a high signal when it detects a black line on a white background.

Multiple sensors allow you to detect the direction to the line

Multiple sensors allow you to detect the direction to the line

In the image above we’ve attached 3 line sensors to the front of the chassis using some M2 spacers, and we then connect them to the Mini Driver using some female jumper wires, as shown in the image below.

It doesn’t matter which of the 8 possible digital input pins you use, but we’ve used pins A1, A2 and A5, and will then configure these pins as digital inputs in software. With these 3 sensors attached it’s now possible to make the robot follow a black line laid down using a marker pen, or insulation tape.

One possible way of attaching line sensors

One possible way of attaching line sensors. Note: One of the pins on the line sensor is marked as NC (No connection) so the extra wire can be tied back or cut away.

Attaching an Ultrasonic Sensor

Ultrasonic sensors emit bursts of high pitched sound waves (beyond the range of human hearing) which bounce off obstacles in front of the robot. By measuring how long it takes for signals to go out and come back, the robot can work out if there are any obstacles in front of it, and if so, how far away they are.

You can fit both a camera, and an ultrasonic sensor on the robot's pan/tilt head.

You can fit both a camera, and an ultrasonic sensor on the robot’s pan/tilt head.

In the image above we’ve attached an ultrasonic sensor to the pan/tilt head of the robot, and then connected it to the Mini Driver using a 3 pin jumper wire. The only pin you can attach the ultrasonic sensor to with our Mini Driver firmware is pin D12. Also, because reading from the ultrasonic sensor can take a long time, at the moment we only read from the ultrasonic sensor at a rate of 2Hz. If you need to attach more than one ultrasonic sensor to your robot, or if you need to read at a faster rate than 2Hz, then you’ll need to attach the other ultrasonic sensors to the Raspberry Pi’s GPIO pins.

Connect the Ultrasonic sensor to D12

Connect the Ultrasonic sensor to D12

Analog

The analog inputs of the Mini Driver can be used to read from analog sensors where the output signal is between 0V and 5V. The ADC on the Mini Driver is a 10-bit ADC so the readings will be returned as values between 0 and 1023. In the image below we’ve connected the X, Y and Z channels of the ADXL335 accelerometer to pins A1, A2 and A3 of the Mini Driver. Accelerometers are great for detecting the acceleration due to gravity (and thus working out if the robot is on a slope), and they can also be used as a neat method for detecting collisions, as we showed in our blog post on ‘bump navigation‘.

Attach analog sensors such as an ADXL335 accelerometer to the ADC input pins.

Attach analog sensors such as an ADXL335 accelerometer to the ADC input pins.

Incremental Encoders

Incremental Encoders are sensors which you can use to tell you how fast a motor is turning, and usually, what direction it’s turning in as well (see here for a good introductory article). We don’t yet sell encoders for the Dagu 2WD Chassis (coming soon), but in the meantime, you may still find this feature useful if you’re using a different chassis, which does have encoders, to build your robot.

Quadrature encoders have 2 outputs, phase A and phase B, and can be used to detect both the speed and direction in which the motor is turning. Wire the left encoder to pins D2 and D4, and wire the right encoder to pins D3 and D5. The Mini Driver firmware can also be made to work with single output encoders. In this case wire the left encoder to D2 and the right encoder to D3.

Connect the encoders up to the first 4 digital pins

Connect the encoders up to the first 4 digital pins

Reading from the Sensors

Once you’ve connected up all your sensors, the next thing to think about, is how to read from the sensors, to make use of them in your control programs. The firmware on the Mini Driver (actually an Arduino sketch which can be found here) reads from the sensors at a rate of 100Hz and sends the data back to the Pi over the USB cable using serial.

On the Pi, you have two main options for talking to the Mini Driver. The first is to use the Mini_Driver Python class we provide in your own Python script, (example script here, read comments at top of file). The second, recommended and more flexible option, is to talk to the robot web server which is running on the Pi using the Python py_websockets_bot library.

Instructions for installing the py_websockets_bot library can be found in this blog post here. If you’ve already got py_websockets_bot installed then you may need to update it to get the latest version. This can be done by navigating to the py_websockets_bot library and running.

git pull
sudo python setup.py install

We’ve added an example script to py_websockets_bot called get_sensor_readings.py which shows you how to read sensor values from the robot. Run the script using the following command

examples/get_sensor_readings.py ROBOT_IP_ADDRESS

where ROBOT_IP_ADDRESS is the network address of your robot. After a short delay you should see sensor values streaming back from the robot.

Looking at the example script in more detail, the important bits of the script are as follows.

Configure the Sensors

Firstly we construct a SensorConfiguration object and send it over to the robot

# Configure the sensors on the robot
sensorConfiguration = py_websockets_bot.mini_driver.SensorConfiguration(
    configD12=py_websockets_bot.mini_driver.PIN_FUNC_ULTRASONIC_READ, 
    configD13=py_websockets_bot.mini_driver.PIN_FUNC_DIGITAL_READ, 
    configA0=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ, 
    configA1=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ,
    configA2=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ, 
    configA3=py_websockets_bot.mini_driver.PIN_FUNC_DIGITAL_READ,
    configA4=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ, 
    configA5=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ,
    leftEncoderType=py_websockets_bot.mini_driver.ENCODER_TYPE_QUADRATURE, 
    rightEncoderType=py_websockets_bot.mini_driver.ENCODER_TYPE_QUADRATURE )

# We set the sensor configuration by getting the current robot configuration 
# and modifying it. In this way we don't trample on any other 
# configuration settings
robot_config = bot.get_robot_config()
robot_config.miniDriverSensorConfiguration = sensorConfiguration

bot.set_robot_config( robot_config )

Pin D12 can be set as either PIN_FUNC_ULTRASONIC_READ or PIN_FUNC_DIGITAL_READ, pin D13 can be set as either PIN_FUNC_INACTIVE or PIN_FUNC_DIGITAL_READ, and the analog pins A0 to A5 can be set as either PIN_FUNC_ANALOG_READ or PIN_FUNC_DIGITAL_READ. The encoders can be set to be either ENCODER_TYPE_QUADRATURE or ENCODER_TYPE_SINGLE_OUTPUT.

Estimate Time Difference to Robot

For some robot applications, it can be important to know exactly when a sensor reading was made. In our software, whenever a sensor reading reaches the Raspberry Pi, it is timestamped with the time that it arrived at the Pi. The problem is however, that if your robot control script is running on a PC connected to the robot over a network, then the system clock of the control PC is likely to be different from the system clock of the Pi. To resolve this problem, we provide a routine to estimate the offset from the system clock to the Raspberry Pi’s clock.

robot_time_offset = bot.estimate_robot_time_offset()

At the moment the algorithm for estimating the time offset is not particularly efficient, and will block for about 10 seconds or so. In the future we’d like to modify this routine so that it estimates the time offset asynchronously and continuously in the background. In the meantime, if you’re not interested in the precise time at which sensor readings were made, then you can leave this line out of your own programs.

Read the Status Dictionary

Sensor readings are returned as part of the status dictionary which, as you might expect, contains data about the current status of the robot. Retrieve the status dictionary using the following line

status_dict, read_time = bot.get_robot_status_dict()

Read the Sensor Values

Having obtained the status dictionary, the sensor readings are returned as a dictionary called ‘sensors’. Each sensor reading is represented as a timestamp which gives the time on the Pi system clock when the reading arrived at the Pi, coupled with the data for the sensor reading.

# Print out each sensor reading in turn
sensor_dict = status_dict[ "sensors" ]
for sensor_name in sensor_dict:

    # Get the timestamp and data for the reading
    timestamp = sensor_dict[ sensor_name ][ "timestamp" ]
    data = sensor_dict[ sensor_name ][ "data" ]

    # Calculate the age of the reading
    reading_age = (time.time() + robot_time_offset) - timestamp

    # Print out information about the reading
    print sensor_name, ":", data, "reading age :", reading_age

The format of the data entry will depend on the type of sensor being read. For the sensor types that can be attached to the Mini Driver the dictionary entries are

  • batteryVoltage – This is a floating point number giving the current voltage of the batteries attached to the +/- pins of the Mini Driver.
  • ultrasonic – This is an integer giving the distance reading in centimetres. The maximum range of the sensor is 400cm, and if it looks as if no ultrasonic sensor is attached to the Mini Driver then the value returned will be 1000.
  • analog – This is an array of 6 integers, one for each of the analog pins A0 to A5, giving a value from 0 to 1023, representing 0V to 5V. If an analog pin is configured as a digital input then it will still have an entry in this array, but it will be set to 0.
  • digital – This is a byte, with each bit representing one of the possible digital pins. Looking at the byte from left to right, the bits correspond to pins A5, A4, A3, A2, A1, A0, D13 and D12.
  • encoders – This is a pair of integers giving the current tick count for the left encoder and the right encoder.

Connecting Sensors to the Raspberry Pi

You should now have a good idea of how to connect a variety of sensors to your robot, and how to read values from those sensors using the py_websockets_bot library. You may find however, that you also need to connect some sensors to the Raspberry Pi’s GPIO pins. This could be because you run out of space on the Mini Driver (the more sensors the better!) or because you have a sensor that uses I2C or SPI as a means to communicate with them.

As an example, in the images below we’ve attached a digital light sensor, and a digital compass to the I2C GPIO pins of the Pi using some jumper wires and a Grove I2C hub, to create a robot that can navigate with the compass, and detect light levels (perhaps it wants to hide in dark places…).

You can attach a range of sensors (such as a digital light sensor and a compass) using I2C

You can attach a range of sensors (such as a digital light sensor and a compass) using I2C

pi_sensors_02

The sensors fixed to the robot

We don’t have the space here to go into detail about how you would wire up all the different sensor types to the Pi’s GPIO pins, and then communicate with them using Python. But there are lots of good tutorials on attaching sensors to the Pi that you can find with Google. Once you’re in the situation where you can connect to, and communicate with the sensor, the steps you need to take to integrate it with the robot are

  1. Take a copy of default_sensor_reader.py and rename it to something like ‘my_sensor_reader.py’. Leave it in the sensors directory.
  2. Fill in the routines in the file. The comments in the file should tell you what you need to do, basically you need to provide a routine to set up your sensors, to read from your sensors and optionally to shut them down. When you read from the sensors you’ll create a SensorReading object (timestamp and data) and put it into a dictionary for integration with the main sensor dictionary. Note: Do not change the name of the class from PiSensorReader.
  3. Update the robot configuration to use your sensor reader module. This can be done with code that looks like this
robot_config = bot.get_robot_config()
robot_config.piSensorModuleName = "sensors.my_sensor_reader"
bot.set_robot_config( robot_config )

If all goes well then your sensor readings should now be returned to you in the sensor dictionary.

Taking Things Further

This tutorial has shown some of the many different types of sensor you can attach to your Raspberry Pi robot, and hopefully it’s given you a good idea of how you’d go about wiring them up and reading from them. Now, the sky is the limit, as putting sensors onto your robot really makes it much easier to program interesting and intelligent behaviours for your robot. Build a robot that can drive around a room, avoiding obstacles, use a light sensor to get the robot to chase bright lights etc, the possibilities are endless.

If you have any questions about attaching sensors to your robot, or need clarification on anything we’ve talked about in this tutorial, then please post a comment below or in our forums. Also we’d love to hear what sensors you decide to put on your robot, and how you end up using them. :)

The post Adding Sensors to the Raspberry Pi Camera Robot Kit appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2014/09/adding-sensors-raspberry-pi-camera-robot-kit/feed/ 14
New Product – A Power Bank for your Raspberry Pi Robot http://blog.dawnrobotics.co.uk/2014/06/new-product-power-bank-raspberry-pi-robot/ http://blog.dawnrobotics.co.uk/2014/06/new-product-power-bank-raspberry-pi-robot/#comments Wed, 18 Jun 2014 09:48:31 +0000 http://blog.dawnrobotics.co.uk/?p=588 One issue that has caught quite a few people out when they build our Raspberry Pi robot kit, is the issue of power. The kit comes with a 6xAA battery holder, but the trouble is, not all AA batteries are the same, which is easy to overlook when you’re grappling with all the other complexities […]

The post New Product – A Power Bank for your Raspberry Pi Robot appeared first on Dawn Robotics Blog.

]]>
One issue that has caught quite a few people out when they build our Raspberry Pi robot kit, is the issue of power. The kit comes with a 6xAA battery holder, but the trouble is, not all AA batteries are the same, which is easy to overlook when you’re grappling with all the other complexities of building a Raspberry Pi robot. :)

We recommend that the robot be powered with good quality, high capacity, rechargeable (NiMh or NiCd) batteries, such as Duracell 2400mAh NiMh . Non-rechargeable (Alkaline) batteries are not recommended as they will struggle to provide enough current to power both the Pi and the motors of the robot.

Good for Robots

Good for robots

Bad for Robots

Bad for robots

Good for Robots

Pretty (and also good for robots)

 

 

 

 

 

As an alternative to AA batteries, we’re now selling the  TeckNet iEP387 USB power bank which can be used to power the entire robot. The power bank is more expensive that the cost of 6 AA rechargeable batteries, but you get the advantage of increased runtime (approx 5 hours compared to 3hrs for the NiMh Duracells), and you don’t have to buy a battery charger.

In this blog post we show you how to use the power bank with the robot.

Connecting the Battery to the Robot

Please Note: If you are using a USB battery pack to power the Pi and mini driver, then you do not need to use the UBEC which we’ve started to supply with more recent versions of our Raspberry Pi robot.

Once you’ve built the robot following these instructions, Slide the iEP387 USB power bank into the chassis behind the wheels.

The iEP387 should come with 2 USB cables, a micro USB cable for powering the Pi, and a USB cable with 2.54mm connectors for connecting to the Mini Driver, and powering the motors. First plug the micro USB cable into the 5V 2.1A output of the iEP387 (the Pi needs this) and connect it to the power connector on the Pi (the extra cable can be wound around the Raspberry Pi mounting struts).

Secondly, use the USB power cable with red and black leads, and 2.54mm connectors to attach the 5V 1.0A output of the iEP387 to the battery pins of the Mini Driver (marked + and – next to the mini USB connector). The red wire should attach to the + pin and the black wire should attach to the – pin. Don’t worry if you get them the wrong way round though, as the Mini Driver has reverse bias protection.

Connecting the iEP387 USB Power Bank to the Robot

Connecting the iEP387 USB Power Bank to the Robot

Make sure that you leave a bit of slack in the cables so that you are able to slide the iEP387 sideways slightly and press the on/off button.

Turning on the iEP387 Power Bank

Turning on the iEP387 Power Bank

Turning on the Robot

To turn on the robot, first switch on the power switch on the Mini Driver. This is important in order to provide power to the motors. Then slide the iEP387 sideways, and press the power button. This should turn on the robot.

Please Note: When using the new Model B+ Pi, it’s possible to knock the Micro SD so that it comes out slightly. In this situation the powerbank will turn off after a while as the Pi wont boot properly and so not enough current will be drawn to keep the powerbank on. If you’re seeing the powerbank turn off for no reason then please check that the MicroSD card is seated properly. This forum post gives more details.

Turning off the Robot

To turn off the robot you need to unplug the micro USB connector from the Pi, and turn off the power switch on the Mini Driver. This is not as neat as we’d like it, but there’s not really an easy way (without adding more hardware, and therefore more cost) to stop the Pi from drawing power from the power bank.

Unplug the micro USB cable to turn off the Pi

Unplug the micro USB cable to turn off the Pi

Pulling the power from the Pi shouldn’t damage anything as the robot’s software doesn’t write anything to the SD card. However, it’s always nice to let the Pi shutdown cleanly if possible, and so to do that you can use the shutdown button on the robot’s web interface.

 Updated: Extra Photos to Show Installation of Battery

I’ve had a couple of people say that their battery pack touches the wheels so have posted the following pictures to try to clarify things. The pictures are not great quality, but hopefully I’ll have time to update them in the new year.

There should be a small (approx 2mm) gap between the battery pack and the wheels

There should be a small (approx 2mm) gap between the battery pack and the wheels

 

The battery pack should be held back by the plastic motor tabs.

The battery pack should be held back by the plastic motor tabs.

As an alternative to inserting the battery pack in from the side, it’s also possible to remove the central support and slide the battery pack in from the front. This is a snug fit, but again there should be a few millimetres between the battery pack and the wheels. This gap can be increased by sliding the wheels slightly along the motor axles.

If the central support is removed, the battery pack can be slid in from the front.

If the central support is removed, the battery pack can be slid in from the front.

There should be a few millimetres of clearance between the battery pack and the wheels.

There should be a few millimetres of clearance between the battery pack and the wheels.

 

 

The post New Product – A Power Bank for your Raspberry Pi Robot appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2014/06/new-product-power-bank-raspberry-pi-robot/feed/ 34
Programming a Raspberry Pi Robot Using Python and OpenCV http://blog.dawnrobotics.co.uk/2014/06/programming-raspberry-pi-robot-using-python-opencv/ http://blog.dawnrobotics.co.uk/2014/06/programming-raspberry-pi-robot-using-python-opencv/#comments Wed, 11 Jun 2014 16:34:48 +0000 http://blog.dawnrobotics.co.uk/?p=554 Our Raspberry Pi robot has proven to be very popular, as it allows people to easily put together a fun little robot, that they can drive around using a smartphone, tablet or computer, whilst viewing the world with the camera on the robot. However, fun as this is, it’s hard to view this ‘robot’ as […]

The post Programming a Raspberry Pi Robot Using Python and OpenCV appeared first on Dawn Robotics Blog.

]]>
Our Raspberry Pi robot has proven to be very popular, as it allows people to easily put together a fun little robot, that they can drive around using a smartphone, tablet or computer, whilst viewing the world with the camera on the robot. However, fun as this is, it’s hard to view this ‘robot’ as being much more than a remote controlled toy. Our personal view has always been that a robot should be autonomous in some way, and that’s why we’ve been working on a programming interface for our robot that will let users of our robots create cool, autonomous behaviours.

Bring your robot to life with Python and OpenCV

Bring your robot to life with Python and OpenCV

The interface is a Python library called py_websockets_bot. The library communicates with the robot over a network interface, controlling it’s movements and also streaming back images from its camera so that they can be processed with the computer vision library OpenCV. Communicating over a network interface means that your scripts can either run on the robot, or they can run on a separate computer. This feature is really useful if you want to use a computer more powerful than the Pi to run advanced AI and computer vision algorithms, or if you want to coordinate the movement of multiple robots.

In this post we show you how to install the interface library, and provide some example programs, that show you how to make the robot move, how to retrieve images from the camera and how to manipulate the images with OpenCV.

Software Overview

Before we talk about installing the py_websockets_bot library, the image below should hopefully give you a better idea about how the software on the robot works.

Overview of the Raspberry Pi Robot software

Overview of the Raspberry Pi Robot software

At a low level, we use an Arduino compatible board called a Mini Driver to control the hardware of the robot. Running on the Mini Driver is a sketch that listens over serial USB for commands from the Pi, whilst sending back status data (at the moment just battery voltage, but other sensors could also be added).

On the Pi, we run a web server written in Python that provides a web interface, and which listens for commands over the WebSockets protocol. When it gets commands, it sends them onto the Mini Driver via serial USB. The other program running on the Pi is a camera streamer called raspberry_pi_camera_streamer. This is a custom Raspberry Pi camera program we’ve written that streams JPEG images over the network. It can also stream reduced size images (160×120) for computer vision, along with motion vectors (coarse optical flow data) from the Raspberry Pi camera.

To control the robot, you can either use a web browser such as Chrome or Firefox, or now, you can also write a program using the py_websockets_bot library. Both of these will communicate with the web server and the camera streamer on the Pi using WebSockets and HTTP.

It may seem a bit over complicated to run a web server, and to communicate with this, rather than controlling the robot’s hardware directly, but it gives us a lot of flexibility. The web interface can be used to observe the robot whilst it’s being controlled by a script, and as mentioned before, control scripts can be written just once and then run either on the robot, or on a separate computer, depending upon how much computing power is needed. Also, in theory, you can write your control scripts in whatever language you like, as long as you have a library that can speak WebSockets. We have provided a Python library, but there are WebSockets libraries available for many other languages, such as Javascript, Ruby and the .NET framework.

Installing the Software

Starting with the standard Dawn Robotics SD card, run the following commands on your Pi to make sure that the robot’s software is up to date

cd /home/pi/raspberry_pi_camera_streamer
git pull
cd build
make
sudo make install
cd /home/pi/raspberry_pi_camera_bot
git pull

Reboot your Pi to use the updated software.

Installing py_websockets_bot on a Linux PC or the Raspberry Pi

Run the following commands to install the libraries dependencies

sudo apt-get update
sudo apt-get install python-opencv

then

git clone https://bitbucket.org/DawnRobotics/py_websockets_bot.git
cd py_websockets_bot
sudo python setup.py install

Installing py_websokects_bot on Windows

This is trickier but involves the following steps

If needed, more details for OpenCV setup on Windows can be found at http://docs.opencv.org/trunk/doc/py_tutorials/py_setup/py_setup_in_windows/py_setup_in_windows.html

Installing py_websokects_bot on a Mac

We don’t have a Mac to do this, but hopefully, the installation process should be similar to installing on Linux. If there are any Mac users out there who could give this a go and let us know how they get on, we’d be very grateful. :)

Making the Robot Move

Making the robot move is very straightforward, as shown in the code snippet below

import py_websockets_bot

bot = py_websockets_bot.WebsocketsBot( "ROBOT_IP_ADDRESS" )
bot.set_motor_speeds( -80.0, 80.0 )    # Spin left

For ROBOT_IP_ADDRESS you would put something like “192.168.42.1″ or “localhost” if the script was running on the robot. The code snippet connects to the robot, and then starts it turning left by setting the left motor to -80% speed and the right motor to +80% speed.

The example file motor_test.py shows more of the commands you can use. It can be run from the py_websockets_bot directory by using the command

examples/motor_test.py ROBOT_IP_ADDRESS

Getting and Processing Images from the Robot

One of the really exciting thing about using the Pi for robotics, is that it has a good standard camera, and enough processing power to run some computer vision algorithms. The example file get_image.py shows how to get an image from the camera, and then use the OpenCV library to perform edge detection on it.

Run it as follows

examples/get_image.py ROBOT_IP_ADDRESS
The result of running edge detection on a camera image from the Raspberry Pi robot

The result of running edge detection on a camera image from the Raspberry Pi robot

Small Images

The Pi can run computer vision algorithms, but the processing power of its CPU is very limited when compared to most laptops and desktop PCs. One crude but effective way to speed up a lot of computer vision algorithms is simply to run them on smaller images. To support this, the py_websockets_bot library also offers routines to stream ‘small’ images which have been reduced in size on the GPU of the Pi. The standard camera images from the robot are 640×480, and the small images are 160×120.

Getting Motion Vectors (Optical Flow) from the Robot’s Camera

The Raspberry Pi contains a very capable GPU which is able to encode images from the Pi camera into H264 video in real time. Recently, the clever people at Pi towers added a feature to the Pi’s firmware that allows the vectors generated by the motion estimation block of the H264 encoder to be retrieved. What this means, is that it’s possible to get your Pi to calculate the optical flow for its camera images, practically for free! (0% CPU) We haven’t managed to do anything cool with this yet, but we’ve provided routines to retrieve them from the camera so that other people can.

The motion vectors from waving a hand in front of the Raspberry Pi Robot's camera (no really, squint a bit...)

The motion vectors from waving a hand in front of the Raspberry Pi Robot’s camera (no really, squint a bit…)

To see this for yourself, run the following example

examples/display_motion_vectors.py ROBOT_IP_ADDRESS

Taking Things Further

Hopefully you can see that there are lots of possibilities for creating cool behaviours for your robot using the py_websockets_bot library and OpenCV. For example, you could write a script to get the robot to drive around your home. Perhaps you could get it to act as a sentry bot, emailing you an image if it detects motion. Or perhaps you could get it to detect and respond to people using face detection.

You can get more details about the capabilities of py_websockets_bot in its documentation here, and also the documentation of OpenCV can be found here. If you have any questions about the py_websockets_bot library, please post on our forums, You can also use our forums as a place to share your robotic projects with the rest of the world. :)

The post Programming a Raspberry Pi Robot Using Python and OpenCV appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2014/06/programming-raspberry-pi-robot-using-python-opencv/feed/ 16
Using the Pi Co-op as a General Purpose I/O Board for the Raspberry Pi http://blog.dawnrobotics.co.uk/2014/03/using-pi-co-op-general-purpose-io-board-raspberry-pi/ http://blog.dawnrobotics.co.uk/2014/03/using-pi-co-op-general-purpose-io-board-raspberry-pi/#comments Sat, 15 Mar 2014 08:04:02 +0000 http://blog.dawnrobotics.co.uk/?p=496 We released the Pi Co-op, an Arduino add-on board for the Raspberry Pi back in the middle of January. But for various reasons, we haven’t had the time to do much promotional work, and explain to people why it’s so cool, and why you’d actually want to add an Arduino to your Pi. To fix […]

The post Using the Pi Co-op as a General Purpose I/O Board for the Raspberry Pi appeared first on Dawn Robotics Blog.

]]>
We released the Pi pi_co-opCo-op, an Arduino add-on board for the Raspberry Pi back in the middle of January. But for various reasons, we haven’t had the time to do much promotional work, and explain to people why it’s so cool, and why you’d actually want to add an Arduino to your Pi.

To fix that, we’ve created a video, because reading text can be really boring :) , and then, we’ve written this blog post to show you one of the really useful things you can do with the Pi Co-op. We show you how you can use your Pi Co-op as a general purpose I/O board for the Pi.

So now, instead of having to buy loads of different add-on boards for your Pi, you can just buy the Pi Co-op here. :) You can use it as an Analog to Digital Converter (ADC), you can use it to connect to 5V devices, you can use it to generate PWM signals, and you can use it for I2C. To top it all off, you can also control all of this functionality from a high level language such as Python.

Getting Started

It’s very easy to set up your Raspberry Pi to work with the Pi Co-op. Full details are provided in the manual, but as a quick recap, these are the steps you need to follow in Raspbian.

Open up a terminal window and run the following commands

sudo apt-get update
sudo apt-get install arduino git

This will install the Arduino IDE and some supporting Python libraries. Now run the following commands

git clone https://bitbucket.org/DawnRobotics/pi_co-op.git
cd pi_co-op
sudo python setup_pi_co-op.py install

This will alter configuration files to allow us to use the serial port on the GPIO pins of the Pi. Finish the configuration by restarting the Raspberry Pi

sudo reboot

If everything goes well, then you will now be ready to start programming the Pi Co-op.

Installing PyMata

To use the Pi Co-op as a general purpose I/O board we make use of a project called Firmata. Firmata is a program for an Arduino (the Pi Co-op is compatible with an Arduino Uno) that allows you to control all of its functionality using serial communication. The reason for doing this is, if you have a library that speaks the correct serial protocol with Firmata, then you can control the Pi Co-op with any language you want, and you don’t have to program it directly!

The Firmata Github page contains links to client libraries for most languages. For example there are libraries for Python, Javascript, Ruby and the .NET Framework, to name but a few. To show how this works, for this post we’re going to use Python and a library called PyMata. PyMata is an open source library that was written by Alan Yorinks. We’ve extended PyMata slightly so that it can also use a tool called Ino to automatically program the Pi Co-op with Firmata, if needed.

To get started with the installation. Open up a terminal window and enter the following

sudo apt-get update
sudo apt-get install python-pip python-dev python-serial
sudo pip install tornado ino

Now install PyMata by entering

git clone https://github.com/DawnRobotics/PyMata.git
cd PyMata
sudo python setup.py install

Blinking an LED with PyMata

The classic ‘Hello World!’ program to run on an Arduino is to blink an LED. The Pi Co-op has an LED so you can get it to start blinking by executing the following Python code on the Pi.

import time

from PyMata.pymata import PyMata

# Pin 13 has an LED connected on most Arduino boards.
# give it a name:
LED = 13

# Create an instance of PyMata.
SERIAL_PORT = "/dev/ttyS0"
firmata = PyMata( SERIAL_PORT, max_wait_time=5 )

# initialize the digital pin as an output.
firmata.set_pin_mode( LED, firmata.OUTPUT, firmata.DIGITAL )

try:
    # run in a loop over and over again forever:
    while True:

        firmata.digital_write( LED, firmata.HIGH ) # turn the LED on (HIGH is the voltage level)
        time.sleep( 1.0 ) # wait for a second
        firmata.digital_write( LED, firmata.LOW ) # turn the LED off by making the voltage LOW
        time.sleep( 1.0 ) # wait for a second

except KeyboardInterrupt:

    # Catch exception raised by using Ctrl+C to quit
    pass

# close the interface down cleanly
firmata.close()

When you run this program, if Firmata is already installed on your Pi Co-op then you should just see a short bit of connection text and the LED on your Pi Co-op should start blinking.

If Firmata isn’t installed on your Pi Co-op then you will see lots of text scroll past as PyMata compiles the Firmata code and uploads it to the Pi Co-op. This does take a little bit of time, but don’t worry, the next time you run the script, the connection process will be a lot quicker. Also, the compiled Firmata is cached, so that if you upload a different Arduino sketch to your Pi Co-op, then the next time you use PyMata it will just upload the already compiled version.

Code Explanation

Hopefully, if you’re familiar with the Arduino Blink sketch then the PyMata LED blink code should be fairly self explanatory. The important bits are

# Create an instance of PyMata.
SERIAL_PORT = "/dev/ttyS0"
firmata = PyMata( SERIAL_PORT, max_wait_time=5 )

This creates an instance of the PyMata class to connect to Firmata on the given serial port. The parameter max_wait_time specifies the time to wait in seconds when trying to connect to Firmata, before giving up and starting the upload process.

# initialize the digital pin as an output.
firmata.set_pin_mode( LED, firmata.OUTPUT, firmata.DIGITAL )

This line sets the LED pin as a digital output

        firmata.digital_write( LED, firmata.HIGH ) # turn the LED on (HIGH is the voltage level)
        time.sleep( 1.0 ) # wait for a second
        firmata.digital_write( LED, firmata.LOW ) # turn the LED off by making the voltage LOW
        time.sleep( 1.0 ) # wait for a second

Finally, these lines which form the body of the while loop, cause the LED pin to be written HIGH and then LOW in order to flash the LED pin.

Running the Other Examples

We’ve written a number of example scripts for using the Pi Co-op with PyMata, and put them in the examples folder of the Pi Co-op repository. The example we’ve just discussed is called pymata_blink.py, and there are also examples for reading from the ADC (pymata_analog_read.py) and controlling a servo (pymata_servo_sweep.py).

To make sure you’ve got the latest code, and then to run one of the examples, you can use the following commands

cd pi_co-op
git pull
./examples/pymata_blink.py

Taking Things Further

Using Firmata can give you a lot of flexibility when it comes to deciding how to use your Pi Co-op. If you stick with Python, then you can learn more about the functionality offered by PyMata by looking at the documentation here. Alternatively, you might decide that you want to use another language to control the Pi Co-op. There are lots of libraries to choose from, including the excellent Johnny-Five for javascript.

Also, if you’ve got any questions in general about the Pi Co-op, and how you can use it with your Raspberry Pi, then please head over to our forums, as we’re always happy to help. :)

The post Using the Pi Co-op as a General Purpose I/O Board for the Raspberry Pi appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2014/03/using-pi-co-op-general-purpose-io-board-raspberry-pi/feed/ 4
Using the Dagu Mini Driver to Build a Raspberry Pi Camera Robot http://blog.dawnrobotics.co.uk/2014/01/using-dagu-mini-driver-build-raspberry-pi-camera-robot/ http://blog.dawnrobotics.co.uk/2014/01/using-dagu-mini-driver-build-raspberry-pi-camera-robot/#comments Thu, 30 Jan 2014 02:17:40 +0000 http://blog.dawnrobotics.co.uk/?p=440 A Raspberry Pi with a camera, gives you a small, low cost, embedded vision system, but it’s not very mobile. In this tutorial we show you how to fix that by attaching it to a robot to give you a Raspberry Pi camera robot! The robot is WiFi enabled which means you can drive it […]

The post Using the Dagu Mini Driver to Build a Raspberry Pi Camera Robot appeared first on Dawn Robotics Blog.

]]>
pi_magician_small_03__98633A Raspberry Pi with a camera, gives you a small, low cost, embedded vision system, but it’s not very mobile. In this tutorial we show you how to fix that by attaching it to a robot to give you a Raspberry Pi camera robot! The robot is WiFi enabled which means you can drive it around using a tablet, phone or computer, using the camera to explore remote areas.

We’ve tried to keep the components for this tutorial as affordable as possible, and as such we’re using the Dagu Arduino Mini Driver to control the motors and servos of the robot. This board also contains a 1A voltage regulator which we can use to power the Pi. Now a limit of 1A is a bit tight for the Pi, but we’ve found that with the right WiFi dongle, the Mini Driver voltage regulator can happily power itself, the Pi, a camera, and WiFi. All the ingredients you need for a camera robot. :)

Update: This robot also works with the Model B+ Pi, and the new Raspberry Pi 2. We’ve updated the instructions to reflect this below.

Update: We’ve now changed the robot slightly so that it uses a UBEC to power the Pi. This extends the battery life of the robot, and increases the current limit for the Pi to 3A.

Required Materials

This tutorial uses quite a lot of materials, but if you already have a Raspberry Pi and a Raspberry Pi camera to hand, then it’s probably one of the most affordable camera robots you can build.

Update: We now sell a chassis bundle for this robot in our store. This comes with wires soldered onto the motors and connectors crimped onto all of the wires, which makes the robot much easier to build.

Assembling the Robot

Assemble the the 2WD robot chassis by following the instructions that come with it, and the assembly instructions from this blog post.

Assemble the pan/tilt kit by following its instructions, and then mount it on the chassis using some M3 hex spacers and screws. You may need to widen the holes on the pan servo before the screws will fit, but this can be done fairly easily using a small screwdriver. Update: If you bought a chassis bundle on or after 3rd April 2014 then it should contain M2 hex spacers instead, which are easier to attach to the pan servo. These are used as shown in the pictures below

Attach the Pan/Tilt kit as show

Attach the Pan/Tilt kit as shown (new chassis bundles come with M2 hex spacers, not the M3 hex spacers shown in the image)

M2 hex spacers come with washers to attach to the 2WD chassis

M2 hex spacers come with washers to attach to the 2WD chassis

Use washers as shown

Use washers as shown

Mount the Pi on the chassis using the Raspberry Pi Fixing Kit. The picture below shows the best position we found for the spacers. This gets the Pi close enough to the Pan/Tilt neck so that the standard 15cm Pi camera cable can be used to attach the camera. Tip: only the position of the bottom left and top right spacers are critical, the other 2 spacers are just there to support the Pi.

Update: If you have a Model B+ Pi, or a Raspberry Pi 2 see the section on attaching the Model B+ and Pi 2 below.

Attach the Pi to the 2WD chassis

Attach the Pi to the 2WD chassis

Mount the Mini Driver Board on the chassis using the spacers it comes with. We mounted ours to leave space for an optional breadboard in the future, but this means that we weren’t able to get screws into all of the mounting holes on the Mini Driver. Still, 2 or 3 screws is all it needs for stability.

Attach the Mini Driver to the 2WD chassis

Attach the Mini Driver to the 2WD chassis

Connect the Pi to the Mini Driver board using a USB cable. The shorter the cable you can use here the better. We used the 1m USB cable that we stock and so wrapped it round the Pi 3 times to keep things neat.

Connect the Pi to the Mini Driver using a USB Cable

Connect the Pi to the Mini Driver using a USB Cable

Attaching a Model B+ or Raspberry Pi 2

The Raspberry Pi Model B+ and the new Raspberry Pi 2 are both a great fit for this robot kit as they use less power than the old Model B Pi, and contain an extra 2 USB ports, and in the case of the Pi 2, you get more computing power. To attach them to the chassis we recommend positioning the Raspberry Pi Fixing Kit spacers and the Mini Driver as shown below. We moved the Mini Driver over to the right as this allows a HDMI monitor to be attached if required for debugging purposes.

Attaching the new Model B+ Pi to the Chassis

Attaching the Model B+ Pi or Pi 2 to the Chassis

Now attach the camera to the Pan/Tilt kit using the sensor fixing kit. Once the camera is secure, use the camera cable to connect the camera to the Pi. The cable attaches to the Pi using the connector closest to the network port, with the reinforcing strip facing the network port and the exposed pins facing away from the network port. The connector is used by raising up the top part of the connector to loosen it, inserting the cable and then pushing the top part of the connector down to hold the cable in place. The cable attaches to the camera with the exposed pins facing forward. Note: When threading the cable through the pan/tilt unit, please be very careful not to let it snag.

Attach the camera to the Pan/Tilt neck

Attach the camera to the Pan/Tilt neck

Now connect the wheel motors and the neck servo motors to the Mini Driver as shown in the diagram below. The exact order of the red and black wires for the wheel motors is not important as if you get it wrong they will just turn in the wrong direction. This can be fixed by just swapping them over when you notice it.

On the pan/tilt servos, the red (middle) wire is Vcc, the brown wire is Ground and the orange wire is the signal wire. Attach the pan servo to the row marked D11, and the tilt servo to the row marked D6.

Please Note: The servo motors used to be attached to pins D3 and D4, but had to be moved to allow for the possibility of attaching encoders.

Attach the wheel motors and the neck servo motors to the Mini Driver

Attach the wheel motors and the neck servo motors to the Mini Driver.

WiFi

Plug your USB WiFi dongle into the Pi. We highly recommend using the Edimax EW-7811Un WiFi dongle as it seems to consume less power than other Wifi dongles, and the software we provide works really well with the Edimax EW-7811Un. Other WiFi dongles may work however.

Powering the Robot

To power the robot, you can either use 6xAA batteries, or you can use a USB powerbank.

Running the Robot from 6xAA Batteries

To get as much life as possible out of the batteries, we use a device called a UBEC (Universal Battery Elimination Circuit). This is an efficient switching voltage regulator which means that less of the battery pack’s power is wasted as heat as its high voltage is converted to 5V for the Pi.

First connect the UBEC to the Mini Driver. This is done by connecting the red input wire of the UBEC to the pin directly next to the on/off switch of the Mini Driver (this is connected to battery voltage), and then by connecting the black input wire of the UBEC to a GND pin on the Mini Driver. We’ve used the GND pin on the 3×2 programming header as this is unlikely to be used for anything else. Finally, plug the USB connector of the UBEC into the Pi.

Please Note: If you are planning to use a USB powerbank to power the Pi and mini driver, then you do not need to attach the UBEC to the robot. Instead, refer to the next section for instructions on how to attach the powerbank.

Connect the UBEC to the Mini Driver and the Pi

Connect the UBEC to the Mini Driver and the Pi

Now make sure that the Mini Driver switch is set to off, and attach a set of 6xAA batteries to the +/- pins of the Mini Driver. It’s important to use the 6xAA battery holder, rather than the 4xAA battery holder that comes with the robot, as 4xAA batteries just aren’t enough to power everything. Also, whilst you can just about get by with non-rechargeable (Alkaline) batteries we recommend that you use 6 good rechargeable (NiMh) batteries as these can provide more current for running the Pi and motors, and will give you much better battery life.

Rechargeable batteries such as these Duracell 2400mAh NiMh are great for powering the robot

Rechargeable batteries such as these Duracell 2400mAh NiMh batteries are great for powering the robot

The robot with UBEC and 6xAA batteries attached

The robot with UBEC and 6xAA batteries attached

Running the Robot from a USB Powerbank

As an alternative to running the robot from 6xAA We now also offer a USB powerbank which can be used to run the entire robot (AA batteries not needed), and gives even better battery life (over 5 hours compared to approximately 3 hours). You can find details of this option, and how to connect it to the robot here.

Preparing the Software for the Robot

If you get to this part of the tutorial then congratulations, the hardware of your robot is now complete! :) Now, setting up the software for this robot from scratch can be a little involved so we’ve created an SD card image, with Raspbian and all the necessary software installed, that you can download here (get the latest version). Instructions for copying the image to an SD card can be found here. Please Note: If you choose this option, please remember to expand the filesystem after installation by running

sudo raspi-config

and choose the ‘Expand Filesystem’ option.

If you lack the confidence to copy the image to an SD card then you can buy a pre-installed SD card from us here. Alternatively you can see all of the steps we went through to set up the image on our main site here. All of the software we wrote for this robot is open source and can be found in this repository.

Whichever route you choose to go down, once you have your SD card, plug it into the Pi.

Powering up and Driving the Robot

Power on the robot using the switch on the Mini Driver, and give the robot 2 to 3 minutes to power on and settle down. The first time you power the robot on, it needs to download a sketch to the Mini Driver and so will take a bit longer to start up than normal. For those of you that know that the Mini Driver is really an Arduino – don’t worry – you don’t need to program it first, that’s done automagically by the wonderful library Ino.

The Raspberry Pi is configured to act as a WiFi access point, so connect to the new wireless network that should appear called ‘Pi_Mini_Robot’. The password for the network is ‘Raspberry’.

Note: Very occasionally the WiFi dongle on the Pi won’t get an IP address (known bug) and so you won’t be able to connect to the network (your device will spend ages authenticating and getting an ip address). This problem is usually resolved by turning the robot off and on again.

The robot is controlled with a web interface which means it should hopefully be accessible from the widest range of devices possible. The web interface does use HTML5 however, so you’ll need to use an up to date browser. We found that Chrome works well on all the platforms we’ve tested it on.

To control the robot type the IP address 192.168.42.1 into the address bar.

Drive the robot with the left joystick, and look around with the right joystick

Drive the robot with the left joystick, and look around with the right joystick

The interface screen should consist of the view from the robot’s camera, and 2 joysticks (not nipples). The left joystick controls the robot’s wheels, the right joystick controls the robot neck. Tap on the right joystick to return the neck to its starting position.

Adjusting the Range of Motion of the Pan/Tilt Servos

Depending upon how accurately you were able to assemble the servo motors of the neck, you may find that setting both servos to 90 degrees does not centre the head. In some cases you may need to partially disassemble the pan/tilt head to correct this, but you may also be able to get away with adjusting the PWM pulse widths that correspond with 0 and 180 degrees. Access the configuration page at 192.168.42.1/config.html. The sliders let you move the pan and tilt servos around, send modified PWM values to the robot by pressing the save button.

The robot configuration screen

The robot configuration screen

Troubleshooting

We’ve tried to make everything as straightforward as possible in this tutorial, but there are still a number of things that can go wrong. If you have any questions or problems, please post in our forums, or leave a comment below.

Software Outline

This blog post has already turned into a bit of an epic, so we’ll leave a full discussion of the robot’s software to another day. The important points are, it’s open source ad can be found in this repository here. :) We’d like to develop it further and adapt it to other Raspberry Pi robot so any comments, suggestions, or bug fixes are very much appreciated. Also, we’re happy to offer pointers if you’d like to use it for your own Raspberry Pi robot.

Update: Programming the Robot

We’ve now released a Python library that lets you write control scripts to make your robot autonomous, and to process images from its camera using OpenCV. You can find details of the library in this blog post.

Update: Adding Sensors to the Robot

We’ve updated the robot’s software so that it’s really easy to add sensors to the robot for use in your robot control scripts. Find out more in this blog post.

Taking things Further

The power of the Raspberry Pi gives a huge amount of flexibility for expanding on this tutorial and taking things further. A robot with a camera can use computer vision to recognise objects, and people, or to recognise locations in order to build a map of it’s environment. Some processing can be done on the robot, but the WiFi connection means that processing can also be done on another more powerful computer.

The robot can also be expanded with more sensors and servo motors, connected either to the Pi or to the Mini Driver board. The software can also be adapted fairly easily to work with other robots.

Have fun exploring. :)

 

The post Using the Dagu Mini Driver to Build a Raspberry Pi Camera Robot appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2014/01/using-dagu-mini-driver-build-raspberry-pi-camera-robot/feed/ 173
Creating a Dawn Robotics SD Card http://blog.dawnrobotics.co.uk/2014/01/creating-a-dawn-robotics-sd-card/ http://blog.dawnrobotics.co.uk/2014/01/creating-a-dawn-robotics-sd-card/#comments Thu, 30 Jan 2014 01:06:30 +0000 http://blog.dawnrobotics.co.uk/?p=447 Update: This post is now out of date as we’ve released a new version of the software. You can find an updated version of the instructions here. This post describes all the steps we go through to set up a Dawn Robotics SD Card. This SD card contains Raspbian with software installed on it to […]

The post Creating a Dawn Robotics SD Card appeared first on Dawn Robotics Blog.

]]>
Update: This post is now out of date as we’ve released a new version of the software. You can find an updated version of the instructions here.

This post describes all the steps we go through to set up a Dawn Robotics SD Card. This SD card contains Raspbian with software installed on it to support a Raspberry Pi robot, and also to support the Pi Co-op Arduino add-on board we sell. If you want to get up and running quickly, then you can just download a complete version of the SD card image here, or alternatively buy a pre-installed SD card from us here. For people who want to build their SD card image from scratch however, or who want to customise it for their own Raspberry Pi robot, hopefully this set of notes will be a good guide to show you what we’ve done.

Set up a Basic SD Card

The SD card is built around the 2013-07-26-wheezy-raspbian Raspbian image which can be found here. The reason we use this image is that we found that we got very jerky, or non-functional camera streaming when using more recent distributions. This may be something to do with the new distributions, or possibly due to a bug in our code. If you’re able to get this setup working with more recent versions of Raspbian we’d be very interested to hear from you.

Anyway, download the Raspbian image and copy it to an SD card using the instructions here.

Log onto the Raspberry Pi and run

sudo raspi-config

In the configuration program, enable the camera, enable boot to desktop, and expand the disk image to fill the SD card.

Create a WiFi Access Point

Set up the Pi as an access point using this great tutorial from Adafruit. The USB WiFi adaptor we sell, the Edimax EW-7811Un uses the same driver as the one that Adafruit sells, so you don’t need to diverge from the tutorial. The only changes we make are that we set the SSID of the network to be Pi_Mini_Robot, and we don’t set up network address translation.

Set up Camera Streaming

In order to stream camera images from the Pi we use mjpg-streamer which sends a constant stream of JPEGs over the network to computers which request them. This doesn’t give the highest resolution video, typically we get a resolution of 640×480 at about 10fps. But it does have the advantage of working with a lot of web browsers, and providing a low lag connection which is very important when trying to teleoperate a robot.

First, we move the tmp directory to RAM. This potentially gives us a small speed advantage, as the JPEGs are not written to the SD card, and extends the life of the SD card as it is not being subjected to lots of repeated write operations.

Add the following line to the bottom of the file /etc/default/rcS

RAMTMP=yes

Now add the following line to /etc/fstab and reboot

tmpfs /tmp tmpfs nodev,nosuid,size=40M,mode=1777 0 0

Secondly we install mjpg-streamer by following this tutorial by Miguel Grinberg. You only need to follow this tutorial up to the end of step 6. We control the process of starting up raspistill and mjpg-streamer by using a Python script.

Install Dawn Robotics’ Software

These next steps set up the Pi so that it will work with our Arduino add-on board, the Pi Co-op, and also so that it runs our robot control web server on start up.

Install the following packages using apt-get

sudo apt-get install python-pip python-dev python-serial arduino

Install the Tornado web framework and the Ino Arduino tools using

sudo pip install tornado ino

Install SockJS

git clone https://github.com/mrjoes/sockjs-tornado.git
cd sockjs-tornado
sudo python setup.py install

Set up the Pi to work with the Pi Co-op

git clone https://bitbucket.org/DawnRobotics/pi_co-op.git
cd pi_co-op
sudo python setup_pi_co-op.py install

Get the scripts for the robot web server, by installing our Pi camera robot software

git clone https://DawnRobotics@bitbucket.org/DawnRobotics/raspberry_pi_camera_bot.git

Install the robot web server so that it starts up when the Raspberry Pi boots

cd raspberry_pi_camera_bot
sudo cp init.d/robot_web_server /etc/init.d/robot_web_server
sudo chmod a+x /etc/init.d/robot_web_server
sudo update-rc.d robot_web_server defaults

Finally, reboot your Pi and enjoy. :)

The post Creating a Dawn Robotics SD Card appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2014/01/creating-a-dawn-robotics-sd-card/feed/ 0
Building a Raspberry Pi Robot and Controlling it with Scratch – Part 3 http://blog.dawnrobotics.co.uk/2014/01/building-raspberry-pi-robot-controlling-scratch-part-3/ http://blog.dawnrobotics.co.uk/2014/01/building-raspberry-pi-robot-controlling-scratch-part-3/#comments Wed, 08 Jan 2014 12:28:26 +0000 http://blog.dawnrobotics.co.uk/?p=425 Happy New Year everyone! Things have been a bit quiet on this blog due to the Christmas rush, and the fact that we’ve been spending time on product development (more on that in a future post). But here at last is the 3rd and final post in our series on the Raspberry Pi robot we […]

The post Building a Raspberry Pi Robot and Controlling it with Scratch – Part 3 appeared first on Dawn Robotics Blog.

]]>
Happy New Year everyone! Things have been a bit quiet on this blog due to the Christmas rush, and the fact that we’ve been spending time on product development (more on that in a future post). But here at last is the 3rd and final post in our series on the Raspberry Pi robot we used for a workshop at the now not so recent Digimakers event at @Bristol.

In part 1 we described the hardware of the robot, in part 2 we talked about the software that ran on the robot. In this post we’ll talk about about the the Scratch simulator which the workshop participants used to create their robot control programs, and our experience of running the workshop.

Workshop Outline

The workshop was aimed at children aged 7 and up, and sought to teach them to write a simple robot control program. The background story for the workshop was that they were remotely controlling a rover on the Martian surface, and had to write a control program to help it find an ‘alien artifact’ which was hidden behind some obstacles. The worksheet for the workshop can be found here, and gives more details about the task along with the commands that can be used to drive the robot.

Running the Workshop

Once the students had been introduced to the workshop, we handed out SD cards which had been pre-loaded with Raspbian for them to use in their Raspberry Pis. The students opened up Scratch and loaded the file scratch/simulator.sb. This file uses remote sensor connections to communicate with a Python script running in the background, and this Python script simulates the movement of the robot, and the data its sensors get back from the world.

Commands are sent to the robot using a slightly clunky process whereby you first broadcast a message telling the robot to prepare for a command, then you send the command, and finally you wait until a sensor value tells you that the command has completed. The image below shows how you tell the robot to move forward 50cm for example

command_blocks

How to tell the robot to move forward 50cm

The workshop worksheet gives the full list of available commands and sensor values, and this is a Scratch program that gets the robot to drive in a square.

Program to make the robot drive in a square

Program to make the robot drive in a square

Setting up the Simulator SD Card for the Students

The SD cards were set up with the following steps

  • Install Raspbian
  • Run raspi-config to enable the graphical login
  • Grab the workshop code (you might want to delete the solution files from the scratch folder)
git clone https://github.com/DawnRobotics/digimakers_scratch_workshop.git
  • Get the simulator script to run at start up by first opening the crontab file
sudo crontab -e
  • Then add the following line to the bottom of the file, and save and exit by pressing Ctrl-O, Ctrl-X
@reboot python /home/pi/digimakers_scratch_workshop/scripts/scratch_simulator_background_script.py &
The robot simulator running in Scratch

The robot simulator running in Scratch – Photo reused with permission from Raspberry Pi Spy

Controlling the Robot

Once the students had created a program that ran well in the simulator, they saved their Scratch file, turned off their Pi and took their SD card to a laptop that had a USB Bluetooth dongle and was running scripts/scratch_rover_5_background_script.py in the background. This script takes the place of the simulator script and uses Bluetooth to pass the commands from the robot control program to the robot.

The laptop used to control the robot

The laptop used to control the robot – Photo reused with permission from Raspberry Pi Spy

It was at this point that most of the students got a fairly rude awakening to the world of robotics, in that the commands they sent were never executed precisely. :) Due to wheel slip on the carpet, 90 degree turns were never quite 90 degrees, and 50cm in the simulator didn’t translate to exactly 50cm in the real world. This prompted a lot of back and forth as the students refined their control programs (they were able to do this on the laptop) to get them working.

The workshop was really well received, and we were really impressed by how well even the youngest participants did in the two hours available. There were two broad ways to complete the task, either the students could navigate around the obstacles using dead reckoning to look for the artifact, or else they could use the robot’s ultrasonic sensor to navigate a potentially quicker path through a gap between the obstacles. All of the students who took part in the workshop were able to implement the  first solution, and a few were making good progress on the second solution by the end.

We’re hoping to run the workshop again at some point in the future, which will hopefully give us a chance to refine it a bit. People are are also welcome to use the source code and support material as the basis for running their own version of the workshop. If you’re thinking of doing that, then get in touch via this thread or via our forum, as we’re always happy to offer advice. :)

The post Building a Raspberry Pi Robot and Controlling it with Scratch – Part 3 appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2014/01/building-raspberry-pi-robot-controlling-scratch-part-3/feed/ 0
Building a Raspberry Pi Robot and Controlling it with Scratch – Part 2 http://blog.dawnrobotics.co.uk/2013/11/building-a-raspberry-pi-robot-and-controlling-it-with-scratch-part-2/ http://blog.dawnrobotics.co.uk/2013/11/building-a-raspberry-pi-robot-and-controlling-it-with-scratch-part-2/#comments Thu, 28 Nov 2013 17:04:03 +0000 http://blog.dawnrobotics.co.uk/?p=372 Welcome to the second part of our series of posts, describing the workshop we ran at the recent Digimakers event at @Bristol. In the last post we described the outline of the workshop and looked at the hardware of the Raspberry Pi robot that we built for the event. In this post we describe the […]

The post Building a Raspberry Pi Robot and Controlling it with Scratch – Part 2 appeared first on Dawn Robotics Blog.

]]>
Welcome to the second part of our series of posts, describing the workshop we ran at the recent Digimakers event at @Bristol. In the last post we described the outline of the workshop and looked at the hardware of the Raspberry Pi robot that we built for the event. In this post we describe the software running on the robot, and how we set it up. Hopefully this post will give some useful ideas for those wanting to set up their own Raspberry Pi robot.

Overall Architecture

The post is to say the least, a bit geeky and a bit technical, so you may find your eyes glazing over a bit if you’re just reading these posts out of a general interest. Therefore, hopefully the diagram below should serve to give you an overview of the general architecture of the system, and you can just look at the relevant sections in the rest of the post for more detail if you want.

Robot Control Architecture

Robot Control Architecture

Software Installation

We began with a vanilla Raspbian install, simply because that’s what we’re most familiar with, and then installed a number of packages

sudo apt-get update
sudo apt-get install python-serial libopencv-dev cmake arduino python-dev python-pip

This gives us pySerial for communication via serial ports, OpenCV for computer vision, CMake to build code and the Arduino IDE to upload control code to the Arduino Mini Driver board that was used to control the motors of the robot. It also allows us to install the RPIO library to use the GPIO pins of the Raspberry Pi by running

sudo pip install RPIO

Getting the Robot Code

All of the robot code can be found in our Github repository here. Download it to the Raspberry Pi by running

git clone https://github.com/DawnRobotics/digimakers_scratch_workshop.git

Programming the Arduino Mini Driver

As we discussed in part 1 of this series. We could have controlled the motor drivers on the robot’s Explorer PCB using the Raspberry Pi, but we had problems getting interrupts working quickly enough on the Raspberry Pi, so that they could accurately keep track of the robot’s encoders. Therefore we used an Arduino Mini Driver board, and created a small sketch that accepted commands over a serial USB link to go forwards, go backwards, turn left etc, and then continuously sent back the tick counts for the left and right encoders.

The sketch can be found in the mini_driver_sketch folder of our code, and can be uploaded with the Arduino IDE.

Using the Camera to Recognise AR Markers

One of the goals for students attending the workshop, was to get the robot to locate an ‘alien artifact’ hidden behind some Martian rocks. For our alien artifacts we used Augmented Reality (AR) markers. These are similar to QR codes, and are essentially a lot easier for computer vision code to identify than arbitrary objects, such as a cup, or cuddly toy for example. To generate and recognise the AR markers we used Aruco, which is an excellent open source AR marker library that makes use of OpenCV.

You can make and install Aruco on your Raspberry Pi by first downloading aruco-1.2.4.tgz from here, and then running the following commands

tar xvzf aruco-1.2.4.tgz
cd aruco-1.2.4
mkdir build
cd build
cmake ../
make
sudo make install

Aruco comes with a number of example programs that you can play around with to get a feel for it. One of these is a program called aruco_create_marker which you can use to create the AR markers for the alien artifacts. We created a number of different markers for the workshop and then put them together into a PDF for easy printing.

We used a modified version of one of the Aruco test programs to detect AR markers in images taken using the raspistill program. All of this was bolted together using a Python script to take a photo, and then run the detection program to look for the AR markers. It’s a bit hacky, and could probably be done much better, but it shows the power of Python allowing you to throw stuff together when under pressure. ;)

To get the detection code to work, first enable the Raspberry Pi’s camera by running

sudo raspi-config

and choosing the ‘Enable Camera’ option. Then build the detection program by running

cd ~/digimakers_scratch_workshop/ar_marker_detector
make

You can test that the detection works by putting an AR marker in view of the camera and running

python ~/digimakers_scratch_workshop/scripts/artifact_find_test.py

Enabling Bluetooth Serial Communication

In the workshop, the Raspberry Pi robot was controlled remotely by a laptop running Scratch, using a USB Bluetooth dongle. Now, we could have used a USB Bluetooth dongle on the Raspberry Pi robot as well, but quite frankly, in the limited time we had available before the workshop, working out how to pair 2 linux computers with USB Bluetooth dongles, and setting up serial communications was beyond us. Therefore we wired up the Bluetooth serial module that we sell to the RX and TX lines of the Raspberry Pi’s GPIO header and used that UART port instead.

This meant that we needed to enable the GPIO serial port using the instructions at the bottom of this page.

Running the Control Code

At this point the robot is set up and we’re ready to run the Python control program scripts/rover_5_server.py. This control program listens continuously on the Bluetooth serial port for commands from the controlling laptop. In turn it sends commands to the Arduino Mini Driver whenever the robot needs to move, and transmits data about the robot back across the Bluetooth serial port.

You can start the control program manually by running

sudo python ~/digimakers_scratch_workshop/scripts/rover_5_server.py

However, what we really want is to have this program running every time the Raspberry Pi robot starts up. This can be done by running it as a cron job. Run

sudo crontab -e

and add the following line to the bottom of the file

@reboot python /home/pi/digimakers_scratch_workshop/scripts/rover_5_server.py &

Now, you can test that the robot works by using a terminal program such as Hyperterminal or Cutecom to talk to it. It will take a bit of time for the robot to start responding after boot up, but once it does, you should see it print out a stream of numbers. These correspond to

Left Encoder Tick, Right Encoder Tick, Ultrasonic Sensor Distance, Last AR  detection attempt number, detected AR marker Id, heading to detected AR marker

You can control the robot by sending the following commands over serial

  • f – Drive forwards
  • b – Drive backwards
  • l – Turn left
  • r – Turn right
  • s – Stop
  • u – Sound the buzzer (if a BerryClip is attached)
  • p – Try to detect an artifact

Next Time…

So, in this post we’ve looked at the software running on the Raspberry Pi robot, and gone over the main installation steps that were needed to set it up. In the next and final post in this series, we’ll look at the software needed to interface Scratch to the robot, and talk about our experience actually running the workshop…

The post Building a Raspberry Pi Robot and Controlling it with Scratch – Part 2 appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2013/11/building-a-raspberry-pi-robot-and-controlling-it-with-scratch-part-2/feed/ 0
Talking to a Bluetooth Serial Module with a Raspberry Pi http://blog.dawnrobotics.co.uk/2013/11/talking-to-a-bluetooth-serial-module-with-a-raspberry-pi/ http://blog.dawnrobotics.co.uk/2013/11/talking-to-a-bluetooth-serial-module-with-a-raspberry-pi/#comments Wed, 27 Nov 2013 19:16:32 +0000 http://blog.dawnrobotics.co.uk/?p=375 Bluetooth is a very low cost and flexible way to add wireless communication to your projects. However, it can also be a bit tricky to set up. In this post we show you how to set up a Raspberry Pi with a USB Bluetooth dongle so that it can communicate with an Arduino using a […]

The post Talking to a Bluetooth Serial Module with a Raspberry Pi appeared first on Dawn Robotics Blog.

]]>
Bluetooth is a very low cost and flexible way to add wireless communication to your projects. However, it can also be a bit tricky to set up. In this post we show you how to set up a Raspberry Pi with a USB Bluetooth dongle so that it can communicate with an Arduino using a Bluetooth serial module.

Once the set up is complete, we’ll have a new serial port on the Raspberry Pi that can be used to communicate with the serial Bluetooth module, either using a program such as Cutecom, or using one of the many serial programming libraries such as pySerial.

This tutorial is aimed at a Raspberry Pi running Raspbian, but it should work on other popular Linux distributions such as Ubuntu. To get a USB serial port set up on Windows, a good tutorial can be found here.

Required Materials

  • Raspberry Pi with Raspbian installed (this tutorial may also work with other distributions)
  • USB Bluetooth dongle
  • Arduino, or an Arduino compatible device such as a Seeeduino
  • A serial Bluetooth module

Preparing the Arduino/Seeeduino for Connection

To test out the Bluetooth serial connection we’ll use a Seeeduino attached to a Bluetooth serial module. First upload the following sketch to the Seeeduino. This program listens on the serial connection for 2 numbers, and then adds them together before sending them back over the serial connection.

void setup()
{
    Serial.begin( 9600 );    // 9600 is the default baud rate for the
                              // serial Bluetooth module
}

void loop()
{
    // Listen for data on the serial connection
    if ( Serial.available() > 1 )
    {
        // Read in 2 numbers
        float a = Serial.parseFloat();
        float b = Serial.parseFloat();

        // Add them together and return the result
        Serial.println( a + b );
    }
}

Connect the two devices up as shown in the diagram below. These connections will power the Bluetooth serial module from the 5V line of the Seeeduino, and connect RX => TX and TX => RX. Whilst the Bluetooth serial module is connected to the UART lines of the Seeeduino, you won’t be able to program the Seeeduino, but you can still power it with a USB cable.

Circuit diagram

Circuit diagram

Setting up the USB Bluetooth Dongle and Pairing it with the Bluetooth Serial Module

Plug the USB Bluetooth dongle into the Raspberry Pi. Then open up a command line terminal and run the following commands

sudo apt-get update
sudo apt-get install bluetooth bluez-utils blueman

Get the name of your USB Bluetooth dongle by running

hciconfig

It should be something like ‘hci0′

Now, ensuring that the Seeeduino is powered on, run the following command to find out the address of the serial Bluetooth module.

hcitool scan

After a short delay, this should return the addresses of nearby Bluetooth devices. The Bluetooth serial module that we sell should be called something like ‘linvor’.

Before Bluetooth devices can communicate, they need to be paired. This can be done by running the following command

sudo bluez-simple-agent hci# xx:xx:xx:xx:xx:xx

where # is the number of your device (probably hci0) and xx:xx:xx:xx:xx:xx is the address of the serial Bluetooth module. After a pause, this program should ask you for the pin code of the Bluetooth module. By default, the pin for the module we sell is 1234.

At this point, we have 2 Bluetooth devices that can communicate with each other, but we need to set up a protocol called RFCOMM so that they can communicate over a serial connection. Run

sudo nano /etc/bluetooth/rfcomm.conf

to edit rfcomm.conf, and add the following lines

rfcomm1 {
    bind yes;
    device xx:xx:xx:xx:xx:xx;
    channel 1;
    comment "Connection to Bluetooth serial module";
}

Where again, xx:xx:xx:xx:xx:xx is the address of the Bluetooth serial module. Once that is done, save the file and un the following command in order to get the serial port /dev/rfcomm1.

sudo rfcomm bind all

To test that your connection works, you could use a serial terminal such as minicom or cutecom (you may need to install these first). However, we’re going to use a Python script and the pySerial library.

Install pySerial if you don’t already have it using

sudo apt-get install python-serial

Then save the following Python script on your Raspberry Pi as bluetooth_serial_example.py

#! /usr/bin/python

import serial

bluetoothSerial = serial.Serial( "/dev/rfcomm1", baudrate=9600 )

a = None
while a == None:
    try:
        a = float( raw_input( "Please enter the first number: " ) )
    except:
        pass    # Ignore any errors that may occur and try again

b = None
while b == None:
    try:
        b = float( raw_input( "Please enter the second number: " ) )
    except:
        pass    # Ignore any errors that may occur and try again

bluetoothSerial.write( "{0} {1}".format( a, b ) )
print bluetoothSerial.readline()

Navigate to the script and run it using

python bluetooth_serial_example.py

If all goes well, then you should be asked for 2 numbers which are then added together for you. All very trivial, but the cool thing is that the calculation was done on the Seeeduino, with communication over thin air via a Bluetooth serial connection. :)

Taking Things Further

You now have a wireless serial connection which you can use for a whole host of applications. You could set up your Raspberry Pi to communicate with remote sensors, or use it to control a swarm of robots!

If you need a baud rate higher than 9600, then the Bluetooth serial module can be reconfigured by sending it AT commands from the Seeeduino. The command set supported by the Bluetooth serial module is given here.

The post Talking to a Bluetooth Serial Module with a Raspberry Pi appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2013/11/talking-to-a-bluetooth-serial-module-with-a-raspberry-pi/feed/ 16
Building a Raspberry Pi Robot and Controlling it with Scratch – Part 1 http://blog.dawnrobotics.co.uk/2013/11/building-a-raspberry-pi-robot-and-controlling-it-with-scratch-part-1/ http://blog.dawnrobotics.co.uk/2013/11/building-a-raspberry-pi-robot-and-controlling-it-with-scratch-part-1/#comments Thu, 21 Nov 2013 20:20:40 +0000 http://blog.dawnrobotics.co.uk/?p=353 Last weekend we ran a workshop at the Digimakers event at @Bristol where we taught people how to program a Raspberry Pi robot with the Scratch programming language. It went really well, and it was amazing to see kids as young as 7 grasp the basics of robot control, and produce some really good control […]

The post Building a Raspberry Pi Robot and Controlling it with Scratch – Part 1 appeared first on Dawn Robotics Blog.

]]>
Last weekend we ran a workshop at the Digimakers event at @Bristol where we taught people how to program a Raspberry Pi robot with the Scratch programming language. It went really well, and it was amazing to see kids as young as 7 grasp the basics of robot control, and produce some really good control programs. Quite frankly, our careers are obviously in jeopardy once they reach the jobs market. :)

This is the first in a series of posts where we describe the basics of how we built the robot, programmed it and put on the workshop. All of the workshop code, and the workshop worksheet can be found here if people are interested in recreating our work.

Aim of the Workshop

This workshop asked the students to write a control program for a robotic rover that had been landed on Mars by NASA. The robot had to explore an area of Mars strewn with rocks (in this case cardboard boxes) in order to find an alien artifact which was represented by an Augmented Reality (AR) marker. This was inspired by some robot rover lessons that our friend Graham Taylor at Raspberry Pi School put together for his wife’s primary school.

The Raspberry Pi Robot used for the workshop

The Raspberry Pi Robot used for the workshop

The students were each given an SD card to put into their Raspberry Pis that contained Scratch and a simple robot simulator running in the background. This allowed them to write their control programs in a simulation environment first, before moving over to the real robot. Once the students had perfected their control programs, they put their SD card into our laptop which was communicating with the Raspberry Pi robot using bluetooth. This allowed their Scratch programs to control the robot. At this point they learnt that the behaviour a robot in a simulator doesn’t necessarily correspond precisely to the behaviour of a robot in real life. Due to low level controller issues and wheel slippage on the carpet, turns weren’t exactly 90 degrees, and the robot didn’t always go the exact distance it was told to. Cue back and forth debugging sessions as the young robot programmers tuned their control programs. :)

Building the Raspberry Pi Robot

The robot has both a camera, and an ultrasonic sensor

The robot has both a camera, and an ultrasonic sensor

Our Raspberry Pi robot is adapted from the Seeeduino/Arduino Robot Kit that we sell. This was extended in a number of ways. We first laser cut am acrylic support platform to mount the Raspberry Pi, although cardboard would have worked just as well. We then took advantage of the extra processing power of the Pi to mount the Pi camera on the Pan/Tilt head, alongside the ultrasonic range sensor.

The batteries in the Rover 5 chassis, may well be able to run the Pi as well as the motors, but for stability, we decided to give the Pi a power supply of its own in the form of a USB power bank.

The Raspberry Pi can produce the PWM signals needed to drive the motors of the robot, but we found that we couldn’t accurately keep track of the pulses generated by the chassis encoders with the Raspberry Pi. We were trying to use GPIO interrupts in Python, but when the wheels started turning quickly, our code just couldn’t keep up. Now doubtless, with better code, or a C/C++ extension we may have been able to solve this, but in the end we decided to offload the control of the motors and the reading of the encoders to a small Arduino compatible board, the Dagu Arduino Mini Driver. The Arduino Mini Driver runs a simple program that listens for commands from the Raspberry Pi, and constantly streams back encoder readings. This is a great example of how simple tasks can be offloaded to a micro-controller, freeing up the Pi to handle higher level logic. The sketch for the Arduino Mini Driver can be found here.

The hardware used to build the robot. Showing a USB power bank, BerryClip+, Arduino Mini Driver, and a Bluetooth dongle (the thing with the red LED).

The hardware used to build the robot. Showing a USB power bank, BerryClip+, Arduino Mini Driver, and a Bluetooth module (the thing with the lit red LED).

In addition to the Arduino Mini Driver, we also added a serial Bluetooth module for wireless control, and a BerryClip so that we could use its LEDs and buzzer to provide feedback on the state of the robot. The BerryClip shown here is actually a prototype of the new, expanded version of the BerryClip, the BerryClip+, so many thanks to Matt Hawkins, owner of Raspberry Pi Spy, and creator of the BerryClip for loaning it to us.

Next Time…

In this article, we’ve outlined the hardware we used to build our Raspberry Pi Robot. In the next article we’ll talk about the software that we wrote to control the robot, and how to install it all on an SD card for the robot.

The post Building a Raspberry Pi Robot and Controlling it with Scratch – Part 1 appeared first on Dawn Robotics Blog.

]]>
http://blog.dawnrobotics.co.uk/2013/11/building-a-raspberry-pi-robot-and-controlling-it-with-scratch-part-1/feed/ 10