The post Closing Down Dawn Robotics appeared first on Dawn Robotics Blog.
]]>People who know Dawn Robotics well may be wondering about the grant we got from Jisc back at the start of August, which was intended to help us develop our robot kits further and to produce educational material for robotics. As we’re not going to be able to fulfill those commitments, this grant has been repaid in full. In fact, the need to do this was one of the reasons for making this decision now, rather than letting things drag out.
Now, whilst it’s a bit sad that Dawn Robotics is closing down, the good thing from a ‘cheap stuff’ point of view is the Closing Down Sale! So, please head along to our store to grab yourself a bargain. We don’t have many robot kits left, but we do have lots of Arduino Shields, Raspberry Pi Accessories and Adafruit items, all with up to 40% off.
For people who have already bought kits, or items from our store, please rest assured that I’ll continue to provide support for these items, and I’m holding back some stock in case I need to ship out spares or replacement parts. Also, the instructions to build our kits, along with the open source software required to do so, will remain freely available, and all the main parts we used are available from other UK retailers.
Finally, I’d like to say a big thank you to all the customers who have bought stuff from us over the years, and to all the people who have supported us and who have made this experience so much fun.
Happy robot building.
Regards
Alan
The post Closing Down Dawn Robotics appeared first on Dawn Robotics Blog.
]]>The post New Products – We now Stock Adafruit! appeared first on Dawn Robotics Blog.
]]>At the moment these mainly consist of items to make your robot flashy, along with a great servo controller board, and a number of small items that we’ve found useful when building, and connecting sensors to our robots. However, we’ll hopefully be expanding the range of Adafruit items we stock in the future, so for now check out the Adafruit section on our website, and please let us know if there’s anything they make that you’d particularly like us to stock.
Aside from that, we also have some new products from Seeedstudio. A nifty all in one accelerometer and compass board, and a 4A motor shield for the Arduino for those bigger robotics projects.
Finally, we are also now stocking a few extra Raspberry Pi accessories such as a case and a power supply.
The post New Products – We now Stock Adafruit! appeared first on Dawn Robotics Blog.
]]>The post Update on Jisc Competition Entry appeared first on Dawn Robotics Blog.
]]>Thank you so much to everyone that voted for us. What this means is that we now have the money to continue developing our robot kit, and producing educational materials to help teachers to teach robot programming in schools. Specifically, we’ll be developing some lesson plans for teaching robotics, and will hopefully be piloting these in a local school or college sometime before the end of the year.
The Jisc funding is delivered as part of a 6 month program, and so we hope to have the final version of the teaching materials ready around the end of January 2016. We’ll try to post a draft version sometime before that though for comments and feedback.
The post Update on Jisc Competition Entry appeared first on Dawn Robotics Blog.
]]>The post Why so Quiet, and Please Help us to Develop Pi Robot Educational Materials appeared first on Dawn Robotics Blog.
]]>Readers who check back on this blog from time to time may have noticed that since the end of November 2014, we’ve been very quiet, and nothing much has been written on this blog (apart from responses to comments). The reason for this is that I (Alan Broun) have been focusing pretty much full time on trying to finish off my PhD at the Bristol Robotics Laboratory and this hasn’t left much time to spend on Dawn Robotics (which is based in the BRL Technology Incubator). My colleague Martin, has been doing a wonderful job, packing orders, building kits and keeping Dawn Robotics going over the last 5 months, but I do most of the development work, so things have been in stasis for a bit.
Now though, I’m almost (almost… so close…) at the point where I can hand in, and although I do need to go off and do some contract programming work for a couple of months to rebuild my finances (Dawn Robotics doesn’t quite pay the bills yet) I am hopeful that Dawn Robotics will soon be back to producing new stuff.
This brings me to the point of this blog post. Dawn Robotics has entered into a competition run by an organization called Jisc which is offering up to £20,000 to help startup companies to develop and pilot products for use within schools and universities. But in order to be considered for a prize we need votes from members of the public i.e. you!
The product that we’ve put forward for consideration is our Raspberry Pi Camera Robot. This was originally released last year, targeted at hobbyists, but since then, a number of tech savvy teachers have bought the kit, and started to use it in schools. This makes sense in a lot of ways, as computing and programming are being seen as increasingly important in UK schools, and we feel that robotics is an absolutely fantastic way to engage kids and get them excited about programming. The trouble is, that it’s not as easy to use the robot in schools as it could be. The documentation for the robot is a bit lacking, being spread out over a number of blog posts, and we don’t have any dedicated teaching materials, such as lesson plans to use with the robot.
If we were lucky enough to win funding in the competition, then we’d use the money to change that. We’d use the money to improve the documentation for the robot and to create a range of lesson plans to make it easy for teachers to use the Pi robot in schools. We’d also use some of the money to refine the kit and its software to make it easier to use in schools (currently, connecting multiple robots to a school’s network is tricky). Also, the kit would be piloted in a number of schools to see if there were any other ways that we could improve the kit further. You don’t have to worry that this is all for for the benefit of Dawn Robotics either. The documentation and lesson plans will be released under a Creative Commons Share Alike license, and all of our software for the Pi Robot is open source under the BSD license. So hopefully, enthusiastic teachers will be able to get value out of the lesson plans and software, even if they don’t have one of our robot kits.
Anyway, hopefully you’re convinced. Please check out our pitch video below, and vote for us here. Also, if you could pass the link on and encourage other people to vote, then that would be most awesome.
The post Why so Quiet, and Please Help us to Develop Pi Robot Educational Materials appeared first on Dawn Robotics Blog.
]]>The post Off to Pitch@Palace appeared first on Dawn Robotics Blog.
]]>As part of our involvement with the Technology Incubator, we’re also fortunate enough to have been invited to take part in Pitch@Palace, a platform for young British startups organised by HRH The Duke of York on 5th November at St. James’s Palace.
We’ll be heading along with 2 other tech startups from the BRL incubator, Reach Robotics, and OmniDynamics. This is a great opportunity for us to explain our vision for robotics in education and research to a wider audience (see the slightly rushed video pitch below…).
As part of the Pitch@Palace event, there’s a Pitch@Palace people’s choice award which is given to the company that gets the most votes from the public. So if you could find time to support us and cast a vote for us then that would be very much appreciated.
The post Off to Pitch@Palace appeared first on Dawn Robotics Blog.
]]>The post Adding Sensors to the Raspberry Pi Camera Robot Kit appeared first on Dawn Robotics Blog.
]]>Now, you’ve always been able to attach sensors to our Raspberry Pi Camera Robot and the Arduino Mini Driver board we use for motor control, but previously you would have had to modify quite a bit of code in order to get your sensor data out. To fix this, we’ve just released an update to the software for our Raspberry Pi Camera robot which makes things much easier. You can now get a large number of sensors to work simply by connecting them to the Mini Driver. The new Mini Driver firmware repeatedly reads from the attached sensors and sends the readings up to the Pi at a rate of 100 times a second. Once on the Pi, the sensor values can be easily retrieved using the robot web server’s websocket interface. Sensors can also be connected directly to the Pi, with sensor readings returned in the same way.
In this tutorial we show you how to update your robot’s software if needed, how to connect sensors to robot, and then how to read the sensor values using the Python py_websockets_bot library. This will let you write control scripts for your robot that use sensors, and which either run on the Pi, or which run on another computer connected over the network.
If you bought your robot before this post was published, then it’s likely that you’ll need to upgrade the software on your robot. Probably the easiest way to do this is to download the latest SD card image from here, and then flash it to an SD card using the instructions here. If you’re interested, you can also find details of how we built the image from the ground up here.
Originally, the pan and tilt servo were connected up to pins D4 and D3 respectively. With this software update, we’ve had to move the servo connections in order to be able to attach wheel encoder sensors (these need the interrupts on D2 and D3). Therefore, in order to make sure that the neck motors keep on working, move the pan servo to the row marked D11, and the tilt servo to the row marked D6.
You have two main options when connecting sensors to the robot. You can either connect sensors to the Mini Driver, or alternatively you can connect them to the Raspberry Pi.
Connecting sensors to the Mini Driver is usually the simpler option, as the Mini Driver can talk to sensors running at 5V, and the sensor values are read automatically at a rate of 100Hz by the firmware on the Mini Driver. It can be useful however, to sometimes connect sensors to the Raspberry Pi. This can happen if you run out of pins on the Mini Driver, or if you need to use I2C or SPI to communicate with a sensor (as these protocols aren’t supported in the Mini Driver firmware yet, and probably won’t be due to a lack of space). Connecting sensors to the Raspberry Pi will probably involve a bit more work as you may need to shift 5V sensor outputs to the 3.3V used by the Pi, and you’ll also need to write a little bit of Python code to talk to the sensor.
With that in mind, we’ll look at the two options for connecting the sensors in a bit more detail. Please Note: In the examples below we use sensors we sell in our store, but there’s nothing to stop you from using any other sensor that you can find, as long as they’re electrically compatible. This means that there’s a truly vast array of sensors out there that you can use with your robot.
The range of sensors you can attach to the Mini Driver includes digital sensors, analog sensors, encoders and an ultrasonic sensor. The ability to read from analog sensors is really useful as the Raspberry Pi doesn’t have an Analog to Digital Converter (ADC) of its own.
The Mini Driver runs at 5V and so it’s assumed that that’s the voltage level at which sensors you connect to the Mini Driver will run at as well. Please check that the sensors you connect to the Mini Driver are happy running at 5V to avoid damaging them. Also, please check your 5V (VCC) and GND connections carefully before turning the power on, as getting them the wrong way round may let the magic smoke out of your sensors, rendering them useless…
Digital sensors are sensors which give a simple high or low output as a result of detecting something. There are 8 pins that you can attach digital sensors to, pins D12, D13 and the 6 analog pins A0 to A5. It may seem odd that you can attach digital sensors to the analog pins, but the pins can be used as both type of input. We’ll configure the exact type of input for the pins in software later on.
As an example of attaching digital sensors we use the trusty line sensor which gives a high signal when it detects a black line on a white background.
In the image above we’ve attached 3 line sensors to the front of the chassis using some M2 spacers, and we then connect them to the Mini Driver using some female jumper wires, as shown in the image below.
It doesn’t matter which of the 8 possible digital input pins you use, but we’ve used pins A1, A2 and A5, and will then configure these pins as digital inputs in software. With these 3 sensors attached it’s now possible to make the robot follow a black line laid down using a marker pen, or insulation tape.
One possible way of attaching line sensors. Note: One of the pins on the line sensor is marked as NC (No connection) so the extra wire can be tied back or cut away.
Ultrasonic sensors emit bursts of high pitched sound waves (beyond the range of human hearing) which bounce off obstacles in front of the robot. By measuring how long it takes for signals to go out and come back, the robot can work out if there are any obstacles in front of it, and if so, how far away they are.
In the image above we’ve attached an ultrasonic sensor to the pan/tilt head of the robot, and then connected it to the Mini Driver using a 3 pin jumper wire. The only pin you can attach the ultrasonic sensor to with our Mini Driver firmware is pin D12. Also, because reading from the ultrasonic sensor can take a long time, at the moment we only read from the ultrasonic sensor at a rate of 2Hz. If you need to attach more than one ultrasonic sensor to your robot, or if you need to read at a faster rate than 2Hz, then you’ll need to attach the other ultrasonic sensors to the Raspberry Pi’s GPIO pins.
The analog inputs of the Mini Driver can be used to read from analog sensors where the output signal is between 0V and 5V. The ADC on the Mini Driver is a 10-bit ADC so the readings will be returned as values between 0 and 1023. In the image below we’ve connected the X, Y and Z channels of the ADXL335 accelerometer to pins A1, A2 and A3 of the Mini Driver. Accelerometers are great for detecting the acceleration due to gravity (and thus working out if the robot is on a slope), and they can also be used as a neat method for detecting collisions, as we showed in our blog post on ‘bump navigation‘.
Incremental Encoders are sensors which you can use to tell you how fast a motor is turning, and usually, what direction it’s turning in as well (see here for a good introductory article). We don’t yet sell encoders for the Dagu 2WD Chassis (coming soon), but in the meantime, you may still find this feature useful if you’re using a different chassis, which does have encoders, to build your robot.
Quadrature encoders have 2 outputs, phase A and phase B, and can be used to detect both the speed and direction in which the motor is turning. Wire the left encoder to pins D2 and D4, and wire the right encoder to pins D3 and D5. The Mini Driver firmware can also be made to work with single output encoders. In this case wire the left encoder to D2 and the right encoder to D3.
Once you’ve connected up all your sensors, the next thing to think about, is how to read from the sensors, to make use of them in your control programs. The firmware on the Mini Driver (actually an Arduino sketch which can be found here) reads from the sensors at a rate of 100Hz and sends the data back to the Pi over the USB cable using serial.
On the Pi, you have two main options for talking to the Mini Driver. The first is to use the Mini_Driver Python class we provide in your own Python script, (example script here, read comments at top of file). The second, recommended and more flexible option, is to talk to the robot web server which is running on the Pi using the Python py_websockets_bot library.
Instructions for installing the py_websockets_bot library can be found in this blog post here. If you’ve already got py_websockets_bot installed then you may need to update it to get the latest version. This can be done by navigating to the py_websockets_bot library and running.
git pull sudo python setup.py install
We’ve added an example script to py_websockets_bot called get_sensor_readings.py which shows you how to read sensor values from the robot. Run the script using the following command
examples/get_sensor_readings.py ROBOT_IP_ADDRESS
where ROBOT_IP_ADDRESS is the network address of your robot. After a short delay you should see sensor values streaming back from the robot.
Looking at the example script in more detail, the important bits of the script are as follows.
Firstly we construct a SensorConfiguration object and send it over to the robot
# Configure the sensors on the robot sensorConfiguration = py_websockets_bot.mini_driver.SensorConfiguration( configD12=py_websockets_bot.mini_driver.PIN_FUNC_ULTRASONIC_READ, configD13=py_websockets_bot.mini_driver.PIN_FUNC_DIGITAL_READ, configA0=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ, configA1=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ, configA2=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ, configA3=py_websockets_bot.mini_driver.PIN_FUNC_DIGITAL_READ, configA4=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ, configA5=py_websockets_bot.mini_driver.PIN_FUNC_ANALOG_READ, leftEncoderType=py_websockets_bot.mini_driver.ENCODER_TYPE_QUADRATURE, rightEncoderType=py_websockets_bot.mini_driver.ENCODER_TYPE_QUADRATURE ) # We set the sensor configuration by getting the current robot configuration # and modifying it. In this way we don't trample on any other # configuration settings robot_config = bot.get_robot_config() robot_config.miniDriverSensorConfiguration = sensorConfiguration bot.set_robot_config( robot_config )
Pin D12 can be set as either PIN_FUNC_ULTRASONIC_READ or PIN_FUNC_DIGITAL_READ, pin D13 can be set as either PIN_FUNC_INACTIVE or PIN_FUNC_DIGITAL_READ, and the analog pins A0 to A5 can be set as either PIN_FUNC_ANALOG_READ or PIN_FUNC_DIGITAL_READ. The encoders can be set to be either ENCODER_TYPE_QUADRATURE or ENCODER_TYPE_SINGLE_OUTPUT.
For some robot applications, it can be important to know exactly when a sensor reading was made. In our software, whenever a sensor reading reaches the Raspberry Pi, it is timestamped with the time that it arrived at the Pi. The problem is however, that if your robot control script is running on a PC connected to the robot over a network, then the system clock of the control PC is likely to be different from the system clock of the Pi. To resolve this problem, we provide a routine to estimate the offset from the system clock to the Raspberry Pi’s clock.
robot_time_offset = bot.estimate_robot_time_offset()
At the moment the algorithm for estimating the time offset is not particularly efficient, and will block for about 10 seconds or so. In the future we’d like to modify this routine so that it estimates the time offset asynchronously and continuously in the background. In the meantime, if you’re not interested in the precise time at which sensor readings were made, then you can leave this line out of your own programs.
Sensor readings are returned as part of the status dictionary which, as you might expect, contains data about the current status of the robot. Retrieve the status dictionary using the following line
status_dict, read_time = bot.get_robot_status_dict()
Having obtained the status dictionary, the sensor readings are returned as a dictionary called ‘sensors’. Each sensor reading is represented as a timestamp which gives the time on the Pi system clock when the reading arrived at the Pi, coupled with the data for the sensor reading.
# Print out each sensor reading in turn sensor_dict = status_dict[ "sensors" ] for sensor_name in sensor_dict: # Get the timestamp and data for the reading timestamp = sensor_dict[ sensor_name ][ "timestamp" ] data = sensor_dict[ sensor_name ][ "data" ] # Calculate the age of the reading reading_age = (time.time() + robot_time_offset) - timestamp # Print out information about the reading print sensor_name, ":", data, "reading age :", reading_age
The format of the data entry will depend on the type of sensor being read. For the sensor types that can be attached to the Mini Driver the dictionary entries are
You should now have a good idea of how to connect a variety of sensors to your robot, and how to read values from those sensors using the py_websockets_bot library. You may find however, that you also need to connect some sensors to the Raspberry Pi’s GPIO pins. This could be because you run out of space on the Mini Driver (the more sensors the better!) or because you have a sensor that uses I2C or SPI as a means to communicate with them.
As an example, in the images below we’ve attached a digital light sensor, and a digital compass to the I2C GPIO pins of the Pi using some jumper wires and a Grove I2C hub, to create a robot that can navigate with the compass, and detect light levels (perhaps it wants to hide in dark places…).
We don’t have the space here to go into detail about how you would wire up all the different sensor types to the Pi’s GPIO pins, and then communicate with them using Python. But there are lots of good tutorials on attaching sensors to the Pi that you can find with Google. Once you’re in the situation where you can connect to, and communicate with the sensor, the steps you need to take to integrate it with the robot are
robot_config = bot.get_robot_config() robot_config.piSensorModuleName = "sensors.my_sensor_reader" bot.set_robot_config( robot_config )
If all goes well then your sensor readings should now be returned to you in the sensor dictionary.
This tutorial has shown some of the many different types of sensor you can attach to your Raspberry Pi robot, and hopefully it’s given you a good idea of how you’d go about wiring them up and reading from them. Now, the sky is the limit, as putting sensors onto your robot really makes it much easier to program interesting and intelligent behaviours for your robot. Build a robot that can drive around a room, avoiding obstacles, use a light sensor to get the robot to chase bright lights etc, the possibilities are endless.
If you have any questions about attaching sensors to your robot, or need clarification on anything we’ve talked about in this tutorial, then please post a comment below or in our forums. Also we’d love to hear what sensors you decide to put on your robot, and how you end up using them.
The post Adding Sensors to the Raspberry Pi Camera Robot Kit appeared first on Dawn Robotics Blog.
]]>The post Improving the Battery Life of Your Raspberry Pi Robot with a UBEC appeared first on Dawn Robotics Blog.
]]>A UBEC provides a nice efficient way to power your Pi from batteries
We recently started selling a USB powerbank which can be used to power the Raspberry Pi robot kit that we sell. Using the powerbank provides great battery life, but we’re still interested in making the robot run well from AA batteries, as this may be the cheaper option if you already have rechargeable batteries and a charger lying around.
Therefore, we’re modifying the kit a bit to include a UBEC (Universal Battery Elimination Circuit). This is an efficient switching voltage regulator which takes the load off the linear voltage regulator of the Mini Driver and improves the running time of a robot being powered by AA batteries a lot. This post gives some details about why we’re making this change and also describes a battery testing script that we’ve written to determine what kind of run times can be expected for different methods of powering the Raspberry Pi robot.
The Dagu Arduino Mini Driver board used in the Raspberry Pi robot contains a L4941B 1A linear voltage regulator which can be used to run the robot from batteries. Linear voltage regulators are cheap, which helps keep the Mini Driver affordable, but they’re not very efficient, so a lot of battery power is wasted as heat. Also the Mini Driver, Raspberry Pi and a USB Wifi dongle use up quite a lot of the available 1A so there’s not much headroom for adding extra sensors to the robot.
By contrast, the UBEC we’re using is a switching voltage regulator which are a lot more efficient than linear regulators, wasting much less battery power. Also, the UBEC is able to provide up to 3A of current, so provided the batteries can keep up, this provides a lot more current to supply USB peripherals plugged into the Pi, and sensors plugged into the robot.
Making use of the UBEC is very straightforward. If you’ve already wired up your robot to provide power to the Raspberry Pi from the Mini Driver, then first remove these wires. After that, connect the red input wire of the UBEC to the pin directly next to the on/off switch of the Mini Driver (this is connected to battery voltage), and then connect the black input wire of the UBEC to a GND pin on the Mini Driver. We’ve used the GND pin on the 3×2 programming header as this is unlikely to be used for anything else. Finally, plug the USB connector of the UBEC into the Pi and attach a set of 6xAA batteries to the +/- pins of the Mini Driver and you’re good to go.
To help us test and compare different ways of powering our robot we’ve developed a little battery testing script using our py_websockets_bot control library (installation instructions here). This test script runs on a PC and connects to the robot over WiFi. It runs in a loop making the robot look in different directions, and then periodically making the robot spin either left or right. In this way we hope to simulate the conditions in which the robot might be used i.e. lots of stop/start motion. The Mini Driver has one of its ADC pins hard-wired to a voltage divider to measure its battery voltage, and so the test script periodically records this voltage. When the robot runs out of juice, it stops responding and the script can be ended, at this point the script saves the voltages out to a CSV file so we can plot how the voltage of the battery changes over time.
The graph below shows some of the tests we’ve run. All of the tests were run on our Raspberry Pi robot using a Model B, so we’d expect runtime improvements if using a Model B+, as this has lower power consumption. The voltage readings are quite noisy, but you still get the general discharge curve over time when running off batteries. The graph shows the advantage of using good rechargeable batteries (NiMH) instead of non-rechargeable (Alkaline) batteries. On rechargeable batteries (Duracell 2400mAh NiMH) the robot ran for approximately 3 hours, which was 3 times longer than when using Alkalines (Sainsbury’s Extra Long Life). The graph also shows the results of running the robot on the TeckNet iEP387 USB powerbank that we sell.
We’ll update this graph with more tests as we run them, i.e. we’re planning to run tests using the Model B+. Also, we’d be really interested to see the results of any battery tests that users run on their camera robots to see how they compare with our results.
The post Improving the Battery Life of Your Raspberry Pi Robot with a UBEC appeared first on Dawn Robotics Blog.
]]>The post Robotics and the Raspberry Pi Model B+ appeared first on Dawn Robotics Blog.
]]>The new Model B+
So, the big news this week (if you’re a Raspberry Pi fan) is that the Raspberry Pi foundation announced the release of an upgraded version of the Model B Pi, the Model B+. The Raspberry Pi Model B+ is a really nice incremental update of the Model B, and it’s especially good for people wanting to use the Pi in robotic projects. This is because, alongside extra USB ports, it now uses switching voltage regulators which means that it consumes less power (between 0.5W to 1W less) and therefore it will last longer on batteries.
As soon as we got a Model B+ this week, we put it onto one of our Raspberry Pi Camera Robots, and it works great. The Model B+ has a different layout and mounting holes than the Model B so we’ve updated the assembly instructions for the robot to show how the Model B+ should be mounted.
We’ve also released a new version of the software for the robot, as the Model B+ needs different drivers for its USB and network ports. This new software has a few bugfixes, and also allows the robot to be driven at slower speeds than before. This new feature may not sound like much, but with the old software, the motors often couldn’t be made to go slower than about 33% speed before they stalled due to friction in the gearboxes. Now by driving the motors in a different way, we get the motors to turn with more torque at lower speeds to overcome this friction. This means that it’s easier to drive the robot whilst looking through the camera, and makes making precise turns easier.
The new SD card image can be downloaded here. Other options for getting the software are discussed here.
The post Robotics and the Raspberry Pi Model B+ appeared first on Dawn Robotics Blog.
]]>The post New Product – A Power Bank for your Raspberry Pi Robot appeared first on Dawn Robotics Blog.
]]>We recommend that the robot be powered with good quality, high capacity, rechargeable (NiMh or NiCd) batteries, such as Duracell 2400mAh NiMh . Non-rechargeable (Alkaline) batteries are not recommended as they will struggle to provide enough current to power both the Pi and the motors of the robot.
As an alternative to AA batteries, we’re now selling the TeckNet iEP387 USB power bank which can be used to power the entire robot. The power bank is more expensive that the cost of 6 AA rechargeable batteries, but you get the advantage of increased runtime (approx 5 hours compared to 3hrs for the NiMh Duracells), and you don’t have to buy a battery charger.
In this blog post we show you how to use the power bank with the robot.
Please Note: If you are using a USB battery pack to power the Pi and mini driver, then you do not need to use the UBEC which we’ve started to supply with more recent versions of our Raspberry Pi robot.
Once you’ve built the robot following these instructions, Slide the iEP387 USB power bank into the chassis behind the wheels.
The iEP387 should come with 2 USB cables, a micro USB cable for powering the Pi, and a USB cable with 2.54mm connectors for connecting to the Mini Driver, and powering the motors. First plug the micro USB cable into the 5V 2.1A output of the iEP387 (the Pi needs this) and connect it to the power connector on the Pi (the extra cable can be wound around the Raspberry Pi mounting struts).
Secondly, use the USB power cable with red and black leads, and 2.54mm connectors to attach the 5V 1.0A output of the iEP387 to the battery pins of the Mini Driver (marked + and – next to the mini USB connector). The red wire should attach to the + pin and the black wire should attach to the – pin. Don’t worry if you get them the wrong way round though, as the Mini Driver has reverse bias protection.
Make sure that you leave a bit of slack in the cables so that you are able to slide the iEP387 sideways slightly and press the on/off button.
To turn on the robot, first switch on the power switch on the Mini Driver. This is important in order to provide power to the motors. Then slide the iEP387 sideways, and press the power button. This should turn on the robot.
Please Note: When using the new Model B+ Pi, it’s possible to knock the Micro SD so that it comes out slightly. In this situation the powerbank will turn off after a while as the Pi wont boot properly and so not enough current will be drawn to keep the powerbank on. If you’re seeing the powerbank turn off for no reason then please check that the MicroSD card is seated properly. This forum post gives more details.
To turn off the robot you need to unplug the micro USB connector from the Pi, and turn off the power switch on the Mini Driver. This is not as neat as we’d like it, but there’s not really an easy way (without adding more hardware, and therefore more cost) to stop the Pi from drawing power from the power bank.
Pulling the power from the Pi shouldn’t damage anything as the robot’s software doesn’t write anything to the SD card. However, it’s always nice to let the Pi shutdown cleanly if possible, and so to do that you can use the shutdown button on the robot’s web interface.
I’ve had a couple of people say that their battery pack touches the wheels so have posted the following pictures to try to clarify things. The pictures are not great quality, but hopefully I’ll have time to update them in the new year.
As an alternative to inserting the battery pack in from the side, it’s also possible to remove the central support and slide the battery pack in from the front. This is a snug fit, but again there should be a few millimetres between the battery pack and the wheels. This gap can be increased by sliding the wheels slightly along the motor axles.
The post New Product – A Power Bank for your Raspberry Pi Robot appeared first on Dawn Robotics Blog.
]]>The post Programming a Raspberry Pi Robot Using Python and OpenCV appeared first on Dawn Robotics Blog.
]]>The interface is a Python library called py_websockets_bot. The library communicates with the robot over a network interface, controlling it’s movements and also streaming back images from its camera so that they can be processed with the computer vision library OpenCV. Communicating over a network interface means that your scripts can either run on the robot, or they can run on a separate computer. This feature is really useful if you want to use a computer more powerful than the Pi to run advanced AI and computer vision algorithms, or if you want to coordinate the movement of multiple robots.
In this post we show you how to install the interface library, and provide some example programs, that show you how to make the robot move, how to retrieve images from the camera and how to manipulate the images with OpenCV.
Before we talk about installing the py_websockets_bot library, the image below should hopefully give you a better idea about how the software on the robot works.
At a low level, we use an Arduino compatible board called a Mini Driver to control the hardware of the robot. Running on the Mini Driver is a sketch that listens over serial USB for commands from the Pi, whilst sending back status data (at the moment just battery voltage, but other sensors could also be added).
On the Pi, we run a web server written in Python that provides a web interface, and which listens for commands over the WebSockets protocol. When it gets commands, it sends them onto the Mini Driver via serial USB. The other program running on the Pi is a camera streamer called raspberry_pi_camera_streamer. This is a custom Raspberry Pi camera program we’ve written that streams JPEG images over the network. It can also stream reduced size images (160×120) for computer vision, along with motion vectors (coarse optical flow data) from the Raspberry Pi camera.
To control the robot, you can either use a web browser such as Chrome or Firefox, or now, you can also write a program using the py_websockets_bot library. Both of these will communicate with the web server and the camera streamer on the Pi using WebSockets and HTTP.
It may seem a bit over complicated to run a web server, and to communicate with this, rather than controlling the robot’s hardware directly, but it gives us a lot of flexibility. The web interface can be used to observe the robot whilst it’s being controlled by a script, and as mentioned before, control scripts can be written just once and then run either on the robot, or on a separate computer, depending upon how much computing power is needed. Also, in theory, you can write your control scripts in whatever language you like, as long as you have a library that can speak WebSockets. We have provided a Python library, but there are WebSockets libraries available for many other languages, such as Javascript, Ruby and the .NET framework.
Starting with the standard Dawn Robotics SD card, run the following commands on your Pi to make sure that the robot’s software is up to date
cd /home/pi/raspberry_pi_camera_streamer git pull cd build make sudo make install
cd /home/pi/raspberry_pi_camera_bot git pull
Reboot your Pi to use the updated software.
Run the following commands to install the libraries dependencies
sudo apt-get update sudo apt-get install python-opencv
then
git clone https://bitbucket.org/DawnRobotics/py_websockets_bot.git cd py_websockets_bot sudo python setup.py install
This is trickier but involves the following steps
If needed, more details for OpenCV setup on Windows can be found at http://docs.opencv.org/trunk/doc/py_tutorials/py_setup/py_setup_in_windows/py_setup_in_windows.html
c:\Python27\python.exe setup.py install
We don’t have a Mac to do this, but hopefully, the installation process should be similar to installing on Linux. If there are any Mac users out there who could give this a go and let us know how they get on, we’d be very grateful.
Making the robot move is very straightforward, as shown in the code snippet below
import py_websockets_bot bot = py_websockets_bot.WebsocketsBot( "ROBOT_IP_ADDRESS" ) bot.set_motor_speeds( -80.0, 80.0 ) # Spin left
For ROBOT_IP_ADDRESS you would put something like “192.168.42.1″ or “localhost” if the script was running on the robot. The code snippet connects to the robot, and then starts it turning left by setting the left motor to -80% speed and the right motor to +80% speed.
The example file motor_test.py shows more of the commands you can use. It can be run from the py_websockets_bot directory by using the command
examples/motor_test.py ROBOT_IP_ADDRESS
One of the really exciting thing about using the Pi for robotics, is that it has a good standard camera, and enough processing power to run some computer vision algorithms. The example file get_image.py shows how to get an image from the camera, and then use the OpenCV library to perform edge detection on it.
Run it as follows
examples/get_image.py ROBOT_IP_ADDRESS
The Pi can run computer vision algorithms, but the processing power of its CPU is very limited when compared to most laptops and desktop PCs. One crude but effective way to speed up a lot of computer vision algorithms is simply to run them on smaller images. To support this, the py_websockets_bot library also offers routines to stream ‘small’ images which have been reduced in size on the GPU of the Pi. The standard camera images from the robot are 640×480, and the small images are 160×120.
The Raspberry Pi contains a very capable GPU which is able to encode images from the Pi camera into H264 video in real time. Recently, the clever people at Pi towers added a feature to the Pi’s firmware that allows the vectors generated by the motion estimation block of the H264 encoder to be retrieved. What this means, is that it’s possible to get your Pi to calculate the optical flow for its camera images, practically for free! (0% CPU) We haven’t managed to do anything cool with this yet, but we’ve provided routines to retrieve them from the camera so that other people can.
The motion vectors from waving a hand in front of the Raspberry Pi Robot’s camera (no really, squint a bit…)
To see this for yourself, run the following example
examples/display_motion_vectors.py ROBOT_IP_ADDRESS
Hopefully you can see that there are lots of possibilities for creating cool behaviours for your robot using the py_websockets_bot library and OpenCV. For example, you could write a script to get the robot to drive around your home. Perhaps you could get it to act as a sentry bot, emailing you an image if it detects motion. Or perhaps you could get it to detect and respond to people using face detection.
You can get more details about the capabilities of py_websockets_bot in its documentation here, and also the documentation of OpenCV can be found here. If you have any questions about the py_websockets_bot library, please post on our forums, You can also use our forums as a place to share your robotic projects with the rest of the world.
The post Programming a Raspberry Pi Robot Using Python and OpenCV appeared first on Dawn Robotics Blog.
]]>