IPTPI is custom made construction. The two geared motors run on 5V and are directly controlled using Pololu DRV8835 Dual Motor Driver Kit for Raspberry Pi.
The motors are custom enhanced by embedding optical encoders (each has 2 micro IR optic pairs at 90 deg) directly in the gears case. The white-black disk is mounted directly on motor shaft, before reduction by gears, so encoders have very high precision. If you want to make the same hack – we share how to 🙂
Quad-core Raspberry Pi 2 (quad-core ARMv7 @ 900MHz, 1GB) has a lot of processing power for an embedded so we are going to utilize it wise. It is possibility to program robot directly with the Pi – no need for external computer. Both Eclipse and Arduino IDE run well (actually there was a small problem with Raspbian Eclipse and Arduino versions – they were quite old, so we had to add Debian Jessie repositories manually, but a week ago Raspberry guys published Raspbian Jessie so this becomes easier).
Raspberry Pi offloads most of the real-time event processing to Arduino Leonardo clone – A-Star 32U4 Micro. It has very compact (coin) size (1.05″ × 0.6″ including USB Micro-B connector), 16 MHz Atmel ATmega32U4 AVR microcontroller, 32 KB flash, 2.5 KB SRAM, Native full-speed USB (12 Mbps), Preloaded with Arduino bootloader, wit 15 general-purpose I/O pins, ISP header,7 hardware PWM outputs, 8 pins analog inputs, 5V.
Some sensors are already mounted:
- 2 motor optical encoders – custom made, each has two channels, hot event streams offloaded to Arduino micro – allow very high precision robot positioning;
- 5 IR opto-pairs forming optical array that allows IPTPI to follow a line, and to stop if end of table reached (@ front part the robot frame – see the picture) ;
- 3D accelerometers, gyros, and compass provided by MinIMU-9 v2 Gyro, Accelerometer, and Compass (L3GD20 and LSM303DLHC Carrier) – directly integrated with RPi2 board through GPIO (3 I2C addresses) – provide ability for precision robot positioning and orientation feedback;
There also some more work to be done. We have to mount additional sensors:
- 2 infra red short to mid range, precise distance sensors from Sharp – SNS-GP2Y0A21YK0F from Olimex – allow the robot to find obstacles in short to mid range;
- 2 infra red adjustable mid range distance sensors with analog and digital output – SNS-IR-3-80 from Olimex – same purpose;
- two USB cameras + precise US distance measuring sensor (6 meters, 1 cm precision) – to be mounted on 2 axis gimbals driven by two metal gear servos (expect more details about construction soon).
Idea is to utilize JAVA 8 power (multi-threading, reactive programming – Reactor, RxJava, actor model – Akka, AI – subsumption architecture) + (asynchromous) GPIO provided by Pi4J library (Wiring Pi wrapper).
Some demos are already available but most of the programming should be done – if interested welcome to RoboLearn hackathons. The source will be available in GitHub under Apache v2 license.