On the pipe there will be 2 cameras, which move on top of each platform. It is used on encoders to detect the distance. One PC is connected with both cameras. On the PC it is done vision recognition to find red and green houses and our minon robots. PC software will plan paths for the minions. Communication will be done using rkmesh radio modules.
Minion is a small remote controlled robot which is designed to the push cubes out of the area. Minion is built on the PisiBot platform. The new addition is box shaped structure, which front and back surface is meant to be as a shovel to move cubes. The top surface is a base for a marker-sheet. The marker-sheet contain an information necessary for image processing. The Minion structure is 3D printed with PLA.
Pipe drive’s objective was to plant two cameras on pipe so that cameras can cover whole competition field from above.
Pipe drive consists of two cars which will be detached at a fixed distance. Both cars had camera attached to them. Chassis of both cars were entirely 3D printed with PLA or ABS.
Leading car had electronics, DC motor with encoder and active part of linkage system. It was built so that it levels itself on one axis by having its center of mass below drive wheel. Additional ballast was used to do so. This also helped to keep camera facing the competition field.
Trailing car consists of camera housing, two 3D printed hooks and passive part of linkage system.
Linkage system was composed of two parts. First, on the lead car, there was a servo controlled lever, which hooked to the trailing car. The trailing car had notches to lock the lever in place. The servo could then, during movement, unhook the two cars as necessary.
In order to make the trailing car stop when unhooked, another lever was placed on the trailing car, equipped with a high-strength magnet. The lever was placed so that, as the former lever unhooked the two cars, it would swing the other lever down close enough to the pipe to lock the magnet onto it. This was enough to stop the trailing car immediately after it was unhooked.
Pipe drive was moderately successful. It managed to plant cameras over the competition field with 5cm accuracy but had problems with cable management and safety of the leading car.
The pipe drive electronics board consists of the following modules:
The pipe drive electronics is but together on prototyping board. The board main controller is Arduino pro mini. The board has a dc motor and an encoder connector. In addition there is also a servo interface for the unlocking system and switch for setting servo position. The communication with the computer goes over USB interface. The Arduino board get its supply trough USB. The 5 V voltage regulator is for Servo motor. The pipe drive electronics seems work well.
The minion code is relatively simple. Every cycle it attempts to read a packet from the modular radio, via UART, and then executes whatever function, command, it has loaded. All commands are represented by a typical function, and are stored in a static array. The packet contains a function number and then an array of arguments.
All commands are stateful, by utilizing static variables, and usually reset their state whenever they're executed the first time around. This allows for the same function to do continuous operation until a new command is loaded.
The pipe drive code is waiting a packet with both cars position. If this packet has arrived, then the PID regulator start to move both car (Second cars is connected to the first one with the locking system). When second car position is present, then it open locking system and release second car while first car is still moving. When first car is arrived at the right position, then it stops and controller sends a message to PC to indicate that both cars with the cameras is transported to the desired location. The pipe drive also send a “ping” messages at one second interval.
The general logic of the computer can be described in a single loop. Each cycle the computer cameras scan the field for the target blocks and robots. Following that they handle path finding, either generating new paths for robots, or by guiding the robots along.
The only input given to the system is the camera feed. Feedback from the minions was not implemented, though they capable of communicating with the PC by the end of the project.
The project is written in C++, utilizing the OpenCV 3.x library.
Identification of target buildings was done with a standard closed contour approach coupled with colour filtering.
Identifying the robots proved more difficult. At the end, each robot was designated with two dots, one key and one unique. The key was always blue, between all robots. By comparing the distances between found dots, the program was able to locate unique robots. By comparing the relative position of the dots, the program was able to assertain the heading of the robots.
The maximum amount of robots is decided by the amount of colours the system is clearly able to distinguish. In our case, with the camera limitations, it ended up being 5.
Once the robots and targets were identified, a path could be generated. Paths consisted of nodes on the camera field, with an accuracy of 1 pixel. The idea was to apply a general pathfinding algorithm to the issue, such as A*. However, due to the time it took to get the rest of the components working, this was not accomplished.
Path finding and guidance worked in principle, the robots were guided by forwarding move commands with the correct PID parameters to the robots as often as possible. This, at certain speeds, was controllable enough.
Unfortunately the robot did not get ready.