Friday 17 June 2011

Course is finished

Today we had a final lecture and, thus, finalised the course. As a final video we would like to post a video of real-time mission on Mars with temperature measurement and duck dropping on real lake.
After 8 weeks of intensive work we obtained following skills and knowledge:

-design and implementation of embedded motion systems;
-experience of working with actuators, sensors, encoders and processors in BrickOS environment;
-skills in C programming such as multitasking, synchronization and communication;
-working in team;
-time-management and planning;
-application of theoretical knowledge on practice;
-problem solving in real-time environment;
-independent learning with limited supervision;
-various examples from specialists who have experience of industrial embedded software design.

Last but not least. It was funny 8 weeks. We really enjoyed working in team when everyone tried to contribute as much as possible depending on his skills. Despite the fact that we didn't win the competition we acquired very important experience which will be definitely used during our future career. I think it is the most important of University education.

Thursday 16 June 2011

Mars mission: final presentation

Today we had a final presentation on the course. 8 weeks of hard work were summarized in 5 minutes for each group.

Overall, there was no team which could find all three lakes. We found and measured one lake. Other team measured temperature twice but on the same lake. The winner team measured temperature on two lakes.

Next step is peer evaluation. Tomorrow we are going to have a lecture by guest lecturer from Mathworks.

Wednesday 15 June 2011

Mars mission: day before presentation

Here it is. The Day X is coming. Everyone is in rush and tries to polish their programme so that it will succeed.

Therefore lab was very crowded for last two days. It was very difficult to find a place to put laptop and rover itself. The Mars Landscape, camera and Earth computer were also intensively used by all groups. We used them also. Namely we made final preparations such as follows.

1) Found best light conditions when rover finds lakes using camera.
2) Determined best initial position for Mars mission
3) Solved problem of stucking at the space between edge and lake. We changed angle of steering and added reverse movement so that rover will not get lost between edge and lake
4) Finalised threshold values for light sensors in order to distinguish lake and edge.
5) Adjusted length of thread for temperature motor

All these preparations were done yesterday in the evening and today in the early morning when we reserved time slots. Also we had a meeting with Jos where he gave us final advice and wished good luck.

Looking forward to final presentation of other groups as well.

Tuesday 14 June 2011

Temperature measurement and duck dropping

After experiments and adjustments which were partially explained in previous posts we made temperature measurement together with duck dropping. Here is the video.

Temperature sensor calibration.

For measuring temperature we need to familiarize ourselves with principles of sensor's work. We used approach of trial and error mixed with least-squares method.

We measured the temperature of various media such as environment air, cold water, hot water, human body and so on. We created a script that puts the measured values on the LCD of the brick.

We found out that values are measured and displayed in hexadecimal format. Therefore we had to use conversion of datatypes.

Next, we needed to calibrate sensor. The value obtained by sensor and values measured by thermometer was compared and put on to x-y graph. Then we made an assumption that function is linear. So the function should look like alpha+beta*sensor_value. So the task is to find alpha and beta. After some trial and errors we found out that formula is 113.78-0.002457*SENSOR_3

Mars mission: integration

For Mars mission we basically have three modules. They are lake finding using camera, cliff avoiding using light sensors, temperature measurement using probe. All these three modules were developed and need to be integrated.

Here we faced several problems such as:

-Using of brick's sensor slots. Now we splitted sensors based on light/encoder principle. It means that all three light sensors are now connected to the first brick whereas all encoders and temperature sensor was connected to the second one. It forced us to change our separate programme modules during integration.

-Temperature sensor sometimes is put by mistake onto cliff. Once rover goes back this sensor stucks and causes problem to movement. We needed to change probe putting mechanism and make measurement software to avoid measuring temperature at wrong places.

-Edge detection sometimes conflicts with lake finding modules. It causes problems with performance and rover stucks on one place or tries to reach the lake ignoring obstacle on this way. We solved this problem by changing priorities of threads.

Next, we need to test integration part in the lab using real environment instead of simulation lakes, edges and using fictitious coordinates from camera.

Mars mission: relevant hardware changes

After facing communication problems and some problems with integration we decided to change hardware configuration slightly.

First of all, we mentioned about communication problem in our last post. Solution of this problem is to direct both bricks to the ceiling or to to the tower on the top of setup. Here it is depicted on the foto

Next we mounted a frame for camera so that it will not make rover unstable and will not interfere rotational part of front wheel.

Also there was a problem with temperature probe. It was not long and deep enough. So we changed the probe putting arm. Now on cruise regime it is short and does not touch floor. On the measurement regime it goes downwards on required depth and allows to measure temperature of deep lake.
As it is seen, we also added battery for additional weight and Duck dropping mechanism.

Sunday 12 June 2011

Mars mission: first problems with communication

After completing line-tracking assignment we are now more focused on Mars mission. Technically speaking, we started this mission earlier and worked in parallel. Experience and code which we obtained during line-tracking was partially recycled during Mars mission. For example, tape detection routine is used now as cliff avoiding routine.

However, here we got some problems with Mars mission. They are linked to communication between bricks and Earth computer.

We found that communication is very unreliable and we searched for the reasons first we found that the orientation of the RCX is a key factor. The best orientation is when the RCX is pointing to the ceiling and therefore sends the signals directly to the receiver or reflects the signals over the ceiling to the receiver.The other key factor, is the communication error that happens at the earth computer when the 2 RCXs send at the same time. As a result we changed the configuration of the 2 RCXs so that only the master RCX can send continuously a request to the earth computer for the coordinates and the slave RCX only sends the temperature whenever the master requests. We still can see another problem that when the slave RCX sends the temperature, the earth computer doesn't receive again from the master RCX and we lose communication again. We are now trying to check some points to solve this problem otherwise, as an ultimate solution, we could put the temperature sensor on the master brick and sacrifice the center light sensor.

All these problems and others are solvable. We need to make more tests and adjust both hardware and software so that they will lead to successful mission.

Friday 10 June 2011

Line tracking: presentation and results

Today we  had a presentation of line-tracking. The presentation itself can be found by following link:
Presentation.pdf

As for questions, jury asked mostly about hardware and team management. For example:
-How did you divide your responsibilities among colleagues?
-Why didn't you choose tank-based chassis?
-Why did you use wires when connected encoders to the chassis?
-How did you calibrate your encoder?

The second part was unsuccessful. On the one hand, rover didn't stuck and make a beep sound. On the other hand it suddenly stopped at the middle of the tape and consequently measured only part of the tape length. Thus, crate of beer went to group five.

Later, we repeated experiment and it worked perfectly. This lesson showed as that we cannot 100% be sure that we solved all possible problems and could predict all of them.

Thursday 9 June 2011

Line tracking: last minute preparation

Here it is. The day of line-tracking is coming. Being in hurry trying to improve as much as possible we did some last-minute preparation.

The main problems were as follows:
-Absence of robustness: if tape has breaks or scratches line-tracking failed
-Measurement error: it was about 3%
-Problem with sound: it sometimes didn't work at all
-Continuing movement after tape finishes.
-Presentation was not self-explanatory enough

Experiments until late night; blood and sweat at the lab... Finally, we fixed problems and created very nice presentation. Video of two various cases is shown below.

Monday 6 June 2011

Line-tracking: length measurement and sound making

After successful implementation steering and light sensor processing part we started other two parts of this assignment. We tested length measurement and created programme for making sound.

We used information on rotation encoder in order to measure length. We measured length of actual tape and received number of revolutions of encoder. Then using these data we made calibration f sensors and added necessary calculations to the software. Measured length will be displayed on LCD of brick. As a result we achieved measurement with relative error of about 3%.

Also we added sound signal. Now when rover detects the start of the tape it will beep.

Plan adjusting

We updated initial plan. Now when line-tracking is almost finished our main priority is final contest. Taking into account that there are only a few slots in camera using schedule our plan depends on this schedule. Main changes are as follows:

-available camera hours will be used with maximum efficiency;
-since TU/e is closed on week-ends we are going to have additional outside session at Serhat's house
-main attention will be given to integration of three subtasks: edge detection, lake detection and temperature measurement;
-presentations on both assignments will be discussed by us and Jos


Camera tests: results and problems

Today we made our initial experiments using camera in DCT lab. The result is as follows.

-We found optimal view angle and height of mount frame;
-We chose optimal light conditions so that lakes are identified without noise;
-We defined that communication with camera is not reliable so we need to slightly change software part in order to make it more robust to communication loss

Sunday 5 June 2011

Line-tracking preparation and edge avoidance for Lake-finding

After some tests Line-tracking code was furthermore improved. Now its movement is smoother and more precise


Also we started edge avoiding routine. This routine will work in parallel with lake detection and will help rover to avoid falling down from Mars surface. It is based on detection of different value of light reflection on the edge. If left light sensor detects edge then rover stops and front wheel rotates to the right direction on 90 degrees angle. For right light sensor detection it is vice versa. Separately we will add routine for avoiding plateau using central sensor. The programme is not ideal and on the sharp edges when both sensor detect edge it still needs to be improved.

New library - SSSRLibrary.h

After some additional experiments we decided to create new library. In this library we added all created by us and verified functions which will be useful during Line-tracking and Lake finding assignments. We called it SSSRLibrary  =)

It consists of following functions:


void move(float speedfrac){}
void stopDriving(){}
void stopSteering(){}
void steer(int angle){}
int isObstacle(int sensor, int reference){}

Tuesday 31 May 2011

Time-management and team-management.

After meeting with Jos we decided to be more organized.

We looked at our agenda and suddenly found out that the Final contest day is within two weeks whereas we are still not sufficiently ready for it. Also we found out that subscription list for using camera is quite crowded. Taking into account a lot of coming holidays time will be crucial. All these facts encouraged us to enforce preparation and start using time and human resources more effectively.

Firstly, we defined main tasks and then split them into subtasks. Then we decided to create subteams which will be responsible for particular task. Based on this we found suitable time slots and distributed activities on two-weeks term. As a result we created a table "2BeDone" as shown below. Next group meeting is tomorrow.


Meeting with Jos, week 7

We met Jos and showed to him results on our line-tracking assignment. He was satisfied by current progress. However, he strongly encouraged us to start implementing LakeFinding assignment simultaneously. Also he recommended us to post conclusions on coding instead of posting code as is.
Next meeting is as usual on Monday.

Line tracking: current results and problems

After implementation of both software and hardware parts which were mentioned before, we made initial tests. These tests revealed some problems to be improved such as:

-Switching movements between rotational and forward movements while specific light sensor positions
-Too sharp rover turnings which yield inaccurate measurements and extreme slalom trajectory

However, we gained good results. This assignment allow us test machine before final contest. For example we will reuse our code and knowledge in areas such as:

-Using communication between bricks and Earth station
-Evaluating distance and direction to the lakes
-Using sensors in order to detect landscape edges and plateau
-Control of turn motions in order to choose best trajectory to the lake

Results are shown below as a video. Still they are not ideal but we are on the right track

Line tracking: hardware part

As mentioned in the previous post, we created a programme which performs "slalom-style" movement. Taking into account this information we decided to change hardware part slightly.

First of all we decided to increase sensitivity of light sensors. Robin updated it on software level such that difference between tape and table will be detected more significantly.

Then we decided to increase precision of encoder. Initially, it was connected to front wheel axle directly. It has two disadvantages. First one is that length measurement will be disturbed by curvy trajectory caused by rotations of front steering wheel. Second one is that one full rotation of front wheel will be equal to 16 ticks which is not sufficient for small distance measurement precision. After analysing them we decided to improve mechanical design. We have made following changes.

1) Moved encoder from front wheel to the on of rear wheels
2) Connected encoder to the axle indirectly by using clutch of two ratchets. Ratchet with bigger radius was connected to axle of wheel and transfers rotations to the ratchet with smaller radius. This second ratchet is connected to encoder. As a result, one full rotation of wheel will be equal to several rotations of encoder. Thus, precision of length measurement will be increased
3) Differential connection was added to rear axle.




These changes improved chassis of rover and will yield easier programming together with better results for both assignments.

Line tracking: software part

After successful fixing encoder problem we had two sessions: on Saturday May, 28 and on Monday May, 30. As a result we made a good progress in line-tracking assignment. Hereby I will give short explanation of software part.

We decided to use Master-Slave architecture. It means that two bricks will use different code. Master brick will make main computational work. Slave brick will receive information from sensors and transfer it to the Master one.

It is useful because for Lake finding assignment we will use communication between bricks as well as with Earth computer. Therefore we decided to use com.c library and studied threading and semaphores in BrickOS environment.

Slave brick has a very simple programme. It reads values from three light sensors and rotation encoder. Then it activates thread for reading information from IR port, initializes it and sends values to Master brick which has certain ID

Master brick consists of four motion control functions such as move(), stopDriving(), stopSteering() and steer(int angle). Next service function is init() which initializes all sensors. The function isObstacle(int sensor, int trigger) determines whether rover is on the line or outside it. Finally, main(int argc, char **argv) function realizes main logic of assignment.

Main idea is as follows. Initially light sensors determine contrast values of tape and table. These values are stored and continuously compared with new values obtained from sensors. If there will be detected that sensors are outside tape then immediately direction of movement is calculated and motors steer to the necessary direction according to necessary angle. In order to keep on track forward movement is stopped then rotation is performed. As soon as necessary angle is achieved forward movement is continued in order to return to track. When it happens steering angle adjusted to zero in order to perform straight movement. Otherwise, rotation and steering continued.

Meanwhile, encoder sends information about rotations and this information is transformed to the distance covered by rover. This distance is supposed to be the line length and indicated on LCD of the brick.

As a result, we get "slalom-style" movement which allows rover to keep on the line no matter how curvy it is.

Tuesday 24 May 2011

Problem with hardware: encoder malfunction

Yesterday we made initial experiments for line tracking competition. We have done:

1) Activating and receiving information from sensors such as motor encoders and light sensors
2) Implementing initial programme which allows to adjust movements according to information oh hardware

Here we had a problem - one of the encoders doesn't work. It doesn't send information to RCX. We have made a double check - shuffled two encoders and the problem remained. It means that problem is in hardware but not in software. It was a rotation encoder.

Jeroen said that it is not a problem and it is easy to fix. We just need to return defect one and he will substitute it. Hopefully, tomorrow we will do it.

A Three Wheeled Design, Why?

By now, our hardware design is finished. Since the line tracking assignment and the final contest are separated no more than a week we decided to create a rover that can do both tasks without needing adjustments in the design. Furthermore we would like to be original and think for ourselves so we're not using any of the standard designs.

The most important requirements we determined during our first meetings were;

-It should be able to explore the mars landscape with the camera without going back and forth, the rover should be able to turn around its vertical axis.

-Because we will have to use encoder values in order to measure the length of the tape, the wheels are not allowed to slip.

-Hysteresis on the steering mechanism should be minimized, so we can be very sure about the direction we're going with respect to our previous position.

-The risk of getting stuck in a lake should be reduced much as possible.

At first these constrains resulted in a caterpillar like design, but since the wheels that drive the caterpillar might slip within the track, this idea was dropped. Then a more ingenious design with two wheels on the left and two on the right that allow the rover to steer in a caterpillar like way was proposed. But since these tires could also slip, and because one of the wheels could get stuck in a lake, this design was also dropped.

Our final design uses three wheels. Steering and driving is all done on one wheel which reduces hysteresis on the steering mechanism. Because we're using three wheels, the rover is not 'statically over-determined', which reduces the risk of slipping wheels (for the line tracking). Using less wheels also reduces the risk of wheels getting stuck in a lake but it goes at the cost of less stability. That's way we had to put the heaviest parts (the Bricks) low as possible. Below the finalized design is shown with some more explanation.



A – Light Sensors are in line tracking position, the onces on the side can easily be moved more to the side for the Mars mission.

B – Direct connection between wheel itself and steering gear in order to reduce hysteresis. Steering wheel can be turned 90 degrees to allow the rover to turn around its vertical axis.

C – Eyes of the rover are like small crosses, it makes the robot look drunk but this is just to mislead people.

D – We were too late with collecting extension cords, only the very long onces were still available...

E – The motor that moves the temperature sensor up and down is placed in the back to improve weight distribution.

F – Handle to lift the rover without Lego parts falling of.

G – Gear ratio has been slightly changed, compared to standard designs, to make it go faster. Of course it goes at the cost of torque. 

H – To increase stiffness of the steering axis a sliding surface is created with Lego bricks that are placed upside down is created. Scotch tape is used to reduce friction.

Monday 23 May 2011

Lake finding algorithm:

This is a state diagram of the operation of the robot in order to find a lake. The algorithm is based on the mechanical design of the robot that can rotate the front wheel 90 degrees. It starts by checking if there is a lake in the camera view or not. If not it starts steering the front wheel 90 degrees to the right until it finds a lake or the light sensor detects a hole. On hole detection it steers to the other side searching for a lake. If it finds a hole again then it is between 2 holes and the moves backwards a bit and starts searching again. upon finding a lake the robots move forwards while adjusting the steering degree of the wheel as a factor of the lake orientation relative to the display. when the lake disappears from the screen as a result of dead zone, the robot moves further until the center light sensor finds the lake and it measures the temperature and sends it. In case a hole is detected while moving towards the lake, the robot tries to move around the hole keeping into mind the orientation of the targeted lake. This is a recursive procedure or steering to the other direction then moving forward then steering back until there is no obstacle and the lake is reached safely.






Wednesday 18 May 2011

Software planning and chassis improving

Today all five of us worked on project. We started from discussions on chassis design. There were two options. We chose the best one.

Then we split up into two teams. Robin and Serhat made mechanical design. Me, Sergiy and George defined general block scheme of software algorithm as well as modules of software. We found out that there should be three modules - sensors&actuators, image processing&communication, motion control. We draw state diagram and defined information flow chart.

Next meeting is on Monday.



Tuesday 17 May 2011

Meeting with Jos

Today we had our first meeting with tutor - Jos.

General conclusion after meeting is as follows. We can use everything we want and there is not any ready solutions. Only approach is to experiment and check.

Also Robin changed the chassis  - now it is smaller.




Next session is tomorrow - after lunch.

Monday 16 May 2011

Questions to tutor, mechanical changes

Today we met each other in order to discuss questions to tutor. Tomorrow we will have meeting with him and we need to clarify some issues.

For example, we need to know more information about image processing, software architecture, steering mechanics, lake exploration algorithm, line tracking conditions etc.

Also, we found out that there are only 4 weeks left to the final presentation. So we need to work twice a week not once.

As for progress, on weekends Robin made some open loop movements using simple C scripts.
Also he proposed various chassis configurations. This configurations will make rover more robust.

Friday 13 May 2011

Getting Familiar

This week Robin joined the group and started making up for lost time. He got to know most of the other members and was quickly briefed on progress so far. Working towards our first tutor meeting, the group spend time on getting familiar with the Lego Mindstorms setup and on refreshing knowledge in C. For the latter purpose we've used a pdf of the book recommended by Van de Molengraft; 'The C Programming Language' by Kernighan and Ritchie. Also we've started reading in 'Real-time Concepts for Embedded Systems' by Qing Li and Caroline Yao. 

We've inspected the Mars Rover as provided to us. Two concerns we have are (1) backlash in the steering mechanism and (2) differential behavior once one of the driving wheels gets stuck in a lake. Once stuck in a lake, we expect the Rover to keep circling around it with one wheel stuck because the wheel that's in the lake will be able to freely rotate (allowing the other driving wheel to execute force on the mars surface). Despite these concerns we'll try to focus on developing software first and leave hardware unchanged for a while. Current state of our software is that of being able to execute the various demo examples on the RCX. 

A shared Dropbox folder is created in order to efficiently share project code amongst group members.

Thursday 12 May 2011

Meeting with Robin

Finally, all 5 group members are arrived. Today we welcomed Robin. We gave him short intro on what we already have done.

As negotiated, Serhat, Sergiy and me will not work on coming Monday because we will be busy with another project - Pizza robot. Robin decided to work independently on Monday.

We saw how other groups created new chassis for their rovers. They used caterpillars instead of wheels. That's a nice idea - may be should think about that also.

Tuesday 10 May 2011

Initial group meeting results

On Monday we had our first group meeting. We created scratch of a plan for coming 5 weeks. In this plan we outlined basic steps and formulated some problems to solve.

We decided to work once a week on Mondays and spend entire day from 9:00 to 18:00. Also we will use parallel approach when different subgroups will work on different parts. We will make integration after each step as were taught on the lecture.

First problems started with Ubuntu. Each of us has different laptops. Therefore single approach didn't work. Some of us decided to use USB stick, some - CD loader, some - use virtual machine. Me and Sergiy successfully installed USB-stick and made simple examples shown on Wiki and made our rover move straight. Serhat and George still on the way of correct installation.

This week Robin will join us. Next week three of us will be busy on Monday on presentation on Pizza-robot. We need to find solution of this time-overlapping problem.

Wednesday 4 May 2011

Group created, tutor assigned

Finally we completed our group. Now we have five persons.

Also, the tutor was assigned. Meeting request was sent to him.

On coming Monday we plan to have initial group meeting. On this meeting everyone except Robin will attend. Robin will join us later.