Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin

...

  1. Located in path_fns.c
  2. Takes multiple (NUM_MEAS) position measurements at the initial rest position.
  3. Moves the vehicle forward slightly and takes multiple measurements at the new rest position.
  4. Wiki MarkupSelects the median x\- and y-coordinates at each position and uses those to compute the initial heading angle (z_zeta\[0\]) via [arctan|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation#atan_smart.28.29]. Also sets the median coordinates of the second rest position as the initial position for the [estimator|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation#run_estimator.28.29] (xx\[0\] and yy\[0\]).
  5. Sets the zeta_set flag to indicate that initialization is complete. This prevents init_heading() from being run during future iterations.
  6. Uses ovrd_throttle to control vehicle motion during initialization (0 = no override, -1 = stop, 1 = high throttle)

...

This is the version of maintain_velocity() that must be used on cars 1, 2, and 3, which uses a PID controller to keep the cars moving at a target velocity.

maintain_velocity_PWM()

Wiki MarkupThis is the the function used on cars 4, 5, and 6 to maintain a target velocity. It uses a simple linear mapping of desired speed to PWM. It takes as arguments a pointer to an integer PWM value, and an integer that gives the target speed. The pwm integer will be set to the PWM necessary to maintain that speed. This function must be used somewhat differently than the maintain_velocity() function used for cars 1-3. We need to let the .tea code know that we are giving a PWM value directly, rather than some throttle value that needs to be converted to a PWM. We do this by setting the throttle value to \ -1000 before setting the PWM value. The .tea code in cars 4-6 is written such that a throttle value of \ -1000 will cause it to read from the PWM slot in the brainstem (slot 2) and give that value directly to the motor (after scaling it by a factor of 3). Thus, to use this function, first you must set throttle == \ -1000 \ [set(THROTTLE, \ -1000, stemRef)\], call maintain_velocity_PWM() to find the correct PWM value for the speed you want, and then set that value into the PWM slot \ [set(PWM, pwm, stemRef)\].

Also note that in some of the older code, the setget.c file has the PWM section of the set() function commented out with a note that it can't be set. Simply un-comment the line and make sure that the correct slot (2) is being written to, and you should be able to set PWM without any trouble.

...

string
The string type is a fixed series of ASCII-encoded text characters.
Warning: The string type is fairly limited and quite expensive in terms of storage in both the program
code space as well as stack space. Care should be taken when using this type.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=23]\]

Tea Programs

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=24]\]

Reset_Reflex.tea

  1. Enables the Push Button override to stop the car
  2. Load to 4 0

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=25]\]

timer.tea

  1. Runs the timer for the dynamics
  2. Load to 4 1

]

timer.tea

  1. Runs the timer for the dynamics
  2. Load to 4 1

[edit Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=26]\]

timer_HV.tea

  1. use for cars 4, 5, 6. Runs the timer for the dynamics
  2. Load to 4 1

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=27]\]

final_program.tea

  1. Contains all the calls to the dynamics functions
  2. Computes and applies the PWM
  3. Load to 4 11
  4. Compiles in the car libraries

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=28]\]

final_program_HV.tea

  1. use for cars 4, 5, 6
  2. Contains all the calls to the dynamics functions
  3. Computes and applies the PWM
  4. Load to 4 11
  5. Compiles in the car libraries

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=29]\]

final_program_contMapC1.tea

  1. Same as final_program.tea but contains braking functionality and slightly different coefficients in motor map
  2. Breaking uses brake values set in scratch pad, but uses them as negative torque values in a continuous motor map
  3. Breaking uses a different filter from other dynamics
  4. Load to 4 11, use on cars 1, 2, 3
  5. Currently the most-up to-date tea containing breaking functionality
  6. Breaking filter still under construction but works fine for car at speeds <=1000 mm/s

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=30]\]

RoboCar_Lib.tea

  1. Library functions used by final_program.tea
  2. Load to 4 2

]

RoboCar_Lib.tea

  1. Library functions used by final_program.tea
  2. Load to 4 2

[edit Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=31]\]

RoboCar_Lib_d6.tea

To be added.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=32]\]

final_program_2.tea

  1. Contains all the calls to the dynamics functions with the braking dynamics added as well.
  2. Computes and applies the PWM for both positive and negative torque-maps (braking added)
  3. Load to 4 11
  4. Compiles in the car libraries

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=33]\]

General FAQ

  1. What commands can be used to look at or edit code in the SSH terminal?
    • You can use "cat" to read only, or any of the text editors to edit (vim/emacs).
  2. How do you move your code into the SSH terminal so that you can compile it?
    • Go to the transfer window instead of the client window (the one that has the file manager type interfrace).
  3. Do you have to be linked to the lab's internet network to use the SSH terminal via your laptop, or can you use it on CAEN?
    • The way it's setup right now, the IP addresses you connect to are local to the lab network. You can forward the port and access it from the Internet, but I don't know the details of that. Read up on port forwarding if you think the time spent doing this is worth the convenience of working from home. Note: it is very easy to link to the lab's internet network. Connect like you would to CAEN and use the password given in the wireless troubleshooting section below.
  4. Is the positioning data stored locally on the cars or is it stored on the computer running the simulation?
  5. How do you create a new project on the cars?
    • Copy the entire directory of a working project. Change the project name in makefile and makefile.Linux. Delete all .d files in the aUnix_<project name>/aDebug and aUnix_<project name>/aRelease folders.
    • Note: The .d files specify the source files used to compile the object files. The way make_program.Linux is written now, poorly formatted .d files will cause the make to fail with the "missing seperator" error, instead of overwriting the .d files.
  6. Where is all the software located on the cars?
    • The source code for all of the projects are located at either /root/projects, /root/project, or root/Project depending on the particular car. After compiling a program, the executable will be located at /root/Desktop/brainstem/aDebug/aUnix/i686. Alternatively, on cars 3-6 you can just go to /root/i686 and find the the compiled program there.

Wiki Markup
\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=34]\]

Hardware

Wiki Markup
\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=35]\]

Car

    • you can just go to /root/i686 and find the the compiled program there.

[edit]

Hardware

[edit]

Car

[edit] Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=36]\]

Car Power-Up Procedure

Be sure to follow the power up/down procedure posted on the wall (reproduced below):

...

  1. Power vehicle down
    • Turn off MiniITX
    • Turn vehicle power switch OFF
  2. Charge batteries
    • Connect charger to vehicle

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=37]\]

New car motor settings (cars 4,5,6)

...

  1. Hold down the button on the motor controller described in the previous one touch programming instructions (but start with the controller already on).
  2. Keep holding the button while the 4 lights on the controller cycle through different patterns until all 4 lights are lit simultaneously, at which point you should release the button.
  3. The number of times that the lights now flash indicates the current throttle profile.
  4. Use the button to toggle through the profiles until #2 is selected (if #2 is already selected, you can just wait a few seconds and the controller will return to normal operation).
  5. In case of confusion, these directions can be found under "Throttle Profile Selection" on the "HV Pro Custom Programming & Proper Gear Selection Sheet" in the orange Novak motor box.

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=38]\]

Troubleshooting

If the car doesn’t behave correctly (e.g. turns off suddenly, wireless goes down, etc), try shutting it down and then disconnecting the power (make sure it’s not running on battery). Then wait a bit to make sure everything is discharged (a minute should be plenty) before following the power-up procedure again.

...

  • Check all connections
  • Reconnect the Brainstem serial cable
  • Reconnect the Moto1.0 power cable
  • Reload the .tea files
  • Connect the Brainstem to the PC and try running the motors using Moto.exe** If they don’t spin, something’s probably misconfigured or fried…
      • Do the hardware reset and hope nothing’s actually fried. Good luck.
    • If they do spin, it’s just a software configuration error
      • Double check the .tea files and try running old ones
        • Note that each process on the Brainstem can only have so many variables. If you have too many variables, the .tea files will still compile but will not execute correctly. Test this as described in the .tea file constraints section to make sure the files you are loading will execute correctly.
      • If that doesn’t work, try the hardware reset.
  • If all else fails, do a hardware reset as described below… but keep this as a last resort.

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=39]\]

Hardware Reset Procedure

(If configuring a brand new Brainstem, skip the resets and go to step 1. below.)

...

If all else fails, it may indicate that the Brainstem has gone bad (unlikely, but it is a possibility). See if a different Brainstem unit works in its place, and if so, the original unit has probably gone bad.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=40]\]

Some Notes About Batteries

...

As of this writing, cars 5 and 6 (as well as all the old cars) are both using the blue batteries. According to Jeff Lovell, when these batteries are switched out in the new cars they should be replaced with the new 7.4V Thunder Power batteries. Since the Thunder Power batteries are only half the voltage, they should be installed in series to provide a total voltage of 14.8V (the old blue batteries are 14.8V but are installed in parallel). The old cars may continue to use the blue batteries.to use the blue batteries.

[edit] Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=41]\]

Positioning system

Currently runs on ~11.6V, but must be between 7V and 16V. On the ceiling, all the rows are in parallel and all the units in a given row are in series. So there is a voltage drop down each row. 10-12V should be enough to maintain a reasonably strong signal at the end. Increasing the voltage also increases the strength of reflections, which can cause more interference. This is less of an issue with the foam pads on the ceiling, but still should not be ignored.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=42]\]

Power Up Procedure

The positioning system can be damaged if the voltage being supplied to it drops below 7V for even a fraction of a second. To prevent this, we are using alligator clips to provide essentially instantaneous changes between 0V and the operating voltage. The procedure is as follows:

...

  1. Disconnect the alligator clips from the power supply
  2. Turn off the supply when done using it

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=43]\]

Troubleshooting

If the positioning system doesn’t seem to be working well, try reconnecting power to it before you try anything else. If you hear a buzzing sound from any of the units, this should resolve that as well.

Check/set configuration of positioning units by connecting the laptop to the unit via a serial cable and running config232. Press “Find Devices” and wait until the ID shows up (may take 20 seconds or so), then “Read EEPROM”. Make necessary changes to the configuration and “Write EEPROM” to save. It’s a good idea to double check the changes by reading the EEPROM again before disconnecting. Configuration details are as follows (in config232): CtlByte txDelay xpDelay IdOvrideunmigrated-wiki-markup

Callers (cars): \ [ 3 see below 0 15 \]see below 0 15 ]

Transponders Wiki MarkupTransponders (ceiling): \ [ 0 255 see below 33 \ ]
The txDelay and xpDelay should be changed for each car to prevent conflicts. Currently, we’re using values around 20 and 30 for txDelay. The xpDelay values range from 0 -- 5 and are spaced to prevent collisions of transponder signals. The diagram below illustrates the setup:[http://wiki.eecs.umich.edu/delvecchio/index.php?title=Image:XpDelay.PNG]
To check ceiling units without removing them, connect the laptop to the open slot on the end unit. Find Devices should bring up all units in that row, and Read EEPROM will get all the configurations at once. The diagram below shows the grid positions corresponding to the device IDs, with IDs in red indicating devices that are prone to error. As of 2/5/09 12180 is out of commission, while the red "X" in the back row is unit 11424.[http://wiki.eecs.umich.edu/delvecchio/index.php?title=Image:DeviceID_11032008.jpg]
Reconnecting power to the unit(s) may help if they don’t seem to be responding correctly after changing the configuration. I’ve mainly noticed this with the units on the cars after changing txDelay.

Wiki Markup
\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=44]\]

Wireless

after changing txDelay.

[edit]

Wireless

[edit Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=45]\]

Installing Drivers for the New D-link Routers

...

At this point you should be able to go to System->Administration->Network and activate the 'rausb' device. The wireless adapter should now be working. If it isn't working or it can't be enabled, try restarting the car.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=46]\]

Troubleshooting

If the cars are not connected to the wireless network (LNK LED not lit on the D-link USB wireless adapater), perform the following steps in order.

...

Note: The network adapters can be started/stopped using /etc/rc.d/init.d/network
5.If there is some network delay try using different channels,in fact is been demostrated that some network delay was caused by many wireless interferences(channels crowded)unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=47]\]

Electronics

Circuitry schematics, etc will go here.
unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=48]\]

Problems

List Recurring Problems that currently cannot be solved with possible hypotheses or fixes when discovered.discovered.

[edit] Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=49]\]

1. Network Failures (by Mads)

...

(Updated by Dan 7/31/09): The network seems to have stopped having these failures as of a few weeks ago (I don't know why), but it is experiencing periodic lag spikes rather frequently. These are often long enough to cause path following to fail. At one point I updated the firmware of the router and that seemed to improve things for the rest of that day...but when I came back the next day the lag was back. Re-updating the firmware had no effect.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=50]\]

2. SOLVED: Segmentation Fault (by Mads)

I have run car 3 for many weeks without any issues and now it is giving me "Segmentation Fault", "Brainstem Failures", or "ovrd_throttle=-1" upon program execution. I have checked the brain-stem and reloaded the TEA files with no errors. Car 2 has the same TEA file as Car 3 and Car 2 does not have the any problems. 2/5/09 Solution: Turned out that a wire had come loose causing shorts and confusion on board the computer. I Re-stripped the wires and put them back in the conductor and crimped the wires together.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=51]\]

3. Solved: Car 2 compilation error (by Jeff D.)

...

To fix this, use the date command in the SSH window on the car to change the date.  The car does, in fact, have a different date than the desktop.  If you don't know how to change this you can search "linux changing date" online and it will tell you the syntax.  I have not gotten these errors since then.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=52]\]

4. SOLVED: Car 3's motor not working (by Jeff D.)

The fuse on the archbridge blew on Car 3, making the motor not receive a voltage. I fixed this by replacing the fuse.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=53]\]

5. Vehicle-to-Computer Delay (by Mads and Jeff D.)

Using Ca2.c code is now causing a delay in the system in excess of 1 second. This is undesirable for human control. Possible short-term fix is commenting out create_string(*) as used ca2.c. However, this is NOT a permanent fix as create_string from vehicle_communication.c is essential for our final demos since it is used for setting up stored_date in ca2.c and allowing the vehicles to recieve position and speed information about each other.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=54]\]

6. SOLVED: UltraSteer wouldn't output speed data (by Iman)

I was using UltraSteer on one of the cars and I wanted it to output the speed. I used the same function to output speed in PathPlan "speed = get(SPEED, stemRef);" , but it didn't work in UltraSteer and would give me an error (SPEED variable not defined) everytime I would try to compile it. It turned out that in the setget.h file in the PathPlan source folder had the variable SPEED, but in the setget.h file in the UltraSteer source folder had the variable THETA (THETA was in the same place in UltraSteer as SPEED in PathPlan). So in UltraSteer use "speed = get(THETA, stemRef);" in the program to output speed data.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=55]\]

7. SOLVED: New wireless card stopped working (by Matt)

One of the new (used on car 4,5,6) wireless cards stopped working. I plugged it into my computer, downloaded the drivers, and it reset itself automatically. It now works again on the car, apparently there was some reset that the car couldn't perform.that the car couldn't perform.

[edit] Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=56]\]

8. SOLVED: Car 6 IP address was switching (by Matt)

...

  • touch /var/lock/subsys/local
  • rmmod rt2570
  • modprobe rt2570
  • ifconfig rausb0 up
  • iwlist rausb0 scan
  • iwconfig rausb0 essid eecs4315
  • iwconfig rausb0 rate Auto
  • iwconfig rausb0 mode Managed
  • iwconfig rausb0 ap any
  • iwconfig rausb0 enc FAB9000FAB
  • ifconfig rausb0 192.168.1.106 (THIS LINE MUST BE CHANGED TO THE CORRECT IP ADDRESS FOR WHATEVER CAR YOU ARE WORKING ON)
  • ifconfig rausb0 255.255.255.0
  • route add default gw 192.168.1.1

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=57]\]

9. SOLVED: Car 5 wheels spin when turned on (by Iman)

When I turned car 5 on the wheels would spin for a quick second then the car would just stop and the small green light on front of the MiniITX would blink off and on. It turns out that the moto settings just needed to be reset, refer to section 2.1.2 New car motor settings (cars 4,5,6).

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=58]\]

10. SOLVED: New Servos wouldn't work when connected to brainstem

The new Servos: GWS S03N STD ahve to have the wires switched before they will work with the brainstem. Check with car 1, 4, 5, or 6 to see what orders the wires must be attached as they connect to the brainstem. DO NOT use car 2 or 3 as a reference because they have completed different servos.

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=59]\][edit]

11. SOLVED: Estimator not working properly

...

Solution: The estimator may need to be repositioned. The glue is brittle and the estimator can easily be snapped off the car. After doing this, make sure the estimator and the wheel are clean and dust-free. Now, while using Moto.exe to slowly spin the wheels of the car, move the estimator around near where it was before until you find the spot where Moto.exe picks up a consistent velocity measurement. Use some epoxy glue to glue it into this place and either hold it or clamp it until the glue sets sufficiently (all while using Moto.exe to make sure that you've still got it in the right spot).unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=60]\]

12. SOLVED: Car 1 slowly comes to a stop

...

Solution: The wire going to the analog 4 pin on the brainstem (where the brainstem gets its voltage reading from) was loose in the blue clamp that connects the wire to the power bus. Note that this is the clamp that connects the wire to the bus, not the clamp after that between the bus and the brainstem. I just took off the clamp and re-clamped the wires.
unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=61]\]

Status of cars

Car 1: Working.

...

Car 5: Seems to work well but cannot change period in moto.exe. If you figure out solution, please post.

Car 6: Working

Wiki Markup
\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=62]\]

Overhead Vision System

.

Car 6: Working

[edit]

Overhead Vision System

[edit Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=63]\]

Using the Overhead Vision System

...

If you close and then re-open CPS.exe on one computer while leaving it running on the other, you will need to hit reset on the one that was left running for them both to start working properly. Avoid running two copies of CPS.exe at once on the same computer--this tends to make the cameras angry (see Troubleshooting #4).unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=64]\]

A Note About Orientation and Filters

...

I've done quite a bit of experimentation with switching the estimator's calculated heading for the heading given by the camera system, and all I've been able to conclude is that it usually makes little difference which one you use. Graphical comparison shows that the data from both are fairly similar. Sometimes one seems to work slightly better than the other (the car will follow the path with slightly less weaving in and out), but which one works better is not consistent from program to program. At the moment (7/30/09) most of the programs on the cars are still using the heading from the estimator, and that is not really a problem--however, if a car/program is having trouble following a path, or if you are sick of having to wait for init_heading to run, then you can go to ca2.c and switch the 'z_zeta\[est_count\]' argument in run_controller to 'angle' to use the camera system's heading. You can also choose to use the positioning values straight from the camera instead of waiting until they have been filtered by run_estimator. To do so, switch 'xx\[est_count\]' to 'xc' and 'yy\[est_count\]' to 'yc' in the arguments to run_controller. In the tests I ran, doing so had no apparent affect on path following. If you switch over all three of these arguments, you can safely comment out init_heading. If you do so, make sure that you set zeta_set to 1 somewhere or the program will get stuck trying to initialize.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=65]\]

Troubleshooting

This is a list of possible problems with the vision system and/or camera computers, along with their (usually very simple) solutions.

1. Problem: The position data the car is receiving seems to be wrong / the car isn't following the path. Solution: Check to see if the system needs a reset. It's easy to forget, but needs to be done whenever the car is
picked up and moved or the camera's view of it is blocked. If that doesn't fix the problem, make sure you are using the
correct pattern; the number on the pattern should match the car number. Another thing that can prevent path following
from working is if the car's IP address is switching, as in Problem #8.
2. Problem: The copies of CPS.exe on the three computers are not talking to each other. Solution: First check that COMP_NUM in CPS.h is correct on all computers. If it is, then check the IP addresses of the
computers. They have been known to change for no observable reason, and this will keep CPS from communicating properly. You will
need to (on all three camera computers) go to CPS.h, update the values of COMP0_IP_ADDRESS, COMP1_IP_ADDRESS, and/or COMP2_IP_ADDRESS, and recompile
the program (see Problem #3).
3. Problem: After changing something in the header file, an attempt to compile gives a bunch of errors in the code. Solution: In Dev C++, a normal compilation only re-configures the stuff from files you have just edited, which can be a
problem if the change to your header file would affect things in other files. You will need to do a full compilation
by hitting Ctrl+F11.
4. Problem: When trying to start the program, you get a message that there was an error setting up the sockets. This will most likely happen after accidentally trying to run two camera programs at once on the same computer. Solution: If you look in the task manager, it will probably show that a copy of CPS.exe is running even if there isn't.
This 'ghost' copy of CPS.exe can't seem to be stopped, and it prevents the program from being started normally. This must be
fixed by restarting the computer.
5. Problem: The tracking/pattern location isn't working very well. Solution: Make sure that all the lights in the lab are on and the blinds are closed--these were the conditions when the pictures were taken, so
having half of the lights off can else can keep the tracking from working properly. Also, make sure that the patterns on the
cars are as flat as possible.
6. Problem: Tracking fails when a car crosses from one camera frame to another. Solution: This can happen if a camera is bumped or shifted at all. The problem can be fixed by re-calculating the mapping between frames for the border causing problems.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=66]\]

Hardware

The overhead vision system consists of six cameras mounted on the ceiling around the track. These cameras are connected via FireWire to three desktop PCs, which implement a tracking algorithm to determine the position of the cars. The positions are converted to a global coordinate system transmitted to the cars via the lab network. Each camera runs on its own PCI card because this allows us to run two cameras per computer without any loss in frame rate.

In the tracking program the computers are numbered from 0-2, from right to left in terms of their positions on the desk.terms of their positions on the desk.

[edit] Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=67]\]

Software

Rather than having a central computer that oversees all of the tracking, we simply run the tracking independently on each computers. Each computer knows at all times which cars it is responsible for tracking and sending data to. When a car is going to move outside of its frame of vision, that computer sends a message via the lab network to another other camera computer, which then takes over responsibility for that car.

The tracking is performed by doing a pixel-by-pixel comparison of the images captured by the camera with images of the patterns taken beforehand. Only central strips of the patterns are compared, in hopes of minimizing the effects of camera distortion.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=68]\]

reset.exe

This is a small program that can be used to reset the vision system from another computer. It can be found in the camera_programs folder on either camera computer. When you run it from any Windows machine on the lab network, the camera system will reset as if you had hit 'r' on one of the vision system computers. The program will produce no output. If the IP addresses of the camera computers change, the program will have to be edited to account for this change.

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=69]\]

The Header File

The header file CPS.h defines many important constants for the Camera Positioning System. Many of these are explained elsewhere, but some miscellaneous (yet important) ones will be clarified here. With the exception of COMP_NUM, all of these constants should be same for the program on all computers. If you are having trouble compiling after changing something in CPS.h, see Troubleshooting #3.

...

CAR_IP_ADDRESS_# where '#' is a target number 1-6: These values set the IP addresses that will receive position data corresponding to each target number (written on the back of each target). As of this writing, the target numbers match the car numbers. If this ever needs to be changed, or if a car needs to be prevented from receiving data for some reason, this can be done by changing these values. However, make sure that no car will be receiving positioning data from two targets at once.

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=70]\[edit]

Setting up a new computer to run/compile vision system code

...

If the computer you're on doesn't have Visual Studio on it (or maybe even if it does), then you're probably getting an error that sounds something like 'Failure to initialize' with error code 0xc01500002, when you try to run your compiled program. To fix this, download and install the Microsoft Visual C++ 2005 Redistributable Package (_http://www.microsoft.com/downloads/details.aspx?familyid=32bc1bee-a3f9-4c13-9c99-220b62a191ee&displaylang=en_). This installs necessary runtime components of some Visual C++ libraries. If you do this and still get the error, then the SP1 version (_http://www.microsoft.com/downloads/details.aspx?FamilyID=200b2fd9-ae1a-4a14-984d-389c36f85647&displaylang=en_) might work instead.unmigrated-wiki-markup

If you do all this and still get runtime errors, then try copying all the .dll files in C:\Program Files\OpenCV\bin straight into the System32 folder (C:\WINDOWS\system32). I didn't have to do this for the first computer I set this up on, but I did for the second, so this step may or may not be necessary. \ [Make sure you don't modify anything else in this folder\].

Now you should be able to compile and run code for the Overhead Vision System. As previously mentioned, I had to do some things for the second computer that I didn't have to do for the first. So, if you have completed all of the above steps and it still isn't working, then I suggest doing some internet research and just playing around with with it until you get a setup that works.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=71]\]

Camera Calibration

Before any meaningful position data can be obtained from the cameras, they must be calibrated both intrinsically and extrinsically. There is a lot of software to do this, but the best documented and apparently most reliable is the Caltech Camera Calibration Toolbox for Matlab (_http://www.vision.caltech.edu/bouguetj/calib_doc/_). Also necessary for calibration is a large (about 2ft by 2ft) checkerboard pattern--there should be one in the lab somewhere. The square size on that checkerboard is 34.925mm on each side. You should use the middle 15 by 15 squares for calibration.

For background information about camera calibration and parameters, refer to the "Multiple View Geometry" textbook in the lab. Particularly of interest is the information on radial distortion on pg. 189-193.

[edit Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=72]\]

Intrinsic

To do intrinsic calibration of a camera, a series of about 15-20 calibration images need to be obtained from that camera. This can be done using the FlyCap.exe software that came with the cameras. Alternatively, you can use a program in the camera_programs folder called pictureTaker which can be used to take a series of images with a customizable time delay between each shot (this time delay can be changed in the program's header file. Each image should be primarily taken up by the checkerboard. It is important that the checkerboard be rotated and held at many different angles in order to get a good calibration. Examples can be found in the previous calib folders. After obtaining these images, follow the instructions in the first calibration example on the Caltech page to obtain the calibration parameters. The calibration program will put the parameters into a file called 'Calib_Results.m'. These new parameters can be used in the tracking program by renaming the file appropriately ('Calib_Results_CamX.m') and placing it in the program's directory on both camera computers.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=73]\]

Extrinsic

This extrinsic calibration only applies if all the tracked objects are of the same height,i.e., you have a common ground plane.

...

  1. First you should understand the global coordinate system in the lab. The origin of our coordinate system is at the South-West corner of the room. The units are in millimeters. The X-direction is to the East ('down' on the computer screens) and the Y-direction is to the North ('right' on the computer screens).
  2. Place the checkerboard on the ground, at the same height where your tracking patterns will be when mounted on the cars (I prop it on some boxes to get it to the correct height). It's best to place the checkerboard fairly close to the center of the camera frame, however this is pretty flexible as long as you know its distance to the origin. Make sure the sides of the checkerboard are parallel to the x-y axis of your coordinate system.
  3. Choose one corner of the checkerboard to be a reference point and measure its (x,y) position in millimeters using the lab's tape measure. You will need this measurement later.
  4. After you figure out the position of your checkerboard and the orientation is correct, you can take a picture of it using FlyCap.exe or the pictureTaker program. You can find examples of extrinsic calibration images in the calib 7-1-10\extr folder.unmigrated-wiki-markup
  5. Run your MATLAB again and read in the intrinsic parameters for the camera you are calibrating using calibration toolbox function load (these parameters can be found in . Then click on Comp. Extrinsic in the Camera calibration tool, and follow the instruction on \ [[1]\|http://www.vision.caltech.edu/bouguetj/calib_doc/htmls/example.html\]&nbsp; (\_[http://www.vision.caltech.edu/bouguetj/calib_doc/htmls/example.html\_|http://www.vision.caltech.edu/bouguetj/calib_doc/htmls/example.html_]). When clicking the positions of the corners, the one you click first should be the one who's position you measured.
  6. Copy the parameters that the calib_gui program gives you to the appropriate Extrinsic_Calib_Data_X.txt file in the CPS\calib_data folder. You will also need to enter the position of the checkerboard pattern that you measured earlier.
  7. You can check that the calibration is accurate by placing some objects on the floor at known positions and checking their positions from the camera system in RECORD_OBJECT_DATA mode, which outputs position data.

...

Normalize() is nothing but the c++ version of MATLAB script normalize.m and convert() is the rest of the math in pixel2position.m. Note you may need to add/subtract the position of your checkerboard in the convert() function.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=74]\]

Camera Positioning System (CPS)

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=75]\]

Pattern Recognition

Before you do anything about searching or tracking, you need to generate the patterns that can be searched or tracked. What I have been using for the project are patterns that are "target" like. There are currently six different patterns. The backgrounds are black or white, and the targets are black and white rings. Below is the list of all the patterns.

  1. B wwwwbb
  2. W bbwwwb
  3. B wwbbbb
  4. B bbwwbb
  5. B bwbwww
  6. W wbbbww

...

The capital letter for each pattern indicates the background color for each pattern. Generally the black background is slightly better than the white one since the floor is so bright that at some places it looks very much like the white paper \ [Note: this may change once the black flooring mats have been placed\]. The six letters after the capital letter are the sequence of the color of the rings, from outer to inner. There is a template of the target ring (as well as pre-made images of the 6 patterns listed above) in the CPS folder. You can edit the template and then print it out to make new patterns. The patterns above were designed to minimize the intersymbol interference as much as possible. So far these patterns have been working successfully.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=76]\]

Why so many sections

The total area of the lab covered by the vision system is divided into 36 "sections". The divisions between these sections are shown by the horizontal and vertical black lines that can be seen in the camera windows when running CPS. Each pattern has a calibration image for every section. When a car moves into a particular section, the image from that section is used for the tracking.

...

http://wiki.eecs.umich.edu/delvecchio/index.php?title=Image:New_section_layout_small.png_layout_small.png

[edit Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=77]\]

Record picture

If a pattern stops working in a certain section or if one of the patterns is changed, you may need to re-take the picture of that pattern in the affected sections. The positioning system code contains a function called recordObjectData which allows you to take pictures of the patterns for the computer to use in its searches. To take the pictures:

...

Note that running the system in RECORD_OBJECT_DATA mode is also helpful when checking the accuracy of an extrinsic calibration, as the program will output the position of the upper-left corner of the 'box', in both pixel coordinates and the global coordinate system in millimeters.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=78]\]

Algorithms

Once you have all the pictures you need, you can start searching and tracking. There is one very important parameter that is used throughout the program, BOX_SIZE. BOX_SIZE is the size (in pixels) of the square box that contains your tracking pattern. In the previous version of our algorithm, we were trying to compare everything in the box to the picture we took before and then find the best match. But the problem is, even though we have so many sections we still had trouble finding a good match because of the distortion. The updated algorithm only compares the middle stripe of the box since the middle part suffers the smallest effects of distortion. Since we are now only comparing the middle one fifth of the entire box, this reduced the complexity but increase the performance.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=79]\]

Searching

...

The search algorithm is implemented in the function void findObject( IplImage \ *img\[\], int objData\[\]\[NUM_CARS\]\[NUM_FILES\], Position loc\[\], socketData &sendSocket, socketData &receiveSocket). You need to pass the image, object data that you recorded, locations of all your objects and socket data that is for computer talking. In this function, initial determines where you start to look at and r_x, r_y determines how far you will be looking. Once those parameters are specified, it will start searching for the objects. What the algorithm does is to search the whole region from initial and (r_x,r_y). It slides the box (of size BOX_SIZE) pixel by pixel, compares the image in the box with those taken previously and finds the best match. Still, it's only comparing the middle strip of the box. After the search for all the objects, two arrays will contain the location of the patterns and difference between the best matches and the pictures that are taken earlier. These will be passed to the searchExchange function, which communicates with the other camera computers to determine the best overall match.overall match.

[edit Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=80]\]

Exchange of search data

The searchExchange function is used to exchange search data between the computers and find out which computer has the best match. This function behaves differently depending on whether COMP_NUM is 0 or not.

...

For more detail, see the code of the searchExchange function in the search.cpp file.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=81]\]

Tracking

Tracking is very similar to searching. It's a local area search. The size of the local area is defined by the global variable ITERS. The value of ITERS should be determined by how fast your computer is and how fast the object you are trying the track moves. I wrote a MATLAB script to calculate the value for ITERS. The MATLAB function is defined as follows: function iters = iter_calculator(pixel_size,max_speed,fps,delay) overal_fps = 1000/(1000/fps + delay); speed_per_frame = max_speed / overal_fps; iters = speed_per_frame / pixel_size;

...

After you specify the ITERS, the program will do a search in the same way as the searching function. But the difference is that it will only search in the area of a box with length of 2*ITERS+BOX_SIZE and centered at its previous location. There needs to be a balance between ITERS and the frame rate. If you increase ITERS, you will lower your frame rate since you are doing more calculation for each frame. If your frame rate is low, then you need to increase ITERS because otherwise, you may not be able to track the object. So the overall goal is to maximize the product of frame rate and ITERS. This can be done experimentally.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=82]\]

Orientation

At the end of the tracking function, we will run the search for the orientation. After you get the symmetric target patterns, you will need to put a "dot" on it somewhere outside the outer circle. The algorithm will search in a bigger box which is centered at the center of the box that contains the symmetric target and find the dot. The location if this dot will then be used to determine the orientation of the pattern.

...

Note: At this point (8/27/2010) none of the demos (that I am aware of) actually use the orientation data given by the camera system.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=83]\]

What to send to the vehicle

...

Then we will send the message, in the form of a character string, using sockets. The string will contain the position of the object in millimeters and also its heading information.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=84]\]

Switching Mechanism

Each camera has some overlap with other cameras so that when one object is leaving from that camera, it can be easily switched to another camera which has a better view of it. There are two cases, one is the switch within one computer, i.e., switches between camera 0 and 1, or between camera 4 and 5. The other case is the switch between computers. The latter case is slightly more complicated since we need to send a message from one computer to the other, which may involve transmission delay. There are a total of 7 borders between the cameras, and there is a mapping in both directions for each border. We name each of these transition mappings in the following way:

...

The data for these mappings can be found in the CPS/calib_data/ folder. There are 14 different files (1 for each transition), named with the format compX_transitionY.txt.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=85]\]

Mapping

The mapping between two cameras will have to be re-calculated whenever the cameras are moved or bumped. If the tracking messes up when a car travels from one camera frame to another, then the camera transition mapping is usually at fault and will have to be re-measured. We draw grids on the images shown on the screen and the lines at the edges between cameras indicate at which point we will switch the pattern from one camera to another. The following is an example of how to calculate this mapping:

...

6. Be sure to test the new mapping by driving one of the cars through several points along the border in question.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=86]\]

When to switch

In the program, the parameters that define when to switch are They are CAM0_VERT,CAM0_HOR,CAM1_VERT and CAM1_HOR, and CAM1_LOWER_HOR. These parameters are set in CPS.h, and they are different for each of the three camera computers. These parameters are chosen so that if the image is very distorted on one camera, it will be switched to the other one, where the distortion is less serious. So, when the object passes these lines it will be switched to the other camera. But oscillation between the two cameras might occur if you don't choose the parameters carefully. Also note that even if you calculate the mapping for one direction, you will still need to do the inverse mapping separately. Unfortunately, you cannot simply reverse the linear functions to get the inverse map. You will need to follow the procedure described above again for each switch.procedure described above again for each switch.

[edit] Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=87]\]

Switch between computers

At this point, you have everything that is needed for switching between the two cameras that are on the same computers. The switching between computers is slightly different because you will need to send a message to the other computer. We will use sockets to send the message. The message is a character string that has the following format: " Switch object_number camera_number projected_x_coordinate projected_y_coordinate " (note the whitespace at the beginning and end of the string). In this string, the first number will always be Switch, which is an enumerator and has the value of 1. Then is the object number. The camera number is a negative number and you need to increment it by one and then take the inverse to get the new camera number. The last two components are the projected pixel positions in the new camera. Each coordinate has two parts, the mapping part and the velocity projection part. The mapping part is as described above and the projection part is based on the object's current velocity in the previous camera. Since there will be transmission delay and processing delay, the projected part will take into the delay into account and project the new position. Note that if there are multiple objects switching between computers, one message will be sent for each of them. So we may have to send up to six messages in a single frame, but this won't cause any problems since sending a message through a socket is very fast.
unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=88]\]

Other important functions in the program

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=89]\]

...

void

...

getCalib(calibData

...

data

...

[

...

])

This is the function you should call at the beginning of the program. data is an array of calibData type. This function reads in all the intrinsic and extrinsic calibration data for all four cameras. However, before you run this you need to change the directory since all those data are stored in the CPS/calib_data folder. This change can be done by simply calling the changeDirectory() function.

Wiki Markup
\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=90]\]

...

by simply calling the changeDirectory() function.

[edit]

int imgDiff(IplImage *img, int data[][NUM_CARS][NUM_FILES], int box, Position &loc, int k, int section)

This function is used in the searching and tracking algorithm. You need to pass in the image which is in openCv format, the picture data that was recorded and read in, the size of the box that contains the object(usually it's BOX_SIZE), the object's position structure, object index k and which section it's in. The function will return the difference of the image in the box at loc with the recorded image.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=91]\]

...

setup(IplImage

...

*img

...

[

...

],FlyCaptureContext

...

contexts

...

[

...

])

This function will set up the connection between Fly Capture image and OpenCV image for the first time. This function will also set up the windows that are used for displaying images on your screen. Since we are using two cameras on each computer, we will also display two windows on each computer. That's why we pass an array of OpenCV images and Fly Capture images to the function. The camera will send images to the computer at a constant speed of 60 frames per second. This frame rate is set in the setup function but can also be modified manually by the FlyCapture software which can be accessed from "C:\Program Files\Point Grey Research\PGR FlyCapture\bin\FlyCap.exe". However, due to limited computation power and the complexity of the algorithm, usually the program iterates at a slower speed. The frame rate of the program (usually somewhere between 30 to 58 FPS) varies with the value of ITERS and number of objects it's currently tracking. So every time the program finishes processing the previous image, it will grab a new image from the camera. The computer will just drop the images that are not grabbed by the CPS program.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=92]\]

...

edit]

void dotSearch (IplImage *img, int objData[][NUM_CARS

...

]

...

[NUM_FILES

...

],

...

Position

...

&loc,

...

int

...

k,

...

int

...

section,

...

IplImage

...

*img2)=

It's best to understand what this function does graphically. When you run the program, you will see several boxes drawn around each object you are tracking. The inner box tells you where the object is while the outer tells you the outer boundary of the range in which we will search for the dot/stripe that's around the pattern. The tiny box is the location of the orientation dot/stripe. This function will search in the unit of a small box. It will slide this small window through all possible positions inside the outer box but outside the inner box. It's essentially searching along four sides of the big box, and that's why you see 4 for-loops there. Because for different patterns the dots have different colors, we will need to distinguish this. For pattern 1,3,4, and 5, the dot is white while for pattern 2 and 6 the dot is black. So, we add a bias when we search for the dot. The function used here to do the searching is called int boxAvg(IplImage *img, int x, int y, int box). This function will do nothing but tell you average pixel brightness in the box that is located at (x,y). So for different colors of the dots, we have different bias values. After searching the entire area, dotSearch(...) will draw the dot at the position that looks most like our orientation dot/stripe.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=93]\]

Where is all the stuff

On the desktop of the camera computers, the folder CPS contains all the stuff for the program:

  • calib 7-1-10 : Contains data from the most recent intrinsic and extrinsic camera calibrations.
  • CPS : The folder for the main tracking program.
    • CPS/calib_data : contains the text files that we recorded the pattern images in, as well as the data from the intrinsic and extrinsic calibrations.
  • matlab scripts : contains several useful matlab scripts for the camera system.
  • old calib : Contains intrinsic and extrinsic calibration data from older calibrations.
  • patterns&template : contains the patterns that can be printed for tracking and the pattern template.
  • pictureTaker : Contains a program for taking pictures with the camera system. Useful for doing camera calibration.
  • reset: contains the program to reset the camera system remotely.
  • test programs: contains various programs useful for testing the functionality of the cameras and the network.

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=94]\]

Everything you need to know about filtering

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=95]\]

Linear filters

Wiki MarkupLinear filtering is nothing but multiplication and addition. Go to matlab and type help filter, it will show you the algorithm to do filtering. Once you got the input data, you just need to multiply the data with corresponding coefficients of the filter, then add them up to get the output. Where do you get the coefficients? Go to matlab and type help firpm. This is a long help but you only need to pay attention to the example it shows at the bottom. Since our data is not changing very fast, so we may want to use a lowpass filter to reject the noise in the high frequencies. % Example of a length 31 lowpass filter:h = firpm(30,[0 .1 .2 .5]\*2,[1 1 0 0]);
The first in put to this function is the length of the filter-1. Longer the filter, less the distortion of your signal, and longer the delay you will have. Since in our project, the general acceptable delay is \ ~100 ms. For the filter that is implemented on the computer, you can have a filter that's of length 10. For the filter on the car, it's impossible to have a filter within that delay. The delay of a lowpass filter can be roughly calculated as length of the filter divided by two, times your sample duration. However, the sample duration for the program on the car is already 100 ms, then you won't be able to get any useful filter that has delay less than 100 ms. The second and third inputs to the matlab function correspond to each other. The example above shows you want your gain is 1 between the interval of \ [0 .1\]*2*pi. And the gain for \ [0.2 0.5\]*2*pi is 0. You don't care about the transition band in between the two intervals above. If you know specifically the frequency of you true data, you can shrink the interval that has gain 1. Otherwise you may just keep that interval as is. Note if you have very small interval that has gain one while you don't want to have a long delay, the actual filtered signal will usually be distorted. Once you figure out what kind of linear filter you want to use. Just plug in the coefficients you got from firpm function to your filter. It's just one line of many multiplication-and-addition operations in C++.unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=96]\]

Nonlinear filters

The only nonlinear filter I have been using is the median filter. The median filter will apply a window of size N(N>=3) to your input data. It will pick the median one in the current window as the output and slide the window by 1. So the delay of the median filter is N-1. It can eliminate the salt and pepper noise in your data. We implement a median filter to the velocity calculation on the computer and it's working fine. So in the program, we just sort the data in the velocity history plus the newly calculated one, then find median of those and set it as the current value. Note every data in the history is newly calculated not the filtered one.

Wiki Markup
\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=97]\]

Methods

one.

[edit]

Methods

[edit] Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=98]\]

Finding Motor Maps

The following links are the matlab files I used to come up with the motor map for car 4:
car 4 motor map data (_https://mfile.umich.edu/download/?path=/afs/umich.edu/user/a/a/aaboutal/Public/car4motormapdata.m_)
car 4 motor map data analysis (_https://mfile.umich.edu/download/?path=/afs/umich.edu/user/a/a/aaboutal/Public/car4motormapdataanalysis.m_)
(These same files can be used for car 5 and 6 you just need to change the variables from "car4_PWM_#t#" to "car5_PWM#t#" or "car6_PWM#_t#" in both the data and data analysis files.)

...

  1. Plot speed v time data in matlab
    • plot((1:length(car5_PWM10_t1)),car5_PWM10_t1(1:length(car5_PWM10_t1)));
    • -- car5_PWM10_t1 is a variable that contains the speed data for the first trial at a PWM of 10 for car 5.
  2. Fit a polynomial to the data
    • data10_t1=car5_PWM10_t1(1:t);
    • time10_t1=1:(length(car5_PWM10_t1(1:t)));
    • -- t is a number that indicates the time at which the speed stopped increasing
    • Wiki Markup\[p,s\]=polyfit(time10_t1',data10_t1,4);
    • -- the number 4 in the line above is the degree of the polynomial (4th degree)
    • -- A lower degree polynomial may be used if a 4th degree doesn't fit the data well
    • plot(time10_t1,y10_t1); -- this line plots the polynomial that fits the data
    • -- Plot the polynomial and the data on the same graph to confirm that it is a good fit
  3. Calculate and plot torque
    • plot(y10_t1(2:length(y10_t1)),diff(y10_t1)w.033/10); -- w is the weight of the vehicle in kgunmigrated-wiki-markup
    • \[p10_t1,s\]=polyfit((y10_t1(2:length(y10_t1))),diff(y10_t1)*w*.033/10,2); \ -\- w is the weight of the vehicle in kg
    • y=((y10_t1(2:length(y10_t1))).^2).*p10_t1(1)+(y10_t1(2:length(y10_t1))).*p10_t1(2)+p10_t1(3);
    • plot(y10_t1(2:length(y10_t1)),y,'r'); -- this is the torque curve
  4. Average all the torque curves
      unmigrated-wiki-markup
    • x10=\[1:m\]; \ -\- m is the maximum speed that the vehicle reaches with a PWM of 10 (this will be for different PWM values)
    • p10 = (p10_t1 + p10_t2 + p10_t3 + p10_t4 + p10_t5)/5;
    • y10 = (x10.^2).*p10(1)+x10.*p10(2)+p10(3);
    • plot(x10,y10,'k');
    • -- the 10 after x, y, and p represent the data for a PWM of 10. Do this for all the other PWM values (i.e. for PWM of 20 use x20,y20,p20)
  5. Come up with the PWM equation (motor map)
    • Wiki MarkupT10 = \ [x10; y10; ones(1,length(y10))\]';
    • PWM10 = ones(1,length(y10))'*10;unmigrated-wiki-markup
    • PWM = \ [PWM10; PWM20; PWM30; PWM40; PWM50; PWM60; PWM70; PWM80; PWM90; PWM10\];unmigrated-wiki-markup
    • T = \ [T10; T20; T30; T40; T50; T60; T70; T80; T90; T10\];
    • A = inv(T' * T)*(T' * PWM); -- the three numbers in this matrix are the constants a, b, and c in the equation: PWM=av+bt+c
    • -- again the 10 after T and PWM represent the data for a PWM of 10. Do this for all the other PWM values.

note: be sure of the units in all the programs and calculationsunmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=99]\]

Finding dynamic parameters ("a" and "b" parameters)

...

note: "a" should be a positive number and "b" should be a negative numberunmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=100]\]

Finding Parameters for maintain_velocity_PWM()

...

4. If you want to check these parameters graphically (recommended), this can be done in the following way: Xeval = linspace(0, 2500, 2501);
Yeval = polyval(p, Xeval);
plot(Xdata, Ydata, '.')
hold on
plot(Xeval, Yeval, 'r'), Yeval, 'r')

[edit] Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=101]\]

Car Dynamics

Updated summer 2009unmigrated-wiki-markup

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=102]\]

Car Motor Maps

  • v = speed, t = torque
  • Car 1: PWM = .58v + 3000t + 93
  • Car 2: PWM = .63v + 3000t + 102
  • Car 3: PWM = .62v + 3000t + 107
  • Car 4: PWM = .45v + 444t - 9
  • Car 5: PWM = .39v + 399t - 7.4
  • Car 6: PWM = .42v + 450t - 6.5

...

\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=103]\]

Car Dynamic Parameters

  • x = at + b
  • x = acceleration, t = torque
  • Car 1: N/A (4.00t - 30 seems to work decently in combination with feedback-Jeff)
  • Car 2: x = 3.77t - 55
  • Car 3: x = 5.07t - 122
  • Car 4: x = 6.43t - 133
  • Car 5: x = 6.81t - 114
  • Car 6: x = 6.95t - 109

note the unit for these equations are mm/s^2

Wiki Markup
\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=104]\]

Demos

  • 6: x = 6.95t - 109

note the unit for these equations are mm/s^2

[edit]

Demos

[edit Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=105]\]

3 Car Autonomous

Description: 3 cars traveling on different printed lab circles imitating traffic roundabout geometry. Utilizes collision avoidance and autonomous cruise control algorithms. Currently demo is set up to run with car 1 on the outermost circle, car 2 on the smallest circle, and car 3 on the intermediate circle.

...

  1. Connect to cars via SSH
    • Open Secure File Transfer Client on Desktop. A window to connect to the cars will appear. On the toolbar click profiles and select car 1 (it may be labeled as MiniITX_101).
    • A window will appear as the computer tries to connect to the car. When connected it will prompt for a password, which is hal9000. If this does not happen after about 5 seconds, then there is a problem with the wireless--usually either the router is down, the car's wireless is down, or you are trying to connect to the wrong car. If the car's wireless isn't working (the D-Link flashes on and off when it is working and ready to connect), make sure the car is on. It takes several minutes for the D-Link to activate after the car has been turned on and the ITX powered up. If this doesn't do the trick then power cycle the car and try again.
  2. Open up a new file transfer and transfer files to cars
    • If the connection window is a SSH secure shell then go to window and select New File Transfer. A SSH file transfer window will appear.
    • In the right window click "projects," or some obvious variation of that name (Project, project, Projects) to access the correct file transfer location on the cars. Select "ca2" then click the Up arrow in the toolbar to transfer the file. It is located under Desktop/Project_2009/Final_demos/3_car_autonomous_roundabout. For car 1 enter "large_car" and select ca2 to upload. Note: for car 2 you would select "small_car" and for car 3 you would select "inter_car." The transfer progress can be seen on the bottom window.
  3. Compile the code
    • In the SSH secure file transfer window go to "window" and click New Terminal. Type "cd /root/Desktop/brainstem/aProject/ca2" to enter the location to compile. Type "make clean; make new" to compile. Once this is complete type "cd /root/Desktop/brainstem/aDebug/aUnix/i686" to enter the location to run the code. Repeat the last steps on each car before going further.
  4. Place cars
    • All the cars travel counter-clockwise around the circle. Place Car 1 on the furthest end from the door of the big circle, car 2 on the closest end to the door of the smallest circle, and car 3 on the closest end to the door of the intermediate circle (which intersects with the large circle).
  5. Start camera system
    • Double-click on "Shortcut to CPS" on the desktops of both computers furthest from the door to activate the camera positioning system.
    • Note: all the main lights in the lab must be turned on for the system to run properly.
  6. Run the program
    • Type "./ca2 " followed by the car's path file name, then press enter to start running the demo. The filename for car 1 is "bigcircle_9points.txt", car2 is "60percent.txt", and car 3 is "inter_11points.txt.
    • To stop the cars, highlight its corresponding terminal and press spacebar.
    • The car's data is saved until the next run under /root/Desktop/brainstem/aDebug/aUnix/i686/ca_output. The relevant files are ca, acc, and terminal_file.

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=106]\]

2 Car Semi-autonomous

Steps to Run Program:

  1. Set Up Autonomous Vehicle:
    • Turn on car 4.
    • Go to the project directory and then to the folder ca2_semiauto_demo
    • Compile using the command "make clean; make new"
    • Go to i686, the directory where the executable ca2 compiled
  2. Set Up Human Controlled Vehicle:
    • Turn on one of the old cars, numbers 1-3.
    • Go to the project directory and then to the folder ManualDrive_demo
    • Compile using the command "make clean; make new"
    • Go to i686 or debug_dir depending on car, where the executable ManualDrive compiled
  3. Set Up Human Interface:
    • Turn on the laptop.
    • Use the long ethernet cable to plug the laptop into our network - verify that it is connected to the internet through the router.
    • Plug the wheel/pedals into the laptop using the USB cable.
    • Check that the pedals are properly plugged into the back of the wheel - this connection comes loose very easily
    • Run the program called Wheel.exe from the laptop. A command prompt window should come up. Verify by turning the wheel and pushing the pedals that all input is being registered properly, the output should give the steering then acceleration then braking.
  4. Set Up Camera System:
    • Turn on both computers in the corner of the lab.
    • Verify that both are connected to the internet through the local router.
    • Run the camera program by double clicking Shortcut to CPS on each.
    • If both computers show two windows with overhead views of one quadrant of the lab, then everything is working properly.
  5. Run Demo:
    • On the autonomous car, run the program in the program directory using the command "./ca2 inter_11points.txt 4 <human car number>"
    • On the human controlled car, run the program in the program directory using the command "./ManualDrive"
    • The demo should be up and running and you should proceed until you're done or there is a crash or a car goes off course or anything else stops the experiment.
    • At the end of the experiment, download the folder "ca_output" from the program directories on both cars, appending -4 and -<human car num> to each folder. You will end up with ca_output-4 and ca_output-<human car num>

Wiki Markup\[[edit|http://wiki.eecs.umich.edu/delvecchio/index.php?title=Lab_Documentation&action=edit&section=107]\]

New semi-autonomous

This demo will include one autonomous car that does no collision avoidance and one car that will be driven by a human. The human car will let the human have full control until there is danger of a collision at which point the car will warn the human to either accelerate or brake depending on what is required to avoid a crash. If the human does not react properly withing a given time frame, the car will take control and act to avoid the collision.

...