Version history:
version 1.1 – May 2008
Polished release. Better compilation and estethic touches.
version 1.0 – September 2007
First release at the time of submission.
Table of contents:
This is the source code that accompanies the paper
Andrea Censi, Gian Diego Tipaldi ”Lazy localization using the Frozen-Time Smoother”, in ICRA’08.
- final version (PDF)
- see additional material at http://purl.org/censi/2007/fts.
The source code is distributed under the Creative Commons License (Attribution-NonCommercial-ShareAlike). In addition to the code by Andrea Censi and Gian Diego Tipaldi, the MCL localizer is based on code from GMapping copyrighted by Giorgio Grisetti, Cyrill Stachniss, and Wolfram Burgard (under the same license).
Pre-requisites:
Linux or Mac OS X. We don’t have any idea whether it works also on Windows – we haven’t touched a Windows-infected machine in a long time.
CMake and pkg-config, for the build process.
The GNU Scientific Libraries (pre-requisited for CSM).
Optional pre-requisites:
Note that all of these can be easily installed using your Linux distribution package manager.
For Mac, you can use fink.
pkg-config
This software links to CSM using the pkg-config system. If you installed it correctly, you should be able to write:
$ pkg-config --libs csm
and obtain a list of CSM’s libraries.
If you installed CSM in a place different than /usr/local
, then you will see a message like:
Package csm was not found in the pkg-config search path.
Perhaps you should add the directory containing `csm.pc'
to the PKG_CONFIG_PATH environment variable
Do what the message says. For example, I gave /Users/andrea/svn/cds/csm/deploy/
as the installation directory for CSM, so I add to set PKG_CONFIG_PATH
as follows:
export PKG_CONFIG_PATH=/Users/andrea/svn/cds/csm/deploy/lib/pkgconfig/
Then, issue the commands:
$ cd src/fts
$ cmake .
$ make
Pre-requisites:
Installation: just run the configure
script.
$ cd src/mcl
$ ./configure
$ touch manual.mk # add your switches here
$ make
If some library is not found, you might have to edit the file manual.mk
.
For the FTS, the source code is contained in the fts/
directory. Note, however, that it uses the CSM library for the basic stuff (laser_data
structure definition, JSON input/output, etc.).
For the fts_loc
application, the relevant files are:
fts_loc.h
contains the parameters definition (to get a description, run fts_loc -help
).fts_loc.c
contains command line parsing, setup procedures, and the loop that chooses which scans to integrate.fts_loc_ght.c
contains the GHT loop.Note that in these files there are a lot of optimization methods we used to improve efficiency; they can be activated by command line options. We did NOT use these in the paper’s experiments, as FTS was fast enough as a naive implementation.
Other files:
fts_loc_nmi.c
, contains the code for some optimizations to the GHT outer loop.spatial_*
: various code for handling 1D, 2D and 3D buffers.normal_map*
: can you guess?Other applications:
fts_log2pdf.c
visualizes the data on PDF or PNGs.fts_exp_eval.c
computes the statistics.fts_log_split.c
splits one log in chunks.fts_log_recover.c
recovers the estimate
field in the SLAM log and puts it into the scan matching log as true_pose
(ground truth).In the following, dataset
is one of aces
, intel
. You can use other logs, of course. You got to have the files:
${dataset}.log.bz2 (original log)
${dataset}.gfs.log.bz2 (used as a map)
${dataset}.sm.log.bz2 (used as incremental guess)
These are bz2-compressed files in either Carmen or JSON format. First off, prepare the logs, dividing them into chunks, using:
./prepare_logs.sh ${dataset}
These will run the FTS and MCL experiments:
$ nice -n 10 ./run_fts.sh ${dataset}
$ nice -n 10 ./run_exp1.sh ${dataset}
To create the statistics in Matlab/Octave-readable format:
$ ./create_stats.sh ${dataset} ${algo}
this will create .m
files.
From Matlab, load these files using:
exp = stats_read(dataset, algorithm);
for example:
exp = stats_read('aces', 'fts1');
stats_show(exp);
To create figures, try:
$ nice -n 10 ./create_figures.sh ${dataset} ${algo}
To create the HTML pages, use:
$ ./create_slideshow.rb ${dataset} ${algo}
In the following, dataset
= {aces
, intel
}, method
= {fts1
, mcl1
, mcl2
}, XXX
= three-digit number.
Input files:
${dataset}.log.bz2
(used as a map)${dataset}.gfs.log.bz2
(used as a map)${dataset}.sm.log.bz2
(used as incremental guess)Intermediate files:
${dataset}_chunks/XXX
: chunk with scan matching data${dataset}_chunks/XXX.clf
: same, in Carmen Log Format${dataset}_chunks/XXX.odo
: chunk with odometry data${dataset}_chunks/XXX.odo.clf
same, in Carmen Log FormatOutput files:
${dataset}_chunks/${method}/XXX
Method output (spatial buffer or particles). This is read by fts_exp_eval
.
${dataset}_chunks/${method}/XXX_go
Method graphical output: spatial buffer or particles, with text written on it. This is read by fts_log2pdf
.
${dataset}_chunks/${method}/XXX_stats.json
Output of fts_exp_eval
, created from XXX. This is read by json2matlab
.
${dataset}_chunks/${method}/XXX_stats.m
Conversion to matlab of XXX_stats.json
.
Other files:
${dataset}_chunks/${method}/XXX_go.pdf
Created by fts_log2pdf
.