Navigation Strategies for Exploring Indoor
Environments
Hector H. González-BañosJean-Claude Latombe
Presented by Miler Lee, 6 May 2002
CS 326A: Motion Planning
Automatic model buildingTask:
Build a representation of an unknown environment relying only on robotic sensors
Next-best-view problem:
What path should the robot take to optimize sensing operations
Challenges to next-best view• Avoid obstacle collision, given that the robot is moving through an unknown environment and the sensors are on the robot
• Localize to the map-so-far, robustness despite odometry errors (require minimum overlap between senses)
Thrun 2001
NBV algorithm overview
Given sensor information at the current position and the map-so-far,
• Compute the local safe region
• Align with the map-so-far
• Generate a list of candidate NBVs
• Sample points near the edge of the safe region
• Remove candidates not adequately visible from the boundaries of the safe region
• Remove candidates without a collision-free path
• Select NBV maximizing visibility gain vs. path length
NBV algorithm overview
Given sensor information at the current position and the map-so-far,
• Compute the local safe region
• Align with the map-so-far
• Generate a list of candidate NBVs
• Sample points near the edge of the safe region
• Remove candidates not adequately visible from the boundaries of the safe region
• Remove candidates without a collision-free path
• Select NBV maximizing visibility gain vs. path length
Safe region
Largest region the robot can safely occupy, given map-so-far
• Compute visibility region
• Extrapolate safe region
Visibility
• W 2 is the actual workspace
• W is the boundary of W
• w W is visible from q W if the following constraints are satisfied:
Visibility
• W 2 is the actual workspace
• W is the boundary of W
• w W is visible from q W if the following constraints are satisfied:
1. Line-of-sight constraint: the open line segment S(w, q) does not intersect W
W
q
w W
q
w
Visibility
• W 2 is the actual workspace
• W is the boundary of W
• w W is visible from q W if the following constraints are satisfied:
2. Range constraint: Euclidean distanced(q, w) rmax for some rmax > 0
Ww
q
Visibility
• W 2 is the actual workspace
• W is the boundary of W
• w W is visible from q W if the following constraints are satisfied:
3. Incidence constraint: (n,v) for some [0, /2], for n W at w and v the vector from w to q
W w
q
Ww
q
n
n
Solid curves
• Visible contour described in polar coordinates as piecewise continuous function r( ) with respect to robot origin
W
q
ab
• r( ) can be decomposed into non-overlapping, continuously visible solid curves, defined by r( ; a,b ) over (a,b), where a and b are critical values of
• Let contain all solid curves in the boundary
Free curves
• Each pair of succeeding solid curves < r1( ; a1,b1 ), r2( ; a2,b2 )> is connected with a free curve f( ; b1,a2 ), s.t.:
• Any ray from the robot origin in the polar region b1 < < a2 will intersect the curve before any obstacle
• The area defined by f is as large as possible Gonzalez-Banos /
Latombe 2002
f
Solid curve approximation
Finite resolution: rather than visible curves, sensor returns a list of visible points
Solution:
• group points into clusters
• fit a conservative (minimal number of vertices) polyline to each cluster, such that every point lies within of a line segment
NBV algorithm overview
Given sensor information at the current position and the map-so-far,
• Compute the local safe region
• Align with the map-so-far
• Generate a list of candidate NBVs
• Sample points near the edge of the safe region
• Remove candidates not adequately visible from the boundaries of the safe region
• Remove candidates without a collision-free path
• Select NBV maximizing visibility gain vs. path length
Model alignment / mergingDefine the partial global model at qk-1 to be Mg(qk-1 ) = < g(qk-1 ), Sg(qk-1 )>
• g(qk-1 ) is the union of the solid curves in W up to stage k-1 (subject to transform)
• Sg(qk-1 ) is the safe region defined by g(qk-1 ) and the connecting free curves
Let the local model be ml (qk ) = < l (qk ), sl(qk )>
Model alignment / merging
Given Mg(qk-1 ) = < g(qk-1 ), Sg(qk-1 )> and ml (qk ) = < l (qk ), sl(qk )>
Calculate the transform T by matching the line segments in g(qk-1 ) and l (qk ) with an alignment algorithm
Create the new safe region Sg(qk ) = T(Sg(qk-1 )) sl(qk )
Model alignment / merging
Example ALIGN algorithm:
• Randomly select pairs of segments from l
• For each pair (u1, u2) in l , find (v1, v2) in g with the same relative angle
• Look for the best fit transform that results
Not perfect…can use some odometric / distance information as well
NBV algorithm overview
Given sensor information at the current position and the map-so-far,
• Compute the local safe region
• Align with the map-so-far
• Generate a list of candidate NBVs
• Sample points near the edge of the safe region
• Remove candidates not adequately visible from the boundaries of the safe region
• Remove candidates without a collision-free path
• Select NBV maximizing visibility gain vs. path length
Feasible set of NBV candidates
1. Randomly pick points within the visibility range of the free curves bounding Sg(qk-1 )
2. Rule out points that are too far away from the visible solid curves of g(qk-1 ), to allow for accurate model alignment
3. Rule out points for which a path planner does not generate a collision-free path
Scoring NBV candidates
Score is defined for an NBV candidate q by
g(q) = A(q) exp { -L(q) }
• A(q ) measures visibility gain of new position
• L(q) is the length of the path generated by the planner
• weights path cost against information gain; small if motion is cheap
Visibility gain metric A
Example A(q) calculation:
• Cast a set of equally-spaced rays from q
• For each ray:
• if the segment between 0 and rmax distance from q hits a solid edge, ignore the portion of the ray beyond the intersection
• compute the length len of the remaining segment that falls outside of Sg(qk )
• set A(q) = len
NBV computation
Remarks
• Experimentally, showed moderately low number of motions / sense operations required to build map
• Ability to tweak parameters to implement cautious detailed mapper or high-exploration rough mapper
• Can be extended to multiple robots, efficient if they could work relative to each other
Limitations
• 2D model only
• Lack of error-recovery or retroactive adjustment
• Feature-poor environments
Future work: integrate NBV algorithm with other simultaneous localization and mapping (SLAM) techniques