4d/rcs reference model architecture for unmanned vehicle systems hierarchical structure of goals and...
TRANSCRIPT
4D/RCS Reference Model Architecturefor Unmanned Vehicle Systems
• Hierarchical structure of goals and commands
• Representation of the world at many levels
• Planning, replanning, and reacting at many levels
• Integration of many sensors stereo CCD & FLIR, LADAR, radar, inertial, acoustic, GPS, internal
James S. AlbusHui-Min Huang
National Institute of Standards and TechnologyRobert Finkelstein, Robotic Technology, Inc.
What is an Architecture?
• Functional modules, components, interfaces• Network connectivity, latency, bandwidth, reliability• Timing and coordination• Knowledge database• Communications protocols• Human Interfaces
Displays and control inputsSimulation and training toolsProgramming and debugging tools
What is 4D/RCS?
A reference model architecture
designed to enable any desired level of intelligence
up to and including human level intelligence
What is a Reference Model?• Functions, entities, events, relationships
• Interaction and information flow between systems and subsystems
• Structures for representation of knowledge, goals, plans, tasks, schedules, intentions, beliefs, values
• Mechanisms for perception, attention, cognition
• Mechanisms for reasoning, modeling, and learning
Battalion HQ
4D/RCS Reference Model Architecture
50 ms plans
Vehicle
RSTA Communications Weapons Mobility
Individual Vehicle
Surrogate Section
Surrogate Battalion
Surrogate Platoon
Primitive
Servo
Sensors and Actuators
Subsystem
500 ms plans
5 s plans
1 min plans
10 min plans
2 hr plans
24 hr plans
DriverGazeGaze
Focus Pan Tilt HeadingSpeedPan Tilt Iris
Select
Section HQ
Platoon HQ
RSTA Communications Weapons Mobility
Individual Vehicle
Surrogate Section
Surrogate Battalion
Surrogate Platoon
Primitive
Servo
Subsystem
DriverGazeGaze
Focus Pan Tilt HeadingSpeedPan Tilt Iris
Select
Sensors and Actuators
How 4D/RCS Relates WSTAWG
What is the history?•Under development at NIST and elsewhere for 25 years
•Originally developed for intelligent manufacturing systems
•Transitioned by DARPA and NASA for unmanned vehicle and manipulator systems
•Used by Army TEAM, TMAP, FMR, Demo I, III
•Has been implemented in JAUGS
•To be integrated into Vetronics Technical Architecture
RCS HistoryNBS/NIST -- Automated Manufacturing Research Facility
DARPA -- Unmanned Combat Air VehicleChina Lake target drones (proposed)
DARPA -- Multiple Unmanned Undersea Vehicles (MAUV)DARPA -- Submarine Operational Automation System (SOAS)GD Electric Boat -- Next generation nuclear submarineNASA - Space Station Flight Telerobotic Servicer (NASREM)Bureau of Mines - Coal mine automationU.S. Postal Service -- Stamp distribution center, General mail facilityArmy
TEAM, TMAP, MDARS, Demo I, II, and IIIARL Collaborative Technology Alliance
DARPA FCS - Boeing, GDMARSPerceptORTactical Mobile Robotics (USAR test course)
DOT -- intelligent vehicle, performance measures
What is the level of maturity?• A free public domain software library
• A variety of software development tools
• A variety of process visualization tools for:
analysis
debugging
control
human interface design
• Documentation and training materials
4D/RCS Documentation
Version 0.1 Issued with Demo III RFP, 1997Version 1.0 Issued in January 1999Version 2.0 Available in draft form
Books:Engineering of Mind - Wiley, 2001
RCS Handbook – Wiley, 2001
Intelligent Systems – Wiley, 2002
Numerous journal articles, reports, and conference papers
Extensive software library http://isd.cme.nist.gov/projects/rcslib
SERVO
PRIMITIVE
ELEMENTARY MOVE
INDIVIDUAL
GROUP
ORGANIZATIONAL HIERARCHY
COMPUTATIONAL HIERARCHY
Behavior
GeneratingValue Judgment World Modeling
Sensory Processing
BEHAVIORAL HIERARCHY
BG3
BG2
BG1
WM3
WM2
WM1
VJ3
VJ2
VJ1
SP3
SP2
SP1
ENVIRONMENT
ActuatorsSensors
ASSEMBLE A B
FETCH A
FETCH B
FASTEN B TO A
MATE B TO A
REACH TO AGRASP
MOVE TO X
RELEASEREACH TO B GRASP MOVE TO Y
LOCATE HOLE MOVE TO
TOUCH INSERTTWIST
LOCK
TIME
STATE SPACE
LE
VE
L
BG5
WM5
VJ5SP5
BG4
WM4
VJ4SP4
BG3
WM3
VJ3SP3
BG2
WM2
VJ2SP2
BG1
WM1
VJ1SP1
Three Aspects of 4D/RCS
4D/RCS Reference Architecture
OP
ER
AT
OR
IN
TE
RFA
CE
SPWM BG
SP WM BG
SP WM BG
SP WM BG
Points
Lines
Surfaces
SP WM BG SP WM BG
SP WM BG
0.5 second plans Steering, velocity
5 second plans Subtask on object surface Obstacle-free paths
SP WM BGSP WM BG
SP WM BGSP WM BG SP WM BG
SERVO
PRIMITIVE
SUBSYSTEM
SURROGATE SECTION
SURROGATE PLATOON
SENSORS AND ACTUATORS
Plans for next 2 hours
Plans for next 24 hours
0.05 second plans Actuator output
SP WM BGSP WM BG SP WM BG SP WM BG SP WM BG SP WM BG SP WM BG SP WM BG
Objects of attention
LocomotionCommunication Mission Package
VEHICLE Plans for next 50 seconds Task to be done on objects of attention
Plans for next 10 minutes Tasks relative to nearby objects
Section Formation
Platoon Formation
Attention
Battalion Formation SURROGATE BATTALION
4D/RCS Computational Node
KNOWLEDGE DATABASE
SENSORY PROCESSING
BEHAVIOR GENERATION
PLAN
PREDICTED INPUT
UPDATE
STATE
PL
AN
R
ES
UL
TS
PLAN
SIT
UA
TIO
N
EV
AL
UA
TIO
N
OBSERVED INPUT
COMMANDED ACTIONS (SUBGOALS)
PERCEIVED OBJECTS & EVENTS
COMMANDED TASK (GOAL)
OPERATOR INTERFACE
VALUE JUDGMENT
WORLD MODELING
EVALUATION
STATUS
STATUSSENSORY
INPUT
SENSORY OUTPUT
PEER INPUT OUTPUT
RCS Node
To Higher and Lower Level World Modeling
SENSORY PROCESSING
WORLD MODELING
VALUE JUDGMENT
KNOWLEDGE
Images
Maps Entities
Sensors ActuatorsWorld
Classification Estimation Computation Grouping Windowing
Goal
internal external
EventsPlanners
Executors
Task Knowledge
BEHAVIOR GENERATION
4D/RCS Computational Node
4D/RCS Reference Model Elements
Behavior Generation -- Planning and Execution
Sensory Processing -- Sensing and Perception
World Modeling -- Estimation, Prediction, Simulation
Value Judgment -- Cost/Benefit, Confidence
Knowledge Database -- What is known
Communications -- Message passing
Behavior Generation
PLANNER
EX
Plan
EX
Plan
EX
Plan
BG
PLANNER
EX
Plan
EX
Plan
EX
Plan
BG
PLANNER
EX
Plan
EX
Plan
EX
Plan
BG
Agent1
Subtask Command Output
Subtask Command Output
Subtask Command Output
WORLD MODELING
SIMULATOR PREDICTOR
VALUE JUDGMENT
cost benefit
EXECUTOR
PLAN
BEHAVIOR GENERATION
Expected Results
Tentative Plans
Images Maps
Entities Events States
Attributes
Feedback
Task Command
Input
EXECUTOR
PLAN
EXECUTOR
PLAN
Task Decomposition PLANNER
KD
SENSORY PROCESSING
Recognize Filter Compute Group Window
Status Status Status
Status
4D/RCS Commands and Plans at Vehicle Level (4)
ActionCommand = ac141 GoalCommand = gc14
1 GoalTime = gt141 ~ t + 1 min
NextActionCommand = ac241 NGoalCommand = gc24
1 NextGoalTime = gt141 ~ t + 2 min
This command would be decomposed into three plans for the Subsystem level of the form:
Autonomous Mobility Plan RSTA Plan Communications Planap13
1, gp131, gt13
1= t+5 sec ap132, gp13
2, gt132 ap13
3, gp133, gt13
3 ap23
1, gp231, gt23
1= t+10 sec ap232, gp23
2, gt232 ap23
3, gp233, gt23
3
ap331, gp33
1, gt331= t+15 sec ap33
2, gp332, gt33
2 ap333, gp33
3, gt333
ap431, gp43
1, gt431= t+20 sec ap43
2, gp432, gt43
2 ap433, gp43
3, gt433
ap531, gp53
1, gt531= t+25 sec ap53
2, gp532, gt53
2 ap533, gp53
3, gt533
ap631, gp63
1, gt631= t+30 sec ap63
2, gp632, gt63
2 ap633, gp63
3, gt633
ap731, gp73
1, gt731= t+35 sec ap73
2, gp732, gt73
2 ap733, gp73
3, gt733
ap831, gp83
1, gt831= t+40 sec ap83
2, gp832, gt83
2 ap833, gp83
3, gt833
ap931, gp93
1, gt931= t+50 sec ap93
2, gp932, gt93
2 ap933, gp93
3, gt933
ap1031, gp103
1, gt1031= t+1 min ap103
2, gp1032, gt103
2 ap1033, gp103
3, gt1033
where ap is action planned, gp is goal planned, and gt is planned goal timeand apij
k is the i-th planned action for the k-th subordinate BG module at the j-th level
Message List
EX
Gaze Plan
RSTA
EX
Driver Plan Gaze Plan
Autonomous Mobility
EX EX
Velocity Plan
AM Plan RSTA Plan
F Wheels R Wheels F Steer R Steer Pan Tilt
Velocity Stereo Gaze
Driver
Vehicle1
Vehicle1 Plan Vehicle2 Plan
Section1
EX EX
EX EXEX
EX
EXEXEXEX EX EX
PRIMITIVE LEVEL
500 ms plans
SUBSYSTEM LEVEL
5 s plans
VEHICLE LEVEL
1 min plans
SECTION LEVEL
10 min plans
SERVO LEVEL
50 ms plans
ACTUATORS 5 ms update F Wheel R Wheel F Steer R Steer Pan Tilt
Pan Tilt
EX
Tilt
BG
BGBG
BG
BG BG BG
BG
ac1, gc1, gt1 ac2, gc2, gt2
TASK COMMAND ActionCommand = ac1 GoalCommand = gc1 GoalTime = gt1
NextActionCommand = ac2 NextGoalCommand = gc2 NextGoalTime = gt2
PLANNER
PLANNER
ac1, gc1, gt1 ac2, gc2, gt2
ac1, gc1, gt1 ac2, gc2, gt2
PLANNERPLANNERPLANNER
PLANNER
ac1, gc1, gt1 ac2, gc2, gt2
ac1, gc1, gt1 ac2, gc2, gt2
ac1, gc1, gt1 ac2, gc2, gt2
ac1, gc1, gt1 ac2, gc2, gt2
ac1, gc1, gt1 ac2, gc2, gt2
Communications
Communications Plan
PLANNER PLANNER
Stereo Gaze Plan
Gaze
EX
LADAR Gaze Plan
EX
BG
ac1 ac1 ac1 ac1 ac1 ac1 ac1
PLANNER
ac1, gc1, gt1 ac2, gc2, gt2
BG
PLANNER
LADAR Gaze
4D/RCS for
Demo IIIOrganizational
Hierarchy
N
5000 m range 40 m resolution
object image
object image
N N
50 m range 40 cm resolution
1 min horizon
EXECUTOR
VEHICLE PLANNER
500 m range 4 m resolution
object image
object image
1
2
6
3
54
WM simulator
pointersobject image
vehicle
ground
skytree
rock
hill
N
classification confirm grouping
filter compute attributes
grouping attention
building
vehicle
5 m range 4 cm resolution
FRAMES Entities, Events
Attributes States
Relationships
IMAGES Labeled Regions
Attributes
MAPS Labeled Features
Attributes Icons
MAPS Cost, Risk
Plans
EXECUTOR
ACTUATORSENSORS
WORLD
SENSORY PROCESSING
WORLD MODELING VALUE JUDGMENT BEHAVIOR GENERATION
groups
objects
surfaces
lists
pixel attributes
5 s horizon
EXECUTOR
SUBSYSTEM PLANNER
EXECUTOR
PRIMITIVE PLANNER
50 ms horizon
EXECUTOR
SERVO PLANNER
vehicle state sensor state
SP1actuator state
ladar signals
stereo CCD signals
stereo FLIR signals
color CCD signals
radar signals
actuator signals
navigational signals
actuator power
SP5
pixels
compute attributes, filter, classification
labeled pixels
labeled lists
labeled surfaces
labeled objects
labeled groups
WM simulator
WM simulator
WM simulator
WM simulator
N N
status
status
status
status
status
classification confirm grouping
filter compute attributes
grouping attention
classification confirm grouping
filter compute attributes
grouping attention
classification confirm grouping
filter compute attributes
grouping attention
SP4
SP3
SP2
pointersobject image
vehicle
ground
skytree
rock
hillbuilding
coordinate transformations
SECTION PLANNER
10 min horizon
Section Task Command
a priori maps
500 ms horizon
Vehicle Task
Subsystem Task
Primitive Task
Servo Task
Plan
Plan
Plan
Plan
Plan
4D/RCSfor
Demo IIIComputational
Hierarchy
N
50 meters
Plan Map
N
50 meters
WM inv simulator
planned actions
planned waypoints
plannerVJ
plan
executor
planned actions & planned waypoints
commanded task to PRIMITIVE LEVEL for next 500 ms
predicted vehicle (position, heading, velocity)
SUBSYTEM LEVEL 5 second planning horizon
World Map
vehicle inverse
kinematics
commanded task from VEHICLE LEVEL for next 5 seconds
ground cover terrain elevation traversibility roughness slope obstacles
new plan every 1 second
digital terrain map database
Planning at Subsystem Level
Possible Paths from Hard Right Wheel Position
3-D Terrain Traversability
Path Cost
Evaluation
Planning to Turn Off-Road
Replanning to Avoid Obstacle During Turn
Planning at Vehicle Level
N
World Map
500 meters
risk visibility from(x) ground cover terrain elevation traversibility obstacles
commanded task to SUBSYSTEM LEVEL for next 5 seconds
Plan Map
N
WM inv simulator
planned actions
planned waypoints
plannerVJ
plan
executor
planned actions & planned waypoints
VEHICLE LEVEL 50 second planning horizon
commanded task from SECTION LEVEL for next 50 seconds
vehicle mobility inverse model
digital terrain map database
500 meters
new plan every 10 seconds
Vehicle Position
cost = ( costWeight->obstacle * detectObs +costWeight->pathLength * segLength +costWeight->unknown * detectUnknown +costWeight->road * aRoads +costWeight->building * aBuildings * notDetectUnkown +costWeight->forest * aForest * notDetectUnkown +costWeight->nearRoad * aNearRoads + costWeight->nearBuilding * aNearBuildings +costWeight->nearForest * aNearForest +costWeight->risk * riskMap + costWeight->frontSlopeMax * abs(maxFrontSlope) +costWeight->sideSlopeMax * abs(maxSideSlope) +costWeight->frontSlopeAvg * avgFrontSlope +costWeight->sideSlopeAvg * avgSideSlope +costWeight->field * fieldConformance +costWeight->offset ) * length;
A Typical Cost Function
Avoid Obstacles
Cost Field
Follow Roads
Follow Tree Line
Video Army Research Laboratory
Demo III Experimental Unmanned Ground Vehicle
using the NIST 4D/RCS Architecture
The 4D in 4D/RCS
Dickmanns video
4-D Model Based Vision for Automated Driving
Developed at Universitat der Bundeswehr, Munichby Ernst Dickmanns
Sensing and Perception
• Resolution and field of view• Tracking, stabilization, and saccades• Measure attributes and features, infer 3-D shape• Group features into patterns, objects, events, situations• Establish relationships• Classify, recognize, and evaluate objects, events, situations• Distinguish what is important from what is not• Focus attention on what is important• Fuse a priori with sensed• Generate expectations and predictions• Function in real-time with computed confidence levels
Hypothesize and Test
SENSORY PROCESSING
Sensory Observations
World Model Predictions
Correlations Differences
Recognized Object or Event
Knowledge Database
WriteRead
WORLD MODELING
Predict
Update
Recursive Estimation
x(k) = state of world at time k y(k) = observed attribute image at time k y*(k|k-1) = predicted attribute image at time k after (k-1)th measurement x(k-1|k-1) = estimated state at time k-1 after (k-1)th measurement x*(k|k-1) = predicted state at time k after (k-1)th measurement u(k) = control input at time k A(k) = propagation of system state at time k B(k) = effect of control input on system state at time k F(k) = forward perspective projection at time k S(k) = sensor transformation at time k D(k) = effect of control on sensor data at time k K(k) = confidence in model at time k
World Modeling (WM)
y(k)
x*(k|k-1)
Entity Frame Name
Perspective projection of visable features
K(k)
A(k-1)
Temporal projection of state
u(k-1)
D(k)
x(k)
delay
B(k-1)
-
+ +
+
S(k)
(k) sensor noise
Control input
Symbolic entity update
++
del
ay
u(k)
Inverse perspective projection
+
+
+
Compute dy*/dx* and K(k)
x(k|k) = x*(k|k-1) + K(k)[y(k) - y*(k|k-1)]
x(k-1|k-1)
F(k)
state of
world
Sensory Processing (SP)
Estimated State Attributes
observed attribute
image
error
y(k) - y*(k)
predicted attribute
image
y*(k|k-1)
System model x*(k|k-1) = A(k-1)x(k-1|k-1) + B(k-1)u(k-1) Measurement model y(k) = S(k)x(k) + D(k)u(k) + (k) Perspective projection model y*(k) = F(k)x*(k|k-1)
Recognition -- compare group with class attributes
Filtering -- track, estimate attributes, verify gestalt
Computation -- compute group attributes
Grouping -- gestalt hypothesis for grouping
Windowing -- attend to relevant region
Image Processing at Each Level
Image Processing at Each Level
hypothesize new entity
WMentities of attention
attention windows
gestalt hypotheses
class attributes
mask irrelevant subentities
group subentities into entities
compute entity attributes
recursive estimation
compare attributes recognize class
windowing
grouping
computation
filtering
recognition
relevant subentity image
hypothesized entity image
hypothesized entity attributes
subentity image
confirmed entity attributes
library of entity class frames
task goal, priorities
select classes
named entity frames
links
SP
SP
SP
SP&WM
SP&WM
KD
WM
BG
confidence
labeled entity image
WM store
what entities look like & where they are
WM
SP size windows
> threshold < threshold
confirm hypothesis
SP deny hypothesis
novel
SP
Color Image
Range ImageLADAR is a Critical Break-Through
Color overlaid on LADAR
Most are developed for ATR or air reconnaissanceNeed sensors for driving on the ground
range and color, 1 - 500 meters, 20 frames/sec
Need foveal / peripheral / wrap-around imaging, pan / tilt, neckNeed saccades, stabilization, trackingNeed inexpensive, rugged systems
Need foliage penetrationNeed to measure load bearing properties of ground
under tall grass, weeds, marsh, mud, snow, and water
Advances Needed in Sensors
High Resolution LADAR
LADAR Image coded for z
Foveal - Peripheral Vision
Sensory Recognition
What Is?
Update
Predict
What If?
Task Planner
Task Executor
Compare Predictions
with Observations
Value Judgment Functions
World Model Functions
Errors
VJ
WM
Evaluate Evaluate
Knowledge Database
KD
Cor
rela
tion
s
Plan
sSensory
Recognition
State Variables Entity Lists
Maps
Simulate
Com
man
ds
Action
sO
bse
rvat
ion
sG
ener
aliz
atio
ns
World Modeling & Value Judgment
Not just an interface between perception and action
World Modeling• Terrain representation (multiresolutional maps)
• Represent geometry, state, and symbolic information
• Link iconic and symbolic representations
• Segmentation and classification
• Immediate experience, short term, and long term memory
• Dynamic modeling (multiresolutional time)
Knowledge DatabaseIconic - signals, images, maps (arrays)- Support communication, geometry, and navigation- Have range and resolution in space and time
Symbolic - objects, events, classes (abstract data structures)- Support mathematics, logic, and linguistics- Have vocabulary and ontology
Links- relationships (pointers)- Support syntax, grammar, and semantics- Have direction and type
NAME = Object#13
Type = Tank, U.S. Model = M-1 Weapon = 90 mm cannon Speed = 45 mph Part-of = Blue force Has-part1 = Turret Has-part2 = Gun Has-part3 = Body Has-part4 = Wheels Has-part5 = Dust cloud
IMAGE = Object#13
side view
front view
Entity Exemplar FrameEntity Exemplar Image
Knowledge Representation
Iconic Symbolic
Surface Name = surf1
color size shape
Entity frames
attributes
is_a(surf1)
is_a(surf2)
Surface Name = surf2
Surface Name = surf3
is_a(surf3)
Surface entity image
Object entity image
List entity image
Attribute images
Entity images
e.g. intensity
color dI/dt dI.dx dI/dy etc.
state
position velocity
classentity image generic class specific class
parent
object
has_parts
edges surface patches
behavior
value
worth to preserve worth to acquire worth to defend
ICONIC
SYMBOLIC
red = rd blue = bl green = gr brightness = I xgrad = dI/dx ygrad = dI/dy tgrad = dI/dt range = r rxgrad = dr/dx rygrad = dr/dy rflow = dr/dt xflow = dx/dt yflow = dy/dt list name = list entity to which the pixel belongs surface name = surface entity to which the pixel belongs object name = object entity to which the pixel belongs group name = group entity to which the pixel belongs
Att
rib
ute
im
ag
es
En
tity
im
ag
es
Cla
ss
imag
es
generic class 1 to which the pixel belongs generic class 2 to which the pixel belongs generic class 3 to which the pixel belongs specific class to which the pixel belongs
Valu
e im
ag
es
worth to acquire cost/risk/worth to traverse worth to defend worth to destroy worth to defeat
ICONICImages
LADAR Provides 3-D Images
LADAR Generated Terrain Map
LADAR Derived Maps
Registration of LADAR imagewith A Priori Map
MULTI-RESOLUTION MAPS
0.4 m grid 50 m wide
4 m grid 500 m wide
30 m grid Terrain map
• Data flows up and down between the different maps
• Path planning
occurs at each level
SYMBOLIC ENTITYFrame or Object
NAME attributes state pointers
9876543210123456789
belongs-to has-part1 has-part2 class1 class2
shape size color behavior
position motion
observed states expected states
Entity Frame
Establish patterns (i.e., relationships between objects)
Label objects
Group into objects
Detect edges, lines, surfaces
Observe pixel brightness, color, range
0
-20
-40
20
40
10
30
-10
-30
0-10-20-30-40 10 20 30 40
Transform into map coordinates
own lane
on-coming lane
right lane
Object#1
Object#2
Object#3
Object#4
azimuth
ele
vati
on
50 mfar left lane
Object#5Object#6
Analyze situations
Link to symbolic data structures Classify and label entities Compare with class prototypes Recursive estimation Compute entity attributes Group features into entities Focus attention
50 m
50 m
Object#1
Object#2
Object#3
class = car in right lane range 20 m bearing 13 deg rel speed 5 m/s tail lights off ETA pulling away
class = HMMWV in right lane range 30 m bearing 9 deg speed 25 m/s tail lights off ETA pulling away
class = car in oncoming lane range 30 m bearing -8 deg rel speed -40 m/s ETA .75 s
ETA for Object#3 30 m @ 40 m/s = .75 s
Object#4class = truck in oncoming lane range 40 m bearing -4 deg rel speed -40 m/s
Group#1
Group#2
class = on-coming traffic in on-coming lane obj 3 in front obj 4 in back
class = rt lane traffic obj#2 in front obj#1 in back
own vehicle
class = HMMWV in own lane speed 20 m/s turn rate = 0 acceleration = 0
Object#5
class = tree range 15 m bearing 60 deg
class = tree range 30 m bearing -42 deg
Object#6
Group#3class = trees
Support driving behavior
Model Based Perception
class class1 class2 attributes state belongs_to has_part1 has_part2 relationships
O3
class class1 class2 attributes state belongs_to has_part1 has_part2 relationships
O3
class class1 class2 attributes state belongs_to has_part1 has_part2 relationships
O2
attributes
O1
O3O5
O6
Horizon
3
5
10
20
5030
mete
rs
O8 O7
O9
O4
Images Geometric Entity Frames
Entity Class Prototypes
O1
Road
attributes
Grass
attributes
Gravel Road
attributes
Tree
attributes
Obstacle
O2
class class1 class2 attributes state belongs_to has_part1 has_part2 relationships
attributes
Grass_on_road
attributes
Off_road_drivableattribute images
list entity image
surface entity image
object entity image
SYMBOLICENTITY
Frameor
Object
NAME = entity_id (uncertainty) // this is the frame address in the KD
// attributes – these are characteristics that address the question What?color = red, green, blue intensitiessize = length, height, width dimensionsshape = curvature, moments, axes of symmetry, etc.
// state – these are dynamic properties that address the question Where?position = azimuth, elevation, range (uncertainty)orientation = roll, pitch, yaw (uncertainty)velocity =v-azimuth, v-elevation, v-range, v-roll, v-pitch, v-yaw (uncertainty)
// class – pointers to the entity image and classes to which the entity belongsentity image = pointer to the entity image in which the entity appearsgeneric class1 = pointer to the generic class1 exemplargeneric class2 = pointer to the generic class2 exemplargeneric class3 = pointer to the generic class3 exemplar
// value or worth of the entityworth to preserve = valueworth to acquire = valueworth to defend = valueworth to defeat = value
// pointers that define parent-child relationshipsbelongs to = pointer to parent entityhas part1 = pointer to subentity1has part2 = pointer to subentity2has part3 = pointer to subentity3
// pointer that define situational relationshipson top of = pointer to entity belowbeside-right =pointer to entity on right
// queues that store short-term state history and expected future states short term memory = pointer to STM queueshort term expectations = pointer to STE queue
// functions that define behaviorbehavior1 = responds-tobehavior2 = acts-like
Car turning left (position, velocity)Oncoming cars (position, velocity)Traffic signals (stop)Truck on own road (position, velocity)Own road edges (Old Georgetown Road, heading North)Intersecting road edges (Democracy Boulevard, to West)Self in lane 2 (position, velocity) intent (go straight)
Situation Assessment
SymbolicEvents
inTime
PLATOON
VEHICLE
SUBSYSTEM
PRIMITIVE
SERVO
T=0ACTUATOR DRIVE
command subgoal interval ~2 hr
~2 hr planning horizon
~10 min planning horizon
~50 sec planning horizon
~500 msec planning horizon
50 msec planning horizon
output update interval = 5 msec
~5 sec planning horizon
command subgoal interval ~ 500 msec
command subgoal interval ~ 5 sec
command subgoal interval ~ 50 sec
command subgoal interval ~10 min
HISTORICAL TRACES
FUTURE PLANS
SECTION
Plan for the dayEvents of the day
~20 hr planning horizon
event integration interval = 50 msec
short term 50 msec memory
event integration interval ~500 msec
event integration interval ~ 5 sec
event integration interval ~ 50 sec
event integration interval ~ 5 min
event integration interval ~ 2 hr
short term ~ 500 msec memory
short term ~ 5 sec memory
short term ~ 50 sec memory
short term ~ 10 min memory
short term ~ 2 hr memory
short term ~ 20 hr memory
sensory sample interval = 5 msec
1
2
3
5
6
7
4
BATTALION
4D/RCS System Engineering Guidelines
• Software library and development tools• Hardware design and testing experience• Test and evaluation methods and procedures• Integration and testing methodology• Field experiments and operational testing results
How does 4D/RCS relate to other architectures?
• JTA -- complements• C4ISR -- complements• VTA -- complements with perception, modeling, behavior, goals, values • JAUGS – complements + perception, organic units, higher level intelligence• RPA -- similar + perception, echelons, ground vehicles• IEEE 1471-2000 -- complies + reference model
Summary
4D/RCS provides for:
OpenPortableReliableIntelligentMature
Many sensors, a rich world modelHigh speed sensory processingDeliberative and reactive behaviorEngineering tools and methodology
4D/RCS is a reference model architecture that is:
4D/RCS is Suitable for Standards
. Consistent with current military operations.
. Consistent and standard software component, interfacing, and organization structures.
. Open and scalable.
. Comprehensive, scientific engineering guidelines.
. Complements and expands from current Scope.
. Proven
. Mature
Standardization Approach(open to discussions)
. Standard interface to the current OE API standard.
. Standard control node component shells.
. Standard task and knowledge models, including mission statements, objects and situations, sensors and actuators specifications.
. Standard inter-node interfaces.
. Standard execution model, including timing and coordination.
. Recommended engineering guidelines.
. Serve as a roadmap for standards requirements.
PL-EX#define EX_EXECUTE_TYPE 2005
class EX_EXECUTE : public RCS_CMD_MSG{public:
//ConstructorEX_EXECUTE();//Update Functionvoid update(CMS *);// Place custom variables here.
PLAN_FORMAT vehiclepathPlan [LENGTH]; };
EX-subordinate PL#define MOBILITY_GOTO_TYPE 1003
class MOBILITY_GOTO : public RCS_CMD_MSG{public:
//Constructor MOBILITY_GOTO ();// Update Functionvoid update(CMS *);// Place custom variables here.POSITION_FORMAT mobNextPosition; };
4D/RCS Interface Examples