For sophisticated robots, it may be best to accept and reason with noisy sensor data, instead of assuming complete observation and then dealing with the effects of making the assumption. We shall model uncertainties with a formalism called the partially observable Markov decision process (POMDP). The planner developed in this paper will be implemented in Golog; a theoretically and practically 'proven' agent programming language. There exists a working implementation of our POMDP-planner
Reference:
Rens, G, Ferrein, A and van der Poel, E. 2008. Extending DTGolog to deal with POMDPs. 19th Annual Symposium of the Pattern Recognition Association of South Africa (PRASA 2008), Cape Town, South Africa, 27-28 November 2008, pp 49-54
Rens, G., Ferrein, A., & van der Poel, E. (2008). Extending DTGolog to deal with POMDPs. PRASA. http://hdl.handle.net/10204/2972
Rens, G, A Ferrein, and E van der Poel. "Extending DTGolog to deal with POMDPs." (2008): http://hdl.handle.net/10204/2972
Rens G, Ferrein A, van der Poel E, Extending DTGolog to deal with POMDPs; PRASA; 2008. http://hdl.handle.net/10204/2972 .