Rens, G2016-02-232016-02-232015-01Rens, G. 2015. Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values. In: 7th International Conference on Agents and Artificial Intelligence (ICAART) 2015, Lisbon Marriott Hotel, Portugal, 10 - 12 January 2015978-989-758-074-1http://www.scitepress.org/DigitalLibrary/ProceedingsDetails.aspx?ID=+mGlly8Sp00=&t=1http://hdl.handle.net/10204/84097th International Conference on Agents and Artificial Intelligence (ICAART) 2015, Lisbon Marriott Hotel, Portugal, 10 - 12 January 2015. Due to copyright restrictions, the attached PDF file only contains the abstract of the full text item. For access to the full text item, please consult the publisher's websiteA novel algorithm to speed up online planning in partially observable Markov decision processes (POMDPs) is introduced. I propose a method for compressing nodes in belief-decision-trees while planning occurs. Whereas belief-decision-trees branch on actions and observations, with my method, they branch only on actions. This is achieved by unifying the branches required due to the nondeterminism of observations. The method is based on the expected values of domain features. The new algorithm is experimentally compared to three other online POMDP algorithms, outperforming them on the given test domain.enOnline POMDP planningPOMDPPartially Observable Markov Decision ProcessesHeuristicOptimizationBelief-state CompressionExpected Feature ValuesSpeeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature valuesConference PresentationRens, G. (2015). Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values. Scitepress Digital Library. http://hdl.handle.net/10204/8409Rens, G. "Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values." (2015): http://hdl.handle.net/10204/8409Rens G, Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values; Scitepress Digital Library; 2015. http://hdl.handle.net/10204/8409 .TY - Conference Presentation AU - Rens, G AB - A novel algorithm to speed up online planning in partially observable Markov decision processes (POMDPs) is introduced. I propose a method for compressing nodes in belief-decision-trees while planning occurs. Whereas belief-decision-trees branch on actions and observations, with my method, they branch only on actions. This is achieved by unifying the branches required due to the nondeterminism of observations. The method is based on the expected values of domain features. The new algorithm is experimentally compared to three other online POMDP algorithms, outperforming them on the given test domain. DA - 2015-01 DB - ResearchSpace DP - CSIR KW - Online POMDP planning KW - POMDP KW - Partially Observable Markov Decision Processes KW - Heuristic KW - Optimization KW - Belief-state Compression KW - Expected Feature Values LK - https://researchspace.csir.co.za PY - 2015 SM - 978-989-758-074-1 T1 - Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values TI - Speeding up Online POMDP planning - unification of observation branches by belief-state compression via expected feature values UR - http://hdl.handle.net/10204/8409 ER -