freepeople性欧美熟妇, 色戒完整版无删减158分钟hd, 无码精品国产vα在线观看DVD, 丰满少妇伦精品无码专区在线观看,艾栗栗与纹身男宾馆3p50分钟,国产AV片在线观看,黑人与美女高潮,18岁女RAPPERDISSSUBS,国产手机在机看影片

正文內(nèi)容

人工智能-貝葉斯網(wǎng)絡(luò)(參考版)

2024-08-04 21:55本頁面
  

【正文】 Craven, 1998] Faculty Grad Student Research Project Other Collective Classification ? Traditional learning methods assume that objects to be classified are independent (the first “i” in the . assumption) ? In structured data, the class of an entity can be influenced by the classes of related entities. ? Need to assign classes to all objects simultaneously to produce the most probable globallyconsistent interpretation. Logical AI Paradigm ? Represents knowledge and data in a binary symbolic logic such as FOPC. + Rich representation that handles arbitrary sets of objects, with properties, relations, quantifiers, etc. ? Unable to handle uncertain knowledge and probabilistic reasoning. Probabilistic AI Paradigm ? Represents knowledge and data as a fixed set of random variables with a joint probability distribution. + Handles uncertain knowledge and probabilistic reasoning. ? Unable to handle arbitrary sets of objects, with properties, relations, quantifiers, etc. 30 Statistical Relational Models ? Integrate methods from predicate logic (or relational databases) and probabilistic graphical models to handle structured, multirelational data. – Probabilistic Relational Models (PRMs) – Stochastic Logic Programs (SLPs) – Bayesian Logic Programs (BLPs) – Relational Markov Networks (RMNs) – Markov Logic Networks (MLNs) – Other TLAs 31 Conclusions ? Bayesian learning methods are firmly based on probability theory and exploit advanced methods developed in statistics. ? Na239。Earthquake) = – (diagnostic and intercausal) P(Burglary | JohnCalls ? 172。Earthquake) = – (diagnostic and intercausal) P(Burglary | JohnCalls ? 172。ve Bayes is a simple Bayes Net Y X1 X2 … Xn ? Priors P(Y) and conditionals P(Xi|Y) for Na239。1 Artificial Intelligence: Bayesian Networks 2 Graphical Models ? If no assumption of independence is made, then an exponential number of parameters must be estimated for sound probabilistic inference. ? No realistic amount of training data is sufficient to estimate so many parameters. ? If a blanket assumption of conditional independence is made, efficient training and inference is possible, but such a strong assumption is rarely warranted. ? Graphical models
點(diǎn)擊復(fù)制文檔內(nèi)容
法律信息相關(guān)推薦
文庫吧 www.dybbs8.com
備案圖鄂ICP備17016276號-1