Questions tagged [bayesian-probability]

Questions in this tag should be about the Bayesian approach of probability theory and its relevance to AI-related issues.

Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief. The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses.

9 questions
7
votes
1 answer

How does the Dempster-Shafer theory differ from Bayesian reasoning?

How does the Dempster-Shafer theory differ from Bayesian reasoning? How do these two methods handle uncertainty and compute posterior distributions?
4
votes
0 answers

How can I draw a Bayesian network for this problem with birds?

I am working on the following problem to gain an understanding of Bayesian networks and I need help drawing it: Birds frequently appear in the tree outside of your window in the morning and evening; these include finches, cardinals and robins.…
4
votes
1 answer

What is the relationship between fuzzy logic and objective bayesian probability?

I understand fuzzy logic is a variant of formal logic where, instead of just 0 or 1, a given sentence may have a truth value in the [0..1] interval. Also, I understand that logical probability (objective bayesian) understands probability as an…
olinarr
  • 745
  • 6
  • 20
1
vote
1 answer

Why is the E step in expectation maximisation algorithm called so?

The E step on the EM algorithm asks us to set the value of the variational lower bound to be equal to the posterior probability of the latent variable, given the data points and parameters. Clearly we are not taking any expectations here, then why…
1
vote
1 answer

Understanding how to calculate $P(x|c_k)$ for the Bernoulli naïve Bayes classifier

I'm looking at the Bernoulli naïve Bayes classifier on Wikipedia and I understand Bayes theorem along with Gaussian naïve Bayes. However, when looking at how $P(x|c_k)$ is calculated, I don't understand it. The Wikipedia page says its calculated as…
1
vote
1 answer

How does maximum approximation of the posterior choose a distribution?

I was learning about the maximum a posteriori probability (MAP) estimation for machine learning and I found a nice short video that essentially explained it as finding a distribution and tweaking the parameters to fit the observed data in a way that…
user8714896
  • 717
  • 1
  • 4
  • 21
0
votes
0 answers

Methods for sequential decision optimization problem with nonlinear bayesian reward function

I am attempting to grasp if there are any other methods out there that i am not aware of that can be beneficial given my problem context. Being inspired from optimal experimental design communities and RL-communities i have a sense there is. To…
0
votes
0 answers

What makes Sequential Bayesian Filtering and Smoothing tractable?

I'm currently diving into the Bayesian world and I find it pretty fascinating. I've so far understood that applying the Bayes' Rule, i.e. $$\text{posterior} = \frac{\text{likelihood}\times \text{prior}}{\text{evidence}}$$ are most of the time…
-1
votes
1 answer

Given A and B, C are independent of each other. Given A, B and C, D and E are independent of each other. What is the minimal number of parameters?

Assuming all variables $A, B, C, D,$ and $E$ are random binary variables. I come up with Bayes net: $D \rightarrow B \rightarrow A \leftarrow C \leftarrow E$ which has the minimal number of parameters of 10, I think. However, the given choices are…