For questions related to ensemble learning, which refers to machine learning techniques where multiple models (e.g. a neural network and a decision tree) are trained and their predictions are combined to solve the same problem. Bagging and boosting are two popular ensemble learning techniques.
Questions tagged [ensemble-learning]
23 questions
11
votes
2 answers
Do deep learning algorithms represent ensemble-based methods?
According to the Wikipedia article on deep learning:
Deep learning is a branch of machine learning based on a set of algorithms that attempt to model high-level abstractions in data by using a deep graph with multiple processing layers, composed of…

Erba Aitbayev
- 357
- 1
- 10
10
votes
1 answer
How can an ensemble be more accurate than the best base classifier in that ensemble?
BACKGROUND: Ensemble classifiers are said to reduce bias by taking an "average" of predictions of several base classifiers that comprise the ensemble. However, I am uncertain if this necessarily means that they can increase accuracy. My intuition…

Snehal Patel
- 912
- 1
- 1
- 25
4
votes
0 answers
When do two identical neural networks have uncorrelated errors?
In Chapter 9, section 9.1.6, Raul Rojas describes how committees of networks can reduce the prediction error by training N identical neural networks and averaging the results.
If $f_i$ are the functions approximated by the $N$ neural nets,…

EmmanuelMess
- 207
- 3
- 14
4
votes
2 answers
When do the ensemble methods beat neural networks?
In many applications and domains, computer vision, natural language processing, image segmentation, and many other tasks, neural networks (with a certain architecture) are considered to be by far the most powerful machine learning…

spiridon_the_sun_rotator
- 2,454
- 8
- 16
2
votes
0 answers
Why don't ensembling, bagging and boosting help to improve accuracy of Naive bayes classifier?
You might think to apply some classifier combination techniques like ensembling, bagging and boosting but these methods would not help. Actually, “ensembling, boosting, bagging” won’t help since their purpose is to reduce variance. Naive Bayes has…

Sivaram Rasathurai
- 316
- 1
- 10
2
votes
1 answer
How is an architecture composed of a second model that validates the first one called in machine learning?
I have a mix of two deep models, as follows:
if model A is YES --pass to B--> if model B is YES--> result = YES
if model A is NO ---> result = NO
So basically model B validates if A is saying YES. My models are actually the same, but trained on two…

Tina J
- 973
- 6
- 13
2
votes
0 answers
How can we combine different deep learning models?
I know that ensembles can be made by combining sklearn models with a VotingClassifier, but is it possible to combine different deep learning models? Will I have to make something similar to Voting Classifiers?

Arnav Das
- 101
- 4
2
votes
2 answers
Are there ensemble methods for regression?
I have heard of ensemble methods, such as XGBoost, for binary or categorical machine learning models. However, does this exist for regression? If so, how are the weights for each model in the process of predictions determined?
I am looking to do…

niallmandal
- 211
- 1
- 6
1
vote
1 answer
How do I check that the combination of these models is good?
I've selected more than 10 discriminative (classification) models, each wrapped with a BaggingClassifier object, optimized with a GridSearchCV, and all of them placed within a VotingClassifier object.
Alone, they all bring around 70% accuracy, on a…

Miko Diko
- 177
- 1
- 2
1
vote
0 answers
What is the precise relation between Swarm Intelligence and Ensemble Methods?
I come from the machine learning side of AI, and have recently become more interested in the bio-inspired side of AI. Specifically I started reading about swarm intelligence and immediately started drawing analogies to ensemble methods in machine…

Jack Ding
- 11
- 1
1
vote
2 answers
GAN with multiple discriminators
I am looking for literature recommendations regarding GANs with multiple discriminators.
In particular, I am looking for examples where each discriminator has a slightly different learning objective, rather than learning on different data. My…

postnubilaphoebus
- 345
- 1
- 11
1
vote
0 answers
How to properly combine multiple readings/measurements?
In an AI application (for example, self-driving), there are usually many different reading devices/sensors to ensure the outcome is correct. More specifically, a self-driving car can use object tracking with cameras, road-integrated optic fiber,…

seermer
- 111
- 1
1
vote
0 answers
How to calculate uncertainty in Deep Ensembles for Reinforcement Learning?
Lets take the following example: I must predict the return (Q-values) of x state-action pairs using an ensemble of m models. Using NumPy I could have the following for x = 5 and m = 3:
>>> predictions = np.random.rand(3, 1, 5)
[[[0.22668968…

HenDoNR
- 81
- 4
1
vote
1 answer
How do I take the correct classification predictions of an ml algo (i.e. random forest/neural net) and sort the instances in each category?
I am trying to sort the instances within each of 5 classification categories in a dataset that has been put through both a random forest classifier and a neural network with 99% accuracy on each.
Essentially what I am trying to do is stack a sorting…

Rocko
- 11
- 2
1
vote
1 answer
In ensemble learning, does accuracy increase depending on the number of models you want to combine?
I want to predict using the same model as multivariate time series data in a time series prediction problem.
Example:
pa = model predict result(a)
pb = model predict result(b)
pc = model predict result(c)
...
model ensemble([pa, pb, pc,...]) ->…

KYH
- 17
- 4