Questions tagged [1d-convolution]

For questions about the 1d convolution in convolutional neural networks.

6 questions
2
votes
2 answers

What does 'channel' mean in the case of an 1D convolution?

While reading about 1D-convolutions in PyTorch, I encountered the concept of channels. in_channels (int) – Number of channels in the input image out_channels (int) – Number of channels produced by the convolution Although I encountered this…
hanugm
  • 3,571
  • 3
  • 18
  • 50
1
vote
0 answers

How is batch data processed in a 1D convolution layer?

Suppose I have a time series data written in a matrix $\mathbf{X} \in \mathbb{R}^{N \times d}$. The sequence length is $N$ and $d$ is the number of features (I have $d$ series). Say I have a batch of $B$ samples. So I have one batch of data in the…
poglhar
  • 23
  • 4
1
vote
1 answer

Keras 1D CNN always predicts the same result even if accuracy is high on training set

The validation accuracy of my 1D CNN is stuck on 0.5 and that's because I'm always getting the same prediction out of a balanced data set. At the same time my training accuracy keeps increasing and the loss decreasing as intended. Strangely, if I do…
1
vote
0 answers

Why does the number of channels in the PointNet increase as we go deeper?

For example, in PointNet, you see the 1D convolutions with the following channels 64 -> 128 -> 1024. Why not e.g. 64 -> 1024 -> 1024 or 1024 -> 1024 -> 1024?
0
votes
2 answers

What do people refer to when they use the word 'dimensionality' in the context of convolutional layer?

In practical applications, we generally talk about three types of convolution layers: 1-dimensional convolution, 2-dimensional convolution, and 3-dimensional convolution. Most popular packages like PyTorch, Keras, etc., provide Conv1d, Conv2d, and…
0
votes
1 answer

Why would adding all the possible embeddings be "worse" than using 1D-convolutions?

Suppose we are using word2vec and have embeddings of individual words $w_1, \dots, w_{10}$. Let's say we wanted to analyze $2$ grams or $3$ grams. Why would adding all the possible embeddings, $\binom{10}{2}$ or $\binom{10}{3}$, be "worse" than…