GPT-3 obviously knows this
GPT-3 doesn't "know" things in the sense that it has learned specific translations, maths or science etc.
GPT-3 is not a knowledge database or question answering system, but it can behave like one if prompted carefully. Sometimes it is hard to get specific responses. This appears to be one of those cases.
Did I ask it in a wrong way or is GPT-3 unable to answer questions like this one?
GPT-3 doesn't really "answer questions", it completes text, and probablisitically fits that text to match the prompts. The fact that it gets a lot of Q&A, maths or translation-based prompting correct is an interesting side effect of the strong language model. It is not clear what the gaps are in it capabilities to perform secondary tasks like this, but very reasonable to expect some.
In general, you should not expect GPT-3 to automatically pick up on questions with answers, and assume it is answering the question as written. Some reasonable questions might look like the start of a novel or newspaper article to it, or in your case it might be acting as if sentences are only loosely related, like a list of different translations. To reduce the flexibility in possible completions, it is normal to set some context in the first paragraph - a kind of meta-explanation of what the rest of the text is doing.
I managed to get it to work like this:
The following is a translation, with individual words explained.
English: The man removes the can.
German for "can" is "Dose"
My prompts are in bold. I am sure plenty of variations of what I wrote will work for this specific case. I don't think you will find a 100% reliable prompt for arbitrary words from any sentences.