Cross-validated is a question-and-answer site. I think it's worthwhile to ask "How well does this website serve people who ask (on-topic, answerable) questions?"
To answer this question in a quantifiable way, we can look at the gap between the number of (non-closed) questions asked in a month in total and the number of (non-closed) questions with answers. I have a query that does this. Behold! The chasm between the number of questions asked in a month and the number of answered questions yawns!
This plot clearly shows that, since 2016, the gulf between questions and answers is more than 1000 unanswered questions per month. For recent years, it appears that roughly half of questions are not answered. Moreover, the number of answers is on a gentle decline.
If we amend my query to break out questions with the [tag:neural-networks] tag vs those without that tag, we can see that the answer gap does not solely arise from neural networks. (It was easier to just do the post-processing in R.)
Glen_b asked a related question several years ago, Are we seeing a dramatic drop in answers per question? But I think that this is not the best way to measure how we are doing. Instead, I think we should ask "If a user asks an on-topic, answerable question, will that user get an answer?" Additionally, I think we can decisively answer Glen_b's titular question in the affirmative (look at the plot!), since a bunch of smart statisticians have all made their own graphs and measured their own quantities of interest over the past several years. Now that we have evidence that there is a problem, I think we should start looking for solutions.

