The problem with trying to apply math to argue against the singularity is that the singularity, especially as Vinge posited it, is a metaphor.
I'm not saying the concept is unreal. I'm saying the idea of a point is unreal. It's also a mixed metaphor.
Vinge describes it as a point that we can't see beyond--a point at which our models of human life stop applying.
This has a physical analog, so it's important to keep in mind both the purely mathmatics based metaphor and the physical metaphor. Because in both, singularities probably don't exist, but in the physical system, there's something there.
The physical system is a black hole.
The first half of Vinge's idea, the point we can't "see beyond," is the event horizon. When Jeph says the singularity happened yesterday, this is the best analog we have. (Allowing for Hawking's argument that event horizon's don't exist as well. Keep in him that his argument does require a "horizon" that behaves in almost all of the same ways as an event horizon. It's just different enough to avoid some really freaky things. For our purposes--a metaphor--event horizon is a good enough phrase.) Unless a black hole is fairly small (as balk holes go), crossing the event horizon is a non-event. Being on one side of the horizon is not really different than being on the other (except that if you look back the way you came, you get to watch the the universe end, but that's not important).
Vinge's idea is that there will be a moment, or point, beyond which we can't predict the future with any certainty. That point lies in the future. Until it doesn't.
But then there's the other half of the concept... the point where our human model stops working. In his 1993 paper/talk Vinge expressly describes it as a point where we need new models to make any sense of what's going on. This is analogous to the black hole's singularity. And here's where we see that "the singularity" doesn't have to be a thing. But there must be something very like it. A black hole's singularity probably isn't a point of infinite density. The issue is, we don't know of any physical process that prevents a point of infinite density. But that's because we need a new model. The singularity is the part of the physical process that exceeds our theories' abilities to model. Infinite, as a physical thing, doesn't make sense. But there is a physical thing. The Milky Way is rotating around something. It seems to have an event horizon. What, exactly, that thing is is beyond our ability to say. We know it "is" a black hole. But we're kinda fuzzy on what a black hole is.
Vinge's conjecture is straight forward. Unless you believe intelligence is magic (in which case, there's nothing to discuss), Vinge's singularity is inevitable. In our life times? Depends on how old you are. Barring some event that dumbs down the collective intelligence of humanity or kills us all, it's going to happen. This might seem like evangelism, but it's just the practical consequence of information technology. Eventually we will create intelligence. Eventually we, or it, will develop intelligence that exceeds that of any human. Eventually a created intelligence will be smarter than all of us.
People like to speculate. Like, once it happens, things will change so fast we can't keep up. If you define "we" as humans today, this is true. Keep in mind, this is already happening. Technology regularly outstrips the ability of our institutions to manage. People have been complaining of future shock since the late 80's. While a few clear seers have look toward us here and now--from the past--and made good predictions, the vast majority have been getting it wrong since the mid-50's (I exclude anything trying to look more the 50 years ahead). For the first value of Vinge's concept, the singularity seems to be already happening.
But Vinge also expressly compared his rate of change to rates of change in biological systems. The rate of natural selection. Comparatively, that aspect of the singularity has also already happened. We spent ~9000 years doing pretty much the same shit. Then we invented industry. And spent ~900 years doing the different shit. Then we invented the modern world and spent about 90 years doing the different shit. There's pattern. The only thing left is for super human intelligence to fundamentally change the game plan. IMO we have crossed the event horizon. We can't go back. We can only destroy ourselves before we reach the point, or develop a new understanding as we get near it.
Kurzwiel, correctly, points out that when you are on a hockey stick trajectory your local environment look flat. Thus the problem of convincing the deniers that the global temperature is going up. The graphs prove it, but last year isn't all that different this year.
It's possible that a rapture of the nerds is coming. It's possible that strong AI will directly bootstrap to stronger AI in timescales we can't comprehend. Unlikely, but possible. It seems more likely that we are close to a break point. Strong AI will still live in the same physical world we do, for a while. Stronger AI will still require new hardware. Developments will happen on the order of years and decades for a while yet. But it seems equally unlikely that we will reach the end of the century without an exponential intelligence explosion. Unless we act to stop it.
Kurzwiel's in the happy singularity camp. Hawking is not. Hawking has been famously wrong, so it might be tempting to dismiss his position. But, to my knowledge, Hawking has never been public wrong to the extent that the thing he was wrong about didn't exist. If Hawking things super human AI will be a thing, it's a good bet it's going to happen. Whether he's right to think it will be a threat to us is debatable.
There's a log of appeal to authority up in there. I'm not against all rhetoric. Just the rhetoric I don't like. But keep in mind what a computer could do in 1990. Or in 2000. We are making our machines smarter. We might not get them to actually think on Vinge's time scale (2023, btw), but they will eventually think. And they will eventually thing better than we can. Unless thinking is magic.
Maybe even then.
The Singularity Vinge described is just machines thinking better than we can. We keep making machines "smarter." It seems absurd to posit that we can't even make a machine smarter than us. The fundamental limits of Moore's Law (with our current technology) run out
we develop hardware with human brain power for $1000 USD.