Comic Discussion > QUESTIONABLE CONTENT
Robots and love
Skewbrow:
Thanks, but I prefer to read material reinforcing my prejudice "Object-oriented programming is the bane of hobbyists and the source of much grief", so Dijkstra may be my best hope? I lost a dear hobby when support to DOS was discontinued. In those days my PC did what I wanted it to do, not what Microsoft thinks that I should be allowed to do with stuff that I own. My programs had full control of the HW, and didn't need to ask Windoze a permission to do something. They were also free of bugs (well, not always, but after I was done). When the change became inevitable, I was suddenly feeling very sympathetic to the old-timers bashing Spinning-Jennies. Mind you, my livelihood was not at risk, but still...
Dijkstra:"A program with a single bug is not almost correct - it is wrong." <- Something that should be sent to Bill Gates' inbox each and every time Microsoft releases a "critical upgrade"
pwhodges:
I had written my own real-time, multi-tasking operating system in the 1970s. For me DOS represented a huge step backwards, both in terms of the lack of security, and in terms of reliability - so I hated it like you hated Windows.
OS/2, and especially OS/2 v2 showed the way forward for small systems, but MS managed to subvert it, first with Windows, then by changing the direction of the NT (originally aka OS/2 v3) development; it was also MS who imposed on IBM the one truly bad design decision in the OS/2 Workplace Shell (the Single Input Queue), and IBM never fixed that because they were not prepared to break full compatibility for existing customers over it (also a bad decision).
Dijkstra and Hoare (and later, Knuth) were my gods; but in the end I have always allowed pragmatism to temper idealism, because I could see that the perfect theoretical world of program proofs was simply going to get left behind.
Random Al Yousir:
Skewbrow, there are no prejudices against object-oriented programming. There are hard facts.
And it was Dijkstra, who pointed out to be painfully aware of the limited size of your skull. Which induced much (intentionally) badly pretensed yawning, IIRC.
At some point you just have to deal with the need to abstract "away" complexity, which might be a non-brainer for a mathematician; however, computers are real-world, unclean, mutable state machines, and as long as you insist of controlling the whole shebang "per pedes" you will run into the oh-so-tight limits of your skull. If you want to go for something more ambitious, you want to go for abstraction. And abstractions are leaky. Another hard rule. Rock hard, in fact.
That's why I think you might enjoy the research of Philip Wadler; because his work is centered around a mathematically "clean", referential transparent, lambda calculus/combinatorics-based approach to the art of abstraction.
A good start into the topic should be here to "hook you up". Maybe this vein of research could do the job to restore your faith in computer science.
Oh, and pwhodges is right: There will be bugs. The way to go is to find a way to deal with this fact.
Object-oriented programming is a hack-up, a kludge. You won't find much disagreement, there. But it's a necessary kludge, because it's so damn hard to do things right.
Skewbrow:
Proofs of program correctness! Luckily I never had to do more than a couple to pass that intro course, and those were trivial for a math dude. :-)
My grumpiness is mostly coming from the fact in math the tools are eternal, so I was unprepared to meet the reality that the other tools I took the trouble of learning didn't last a decade.
For work related programs I just wanted (and still do!) the full computing power. For entertainment of (self-made or bought) games the same thing was true. On those occasions not having to fight the operating system or share the cycles with something else was nice (if not necessary). Reliability was not an issue. If I crash the system while debugging a hooked interrupt or by attempting to read past a nil pointer, that is solely my fault, and the PC would reboot in under twenty seconds anyway. If a work-related simulation crashed, that meant lost productivity, but again was my own fault, and a faulty sim might not give reliable results anyway. Yes, I realize that you cannot run a system serving several people with an attitude like that, but the point is that my PC was truly personal.
I did start learning Delphi for Windows a year ago (they taught me Pascal as an undergrad, so that's what the salesdude recommended). If I find the time, I may do a full comeback - at least port my old stuff to Win. So much to learn/unlearn. :cry:
RAIY seems to have posted more links. May be I should make a serious attempt at learning?
Random Al Yousir:
Well, not so much "more" links. The first one you can safely ignore, the second one is just a good starting point on Wadler's site, which I referenced already. :wink:
Navigation
[0] Message Index
[#] Next page
[*] Previous page
Go to full version