…and again he is defeated. Although the Satan from Dante’s Inferno was definitely more of a bad-ass than the final boss in Castlevania: Lords of Shadow, who is pretty much just a naked dude with smoke leggings. For a break from this Devil/Angel/God mythology it’s probably time to slay the Stay Puft Marshmallow Man or maybe take a look at Red Dead Redemption. Or find out what’s new and exciting in the PC gaming world — although seeing as most PS3 titles still suffer from the same annoyances as always (e.g. camera angle vs. walk direction), probably not much.
…but I’ve completed Dante’s Inferno. Finally.
I’m pretty much giving up on Dead Space however, very sparse ammunition combined with bad aiming skills using the PS3 controller do not add up to a fun experience. Might give it another shot if/when I get a decent graphics card for my PC.
Now all I need is a heaping of motivation with a topping of perseverance to complete some other projects. There’s still a half-finished preamp, amplifier cases that need work, subwoofers, the temporary cases on the home theater speakers, not to mention a beagleboard and DAC…
Things that came close to being thrown out this weekend, in no particular order:
- Original Turbo Pascal 5.5 Upgrade on 5,25″ disks
- Modula2 compiler, also on 5,25″ disks
- Microsoft DirectX barf bag
- 3dfx key chain
- Pen with integrated 80s-style LCD game
- 3 Iomega Jaz Disks (all probably dead from click-of-death-syndrome)
- 12 Ikea LACK table legs
- CodeWarrior for PlayStation Net Yaroze
- Watcom C++ 10.5
- Intel spring-loaded wind-up modem cable thingy
- Tomb Raider Christmas-themed clock-in-a-can
No wonder there were so many boxes to move…
Reed-Solomon codes, Galois field arithmetic, Berlekamp-Massey algorithm…
At least GPU programming with CUDA is reasonably straightforward, even if NVIDIA/Intel kinda screwed up the ION2 platform. Yes, the GPU got a bit faster, but come on — a single PCIe lane was the best you could do? Good enough for HD video decoding, but anything requiring actual bandwidth is pretty hopeless, although it’s often a Good Thing if developers are forced to test against a slow machine. My work machine however is in dire need of an
upgrade replacement since I’ve had it for over three years — I don’t think I’ve ever kept a computer in regular use that long — but thinking about it triggers another overload: AM3, 1155, 1156, 1366, quad-core, hex-core, i5, i7…
So much of this seems so familiar…
One thing that is a bit of mystery to me is why so much time and money is spent (sorry, “invested”) in finding the bestofthebestofthebestsir, only to use them on dead-end projects where only the lowest common denominator skills are actually required. And why bother even filling your ranks with experts when decisions are not made on technical merit, and management will often simply ignore reality anyway, for example by claiming that their 16-year-old son could implement a complex and invasive feature with far-reaching consequences over the weekend…
Not to mention the fun of design by committee or corporate politics-driven development. In the best case, the quality converges on mediocre, but can also become unstable and oscillate wildly before exploding. I’m pretty sure the layers of abstraction (either hierarchical or intellectual) between those doing the work and those giving “vision” and “leadership” also factor into the equation.
Yes, I know, rants and anecdotes are not data