UFCS for D
Nick Sabalausky
a at a.a
Fri Mar 30 01:21:12 PDT 2012
"Nick Sabalausky" <a at a.a> wrote in message
news:jl3n59$qf7$1 at digitalmars.com...
> "Jacob Carlborg" <doob at me.com> wrote in message
> news:jl3kar$ie4$1 at digitalmars.com...
>> On 2012-03-30 04:05, Nick Sabalausky wrote:
>>> "Walter Bright"<newshound2 at digitalmars.com> wrote in message
>>>>
>>>> True, but I upgraded recently to 64 bit Win 7, with a 6 core processor
>>>> and
>>>> SSD drive. Reddit seems a lot zippier :-)
>>>
>>> I don't understand why people think it's ok for basic, basic shit that
>>> would
>>> have ran fine on a Pentium 1 (and less) to now require what quite
>>> literally
>>> is a super-fucking-computer-on-the-desktop just to run acceptably.
>>>
>>> Seriously, what the fuck's the point of buying all this insanely
>>> powerful
>>> hardware if software just turns the damn thing right back into a fucking
>>> single-core P1? That's just insane. People are seriously fucking
>>> bat-shit
>>> crazy.
>>
>> Have you seen this:
>>
>> http://hallicino.hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins
>>
>> They compare and old Macintosh from the 80's against a fairly new PC.
>>
>
> Yea, I've seen that. It's a very good article, though. Although I've been
> saying this since before that article, and even before multi-cores.
> Contrary to the title, I wasn't at all surprised which won ;)
>
> Of course, I don't expect software to be as super-fine-tuned as it was on,
> say, the Apple 2 or Atari 2600. There *is* definitely some value in
> loosing a certain amount of performance to abstractions, up to a point.
> But we've blown way, way, WAAAY beyond that point.
>
> It's sickening how much gratuitous waste there is in a lot of "modern"
> software, and really for not much benefit, as D proves.
>
Actually, one thing that really gets me is shutdown times: RAM is
*volitile*. How much processing can really be needed when the RAM's just
gonna get wiped anyway? You ask the user if they want to save, you flush the
output queues for anything non-volitile, and you cut the power. Sheesh!
Desktops are the worst offenders, and paricularly WinXP. But then even on my
brother's PS3, you can literally count the seconds before it actually turns
off. It's just a set-top gaming console, is that really necessary? (They can
spare me their "It does everything!" - like I give a crap about any of those
gimmicks.) On my old (super-low-power) NES, you could hit the power button,
and within one second you were at the title screen. Hit one button and
you're immediately playing (and I mean *playing*, not "watching exposition"
or "learning how to turn left"). And then power button again, and the
system's off. Try doing any of that on a PS3. It's amazing that the faster
and more powerful the systems become, the longer and longer it takes them to
start/stop tasks. ('Course, the Apple 2 is a notable exception: that thing
seemed to take forever to boot. It did shut down pretty damn quick though.)
Some of that stuff isn't even a technical matter at all, but deliberate
design: Who the hell decided we need twenty company logos (fully animated,
one at a time), then 10+ minutes of exposition and building "atmosphere",
followed by half an hour of (typically patronizing) tutorials before
actually getting to the real gameplay? Zelda Skyward Sword is the worst
offender, it literally takes *hours* to get past all the initial exposition,
tutorials and shit into the real core of the game (I honestly started
wondering if there even *was* a game - "Did I pick up a Harry Potter movie
by mistake?"). The original Zelda, you could get from power off to the meat
of the gameplay in literally seconds. Game devs won't let you do that now:
They've gotta show off their cinematography so they can get hired by Pixar,
where they *really* wanted to be all along. (Meh, Dreamworks was always
better anyway ;) )
Sheesh, (and now I'm *really* getting sidetracked here ;) ), even
*Hollywood* hates exposition (you can tell by how the actors/directors
always rush through those lines as fast as they can). But go figure: with
all the Hollywood brown-nosing the game devs do, and imitating them even in
ways that make no sense for an interactive medium, that hatred for
exposition and rambling on, and on, and on, is the *one* thing in Hollywood
that game devs aren't tripping over themselves trying to ape.
More information about the Digitalmars-d-announce
mailing list