dmd platform support - poll
terminal.node at gmail.com
Sat Dec 27 22:52:53 PST 2008
> "Walter Bright" <newshound1 at digitalmars.com> wrote in message
> news:gj6mld$294o$1 at digitalmars.com...
>> John Reimer wrote:
>>> Incidentally, I'm still using my Compaq Presario X1000 laptop
>>> (Pentium M 1.4 GHz) which is probably close to 6 years old now.
>>> I've updated certain aspects of it and fixed it a couple of times.
>>> Amazingly it keeps running... and performs quite well for my needs.
>> As my main machine, I use a P4 at 1.6 GHz, 512 Mb ram. I'm not sure
>> how old it is, but when the power supply failed and I went to the
>> nerd store to replace it, the guy said "I haven't seen one of these
>> power supply configurations in years!" He sent me to the local pc
>> recycler, where I got one out of a bin for $10.
> Now that I don't feel like I'd be laughed out of the discussion in a
> flurry of posts involving words like "archaic": Mine's a:
I should confess something here. Two or three years ago, I actually purchased
components and built two AMD Athlon 64 systems for myself (sequentially...
not both at once). The last one was a dual core. But I gave them away to
family and settled on just using my old laptop.
Both systems had fairly powerful graphics cards in them too. They were good
systems and were great for playing the latest flight simulators... but I
decided I wanted to spend less time on games. :). Right now the dual core
system is put to very good use by my younger brother and sisters for video
editing... one can never get too much power or memory for that task. Also,
nowadays, video-editing among non-professionals is quite common... so I think
there may just be a whole lot more justification for buying into some of
these powerful systems than you might realize.
> - 1.7 GHz Celeron (was a 1.2GHz AMD K6-2 for a long time, but I bought
> this CPU/MB off someone for about $25, seems to be about the same
> performance though (makes sense, Celerons are notoriously low on
> cache, or at least were last I checked)).
I've never heard of a 1.2 GHz K6-2. Was that overclocked or something?
I think most of those maxed out at 500 MHz. I've used k6-3's and k6-2+'s
before. Excellent CPU's for the time.
> - 1 GB RAM (Only reason I upgraded from 512MB was I had a job that
> needed MS's bloated .NET era SQL Sever client),
I also have 1 GB. This is actually a little limited for a system that needs
to use a VM like VirutalBox, VMWare or Parallels. I've made use of these
tools for porting both win32 and linux software at the same time. I now
use coLinux, and while that works much better, I still could use more memory.
Also since I use VM's, I'm actually missing the hardware support for them
that is now available in modern CPU's.
> - Graphics card that's pixel shader v1 (was a pre-pixel-shader
> GeForceMX 2 for a long time, only upgraded because I found this one
> for about $40 and wanted to play around with pixel shaders).
I'm looking forward to the time I can purchase another laptop. I wouldn't
mind getting a graphics chipset that supports some of these features so that
I can experiment. But admittedly, Graphics technology is among the fastest
moving targets yet. For now, I can do just fine with my laptop and its ATI
Radeon 9000 64 MB chipset.
> - The motherboard's USB is v1.x
I can't stand USB v1.x ... it's way to slow for hard drive operation. The
bandwidth just isn't sufficient anymore.
> - 21" CRT I got from a CompUSA store-closing for $25. (Funny thing is,
> this was made years ago and goes higher than HD resolution and has no
> native resolution, good contrast, no ghosting, no realistic risk of
> burn-in, and zero frames of "image processing" delay. Silly people and
> their hundreds/thousands-of-dollars LCD/Plasma/DLP HDTVs ;) ) I can't
> hang it on the wall, but what do I care? My desk's big enough.
LCD's are among the greatest advancement of today's technologies. CRT's
are horrible throwback to the day of triodes, pentodes, and lightbulbs...
and other high-voltage Edison-derivatives. I've considered them so for probably
over 10 years. I was itching for the day that I could stop staring at an
electron beam sprayed directly into my eyes. I've never lamented the CRT's
fall from grace.
> So, yea, about on par with you two. (Although I do have damn near a TB
> of HD space and still crave more...yea, I'm a packrat.) The only thing
> about it that I feel is insufficient is the number of PCI ports (it's
> one of those reduced-size motherboards...in a non-reduced-size case),
> but I'm still getting by.
> I do some occasional video processing/editing, 3D stuff (mainly to
> learn it), and gaming (but nothing like Gears of War or Halo or
> anything like that, besides I prefer to game on a living-room
> console). If I were to get really serious about any of those things, I
> would probably want a new system, but I don't do enough of them to
> really justify it.
> I would kind of like the convenience of a laptop (mine's dead), but
> the only reason I'd be interested in the fancier CPUs on that is for
> the reduced heat/power consumption.
> Speaking of laptops, if anyone hears about a company that makes
> quality laptops with an actual built-in trackball, let me know. I
> can't stand those awful touchpads or IBM's "nubs", and dragging around
> a real trackball in addition to power cord, etc, starts taking away
> from the whole "portability" thing.
I haven't seen or heard of trackballs in a laptop for a long time. :)
I know what your saying about the touchpads... I'm actually surprised they
stuck. I don't consider them to be the best invention to be adopted for
laptops. They do the job, but I think we should have stuck with "nubs".
"nubs" were hard to get familiar with, but once you did... they were extremely
practical and space efficient... or so I felt anwyay.
> One other funny anecdote about "newer/trendier is not always better":
> I've been recruited by a friend of my mom to replace/supplement her
> small business's wireless network with a wired one.
> And now I'll stop rambling ;)
Wired is not necessarily backwards. :) Despite the tangly lines, it's just
easier to keep secure.
More information about the Digitalmars-d