Technological Advancement and Money
Less than a year ago, I got myself my first laptop. At 2,600USD, I normally couldn’t hope to afford it, but they had it on sale for 1,500USD for two reasons:
- It was a display model. This was supposed to mean it would be damaged, but the only ill effects of its treatment were two small holes in the back where a padlock had gone, and a very tiny dent in the paint job on the back of the monitor.
- It was a discontinued model, though not in the usual negative sense. Sony sold different versions of this exact model with the same specs in various screen sizes. This screen’s size, with a resolution of 1920x1200 was the largest of the sizes, and had been discontinued. Therefore, other than having a nicer screen, it was identical to existing models. The technicality of it being a discontinued model however, further lowered the price.
Naturally, I was quite like a 3-year-old with a new toy with this thing. Not only was I, a total computer geek, now able to have a computer anywhere I went, but it was a computer with GOD-like system specs (at least for a laptop), and a screen that was not only sharper than my CRT, but also physically larger.
You’d think that now, less than a year later, this computer would still be godly, or, at the very least, better than average, but you’d be wrong. Now it is nothing. It will barely be able to run Windows Vista at all, and it cannot run any of the latest games even at minimum quality.
In my mind, something is very, very wrong here. I understand that with the market’s increasing demand for more advanced technology, higher system requirements for newer software are inevitable, but come on now, how much are we really expected to pour into upgrades? With 6USD a week in allowance, how am I expected to pay 2600USD a year for new laptops? Yet, somehow, it’s not their fault that the software demands so much, it’s mine for not paying the hardware industry enough nonexistent cash.
Doesn’t look like I’m alone either. My system specs are similar to the ones listed in the system specs thread on this board, so why then, is my computer considered to be old? I remember reading a thread on a Windows forum about whether or not Microsoft’s decision to support my laptop’s 32bit CPU in Vista was the right decision, and a lot of people felt that those still clinging to 32bit technology have it coming to them, and it’s time to buy a new machine.
I’m terribly sorry, but I just don’t get the 64bit craze. 64bit is nothing new, the Nintendo 64 had it back in 1996. The technology hasn’t appeared in the personal computing world until now, and that leaves me skeptical about its legitimacy as a technology. If it really brought anything truly revolutionary, hardware companies would have been a lot quicker in bringing it to PCs. Now it seems the technology has popped up at the most opportune time to get people to by new PCs right before Vista comes out, then throw those out and by new ones again a few months from now when it turns out their specs aren’t high enough.
I have friends with 64bit PCs. I know what that world is like. Windows XP 64bit edition, which is currently the only MS operating system for this platform we have, is about as stable as a tower of children’s blocks. Moreover, it’s nearly impossible to find any software that runs properly on it, especially in the field of security software such as antiviruses, firewalls, and antispyware apps. At the same time, however, using a 32bit OS removes all benefits of having a 64bit machine, and runs like molasses. The result is that most of these people dual-boot Windows XP Professional and Windows XP 64bit. I’ve done dual-boots in a variety of settings before, and they aren’t exactly convenient arrangements, not to mention they’re a pain in the ass to set up. The fact that 64bit machines actually require one to function in today’s world is a sign to me that it’s simply not yet time for such an upgrade. Hell, the main reason to use 64bit is the ram. 32bit machines can only handle up to 4gb of ram, which everyone seems to feel is pitifully small, yet I have yet to see a machine that uses even more than 2gb, including 64bit ones.
I don’t know about you, but I think any software company that releases a product that can’t run on a machine with 4gb of ram is going to have a hard time justifying that requirement. I mean, yes, new features are appearing in software such as IDEs, runtime environments, operating systems, word processors, browsers, etc. I also realize that we’re talking very long-term here with regards to what new features for such products will come out, but I really can’t see how a few new features, most of which are relatively simple like tabbed browsing or a better organized GUI, can quadruple the system requirements for the program. Something is just fishy about that.
Take IE7 for example. I realize the program isn’t finished yet, but it’s already in the release candidate stage and only bug fixes remain until completion. While IE6 booted instantly, humiliating Firefox, IE7 takes, on my machine, 15-30 seconds to boot. What the hell about the new browser features requires it to take THAT long to boot up? What I ask you?
Think about this. Microsoft Windows XP required 128mb of ram as a dead minimum to function at all, with 256mb to function at a decent speed. Now, Vista requires a minimum of 512mb of ram with 1gb to function at a decent speed. That means that Windows Vista will be as difficult to run as 4 copies of Windows XP running at the same time, despite the fact that very few new features have been added to the OS. What the hell is going on here?
I blame the problem on what I like to call the “breathing room” philosophy. I remember when I took my principles of computer programming class and I kept asking my instructor which methods of doing the same thing would be easier on a computer to run. He told me it didn’t matter, saying that if it mattered, I’d clearly be able to tell by the varying speed of my computer during runtime. In my strong opinion, this fails to take into account the fact that when you take all these wasteful ways of doing things, and multiply it a million fold in a very complicated application, you end up with something that runs far less efficiently than it would otherwise, yet it doesn’t matter because we “have the room” for it. This attitude pisses me off even if I can run the software, since it means I can expect it to run far less smoothly than it could otherwise.
Take FarCry for example. It’s a great game, but it’s murder for system specs. Given the setting it takes place in, that would be entirely understandable, were it not for the fact that it also occurs in indoor areas. On a machine with a gig of ram and a Radeon 9800 Pro, the old clunker ran the thing at a steady 15fps. In a desperate attempt to play a game that I though might be fun, I did the following:
- Removed all textures from the game.
- Set the display resolution to 320x200, the same resolution used in the original Doom.
- Changed all trees to sprites.
- Hurt the lighting quality so bad it looked like I was in a sandstorm.
- Changed smoke and muzzle flash effects to colored flat squares.
Was this enough to get it working? Yes. Finally I had it. The game ran at a consistent 30fps, and I was finally able to play it without my mouse lagging everywhere. However, the experience of the game was gone, certain parts were impassible due to the quality, and the lack of graphics also exposed the fact that aside from being a picture show, the game had little else to offer, particularly in the area of gameplay. Taking all this into consideration, I decided to remove the POS from my system, and go play Half-Life 1, which had graphics far superior to what had been done to FarCry, yet ran at over a thousand FPS on the same machine. Sure, it doesn’t have ragdoll physics and the soldiers are stupid as hell, but aside from that there’s really no reason why it should run so much faster.
When I told this to my friend, who was able to run FarCry just fine at 25fps on full quality, he remarked that it’s my fault for not keeping my system up-to-date. Personally, I don’t see how that’s relevant, seeing as how even his machine could benefit from it being made properly. There’s no reason why he shouldn’t be able to run it at 100fps at that same quality, then he wouldn’t have to deal with the choppy moments of the game.
What frustrates me, as this situation gets worse and worse, is that I see my precious world being taken away from me. I just don’t have the funds to keep up with it all, and 10 years from now, I see myself no longer being able to run any proprietary software. This is all complete stupidity, but hey, it’s making them money, and I seem to be the only one that’s noticed the problem. Maybe that’s why they haven’t stopped it yet.
Well, that’s my rant. I’m off to play Super Mario Bros. on an emulator, something that doesn’t require a super computer to do, yet still does not fail to entertain me. Farewell.
Technological advances are part of history for example cavemen used stick's and stone's for fighting each other then it was spear's and arrow's then horsemen and sword's then muskets and cannon's and now bomber's and tank's.Get used to it.
I Do,however agree with you about vista.
I Do,however agree with you about vista.
Yes, but how many Neanderthals do you see operating those bombers and tanks? Technology is now advancing at an incredible rate, presumably due to the widespread availability of information.
Don't misunderstand me here, I think that's a great thing. The thought of how much computers could improve during the course of my lifetime thrills me. I'm currently 19 years old, and by the time I die, it's not unreasonable for me to expect the availability of 1xb ram sticks. That's a thrilling prospect. Right now, I'm thinking of all the things that could be done with such technology. Think about how complex a game could become. Think about what kinds of new advances in software and operating systems would be possible with something like that.
My rant wasn't about the fact that I was unhappy with technology, it was about how this new technology is being used. I'm not naive enough to say that things haven't improved since, say, 5 years ago, but given how much I've spent on all this new stuff, it just hasn't improved enough. Rather, the positive effects of the new technology are largely eliminated by the breathing room philosophy. Rather than newer programs truly taking advantage of it, they just fill in the space with useless bloat. This means I'm compelled to buy a new machine to use software that really isn't that much better.
My FarCry example was designed to illustrate this. I lowered the quality of the game to something that made Half-Life 1 look amazing, yet it ran at only a tiny fraction of the speed. Yes, it was still more complicated due to engine physics, but come on now, is it really reasonable to expect a 30/1000fps difference just for those? I don't think that's justified, and I think had the breathing room philosophy not been employed, such software would have run fine on maybe 150% the system requirements of Half-Life 1. I think of the fact that if people didn't bloat up there software, how much farther all this stuff could go.
So if this doesn't stop, I already know what that 1xb of ram would be used for - it would be used for something only slightly better than what is available to use today because the bloat would eliminate any benefit possible by that technology. Meanwhile, existing systems will slowly become worse at running the same software they've always run due to the bloat.
I was remarking on Super Mario Bros. to comment on the fact that even most of the visible improvements in the technology when the bloat is factored in are wasted on garbage that doesn't really contribute to improved quality in a game. I could probably get more fun out of modern games, but those aren't available to me because I don't have the money for them. Old games are sufficient, and if I'm forced to resort to those for entertainment because of how much waste is put into otherwise great technology, so be it. I'll live in the dark until people come to their senses. Then maybe my gig of ram can have more use to me than killing goombas.
TheMachine1
Veteran

Joined: 11 Jun 2006
Gender: Male
Posts: 8,011
Location: 9099 will be my last post...what the hell 9011 will be.
When I was at college 93-96 An IT student told me memory cost $50 per MB.
In 2006 i have 768 MB that would cost like $38,400 back then. I remember my used
laptop I bought for $1000 from my sister in the same period had a memory card upgrade
and my sister said it cost hundreds of dollars.
The people that believe the singularity are idealistic dreamer with no idea about how evil people really are. We currently have the technology and resources to feed the hungry cure poverty fix most peoples medicals problems.
But people in general are wasteful. The entire idea of a consumer society is to make people never happy with what they have so they work harder at jobs to buy stuff they don't need.
To be honest new games are not as fun as the old games. Starcraft is still the best game that ever was. And all the old sega master system games were more fun than todays mindless killing games.
Todays games also seem to be conditioning people to violence so they accept it as the norm. I see so many peoples morals around the world being influenced by tv and video games to accept things that would of been unimaginable even 20 years ago. You go show your grandparents one of todays shoot em ups and see what they say.
When people have a constant hunger to get more items in the consumer society they are never happy never satisfied. They are more willing to break the rules and take advantage of any chance they have to gain wealth because there motives are for material goods alone. Slaves to capitalism with no real heart or compassion.
I find todays games morally wrong. I have a problem with playing a Life like game that goes and kills people and doesn't have any other interesting game play but killing.
The singularity is a nice dream but completely unrealistic. I wish it was realistic. My interest in people with aspergers is there seems to be a greater percentage of people that still have morals and compassion than most people. Don't get me wrong im not saying that every one is heartless because no two people are identical...
Ok im going to stop my self now i could go on and write a book on this stuff if I carried on to much longer.
_________________
Unfortunately being human is a genetic disorder, and ultimately fatal.
edit: overdoing it.
_________________
All hail the new flesh, cause it suits me fine!
Last edited by Scintillate on 17 Oct 2006, 8:24 am, edited 1 time in total.
My desktop is a few years old, and is for the most part sufficent.
Each new version of windows seems to be less and less efficent. Microsoft tends to release chunks of source code (not the whole thing of course), and it's terrifying. Extremely difficult to understand.
So imagine what the programmers working on it have to deal with. And each new set of them makes things worse because the foundation is so warped.
There aren't many uses for 64 bit processors at this point. I've heard Solaris is good as far as 64 bit goes.
Similar Topics | |
---|---|
Just started a retail job and struggling with counting money |
25 May 2025, 6:05 pm |