จากที่อ่านแล้วจับใจความก็ตั้งชื่อได้แบบนี้แหล่ะ ข้อความนี้มาจาก http://www.gamedev.net/community/forums/to...?topic_id=78662
OK...I'm writing this post to hopefully answer some vital questions and "tune" the way of thinking and approach to programming of all newbies and many experienced programmers and game developers out there. This post will most likely have two outcomes: either a major flame war, or a major revolution in approach to development. Hopefully, it will be the second one. Everyone feel free to comment on the subjects I discuss and ask questions. Here goes...
Everyone wants to be like John Carmack (the creator of Doom and Quake) these days, yet do almost nothing about it - that's the problem. It seems that all newcomers to the programming industry think that they can just learn C++, then make Quake 4 and tell Carmack that he can go home now. Well it's not like that. Compilers and High-Level Languages were initially designed with the intention of providing assembly programmers with a quicker and easier way of development. However this new invention was quickly abused and the phenomenon of "I know how to program but I don't know how it works" programmers was created. I'm not bashing anyone here, just trying to explain something that has been misunderstood by many people. Just think about it...If all the real programmers would instantly disappear then what would happen to the industry? The answer is long yet simple - no new graphics cards coming out every month, no new operating systems, no new consoles, no new 3D engines, no Doom 3, no DirectX 9, no OpenGL 2.0...no technological progress in other words as far as computer industry is concerned. And considering the fact that technology rotates around the computer industry, things would be messed up badly.
As far as assembly language goes, I think it's about time someone noticed that nVidia, ATI, AMD, Intel etc. release a new product each month. And no, their hardware does not understand high-level languages, just like it doesn't understand plain english. Of course, this is rather obvious, but then why is it that game developers want to make their game engines as good as possible relying on the raw processing power of modern graphics cards, but ignoring their new technological advancements? C++ is definitely a powerful language, but its core power is limited to the technology that was available at the time of it's creation. So why is everyone speaking against assembly language, yet complaining that consoles are years ahead of PC's when gaming is concerned?
Have you noticed that the progress made in the software industry has started to decline? New operating systems, games, you name it, have started to be cloned and mass-produced, resulting in countless repetitions of the same concepts. How many "different" operating systems are there? 10's? 100's? Yet many of them are the same thing with a different flavour. How many cloned games are there? Countless! The game developers in this community, as well as the commercial gaming industry, spend so much time on making "new" engines and games. OK, so you've just made the one hundredth clone of quake. So what? Those who have just made the one thousandth clone of Tetris, no matter how crap and simple it is, have potentially achieved more than you have because starting with small games like Tetris is the best way to learn programming. Then these makers of the tetris clone can apply their new skills to make something which is really NEW, instead of going on by spending months to clone the latest hit by ripping apart its source and modifying it to make a game which adds nothing new to the genre or to the technology. I'm not saying that cloning is the totally wrong thing to do. It would be the totally correct thing to do if you add something new to each clone you make. Even a small improvement, a single bit of code which employs a new hardware capability or a new algorithm, counts. It takes some knowledge and experience to understand this, but even after having done so most programmers just happily go on making clones of already existing technology as if nothing ever happened. My question is why? It doesn't take alot of brain power to understand what the word "clone" means and that you're developing something which already exists. Reinventing the wheel. So why does everyone carry on going in the wrong direction?
To understand this even better let's think about some examples of what I've just babbled about. Every new gaming console creates chaos in the gaming industry. Counter-Strike is still the most popular multiplayer game, although it originated from an ancient game, considering the fact that Half-Life was made from a highly modified Quake 1 engine. Every new engine from Carmack brings a revolution to the gaming industry. Updates on the development of Neverwinter Nights has RPG fans drooling all over their monitors. It seems that new series of Final Fantasy can go on being released forever and manage to gain more and more popularity each time. What do all these examples have in common that makes them so popular? - the element of technological progress. However, why are we no longer excited when hearing the news of the release of another Command & Conquer clone? Well simply because it's nothing to be excited about, since it's the same concept with supposedly new features. But we know that there's nothing new to see in it, so we don't bother looking (except for those who can't think of anything better to do than play games all day).
So what's the conclusion of this story? WAKE UP! It's time to realise that we're sinking. Start using your time productively, rather than spending it on "reinventing the wheel" over and over again. The infamous newbie question "How do I make Quake?" can be answered much more simply than you think - don't make quake, make something with your own design in it and make sure that it's worth the effort. Also understand that either you implement a new concept or technology in your project or it just won't be worthy. Another question answered - "How do I become like Carmack?" - you spend your time on learning useful and powerful things and making sure that you learn from the start. ie: don't learn any useless/dead programming languages and don't jump into C++ thinking that because it is the industry standard you'll become a guru if you learn it. Start off from the beginning - learn assembly language and how hardware deals with your code. Afterwards you can jump into C++ knowing that you're on your route to success. ie: You can't learn Nuclear Physics without learning basic maths.
Many people speak against assembly. But what if you actually used your brain to think about whether it's really that bad? You'll actually figure out that due to new processor technologies like MMX, SSE, 3DNow... and graphics PROCESSING units on which modern graphics cards are based, assembly language is the only way to making better software. Even if C++ had support for these technological advancements, it still wouldn't be possible to become an expert without learning how programming works at hardware level. When hearing expert programmers saying "I don't use assembly that much anymore" people tend to get the impression that assembly is a waste of time. Well it is the opposite. It is not used that much anymore because compilers and high-level languages were created for providing programmers with a quicker way of development, PROVIDED THAT THESE PROGRAMMERS KNOW HOW PROGRAMMING IS IMPLEMENTED AT LOW LEVEL. In other words, if you want to become real experts LEARN ASSEMBLY. It is true that assembly is hard to get a hold of, but once you know the basics it is MUCH EASIER to master than High-Level languages because there are no complicated functions, loops, algortihms etc. to learn. One important advice, however, is to start off by learning the basics of a High-Level language (preferrably C++, Java or Visual Basic) in order to understand what programming is all about and once you're confident that programming is your thing you can move on to learn assembly language.
Believe it or not, all those new CPU's and GPU's produced by Intel, AMD, nVIdia and ATI are not made to make the list of features of graphics cards and processors look cool. They were made to give developers more power and control, which can only be achieved if you know how to use the new technology. This is the reason for which console games look much better than PC games - power and control provided by modern hardware is employed in the games' engines, while the PC, which has alot more power than consoles, suffers from an "I'll buy a new PC and my 3D engine will be faster" attitude.
I certainly hope this post has answered the newbie programmers' questions as well as giving inspiration to the experienced programmers. Now I'll either have my ass flamed for "disturbing the peace" or get some intelligent and understanding replies