would suggest you to see sample engine :
http://www.devmaster.net/engines/
/////////////////////////////////////////////////
A game engine is the core software component of a computer or video game or other interactive application with real-time graphics. It provides the underlying technologies, simplifies development, and often enables the game to run on multiple platforms such as game consoles and desktop operating systems such as Linux, Mac OS X, and Microsoft Windows. The core functionality typically provided by a game engine includes a rendering engine (“renderer”) for 2D or 3D graphics, a physics engine or collision detection (and collision response), sound, scripting, animation, artificial intelligence, networking, streaming, memory management, threading, and a scene graph. The process of game development is frequently economized by in large part reusing the same game engine to create multiple different games.
//////////////////////////////////////////////////////////////////////////////////////////////////////
An example game engine
This article may require cleanup to meet Wikipedia's quality standards.
Please discuss this issue on the talk page or replace this tag with a more specific message.
This article has been tagged since April 2007.
The following paragraphs will describe a generic gameplay occurrence in a generic game engine, to demonstrate some aspects of the design. The occure will be playable character interacting with its environment such as moving from one spot to another, and while doing so, hitting an object, say a ball, and observing the result.
The first step would be Input. At this stage, our player decides what they do, then inputs this decision into the computer, using keyboard, mice or whatever method is used in the particular game. In most cases, movement would be activated by clicking keys on a keyboard to move forward. This action is received by the computer, which, using several methods which will not be covered in this paper, announces to the game engine through the Input module that the player has pressed a key, which in this case, will be W.
The Input module, having received this information, will now process it, moving it up one stage, to the Event dispatcher. This is a module on the way to the center module of the engine(called the Game engine). The Event dispatcher informs the game engine that input was detected, and then sends the input. Note as well the Networking module, this module handles incoming data from other players in games that have more than one player involved at a time.
At this point, we leave the Input module alone and go into the core of a game engine, the so-called brain of the engine. This is the part of the engine that does most of the decision making and tells the rest of the game what to do. In there, everything conceptual is handled, such as the fact the player is moving from point A to point B. In this case, the game engine will process the information it has received, and then do what is called a Status update. A status update means handling new input and generating a new 'picture' of what the game world looks like.
This is computed using the processor(CPU), which updates the game engine regarding the new player position, based on the data the game engine has given it. These are the CPU and memory management modules, these modules deal with scheduling the different stages of the game engines operation(CPU) and making sure it has the data it needs to preform computations(memory management). However, these modules are not relevant at this scope.
Once the game engine receives its new picture of the world, it can generate messages(events) to the other modules, dealing with all kinds of actions that occur as a result of player action. I will first examine the technical parts, and then the parts that produce something that is noticeable by a player.
In our case, we have decided that there is a ball in the way of our player, and that by moving forward, he has hit it and moved it.
This has been computed with the help with a collision detection module , a fancy way of saying this module(which in most engines fits inside the physics module) checks whether bodies have collided and informs the engine so it can feed information into other parts of the system.
In this case, the game engine sends the Physics module information. However, the physics module does not receive the entire world picture. To save time, the module works piece by piece. Therefore, the input in human terms would look more like this:
A ball, of size A, at position X,Y,Z has been hit by force B in direction C.
The physics module would do the math and returns to the game engine information about the balls' new position in the world relative to the its former position. At this point, the game engine once more creates a new world picture(in the same fashion as before). This step is repeated several times for each change in the world state occurring in a single Tick of time. At a certain point several other changes to the world state also occur, the most important among them being messages to the Artificial Intelligence.
Artificial Intelligence (henceforth referred to as AI) can refer to many things. Most often, it refers to other, non-player controlled characters, such as enemies or friendly characters. In case of our player moving forward, a plausible update would be sending the updated world picture to the Enemy AI module, which would then return a decision such as Not firing. There are many types of AIs existing in current games, and covering more than the barest basics of what such a module does is beyond the scope of this paper.
Once our game engine is finished processing all status changes, among them player input(such as movement), physic events(such as a ball being kicked), AI decisions(such as not firing), it starts the output stage.
The output stage is split into two main pieces.
The first and easier to explain, is the Sound Module. This is the subsystem responsible for handling all acoustical events in a game. Examples would include background music, speech, gunfire, or in our case, movement on a surface(say wood) and hitting a ball.
In our case, the sound module receives a message telling it to play the sound of our player walking on wood and a ball getting hit to the appropriate speakers. What it does is pull up the audio files(using the streaming subsystem, another sub-module not discussed in depth included under memory management) that are linked to the above events, and plays them. This is a simplistic description, and modern sound modules have the capability of changing how the sound is heard depending on what speakers a player has, what direction the sound comes from, and other such different conditions that make today's sound modules capable of producing very realistic sound.
The second piece is the Graphics engine, which has in some cases been mistakenly referred to in the press as the actual engine. This is the subsystem responsible for everything that gets displayed onto the screen. As our diagram shows, this is a rather large and complex module, which in fact can be taken on its own and be called an Engine on its own(which may be the source of press confusion).
The Graphics engine can be described as a pipeline. In the first stage of the pipeline, data is received and processed regarding the world picture. At this stage, all the information is delivered in terms of dots. The first step of the graphics engine is replacing these dots of information with actual models that look similar to what the final image is.
Afterwards, there is the animation stage. At this stage, the models are modified to appear in the proper stance. For example, in case of our player moving, the proper stance would be in mid-step. This is done by 'Transforming' a model to the proper shape, according to pre-defined instructions. Now comes the texturing stage. Textures are basically drawings of different types that are 'pasted' onto a model to give the model color. A texture can be a drawing anything, such as clothing, and usually several textures are overlayed to create complex shapes such as a human face.
Following this, a series of steps which as a whole are classified as Lighting occurs. At this stage, light is displayed upon the world, shadows are drawn and other such information is filled in. By this stage, the process is nearly complete and the game almost looks lifelike.
Afterwards, just a few small effects are left. These are called Particle Effects because they deal with small particles, such as, dust rising up from wood that is being walked upon. This is a simplification, in practice, all these steps, such as lighting and effects, are done using the same stage, however there is a difference in importance, which is why in design, they are usually considered separate stages.
At the end of this process, we have a finished image, that-excluding specific technical details to clean up the image, is ready to be displayed on the screen. It now passes to the Rendering stage.
The Rendering process deals with displaying the image onto a screen, by taking our digital information and translating it into an analog signal that is ready to be transmitted to a screen. In some cases, additional technical work can be done at this stage, cleaning up the image etc. but after this, our engine rests, for a minuscule fraction of a second, waiting to process the next frame.
All these complex interlocking pieces of technology cooperate to create the final appearance of the game, visual and sound combined. It is ironic that these complex devices mostly run idly, waiting for the player to make a decision, all the while keeping the player seamlessly immersed in the game. Most of the time, an engine is busy just displaying the same image, waiting for player input, but even though the player is idle, the engine must still work hard to display a consistent output that attempts to present an alternate world.
//////////////////////////////////////////////////////////////////////////////////////////////////////
Hardware abstraction:
Most often, 3D engines or the rendering systems in game engines are built upon a graphics API such as Direct3D or OpenGL which provides a software abstraction of the GPU or video card. Low-level libraries such as DirectX, SDL, and OpenAL are also commonly used in games as they provide hardware-independent access to other computer hardware such as input devices (mouse, keyboard, and joystick), network cards, and sound cards. Before hardware-accelerated 3D graphics, software renderers had been used. Software rendering is still used in some modeling tools or for still-rendered images when visual accuracy is valued over real-time performance (frames-per-second) or when the computer hardware does not meet requirements such as shader support or, in the case of Windows Vista, support for Direct3D 10.
MMOG Middleware:
Middleware for massively-multiplayer online games is far more complex than for single-player video games. However, the increasing popularity of MMOGs is spurring development of such middleware packages. Some prominent solutions include:
Gamebryo
HeroEngine
BigWorld Technology
Kaneva Game Platform
Multiverse Network
Monumental Technology Suite
Project Darkstar
RealmCrafter