C ++

我有一个sdl / opengl游戏,我正在为乐趣而工作。 平均来说,我得到了一个体面的fps,但运动真的很不稳定,因为SDL_GL_SwapBuffers()会随机地花费很长时间来处理。 纹理加载并写入缓冲区有时会超过100ms! 我剪掉了很多代码,试图弄清楚是不是我做错了,但我没有太多运气。 当我运行这个裸骨节目时,它有时会阻塞高达70ms。

主要:

// Don't forget to link to opengl32, glu32, SDL_image.lib

// includes
#include <stdio.h>

// SDL
#include <cstdlib>
#include <SDL/SDL.h>

// Video
#include "videoengine.h"

int main(int argc, char *argv[])
{
    // begin SDL
    if ( SDL_Init(SDL_INIT_VIDEO) != 0 )
    {
        printf("Unable to initialize SDL: %sn", SDL_GetError());
    }

    // begin video class
    VideoEngine videoEngine;

    // BEGIN MAIN LOOP
    bool done = false;
    while (!done)
    {
        int loopStart = SDL_GetTicks();

        printf("STARTING SWAP BUFFER : %dn", SDL_GetTicks() - loopStart);
        SDL_GL_SwapBuffers();


        int total = SDL_GetTicks() - loopStart;
        if (total > 6)
            printf("END LOOP  : %d ------------------------------------------------------------>n", total);
        else
             printf("END LOOP  : %dn", total);

    }
    // END MAIN LOOP

    return 0;
}

我的“VideoEngine”构造函数:

    VideoEngine::VideoEngine()
{
    UNIT = 16;
    SCREEN_X = 320;
    SCREEN_Y = 240;
    SCALE = 1;


    // Begin Initalization

        SDL_Surface *screen;

        SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );  // [!] SDL_GL_SetAttributes must be done BEFORE SDL_SetVideoMode

        screen = SDL_SetVideoMode( SCALE*SCREEN_X, SCALE*SCREEN_Y, 16, SDL_OPENGL );  // Set screen to the window with opengl
        if ( !screen )  // make sure the window was created
        {
            printf("Unable to set video mode: %sn", SDL_GetError());
        }

        // set opengl state
        opengl_init();

    // End Initalization

}

void VideoEngine::opengl_init()
{
    // Set the OpenGL state after creating the context with SDL_SetVideoMode

        //glClearColor( 0, 0, 0, 0 );                             // sets screen buffer to black
        //glClearDepth(1.0f);                                     // Tells OpenGL what value to reset the depth buffer when it is cleared
        glViewport( 0, 0, SCALE*SCREEN_X, SCALE*SCREEN_Y );     // sets the viewport to the default resolution (SCREEN_X x SCREEN_Y) multiplied by SCALE. (x,y,w,h)
        glMatrixMode( GL_PROJECTION );                          // Applies subsequent matrix operations to the projection matrix stack.
        glLoadIdentity();                                       // Replaces the current matrix with the identity matrix
        glOrtho( 0, SCALE*SCREEN_X, SCALE*SCREEN_Y, 0, -1, 1 ); //describes a transformation that produces a parallel projection
        glMatrixMode( GL_MODELVIEW );                           // Applies subsequent matrix operations to the projection matrix stack.
        glEnable(GL_TEXTURE_2D);                                // Need this to display a texture
        glLoadIdentity();                                       // Replaces the current matrix with the identity matrix
        glEnable(GL_BLEND);                                     // Enable blending for transparency
        glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);      // Specifies pixel arithmetic
        //glDisable( GL_LIGHTING );                               // Disable lighting
        //glDisable( GL_DITHER );                                 // Disable dithering
        //glDisable( GL_DEPTH_TEST );                             // Disable depth testing

        //Check for error
        GLenum error = glGetError();
        if( error != GL_NO_ERROR )
        {
         printf( "Error initializing OpenGL! %sn", gluErrorString( error ) );
        }

    return;
}

我开始想我可能有硬件问题? 尽管我从来没有遇到过这个问题。


链接地址: http://www.djcxy.com/p/95057.html

上一篇: c++

下一篇: How to execute sql script in multiple users (schema)