Opengl Error Invalid Enumerant
Contents |
topic ForumsMembersHelp Files Developer Journals Gallery Calendar Downloads Resources Store Classifieds Tracker Links Home For Beginners glewinit invalid enum Articles All Articles Post an Article Technical Game Programming General Programming opengl error 1280 minecraft Graphics Programming and Theory DirectX and XNA OpenGL and Vulkan Multiplayer and Network Programming Artificial Intelligence
Gl_invalid_enum
Math and Physics Mobile Development Middleware, Libraries, and Tools Virtual and Augmented Reality Creative Game Design Music and Sound Visual Arts Business Breaking into the Industry Production and
Glewgeterrorstring
Management Interviews Business and Law Event Coverage Forums All Forums Technical Game Programming General Programming Graphics Programming and Theory DirectX and XNA OpenGL & Vulkan Multiplayer and Network Programming Artificial Intelligence Math and Physics Mobile & Console Development Build Systems & Source Control Middleware, Libraries, and Tools Virtual and Augmented Reality Creative Game Design Writing opengl 1281 For Games Music and Sound Visual Arts Gallery Business Breaking into the Industry Business & Law Production & Management Community GDNet Lounge Coding Horrors Article Writing Comments, Suggestions and Ideas Your Announcements Hobby Project Classifieds Indie Project Showcase Community Developer Journals Gallery Classifieds Jobs Freelancers Hobby Projects GDNet+ Membership Store Marketplace Newsletter » Home » Forums » The Technical Side » OpenGL and Vulkan Chat Watched Content New Content 0 GLSL Invalid Enumerant Error [Fixed] Started by Giawa, Feb 16 2008 02:37 PM Old topic! Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic. You cannot reply to this topic 2 replies to this topic #1 Giawa Members -Reputation: 130 Like 0Likes Like Posted 16 February 2008 - 02:37 PM Hi everyone, I've been searching for a few hours for a solution to this problem. I wa
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more
Opengl Error 1282
about Stack Overflow the company Business Learn more about hiring developers or posting ads glewexperimental with us Stack Overflow Questions Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow glgeterror is a community of 6.2 million programmers, just like you, helping each other. Join them; it only takes a minute: Sign up OpenGL error GL_INVALID_ENUM (0x0500) while glewInit() up vote 4 down vote favorite http://www.gamedev.net/topic/483148-glsl-invalid-enumerant-error-fixed/ I'm new to OpenGL and try to learn OpenGL 4 by following the http://www.openglbook.com tutorial. On page two, we create a simple triangle using shaders. But right after creating the vertex shader, the program crashes. By Debugging, I could find a OpenGL error GL_INVALID_ENUM (0x0500). See code below... Unfrotunately I couldn't find any solution. Maybe you know sth? Edit: So, after searching again for GL_INVALID_ENUM and glewInit(), I found http://stackoverflow.com/questions/19453439/opengl-error-gl-invalid-enum-0x0500-while-glewinit that there are already many posts on other Websites, BUT: Most of them had some typos in their shaders and I definitely don't. On http://www.opengl.org/wiki/OpenGL_Loading_Library i found: You might still get GL_INVALID_ENUM (depending on the version of GLEW you use), but at least GLEW ignores glGetString(GL_EXTENSIONS) and gets all function pointers. Some recommend to just ignore the error, but I still can't run the program... Btw, could anyone of you try to run the program? If you had the same error, we'd definitely know it's not a wrong IDE/Project configuration of mine. Specs: Windows 8.1 64bit Intel Core i7-3517U with Intel HD4000 GPU (OpenGL 4.0.0 Support) 8 GB RAM IDE: Eclipse IDE for C/C++ Developers Version: Kepler Service Release 1 Build id: 20130919-0819 using MinGW Compiler code: #include
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the http://stackoverflow.com/questions/14046111/glewinit-apparently-successful-sets-error-flag-anyway workings and policies of this site About Us Learn more about Stack http://gamedev.stackexchange.com/questions/29990/opengl-glgeterror-returns-invalid-enum-after-call-to-glewinit Overflow the company Business Learn more about hiring developers or posting ads with us Stack Overflow Questions Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow is a community of 6.2 million programmers, just like you, helping each other. Join opengl error them; it only takes a minute: Sign up glewinit() apparently successful, sets error flag anyway up vote 1 down vote favorite I have recently migrated from Windows to Linux (Debian, 64-bit) and am trying to get a GPGPU development environment up and running, so I am testing a program which worked under Windows. Compiling and linking goes fine, but opengl error invalid when I run the program I get some odd errors. I am using glew and freeglut. First snippet: OpenGL only i = 1; info = PROGRAM_NAME; glutInitContextVersion(4,2); glutInit(&i, &info); glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA); glutInitWindowSize(W_SIZEX, W_SIZEY); glutInitWindowPosition(W_POSX, W_POSY); glutCreateWindow(info); glClearColor(1.0,1.0,1.0,0); /**/ printf("Before glewInit: %i\n", glGetError()); /**/ printf("glewInit returns: %i\n", glewInit()); /**/ printf("After glewInit: %i\n", glGetError()); /**/ From which I get the following output: Before glewInit: 0 glewInit returns: 0 After glewInit: 1280 This is an invalid enum error. I don't know what's causing it, but I suspect it might be related to the next error I get, later in the program's execution. Second snippet: OpenCL-OpenGL interop /* BUFFERS */ (*BFR).C[0] = clCreateBuffer(*CTX, CL_MEM_READ_WRITE, SD, 0, 0); (*BFR).C[1] = clCreateBuffer(*CTX, CL_MEM_READ_WRITE, SD, 0, &i); dcl(i); glGenBuffers(2, (*BFR).G); glBindBuffer(GL_ARRAY_BUFFER, (*BFR).G[0]); glBufferData(GL_ARRAY_BUFFER, SI, 0, GL_DYNAMIC_DRAW); (*BFR).D[0] = clCreateFromGLBuffer(*CTX, CL_MEM_WRITE_ONLY, (*BFR).G[0], &i); dcl(i); glBindBuffer(GL_ARRAY_BUFFER, 0); Here, the dcl(int) method just decodes the CL error code. When I run this, I get a CL_INVALID_GL_OBJECT error from clCreateFromGLBuffer(). However, OpenGL has no issues generating, binding or unbinding the buffers in question. The OpenCL contex
Start here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company Business Learn more about hiring developers or posting ads with us Game Development Questions Tags Users Badges Unanswered Ask Question _ Game Development Stack Exchange is a question and answer site for professional and independent game developers. Join them; it only takes a minute: Sign up Here's how it works: Anybody can ask a question Anybody can answer The best answers are voted up and rise to the top OpenGL: glGetError() returns invalid enum after call to glewInit() up vote 1 down vote favorite I use GLEW and freeglut. For some reason, after a call to glewInit(), glGetError() returns error code 1280. Reinstalling the drivers didn't help. I tried to disable glewExperimental, it had no effect. Code worked before, but I am not aware of any changes I could possibly make. Here's my code: int main(int argc, char* argv[]) { GLenum GlewInitResult, res; InitWindow(argc, argv); res = glGetError(); // res = 0 glewExperimental = GL_TRUE; GlewInitResult = glewInit(); res = glGetError(); // res = 1280 glutMainLoop(); exit(EXIT_SUCCESS); } void InitWindow(int argc, char* argv[]) { glutInit(&argc, argv); glutInitContextVersion(4, 0); glutInitContextFlags(GLUT_FORWARD_COMPATIBLE); glutInitContextProfile(GLUT_CORE_PROFILE); glutSetOption(GLUT_ACTION_ON_WINDOW_CLOSE, GLUT_ACTION_GLUTMAINLOOP_RETURNS); glutInitWindowPosition(0, 0); glutInitWindowSize(CurrentWidth, CurrentHeight); glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA); WindowHandle = glutCreateWindow(WINDOW_TITLE); GLenum errorCheckValue = glGetError(); if (WindowHandle < 1) { fprintf(stderr, "ERROR: Could not create new rendering window.\n"); exit(EXIT_FAILURE); } glutReshapeFunc(ResizeFunction); glutDisplayFunc(RenderFunction); glutIdleFunc(IdleFunction); glutTimerFunc(0, TimerFunction, 0); glutCloseFunc(Cleanup); glutKeyboardFunc(KeyboardFunction); } Could someone tell me what I am doing wrong? Thanks. c++ opengl glut share|improve this question edited Jun 1 '12 at 21:32 asked Jun 1 '12 at 21:04 malymato 10815 is GLEW returning an error