What's wrong with   writefln("%s", glGetString(GL_EXTENSIONS));
    Chris Sauls 
    ibisbasenji at gmail.com
       
    Sat Mar  4 10:28:34 PST 2006
    
    
  
Cris wrote:
> it doesn't work that way: "src\engine\renderer.d(104): cannot implicitly 
> convert expression ((glGetString)(7939u)) of type ubyte* to char*"
> 
Given that its natural type is ubyte* you should be able to do this:
# private import std.string ;
#
# char[] toString (ubyte* foo) {
#   return std.string.toString(cast(char*) foo);
# }
I would think so, anyhow.  If it will not allow a direct cast to char*, you can always add 
a middle state cast to void* first.
-- Chris Nicholson-Sauls
    
    
More information about the Digitalmars-d-learn
mailing list