There's nothing quite like the feeling of accomplishment getting something done gives. Best natural high EVER!
What is it that I accomplished? Nothing much, just got a true 6 degrees of freedom camera to work correctly and fly around a boring plane. It can roll, it can climb, it can move forward, it can do pretty much everything I want it to do. The control scheme I'm using is WASD to move around the plane, RF to climb or fall, and QE to roll clockwise and counter clockwise, plus what I had done before of the mouse to pith and yaw.
I'm very happy, but I also need to document how I do this.
Keyboard events are generated whenever someone presses or releases a key. The camera class has a number of boolean toggles that are flipped whenever the corresponding event happens. Then, during the camera update, it checks for each of the toggles and if it is true, the camera is moved along the corresponding axis. Here, the use of vectors to store the camera's position and view pays off, since with these already in place advancing is trivial.
To move forward, I multiply the view vector by a scalar that determines my speed. The resulting vector is added to my camera's position vector, and it's done. To move back, it's the same but the resulting vector is subtracted from the camera's position. The other movements follow the same logic but along the up and right vectors. Finally, roll is a rotation of the up vector around the view vector, and is done with quaternions the same as yaw and pitch.
I'm very excited to be able to get around the world like this. Now I just need to populate this world with something interesting to visit. The next few updates should prove a lot more interesting, now that the basic framework at least is done.
Monday, November 22, 2010
Sunday, November 21, 2010
A small step for man, a giant leap for the game
I have finally managed to get the camera to work! I control the vertical! I control the horizontal!
Ok, so the mouse now moves the camera around. After days and hours of experimentation, I have figured out how it all fits together. I am making a 6DOF camera, although right now only attitude and bearing are affected (pitch and yaw).
The way this works is that the camera stores three vectors: the position of the camera, the view, and the up vector. Position is fairly straight forward. View is the direction in which the camera is looking. Up vector serves to orient the camera, indicating which way is up. All pretty self explanatory.
I then use the gluLookAt function to set the correct perspective. One warning about the way I do this, my view vector stores the direction the camera is facing, but gluLookAt expects as one of its parameters the position of something you're looking at, which is then used as the center of the screen. The way I get this position is to add the camera's position and the view vector and passing this along to gluLookAt.
gluLookAt then subtracts the position from the target to get its own view vector which it then normalizes. So gluLookAt is undoing some of my work. I will do a function of my own to do what gluLookAt does, so I avoid the unnecessary step, but it isn't a priority at this time.
The trick is how to get the camera to rotate, with what I have. The solution I am using is using quaternions to describe the rotations. The way this works is by incrementally rotating the view and up vectors with the quaternions.
I use the SDL to detect when the mouse moves, which creates an event. This provides me with both absolute and relative movement. Absolute movement provides the screen coordinates of the mouse, with (0,0) being the top left corner. Relative movement is the accumulated movement since the last event. Because the program doesn't keep track of the mouse outside of the screen, I have to warp the mouse back to the center of the screen each time it moves away.
This was creating a problem where the warp movement created a mouse movement event that was the opposite of the movement itself,. To fix it I am now checking the absolute position of the mouse when I catch an event, and if it is at the center I ignore that particular event. Since it can only be at the center right after warping, and any movement I care about draws it away, I don't lose anything.
Once I have the relative movement, I multiply it by a sensitivity factor and store it as the number of degrees the camera moved in that frame. Horizontal movement of the mouse affects the bearing of the camera, while vertical movement affects the attitude.
So, on to the rotations themselves. Changes in bearing are described as a rotation around the up vector of the camera. This modifies the view vector, but leaves position and up unchanged. Quaternions are used here to describe this rotation. I use a convert_axis_angle function, using the up vector and the angle to build the quaternion and then normalize it. I then create a quaternion from my view vector, which for any given vector v = [ x , y , z ] , q = [ 0 , v ] .
The rotation of the view vector is the sandwich product qbearing x qview x conjugate(qbearing) . This result is stored in view.
For attitude changes, I need to determine the cross product of view and up, which gives me the right vector of the coordinate trio ( view , up , right ). This time, both view and up are modified by the rotation. So I create the attitude quaternion with convert_axis_angle, using the right vector this time, and make both rotations:
qattitude x qview x conjugate(qattitude)
qattitude x qup x conjugate(qattitude)
The results are stored in view and up respectively.
After this is done, I use gluLookAt as described above, and voila, it is done.
There's room for growth, of course. Roll can be easily appended, once the input is mapped. In the case of roll, the rotation would be around the view vector, and only up would be modified. I also need to add translation, to move the camera around. This is easily done, however, now that I have a view vector. Lateral movement (strafing) is done by getting the right product of view and up, and moving along that vector, vertical movement is done along the up vector and forward movement is done along the view vector.
Ok, that's it for now. Next up, making the world a more interesting place.
Ok, so the mouse now moves the camera around. After days and hours of experimentation, I have figured out how it all fits together. I am making a 6DOF camera, although right now only attitude and bearing are affected (pitch and yaw).
The way this works is that the camera stores three vectors: the position of the camera, the view, and the up vector. Position is fairly straight forward. View is the direction in which the camera is looking. Up vector serves to orient the camera, indicating which way is up. All pretty self explanatory.
I then use the gluLookAt function to set the correct perspective. One warning about the way I do this, my view vector stores the direction the camera is facing, but gluLookAt expects as one of its parameters the position of something you're looking at, which is then used as the center of the screen. The way I get this position is to add the camera's position and the view vector and passing this along to gluLookAt.
gluLookAt then subtracts the position from the target to get its own view vector which it then normalizes. So gluLookAt is undoing some of my work. I will do a function of my own to do what gluLookAt does, so I avoid the unnecessary step, but it isn't a priority at this time.
The trick is how to get the camera to rotate, with what I have. The solution I am using is using quaternions to describe the rotations. The way this works is by incrementally rotating the view and up vectors with the quaternions.
I use the SDL to detect when the mouse moves, which creates an event. This provides me with both absolute and relative movement. Absolute movement provides the screen coordinates of the mouse, with (0,0) being the top left corner. Relative movement is the accumulated movement since the last event. Because the program doesn't keep track of the mouse outside of the screen, I have to warp the mouse back to the center of the screen each time it moves away.
This was creating a problem where the warp movement created a mouse movement event that was the opposite of the movement itself,. To fix it I am now checking the absolute position of the mouse when I catch an event, and if it is at the center I ignore that particular event. Since it can only be at the center right after warping, and any movement I care about draws it away, I don't lose anything.
Once I have the relative movement, I multiply it by a sensitivity factor and store it as the number of degrees the camera moved in that frame. Horizontal movement of the mouse affects the bearing of the camera, while vertical movement affects the attitude.
So, on to the rotations themselves. Changes in bearing are described as a rotation around the up vector of the camera. This modifies the view vector, but leaves position and up unchanged. Quaternions are used here to describe this rotation. I use a convert_axis_angle function, using the up vector and the angle to build the quaternion and then normalize it. I then create a quaternion from my view vector, which for any given vector v = [ x , y , z ] , q = [ 0 , v ] .
The rotation of the view vector is the sandwich product qbearing x qview x conjugate(qbearing) . This result is stored in view.
For attitude changes, I need to determine the cross product of view and up, which gives me the right vector of the coordinate trio ( view , up , right ). This time, both view and up are modified by the rotation. So I create the attitude quaternion with convert_axis_angle, using the right vector this time, and make both rotations:
qattitude x qview x conjugate(qattitude)
qattitude x qup x conjugate(qattitude)
The results are stored in view and up respectively.
After this is done, I use gluLookAt as described above, and voila, it is done.
There's room for growth, of course. Roll can be easily appended, once the input is mapped. In the case of roll, the rotation would be around the view vector, and only up would be modified. I also need to add translation, to move the camera around. This is easily done, however, now that I have a view vector. Lateral movement (strafing) is done by getting the right product of view and up, and moving along that vector, vertical movement is done along the up vector and forward movement is done along the view vector.
Ok, that's it for now. Next up, making the world a more interesting place.
Friday, November 19, 2010
Square One
Well, got the framework back up and running. That was fairly painless, though I noticed my previous posts were a bit light on details. So, just in case something like this ever happens again, the libraries I'm linking to are:
mingw32
SDLmain
SDL.dll
opengl32
glu32
I will try to focus on getting the camera working for now. The camera is after all at the heart of the game. You can't have a 3d space game without freedom in all three dimensions.
mingw32
SDLmain
SDL.dll
opengl32
glu32
I will try to focus on getting the camera working for now. The camera is after all at the heart of the game. You can't have a 3d space game without freedom in all three dimensions.
Thursday, November 18, 2010
Why backups are important
So it turns out the usb drive where I was keeping my project files died a couple days ago. Attempts to revive it have not been successful. As a result, I'm back at square one. What I learned up to now can help get up to speed quickly when I start up again, but it is fairly sad to see all that work gone. :(
Friday, November 12, 2010
Project 1 - Life getting in the way
Had a busy week. Between a short trip and work, haven't had the chance to fix the camera. I've managed to break it, and get the new computer's developing environment up and running though. I'll get some more stuff up here as soon as I can.
I might just get the camera up to some basic stuff and continue working on other things to keep everything going.
I might just get the camera up to some basic stuff and continue working on other things to keep everything going.
Wednesday, November 3, 2010
Project 1 - Moving the camera, part 5
I said I'd work on something else, but I couldn't leave the camera as it was. I knew there had to be a better way to do what I'd been doing, and it turns out there is. Its called quaternions, and its confusing. Thankfully I've found plenty of explanations online, most directed at modeling in 3d. So I'm taking a bit from a few different tutorials to make my camera more robust. The code from NeHe's Lesson: Quaternion Camera Class is coming particularly handy.
I'm not lifting the camera class completely, it doesn't do everything I need, but the quaternion class is handy and saves me the time it would take me to fully grasp how to multiply them and turn them into matrices. So I'm keeping that. I'll still need to read up on them, though. As it is things mostly worked, though I'm still stuck with the controls being reversed after making a half turn (inverting up and down).
After a couple of days of thinking about it, it's occurred to me that the way I'm going about it might be to blame. Particularly with the mouse. I'm trying to keep track of attitude and bearing, both on 360 degrees. I had to, because if I didn't let them move freely (by limitting attitude for instance to +/- 90°) then I wouldn't be able to make full revolutions. The problem is that when I go over 90° I start flying inverted, but moving the mouse still adjusts the absolute angle around the fixed y axis. Which inverts it. I could try telling the program to invert the rotation when I go over 90°, but then if I rolled while flying forward I'd have the same problem.
The idea I'm toying with now is to keep track of the camera's frame of reference. I am not yet sure how I'll go about doing this, though some vector algebra and the cross product seem to offer promising opportunities.
I'm not lifting the camera class completely, it doesn't do everything I need, but the quaternion class is handy and saves me the time it would take me to fully grasp how to multiply them and turn them into matrices. So I'm keeping that. I'll still need to read up on them, though. As it is things mostly worked, though I'm still stuck with the controls being reversed after making a half turn (inverting up and down).
After a couple of days of thinking about it, it's occurred to me that the way I'm going about it might be to blame. Particularly with the mouse. I'm trying to keep track of attitude and bearing, both on 360 degrees. I had to, because if I didn't let them move freely (by limitting attitude for instance to +/- 90°) then I wouldn't be able to make full revolutions. The problem is that when I go over 90° I start flying inverted, but moving the mouse still adjusts the absolute angle around the fixed y axis. Which inverts it. I could try telling the program to invert the rotation when I go over 90°, but then if I rolled while flying forward I'd have the same problem.
The idea I'm toying with now is to keep track of the camera's frame of reference. I am not yet sure how I'll go about doing this, though some vector algebra and the cross product seem to offer promising opportunities.
Subscribe to:
Posts (Atom)