I've been doing a lot of searching on the math behind this conversion, and the best I've been able to come up with so far is this:
x = sin(horizontal_angle) * cos(vertical_angle)
y = sin(horizontal_angle) * sin(vertical_angle)
z = cos(horizontal_angle)
For arbitrary angles, this works fine. Where I have problems is when one of the rotations is 0 degrees. At 0 degrees (or 180, or 360, or...), sin() is going to be zero, which means both the x and y coordinates I get out of the above formulas will be zero, regardless of what the other angle was set to.
Is there a better formula out there that doesn't mess up at certain angles? My searches so far haven't found one, but there has to be a solution to this issue.
Update: After some experimentation, I found that my main misunderstanding was the fact that I was assuming the poles of my spherical coordinates to be vertical (like latitude and longitude on a planet), while they were actually horizontal (projected into the screen). This was due to the fact that I'm working in screen space (x/y mapped to the screen, z projected into the screen), rather than a traditional 3D environment, but somehow didn't think that would be a contributing factor.
The final formula that worked for me to get the poles oriented correctly:
x = cos(horizontal_angle) * sin(vertical_angle)
y = cos(vertical_angle)
z = sin(horizontal_angle) * sin(vertical_angle)
Your formula is correct for all angles. But the names that you've given the angles are probably not quite right. What you've called "horizontal angle" is the inclination angle - the angle between the vector and the z-axis. So if "horizontal angle" is 0, then the point lies on the z-axis, which means that it is correct for x and y to both be 0. What you've called "vertical angle" is actually the angle in the x-y plane. If it is 0, then the point lies in the x-z plane, so y is correctly set to 0.