Hey I am developing a Cocos2dx application with XCode and iPhone 5c and I am looking to change the coordinate system to portrait. I looked at these directions http://www.cocos2d-x.org/wiki/Device_Orientation. According to the direction, you need to change a couple of methods in the RootViewController so that before Cocos does its magic, iOS has already handled rotating the container.
Now the applicable methods of my RootViewController.mm look as follows. It builds and runs.
#ifdef __IPHONE_6_0
- (NSUInteger) supportedInterfaceOrientations{
return UIInterfaceOrientationMaskPortrait;
}
#endif
- (BOOL) shouldAutorotate {
return YES;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
return UIInterfaceOrientationIsPortrait ( interfaceOrientation );
}
- (void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation {
[super didRotateFromInterfaceOrientation:fromInterfaceOrientation];
auto glview = cocos2d::Director::getInstance()->getOpenGLView();
if (glview)
{
CCEAGLView *eaglview = (__bridge CCEAGLView *)glview->getEAGLView();
if (eaglview)
{
CGSize s = CGSizeMake([eaglview getWidth], [eaglview getHeight]);
cocos2d::Application::getInstance()->applicationScreenSizeChanged((int) s.width, (int) s.height);
}
}
}
@end
The code for the init() method of my level looks like this
bool Level1::init()
{
//////////////////////////////
// 1. super init first
if ( !Scene::init() )
{
return false;
}
auto visibleSize = Director::getInstance()->getVisibleSize();
Vec2 origin = Director::getInstance()->getVisibleOrigin();
for(int i= 0; i < MAX_TOUCHES; ++i) {
labelTouchLocations[i] = Label::createWithSystemFont("", "Arial", 42);
labelTouchLocations[i]->setVisible(false);
this->addChild(labelTouchLocations[i]);
}
auto eventListener = EventListenerTouchAllAtOnce::create();
// Create an eventListener to handle multiple touches, using a lambda, cause baby, it's C++11
eventListener->onTouchesBegan = [=](const std::vector<Touch*>&touches, Event* event){
// Clear all visible touches just in case there are less fingers touching than last time
std::for_each(labelTouchLocations,labelTouchLocations+MAX_TOUCHES,[](Label* touchLabel){
touchLabel->setVisible(false);
});
// For each touch in the touches vector, set a Label to display at it's location and make it visible
for(int i = 0; i < touches.size(); ++i){
labelTouchLocations[i]->setPosition(touches[i]->getLocation());
labelTouchLocations[i]->setVisible(true);
labelTouchLocations[i]->setString("Touched");
std::cout << "(" << touches[i]->getLocation().x << "," << touches[i]->getLocation().y << ")" << std::endl;
}
};
_eventDispatcher->addEventListenerWithSceneGraphPriority(eventListener, this);
return true;
}
When I touch the point at the lowest left part of the iPhone, the x coordinate of the touch point I print out is about 150. The y coordinate is 0, as expected. Why is the x coordinate not zero? Is there something I am missing in the creation of the scene? I believe I made all the changes that the Cocos documentation requires. Are their docs out of date?
The coordinate system does not start at (0,0) at the bottom left as would be suggested by the documentation. However, there are simple methods that can be used to get the origin and size of the container that are included in the default scene.