I'm fairly new to coding, and brand new to graphics. I would be grateful for a bit of direction, as I haven't been able to find the answer i need.
I have a class that extends view. I created several 3-layered rects in a custom canvas. I am successfully able to touch and move each rect. I can pan the whole canvas. I can move the individual rects after I pan. I can zoom the whole screen. The one thing I can not do is move the individual rects after I zoom. I can't "find" the rects after zooming. I'm about ready to give up on the whole custom view and try to do this using an imageview or something I can get a handle on vs. trying to see what was touched by coordinates.
My question is: how do you find the coordinates of a rect after zooming? My guess was to recalculate every single rect based on movement of the canvas or scale of the canvas, but clearly I'm missing something as this seems overly complex and doesn't work for me.
Here's some of the code - please forgive the mess, I'm just trying to get the idea down. I've tried using the matrix as well as the scalefactor in the calcs, but neither seem to get me in the right place. (Some of the code isn't used - I used it but couldn't get it to calculate where the rects were properly... if that's even the right way to do this?)
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
canvas.translate(xOffset, yOffset);
canvas.scale(scaleFactor, scaleFactor);
for (RectF r: RectFs){
paint.setStyle(Paint.Style.STROKE);
paint.setColor(Color.BLACK);
canvas.drawRoundRect(r, 20, 20,paint); //Set a border
paintShader.setStyle(Style.FILL);
paintShader.setShader(new LinearGradient(r.left, r.bottom, r.right, r.top, Color.CYAN, Color.YELLOW, Shader.TileMode.REPEAT));
canvas.drawRoundRect(r, 20, 20, paintShader);
paint.setColor(Color.BLACK);
paint.setTextSize(textSize);
paint.setTypeface(Typeface.defaultFromStyle(Typeface.BOLD));
paint.setLinearText(true);
canvas.drawText(text, r.left+px, r.top+py, paint);
}
}
public boolean onTouchEvent(MotionEvent event) {
scaleGestureDetector.onTouchEvent(event);
touched = true;
if (!panned){
left = event.getX() - rectHeight / 2;
top = event.getY() - rectWidth / 2;
}
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
startPos.set(event.getX(), event.getY());
findWhatWasTouched();
invalidate();
break;
case MotionEvent.ACTION_POINTER_DOWN:
case MotionEvent.ACTION_MOVE:
if (freezeAction)
break;
// Only move if the ScaleGestureDetector isn't processing a gesture.
if (!scaleGestureDetector.isInProgress() && panned) {
xOffset += event.getX() - startPos.x;
yOffset += event.getY() - startPos.y;
} else
resetAllRectsAfterScrollOrZoom();
findWhatWasTouched();
invalidate();
startPos.set(event.getX(), event.getY());
break;
case MotionEvent.ACTION_POINTER_UP:
freezeAction = true;// When two fingers off of zoom, don't let the other finger jump everything around
break;
case MotionEvent.ACTION_UP:
if (!freezeAction){
findWhatWasTouched();
invalidate();
if (panned || zoomed)
resetAllRectsAfterScrollOrZoom();
}
liveRect = null;
panned = false;
freezeAction = false;
xOffset = 0;
yOffset =0;
break;
}
return true;
}
public void resetAllRectsAfterScrollOrZoom(){
float height, width;
for (RectF r: RectFs){
height = r.height();
width = r.width();
if (zoomed){
r.left += (xOffset * scaleFactor);
r.top += (yOffset * scaleFactor);
}else {
r.left += xOffset;
r.top += yOffset ;
}
r.right = r.left + width;
r.bottom = r.top + height;
}
}
public void findWhatWasTouched(){
boolean rectWasTouched = false;
for (RectF r: RectFs){ // Update positions due to a touch event.
if (r.contains(startPos.x, startPos.y)){
if (liveRect == null)
liveRect = r;
rectWasTouched = true;
}
}
if (!rectWasTouched)
panned = true;
if (!zoomed && !panned && liveRect != null){
float height = liveRect.height();
float width = liveRect.width();
liveRect.left = left;
liveRect.top = top;
liveRect.right = liveRect.left + width;
liveRect.bottom = liveRect.top + height; // We have a moving single rect - reset it's position here
}
}
private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
@Override
public boolean onScale(ScaleGestureDetector detector) {
scaleFactor *= detector.getScaleFactor();
zoomed = true;
// don't let the object get too small or too large.
scaleFactor = Math.max(0.1f, Math.min(scaleFactor, 5.0f));
return true;
}
@Override
public void onScaleEnd(ScaleGestureDetector detector){
zoomed = false;
}
} // End ScaleListener class
} // End view class
The biggest issue is that the touch on the rects after zooming move the rects further down & to the right exponentially as I move in that direction... which leads me to think the math is wrong in the zoom part of the resetAllRects block. But for the life of me I can't figure it out. Still don't know if resetting the rect coords every time is the right way to do this?
This took me days to sort out, and I don't think I could have ever figured it out without SO. Thanks especially to @Awais Tariq for giving me the clue needed here: Get Canvas coordinates after scaling up/down or dragging in android. Basically everything (movements, translations, and most importantly, checking against pointer positions) must run through a calculation that can translate the new screen density. Here's what I changed in case anyone in future runs into this mess:
This recalcs the postions of the rects after a zoom. But the pointer is also 'off' exponentially as you go further from 0,0. To correct, I used the following formula inside the if-check to see what was touched:
Everything is successfully working with this solution. I may go back and try this with a layout instead to see if passing in a view to the if-check might be easier. But, for now, this works. Way too complex for my newbie coder brain to figure this out, I wonder if this isn't something that could be built in to Android in future.