Closed hankcs closed 8 years ago
Thanks @hankcs
Yes, you need to get the touch events (onTouch, onMove...) or whatever they are called on Android. When the finger starts to touch the drawing area, just call startStroke(). When the finger moves, you need to call strokeTo(x, y) with the coordinate of the finger on your drawing area.
The drawing area can be anything but the easiest is to create a QGraphicsView with a QGraphicsScene. When the lib calls your newTile(MPSurface, MPTile) callback, you need to add MPTile (which is a QGraphicsItem) to your QGraphicsScene.
I suggest, you check at the demo app which is included in this project for a real world example.
Good luck with your Android app!
Let me know about your project when it is live! I am curious about new apps using this lib! :)
Thank you!
Your advice is really helpful, and I'll study your codes before I start off.
I'm working with a small team. We made an app aiming at teaching people how to draw. Our app is using a vector graphic library now, and I decide to switch to a powerful engine just like libmypaint. I hope our job will improve people's art ability.
I'll put "inspired by qtmypaint" to the about page in next generation of our app, thank you again!
Hello feeel, thank you for your QT port.
I'm seeking to port libmypaint to Android, using openGL ES to do the rendering. But I'm not familiar with QT.
In my view, libmypaint does the algorithm on a "surface", what I need to do is, passing touching gesture (x,y) to "stroke_to" and draw the surface on the screen, whether use openGL or other native graphic functions. Am I right?
Thank you in advance.