Saturday, July 28, 2018

Sample Code

Bunch of sample codes I've written for use in lua on the Corona SDK, utility and math libraries.
Some of these may appear in both lists due to being a library within a demo.

I have now added a third list of other people's libraries as they are proving quite useful.


Tutorials:


Libraries:

Sample/Demo code:

Others:

Handling Multitouch Input - A tutorial on implementing pinch-zoom

Handling Multitouch Input

A tutorial on implementing pinch-zoom


Preface

The files associated with this tutorial should be one at a time within the main.lua  Uncomment only one require statement at a time to follow the workings in the logic below.


Introduction

Most applications (more than you'd expect) can perform perfectly fine with just one touch point. If you consider the large number of apps out there you can see that many have a huge feature set but still get by with just a single point of input because they are designed around buttons or individual swipe actions, etc.

Take Angry Birds, for example. This game requires that every tap, drag and swipe is performed by one finger. Navigating the menu, opening up settings options and firing the afore-mentioned birds with attitude is all done with one finger, and rightly so. It makes for a simple, intuitive and engrossing game.

However, even this most basic interface requires one simple trick learned from iOS. This involves using two fingers to "pinch" zoom in and out of the parallax-scrolling action.

So, that's simple, isn't it? The rule is: When one finger is used, perform the action for the object being touched. When two fingers a used, perform a gentle scaling of the top-level parent display group.

This tutorial aims to show you how to handle these multiple touch scenarios with as little hassle as possible. It will also try to provide some insight into the oft-requested pinch zoom...


Touch Basics

If you're reading this tutorial you probably already have some experience with the Corona touch model, so I will just highlight the core tenets.

    • addEventListener() is used to listen to a particular display object for user touches
    • There are two types of touch event: touch and tap
    • The touch event is comprised of phases: began, moved and ended
    • Listening to one display object for both touch and tap events will fire the touch event phases before the tap event fires
    • Returning true from an event function stops Corona from passing that event to any display objects beneath the object
    • system.activate("multitouch") enables multitouch
    • Once a touch event has begun future touch phases are directed to the same listener by calling display.getCurrentStage():setFocus()
    • setFocus can only be called once per object per event (without cancellation)
    • Calling dispatchEvent() on display objects fires artificial events
    • Events fired with dispatchEvent do not propagate down the display hierarchy
The Tap Problem

As described above, touch events have a number of phases which literally describe the users interaction with the device: putting the finger on the screen, moving it around and letting go.

When it is listened for, the normal tap event is fired if the above event phases occur within a given time span (iOS employs about 350 milliseconds) and without a distance between the began and end(ed) locations greater than (approx) 10 pixels.

This means that if you are listening for both touch and tap events you need to actually detect a tap within your touch listener function to know that your tap listener function is going to be called. So, if you're already detecting taps you my as well not attach a tap listener at all.

For the purposes of this tutorial that's exactly what we'll do: we will leave out tap events because they simply complicate our code.


Single Touch

Sample1.lua

To demonstrate the typical touch event lets create a display object with a standard touch listener and use it to move the display object around.

-- single touch sample

-- create a user interface object
local circle = display.newCircle( 0, 0, 50 )

-- make it less imposing
circle.alpha = .5

-- standard single-touch event listener
function circle:touch(e)
-- get the object which received the touch event
local target = e.target
-- handle each phase of the touch event life cycle...
if (e.phase == "began") then
-- tell corona that following touches come to this display object
display.getCurrentStage():setFocus(target)
-- remember that this object has the focus
target.hasFocus = true
-- indicate the event was handled
return true
elseif (target.hasFocus) then
-- this object is handling touches
if (e.phase == "moved") then
-- move the display object with the touch (or whatever)
target.x, target.y = e.x, e.y
else -- "ended" and "cancelled" phases
-- stop being responsible for touches
display.getCurrentStage():setFocus(nil)
-- remember this object no longer has the focus
target.hasFocus = false
end
-- indicate that we handled the touch and not to propagate it
return true
end
-- if the target is not responsible for this touch event return false
return false
end

-- listen for touches starting on the touch layer
circle:addEventListener("touch")

The above function handles touch events when multitouch is not activated. This is not the simplest touch listener but it is practical and safe. It's also not the most complex, but any other work it could do should be performed by functions it can call. It caters for the following situations:

    • The touch starts on the object
    • The touch is used to move the object
    • Touches which started away from the object are ignored
    • Handled touches do not get passed to other display objects
    • Ignored touches get propagated to other display objects
    • The display object has its own :touch(e) function and not a global function

Note that the object will ignore touches which start elsewhere. This is because setting hasFocus indicates that the object should accept touch phases after began. Also, it will not lose the touch once it acquires it because setFocus tells Corona to direct all further input to this object.


Multiple Touches

Sample2.lua

Fortunately, converting this function to be used by multiple display objects is not difficult. The catch with setFocus is that each display object can only listen for one touch because all other touch events are ignored on that object, once it begins handling a touch.

To demonstrate multitouch we will convert the above code to create multiple objects which will handle one touch each.

-- multi touch sample

-- turn on multitouch
system.activate("multitouch")

-- creates an object to be moved
local function newDragObj( x, y )
-- create a user interface object
local circle = display.newCircle( x, y, 50 )
-- make it less imposing
circle.alpha = .5
-- standard multi-touch event listener
function circle:touch(e)
-- get the object which received the touch event
local target = e.target
-- handle each phase of the touch event life cycle...
if (e.phase == "began") then
-- tell corona that following touches come to this display object
display.getCurrentStage():setFocus(target, e.id)
-- remember that this object has the focus
target.hasFocus = true
-- indicate the event was handled
return true
elseif (target.hasFocus) then
-- this object is handling touches
if (e.phase == "moved") then
-- move the display object with the touch (or whatever)
target.x, target.y = e.x, e.y
else -- "ended" and "cancelled" phases
-- stop being responsible for touches
display.getCurrentStage():setFocus(target, nil)
-- remember this object no longer has the focus
target.hasFocus = false
end
-- indicate that we handled the touch and not to propagate it
return true
end
-- if the target is not responsible for this touch event return false
return false
end
-- listen for touches starting on the touch layer
circle:addEventListener("touch")
-- return the object for use
return circle
end

-- create layer for the draggable objects
local group = display.newGroup()

-- create 5 draggable objects
for i=1, 5 do
-- create object
local circle = newDragObj( 100, i*100 )
-- add it to the control layer
group:insert( circle )
end

Note the key differences in this code:

    • We have activated multitouch
    • We have wrapped the display object creation so that it can be called repeatedly
    • setFocus accepts a specific touch ID to differentiate between user screen contacts
    • When ending the touch, setFocus accepts nil to release the object's touch input

With the code above we should be able to create 5 large circles and each one can be moved independently. Note that, as before, due to setting hasFocus and setFocus now accepting a specific touch ID, the display objects will ignore touches which start elsewhere and will not lose a touch once it begins.


The Multitouch Problem

Now, remember that the strength of the code above is that it can distinguish between multiple touches easily. This is because objects will not lose their touch once they acquire it. This is both a huge bonus and a bit of a problem...

    • The bonus is that setFocus allows us to say "Send every move this user's touch makes to my object's event listener and nowhere else."
    • The problem is that setFocus also stops our display object from receiving any other touch events.

If we have not yet called setFocus, using hasFocus conveniently allows our object to ignore touches which don't begin there. This is useful because users often make a swiping gesture (by accident) on the background (or inactive part of the screen) and swipe across our object. We want it to ignore touches which don't begin on it.

So, the problem is how do we convince Corona to let our objects receive multiple touches when the functions which give us this great ease-of-use stop exactly that?

The answer is to create a tracking object in the began phase.


The Concept

With a small change to the code above we can create a single object which spawns multiple objects in its began phase. These objects will then track each touch individually. We will also change the code further to remove the tracking object when the touch ends.

The complete code will have one function to listen for the touch event began phase and another to listen for moved, ended and cancelled phases. These two functions will be added to the target listening object and the tracking dot objects, repectively.


Spawning Tracking Dots

Sample3.lua

First, we need to create an object which will handle the began phase as before, but this time it will call a function to create a tracking dot.

-- turn on multitouch
system.activate("multitouch")

-- create object to listen for new touches
local rect = display.newRect( 200, 200, 200, 100 )
rect:setFillColor(0,0,255)

-- standard multi-touch event listener
function rect:touch(e)
-- get the object which received the touch event
local target = e.target
-- handle began phase of the touch event life cycle...
if (e.phase == "began") then
-- create a tracking dot
local dot = newTrackDot(e)
-- we handled the began phase
return true
end
-- if the target is not responsible for this touch event return false
return false
end

-- listen for touches starting on the touch object
rect:addEventListener("touch")

This is pretty straightforward. It just creates a display object which listens for the began phase of any unhandled touch events. When it receives a touch with a began phase it calls the function which will create a new display object. This new object will be able to track the touch by directing the future touch phases to itself (instead of the rect) by calling setFocus.

Note that we are not setting the hasFocus value because multitouch objects only need to handle the began phase.

Next, we need to create the tracking dot. This code is almost identical to the previous multitouch function.

-- creates an object to be moved
local function newTrackDot(e)
-- create a user interface object
local circle = display.newCircle( e.x, e.y, 50 )
-- make it less imposing
circle.alpha = .5
-- standard multi-touch event listener
function circle:touch(e)
-- get the object which received the touch event
local target = circle
-- handle each phase of the touch event life cycle...
if (e.phase == "began") then
-- tell corona that following touches come to this display object
display.getCurrentStage():setFocus(target, e.id)
-- remember that this object has the focus
target.hasFocus = true
-- indicate the event was handled
return true
elseif (target.hasFocus) then
-- this object is handling touches
if (e.phase == "moved") then
-- move the display object with the touch (or whatever)
target.x, target.y = e.x, e.y
else -- "ended" and "cancelled" phases
-- stop being responsible for touches
display.getCurrentStage():setFocus(target, nil)
-- remember this object no longer has the focus
target.hasFocus = false
end
-- indicate that we handled the touch and not to propagate it
return true
end
-- if the target is not responsible for this touch event return false
return false
end
-- listen for touches starting on the touch layer
circle:addEventListener("touch")
-- pass the began phase to the tracking dot
circle:touch(e)
-- return the object for use
return circle
end

Note that the only two changes we’ve made to this function are:

    • We call circle:touch(e) because the circle has only been created and has not actually received the touch event’s began phase. Calling this allows the circle object to take control of the touch event away from the rect object and handle all future touch phases.
    • At the start of the :touch() function we also change to using the circle as the target because the e.target property is actually the rect object (where the touch began.)

When this code is used with the code above we will see a small blue rectangle which can create multiple white circles. Each circle is moved by an independent touch. It is this mechanism which we can use to direct all of the touch information to our blue rect and pretend that it is receiving multitouch input.


Faking Multitouch Input

Sample4.lua

Our blue rect object is going to become the recipient of multiple touch inputs. To do this we need to first modify the touch listener function of the rect. At first we will simply add some print() statements for the moved, ended and cancelled phases.

Here is the modified :touch() listener function for the small blue rectangle:

-- standard multi-touch event listener
function rect:touch(e)
-- get the object which received the touch event
local target = e.target
-- handle began phase of the touch event life cycle...
if (e.phase == "began") then
print( e.phase, e.x, e.y )
-- create a tracking dot
local dot = newTrackDot(e)
-- we handled the began phase
return true
elseif (e.parent == rect) then
if (e.phase == "moved") then
print( e.phase, e.x, e.y )
else -- ‘ended’ and ‘cancelled’ phases
print( e.phase, e.x, e.y )
end
return true
end
-- if the target is not responsible for this touch event return false
return false
end

The major change here is the addition of the moved, ended and cancelled phases. Doing this allows the tracking dots to call the :touch() function of the blue rectangle, passing in the event parameter received by the white circle’s touch function.

The elseif statement is also important here. If the tracking dots pass the event parameter to the rect the e.target will be a reference to the dot, not the rect. We will store the reference to the rect in the .parent property. This way, the rect:touch() function can determine if it is the rightful recipient of the touch event.

Of course, we haven’t changed the circle’s touch function to call the rectangle’s :touch() yet. Before we do that, we need to make sure that each circle keeps a reference to the rect object so that it can call the rect:touch() function and pass it the event parameter.

Here is the start of the newTrackDot() function, which needs to make a local copy of the original .target property of the event parameter.

-- creates an object to be moved
function newTrackDot(e)
-- create a user interface object
local circle = display.newCircle( e.x, e.y, 50 )
-- make it less imposing
circle.alpha = .5
-- keep reference to the rectangle
local rect = e.target
-- standard multi-touch event listener
function circle:touch(e)

Keeping a reference to the object which received the original began event phase allows our tracking dots to send the multitouch events back to it.

Now, we don’t need our tracking dots to send the began phase event parameter to the rect because rect has already received that event. What we do need is to call rect:touch(e) in the :touch() function of the tracking dot so that the other phases get sent to our rect object.

-- standard multi-touch event listener
function circle:touch(e)
-- get the object which received the touch event
local target = circle
-- make it less imposing
circle.alpha = .5
-- store the parent object in the event
e.parent = rect
-- handle each phase of the touch event life cycle...
if (e.phase == "began") then
-- tell corona that following touches come to this display object
display.getCurrentStage():setFocus(target, e.id)
-- remember that this object has the focus
target.hasFocus = true
-- indicate the event was handled
return true
elseif (target.hasFocus) then
-- this object is handling touches
if (e.phase == "moved") then
-- move the display object with the touch (or whatever)
target.x, target.y = e.x, e.y
else -- "ended" and "cancelled" phases
-- stop being responsible for touches
display.getCurrentStage():setFocus(target, nil)
-- remember this object no longer has the focus
target.hasFocus = false
end
-- send the event parameter to the rect object
rect:touch(e)
-- indicate that we handled the touch and not to propagate it
return true
end
-- if the target is not responsible for this touch event return false
return false
end

Pretty simple. We now have a rectangle which creates a tracking dot for each touch it detects. Each of those dots also send their touch information back to the rect, using it’s original touch handler function. The rect will also know that it is the proper target.

The trick now is to make use of this multitouch information.


Employing Multitouch

Sample5.lua

We now have an object which can detect the start of multiple touch points. It spawns tracking dots for each point and receives the following touch events.

To make some basic use of this multitouch information we will position the rect display object at the centre of the touch points. This can all happen within the :touch() function of the rect object.

To position the rect object at the centre of our multiple touch points we first need to find the average x and y of all the touch points. We’ll use a separate function for that.

-- calculates the average centre of a list of points
local function calcAvgCentre( points )
local x, y = 0, 0
for i=1, #points do
local pt = points[i]
x = x + pt.x
y = y + pt.y
end
return { x = x / #points, y = y / #points }
end

In order to call this function rect needs to keep a list of the tracking dots it creates. We will add this list to the rect object as a property when we create the rect display object.

-- create object to listen for new touches
local rect = display.newRect( 200, 200, 200, 100 )
rect:setFillColor(0,0,255)

-- keep a list of the tracking dots
rect.dots = {}

Now we’ll get the average centre of those dots and update the x and y position of rect:

-- standard multi-touch event listener
function rect:touch(e)
-- get the object which received the touch event
local target = e.target
-- handle began phase of the touch event life cycle...
if (e.phase == "began") then
print( e.phase, e.x, e.y )
-- create a tracking dot
local dot = newTrackDot(e)
-- add the new dot to the list
rect.dots[ #rect.dots+1 ] = dot
-- we handled the began phase
return true
elseif (e.parent == rect) then
-- calculate the average centre position of all touch points
local centre = calcAvgCentre( rect.dots )
-- update the position of rect
rect.x, rect.y = centre.x, centre.y
if (e.phase == "moved") then
print( e.phase, e.x, e.y )
else -- ‘ended’ and ‘cancelled’ phases
print( e.phase, e.x, e.y )
end
return true
end
-- if the target is not responsible for this touch event return false
return false
end

Run this code and you’ll see a small blue rectangle. Touch the rectangle and it produces a white circle. Moving this first circle will cause the blue rectangle to follow it precisely. Release the touch and create another white circle and you’ll see that the blue rectangle now stays at the midpoint between the two white circles. Create yet another and it will stay between the three circles, and so on.


Debugging and Devices

Sample6.lua

We now have a good simulator debugger for multitouch capable display objects. You’ll notice, however, that when you release your touch from one of the tracking dots the dot does not disappear. This is really great for debugging with a simulator because you can pretend to have multiple touch points. This is not so great on the device because you’re filling up the screen with white circles.

To fix this, if it is running on a physical device, the rect:touch() function needs to remove the tracking dots in the ended phase. First, however, we need to store a variable at the start of our code which indicates whether we are running on a device.

-- which environment are we running on?
local isDevice = (system.getInfo("environment") == "device")

The isDevice variable will be true if the code is running on a real, physical device and can be used to automatically remove the tracking dot when the user lifts their finger.

if (e.phase == "moved") then
print( e.phase, e.x, e.y )
else -- ‘ended’ and ‘cancelled’ phases
print( e.phase, e.x, e.y )
-- remove the tracking dot from the list
if (isDevice or e.numTaps == 2) then
-- get index of dot to be removed
local index = table.indexOf( rect.dots, e.target )
-- remove dot from list
table.remove( rect.dots, index )
-- remove tracking dot from the screen
e.target:removeSelf()
end
end
return true

Notice that or e.numTaps == 2 is used. This allows the tracking dot to have a tap listener which also calls the rect:touch() function so that in the simulator we can use a double tap to remove the tracking dot.

The tap listener should only listen for taps if the code is running in the simulator, so we'll use the isDevice variable again. The tap listener is added inside the newTrackDot() function which creates tracking dots.

-- listen for a tap when running in the simulator
function circle:tap(e)
if (e.numTaps == 2) then
-- set the parent
e.parent = rect
-- call touch to remove the tracking dot
rect:touch(e)
end
return true
end
-- only attach tap listener in the simulator
if (not isDevice) then
circle:addEventListener("tap")
end

Note that we also:

    • Check for two taps, so that only a double tap will remove a tracking dot
    • Set the .parent property, just as we do in the touch function
    • Only attach the tap listener if the code is running on the simulator


Making it Useful

The code so far is useful but doesn’t do very much. We can move a small, blue rectangle around with more than one finger. The beauty of multitouch input devices is that the real world has an impact on the virtual. If all we want to do is move an image or collection of display objects around we can add this code to those objects and have them respond to the user’s touch. If we want it to be a bit more realistic, we should add some rotation and scaling.


Relative Motion

Sample7.lua

Before we do that, however, take a look at how the rectangle moves when you use one finger. It centres itself directly under the touch point. To be believable it should really move relative to the motion of the touch point. Unfortunately, this is not as simple a change as it would appear, because we need to cater for removing a touch point. We now need to move some code into the moved and ended phases.

To illustrate the complete change and to lay out the full rect:touch(e): code - it has changed a lot, after all - here’s the whole function:

-- advanced multi-touch event listener
function rect:touch(e)
-- get the object which received the touch event
local target = e.target
-- handle began phase of the touch event life cycle...
if (e.phase == "began") then
print( e.phase, e.x, e.y )
-- create a tracking dot
local dot = newTrackDot(e)
-- add the new dot to the list
rect.dots[ #rect.dots+1 ] = dot
-- pre-store the average centre position of all touch points
rect.prevCentre = calcAvgCentre( rect.dots )
-- we handled the began phase
return true
elseif (e.parent == rect) then
if (e.phase == "moved") then
print( e.phase, e.x, e.y )
-- calculate the average centre position of all touch points
local centre = calcAvgCentre( rect.dots )
-- update the position of rect
rect.x = rect.x + (centre.x - rect.prevCentre.x)
rect.y = rect.y + (centre.y - rect.prevCentre.y)
-- store the centre of all touch points
rect.prevCentre = centre
else -- ‘ended’ and ‘cancelled’ phases
print( e.phase, e.x, e.y )
-- remove the tracking dot from the list
if (isDevice or e.numTaps == 2) then
-- get index of dot to be removed
local index = table.indexOf( rect.dots, e.target )
-- remove dot from list
table.remove( rect.dots, index )
-- remove tracking dot from the screen
e.target:removeSelf()
-- store the new centre of all touch points
rect.prevCentre = calcAvgCentre( rect.dots )
end
end
return true
end
-- if the target is not responsible for this touch event return false
return false
end

The fairly significant change here is to:

    • Calculate the centre of all touches and store it for reference in the began phase
    • Add the difference between the previous and current touch centres to the rect.x and rect.y in the moved phase
    • Update the stored touches centre in the ended phase so that removing a finger does not throw off the next moved phase

The user can now place any number of fingers on the rect, even change them, and move it around as if shifting a photo on a table. Of course, what it doesn’t do is rotate with their touch.


Scaling

Sample8.lua

With multitouch control of a display object, each transformation we want to apply to it requires taking the average of all the tracking dots and applying that to the image at the midpoint (the average location) of the display object.

For scaling, this means that the mathematical process is:

    • Sum the distances between the midpoint and the tracking dots
    • Get the average distance by dividing the sum distance by the number of dots
    • Get the same average distance for the previous location of the tracking dots
    • Take the difference between the previous and the current average distance
    • Apply the difference as a multiplication to the display object’s .xScale and .yScale

This is only slightly more advanced from how we applied the average transition of the display object when moving multiple tracking dots.

To help us get these scaling values we’ll need some basic library functions.

-- returns the distance between points a and b
function lengthOf( a, b )
    local width, height = b.x-a.x, b.y-a.y
    return (width*width + height*height)^0.5
end

The most important library function calculates the distance between two points on the screen. This is a very typical trigonometry function and widely used.

To get the midpoint of the tracking dots we’ll use the calcAvgCentre() function described above. To get and store the average distance between the midpoint and the tracking dots we’ll use these functions:

-- calculate each tracking dot's distance from the midpoint
local function updateTracking( centre, points )
for i=1, #points do
local point = points[i]
point.prevDistance = point.distance
point.distance = lengthOf( centre, point )
end
end

-- calculates scaling amount based on the average change in tracking point distances
local function calcAverageScaling( points )
local total = 0
for i=1, #points do
local point = points[i]
total = total + point.distance / point.prevDistance
end
return total / #points
end

The first of these gets the current distance for each dot, stores it in the tracking dot and also saves the previously known distance. The second function calculates the difference between the previous and current set of distances.

Using these functions is simple. For the began and ended phases of the rect:touch() we just call them and they update our tracking dots with the appropriate values. Here is the additional update call for the began phase:

-- pre-store the tracking dot scale and rotation values
updateTracking( rect.prevCentre, rect.dots )
-- we handled the began phase
return true

And the update for the ended phase:

-- store the new centre of all touch points
rect.prevCentre = calcAvgCentre( rect.dots )
-- refresh tracking dot scale and rotation values
updateTracking( rect.prevCentre, rect.dots )
end

The moved phase is a little more complex because this is where the real work is done. Fortunately, all we need to do here is update the tracking dots again and only apply the scaling if there is more than one tracking dot.

if (e.phase == "moved") then
print( e.phase, e.x, e.y )
-- declare working variables
local centre, scale, rotate = {}, 1, 0
-- calculate the average centre position of all touch points
centre = calcAvgCentre( rect.dots )
-- refresh tracking dot scale and rotation values
updateTracking( rect.prevCentre, rect.dots )
-- if there is more than one tracking dot, calculate the rotation and scaling
if (#rect.dots > 1) then
-- calculate the average scaling of the tracking dots
scale = calcAverageScaling( rect.dots )
-- apply scaling to rect
rect.xScale, rect.yScale = rect.xScale * scale, rect.yScale * scale
end
-- update the position of rect
rect.x = rect.x + (centre.x - rect.prevCentre.x)
rect.y = rect.y + (centre.y - rect.prevCentre.y)
-- store the centre of all touch points
rect.prevCentre = centre
else -- "ended" and "cancelled" phases

Above, we’ve made the following changes to the moved phase:

    • Declared variables to work with the forthcoming transformation values
    • Called updateTracking to refresh the stored distance values of the tracking dots
    • Used those distance values to calculate the average change in tracking scaling
    • Applied that scaling to the display object rect

The display object now translates (moves) and scales (zooms) along with our tracking dots (touch points.)


Rotation

Sample9.lua

To rotate our display object the basic logic follows that we work out how much each tracking dot has rotated around the midpoint (of all the tracking dots), get the average and add the difference between that and the previous amount to our object's .rotation value.

This requires adding some more general purpose library maths functions to our code.

-- returns the degrees between (0,0) and pt
-- note: 0 degrees is 'east'
function angleOfPoint( pt )
local x, y = pt.x, pt.y
local radian = math.atan2(y,x)
local angle = radian*180/math.pi
if angle < 0 then angle = 360 + angle end
return angle
end

-- returns the degrees between two points
-- note: 0 degrees is 'east'
function angleBetweenPoints( a, b )
local x, y = b.x - a.x, b.y - a.y
return angleOfPoint( { x=x, y=y } )
end

The code above performs two standard operations. angleOfPoint returns the angle between (0,0) and pt. angleBetweenPoints uses angleOfPoint to return the angle between point a and b.

Because of an oddity in angle calculations, we will also need another function which can determine the smallest angle between two points on the perimeter of a circle. This is important because when we’re using the angle which a tracking dot has rotated we may accidentally end up with an angle which represents the larger angle, say, between 10 degrees and 260 degrees. What we would want here is the angle 90 degrees, not 260 degrees.

-- returns the smallest angle between the two angles
-- ie: the difference between the two angles via the shortest distance
function smallestAngleDiff( target, source )
local a = target - source
if (a > 180) then
a = a - 360
elseif (a < -180) then
a = a + 360
end
return a
end

As in the calcAvgScaling function, we’ll make use of the above function in the calcAvgRotation function to determine the average amount that all of the tracking dots have rotated around the midpoint:

-- calculates rotation amount based on the average change in tracking point rotation
local function calcAverageRotation( points )
local total = 0
for i=1, #points do
local point = points[i]
total = total + smallestAngleDiff( point.angle, point.prevAngle )
end
return total / #points
end

We also want to update the difference between the tracking dot angles - and their previous angles at the same time. Fortunately, we’re already doing this for tracking dot distances from the midpoint, so we can add this code there:

-- calculate each tracking dot's distance and angle from the midpoint
local function updateTracking( centre, points )
for i=1, #points do
local point = points[i]
point.prevAngle = point.angle
point.prevDistance = point.distance
point.angle = angleBetweenPoints( centre, point )
point.distance = lengthOf( centre, point )
end
end

Now, due to this small addition of code, the rect:touch() function is already updating the appropriate values in the began and ended phases. All we have to do is apply rotation to the rect display object in the moved phase. Of course, we only need to do this if there is more than one tracking dot.

-- if there is more than one tracking dot, calculate the rotation and scaling
if (#rect.dots > 1) then
-- calculate the average rotation of the tracking dots
rotate = calcAverageRotation( rect.dots )
-- calculate the average scaling of the tracking dots
scale = calcAverageScaling( rect.dots )
-- apply rotation to rect
rect.rotation = rect.rotation + rotate
-- apply scaling to rect
rect.xScale, rect.yScale = rect.xScale * scale, rect.yScale * scale
end

So, here we simply call the functions described earlier to calculate the average amount of rotation around the tracking dots’ midpoint and apply it to the display object.







Pinch Centre Translation

Sample10.lua

Run the code now and you’ll notice that while the display object rotates, scales and moves with the tracking dots, it doesn’t quite shift with the tracking dots. This is because, unless the user is very lucky (or not paying attention,) they will never quite get the midpoint to be the very centre of the display object being manipulated.

To solve this we don’t just need to apply the translation, scaling and rotation to the display object - we also need to apply it to the centre point location of the display object. This means that:

    • Scaling should be applied to the distance between the midpoint and the rect centre
    • Rotation should be applied to the rect centre, rotated around the tracking dot midpoint
    • But, fortunately, we’re already applying translation, so that can be ignored.

Ok, so what standard library maths functions do we need? Well, we want to rotate a point around another point, so there’s those:

-- rotates a point around the (0,0) point by degrees
-- returns new point object
function rotatePoint( point, degrees )
local x, y = point.x, point.y
local theta = math.rad( degrees )
local pt = {
x = x * math.cos(theta) - y * math.sin(theta),
y = x * math.sin(theta) + y * math.cos(theta)
}

return pt
end

-- rotates point around the centre by degrees
-- rounds the returned coordinates using math.round() if round == true
-- returns new coordinates object
function rotateAboutPoint( point, centre, degrees, round )
local pt = { x=point.x - centre.x, y=point.y - centre.y }
pt = rotatePoint( pt, degrees )
pt.x, pt.y = pt.x + centre.x, pt.y + centre.y
if (round) then
pt.x = math.round(pt.x)
pt.y = math.round(pt.y)
end
return pt
end

The moved phase also needs changing quite a bit. 

-- apply rotation to rect
rect.rotation = rect.rotation + rotate
-- apply scaling to rect
rect.xScale, rect.yScale = rect.xScale * scale, rect.yScale * scale
end
-- declare working point for the rect location
local pt = {}
-- translation relative to centre point move
pt.x = rect.x + (centre.x - rect.prevCentre.x)
pt.y = rect.y + (centre.y - rect.prevCentre.y)
-- scale around the average centre of the pinch
-- (centre of the tracking dots, not the rect centre)
pt.x = centre.x + ((pt.x - centre.x) * scale)
pt.y = centre.y + ((pt.y - centre.y) * scale)
-- rotate the rect centre around the pinch centre
-- (same rotation as the rect is rotated!)
pt = rotateAboutPoint( pt, centre, rotate, false )
-- apply pinch translation, scaling and rotation to the rect centre
rect.x, rect.y = pt.x, pt.y
-- store the centre of all touch points
rect.prevCentre = centre
else -- "ended" and "cancelled" phases

The moved phase is now doing a number of things, whether there’s one tracking dot or many:

    • pt is declared to use as a working space for the display object’s position
    • The midpoint translation is applied to the working object
    • The distance between the midpoint and the display object centre is scaled
    • The centre of the display object is rotated around the midpoint

Run the code now and no matter where you place your fingers, real or virtual (in the simulator,) as long as the touch (tracking dot) is started on the display object it will pinch-zoom with the touch points.

The effect is most obvious when using two fingers because the tracking points stay precisely relative to their starting location on the display object, but more can be used and the result is the same, just a little more averaged across the touch points.


One More Thing

Sample11.lua

Everything so far has relied on a single display object being manipulated. When does that happen in the real world? Realistically, a program will need a group of objects to be pinch-zoomed. More importantly, what use is a complex function if it can’t be re-used?

To re-use the :touch() function so that it can be attached to any display object - image or group - simply change the references it uses. To show that, let’s create a display group with a number of objects contained.

-- spawning tracking dots

-- create display group to listen for new touches
local group = display.newGroup()

-- populate display group with objects
local rect = display.newRect( group, 200, 200, 200, 100 )
rect:setFillColor(0,0,255)

rect = display.newRect( group, 300, 300, 200, 100 )
rect:setFillColor(0,255,0)

rect = display.newRect( group, 100, 400, 200, 100 )
rect:setFillColor(255,0,0)

-- keep a list of the tracking dots
group.dots = {}

-- advanced multi-touch event listener
function touch(self, e)
-- get the object which received the touch event
local target = e.target
-- get reference to self object
local rect = self
-- handle began phase of the touch event life cycle...
if (e.phase == "began") then

Now, it’s the group which will be manipulated, so attach the listener and the function to the group.

end

-- attach pinch zoom touch listener
group.touch = touch

-- listen for touches starting on the touch object
group:addEventListener("touch")


And there we have it - a touch listener function which can be applied to any display object or group to implement multitouch pinch-zoom-rotation.