M5Core2 library fork that supports multi-touch and provides Arduino-style virtual buttons [update: and gestures and events] [update2: MERGED !]



  • Actually, I have a bug report now.
    I am trying to integrate this with a gesture recognizer I'm working on. In most respects it seems significantly more reliable; I am getting much more accurate samples, allowing me to reliably recognize swipe-in without heuristics (hacks.)
    However, one issue is significant. The last sample in a while(M5.Touch.ispressed()) loop is always wrong.
    In this example program (using your fork of M5Core2):

    #include "M5Core2.h"
    
    void setup() {
      M5.begin();
      M5.Lcd.fillScreen(BLACK);
      M5.Lcd.setTextColor(WHITE, BLACK);
    }
    
    void loop() {
      bool draw = false;
      TouchPoint_t pt = { 0, 0 };
      M5.update();
      while(M5.Touch.ispressed()) {
        pt = M5.Touch.point[0];
        draw = true;
      }
      if(draw) {
        M5.Lcd.drawCentreString(String("    ") + pt.x + "," + pt.y + "    ", 160, 40, 4);
        draw = false;
      }
    }
    

    The screen will always display -1, -1.
    What I expect, when ispressed() is true, is to get a valid measurement.
    I see that the functions are decoupled in the libaray, and one might expect a race condition. But the result is always -1, -1.
    I think this is a typical application, so it would be great if the library could do its best to handle this situation.
    If I have to test every value for -1,-1 whenispressed() is true, I don't really need ispressed().



  • Hi @vkichline et al. I was working n gestures too. as well as a more generic touch event interface. Wouldn't it be cool if you could do this?

    #include <M5Core2.h>
    
    TouchZone topEdge(0,0,320,40);
    TouchZone bottomEdge(0,200,320,280);
    Gesture swipeDown(topEdge, bottomEdge, "Swipe Down");
    
    TouchButton lt = TouchButton(0, 0, 159, 119, "left-top");
    TouchButton lb = TouchButton(0, 120, 159, 240, "left-bottom");
    TouchButton rt = TouchButton(160, 0, 320, 119, "right-top");
    TouchButton rb = TouchButton(160, 120, 320, 240, "right-bottom");
    
    void setup() {
      M5.begin();
      M5.Touch.addHandler(eventDisplay);
      M5.Touch.addHandler(colorButtons, TE_BTNONLY + TE_TOUCH + TE_RELEASE);
      swipeDown.addHandler(yayWeSwiped);
      rt.addHandler(dblTapped, TE_DBLTAP);
    }
    
    void loop() {
      M5.update();
    }
    
    void eventDisplay(TouchEvent& e) {
      Serial.printf("%-12s finger%d  %-18s ", M5.Touch.eventTypeName(e), e.finger, M5.Touch.eventObjName(e));
      switch(e.type) {
        case TE_TOUCH:
        case TE_TAP:
        case TE_DBLTAP:
          Serial.printf("(%3d, %3d)\n", e.from.x, e.from.y);
          break;
        case TE_RELEASE:
        case TE_MOVE:
        case TE_GESTURE:
          Serial.printf("(%3d, %3d) --> (%3d, %3d)  %5d ms\n", e.from.x, e.from.y, e.to.x, e.to.y, e.duration);
          break;
      }
    }
    
    void colorButtons(TouchEvent& e) {
      TouchButton& b = *e.button;
      M5.Lcd.fillRect(b.x0, b.y0, b.x1 - b.x0, b.y1 - b.y0, b.isPressed() ? WHITE : BLACK);
    }
    
    void yayWeSwiped(TouchEvent& e) {
      Serial.println("--- SWIPE DOWN DETECTED ---");
    }
    
    void dblTapped(TouchEvent& e) {
      Serial.println("--- TOP RIGHT BUTTON WAS DOUBLETAPPED ---");
    }
    

    It's on the gestures branch of https://github.com/ropg/M5Core2

    Pre-release, I'd be happy to hear if you see problems. And note that the documentation has not changed yet, so you'll have to figure it out on your own.... :)

    Next some more events such as TE_PRESSED and TE_DRAGGED, which will not fire for a button if you happened to start a gesture there.



  • @vkichline said in M5Core2 library fork that supports multi-touch and provides Arduino-style virtual buttons.:

    Rop - excellent work!
    Feedback: I noticed you used left, top, right, bottom for rectangular dimensions of the TouchButtons, a common scheme.
    But throughout the M5 library, rectangles seem to be described with left, top, width, height.
    The difference may to lead to frequent coding errors.

    Their own touch interface uses x0, y0, x1, y1, so that's what I followed. Can be changed. I would love to hear from M5Stack just to see how they see possibly using my code in their library, and I am happy to change whatever they want changed. Otherwise this is just a lot of work for no good reason...



  • @vkichline You should not use .ispressed() in combination with the new API. It queries INT on pin 39 directly, as the old API does not use M5.update(), so you will indeed see (-1, -1) because M5.update() also concluded that there was no touch press just before it.

    The new API provides .points to tell you how many points are pressed now. if (points) { ... should do the job. Or go play the newer library (gestures fork) and use the events interface.



  • Hi @Rop

    very cool stuff indeed. Thank you very much for sharing. Hopefully the M5Stack guys will pick it up rather sooner than later or at least let us know their intentions.

    For me swipe down worked after some 'training' to not do the swipe too fast.

    Double tap almost never worked until I've increased MAX_BETWEEN_TAP from 150 to 250. With the extended time it's very reliable.

    Cheers
    Felix



  • OK, I just finished the touch API in my fork of the M5Core library. What that means is that the version with buttons, gestures and events is now production quality, fully documented and ready for extensive testing. Please note that I changed the coordinate system for any screen areas to xywh from xyxy to match the display library. (Except for the legacy API, so that existing code for that, such as the M5Core2 Factory Test, still works.)

    I filed a new pull request against the M5Core2 library and closed the old one.

    Scroll the documentation (from touch.h) below.

    /*
    
    M5Stack Core2 touch library
    
        This is the library behind the M5.Touch object that you can use to read
        from the touch sensor on the M5Stack Core2.
    
        This library also provides ways to describe points and rectangles on
        the sensor as well as a TouchButton class (Arduino-style buttons for
        any rectangle) and a Gesture object for recognising touch gestures and
        a number of supporting data types. You can use the built-in events to
        trigger your own functions when things happen on the sensor.
    
    
    About the touch sensor in the M5Stack Core2
    
        Touch is provided by a FocalTech FT6336 chip, which supports two
        simultaneous touches. However, the M5Stack Core2 touch display is only
        multi-touch in one dimension. What that means is that it can detect two
        separate touches only if they occur on different vertical positions.
        This has to do with the way the touch screen is wired, it's not
        something that can be changed in software. So you will only ever see
        two points if they do not occur side-by-side. Touches that do happen
        side-by-side blend into one touch that is detected somewhere between
        the actual touches.
    
        While this limits multi-touch somewhat, you can still create multiple
        buttons and see two that are not on the same row simultaneously. You
        could also use one of the buttons below the screen as a modifier for
        something touched on the screen.
        
        The touch sensor extends to below the screen of the Core2: the sensor
        is 320x280 pixels, the screen is 320x240. The missing 40 pixels are
        placed below the screen, where the printed circles are. This is meant
        to simulate the three hardware buttons on the original M5Stack units.
        Note that on at least some units the touch sensor in this area only
        operates properly if the USB cable is plugged in or if the unit is
        placed on a metal surface.
    
    
    Describing points and areas
    
        TouchPoint and TouchArea allow you to create variables of the types
        TouchPoint and TouchZone to hold a point or an area on the screen.
    
        TouchPoint
            Holds a point on the screen. Has members x and y that hold the
            coordinates of a touch. Values -1 for x and y indicate an invalid
            touch value.
        
        TouchZone
            Holds a rectangular area on the screen. Members x, y, w and h are
            for the x and y coordinate of the top-left corner and the width and
            height of the rectangle.
            
        The 'valid' method tests if a point is valid. Using the 'in' or
        'contains' method you can test if a point is in a zone. Here are some
        examples to make things clear.
            
            TouchPoint a;
            TouchPoint b(20, 120);
            Serial.println(a);              // (-1, -1)
            a.set(10, 30);
            Serial.println(a.valid());      // 1    
            Serial.println(b.y);            // 120
            TouchZone z(0,0,100, 100);
            Serial.println(z.w);            // 100
            Serial.println(z.contains(a));  // 1
            Serial.println(b.in(z));        // 0
            
    
    Basic API
    
        The basic API provides a way to rad the data from the touch sensor.
        While you may want to use this directly. But you may want to skip using
        this API as even for simple applications the more advanced ways of
        using the touch sensor are easier.
    
        M5.update()
            In the loop() part of your sketch, call "M5.update()". This is the
            only part that talks to the touch interface. It updates the data
            used by the rest of the two-touch API.
    
        M5.Touch.changed
            Is true if M5.update() detected any changes since the last time it
            was called.
    
        M5.Touch.points
            Contains the number of touches detected: 0, 1 or 2.
    
        M5.Touch.point[0], M5.Touch.point[1]
            M5.Touch.point[0] and M5.Touch.point[1] are TouchPoints that hold
            the detected touches.
            
        So the simplest sketch to print the location where the screen is
        touched would be:
            
            #include <M5Core2.h>
    
            void setup() {
              M5.begin();
            }
    
            void loop() {
              M5.update();
              if (M5.Touch.changed && M5.Touch.points == 1) {
                Serial.println( M5.Touch.point[0] );
              }
            }
    
    
    Buttons
    
        You can create virtual buttons for any given rectangle on the screen by
        creating a global variable to hold the TouchButton object and providing
        the coordinates (x, y, width and height). These buttons can be used in
        two ways. You can either use them the way you would a normal Arduino
        button, or you can provide handler functions to process various events
        for the button. We'll talk about the events later, but here's a simple
        sketch that defines a button and then does something when it's pressed.
    
            #include <M5Core2.h>
    
            TouchButton b(0,0,100,100);
    
            void setup() {
              M5.begin();
            }
    
            void loop() {
              M5.update();
              if (b.wasPressed()) Serial.print("* ");
            }
            
        wasPressed() will only be true once after you release the button. You
        can also use the other Arduino button functions such as isPressed()
        that is true as soon as and as long as the button is touched. Note that
        the TouchButtons only become pressed if the touch starts within the
        button, not if you swipe over it, and that it will stay pressed as long
        as the finger touches, even if it leaves the button area. You may want
        read about the events further down to distinguish between different
        kinds of touches.
    
        The three buttons BtnA, BtnB and BtnC from the older M5Stack units come
        already implemented as buttons that lie just below the screen where the
        three circles are. If you want them to be a little bigger and also
        cover the area of the screen where you may be showing labels for the
        buttons, simply raise the top of the buttons like this:
        
             M5.BtnA.y0 = M5.BtnB.y0 = M5.BtnC.y0 = 220;
    
        Buttons are only active when their variables exist, meaning that if you
        define button variables in a function that has its own loop that calls
        M5.update(), they will not be detected anywhere else.
    
    
    Gestures
    
        When you create a gesture you can specify two TouchZones. Whenever a
        swipe on the sensor starts in the first zone and ends in the second,
        that gesture is counted detected. Like in the following simple example
        sketch:
    
            #include <M5Core2.h>
    
            TouchZone topHalf(0,0,320,120);
            TouchZone bottomHalf(0,120,320,160);
            Gesture swipeDown(topHalf, bottomHalf);
    
            void setup() {
              M5.begin();
            }
    
            void loop() {
              M5.update();
              if (swipeDown.wasDetected()) Serial.println("Swiped down!");
            }
    
        After the start and end zones, you can also optionally specify a name,
        a maximum time for the gesture (default 500 ms) and minimum distance
        the swipe must cover on the screen (default 75 pixels). Only the first
        gesture for which a swipe meets the criteria will be detected. 
    
    
    Events
    
        The most powerful way to react to things happening with buttons,
        gestures or just touches and swipes on random parts of the screen is by
        using events. For this you need to define one or more event handler
        functions. This is done like this:
    
            void myHandler(TouchEvent& e) { ... }
    
        It's important to do it exactly this way, only changing the name of the
        function. You can then set things up so that this function receives
        events. Note that the function name is provided without the brackets. 
    
            M5.Touch.addHandler(myHandler);
    
            - or -
    
            testbutton.addHandler(myHandler);
    
        With the first line the function gets all events, with the second line
        only those that pertain to a specific button or gesture. Events have
        the following members:
        
            e.type
                The type of event, such as TE_TOUCH or TE_TAP, see below.
                
            e.finger
                0 or 1, whether this is the first or second finger detected
                
            e.from and e.to
                TouchPoints that say from where to where this event happened
                
            e.duration
                Duration of the event in ms
                
            e.button and e.gesture
                Pointers to the event or gesture attached to the event
                    
        Here's a list of possible event types and when they're fired:
        
            TE_TOUCH
                A finger comes on the screen. If that is within the area of a
                button, a pointer to that button will be in e.button.
                
            TE_MOVE
                A finger has moved. e.from and e.to contain the coordinates. If
                this swipe started within a button, that button will be in
                e.button for all subsequent TE_MOVE and the eventual TE_RELEASE
                event, even if the finger is not on the button anymore. This
                allows buttons to be swiped.
                
            TE_RELEASE
                The e.from and e.to will hold the beginning and ending
                positions for this entire swipe, e.duration holds the time
                since the finger was placed on the screen.
                
            TE_GESTURE
                After TE_RELEASE, the first gesture that matches a swipe fires
                a TE_GESTURE, with a pointer to the gesture in e.gesture. None
                of the further events below are then generated, even if the
                gesture started on a button.
                
            TE_TAP
                Fired when a short tap on the screen is detected. Will come
                with the associated button in e.button if applicable.
                
            TE_DBLTAP
                Like TE_TAP but for a quick double tap. Note that no TE_TAP will
                be fired in this case.
                
            TE_DRAGGED
                Fired when the finger has left the button before it is lifted.
                
            TE_PRESSED
                Fired after the button is released and it wasn't a TE_TAP,
                TE_DBLTAP or TE_SWIPE. 
    
        When you add a handler function you can also specify what events it
        should receive by supplying it as the second argument after the handler
        function. If you want to register multiple events for the same
        function, don't register the handler twice, but simply add (or bitwise
        or) the event values. The default value there is the pseudo-event
        TE_ALL, which is simply a value with all the event bits turned on. You
        can also subtract event type values from TE_ALL to exclude them.
    
        If you add the pseudo-event value TE_BTNONLY to the value supplied to
        addHandler, it indicates that you only want the function to receive
        events that have a button attached. (This only makes sense if you
        register your handler with M5.touch, where it could also see event that
        were not tied to a button.)
        
        Here are some examples of ways to add a handler function:
        
            button1.addHandler(b1Function, TE_TOUCH _ TE_RELEASE);
                b1Function only get these two events for button1
                
            M5.Touch.addHandler(btnHandle, TE_ALL + TE_BTNONLY - TE_MOVE);
                btnHandle gets all events (except TE_MOVE) tied to buttons 
                
            swipeUp.addHandler(nextPage);
                Handler nextPage is called when swipeUp gesture detected.
                Note that nextpage must still be in the handler format,
                accepting the event as an argument, even when it can
                completely ignore it.
                
        If your event calls functions in e.button or e.gesture, remember that
        these are pointers. Without going into too much detail, it means it
        must do so with the -> notation, so to read the button x position, you
        would say "e.button->x".
    
        If you registered a function with M5.Touch and do this on an event that
        has no button attached, your program will crash. So make sure you only
        get details about buttons if you know your event has a button pointer.
        You can either test that with "if (e.button) ..." or make sure with
        TE_BTNONLY.
    
        Please have a look at the example sketch (see below) to understand how
        this all works and run the sketch to see all the events printed to the
        serial port. 
    
    
    Legacy API
    
        There was a previous version of this library, and it provided a number
        of functions that were single touch only. The older version did not
        have M5.update(). Instead it used ispressed() and getPressedPoint()
        functions as well as HotZones, which provided something that worked a
        little bit like TouchButtons. This older API is still supported (the
        M5Core2 Factory Test sketch still works), but you should not use it
        for new programs. The ispressed() function specifically does not mix
        well with code that uses M5.update().
    
    
    Example
    
        It may sound complicated when you read it all in this document, but
        it's all made to be easy to use.
        
        Under File/Examples/M5Core2/Basics/touch in your Arduino environment is
        an example sketch called "touch" that shows this library in action.
        Please have a look at it to understand how this all works and run the
        sketch to see all the events printed to the serial port. It shows
        buttons, gestures and events and should be pretty self-explanatory.
    
    */
    


  • @felmue I'll meet you in the middle: I raised it to 200 because it was a bit tight indeed. 250 felt like too much because then it takes just a little too long for my taste for single taps to register.



  • This post is deleted!


  • @vkichline said in M5Core2 library fork that supports multi-touch and provides Arduino-style virtual buttons.:

    Rop - excellent work!
    Feedback: I noticed you used left, top, right, bottom for rectangular dimensions of the TouchButtons, a common scheme.
    But throughout the M5 library, rectangles seem to be described with left, top, width, height.
    The difference may to lead to frequent coding errors.

    Fixed in latest version, switched to xywh except for the legacy interface so that the Factory Test still compiles.



  • @rop said in M5Core2 library fork that supports multi-touch and provides Arduino-style virtual buttons.:

    @felmue I'll meet you in the middle: I raised it to 200 because it was a bit tight indeed. 250 felt like too much because then it takes just a little too long for my taste for single taps to register.

    Hi @Rop

    thank you for all your hard work. I like the gestures and I like your coding style, very clear and well documented.

    I fully understand your reasoning about not having a too long delay for double tap to not compromise single tap too much - 200 is fine for me too.

    Cheers
    Felix



  • Litte drawing program just to show the power and simplicity.

    (Use todays version of library, made a small change...)

    // Simple drawing program. Swipe from top to bottom of display to clear
    
    #include <M5Core2.h>
    
    TouchZone topEdge(0,0,320,50);
    TouchZone bottomEdge(0,190,320,90);
    Gesture swipeDown(topEdge, bottomEdge, "Swipe Down");
    
    void setup() {
      M5.begin();
      swipeDown.addHandler(clearScreen);
      M5.Touch.addHandler(thickLine, TE_MOVE + TE_TOUCH);
    }
    
    void loop() {
      M5.update();
    }
    
    void thickLine(TouchEvent& e) {
      // Draw circle every 3rd pixel between detected points to close lines
      int p = e.from.distanceTo(e.to) / 3;
      int dx = e.to.x - e.from.x;
      int dy = e.to.y - e.from.y;
      for (int n = 0; n <= p; n++) {
        float f = p ? (float)n / p : 0;
        M5.Lcd.fillCircle(e.from.x + (dx * f), e.from.y + (dy * f), 5, WHITE);
      }
    }
    
    void clearScreen(TouchEvent& e) {
      M5.Lcd.fillRect(0, 0, 320, 240, BLACK);
    }
    


  • It got merged... Yay!



  • Hey @Rop

    congratulation. This is great news!

    Cheers
    Felix



  • Congrats, Rop, but I have bad news, too.
    They did not bump the version number, so Arduino or PlatformIO install the original release.
    I manually downloaded and installed the update; verified the touch example was updated, and it works fine.
    However, I can no longer run the factory test at all. The old hack that made it work no longer works, and Felix's suggestions don't help either.
    So unfortunately, there are still issues to be addressed.



  • Hi guys

    in the M5Core2 library all I2C devices are using the internal I2C bus on Wire1 (GPIO 21 and 22).

    That actually hasn't changed with Rop's touch pull-request - all that has changed is that Wire now is setup to use GPIO 32 and 33 which is the external I2C bus.

    Factory test source code was and is still wrongly using Wire for scanning and testing of internal I2C devices.

    @vkichline - to fix factory test open Core2_Factory_test.ino and change the four occurrences of Wire to Wire1.

    Happy Stacking!
    Felix



  • @felmue, confirmed that the correct fix for Core2_Factory_test is to change lines 446 & 447 from Wire. ... to Wire1. ...
    With that, @Rop's update works fine.
    Thanks to both of you!



  • @rop, perhaps this topic should be closed since it's part of the standard lib now, but I have a question about TouchButtons and don't know where else to post it.
    With your change checked in, I feel that I can start to develop some Core2 code at last. I am trying to get M5Stack-SD-Updater working with your addition to touch, and I have found a difference between the simulated and physical BtnA/B/C behavior which is problematic.
    If you run this brief sketch on an M5Stack Fire and on a Core2 (chainging the header as required)

    #include <M5Core2.h>
    void setup() { M5.begin(); }
    void loop() {
      M5.update();
      if(M5.BtnA.isPressed()) M5.Lcd.fillScreen(WHITE);
      else M5.Lcd.fillScreen(BLACK);
      delay(100);
    }
    

    ...you will see that on the Fire, if you reset the device with BtnA pressed, the screen will immediately turn white.
    On the Core2, if you reset with the simulated BtnA pressed, the screen will remain black until you release and press the button again.
    This makes it tricky to use the M5Stack-SD-Updater gesture of resetting with A pressed to load the menu application.

    Can you think of some work-around I could insert, perhaps in the setup() function, to make an already pressed button read as pressed in loop()?



  • I find that touch::ispressed() does not return true until you release and press again if you reset with the screen pressed.
    So this may be quite difficult...



  • Hi @vkichline

    hmm, I think that is a tricky one. I tried a few things but have not found a solution.

    I believe the touch IC constantly monitors the electrostatic field and automatically adapts to slow changes to get a 'baseline'. And only big changes in a short period of time will be taken into account for touch.

    So when the finger already is on the screen when the device boots that is simply taken as baseline and only lifting and touching the screen again is a big enough change to count as touch.

    Thanks
    Felix



  • @vkichline Booting with the screen pressed. I had not thought of that... It will be fixed.