Briefly explain the programming methods of touch and click events in Android development

  • 2020-04-01 04:27:27
  • OfStack

On Android, there is more than one way to listen for events that users and applications interact with. For events in the user interface, the way to listen is to intercept those events from a particular view object that is interacting with the user. The view class provides the means.

In the various view classes used to construct layouts, you may notice that some common callback methods seem to be useful for user interface events. These methods are called by the Android framework when the associated action on the object occurs. For example, when a view, such as a button, is touched, the onTouchEvent() method on that object is called. However, in order to listen for this event, you must extend the class and override the method. Obviously, extending every view object you want to use (just to handle one event) is ridiculous. This is why the view class also contains a collection of nested interfaces with much simpler callback functions to implement. These interfaces, called event listeners, are used to capture "tickets" to the user's interactions with your interface.

When you more commonly use event listeners to listen for user actions, there is always a time when you may have to extend a view class in order to create a custom component. Maybe you want to extend the Button class to make something fancy. In this case, you will be able to have the event handler event handlers class define the default event behavior for your class.

Event Listeners

The event listener is the interface to the View class and contains a separate callback method. These methods are called by the Android framework when the listener registered in the view is triggered by a user interface operation. The following callback methods are included in the event listener interface:

OnClick () : contains the view.onclicklistener. Called when the user touches the item (in touch mode), or focuses on the item by the browse key or tracking ball, and then presses the "ok" key or presses the tracking ball.
OnLongClick () : contains the view.onlongclicklistener. Called when the user touches and controls the item (in touch mode), or focuses on the item by the browse key or tracking ball, and then holds down the "ok" key or holds down the tracking ball (one second).
OnFocusChange () : contained in the OnFocusChangeListener. Called when the user browses into or out of the item using the browse key or trace ball.
OnKey () : contains the view.onkeylistener. Called when the user focuses on the item and presses or releases a key on the device.
OnTouch () : contains the view.ontouchlistener. The action performed by the user is called as a touch event, including pressing, releasing, or any motion gesture on the screen (within the bounds of the item).
OnCreateContextMenu () : contained in the OnCreateContextMenuListener. Called while creating a context menu (as a result of a continuous "long click" action).
These methods are the only "inhabitants" of their corresponding interfaces. To define these methods and handle your events, implement the nested interface in your activity or define it as an anonymous class. Then, pass an instance of your implementation to the respective view.set... Listener () method. For example, call setOnClickListener() and pass it your OnClickListener implementation.

The following example shows how to register a click listener for a button:


// Create an anonymous implementation of OnClickListener
private OnClickListener mCorkyListener = new OnClickListener() {
  public void onClick(View v) {
   // do something when the button is clicked
  }
};
 
protected void onCreate(Bundle savedValues) {
  ...
  // Capture our button from layout
  Button button = (Button)findViewById(R.id.corky);
  // Register the onClick listener with the implementation above
  button.setOnClickListener(mCorkyListener);
  ...
}

You may find it more convenient to implement an OnClickListener as part of an activity. This will avoid additional class loading and object allocation. Such as:


public class ExampleActivity extends Activity implements OnClickListener {
  protected void onCreate(Bundle savedValues) {
    ...
    Button button = (Button)findViewById(R.id.corky);
    button.setOnClickListener(this);
  }
 
  // Implement the OnClickListener callback
  public void onClick(View v) {
   // do something when the button is clicked
  }
  ...
}

Note that the onClick() callback in the above example does not return a value, but some other event listeners must return a Boolean value. The cause is related to the event. For some of them, here's why:

OnLongClick () the wok returns a Boolean value indicating whether you have consumed the event and should not process it further. In other words, returning true means that you have handled the event and have stopped there; Returning false indicates that you have not yet processed it and/or that the event should continue to another on-click listener.
The onKey() stream returns a Boolean value indicating whether you have consumed the event and should not process it further. In other words, returning true means that you have handled the event and have stopped there; Returning false indicates that you have not processed it and/or that the event should be passed on to another on-key listener.
OnTouch () - returns a Boolean value to indicate whether your listener has consumed the event. The important thing is that the event can have multiple actions that follow each other. Therefore, if you return false when you receive a downward action event, you have not consumed the event and are not interested in the subsequent action. Then, you will not be called by other actions in the event, such as gestures or the last up action event.
Remember that key events are always submitted to the view where the current focus is. They are distributed from the top of the view hierarchy and then go down until they reach the appropriate target. If your view (or a subview) currently has focus, then you can see the event being distributed via the dispatchKeyEvent() method. In addition to intercepting key events from your view, there is an alternative that you can use onKeyDown() and onKeyUp() in your activity to receive all events.

Note: Android will invoke the event handler first, followed by the appropriate default handler in the class definition. Thus, returning true from these things listeners stops the event propagation to other event listeners and also blocks the callback function of the missing event handler in the view. So when you return true confirm that you want to terminate the event.

Event handler Event Handlers

If you create a custom component from the view, you will be able to define some callback methods to be used as the default event handler. In the documentation for creating Custom Building Custom Components, you will learn some general callback functions for event handling, including:

OnKeyDown (int, KeyEvent) - called when a new KeyEvent occurs. OnKeyUp (int, KeyEvent) - called when an upward KeyEvent occurs. OnTrackballEvent (MotionEvent) - called when a trackball event occurs. OnTouchEvent (MotionEvent) - called when a touch screen movement event occurs. OnFocusChanged (Boolean, int, Rect) - called when the view gains or loses focus.

You should know that there are other methods that are not part of the view class, but that directly affect how you handle events. So, when managing more complex events in a layout, consider these methods:

Activity. DispatchTouchEvent (MotionEvent) - this allows your Activity can capture all touch events before distribution to the window. ViewGroup. OnInterceptTouchEvent (MotionEvent) - this allows a ViewGroup view group in distributed to child views when looking at these events. ViewParent. RequestDisallowInterceptTouchEvent (Boolean) - in a parent view to invoke this method above said it should not be through onInterceptTouchEvent (MotionEvent) to capture the touch events.

Touch Mode

When the user navigates the user interface with a directional key or trackball, it is necessary to set focus on the items (such as buttons) that the user can manipulate, so that the user knows which item will receive input. However, if the device has touch capability and the user interacts with the interface through touch, there is no need to highlight items or set the focus to a particular view. So, there's an interaction mode called touch mode.

For a touch-enabled device, once the user touches the screen, the device goes into touch mode. Since then, only views with isFocusableInTouchMode() true can be focused, such as the text editing widget. Other touchable views, such as buttons, will not receive focus when touched. They will simply fire the on-click listener when pressed. Any time the user presses the arrow key or scrolls the tracking ball, the device will exit touch mode and find a view to receive focus. The user may not touch the screen to restore the interface interaction.

The state of touch mode is maintained throughout the system (all Windows and activities). To query the current state, you can call isInTouchMode() to see if the device is currently in touch mode.

Handling Focus

The framework handles regular focus shifts based on user input. This involves changing the focus when a view is deleted or hidden, or when a new view appears. Views signal their intention to gain focus through the isFocusable() method.

To change whether the view can accept focus, you can call setFocusable(). In touch mode, you can use isFocusableInTouchMode() to query whether a view is allowed to accept focus. You can change it by using the setFocusableInTouchMode() method. The focus shift is based on an algorithm that looks for the nearest neighbor in a given direction. In rare cases, the default algorithm may not match the desired behavior of the developer. In these cases, you can provide explicit overrides with the following XML attributes in the layout file: nextFocusDown, nextFocusLeft, nextFocusRight, and nextFocusUp. Adds one of these properties to a view that loses focus. Define the property value to be the ID of the view that has the focus. Such as:


<LinearLayout
  android:orientation="vertical"
  ... >
 <Button android:id="@+id/top"
     android:nextFocusUp="@+id/bottom"
     ... />
 <Button android:id="@+id/bottom"
     android:nextFocusDown="@+id/top"
     ... />
</LinearLayout>

Typically, in this vertical layout, browsing up from the first button or down from the second button does not move anywhere else. Now that the top button has defined the bottom button as nextFocusUp (and vice versa), the browsing focus will loop from top to bottom and from bottom to top.

If you want to declare a focusable view in the user interface (which is not usually the case), you can add the android:focusable XML attribute to the view in your layout definition. Set its value to true. You can also through the android: focusableInTouchMode statement in touch mode a view to be the focus.

To request a specific view that receives focus, call requestFocus().

To listen for focus events (to be notified when a view gains or loses focus), use onFocusChange(), as described in the Event listener above.

The difference between a touch event and a click event

How does Android distinguish between an onTouchEvent, an onClick, or an onLongClick event for a View control on the screen?
In Android, a user action can be processed in order by different views, and a UI action that is fully responsive to the user is called consume the event. In what order does Android pass the event? Under what circumstances was the event determined to have been consumed?
          To clear up these questions to write a correct response to the UI code of the operation is very important, especially when different View on the screen need to make a different response on the UI operation time, a typical example is the user placed a Widget on the desktop, so when a user Widget against doing various operations, desktop itself to respond to the operation of the user, sometimes sometimes ignored. Only by understanding the mechanism of event triggering and passing is it possible to ensure that UI controls still respond correctly to user actions in very complex interface layouts.
 
1.   onTouchEvent
        The three most common events to handle in an onTouchEvent are ACTION_DOWN, ACTION_MOVE, and ACTION_UP.
        These three events identify the most basic user touch screen operation, and the meaning is clear. Although everyone is using them every day, but there is a point please pay attention to, ACTION_DOWN event as the starting event, it is more important than ACTION_MOVE and ACTION_UP, if there is ACTION_MOVE or ACTION_UP, then there must have been ACTION_DOWN.
        From the Android source code, you can see some interaction mechanisms implemented based on this understanding of different importance, which are also explicitly mentioned in the SDK. For example, in the onInterceptTouchEvent method of ViewGroup, if true is returned in the ACTION_DOWN event, the subsequent events will be sent directly to the onTouchEvent, rather than continue to be sent to the onInterceptTouchEvent.
 
2.   OnClick, onLongClick, and onTouchEvent
        As mentioned in a post, if you handle an onTouchEvent in a View, you don't have to deal with onClick anymore because Android will only trigger one of the methods. This understanding is not quite correct. For a certain view, the user completes a touch operation. Obviously, the signal obtained from the sensor is two operations of finger pressing and lifting.
        In Android, the triggering of onClick and onLongClick is related to ACTION_DOWN and ACTION_UP. In terms of timing, if we overwrite onClick, onLongClick and onTouchEvent in a View, the onTouchEvent is the first to capture the ACTION_DOWN and ACTION_UP events, and the second to trigger onClick or onLongClick. The main logic is implemented in the onTouchEvent method in view.java:


case MotionEvent.ACTION_DOWN: 
 
  mPrivateFlags |= PRESSED; 
 
  refreshDrawableState(); 
 
  if ((mViewFlags & LONG_CLICKABLE) == LONG_CLICKABLE) { 
 
     postCheckForLongClick();  
 
  } 
 
  break; 
 
case MotionEvent.ACTION_UP: 
 
  if ((mPrivateFlags & PRESSED) != 0) { 
 
     boolean focusTaken = false; 
 
     if (isFocusable() && isFocusableInTouchMode() && !isFocused()) { 
 
        focusTaken = requestFocus(); 
 
     } 
 
  
 
  if (!mHasPerformedLongPress) { 
 
    if (mPendingCheckForLongPress != null) { 
 
       removeCallbacks(mPendingCheckForLongPress); 
 
    } 
 
    if (!focusTaken) { 
 
       performClick(); 
 
    } 
 
  } 
 
   ...  
 
  break; 

   
   
        As you can see, the Click trigger occurs after the system captures ACTION_UP and is executed by performClick(), which invokes the onClick() method of the previously registered listener: & PI;  
   


public boolean performClick() { 
 
   ...  
 
  if (mOnClickListener != null) { 
 
    playSoundEffect(SoundEffectConstants.CLICK); 
 
    mOnClickListener.onClick(this); 
 
    return true; 
 
  } 
 
    return false; 
 
} 

   
     
 
The trigger for LongClick starts with ACTION_DOWN and is done by the postCheckForLongClick() method:    
   


private void postCheckForLongClick() { 
 
   mHasPerformedLongPress = false; 
 
   if (mPendingCheckForLongPress == null) { 
 
     mPendingCheckForLongPress = new CheckForLongPress(); 
 
   } 
 
   mPendingCheckForLongPress.rememberWindowAttachCount(); 
 
   postDelayed(mPendingCheckForLongPress, ViewConfiguration.getLongPressTimeout()); 
 
} 

 
As you can see, after the ACTION_DOWN event is caught, the system begins to trigger a postDelayed operation, with the delay of 500ms on Eclair2.1, which triggers the execution of the CheckForLongPress thread:    
   


class CheckForLongPress implements Runnable { 
 
 ...  
 
    public void run() { 
 
      if (isPressed() && (mParent != null) 
 
          && mOriginalWindowAttachCount == mWindowAttachCount) { 
 
        if (performLongClick()) { 
 
          mHasPerformedLongPress = true; 
 
        } 
 
      } 
 
    } 
 
 ...  
 
} 

   
 
If all the conditions are met, execute performLongClick() in CheckForLongPress, where onLongClick() :   is called;  
   


public boolean performLongClick() { 
 
    ...  
 
   if (mOnLongClickListener != null) { 
 
     handled = mOnLongClickListener.onLongClick(View.this); 
 
   } 
 
    ...  
 
} 

   
 
You can see from the implementation that the onClick() and onLongClick() methods are captured by ACTION_DOWN and ACTION_UP events to determine whether they are triggered or not, which means that if we listen to or overwrite onClick(),onLongClick(), and onTouchEvent() methods in an Activity or View, it doesn't mean that only one of them will happen.  


Related articles: