React Native touch event processing detailed explanation

React Native touch event processing detailed explanation

Touch is the core function of mobile devices and the basis of mobile application interaction. Android and iOS each have a complete touch event processing mechanism. React Native (hereinafter referred to as RN) provides a unified processing method that can easily handle touch events of components in the interface, user gestures, etc. This article attempts to introduce touch event processing in RN.

1. RN basic touch components

Except for Text, other RN components do not support click events by default, nor can they respond to basic touch events. Therefore, RN provides several components that directly handle response events, which can basically meet most click processing requirements: TouchableHighlight, TouchableNativeFeedback, TouchableOpacity and TouchableWithoutFeedback. Because the functions and usage methods of these components are basically similar, except that the feedback effect of Touch is different, we generally use Touchable** instead. Touchable** has the following callback methods:

  • onPressIn: click to start;
  • onPressOut: click to end or leave;
  • onPress: click event callback;
  • onLongPress: long press event callback.

Their basic usage is as follows, here we take TouchableHighlight as an example:

  1. <TouchableHighlight
  2. onPressIn={() => console.log( "onPressIn" )}
  3. onPressOut={() => console.log( "onPressOut" )}
  4. onPress={() => console.log( "onPress" )}
  5. onLongPress={() => console.log( "onLongPress" )}
  6. >
  7. <Image
  8. style={styles.button}
  9. source={require( './img/rn_logo.png' )} />
  10. </TouchableHighlight>

The touch components provided in RN are very simple to use. You can refer to the official documentation, so I won’t introduce them in detail here. The following mainly introduces the user touch event processing.

2. Single component touch event processing

We know that RN components do not process touch events by default. To process touch events, a component must first "apply" to become a responder to touch events. After completing event processing, it will release the responder role. A touch event processing cycle starts when the user presses the screen with their finger and ends when the user lifts their finger. This is a complete touch operation by the user.

The life cycle of a single operation interaction processing of a single component is as follows:

Let's analyze the life cycle of event processing in detail. During the entire event processing process, a component may be in one of two identities and can switch between them: non-event responder and event responder.

Non-Incident Responders

By default, touch event input is not directly passed to the component, and the component cannot handle the event, that is, it is not an event responder. If a component wants to handle touch events, it must first apply to become an event responder. The component has the following two properties to make such an application:

  • View.props.onStartShouldSetResponder, this property receives a callback function, the function prototype is function(evt): bool, when the touch event starts (touchDown), RN will call back this function to ask the component whether it needs to become an event responder and receive event processing. If true is returned, it means it needs to become a responder;
  • View.props.onMoveShouldSetResponder, it is similar to the previous property, but this is when the touch is in progress (touchMove), RN asks the component whether to become a responder, and returns true to indicate yes.

If the component returns true through the above method, it means that it has applied to become an event responder and wants to receive subsequent event input. Because there can only be one event processing responder at the same time, RN also needs to coordinate the event processing requests of all components, so not every component application can be successful. RN notifies the component of its application result through the following two callbacks:

  • View.props.onResponderGrant: (evt) => {}: indicates that the application is successful and the component becomes an event handler responder. At this time, the component begins to receive subsequent touch event input. Generally, at this time, the component enters the activated state and performs some event processing or gesture recognition initialization.
  • View.props.onResponderReject: (evt) => {}: Indicates that the application failed, which means that other components are processing events and it does not want to give up event processing, so your application is rejected and subsequent input events will not be passed to this component for processing.

Incident Responder

If the component applies to become an event responder through the above steps, subsequent event inputs will be notified to the component through the callback function, as follows:

  • View.props.onResponderStart: (evt) => {}: indicates that when the finger is pressed, the callback of the event responder is successfully applied;
  • View.props.onResponderMove: (evt) => {}: indicates the event of the touch finger moving. This callback may be very frequent, so the content of this callback function needs to be as simple as possible;
  • View.props.onResponderRelease: (evt) => {}: indicates the callback when the touch is completed (touchUp), indicating that the user has completed the touch interaction. The gesture recognition process should be completed here. After that, the component is no longer an event responder and the component is deactivated.
  • View.props.onResponderEnd: (evt) => {}: Indicates the callback when the component ends the event response.

As can be seen from the previous figure, during the period when a component becomes an event responder, other components may also apply for touch event processing. At this time, RN will ask you through a callback whether you can release the responder role to other components. The callback is as follows:

  1. View .props.onResponderTerminationRequest: (evt) => bool

If the callback function returns true, it means that the responder role is released. At the same time, the following function will be called back to notify the component that the event response processing has been terminated:

  1. View .props.onResponderTerminate: (evt) => {}

This callback also occurs when the system directly terminates the event processing of the component, such as when a user suddenly receives a phone call during a touch operation.

Event data structure

As we have seen above, the callback for touch event processing has an evt parameter, which contains a touch event data nativeEvent. The details of nativeEvent are as follows:

  • Identifier: Touch ID, usually corresponding to a finger, used to distinguish which finger is the touch event in multi-touch;
  • locationX and locationY: the location of the touch point relative to the component;
  • pageX and pageY: the position of the touch point relative to the screen;
  • timestamp: the timestamp of the current touch event, which can be used for sliding calculation;
  • target: the component ID that receives the current touch event;
  • changedTouches: evt array, an array of all touch events from the last callback report to this report. Because a large number of events are generated during the user touch process, sometimes they may not be reported in time, so the system reports them in batches in this way;
  • touches: evt array, which contains events of all current touch points in case of multi-touch.

Among these data, the most commonly used are locationX and locationY data. It should be noted that because these are Native data, their units are actual pixels. If you want to convert them to logical units in RN, you can use the following method:

  1. var pX = evt.nativeEvent.locationX / PixelRatio.get();

3. Nested component event handling

The previous section introduces the process and mechanism of event handling for a single component. However, as mentioned earlier, when a component needs to be an event handler responder, it needs to apply for it through the onStartShouldSetResponder or onMoveShouldSetResponder callback return value of true. If multiple components are nested, and both callbacks return true, but there can only be one event handler responder for the same component, how to deal with this situation? For ease of description, assume that our component layout is as follows:

In RN, the bubbling mechanism is used by default, and the component with the deepest response starts to respond first. So in the situation described above, as shown in the figure, if on*ShouldSetResponder of components A, B, and C all return true, then only component C will get a response and become the responder. This mechanism can ensure that all components of the interface can get a response. However, in some cases, the parent component may need to process the event and prohibit the child component from responding. RN provides a hijacking mechanism, that is, when the touch event is passed down, it first asks the parent component whether it needs to be hijacked, and does not pass the event to the child component, which is the following two callbacks:

  • View.props.onStartShouldSetResponderCapture: This property receives a callback function, the function prototype is function(evt): bool. When the touch event starts (touchDown), the RN container component will call back this function to ask the component whether to hijack the event responder setting and receive event processing by itself. If true is returned, it means hijacking is required;
  • View.props.onMoveShouldSetResponderCapture: This function is similar, but it asks the container component whether to hijack the touch move event (touchMove).

This hijacking mechanism can be regarded as a sinking mechanism. Corresponding to the bubbling mechanism above, we can summarize the RN event processing flow as follows:

Note: The * in the figure indicates that it can be Start or Move. For example, on*ShouldSetResponderCapture indicates onStartShouldSetResponderCapture or onMoveShouldSetResponderCapture, and the others are similar.

When a touch event starts, the onStartShouldSetResponderCapture of component A is called first. If this callback returns false, it is passed to component B as shown in the figure, and then the onStartShouldSetResponderCapture of component B is called. If it returns true, the event is no longer passed to component C, and onResponderStart of this component is called directly. Then component B becomes the event responder, and subsequent events are directly passed to it. Other analyses are similar.

Note that there are onTouchStart/onTouchStop callbacks in the figure. This callback is not affected by the responder. All components within the scope will call back this function, and the calling order is from the deepest component to the top component.

4. Gesture Recognition

The previous part only introduces the simple touch event processing mechanism and its usage. In fact, continuous touch events can form some more advanced gestures. For example, the most common screen sliding, pinching or rotating pictures are all completed through gesture recognition.

Because some gestures are very commonly used, RN also provides a built-in gesture recognition library PanResponder, which encapsulates the above event callback function, processes the touch event data, completes the sliding gesture recognition, and provides us with a more advanced and meaningful interface, as follows:

  • onMoveShouldSetPanResponder: (e, gestureState) => bool
  • onMoveShouldSetPanResponderCapture: (e, gestureState) => bool
  • onStartShouldSetPanResponder: (e, gestureState) => bool
  • onStartShouldSetPanResponderCapture: (e, gestureState) => bool
  • onPanResponderReject: (e, gestureState) => {…}
  • onPanResponderGrant: (e, gestureState) => {…}
  • onPanResponderStart: (e, gestureState) => {…}
  • onPanResponderEnd: (e, gestureState) => {…}
  • onPanResponderRelease: (e, gestureState) => {…}
  • onPanResponderMove: (e, gestureState) => {…}
  • onPanResponderTerminate: (e, gestureState) => {…}
  • onPanResponderTerminationRequest: (e, gestureState) => {…}
  • onShouldBlockNativeResponder: (e, gestureState) => bool

As you can see, these interfaces basically correspond to the basic callbacks received earlier, and their functions are similar, so I will not go into details here. There is a special callback onShouldBlockNativeResponder that indicates whether to use the Native platform's event processing. It is disabled by default, and all event processing in JS is used. Note that this function can only be used on the Android platform. However, the callback function here has a new parameter gestureState, which is data related to sliding and is an analysis and processing of basic touch data. Its content is as follows:

  • stateID: The ID of the sliding gesture. This ID remains unchanged during a complete interaction.
  • moveX and moveY: the distance the gesture has moved since the last callback;
  • x0 and y0: the coordinates on the screen when the slide gesture recognition starts;
  • dx and dy: the distance moved from the start of the gesture to the current callback;
  • vx and vy: the speed of the current gesture movement;
  • numberActiveTouches: The number of currently touching fingers.

The following is a simple example. This example implements a circular control that can be dragged with a finger. The usage example is as follows:

  1. import React from   'react' ;
  2. import {
  3. AppRegistry,
  4. PanResponder,
  5. StyleSheet,
  6. View ,
  7. processColor,
  8. } from   'react-native' ;
  9.   
  10. var CIRCLE_SIZE = 80;
  11. var CIRCLE_COLOR = 'blue' ;
  12. var CIRCLE_HIGHLIGHT_COLOR = 'green' ;
  13.   
  14. var PanResponderExample = React.createClass({
  15.   
  16. statics: {
  17. title: 'PanResponder Sample' ,
  18. description: 'Shows the use of PanResponder to provide basic gesture handling.' ,
  19. },
  20.   
  21. _panResponder: {},
  22. _previousLeft: 0,
  23. _previousTop: 0,
  24. _circleStyles: {},
  25. circle: ( null : ?{ setNativeProps(props: Object): void }),
  26.   
  27. componentWillMount: function () {
  28. this._panResponder = PanResponder. create ({
  29. onStartShouldSetPanResponder: (evt, gestureState) => true ,
  30. onMoveShouldSetPanResponder: (evt, gestureState) => true ,
  31. onPanResponderGrant: this._handlePanResponderGrant,
  32. onPanResponderMove: this._handlePanResponderMove,
  33. onPanResponderRelease: this._handlePanResponderEnd,
  34. onPanResponderTerminate: this._handlePanResponderEnd,
  35. });
  36. this._previousLeft = 20;
  37. this._previousTop = 84;
  38. this._circleStyles = {
  39. style: {
  40. left : this._previousLeft,
  41. top : this._previousTop
  42. }
  43. };
  44. },
  45.   
  46. componentDidMount: function () {
  47. this._updatePosition();
  48. },
  49.   
  50. render: function () {
  51. return (
  52. < View style={styles.container}>
  53. < View  
  54. ref={(circle) => {
  55. this.circle = circle;
  56. }}
  57. style={styles.circle}
  58. {...this._panResponder.panHandlers}
  59. />
  60. </ View >
  61. );
  62. },
  63.   
  64. _highlight: function () {
  65. const circle = this.circle;
  66. circle && circle.setNativeProps({
  67. style: {
  68. backgroundColor: processColor(CIRCLE_HIGHLIGHT_COLOR)
  69. }
  70. });
  71. },
  72.   
  73. _unHighlight: function () {
  74. const circle = this.circle;
  75. circle && circle.setNativeProps({
  76. style: {
  77. backgroundColor: processColor(CIRCLE_COLOR)
  78. }
  79. });
  80. },
  81.   
  82. _updatePosition: function () {
  83. this.circle && this.circle.setNativeProps(this._circleStyles);
  84. },
  85.   
  86. _handlePanResponderGrant: function (e: Object, gestureState: Object) {
  87. this._highlight();
  88. },
  89. _handlePanResponderMove: function (e: Object, gestureState: Object) {
  90. this._circleStyles.style.left = this._previousLeft + gestureState.dx;
  91. this._circleStyles.style.top = this._previousTop + gestureState.dy;
  92. this._updatePosition();
  93. },
  94. _handlePanResponderEnd: function (e: Object, gestureState: Object) {
  95. this._unHighlight();
  96. this._previousLeft += gestureState.dx;
  97. this._previousTop += gestureState.dy;
  98. },
  99. });
  100.   
  101. var styles = StyleSheet. create ({
  102. circle:
  103. width: CIRCLE_SIZE,
  104. height: CIRCLE_SIZE,
  105. borderRadius: CIRCLE_SIZE / 2,
  106. backgroundColor: CIRCLE_COLOR,
  107. position: 'absolute' ,
  108. left : 0,
  109. top : 0,
  110. },
  111. container: {
  112. flex: 1,
  113. paddingTop: 64,
  114. },
  115. });

As you can see, create a PanResponder instance in componentWillMount, set the relevant properties, and then set this object to the properties of the View, as follows:

  1. < View    
  2. {...this._panResponder.panHandlers}
  3. />

The rest of the code is relatively simple and will not be described in detail here.

5. Conclusion

From the above introduction, we can see that RN provides an event handling mechanism similar to the Native platform, so it can also implement various touch event processing and even complex gesture recognition.

In the event handling of nested components, RN provides event handling in two directions: "bubbling" and "sinking", which is somewhat similar to the NestedScrolling supported on Android Native not long ago, which provides a more powerful event handling mechanism.

Also note that due to RN's asynchronous communication and execution mechanism, all callback functions described above are in the JS thread, not the Native UI thread, while the Touch events of the Native platform are all in the UI thread. Therefore, there may be delays when implementing animations through Touch or gestures in JS.

<<:  Small Demo, Big Knowledge - Learn Android Coordinates by Controlling Button Movement

>>:  Play with Android nested scrolling

Recommend

How to gain user trust? You can start from these 9 aspects

When Dangdang held the “50 off for purchases over...

The pretty elf in the countryside, the pure-colored mountain wren

A bird's eye view of science Quanzhou Associa...

How to make hotel management mini program and hotel accommodation mini program?

With the advent of well-known hotel brands, many ...

Fecal treatment technology: a "wonderful adventure" of human microorganisms!

Fecal disease treatment technology: a wonderful a...

Xiaomi TV head Wang Chuan: LeTV model is coming to an end

Introduction Wang Chuan, co-founder and vice pres...

Mel Gibson's film "Apocalypse" HD 1080P English subtitles

"Apocalypse" is a fantasy adventure act...

Zebra AI: APP competitive product analysis!

2020 was an extraordinary year. The arrival of th...

The secrets of Douyin App's operation and promotion routines

The reason why Douyin is so popular is not only b...

Everything you want to know about event planning and operation is here

As an operator , event planning and event operati...