Touch is the core function of mobile devices and the basis of mobile application interaction. Android and iOS each have a complete touch event processing mechanism. React Native (hereinafter referred to as RN) provides a unified processing method that can easily handle touch events of components in the interface, user gestures, etc. This article attempts to introduce touch event processing in RN. 1. RN basic touch components Except for Text, other RN components do not support click events by default, nor can they respond to basic touch events. Therefore, RN provides several components that directly handle response events, which can basically meet most click processing requirements: TouchableHighlight, TouchableNativeFeedback, TouchableOpacity and TouchableWithoutFeedback. Because the functions and usage methods of these components are basically similar, except that the feedback effect of Touch is different, we generally use Touchable** instead. Touchable** has the following callback methods:
Their basic usage is as follows, here we take TouchableHighlight as an example:
The touch components provided in RN are very simple to use. You can refer to the official documentation, so I won’t introduce them in detail here. The following mainly introduces the user touch event processing. 2. Single component touch event processing We know that RN components do not process touch events by default. To process touch events, a component must first "apply" to become a responder to touch events. After completing event processing, it will release the responder role. A touch event processing cycle starts when the user presses the screen with their finger and ends when the user lifts their finger. This is a complete touch operation by the user. The life cycle of a single operation interaction processing of a single component is as follows: Let's analyze the life cycle of event processing in detail. During the entire event processing process, a component may be in one of two identities and can switch between them: non-event responder and event responder. Non-Incident Responders By default, touch event input is not directly passed to the component, and the component cannot handle the event, that is, it is not an event responder. If a component wants to handle touch events, it must first apply to become an event responder. The component has the following two properties to make such an application:
If the component returns true through the above method, it means that it has applied to become an event responder and wants to receive subsequent event input. Because there can only be one event processing responder at the same time, RN also needs to coordinate the event processing requests of all components, so not every component application can be successful. RN notifies the component of its application result through the following two callbacks:
Incident Responder If the component applies to become an event responder through the above steps, subsequent event inputs will be notified to the component through the callback function, as follows:
As can be seen from the previous figure, during the period when a component becomes an event responder, other components may also apply for touch event processing. At this time, RN will ask you through a callback whether you can release the responder role to other components. The callback is as follows:
If the callback function returns true, it means that the responder role is released. At the same time, the following function will be called back to notify the component that the event response processing has been terminated:
This callback also occurs when the system directly terminates the event processing of the component, such as when a user suddenly receives a phone call during a touch operation. Event data structure As we have seen above, the callback for touch event processing has an evt parameter, which contains a touch event data nativeEvent. The details of nativeEvent are as follows:
Among these data, the most commonly used are locationX and locationY data. It should be noted that because these are Native data, their units are actual pixels. If you want to convert them to logical units in RN, you can use the following method:
3. Nested component event handling The previous section introduces the process and mechanism of event handling for a single component. However, as mentioned earlier, when a component needs to be an event handler responder, it needs to apply for it through the onStartShouldSetResponder or onMoveShouldSetResponder callback return value of true. If multiple components are nested, and both callbacks return true, but there can only be one event handler responder for the same component, how to deal with this situation? For ease of description, assume that our component layout is as follows: In RN, the bubbling mechanism is used by default, and the component with the deepest response starts to respond first. So in the situation described above, as shown in the figure, if on*ShouldSetResponder of components A, B, and C all return true, then only component C will get a response and become the responder. This mechanism can ensure that all components of the interface can get a response. However, in some cases, the parent component may need to process the event and prohibit the child component from responding. RN provides a hijacking mechanism, that is, when the touch event is passed down, it first asks the parent component whether it needs to be hijacked, and does not pass the event to the child component, which is the following two callbacks:
This hijacking mechanism can be regarded as a sinking mechanism. Corresponding to the bubbling mechanism above, we can summarize the RN event processing flow as follows: Note: The * in the figure indicates that it can be Start or Move. For example, on*ShouldSetResponderCapture indicates onStartShouldSetResponderCapture or onMoveShouldSetResponderCapture, and the others are similar. When a touch event starts, the onStartShouldSetResponderCapture of component A is called first. If this callback returns false, it is passed to component B as shown in the figure, and then the onStartShouldSetResponderCapture of component B is called. If it returns true, the event is no longer passed to component C, and onResponderStart of this component is called directly. Then component B becomes the event responder, and subsequent events are directly passed to it. Other analyses are similar. Note that there are onTouchStart/onTouchStop callbacks in the figure. This callback is not affected by the responder. All components within the scope will call back this function, and the calling order is from the deepest component to the top component. 4. Gesture Recognition The previous part only introduces the simple touch event processing mechanism and its usage. In fact, continuous touch events can form some more advanced gestures. For example, the most common screen sliding, pinching or rotating pictures are all completed through gesture recognition. Because some gestures are very commonly used, RN also provides a built-in gesture recognition library PanResponder, which encapsulates the above event callback function, processes the touch event data, completes the sliding gesture recognition, and provides us with a more advanced and meaningful interface, as follows:
As you can see, these interfaces basically correspond to the basic callbacks received earlier, and their functions are similar, so I will not go into details here. There is a special callback onShouldBlockNativeResponder that indicates whether to use the Native platform's event processing. It is disabled by default, and all event processing in JS is used. Note that this function can only be used on the Android platform. However, the callback function here has a new parameter gestureState, which is data related to sliding and is an analysis and processing of basic touch data. Its content is as follows:
The following is a simple example. This example implements a circular control that can be dragged with a finger. The usage example is as follows:
As you can see, create a PanResponder instance in componentWillMount, set the relevant properties, and then set this object to the properties of the View, as follows:
The rest of the code is relatively simple and will not be described in detail here. 5. Conclusion From the above introduction, we can see that RN provides an event handling mechanism similar to the Native platform, so it can also implement various touch event processing and even complex gesture recognition. In the event handling of nested components, RN provides event handling in two directions: "bubbling" and "sinking", which is somewhat similar to the NestedScrolling supported on Android Native not long ago, which provides a more powerful event handling mechanism. Also note that due to RN's asynchronous communication and execution mechanism, all callback functions described above are in the JS thread, not the Native UI thread, while the Touch events of the Native platform are all in the UI thread. Therefore, there may be delays when implementing animations through Touch or gestures in JS. |
<<: Small Demo, Big Knowledge - Learn Android Coordinates by Controlling Button Movement
>>: Play with Android nested scrolling
When Dangdang held the “50 off for purchases over...
A bird's eye view of science Quanzhou Associa...
With the advent of well-known hotel brands, many ...
On April 11, 2018, Beijing time, Link Mobility (N...
Fecal disease treatment technology: a wonderful a...
Introduction Wang Chuan, co-founder and vice pres...
"Apocalypse" is a fantasy adventure act...
2020 was an extraordinary year. The arrival of th...
The reason why Douyin is so popular is not only b...
Today, let’s talk about how the live streaming in...
"What tools are needed for a complete event?...
When many novice website operators get a website ...
There is a group of scientists who have made beco...
With the strategic upgrades of major platforms an...
As an operator , event planning and event operati...