The truth about brain-to-brain interfaces: Is the "telepathy" that Musk pursues possible?

The truth about brain-to-brain interfaces: Is the "telepathy" that Musk pursues possible?

Although we don’t have the wings of a phoenix, our hearts are connected with each other.

Written by Gu Fanji (School of Life Sciences, Fudan University)

In 2017, Tim Urban, the blogger of the well-known technology blog "Wait But Why", was invited by Elon Musk to visit Neuralink, the neural connection company he founded, for a long time. He also had in-depth discussions with Musk and most of his founding team members at meetings or in private. After the visit, Urban published his summary in a blog post, quoting Musk's words [1]:

I can imagine a bouquet of flowers and have a very clear picture in my mind, but if you try to describe it with words, you'd need a lot of words and still only be able to give a rough idea.

You have a lot of thoughts in your head, and they have to be compressed by your brain into data that is transmitted very slowly, either in spoken words or typed words. That's language. Your brain runs a compression algorithm on the transmission of thoughts and concepts. In addition, you have to listen and decompress the information you hear. There is also a lot of data loss in this process. So, while you are decompressing to try to understand, you are also trying to reconstruct the other person's state of mind to understand its source and reorganize in your own mind the various concepts that the other person's mind is trying to convey to you. ... If both people have a brain interface, you can communicate concepts directly with another person without compression.

Musk calls this kind of conceptual communication "non-linguistic consent consensual conceptual telepathy." [2]

Musk's dream is not new. Science fiction novels and movies such as "The Advocate for the Dead," "The Destroyer," and "Avatar" have long described scenes of direct communication between minds. In many other science fiction novels, the description of communication without language and the brain being able to directly receive the thoughts of other people or even other creatures is often associated with human progress and ultimate goals (such as crossing a wormhole and uploading all human consciousness to the Internet and intersecting together). One of Musk's original intentions in founding Neuralink was to enable us to communicate directly with "real thoughts" that are not encoded in language.

Of course, Musk is not the first person to propose a “brain-to-brain interface.” In fact, as early as 1994, Nobel Prize winner in physics Murray Gell-Mann wrote in his book Quark and the Jaguar: “For better or worse, one day a person will be able to connect directly to an advanced computer (not through spoken language or an interface like a console) and, through that computer, to one or more other humans. Thoughts and feelings will be fully shared, and unlike language, which can be selective or deceptive… I am not sure whether to recommend doing this (although if all goes well, it may alleviate some of the most difficult problems we humans face). But it will certainly create a new form of complex adaptive system, a true synthesis of many people.” [3]

Quark and the Jaguar

The first person to put brain-brain interfaces into practice was Duke University neuroscience professor, brain-computer interface pioneer, and senior expert Miguel Nicolelis. In 2011, he reported in his famous book Beyond Boundaries: The New Neuroscience of Connecting Brains with Machines and — How It Will Change Our Lives. that they had two rats implanted with brain-brain interfaces complete a preset task together [4]. At the 2014 World Cup in Brazil, 29-year-old high-level paraplegic Juliano Pinto used his brain to control the mechanical skeleton made by Nicolelis' laboratory and successfully kicked off the game.

On August 28, 2020, Neuralink held its second press conference, where Musk introduced the latest progress and physical demonstrations. In the past few years, they have made significant progress in chip miniaturization, surgical robots, wireless transmission and other technologies, and have made a big step forward in the transformation from principle to practical application. In 2019, they successfully recorded neural activity in pig brains and predicted pig movements. The press conference attracted the attention of the world and achieved a great public relations victory. But there is no innovation in terms of ideological principles, and Musk's advocacy of making people superhuman through the fusion of human brain and artificial intelligence is just a myth, at least in the foreseeable future.

On April 8, 2021, Neuralink released another 5-minute video online (see video below), showing a 9-year-old macaque named Pager playing table tennis on a computer. In the video, Pager does not need a game controller (joystick), and can move the table tennis racket with "intention" (actually just brain signals), and plays quite well. This is another major progress report on brain-computer interfaces by Neuralink after its second press conference in August 2020, although Nicolelis successfully trained monkeys to control robots far away in Japan to walk synchronously through "intentions" in 2008. For a time, the whole network was boiling, and many people were very optimistic that Neuralink had successfully achieved "mind control" and was not far from "telepathy" between people.

The researchers trained Page to use her "mind" to control the table tennis racket. Please go to the "Fanpu" public account to watch the video.

However, Nicolelis clearly opposed Musk’s ideas. At the Tencent Scientist WE Conference in November 2020, Nicolelis bluntly stated that Musk’s talk about brain-computer interface mind control, memory uploading, and even immortality is just a marketing strategy. Such talk is of no benefit to the scientific development of the brain-computer interface field. “I don’t agree with a word he said.” [5]

Why did Nicolelis say this?

To answer this question, we need to look at how far people have come so far in their work on brain-to-brain interfaces.

First, let’s look at a specific example of a “brain-brain interface” published by Nicolelis’ laboratory in 2013 [6].

In the experiment, mice that had received behavioral training and knew how to press the lever according to the indicator light were divided into an encoding group (encoder) and a decoding group (decoder), and were kept in two identically set rooms, unable to see each other. Microelectrodes were implanted in the motor cortex of the brain in both groups of mice, and the electrode cables were connected through an artificial signal acquisition and conversion device. When the encoding mouse correctly pressed the A lever according to the instructions, the microelectrode accurately collected the corresponding dense discharge of neurons, which was converted into a series of high-frequency pulse signals (A signals) after artificial processing; when the B lever was correctly pressed, the neuronal discharge pattern collected by the microelectrode was processed into a single pulse (B signal). At the same time, pulse signals of different patterns were sent to the microelectrodes in the brain of the decoding mouse to slightly stimulate the cerebral cortex, which is called intracortical microstimulation (ICMS). When ICMS is a series of high-frequency pulses (A signal), the A lever is pressed; when ICMS is a single pulse (B signal), the B lever is pressed.

In this way, the decoder mouse pressed the same lever as the encoder mouse. The researchers believe that "the brain-to-brain interface between the encoder mouse and the decoder mouse enables the decoder mouse to completely rely on the neural patterns of the encoder mouse to reproduce the behavioral choices of the encoder mouse."[5] In this way, "telepathy" is achieved.

So how did the decoder mouse read the "neural pattern" of the encoder mouse? In other words, how did the decoder mouse know that a high-frequency pulse meant pressing lever A, and a single pulse meant pressing lever B? Could it be that it really had a telepathic connection with the encoder mouse?

The answer is: the researchers told it.

This study is divided into two parts. Before the experimental part, there is an important training phase. The researchers used the behavioral training method of conditioned reflex (recall Pavlov's dog) to let the decoder mice learn to associate different ICMS with different levers. In this way, in the experiment, the cortical discharge pattern of the encoder mouse was artificially converted into different pulse signals, and the decoder mouse pressed the appropriate lever according to the rules it had learned long ago.

In other words, the reason why the decoder mouse was able to "reproduce the behavioral choice of the encoder mouse" is because it learned to respond to ICMS during the training phase. The authors did not say whether the decoder mouse could still do this if it was not trained. My guess is that it could not. If this is the case, then the decoder mouse is actually unaware of the encoder mouse's choice, but the experimenter converted the encoder mouse's choice into a suitable stimulus that can cause the decoder mouse to act accordingly, so this is actually just a reflex.

Invasive brain-to-brain interfaces

In 2020, Luo Minmin's laboratory at the Beijing Institute of Life Sciences/Beijing Center for Brain Science and Brain-Inspired Research developed an optical brain-to-brain interface that can transmit information about movement speed from one mouse to another and accurately control the latter's movement speed in real time[7].

In the brainstem, there is a nucleus called nucleus incertus (NI), in which there is a type of neuron that can express neuromedin B (NMB). Luo Minmin's group has long discovered that the activity of this type of neuron can accurately predict and control the movement speed of animals. They fixed the heads of two mice (one encoding mouse and one decoding mouse), but allowed their bodies to run freely on a treadmill. They recorded the changes in calcium ion signals of a group of neurons in the nucleus incertus of the encoding mouse, and converted them into light pulse stimulation of different frequencies through machine learning. The stimulation was applied to the same type of neuron group in the nucleus incertus of the decoding mouse, which allowed the movement speed of the two mice to be highly synchronized.

This work by Luo Minmin's group is certainly a big step forward compared to the early work of Nicolelis and others. The activity of the controlled decoding mice is no longer a simple task such as "choose one of two things", but a continuously variable quantity - movement speed.

However, they did not use the original brain signals of the coding mouse to directly control the activities of the decoding mouse. Instead, they artificially converted the original brain signals into a sequence of light stimulation pulses, and then used the light pulses to stimulate the decoding mouse. Does this count as "telepathy"?

Although inserting microelectrodes directly into the brain can achieve higher resolution and signal-to-noise ratio, it is difficult for healthy subjects to accept. Not long ago, the American animal protection organization PCRM (Physicians Committee for Responsible Medicine) complained to the US Department of Agriculture about the collaborative research conducted by Neuralink and the University of California, Davis from 2017 to 2020. PCRM believes that implanting chips into the skulls of macaques is a cruel act [8].

Therefore, many laboratories are also studying non-invasive brain-brain interfaces.

Non-invasive brain-to-brain interface

The laboratory of Rajesh PN Rao[9] at the University of Washington is one of the centers in the world for research on non-invasive brain-to-brain interfaces. Since publishing the first article on human brain-to-brain interfaces in 2013, they have conducted a series of related work. This article only introduces two representative works.

Experiment 1[10]

Experimental task: Two subjects completed a game together: a missile or a passenger plane flew across the screen of the "sender", and the "sender" was required to manipulate the "receiver's" hand through the brain-to-brain interface and press a button to shoot down the missile. The two subjects were connected to each other by a brain-to-brain interface device consisting of electroencephalogram (EEG) and transcranial magnetic stimulation (TMS).

Task training: Collect the sender's electroencephalogram (EEG) signals and train him to move a one-dimensional cursor by imagining wrist movements when he sees a missile flying across the screen; for the receiver, find out in advance which part of the cerebral cortex is responsible for controlling the wrist abductor muscles (muscles that extend the wrist), and place a transcranial magnetic stimulation coil above this cortex so that the magnetic pulses emitted by TMS can cause the hand to move upward and press the button.

During the experiment, the two subjects were in two different buildings, one mile apart, and could not hear or see each other. The sender imagined moving his wrist to induce EEG signals, which were detected and wirelessly transmitted to the receiver's TMS device, which controlled the coil to send corresponding magnetic pulses, causing the subject's wrist to move and press a button. In this way, the two subjects were able to complete the game together only through the brain-to-brain interface.

Transcranial Magnetic Stimulation (TMS) is a non-invasive, painless, and non-destructive brain stimulation. TMS technology uses a pulsed magnetic field to act on the cerebral cortex, changing the membrane potential of cortical nerve cells, causing them to generate induced currents, affecting brain metabolism and neural electrical activity, thereby inducing physiological and biochemical reactions (such as causing a simple movement).

Experiment 2[11]

In this experiment, three subjects - two senders and one receiver - sat in different rooms and completed a Tetris game together. The rules of the game are shown below:

The sender and receiver need to cooperate: the sender decides whether the falling blocks need to be rotated, and "tells" his decision to the receiver through the brain-to-brain interface. The receiver then operates to place the blocks and eliminate the blocks in the bottom row.

On both sides of the sender's screen, one side displays the word "yes", indicating that the block needs to be rotated, and a light-emitting diode flashes 17 times per second below it; the other side displays the word "no", indicating that the block does not need to be rotated, and a light-emitting diode flashes 15 times per second below it. Different flashing frequencies can induce brain wave components of different frequencies.

When the sender makes a judgment and looks at a certain word, the control device determines whether the TMS coil behind the receiver's brain should emit magnetic pulses based on the EEG frequency collected by the sender's head. The magnetic pulses stimulate the occipital cortex (responsible for visual information processing) at the back of the receiver's brain, allowing the receiver to see flashes, which, according to the prior agreement, means that the sender meant "rotate the blocks."

Schematic diagram of the brain-to-brain interface (i.e., the control device in the picture). The control device converts the sender's brain electrical signals into pulse signals to stimulate the receiver. | The photo above is from Mark Stone/University of Washington[12]

The entire game requires three people to communicate and work together. The receiver decides whether to rotate the blocks after receiving the instructions (visual signals) from the two senders. The receiver's EEG signals are also collected and transmitted to the sender in a similar way, so that the sender knows the receiver's decision and gives feedback again. In this way, the results of the game will be communicated to the three people at the same time.

Cross-species hybrid brain-to-brain interfaces

In experiments that use the human brain to control animals, researchers generally use a hybrid brain-to-brain interface, which collects brain signals from humans in a non-invasive way, while implanting microelectrodes in animals to control their movements. Relatively speaking, the effects of invasive interfaces are more precise and accurate.

Zhang Shaomin et al. from Zhejiang University [13] developed a brain-to-brain interface from the human brain to the rat brain. In the experiment, the participants imagined waving their left or right arms. The related EEG signals were converted into control signals for turning left or right and wirelessly sent to the microelectrodes installed in the motor cortex of the rat, which discharged and stimulated the rat brain.

The controller can see the rats in the maze on the screen. The researchers used a complex three-dimensional maze (see the picture below), in which the rats need to go uphill, down stairs, avoid obstacles, go around, cross aisles, etc. In the experiment, the controller can use his imagination to make the rats successfully patrol the complex maze along the predetermined route within the specified time.

Figure 3 Schematic diagram of the complex maze used by Zhang Shaomin et al. from Zhejiang University. [13]

From the above representative studies, it can be seen that most of the brain-to-brain interface research so far is still difficult to say that it is true "telepathy". This may explain why Nicolelis, a senior expert on brain-computer interface, believes that Musk's "telepathy" is a "marketing strategy."

Indeed, if we only look at the surface phenomena of behavior, these experiments can show that the receiver can perform the actions that the experimenter wants simply according to the commands imagined in the sender's mind - not according to verbal instructions. If we confuse the sender's brain signals with "thoughts" or "intentions", and understand the receiver's actions as his acceptance of the sender's "thoughts" or "intentions", then we will assert that this is "telepathy". However, brain signals are not equivalent to "thoughts" or "intentions". Benjamin Libet's classic experiment has long told us that before we realize that we want to turn our wrists, the "readiness potential" related to this can be recorded in the brain. Therefore, the readiness potential (brain signal) precedes the "intention" we perceive. We can use EEG equipment to detect the readiness potential, and of course we can also process this potential to stimulate another person's brain to perform a certain action. In my opinion, this cannot be considered telepathy. Because from the receiver's perspective, the experimenter already knows in advance what kind of stimulation and which part of the receiver's brain will cause the receiver to perform the action that the experimenter wants to see. This is actually just a reflex.

Libet's experiment

In the early 1980s, American neuropsychologist Libet conducted an experiment in which he asked subjects to decide for themselves when to move their wrists and recorded their electromyography and electroencephalogram.

People have long known that when muscles move, electromyography can be recorded at the corresponding parts as the moment when the movement starts; in addition, muscle movement is controlled by the primary motor cortex of the brain, and before that, some brain areas have already made movement plans and sent them to the primary motor cortex, which then issues commands to control muscle movement. Activities related to the earlier "movement plan" can be recorded in the electroencephalogram as a brain wave component called "readiness potential". The readiness potential appears 1 second or more before the actual movement begins.

Most people would think that we first generate the thought (decision) of "I want to exercise", and then the cortex responsible for planning movement makes a plan and sends commands through the primary motor cortex to control the movement of the wrist muscles (myography is measured).

Libet asked the subjects to stare at a light spot that rotated continuously along a clock face on the screen while turning their wrists (see the picture). He asked the subjects to report afterwards when they made up their minds to turn their wrists when the light spot turned to a certain position. As a result, Libet found that the readiness potential appeared before the subjects made the decision, more than half a second in advance. This result shows that the readiness potential precedes the awareness of moving the wrist (intention), and the EEG signal is not equivalent to the intention itself. In fact, it is still unclear what the neural substrate of intention is, and it is not known in which brain area it occurs.

Figure 1 shows a schematic diagram of the Libet experiment. (Quoted from Blackmore, 2005)

The brain-to-brain interface experiments introduced in this article so far can be divided into two parts. One part is to measure a certain brain signal related to the sender when he is thinking (only correlation, not causality! We don’t know what the neural matrix of this "thinking" is). The other part is to examine what kind of stimulation mode and which part of the recipient's brain should be given to make the recipient perform the desired action. This is actually just a "stimulus-reflex" rather than "understanding". Finally, through machine learning, the recorded brain signals of the sender are converted into the stimulation pattern required by the experimenter. In this way, the first two parts are combined into one, giving people the impression of "telepathy". This is even more obvious when the action is a choice between two options (all experiments in this article except the optical brain-to-brain interface experiment of Luo Minmin's laboratory).

It is worth pointing out that the second half of all experiments is to control the movement of the recipient. This is because the experimenter knows where the brain area that drives this movement is. At the same time, the neural coding of movement control is group coding, and there is no need to find a specific neuron. In this way, the experimenter can know in advance which part of the brain should be stimulated in what way. The basis of these works is based on the group coding principle of movement control that has been relatively clear in brain research. If the recipient is not required to complete a motor task, but a mental activity, then it is impossible to achieve it, because we currently do not know what the neural mechanism of this activity is, and what kind of stimulation should be given to induce the corresponding mental activity. This is what Greg Horwitz, a neuroscientist at the University of Washington, said: "If you want me to move my arm, I know where to put the electrode." "Even if you can insert electrodes anywhere in my brain, if you want me to vote for Biden or Trump, I don't know where you should stimulate to achieve it, or in what mode to stimulate." [14]

More than a decade has passed since Nicolelis formally proposed the brain-to-brain interface. Although there have been some progress in various aspects, there has been no breakthrough in the above-mentioned problems. This cannot be solved by simply improving the technology of brain implants. It has been five years since Musk proposed that telepathy could be achieved within eight to ten years. This is probably an unrealized promise.

Of course, science should not exclude bold ideas. Such ideas encourage scientists to challenge the unknown, and perhaps they will be realized one day in the future. However, we should not confuse imagination with reality. Let's take a look at other ideas about brain-to-brain interfaces.

Nicolelis et al. proposed: "Finally, it must be emphasized that the topology of brain-to-brain interfaces need not be limited to just one sender and one receiver. On the contrary, we have pointed out that, in theory, the accuracy of the channel could be improved if a grid of many interconnected brains were used instead of just two brains. Such a computing structure could be the first to create an 'organic computer' that could solve heuristic problems that ordinary Turing machines cannot." [15]

If many brains are interconnected and communicate directly with each other, it is possible to form a "giant brain", just as neurons are interconnected to form a brain that is much more powerful than it. It is still difficult to imagine what new phenomena such a giant brain will produce.

Rao et al. have suggested that “a great deal of information in our brains is not accessible to consciousness through introspection and therefore cannot be readily expressed in verbal form”[8]. This is precisely the difficulty that surgeons and music masters have in imparting their knowledge and expertise to novices. They cannot tell students how to accurately “position and move their fingers to perform critical operations”[8]. They hope that brain-to-brain interfaces may eliminate these inherent problems in verbal communication.

Of course, some people have begun to worry about the negative impact of brain-to-brain interfaces. They have raised the question [16]: Will brain-to-brain interfaces force the sender to exert some kind of coercive influence on the receiver, thereby causing the latter to lose some sense of autonomy? Does extracting information from the sender's brain record violate his privacy? What people don't say is often more important than what they say, and privacy in the human brain is the core of individual autonomy. Developing brain-to-brain interfaces may not be worth the cost... Of course, judging from the current development of brain-to-brain interfaces, it is impossible to achieve the vision of some experts in the foreseeable future, so these concerns are still premature, but it may be meaningful to sound the alarm for how to carry out research in this area in a healthy way.

References

[1] Tim Urban (2017) Neuralink and the Brain's Magical Future. Wait But Why April 20, 2017 (
https://waitbutwhy.com/2017/04/neuralink.html)

[2] https://www.wired.com/story/elon-musk-neuralink-brain-implant-v2-demo/?bxid=5cec254afc942d3ada0b6b70&cndid=48167859&esrc=&source=EDT_WIR_NEWS LETTER_0_SCIENCE_ZZ&utm_brand=wired&utm_campaign=aud-dev&utm_mailing=WIR_Daily_082820_Science&utm_medium=email&utm_source=nl&utm_term=list1_p2

[3] Murray Gell-Mann (1994) Quark and the Jaguar. WH Freeman and Company.

Chinese translation: Gell-Mann, translated by Yang Jianye et al. (2002) "Quarks and Jaguars", Hunan Science and Technology Press

[4] Miguel Nicolelis (2011) Beyond Boundaries: The New Neuroscience of Connecting Brains with Machines---and How It Will Change Our Lives. Times Books.

[5] https://news.sciencenet.cn/sbhtmlnews/2020/11/358719.shtm

[6] Miguel Pais-Vieira, et al. (2013) A Brain-to-Brain Interface for Real-Time Sharing of Sensorimotor Information. Nature SCIENTIFIC REPORTS, 3: 1319 | DOI: 10.1038/srep01319

[7] Lu, L., Wang, R., and Luo, M. (2020). An optical brain-to-brain interface supports rapid information transmission for precise locomotion control. Sci China Life Sci 63(6):875-885, https://doi.org/10.1007/s11427-020-1675-x

[8] https://www.theguardian.com/world/2022/feb/15/elon-musk-neuralink-animal-cruelty-allegations

[9] https://en.wikipedia.org/wiki/Rajesh_P._N._Rao

[10] Rao, Rajesh et al. “A Direct Brain-to-Brain Interface in Humans.” PLOS ONE 2014: 1-12.

[11] Linxing Jiang, Andrea Stocco, Darby M. Losey, Justin A. Abernethy, Chantel S. Prat, Rajesh PN Rao. (2019) BrainNet: A Multi-Person Brain-to-Brain Interface for Direct Collaboration between Brains. Scientific Reports, 9 (1) DOI: 10.1038/s41598-019-41895-7

[12] Anthony Cuthbertson (2019) First brain-to-brain interface to communicate using only your mind successfully tested, researchers claim. (https://www.independent.co.uk/life-style/gadgets-and-tech/news/computer-brain-interface-university-washington-neuralink-a8984201.html)

[13] Shaomin Zhang et al. (2019) Human Mind Control of Rat Cyborg's ontinuous Locomotion with Wireless Brain-to-Brain Interface. Scientific Reports, 9:1321 ( https://doi.org/10.1038/s41598-018-36885-0)

[14] Adam Rogers (2020) Neuralink Is Impressive Tech, Wrapped in Musk Hype. Wired 04/09/2020 (https://www.wired.com/story/neuralink-is-impressive-tech-wrapped-in-musk-hype/?bxid=5cec254afc942d3ada0b6b70&cndid=48167859&esrc=desktopInterstitial&source=EDT _WIR_NEWSLETTER_0_DAILY_ZZ&utm_brand=wired&utm_campaign=aud-dev&utm_content=Final&utm_mailing=WIR_Daily_090620&utm_medium=email&utm_source=nl&utm_term=list2_p5)

[15] Pais-Vieira, Miguel; Mikhail Lebedev; Carolina Kunicki; Jing Wang; Miguel AL Nicolelis (2013). A Brain-to-Brain Interface for Real-Time Sharing of Sensorimotor Information. Scientific Reports. Nature Publishing Group. 3: 1319. doi:10.1038/srep01319 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3584574).

[16] Martone, Robert. (2020) Scientists Demonstrate Direct Brain-to-Brain Communication in Humans. Scientific American Mind. 31 (1):7-10

Special Tips

1. Go to the "Featured Column" at the bottom of the menu of the "Fanpu" WeChat public account to read a series of popular science articles on different topics.

2. Fanpu provides a function to search articles by month. Follow the official account and reply with the four-digit year + month, such as "1903", to get the article index for March 2019, and so on.

Copyright statement: Personal forwarding is welcome. Any form of media or organization is not allowed to reprint or excerpt without authorization. For reprint authorization, please contact the backstage of the "Fanpu" WeChat public account.

<<:  Hu Q&A: How to measure a temperature of 100 million degrees?

>>:  The paints are not only colorful, but some are also very "heavy"

Recommend

4G Outlook in 2015: FDD licenses will be issued at 1 cent per Mbps

2014 is the year when China officially enters the...

Apple lowers its profile to make money: Watches are on retailers' shelves

[[146262]] Apple Watch sales are not ideal, so Ap...

How to implement a product from 0 to 1 and increase user growth?

The goal of this series is to help product novice...

How to use growth hacking thinking for promotion?

Back to August 1, 2017, at 11:23 am, I was still ...

Zhihu promotion operation skills practical strategy

As one of the mainstream new media platforms, Zhi...

Want to know more about the zoo? This is not only fun, but also serious!

On January 25, representatives of the Chinese and...

Tips on using sentence breakers in Baidu search promotion!

What is a sentence breaker? The sentence break is...

Let’s talk about the operation and promotion of mobile games!

What to do when you are bored? Let’s play a game ...

Record an APP transfer process

Due to business needs, you need to transfer the d...

How to choose keywords for Google promotion?

At present, this Google promotion series is conti...