Summary of live data analysis report template

Summary of live data analysis report template

This data analysis article of over 10,000 words will take you into the world of live broadcast data and comprehensively interpret the data problems in the live broadcast process from a professional perspective.

Why do we do data analysis?

Is it for data analysis or to effectively solve the problems encountered?

Only when you understand this problem clearly, will you realize that after the broadcast every day, although it seems to be a live broadcast review, it actually cannot solve any problems.

A team that is truly capable of data analysis must understand the three levels of data.

First, a general review can only present results but cannot get to the essence of the problem.

Live broadcast data includes 10 modules including traffic, users, products, content, delivery, funnel, anchors, etc. A good data analysis is not just a summary presentation.

Second, the lack of timeline and cross-data analysis makes it impossible to approach the optimal solution.

For any element of live data, it is meaningless to look at it in a single time frame. It is necessary to introduce periodic year-on-year and month-on-month data to discover the pattern.

It is meaningless to look at any element of live broadcast data from a one-way dimension. For example, if you look at the online data without considering the flow rate, what can you see?

Third, the focus of data analysis must be different for accounts of different modes and stages.

Different live broadcast methods focus on different data, and accounts at different stages focus on different data. Data analysis does not seek to be large, but only precise.

The following data decomposition integrates the above three points and industrially decomposes the data model from the perspective of data modules.

1. Module of the review summary table

Any intuitive analysis of a live broadcast requires a data summary table

In daily analysis, I divide the review table into four categories: traffic data, user data, transaction data, and product data.

Traffic data includes audience, peak value (opening, closing, popularity, average), and flow rate (opening, mid-game, closing)

User data includes length of stay, number of interactions & interaction rate, number of fans & fan increase rate, number of groups & group increase rate, number of old fans & proportion

Transaction data includes GMV, thousand transactions, UV value, viewing conversion rate, click-through conversion rate, and old fan transaction ratio

Product data includes top products sold, number of sold items, number of people selling items, refund rate, refund GMV, number of refunds

By summarizing the above four modules, we can basically see the data of a live broadcast, but as I said, the result of data analysis is not to obtain data, but to analyze the potential problems of the data.

How to discover the problem?

The simplest method is to combine the timeline to make a periodic arrangement of multiple data, but problems arise.

Although the store backend provides a large data disk, it lacks comparative analysis of sectoral data. Be careful not to think that the e-commerce compass is very functional. The fact is that the developers have never thought about the problem from the user's perspective. Many of the large number of functions that are piled up seem to be very powerful, but they are actually useless.

Therefore, the best way is to set your own standards for data dimensions, download data in the background, and create tables according to the timeline.

By comparing the timeline, we can find out which data increases and which data are abnormal through the time period. Let’s give a few examples to describe the logic of the analysis.

1) Analysis of opening flow rate and online flow data

Some people have a misunderstanding. When comparing the growth of account weight, they like to compare the overall audience, flow rate and GMV scale. However, from a more precise perspective, the scale of the opening rapid flow can better represent the weight.

We can use the opening flow rate and online data to periodically check the changes in weights. Generally speaking, if the account shows an increasing trend, the opening rapid flow is basically stable or increasing, but if the weight moves downward, there will be multiple declines.

2) Analysis of user behavior in user data

For a new account that is cold started, if you want to get the live broadcast recommendation, you need to increase the interactive indicators. Then, in the table, you need to see the growth of retention, interaction rate, fan conversion rate, and group conversion rate. This is a good and direct way to determine whether the cold start is climbing.

For accounts that fail in cold start, you will find that the spiral rise of indicators is definitely missing at the interactive data layer. Combined with objective data analysis, this can greatly reduce the subjective assumptions of traders or the platform taking the blame, for example, whether there is a problem with the account.

3) Analysis of transaction indicators in transaction data

The viewing entry rate represents the quality of the picture (live broadcast/short video). When the traffic structure remains unchanged, the viewing entry rate continues to decline. If the focus is on short video traffic, it is necessary to detect whether there is any deviation in the guidance of the work. If it is a Qianchuan delivery, it is necessary to detect what problems there are in the delivery content.

4) Analysis of refund indicators in product data

According to one week's data statistics, assuming that GMV is in a stable or steadily rising stage, but the increasing rate of the refund rate is much higher than the transaction rate, then we need to analyze what factors have led to the increase in the refund rate, is it because of the change of model or the problem of rhythm rendering in the live broadcast room.

In view of the recent abnormal delivery environment, it is even more necessary to periodically analyze the refund indicators. If in the delivery environment, due to the impact of delayed delivery due to the epidemic, or the pre-sale model, it is obviously found that the refund rate has surged, then it is necessary to weigh whether it will affect the adjustment of the delivery area or reduce the scale of delivery.

You don’t need to remember the above, I just want to express the systematic nature of the overall review and the method of data information mining.

Imagine if all the above data were changed from after the event to during the event, would their meaning change?

Can the exposure entry rate be optimized in real time? Yes, can click-through conversion rates be optimized in real time? Sure, isn't GPM itself used for real-time optimization?

After the review is no longer in the middle, the data review is the 2/8 rule

In terms of time, people focus most of their time on post-event review, but the problems they solve account for less than 20%. In-event (real-time) review seems to be on-the-spot analysis, but it can solve 80% of account problems.

In the first point, I said that the review summary table solves the fundamentals of the review, but if you want to dig deeper into the essence of the problem, you need to combine diversified sector data.

In my review framework, there are multiple modules including traffic, funnel, delivery, content, etc. Does that mean that all module data need to be analyzed?

The answer is no. Each live broadcast room has its own traffic, products, and operation system. Regarding this part, I will list in detail how I focus on using section data.

2. Traffic structure module

In daily data analysis, the traffic structure includes natural recommendations, squares, short videos, Qianchuan, Suixintui, attention, search, and other channels. At the same time, under each channel, the timeline is used to periodically compare traffic, interaction and transaction data.

If it is a single structure traffic, there is no need to do traffic structure analysis. For example, the live broadcast room with activity account creation and order holding method basically relies on natural traffic, and as a trader, you can remember the proportion of recommendations for each live broadcast without relying on statistics.

Another example is Qianchuan’s individual products, which are basically based on paid traffic. We only need to focus on improving the data of Qianchuan’s production part. Analyzing too many types of data is not only useless, but also increases time costs.

However, if it is a diversified traffic structure, that is, there is both short video traffic and natural recommendations and Qianchuan traffic, then it is necessary to do data analysis of the traffic structure.

What are the consequences of not analyzing? Let us take an example of cover-up.

For example, if the data for one week shows a sudden decrease, the total table alone cannot tell which part has changed. In this case, it is necessary to use the block data of the traffic structure.

I originally thought that there was a problem on the delivery side, but after checking the channels, I found that the number of natural recommendations had decreased.

For example, if you find that the exposure entry rate has dropped sharply and the total table alone cannot distinguish it, then with the help of the block data of the traffic structure, you can know that it is the traffic change of a certain channel, such as a surge in video traffic, that has led to a drop in the overall exposure entry rate.

The traffic structure summary table can reflect the traffic channel distribution of an account. If it is a live broadcast room that mainly relies on natural recommendations, it is necessary to always observe the increasing proportion of natural traffic and the corresponding traffic volume.

Traffic structure analysis can also solve a problem. In addition to the traffic share of each channel, we can obtain detailed information about the traffic quality of different channels and the transaction amount brought in. This provides a key reference for our focus on traffic support.

3. Product data module

In daily data analysis, for product analysis, I will divide it into four dimensions: single-game products, key products, 7-day products, and 15-day products. For each dimension, I will explain the reasons for daily analysis.

Single-game analysis can solve product scheduling, 7-day analysis can solve product grouping, 15-day analysis can solve product quality, and key analysis can solve hot-selling products.

Single-game analysis solves product scheduling, which refers to the analysis of product data from a show that has just ended. Its dimensions include exposure click-through rate, click-through rate, unpaid order rate, refund rate, etc.

Through the analysis of a single live broadcast, we can see the product data in the opening stage and the conversion efficiency of different products during the order-placing period, thereby enabling us to make diversified data combinations.

Combining the dwell time and interaction data, we can determine whether the opening product needs to be replaced. Combining the product click-through rate and click-through conversion rate, we can determine what products should be available during the best order-placing period.

The 7-day analysis and solution for product grouping means that through 7 days of data dimension sedimentation, the live broadcast room has been able to form relatively stable product data, and the data on which products need to be focused on and which products should be removed has been verified.

The 15-day analysis and solution of goods is from the perspective of product after-sales. This time period from the user placing an order to after-sales is basically coming to an end. The live broadcast room needs to judge the goods in the previous 15-day period to determine whether it is necessary to re-open the product from the perspective of returns, after-sales costs, etc.

The focus of analysis and solution of hot products refers to products with the largest transaction volume. Generally speaking, we classify them as hot products. However, any hot product has a life cycle. We need to analyze the rise and fall of products from the perspectives of product clicks, transaction conversion, transaction amount, and transaction share.

Generally, when a hot product declines, its share in transaction data will decrease. At this time, what the live broadcast room needs to do is to cultivate new hot products.

Based on the product analysis of the above four dimensions, if it is a single-product live broadcast room, I think it is sufficient to do a good job of single product data analysis. If it is a multi-category live broadcast room, it is necessary to analyze all dimensions.

In terms of product analysis, I think the store backend has provided a relatively detailed product compass, but it is still not friendly enough for newcomers. If you want a structured analysis, I suggest you refer to my daily spreadsheet.

4. Short video data module

In daily data analysis, for short videos, we need to distinguish between natural flow short videos and placement short videos.

Data analysis of naturally flowing short videos does not need to be done in every live broadcast room. For example, many live broadcast rooms that hold back orders do not pay attention to video traffic at all. What is the point of analyzing short video data?

Short video analysis focuses on live broadcast rooms that mainly feature short videos. The data is mainly divided into two dimensions. The first part is the account quality dimension, and the second part is the product quality dimension.

The account quality dimension includes the daily playback volume of a short video, the completion rate, like rate, comment rate, forwarding rate, follower-closing rate, etc. The weighted data, combined with the horse racing mechanism, basically determines the traffic level of the account.

The quality dimensions of sales include live broadcast exposure, live broadcast transaction rate, and the corresponding live broadcast transaction amount, which can best reflect the contribution rate of short video works to the live broadcast room.

I am used to using tables to count the points of short video data. The bug of Doudian Compass is that it can only arrange data in a single dimension, but cannot arrange data in combination of multiple factors.

What will short video data be used for in daily life? Let’s break down how to use data to make short video decisions in daily life.

1) Assessing the performance of the short video team

The assessment of short video teams depends on the quality of their works and the quality of the products they bring.

The data in this part comes from the video port of Doudian Luopan. The data of the two dimensions have been listed above. What you need to do is to formulate a daily assessment data standard, a minimum line, a high reward line, and require the team to execute around the KPI.

2) Decide whether to place the work on Doujia

The completion rate, increase in playback volume, and interactive data of short videos basically reflect the video's ability to generate volume in terms of account quality, but this is not enough. It is also necessary to combine the GPM and transaction amount of the short video to determine whether it has the potential for profit.

If the short video is still increasing in a short period of time, and the thousand transactions are higher than the daily median sales data, you can test the ROI through small-amount overlapping investments, and continue to reinvest if it is positive.

3) Which short videos are the dark horse ones?

With the help of the reverse selection tool in the table, by taking the number of traffic as negative and the GPM and transaction amount of the live broadcast room as positive, a contrast can be formed to derive a type of short video, that is, works that have a low GMV but a high transaction ratio.

For this type of work, you can expand the traffic scale through payment, and also invest small amounts to test the ROI. If it is positive, you can continue to invest.

4) Error correction video template

A good work must have both explosive traffic and high-value transactions. However, some works often have large traffic but the actual transaction amount is very low.

For works with low exposure rate/high number of entries but low transaction amount, it is usually because the form of the work itself is more important than the content, and there is a lack of exploration of the product's selling points, which causes users to leave after viewing the work. Once such works are discovered for a long time, you can choose to re-optimize or build a template.

The above is a data analysis of natural short video traffic. As for the short videos of delivery materials, it will be expanded separately in the subsequent Qianchuan article series.

5. Funnel data module

In daily data analysis, the marketing funnel is divided from top to bottom into exposure entry rate, product exposure rate, product click-through rate, order creation rate, and order conversion rate.

Any live broadcast room should have a data section, because no live broadcast room can escape the influence of the marketing funnel.

A closed-loop live broadcast room data verification method is data presentation-problem mining-solution combination-optimization follow-up.

Applied to the funnel, the data at each level is compared with the reference value to find out whether the data is abnormal. At this time, we enter the second step to find out the reasons for the decrease in the values ​​of each circle.

Although funnel optimization is relatively complex, many factors can be qualitatively analyzed. For example, the exposure click-through rate is affected by factors such as product display vision, anchor guidance, field control clicks, etc. Another example is the order conversion rate. Factors that affect it include anchor's sales pitch, price factors, and after-sales service reminders, etc.

The only thing that is difficult to decide is the exposure entry rate under multi-channel traffic. This part has been broken down at the beginning of this chapter, that is, using the traffic channel table to look at the exposure entry rate of each part and perform targeted optimization.

In previous funnel analyses, the club has published a funnel optimization table. For newcomers, you can directly refer to the map for optimization.

One of the difficulties of funnel analysis is setting reference values. For example, what results should be achieved for the efficiency of each level.

This problem is actually very easy to solve. According to the table below, fill in the funnel data for one week on a weekly basis, and then use the optimal indicator for each level of this week as the target indicator for the next week.

By using the calculation formula in the table, you can see how many games in the new week have exceeded the previous highest indicators. The more they exceed, the more it means that the method has been verified and the data has been optimized. If the effect is not good, continue to go back and check the problem layer.

6. User data module

In daily data analysis, user data is divided into four categories: transactions, non-transactions, fans, and non-fans. However, depending on specific usage, each dimension has different subdivision scenarios.

The best feature of the store over the past year is user data. I use user data in three main ways.

1) User tag growth

The process of a new account from cold start to acceptance and then to testing of hot products is the process of verticalization of live broadcast labels. The daily e-commerce big screen can only display basic labels. If you want to view the recent in-depth labels of the account, just click on the e-commerce compass-crowd portrait.

In this section, Luopan presents the basic attributes of the user layer, content preferences, purchasing preferences, and eight common groups. However, these data are meaningless when viewed from a single dimension. Why?

For example, if you want to see whether the tags of the people currently interacting in the live broadcast room can support conversions on a large scale, then you should not view it in a single dimension, but compare the data, select the people your content reaches, and compare them with your consumer group.

Generally speaking, the closer the audience reached by the content is to your consumer profile, the more positive the live broadcast room will be and the smoother the conversion rate will be. Otherwise, you need to think about the problem of conversion gaps.

Similarly, we can compare the portraits of the "product display crowd" and the "product transaction crowd" to see what the user portraits look like for the people who have already completed transactions. This can help us make good decisions on the product selection direction or the directional direction of delivery, and avoid the hollowing out of the user layer and the transaction layer.

2) User tag correction

There is no such thing as precise labels in the live broadcast room. They are always changing. An event or a large-scale transaction with incorrect scheduling may cause the label to shift.

In daily life, there are indeed cases where live broadcast labels are suddenly inaccurate. In this case, if you cannot judge through the perception layer, you can use the user data of Compass for analysis.

How to do it? By selecting the population that has been exposed to the product in the past day or seven days and comparing it with the population that has been exposed to the product in the past 30 days, you can clearly see the variability of the population labels.

For situations where the proportion of old fans is too high, the proportion of natural recommendations is low, and the conversion rate of new fans is too low, it is also necessary to use scene and crowd portraits to make multi-factor comparisons. What is the portrait of old fans and how to maintain it? The conversion rate of new fans is low, what is the portrait of new fans but unconverted users, and what is the matching solution.

3) User tag delivery

The precise groups of people in the store backend can be pushed and used in conjunction with Qianchuan, but the usage strategy will involve the selection of data.

Under normal circumstances, Qianchuan's backend plans are all created for a single product. Different product population packages are different. When selecting in the backend, you cannot directly import the backend portrait. Instead, you must first enter the product population, select a single product dimension, and then select based on the single product dimension.

For the single-product crowd package, for AB testing, it is recommended to select the crowds that click, add to cart, and complete transactions for batch testing. The push of general crowd packages is more accurate, and combined with more competitive bids, good production can be achieved once the volume is measured.

4) Maintain user tags in the live broadcast room

This is partly to cater to the strategy used by the iron fan system. The iron fan strategy is not just to simply maintain the fan group, but to manage users in a hierarchical manner.

The data can be obtained through insights into the live broadcast crowd, dividing users into three major groups, namely fans & transaction groups, fans & non-transaction groups, and non-fans & non-transaction groups.

These three groups represent the portraits of level one to three users respectively, and in the live broadcast strategy, corresponding maintenance measures can be adopted based on different portraits.

7. Qianchuan Data's modules

In daily data analysis, Qianchuan’s campaign includes planning cycle analysis, marketing funnel, real-time analysis, data point and accident analysis.

1) Plan periodicity analysis

There are different ways to analyze Qianchuan data. I never think that a data point can be judged apart from the life cycle of a plan. This is just like an example I have given.

From the perspective of human behavior, adults and children react completely differently to one thing. For example, if a child does not want to eat, it may be because the child is not hungry, while the adult may be in a bad mood.

Therefore, for the same data performance, we cannot look at the cause in a one-size-fits-all manner. Instead, there is an effective method to divide all backend plans into cycles and perform differentiated processing based on the data performance within the cycle.

All plans created by Qianchuan can be divided into the new stage, cultivation stage, maturity stage, decline stage, activation stage and death stage.

During the new product launch period, when a large number of new plans are launched, they are still in a state of no consumption or low consumption. In this case, there is no need to do any analysis. The priority is to raise prices and eliminate bidding risks through bidding. The second is to increase the volume intelligently and adjust directional risks. Only then can the materials be replaced.

During the incubation period, plans to increase consumption begin to appear, but the data show diversity. There are consumption but low production, qualified consumption and production, slightly lower consumption and production, stable consumption and fluctuating production, fluctuating consumption and stable production, etc. For different situations, I have given answers in the previous investment decomposition, and I will conduct in-depth decomposition in the subsequent Qianchuan series of articles.

The mature stage means that the unstable stage has been passed. On the basis of qualified production, consumption is stable and rising. What needs to be done in this part is to continuously dig deep into the value of materials and plans and make differentiated replication.

The decline stage means that the plan enters the end of its life cycle, or the plan declines, or the material declines, but in essence it is a decline in competitiveness, so it is necessary to raise prices to improve competitiveness, or intelligently expand volume to explore more people.

The activation period is the plan that still has a chance to recover after the recession period strategy. At this stage, the plan is pushed to a desperate situation, and prices are raised and sales are increased continuously until ROI is reduced.

The death stage means that the plan cannot continue to grow in volume through any optimization, so the material fission and reshooting are done again to continuously inherit the material model with explosive volume.

2) Marketing funnel optimization

Implementing cycle management for all backend plans from the beginning can optimize plan effects in real time. However, a campaign is not entirely determined by the campaign itself, but also involves the people, goods and venues in the live broadcast room. Therefore, when analyzing the marketing funnel in the live broadcast room, it is necessary to disassemble Qianchuan’s independent marketing funnel.

Since the methodology has been outlined above, this section will not be expanded in detail.

3) Real-time data analysis

In addition to its own periodic optimization plan, Qianchuan needs to adjust the data in real time during the real-time delivery process, which will involve a large amount of data. Let's break down how to use the big data screen in daily delivery.

The core data of Qianchuan Large Screen consists of advertising production, exposure entry rate, entry conversion rate, GPM, number of online users, and length of stay.

ROI in the overall market is only a phased result. What we should check most is the time-sharing production of Qianchuan’s main consumption plan, and use this to determine whether to adjust the plan.

The exposure entry rate measures the quality of the current material. If it is a direct projection screen, set the parameter value in advance as the high-voltage line of the live broadcast entry rate, and guide the live broadcast rhythm in real time according to the changes in data during the live broadcast.

The transaction rate should be viewed in conjunction with GPM. Products of different price ranges have different impacts on the two. High order rankings will often lead to lower transaction rates, but with a certain probability, GPM will be increased.

The number of online users should be combined with the flow rate and GPM. Analyzing the online number does not mean analyzing the current consumption speed, but whether the normal GPM can be guaranteed under the current online state. The flow rate of each anchor is different. Some anchors can handle a maximum of 300 online users, and some can handle a maximum of 50.

It is important to note that for the host’s carrying capacity, if you encounter high flow rates and cannot replace the host to carry the traffic, it is better to reduce consumption than to fearlessly take on the traffic.

In the large screen, you don’t need to pay attention to the source of traffic. The marketing funnel can be viewed in today’s review and optimized in real time according to the above requirements. For creative sorting, enter the plan background to view the real-time sorting. After the broadcast, jump directly to the short video material library analysis in today’s review.

The same is true for plan sorting. The big market is just for capturing and displaying, and more goes to the background for optimization. As for product sorting, if multiple products are released, it is important to keep in touch with the live broadcast room at all times to keep the live broadcast end consistent with the release end.

4) Data points and accident analysis

This part mainly focuses on data problems in daily delivery that cannot be summarized by the above, as well as account anomalies, through analysis of minable data. Since the Qianchuan series will be written starting in April, please see subsequent articles.

8. Anchor data module

In daily data analysis, the analysis of the anchor's data layer includes five dimensions: traffic attraction ability, retention ability, interaction ability, fan attraction ability, conversion ability, and traffic effectiveness.

I call this the anchor’s five forces model.

Traffic-attracting capability refers to the host's ability to attract users into the live broadcast room. The specific data is the live broadcast room entry rate.

The ability to retain users refers to the host's ability to attract users to stay in the live broadcast room. The longer they stay, the stronger the retention ability. The specific data is the length of time they stay in the live broadcast room.

Interactive ability refers to the host's ability to attract users to like and comment. The specific data include comment rate and like rate.

The ability to attract fans refers to the host's ability to attract users' attention and form a fan group. The specific data include the fan increase rate and the fan group increase rate.

Conversion ability refers to the host's ability to attract users to make transactions in the live broadcast room. The specific data is the click-through rate, GPM.

This is a set of anchor training and PK tools that can be used in any live broadcast room.

If it is a personal live broadcast room, starting from the day the anchor goes live, you can record the anchor's five major ability indexes in a data table, rank the anchor's strengths and weaknesses in different indexes, and then continuously optimize the anchor's live broadcast room effect on a weekly basis.

If it is a carousel-type live broadcast room, you can monitor the anchor through the store background, or use the field control function of a third-party data tool to monitor the anchor's time-sharing live broadcast records, and then drag the data to download.

Through the data of different anchors, we can more realistically discover that the ability sections of different anchors are completely different and can be optimized and used differently.

In terms of optimization, anchors with shortcomings in the index should be encouraged to look up to those with strengths, and optimize their shortcomings through benchmarking learning. In terms of application, for anchors of important sessions, anchors with strong conversion capabilities can be selected to stay on site to give full play to the advantages of different anchors.

9. From the perspective of account score

In daily data analysis, the analysis of account scores includes account reputation scores, store experience scores, and account credit scores.

In the early days when there was no score maintenance system, countless accounts and small stores died in a year. This situation was greatly improved after it was introduced.

This is the inevitable result of large-scale live streaming, and also a lesson learned from the chaos in score maintenance under expansion. It is summarized in the data analysis. Not every team needs to do it, but it can be used as a reference.

Special personnel are assigned to unify the management of the scoring system. The table mainly includes experience points, word-of-mouth points, and all detailed indicators related to product quality, product after-sales, customer service, etc.

Daily registration is done in the form of a table, and each live broadcast room can view daily indicators arranged by date. For abnormal indicators, backend staff can optimize them in a timely manner to avoid a drop in scores to the greatest extent.

To sum up the above, a systematic data analysis model is formed with the "Review Summary Table" as the main module, and traffic channels, products, marketing funnels, short videos, Qianchuan delivery, anchors, and account scores as sub-modules.

In 2022, the live streaming track, in addition to the gameplay layer, is more about the refined operation of accounts, with data analysis being the first priority.

Data analysis may seem complicated, but all data have their own patterns, provided that as a trader, you can mentally shape a comprehensive analysis model.

Related reading:

Tips for breaking 1 billion GMV in live streaming!

How to make money from live streaming?

SOP for live broadcast room with monthly sales of one million

Author: Yin Chen's live broadcast

Source: Yin Chen's live streaming

<<:  Armor has good tricks · A huge number of Qianchuan advanced courses, from zero foundation to proficiency, no nonsense, practical implementation

>>:  Short video operation matrix gameplay!

Recommend

2 tips to double your conversion rate! Which one have you used?

In previous articles, Qingguajun has always empha...

How to write excellent promotional copy?

For many operations and promotion personnel, crea...

How to operate a product in Jianshu

1. Product positioning Jianshu is a product that ...

June new media marketing promotion hot calendar!

Whether it is new media, marketing, event plannin...

How can a small brand turn into a big brand?

In the past, if a small brand wanted to turn arou...

Introducing 5 data-driven and practical customer acquisition methods

Admittedly, in many cases, the acquisition of new...

How to build an operating system for community products

Community can be considered a form of Internet pr...

How to achieve accurate traffic diversion from 0 to 1?

“You will never make money beyond your perception...

Those excellent community products are secretly recording user behavior

1. The Originator of Content Platforms: BBS "...

Thoughts on the Didi incident: How to provide safer taxi services?

On the evening of April 28, less than a month aft...

Analysis of the hot marketing calendar in December!

December Holidays December 1 World AIDS Day Decem...

Zero-based fat reduction and body shaping—suitable for both men and women

Zero-based fat reduction and body shaping—suitabl...