Today we will interpret the data management capabilities and data analysis ideas that are closely tied to problem analysis and solution. First of all, let's make it clear that the data mentioned here is not just 'number' but 'data'. Compared with 'number', 'data' carries more information, attributes, and measurement dimensions. Please keep this consensus in mind when reading the following content. Why are data-related jobs so valuable, such as data analysts, accountants, financial analysts, etc. Because data is an objective presentation of fuzzy reality (why theoretical mathematics is the foundation of all theoretical and practical sciences), it describes reality in a rational way and measures objective existence in a quantitative way. In-depth analysis and mining of data can provide the most approximate interpretation of reality and guide people to formulate the most effective strategies to guide reality. Of course, for event operations , the data part of the work is not that mysterious, but the essence is the same: how to grasp the closest reality of the responsible business through data management, and how to extract effective information from the data through data analysis. To put it simply, when your boss asks you about the effectiveness of a recent campaign, you can explain from multiple angles whether the campaign was good or bad, and the support of data makes your conclusion irrefutable; and the support of these data comes from your rigorous data analysis and mining process. So I hope this part of the data can give you an introduction to understand:
We set the data management and analysis capabilities required for necessary event operations as the boundaries of this part of knowledge . The discipline of data management and analysis is also profound and endless. Until now, new data analysis theories and methods are still being created. If you are interested in this part, you can study it in depth on your own. 1. Basic data management capabilitiesThe essence of data management work is to help us understand the reality as much as possible, to understand whether your products, plans, and special activities have truly and effectively affected users, what real processes have brought about the actual results, whether the effects of the operational actions taken are better or worse than possible choices, and whether the invested costs are worth the money. The process of understanding reality through data management is a process of challenging intuitive cognition. Is a project that a leader likes to praise endlessly a successful project? Is a project that seems to have no output a worthless project? Do more users agree that you think the content you think is good? The scary thing is that, from my personal experience, data management is often not used as a tool to grasp reality, but as a means to cover up problems and create false realities to deceive superiors, partners, customers and the market . A common method is to first write down an optimistic conclusion that helps achieve the goal (this project is very popular in the market, so we can increase resources), and then work backwards to mine the data, leaving behind the actual but not so rigorous numbers that help support this conclusion (the user click-through rate of the product ads we put out is very high - in fact, it is only because the advertiser has a profit point of winning an iPhone prize in a lottery), hiding the real numbers that will lead to pessimistic conclusions (users who were deceived by the lottery are not interested in the product at all), and gradually tend to believe some wrong and one-sided data, and often they themselves are hypnotized. You must have done this kind of thing many times. But we also understand that in a competitive environment, facing the market, customers and superiors, we need to exaggerate and package some false conclusions to keep a project alive and gain more time windows and resources. However, this process does not actually affect your understanding of the real reality. Only by knowing the real situation in your mind can you evaluate whether the resources gained by creating the illusion through data management can make up for previous mistakes and bring about a second chance. So before you start this part, please prepare your mindset: this part of data management will not teach you how to package and present data beautifully and seemingly reasonably. There are many courses on the market for this kind of methodology to learn on your own. The data management I want to share and advocate is more about how to help you grasp the real reality through data management. What is the real reality under data management? I think it needs to include at least three parts:
The above three parts constitute the most basic data management framework, representing your grasp of a complete reality. It also means that you can withstand the three classic soul-searching questions from your leader: Are you doing well? How good is it? What's good about it? (Well, actually, I am also constantly teaching leaders to ask these three questions in another leadership series of courses. I hope you can follow along perfectly.) To get to the bottom of the reasons, leaders like to ask these three questions, which also represent the hidden meaning behind these three questions and represent the complete reality. What to do, how to do it, how to improve? So let's break it down and see what the three parts should do, how to do it and how to improve the capabilities of these parts. Describing reality means grasping the complete and objective picture of reality and seeing the essence through appearances. Many times the results are treated as if they were the only reality, which is actually partial or even misleading. Results and reality are two completely different things. Results are only a part of the most intuitive and superficial reality (but it cannot be denied that in some black-and-white issues, results are the only reality). Knowing only the results often makes it difficult for you to draw conclusions and guide your next actions, but if you master the reality, your conclusions and subsequent actions will be very clear.
The essence of building a business data management framework is to build a data framework that reflects all the realities related to your business that you need to master and to logically break down and interpret it in order to draw effective conclusions. Integrity is the goal of the first step in building the framework. Only when this step has integrity can it lay the foundation for the next data management step. (1) First ask yourself a question: What are all the relevant real factors that affect this business? Since it is a problem, we have entered the field of problem analysis that we have mastered. Referring to the methodology of the problem analysis article, this problem is a relatively chaotic problem. I don’t know what the final analysis result may be, and what elements are needed for disassembly, so we use the brainless 6W2H method to help us exhaustively enumerate (refer to skyhahalife’s previous article for details):
Knowledge points :
The exhaustive process continues. As long as you think that the data can help you understand the complete reality, it should be included in the framework, and continue to use other 'W' and 'H' to help you break it down until all the data related to this project is exhausted. The essence of improving this part of your ability is to continuously accumulate your business familiarity in the process of breaking down and interpreting data again and again, and to know which numbers are important and implicit and need to be paid attention to for different business scenarios. After accumulating this ability to a certain extent, you can skip the above breaking down process and build this framework in your mind. However, before you become proficient in the operation or when you come into contact with a new business scenario, it is recommended to apply the above process in its entirety to help you start the process of mastering the complete reality in the first few times. Then you now have a bunch of disorganized data in your hands. They are already a complete description of reality, but you still can't understand them and can't draw valid conclusions, because they are just disordered numbers and need further logical decomposition and interpretation, which are the next two steps we are going to take. (2) Weighing the good and the bad I had 50 yuan of spicy hot pot for dinner today. Do you have any impression of this number? I don’t think so. But if I tell you that a 30-yuan spicy hot pot meal is called a "couple meal" in the same store, obviously you will come to the conclusion "Are you a pig?" What this step is about is to evaluate the messy numbers you currently have. Computers and many straight men only have two results in their minds: 1 and 0. The positive and negative conclusions are obvious. But there is no absolute rule on how to evaluate a number that is neither 0 nor 1. The only way to draw relative conclusions is through comparison. The conclusions of good and bad drawn are not really absolutely good and bad, but only relatively better and worse. The above theory explains an epic question that confuses many straight men: How expensive a gift should I give my girlfriend on the anniversary to make her satisfied? In most scenarios, the answer is that it just needs to be more expensive than what her best friend received. Therefore, this part needs to give a conclusion for each number. The key point is to find a reasonable and effective reference standard for each number:
The ability improvement in this area is obvious, that is, the data dimensions you often come into contact with and whether you are familiar with their references and measurement standards. Of course, before you master the capabilities of this module, force yourself to comprehensively sort out and understand the measurement standards of all data within a complete data framework, and you will basically be on your way. Now you have a bunch of messy numbers with evaluations in your hands, and you can draw some obvious conclusions, such as my page conversion rate is very bad, but you still can't fully describe the entire reality, because there is no logical relationship between these numbers, which is what we need to do next. (3) Constructing logic to form a complete description of reality A lot of data analysis work is stopped in the previous step, because it seems that the conclusions are very clear (high click-through rate means users like it, good sales means the product is popular, and long page stays mean users are willing to browse). Also, for those who use data management as a tool to manage expectations, this stage can actually draw many one-sided conclusions that effectively support the expected results. Some subjective or objective factors make them stop and start using these scattered conclusions to organize their own project reports. But if you look into it more carefully, you will find that these single-point numerical conclusions can always be questioned and challenged. This is also the reason why you are often unable to respond when challenged by your leaders. Because the data do not form a logic, they do not form a complete description of reality, so the conclusions must be incomplete and challengeable. A complete and effective data management framework is a rigorous logical description of reality, in which rigor and logic are necessary factors. Because data can easily deceive people, for the reasons mentioned above, a single point of data can lead to many different results without logical deduction, and people tend to believe in positive conclusions and ignore other possibilities, so misleading occurs. Building a complete and rigorous logic is an effective way to overcome this problem, and the method is actually not complicated:
Keep trying to combine the numbers you have, going through the above process, to form one or more rigorous logical descriptions, until there are no more important or necessary descriptions to sort out. It means that you truly understand all the realities of a project, which is the essence of effective data management work and the basic guarantee for formulating subsequent strategic optimization actions. Being able to face success and failure objectively and draw effective conclusions is what a business and a team really need. At this time, you can confidently tell your leader that you can ask any questions about the project. Writing a summary is even easier. It is nothing more than presenting these logics in a simple and clear way through some visualization tools. There are too many teaching materials on the market on how to visualize. I will not teach the knowledge of relocation here, so you can find it yourself. Of course, the complete implementation of the above three parts is extremely complex and time-consuming, and it is obviously unrealistic to strictly implement all projects. Therefore, it is more important to understand the real operating principles of data management. When faced with complex data management tasks, this theory provides an introduction and method to allow you to carry out data management work without mistakes. As you become more and more proficient in using the tools in each step and your business capabilities gradually improve, your data management capabilities will enable you to easily, efficiently and specifically grasp the full facts of a project. Compared to those who spend a lot of energy running data aimlessly and writing reports to draw conclusions every time, you are a dimensionality reduction attack. 2. How to complete data mining and draw conclusions through data analysis tools?We begin the second part of the interpretation, which is about how to use data analysis tools to help you complete data mining and draw conclusions. This process corresponds to the steps of the data management part on how to extract value from single-point numbers and give meaning. This section will not introduce too specific data analysis tools. There are also many ready-made methodologies on the market that can be learned. As a person who once won the first prize in the National Mathematical Olympiad, after ten years of clubbing, I don’t really understand what regression analysis and probability calculation are. But what I want to share are the more important data analysis ideas behind data analysis tools. In a sense, these ideas are truly powerful data analysis tools, helping you understand the essence of data analysis tools, how to analyze a number in different scenarios, and avoid over-emphasizing the methodological analysis process and leading to wrong conclusions. Once you master these ideas, the tools are ready-made skills and you can search on Baidu whenever you need them. Common data analysis ideas (tools) that need to be mastered for event operations: 1. Trend analysisWhy is it that a company is still growing and its business scale is still huge, but its market value will drop wildly if the growth rate drops slightly? Because the capital market's valuation of a company is actually based on the company's business scale and returns in many years to come, a company's future and development trends are more important than its current status . The capital industry's evaluation model for a company's business scale represents the core of trend analysis thinking. The higher the latitude of trend analysis, the clearer the nature of the trend can be seen. How to understand it? The growth rate is the first dimension of trend analysis, which reflects that the company's business is still developing positively. However, the growth trend of the growth rate, the second dimension of trend analysis (the often-seen report of declining growth rate), reflects the direction of the company's business development trend, whether it is growing faster and faster (implying a future of sustained growth) or slower and slower (implying a future of stagnation and decline in growth). Now we come to the growth trend of the growth rate, the third dimension, which reflects the more essential direction of the company's business development trend (the company's growth rate is declining, but the decline is smaller than the previous quarter, which may mean that the company and business have found a new growth curve and are taking over the old growth curve). Applying this principle to actual work means that we must consciously move to higher dimensions in the process of various trend analysis in order to see more of the essence of the trend. Staying at a low level often causes you to ignore potential future trends. 2. Comparative test typesA commonly used analysis method is A/BTEST, which uses two control groups with basically consistent conditions and contrasting differences to determine the actual effect of an action. It doesn’t matter whether this method is called A/BTEST or C/Dtest. What is important is the analytical thinking behind it. To obtain the true impact of an operational action or interference factor, the most rigorous and effective method is to use a comparative test that eliminates interference factors. I believe that this method is not unfamiliar to science students: let the two experimental subjects be under factors that are as consistent as possible, with only the test factors being different, and observe the final results of the two experimental subjects. Only then will there be an absolutely rigorous logical inference relationship between the test factors and the results.
With this idea in mind, you can try to find out why the goddess rejected you: because she is ugly or poor? 3. User Path and Funnel ModelThe commonly used data analysis ideas for activity operations are to return to the user's path to build a data logic framework to clearly help the business determine the truly critical conversion links that affect user behavior. Everyone can draw a funnel model: put the absolute value of each link from ad clicks to page traffic to product details to order placement, calculate the conversion rate, and draw a conclusion on whether the conversion rate of each step is good or bad, and then optimize the poor links. Of course, there is no problem in using such a funnel model in most simple business models, and it can also draw basically effective conclusions. However, most people's use of the funnel model is still limited to the basic funnel model given to them by their mentor six months before graduation. They just mechanically fill in the data of each funnel level every time. Such a rigid way of using often becomes completely ineffective when the business model changes (users often do not enter from the page when designing social sharing) or when user behavior subtly evolves (users exit to search after seeing incomplete page information, but eventually place an order). Therefore, it is more advocated to master the core idea of user path analysis method. Instead of directly applying the ready-made funnel model, try to draw the user's path from the first time they come into contact with your business, like telling a story, and clearly describe the user's possible direction until they reach the final transaction conversion stage. Only by sorting out the user paths in this way and finding the key points can we form a truly effective funnel model. In an era where social, vertical, and decentralized businesses (social e-commerce, distribution, etc.) are becoming increasingly common, users' browsing paths are far more complicated than the traditional homepage-feature page-product details model (traditional e-commerce and portal businesses are all like this). Only by mastering the core ideas of user path analysis can you really keep up with business changes. 4. Black Box AnalysisWhen the operating mechanism of a business or a module is too complicated, making it too costly or even impossible to analyze its operating logic, black box analysis thinking gives you an easy and simple path: treat the unclear and complex operating mechanism as a black box, only consider the input and output results of the black box, and look for the relationship between the input and output. In fact, most of the scientific conclusions reached by humans are drawn through black box analysis. For example, scientists have discovered that if a Ferrari key is placed on a table in a disco, many young ladies will be attracted. The theory of black box analysis is not difficult to understand, but the key is to use this thinking appropriately at the right time. To a certain extent, black box analysis conflicts with the data system management mentioned above that seeks to get to the bottom of the cause and grasp the full picture of the facts. But too often we dwell on the details of a project for too long, always having to delve into an undefinable cause. When you find that a factor that only causes linear changes has trapped the project team in analysis for too long, resulting in inefficiency, it is timely to package some uncritical or unanalyzable parts into a black box, and use the relationship between input and output to analyze to draw effective conclusions. Reasonable use of black box analysis ideas can help you improve a lot of efficiency. 5. Big Data Thinking - Simplified into ModelsThis way of thinking is that when the amount of data is large enough, we must learn to simplify complex individuals into big data individuals . The simplification process is to extract the key and necessary factors and remove those factors that are irrelevant or have huge individual differences but negligible overall differences. For example, many e-commerce companies use big data to label each person with income and city level, because the intersection of these two attributes constitutes a basic consumer model: rich people in first-tier cities like to pursue trends, poor people in first-tier cities may like to buy Uniqlo, rich people in lower-tier cities like to buy luxury goods, and poor people in lower-tier cities like to buy purely cheap goods. There may be some stingy rich people in first-tier cities who are not willing to buy a Dyson, but that’s not the point. This is big data thinking. Building the simplest user model is the essence of big data thinking. The business situation that event operators often need to face is to formulate business strategies through analysis of target users. Your business purpose determines the user model you need to build, and the user model building determines the layout and classification of clues on your special page. For example, you are in charge of an activity to sell mooncakes. Selling mooncakes is your business purpose. The user model associated with this purpose may be why users buy mooncakes and how expensive mooncakes users can afford. So you built a user model like this: rich people give gifts, poor people give gifts, rich people eat by themselves, and poor people eat by themselves. What you need to do next is pretty straightforward. at lastThe above is my definition and analysis of the data management and data analysis capabilities that are needed for event operations. The important thing is to understand these principles. There are plenty of specific tools on Baidu, but if you use them mindlessly without understanding the principles, you will easily be deceived by the data. It's over, let's understand it~ Author: SKY#沙璇浩, authorized to be published by Qinggua Media. Source: SKY#沙璇皓(skyhahalife) |
First, McDonald's name change to "Golden...
April to December every year is the golden period...
Mid-Autumn Festival is coming soon. How to write ...
Build your own site group from scratch: you only ...
How much does a piece of Cordyceps sinensis cost ...
I've been watching the Japanese drama "W...
With 468 million monthly active users, WeChat has...
Traffic pool thinking is the guiding thinking for...
Airbnb, Chinese name:爱Airbnb , is a service websi...
Many people dream of getting rich overnight, marr...
In the past two days, an article titled "In ...
In "Analysis of Pinduoduo's activity ope...
The iPhone is criticized every year, so why does ...
According to the latest news, from 0:00 to 24:00 o...
Tik Tok short video eight resources video tutoria...