1. ‘Data as core’ theory
1.1. From ‘process as core’ to ‘data as core’
In the big data era, computer mode has been changed, from ‘process as core’ to ‘data as core’. Hadoop is the paradigm for ‘data as core’. Non-structure data and analysis requirements will change the upgrade method for IT: from number increasing to structure changing.
The new concept in big data era: calculation method transformation. For instance, IBM uses ‘data as core’ design, aiming at decreasing the data exchange necessity among super computers. In the big data era, cloud calculation shows its ‘data as core’ in storage and calculation. Cloud calculation provides powerful tool and method for big data, big data provides a platform for cloud calculation. Big data can utilize the constructed cloud calculation resources.
2. Data value theory
2.1. Function value to data value
Big data makes data online, which is a feature for internet. In the non-internet era, function is its value. In today’s internet products, data is value.
The value of big data is to create and fill the blanks of unrealized parts. Big data focuses on ‘useful’ rather than ‘big’. For example, data can tell us the consuming trend for each customer, what they want, what they like, the differences between among customers, what information can be integrated and classified. Data amount increasing will make the quality improvement.
3. All-sample theory
3.1. From to random sampling to all-sample
Using all-sample theory to think and solve problems. There are more deviations in random sampling than in all-sample theory. The all sample is more reliable because it has all the information.
4. Efficiency priority theory
4.1. From precision priority to efficiency priority
Paying attention to efficiency rather than precision, big data indicates human made a giant step in quantization and world knowing, the information that cannot be measured, stored, analyzed has been put into data. Owning big amount data and less precise data opens a new door for us to understand the world. Big data can improve the production efficiency and sales efficiency, for which big data can let us know what the market needs, requirements of consumers. Big data makes enterprise decide scientifically.
5. Relevance priority theory
5.1. From cause/effect to relevance priority
A major feature for data thinking is from cause/effect to relevance priority. Big data does not need to find the cause, does not need to prove the events are cause/effect. It only needs to know that when this happens, we can find the high probability for the outcome. So when this event happens, we can make a decision, what we need do.
For example: Americans developed software to dig data information with ‘personalized analysis report automatic visualization program’, this data digging program can extract important information from various data, then analyze and link the information with previous data to get the useful information.
6. Prediction theory
6.1. From unpredictable to predictable
The core of big data is prediction, big data prediction can be shown in many aspects. Big data is not to teach robots to think like human, on the contrary, it is to predict the possibilities by applying mathematic algorithms on big amount of data. Everyone behaves like other people in front of big data principle, so merchants know better about consumers.
Eliminating subjectivity in prediction. The reason for the systems to succeed is that they are set up upon massive data. Prediction mathematic model is not a new concept, they are getting more and more accurate. In this era, data analysis ability started matching data collection ability, analysts not only have more information to construct model, but also have the technologies to convert the information into data.
Satellite Tracking + Big Data GIS Application in Real-time Ship-controlling>
Big Data - Signaling Data Makes Cities Smarter>