The History of Data
At the beginning of the digital revolution in the 1950’s and 60’s, as technology transformed from analog and mechanical to digital, computational power was the limiter of progress. It was understood that improvements in hardware would reap improvements in performance and efficiency. As the nascent computer industry grew, so did the complexity and density of processors – Moore’s Law (Moore, 1965) projected a yearly doubling in the number of components per integrated circuit. And this projection proved to be roughly accurate for decades following as illustrated in figure 1.
Figure 1
As processing power increased, more and more information was collected and stored and the ability to comprehend this data in a meaningful way began to be viewed as another limiter to progress. As our ability to capture and store data increased, questions arose around our ability to comprehend and utilise it. By 1990, researchers were starting to acknowledge that this deluge of data was becoming problematic, from the viewpoints of both storage and comprehension. “The imperative [for scientists] to save all the bits forces us into an impossible situation: The rate and volume of information flow overwhelm our networks, storage devices and retrieval systems, as well as the human capacity for comprehension…” (Denning, 1990) . The size and complexity of the data being stored was beginning to outstrip the capability of traditional analytical methods.
Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing Service
In their retrospective study of the evolution of storage systems, Morris and Truskowski noted that an important economic tipping point occurred in the mid-1990’s. By 1996, digital storage had become more cost effective for storing data than paper (Morris & Truskowski, 2003). And the price of data storage continued to fall. As data storage became ever more efficient and cost-effective, the quantities being stored grew exponentially, projected to be over 40ZB by 2020, as illustrated in figure 2.
Figure 2
An illustration of the scale of this growth is that by 2014, mankind was producing the same amount of data every two days as was produced from the dawn of civilisation to 2003. (Kitchin, 2014).
In 2005, the term Big Data was coined by Roger Mougalas to describe sets of data that are so large as to be almost impossible to manage and process using traditional business intelligence tools. (van Rijmenam, 2016) Technology companies, financial institutions and governments across the world began to see Big Data as the next great challenge and opportunity. The idea of being able to comprehend and utilise the flood of data now being captured and stored was recognised as being of great technological and economic value. At the World Economic Forum in Davos in January 2012, data was declared “a new class of economic asset, like currency or gold”. (Lohr, 2012). The time of Big Data had come.
In common with other new concepts, the term Big Data quickly came to mean whatever the speaker wanted it to mean. It became an industry buzzword and a new field to report on and champion. Because it impacts diverse fields such as medicine, sociology, economics, computer science, radiology, agriculture and sports science, the term itself was in danger of becoming ambiguous.
To formally define the term, De Mauro, Greco and Grimaldi surveyed contemporary definitions of the term Big Data and aggregated over 1,500 conference papers and journals in 2014 that used the term in either their heading or abstract, to synthesize a consensual definition: “Big Data represents the Information assets characterised by such a High Volume, Velocity and Variety to require specific Technology and Analytical Methods for its transformation into Value”. (De Mauro, et al., 2015)
INTRODUTORY SENTENCE TYING BIG DATA TO MACHINE LEARNING#####################
The fields of predictive analytics and data mining have long been concerned with finding and describing structural patterns in data, which can then be used to explain the data, influence decisions or predict behaviour. When faced with a very large dataset, the automation of this process is a necessary. Machine learning can be defined as “an automated process that extracts patterns from data.” (Kelleher, et al., 2015).
It can be thought of as the implementation of statistical models and algorithms to perform a task, without specific instructions.
Bibliography
Al-Jarrah, O. Y. et al., 2015. Efficient Machine Learning for Big Data: A Review. Big Data Research, 2(3), pp. 87-93.
Denning, P. J., 1990. The Science of Computing – Saving All the Bits. American Scientist, Volume 78, pp. 402-405.
Kelleher, J. D., Mac NAmee, B. & D’Arcy, A., 2015. Fundamentals of Machine Learning for Predictive Data Analytics ( Algorythms, Worked Examples, and Case Studies). 1 ed. Cambridege, Masssachusetts: The MIT Press.
Kitchin, R., 2014. The Data Revolution – Big Data, Open Data, Data Infrastructures & Their Consequence. 1st ed. London: SAGE Publications Ltd.
Lohr, S., 2012. The Age of Big Data. The New York Times, 11 February.
Morris, R. & Truskowski, B., 2003. The Evolution of Storage Systems. IBM Systems Journal, 42(2), pp. 205-217.
Obermyer, Z. & Emanuel, E. J., 2016. Predicting the Future – Big Data, Machine Learning and Clinical Medicine. The New England Journal of Medicine, 29 September, 375(13), pp. 1216-1219.
Qiu, J. et al., 2016. A survey of machine learning for big data processing. EURASIP Journal on Advances in Signal Processing, 28 May, 2016(67), pp. 1-16.
van Rijmenam, M., 2016. datafloq.com. [Online] Available at: https://datafloq.com/read/big-data-history/239[Accessed 4 June 2019].
Witten, I. H., Frank, E. & Hall, M. A., 2011. Data Mining – Practical Machine Learning Tools and Techniques. 3rd ed. Burlington, Massachusets: Morgan Kaufman.
Essay Writing Service Features
Our Experience
No matter how complex your assignment is, we can find the right professional for your specific task. Contact Essay is an essay writing company that hires only the smartest minds to help you with your projects. Our expertise allows us to provide students with high-quality academic writing, editing & proofreading services.Free Features
Free revision policy
$10Free bibliography & reference
$8Free title page
$8Free formatting
$8How Our Essay Writing Service Works
First, you will need to complete an order form. It's not difficult but, in case there is anything you find not to be clear, you may always call us so that we can guide you through it. On the order form, you will need to include some basic information concerning your order: subject, topic, number of pages, etc. We also encourage our clients to upload any relevant information or sources that will help.
Complete the order formOnce we have all the information and instructions that we need, we select the most suitable writer for your assignment. While everything seems to be clear, the writer, who has complete knowledge of the subject, may need clarification from you. It is at that point that you would receive a call or email from us.
Writer’s assignmentAs soon as the writer has finished, it will be delivered both to the website and to your email address so that you will not miss it. If your deadline is close at hand, we will place a call to you to make sure that you receive the paper on time.
Completing the order and download