Did you know that there are 12 factors to be considered while acquiring data from the web? If no, fret not! Download our free guide on web data acquisition to get started!
Even though the term Big Data coined by Roger Mougalas in 2005, it was in existence without the name from the late 1600s.
Big Data, in general, is a relative term. The “Big” in Big data also depends on the size of the organizations that are dealing with it. The volume of data handled/analyzed by companies like Google cannot compare the volume of Big Data handled by a mid-sized analytics company. Nevertheless, the term volume of data that makes it a “Big Data” depends on the volume of data that the company can handle. We can even argue that the early astrological/astronomical studies based on the observation of the stellar objects, recording them to analyze and explain various events and concepts can be classified as “Big Data” and interpretation of that time. As our technologies evolve, the ability to record, store, and interpret evolves which makes the term Big Data relative to time as well.
The statistical analysis of data dates back to 1663 by John Grant when he used the vast amount of data and statistics to analyze the patterns to identify the origin, infection rate, and mortality rate, etc of the bubonic plague which was terrorizing Europe at that time. Later around 1800 the field of statistical analysis grew exponentially. Big Data is defined as the immense, vast, and uncontrollable amount of analysis by Management Experts.
In the year of 1880, the USA had conducted a nation-wide census. It was estimated at that time that it will take at least ten years to manually process all that was collected by the census initiative. Fortunately, Herman Hollerith came to the rescue. He invented the Tabulating Machine. He created this machine to summarise information stored in the punched cards. This invention helped reduce the ten years’ worth of labor to a mere three months. This also spawned a class of machines called the unit record equipment and eventually the data processing industry.
After this, the importance of analysis grew beyond imagination. It is one of the prime weapons during the first and second world wars. The swiftness in receiving data about the troop’s advancements. Enemy infiltration etc is vital in winning a battle or even a war for that matter.
On Sept 8, 1888 issue of Electrical World, the first Magnetic storage in the form of wire recording was publicized by Oberlin Smith. In 1927 an Austrian German Engineer called Fritz Pfleumer invented the first magnetic tape storage. He devised a way to insert metal strips on to cigarettes. To prevent the smokers from getting stains on their lips by the roll-on papers available at that time. Later, he figured out that he can use a similar technique. To store it on magnetic strips after various experiments with a variety of materials. In 1928, he patented a method to store it on a very thin paper. Striped with iron oxide and coated with lacquer.
In 1931, IBM developed a large custom based tabulator for Columbia University. It was termed as a Supercomputer by The New York World Newspaper. In 1943, the British were desperate to crack the Nazi code, to interpret the information being passed by Germans. They invented the machine called “Colossus” to crack the Nazi Codes which scanned 5000 characters a second. This reduced the workload of weeks to hours. This was the first processor. After two years, in 1945, John Von Neumann published a paper on the first documented discussion on the program storage, Electronic Discrete Variable Automatic Computer (EDVAC). Thus, the foundation laid for the computer architecture as we know today. We at PromptCloud believe in providing quality data to all our customers across the industry, and we have excelled at the same for the past decade.
Your email address will not be published. Required fields are marked *
Save my name, email, and website in this browser for the next time I comment.