One of the technological developments which have produced great social transformation has been the computer. In inventing computers inventors tried to imitate the manner in which humans carry out mathematical operations, which at the end, the capability to perform arithmetic and logical operations, combined and in great amount, can carry out all the actions computers do today and some which we can only fathom. Computers surpassed the human capability and speed to perform arithmetic calculations by far.
The first computers were based on vacuum tubes which were large and heavy. Then in the 40s, with the advent of the transistor, size began to decrease and then again in the 60s with integrated circuits, they decreased in size. Then miniaturization achieved another milestone by using microprocessors which have also been reduced in size as technology progresses.
Computer evolution grew at the same pace as communication networks, which began with wired telegraphy, radio waves, and telephony. According to history, data was transmitted for the first time in 1940 from the University of Dartmouth (U.S.A.) but only until the end of the 60s, the Advanced Research Projects Agency Network (ARPANET) was conceived and which later gave way to National Science Foundation Network (NSFNET) for the purpose of allowing communication between universities.
The project led to “commuting packets” that sent messages in different packages which could take many pathways according to the traffic of the network and recombine when they reached the destination. The Transmission Control Protocol/Internet Protocol (TCP/IP) then appeared in the 70s, which is still used today in email protocols.
Then, later in the 80s, the technology to develop the Internet laid its bases giving rise to the world wide web of the 90s, an interconnected system of computers which, may be accessed via the Internet. Documents on the web are called hypertexts which include texts, images, and videos, among others.
On the other hand, the Global Positioning System or better known as the GPS helps determine the position of an object (persona or vehicle) on Earth, normally with great precision. It was developed by the U.S. Department of Defense and uses the satellite system and based on a triangulation process, pinpoints any location on Earth. The GPS was conceived for military use and at the beginning, signals were not available for civil purposes. However in 2001, during the Clinton administration, the system was made available and open to all, with all its technological, economic and social implications.
With the boom of IT and communications, among other reasons, the abundance of data opened the opportunity and the need to analyze this data with the purpose of obtaining useful knowledge for decision-making, both for machines and human beings. This led to what is known today as Big Data, which includes aspects linked to processing and analysis of great data volumes. The production and processing of large data may be traced almost to the beginnings of computers. One of the first computers was the UNIVAC which was placed at the United States Census Bureau in 1951 and another at the United States Atomic Energy Commission, and used by the commercial broadcast television network CBS to predict the winner of the presidential elections in 1952, with 1% of the voting population, and predicted the election of Dwight D. Eisenhower.
The great amount of data comes from sources such as:
This barrage of data has implicated great challenges in the development of computer and IT sciences and communications as the traditional methods did not have the capability to solve, collect, store, organize, share, analyze or visualize massive data.
The manner to transmit knowledge has also undergone a great change in the past decades, particularly with the generation, availability, and analysis requirements of big data sets in scientific development which was formerly based on the pillars of experimentation: theory and computer simulation, included as a fourth component of analysis of great amounts of data.
Automatic learning or machine learning is a branch of artificial intelligence and whose objective is to develop computers capable of learning. In other words, create software programs capable of generalizing behaviors from information provided in form of examples, and represented as computer systems.
The challenge and opportunity to analyze massive data to obtain non-evident knowledge is known as Knowledge Discovery in Databases or KDD. One of its more transcendental phases is data mining, which is largely based on statistics. In this analysis stage, they try to discover patterns in large data volumes and used in artificial intelligence, automatic learning and database systems.
Through hidden knowledge extraction and of great interest from data, using statistical methods, artificial intelligence and particularly automatic learning they have developed among many others, systems such as:
The generation and processing of great data volumes in the past decades have been framed by the convergence of digital, physical and biological technologies which have resulted in what is known as the Fourth Industrial Revolution, a phenomenon which will bring many transformations to the world as we know it.
To face the challenges of the Fourth Industrial Revolution it is important for governments and particularly educational institutions to be prepared and secure training processes for citizens. In fact, educational institutions should begin transforming their curricular programs to respond to a new society and new requirements. It is also important to continue to advance in information accessing and essential services as well as citizen oversight, among others.
Great transformations have been caused and will continue to be caused by this barrage of data. This has been possible thanks to technologies as the Internet of things, and cloud computing. Some technologies, in part, seek greater understanding of living beings at the macro and micro levels, including:
The availability of great volumes of data has been very valuable and has been perceived as a great opportunity to discover inherent knowledge, identify variable relationships, carry out associations, relate data of different nature and identify patterns, among other functions.
Lastly, it is important to mention that in face of a barrage of data there are great risks, due to the inadequate use of information and flaws in computer security by not restricting access to critical information to non-authorized parties or machines. This is why it is important to emphasize that the real threat of technological developments is in the use humans make of them.
Consejo Editorial: Fredy Chaparro Sanabria Director Unimedios, Nelly Mendivelso Rodríguez Oficina de Prensa, Liseth Sayago Cortes Oficina de Realización Audiovisual, Carlos Raigoso Camelo, Oficina de Producción Radiofónica, Ramiro Chacón Martinez Oficina de Proyectos Estratégicos.
Editor: Álvaro Enrique Duque Soto
Diseño y desarrollo del sitio web: Martha Lucía Chaves Muñoz Oficina de Medios Digitales
Contacto: Oficina de Prensa-Unimedios, teléfono 3165000 extensiones 18432-18108, Correo electrónico: email@example.com
Redes sociales: Twitter: @PrensaUN, Facebook: Agencia de Noticias UN