Saltear al contenido principal
What Is Big Data (BD) Today?

What is Big Data (BD) today?

  • Blog

This term refers to a set of techniques and technologies focused on knowledge discovery ready to support decision making. This definition can be similar to other previous paradigms than BD, like Business Intelligence for instance. Yet still most of the developed algorithms to solve specific problems, such as simplex linear programming or the algorithm of the travelling salesman problem (TSM), stem from decades.

The authentic step forward that has been given to the big data is the huge increase in processing capacity, not only achieved by the performance improvement of machines within our grasp, but of the new capabilities arising from new needs. It is not by chance that the first real case of parallel programming was born in Yahoos hands and was definitely improved by Google. They had to store a big amount of information every day and decided to turn it into a business asset. Simultaneously Amazon decided to learn the online behavior of their customers about purchase suggestions.

The challenge now is to «drag» all the elite’s expertise and «drop» it in the rest of the market, knowing that BD is a powerful tool to change the business rules. And this is precisely the step our company is about to give. We have to be able to read the needs of our clients to efficiently position ourselves as suppliers in this area. In fact, if the business management gurus are not wrong, only companies that understand Big Data as an integral part of its processes will survive and will have better chances to grow in the times ahead.

Some questions remain unsolved regarding Big Data as current paradigm of information management, such as how it will affect countries like Spain, where the great majority of companies do not appear to be within the sphere of BD. At the same time it will be very interesting to know what will be the role of Big Data in the economy areas that are less subjected to the market laws. The BD abilities to predict facts related to human activity (epidemics, plagues, etc.) are well known, but… will appear any progress on climate evolution or something related with that? At the same time there are many temptations to be avoided. The ability to predict, even statistically speaking, cannot ride roughshod over the individual basic rights.

It goes without saying that all this knowledge cannot be used for other purposes than those to which it was intended. This is the only way to ensure to do something useful to progress, since it is science the only one that can do it.


Written by Arturo Remartínez

Esta entrada tiene 2 comentarios
  1. «the first real case of parallel programming was born in Yahoo’s hands»: It’s pretty bold to claim that. Do you mean that the invention of Fortran language was worthless, and the use of Cray supercomputers already in the 1970’s for physics, chemical, climatology and biomedical simulation and research was not a real case of parallel programming? Or maybe parallel programming for uses not related to finding ways to make customers buy what they really don’t need is just a waste of resources?

    1. Thank you for your comment! I’ll try to explain these points: Big Data is basically understood as an explosion in data volumes handled around activities related to the Internet and the quantification of facts allowed by some modern technologies. That does not mean that algorithms and solving and management optimization problems techniques, which are implicit in Big Data, are newly-made. These techniques have for many years invented but now they take on a new meaning.I hope I have answered your questions.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *

Volver arriba