Big Data: the fuel for Artificial Intelligence Journalism

Dr. Mohamed Abdulzaher

Dr Mohamed Abdulzaher

LinkedIn

Media Consultant at UAE Government Communication Office

Since the First Industrial Revolution, and through the second and Third Industrial Revolutions, media development cannot be isolated from the development of the ways in which information and data is transmitted, whether it is big, medium or even simple data.

I may disagree with some who describe today’s big data as a product of the techniques of the Fourth Industrial Revolution, or that it is limited to only the current century and the coming decades.

The History OF Big Data

Big Data always existed, from the past; the first Industrial Revolution had created a huge data, compared to world population at that time “billion people”. The first industrial revolution that first surfaced in Britain left behind big data, which the world could not access at that time.

For example, the number of machines that were manufactured; how much they contributed to reducing human labour, how much they contributed to increasing production, what is the nature of those machines, and what are the energy sources stored in the ground — oil and gas or others? Big and huge data had been hidden from the world for many years, which made Britain lead the world economies at that time.

Beginning of the 2IR, the volume of global big data increased, as the population increased to 2 billion in 1930, the 2IR focused more on steelmaking, railway development, and electricity and chemical industries. Here the media was struggling to get the data and news, where publishing news between countries was considered a crime in some regions of the world.

Then the 3IR, and world’s population reached more than 3 billion people in 1960, increased to five billion in 1987, then 6 billion in 1990s.

The 3IR began with mainframe computing, personal computing, and the internet. It was the real beginning of Big Data explosion, which allowed the world to emerge from a period of data scaling in wars and political conflicts to a period of rapid and open data from all countries.

Here, this coincided with a major revolution in the media relying on satellites, the internet and social media, and graphics became more massive and the media failed to reach an analysis even on the 5% of that data which had accumulated over decades, or that circulated at that time.

The role of Big Data in AI Journalism 

In Fourth Industrial Revolution and the Artificial Intelligence Journalism era Big Data has become the foundation and dynamic engine of all 4IR tools, communication, media, and data sources, transforming big data as fuel for all technologies of the 4IR.

Peter Lyman and Hal Varian (now chief economist at Google) attempted to quantify the amount of digital information in the world, and its rate of growth, for the first time. They concluded: “The world’s total yearly production of print, film, optical and magnetic content would require roughly 1.5 billion gigabytes of storage. This is the equivalent of 250 megabytes per person; that is for each man, woman and child on Earth.”

Big Data investments are further expected to grow at a CAGR of approximately 13% over the next 10 years, eventually accounting for over $200 billion by the end of 2030, which will help to strongly boost Artificial Intelligence Journalism and all its tools and techniques.

There are three main areas where big data has the potential to disrupt the status quo and stimulate economic growth within the media and entertainment sectors:

Products and Services: Big data-driven media businesses have the ability to publish content in more sophisticated ways.

Customers and Suppliers: Ambitious media companies will use big data to find out more about their customers — their preferences, profile, attitudes —and they will use that information to build more engaged relationships. Without big data applications, there will be a wasteful and random approach to finding the most interesting content.

Infrastructure and Process: While start-ups and SMEs can operate efficiently with open source and cloud infrastructure, for larger, older players, updating legacy IT infrastructure is a challenge. Legacy products and standards still need to be supported in the transition to big data ways of thinking and working. Process and organisational culture may also need to keep pace with the expectations of what big data offers.

Leave a Reply

Your email address will not be published. Required fields are marked *