[unable to retrieve full-text content]
We produce more data than ever. Every day, individuals send over 500 million tweets, create four petabytes of Facebook data and generate four terabytes of location data.
At the same time, companies are turning to networks of Internet of Things devices to deliver constant data-streams that they use to drive policy. In some cases, this information becomes publicly available. Government agencies are collecting and publishing massive amounts of info — minute-to-minute wind speed data on the year’s hurricanes, or a public log of earthquakes and their intensities.
The entire digital universe is expected to reach 44 zettabytes of data by 2020 — put another way, 40 bytes for every star in the observable universe.
This is good news and bad news for data journalists. There is more information than ever to analyze, and for almost any topic out there, someone has quantified it. However, there is also so much that even individual data sets can contain more information than one journalist could ever hope to analyze. If they want to take advantage of this info, they’ll need advanced tools.
As a result, data journalists are turning to artificial intelligence (AI) technology to analyze these massive findings. Along with big data analytics, AI can help …
Read More on Datafloq