Background: The digital revolution has set in motion the ‘big bang’ of data generated by human digital interactions and information storage. Data-analytic tools and sources of large datasets present great opportunity as well as risk to evidence generation. The presentation will present the status of big and small data analytics, and the potential uses to improve and innovate evidence generation.
Context: Big data are data generated at high volume, velocity and variety. It is estimated that 1.7 billion bytes of data per minute are generated digitally. Big data use in evidence generation is about turning imperfect, complex, often unstructured data into actionable information. This actionable information requires using computational techniques to unveil trends and patterns within and between extremely large and complex development and other dataset sources.
The characteristics of big data are that it is digitally generated, passively produced, automatically collected, geographically or temporally trackable, and continuously analysed. This has been used, particularly in the private sector through geo-sensing, community radio, postal data, drone data, social media and service data. However, access to this form of secondary data poses a challenge. In South Africa, organisations such as Code4SA and Open Data Durban have begun to drive access to public datasets. At this nascent stage of big data analytics, harnessing its use requires development professionals to be more aware of its potential, its sources, and uses. In addition, applying big data analytic approaches for small data use could improve how data are translated.
Approach: The paper will further present cases of big and small data use in development, current open-source and advanced platforms to source and analyse data, and the rhetoric on privacy and policy on big data use.