Most of the scientists of the 19th and 20th centuries used small data for discoveries. It was that 65% of the big innovations are based on small data according to Martin Lindstrom, the author of “Small Data: The Tiny Clues That Uncover Huge Trends”, in which he studied the big 100 innovations.
Small data is the data that comes in a volume and format that makes it accessible, informative, and actionable for humans. Big data is usually related to machines and small data is usually related to humans.
The only way to comprehend big data is to reduce it to smaller, visually appealing objects that represent various aspects of a large data set. Two human-centered principles that emerged from the experiment can help organizations get started on their own small data initiatives:
- Balance machine learning with human domain expertise: A number of Al tools have been developed for training Al with small data. For example, few-shot learning teaches Als to identify object categories (faces, cats, motorcycles) based on only one or a few examples instead of hundreds of thousands of images. In zero-shot learning, the AI is able to accurately predict the label for an image or object that was not present in the machine’s training data.
- Focus on the quality of human input, not the quantity of machine output: Detecting a small data set can be more accurate than large ones as it may consist of videos, images, text, etc. Which will demonstrate the reality of a lived experience within its contextual environment. Therefore, mega-companies should focus on collecting qualitative data for analysis which can be done by open-end surveys and questionnaires that offer a comfortable space for them to be honest, which reflects accurate data with maximum quality as a base for the company’s analysis.
By Maen Altengi, edited by Mazeat Koreny