Because we care about data big and small!
We (humans) produce more and more data every day.
Unless you work for Google, chances are your “big data” is not that big at all.
What used to be “big” yesterday is “large-ish” today and will be “small” tomorrow.
Definitions of “big data” usually refer to more attributes of the data than just sheer volume.
Big data technologies are great for data that are truly big.
Setting up a cluster of machines for many “big data” applications would be overkill and not financially viable.
Most users are stuck with laptops, workstations or individual servers.
The tools we have for those tend to break even for modest amounts of data.
People often use “big data” technologies on single machines, which is not efficient.
Ergo, we need new tools, inspired by the “big data” hype, that can process larger amounts of data without requiring the hardware- and management overhead of current “big data” technologies.
Many users of such tools would also lack experience of setting and running a data-intensive project.
Ergo, we need project management tools for such endeavours.