It doesnot depend on professionalism, it depends on data management conditions.
There are 4 data management complexity levels, sorted by software use and type of data input:
- Individual use + vanilla data
- Individual use + imported data
- Collaboration + vanilla data
- Collaboration + imported data
Each level can reach professional state. But.
At the lowest level of the complexity (individual + vanilla data, created from scratch) you can afford any kind of data management flexibility, including heavy fractioning since you are the only data handler, owner and generator. You can cope with everything you create in many possible ways.
At the highest level of the complexity (collaboration + imported data), where lots of uncontrollable garbage data is imported from many different sources, where it is also generated automatically by different software applications, and all of this is multiplied by teamwork as well, autopurge+fake user strategy is the only viable solution, and any other possible datamanagement flexibility is quite deadly. Because the amount of garbage at this level forms a dense stream, which is not humanly possible to manage.
So datamanagement demands depends only from the level you belong to.