So we are running the risk of normalising the creative feeld by letting you take control of your processes
That risk exists, but it depends on how the systems are used.
AI models are trained on large datasets. That creates two structural tendencies:
* Convergence toward dominant patterns: widely represented styles, themes, and formats are easier to reproduce than marginal or highly local ones
* Compression of variation: outputs often sit near statistical averages unless explicitly pushed otherwise
In a field like the creative industries—often discussed under Cultural Economics and Creative Industries—this can lead to partial standardization if AI is used passively (e.g., default prompts, template workflows).
However, normalization is not automatic or total:
* AI does not “take control” independently; it operates within human-defined prompts, datasets, and selection processes
* Users can deliberately steer outputs toward specific localities, minority aesthetics, or non-dominant references
* Constraints can be inverted: instead of averaging, AI can be used to amplify niche or hybrid forms if guided precisely
The actual risk emerges under specific conditions:
* heavy reliance on default outputs
* platform incentives favoring scalable, familiar content
* reduced investment in local or experimental practices
Under those conditions, diversity can narrow. Under different conditions—curated inputs, critical use, and strong local ecosystems—AI can also increase variation by recombining influences that would not normally meet.
So the issue is less “AI taking control” and more how much agency is ceded to automated averages versus actively directed differentiation.