One must also consider phenomena such as Wikipedia, which through a participatory mechanism has created a new form of universal encyclopedia that is constantly updated in dozens of languages. A project that would have been simply unthinkable only 20 years ago, in the days of CD ROM encyclopedias.

No publisher would have the resources to take on such a cyclopean and universal publishing project, which was made possible by a network of volunteer contributors. The issue of quality control should always be taking into consideration, even with Wikipedia, but its usefulness is unquestionable, and it should also be extended to Wikimedia, the area that offers public domain materials and documents.

The “Open Source” paradigm, characterized by the collective and open realization of software applications, has also been applied to other contexts, for example, cartography or software, resulting in large projects that are “open” to the contribution of creative communities and free to use for the public.

Institutions have also initiated large participatory “Open Content” projects, such as Europeana, the European digital library built with contributions from member states and hundreds of cultural institutions.

These open and shared knowledge projects are now confronted with Artificial Intelligence, which on the one hand exploits their content to train its models, and on the other can be used to improve the activity of contributors to collective knowledge.

Forms of Open Content should not be confused with ot replaced by Artificial Intelligence models.