A minister of the French government summoned a few of the most eminent merchants and asked them for suggestions on how to stimulate trade-as if he would know how to choose the best of these. After one had suggested this and another that, an old merchant who had kept quiet so far said: “Build good roads, mint sound money, give us laws for exchanging money readily but as for the rest, leave us alone! [Lasst uns machen] ” If the government were to consult the Philosophy Faculty about what teachings to prescribe for scholars in general, it would get a similar reply: just don’t interfere with the progress of understanding and science. (I. Kant, The Conflict of the Faculties, AK VII, 19-20 n2)
When Francesca Di Donato wrote the article we are proposing for open peer review, the COARA principles and its internal governance could perhaps still be developed in a Kantian way. Now, however, with the benefit of hindsight, we are in a better position to see whether this alleged potential has been developed or not.
In the concluding remarks of her article, Francesca Di Donato argues, following Kant, that the evaluation of research belongs only to the scientific community: “philosophical activity is fundamental research, the exercise of a method which consists in subjecting any doctrine to criticism, and as such it is the fundamental precondition of all knowledge. It consists of free communities of peers who learn from their mistakes and constantly self-correct.” Therefore, she concludes, “changing the way we evaluate is not enough if we do not also discuss the evaluators themselves. The last point is at the core of a responsible research assessment reform. In fact, the ARRA requires the direct involvement of individual academics and of scientific communities in the definition of new criteria and processes (ARRA, 2022, pp. 3, 5, 6, 9), but academic communities should assume collective ownership and control over the infrastructures necessary for successful reform. This last point is not as prominent in the ARRA as it should have been – and should be a central governing principle in the future CoARA.”
The following presentation addresses two questions:
- Did COARA take Francesca Di Donato’s suggestions seriously?
- If not, why not? Simple reluctance or deeper structural reasons?
If this were only a domestic issue, the fact that some of the international literature on research assessment in Italy appears misleading to many Italian-speaking researchers would not be so important. Now, however, the ANVUR, the Italian agency for research assessment appointed by the government, is participating in the research assessment reform process initiated by the COARA coalition, in a way that is not only inconsistent, but may put the entire COARA project at serious risk of failure. Therefore, we decided to present a translation of a 2017 article dealing with Andrea Bonaccorsi’s closed-access book La valutazione possibile. Teoria e pratica nel mondo della ricerca. Bologna. Il Mulino, 2015. Andrea Bonaccorsi is a former member of ANVUR’s board of directors, who has attempted to provide one of the broadest theoretical justifications for the Italian research assessment system, which is pervasive, centralized, mostly bibliometric, and under government control.
We are putting the article out for open peer review by inviting a few experts, but anyone’s comments are welcome. To take part, read the instructions in the grey box at the bottom of this page.
So-called ‘AI’ is a derivative of a surveillance business model that allows Big Tech to provide extrajudicial surveillance services for both civil and military purposes. As such mass surveillance is banned, Big Tech, through regulatory capture, has produced the AI Act, under which fundamental rights can be violated with impunity as long as there is no foreseeable harm. So Big Tech’s next target is any norm that still protects fundamental rights.
The attacks on the GDPR are sneaky attacks on the fundamental rights that the GDPR protects, not attacks on the alleged obstacles that stifle innovation.
On this neoliberal attack on fundamental rights, Daniela Tafani submits to open peer review her article, GDPR could protect us from the AI Act. That’s why it’s under attack.