COGNIMAN Conversations – with Debora Zanatto

4 mins to read

The third episode of the COGNIMAN Conversation has been aired on YouTube. Subscribe to our YouTube channel to stay updated.


Scroll below for the readable version of the interview.


My name is Debora Zanatto and I am a Lead Consultant at Deep Blue. I have a background in psychology and a little bit of neuroscience and human-robot interaction.


Basically bringing in a little bit of a human knowledge into the human machine or human robot interaction and within Deep Blue and within COGNIMAN, leading work package four, which is the one that is focused on the human machine interaction, setting some guidelines of how the future interaction is going to look like and working on the acceptance and the ethics downside of this new design interaction and the new redesign skills that the future operators and the users will be needing or having.


What we did at the initial stages of the project is trying to understand what is the role of the operator currently, at time zero, within each use case. So what they do, what kind of issues they are having and what the project can do for them. Once we have done that, once we have understood what the current situation is, we try to match it, with the potential solutions that the technical partners are proposing.And imagine the new scenario. Imagine how the new situation would be, where would the operator be. Where would the user be. What would such a person be doing and would the solution bring benefits in reality or not? Because if not, we need to start rethinking a little bit, the solution.


There is a little bit of everything. Every use case brought kind of a different vibe, I would say. But overall, the generic feeling was curiosity towards the new solution. A little bit of a doubt. But probably brought by the fear of not knowing what is coming. That, we can easily fix by keeping the user and the operator in the loop and feeding them much more information about what we are doing so that they’re a little bit less scared.


I think the extraordinary side of this project is the fact that we are diving into very particular use cases, situations in which the technological solution still needs to be very tailored around the situation. It’s not something that can be widely applied, not at this stage, but at the same time, it seems like that the results we are taking back home, especially from a user perspective, are recurrent are always the same. For example, is trust. So I think the special thing is that we are diving into something very specific, but probably to find something that was already there, we already knew.


The biggest achievement, especially from the Deep Blue side, is having a wider knowledge of the new scenarios. The main difficulty for us was finding a way to properly communicate with the technical partners. Different languages, absolutely different. And it’s difficult sometimes to fine-tune and find the same kind of vibration or frequency, let’s say. So we managed to do that. Now everything seems to kind of open up and become a little bit more clear. So we managed to get a vision of more or less where the user is going to be and what they’re going to be doing, which is amazing, really.


Still keeping people together! Keep them working, keep them focused on what we are going to be doing, it’s probably going to be the biggest challenge, because we don’t know exactly what kind of issues we might be encountering in the next few months. The technology in the simulation might be working nicely, but then it doesn’t fit the user and that we need to sit on a table and we discuss everything. But it might be even the opposite. You know, the technology is not at that stage yet that we hoped and we need to start redesigning the solution.


This was initially required by the European Commission because this is such a kind of unique project and it has such heterogeneous kind of pilots that there are so many things to take into account, especially from the ethical point of view, especially from the fact that we are not 100% aware of what the repercussions on the operator are going to be, that we need to have some much bigger expert than we are to tell us and oversee and to tell us: listen, you have to pay attention to a few issues and you need to take, some step to fix that issue.


Mainly their contribution is going to be through monitoring the state and the progress of the project through deliverables and through milestones in terms of meetings. So they’re going to be reviewing some particular deliverables that are linked to ethical concerns. They’re going to be participating in meetings in which they also expose, so to speak, the outcomes or their evaluation.


I think it is the fact that we are diving into something very specific and somehow unique in terms of pilots, to find that the solution is something that can be used in other contexts, in other situations, because it’s so intricate at this stage that it means it can be applied to kind of easier scenarios or situations.

Scroll to Top
Copy link