A really interesting talk thinking about AI vs. Humans and computers replicating and replacing human processes.
In research, the outcome is often measured as human performance vs. AI, so essentially can it be proven that AI is as good or better than humans?
Dyer demonstrated that the combination of AI plus human performance always gives better results than either AI or humans can produce alone. The question is how can we work with AI to improve existing processes and complement human skills?
Dyer spoke of the narrative of AI in the media: The reoccurring theme of robots being a threat to human beings, they will take our jobs and eventually threaten our very existence by replacing us. Robots are seen as a threat to human beings - we are portrayed as being weaker.
Image from film 'Ex Machina'
Dyer framed things differently, he spoke of AI cooperating with humans, complimenting our skills, being used as a second opinion / cross reference.
In terms of diagnosis, AI is easily trained to determine severity of symptoms, and easily and accurately decide if a patient needs either immediate treatment, or no treatment at all. This means that the workload of a clinician can be significantly decreased and more time can be spent on more complex cases that need further analysis.
The crucial part of the AI puzzle which is yet to be deciphered, is if the machine sees something unfamiliar. Instead of it saying "I'm not sure", it might automatically place something in its nearest category. These are the instances where we see how vital the clinicians role is and how irreplaceable they are.
Imagine though, in a clinical setting, AI taking on the tasks that involve memorising and recalling information (basically what we would define as the 'knowledge" involved), this would mean that clinicians would have more time to engage other important skills such as compassion, empathy and understanding to make a fuller, more rounded diagnosis.
This resonated with me and my thoughts on our project. Could this lead to more of the patient narrative being heard and given value in the clinical setting? By allowing better resources for a clinician to utalise their human skills (e.g. compassion), richer and more valuable information might be drawn form from the patient, helping the clinician to make a better assessment. At the same time AI could capture the patient narrative and analyse it using its knowledge. Together, this would provide a more rounded picture of the patient.
My mental wellbeing is severely effected by feeling misunderstood. Simply feeling heard is in itself medicine for me.