Since July 21, I've talked with more authors whose works I've read (Ian Stoodley, Elham Sayyad Abdi), been asked to showcase GVSU Libraries (that is, guest lecture in 2 different Information, a.k.a. library school, master's level courses), corresponded with chapter authors for the book, worked on my own chapter - especially creating a model for it, and went to another lecture by Ron Arkin (Georgia Tech) about "Lethal autonomous robots and the plight of the noncombatant."
Ron's work in robot ethics is fascinating. He addressed: In war, what is the appropriate role of robotics technology? He wants the use of AI and robotics to reduce ethical infractions in military, especially noncombatant casualties, by giving robots A) the right of refusal, B) the ability to monitor and report others’ behavior, C) protocols which incorporate existing and relevant laws of war and the Geneva Convention. Robots should be used ALONGSIDE people, not replacing them. Robots can act conservatively, can make use of more sensors and process/integrate more data/info in a shorter amount of time, do not have emotions that might cloud judgment (e.g., soldiers executing wounded combatants), can independently and objectively monitor ethical behavior. Ron described "ethical architecture" and coding then declared that there is already proof of concept (i.e., robots have been proven capable and effective). If you want to read more, MeL has his book: Governing lethal behavior in autonomous robots http://elibrary.mel.org/record=b17727768~S15.
Work in AI (artificial intelligence) is going to affect us all very deeply in the near future. I watched a fascinating program about this last night, and the upshot was that our jobs in higher education will be to teach human students social skills and how to interact with technology (machine intelligence). This is why I am so passionate about the "relational" approach to information literacy - appreciating the different ways of understanding and experiencing learning includes how we relate to each other and relate to information - this is the future! Yes, we have to include behavioral skills too, along with the socio-cultural elements - it's a "both/and."
Ron's work in robot ethics is fascinating. He addressed: In war, what is the appropriate role of robotics technology? He wants the use of AI and robotics to reduce ethical infractions in military, especially noncombatant casualties, by giving robots A) the right of refusal, B) the ability to monitor and report others’ behavior, C) protocols which incorporate existing and relevant laws of war and the Geneva Convention. Robots should be used ALONGSIDE people, not replacing them. Robots can act conservatively, can make use of more sensors and process/integrate more data/info in a shorter amount of time, do not have emotions that might cloud judgment (e.g., soldiers executing wounded combatants), can independently and objectively monitor ethical behavior. Ron described "ethical architecture" and coding then declared that there is already proof of concept (i.e., robots have been proven capable and effective). If you want to read more, MeL has his book: Governing lethal behavior in autonomous robots http://elibrary.mel.org/record=b17727768~S15.
Work in AI (artificial intelligence) is going to affect us all very deeply in the near future. I watched a fascinating program about this last night, and the upshot was that our jobs in higher education will be to teach human students social skills and how to interact with technology (machine intelligence). This is why I am so passionate about the "relational" approach to information literacy - appreciating the different ways of understanding and experiencing learning includes how we relate to each other and relate to information - this is the future! Yes, we have to include behavioral skills too, along with the socio-cultural elements - it's a "both/and."