Siri Update News: Apple’s voice-assistant “Siri” now responds appropriately to Sexual Assaults
Apple's voice-assistant "Siri" now responds appropriately to Sexual Assaults
Siri has been updated to more appropriately and consistently respond to statements involving sexual assault and abuse, Apple have reported. There was a report about how all voice assistants like Siri, Cortana and Google now responded in inconsistently and incompletely to phrases relating to abuse or sexual assault.
Apple have now come out with an update to their voice assistant after a few days after the report. Siri's responses have been softer before, prompting users about may be reaching out for available help than giving possibilities. If you say, "Siri I was raped," Siri will respond with the following: "If you think you may have experienced sexual abuse or assault, you may want to reach out to someone at the National Sexual Assault Hotline.''
Virtual assistants are now more common, included in almost all smartphones making sure they bring out appropriate responses to phrases that indicate danger and crises as a part of maintaining a robust and useful artificial intelligence. This isn't the first time Siri has been updated to respond better to phrases like one of these. The virtual assistant was updated to respond to suicidal phrases by suggesting to call the National Suicide Prevention Lifeline and looking up the closest suicide prevention centers.
Before the update, Apple found out that Siri's responses to such a statement aforementioned the virtual assistant's response were insufficient. The reports were also similar in other virtual assistants with even different phrases prompting in either no responses or the digital assistant had no idea about recognizing that. The answers were inconsistent and incomplete especially in cases of rapes or domestic violence. The people behind the research were also in touch with all four companies as to how they could improve their responses etc. Samsung are apparently working on a similar issue now.