Later this year, will be unveiling a new safety feature on iOS 15 for iPhone and iPad, that uses a technology to check for and report potential child sexual abuse material (CSAM). Nonetheless, it says it will preserve user privacy.
This feature is aimed at providing parents and children who use these services better protection from online harm. For example, it will use filters to block potentially sexually explicit photographs to and from a child’s iMessage account, and intervene when one attempts to look for terms pertaining to CSAM via Siri and Search.
talked to Head of Technology, Media and Telecommunications on how effective this technology is and the concerns surrounding user privacy. Amongst others, they discussed how effective this technology would be at curbing the spread of child abuse imagery online; privacy concerns that may arise from the implementation of such technology; and usability of information obtained from such a technology in a court of law.
Listen on here: https://www.bfm.my/podcast/evening-edition/evening-edition/apple-to-launch-child-safety-feature