https://www.abc.net.au/news/2024-10-05/robot-vacuum-deebot-ecovacs-photos-ai/104416632
Ecovacs, a major manufacturer of robot vacuums, has come under fire for its data collection practices. The company is reportedly collecting photos, videos, and audio recordings from inside customers’ homes to train its AI models.
While Ecovacs claims users are “opting in” to this data collection through a smartphone app, critics argue the process lacks transparency. Users are not informed about the specific data being collected, and the privacy policy, located elsewhere in the app, allows for broad data collection for research purposes.
This data collection raises significant privacy concerns, especially considering recent reports of critical security flaws in Ecovacs vacuums. These flaws could potentially allow hackers to remotely access the camera and microphone on the devices.
Data Used to Train AI Models
Ecovacs confirms the collected data is used to train its AI models, which power features like obstacle avoidance and navigation. However, the company maintains that the data is anonymized before being uploaded to their servers.
History of Robot Vacuum Data Leaks
This incident is not the first time robot vacuum data has been leaked. In 2022, intimate photos taken by iRobot vacuums were leaked on social media, highlighting the potential risks associated with in-home cameras.
Privacy-Preserving Camera Technology
Researchers in Australia are developing a solution that could eliminate the need for high-definition cameras in robot vacuums altogether. This technology scrambles the image data before processing, making it unusable for anyone besides the robot itself for navigation purposes.
The Takeaway
The Ecovacs data collection case underscores the ongoing tension between user privacy and the development of AI technology. Consumers are urged to carefully consider the data collection practices of any smart home device before purchase. The development of privacy-preserving camera technology offers a potential solution for the future.