Concerns about Zoom’s New Terms of Service
| By Xuan Zhong |
Zoom’s latest terms of service adds a new clause about users’ personal information. Although Zoom has repeatedly emphasized that users can choose not to share data with Zoom while using this application, this provision has been criticized by many users as evidence that Zoom will use users’ private data to train its AI system. It also raises concerns about users’ inability to assure themselves that their personal data is not being misappropriated.
In Section 10.4 of the terms, before using the software users agree to “grant Zoom a perpetual, worldwide, non-exclusive, royalty-free, sublicensable, and transferable license” for various purposes including, “machine learning, artificial intelligence, training, testing, improvement of the Services, Software, or Zoom’s other products, services, and software, or any combination thereof.” Zoom currently offers the ability to provide automated meeting summaries and detect spam activity using AI and is developing other services using artificial intelligence.
Behind the buzz generated by Zoom’s updated terms is an ambivalence about AI. More and more people believe that the popularization of AI has put them in a dilemma. It is undeniable that AI will penetrate more and more deeply into every aspect of life, and it does bring a lot of convenience to users. However, the upgrading and iteration of AI is the result of a large amount of data training, many users will be entangled in whether to enjoy smarter services but disclose more personal privacy. How to delineate the boundaries of the personal and public spheres has aroused discussion. All in all, while there are people who don’t mind their personal privacy being publicized, it is also necessary to respect the idea of guarding all personal privacy.
The “unequal” status between software companies and users, as reflected in the Zoom incident, also deserves attention. Software developers have strong legal and technical teams, which can set up links that users cannot easily detect to extract their personal information, such as inserting one or two new clauses into a long set of terms and conditions, setting the Terms of Service with only two options: “Accept All” or “Cancel” or silently opening the user’s listening function in the default function settings. Users need to be very careful to ensure that they don’t fall into these “traps”. We need a strong third party – for example, more detailed legal regulations – to restrict software companies and force them to ensure that their expertise is not abused. At the same time, users should focus more on the protection of privacy and choose companies that are transparent about how they use users’ data and respect their privacy.