![translation](https://cdn.durumis.com/common/trans.png)
This is an AI translated post.
Ambiguity Towards Digital Privacy
- Writing language: Korean
- •
-
Base country: All countries
- •
- Information Technology
Select Language
Summarized by durumis AI
- The US government ordered the deletion of TikTok from federal agency devices due to concerns about its data collection practices, with the European Union and Canada following suit.
- However, TikTok continues to grow, recording high advertising revenue and individual dwell time, reflecting the reality that users exhibit ambiguous attitudes towards digital privacy.
- Companies should consider strategies to meet users' demands for privacy by providing clear, concise information and easy-to-follow guidelines.
The US government formalized a measure to delete TikTok from all devices and systems within federal agencies at the end of last month.The decision, which the European Union and Canada are following suit, is based on TikTok's sophisticated user data collection methods and China's government's access to corporate data.Concerns about privacy surrounding TikTok have been widely recognized by users for the past 2-3 years through keywords such as deleting TikTok within YouTube or Google Search, and refusing TikTok cookie settings. Last month, France imposed a €5 million fine on TikTok UK and Ireland for making it difficult to refuse all cookie collection, another very practical backlash against such concerns about user data collection and utilization.
However, TikTok is expected to have generated close to $10 billion in advertising revenue as of last year, and its per-user dwell time is the longest compared to other platforms, confirming that it has seen seven consecutive quarters of growth in in-app billing.
Therefore, it is difficult to conclude that TikTok's growth is simply due to the platform's appeal. Rather, it is necessary to view this phenomenon as a mirror reflecting the current ambiguous attitude of users towards digital privacy.
Privacy is the ability for individuals to control who can know their information, why, and by what means.If, in reality, privacy can be simply and clearly protected by a hat or a laptop film, what about privacy management in the digital world? People are outraged by articles about excessive location tracking by companies, but they don't change their iPhone settings. Perhaps they are simply succumbing to social pressure to say they care, while taking a compromise attitude to maintain the use of products and services using the technology. And this looks like a textbook example of what social scientists call the "intention-behavior gap."
Digital privacy is incredibly difficult for many people to even understand what it is. How many people check the privacy policy at the bottom of the site they visit? The New York Times, after reviewing the privacy policies of 150 companies, described them as "an incomprehensible disaster," some even exceeding the complexity of Kant's "Critique of Pure Reason."
For people, digital privacy has so far been mostly a theoretical concern. They feel fear when they hear stories of others being "hacked" and having unauthorized photos leaked online and becoming victims of data-driven advertising, or simply experience targeted advertising, such as pop-up notifications or spam calls, as annoyances in their lives.
However, the growing sense that they have become potential targets for constant attempts to seek out their intentions is fostering a "disconnect" culture where people are trying to take a stricter stance between companies and their interests. Apple has already taken this as a business opportunity, offering users options to decide whether to allow existing digital advertising platform companies access to user data in its own ecosystem, the App Store, to symbolically appeal to a user-centric approach to digital privacy.
So, how can companies strategically link users' potential demands for privacy to their future goals?
First, simplicity gives a sense of security. Fears about privacy stem from opaque intentions and complex related policies. People start worrying when they don't know the scope of their personal data usage. Also, policy pages organized in long text are experienced as if they were written to protect the company rather than to explain it to people. Instead, people need simple sentences that help them understand what the company's technology can and cannot do.
Second, provide easy behavioral guidelines. In the material world, privacy is intuitive and tangible. It can be easily controlled through clothing, masks, curtains, etc. Privacy in the digital realm should feel the same way, and this can start with providing opportunities to take small, easy, but symbolic actions. Many people still tape their laptop cameras. Snapchat already differentiated itself in 2016 by offering users the opportunity to act on their privacy control through the My Eyes Only feature. The key to such privacy is that it is visible in the interface and controllable in everyday intuitive interactions.
Strategies to make privacy intuitive and specific should no longer be aimed at emptiness.
*This article is the original content published in the Electronic Newspaper Named Columnon March 14, 2023.
References