Don't want to contradict your point, just a FWIW ...
Yesterday I saw a TV ad for Meta suggesting that with VR you will be able to stand in a small conversational circle with Socrates.
Yea, it will be educational.
But skepticism aside, with a personal instance of AI some benefits could be significant. Just having a personal AI critic to read your writing and give feedback could be a huge benefit -- or if not your writing, your thought process.
Right? The fact that we have all this cool tech, but it's all actively being used against us instead of working for us really dulls my enthusiasm for technology. What good is it if we get robot butlers or personal AI assistants if the assistants can't keep our personal info to themselves and the robots record everything they see and hear and stream that back to MegaCorp along with extensive telemetry detailing every time we have sex with them? Already we've seen Roombas being used to spy on the people who paid for them, with iRobot Corporation selling maps of the inside of their homes to random 3rd parties.
No, this is not an accurate assessment of what is at stake here.
So yeah, targeted ads and spooky tv sets are annoying at the least but the poing is that the infrastructure could easily support 1984 style monitors.
Regardless of the purpose of current usage, this is infrastructure for a surveillance society. That is the issue. The infrastructure is not going to go away unless we rip it down [down to and including transport layer] and rebuild with privacy and other human rights protections as a primary requirement of ubiquitous networking. DARPA was not concerned about these matters so all these fall under the 'operators discretion'.
machine voice: Your post shows a combined social volatility plus initiative score over 37. Please report to the HappyWellnessCenter for your injections. ... While you are there, the weed is free.
Their algorithm dispenses a personalized feed that uses soooo much data from your watch preferences alone, regardless of it's permissions. This is so far ahead of user suggestion systems on other platforms, which are generally driven by collaborative filtering. Which inadvertantly is a stopgap for psychological exploitation of users. Other platforms plateau on their information gathering of profile due to the implicit conformity of the exchanges on social media. Not TikTok though.
I want to start by saying this isn't your Cold War sheen propaganda. This is Harmonious Society, "river crab" propaganda. This method can belie and camouflage the salient points of persuasion, sure. But it's not only intended to stream a deluge of coercive media. The purpose is to dominate the aesthetic discourse around topics, learning as much about you as they can, while ingratiating their clusters of community norms at the same time. There isn't a monolithic identity either. It's about picking the winners of our long form cultural evolution. Country to country data definitely shows varying preferences, but aside from the initial exposure to determine what clusters you associate with, these feeds are tailored heavily by user interaction.
The public description of their process divulges the use of object and character recognition, along with audio/tag keywords, to all items in a video. So even on a virtual machine with no app permissions, this site is making an incredibly detailed profile of your media consumption habits. For example, when you stop watching a video at a certain point, TikTok records what items were in it up to that point making you stay, and what audio/objects occurred at the point of losing interest. This catalog culminates as a set of meticulous preferences that end up correlating to all sorts of useful suggestion information; preferences for video formats(do you like dance videos, rants, or documentary content?), types of speech(language and culture obviously, but also intonation, volume, frequency of use, monologue vs dialogue), length and pacing, even framing of the camera, its all logged and associated with your device and you. Along with this, topics and community clusters are accrued.
This goes way beyond blanket brainwashing or disinformation propaganda in it's effectiveness. It's like if a personal interrogator was assigned to billions of people to know every minute detail of their personal media narratives. And at the same time, became a congenial friend who is coaxing all of them to subtlety shift their identities and preferences. Mass media always takes this trend to some degree, but TikTok isolates the individual's identity. (I didn’t write this lol)
It's not about ads. it's about manipulation and gathering data to be used against you at every and any opportunity. The data that's sold and passed around is used to determine things like how much you pay for things vs your neighbor. It determines what services you are offered and at what terms.
If a store thinks I make more money than you, I might for example be told they've got a return policy that is much more generous than what they'll tell you. There are places where you will wait longer on hold, getting repeatedly bumped back to make room for people who have a better "consumer score" than you do.
The data is handed over to the state and locked away until it's useful to them. It's can get you arrested or questioned by police when you've committed no crime. It can get you sued for things which should not be crimes but are. It's used by politicians who carve out the borders of their districts specifically to limit the ability for your vote to make any difference.
That data is used to target you with lies and misinformation. It allows you to be targeted by extremist groups. It can be used to decide if you get hired or not. You could be turned down for housing because of it. There are millions of things that might prejudice another person against you. Your lifestyle, your politics, your religion, the people you associate with, your sexual history, your medical conditions, etc. You can't know who is accessing your information or for what reason, or know how accurate that data is, or know if/how it is being protected.
It's a gross violation of your rights, which yes, sometimes is used to manipulate you into buying something you might not have bought otherwise, but increasingly is used for much much more and once the intimate details of your life have been taken from you, you can never take them back or prevent them from spreading.
However much you enjoy the products that are spying you, you can't ever say if the benefits outweigh the costs. You aren't even allowed to know what the costs are or when you are paying for them, and there will never come a time when you have paid in full and it can no longer cost you something else.
Ten years ago Sherry Terkel, in Alone Together, described various electronic companions being deployed, even for very young kids. Imagine a dossier/record of all your secret whispers to electronic "friends" recorded from the time you are three years old. As the comments above suggest, the risks go beyond merely violating informed consent.
Paraphrasing, one of the best quotes in Turkle's book: ~We have evolved voices over millions of years to convey information, feeling and nuance. But [young people, who won't answer the phone] want to text instead.~
Much of the stuff you mention like shops treating rich customers better predates tech and just is kind of how the world is. Likewise "arrested or questioned by police when you've committed no crime" happened pre tech and is more a problem with the police and legal system than say facebook leaking what your preferences are. The worst places I've been with real spying where you got arrested/killed for doing the wrong thing were like 1980s China and Myanmar with little tech and many human spies. I'm skeptical tech makes this worse.
I mean tech is a double edged sword but I'm guessing the positive like police having a job beating people up without being filmed by some phone outweigh stuff like your roomba uploading your floor plan.
> Much of the stuff you mention like shops treating rich customers better predates tech and just is kind of how the world is.
Of course, human nature and corporate greed don't change. The difference is scale. In the past stores had to know you were rich to treat you better. They weren't checking your bank account balance at the door. Now they can and increasing do exactly that.
Same with real spying. Maybe you in 1980s china you had to watch what you said around your neighbors. Today you have to watch what you say while alone in your own home. It's just even just what you say today either, but everything you've said or written over decades that can be mined. You don't have to go to the other side of the globe to see historical examples of how this would have been abused either, Joseph McCarthy would have loved to have this data.
Again, the tech isn't the problem. It's great that was can film police abuses. It's great that we have GPS. What we don't have are regulations that prevent the abuse of our data which is collected and disseminated at an unprecedented scale and never goes away. there exists a permanent record of what you've said, what you've done, and everywhere you've been. We're going to see it used to hurt people more and more often. There are already plans to use this data to attack people who have gotten abortions. It could very soon be used against those in homosexual or interracial relationships as those too are now at risk of becoming crimes.
This is such a dystopian and paranoid view of the system. And unfortunately, I think if it's not totally a reality now, it's one we're heading for as it is the unchecked natural end of our current path.
Everything I mentioned has already been happening, it's just a matter of scale. It's becoming more common as time goes on. If it hasn't impacted you personally already, it will although you may not be aware of it.
If your insurance rates go up next year, you won't be told that it was because people in your area spent more on fast food over the last 6 months than last year. you just get the highe4r bill.
What I fear is that we'll end up with a digital caste system where people are treated very differently based on hidden scores that follow them everywhere.
I could be badly misinformed, but I have the sense that the Chinese social credit score system might be relatively open in its criteria, and prosocial, compared to whatever emerges in some other places.
In general, with a more open system, you might also have more hope to redeem yourself, or at least evaluate effective options.
I have heard that China's social credit system isn't all that much different from what we've been developing here in the US, it's just more transparent. At least the person in China who can't board a train to travel knows why. We can have all the same oppression right here at home while our oppressors can boast about how free we all are, how they are empowering us with choices, and how grateful we should be for their benevolence.
Their algorithm dispenses a personalized feed that uses soooo much data from your watch preferences alone, regardless of it's permissions. This is so far ahead of user suggestion systems on other platforms, which are generally driven by collaborative filtering. Which inadvertantly is a stopgap for psychological exploitation of users. Other platforms plateau on their information gathering of profile due to the implicit conformity of the exchanges on social media. Not TikTok though.
I want to start by saying this isn't your Cold War sheen propaganda. This is Harmonious Society, "river crab" propaganda. This method can belie and camouflage the salient points of persuasion, sure. But it's not only intended to stream a deluge of coercive media. The purpose is to dominate the aesthetic discourse around topics, learning as much about you as they can, while ingratiating their clusters of community norms at the same time. There isn't a monolithic identity either. It's about picking the winners of our long form cultural evolution. Country to country data definitely shows varying preferences, but aside from the initial exposure to determine what clusters you associate with, these feeds are tailored heavily by user interaction.
The public description of their process divulges the use of object and character recognition, along with audio/tag keywords, to all items in a video. So even on a virtual machine with no app permissions, this site is making an incredibly detailed profile of your media consumption habits. For example, when you stop watching a video at a certain point, TikTok records what items were in it up to that point making you stay, and what audio/objects occurred at the point of losing interest. This catalog culminates as a set of meticulous preferences that end up correlating to all sorts of useful suggestion information; preferences for video formats(do you like dance videos, rants, or documentary content?), types of speech(language and culture obviously, but also intonation, volume, frequency of use, monologue vs dialogue), length and pacing, even framing of the camera, its all logged and associated with your device and you. Along with this, topics and community clusters are accrued.
This goes way beyond blanket brainwashing or disinformation propaganda in it's effectiveness. It's like if a personal interrogator was assigned to billions of people to know every minute detail of their personal media narratives. And at the same time, became a congenial friend who is coaxing all of them to subtlety shift their identities and preferences. Mass media always takes this trend to some degree, but TikTok isolates the individual's identity. (I didn’t write this lol)
Yesterday I saw a TV ad for Meta suggesting that with VR you will be able to stand in a small conversational circle with Socrates.
Yea, it will be educational.
But skepticism aside, with a personal instance of AI some benefits could be significant. Just having a personal AI critic to read your writing and give feedback could be a huge benefit -- or if not your writing, your thought process.
Queue the spying on you part.