"Big Brother" is now slightly less scary than "Smart" Brother with our data

Post Reply
circuitbored
Site Admin
Posts: 102
Joined: Fri Aug 18, 2017 9:03 pm

"Big Brother" is now slightly less scary than "Smart" Brother with our data

Post by circuitbored » Sat Nov 06, 2021 11:08 pm

Due to the current state of inescapable tracking imposed upon all of us online by "The powers that be" I have been reducing a lot of my normal interactions online to the dismay of many of my friends and family. I fear this may lead to more desires to avoid the Internet entirely as time progresses, like I do with cable TV, due to the endless stream of pharmaceutical and reverse mortgage commercials on every "brief commercial break".

I noticed that somewhere around the middle of last year that some things I had been posting on social media accounts I run had been suddenly painting an inaccurate public picture of me that was not very well connected to my real life, and in some cases far too tellingly inaccurate. People were suddenly taking me very seriously last year rather than reading my heart felt opinions on this site. Many people judged me based on my social media posts, even on my more humorous posts, and judging me as a consequence. The judgement was more deep than ever before and began to show up in real life conversations I had with friends as rumors, and there was no explanation as to why it was occurring. I would also hear gossip about others I knew more frequently in a similarly judgmental fashion, of course that gossip and speculation was based on social media posts they had made... Ahh the conversations that happen behind our backs... Luckily though, it was not as bad a Will and Jada, but it did cause some people to question my grasp on reality at times, because they did not truly know me, and that I've always been an aspiring comedian, and that sometimes my humor and music gets somewhat "edgy" (without the intent of being offensive of course) because I have a bunch of people following me that don't know me in addition those those who do know me, and folks, that's what entertainment is all about.

Even though I was regularly careful about not sharing my private personality with the entire world on social media, I had to step back from sharing a lot more of my personal experiences, thoughts, and information even with my own friends and family because of a rising sense of public social judgement, and judgement of people that even already knew me. I realized more and more that I had many friends and/or followers that had dramatically radicaliz-ingly different views than me, and that social media was amplifying my thoughts to others and possibly promoting them as "controversial" and/or "edgy" to others all without my awareness of that as a new practice.

I'd like to consider myself an open minded and centered individual who works hard to not impose my views on others, but from last year up to this year somehow in almost every way possible, everyone was becoming increasingly divided and opinionated and only interested in entertaining like-minded opinions to theirs... This attitude/trend was also bolstered by television news experiences possibly (but I'll leave that up to you guys to decide). The entire world became dramatically divided surrounding issues of political identity as a basis of life, and we all suffered for it deeply during political meltdowns across the world, because everyone picked sides and played a football game on a wide variety of topics from human rights to politics.

The real honest truth is that (truly democratic) politics and human rights in life, cannot be reported on and watched a football game. We were not meant to be on "sides". Bad things happen when we take "sides" and wear "team colors" on things that can do harm... For sports and entertainment and some personal decisions, it's probably not a big deal, but for what really matters in life, being unflinchingly partisan is absolutely and perilously toxic.

Democracy is about maintaining as many freedoms as possible while upholding unity and individual rights (including privacy), so when division is promoted as a basis for political discussion without the goal of creating unity, it simply fails... I was a Political Science student, learned that early on in my life...

For example... It's really hard to fix a "family trip" once trust is gone within a "family". It's hard to fix relationships as well when any one individual within the family home is suffering because of the "rights and freedoms" or "toxic leadership" of another family member, and thus, you get a recurring state of conflict, which is where we all are in the entire world right now.

Politics and technology are now inextricably linked too because of social media, advertising/promotion and online news. Our future is completely as unpredictable to us (and possibly as wildly expensive) as the next forthcoming iPhone... When you combine those issues with rising partisanship, toxic leaders, suffering individuals, a lack of real opportunity, and personal data mining by greedy corporate execs, you're in for a wild societal ride... Luckily today, I'm going to stick to just addressing the "data mining by greedy corporate execs" thing... I hope.

Data mining is real, it's a subject most people are unaware of for it's depth and potential to do harm. Historically governments have taken great steps to limit what data they collect on individuals, especially concerning health care and non-criminal behavior. In recent years though, private companies have slowly evolved to create their own operations in data mining, and in many cases purchased this type of data from platforms to guide and refine their business goals, and it's likely what is driving a rise in partisanship undercover on a world-wide basis.

Potentially now, each survey you do online, each ad or image you click on, each article you choose to read, and many other actions you commit to online could possibly be an information gathering event for another party or company. Many of our online actions now are linked to sending details about our choices and thoughts to others to drive targeted content and advertising. Each of us have completely different experiences when we go online as well because of algorithms and the introduction of artificial intelligence to modern web sites and applications, which is exactly what makes cases of online manipulation and deception (on a larger public scale) so hard to identify. Many people and companies are silent concerning the truth about manipulation because it is profit-driven, as they may benefit from this practice directly or indirectly, especially when you consider stock investments. Many companies also suffer embarrassingly bad expose's when their practices come to light as well, and it may all get much more sensationalized with proper investigation and regulation as time passes (fair warning).

A huge problem is that the engineered actions that occur via algorithms, and based on collection of our personal data, are not always rooted in responsibility or even in properly thought out nor in responsible ways right now; Frances Haugen citing internal practices recently at Facebook, where experiences by children were under threat for negative influence and harmful mental health experiences, was just the tip of the iceberg for how platforms can impose control and manipulation on users once they have reached a large scale of support, funding, or huge user base. There is a huge possibility that the application of social algorithms may possibly be the modern-day Hindenburg as well, that crashes hard and kills many, or perhaps the Titanic of our generation because they have serious and truly unpredictable consequences to our mental health through indirect and unidentifiable means, and because they were created and applied quickly with reckless disregard by overly self-assured (yet flawed and financially motivated) human beings. Privacy, security, community, and control were paramount to the sales pitch that platforms originally used to create their vast user bases. they eventually abandoned those promises time and time again as they grew, now arriving at a point where they ask you to pay to be visible, while selling account data they have compiled on you as a patron because the represent a critical structure on the Internet.

Everyday users, working to succeed on platforms, are encouraged to post insight into their private lives -- all for free in hopes of popularity. This includes your friends and children. Platform users often fill out surveys from a variety of sources, they communicate with all kinds of people and bot scripts, and share their individual art and thoughts with companies and individuals every time they post, while the reward and payment is often not even regarded by platforms as relevant or feature worthy unless they use a credit card and boost each individual post they make... Ad revenue is no longer for CocaCola, it's generated from tons of kids that do makeup tutorials and post videos of their pets in your living rooms (provided they pay to boost EACH POST). The rewards come mostly in serotonin, as posters see that their audience improves while their debt rises, they eventually reach the point of clarity, that a normal job is probably what they need to pay off all the advertising debt now, and hopefully they will get one, but in the mean time, social media companies and execs are busy hawking NFTs and Bitcoin while money keeps rolling in the door so they can climb to the top of the Forbes list at some point for the year. The money rolls into the door for a platform owner because there are so many ways money is made through social media, and very little benefit or value beyond entertainment that social media needs to provide to it's loyal and eager for success user base of patrons... Once you set up the lawn mower, the grass pretty much cuts itself, and in addition to that as a platform owner, you have access to all of the analytics that platform generates for everyone on it.

Social platform owners now (as a success road map) initially set up vast and "temporarily free" communities with great features that foster enthusiasm to sign-up to their service. They manage and create private business partnerships that drive their growth, and even promote and create personalities to build their branding, often at a huge start-up costs. This is why social platform execs often only intervene with operations and ethical statements if profit drops or if a negative story concerning them hits the news... Once a community is built, the users drive the content, and most of the grass "begins to cut itself" and content and comfort takes a back seat to cash. Execs are often insulated from politics concerning the platform because they can simply say it's platform "bad actors causing the problems" rather than their lack of creating features or ethically operating their service. The whole thing turns bad once platform execs reaches the point when they have a significant enough user base to where they can implement an "ad-driven" profit models and algorithms that generally corrupt user experiences and remove the control over content that they once had earlier on. Many of these platforms become monopolies; too big to fail, and devastating if they were to shut down. We had a lesson with that in 2021 when Facebook shut down for a few hours, and mysteriously crippled many other dependent sites and services as a possible result.

On another hand, platform leaders also can collect a wide variety of personal/individual data on members within their massive user base, and support the same types of data collection and sharing of said data with other companies behind the scenes... Data from sources like the political policy survey you completed, your travel and shopping habits, or the history of charitable contributions you've made on the platform, or even a movie you posted about liking that can give anyone holding that data a unique and possibly very specific insight into who exactly you may be in real life.

Imagine if you as a person owned access to a very personal data library on almost anyone you met in life that contained their private messages, location history, people they've dated, their birthday, things they've purchased, their family relationships, and that information may even include how much money they currently hold in their bank account? Would you swear to never search anyone's name? Yes my friends, that is the real Metaverse... More data than the US Government has on you is now held by private companies. To think that people were worried about "Big Brother" all those years ago... It's a total shame that we've gotten to this point to be honest... Totally shameful.

This is NOT what the Internet was invented for... AT ALL.

There is a dark "voyeuristic" undercurrent in Social Media and software development now, perpetrated by several technology and platform leaders now that is far out of control in our world, and it really all needs to be reigned in. The desire to collect data on massive amounts of people, especially for a private company or individual's access is not ethical, and only serves to feed the ego, influence, and profits of controllers of that data. This kind of data gathering practice is also flowing over into software driven consumer devices like new cars, computers, televisions, thermostats, and home security devices, and if left unchecked, even your next "smart" toilet seat may come with a microphone or camera embedded in it.

Possibly the biggest crime being perpetrated undercover these days is that we're eagerly lining up to pay for shiny new devices and signing up for apps and services that aid personal data tracking at a higher rate and cost to us overall now more than ever.

Most people are unaware that some of their new Phones come with a "LIDAR" feature installed. It's very interesting technology, but it can be very very invasive to our privacy if mis-used... Nowadays, people are paying for the very same technology and devices that can scan or monitor any room they are in and then the device can potentially share those scans back to (God knows who) anyone that has access to the device's telemetry or storage. In the future, apps like TikTok, Facebook, Meta(whateverse), or even a hacker can potentially access that scanner, or the data storage from the device and have access to a wealth of information on any user they want who purchased one of those phones.

Facial recognition when it was introduced was just the surface of a festering violation of individual user privacy. When devices and companies manage to capture multi-dimensional personal user data, it has value for years into the future even if it is shut down at any point later. This kind of data can also be held offline in private even when there was a public promise by companies to delete it because it is very valuable by nature (in a voyeuristic, analytical, strategic, or otherwise sense).

The key to privacy protection is to make sure privacy is not violated to begin with, and we are failing radically in privacy protection through paying for devices and tools that invade our privacy.

Our online privacy rights have already been violated and owned on a massive scale... Privacy should be protected by law at the source of data collection -- AT THE DEVICE LEVEL -- not just at the app level. Phones, Cameras, ATMs, Cars, etc... The devices and apps that we pay for and use (respectively), should not enable data collection that can be used against us in any way for extortion or manipulation, especially in social and economic circumstances. Current law already provides for protections about this to an extent, but courts are still not educated and active enough, and the laws are fairly weak in specificity to properly enforce those standards on the Internet.

I obey the law of course, but even the simplest data gathered by companies on us can become a means of manipulation or extortion that companies can use over time against us covertly. Think about private/personal data use in determining bank loans, job opportunities, court cases, and/or in health insurance. Suddenly data stored on private citizens can become weaponized against them without anyone truly knowing why or how it occurred, and evidence of it all can be removed and covered up with a simple software update.

We are so far past a calling for congress to get up and enforce proper protections of individual rights regarding technology. There should be an escalated mandate for government to get this situation under control. Online communities are truly only useful to us when we can share our moments, talents, and our faces online with proper respect, with a transparently equal field of success, safety, community, and privacy... Otherwise, we really don't need any of it at all. We can always all return to a culture of talent shows, farmer's markets, art exhibits, and cold phone calls because back then you could usually just go home and not worry about stalkers showing up later and about "in-app" purchases to be seen and heard.

We have to be aware of what current culture of tracking us and creating reactive algorithms based on tracking data is doing to us mentally and emotionally. We need to prevent narcissism and voyeurism from becoming normal in leaders of all industries, but especially in social IT, and figure out how to prevent it all from turning very bad against us if we're going to be good and ethically balanced humans into the future. We also need to take steps to protect each other from this new toxic culture of sharing too much personal information. Companies need to set up accountable oversight for making privacy decisions, and also emphasize responsibility, simplicity, and transparency in regards to the data they collect and hold concerning their audiences now more than ever, and congress needs to get off it's duff and update laws to protect tax paying and law abiding citizens from this ego-driven privacy violation nightmare.

Occasionally now I host fake conversations, only click like on things I want to see more of, and go to completely out of character sites and locations just to throw the algorithms and tracking of me off base. It's one of the only real ways left to protest and control what I view when I'm online in shorter sprints.

Post Reply