English philosopher Jeremy Bentham (1748-1832) conceived the “Panopticon” as part of his philosophy of “Utilitarianism.” A panopticon is a building with a central tower (as in a prison, a hospital, a school, a factory) which is surrounded by cells that are visually accessible to the people (the authorities) in the tower. The tower casts a brilliant light upon those people in the cells, so bright that they cannot see the watcher in the tower. They cannot tell when they are being observed, so must assume they are always being watched. Over time, however unconsciously, that sense of being watched changes their behavior. Their actions become moderated and careful, and a mild tension invades their minds.

Many versions of the panopticon still exist in our institutions’ physical architecture today. However, the panopticon is rapidly being updated; it no longer requires a central tower and cells in which we must dwell, work, eat, sleep and be observed by a watcher. We can just be at home doing what we do, and still provide all kinds of info to the observers hidden in the digital panopticon. We can no longer live in peaceful obscurity, but the watchers and their digital “tower” remain invisible to us so we do not feel violated.

We gradually are being convinced by businesses to make our homes digitally smarter so electricity costs can be controlled and security devices will make our homes safer. Our installation of smart thermostats, lights, computers, digital assistants, drones, wearable devices, televisions, digestible sensors, etc. will allow these devices to interact with each other and with us even if we are far away.

Of course, these handy devices also “scrape” (as the term says) information from our homes: sounds, facial and voice recognition, key words of conversations, our moods and various kinds of technical data, and silently transmit them back to the companies that sold us the stuff. How long will it be until, when eating our cereal, we’ll ingest little sensors that pass our lips looking like one more tiny granola chunk, but which will transmit the state of our respiratory system, our blood platelets, our emotions and our bowels. (As indicated by our colon’s complaints, either anti-diarrheal or laxative ads probably will immediately appear on our smart phones.)

Businesses claim they use our information to “improve our experience” and “improve services” to us, which means seducing us into buying more of their stuff. They also sell our data to other companies that want to sell us even more stuff based on what our “smart” devices have sniffed out about our desires, addictions, secret lusts, political perspectives and past patterns of product use.

We have been sucked into a digital “black hole,” a place from which we cannot escape (unless the apocalypse intrudes on our technology addicted society and destroys the electrical grid, or we each move into the forest and live in a hollow tree).

Willingly, sometimes unwittingly, we all have given over massive amounts of data which have been sold and resold multiple times to numerous companies and institutions. We are still walking around, tending to business, having fun, sharing our intimacies with our intimates (and with the digital devices that listen to us). So…what is the problem?

So what if businesses are trying to make more money off of us by scooping out info from inside our homes and heads? Millions of people are willing to make that trade, which may show our true lust for more stuff (a pretty creepy commentary on us), but it’s also “just business.” Or is something else unfolding here?

Jeremy Bentham’s philosophy of “utilitarianism” had a basic tenet: the best governing system is one which provides the greatest number of people with the greatest happiness. In a democracy, that “greatest good” tenet provides the measure for all morals and legislation, and the basis of what is considered either morally right or wrong. However, the current political (sometimes violent) conflicts among citizens in the world’s polarized democracies show how difficult democratic societies find it to achieve consensus about what is right or what is wrong.

China’s government is sweeping this debate off the table by installing its new “social credit system” throughout the nation over the next several years (I wrote about this some time ago). Their government claims the purpose of the new system is to introduce greater civility and honesty between people and businesses, to build a more civilized society. Better behavior is their stated goal, but one Chinese official, when discussing the new “system,” chillingly said, “Whosoever violates the rules somewhere shall be restricted everywhere.”

China’s new “system” uses the same digital tools we do: rapidly improving artificial intelligence systems, facial and speech recognition and language translation software, computerized decision making algorithms and the taking of citizen information. They use theirs to identify citizens who do good or bad deeds and decide who will be rewarded and who will be punished. We are not there yet, but are headed in that direction.

Our own government has already eavesdropped, bought, subpoenaed or hacked much information from us. We are ripe for government exploitation and manipulation because our “leaders” know that many citizens are anxious about our rapidly changing world becoming more crowded, full of ethnically diverse populations, many of whom are on the move, trying to escape from war, famine and other disasters.

People are fearful and insecure, thus willing to give up traditional freedoms to “be kept safe” from immigrants and refugees, and from what seem to be a thousand threats introduced so dramatically by the media.  (How many times each day do we hear some statement from government or other institutions reminding us that they are keeping us safe?) They only want to save us from calamity.

What dismal baloney.

But there is clear evidence throughout the world of peoples’ growing appreciation for authoritarian leaders and right-wing political agendas, including here in the US with Trump’s election. In 1930, many highly capable, accomplished Germans despised the “little corporal” and his group of “brown shirts,” but voted him into power several years later, resulting in World War II’s 85 million deaths from violence, disease and famine.

Retired Harvard Business School Professor Shoshana Zuboff has just written a new book, “The Age of Surveillance Capitalism: The Fight for A Human Future at the New Frontier of Power.” In it she describes our largely unconscious drift into a new era of political power and into an ideology she calls “instrumentarianism.” Just as in the panopticon’s cells under the bright lights, we are merely objects of information, but we cannot see the observers and we receive no open, honest communications from them.

Business and government do not want to “serve” us better by relentlessly monitoring us; they simply wish to manipulate and herd us. Just as in China, the hope is to instigate wide spread modification of human behavior, either for business purposes or political purposes.

But, no worries. All this manipulation is simply to promote the greatest good so that the greatest number of us can experience the greatest happiness. No doubt 60-some millions of us already think this is just fine, and won’t give it much thought as it silently unfolds.


Online Poll

Should Oregon ban Styrofoam?

You voted:

Online Poll

Should Oregon ban Styrofoam?

You voted:

(0) comments

Welcome to the discussion.

1. Be Civil. No bullying, name calling, or insults.
2. Keep it Clean and Be Nice. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
3. Don't Threaten. Threats of harming another person will not be tolerated.
4. Be Truthful. Don't knowingly lie about anyone or anything.
5. Be Proactive. Let us know of abusive posts. Multiple reports will take a comment offline.
6. Stay On Topic. Any comment that is not related to the original post will be deleted.
7. Abuse of these rules will result in the thread being disabled, comments denied, and/or user blocked.