Blocked: The broken loop of digital society

The ability to block users *with impunity* and without any expectation to reconsider the decision or to reframe the originating situation is one of the low-level root causes of the divisional echo chambers of the Internet. The cross-service block-map (the map of who blocks who) of the Internet is the invisible, ever-growing membrane that insidiously divides us.
In the digital society and economy, the perception of other actors (suppliers, platforms) is dictate by consumer-opinion, which is increasingly easy to manipulate, twist-around and antagonize into predictability. What we lost from the more traditional society is the reputation-centric approach, that was more robust to hype and gossip and which in turn was based on the local-visibility and perceived-immutability of an actor’s identity.

The core issue of the societal hyper-division, as it is brought forth by the digital age, is the ability to block interactions/people with impunity: without consequence, consideration or second thought. This is opposed to being incentivized/motivated/pushed to deal with the issues leading to the block by the full spectrum of human insight and experience. Of course, there is a whole spectrum between social oppression (the absurd belief that the gathering of the elders is *always* the ultimate authority) and individual supremacy (the absurd belief that the individual *always* knows what’s best for them, in spite of others’ experience).

This ability to block interactions without having to deal with any kind of aftermath of the choice to do so creates this societal ability to avoid situations that in real life one would have to confront. It is a way to filter and create kind of a sound-proof glass-box around communication. While some level of isolation is mandatory for privacy and safety, excessive anonymity open a Pandora’s box of moral hazard: Blocking interactions makes it easier to unwittingly justify a gradual shift of human attention from what is *factually relevant*, *circumstantially useful* and *historically validated* to what is perceptually _pleasurable_ and _cognitively easy_ (inref. “cognitive easing”), which is quite a wide gateway to potential excess, polarization and a general departure from traditionally co-observed reality.

This creates the reluctance to actually communicate and confront issues. While this may be necessary as a protective measure, people have no safeguard against abusing freedom of speech on the Internet. The way we don’t act cordially with one another is because our mind perceives the lack of physical presence of the other(s) as a proof that *there is no stake, there is nothing to lose*.

We need a new, better-equilibrium on the spectrum of reconciliation between society and the individual. We need a new social contract, one that expands the breadth of the definition of the concept of individual in the world of IoT, AI and mobile devices. For instance, if you either block/get blocked a lot, your access to the service should be limited. There should be a two-way cost for this decision to “non-communicate”, even if the cost is not symmetrical. Also, this cost should somehow use the understanding of a third party human (or panel) to judge the nature of the cause for the incident.
One has the right to protect oneself and be protected from harassment. However, one should also be motivated not to abuse that right, for instance by repeatedly instigation anger and then retreating to the position of the victim.

We need a way to filter and protect ourselves. If we continue conversations with someone who has not taken “no” for an answer and it is mentally detrimental then that is an issue
no one should have to put up with.

One of the pillars of mediating the global problem of digital trust is to redefine the <social contract>. And for the novel version of such contract, one needs a new definition of identity. Maybe one that separates that avatar from identity (i.e. physical embodiment of the consciousness manifesting it). Example of identities include a physical person, or the physical hardware that execute the code of an autonomous artificial agent. Examples of avatars include an account ID, an alias, a nickname or a phone number.

Bogdan Written by: