Wednesday, January 18, 2006

A utilitarian explaination to the first subject of my last post

Using utility maximizing logic: Lets assume that you and I don't like eachother, you offer me $1.00 for my candybar, which would only offer me about $0.75 worth of utility in the form of something else I desire. Meanwhile the Candy bar will offer you about $1.25 worth of utility. It would seem that the transaction would be a "steal" for both parties, win-win. But I still wont sell you my candybar. What this would tell us is that I consider the opportunity cost of associating with you as an added value on top of the cash value of the transaction. E.g., doing business with you gives me negative utility. In this case, lets say I wouldn't part with my candy-bar to you at least for less than $5.00. Now that's way too much for you to get any utility out of my candybar.

In real world settings we've all ran into this, places that couldn't pay you enough to work for them because the working environment, or your co-workers were so bad. Or people you just refused to do any business with.


Perfect information vs. Human Trust:

Let us assume for a moment that two actors in a transaction have perfect information about the product(s) being offered, and perfect information reguarding buyer and seller. Let us also assume that the product is satisfactory, and both buyer and seller know that the other will not cheat. Now, is it possible that they still will not deal with eachother? Disreguard reasonable substitues, other vendors, etc.

I say yes for a simple matter of human nature. If the buyer and seller do not trust nor like eachother, they may not co-operate, even when they know that the other will not cheat on their bargain. Its utterly irrational from an "enlightened self-interest" paradigm. However I think there may be an evolutionary explaination. Lets give an example:

Assume a fairly modest transaction, with no life or death consequences. You do not like/trust the person who's come to your lemonade stand to buy lemonade, they've proffered money up-front and there's plenty of witnesses and they are well known enough that you know they won't steal. Now, you might co-operate and sell or you might not. We're not talking lack of empathy through anonymity, e.g., a randomn customer. Rather this is someone you actively dislike or mistrust. Why might you not do business with them even if they've got the money? Essentially the reasons boil down to association. You do not want you, your product, etc. associated with that person. There are human rather than financial reasons which compel you not to associate with them. However, often these could turn into financial reasons. Take for instance a baby-food manufacturer that also makes bombs or what-not, and you get the picture. The association can ruin business.

As for why this might be important evolutionarily, it has to do with the fact that trading in our more cold economic sense developed out of a more tit-for-tat surivival sense in which a given transaction implied future dealings. The people you traded with were likely to be neighbors, ergo even if you structured a given deal so you could not be cheated, you would not want to associate or co-operate with someone who you disliked or didn't trust in the human sense as much as you possibly could. Because the feeling would be that they will do some harm, either through cheating, or harm by association at some point in the future. This is more or less the origin of social stigma, and why no one wants to be known as "Friend of the outcast" in the schoolyard, it serves as a reinforcement of social norms and thus contributes to group conformity and survival.

Which brings me to something somewhat unrelated. Given that social groups are designed to reinforce themselves, create inclusive and exclusive norms, symbols and indentifications, etc. It brings me to a postulate: That subtle coersion is the underlying dynamic of all group behavior.

This would lead to a rather dim view of the species, I suppose. Perhaps then there needs to be a slight redefinition of the term for our purposes then. In a group dynamic, norms and the like tend to be decided by an aesymetrical concensus over time. These become self-reinforcing as one desires the benefits of belonging in a given group, or contrary fears the consequences of not belonging. Excepting certain cases (cults of personality, etc.), who decides these customs and conventions cannot be explicity defined, rather they arise organically from the interactions of the members of said group. Ergo, the creation of norms and social signals cannot in it iself be thought of as coercive, so long as all members of the group are willing members. Its cultural dynamics are purely voluntary. Wherein lies the trouble is when groups compete of course, and that's where inter-group dynamics certainly do get coersive. Moreover, trouble arrives when one has less choice in the matter of belonging to a group, e.g., when your status vis-a-vis group consensus is thrust upon you. Examples include being born into a given status, or having beliefs or characteristics which automatically exclude you. In general this does not become touchy until you start involving real world goods and services, and things like weapons.

It becomes no wonder then how legalism and the rise of the state came into being, as an effort to normalize economic relations between groups that might not associate otherwise. I'm sure you can imagine examples for yourself with no difficulty. I work with and provide services for many evangelical christians, and I certainly would be excluded from their cultural functions, as I am neither evangelical, nor christian. In absentia of a hobbesian construct of some sort to normalize relations between easily identified cultural differences, its not uneasy to see what happens. One group will try to forcibly supercede the other, whether by force of arms or otherwise.

The conundrum however, arises thus. We are all quite unsatisified with the existance of the state in general. It's existence allways feels vaguely paternalistic and threatening, and indeed it has been shown variously that the existance of the state is contrary to personal (and often public) freedom even in its best form. How then do we structure a non-state construct that in so much literally "forces men to be free". In this case free by not killing eachother?

More-over, if we desire to not create a professional class of legalists (which would seem to replicate quite a bit of the functions of the state, or at least plant seeds to make the state reborn), how do we make it simple and functionable enough to be perfectly comprehensible to every man, yet assuring no loopholes?

I suggest some sort of reverse engineering of human social dynamics. Take for instance a universal Identification system. It's not bad in and of itself, its the element of the state that creates the problem. When all such a system does is identifies you to others you wish to interact with, it functions no more than an essentially highly trustworthy form of social signalling. It's only when the state (or another powerful entity) can track and harrass you with it that it becomes problematic. But again, there has to be some form of order that keeps different groups peaceful that also functions in tandem. Observe Rawandan holocaust where it was not necissarily the state that used the indentification against you, just a crazed group of "others" out to get you.

So we get to the next point, which is that violation of infractions would have to be easily dealt with, and peacekeeping would have to be easily obtainable. Of course, one of the original selling points of the state is convienence: You don't necissarily have to worry about the underlying public order as others are doing it for you. So there's that caveat to worry about as well.

It's also worth noting that this should not be concieved as a "minimalist state" solution, as because in my opinion minimalist states either rarely remain minimalist or rarely remain states. Likewise I find a will defined by pure majority to be equally repugnant, though it may satisfy a non-state solution, it certainly does not offer the best solution. Specifically if you happen to not be in the majority, and also happen to be right about something.

Just things to ponder toward a better tomorrow I guess. These are the underlying problems of the next few centuries. If people themselves don't figure out their own behavior and how to "hack it" to create a more free world, other powers will figure it out and learn how to program it toward their more controlled world.

Wednesday, January 04, 2006

Why the "information Society" is bogus.

Well, not bogus per se. But its not happening. People don't care about "information" in the abstract sense. They will not build the "information society". Rather they will take your "information technology" and turn it into their "social networking technology".

While people may like being able to have all their information on portable devices, be able to do their bills and such from their cell phone, etc. That's not the primary thrust of what people actually use communication devices for. We use them for social networking. Remember how the PC was supposed to be a word-processor and spreadsheet system? Remember how it was supposed to be a fad? Note to gadget makers: People do not want "your" gadget. They want "their" gadget, which you just so happen to make. Observe the social status signaling derived from custom accessories and individualization of cell-phones and ipods.

People want to build, maintain, and customize their social-networks with communication and co-ordination devices. They really don't give a flying fuck about the informational capacity, or that their pocketbook can run excell. Granted these are pluses, and they will want them as features. But what people actually want is something like this:

Take 1 Video I pod, 1 cell phone camera, 1 Play Station Portable, and 1 pocket P.C. smash them all together until you have something that'll fit in your pocket, give it full wireless fidelity, and give it an easy an intuitive OS that's also secure and robust. And when I say easy, I mean exactly that. They'll want to be able to customize it just by manuevering a few icons around, or making a few gestures, or respond by voice. They'll want to be able to literally go "Find me a thai resturant within 10 minutes from where I'm at now." And be able to instantly tell all their friends that they're going to be at the thai resturant. They'll want to be able to make movies, draw on it, manage all of their finances, pay with it, watch movies, play video games, chat with friends, co-ordiante activities etc.

But this is the kicker, which applies especially to you gadget makers: They don't want you to have any control over any of their information. They do not want to play in your "information society". They want their social network to be easy to manage and as expandable and contractable as they like, and they want to be fully able to control every feature they do or do not use on this little mega-gizmo. And they don't want you to know anything about them. I.e., they don't want your database compiling econometric information on their buying habits, they don't want to have to worry about their identity being stolen, either by losing the device, or having your data-base raided.

It has to be secure, so secure that no one else could get their info, but at the same time not asking them for a password every few seconds, or bothering them everytime they link to a wireless network. They want a tool, you want to offer them your technocratic vision of the future, they want what people have allways wanted, their social life.

Ergo, I cannot stress this enough: They don't want "Your" gadget. They don't care how convienent it is, its not even that they care loads about privacy, its about control. They'll gladly tell you much about themselves and share demographic information, so long as its opt-in, rather than a standard feature. Otherwise it turns your wonder-gizmo into a creepy paper-weight.

Which brings me to two more points: PSP vs. Ipod. Ipod has done alot to encourage both personal customization and buying aftermarket products from 3rd party vendors, etc. They've done extremely well this way. PSP has purposely crippled their OS so that you can't run your own programs on it. Which is a stupid stupid mistake. Because people want their PSP to be theirs, not Sony's. They don't want sony to decide what it can and cannot run. It's not as if they'd stop buying sony products, rather they're going to buy stuff that sony doesn't sell, or doesn't sell or package well, and use it. It's a network benefit for the PSP that they completely shot out of the water. Which is why I predict the PSP will become a dead-paperweight system pretty shortly.

Second point relates to the conflict I've been trying to illustrate. Technology is moving in two directions, from one perspective the suppliers of various communication technologies are trying to create an ordered technocratic informational supra-structure where all data can be cataloged and acessed on anyone at any time, down to their dna signature. Granted, they don't want say the average person being able to access it, but they want the technocratic buerocracy to have this information. At least in aggregate. Though likely overtime it will become in specific too. Governments and corporations both fall into this category as they both want it for slightly different, but essentially the same reason: It makes their lives easier. They know more about their customers, or citizens, they know what they want, what they're doing, and how to manipulate them. One wants this ability for the good of the bottom line, the other for the good of the state.

Now the side of the equation is that people want this technology too, they want fluid movement, fluid social networking, ease of use and communication, etc. But they don't want people easedropping, compiling them into biometric data, and regurgitating specifically designed ads or knowing where they're at all the time. People want the ability to be left alone and not to be watched or cataloged as much as they want fluid networks. In other words, people want a reliable and accurate information supra-structure.....and they want no one to control it. They want all their communications to go through essentially a PGP system that doesn't require as much technical knowledge. A global free-net that's easy to use and fast. They'll still gladly shell out demographic long as its their choice to do so.

So again: they don't want your information society. They want their mobile commons, their personal city on the hill. In otherwords: their human, their going to use this in human ways, not in technocratic ways.