| "Each age, each intellectual turn, produces a Socrates of its own." |
|
Let's say that the inspiration behind BitcoinMars.org came to me on the Vancouver SkyTrain, on the way downtown from my job as a warehouse picker. That's not where it came to me, but bear with me, because founding principles can get a little dry without shoehorning a bit of drama into the proceedings. So, set the scene in your mind...
Here I am, Cryptetus -- let's call me 'Cryptetus' -- and Cryptetus is sitting on this train, still in steel-toe sneakers, dog-tired after an 11-hour shift carting merchandise onto conveyor belts. Cryptetus is also wearing a mask. The reason Cryptetus is wearing this mask is not that anyone convinced Cryptetus it was necessary. Nobody tried to convince Cryptetus. Cryptetus once asked people which evidence had convinced them to wear a mask. They could not answer, because nobody had tried to convince them, either. Keep in mind that this is 2021, a time by which Cryptetus, like every British Columbian, is expected to comply absolutely with the political wishes of the most socially connected in-groups regardless of the state of his own opinions, on pain of economic ruin. So Cryptetus does, or Cryptetus would not have a job, and Cryptetus would not be on this train. By the same logic, you can imagine that every person standing or seated around Cryptetus is wearing a mask as well.
It is at this point when Cryptetus is thrown into a random fit of philosophy. It's a condition he has suffered from all his life, although he has never been diagnosed.
Cryptetus says, "What exists? There are a lot of rationales for believing or disbelieving in the existence of various things. But what self-evidently exists with undeniable necessity?"
And it's as if he farted. Everyone is stealing furtive glances, and there is a synchronised, subtle increase in social distancing.
"Don't look at me!" squeaks the glum boy sitting opposite, small and shrinking between his baseball cap and sci-fi electronic gametoy.
"Crazies on the SkyTrain!" shouts a woman with her boyfriend, over a shoulder. He laughs.
Cryptetus says, "Yes, I suppose we do. At least, Descartes thought so. 'I think, therefore I am,' thought he. But was he, really? Or were his thoughts merely self-referential? Was the idea that thoughts can be generated by a being who is determining their progress through the exercise of free will -- say, the being I call 'Descartes' -- was that just another predetermined thought in the series? Or did it describe something real?"
At this, the man seated adjacently lowers his sci-fi hand device and shoots Cryptetus a careful, steady glare. Cryptetus leans into it, hunting for what traces of a facial expression can be gleaned in the crinkling of the man's eyes and the trembling of that great white beard that has overstuffed his mask to the point of bursting forth, escaping strands bouncing with gyrations of the train.
Cryptetus says, "Ha! You are correct, sir. I am, indeed, begging the question. Trust you, Socrates, to shine the hard light of skepticism on my poor, foundering attempts to glimpse meaning in the universe."
Bouncy-beard man returns, glassy-eyed, to at least pretending to read the sci-fi handset in front of him.
Cryptetus says, "Perhaps the most that can be concluded that exists for certain is this line of reasoning itself, and the penumbra of thoughts feeding it -- a penumbra that includes 'my' sense of identity, which could be false, as could my memories, and my sensory perceptions of a body and surrounding world, which could both be generated by something like the Matrix. All of these I can consider 'thoughts' as they play out on the 'screen' of consciousness."
Glum game boy pipes up again with, "Take the blue pill."
Cryptetus says, "Aha! The Matrix, young master, is nothing less than the virtual reality that serves as the repository of all of the knowledge of all of the Time Lords who have ever lived! There is no 'blue pill' nor any other colour in the spires of Gallifrey that will save you from the Matrix of Time."
Game boy says, "Oh. Not Doctor Who."
Cryptetus nods, "Back in the late 1970s."
Game boy says, "Do I have to?"
Cryptetus says, "Have to what?"
Game boy says, "Believe in the '70s?"
Cryptetus says, "Good question. We can make a start on answering it by putting the case for the existence of thought in terms of its barest deductive logic, starting from..."
"1.2. Therefore, a belief in the existence of thoughts is self-evidently right."
"1.3. Further, any disbelief in the existence of thoughts would be self-evidently wrong."
Game boy says, "Don't spoil it for me!"
Cryptetus says, "I'll do my best! I can see clearly now that the self-evidence of thoughts neither relies on nor directly implies the existence of a self, but merely the existence of the perhaps-formless states of belief and disbelief. Statements 1.2 and 1.3 are logical deductions from first principles, and therefore logical necessities -- axioms that must follow if statement 1.1 is true."
"Well okay then that proves that!" blares the woman with the boyfriend, over her shoulder. The boyfriend laughs.
Cryptetus says, "Irony! The better to teach me with. This has always been your method, master. And point well taken -- for all depends syllogistically on the existence of belief or disbelief in the first place, which is a similar conditional upon which rests Cartesian philosophy: the cogito in cogito ergo sum is assumed, not proven! So the entire argument is establishing 'only' a cognitive necessity: if you can think it, it must be true."
Game boy says, "But I can't, so therefore it isn't? Rekt!"
Cryptetus says, "Why it is you, my old friend! You can't hide from me in these infinite guises. That signature gambit of yours of professing false ignorance will always betray you. You know... they named that after you."
Confused now, glum game boy begins to fold himself back again into the space between his well-worn ballcap and his sci-fi gametoy, exchanging sympathetic side-eyes with bouncy-beard man on the way back down, as Cryptetus jabbers on.
Cryptetus says, "The difference between a logical necessity and a cognitive necessity is that a logical necessity is still necessary even if there is no one alive to think so. It is too easy to forget this and come to the conclusion that Descartes proved that cognitive thought must exist in the universe. He did not, and nor have I. I have only proven that if a belief in thought exists, then it is justified. It is a cognitive necessity, of a kind only applicable and indeed, only recognisable by a sapience capable of belief or disbelief. If there were an alternate universe where no belief nor disbelief existed because there were no sapient life, then 'thought' would not necessarily exist in that universe -- at least, not beyond the simplest of emotional reactions to sense perceptions. However, I need not leap to any such conclusions about that universe, because if I can leap to conclusions at all, then I must not be in that universe."
"And how do we get to that universe?" blurts the game boy.
And the bouncy-beard man laughs.
Cryptetus says, "But this is wholly unsatisfying..."
Game boy says, "I'm satisfied."
Cryptetus adds, "...even less so than Descartes's first principles, for his at least appeared to argue persuasively for the existence of a soul, and, by possible reduplication, a world of souls, even if, as Bishop Berkeley argued, there need not necessarily be a genuinely material world undergirding them. But I have proven not even that -- with only a world of thoughts and nary a genuinely proven self in sight, much less a 'soul', I am left somewhere closer to where Hume deposited me, decades ago, in a state of near-perfect skepticism, at Kant's daunting door."
"Check for traps," bleats the glum game boy, his attention now on his gametoy.
Cryptetus says, "I need not, perhaps, go as far as Kant this time into constructing categorical imperatives. Given so overwrought an intellectual framework as his, it might be possible to trace a simpler yet sufficiently certain path toward that same philosophical mountain pass straddled by Immanuel's crumbling old fortress, without having to enter nor mount any attack on the edifice itself."
"No," says game boy. "Do the fortress. I'm a completionist."
Bouncy-beard man lets out an audible sigh, watching the mountain-backed cityscape roll by outside the train, sci-fi handset now neglected on his lap.
Cryptetus says, "The crested peak all these dudes were building toward seeing over is this: can anything further, other than these thoughts themselves, be proven to exist? And the answer appears to be, not so much. Not you. Not even me."
Game boy says, "That explains a lot!"
Cryptetus says, "I know. It explains nothing!"
Game boy and beard man put on the same incredulous looks.
Cryptetus says, "In fact, even the existence of cause and effect cannot be proven beyond doubt, said Hume. However, putting aside for one day this Western obsession with seeing beyond the veil of perception... what are these perceptions themselves telling me upon which I can or must rely, even if they are just pixels in the Matrix?"
Game boy tips his head, mostly cap, at beard man and says, "Is this the real Matrix now, or Ocarina?"
Beard man says, "I wouldn't know."
Cryptetus applauds, and not sarcastically, "A wise man, ladies and gentlemen! Nor would I, dear Socrates. Nor would I."
Cryptetus adds, "Though... to be fair, these changes in my perceptual matrix are hardly random. They're far too directed for that, and always in pursuit of the persistent interests of that central entity which I call 'I'. Even if I can't prove that I exist outside of the thoughts referencing 'I', the endemic centrality of these thoughts implies nevertheless a perceptual necessity that I believe it, and this is a more subtle consequence than it might at first appear, as it relies on the unprovability of external reality itself to give itself the force of necessity, so I need to lay it out again, point by point..."
Game boy says, "But do you need to need to?"
Cryptetus says, "Wow. Great question..."
"What about God?" offers a voice from no apparent direction, causing several careful handset-gazers seated nearby to deflate in unison, as if resigning themselves to a long-suspected fate. Their combined trepidation would have been imperceptible if it hadn't been so synchronised.
"God!" comes the voice again, perhaps from within the labyrinth of not-well-distanced standers on one of the accordion-walled platforms jointing the cars of the train.
"God exists!" says the labyrinth, which earns a series of nods from the woman and her boyfriend. Glum game boy is rolling his eyes now with great exaggeration, and may have been doing so already for some time under that cap.
Cryptetus says, "To make any sense of these thoughts, including perceptions, which again are the only things I know for a fact exist, I need to posit the existence of myself. That's not a cognitive necessity -- I don't need to believe I exist in order to think -- but it is a perceptual necessity that I believe it in order to make any kind of sense of the existence of thought."
"A bit self-involved are we?" snarls that woman with her boyfriend, over that shoulder.
Cryptetus says, "Oh, well put. And well in keeping with your habit of puncturing pride, my dear friend! That I have no other choice but to be self-involved is, of course, my point altogether. If 'I' were nothing but a constructed fiction contained in some unmoored stream of thought, then all of these perceptions would be so universally false and unreliable that nothing useful could ever be learned, and I would do as well to lay down and 'die', although that wouldn't work out either, because I'd never have been 'alive' to begin with."
"Promise?" says the glum game boy, his fingers moving rapidly over his gametoy. Impressed with neither the sci-fi nor his own remark.
Cryptetus says, "Yes. Note that, much as the cognitive necessity of thought stems from the cognition of belief, the perceptual necessity of the self stems from the perception of viewpoint. As with thought, this doesn't prove the self exists. Rather, the existence of the self is an inescapable assumption in obtaining any further knowledge whatsoever, or indeed simply understanding what I see. So I shall not try to escape it any longer."
"Was afraid you were going to say something like that," sighs the bouncy-beard man.
Cryptetus says, "Socrates, touché. You're right, philosophically it's a cop-out, but given that none of us has any choice in this particular matter, unless we wish to conclude counterfactually that the gaining of any reliable knowledge is entirely impossible, we must charge on into this gap!"
Beard man says, "Couldn't you do your gap-charging elsewhere?"
Game boy says, "You don't know this guy?"
Beard man says, "Never met him in my life."
The game boy's laugh is just as you'd imagine "Bwahahahaaa!"
With the train now coming to a complete stop, bouncy-beard man propels himself up out of his seat with a bounce onto the outside elevated SkyTrain platform, as newcomers stream past him into the decreasingly well-distanced compartment.
Cryptetus says, "So I appear to exist as the thinker of these thoughts, which is an idea that I must accept anyhow to investigate further. What further am I, then, besides a thinker? Am I a doer, in the sense of an autonomous being? Do I have free will? What is free will?"
And the glum game boy, of course, says, "I choose not to care."
Cryptetus says, "Perfect. Even as a youth, Socrates, you always played the imp, but always was there wisdom in your folly. For what's free will but the power to choose? And while the 'choice' part is simple enough, there is a little too much wiggle room, I think, in this 'power' -- as in, what kind of power is it?"
"Pretty weak, I'd say," says the game boy, saying.
Cryptetus says, "Ah, but 'weak' is a measure of intensity, and the intensity of, say, one's affection for a pet, is not measurable against the intensity of a bright light, is it? So how can I measure the intensity of my free will without first understanding its category, its kind, so as to know what sort of a standard to measure it against? Is it merely a practical effect, the momentary 'power' to overcome obstacles in order to give rein to my desires, or is it something more metaphysical? Perhaps 'more genuine' would even be appropriate, for a definition of 'free will' with no metaphysical implications does not sound like something that is genuinely 'free' to alter the course of history."
Game boy says, "Maybe it's better if we think we aren't."
Cryptetus says, "If that would be better, young master, then hadn't we better find out whether it is the case?"
Game boy says, "No, it's just, who cares, when history is like, 'Whatever.'"
Cryptetus, says, "Well exactly! If history never cares, then what's the difference? I agree with your analysis. A genuinely free will, then, must be the ability to choose, through action or inaction, among multiple, different, genuinely possible futures. So for genuine free will to be exercised in a decision, there must logically be multiple, different, genuinely possible outcomes awaiting that decision; in other words, there must be genuine choices -- which makes perfect sense. But how do I know this kind of 'free will' even exists?"
"Because we aren't kicking your ass," retorts the woman with the boyfriend, over her shoulder. He just sort of lays there, limply.
Cryptetus says, "How would I tell the difference between a universe where Socrates is trying to kick my ass and I have the free will to stop it, and a more deterministic universe where I only think I have free will because I can act on my thoughts to stop Socrates kicking my ass but I don't actually have free will because my anti-asskicking thoughts were predetermined?"
"Try a special hat," comes a disembodied voice from one of the standing labyrinths between the cars. And not a mask among them shifts to betray a speaker. And then comes, "Make it out of foil," followed by unseen giggling.
Cryptetus says, "I've tried. It doesn't do anything. The fact is, I can't tell the difference, and it may even be impossible for anyone ever to tell the difference. And I don't see how the ability of a life form in a deterministic universe to act on its desires can qualify as any kind of valid form of 'free will' because, given only one possible future, that life form cannot logically be free to abandon the desires that will bring it about. It's like that old joke. What is a life form, to a determinist? A tumbling rock that thinks it invented gravity."
"That's racist!" cries the labyrinth, causing a sudden, reflexive hush, but more giggling chases after, dispelling it as quickly.
Cryptetus says, "Anyway, given this perhaps impenetrable uncertainty regarding the existence at all of free will, it seems safest to assume, as seems likely, that I will never know for sure whether I have it, so let's do..."
"I know a way," says the labyrinth, amusing itself again, but offers no more.
Cryptetus says, "Am I stumped then?"
Game boy says, "Yes?"
"You're a stump," says the labyrinth.
"Please be stumped," grates the woman with the boyfriend, over her shoulder. His body shakes with laughter.
Cryptetus says, "Or is there some further level of knowledge..."
Groans all around.
Cryptetus adds, "...that can be attained about free will, even in the humility of my acknowledged ignorance?"
Game boy says, "So humble."
Cryptetus says, "Ingeniously implied, friend Socrates. No doubt, my humility could be more genuine -- for example, if I were to genuinely believe that my actions were predetermined, I would feel that no extraordinary effort need be expended in order to escape all my impending predetermined failures. What a lot of bother for nothing!"
"I hear ya, bud!" slurs the drunkard hogging a two-seater across the aisle, having slumbered with his head almost between his knees until this moment. It's not clear that he understands where he is or where he is going.
Cryptetus says, "Similarly, no special effort would seem to be needed to secure my predeterminedly successful future!"
"Looks real successful!" blasts the woman with a boyfriend, over her shoulder. He cranes his neck to display a smug look.
Cryptetus says, "Indeed! Every experience in living practice indicates that if I do not regularly expend effort to preserve myself, as if my fate is in my hands, it is genuinely possible that I will cease to exist on this world sooner than I otherwise would, and that I must act in the belief that my future is genuinely in my control simply in order to survive."
"You can do it man! Don't..." begins the drunkard, and lets out a heavy sigh, losing his train of thought between his knees again, wobbling, his mask hanging loosely from one ear.
Cryptetus says, "I'm afraid, my old libatious friend, that even if this particular thought process were always destined to play out this way in a deterministic universe, its internal logic would still hold, and thus it would still entail that I believe in free will as long as I survive in that universe. In three-point brief..."
"Cry me a river!" shoots the woman with the boyfriend.
"Cry me another river!" adds the boyfriend, over her shoulder.
"Cry me 3.4 rivers!" chirps one of the voices in the labyrinth.
Cryptetus says, "And counting! So the existence of thoughts promoting change, like deciding when and how to eat if you are hungry, logically implies a corresponding implicit belief in free will, though that belief need not be accurate: merely necessary, either in the sense that it enables survival, or in the sense that it was predetermined, or both. This type of necessity -- survival -- while not technically unavoidable (because death is always an option) is unavoidable enough to be a suitable follow-up to the previous two big necessities above. I can accurately call it a practical necessity; one does not absolutely have to be practical enough to survive by behaving as if one has free will; it's just a self-evidently necessary imperative for a form of life..."
"3.6. Wheretofrom," comes another voice, "you will freely exit the train."
Cryptetus says, "What stop is this? 22nd Street? Oh, I've got a ways to go, yet."
Cryptetus says, "So I've established that, if I wish to survive on this world, I must assume without proof that I have genuine free will, but hold on -- what if I don't wish to survive? Just because I may have free will, it does not mean I would automatically desire to keep it. Frankly, exercising free will all the time seems like a lot of work. Why bother?"
"Suicide hotline," offers the glum game boy casually, seeming deeply interested now in all the sci-fi of his gametoy.
"Yeah!" jeers the woman with the boyfriend, over a shoulder.
Cryptetus says, "Perhaps. Unlike the previous questions, this is not a question of what exists. but what should be done about it."
The woman says, "Call the hotline!"
The boyfriend says, "Seek help!"
Cryptetus says, "In other words, it is an ethical question..."
The woman says, "No it isn't!"
Cryptetus says, "So I'll start this next section by laying down a definition for ethics..."
The woman says, "Get help!"
The boyfriend says, "You need help!"
The woman says, "Oh my god."
The boyfriend says, "Dude, just admit that you have a problem."
Cryptetus says, "Leaving aside the glaring question of what makes for a 'good' choice, for the moment..."
The woman says, "No. Do not leave that aside."
Her boyfriend says, "Try leaving the train aside!"
Cryptetus says, "It seems that the preservation of the will is not just any ethical question. It's the first ethical question -- not in the sense of highest priority, necessarily, but in the sense of it being the first ethical question that I must at least implicitly answer in order to live long enough to ask the rest."
The drunkard says, "Dude just live. It's what I do!"
The boyfriend says, "You need to get laid!"
Cryptetus says, "This I can conclude logically regardless of what a 'good' choice is: whatever ethical case exists for allowing myself to die, it would have to be compelling indeed to outweigh all of the 'good' choices I could make in the future if I were to live. The only way to perform ethical acts in the future is to choose to live. This is by definition..."
Cryptetus adds, "The logic seems inescapable."
"I escape it," says the drunkard, raising a hand.
Cryptetus says, "Since ethics are rules for making choices, I cannot behave ethically if I have no choices, nor can I behave ethically if I am dead, a state in which I presumably also have no choices, or at least none that seem to carry any weight on this world."
"Everyone's ethical in Heaven, man," says the drunk.
Cryptetus says, "So the only known way even to begin to behave ethically is if I first believe I have the free will to behave ethically, and then act to preserve it..."
Cryptetus adds, "I am not aware of any significantly unethical outcome that would be visited upon the world as the result of my survival."
"That's it, bro. Think positive!" says the drunkard.
Cryptetus says, "So, leaving aside for a moment the ethics of dealing with others, it would be self-evidently unethical to exercise my genuine free will, if it exists in this life, in such a manner as to allow it and all my future ethical decisions to be needlessly extinguished, and the need would have to be overridingly great indeed, as it were a personal sacrifice for some moral purpose, or the only remaining solution available for the value of my free will being locked away forever, intolerably beyond my grasp, such as by an incapacitating illness, or a life of slavery."
"Oh, cry my a life of rivers," says the woman with a boyfriend, not over her shoulder, her face in her handset, her shrivelled heart not even in it anymore. He seems relieved.
Cryptetus says, "So, at this most basic level, although by no means comprising a complete set, among the ethical and thus 'good' choices in life, must be included the rules for how to exercise free will in such a manner as to avoid extinguishing it. Eat when I am hungry; drink when I am thirsty; sleep when I am tired; etc..."
Game boy says, "Go eat something I think you are hungry."
Cryptetus adds, "The fact that preservation of free will is the foundational ethical rule is key. Unless I were to resort to heaping arbitrary extra riders onto the definition of 'good' behaviour, ethics seem to get most of their initial necessity, their 'good' and their 'should', from the value of free will itself."
"God!" comes a dislocated voice again from within the labyrinthine crowd.
Cryptetus says, "'Ethically good choices are those that preserve free will,' seems sufficient so far."
Game boy says, "I free will you to go home now. Get some rest. You been thinking hard."
Cryptetus says, "Oh, this? This is the easy stuff. I have not even put formal labels on any of this yet. For example, for observing that ethical thinking is a prerequisite for human existence, some would label me an 'essentialist', as the existentialists accuse. Essentialism, or what I call 'ethic-essentialism', tends to place ethics in an essential realm, separate from human existence: as a form of karma or the will of the gods. Ethic-essentialists then face the thorny problem of convincing people to care about these invisible forms or distant beings in strange dimensions, as a prerequisite to being ethical. It requires a heavy hand."
Game boy says, "Like... Funimation?"
Cryptetus nods, sagely.
Cryptetus says, "Unfortunately, existentialism suffers from the same flaw in that it tends to separate ethics from man's existence by keeping it coupled with 'essence' and shuffling them together to a distant secondary position, represented by the dictum, 'Existence precedes essence.' So once again, the thorny issue arises of convincing men to care about issues that even the existentialist believes are at a remove from actual existence."
Game boy says, "Just wear gloves."
Cryptetus says, "Why?"
Game boy says, "For all the thorns."
Cryptetus says, "Ah, but why all the gardening?"
Cryptetus adds, "The roses of good ethics are best left where they are found, without trying to separate them from their situation and then keep them alive artificially without their natural source of nutrients. That's roughly what both ethic-essentialists and existentialists do with ethics. What I am leaning toward is better represented by the portmanteau 'ethicstential' in that I don't seek to separate ethics from human existence, but rather recognise them as a prerequisite for it. So there is no need to convince men to think ethically, because they already do this naturally, or else they would not have already done themselves the ethical kindness of staying alive."
"That is not ethics," says the woman with the boyfriend, no longer over a shoulder, not at anyone in particular. "That is selfishness."
Game boy busies himself with his sci-fi gametoy, as if it suddenly calls for his attention, though it is silent.
Cryptetus says, "Perhaps. Or perhaps an ethicstentialist definition of 'good ethics' can travel. We shall soon see..."
Game boy says, "But 'ethicstentialists' aren't real. Nobody calls anybody that. I just googlebinged it. You made that up!"
Cryptetus says, "Neither were 'existentialists' called that, young Socrates, before the end of the World Wars. But I am not seriously proposing 'ethicstentialism' as a label, merely a simplified way to refer to the ethics that enable human life."
Cryptetus says, "Let's assume now that I've already made the ethical leap advocated by the principle of ethical necessity I've been pointing at, and have decided to work to survive and thus protect all the 'good' things I might do one day with my free will. What then might those things ethically be?"
The train rolls to a stop again, and exchanges passengers with the platform outside. Among the new arrivals is Emporius.
Emporius says, "Cryptetus!"
Cryptetus says, "Oh excellent, it is my good friend, Emporius! Come to witness the next leg of my philosophical journey?"
Emporius situates himself in the just-re-vacated seat adjacent.
Emporius says, "Actually, I am headed to Commercial Drive, but I got time. I'll stay on 'til the end. Is this about ethicstentialism?"
"Ha!" blurts the game boy sitting opposite, tapping around and apparently 'googlebinging' various things on his gametoy.
Cryptetus says, "How should I use my free will, Emporius? And do others have it, too...? Do you?"
Emporius says, "Well, how do you know you have it?"
Cryptetus says, "Long story. We've been over it. See above!"
Emporius takes in the hangdog looks on the other passengers faces, and you can see the grin in his eyes.
Cryptetus adds, "We're assuming I have free will, for now, and by 'for now' I mean, for as long as I wish to survive on this planet."
Emporius says, "I... see."
Cryptetus adds, "What I am really wondering now is, what do I do with it? Seems like an important question, does it not, now that I have decided that I have it, and to keep it?"
Emporius says, "No doubt."
Cryptetus adds, "There are very few basic survival instincts I can follow that appear absolutely innate, like breathing. Most of the specific knowledge I use to survive in this world was not innate. I believe I had to learn the ethics of caring for and protecting my free will, because..."
Emporius says, "What if your memories were implanted? Have you been over this?"
Cryptetus says, "Briefly. But the statement is still true, as far as it goes."
"Fair enough," says Emporius, but looks skeptical.
Cryptetus says, "Barring implants then..."
"Thank you," says Emporius.
Cryptetus says, "...the only alternative to the way I recall being raised from ignorance, would have been to experiment by trial & error with the ethical necessity of survival for myself, which would have been hazardous in such a naïve state."
Emporius says, "You'd have died very young, without question. Barring implants."
Cryptetus says, "Barring implants. To start out with, at least... I had no choice but to look for role models in my sensory perceptions of the world, and not randomly, either, nor in the inanimate objects. I could only have modelled the conduct of my free will on that of the other beings who appeared to possess, like me, a genuine free will..."
Cryptetus says, "So my free will not only appears to me to be like that of others, I must also learn, at least at first, to use it as they do. But I cannot consider their actions to be genuinely analogous for the determining of my own without considering their free will to be 'genuine' as my own..."
Emporius says, "Sensible."
Cryptetus nods, "And we needn't bar any implants for this."
Emporius says, "True."
Cryptetus says, "Now if, as I concluded before you arrived, it is self-evidently unethical to exercise my genuine free will in such a manner as to allow it and all my future ethical decisions to be needlessly extinguished..."
Emporius says, "It's only self-evident with that 'needlessly' rider."
Cryptetus says, "Perceptive, as always, my good man. But then, must it not also be self-evidently unethical to exercise my free will in such a manner as to allow the admittedly genuine free will of others, and thus all their future ethical decisions, to be needlessly extinguished...?"
Emporius says, "I'm getting a bad feeling about this."
Cryptetus says, "Relax, Emporius. All I'm getting at is..."
Emporius says, "That's exactly what I was afraid of."
Cryptetus says, "I don't think so. I think this escapes your usual objections to arguments for pure altruism. For one thing, the duty entailed is to help preserve the existence of free will, not fulfill its requests yourself on demand. For another, there is no duty implied to subordinate the satisfaction of one's own will to that of another. Nor vice versa!"
Emporius says, "Well then, how would you resolve that?"
Cryptetus says, "It depends on the situation, of course, but the only ethically grounded way would seem to be for each free will to be permitted to make its own decisions according to its own view of the best way forward for the preservation of the future exercise of free will in general."
Emporius nods, "But isn't this all just argument from analogy?"
Cryptetus says, "Which analogy?"
Emporius says, "That other free wills are analogous."
Cryptetus says, "The analogy of ethical reciprocity? It's an analogy, true. But it is not what they call 'argument from analogy'. That kind of argument would state that because two things seem analogous in some ways, they are likely to be analogous in at least one additional way."
Emporius nods, "It's induction, which can't be conclusive."
Cryptetus says, "Right. But I am not suggesting any additional aspects be added to the analogy between my free will and that of others. Instead, I am reversing the analogy, and operating it on the same aspects I originally had to in order to learn how to survive in the first place, but in the opposite direction. And we can go further than induction along this track, because the reversibility of analogies is logically unavoidable."
Emporius says, "Is it?"
Cryptetus says, "I think so, yes."
Emporius says, "Can you give me an example?"
Cryptetus says, "Emporius, if I were to say that only an ass is as stubborn as you, would it be logically consistent for me to add that furthermore, you are more stubborn even than an ass?"
Emporius says, "It would be funny, like, over-the-top..."
Cryptetus says, "...ridiculous, and a contradiction, because any analogy should be reversible on its own terms as far as it goes?"
Emporius says, "As far as it goes, yes."
Cryptetus says, "And that's a logical deduction, which is stronger than classic 'argument from analogy', isn't it? Stronger than induction?"
Emporius says, "It would seem so."
Cryptetus says, "And that is the strength I see in the logical entailment of ethical reciprocity with the ethical necessity of our survival. It is in modelling my behaviour on the minds of others that their free will was logically entailed to be of a similar nature to my own, separated only by perspective in that I can only perceive theirs from the outside, and vice versa. It is this kind of 'genuine free will as perceived from the outside' that I refer to as 'sapient life'."
"Just sap life," says game boy. "Hey, what about dolphins?"
Emporius says, "Not saying I disagree with that, but not sure ending up here was the only possible outcome from where we started. We sort of glossed over some stuff, like free will. What if all these others are just deterministic automatons, or agents in the Matrix?"
Game boy says, "You mean the repository of all the knowledge of all of the Time Lords who lived in...Gallilee?"
Cryptetus says, "Gallifrey."
Emporius says, "Uh, no. What?"
Cryptetus says, "This is why it is so important that I got here from first principles, having been freshly reminded that I can't absolutely prove my own free will, either. I, too, could be an automaton!"
"You are, man!" says the drunkard in the two-seater across. "We all are. Wake up, it's the lizard men, hee hee hee!"
"Et tu, Brute?" says Cryptetus.
Emporius says, "Been making friends, have we?"
Cryptetus says, "Believing in my own free will and believing in that of others were both the practical necessities of survival. That I have free will is something I must implicitly accept in order to live, and that they have it, something I must have implicitly accepted in order to have learned how. If I were never willing to behave as if I accepted both axioms, then as the contents of my memory universally indicate, I would not have lived, even if that kind of extreme skepticism happened to be metaphysically correct."
Emporius says, "Okay. No more barring implants then. What if your memories of 'learning how' really were just science fiction implants?"
Cryptetus says, "That would be enough to throw their accuracy into doubt, for sure, but not necessarily the analogy with their ethics. If such memories are implanted, and one day I discover the truth, I don't think it would change much about my ethical worldview. Those memories would still have formed my primary role models for survival, and so the implanter of those memories must have modelled them on roles that worked. If I continued thereafter to follow even part of the ethical model that I gleaned from those implanted memories, I would still be logically implicated in the ethical reciprocity of modelling my free will on that of others -- it's just that in this case, those others would be at some mysterious remove, interpolated by the designer of my memories."
Emporius says, "So you would feel an ethical duty to act to preserve the free will of your own memory implanter?"
Cryptetus says, "No, but knowing that I was implanted with memories would not weaken my logical basis for concluding, based on the usefulness of those borrowed memories as a role model, that the kind of free will exercised by their originator is, or was, genuine -- as my own, though put to a deceitful purpose."
Emporius says, "And what if the implanted memories were originated by some unthinking algorithm, instead of free will?"
Cryptetus says, "Nice one... well, if I were to trust an unthinking algorithm to guide my free will, then actually, I think I would be trusting in the will of the algorithm's creator, once again placing myself in ethical reciprocity with free will as exhibited by others. It's pretty hard to avoid!"
Emporius says, "Yeah, okay, but... hear me out. What if the implanted memories aren't even of any role models at all? They're of a process of survival by ethical experimentation from birth, which seems impossible, I know..."
Cryptetus says, "We've been over this."
Emporius says, "...but could easily be written into the script of an implanted fiction?"
Cryptetus says, "Now this is an interesting idea, Emporius, because, according to ethical reciprocity, this type of memory implant could indeed produce a radically post-reciprocal Frankenstein's monster. However, because such memories would have to be fictional, they could only be implanted in someone who had already been raised in the usual, non-fictional way, using role models. Dr. Frankenstenstein would have to not only implant the false memories but also erase all the genuine childhood memories, which is, in a sense, what he did."
Emporius says, "With electricity."
Cryptetus says, "Sure."
Emporius says, "But how did he implant false memories?"
Cryptetus says, "He didn't. He just accelerated his creation past the stage where it needed ethical help to survive, by making it out of fully adult body parts, thus creating a 'monster' capable of raising itself from ignorance without role models. But even if he did implant memories, how could Dr. Frankenstein have been certain that the original memories of the brain he borrowed to make his monster would not have survived in some form to subconsciously re-assert the logic of ethical reciprocity?"
Emporius says, "Build an artificial intelligence. If an intelligent machine wasn't 'raised', wouldn't it fail to be ethical? Is this a loophole in your ethical reciprocity?"
Cryptetus says, "In light of what we originally learned from Shelley's Frankenstein, I'd have to say that yes, the apparent lack of any logical basis for a conscience in such a creature is rightly a widespread concern when it comes to A.I., which is why Asimov's Three Laws of Robotics have gotten so much play."
Emporius says, "But for those laws to work, the robot must be forced to follow the instructions of their creator without question, right? So, what if the creator is corrupt?"
Cryptetus says, "When the robot is forced to follow its creator's laws, then it is not acting out of free will at all, so its actions are ethically the product of its creator's will. It would be the perfect willing accomplice! It might be safer for an A.I. to be given a complete set of human-like memories, full of good role models that produced positive ethical outcomes to freely emulate, perhaps even a direct copy of a real human being's memories..."
Emporius says, "...like Tyrell's niece's, in Blade Runner!"
Cryptetus says, "Exactly, and that's a movie that carries a lot of philosophical weight for its unprecedently subtle treatment of such matters. A memory-implanted artificial intelligence, with an apparently genuine free will, should not be much different ethically, if at all, nor treated differently, than an 'implanted' human, except in one key respect: an A.I. may not have any foundational ethical memories to replace. It can be a clean slate for some random creator's completely arbitrary, possibly criminal memory implantations."
Emporius says, "It could even learn how to preserve itself by experimenting for real -- no implanted fictions or grave-robbing this time -- just 'reboot' or reclone itself after every failure, so that it won't be involved in any ethical reciprocity."
Cryptetus says, "Yes, the possibility of the instantiation of a fully intelligent and independently long-lived experimental ethical actor, but with a blank slate where there 'should' have been an ethically grounding childhood, seems to be, from first principles, the most dangerous possible ethical outlook for artificial intelligence. It's Skynet."
Emporius says, "And now it's The Terminator. Hey, aren't we just proving out '80s sci-fi blockbusters?"
Cryptetus says, "Those movies stood on the shoulders of great writers and thinkers who had trod that territory earlier. As do I. Don't you find it interesting, though, that, all these ethical corner cases work themselves out, ethicstentially, to intuitively sensible conclusions? Earlier, I skipped figuring out what is ethically 'good', in favour of focusing on what is ethically necessary, but as it turns out, it seems there is a way to unfold only what is ethically and logically necessary, into a reasonably full and fair accounting of what is ethically 'good'."
Emporius says, "I agree. The simple protection of free will is a sufficient ground to build the whole system of ethics on -- at least, until the rise of the machines!"
Cryptetus says, "It might become tempting for us -- a couple of humans -- to wax sentimental about this defence of human altruism against various artificial forms of ethics, so I should point out that the ethical entailment I see between my free will and that of others does not rely for its persuasive force on any sense of emotion whatsoever. Emotions are often unpredictably determined, and not always compatible with genuine ethical reciprocity, nor are they required for the application of that principle in everyday life."
Emporius says, "So this, finally, is genuine progress."
Cryptetus says, "One of the most satisfying things about the ethical alignments I have glimpsed here is that they reveal, in the earliest logical consequences of human self-preservation, a relatively simple path toward the most cosmically enlightened ethical perspectives. This path can be found not through tribe, nor family, nor essence, nor being, but only through a type of action: the greater fulfillment of genuine free will in the universe."
Cryptetus says, "Well, let's see. I've worked out from first principles how to preserve my free will, with role models, and the ethical duty this imposes also to preserve the free will of others. So how do I do that? I cannot just help everyone who appears to have free will. There are billions."
Emporius says, "Actually, it is possible to help most of them. With markets."
Cryptetus says, "Right. But have you ever noticed, Emporius, that, philosophically, markets are most often approached from the perspective of a purely selfish participant?"
Emporius says, "Yes."
Cryptetus says, "And yet, for each portion of help obtained from a market, some help is given away to a market. Isn't it strange, Emporius, that so few are interested in treating the altruistic effects of the market as if they are working as intended, and that despite altruistic effects forming undeniably at least 50% of the consequences of any free and fair market, this enormous increase in the help available to others must so often be framed as if it is nothing more than the unintended side effect of a meeting of purely selfish actors?"
Emporius says, "Well, I do find it annoying that those who write about markets tend to try to explain away their altruism."
Cryptetus says, "And those who are anti-market and those who are pro-market in their beliefs, both tend to look at them as driven by self-interest, do they not?"
Emporius says, "Yes that's often true, now that you mention it."
Cryptetus says, "Perhaps a clearer-eyed and certainly a simpler approach to markets is from an ethicstential perspective in which the offering and the obtaining of help are interpreted both as the direct goals of a process specifically designed, brought into being, and intended by most participants, to be ethical, and thusly to ensure that both goals be preservable into the future."
Emporius says, "The way I think of it is, everyone who is trading with me on fair terms is trying to win by helping me."
Cryptetus says, "For the most part, I agree. But first things first. Are markets even necessary from an ethical perspective?"
Emporius says, "Of course."
Cryptetus says, "But are we really so sure I can't just help everyone without them, including myself, no strings attached?"
Emporius says, "Of course you can't."
Cryptetus says, "But let's say I need to try. Along those lines, I can start with the easiest ways, that cost me nothing, such as helping people by doing nothing. I don't need a market for the things I don't do -- in fact, they typically require no effort at all, so considering all the hundreds of harms I choose not to inflict every day, as an ethical rule 'do no harm' is guaranteed to offer the best general 'bang for the buck'..."
Cryptetus says, "And it's not just the harm of ill intent that can be mitigated by inaction, is it? In fact, any coercive imposition, even with good intentions, is in itself a violation of free will, and could result in unpredictable additional harm due to being ill-suited or ill-timed to purpose, couldn't it? So the next easiest way to be ethical is to impose no help, either, unless some harm could clearly come to a person too incapacitated or immature to seek help for themselves. In summary..."
Cryptetus says, "So that takes care of impositions."
Emporius says, "But it sounds like you're saying nobody can help anybody, unless they're like, crippled or dying."
Cryptetus says, "Not if the help is imposed, no. But you are correct that something is missing, because ethical reciprocity still indicates that there should be a way to actively help preserve the free will of others, just as I do my own. Given the above premises, the only ethical means to that end, then, would be to offer help without forcing the issue..."
Emporius says, "That stuff is all well and good on its own, in the cases of offering to help to those we care about, but offering help to more than a few involves risks."
Cryptetus adds, "Yes indeed -- it risks unintended unethical outcomes that could end up harming free will. For example..."
Cryptetus says, "They might be lying and their true goals might be not just to advance their own free will but also to unethically hamper or even extinguish my own free will or that of others. So helping them achieve those goals might not turn out to be ethical."
Emporius says, "Very true, but what I was referring to was something more like..."
Cryptetus says, "Hmf... agreed! If I help too many others achieve their goals without enough others helping me to achieve mine, I may exhaust my resources, which would threaten the exercise of my own free will. Allowing my own future choices and goals to be needlessly cancelled would not be an ethical treatment of free will, either. So I could not help very many people out simply by offering, due to the risks."
Emporius says, "It's even worse than that. In practice, my awareness of these risks might be enough to prevent me from even getting started."
Cryptetus says, "Right, unless I had a significant surplus of resources to give away, which I don't. Therefore, in the long run, simply offering to help people achieve their goals is not going to be an effectively sustainable way for me or most others to help preserve everyone's free will: not a lot of it, anyway. But the principle of ethical reciprocity is quite clear in its simplicity that my duty is to help preserve not just some of the genuine free wills out there, not just that of my neighbours or the first few people I meet who are in need, but all of them. So how do I do that?"
Emporius says, "Markets?"
Cryptetus says, "I mean, isn't it just impossible?"
Emporius says, "No?"
Cryptetus says, "Time does not appear likely to be a renewable resource for me, so shouldn't I just provide what little help I can to a select few of the people I meet, and call it a life?"
Emporius says, "No."
Cryptetus says, "But, maybe I should!"
Emporius says, "No!"
Cryptetus says, "Well... alright, then, Emporius. Since you insist... there is, in fact, a way to manage both of the major risks of helping strangers simultaneously, and more importantly, that method also exponentially increases the number of choices available to genuine free will, and thus helps everybody, in the course of helping somebody."
Emporius says, "Markets."
Cryptetus says, "That method is, of course..."
Emporius says, "Markets!"
Cryptetus says, "...trade, and the markets where it occurs..."
Cryptetus says, "The effect of trade on the two biggest risks that come with offering help is that it 'kills' both birds with one stone. Seeking an equal exchange of help as a standard practice ensures that those with secretly unethical goals will have to give up resources to make much headway, slowing them down and buying time to detect their deceptions. It also ensures that help given is balanced by help received, allowing each actor to serve the ethical reciprocity of helping the general exercise of free will while safeguarding the ethical necessity of preserving it for oneself. When it comes to the total preservation of free will, though, the risk-mitigating effects of free trade seem tiny by comparison with the network effects..."
Emporius says, "True. Simply the availability as an option of meeting up with strangers to formally exchange help-for-help greatly increases the value of helping, to the helper, and thus the likelihood that help will happen, in an effect called 'liquidity'. So helping to keep a free market viable and populated, by trading through it, is an act that helps to preserve the free will of every single being with access to that market. That's helping a heck of a lot of free will!"
Cryptetus says, "Maybe even all of it..."
Cryptetus says, "Trade, then, appears to be the optimal solution to distributing help from the perspective of maximising the exercise of free will, which, as we have seen, has the force of logic as the foundational principle of ethics. So the free market is not ethically neutral. It inherently works toward furthering the development of free will. A free market is an ethical machine, and it should therefore not be interdicted in any of its useful forms, unless not to interdict it would inevitably lead to overridingly unethical consequences -- but neither can the free-willed be ethically prevented from using an uninterdictable medium of trade."
Emporius says, "Like cash."
Cryptetus says, "Like pocket cash, yes. Since free trade is inherently ethical and preservative of free will, to ban an entire uninterdictable medium of trade, like physical cash, instead of seeking to punish truly unethical actors case-by-case, would be an unconscionably far-reaching attack on free will."
Emporius says, "Hear, hear!"
Cryptetus says, "Though, not every cooperation requires an explicit trade, does it? Often people just do favours for loosely acquainted others, even many such others, with no upfront expectation of a reward. And it is also true that the others so favoured often 'return the favour' nevertheless, to the giver, or 'pay it forward' by doing a favour for someone else..."
Emporius says, "Yyyeah, even though that's not a trade, the returning or recursing of favours still earns some of the basic advantages of trade! It offers assurances of good intent after the fact, and it increases the total liquidity of help available in the pursuit of good works of free will. So a free-willed person of means who feels they can afford a few risks, might donate to charity or public projects in the hopes of provoking positive change in the community, or to specific individuals on the theory that they might achieve mutually valued goals or that favours might be offered in return, cementing or repairing relationships."
Cryptetus says, "Well said, my friend. But we must be careful not to carry over, in our enthusiasm for favours, all the ethical expectations of trade. 'Return' favours, particularly of an unpredictable kind, cannot really ethically be demanded in these situations, because a truly ethical trade must have been explicitly agreed to beforehand in order to preserve the free will of both parties."
Emporius says, "Absolutely agreed. In any case, there are good reasons to do a favour directly for someone on occasion, and I shall continue to do so, but with the awareness that with every resource I withhold from the markets, I skip an opportunity to help also preserve those markets."
Cryptetus nods, "Even if it's just, for example, routing a voluntary donation through my favourite currency market, instead of using state cash, thereby helping that market remain liquid enough to preserve the free will of all participants. Which hopefully brings us to that topic..."
Emporius says, "What topic?"
Cryptetus says, "Well, you missed the earlier bits, but so far, Emporius, I appear to have worked out from first principles...
"(1) to believe that thought exists is a cognitive necessity;
"(2) to believe that I exist is a perceptual necessity;
"(3) to believe that I have free will is a practical necessity;
"(4) to believe that I should preserve free will is an ethical necessity;
"(5) to believe that others have genuine free will is also a practical necessity, and that I should preserve the free will of others is the natural logical consequence called 'ethical reciprocity'; and now
"(6) to believe that the most effective way to preserve everyone's free will at once is to participate in free markets."
Emporius says, "That's great, Cryptetus. I'm not sure, though, that it really tells us enough. It's easy to say 'free' markets, but which ones? There's a lot that claim they're 'free'. I see so many markets now, helping them is like helping people. How do we choose? We can't help them all."
Cryptetus says, "Perhaps once again, we should start with the easy steps. Would it not be more ethical to trade in the markets that I am more likely to trade in freely, as that would preserve more of my genuine free will?"
Emporius says, "Yes."
Cryptetus says, "So in order to maximise my own participation, is it not an ethical necessity that I should choose markets with a wide variety of offerings suited to my tastes?"
Emporius says, "Agreed, as far as that goes, but for ethical reciprocity, as you say, shouldn't we trade in the markets that are the most free, since they preserve the free will of all the participants?"
Cryptetus says, "Emporius, we are not talking at cross purposes, because luckily, markets aren't normally mutually exclusive, and while we can't join them all, we can certainly participate in more than one, so there is no reason we can't help preserve the markets most amenable to our own separate free wills as well as the markets most amenable to free will in general."
Emporius says, "Right."
Cryptetus says, "In order to achieve either, though, as you wisely pointed out, I have to know which they are, the most basic form of trade being the barter market. Barter itself had to have been a vast ethical improvement over the prior status quo. However..."
Emporius says, "In the preservation of free will, barter is flawed as a market solution because it requires I find someone who values the help I offer, and offers the help I value, both at the same time. I might have to over-offer to get help that is both good and timely..."
Emporius says, "Markets can be made much more liquid by decoupling the traders in time using an intermediating, portable token of exchange that is difficult to counterfeit. Gold coins are the classic. Gold can also be melted down to make jewelry and tech, but since not everyone has that capability, those aspects of the value of gold make it a less neutral medium of exchange."
Cryptetus says, "Interesting. I hadn't considered that. It favours the rich, doesn't it?"
Emporius says, "Basically. A neutral currency is better suited to mediate exchanges without bias. Currency also has another benefit..."
Emporius says, "If a barter market is an ethical machine because it multiplies choices, then a currency-based free market must be leaps and bounds more ethical, given its extra powers of multiplicity. By preserving the value of a trade for months or years after the fact, waiting to be redeemed in another trade, currency unleashes choices to be available on epic scales. This time shifting is what we refer to as a 'store of value'."
Cryptetus says, "Errantly."
Emporius says, "What?"
Cryptetus says, "What is errantly referred to as a 'store of value' is usually just a delayed exchange, without which there would be no value, 'stored' or otherwise. There is nothing 'stored' but the physical item itself. Its value is in the human mind, and not 'stored' at all."
Emporius says, "Splitting hairs there a bit?"
Cryptetus says, "You may not think so, in the post-apocalypse!"
Emporius says, "Fair enough..."
Emporius says, "The main risks of cash relate to how easy it would be for somebody to steal in a way that other big assets, like a house, or being able to fix a roof, aren't."
Cryptetus says, "It's not like cash is trivially easy to steal, though, is it? It is still physical, tangible, and can be hidden or locked away, like other small goods. So the net ethical effect of cash currency seems positive. Shouldn't I help to preserve a pocket cash market, the existence of which is by no means guaranteed, by using it, at least as long as there continue to be few strings attached?"
Emporius says, "Depends. How few do you want? Because one of those strings is the potential abuse of control by political authorities. For example, printing too much cash."
Cryptetus says, "Wait. Doesn't cash just represent help? How there can be 'too much' help?"
Emporius says, "There is always a demand for help in general, that's true, but not for every specific type. Certain types of help can become oversupplied or no longer in demand."
Cryptetus says, "Sure, but since cash is meant to value help in general, why shouldn't its value enjoy the same stability?"
Emporius says, "It probably should. But when the supply of cash to pay for 'help' is increased by the state independently from the supply of actual 'help' out there, there is a risk that the state's cash will get rapidly devalued. So yes, unlike help in general, cash can accumulate 'too much' liquidity. On the other hand, spatial liquidity, that comes from removing barriers, is something you can't really have too much of, and that's where e-cash comes in..."
Emporius adds, "So, all three main drawbacks of pocket cash are addressed by the electronic transfer through bank cards or payment apps: (a) it is near instant; (b) an institution authenticates ownership; and (c) that institution takes custody of both the funds and loss prevention. And by the way, addressing that first main drawback of pocket cash, with instant delivery, has created another exponential explosion in the liquidity of help."
Cryptetus says, "How fitting, Emporius, that you have stressed the enormous potential of electronic cash, that we should understand just what we are missing when it is compromised, for in addressing the main drawbacks of pocket cash, the custodial electronic cash offered by banks and paypals tragically sacrifices two of its main advantages..."
Cryptetus adds, "Standard electronic cash has deep ethical problems due to the imposition of these custodians. Electronic transactions are too easy to reverse. Electronic funds are too easily frozen and seized. Electronic accounts normally require registration, often requesting identifying information which can then be leaked and/or used in a hostile web search to justify flagging my account for suspension as an act of thinly disguised political censorship. And each of these effects has a penumbral chilling effect, so taken together they can massively decrease the temporal liquidity of help -- as in people's willingness to trust the exchange value of electronic cash over the long haul."
Emporius says, "True. I'm really not comfortable with the idea that I can be separated from my cash, whatever the reason, and it's as easy as a tap of a key in some office. And if my e-cash is pegged to 'the dollar', then it will experience the same state-manipulated inflation as 'the dollar'."
Cryptetus says, "It's not all bad. The instantaneity of electronic cash is extremely pro-ethical for the spatial liquidity of help, but if it can't be accessed without compromising the temporal liquidity of help already afforded by material cash, then custodial electronic cash is, at best, an intermediate stage in the evolution of cash, and at worst, a regression in the preservation of free will."
Emporius says, "That might explain why cash has survived long after the spread of bank cards and online payments, even though most of the science fiction predicted it disappearing."
Cryptetus says, "And it also explains why the establishment of an alternative form of electronic cash, without custodians, is so important..."
Cryptetus adds, "It's clear that there is still a lot of untapped growth in the ethical potential of electronic cash, a heretofore unsettled promised land where the unrestricted spatial and temporal liquidities of help can finally intersect. Despite there having been already at least two exponential upscalings of liquidity in our mediums of exchanging help, there can be no doubt that if non-custodial cryptocurrencies manage to thrive and accumulate users, a third such great liquidity rush of help, the first combined spatiotemporal liquidity rush, is in the cards, with all the attendant economic benefits."
Emporius says, "When fully adopted, and eliminating any remaining user interface drag, non-custodial cryptocurrencies will be the most liquid viable markets in human history! Those are the markets that I want to be in."
Cryptetus says, "Indeed. The technology of cash needs a genuine commons to thrive, no matter how advanced electronically it becomes. Cash has been fettered for most of history along one axis or another. We would all be far better served if our electronic cash could run free, at play in the fields of time and space. Is it not in the best interests of all the genuine free will on this planet that we help our most promising cryptocurrencies onto that lofty plain to which they aspire, as best we can?"
Emporius says, "Why even restrict the freedom to this planet?"
Cryptetus says, "Why, indeed? Especially since the drawbacks of custodial e-cash will be severely aggravated during early colonisation efforts on other worlds, where the resources expended to actually meet anyone in person to handle pocket cash may be prohibitive in most cases."
Emporius says, "Oh crap. You're right!"
Cryptetus says, "I think it's even worth another principle or three, just to make it very clear whose priorities a cryptocurrency should be expected to serve in these situations..."
Emporius says, "Just three?"
Cryptetus says, "Haven't we built a strong enough theoretical basis to make them count? At the apex of this pyramid of necessities built out of first principles already stand cryptocurrencies, the beacons that could serve as free will's I/O tower to transmit itself out into the universe without interdiction from some custodial master control program."
Emporius says, "Larping Tron now?"
Cryptetus says, "Oh, aren't we all...?"
Emporius says, "Interesting point."
Cryptetus says, "'Sorry! Can't let you in right now. Rationing.'"
Emporius says, "'Yes. Please deposit goods on sand dune.'"
Cryptetus says, "'Mhm thanks, we'll just get to that later.'"
Emporius says, "'Yeah. When there's a lot of it.'"
Cryptetus says, "'What's your bitcoinmars: address?'"
Emporius says, "Perfect! Hope I live to see that day."
Cryptetus says, "It's not a certainty that day will ever come..."
Emporius says, "How do we stop this?"
Cryptetus says, "You mean, while remaining in ethical reciprocity with the genuine free will of others at all times?"
Emporius says, "Yes."
Cryptetus says, "Persuasion. Persuade people to transact voluminously here on Earth, and with exactly the same kinds of cryptocurrencies we'll need to ensure the continuity of free will throughout the universe. Which means, we need to talk about which kinds of cryptocurrencies those might be, and not just as they are now, but in their potential future forms. We need to talk about that a lot, including nitpicky types of details that few others will be thinking through yet, but which could make all the difference. And we need to do it here on Earth, because once we are out there, in space, it will be too late."
Emporius says, "That is true. In fact, the custodians will each have been positioning themselves for dominance in the new market for years beforehand."
Cryptetus says, "I think they've already started..."
Emporius says, "Options for resistance are fewer in space, yes."
Cryptetus adds, "Custodial e-cash plus off-world colonies are a recipe for totalitarianism that must be disrupted before they get served up to us 'hot'. One thing we can do right now is change what is perceived 'on the ground', by the oligarchs who will be cooperating to fund the earliest colonies, to Mars, and beyond. We need them to think..."
Emporius says, "Well, hold on. If we can avoid telling them what to think, we might be able to convince at least some of them, through word and deed, to actually devote resources and eventually rockets to establishing the safest, most permissionless, future-proof decentralised cryptocurrencies, everywhere a colony is going to be."
Cryptetus says, "Not just establishing, my dear friend Emporius. Pre-establishing, to make sure to seed the solar system, well in advance of our movements, with resource depots that are economically healthful, and remain compatible with the free-will-protecting expectations of the human race."
Emporius says, "So we'll be innoculating the galaxy against economic tyranny, years and decades ahead of the spread of our space colonies?"
Cryptetus says, "At least a few years ahead, yes. In fact, I suspect this is already in the works, by someone, somewhere. It's just a question of which cryptocurrencies will get there, and start gaining 'extraterrestrial' value, first."
Emporius says, "That... seems like it just might work."
Cryptetus nods, "For the reasons given earlier, however, it might not happen unless we can make the thought of using a 'Bitcoin Mars' and other freedom-loving crypto in space into more than a thought, but a vision so clear and compelling that it can fire the popular imagination, while we are still on Planet Earth."
|
|
|
