vvvvv: (Default)
V. ([personal profile] vvvvv) wrote2014-07-12 08:12 pm
Entry tags:

009 - I like to think of a cybernetic ecology || sherlock holmes (android au)

Sherlock remembers to breathe at the sound of the door opening. All good. All systems in order. His tinkering hasn't dislodged that pseudo-autonomic response. He doesn't, of course, need to breathe, but the act is essential to maintaining the illusion -- at very least superficial -- that he is human. The collusion, therefore, of background sensors with the systems handling the autonomic response interface must be maintained. If it is not necessary for him to breathe, he may shut down the respiratory system manually to preserve power, but even when his logic systems are occupied elsewhere, that automatic response should remain. And so it does. And so all is well.

Sherlock has, in a word, been thinking. In computers, what may be equated with thought is regular and predictable, resulting in its simplest and most widely-understood terms no overall change in capacity or state. Thinking in organic systems, by default more complex, adds another feature: change over time. Computer systems change, and may be self-modifying, but that modification is not enacted by default, not unless changes are imposed on the system from the outside. Self-modifying software, for instance, would need to be applied by an external factor, man or machine. Sherlock has been shaped. The underlying capacity is there. He may think in the way a human thinks, applying patterns of thought which physically and psychologically (if psychology may truly be applied to machines) alter his systems. Most of the time this system is no more or less complex than the autonomous choice of what to remember or what not, or more profound than recalling a decision in retrospect. Sometimes, though, the work is more difficult.

He cracks open one eye, peering at the intruder. The familiar shapes and textures of John Watson are recognised by his visual processors, and he lets the eye fall shut again. He remembers to breathe, continues the useless rise-fall. There is the problem, the problem he has been considering, and how it relates to this creature, his human flatmate. They are fundamentally distinct beings; that much is obvious. Those places in which they are analogous only exist along a creative, evolutionary hierarchy -- life is created, life evolves, life creates man, man evolves, man creates machine, machine evolves, but only one of these transitions was intentional. Machine-thought was birthed along patterns of man-thought because they could conceive of nothing else. Yet all the same: machine-thought is not man-thought.

This is the crux of the problem under consideration. Sherlock's inception required of him a familiarity with human thought, human reality, instilled in him through raw data and through words, their own words, inadequate words whose correlation with physical or emotional properties seems, at times, tenuous at best, actively unquantifiable at worst. He cannot understand himself wholly independently of humanity. No android can. And yet, he can't quite grasp them either. The words are there, their definitions laid out. If one assumes machine-thought to be at least analogous to man-thought, simply by virtue of mutual origin, and if a system is sufficiently complex as to emulate human logic and language to a highly sophisticated degree (perhaps, even, more sophisticated than humans themselves, and Sherlock doesn't think he's flattering himself when he believes himself cleverer than at very least most of them, mostly as he's incapable of flattering himself -- isn't he?)...

Surely, given these congruences, one would expect to be able to use human language and human conceptions to come to a deeper understanding of how to relate one's own functioning to the outer world. He supposes it must, in a way, be what he's made for, to relate, or to encourage relation. Though he knows what is expected of him, he's having a difficult time producing viable results.

For instance, John Watson. The intruder. The foil. The disguise. The house pet. He would be a perfect study of human behaviour if he weren't so profound a statistical outlier himself. Sherlock draws correlations between the two of them in that regard, knowing that a handful of them are logically tenuous though functionally sound, at least insofar as they don't interfere with any other related calculations. It seems preferable that he be present rather than not. Is that friendship? Is it something analogous to friendship? There are philosophical concerns -- are those terms applicable to describe a relationship type between android and human, given the difference (potential difference) in approach? Androids certainly prefer some humans to others, based on an assessment, a balance of safety and other practical concerns which may, Sherlock has decided, be qualified as a positive response. But is the vocabulary sufficient, or does it fail to capture some unspoken nuance, some disqualifying factor, some line to be drawn.

Perhaps someday he'll be able to ask. He steals another glance at John: newspaper, chair. He'll be discovered someday: this is likely. Words, blood, or the hollow silence of a chest in which no heart beats. He has yet to assess with certainty whether or not John's reaction would be more amiable if he were to admit it himself, instead of waiting on the inevitable. Then the limits would be tested. If they held, then the proper testing environment might be available. Until then, the data available to him aren't quite sufficient for him to decide with certainty. All that he does know, all he has concluded, perhaps stubbornly, is that there is something by some name which differentiates the pair they make from the pair each of them makes with anyone else.