Saturday, April 21, 2012

Adventures in Beta: Diablo III

On April 20th, 2012 (that's yesterday), Blizzard Entertainment released the beta version of their upcoming dungeon crawler, Diablo III, to the public.  This demo is only playable until the 23rd, so if you're reading this too far in the future, you'll probably have to pay to play this one.  But since I have friends who are on the ball about these sort of things, I got to give the new Diablo a whirl for free, and now I'm going to tell you all about it.

For those unfamiliar, the basis of Diablo is an adventure/role playing game, with an interface somewhat similar to Blizzard's other big hit, World of Warcraft, but from a top-down perspective.  The setting is a sort of fantasy dark age period, and the story follows the player's struggle to defeat the forces of evil, headed by Diablo, the ruler of hell.  Gameplay consists of exploring randomly-generated maps and slaying the demons and monsters which emerge from the landscape, while collecting the weapons and armor that gradually transform you into a greater badass.

Those are the basics, anyway.  The truth is, though I've played plenty of other Blizzard games (particularly the Warcraft and Starcraft series), I'm just a noob to Diablo.  So this beta served as my own introduction to the series, much as I feel ashamed to admit it to my gaming friends.  The first step of course was to create my character.  Unlike in WoW, this doesn't entail a lot of cosmetic customization.  It boils down to three choices: name, character class, and gender.  For my first outing, I decided to go with the male witch doctor.
Exceptionally trained by a fully accredited witch medical school.
My first hour or so consisted of running around blindly, not really knowing what the hell I was up to.  I actually missed a good deal of the early story, because I was in a party with my friend Bau, who knew what he was doing and kept advancing story moments and quest objectives while I tried to figure out how to use the dagger I'd equipped.  Online multiplayer is a core part of the Diablo experience (I'm told), but it wasn't until I started playing solo that I started to get my bearings on the world.

The graphics are recognizably in Blizzard's fantasy style, but somewhat darker and less cartoon-like than the art in World of Warcraft.  The environment is appropriately misty and shadowy, with lots of gloom and doom to bring across an apocalyptic atmosphere.  Things don't get too dark, though, and that's a blessing because I like to be able to see what I'm doing.  The visuals are very well designed, and I seldom lost track of where an enemy or an item was.  The sound design is also very good; that fine balance of voice overs, sound effects, and music is one of Blizzard's more under-appreciated strengths.  The music by itself isn't much beyond your typical spooky ambiance, but that's just where it needs to be.

I don't know if the events seen in the beginning constitute a major piece of the story, so I'll try to be as vague as I can, but the main quest revolves around my witch doctor's attempt to relieve the townsfolk of the menace posed by their undead ex-sovereign, Leoric the Skeleton King.  After leveling up in graveyards and crypts for a while, assembling key items and gathering bits of lore, I acquired a number of useful spells and skills for accomplishing this task, including the ability to summon a pack of zombie dogs to tank my damage and do much of my fighting for me.
They might not actually be dogs, but they are definitely zombies.
The last stage of the beta took me through the subterranean floors of a spooky cathedral, where the Skeleton King lay entombed.  For a penultimate challenge, however, the monsters in this area didn't pose much of a challenge.  It might just be because it's a beta, but once you've figured out what you're doing the "normal" difficulty mode is essentially a piece of cake.  With the rare exception of lumbering abominations, few enemies could stand up to my flaming bat attacks.
The only thing worse than a wall of fire is a wall of bats.  On fire.
It was around this point that Diablo III started really bugging out on me.  I'll give the game a pass because it's beta, but being randomly disconnected from the server is a pretty big issue.  The dungeons are randomized and regenerate with each log in, so I had to do a lot of leg work over again once they finally let me back in.  It wasn't the end of the world, but it was annoying.

In fact, there were a number of smaller glitches that I noticed throughout my time with the beta.  The basic functionality of the game is point and click; one click of a mouse button should cause your character to attack a monster, pick up an item, or interact with the environment.  In multiple situations, however, I had to click two or three times before the game actually performed the desired action, and occasionally the lag was so bad that I could hardly take two steps without being sent back to my starting point.

When the game isn't losing its mind, the action is extremely fluid and fast paced.  It's also very simple; click an enemy to cast some kind of spell, and continue to do so until he is your enemy no more.  Other spells and abilities can be used with a small number of hot keys, but with the relative difficulty thus far I found that I could usually get away fine with just mouse clicks.  Occasionally you encounter elite or boss characters, who throw a little variety into the mix with their bizarre personalities and longer HP bars.  Still, they don't pose too much of a challenge.
Lloigor here doesn't appreciate people reading over his shoulder.
Fighting Leoric himself, however, was a little bit of a shock.  I suffered my first in game death at the hands of the Skeleton King, for being too slow on the healing potions. I wouldn't necessarily describe this battle as very hard, because I took him out on my second try, but there is a distinct increase in difficulty.  In fact, this fight more closely resembled the game I had been expecting, with a greater amount of strategy and skill than in slaying hordes of mindless zombies.  Sending off this boss took a little more careful micromanagement, and a lot more running away from his whirling-mace-of-death attack.
That thing he's holding there?  He swings it at your head.
After taking the big bad down, I was informed by helpfully large white text that I had beaten the Diablo III beta.  Hooray!  A few short hours of play time, and I had accomplished just about all that Blizzard was willing to let me do for free.  To see more, I'll have to hand over cash money for the full game; otherwise, I'll simply be left wondering forever.  For now I can just replay this quest, leveling up my witch doctor with more fantastical witch doctor powers, or try out some of the other character classes.  At least I can until Monday, when the open beta closes.

Will I take the plunge and buy the full game?  At this point, I'm not sure.  There's no doubt that it's fun, and at a higher difficulty setting it will probably be a real challenge.  It has a lot of elements that I liked about World of Warcraft, without being as much of an unavoidable time sink (not to mention a money sink; fifteen dollars every month can add up).  On the other hand, I don't have the same personal attachment to the Diablo franchise that other people have, and I wasn't exactly waiting for this one with bated breath.  I might hold off for a while, since it seems like there are a few glitches with the servers that need to be worked out anyway.

It looks to me like Diablo III is on track to be a very entertaining and very popular game; I can't say if it's better than the first two, but I think newcomers will find something worthwhile in the experience.  The fans I know seem to like it too; sounds like a hit to me.

Tuesday, April 17, 2012

Curie the Class Bunny

Being a substitute teacher is a lot like being a regular teacher, minus the security or sense of belonging that comes with getting to write your name on the same whiteboard every day.  It's basically itinerant work, and difficult, too.  Keeping thirty kids in line with less than an hour to learn their names is not an ideal arrangement.  Being called on short notice to expound upon a subject you may not have given much thought to in a while can lead to unnecessary flirtations with mediocrity.  The ability to do any of it successfully rests with your skill in projecting a personality that commands a modicum of credibility without having time to earn it the old fashioned way.

But like all things, it has its perks.  The kids are generally happy to see you because they believe that your presence indicates a lighter workload, and they tend to forgive you if it turns out they're wrong.  You can make lame jokes that confuse the hell out of everyone, without worrying that your perceived insanity will hurt your ability to educate properly in the future. And if the teacher decides to softball the lesson plan in his or her absence, you get to watch children watch movies!

Oh, and sometimes there are bunnies.
This is Curie, who lives in room B109 of Agnes Stewart Middle School in Springfield, Oregon.  She sits in the corner while the students learn science and math, chewing on whatever that stuff in there is.  Grass?  I'm pretty sure she got hold of a paper towel at some point, somehow.  She's pretty damned adorable.

Apart from an improvised review/lecture on scatter plots and some homework help, my job on Monday morning largely consisted of keeping the room quiet and focused while Bill Nye rattled through the glorious history of genetics research.  Since the kids were pretty well behaved, I mostly just paced from one end of the room to another, looking responsible.  Curie, meanwhile, would chew on stuff constantly.  I don't know if that improves the learning environment, but it definitely made the room more awesome.

I'll be honest: this entire post was cooked up for the sole purpose of posting a picture of a rabbit on the internet.  Is that so wrong?  I don't care.  You'll thank me when you need your bunny fix.

Saturday, April 14, 2012

Let Me Tell You the Truth

In writing this essay, I don't really mean to assert that all of these ideas and observations are original or especially profound.  They aren't original, and they are only profound to the extent that they haven't occurred to the reader in the way I express them.  But they are important, and I feel a need to put them forward for the benefit of others.

The main issue at hand is the nature of truth: whether it is absolute or relative; the limits of humanity's ability to perceive it; whether it is wholly external or at least partly internal to the human mind; whether it exists at all.  This is an extremely heady subject, and I don't come at it with the benefit of systematic research, or training in philosophy beyond familiarity with a few famous ideas.  These are my thoughts, whatever they are worth.


Metaphors, Language, and Doubt at Small Scales

We live in a world that is inherently difficult to understand.  Or rather, we seem to be inherently designed to be unable to understand it.  It's a natural human tendency to organize as much knowledge as possible into yes-or-no propositions; we assume that everything can be boiled down to either it is or it isn't so.  And possibly, some things can be.  But the wishful thinking that everything can be seems like the underlying premise of most casual thought.  Given that many things in the experience of our lives cannot be so neatly pared down, it stands to reason that either we or the universe are being obtuse.  Applying my own logic to that line of reasoning, I'm left wondering if both might be the case.

A lot of it seems to be our own fault.  Putting aside our perceptual limitations (because some things could be obviously true, and we simply lack the capacity or perspective to see them), we don't always make things clearer in our attempts to describe the experience of living in the universe.  In a very real sense, we live in a world of abstractions as much as we live in a world of matter.  We give names to ideas and feelings, and treat them in our minds like physical objects.  Of course, we aren't stupid: rationally, we can distinguish between the conditions of a burning bush and a burning romance, because one is literally on fire and the other is only metaphorically so.  But the urge to make metaphors at all seems strange on the surface.  The instinct to accept them as true is even stranger.  If anything, a metaphor is only a truth with an asterisk: "this statement is true if by these elements we are actually referring to other, unspoken elements."

The use of language is the biggest metaphor of all: a system of sounds that stands in for the thinly ordered chaos in our brains.  To put it simply, speaking is a metaphor for thinking; and if language is a metaphor, then even an objective statement of fact must be rendered with an asterisk after every word: the* third* stone* from* the* left* is* more* massive* than* the* other* stones*.  The idea is not complicated or easily misunderstood, but it is possible to interpret each of the words used to express it in more than one way.  If a word has more than one definition, its intended meaning can theoretically be misunderstood; alter some of the intended meanings of those words, and the statement could easily be true or untrue for any group of stones.  It's a good thing for our sanity that we're pretty skilled at deducing meaning from context, because otherwise we'd spend all of our time straining for an impossibly specific description of every mundane occurrence, and still be only half sure of what we were hearing.

I believe it's our facility with words that makes us so keen on metaphors, exaggerations, and other cases of not-quite-truth.  It could be the other way around, and I'm not going to spend much effort arguing that it isn't; the real point is that they are connected.  Our willingness to accept that a word or phrase means a certain thing in one context (for example, "the other stones" means only a given subset of stones, and not every stone that ever existed or will exist) has something to do with our acceptance of metaphors: that a person's blood can boil and that this means something other than superheated fluids scalding that person's veins.  

So what am I going on and on about?  Well, it bothers me a little much that we have such a deep-seated attachment to the idea of absolute truth, when the only means available for expressing truth lies in sound-symbols that cannot be tied to absolute meanings.  Even if thoughts are absolute (and I don't assume they are), if words are not then there's nothing for thoughts to rest on outside of our heads.

Absolute truth must be technical and extremely precise: what does it mean, then, if a certain level of precision is beyond our reach?  It might be said that we're already precise enough that any further distinction would make no difference in our actual perception, and that is probably the case with some matters.  But absolute truth implies the absolute absence of doubt, and this way of looking at things concedes doubt on very small scales.  As a human being, this troubles me, and I can't help wanting to tease the meaning of it out, even if I strongly suspect that it's impossible.

Alternatives, Delusions and Subjectivity

When it comes to efforts at communication between two or more thinkers, we've got this system of grains of sand resting on grains of sand, with nothing firm to act as a final support.  There may be an independent, absolute standard of truth, but it is beyond the ability of language to express it.  For practical purposes then, I suppose that since it is not possible to express an absolute truth, the possibility that such a truth does not exist cannot be ruled out.  Its presence may be taken as an article of faith, but it doesn't have much else to lean on.

There are basically two alternatives to the idea that it can be stated with absolute certainty that some things are always either true or false.  The first is relative truth: some things are true or false depending on context and perspective.  The second is nihilism: truth itself is an erroneous concept and nothing can be said to be true or false in any sense. 

The full implications of nihilism are deeply confusing to me, and most of my thoughts on the subject have been in search of ways to avoid it.  It's easy to fall into an either-or between absolute truth and nihilism, because they are natural opposites and it isn't obvious where the limits of nihilism would lie.  Does it apply only to "truths" about values and morality?  Or does it extend into the seemingly independent physical world as well?  And what, exactly, is the distinction between them?

As I mentioned above, much of what we experience in life and that can rightly be termed a part of our world are not physical objects.  In fact, many of them seem more like delusions that we impose upon an objective world to better suit our psychological natures.  For example, laws are only words, vibrations of sound or symbols on a paper, yet breaking them can lead to the loss of freedom.  "Freedom" is something people are said to have or not have, yet it is described in myriad and contradictory ways.  Even the idea of possession, of "having" something, is more rooted in abstraction than anything objectively physical: a possession can be in your hand or in another state, and you can have something like freedom as easily as you can have a wrench.

In our minds, we conflate these things on a regular basis.  But perhaps "conflation" is the wrong term; it strongly implies that identifying the physical with the imaginary is somehow wrong, and I don't think that's necessarily obvious.  Regardless of whether there is an independent physical reality or not, it doesn't seem to make much of a difference to our minds.  If nihilism applies to value judgments and other intangibles, it may as well apply to tangibles too, as far as our perceptions are concerned.  After all, we can't live anywhere except within our minds.

It's that interweaving of traditionally objective and subjective realities in our perception that leads me to favor the concept of relative truth.  If nihilism is the case (and that may be a contradiction in terms, but I'd go crazy trying to express this any other way) and there is no truth, it can only be the case from a perspective outside the human mind; in other words, from a perspective that no human will ever have.  But if we accept the essential, mental nature of ourselves and our perceived reality at face value, then relative truth gives us a decent model.  It's messy, subject to reinterpretation, and as apparently real as any abstract noun we've come up with; in other words, from our perspective it makes perfect sense.

Relative truth is unsettling, because it embodies the asterisk: inherent to relative truth is the caveat that what is boldly stated is not true without exception, that in another circumstance it might come out differently.  It's the admission that we float over nothingness, without the sense of firm ground to hold us up.  Dwelling on the absence of terra firma is a frightening prospect, especially if we can't tell floating from falling.  But a relative model of truth jibes with our essentially relative perspective on the universe, and if an absolute exists beyond our grasp, the relative may be our best approximation of it.

Human Nature, Intelligence, and Emotion

I suppose what I'm saying in a condensed form is that "the world," the sum of all our possible experiences, is a subjective phenomenon.  The question of whether an objective world, where a thing called absolute truth might be, exists outside of our little bubble it is still open; I don't see how it might be closed.  It's sort of a cliche to imagine we live in a simulation imposed upon us by aliens or robots or gods, but it could just as easily be one we impose on ourselves.  If that were so, then human nature would color the nature of the world we perceive.  I think it's very important, therefore, to figure out what human nature is.

There are a lot of ideas in about what it means to be a human being.  From what I can gather from the ideas I've been circling through, it would seem that to be a human is to be a participant in the shared shaping of perceived reality.  But the most popular way of defining humanity, from before the time of the ancient Greeks to the present day, has been to define us in contrast with animals.  This makes a certain amount of sense, because of all the things we easily observe in nature, animals are the things most like people.  It's easy to see many things which all animals, including humans, have in common; identifying what we have that no other animal has (or perhaps, lack what no other animal lacks) would tell us what makes the human animal unique.

By a wide margin, of all the living things we've found, only humans display the amount of intelligence that we have.  There are other intelligent creatures (some apes, dolphins and birds have mental abilities comparable to a small child), but their potential does not reach as high as ours.  It might also be that we have souls and other animals do not, but this has never been conclusively shown (to say nothing of the fact that a soul is a tricky thing to define).  Our massive intelligence is a much more obvious distinction, and in any case the presence of a soul has historically been connected with the ability to reason, both intellectually and morally.  It may simply be a more spiritual side of the same coin.

However, here I have some difficulties.  Humans are uniquely intelligent among creatures we know about, but that does not make them uniquely intelligent among all creatures that might exist.  Given time and the right conditions, some species of animal could evolve into our intellectual equals.  As far as anyone can tell with science, our intelligence is a function of our brains, which evolved by natural selection the same as any other animal's brains.  It's a remarkable evolution, but not marked by a unique destiny; it just sort of turned out that way.

All of this has led me to think that what is essential about our nature is not what makes us different from the other animals.  We may be different from them, but that is beside the point, and it may be argued that we are more like them than not.  And in any event, we might not be able to fully articulate the differences between us anyway; humans only have the perspective of the human mind, and can't see the world from any other creature's point of view.  Isn't it possible that our true nature is in a more primitive aspect of ourselves, one that we share with the animals?

A professor of mine once told my class something that I quickly accepted as true from my own experience, that humans are not really the rational, intelligent creatures celebrated by classical theories on behavior; rather, we are "emotional creatures who happen to think."  I've debated friends of mine on this point, and found a great deal of resistance to the idea, particularly from people engaged in rigorously intellectual fields like physics and other "hard sciences."  Science works on the basis of unbiased reporting of empirical observation; strong emotional attachments or reactions are not good for this process.  Without wanting to put words in anyone's mouth or turn people into straw men, I think the notion that emotion is primary over intellect may offend some people, or scare them.  Many people are not comfortable with strong emotions; many of those people work in the hard sciences, because an empirical study of the material world provides a comforting sense of control.

I can only speak from my own experience, because I only know my own mind.  I consider myself an intelligent person with a high regard for clear and rational thought.  But I also know that I've been at the mercy of strong emotions my whole life; thinking is something I do, but feeling emotions is more or less something that just happens.  I can put on a neutral face and accept or reject my feelings, but I have no control over their existence; they are a deeply essential part of myself.

Then again, I do not have absolute control over my thoughts, either.  Thinking hard is a lot like driving a car over a field of ice; you can't predict or control every turn the tires make.  Thoughts can be as troubling and persistent as emotions, and sometimes it's hard to tell the difference between them.

It may not be an either/or proposition.  If humans think, and also feel emotions, then both are obviously integral to the human experience.  But which is primary?  I can't move past that muddled mess of relativism; sometimes one, and sometimes the other.  Sometimes, both at once.  I can't commit to one or the other, and all I sense is an unrelenting tension between them.  It's the same tension that exists between the world as it exists outside of our minds, and the world we actually live in and experience.  It's not possible to say that one is real and one is fake; only that one is right in front of us, and the other is beyond our sight.


When I started writing this essay, I wondered roughly what philosophical camps I might be placing myself into.  I can never be quite sure what philosophy I am actually in line with, because many contradictory ideas seem perfectly reasonable in the moment that I consider them.  Looking back on the threads I've woven here and trying to sum them up, I'm not sure exactly where I fit, but I think I can possibly briefly restate them in a consistent way.

Ultimately, I am agnostic on the question of the existence of absolute truth that is separate from human experience, whether it is material, spiritual or both.  I posit the existence of a reality that is subjective in nature and is made up of a combination of information from our senses and information supplied by our collective imaginations, and that it is this world that we can actually be said to live in.  I believe that the truths of this world are inherently relative, and mirror the tension between our rational and emotional selves.

I cannot say whether there is absolute truth in the world where the information our senses gather originates.  I lack the knowledge to prove whether that world even exists, and if it is the case that it does not exist, then I cannot explain where the information we sense comes from.  But the existence of the subjective world should be obvious; it is the world as it seems to be, consisting of matter and energy and abstract nouns.

Our desire for knowledge of the absolute, and the ability to live our lives with complete certainty of that truth, is probably an expression of the desire to transcend the reality of human life.  Real life is tricky and dangerous and frightening, but the chance to see beyond it to a reality of order and peace, untouched by human imperfection, is almost too much to resist.  But I wonder if we might not be more happy by embracing that imperfection and accepting that truth as we understand it can only be relative.  If nothing else, it opens our world to possibility, and gives us a chance to see things with more open minds.

Sunday, April 1, 2012

The Busiest March

Yes, ladies and gentlemen, that was the most productive March in the history of The Wave Function Junction.  In fact, it was the second most productive of all months, by number of posts.  How I managed to make fourteen posts back in May 2009 is something of a mystery, given the rarity of updates in general around here.  I usually consider a month with five updates to be some kind of crazy, obsessive posting bender.  Eleven has got to be a symptom of something.

And truth be told, I actually meant to make one more.  The last post of the month was supposed to be another book review, of Barack Obama's Dreams from My Father.  Unfortunately, a number of factors prevented this post from being made.  Mostly, I didn't finish the book (about a hundred and forty more pages to go, actually), and I don't generally like to write about things I haven't finished.  Then there was the EMP, and I had to write about that, and The Big Lebowski, and between all the things I was doing with friends in and around Seattle, I figured the President could wait.  It's not like I haven't been boosting him around here.

So apart from that, I think the review train is going to come to a slow around here.  It's not that I'm done with them entirely, but my little experiment has run its course.  I got to practice my review writing, and hopefully enriched the blog experience.  Like all good experiments, the results taught me things I didn't know, things that will be useful in the future.

So what did I learn from the experience?  For one thing, I learned that when you post an article about a phenomenally popular book within a stone's throw of its release as a major motion picture, this happens to your page view ratings:
Yes, the effect was highly temporary, and page views don't really matter, but it's kind of interesting to see what a difference it makes just to mention The Hunger Games at an opportune time.

(Hunger Games Hunger Games Hunger Games Hunger Games Hunger Games!)

Sorry about that.  Page views feed my ego, and my ego is hungry.

I also learned what a valuable source of input and feedback my girlfriend can be.  At least three of the items I reviewed last month (including The Hunger Games) were things I experienced either with her or at her insistence.  I keep thanking her in public and in private, and it's probably getting a little annoying at this point.  But you know what?  It's still pretty damn awesome.  So thanks to her again!

As far as the writing goes, several substantive points emerged for me.  I tried approaching most of these posts from a different angle of attack, so hopefully a couple of different stylistic effects were apparent to the average reader who wandered by.  But I think the biggest divide came between subjects I already knew a lot about (such as Pet Sounds or The Legend of Zelda) and things I came to as a relative newbie (Einstein on the Beach).  I want to say that background knowledge makes for better reviews, since I was able to approach them more confidently, but I also tend to think the Einstein review was probably the best-written thing I did the whole month.  That might not be the consensus opinion, but it feels that way to me.

I also relearned that, while posting grainy pictures of famous things from museums with commentary may be an easy way to rack up page views (the one I did about the San Francisco Museum of Modern Art a year ago is currently my most viewed post of all time by a wide margin), it doesn't actually lead to good writing on my part.  So unless I actually get some vocal feedback one way or another on those, I probably won't do many more of them.  Unless I really want to.

So, what's next for this blog?  Well, that Dreams from My Father post needs to be written (which means the book needs to be read), so that'll probably go up soon.  I also had an idea for an essay about different ways of perceiving truth; sounds boring and abstract, I know, but I really want to write it and you can't stop me, dammit.  The question of whether humans are essentially intellectual or emotional creatures was recently raised by a friend of mine, and I feel as though an essay on this blog is probably the best place to sort out my thoughts on the issue.

In addition to those things, I have a pile of poems that should probably be read sooner or later, so why not sooner?  I've also got some new ideas for fiction that need writing, so expect a short story or two in the near future.  If you expect anything at all, anyway.

Anyway, that's roughly the state of the blog these days.  Thank you all, dear readers, for making this the best month in WFJ history.  Have a lovely day.

Hunger Games.