Freedom of Speech

       Freedom of speech is a difficult subject because it’s dependent a lot on how many rights you believe the government has. You can also say it’s reflective on your faith in other people doing the right thing and that it shouldn’t be moderated, and for some people, that expectation is just really low. For me, freedom of speech means that the government will never tell me I cannot say something I believe in, unless, and only unless, it puts a number of people at risk. And I don’t mean like cyberbulling or harassment. I mean like releasing military plans or divulging national secrets or stealing identities. Things that either affect people’s lives, as in keeping them alive, or their economic well-being to promote them being alive.

       Anything beyond that and things start to get really complicated. I’m going to sound like an awful person for saying that the government shouldn’t really intervene in cases of cyberbullying, at least not on the national level, but there is a broader reason for that. There is some, though definitely not all, hateful or derogatory text that needs to exist out there, not because it’s right, but because it is still the equal expression of someone’s beliefs. Furthermore, without the presence of discontent, there would be nothing to counter-argue someone’s own argument. There is some merit to the idea that without “hate” there would be no growth in an individual, and that being a well-informed and educated adult means that you need to be able to handle those situations maturely and with well-thought-out responses. So while there are awful people out there who say awful things, they are within their rights to do so and the government shouldn’t say they can’t.

        That kind of censorship should be from–oh, I sound even worse now–peer pressure. But I think there is such thing as good peer pressure. If you have a friend group who expects you to be a decent person, hey, you might turn out to be a decent person. So the censorship that ought to occur shouldn’t originate from the government but from the people who make up its population and expect a society that is above slurs, hate, and illegal activity. Because if we want change in a society, then we need to encourage its movement forward. So if we want people to be safe from hate speech, crime, and intolerance, then it’s our job to refuse to allow those people to have either an educated-sounding voice, because hate isn’t really educated, or a voice at all. What drama we do or do not tolerate is solely dependent on ourselves to moderate.

Connecting with Community

So, I’m sad that I missed participating in Day of DH on Day of DH because class, but I’m sad that I missed it just because how connected all these projects felt reading through them. Even with all of the blogs I read being entirely separate projects or tasks, there was still this overwhelming sense of community and closeness in the idea of there just being a whole day dedicated to DH.

And I really just love this idea of something as simple as a dedicated date making so many people come together. It’s part of the reason I really want to get into livestreaming myself, in that I can foster that feeling on the Internet. In hosting events in games or on my channel, I’ve enjoyed being part of this goal of bringing a bunch of people with maybe limited similar interests together just to celebrate the idea of community. And that’s what Day of DH, granted some days later, felt like. This but massive celebration of a diverse interconnected community.

And, that’s kinda the point, isn’t it? Even if we all have absolutely nothing else in common, we have this. We have this idea of wanting to make things better in the digital space. And maybe today it’s Day of DH. Maybe in a couple years this kind of thing will be a quarterly event. And then there may just be spontaneous eruptions of DH celebration. And in the long run, I really want to be part of that.

I’ve watched and helped foster some of these kinds of things in gaming communities. It’s an amazing way to watch people make new friends, renew their interest in a game, or look at something in a totally knew light from just some offhand comment or joke made at a totally nonsensical party. And in a field where we’re constantly striving to be constantly connected, that feeling would blend so well into what we’re already trying to do in digital humanities. Where we’re working to try and remove the barrier people seem to have about the person on the other side of the screen not really being human. This made, I guess in a way no other assignment as done, made DH feel like it was about being human and being connected to other humans.

That’s why I livestream. That’s why I have my channel. And all my blogs. And my mod responsibilities. All of it, to try and bring people together. And even if there was no one else posting on Day of DH today that shared that interest, there were so many people reaching out with the exact same intentions. It’s things like that that make you feel like you’re part of a whole and not just a rambling individual on their own blog.


Nowviskie’s argument is something I’ve had to wrestle with a lot while being engaged to a self-proclaimed Nihilist, or at least, I’ve been contemplating whether it matters in the end and to what effect it does, if it does. To be fair though, my self-proclaimed Nihilist doesn’t really have a very clear definition on his opinions either. But that’s not the point.

Following this kerfuffle between Blizzard and Nostalrius Private Server, I’m sort of brought back to this idea of how do we want to be remembered. For reference, there are a large number of World of Warcraft fans who have been asking Blizzard for ages to create “legacies servers,” or servers that run older versions of the game, which Blizzard has somewhat inelegantly refused to do. It escalated recently when Blizzard shut down a non-profit fan server, Nostalrius Private Server, that was running the original WoW. This goes into copyright, but the largest arguments I have seen about why the community is so upset is because Blizzard is refusing to accept that the community’s fondness of the game is not with their current content, it is with the older content.

This produces an interesting idea. There are the developers’ idea of what WoW’s legacy will be, and that is the most modern iteration of the game. The community on the other hand largely wants to remember the older versions of WoW, either out of nostalgia or perhaps because of quality, but things start to get objective at the point. The point is there is a hard and large clash of perspectives on what a singular game’s legacy will be. Well, to be fair, WoW has shaped the online gaming atmosphere for years, so perhaps that legacy will have a huge effect on the genre for years to come. But, that being the case, then what legacy is “right.” It’s complicated because that’s an opinion, but is it fair for the developers to cling to heavily to something they released on the world and are struggling to control? At the same time, does the community have the right to take an IP and not let it evolve purely out of nostalgia? In the end, whatever gets remember will be a combination of these two major factors, but which one will be the more dominant one?

Fan-servers are becoming increasingly popular as time goes on. While I don’t follow the WoW ones, I am keeping an eye on a huge recreation of a game called Alicia Online. But as much as I am looking forward to the new AO, I’m always wondering if it’s worth trying to recreate an old title. What if instead of recreating AO, they just made AO 2? AO has made its mark, so why does it matter so much to recreate something that’s already had its run? Of course, there’s the issue of copyright that comes up here, but my point is more that why is there this resurgence of needs to recreate the old instead of trying to be innovative?

Nostalgia seems to be having a curious effect on how legacies are going to be shaped now, particularly with the tools to recreate some of those older materials into new mediums. If that’s the case, how do we classify the history of an event? From its original finale? Do we include the recreations? In wanting to make a mark, where does our mark end if it can just be brought back up over and over? And if that’s the case, where does our mark end in our own content and it becomes the product of an entirely different community?

10,000 Feet Up

So, I’m sitting in a really old hotel, working on channel stuff, and just trying to get over the weird creaks this building makes when you move ever so slightly. Yeah, my family takes weird vacations. Despite all that, I’m still on the internet, uploading content, and enjoying my downtime from the ridiculous hikes my dad wants to take at 10,600 something feet above sea level. And I’m sitting here in a weird reflection period practicing my presentation for Monday the ump-teenth time.

This town has history everywhere. My parents have already been stalking Doc Holiday’s room at least twice (I’m so glad no one is staying there). We’ve hiked around old mines and been shopping in buildings older than my grandparents. And it’s weird bringing my brand new computers into something like that. Not that I’m complaining about having them, but it’s such an odd contrast.

All that is leaving me thinking about what a lot of my parents’ generation is worried about with the growth of DH, and that’s experiencing history firsthand. Obviously, that’s not the easiest thing, and I’ve been extremely lucky to have traveled all the places I have been. But I can understand that there is some resistance, and it’s a tricky question to answer. If everything does become accessible digitally, where do we create the incentive to actually visit it?

I think to some people, that question is like comparing apples to oranges. Seeing something in person isn’t the same as seeing it online. How can you replace the experience of seeing Monet’s Water Lilies in person compared to a replicate? That is, for some people, a tear jerking experience. But for others, that’s not as true. And as things like VR are becoming more believable, visceral, and convincing, I can understand where that feeling that technology is going to “take over” is coming from. Whether I think it’s completely right or wrong I’m still working my way through, but I can grasp that idea.

So, where do we make that cut? Do we limit how real virtual images are allowed to be to encourage travel? Do we just upload everything and allow ourselves a sort of Matrix or Oasis type of experience? And sitting in my hotel room, with not nearly as much oxygen as I need, on some spotty wifi and crossed fingers this post will actually go through, I don’t know if I know that answer. I don’t want people to stop experiencing the real world, but I get why that is really hard right now. And so I’m struggling with my love for the experience with the understanding of a realistic budget. So, what has to give?

Journal of Digital Humanities

((I’m adding this in at the beginning because I want to note that I did write this up and had it set to post right after my presentation, but I think I tabbed out of the window just short of it completing the process. My apologies.))

Journal of Digital Humanities was a fascinating read, and while there happened to be an immediate issue that I found particular interest in for my presentation of the article, it was it was still incredibly interesting to go through and look at other entries to the journal beyond what I focused on for the presentation. In that regard, the journal is fairly extensive and diverse for the number of entries it currently has, bridging a wide gap from discussing the difference of presenting medium digitally to how visuals affect the presentation of studies. Unfortunately, as the journal has continued, those articles are becoming more and more sparse, making some topics feel more thoroughly discussed than others. Which is a real shame, since all the articles are very interesting to read.

What I found incredibly unique about the journal though was the submission process. Before even submitting an article, writers are required to submit a sort of “digital resume” through their online activity and posts. In a way, this validates an individual by saying they have been producing a lot of good content before even coming here, and it also says that if you enjoyed an article that you can find that writer making content of a similar type on another platform. It’s a great way to make sure someone is going to bring a good name to the journal, and in return, that person gets free advertising to their other media outlets. All of these articles are then reviewed by a volunteer editor program called Editor-at-Large, which is a body of DHers with some standing in the community. The requirements to apply for Editor-at-Large are a little broad, but it’s encouraged that the applicant is associated with an institution. In any case, the editor board being somewhat prestigious only adds to the fact that the articles that get into Journal of Digital Humanities are also of a high quality.

I think the major issue for the website though is the feel of it. I tried for a while to identify who I thought would be attracted to this website, and I found it difficult to decide on any particular age group. Mostly, I think this is due the layout of the journal. There are so many facets to it that don’t really mix well with each other that it feels like a little disjointed. The home page launches the reader on the most recent issue, instead of a summary to the site or any direction to allow the reader to pick what they want to read. There isn’t necessarily anything wrong with this, except this kind of assumed interest is carried across the whole layout. It is very difficult to actually find the topics of the issues individually. I think this is unappealing to a lot of readers, which is a shame. If you actually click on and read the article, then the content itself will definitely be quality. It’s just a matter of assuming that the reader is interested in the first place, without anything given away besides a single image, which could stem in all kinds of directions. That makes it feel very click-bait-y, which is an unhealthy approach for a journal of this quality, or at least, that’s how I feel about it.

In short, Journal of Digital Humanities is a diverse source of content about DH, but is severely suffering from a poor layout design that doesn’t really encourage readers to engage in the content. If the website’s layout weren’t so cluttered and more linear, as well as maybe that high gate of article entry being slightly lowered, I feel like readers would be fair more engaged to read and more writers would be encouraged to add to it. I hope that’s a consideration of the team behind it, in adding that layer of transparency, because the content that is already there is thoroughly enjoyable and entertaining. Catching up on the whole contents thus far is definitely something I am writing down as a summer read.

Where Do We Go?

I start with this video because one, I love this artist, and two, because of the variety this rap covers. Despite the fact that this is presented as a “summary of gaming” over the past 40 years (as of 2014), this video hardly scratches the surface of gaming as a whole. But where do we actually start with summarizing gaming as a whole?

Dan Bull’s song hits on some of the most iconic and memorable entries, that is certain. The song is definitely more marketable that way, since more people can relate. But how would someone else go back and encompassing gaming? Like literature, gaming jumps across genres freely. Within story genre, games also have mechanical genres, based on how the game is played. And then you add stylistic labels, and age ratings, and so on and so on until you have the mess that is categorizing video games, much like their novel counterparts.

So, with all that said, where would we start? What genre are we classifying games in? Is it based on their stories? Or do we base it on how they’re played? What kind of studies do we use stylistic choices for? Because all of this important thinking about how to put gaming into a timeline. Particularly, it’s important for deciding what was the “first of its kind” in a genre.

Since games, like novels, are so diverse, it’s difficult to say. Is a game the “first of its kind” because it was the first fantasy game or because it was the first text-based adventure game? And that’s when I feel like the timelines would get incredibly complicated, because it would come down to personal opinions. And presenting games based on those biases completely changes how we would view games. Generally speaking, “first of its kind” games have a kind of prestige. They’re games every gamer needs to play because it was a basis for games to come after, and to understand the evolution of gaming, it is important to play those games. But how in that spectrum do we decide what is the first of its kind? Because the game we chose to be the “first of its kind” will get that prestige, and the other game may disappear into obscurity because it wasn’t, according to our classifications.

I think that’s a huge problem with timelines. Because visuals are so useful, but what visuals we use to make our timeline out of shape so much. The expression is “History is told from the side of the victor,” and that’s true for a lot of reasons. For my argument, what we chose to be important and to be reprinted into our visualizations of history over and over again, becomes important in the eyes of the majority. And unless we find a way to just have an infinite spectrum understandable by the human mind, that’s always going to be an issue of visual representations.

I mean, there is a reason those games were in Dan Bull’s rap. They are important to our gaming history as it is. For a lot of good reasons, but that doesn’t mean there haven’t been equally good games that failed because of these games’ success. We chose the success of these games, and therefore we’ve populated a certain timeline of games that are important. So how do we represent everything else equally?

Games Are About Choice

So, I don’t want to sound like I’m countering Cae’s argument, because I really like what they said, but I felt kinda differently about their argument. I was going to write this anyway, but now I kinda have a comparison. So, I guess my point is, this argument stands on its own, but I think it’s a good contrast to what Cae already said if you read both (because what they said is really interesting, so you should read it).

As a narrative writer for video games, I would readily argue that games are strong when they give the player choice. Yes, there are powerful games when choice is completely removed, like The Beginner’s Guide or Presentable Liberty. Those games are experiences, and I think the lack of choice in those case make them extremely powerful. And if you want a real punch in the gut, I highly suggest either of those games. The Beginner’s Guide personally knocked the breath out of me for days, and my fiance can say the same for Presentable Liberty. They are both marvels because of how they constrict choice. And in all honesty, there isn’t any.

Yes, there is something to be said about development in a game when there isn’t 100% free choice. Unless it’s an open world sandbox like Minecraft, with limited to no plot, there isn’t really a 100% choice. That’s a technical setback that we just haven’t overcome yet in terms of development. But we are getting there. My point is though, is that saying games aren’t 100% is unfair because on a technical level we haven’t gotten there. We already have games like Life Is Strange, Oxenfree, and Until Dawn that have pushed the boundaries of what we thought games centered around choice could do. Those game have so many choices and outcomes in them that we’re still finding combinations not yet explored.

But in the end, I have no better example about games using choice than The Stanley Parable.  If you’ve never heard of this game, I would highly suggest you play it. And again. And again. And then a few more times just for fun. This game is hilarious and you will not be disappointed. There is so much dialogue programmed into the The Stanley Parable though, that the player can choose to completely ignore the plot at any given time in the game. Granted, they are limited by the game world, but there is no instance where the player has to follow along with what the game is telling you to do.

I said that video games as a medium are about manipulating choice. While on a mechanical level all choices and outcomes are constructed, the player deciding which choices they want to make makes the game about choice. There are some plots that take this deeper than others. If you picked up a Legend of Zelda title, no, the game would not let you decide whether or not you wanted to kill Ganondorf. That is part of the plot. But, the game does let you choose how you want to fight him in the gameplay. You could do the easy thing and use your sword. Or you could decide that for a certain battle you were only allowed to use the bow. This is obviously limited because this series isn’t known for choice, but there is still some remnants in there. If you want a fantasy game about choice, you’re probably playing an Elder Scrolls games.

On the other hand, if you picked up a game like Harvest Moon, you would be presented with an entirely different set of options. For this comparison, I’m going to use Harvest Moon: A Wonderful Life in specific. Yes, there are a limited number of overall outcomes, but there are a number of ways to get there in this game. For those of you not familiar with the franchise, Harvest Moon is a farming simulator, for lack of a better comparison. The game is typically about having a successful farm, but there are some underlying plots in these games. In my particular example, your choices determine what your son will grow up to be. And there are all kinds of things that affect this. Primarily, it’s based on who you marry in the game and what you give your son once he’s born. But there are some more subtle effects as well. Some of it comes from where you get your money. And there are a lot of ways to get money in this game. You can farm, ranch, fish, cook, mine, forage, or be super lazy and just plug your Gamecube controller into the third slot and hit Start and Z a bunch of times to cheat in yourself a bunch of items and gold. Whatever you end up doing, what you sell affects what your son wants to do as well. So you can run a “farm” entirely on mushrooms you find in the forest, if you want to, and the game will just keep going plot-wise.

The Scarecrow game falls into the first. Obviously there is no choice to join the crows, and you have to play the way they say you can play, but it is a matter of choice how quickly you do it and in what order you do it. A minimal spectrum, but it is one. I would say though that this game would have been better just serving as an experience game. It would have made its point and felt far less forceful in its delivery. But at the same time, it is the player’s decision to pick a game up and play it in the first place or to just leave it alone and never delete it. See Pony Island for a better example of that, but another game with massive spoilers.

Games definitely have choices. Some of them are just as simple as choosing to play the game. But if the player does interact with a game world and how they choose to interact with that game world, is a choice, because they are sculpting their experience in the game based on how they play it.