Why is the webhead so lucrative? An answer to The Daily Wire

From: Judeus Samson via unplash.com

The content below may contain spoilers for the new movie, Spider-Man:  No Way Home.  Do with this information what you will. 

I am a fan of the web-head.  I have been a fan of the classic Peter Parker since I was in middle school.  And as such, I have consumed a lot of Spider-Man content over the years, from television shows, to comics, to live action movies (yes, all eight of them.)  I very much enjoyed the latest installment:  Spider-Man: No Way Home. 

Apparently, I’m not alone.  According to The Daily Wire, Spider-Man pulled in $253 million dollars domestically on opening weekend alone.  Which, thanks to Covid, hasn’t happened in a long time. 

In my perusing the internet after watching the movie (mostly to validate my own feelings towards it), I came across an article by The Daily Wire (yes, that same article that I referred not two sentences ago) that made me pause.  It suggested that the reason the webhead made so much money was because it wasn’t woke—no LGBTQ+ stuff, no racial politics, etc.  While I don’t tend to watch “woke” movies, I did find this assertion odd.  I think it displays a remarkable misunderstanding of the webhead, the MCU, and the fanbase, so, in true “me” fashion, I decided, “Hey, why not pick this article into oblivion?” 

So, let’s take a walk through this article, refuting it point by point.  I will be taking quotes from the article:  “Spider-Man:  No Way Home’ Suggests Movie Fans Are Fed Up With Woke Propoganda.” 

I’ll start with the premise of the article, which cites Breitbart as its source:  ““Newsflash: If you make a decent movie that seeks to entertain and move — instead of lecture and shame — we will show up,” John Nolte wrote. “‘Spider-Man: No Way Home’ does precisely that. You will not only have a great time at the movies again, but there’s no gay, transsexual, or racial nonsense. This movie is about only one thing: Delivering the goods.””

Premise:  This movie is “good” because there is no “transsexual, gay, or racial nonsense.”  Okay, fine.  Let’s see how they back this claim up. 

Their first point:  Black Widow bombed because of overtly feminist themes that superseded the central narrative. Eternals had a similar issue because audiences saw the extraordinarily politically correct agenda and rebelled against it.”

I’d like to disagree with the claims made about these movies, and I will briefly point out this article seems to make a logic flaw.  If these movies “bombed,” it does not necessarily mean that if you do the opposite of what these films do, i.e. “be woke,” that you will end up with a successful movie.  But given that this is the point the article makes, let’s look at why these two movies in particular didn’t do well.  Given the fact that all of these movies (Black Widow, Eternals, and Spidey) were released at differing stages of the pandemic, it’s good to take a quick look and see how Covid was doing for each of these. 

Black Widow was released on July 9, 2021.  This was around the time life was “getting back to normal.”  Which theoretically means that if people really were excited about this movie, they could have gone to the theater maskless and watched Black Widow as if it were an entirely normal experience, and the movie should have been a smashing success.  Assuming “bombed” means “did poorly at the box office,” then the article would be correct:  Black Widow made $80.4 million on opening weekend—less than half the revenue Spidey cashed in at the end of his debut, which seems to indicate that fans weren’t willing to spend their money to watch this movie in theaters.

But this movie wasn’t just released in theaters.  This movie was released on Premier Access with Disney+, which meant that if you wanted, you could pay $30 for a month to watch the movie as many times as you liked.  Disney didn’t delay the release of Widow on Disney+, either—both the streaming service and the theatrical release were simultaneous.  Typically, if fans want to watch a movie multiple times before the DVD or streaming service release, they have to go to the theater, paying the ticket price over and over again rather than a one-time fee, and this spending adds to the overall gross of the film (though in all likelihood would not affect the bottom line on opening day).  With the streaming service, as everyone knows, you pay a one-time flat fee and the movie is yours forever (well, yours on the “cloud,” whatever that is).  My guess?  People mooched off the one person they knew who had a Disney+ account and was willing to splurge for the $30 Premier Access streaming fee.  The streaming release affected the film’s box office performance to the point that Scarlett Johansson, the title actress, sued Disney over the fact that she wasn’t able to get the bonuses associated with said box office. (And that situation is a whole different can of worms.)   

Now, to the actual point they’re making about feminism:  I actually don’t really know what they’re talking about.  I watched this movie—while it is a “girl power” movie, meaning there is a fair amount of women handing out unrealistic beatings to men, I didn’t necessarily think anything was “overtly feminist.”  Honestly?  I thought it was kinda cute.  So we could safely say that Widow didn’t bomb because it was feminist, it most likely bombed because of the circumstances of the release. 

Eternals faired even worse at the box office:  $71.3 million on opening weekend domestically.   Why the poor turnout?  Well, there could be several reasons for this one—one could be that the film was released November 5, 2021.  It is during cold and flu season, so the more cautious among us may have felt the desire to stay indoors.  The reason The Daily Wire is implying it did poorly?  Eternals features a gay couple kissing. 

While it’s entirely possible that some people boycotted this movie for that reason alone, the more likely reason is they have no desire to watch yet another Marvel movie for characters they know nothing about.  If we’re honest, absolutely no one knows who the Eternals are.  They are a very niche group of heroes, and most casual comic-book fans (which is currently the majority of the MCU’s audience) would know them only tangentially at best.  Marvel has turned some obscure superhero groups into box-office hits (Guardians of the Galaxy and Big Hero 6), but those were marketed as fun summer blockbusters with goofy jokes.  No one knew who the Eternals were, no one cared to know, and they don’t affect the main MCU storyline we’ve already invested one-and-a-half decades working on.      

Also, this might just be me living under a rock, but I saw zero ads for this movie.  I didn’t know it had come out until after the conservatives were freaking out about gay characters onscreen.  Marketing left something to be desired for this one, if you ask me. 

Last, the article makes this point:  “No Way Home has Peter Parker (Tom Holland) visit Doctor Strange (Benedict Cumberbatch) with the request that everyone forgets he’s a superhero. But our leading man quickly learns that actions have consequences. Peter Parker makes mistakes, but the movie also includes a place for second chances and redemption. Plus, there are nods to American pride in the film, which is virtually nonexistent in the rest of Hollywood.”

The point I assume the article attempts to make is that because there are brief moments of American pride, and Peter Parker doesn’t treat himself as a victim, this movie did well at the box office.  I’m not entirely sure what “nods to American pride” the article is referencing.  There is a fight on the Statue of Liberty, which has been recently renovated to hold Cap’s shield (in honor of his sacrifice?  I didn’t know and didn’t care).  I’m assuming that’s what this means.  Oddly enough, while this article praises the “American pride,” I actually found it startlingly lacking—for a Spider-Man movie. 

Spider-Man movies typically have heavy American imagery.  In most movies (including ones with Holland, the MCU’s pick for Spidey), he’s seen posing with a flag in the background.  This might appear to be an odd choice, considering Captain America would be considered a more appropriate character for American pride, but it was a very deliberate decision made in the first live-action Spider-Man starring Tobey Maguire. 

You see, Spider-Man was released in May of 2002.  Nine months prior to the release of the Raimi movie, America experienced 9/11.  Two planes deliberately crashed into the World Trade Center, murdering thousands and shattering the illusion that America was untouchable.  In the months following that attack, national American pride was at an all-time high, and countries all over the world were showing solidarity.  Spider-Man also found a way to support the U.S.A.—there flags everywhere, yes, but some of the most iconic shots feature the webhead clinging to a flagpole flying the American flag.  He’s from NYC, after all.  Ever since then, American flags dot Spidey’s background.  The movie has “nods to American pride,” because at this point, they’re basically required; it’s not because the webslinger is responding to woke Hollywood.

So, to recap, The Daily Wire basically says that No Way Home did well because it wasn’t woke like Widow or Eternals, and has a few American moments in it. 

Nothing could be further from the truth.  Pitting Spider-Man against any hero is unfair—box-office wise, at least.  Spider-Man is the most popular superhero globally, which is an impressive feat considering his competition is Superman, Batman, and Wonder-Woman, three iconic characters from Detective Comics created decades before the web-slinger.  The U.S.A. crowns Spidey as its favorite hero—above Captain America, Iron Man, and Batman.  Spider-merch sells—you really think a child is going to want an Eternals happy meal toy when he could have a pair of cheap webshooters?  (Fun fact— we did get a cheap pair of webshooters in a happy meal once, and I also got an Eternals toy much more recently.  The webshooters were cooler.)  A fairer comparison would be pitting Spider-Man against the Nolan Batman trilogy—perhaps they aren’t on the same cinematic playing field, but The Dark Knight did end up making $1 billion dollars in the box-office worldwide (not on opening weekend, but still an impressive number.) 

What makes Spider-Man such a beloved character is Peter Parker, his alter ego.  He has been such a massive success as a character because his personal life is a complete mess, but he still tries to do the right thing and use his gifts to the best of his ability.  The Spider-Man costume is one of the few in the MCU that hasn’t completely succumbed to the “battle-chic” aesthetic, which makes him bright and interesting to watch.  Even if there was an entire cast with a diversity quota and a gay romance subplot, I would be willing to bet Spidey would still get quite the showing on opening night.  Obviously, there would be people who boycott it, but most people just love Spider-Man and would be willing to forgive quite a bit just to watch him swing across the city.

Oh, and one more thing:  this movie has had rumors flying around it for months.  With the release of shows like WandaVision, rumors had started that this movie, perhaps, was going to attempt to introduce a “multiverse”—all three Spider-Men in one movie working together. The trailers only amped up the excitement for fans, who haven’t seen Molina on the screen as Doc Ock in over a decade.  No one wanted this movie spoiled, so they tried their damndest to see this movie the night it opened. 

And to be fair, it was worth going to see on opening night.  Watching Andrew Garfield walk through that portal—I haven’t had that much fun in a theater in a very, very long time.  The cheers that came out of everyone when Tobey waved made me smile. 

Spider-Man didn’t get views because he’s not woke.  Spidey got views because he’s Spidey.  Stan Lee created an icon, and he’s not going to be going away any time soon.  Does it help that there wasn’t a bunch of propaganda in this film?  Of course, but I guarantee that’s not why he’s rolling in dough. 

The best part about Spider-Man is that anyone could wear the mask.  It’s just that Peter Parker wears it very, very well. 

The Poetics

Image Credit: Kyle Head via Unsplash.com

Good evening, ladies and gentlemen.  This month, we mark the one year anniversary of the Hopeless Academic!  To mark this very auspicious occasion, I will be doing a summary of one of my favorite analytical works when it comes to literature, everyone’s absolute favorite:  The Poetics by Aristotle.  I referenced this work in almost every paper I wrote in college, and for good reason.  In this work, Aristotle lays out several rules that are still followed (with several exceptions and improvements) by storywriters and even filmmakers today.  If you studied literature at all in high school or college, you were probably introduced to the “story arc,” a diagram which reduced literature into a mathematical equation.  (Exposition + Rising Action = Climax… or at least the graphical representation of that.)  Aristotle is the great great great grandfather of that diagram, or at the very least, parts of it. 

Obviously, I’m going to be leaving some things out, because… well.. it literally is Greek to me.  (Seriously, the words are in actual Greek.  At least I’m pretty sure they’re Greek.)  Besides, I want to make this more fun, and if you wanted a chapter by chapter summary, you could go read sparknotes. 

Now, in the Poetics, Aristotle makes a distinction between Tragedy and Comedy, but he focuses mostly on how a Tragic play should be constructed.  Most people extrapolate his advice from tragic plays only to how stories in general should be constructed, including comedies, which is what we’re going to do today.  Aristotle’s advice is solid regardless, so why not use it? 

Most of what Aristotle has to say centers around plot, and how the elements of a tragic play can serve that plot.  To him, the essence of a story should center around an action rather than a particular character, though modern filmmakers and authors might disagree.  If we get down to brass tacks, though, I think that most people would choose to watch a movie where a bunch of boring people do something rather than a film where a bunch of interesting people do nothing.  I think Aristotle knows this, too, but he sort of insulates himself from criticism by saying that a plot needs characters in order to accomplish anything.  Even if he says that plot is the most important thing, he knows that plot almost by definition relies on characters. 

In order to have a good plot, Aristotle also lays out what he thinks are the most essential structural elements to said plot.  Two of the structural elements that he thinks every good plot should have are a peripeteia and an anagnorisis.  Which are… what? 

For those who know their Greek, a peripeteia is a reversal of the situation — in other words, that’s when things go from good to bad, or from bad to good.  For example, in Cinderella’s story, her peripeteia is when she puts on the glass slipper — at least, that’s officially when her situation reverses.  Before, her life was full of misery and abuse, but after she put on that slipper, her fate was sealed as the bride of Prince Charming, or whatever his name is.  “But what about before, when Cinderella’s life was wonderful and then her father died?”  you ask, and rightfully so.  Didn’t her situation reverse then, too?  Because this happens so early on in the story, most people would qualify this as exposition, or simply setting up the story.  In fact, the animated Disney version has this part in voiceover, giving you only the necessary details so you know the facts of the story — the story itself doesn’t really begin until after her father dies. 

So much for a peripeteia.  What about anagnorisis?  An anagnorisis , or recognition, is when somebody goes from ignorance to knowledge — honestly, it’s just when someone recognizes something.  (Quite the fancy term for such a basic thing.)  Aristotle apparently likes a bit of suspense in his stories.  There are a bunch of movies and stories with recognition scenes, like the animated movie Anastasia.  Anastasia has amnesia and doesn’t remember if she is the Duchess of Russia, but she recognizes the perfume on her grandmother’s hands, and the memories come flooding back to her.  Her grandmother recognizes the necklace she had given Anastasia before they were separated, and the two reunite. 

The other elements of the plot are a lot more basic — the complication is basically the rising action, or setting up the conflict in the story.  The unraveling is the falling action in the story; this comes after the climax, and wraps up the story in a neat little bow. 

“Great,” you think, “I have all of the basic structure in order to write a story that Aristotle would love.”  Not so fast.  Those are just some basic plot elements — he has a lot more rules. 

Aristotle also has some thoughts on the length of the story.  To him, the perfect plot follows along with an action that is roughly twenty-four hours in length.  While it may seem overly restrictive, Aristotle’s idea is that the human brain can only handle so much — remember, people are remarkably lazy.  Once you hit two days or longer, the plot gets too complicated for people to handle.  Or at least, Aristotle thinks so.  This is one of the rules that I’d chalk up to preference, personally, instead of doing a Jack Baur-esque type plot where everything you watch is shown in “real time.”  Sadly, Aristotle wouldn’t even appreciate that type of television show, because it’s episodic.  According to him, episodic plots are “the worst” (translation from the Greek.)  His argument would be that it would be too difficult to keep up with all of that action.  And honestly?  This is actually kinda fair.  Before the Netflix binge culture we somehow created, episodes were shown a week apart — who remembers all of the details of a show from three weeks ago?  This is why most of the time, the only thing that remained consistent were the characters, and each episode was a story unto itself.  With shows that aired on network tv, like The Office or Friends, you could skip most installments, and just watch that one singular episode.   Even with shows that do have a continuous plot, there’s a quickie “Previously On…” recap.  Maybe the ol’ man was onto something. 

Ok.  So we’ve got a complication, a peripeteia, an anagnorisis, some falling action, and a singular story with that takes place within twenty-four hours.  Great!  But you’re not done yet.  This next rule is probably one of the most important rules. 

You gotta keep it consistent.  Aristotle calls this rule the “rule of probability and necessity.”  It basically means that all of the characters have to be consistent unto themselves, and the plot can’t just randomly jump to a conclusion because… plot.  Let’s take a character like Captain America, for example.  He has been built up as a caring, noble character who would go to the ends of the earth for his friends.  Above all, he’s seen as loyal, especially to those he views as family, and he’s been pretty consistent in every MCU movie he’s been in over the course of a decade.  It would make no sense if, oh, I don’t know, he decided to make an entirely selfish choice and leave all of his friends behind just because he had a crush on some girl.  Especially if he abandoned said friends after nearly killing himself to save said friends from brainwashing.  That would make no sense at all. 

Aristotle also wants to make sure that your story makes sense.  You can’t have Superman showing up at every twist and turn just to save the day.  Superman has a schedule to keep, and honestly, having him show up at every twist and turn is unimaginative and unexciting. Imagine watching a Batman movie and Superman just shows up at the very last second to save Batman from Joker.  Sure, you’re happy Batman was saved, but wouldn’t it have been a bit more fun to watch Batman use his own ingenuity to save himself?  Pixar actually has a rule for this, and they are famous for their quality storytelling. Their rule: “Coincidences that get characters into trouble are great; coincidences that get them out of it are cheating.”  Aristotle, however, explains that sometimes, you have to cheat.  It’s inevitable.  Tony Stark needs his suit, but the nearest warehouse is thousands of miles away, and in order to get there in time, his Iron Man suit would need to break the sound barrier, which would be nearly impossible.  That’s ok!  Just don’t make us watch the suit traveling across the country.  Just have it show up — CinemaSins can nitpick it later.  

Obviously, Aristotle talks a lot more about what goes into a good story.  There’s how you present everything, and the set design, and even the meter of the songs.  (Which I suppose is the Ancient Greek precursor to sound design.)  He has a lot of rules, not to mention his own philosophy on what tragedy is for and how it fits into the human experience. 

Maybe some of his rules are outdated.  Clearly, the episode thing is going to stick around for a long time to come.  And stories told within only twenty-four hour periods don’t really exist that much anymore either.  We’ve grown in sophistication since the days of Ancient Greek plays, too—almost everything that we depict onscreen in the twenty-first century looks incredibly realistic.  The movie industry is worth billions, and actors praised like the gods Aristotle worshipped.  But he certainly got something right, if studios like Pixar share his ideas.  Every aspiring playwright, author, and screenwriter should memorize The Poetics.  Maybe then Hollywood might start turning out some original films. 

The Black Pen

From: Kelly Sikkema via unsplash.com

Okay, we’re going to be brief today. I do not consider myself a poet, but I wrote you all a poem. This is a blatant ripoff of William Carlos Williams’ poem “The Red Wheelbarrow.”

I am not sorry. Williams’ poem is just horrible. I will not elaborate, because a quick Google search and a three second glance at the poem will prove me right. Just as Williams did his best to shape his poem like a wheelbarrow, so too will my poem be pretentiously shaped. Here we go.

The Black Pen

So
Much
Depends
Upon
A
Little
Black
Pen
Filled
With
Black
Ink
Beside
The
White
Paper

Ta-da! I can make shaped poetry, too, Williams! (And mine makes more sense… just sayin’… I feel like infinitely more people depend on pens than wheelbarrows, but maybe that’s just me.)

I do have real posts coming, but inspiration struck like a lightning bolt and I just had to share.

Go forth, and mock more shaped poems.

That is all.

A Love Letter to Children’s Literature

From: Ben White at Unsplash.com

I’ll admit it, I’m a sucker for children’s lit.  Obviously, when I was a child, that was all I read.  I felt rather guilty about doing this around middle and high school, due to the fact that I was “no longer a child.”  I was obsessed with the idea of “reading at my level,” because as a third grader, I was told that my reading level was very high.  Naturally, this fueled my ego, and I became one of those “burnt out at twenty” types who didn’t have the attention span to read a single two-hundred page novel in a sitting.  I have since learned to fall back in love with reading, but even when I “wasn’t reading,” I could still make it through a children’s book. 

Now, when I say children’s lit, I don’t mean the picture books we read at bedtime as kids.  (Though I will always have a special place in my heart for “Froggy Goes To Bed.”)  I mean things like fairytales and the books we all devoured in middle school—the Percy Jackson books, Harry Potter stories, Lord of the Rings, Anne of Green Gables, and those other random books we were all assigned during those grades and somehow still remember.  Most people think on these books with the fondness that only nostalgia can bring.  There are people who outgrow children’s literature—those are the people who will give their childhood books to their children but not open the stories themselves.  I am not one of those people.  And I’m gonna tell you why. 

One of my favorite things about children’s literature—and stories in general—is that they are simple.  Our own lives are messy, and they are difficult to understand as we live them.  Stories organize events to form a cohesive whole; as Aristotle would say, a plot is following one action from beginning to end.  Each story is like a microcosm of the world, but without extraneous details that tend to complicate it.  Adult literature often explores the world of adults, dealing with more nuance and, shall we say, PG-13 themes.  This is not to say that children’s literature can’t have a complex plot, or that children can’t understand those “PG-13” themes, or even that those themes shouldn’t be there.  Children’s literature just distills those themes into a manageable bite-size pieces.  For example, children’s literature is what first teaches children about the existence of dragons, which is a PG-13 theme if I’ve ever seen one. 

There is a quote attributed to G.K. Chesterton that goes like this:  “Fairy tales do not tell children the dragons exist.  Children already know dragons exist.  Fairy tales tell children that dragons can be killed.”  Not every story has an actual dragon, of course.  Harry Potter’s “dragon” is Voldemort, Frodo’s is Sauron, and so on and so forth.  I must say, I have to disagree with Mr. Chesterton here.  Most loving parents don’t want to show their children dragons, and so shelter them as long as possible.  As a consequence, a lot of these children don’t meet a dragon until a bit later in life, or at least not until after they’ve been introduced to their first fairytale.  Chesterton got it only half wrong:  fairytales, in fact, do teach children that dragons exist, and that those dragons have dangerous claws and teeth, and if you get too close or don’t pay attention, the dragon can and will strike.  Fairytales introduce evil in that manageable bite-size nugget.

But the second half of Chesterton’s quote is spot-on:  fairytales teach that dragons can be beaten, and they show a range of different ways to beat them, because not every dragon is defeated the same way.  Some dragons, like Sauron, are defeated with two little half-dead hobbits, others, like Voldemort, are defeated in a very public battle and a simple spell.  Most of the time, the fact that the dragon is defeated isn’t nearly as important as how the dragon is defeated.  Children’s books show that defeating the dragon rarely comes from pure luck (that’s just lazy writing), but requires a certain amount of skill (which means that you’re going to have to learn something). They also show perhaps the most important part of all to defeating a dragon:  bravery.

Bravery is a running theme throughout almost all of children’s literature.  Real children often feel small, not least because everyone else in the world is so much larger than they are, and those larger people certainly do seem to make a mess of things.  In stories, children feel the exact same way, which makes those characters relatable not only to children, but to adults as well.  (Don’t pretend like you’ve never felt that the world was entirely too large and that you were entirely too small.)  Merely existing as a child takes a certain amount of bravery, but to exist as a child in a novel would be a great feat given the dangerous mission that likely lies ahead in every adventure story.  These children are usually terrified, but they face their dragon anyway.  Somehow, they find the resolve to carry out the duty no one else wanted, because they understand that if they don’t, more horrible things could happen.  In fact, most of the children in storybooks are cheerful in the face of adversity, or at least they do their best, and that is perhaps the bravest thing of all.  

As an adult, I am continually impressed with these literary children.  I understand that not only are they being braver than I ever could be, (I would most definitely abandon the wizarding world to its fate rather than face Voldemort) but they are dealing with these grandiose scenarios on top of the ever-so-difficult process of growing up.  It’s no small feat to grow up.  Growing up means you start to understand the world and your place in it; you start to understand why and how society is structured the way it is, and it comes with an appreciation (and a certain cynicism) of that structure.  To do that is not a small accomplishment.  I think this is the key as to why I like children’s literature so much:  I believe that no one ever truly grows up.  I don’t mean in the idealistic Peter Pan sense, in which some adults deliberately choose not to become more mature as they grow older.  And I don’t even mean in the “every adult as a child in them” sense either.  I mean that I think that everyone is in a state of becoming, and that people try to be a better version of themselves every single day.  Children understand this concept very well, because they continually reference the future—“when I grow up.”  They want to become a great person.  Adults usually lose this sense because they are grown up.

Children’s literature helps me, at least, to remember that I am not fully grown up, and I never will be.  I relate to Anne of Green Gables for her awkwardness and her ability to speak out of turn, and I especially relate to her remorse when she does the wrong thing.  Learning how to deal with people is part of maturing.  I relate to Harry when he feels like everything crashes in on him at once, and he doesn’t want the responsibility of saving the wizarding world.  Shouldering one’s duty the way he does is part of becoming an adult.  My problems won’t always look like theirs, and I think that’s part of growing up, too.  The problems in children’s literature will forever serve as reminders:  I’m not complete.  I’m still working to become the “grown up” version of me. 

If you think you have fully grown up, perhaps you should open one of those old dusty children’s books. Maybe you won’t feel as grown up as you think.

Emma Woodhouse: The Original YouTube Apology Video

From: SeattleRefined: “Review: ‘Emma.’ sets a spritely pace that doesn’t lose its luster along the way” Image credited to Focus Features.

If you’ve ever had the unfortunate displeasure of following a YouTube beauty guru channel, you’ll eventually realize that most of the big name YouTube beauty gurus all know each other, they all make way too much money, and they all like to start drama.  Like . . . a LOT of drama.  While I am not an avid follower of these things, somehow I was made aware of the existence of these beauty guru people through memes and the ever-knowing YouTube algorithm.  (God bless big tech, she says sarcastically.)  Anyway, with drama, comes the inevitable hurt feelings and overstepping, along with real problems like marketing infringement and sometimes even child exploitation.  (Oh, the things we overlook in the rich and famous.)  Because fans drive a YouTuber’s career more so than a traditional celebrity, it is a good move to apologize for past missteps in an attempt to expiate their past.  (Cancel culture, anyone?)  Thus comes the rise of the ever popular apology video.

If you’ve never seen one of these, I’ll give you a quick rundown.  YouTube apology videos are typically characterized by being insanely long (can be up to forty minutes, or even longer), crocodile tears, false promises, disassociation, and insincere apologies.  Most apology videos are demonetized, which means the creator doesn’t make money off the video, which theoretically makes the video appear more “genuine,” even though the only people being fooled are the people making the video.  Typically, the creator says something like, “I apologize for how my behavior hurt others,” or “I apologize for being manipulated by (insert name of other person making another apology video here).”  As anyone can tell, these aren’t actual apologies—these people apologize to their fans for the fans’ perception of the creator’s behavior, not for the problematic behavior in the first place.  Responsibilities are not shouldered properly, or they apologize to the wrong people entirely.  That would be like me hitting a family member in public and then apologizing to a witness.  My main priority should be apologizing to the person I hit, first for hitting them and then for degrading their reputation in public.  YouTubers, though, try to follow a different set of rules.

But you know who nailed this whole false apology thing long before YouTube?  Jane Austen with Emma Woodhouse, the main female protagonist in the book Emma

I have been told time and time again that when Austen wrote Emma, she was writing a protagonist only she could love.  Despite this tidbit, I also know that a lot of people absolutely love this book, but it’s rubbed me the wrong way ever since I read it, and since I have no advertisers to please, I can give whatever opinion I like.  I personally think it’s more fun to write a rant, and maybe it’ll be more fun for you to read. 

Oh, for those who haven’t read this book, spoilers ahead. This review, though, might make a bit more sense if you have read the book, as I’ll be referencing characters with the assumption that you have a vague idea of who they are.

As stated, I don’t like Emma.  I think she’s selfish and self-righteous, and I am thoroughly displeased that any man ends up with her, let alone the fabulous catch Mr. Knightly (her sister’s husband’s brother).  There are several things about Emma’s behavior that are reprehensible, and most of them are displayed in the last portion of the book.

Don’t get me wrong, Emma is pretty unbearable throughout.  Spurred on by the one success she’s ever had, she finds it to be her Mission In Life ™ to make sure everyone is happily married (except herself, of course, as she is above such things as romance and affection.)  Emma not only builds up the young Harriet Smith beyond her social station and gives Harriet ambitions beyond what she can achieve, Emma also is tactless and imprudent, indiscriminately giving her opinion on the personal lives of others of whom she knows nothing. 

But, those are not the main issues I have with Emma’s character.  All of her previous faults could be forgiven.  I have no sympathy for her after she insults Mrs. Bates.  After this occurrence, in my mind, Emma does nothing to redeem herself. 

For those who don’t remember the story (or who don’t particularly mind spoilers), Emma and her friends go on a picnic, and Frank (Emma’s best friend’s stepson) asks everyone at said picnic to tell him one clever thing, two sorta-kinda clever things, or three boring things.  (Ah, the games played in Austen’s time.  They sound like a hoot.)  Mrs. Bates, an elderly woman who has a tendency to prattle, says she’ll have absolutely no problem fulfilling the last requirement.  Emma (for some reason) absolutely roasts the old woman, fully agreeing with her and suggesting that Mrs. Bates had better just keep her mouth shut.

Yeah.

Mr. Knightly, being a gentleman, decides at the end of the picnic to give Emma the grace she didn’t give Mrs. Bates—he takes her aside and says that what she did was tactless, saying he’s disappointed in her. 

And this is where the YouTube apology part comes into play.  You see, before Mr. Knightly told her about her misbehavior, Emma was seemingly unaware of it.  She knew she shouldn’t have said it, but after the words were spoken, she didn’t seem to feel badly, nor did she apologize or have the good grace to leave.  After Mr. Knightly spoke to her, Emma sat miserably in her carriage—alone—crying on the way back home.  Remember those YouTuber crocodile tears?  Yeah, I found them. 

Now, I would normally consider these tears of contrition, but the rest of the book proves otherwise.  Emma goes to Mrs. Bates’ home the next day, and sits with her for a long time, but not once does she actually apologize for what happened at the picnic.  (Trust me, it’s not there, I was waiting for it and it never came.)  Emma does leave Mrs. Bates’ company with everything all patched up, but I as I read, I got the distinct impression Emma went to assuage her own conscience and not to atone for her wrongdoing.  If she were truly sorry, don’t you think she’d say so?  Insert classic YouTuber Apology Video move of not shouldering responsibility properly. 

My theory was further confirmed when Emma came back to her own home and found Mr. Knightly visiting.  To her credit, Emma doesn’t mention what she’s been doing, but when her father asks after Mrs. Bates, Emma and Mr. Knightly share a significant look, and she considers herself back in his good graces.

Again, I ask, dear reader procrastinating by reading this blog, if Emma was trying to make up for insulting Mrs. Bates, why would she care about Knightly’s opinion?  As a reader, I felt like I was supposed to feel like Mr. Knightly’s forgiveness mattered more than Mrs. Bates’.  In typical YouTuber fashion, Emma apologized to her audience rather than the actual person she wronged. 

I could be persuaded to like Emma after all the havoc she caused if this episode had gone differently.  I have been in Emma’s situation more times than I can count—I’ve blurted out wretched things to people who didn’t deserve it, and I didn’t apologize, at least not with words.  (Sibling apologies are different, okay?)  But Emma’s comments were unprovoked and completely untoward.  If she had felt badly before Knightly spoke to her, I could easily like her character, but I also understand that sometimes we need people to guide us along a tactful path.  I can even forgive the tears—they weren’t in front of anyone, so she wasn’t trying to put on a show for anybody.  I don’t even mind that she gets back into Knightly’s good graces.  After all, good friends will call you out when you do something wrong, and that’s all he was being–a good friend.  But I just can’t quite get over the fact that she never actually apologized.  Because of that, the whole thing felt insincere.  The tears were frustration over being called out; the apology was just to get back on Knightly’s good side.  And boy did she end up on his good side. 

I have other problems with Emma’s behavior after this episode. I’m not particularly a fan of how she treats Harriet in the last bit of the book after she realizes they both like the same man. But as they say, all’s fair in love and war.

Perhaps people will say I am being too harsh on Miss Woodhouse. After all, there are a myriad of people who actually do love her, despite the fact that Austen created Emma to please only herself. To her credit, Emma is very relatable specifically because of her flaws. I understood a lot of her motivations (written and unwritten), because I’ve been in situations like hers before. Though I’ve never set people on blind dates together, I have spoken out of turn and said stupid things that were probably better left in my head. All in all, she’s definitely one of the most real characters I’ve ever read.

In my humble opinion, however, Emma is a protagonist only a mother could love, and unfortunately, her father is a widower. 

The Bells of Notre Dame

From: Bennett Tobias at Unsplash.com

Today is the two year anniversary of the burning of Notre Dame cathedral in Paris.  On April 15, 2019—Palm Sunday—men and women from across the globe watched as Notre Dame was consumed in what can only be described as hellfire.  France watched on its knees as its beloved cathedral wouldn’t stop burning. 

I am not French.  As far as I’m aware, I don’t have a drop on French blood inside me.  But the burning of Notre Dame cathedral holds a special place in my life because I was one of the last few thousand people who attended Mass there.  My friend and I had been visiting Paris for the weekend during our semester abroad, and we decided to go to Notre Dame.  We checked the Holy Week schedule, and as luck would have it, there was a vigil Mass in a few minutes.  I dropped a euro into a basket, and a man in turn handed me what Parisians apparently use for palms:  a short boxtree branch. 

Sitting in Mass, it felt a lot like any other Mass.  I have zero understanding of the spoken French language, but I was able to decipher the Palm Sunday readings well enough.  I remember attempting to understand the homily but ultimately giving up and rereading the booklet I was given at the beginning of Mass—also in French.  I have no comprehension what the homily was that day, and to be frank, my experience of Notre Dame is actually embarrassingly limited.  I went to Mass, but in the shuffle afterward, my friend and I didn’t actually get to see the whole cathedral. We stepped outside of the cathedral and into the setting sun, snapped a few photos, and took a metro ride back to our hotel. 

The next day, we flew out of Paris and a few hours later, parents and grandparents were texting me, asking me if I was all right—Notre Dame was burning to the ground.  I opened the livestream on my phone and watched as the structure I sat in two days before went up in flames.  The spire fell, the roof caved in, and the glass melted away—at least, that’s what we were told in the early hours of the fire.  Decades of survival, a testament to Christendom over the course of eight centuries:  gone.  And I was one of the last people to see it intact.  I had philosophy homework I was supposed to be reading, and I sat on my bed, trying to concentrate.  The livestream glowed with orange and red flames as the sky darkened around Notre Dame. The sun was setting around me also. I realized I couldn’t read.  Instead, I watched Notre Dame burn in fascinated horror. 

In the wake of the fire, I thought about permanence and oddly enough, death.  Being relatively young, I have not encountered much death.  My grandparents are still living, and the funerals I have been to are for people I don’t know horribly well.  When I lived in Europe for those few months during my semester, I noticed that somehow, it seemed as if this world didn’t care so much about death—or at least, they took it more as a matter of course.  Things were built to last, or at the very least, recycled; it seemed that almost every structure in Europe had been broken down and used for something else.  Stones that adorned the outside of the Coliseum, I was told, were used elsewhere after a shift in power.  The United States seems to like to place its history behind glass cases—only gloved experts are allowed to breathe the same air as the Declaration of Independence.  “Preservation” didn’t seem as big a deal to Europeans, or at least, not to the ancient ones.  And human death, at least in ancient times, seemed to be more of an opportunity to show off your wealth in a ridiculously lavish grave (that is, if you were rich enough.) 

This, obviously, was not the attitude of modern Parisians as Notre Dame was suddenly destroyed.  My thought process went something to the tune of suddenly losing a family member:  “But I just spoke to him last night, and he seemed perfectly fine!”  While I cannot speak for France, there were likely many Parisians who felt the same way.  That cathedral withstood two world wars and was eight-hundred and fifty-six years old.  And if this cathedral could be destroyed in one afternoon, what does that mean for a human life?  As the flames started to die down, the motto of the Catholic Church flashed through my mind:  sic transit gloria mundi, “thus passes the glory of the world.”  Notre Dame was the glory of Paris.  It’s not gone, by any stretch, but it no longer quite so glorious.

Death happens, even sudden, unexpected death, and it’s only human to attach significance to stone cathedrals.  The glory of the world does pass away, and philosophers and theologians are always the first to point these things out.  Change is inevitable, they say, and the world itself is ephemeral and fleeting.  One thing I never really hear, at least in philosophy or theology, is about humanity’s almost desperate need to rebuild.  That is what people have done and what people will do until the end of time.  I don’t have the ability to grieve over Notre Dame the way the people of France did.  It may not be rebuilt in the same way, and it is possible it will never be quite as beautiful ever again.  But there were glimmers of hope shortly after the fire was extinguished.  The Crown of Thorns was rescued by a brave priest, the windows weren’t as damaged as they thought, and it turned out that a lot of the art hadn’t been inside anyway because of the restoration project. 

This world was not made for permanence.  Not even stone structures will last forever.  Paris lost something very dear to it, but we will pick up the pieces, salvage what we can, grieve, and start again.  It might not look quite the same, but rebuilding seems to be an almost essential to being human. The ancients seemed to understand that as they recycled their old stone buildings and statues. Tragedy changes the landscape, but human beings tend to take those scars and reintegrate them into something cohesive and whole once again.

I expect nothing less from humanity.  Even when things seem hopeless, we somehow muddle through. 

The Interpretive Ring

By: Lucas Lenzi from unsplash.com

When you read a book, do you spend all of your time dissecting the intentions of the author, using his historical and cultural context? If you said no, you’re like most people. If you said yes, you’re probably in college.

Most people read books however they want.  They ignore the intentions of the author, historical and cultural context, and sometimes they can even ignore the other books surrounding the one book they are currently reading.  And to be honest, as long as someone isn’t going around and publishing their reviews while pretending to be some great scholar (though there are plenty of those people), I think that can be perfectly fine.  Why wouldn’t it be?  (Okay, sometimes people twist it and ignore the author’s intent entirely—cancel culture, anyone?—but I’m talking about benign reinterpretation.)  A book, article, or blog post has two sides to it: first, the author writing the thing, and then the reader, who has his own cultural and historical context.  No one has a brain exactly like the author, and because of that, no one will interpret his ideas exactly the same way.

Readers of generally similar backgrounds, however, will interpret things in generally the same way.  Students at secular universities might read The Iliad and decide that Achilles and Patroklus are gay, students at private Christian schools might read the same book and decide the relationship between the two men is platonic.  And then there might be a third party who doesn’t care about Achilles and his sexuality and just wants to get the darn book over with because if I have to read yet another horrible death sequence. . .

I digress. 

This brings me to our topic for today:  Reader Response Criticism and its (possible) pitfalls. 

When we last talked about Reader Response Theory, there was a pretty extensive discussion about Ideal/Informed Readers.  Today, we’re going to be talking more generally about something called an interpretive community, one of the main points in Stanley Fish’s Reader Response Theory. 

The basic idea of the interpretive community is that the readers, not the author, create the meaning of the text, because every single statement is contextual and will be interpreted different ways by different people.  Interpretive communities are made up of people who share the same general interpretation of a work of art. To illustrate, let’s take the following sentence:  “I never said she stole my money.”  This sentence can mean several different things if different words are emphasized, but it would easily be understood if it were read in the context of a conversation.  In the case of the interpretive community, it’s not the surrounding conversation that provides the context for specific works of art (e.g. novels and poems), but the reader’s own personal background and experiences.  Remember those Christian students who think the Achilles/Patroklus bond is platonic?  That’s right—they’re an interpretive community.  Those students who believe that Achilles and Patroklus are totally gay?  They’re an interpretive community too.   Interpretive communities can be very small, like a group of friends who discuss literature together (the Inklings comes to mind), or it can be as big as a university, and everyone is in one.

I hear an objection.  “Sometimes things are just easily understood without cultural context,” you say.  “You don’t need to always be in an interpretive community.”  After all, people of all kinds of different backgrounds tend to understand poetry written by soldiers in WWI—it doesn’t take any special ring of people to show you how that poetry is supposed to work.  Fair enough.  I may have been a bit misleading in my previous paragraph.  Interpretive communities can be very small, but they can also encompass an entire language.  Sure, these poems are easy enough to understand to the vast majority of people who can speak and write English.  That’s an interpretive community, too, since they actually do have a relatively homogeneous background (at least, the native English speakers do).  Obviously, there is diversity to this interpretive community, and it can become almost like a Russian nesting doll situation.  See, for me, I’m part of the English speaking interpretive community, then the American interpretive community, then into a further subset that I would call “Christian,” and then. . . well, you get the idea.  Basically, the communities have the potential to become concentric rings or venetian diagrams, with some bleed-through and overlap. 

And this is where I see the potential for a problem, but this is also where I have to ask you to hang tight while we discuss one more idea.  C.S. Lewis has an essay titled “The Inner Ring,” where he discusses the possible dangers of becoming exclusionary and closed off from others.  He explains that every person has a special ring of friends, but the everyday politics that go on between people can create rings inside rings.  Inside jokes and experiences can bond people in special ways, starting to make people feel like they belong to a special “club,” though the edges of the ring aren’t as defined as all that.  There is nothing wrong with having friends—and even, to some extent, having a closer circle of friends within that ring.  Besides, everyone wants to feel as if they belong.  The problem is introduced when that Inner Ring preys on that desire.  For a quick (and stupid) example (that Lewis definitely wouldn’t approve of, seeing as it’s way too obvious), in Mean Girls, Cady Heron becomes Regina George’s double.  Cady ditches khakis and starts wearing miniskirts in an attempt to fit in with the “Plastics,” the popular girls who supposedly run the school. She changes who she is and gives up essential parts of herself to be in that group, which is the biggest problem of the Inner Ring.  The ring of “Plastics,” however, is clearly defined.  With life, you rarely find circles that obvious.

How does this relate to the interpretive community?  Interpretive communities are pretty harmless, yeah?  Well, usually.  In my four years in college, though, I did pick up on a very subtle hint of “inner ringedness” while combing through scholarship and even when talking with friends.  Scholars always tend to pontificate as if they’re right, which makes sense considering all the research and time it took to get where they are.  The problem is, there are a few strains of academia that have difficulty with criticism, and the pesky thing about humans is they seem to have problems with criticism in general. Those kids who assume Achilles and Patroklus are gay?  They’re missing out on an analysis of platonic friendship.  Those Christian kids who argue they’re just friends?  Well, they’re ignoring the fact that the Greeks didn’t mind a bit of homosexuality.  Perhaps they aren’t so much interpretive communities as an interpretive ring.

But does that mean that every single person who belongs to multiple interpretive communities has to pay attention to what everyone else?  No, of course not.  To paraphrase G.K. Chesterton:  don’t be so open-minded your brains fall out.  There are going to be some interpretations that are more correct than others, and there are going to be some interpretations that are so wrong that they should be dismissed completely.  There’s no reason to accept everything everyone else says.  How can one person go about being a part of an interpretive community and avoid the interpretive ring that begins to form later? 

Perhaps we can pull a solution from Jordan Peterson, a popular psychologist and author.  In his book 12 Rules for Life, Rule 9 states:  “Assume that the person you are listening to might know something you don’t.”  This approach acknowledges that each person has their own preferences for their very own ideas and with the people who agree with them.  It does, however, leave a bit of wiggle room for new thoughts—the small epiphany of, “Oh, I hadn’t thought of that before.”  Inner rings are basically inevitable and can be relatively benign, or they can force people to change their souls in order to be considered part of the group.  Interpretive rings can be the same way, but with the added academic veneer. If they are allowed to form, interpretive rings can encourage people within to gaze sneeringly on the people without, as if they are too stupid to understand the “correct” point of view.

Now, obviously, the interpretive ring doesn’t exactly exist in scholarship.  As far as I know, most people don’t pay much attention to Reader Response Theory, so they’re not very inclined to apply the ideas elsewhere.  This might be completely misguided, but I have noticed a growing trend to close oneself off from people with whom they disagree, especially among the upper echelons of academia.  In my own mind, this signals a shift from communities to rings, and it seems nearly inevitable.  Don’t fall prey to the Inner Ring—there is no need to change your soul for others who don’t want to accept you.  But don’t fall prey to the Interpretive Ring, either, by looking down on others and their ideas.  Part of being human is being a lifelong learner. 

Even if you dismiss others’ ideas outright, at least you know they exist now.  In order to avoid shutting oneself off from other people and their ideas, it would be wise to listen to their arguments as arguments; don’t just listen in order to strengthen your own ideas. Remember, there’s no reason to stifle yourself just because the world wants you to remain inside an echo chamber. You’re just one person, after all, and it’s very unlikely you’re right about everything.

Why is the Noodle Incident like Duncan’s Murder?

Image: Markus Spiske at Unsplash.com

Why is Calvin and Hobbes like a Greek tragedy, or like Macbeth?  This question can be answered in many legitimate ways—it probably has as many answers as the question posed by the Mad Hatter in Alice’s Adventures in Wonderland:  “Why is a raven like a writing desk?”  (Fun fact:  Lewis Carroll wrote this question without an answer in mind, but when pressed, he said that both can produce a few notes.  I, however, an unsatisfied with this answer; my favorite is “Poe wrote on both.”)  Unlike Carroll, however, I did write this question with an answer in mind.  My answer: because all three leave things to the imagination.

Plays, like movies, are a visual and auditory medium, that each have certain constraints and freedoms.  Movies can easily move the camera from one set to another, but camera shots must be planned carefully to show only the important information because of the edges of the screen. Plays are able to bridge the gap between audience and actor more easily, because stage actors have the ability to touch the audience.  However, stage acting is limited by set pieces that can’t move (unless you’re Broadway; apparently Broadway just has an unlimited amount of money) and the fact that people will have a difficult time seeing an actor’s face.  Movies, however, do have one significant leg up over plays:  they can show violence well. 

Violence in stories is absolutely not a new concept; in fact, Greek tragedies were doing horribly violent things to their characters long before the introduction of modern-day special effects.  Most good playwrights, however, would write action like that offstage.  But why?  Was it because the Greeks were simply too upright and couldn’t stand such things?  Hardly.  We’re talking about the society that produced The Iliad, one of the most violent works I’ve ever read.  (Seriously.  It makes the baptism scene in The Godfather look tame.)  The reason that Greeks kept this stuff offstage was because of something that Horace explains in “The Art of Poetry.”  He explains that usually, it’s better to “show, not tell.”  This refers to storytelling techniques where the audience gets to learn about a character or plot through the actions in the story, rather than being told in something like a voiceover or a Star Wars crawl text.  Horace says that usually things are better absorbed if people get to see it rather than hear about it, except with violence.  The reason: violence looks fake as hell. 

Okay, so he didn’t say exactly that, but that was his general point.  Violence simply doesn’t look good onstage, and for a long time, it didn’t look good in movies.  Making a bloody death look as realistic as possible would require, well, an actual bloody death.  Not only would this be unethical, but it would be difficult to replace actors.  Contract negotiation alone would be a nightmare.  Movies, even now, tend to follow this rule, or at least ones that aren’t rated R.  Typically, the camera pulls away at the last second before a villain hits the hero over the head, or sometimes both characters are shown only in silhouette (conveniently eliminating the need to show blood), or in some cases, all you can hear is the scuffle.  Movies that are rated R now have the ability to showcase violence in all of its “glory” because of advances in the movie and theater industries.  Fake blood looks almost like real blood, makeup and prosthetics can simulate severed limbs, and CGI special effects can make up for the deficit.  (Though I know that there will be people who say that CGI always looks tacky.)  But why does it have to look so great?  Who cares if it looks fake? 

Most of you have probably already answered this question:  the audience doesn’t want to be reminded that they’re watching a movie, or that they’re being told a story.  People are willing to suspend what they think reality looks like in order to enjoy a book, movie, or play.  But live-action movies and plays have an interesting problem:  their world looks like ours.  If fights look ridiculously fake, or blood looks like ketchup, you’ll be reminded that the people on the screen are actors, not characters.  Once, when I was in high school, my teacher put on the clip in Julius Caesar where Julius is stabbed to death.  This was clearly a low-budget production, because the clip was a group of men surrounding Julius, calmly lifting their swords up and down as Julius “writhed” on the floor.  We all laughed, because it looked ridiculous—stabs are quick, short, jerky movements, not fluid and slow.  This would be why Macbeth is better; Macbeth killed Duncan offstage and thus the audience isn’t subjected to a shoddy recreation of murder.  Consciously or not, Shakespeare followed the tragic Greek formula for violence, and in doing so, he did the play a favor.  Besides, I demand only the highest quality in my murder recreations. 

But is that the only reason violence is better offstage?  If you can show it if it looks real, why do some movies still shy away from showing it?  This would be why Macbeth is like Calvin and Hobbes:  sometimes, leaving things to the imagination is just as effective, if not more effective, than showing it.  Longtime Calvin and Hobbes fans will remember “The Noodle Incident” referenced in the comics.  Internet theories abound, but all the strip tells us is that Calvin came home from school early (apparently there were sirens), his parents don’t appear to know what happened, and there were noodles involved.  Bill Watterson, the writer and illustrator for the comic, says that he leaves the incident unexplained because his readers could come up with something much more imaginative than he could.  (I personally think that Calvin blew up the teacher’s lounge microwave.)  The same concept applies to Macbeth, albeit more graphically.  Macbeth comes from offstage, his hands smeared in blood.  The audience can imagine the murder—did Duncan stir?  Did Macbeth hesitate, even a little?  Where’d he stab him?  Did he look back at Duncan’s body?  If he did, was he triumphant, or remorseful? The lines afterward indicate a trembling man, but an actor could choose to inject a moment of triumph if he desired.

Imagination can sanitize or terrorize as much as a person allows.  Some Calvin and Hobbes fans think that Calvin’s “Noodle Incident” was that he played with his food at lunch; others, like myself, have crafted more elaborate stories.  It was left up to the readers, and many playwrights and screenwriters trust their audience to do this.  As many people know, the monsters we imagine under our beds at night are always worse than what we find when we flick the light switch. 

This post was last edited on December 24, 2020.

Why (I think) Flannery O’Connor Needs to Chill

Photo from Angelus News. (Originally from CNS Photo.)

WARNING: If you are in love with O’Connor’s work, the following may trigger, disturb, or cause emotional distress to you. Do what you will with this information.

Have you ever watched a movie based on a book you haven’t read? Typically, if you watch the movie with a bookworm, they’ll completely understand the plot and where the movie is headed, and more likely than not they’ll lean into you halfway through the movie and say something like, “This part was way better in the book.” Ignoring that obnoxious behavior, sometimes the more obnoxious part is not understanding the movie at all because the director assumed you read the book. (Looking at you, Percy Jackson and the Lightning Thief. Also Divergent.) Books usually provide context and motivations that don’t always translate well to movies. But I am firmly of the opinion that if a book is going to be translated to the silver screen, it has to make sense without having read the book. For example, The Hunger Games can be watched without reading the books; the world is coherent and character motivations are shown through acting. The Lord of the Rings might have more depth in the books, but you don’t have to read them to watch the movies. When I go to a movie theater, I go to be entertained. I don’t want a thousand plus page trilogy assigned as homework. I want to be inspired to read the book, not told to read it to make sense of the movie.

I feel the same way about books. In order to understand the point that an author is making, I shouldn’t have to dig through outside scholarship for hours. I have two reasons for this: first, my four years of digging through outside scholarship for hours, and second, my assertion that human beings are supremely lazy. If you assign “homework” to something that’s supposed to be for pleasure, then most people won’t do it. This would be why most people don’t know why Tolkien wrote The Lord of the Rings: he wanted his books to be to the English people what the ancient myths were for the Greeks–a kind of pre-historical web that people passed down from generation to generation. As it stands now, only people who have actually studied Tolkien and his work know this. (Though the YouTube channel Wired has put out some excellent videos on Tolkien lately.) LOTR is now just a fun novel-esque story that only “nerds” get really into.

Making people fish around to understand intentions doesn’t really work–though I may have been too harsh. It’s not actually because people are lazy; it could be because people don’t even realize there even is something beyond the canon. Would it occur to a casual reader of LOTR that they should read Tolkien’s letters? Probably not. His purpose was lost not necessarily because people are lazy (though this is still a fact) but because people didn’t realize there was more.

Don’t assign a handbook to your work. Your brain is a complex web of motivations–it’s arrogant to expect everyone to understand what you’re doing. You shouldn’t assume stupidity on behalf of your audience, but don’t you dare assume they know what’s going on in your head, even that they even care. If you are an author, it is your job to make me care, and you cannot make people care by assigning outside source material.

Which is why I really just don’t like Flannery O’Connor’s work.

Flannery O’Connor spent her life battling two things: lupus and nihilism. Eventually, O’Connor died of lupus at thirty-nine years old, and never gave into nihilism. If you’re just a casual reader of her stories, though, you might think otherwise.

Don’t let the beautiful watercolors on her books fool you–O’Connor’s stories are typically dark. One story, “A Good Man is Hard to Find,” follows a grandmother riding in a car with her son, his wife, and their children. The car crashes and a man called The Misfit kills each member of the family individually, including an infant. The grandmother is the last to die, and the story ends with her being shot. The grandmother is a pusillanimous woman, thinking much of herself and her appearances. If you read the story “correctly,” however, you can tease out a brief moment of grace before the grandmother is shot to death–she forgives the Misfit and sees him as a child of God. The Misfit appears to possible have a brief flash of grace but kills the old woman anyway. A supremely hopeful reader thinks that perhaps he will turn his life around someday.

While the story doesn’t end in a particularly hopeful vein, though Flannery leaves the door to the Misfit’s redemption very, very slightly ajar. Flannery writes her stories in a way to shock the reader. The Misfit is such a horrible, evil man, that the reader is supposed to look at the Misfit and be shocked by the consequences of his nihilistic philosophy. The reader is also supposed to relate to the small-souled grandmother and be shocked with her at the Misfit. This revelation at the end of the story enables her to escape her pusillanimous prison and have an epiphany of grace just before her death.

I object to this for two reasons. First, if that explanation is not given, Flannery’s work is incredibly difficult to decipher. Most readers, unaware of Flannery’s Catholic roots, would just read the story and remark on the fact that it’s a little dark. Most people will not be teasing out the meanings behind short stories; let’s be real, not everyone took a short story class in college. (I sure didn’t.) Most people I know who like the stories enjoy dark literature and would have no problems with the story above, even without the explanation. Before moving on to my second reason, let’s assume for a moment that the reader picks up on this way of writing and the meaning behind it. My second reason for disliking Flannery’s writing is that I object highly to the idea that only evil can shock a small-souled person out of their shell of a world. In my experience, evil hardens people and tends to close them in more, not less. It is in the nature of evil to divide and make people isolated. In fairness to Flannery, her characters can be read as either giving into nihilism or holding out hope for a new beginning, but most Flannery scholars focus on the idea that there is hope shining through the darkness in her stories. It is my personal belief that kindness and beauty are the more solid options than a more roundabout route.

I’ve also had teachers explain to me that there is beauty in her stories, but it’s harder to find, just like in real life. This is a perfectly valid point, and my objection to this point isn’t so much a reasoned argument, rather than a personal preference. I do not like dark short stories. In my personal opinion, they are entirely too nihilistic and the lack of closure I expect from a plot. Life is already too dark, and I do not need to be reading dark literature in my spare time.

I have been told that readers must be humble before the work of an artist, because it’s always better to give the benefit of the doubt. As has already been established, the human mind is complex and there could be a good message hidden behind the words. I highly encourage you do act this way towards new authors and new works. There does come a time, however, where your logic and opinion matter. Artists aren’t above criticism just because they’re artists. I respect the work Flannery O’Connor did, but in my own personal opinion, I believe she failed in her mission. I realize, however, that people much smarter than I believe she succeeded. Perhaps someday, unlikely though it may be, I will come to appreciate the work of Flannery O’Connor.