Tuesday, September 29, 2015

You mean my opinion DOESN'T make a difference??? Sheesh, why do we even have this blog?


It's scary how accurate Jodi Dean's ideas are about the paradoxical effect that abundant communication is having on our society, particularly concerning politics and other issues that actually matter. Thanks to the internet and technology, there are so many modes of communication and opportunities for "connectivity" these days that we would think that this would boost democracy. After all, more people are able to express their opinions! Yay! This should make the whole world better! Go democracy!

But in her article, "Communicative Capitalism," Dean bursts our bubble, arguing that our advanced technologies of communication have actually isolated us more than connected us, as least as far as government goes. The reason for it, she explains, is that "communicative exchanges, rather than being fundamental to democratic politics, are the basic elements of capitalist production." In other words, communication is focused much more on advertising than on debating issues. This is communicative capitalism.

I know what you're probably thinking. "What the ...? How could increased communications REDUCE democracy and, well, communication? That's so self-contradictory!" Yup, I'm asking myself the same question. In fact, even Dean feels a bit incredulous about it all: "Why, at a time when the means of communication have been revolutionized, when people can contribute their opinions and access those of others rapidly and immediately, has democracy failed? Why has the expansion and intensification of communication networks, the proliferation of the very tools of democracy, coincided with the collapse of democratic deliberation and, indeed, struggle? These are the questions the idea of communicative capitalism helps us answer ... " She goes on to explain how when people give their opinions nowadays, said opinions do not have to be responded to; they are just considered a "contribution" to the pool rather than a message that requires a response. And if we don't ever respond, there's no debate. It's like we live in a world of ads. People communicate in order to sell their opinions rather than to discuss them. Just as there is no pressure to respond to an ad, there is no pressure to respond to opposing opinions, either, even if it's a protest against something very serious. Even if it's a protest against something that our governing body is doing.

That's frustrating, considering the fact that we, the people are supposed to be in charge of the government. But how can we pretend to be if they ignore the issues we bring up?

It's interesting, isn't it? I mean, this blog post is just going to be added into the giant pool of opinions that are read and tossed aside. “Thank you for your contribution, Miss Stout. It won't make any difference whatsoever, but we're glad that you got that off your chest.” Seriously? Does no one's opinions even matter anymore?

Well, I would argue that just because communication can't change the government as quickly doesn't mean that it won't eventually. After all, communication can change people. It can alter the paradigms of individuals. And everyone knows that if there is an overwhelming majority of people pushing for something, it becomes more likely for the government to take note of them. However, will the way that people join a cause be because they debated it and truly deliberated it for themselves, or did they just buy in to the convincing "ads" and "catchphrases" going around about said cause? I'm sure that the brain-scattering effect of the internet that we discussed in earlier posts has a lot to do with this idea. After all, if our capacity to carefully and deliberately read through and decipher complicated issues has been greatly weakened because of the internet's overstimulating influence, that means we will probably rather skim over things that in fact require and even deserve much of our attention and thought. And if that indeed is our tendency, we'll probably choose to side with the people who have the best "sales pitch," if you will—the party with the most easy- and quick-to-understand arguments that require little actual thinking on our part. Honestly, thinking for ourselves is so last century.

Friday, September 25, 2015

A Roundabout Way of Saying: "We've Put a Noose on Communication."



“Not only are people accustomed to putting their thoughts online but also in so doing they believe their thoughts and ideas are registering … Contributing to the infostream, we might say, has a subjective registration effect. One believes that it matters, that it contributes, that it means something.” - Jodi Dean
I think I see what Jodi Dean, in her Communicative Capitalism essay, was trying to say, about how politics are lost because people can ever see the whole issue. Personal opinions and neighbor bashing take precedent over real issues. Does anyone even know what real issues are? After slogging through Dean’s biased political jargon, I found something that actually resonated with me, got my mind working. “One believes that it matters, that it contributes, that it means something.” And whether or not it applies to communicative capitalism (which is TERRIBLE to say out loud, by the way) I’ve got a point here that I’m going to try to express. Bear with me.

Belief and reality don’t necessarily coincide.

On my phone there is an app that allows me to have an ongoing group chat with my mother and my three sisters. One day, I received many massages from one sister saying that she was quitting Facebook forever, because it was “fake and negative.”

“And besides,” she said, “The only people I need to communicate with that actually matter are right here on this app.”

My sister didn’t say that things were positive and negative. She said they were fake and negative. In essence, she was saying that people tend to show an absolute side of themselves, which tend to be either extraordinarily negative, or so markedly good that the presentation seems perfect – and pretend. My sister could not handle the negativity – complaining, sharing of terrible things, atrocities making headlines with seemingly nothing to buffer the onslaught. She also couldn’t handle the pretenders – those people who only post the positive things, while hiding the side of life that isn’t a perfect meal, the perfect children, an exquisitely frosted cupcake. So she quit.

The deluge of mono-personalities is both overwhelming and overpowering. Behind our screens humans become thorough aggressors, opinionated authorities, who relay information that is born of passion rather than research. And that is the biggest thing that bothers me. Which takes me back to what I said before.

Belief and reality don’t necessarily coincide. They don’t. Just because you believe something doesn’t mean that it’s fact. In that vein, maybe this whole communicative capitalism thing is, in effect, the death of argument. And that is SUCH a shame. People don’t know how to argue anymore. They don’t know how to learn from one another. It seems that when discussion begins, it is so often murdered by bigotry on all fronts.

I can’t help but feel that our inability to argue, to see things critically instead of with charged emotion, is going to allow things to happen over our heads. If the US is supposed to be governed by the People, but the People are too busy squabbling ineffectively, what’s to stop our government from doing things that we might not want them to do? If our attentions are occupied by the death of a lion, or the sex change of an athlete, we’re impeccably distracted from things that might be of real national import. The things we first-world, overly privileged humans care about are things that third-world people don’t waste their energy on.

In the case of the lion that was shot by a dentist, there was extreme uproar in the United States, so extreme that the death of the dentist was recommended by those most offended. But the people who actually live where the lion was shot did not care. As a matter of fact, they were relieved. The lion was called “beloved” by someone with a little too much creative license on their hands. In Zimbabwe, where people are killed all too often by lions, they didn’t shed a single tear. To them, the death of a lion, whether legal or not, is welcome. It’s all point of view.

I guess what I’m trying to say is, the art of discussion has become tremendously skewed because of our ability to pick and choose what will be deemed as important, whether or not it actually has any value. Argument has cancer, as malignant and deadly as they come. If rhetoric is the body, and our cells are topics of discussion, then communicative capitalism symbolizes the malignant cells that multiply into horrible, useless, strangling tumors. There’s a tumor on the windpipe of argument, and is growing with alarming rapidity. Asphyxiation is imminent.

Tuesday, September 22, 2015

Wir Sind Die Roboter

One of my favorite songs ever is called Die Roboter, or Wir Sind Die Roboter (The Robots, or We Are The Robots in English) by one of my favorite bands ever, Kraftwerk. Here’s the video if you’re feeling curious.


One of Kraftwerk’s major ideas that their performances and music explored is the synthesis of humankind and technology. A lot of their recordings and live shows blended those notions and explored what could happen when one becomes the other, and vice versa.

Carr’s article made me think of this song. Both the song and the text offer fascinating insights into how our thought processes are being shaped by this melding of man and machine.

There are a million things that can be teased from this article, but the notion of understanding in AI really stuck out to me. What is understanding, exactly, especially in the age of the internet? Is there a difference between understanding and simply knowing? What the heck is knowledge anyway? Are the differences only semantic? If yes, why would that be? Who is capable of understanding? Can machines be taught to think the way a human can? The questions are endless, and it’s going to take someone a lot smarter than I am to sort these out. Let’s turn to the text.

While discussing Google’s data retrieval efforts in terms of a stepping-stone to AI, Carr states, “It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.”

Are we simply “programmed” with information through the course of our life? How do we process it? It’s interesting to note how the language we use to talk about computers and brains is remarkably similar.

John Searle’s “Chinese Room” argument comes to mind (60 second version here).


The esteemed philosopher’s thought experiment sheds some more (but not a lot) of light on the subject. Searle argues that computers are only able to mimic human thinking, but aren’t able to think themselves—it’s just a matter of following instructions to produce a correct result, or relevant information.

Don’t we do the same? I’m definitely no neuroscientist, but I would argue that our minds store and process information in largely the same way. This is just pure speculation on my part, but maybe our unique ways of thinking about things, are caused by those “bugs” Carr wrote about.


Perhaps having a larger hard drive, say, the internet, wired to our brains would only benefit us. Perhaps as humans and machines grow closer together, human thinking will continue to grow more machinelike and machine logic will grow more humanlike. Like Kraftwerk told us 40 years ago, the lines between man and machine are being blurred. After all, Wir Sind Die Roboter.


Squirrel! Or Why I Blame the Internet for Everything, Especially This Post

I finally figured out the joke on those evolution t-shirts, the ones that show the evolution of man and end in him crouching in front of a computer. (Took me long enough. I blame the Internet for distracting me.)

Carr says that the Internet “returns us to our native state of distractedness, while presenting us with far more distractions than our ancestors ever had to contend with” (15). Well doesn’t that make you feel grand and intellectual? Apparently all of humanity’s efforts have led to searching the screen the way a dog lets its nose guide it from one delicious smell to the next. And we can hardly not stay away.

Oh yes, we’ve gotten far in progress. We’ve got all of this knowledge, all of this experience, all of these frameworks to build upon. But the thing is, we haven’t actually got it. All that juicy information belongs to the hive mind, otherwise known as the Internet. Carr says, “As we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence” (10). I do sense a bit of role reversal. It’s almost like I’m uploading/storing information into my brain short term. My real brain, the one I’d be lost without, I access through my fingertips and view through a screen.

You’d think some benefits would come from this. And there are—easy communication, vast information, ect.— but none for the individual. I’d assumed that heavy Internet users are good at multitasking. But then I read Carr’s article.  The test he cites, which evaluated intelligence of media multitaskers and non-multitaskers, suggests otherwise. He says, “The heavy multitaskers weren't even good at multitasking. They were considerably less adept at switching between tasks than the more infrequent multitaskers. ‘Everything distracts them’”(13).

What does it even mean to multitask? When I multitask, I’m not actually doing multiple things at once. Rather, while I’m working on something for a short time, my mind starts slipping to the next task part way through. It turns into a fast cycle of engagement and disengagement, and while I might be able to cram in or spew out a whole lot of information very quickly, a great deal of it was not completely thought through. Take my blog posts for instance. Read through them and you’ll notice the lack of transitions from one topic to another. Carr said that the Internet changes “our habits of mind” (14). Viewing various information in short, quick clumps has led to me caring less about tying my own information together. Once I’ve devoted a few minutes to one idea, I’m on to the next. Then it’s up to you to connect the strings. Or do you? How much does your Internet-saturated brain mind my chaotic writing?

I know that even away from the Internet, my brain loves to be distracted. Carr says, of how Internet-use changes our brains, “The cellular alterations continue to shape the way we think even when we're not using the technology” (14).

When I opened up Above the Fold, I started flipping through the pages to get to the required reading.  I was just going to skip the preface, until I saw a little blurb off to the side, bright orange, big letters. I allowed myself to read that. And then I went from distraction to intrigue and curiosity, so I scanned the page for where that blurb came from and read that section. You might say my distraction is caused by the Internet. But consider newspapers—long before the Internet, big bold headlines caught people’s attention. Sure, publishers used to have newsboys shouting, drawing attention, but that stopped as deliveries became a thing. And then the information had to speak for itself, which ironically, it’s not very talented at. Sometimes the design has to shout the words. Even away from the Internet, that principle is well understood. With the Internet, I suppose that’s kind of like CSS working for HTML.

Oh, well, I can’t think of a way to tie in this next part, so I’m just going to jump straight into another topic. (I can do that, you know. Since my post is about Internet distraction, it turns my lack of effort into a “rhetorical choice.”)


Carr quotes Patricia Greenfield, who explains how media affects our brains. She states, "‘Every medium develops some cognitive skills at the expense of others’” (13). It’s hard to argue against the evils of the Internet when there’s some form of balance. I can list pros and cons of the Internet, but I wonder at what point will the cons so heavily outweigh the pros that Internet has taken it too far? Or at that point, what if the cons aren’t recognized as cons? What if most people spend their whole lives in some virtual reality, but they don’t see what’s wrong with that? I’d want to tell them they aren’t experiencing the real world. They don’t understand hard work. They don’t know what real excitement is. But thirty years ago, people would have said the same thing about me.  I think no matter how many articles are published about how Google makes us stupid or the Internet makes us dumb, none of that is going to stand up to Netflix. While right now, the idea of humans becoming almost like machines alarms me, people adapt so quickly. I don’t think there’s any “upgrade” that most people wouldn’t welcome eventually. 

Monday, September 21, 2015

The internet shortens our attention span so much that we don't even finish our

After reading Nicholas Carr's articles about how the internet is affecting our brains, I am suddenly a lot more concerned about the omnipresence of the digitized world than before. Society has adapted so quickly to this new virtual reality, this new "habitat," that we should have guessed that said adaptions would have had an effect on our brains.
Duh.

To be honest, I have noticed a difference in the way that my brain processes information. Before reading Carr's articles, I had been calling it "impatience." I was not patient enough to read for too long in one sitting, nor was I patient enough to do all of my homework in one sitting. I'd rather do it in spastic spurts of energy throughout the day interrupted every five to ten minutes with texting, watching YouTube videos, Facebooking, reading an interesting article about an underlying plot-line of Star Wars III: Revenge of the Sith, etc. It's not just my activity on the internet alone that is being affected, but my activity everywhere because of how easily distracted I am. It's frustrating, but Nicholas Carr is at least helping me to recognize what the culprit is: "... A growing body of scientific evidence suggests that the Net, with its constant distractions and interruptions, is also turning us into scattered and superficial thinkers."

Net, it's all your fault!
... Or is it?

Well, yes. It is. The internet is mostly to blame for this new epidemic of extreme scatterbrained-ness. However, now that we know that, it empowers us to be able to combat it. Now that we know what to combat, it will now be our own fault if we do nothing to change it ... if we do in fact want to improve our cognitive abilities and not turn into shallow robotic thinkers. Some might not care at all. But I do. Why do we need to give in to the constant messages and advertisements and whims that attack us and try to divert our attention so that we are never doing the same thing for longer than about fifteen minutes? Don't we have more control over ourselves than that? Something that Carr mentioned gives me hope:

"The human brain is almost infinitely malleable. People used to think that our mental meshwork ... was largely fixed by the time we reach adulthood. But brain researchers have discovered that that's not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind "is very plastic." Nerve cells routinely break old connections and form new ones. 'The brain,' according to Olds, 'has the ability to reprogram itself on the fly, altering the way it functions.'"

Look! Just as our brains have been "reprogrammed" to taking in so much information at once and skimming rather than reading, can it not be reprogrammed to do the opposite if we change our habits and determine to do more things that require our undivided attention such as reading more books? Can we not set aside a specific time to check texts and emails so that we don't get interrupted with the task we are trying to complete? Can we not also set aside a specific time to surf YouTube if we so desire so that our study time is not fragmented by it? Maybe in this way, in consciously changing our habits, environment, and attitude toward the internet, we can recuperate our attention spans, at least to a degree.

Some might argue that the internet is too prevalent in this day and age to avoid the negative effects it will play on our minds, but I say that it really is our choice what we do with our time or who we decide we become, and if we can do something, we should. We don't have to live in automatic mode which Carr suggests is an effect of the "rapid shifts in focus" that vigorously surfing the web produces. We don't have to fall into the category of the multitaskers who are "less creative and less productive than those who do one thing at a time."

I will add, however, that I got off track at least 25 times while writing this blog post because of texts and emails and other things that distracted me. But this time, I at least noticed it. I even laughed at myself. My brain really does seem to have been molded into a baby internet Frankenstein. Wonderful. It's going to be much harder implementing self-discipline than it was to just write about doing it. But it'll be worth it.

Oh, look! I got a text! I'll get back to writing this post in a sec.

Once Upon a Time the Internet Stole Our Minds

A few weeks ago, as I sat in my first day of classes, I heard the same thing over and over again. "You're English majors. You know what you're doing. You know how to write." By the end of the day, I felt very confident that I did not, in fact, know how to write. Not academically, anyway. But besides the writing issue, there was a bigger one - reading. There was so much to read. So much! Every night I'd read anywhere from 75-125 combined pages of reading, and at first I was struggling to get it.

Over the last four years, I have been a stay-at-home mom. During those years, when my kids were really little, I think I read maybe...four books? (And I mean adult books, not picture books. I've read hundreds of those.) Several months ago, I was invited into a book club, which I joined reluctantly, because, as I lamented to my spouse, I had essentially forgotten how to read.

Obviously, I could still read. I could interpret the symbols in the alphabet correctly, see them as words and so forth, but I wasn't getting much from the reading. I was so used to getting information via a direct question, getting quick, factual answers (or, at least, answers that seemed factual), and utilizing the gleaned information that I had no need to go deeper.

In his article, Is Google Making Us Stupid?, Nicholas Carr quotes developmental psychologist Maryanne Wolf:
"'We are how we read.' Wolf worries that the style of reading promoted by the Net, a style that puts 'efficiency' and 'immediacy' above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become 'mere decoders of information.' Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged."
This is what was happening to me. I couldn't make "rich mental connections" because I'd lost the ability to read effectively, critically.

But I joined the book club, and it proved to be effective in slowly re-engaging my ability to "interpret text."

It seems to me, that in order to read in such a way as to learn effectively, the ability has to be exercised. I have a grandmother who is 72, and she is a master violinist. Since the age of three, she has practiced her violin daily, and though she is no longer in an orchestra, she continues to practice  daily. "Because it's a skill," she says, "And if you don't keep up your skills, they'll go away."

In his article, Carr relates the story of Frederick Winslow Taylor, who went about timing the job functions of  factory workers in an effort to make them more efficient. Tough the factory workers complained about becoming "automatons," Taylor's experiments were a success.
"The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the 'one best method' of work and thereby to effect 'the gradual substitution of science for rule of thumb throughout the mechanic arts.' Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. 'In the past the man has been first,' he declared; 'in the future the system must be first.'"
I can't help but scowl when I read this. It just doesn't sit well with me. Sure, efficiency is great, and it would seem that first-world humanity is forever eager to find new ways to stream-line any sort of process. Even if the process is already simple, we will seek to simplify it further. But what about humanity itself?

I don't like that Taylor wants the "system" to be first in our consideration, but much to my own chagrin, I think his wish is coming true. Because we the internet is so efficient, and capable of doing so many things for us, not only is our ability to read affected, but our ability to do things in general is affected. In his article, Carr discusses how even maps, television, and the printing press, are digitized. We rely on these digitized "conveniences" to do our thinking for us. What happens if you have to find your way and all you have is a paper map and the sun to find your way?

My point is, though efficiency is great, maybe Carr is right. Maybe Google, and the internet, is making us stupid. I have this strong conviction that part of what makes the brain smart is the trial and error that we have to go through to get there. Actually reading a book, with the intent of finding meaning, writing things out by hand, coming up with something creative on your own instead of relying on Pinterest to be creative for you - I feel like these sorts of situations are what make us strong thinkers.

"Maybe I’m just a worrywart," said Carr. "Just as there’s a tendency to glorify technological progress, there’s a countertendency to expect the worst of every new tool or machine."

I am so a worrywart. I'm a bit skeptical of progress until it's been proven by other people. You won't find me being a guinea pig in any sort of study. Going forward into the future, I suppose I'll continue to take every newfangled gadget with a grain or two of salt.

But then, who knows, the salt might improve the savor.

Tuesday, September 15, 2015

Memoir of a Flesh and Blood Girl Before All of Humanity Was Uploaded to WiFi.

I spent a ridiculous amount of time deciding between a print or digital format for our HTML textbook. I ran my hands over the smooth pages, in awe of the strikingly clean text against the simple colors, and lingered on the pretty and simple layout and font. Really, that’s all the same as an eBook. But it was just different to hold it in my hands, interact with it beyond the keyboard. We view everything online from behind a glass wall. And those things can slip away and be replaced with a click. Eventually, though, I went with the eBook, because it made sense to use a digital manual when I’m making something digital.

When something goes digital, anything associated with it will probably end up digital, too. I used to buy CDs. But then there was a way to store my music where I bought it. So I used iTunes and downloaded the music. Later, I could listen to music straight from the Internet, in my pocket, for free. Hello Pandora. The Internet is sucking in everything physical. I swear by the end of the world, our reality will be digital, and we'll be riding signals up into a computer in space. 

 I used to think about buying music as buying the album of an artist I liked, and I’d just skip the songs I didn’t really like. Now I generally care more about individual songs. Spotify still allows me to “open” an album to look at the songs, in a hierarchal fashion, but I don’t have to.

Manovich wrote, “We are no longer interfacing to a computer but to culture encoded in digital form.” When I go on Pinterest to look at images, its digital form is part of the experience. It wouldn’t feel the same looking at recipes or origami tutorials on paper, even if I could “like” them by storing them in a book. Ever since I started using it, the Internet has been the go-to place for entertainment. It’s just where that part of culture is.

I’m also fascinated by the advances in virtual reality, and what it might mean someday as a “go-to place for entertainment.” But I’m disappointed by the traditional means of interacting—the keyboard and mouse. How scared can I be walking through a dark hall when I am obviously not walking and my finger is on an arrow key? But there are controllers that are more immersive—as far as I can tell, treadmills that track the user's movement, and there’s even plans for a virtual reality center where users can run around, interacting with the environment. That’s a huge interface change. I can’t imagine the traditional “keyboard, mouse, window-like screen” changing for a long time, especially for work purposes. But as far entertainment, I can imagine the next generation having a very different idea of human-computer interface. 

Third-wave Ska and Second-rate Pirates

Back in the day, I was thrilled with the advent of file-sharing services. It was the early aughts, and I was deep in the throes of newfound teen mopiness. My outlet was music, and my parent’s broadband was the conduit. I spent hours upon hours downloading myriad illegal mp3s from less-than-reputable sources. The internet freed me from having to save my meager wages to buy albums; my music collection was suddenly unchained. Little did I know, this access to a world’s wealth of music would influence my future media habits so drastically.

It didn’t bother me that my parents would have killed me if they knew all of the pirating I was involved with. I downloaded hundreds of hours of third-wave ska to make any potential danger worth it. Back before the fear of litigation was planted in my then 15-year-old heart, Limewire, KaZaA and Napster provided me with an endless source of tunes, and greatly influenced my taste in music (yes, I managed to branch out from my ska days. Gone, but certainly not forgotten). New and old music came at me fast—there was always something different to explore.

As it turns out, this enormous, hodgepodge collection of misspelled filenames, out-of-order track listings and blatant mislabeling was incredibly formative in the development of my media-ingestion and information-gathering habits. My brain was rewired to treat music and other information as a fractured whole, where any semblance of cohesion was born from arbitrary filenames, sorted either chronologically or by name. I eventually graduated to the shuffle feature, and this rocked the boat as well. Information was presented at a million miles a minute, and I tried my best to adjust.

Now, it’s second nature for me to multitask. Granted, I’m not very good at it but my tether to technology all but demands it. Like the steady stream of broken filenames and varied genres, we are processing and assembling information according to the new rules of information imposed by the internet. Everything is fragmented; we thrive on eclecticism. 


Don’t get me wrong—there is nothing wrong with albums or books. These forms should be celebrated. But it seems as though those mediums are the exception now. We are taking all of these things in and creating a unique blend of info and media that hasn’t been seen before. It’s an exciting era.