Georgia GOP voters deciding their governor candidate have been able to cast ballots in the race for 10 days by the time this paper comes out. And they still have until July 20th to vote early before the option of showing up in person to vote on July 24th.
It’s great idea that Georgia and other states do whatever they can to make voting convenient. But it’s unclear that the efforts have paid off in any substantive gains in turnout and there are solid reasons so much early voting time is detrimental to elections.
Georgia political magazine James in their May/June issue had a well-researched article questioning whether the excessively-long early voting hurts turnout. The article noted, “While voters may find early voting more convenient, turnout data show that early voting may actually decrease turnout, not increase it.”
Making voting so easy, like picking up a gallon of milk, makes it seems trivial. In the words of Revolutionary-era pamphleteer Thomas Paine, “that we obtain too easily we esteem to lightly.”
The James article by Hans von Spakovsky reported that Texas was the first state to adopt early voting in 1988. Thirty-seven states now offer some form of early voting.
A study from American University looked at the 2008 presidential election. The 2008 turnout was up 2.4 percent over the 2004 election. But seven of the 13 states with the highest turnouts had no form of early voting and 10 of 12 states that saw a decline in participation had some form of early voting. The magazine notes a similar study in 2013 from the University of Wisconsin came to the same conclusion -- states offering convenience voting don’t see much, if any, participation benefit.
In Pickens County, early voting totals are generally below election day totals. In the spring primary, 3,160 waited until election day, while 1,917 cast early ballots. The highest number to vote on any single early voting day was 243 (the last day of early voting), while several days saw only about 85 people voting early during the primary. Our local election office says in a presidential election, more people will cast early ballots (almost equal to the number of election day voters), possibly out of fear of lines on election day. It should be noted that there are 12 regular polling places open on election day and only one early voting location, meaning the lines can be longer for early voting than on election day, when we are spread out across the county.
Aside from the cost of running the early polls, there are real problems we see in the long voting period. The extra days throw off mobilization attempts by candidates and public groups. When do you run your best ads or most urgent appeal? When do you rally your troops? At the start of early voting or right before election day? If you wait on election day, then a sizeable chunk of ballots have already been cast and you miss those voters entirely.
On the other hand, the extended voting time makes the election drag on too long and voter apathy increases as people tire of hearing the campaign news/ads every day from the start of early voting to election day. The constant cycle may leave voters ignoring campaign appeals entirely and fewer voters participating out of a backlash to the intended convenience.
A related problem is when news develops between the time early ballots are cast and the final count. Once a vote is cast it is cast. You can’t change it later if something really earth-shattering comes to light. An example cited by James magazine was in 2016, GOP presidential hopeful Marco Rubio dropped out of the race a week before the Arizona primary. He still finished third as many people had already cast ballots.
The James article concluded by quoting the American University poll saying the lack of voter participation is a real problem but “it’s not procedural, it’s motivational.”
We do recognize that many people need a couple of chances to get to the polls but surely a week of early voting, including the Saturday prior to election, is sufficient.
By Christie Pool
Last week I finished season two of Amazon’s Goliath starring Billy Bob Thornton and before the final credits were rolling I was already thinking: “How could I have just invested eight hours of my life watching this?”
Like many Americans, I spend some down time plugged in to Netflix and Amazon Prime and, of course, big time sporting events like the NBA finals and the FIFA World Cup. From sporting events to comedies to dramas, we Americans like our television shows. Critics and regular viewers say streaming television is where it’s at – more so than movies – for the smartest, deepest storytelling and most nuanced and morally complex characters.
But in this golden age of television it seems every show is engaged in a race to see which can have the meanest, sickest character.
Your main character skins his rivals like Ramsay Bolton on Game of Thrones? No worries, our main character beat a stroke victim to death with an aluminum chair (Scandal) and another show’s main villain shows-off by beating one of the most popular characters to death with a barbed-wire baseball bat (The Walking Dead).
So when I realized, at the end of Goliath that I really had sat through a show where one character had a fetish with amputated stumps and another enjoyed playing surgeon, chopping off limbs of people who crossed him, I was aghast.
And to be honest, the entire premise of a show like that makes me wonder just exactly what we’re all doing watching this twisted material and what effect it might have on us? Americans watch on average five hours of TV a day (that’s a lot). And this is the stuff we’re watching? These shows I reference are among the most popular, not something you have to seek out on the dark corners of the internet. Not too long ago shows that filled our screens considered it inappropriate to show two married people sleeping in the same bed (I Love Lucy, Dick Van Dyke).
How have we gone from Leave it to Beaver and Andy Griffith to our current top shows like House of Cards (where First Lady Claire Underwood kills her own mother for voter sympathy). Game of Thrones, is filled with so many despicable acts that it would take the whole paper to detail them.
Cop shows have always been popular. And someone had to be killed for Sherlock Holmes to have a case, but television is escalating the crimes to ever more elaborately gruesome and strange. Even on shows like CSI, Criminal Minds, it’s never just a serial killer. He has to also be depraved in some outlandish way.
The best shows, the ones we want to commit to watching full seasons of, should be challenging with great performances, snappy scripts and well-developed themes. We want compelling plots that develop naturally by putting characters into a difficult or interesting situation, then allowing them to behave authentically, like real people -- even the bad guys. Why does the drug dealer go to elaborate length to torture by playing surgeon?
When we flip on our TVs we want to be entertained but we also want to be connected and fight for the hero. We love shows that inspire and characters who grow and mature, or are crazy funny and for just plain bad guys.
So give us more funny, more genuine human drama and less torture and horror.
Thanks to the Georgia legislature, we are all soon to be at liberty to listen to rock and roll, ride with the windows down and enjoy the sights from our cars on warm summer evenings. (Boy that is a compliment you never think of giving a government body).
But come July 1, we can all turn up that Skynyrd, crank up the Big K.R.I.T and listen without being disturbed while you cruise home from work. Or you may be the type to take that special someone on a moonlight drive and want to talk to them, as opposed to someone on a cell – “Oh, I just got to take this one.”
And you may be surprised what has changed in Pickens County since the last time you went for spin – one where you checked out the scenery, rather than social media.
What we’re talking about is the state’s Hands Free Georgia law going into effect at the end of this month requiring you “keep your eyes on the road and your hands upon that wheel” - [as Roadhouse Blues advises drivers].
When those powerful, stylish, gas-guzzling machines of the mid 1900s to the 1970s rolled out of a driveway, you didn’t call, didn’t sit with eyes glued to a little screen while the real world slid past. You cruised and you sure as heck didn’t spend the whole drive yakking on a little phone with your spouse about plans for the front yard.
Back when America was inhabited by real people, not a herd of social media sheep, winding out on the highway, you sat back and thought about life or at least what you were going to do when you got home. Reflective-time, as close as many of us get to being philosophical and something that can’t be done while fielding calls from customers, bosses or co-workers about a project that really can wait until you get to work.
Work is where work happens, cars are where driving happens.
The fact that the state had to pass a law that makes it’s illegal to fiddle around with some computer-toy-phone while driving shows how much we’ve all been sucked into the online world. It’s unbelieveable that our Georgia highway rules have to state that it is illegal to watch a video and drive a car at the same time.
“It’s become a habit we don’t think twice about since we have been talking on our phones while driving for more than three decades and it is going to take time for all of us to stop automatically reaching for the phone when it rings,” GOHS Communication Director Robert Hydrick said in a press release last week. “If you want to talk on your phone or use GPS while driving, now is the time to implement those measures so hands-free will become the instinctive thing to do.”
Better yet, don’t implement those things. Don’t do anything to compel you to keep on squawking while driving. In fact, tell people you want to make it illegal to talk on a cell phone in a restaurant or while waiting in line at the grocery store.
We encourage all our readers to use that time in the car to take a break from the constant contact that cell phones force upon. Studies have shown that people develop automatic reactions when they hear that little ping indicating a new alert -- like Pavlov’s dogs starting to slobber at a sound. Give yourself a break; studies are also finding that more social media/digital communication may produce depression and anxiety.
While the state passed this law for sorely needed improvement with highway safety, we hope it has some beneficial social effects with people regaining a sense of reflectiveness and sanity that follows them, even when they aren’t behind the wheel.
By Angela Reinhardt
My father-in-law was at my house a few days ago, telling stories like he likes to do. He reminisced about a guy most people know as “Smiley.”
This “Smiley” character was the latest in a War and Peace-sized list of nicknames I’d heard him and others bring up over the years, so many so that I wondered if anyone here went by their birth name. Monikers on the list include: Codeye, Birdhead, Tiny, Slapface (also referred to as “Ol” Slapface), Lager, Squirrel, Lambhead, Doodle, Sod, Brush, Red, Buzz, Hutch, and on and on. My father-in-law has two himself.
I didn’t recall the nickname phenomenon where I grew up, or within my own family for that matter. Was this a regional thing? A rural thing? And why don’t I have one?
The issue is explored in an episode of Cartoon Network’s Uncle Grandpa. Uncle Grandpa, a magical grandpa who travels the world helping people, lands his flying RV in a neighborhood and kids gather. He goes down the line excitedly high-fiving “Larry Picture,” “Blondie,” “Duck Head Tony,” “Spaghetti Legs,” and gets to the boy at the end.
“And who are you?” Uncle Grandpa asks.
“No, I meant your nickname.”
Eric lowers his head despondently.
“Oh that. I don’t have one.”
The downtrodden Eric said he guessed he’s too boring to have a nickname.
Uncle Grandpa dished out some wisdom. He tells Eric he can’t just get a nickname.
“You gotta' earn it,” he says. “You have to do something legendary.”
You also can’t give yourself one.
I agree some nicknames are the result of legendary acts (i.e. “The Great Bambino,”), but the origins of a nickname are many and varied. They range from commentary on physical traits (“Curly” for someone with curly hair); to ironic nicknames for physical traits (“Shorty” for someone who is tall), to riffs on occupation (“Bones” for a doctor), and many others.
[Note: Some don’t count. Occasionally I’m referred to as “Ang” or “Reinhardt,” but variations on your name don’t cut the mustard].
While a legendary act might not be required to get your own, Uncle Grandpa is still onto something, the certain sine qua non that nicknames have.
In his book The Means of Naming, author Stephen Wilson says nicknames are common in small groups or communities where they can represent a hierarchy of power in which it “more than any other type of personal name reflects the social power the namer holds over the named,” or that they can “stigmatize anything uncommon – heritage, accent, appearance, attitudes.” But nicknames also exist in a friendly, affectionate sense, and are used respectfully and signal membership into a group.
All of this makes sense. I’d argue most people get their nickname in high school or college, or in tight-knit groups like the military. They are also a decidedly a male phenomenon and seem to have a strong relationship to hazing.
Still, I’ve lamented not having my own. Maybe I’m boring like Eric. I wrote an article about biscuits last year and friends called me “Biscuit” a couple times, but it didn’t stick. Another friend called me “Tangelo” one night several years ago, but that one didn’t stick either.
I was discussing this at the office and a guy who helps us deliver papers on Wednesday (and who goes by ‘Big-D’) said, “Oh, no. We all call you ‘Rhino.’”
For a split-second I got excited at the thought of having a nickname I didn’t know about. I was in the club!
Then he told me, “No, not really.”
Oh well. But for all you lucky folks out there, all you Maddogs and Papa Smurfs and Bruisers, enjoy being part of a chummy subculture me, and many others, may never know.
(really trying to make this one stick)
Anthony Bourdain got paid, very well, to travel the world eating unusual food. He re-defined the chef culture and was judged to be a very hip 61-year-old, amazingly fit from martial arts.
Kate Spade, 55, was described in her New York Times obituary as having an “accessory empire.” Empire is a strong word but even people who couldn’t recognize her handbags, might know her name as someone famous.
Both were rich. Bourdain’s mother said in his obituary, “Success beyond his wildest dreams. Money beyond his wildest dreams.”
Both had children: Bourdain’s was 11 years old; Spade’s was 13.
Yet, both hung themselves last week – ending lives that most would trade for in a split second. What dude wouldn’t want to be paid to eat and travel and how many girls dream of fashion jobs?
The tortured artist is a stereotype for a reason. As early as 1897, poet Edwin Arlington Robinson penned a verse about super-rich-man-about-town Richard Cory who “went home and put a bullet through his head.”
Not making People magazine, however, is the fact us commoners have been killing ourselves in growing numbers since the 2000s rolled in.
According to figures at both the CDC and World Health Organization websites, Americans have been killing themselves more and more often. Forty-five thousand Americans took their own lives in 2016. The suicide rate has increased more than 30 percent in half of the states since 1999.
While the United States is often found among the worst for social problem statistics, with suicide that is not the case. The countries where it is most common are generally the poorer countries. Sri Lanka and Guyana both top the list. The most common way to kill yourself worldwide is poisoning with agricultural pesticide if that gives some insight into the demographics.
In the United States men with guns are by far the most common scenario for self-inflicted deaths. Men are three times more likely to kill themselves than women here and guns account for about half of all suicides. Hanging and poison are the next two leading causes.
Suicide has become the 10th leading cause of death in the United States, according to a New York Times article on the rising number. It included experts who expressed dismay that the rise has occurred despite years of research and preventative programs.
The Centers for Disease Control reports that suicide is rarely caused by any one thing and many of the people who die by suicide are not known to have a mental problem at that time. Bourdain apparently shocked friends by killing himself, but Spade’s relatives said for years she had battled depression issues.
The CDC found causes of suicide are linked to relationships 42 percent of the time; problematic substance abuse 28 percent; “general crisis” 29 percent; job/financial 16 percent and physical health 22 percent. But these often overlap and include a mixture plus other specific issues.
There is clearly something wrong in a world where killing one’s self is in the top 10 of ways to die. It’s hard to know what can be done to improve these statistics. On a larger scale, two areas of modern life need to improve their efforts: the church and psychiatric medicine. At the bottom of the issue is a spiritual problem. People are hopeless. Whether from internal factors or what they see around them, people feel alone and miserable. More emphasis on the community and specifically the church family is needed to address this rot in the modern psyche.
Secondly, as the results in the New York Times story indicated, the rise in suicides is despite the fact that more Americans than ever take antidepressants, with the 15 million Americans taking the drugs -- tripling the number of taking them on a regular prescription since 2000. Clearly the modern approach to mental health on a national scale is a failure.
On a personal level, there are plenty of websites with tips for spotting symptoms in loved ones – generally beginning with talk of suicide – don’t ignore it if you hear someone speak of it. Googling suicide prevention returns a bountiful number of resources.