Wednesday, December 23, 2015

The War on Christmas

How the Atheists stole Christmas...

Christmas time is the time of the year when carols are played through stores, tinsel adorns everything, twinkley lights cover trees instead of snow, and people have lots of silly accidents falling off of ladders. Its a time when every church has a nativity scene outside, and inflatable blow up Santas start appearing everywhere. Its a time when family gets together and celebrates the year they have just had, eat good food, and exchange gifts. 

However, around December, we also have people who do not embrace the spirit of this holiday. What is ironic, is that these angry, argumentative people who do nothing but glare at others and complain are the very ones claiming there is a "war on christmas". They become purple in the face because someone says "Happy Holidays" instead of "Merry Christmas". They boycott stores because their winter displays do not specifically say "Christmas Display", and they make poor Starbucks employees call them "Merry Christmas" as a way to "get back at" the store for daring to have an undecorated red holiday cup. 

I can't count the number of people I have heard complaining about how we need to "put the Christ back in Christmas" and how this country is becoming so very "un-christian"... Honestly, it's a little embarrassing to even be near these people when they start ranting about the lack of snowmen and reindeer in stores. And really, when was the last time a snowman was present at the birth of Jesus... Oh yeah, never. Because Jesus was born in the middle east, in the summer, and the only Artiodactyl animal present was a Camel, not a Reindeer... 

The world as a whole, and not just America, has been making a shift away from obsessing about Christmas and trying to be more inclusive by remembering that there are other celebrations around the month of December as well as Christmas. Hanukkah, Kwanzaa, Ramadan, St. Lucia Day, Boxing Day, and Yule celebrations are just some of the many holiday celebrations that occur in December. There's more than 10 different celebrations that take place in December, some of which are based in religions like Christianity, Judaism, and Islam. Other holidays are more general and non-religious, celebrating the rebirth of the year, the coming to a close of the old one, and the joy of a new time ahead. Most of these holidays center around a gift-giving, family time, and a mysterious person sneaking around giving out coal to naughty children. 

It is a harmless conglomeration of holidays that we all enjoy, and yet somehow the very fact that people are now recognizing more than just Christmas is offensive to some Christians. I love how they say that we are commercializing Christmas for profit and corrupting the spirit of the holiday in one breath, and then complain that people don't have enormous banners across their stores or TV adds that say Merry Christmas in the next. The hypocrisy is real. 

The celebration of Christmas was adapted from the pagan holiday of the winter solstice to begin with. Christians took bits and pieces from other people's and religion's version of the holiday such as gift giving, naughty vs nice, decorated trees, socks filled with little gifts, etc... None of that has anything to do with the birth of Jesus. It all came from other people's holidays and the Christians just appropriated it. Now I have no problem with that, but they can't get upset when others don't feel like following their celebrations. They can't force everyone else to do what they do, and they have no right to try.

I also find it interesting how Santa Clause has somehow become synonymous with what Christians like to call "their" holiday... I mean really, what does Santa, reindeer, snow, gift-giving, and decorated fir trees have to do with Jesus? And how does a move away from only recognizing the holiday with the fat bearded man in red have to do with a war on Christianity? 

I really want to know how on earth any sane person could ever believe there is a "War on Christmas"... Really, all I have to say to those people is: Get over yourselves and grow the fuck up.

Sunday, December 6, 2015

Growing up Unconventionally

To be born to Immigrant Parents

To begin, I'll go over my background a little bit. My parents are foreign in that they were not born, nor did they grow up in the US. My father was born in England to Polish parents and grew up jointly in England and Poland, and lived much of his adult life in South Africa. My mother was born in South Africa, grew up there, and lived most of her young adult life there. My older sister was born in South Africa and grew up there until the age of six. My parents moved to the United States in 1995 because they were concerned at the rising rates of violence, particularly against white women, and did not want my sister and I (though I had yet to be born) to become a rape statistic. At the time, 1 in 3 white women were raped in their lifetime in South Africa, and they wanted their children to avoid such a fate. And so they moved to the United States, and I was born that same year.

Now I may have been born and grown up in the USA, but I was not raised as such. My parents did not raise me to be an American. They raised me to be a worldly person and they raised me as they were raised, understandably. When other kids were watching American cartoons like Spongebob Squarepants, Kim Possible, and the Powerpuff Girls, I was watching British programs like The Animals of Farthing Wood or National Geographic animal documentaries. When other parents were playing Elvis Presley music in the house, my parents were playing the Beatles, Queen, and classical orchestral pieces. When other parents were watching old western movies and war movies with John Wayne in them, My parents were watching things that were popular in England and South Africa at the time.

Up until I began kindergarten, I had a very strong South African accent, because that is what I was around. I lost that most of that accent gradually when I started school and was exposed to other American children. However, I was still not exposed to much "American Culture". When I made my first friend, she was English. Her parents were from England, and she and her older sister had been born there. She was in the same boat as me, growing up the child of immigrant parents in a country of - what to us were - foreigners. Since were were 6 or 7, we have been friends, and for a very long time Best Friends until geographic separation disrupted that.

At the end of 4th grade, my parents removed me from school and began homeschooling me because they knew they could do a good, if not better job of it than the school system. We followed the requirements of the homeschooling association and so I did learn many things that American children did, however me mainly focused on what my parents had learned growing up. I learned world history, the Greeks and Romans, the Egyptians, and the French and English. When I learned about the World Wars, I didn't just hear about how heroic America was for joining at the last minute and helping end it.  I learned about how the European Allied Forces struggled for years to keep afloat, to prevent the spread of Hitler's armies. I learned about the contribution of various countries and how the war affected Europe and Europeans who were closest to the conflict zones. I didn't learn endlessly about how brave and selfless America was and how the housewives suffered endlessly because their men were not there to help them. I did learn about it, but the focus was on the grand scheme of things, not just on America's contribution. When we studied geography, I learned about WORLD geography, and did not have to memorize the capital cities of each and every state. My parents didn't think it was as useful to know about the capital of Mississippi, a place I would likely never go, whereas knowing that there were countries out there called Australia, Japan, and Madagascar was probably going to be more helpful in my future. Knowing the difference between Australia and Austria, Sweden and Switzerland, and learning about the various languages, cultures, and people in those areas was important to my future and my world knowledge.

I had a very rounded education, but not much interaction with pop culture, american music, american tv cartoons or tv shows popular at the time. My parents listened to NPR on the radio constantly, they watched BBC world news, and so that is what I listened to and watched. I had a thoroughly foreign upbringing. And I am endlessly grateful for that. I know more about the world, and world views and issues than many of my "peers". I understand more about foreign politics and issues that go beyond the borders of the country I live in than even many adults I meet. I understand the consequences that the government that I live with will have upon the world. I can think beyond myself and my country, and for that I am grateful.

And yet I have always been picked on or bullied or teased about where I come from and what I know. When I was younger it was about music and the fact that I didn't know what "Fall Out Boy" was. When I lived in Australia, I was teased about my "American" accent. When I came to college, it was about the TV shows that I didn't know, or the Actors that I had never heard of, like John Wayne.

And I am sick and tired of having people criticize me for my lack of knowledge of all things American and wonderful, when they cannot take it in return. I am always astounded that people don't know about Maggie Smith, or Dame Judy Dench... Of British TV shows like Miss Marple, Midsommer Murders, and Masterpiece Theater... Of movies like Keeping Mum and Death At a Funeral (the original, not the horrible American remake). And yet, if I dare to voice how astounded I am, all I get is... "Well it's not American so why should we know it?"

The world does not begin and end with the blessed USA. There is more out there than the portion of the continent of North America that houses the United States. There is more land, more political arrangements, more music and culture and film and literature, more human beings out there. And yet I am always the odd one out, the less knowledgeable, the less aware simply because my specialties and likes and dislikes and humour is not American.

In the past I have had to deal with this type of ostracism because I do not have the same likes as my peers, and in the past I have ignored it and moved on and been the bigger person for the most part (excluding my obsession with proper pronunciation). But the past is not now and I am not the same little girl that had to deal with people being mean to her.

Why do my tastes have to be wrong. Why does my knowledge have to be irrelevent. Why does what I like have to always take a backseat to those around me. Why does no one ever ask if I could show them what I like and let them decide if they like it too. Why, oh god fucking why am I always the one who is weird when I act surprised that those around me don't know some of the biggest names in foreign film and TV.

And the argument is always that I was born here, in the US, so I should know all this stuff, and love America and it's culture. Because I was BORN here. I wish I hadn't been. I wish my mother had got pregnant a few months earlier so I could have been born in South Africa. Then at least I'd have the excuse that people seem to need, that I WASN'T born here. Maybe then i'd catch a break and people wouldn't hound me for not knowing what they know. Maybe then people would actually be willing to try and understand why I'm so different. Maybe then they'd actually care.

I am sick and tired of it, that just because I had a different upbringing and foreign parents and was raised to have more of an interest in world media than in American media I am strange. It hurts to have people who I think of as friends belittle and disregard me and my upbringing because it wasn't the same as theirs, because it wasn't fucking American. So I just don't give a damn anymore.