By Benjamin J.N. Harrison
Jews are indubitably some of history’s most significant victims; perhaps no group has been so consistently persecuted, so considerably vilified as the world’s Jewish community. Unfortunately, the hatred of Jews is a tradition so archaic and widespread that it has solidified itself in civilization as a normality and a mere fact of human culture. Jews have been blamed for many things; among them, spread of disease, which led to one of Europe's most bloody, brutal, and unforgiving pogroms: the Strasbourg Massacre.
In the mid-14th century, the first major European outbreak of bubonic plague’s second pandemic was peaking, and was thus far responsible for the death of millions across the continent . By 1348, pogroms against Jews had begun in European cities, the first being in Toulon, as many Christians believed them to potentially be the cause of the pandemic . Although the reasons for these accusations are not certain, one can posit that they were largely due to the general Christian detestation of Jews at the time; Jews were viewed in a very negative light, often being accused for killing Christ, plotting world domination, or being feared for refusing the Catholic Church’s teachings . Moreover, the Jewish community in Europe was not as affected by the plague, and thus was falsely blamed for bringing it upon Christian Europeans by poisoning the wells and the food, although this relative immunity was in fact mostly due to Jews at the time being isolated from others in ghettos .
The Strasbourg Massacre, then, was one massacre in a string of pogroms that took place during 1348-1351 in Western Europe – 30 alone occurred in the Alsace region, in which Strasbourg is situated . The massacre began on 14 February 1349, before the plague had even hit Strasbourg, when the bourgeoisie citizens, convinced the Jews had caused the plague, erected a scaffold in the Jewish cemetery of the city, where the Jews were burnt alive . Although there was a city policy ensuring protection for the Jews, they were forced to make the decision of whether they would be burned, expelled from the city, or baptized. Low estimates for the number of Jews killed are 250 to 300, but some higher estimates postulate that as many as 2000 Jews were burnt alive during the massacre . Nevertheless, the entire Jewish community of Strasbourg was eliminated by this event .
Civilization’s history of anti-Semitic violence must not be forgotten. While it is pleasant to imagine that the world inhabited today is one of utter peace and harmony, anti-Semitism persists; the 2018 Pittsburgh synagogue shooting and 2017 Charlottesville Unite the Right rally are two such instances of intense Jew hatred and bigotry; Louis Farrakhan, leader of the Nation of Islam (NOI) continues to spew venomous hate and baseless conspiracy theories about Jewish people and their intentions. Together, the citizens of the world can work together to make anti-Semitism a thing of the past, where it should be locked away forever.
By Benjamin J.N. Harrison
For many, hummus b’tahini (the full Arabic term for hummus), a dip of ground chickpeas mixed with sesame paste, garlic and olive oil, is considered solely for its gastronomic merit; however, there exists a deep political and cultural divide in the Middle East with the dish at the epicenter.
The seemingly never-ending dispute over hummus between the Israelis and the Arabs concerns the origins of the dish; while the earliest recipes for a dish similar to hummus b’tahini can be found in 13th century Cairo cookbooks, as well as the Hebrew bible, precise geographic and historical origins are difficult to determine, and the issue is thus largely open to speculation . In the 21st century, the hummus quarrel is perched on the ideals of identity and patriotism . Countries claiming ownership of the dish include Greece, Israel, Jordan, Egypt, Palestine, Turkey, Syria, and Lebanon; perhaps the most notable international dispute concerning hummus was the inconclusive hummus battle between Israel and Lebanon.
2008 marks the beginning of the infamous ‘hummus wars’, when Lebanon accused Israel of imperialism for its widespread adoption of hummus b’tahini . Worried that hummus was no longer known as a Lebanese, but an Israeli dish, the Association of Lebanese Industrialists took legal action and sought protected status from the European Commission for hummus as a uniquely Lebanese dish . While many in Israel dismissed this as trivial and ridiculous, Lebanon hastened to point out that feta cheese was successfully trademarked by Greece in 2002, and that their attempt was analogous .
The hummus wars became silent until 2010, when the whole dissension metamorphosed into something of an even more ludicrous nature; Israel and Lebanon decided to settle the battle and began to compete for which nation could make the largest hummus dish. In January of that year, fifty Israeli chefs were gathered in the village of Abu-Gosh to mash up roughly nine thousand pounds (8992.5 lb) of chickpeas in a six-meter satellite dish, beating the then-current Guinness World Record . However, Lebanon was not one to ignore such a cue; in May of 2010, some three hundred Lebanese chefs prepared a batch of hummus b’tahini weighing in at 22,046 pounds, taking the Guinness World Record from Israel . According to local news outlets, the recipe included eight tons of boiled chick peas, two tons of tahini, two tons of lemon juice, and seventy kilograms of olive oil .
Have the hummus wars truly ended? Probably not. Was there a clear winner? Not necessarily. Will Israel and Lebanon battle over culinary hegemony in the future? Perhaps. Amidst all these unanswered questions, one can at least be sure of the wars’ positive outcomes: (a) a true moment of peaceful Arab-Israeli competition and (b) 30,000 extra pounds of hummus.
By Benjamin J.N. Harrison
Introduction & History:
The year was 1932; the location was Macon County in Alabama, harboring a large population of illiterate and poor African-American sharecroppers . The U.S. Public Health Service (USPHS) had just initiated an experiment in collaboration with Tuskegee University following results from a survey by the Rosenwald Fund indicating high levels of syphilis among African-American males in certain southern counties, especially Macon County, in which 36% of the black population were afflicted . The experiment, led originally by Taliaferro Clark, initially sought to follow untreated syphilis in a group of black men for a short period of time before following up with a treatment phase . However, the intentions of the study soon became distorted with perceptions of different and often inferior physiology in black males relative to white males; it was common to believe that the black male had an underdeveloped nervous system and would thus not suffer the neurological damage of syphilis, but rather the cardiovascular damage . The study was no longer pursued to explore the frontiers of biomedical and biopathological science, but rather to push a racially centered agenda and to exert hegemony, as will be demonstrated . For the next forty years, until 1972, those of Macon County who participated in the study would be treated not as patients, but as mere experimental subjects; cadavers of sorts that had not yet died.
The USPHS selected six hundred African-Americans to participate in the study; two hundred and one were identified as non-syphilitic and thus formed the control group, while the other three hundred and ninety-nine were identified as syphilitic and comprised the study group . Those selected were enthused, as they were promised treatment for their “bad blood” – a loose colloquial Southern term used for various conditions such as anemia, syphilis, and fatigue  - which, as will be discussed, they were denied constantly . Furthermore, the subjects were promised their funeral expenses covered, should they allow an autopsy, as well as occasional free meals and free transport to the treatment center . And so, with willing, but uninformed consent, six-hundred African-Americans of Macon County, Alabama began the next forty years of their lives as constantly deceived, increasingly sick, and profoundly mistreated walking cadavers.
From a rigorously scientific point of view, the experiment was deeply flawed from the beginning; what was supposed to be a study of untreated black males began with the administering of a small dose of what was then a therapeutic for syphilis: a compound combining mercury, arsenic, and bismuth . Although largely noneffective, this had the potential to distort the results of the study. Furthermore, many of those in the control group contracted syphilis throughout the experiment’s duration, and were simply transferred into the study group .
Although the study was initially meant to last a mere six months, it persisted for forty years, from 1932 to 1972. This was due to the perceived need for autopsy; “we have no further interest in these patients until they die”, said Oliver C. Wenger, a physician working in the study . And so, the study would continue indefinitely until every untreated patient died. Throughout these forty years, some major papers were published in medical journals, the first being in 1936 . This first paper was criticized because of the lack of clarity concerning treatment for the subjects, but the experiment was regarded generally in a very positive light, mostly as its fine details were hidden from the medical community .
As was mentioned earlier, treatment, although promised, was denied to the study’s subjects. For example, during World War II, many of the study’s subjects registered for the draft. However, the USPHS made efforts to hinder these men from joining the military, as there they would be identified as syphilitic and would be treated . Furthermore, when penicillin was established as an effective treatment for syphilis in the late 1940s, the patients of the Tuskegee experiment were not offered the antibiotic, and measures were taken to ensure that local doctors of Macon County did not prescribe it to them .
The physicians did many things during the study to create the guise of some kind of progress. For example, to encourage consent for painful, potentially dangerous, and noneffective spinal taps intended to identify signs of neurosyphilis, the doctors sent the syphilitic patients a misleading letter titled “Last Chance for Special Free Treatment” . Furthermore, the patients were administered placebo treatments, as well as “pink medicine”, which was merely aspirin .
By the early 1970s, a USPHS venereal disease investigator named Peter Buxtun went to the press with his concerns and stories about the immoral practices within the Tuskegee syphilis study. Needless to say, the story became front-page news and the study subsequently came to a cessation in 1972 due largely to public opinion . By this point, twenty-eight of the original three hundred and ninety-nine subjects had died directly of syphilis, while another hundred had died of syphilis complications. Moreover, forty of the syphilis patients’ wives had been infected and nineteen of their children were born with congenital syphilis .
The Tuskegee Syphilis Experiment sparked debate about ethics in biomedical research with human subjects; and rightly so, as there was a clear neglect for ethical consideration during the study’s duration.
Firstly, the Tuskegee Syphilis Experiment broke legal and scientific values concerning treatment of syphilis. In 1927, an Alabaman law was established, mandating treatment for those afflicted with syphilis, something the study made no consideration for . Moreover, even at the beginning of the study, most major medical textbooks recommended that all cases of syphilis be treated, as the consequences were serious .
The Tuskegee study also ignored many of the emerging medical principles throughout its duration. For example, in 1946, while the experiment was underway, the American Medical Association (AMA) judiciary counsel issued a report establishing the principle of voluntary consent. Furthermore, in 1964, the World Health Organization (WHO) issued a Helsinki declaration that pushed for informed consent to lie at the basis of biomedical research . Now while many may argue that the participants of the Tuskegee study did consent, it was misinformed consent. The participants, who were largely illiterate and easily persuaded, were not informed of the nature of the study and were promised treatment, when instead they were constantly denied it. In other words, the participants of the study consented to one thing, but got something completely different .
The Effects of the Tuskegee Experiment & Conclusion:
While the Tuskegee experiment was mostly an overwhelmingly negative moment in medical history, it brought forth one major positive change in medicine: stricter ethical guidelines for biomedical research involving humans. At the beginning of the Tuskegee experiment, biomedical ethics were not really at the forefront of discussion; but following the outrage in the 1970s was heated debate concerning just such a topic . In 1974, Congress passed the National Research Act, which created the Belmont Report, establishing guidelines for research involving human subjects and seeking to oversee and regulate human experimentation in medicine . These, one cannot deny, are very positive outcomes.
However, one needs to be reminded of the profoundly negative effects the Tuskegee experiment has brought forth. Firstly, it damaged many families; as was mentioned before, the end of the syphilis experiment saw many patients dead, forty wives infected, and nineteen babies with congenital syphilis. Furthermore, many argue that the Tuskegee experiment has led black people to develop a distrust against clinical medicine . The Tuskegee experiment has also damaged the reputation of Tuskegee University, which was perceived generally as a progressive and respected institution . To repair some of the damage done by the Tuskegee experiment, attorney Fred Gray initiated a lawsuit on behalf of the patients and achieved a successful settlement for $10 million and medical treatment for the surviving seventy-two participants .
In summation, the Tuskegee Syphilis Experiment was a devastating and inhumane experiment that did irreversible damage to the black community of Macon County, Alabama and sparked debate about biomedical ethics concerning studies that used human subjects. Perhaps President Bill Clinton put it most eloquently when he formally apologized for the event in 1997:
“To the survivors, to the wives and family members, the children and the grandchildren, I say what you know: No power on Earth can give you back the lives lost, the pain suffered, the years of internal torment and anguish. What was done cannot be undone. But we can end the silence. We can stop turning our heads away. We can look at you in the eye and finally say on behalf of the American people, what the United States government did was shameful, and I am sorry” .
By Ben J.N. Harrison
The first religion to embrace monotheism was not Judaism, but rather a short-lived religion during the early eighteenth dynasty of Egypt, with the heretic Pharaoh Ikhnaton as its most fervent proponent.
After a life of hedonic pleasures and luxury, Amenhotep III died in 1380 B.C., leaving his legacy as a bringer of architectural prosperity and general stability. With his reign over, it was time for his son, Amenhotep IV, soon to be known as Ikhnaton, to take the throne and initiate one of the most troubling and outrageous periods of Egyptian history.
Much controversy exists amongst Egyptologists as to whether Amenhotep IV began his reign when Amenhotep III died, or if they ruled together in co-regency before the latter’s death. Certain Egyptologists believe Amenhotep IV unequivocally ruled for eight to twelve years in co-regency with his father, but others remain dubious. For the sake of this paper, it shall be asserted that Amenhotep IV and his father co-reigned for at least eight years, an assertion based on the findings from The Egyptian Ministry for Antiquities in February 2014.
When coming to power alone, Amenhotep IV began to abhor the religion of the god Amon and its practices. This ardent disliking was exacerbated by the large harem in the great Karnak temple, supposedly of Amon’s many concubines. A strong adherent to fidelity, Amenhotep IV was outraged.
It wasn’t, however, until Amenhotep IV’s fifth year of ruling when he decided to take action on the prevalent religious conservatism of Egypt. It all began with a mere name change, from Amenhotep IV to Ikhnaton, a title that translates roughly into ‘Aton is satisfied’. This unprecedented change of name was indicative of revolution. Shortly after, Ikhnaton declared that all other gods were mere trivialities, and there was but one god – Aton.
The nation of Egypt was experiencing a turning point in world history - the first monotheist. Before Moses, before Isaiah, there was Ikhnaton. In a region as conservative and polytheistic as Egypt, this was confounding.
But Ikhnaton found great delight in Aton, god of the sun. He found divinity above all else in the great shining solar disk, the source of all earthly life. During his reign, he composed zealous hymns, passionate songs, and elegant poetry, all dedicated to Aton. Of his poems, Great Hymn to the Aton is one of the most beautiful and splendid pieces of surviving Ancient Egyptian literature. In it, Ikhnaton eulogizes Aton ardently, stating that he is not merely god of the Egyptian peoples, but god of every nation equally. In a time when each nation had their respective tribal deities, this was radical. Moreover, he states that ‘there is no other that knoweth thee’, asserting that he understands better than anyone else the love and glory that Aton intends to bring for all to enjoy. Ideas such as these solidified Ikhnaton’s position as a true revolutionary.
Not only was Ikhnaton’s introduction of monotheism radical, but the type of god he praised was itself radical. In a nation where most gods could simply be erected as statues, Aton was more abstract. In an almost pantheistic fashion, Aton is to be found in all forms of life and growth. Furthermore, Aton is not limited to human form; rather, the ultimate power of Aton lay within the heat of the sun; the setting and rising orb is merely an asset of Aton’s true divinity.
While Ikhnaton may have been a visionary, and apparently desired unity, he was not satisfied with a slow integration of his religion. He supposedly – and there is much debate about this - gave orders that all names of gods except Aton be effaced and chiseled from every public inscription in Egypt. Furthermore, he shut down all the old temples worshipping other, lesser gods such as Osiris, Maat, Isis, Ptah, et cetera. While Ikhnaton certainly desired his people to love Aton as much as he, happiness could simply not be obtained with no god existent but Aton. Artists tried to participate in the love of Aton by building him statues, but Ikhnaton forbade this on the grounds that Aton was formless.
Ikhnaton moved from Thebes, the religious capital of Egypt, to a place known today as Tel el-Amarna. So not only did Ikhnaton disrupt religion, he disrupted the very tradition of kinghood by moving the capital of Egypt two hundred miles north to a barren, lonely stretch of desert. There, he built his beautiful new capital of Akhet Aton, which translates into ‘horizon of the Aton’. In this newly built city, Ikhnaton neglected his domestic and foreign duties and instead focused on fulfilling his role of high priest. This neglect led to domestic taxation being at an all-time low, invasion of the Hittites, cessation of mine working, and an empty Egyptian treasury. Needless to say, Egypt was quickly going to ruins. The nation was disgusted and slowly awaited Ikhnaton’s death.
Two years after Ikhnaton’s death at thirty years old in 1362 B.C., his son Tutankhamen ascended the throne and brought Egypt back to its traditional form. He returned to Thebes, abandoning Akhet Aton, made his peace with the church and announced a restoration of the ancient gods. He commanded the words Ikhnaton and Aton to be erased from all monuments. It was illegal to utter Ikhnaton’s name, and he was instead understood generally as ‘The Great Criminal’.
While mass hatred towards Ikhnaton was prevalent, his legacy as the first ever monotheist echoes still today. Egyptologist James Henry Breasted has deemed him the first ever individual, meaning the first person to ever leave such a large impact on the present-day world. Before Judaism, Islam, Christianity, and other Abrahamic faiths, there was that which Ikhnaton preached. While it may not have been a successful faith, it shall remain ever present on the sands of time.
In the 1900s, Steel Pier in Atlantic City, New Jersey opened its first diving horse exhibition after its inventor, William "Doc" Carver's, horse fell into water after the bridge they were crossing collapsed in North Platte, Nebraska. Following the idea, Carver, in partnership with Al Floyd Carver, began touring the country, stopping by Hanlan's Point Amusement Park in Toronto, Canada in 1907, with his rider Lorena Carver who would dive with horses from towers of up to 60 feet. Later on, in 1924, Sonora Webster joined the show with her horse Red Lips. The High Diving Horses became the highlight of the Water Circus at Steel Pier with horses diving three to four times a day. One would assume that horses were injured multiple times but surprisingly, this was not the case. The only injury while the show ran was the blinding of Sonora Webster in 1931 after her and her horse lost their balance on the platform before diving. She continued diving with horses blind and in 1991, the film Wild Hearts Can't Be Broken was released based on her life. The shows received strong criticisms of animal welfare abuses with a decline in popularity after World War II with allegations of the use of prods, electrical jolts and trap doors to get unwilling horses to dive. Finally, in the 1970s, horse diving shows ceased to exist though a couple of years ago, an attempt to revive them was made at Steel Pier. It was quickly halted with the president of the Human Society of the United States saying "This is a merciful end to a colossally stupid idea.".
The Second World War is known to be dominated by modern technology such as machine guns and artillery fire. However, there is one man who pushed those aside and donned a basket-hilted sword, longbow and bagpipes to attack the enemy. Lieutenant Colonel Jack 'Mad Jack' Churchill, a soldier for the British Army was originally an actor before the war started. When the war began, he enlisted and soon became known as a 'madman' who deemed modern technology as unnecessary and opted for something out of a Napoleonic era. His motto "Any officer who goes into action without his sword is improperly dressed." was viewed as outlandish in a time where artillery, tanks and machine guns won battles. However, he was able to prove otherwise. Known as the only man in WWII to have felled an enemy by longbow, Mad Jack honoured his namesake. In Operation Archery, a raid on a Nazi garrison in Norway, Churchill was on the first landing craft. As the doors fell, he kept forward onto the field, playing 'March of the Cameron Men' on his bagpipes as he threw a grenade and vanished from sight. He quickly dotted the area with men who were too unfortunate to pass by him. The effect of this raid was enough to persuade Hitler to divert most of his troops to Norway, leading to many successful POW camp raids in Southern Europe. Throughout his career, he was captured while playing his bagpipes, knocked unconscious by explosions and even sent to a concentration camp, only to escape shortly after and then walk 150km to Verona, Italy to join the fight once again. Churchill left his mark on history, not only through his courageous actions, but with his words, noting that after the war ended, he stated "If it wasn't for those damn Yanks, we could have kept the war going another 10 years.".