Opinion: Why I Love Wilson’s Creek Battlefield

Yesterday as I perused my Twitter feed (@Hoghighlander for those who want to follow for more great history content!), the anniversary of the Battle of Wilson’s Creek was trending. Many of the historians, bloggers, and podcasters I follow were posting about the battle, outcomes, significance in the American Civil War, and the central character that died while leading the Union Army, Brigadier General Nathaniel Lyon. Some Civil War historians have overlooked this small battle (small from a military statistics perspective when compared to Gettysburg, Chancellorsville, Chickamauga, Stones River, and others). However, the significance lies more with the impact it has on the Midwest and the Trans-Mississippi Theater. Missouri was precariously situated between free and slave state supporters and there was a race to tip the balance and solidly secure the state. On August 10, 1861, Confederate soldiers from Arkansas, Louisiana, and the Missouri State Guard (pro-Confederate Missouri soldiers) commanded by Sterling Price and Benjamin McCullough were attacked by a smaller Union army led by Nathaniel Lyon and Franz Sigel. The fierce fighting carried on for eight hours, but early on, a bullet ripped through Lyon’s chest, killing him almost instantly. By 4:00 pm, Union forces pulled from the battlefield and left the nearby town of Springfield to the Confederate army. However, due to the losses the Confederate army suffered at Wilson’s Creek, Price and McCullough were split on how to proceed. Price wanted to pursue the Union further north, but McCullough wanted to remain close to Arkansas to maintain supply lines. Springfield would occasionally shift allegiance, but Lyon’s determined stand would later cement Missouri for the Union.

This post isn’t about the battle itself (for a great discussion on the background, action, and aftermath of Wilson’s Creek, listen to this wonderful podcast from the Civil War Breakfast Club: Battle of Wilson’s Creek-CWBC. Instead I wanted to explain why I love visiting this battlefield, now a National Park. The park was created in 1960 with a small visitors center and some museum displays. While the park only preserves 1,750 acres, there’s a lot of natural and historic beauty in those acres. I was born in Springfield, Missouri, less than 15 miles from the battlefield and it was one of the first national parks I ever visited as a child. My fiery history passion was stoked by frequent visits, gift shop coloring books, re-enactments, and moonlight tours where actors portrayed various personalities in the battle’s aftermath. It didn’t dawn on me until high school that the actor playing the Union chaplain was my high school history teacher, Mr. Elkins, who works part-time for the National Park Service (lucky dog).

Split rail fencing is a common sight at Wilson’s Creek Battlefield. Many volunteers and park employees have painstakingly recreated the fences as they would have appeared back in 1861.

The museum underwent some amazing updates recently; new exhibits, historical items, an upgraded fiber optic map of the battle (that was my favorite attraction as a kid and it still is today), and an expanded Civil War research library. Anyone who wants to research the Trans-Mississippi Theater and the war in Missouri must find visit this library and take advantage of all the resources it offers. Upon passing through the gate, you come up to the first Confederate encampment and the small farm buildings run by the Sharps and Rays who were the local families when the battle broke out. The stunning rolling hills of corn and wheat are quite a sight in the fall. As a kid, I often imagined the two sides thrashing one another, even when I came to see historical re-enactments. The billowing smoke and bayonets shining in the hot August sun, it’s hard to forget such an impression.

The park may be small, but damn is it chock full of amazing things.

The Ray House is only original building on the battlefield featuring much of what would have been in the house. In fact the bed frame in there now is the same that was used to lay out General Lyon’s body. The house was used as a field hospital treating both Union and Confederate troops, and during the fight, the family hid in an underground cellar. The house is a popular place for tours and is the centerpiece of their moonlight tours. When you walk through there and see people in period dress, bloody rags laying everywhere and screaming men trying desperately to get rid of the pain, you feel as if you were right there in the thick of it. You’re transported back to that warm humid evening of August 10, 1861.

The Ray House is meticulously maintained to preserve its original condition. There are some modern features like climate control to preserve the artifacts, but visitors can see what the house was like during the Civil War.

What really draws me to Wilson’s Creek are the vast ranges of fields and forests that look so well maintained. Underneath it all is a bloody historical narrative though. Missouri witnessed intensely savage fighting during the war years with bands of roaming guerrillas and bushwackers slashing each other. The social and political divisions here ripped families apart and vendettas scarred relationships for decades after. To be a farmer in Missouri back then was an almost riskier occupation than a Union or Confederate soldier; you didn’t know if you would die today or by who’s hand.

Finally, as you wind around the one-way roads, you make your way up steep elevation. Whenever I rode my bike, this was a struggle and ended up with me dismounting and just pushing the bike up the hill. But to the Civil War aficionado, this is the climax of visiting Wilson’s Creek; Bloody Hill. The bulk of Lyon’s army was situated on the hill controlling the high ground. They repelled four separate Confederate assaults and artillery pounded on their positions, trying to dislodge the Union from their position. Lyon himself led one charge which cost him his life unfortunately. Today a concrete marker stands in the spot where veterans say his fell.

The Lyon Marker sits at the bottom of Bloody Hill. While it’s quite a hike to get there, you can’t help but experience the transcendental feeling of being where men died and their remains could quite possibly be right under your feet.

Every few years, park employees or visitors find artifacts in the battlefield grounds. Stories are still popping up about who had ancestors that fought or died at the battle. In a recent discovery, I learned that my 5x great-grandfather Presley Beal was responsible for building a makeshift coffin for General Lyon in order to transport his body back to its final resting place in Connecticut. Who would have known? My Wilson’s Creek connection just got stronger. Even now that I live in St. Louis, I still try to visit the battlefield whenever possible. The draw is undeniable. The scenery is beautiful, the history is rich, and the people who keep it open for public enjoyment are the most endearing and educated history keepers I know. Wilson’s Creek will always hold a special place in my heart as I continue to travel the country seeing historic places. No matter how far I go, I’ll always know right where to come back; to a small, winding creek in southwest Missouri where the birds sing, the wheat shines, and the soil gives up the dead and tells a story of our nation’s struggle and reconstruction.

For more information about the battlefield and park, visit the NPS website: Wilson’s Creek National Battlefield

Protecting the Tiger: The Korea Defense Service Medal

The United States Armed Forces has installations around the world and partners with critical nations for their national defense. After World War II, we created a special command for the Far East, we have a massive presence in NATO and Western Europe, and our Navy criss-crosses the globe. At the close of the Korean War, the armistice signed on July 26, 1953 may have ended the actual fighting, but no formal peace has ever occurred. With this, the U.S. has maintained a defensive garrison in South Korea. The United States Forces Korea (USFK), part of the larger Indo-Pacific Command, oversees the combined command with the Republic of Korea Armed Forces and conduct a series of military training exercises and humanitarian missions. Over 28,000 U.S. troops are stationed in South Korea at any given time.

Lieutenant General William Harrison and General Nam Il signing the armistice at Panmunjom. ROK President Rhee refused to sign the armistice and no formal treaty has been ratified between the two nations (Image courtesy of Department of Defense)

For fifty years, South Korea was another nation in the larger geopolitical defense policy of the U.S. and a less than desirable posting. In 2002, service members finally began receiving recognition for their contributions in South Korea with the creation of the Korea Defense Service Medal (KDSM). Signed into law by President George W. Bush, the KDSM is awarded to any service member who serves at least thirty consecutive days in South Korea or sixty non-consecutive days. If someone is wounded by enemy combatants while in South Korea, they automatically receive the award, regardless of time overseas.

Under the award criteria, any veteran that was stationed in South Korea since July 27, 1954 may receive the KDSM. Within this period if a veteran served in Korea between October 1, 1966 to June 30, 1974 they can also qualify for the Armed Forces Expeditionary Medal. This was in response to the Korean DMZ Conflict in the late 1960s.

The Korea Defense Service Medal (KDSM). Only one medal is issued no matter how long; no oak leaves, service stars, or other appurtenances are authorized

The significance of this medal isn’t only for recognizing overseas service, but it’s a reminder of the legacy of the Korean War. The status quo that has remained for over sixty years may continue for decades more as the two Korean nations remained divided at the 38th parallel. The U.S. remains a staunch ally to the South Koreans and the KDSM signifies our perpetual commitment to the Republic of Korea.

Contact and Brawls: The Combat Action Ribbon

When the general public looks at a veteran, how can they tell that they’ve served in combat, short of asking them directly? The U.S. Army has the Combat Infantry and Combat Action Badge, the U.S. Air Force has the Combat Action Medal, but the focus of this article is on the award given to members of the U.S. Navy and Marine Corps: the Combat Action Ribbon. Every marine and sailor knows the significance of having that ribbon on their rack. This coveted ribbon is awarded on the most stringent criteria and simultaneously is one of the most retroactively issued awards in the U.S. Armed Forces.

The Navy and Marine Corps Combat Action Ribbon

The Combat Action Ribbon (CAR) was established on February 17, 1969 by the Secretary of the Navy. The criteria set by the Department of the Navy requires bona fide evidence that the member was engaged in direct combat with an enemy. Not only does a person need to be in combat, but they must have acted satisfactorily, i.e. not surrender or disobey orders from commanding officers. The CAR is awarded normally to ground troops or sailors stationed on board ships, but not aircrews. The Navy and Marine Corps provides Strike and Flight Numbers on the Air Medal to denote combat operations, but some can still receive the CAR at the discretion of the Secretary of the Navy. [See Aerial Heroism article for more information about the Air Medal]

For an award like this, it was to be expected that many Navy and Marine Corps veterans would want to verify their eligibility. Initially the award was made retroactive to 1961 to accommodate those serving in Southeast Asia and other special operations around the globe. In October 1999, Public Law 105-65 shifted the retroactive date to 7 December 1941. This allowed for World War II and Korean War veteran to apply for and wear the CAR. But how does the Navy and Marine Corps determine entitlement during those conflicts. Fortunately for the veteran and the NPRC reference technician who researches the service record, massive ledgers and rubrics contain the movements and engagements of every ship and ground unit since World War II. Those are broken down further to specific locations and cross-referenced with a veteran’s service record. If they were attached to a unit or ship that saw combat in their time frame, they are eligible for the CAR. Since over four million sailors and Marines served in World War II and Korea, applications for the CAR are some of the most common requests among Navy and Marine Corps awards.

Where is the Coast Guard in all of this? Historically the Coast Guard followed the same pattern as the Navy, especially when it pertains to awards. Coast Guard members attached to units that saw combat were eligible to receive the CAR. It wasn’t until 2008 that the Department of Homeland Security created the Coast Guard CAR. The majority of CGCARs were issued during the Vietnam War when servicemembers served in the ‘brown water navy’ patrolling the Mekong Delta in South Vietnam.

The Coast Guard Combat Action Ribbon

As per an agreement between the NPRC and the service branches, the Department of the Navy verifies the service record information provided by the NPRC and determines whether or not an individual receives the CAR. If all the specific criteria are met, they receive it. However, as with many other awards, there can be some grey areas. Simply being in a theater of operations doesn’t ensure entitlement. Many Navy veterans from World War II who served in the Pacific qualify only if they participated in certain operations, such as the Battle of Leyte Gulf or the long island hopping campaign to the Japanese home islands. Details matter when it comes to the CAR. Veterans of combat deserve to be recognized for their actions and the CAR does just that.

Marine Corporal Eugene Sledge participated in some of the deadliest combat in the Pacific Theater of WWII, including Peleliu and Okinawa. The ribbons shown in above picture are before the creation of the CAR.
Marine Corporal Eugene Sledge’s ribbons with all retroactive awards showing, including the CAR first in the order of precedence.

Opinion Piece: Why Study History?

Rather than write about a specific historical topic today in my succession of articles (I have the time now since I’m taking time off from work this week vacationing in Colorado and Idaho), I thought I’d write about why I believe studying history is a critical part of education and daily life. There are many historians out there who will tell you the importance of a well-rounded historical education–better engagement with other cultures, understanding government policy, socio-economic reasons, etc. The tired trope of ‘history repeats itself’ carries an element of truth, but it takes a careful eye to discern the subtleties when similar historical patterns make the rounds.

I often think of Hari Seldon, the central protagonist in Isaac Asimov’s cornerstone science-fiction book series ‘Foundation‘. As he debates the merits of his psycho-history theory, whereby one can determine a future event based on the difference between stated intention and actual behavior, his listeners filter through the complex mathematics and only hear the word ‘prediction’. Seldon rejects that his theory is proof that a person can predict the future, claiming that one can only determine large scale social changes and not individual action. Despite his reservations, Seldon is given resources from the Galactic Empire to refine the mathematics and develop a working formula for developing multiple probabilities. Fast-forwarding through the narrative though, predictions about the fall of the empire are fulfilled and the following chaos are manipulated by the Seldon’s successors in the Foundation society. His prophetic ability bestows him the moniker of ‘Raven’ Seldon.

Now why did I give that example? Because just as Seldon’s psycho-history is premised on the fact that we cannot predict individual human behavior, we can develop firm understandings of societal behavior and the trajectory of certain actions. The academic study of history incorporates elements of anthropology, sociology, and psychology because history analyzes human behavior, interaction, and customs on multiple levels of organization. We don’t only study the American Civil War by regurgitating battlefield statistics and tracking the movement of armies. We also look at the economic factors of the Union and Confederacy, the political machinations of certain parties, racial issues concerning slavery and emancipation, the emotional challenges faced by families on the home front, and other innumerable things. History compiles everything together, but it also takes time to uncover the primary sources and conduct our own research. Go to any history conference and you’ll see firsthand how specific a presenter makes their topic. They uncover new evidence challenging a traditional argument, or venture into new territory that can open up a new niche market of historical research. The duty for historians is not only to recount past events, but make them relatable today. This can be difficult for many because the historical education they receive in school routinely teaches only essential, memorizable segments. Wider comprehension on more complex and interconnected historical narratives are more prevalent in higher and post-graduate education, but most mainstream historical education ceases after the 12th grade. After which, most historical learning is self-taught or consumed through mainstream or entertainment media. The latter of which, as we know, is consistently inaccurate.

History teachers go to great lengths to educate their students, but not every student receives the same level of instruction. I was fortunate enough to have a well-rounded set of teachers who gave me multiple interpretations of a historical event and encourage my critical thinking. One teacher, Mrs. Dickey, instructed us on Central and South American history; a subject that I might not have had any exposure to in high school had it not been for her own curriculum and not followed only what the state of Missouri allowed. I came away with not only stronger knowledge of Central and South America, but a heightened awareness that what we take away from history should be made to improve lives, not glorify the problems that still exist. If there’s a problem, should you do something to fix it?

I study history to improve my life and the lives of others. Everyday I work with veterans and when they have issues of not getting the financial or medical benefits, I use my historical research skills to discern what they need and how they can get receive correct answers if it’s beyond my jurisdiction. I also study history to improve people’s knowledge of the world. If one’s historical education is reduced to sound bites or a conspiracy vlogger on YouTube, or even worse, watching Oak Island and Swamp People on the “History” Channel, that can adversely impact their decision making process on real-life decisions. How can one be an informed voter if they don’t research what government policies have worked or failed in the past? How can diplomats engage with other countries if they remain insensitive to another’s cultural heritage and history, especially if they were colonial subjects?

The concept of ‘history repeating itself’ is more abstract than we think, honestly. We may not wholly prevent a specific event from ever happening again, but they can happen in new forms, which can take time to recognize. This month I closely followed the ongoing withdrawal of U.S. armed forces from Afghanistan after twenty years of operations combating the Taliban. There were billions spent on supporting a democratically elected Afghan government, equipping and training the Afghan National Army, and a peace agreement with the Taliban. Now getting into the weeds of our role in Afghanistan since 2001 and Middle Eastern policy is NOT for this article. Despite the accomplishments made by coalition forces against the Taliban, it echoed in many respects from another conflict; Vietnam. I couldn’t help but discern similar patterns; the U.S. entering a foreign nation to defeat an insurgency, partnering with the domestic government, equipping and training a national army, prolonged fights with said insurgency, and the eventual withdrawal following a peace settlement and leaving the regime behind to fight the insurgency on its own terms. A gross simplification for sure and its hard to judge both conflicts with the same mold, but the broad spectrum of activity in both Vietnam and Afghanistan do share many of the same characteristics. Whether or not the Afghan government will endure and defeat the Taliban, my personal thought is no, and to go one step further, a total collapse of the government in Kabul is highly probable in the coming years.

I’m not Hari Seldon (especially since I suck at math) but one doesn’t need to be a Hari Seldon in order to educate oneself on the patterns of history. Without studying history, human civilization would be far worse off and far less intelligent by not taking lessons from our ancestors. The human experience is very much trial and error and it’s the errors that allow us to adapt and progress. History chronicles those errors, but people are far more reluctant to adopt those lessons since they don’t often happen in their lifetime. That’s why we have educators who teach us everything they can about history. It boils down to us, the individual, to make decisions that positively impact the wider world. Only then can we all be more like Hari Seldon.

(Header Image: Reading to Children, Germany, 8/1950, Image Courtesy of the National Archives, NAID 23932386)

Efficiency, Honor, Fidelity: The Good Conduct Medal

The U.S. Armed Forces expects the best from every servicemember from basic training to an honorable discharge. They represent the highest ideals of their service branch, striving for the highest. As a result, everyone’s performance record is tracked for posterity. Evaluations track a member’s aptitude and accomplishments which helps determine promotions and awards. One award, whose origins stretch back to right after the American Civil War, recognizes exemplary behavior, commitment, and dedication to military service: the Good Conduct Medal.

Good Conduct Medal from each service branch. Left to right; Army, Marine Corps, Navy, Air Force, Coast Guard

In 1869, the Navy created the first Good Conduct Medal (GCM). The purpose was to recognize a period of honorable service following a sailor’s discharge. If they completed at least three years of honorable service, they received the GCM (which was actually a badge) along with the discharge paperwork or re-enlistment contract. The award wasn’t even allowed to be worn on the uniform until 1885 when the second version of the medal was released. Between 1869 and 1996, the Navy GCM underwent four revisions, each one having a different design and criteria. Designers switched between a Maltese cross or a simple circle design with varying types of ships, anchors, or a globe (some officers rejected the globe version because it signaled ‘imperialist qualities’).

Navy Good Conduct Medal, circa 1886 (Image courtesy of the Naval History and Heritage Command)

In 1896, the Marine Corps followed suit with the Navy and created their own GCM with the added feature of having the recipient’s name stamped onto the reverse side of the medal. The Coast Guard GCM came later in 1921, the Army GCM in 1941 following Executive Order 8809, and the Air Force was last in 1963. Until 1963. Air Force personnel were given the Army GCM because they shared the same regulations and award standards until the early 1960s.

The GCM is one of the more commonly issued awards in the military; outpaced by the National Defense Service Medal and the Army Service Ribbon. Unlike some awards, the GCM has specific time requirements. Service members were to demonstrate three or four years of honorable service to qualify for the award, depending on the time period. How did an enlisted person’s superior determine honorable service? High standards of job performance and not committing any infractions or UMCJ violations are absolutely necessary for anyone hoping to receive the GCM. The Navy maintains a grade system where every 90 days, a sailor receives marks for their performance and if at any point it dips too low, they immediately become ineligible to receive the award. The same criteria extends to the Marine Corps as well. A Marine must have three years of ‘honorable and faithful service’. Prior to December 1945, it was four years, but later reduced to three. The Coast Guard GCM was established in 1921 by the Coast Guard Commandant and they used many of the same criteria used by the Navy and Marine Corps; a grading system combined with three years of honorable service (reduced from the original requirement of four years).

The Army GCM has the unique distinction of being created by the President of the United States. Under Executive Order 8809 signed by President Franklin Roosevelt in 1941, the Army GCM was established with a three year requirement. By 1943, FDR signed a follow-up order, EO 9323, amending the time requirement to one year if the United States was at war. Since the order was signed in the midst of World War II, many Army veterans unknowingly qualified for the medal since they enlisted for the duration of the conflict. Thousands of veterans applied for the medal retroactively following the war. Qualifications changed again during the Korean War when President Harry Truman issued Executive Order 10444 in 1953. It allowed service members to receive their first GCM after June 27, 1950 for a period of less than three years, but more than one year. It also included a clause allowing soldiers who were discharged from combat injuries or died in the line of duty if they served for less than a year.

Elvis Presley returning to the US after serving three years in the Army. He received the Good Conduct Medal (wearing the appropriate ribbon in the above picture) along with a handful of weapon qualification badges, circa 1960 (Image Courtesy of the Graceland Archives)

The Air Force was the last to adopt a GCM. It also holds a special distinction by being the only GCM to have been authorized by an Act of Congress in 1960. In the interim years of the branch’s creation in 1947 and the first GCM awarded in 1963, Air Force servicemembers were judged by Army standards until the Air Force developed its own. Additionally, airmen serving before and after 1963 can wear both the Army and Air Force versions of the GCM. By 2006, debates within the Department of the Air Force occurred on whether or not the branch should even have a GCM. The rationale being that Air Force personnel should be held by a higher standard of conduct than any other branch. Therefore, something like a medal for good conduct was out of place since exemplary performance and behavior was the expectation, not an aberration. In 2006, the Air Force GCM was discontinued. This policy didn’t last long however. Within two years, officials began reconsidering the decision and reversed themselves in 2008. All servicemembers who would have qualified for the award in those years were retroactively issued the medal.

Now the uniqueness doesn’t end here for the GCM. Appurtenances go with almost every award in the U.S. Armed Forces; oak leaves, stars, arrowheads, etc. The Navy, Marine Corps, and Coast Guard use bronze stars to denote multiple GCM awards; oak leaves for the Air Force. For the Army however, they use an appurtenance wholly unique to their version; a loop. The loop harkens back to when the Navy used enlistment bars on their GCM badges denoting years of service. Bronze, silver, and gold loops mark the number of subsequent awards:

  • Bronze loops are used for the second through the fifth awards.
  • Silver loops are used for the sixth through the tenth awards.
  • Gold loops are used for the eleventh through the fifteenth awards.

By this process, a servicemember can theoretically receive the GCM a grand total of fifteen times; meaning they could have served more than forty-five years in the military, without any infraction or judicial punishments and receive stellar ratings in every performance report. Is such a scenario possible? Yes, but only a handful of people have served in the U.S. Armed Forces for such a duration. Dwight D. Eisenhower, Douglas MacArthur, John William Vessey Jr., Chesty Puller, and Omar Bradley are such record holders (Bradley with the highest record at 69 years, 8 months, 7 days).

Fidelity, exemplary behavior, and honor emulating the high standards of conduct expected of a sailor, soldier, or airman are encapsulated with the GCM. When they receive that award, they have shown that they can act and lead by example the golden standard of honor, hard work, and loyalty everyone expects of each other in the U.S. military.

‘Unternehmen Walküre’: Killing Hitler (Almost)

“This is General Olbricht, calling on behalf of General Fromm, commander of the Reserve Army. The Fuhrer, Adolf Hitler, is dead. A group of radicals in the SS are attempting to seize control of the government. Initiate Operation Valkyrie.”

Bill Nighy delivered that line as General Friedrich Olbricht, one of the participants in the famous July 20 plot intended to assassinate Adolf Hitler, arrest members of his inner circle, and negotiate an armistice with the Allies. The bomb failed to eliminate their intended targets and the plotters were quickly arrested and executed. For a brief moment though, men like Claus von Stauffenberg, Ludwig Beck, Friedrich Fromm, Werner von Haeften, Henning von Tresckow, Carl Goerdeler, and Erwin von Witzleben believed that they had incapacitated the Nazis and rescued Germany.

Adolf Hitler, the target of over twenty assassination attempts since 1934. With every escape from death, his hubris and ego grew to mammoth proportions (Image courtesy of the German Federal Archives)

Popularized in the 2008 film, Valkyrie, the July 20 plot became the most famous assassination attempt against Hitler, who by than had avoided death more than a dozen times. Members of the German Resistance were comprised of senior political, military, and private sector businessmen who lost faith that Germany could prevail against the Allies. U.S., British, and Canadian forces were in the midst of liberating France and renewed offensives by the Soviet Union on the Eastern Front placed Germany in an untenable position. In the early summer of 1944, after a failed bomb assassination, a new plan was put forward to resistance leaders.

Colonel Henning von Tresckow, a July 20 plotter, tried detonating a bomb on Hitler’s plane in 1943, but it failed. He later rewrote the Valkyrie plan to meet the plotters goals of removing the Nazi government (Image courtesy of the German Federal Archives)

The German Reserve Army (or Replacement Army) retained operational plans for maintaining law and order during a national emergency. Codenamed ‘Operation Valkyrie’, General Olbricht believed that it was possible to retool the plan to use the Replacement Army if a coup d’état were to erupt. Resistance members recruited the commander of the Replacement Army, Friedrich Fromm, who agreed to keep silent in exchange for a senior position in the new regime. Colonel Henning von Tresckow (who tried to kill Hitler in 1943) drafted a new copy of the Valkyrie plan and distributed it to various Nazi installations across Europe over a period of several weeks. The new draft required seizing communication hubs, government offices, and concentration camp offices so as to quickly secure German infrastructure.

But how to kill Hitler himself? There had been twenty-one attempts, including shooting, stabbing, bombing, and even poisoning (allegedly Hitler’s vegetarian diet spared him that fate). The answer came with Colonel Stauffenberg’s appointment as Chief-of-Staff of the Replacement Army. This granted him access to Hitler’s advisors and itinerary, which was manna from heaven to the Resistance. Armed with the knowledge of Hitler’s moving location and retinue, they could decide the method for killing Hitler. Two bombs armed with chemically timed pencil detonators inside a leather briefcase was the best option. The bomb would detonate inside a concrete bunker at the Wolf’s Lair complex in East Prussia and the resulting concussive blasts would instantly kill anyone in the room. Stauffenberg’s job required him to attend military briefings and so he volunteered to deliver the bombs.

Colonel Claus von Stauffenberg before sustaining injuries and receiving his signature eye patch (Image courtesy of the German Federal Archives)

The planners originally chose July 11 to carry out their mission, but there was a hiccup. Resistance members understood that if only Hitler was killed, he would be replaced by a close associate like Himmler or Goering. Ultimately the plan was aborted because Himmler wasn’t present at the briefing. A second attempt occurred on July 15 since Himmler and Goering were in attendance, but Hitler was called out for another meeting and Stauffenberg hastily removed the detonator from the bombs. Simultaneously as the Resistance carried out their plans, the Gestapo were investigating the alleged plotters and many concluded that some assassination attempt was in the works. Stauffenberg, Beck, Tresckow, Olbricht, and others resigned themselves to the fact that even if Hitler miraculously survived, they needed to complete the second half of their plot of seizing control of the German government. Failure meant facing the firing squad.

July 20, 1944: Stauffenberg and his adjutant, Lieutenant Werner von Haeften, arrived at the Wolf’s Lair and under the pretense of using a washroom, the two armed the bombs and walked to the briefing. A last minute change occurred when the meeting was moved from the concrete bunker to a wooden cabin with large windows: it was an especially hot and humid day. Stauffenberg placed the bomb as close as possible to Hitler and left the room quickly thereafter under the pretense of a phone call. At 12:42 PM, an explosion ripped through the cabin, shattering windows, ripping off doors, and splintering rafters. Believing Hitler was dead, Stauffenberg and von Haeften sped away from the Lair and flew back to Berlin where plotters received the flash: “HITLER IS DEAD.”

Hermann Goering and Martin Bormann inspect damages following the bomb blast

Hitler wasn’t dead. General Fellgiebel, another plotter present at the Wolf’s Lair, saw Hitler and informed other members, but when Stauffenberg arrived in Berlin, he maintained the Hitler was still dead. At 4:00 PM, Operation Valkyrie was initiated and the Replacement Army quickly went to work arresting ‘conspirators’ in the Nazi Party and Wehrmacht. As the plot continued though, news of Hitler’s survival began undermining the plan. Field Marshal Wilhelm Keitel deduced that Stauffenberg planted the bomb and orders for his and others arrests went out.

At around 7:00 PM, Hitler recovered enough from his mild injuries to begin making phone calls to Berlin. Members of the coup who wavered in their support for the Resistance shifted sides after hearing of Hitler’s survival. The coup quickly disintegrated and the plotters were ordered to be taken alive. In an attempt to prove loyalty, General held an impromptu court-martial and pronounced death sentences to all the conspirators. They were escorted to the courtyard of the Bendlerblock (administrative offices for the War Ministry), lined up, and shot to death.

Conspirators for the July 20 Plot appear before Nazi German judge Roland Freisler. Conspirators stand as their names are called out. German officers and other people seated. Judge Roland Freisler seems more intent on intimidating and chastising the accused, than eliciting testimony.

In the weeks and months following the July 20 plot, dozens more conspirators were identified, admonished before kangaroo courts and summarily executed. It was the last assassination attempt against Hitler, but after World War II, it became the most famous attempt of them all since it came the closest to possibly ending the war. In those six hours on July 20, the Resistance had their chance of stopping the most savage fighting in all of Europe. They made the most of those hours before facing the gallows. Stauffenberg and his fellow conspirators became heroes in postwar world and their actions were later recognized by the German government in 1980 with the Memorial to the German Resistance. A plaque hangs above the spot where the plotters were executed, displaying a four solemn lines attributing to their cause:

You did not bear the shame.

You resisted.

You bestowed the eternally vigilant signal to turn back

by sacrificing your impassioned lives for freedom, justice and honour.

American Pallas Athene: The Women’s Army Corps Service Medal

General Douglas MacArthur said they were ‘my best soldiers.’ Without them, many believed that the U.S. war effort would have been vastly shorthanded. They were a vital force in North America and by the end of World War II, there were over 150,000 active duty personnel in every theater of operations.

The Women’s Army Corps was established as an auxiliary unit and activated to full duty status on July 1st, 1943, serving in communication and mechanical duties both in the United States and overseas. During World War II, the service women endured rigorous training and a great deal of slandering from WAC opponents. Some believed that women could not rise to the challenge and many others disbelieved that women should perform any wartime duties. Despite some public backlash, the Women’s Army Corps boasted over 150,000 active duty members and inspired the creation of other women’s auxiliaries; the Navy WAVES, Coast Guard SPARS, and the USMC Women’s Reserve. Like the rest of the armed forces, segregation was practiced between black and white women in the WAC, but senior leaders made it a priority to ensure that everyone received the same training and opportunities to work in different specialties.

The Women’s Army Corps Service Medal

For their service, all enlisted members of the WAC received the Women’s Army Corps Service Medal. Created by Executive Order 9365 by President Franklin Roosevelt on July 29, 1943, the medal is given to anyone who served with the WAC or it preceding organization, the Women’s Army Auxiliary Corps. Unlike other service medals, the WACSM has no appurtenances and is only awarded once. Following the corps’ disbandment, the medal was no longer awarded, but those who still served during the necessary time frame can apply to receive the award. The observe side of the medal features a profile of the deity Athena; Greek goddess of wisdom and warfare. For many women, World War II proved that they were capable of doing many of the demanding wartime jobs and accomplishing them with great valor and gallantry.

Fugger the Insanely Rich: A Review of ‘The Richest Man Who Ever Lived’ by Greg Steinmetz

Do you ever daydream about what you’d accomplish with endless funds? Would you buy that new car you always wanted? Embark on a month long vacation to the tropics? Eat a hamburger at every establishment listed on ‘Diners, Drive-Ins, and Dives’? All of these are admirable, but consider this: all that wealth wasn’t enough and it was your gift bestowed by God to make money. Your sole purpose was generating profit for you, your company, and family. If you thought this, then you share a kindred soul with one of the richest men in history: Jakob Fugger.

Greg Steinmetz’s book, ‘The Richest Man Who Ever Lived: The Life and Times of Jacob Fugger’ recounts the founding and influence of the powerful Fugger merchant family. At a time when large capitalist enterprises and industrial monopolies were the norm in European economies, Fugger singlehandedly cornered the copper and silver market. His business network reached into several royal households and the Vatican. Merchants, bankers, and businessmen followed his advice and bowed to his financial acumen, believing that he really did turn anything into gold. Steinmetz’s research pulls from a vast archive and the Fugger family papers that have survived for nearly six hundred years. That’s right; Jakob Fugger isn’t your 20th century mogul or Elon Musk style tech entrepreneur. He was 2% of Europe’s GDP in the 15th and 16th centuries.

Portrait of Jakob Fugger by Albrecht Dürer, 1518

Steinmetz unveils the humble origins of the Fugger family and the ascent into the wealthy echelons of society. His grandfather, Hans, was a lowly peasant who moved to Augsburg and entered the textile trade. Textiles was a powerful industry in Europe and its proximity to Italy which produced many of the necessary dyes for colors created a rich market in Germany. The Fugger children and grandchildren worked in various capacities in the merchant business, but others were encouraged to pursue studying theology and become priests. Jakob was one of them, but by the age of fourteen, he was pursuing business interests on the family’s behalf in Venice. The formative Venice years were invaluable to Jakob as he learned the value of building networks, investments, new enterprises, accounting, and honoring contracts. After returning from Venice, Fugger sought out new ventures in Central Europe and his greatest windfall occurred with mining. Through a series of deals with Hapsburg nobility, he secured rights to silver and copper mines throughout Austria. The mines made the Fuggers rich beyond comprehension. When Jakob died in 1525, the vast majority of copper and silver used for minting coins and commercial use came from his vast mines.

Material wealth was only part of the Fugger fortune. The family was closely allied with the Hapsburg royal household and were early supporters of their claims to titles of nobility. They provided many of the contracts and rights for the Fuggers to operate in their territories and given Jakob’s ability to raise funds, the Hapsburgs came to rely on him for loans and credit. Whenever the Hapsburgs needed funds to raise armies or influence elections, they went to Jakob. Border disputes or problems with the Catholic Church? The Fugger network had agents strategically placed in key positions that allowed them to resolve disagreements. Nothing was without its cost and Fugger routinely made a profit from different ventures. Steinmetz took no shortcuts in emphasizing the relationship between the Fuggers and Hapsburgs. Banking and nobility were tailor made for each other during the Renaissance and Jakob certainly capitalized on this political network.

Augsburg in the 15th century. This prosperous German city was the center of much of the wool and textile trade in Central Europe making many families enormously wealthy. Augsburg would be Jakob’s home for most of his life and many in the city were impacted by his financial and political influence

Steinmetz makes another salient point in his analysis of Jakob Fugger and the merchant family. Fugger was traditionally seen as the poorest family member who made his fortune from nothing, but that’s far from the truth. His father and grandfather made important in-roads with the textile industry and built valuable relationships with German and Italian markets. His mother, Barbara Basinger, managed the Fugger bank following the death of his father Jakob the Elder. She was just as shrewd and enterprising as her husband and sons. She exponentially increased the size of the family fortunes and by her death, she left vast inheritances and dowries for her children. Steinmetz’s study of the family relations sheds light on the centrality that the business had with the Fuggers. Outside members worked as agents or informants throughout Europe, but the Fuggers alone were the only ones who managed the money and allowed to learn the art of accounting. Traditional historians emphasized the importance of Jakob, but he wasn’t a one man operation; the extended family made it all happen with him.

Fugger’s profit-generating skills weren’t entirely for selfish reasons though. Steinmetz recounts the Fuggers’ generosity with local churches and impoverished citizens, all of which was motivated by his devout Roman Catholic faith. Originally destined for the life of a priest, Fugger donated large sums of money to St. Anna’s Church and paid the salaries of many parish priests. In 1512, a chapel designed by Renaissance artists Albrecht Dürer, Hans Burgkmair, Jörg Breu the Elder and Hans Daucher was dedicated to the Fuggers and later a mausoleum for Fugger brothers Ulrich and Georg. The most iconic fixture of Fugger’s legacy was the Fuggerei. In 1518, Jakob established a trust funding the building and maintenance of a large social housing complex for struggling laborers. These apartments had modern features for the time, complete with private kitchens and bedrooms, all within an enclosed community where residents lived under a specific set of rules. Rent was set at one guilder and residents were to pray for the Fugger family several times a day. The Fuggerei exists today with the same rent of one guilder (equal to 0.88 euro) and houses around 150 people.

The Fuggerei, Augsburg, Germany

The Richest Man Who Ever Lived’ is a perfect book for those who enjoy reading about larger-than-life personalities and biographies of influential people. Medievalists would appreciate the historical research and contextual evidence Steinmetz uses throughout the text. In conclusion, ‘The Richest Man Who Ever Lived‘ is a fitting testament to a man who if he went bankrupt, could have singlehandedly sent Europe back to the Dark Ages just as they were entering the enlightening Renaissance.

American Baronetcy: A Review of ‘The House of Morgan’ by Ron Chernow

The Gilded Age is a staple of middle and high school social studies classes in the United States. Students learn about the great robber barons who commanded American industry. The rapid transformation of the economy from a rural agrarian landscape to factories, foundries, and railroads signaled the shift in American life. The captains of industry who instigated this transformation amassed financial and political fortunes that could give Bill Gates, Jeff Bezos, and other billionaires a run for their equity. We hear of men like John D. Rockefeller, Andrew Carnegie, and Cornelius Vanderbilt when thinking of that Gilded Age, but one personality evolved with the economy and was a critical component of the U.S. economic engine. From humble roots in England, the House of Morgan grew into a global financial institution that bankrolled industries and foreign governments. One name was synonymous with banking in the Gilded Age; he was John Pierpont Morgan.

Ron Chernow’s book ‘The House of Morgan‘ explores far beyond the biography of J.P. Morgan and his legacy. Rather, in true Chernow fashion, the book runs a fine comb over the rise, dominance, splintering, and restructuring of the most influential corporate financial company in the 19th and 20th centuries. Examining the role of the House of Morgan in American finance is akin to researching the role of Jonas Salk and the development of the polio vaccine; it’s impossible to discuss it without them. What Chernow illustrates is how pivotal the House of Morgan became in the banking world and how that power transferred between generation. Coupled with the family history, Chernow examines the company, J.P Morgan & Co., and how throughout various times in history was at the center of economic growth, government crisis intervention, controversy and scandal, and the diversification of high finance.

The office of J.P Morgan & Co. at 23 Wall Street. This corner was both powerful and mysterious in the world of high finance with J.P. Morgan and the numerous partners consolidating industries and giving banking security to the most influential companies and men in the U.S. (Image courtesy of the Library of Congress)

Chernow’s narrative follows a round-robin pattern focusing on the multiple offices and personalities connected to the House of Morgan over period of 120 years. Imbedded in this structure is the rise and fall of what he called the ‘Gentleman Banker’s Code’. Throughout the 19th century, banks were private institutions and we do mean PRIVATE. The House of Morgan never advertised its services, publicly listed client names, or dealt with the rabble of early Wall Street. A bank like that wouldn’t likely survive in today’s fast finance world powered by vast digital databases. But the House of Morgan was a product of its time; the bank served institutions and business and not the public at large. Chernow’s in-depth research reveals how the Morgan enterprise amassed its fortune through acquisition, controlling interest, and issuing bonds and loans to corporations and governments. Each branch in North America, England, and France conducted business in slightly different fashions, but the all followed the Banker’s Code.

Obviously J.P. Morgan dominates the early narrative–his face is on the book cover. The Morgan name began with his father Junius Spencer Morgan who started J.S Morgan & Co. with George Peabody, his business partner. Through intensive training, J.P. rises as a powerful figure who takes the company beyond what his father could have dreamt. Renamed J.P. Morgan & Co. in 1895, the bank quickly became the focal point for corporate finance. Captains of industry came to respect Morgan’s financial acumen because it produced results. His method of consolidating fractured businesses and controlling interest was trademarked as ‘Morganization’. It became a synonym for the House of Morgan’s novel future practice of mergers and acquisitions.

The story of Morgan isn’t only limited to Wall Street. Morgan branches in England such as Morgan Grenfell illustrate the dichotomy between the American and European methods of banking. While the Bank of England and Morgan Grenfell formed an integral component of the state economy, J.P. Morgan & Co. maintained an independent streak, occasionally interceding on behalf of the U.S. federal government. The relationship is not always a happy one as Chernow recites. The Progressive period aimed to reduce poverty and controlled the unrestrained capitalism of the robber barons. Morgan was a prominent target which became a trend that followed the company for years. They appeared in court cases, Congressional hearings, and were the subject of numerous federal investigations ranging from illegal price fixing to underwriting loans to belligerent foreign nations. The Glass-Steagall Act forced the bank to re-evaluate its business model now that they were prevented from intermingling commercial and investment banking. The result was the spin-off of multiple Morgan entities that later evolved into the modern offices we know today: Morgan Guaranty, JP Morgan Chase, and Morgan Stanley.

The House of Morgan didn’t survive by the Morgans alone. An army of junior and senior partners came and went through the 23 Wall Street office bringing with them their education and prejudices. Figures like Tom Lamont, Russell Leffingwell, and George Whitney, were instrumental in expanding Morgan’s reach into new territory according to Chernow. The stresses of such a job however were evident in Chernow’s writing: they all died young from heart attacks, strokes, overexertion, and alcoholism. Racial and ethnic prejudices were not absent either as an unspoken code prevented Jewish, black, Hispanic, and other non-white hires, unless they served lunch in the private dining halls. Chernow wastes no paper in examining the darker side of the dominant banking business.

The immensity of Chernow’s work speaks as a testament to the changes that impacted the House of Morgan. Chernow’s analysis illustrates the remarkable shift in policy and public connection that discarded the old Gentleman Banker’s Code and was replaced with younger proteges working harder and faster. Gone was the smoked filled, leather armchair partners room where deals were finished over brandy and cigars. In the 1970s and 1980s when information technology altered Wall Street, the various Morgan entities adapted to the times, but its historical provenance never faded. ‘The House of Morgan‘ is a bold history that highlights the best and worst of American finance, but doesn’t deny or revise its legacy. Chernow’s trademark intensive research doesn’t ignore scandal or the trivial and is truly an enthralling read for those who know the name of Morgan.

Nubian Neighbors: The Long Overshadowed African Kingdom

To preface this article, I knew practically nothing about ancient Nubian civilization. Vague childhood memories from my Egypt-o-mania phase recalls a passing reference to Nubia as a subservient client kingdom. My bookshelf had plenty of Ancient Egypt books for kids and most neighboring kingdoms were glossed over. The Egyptians exerted authority over the Nubians through military and economic oppressions, rendering them impoverished vassals. A recent trip to the St. Louis Art Museum fundamentally changed those notions. Nubian Treasures, a traveling exhibit from the Museum of Fine Arts-Boston, displayed an astonishing array of artwork and artifacts from various Nubian kingdoms that existed over two-thousand years. The exhibit left such an impression it compelled me to write this post. As time passes we discover more details and nuances about ancient civilizations that we didn’t know existed. We either don’t have the knowledge or it is explained by another source (No, ancient aliens do not count. Kill that thought right now). Who were the Nubians and why don’t we know more about them?

Faience lion that adorned a Nubian temple. Many of the raw materials and artistic design patterns were heavily influenced by Egypt, which presented difficulties for early archaeologists to separate Nubian from Egyptian (St. Louis Art Museum)

Like many who study ancient civilizations, there is a tendency to attribute cultural traits from an established societies to newly discovered adjacent ones. What does this mean? Essentially, when there’s a powerful kingdom that has large cultural exports such as art, language, religion, and government, bordering kingdoms can heavily rely on their neighbors leading to appropriation. Famed Egyptologist George Reisner believed this theory when excavating Nubia in the early 20th century. Archaeological evidence collected at the time led Reisner to believe that the many of the Egyptian-like artwork and artifacts were remnants of a Egyptian occupied land of a subservient people. Hieroglyphs and artwork reinforced this notion as Westerners interpreted the darker depiction of Nubian characters as servants or slaves. This theory took hold in the academic world and remained unchanged for decades.

The Nubians were definitely not pushovers who allowed the Egyptians to dictate their civilization. The first recorded cultural group, Kerma, lived in Nubia from 2500 BCE to 1500 BCE until it was conquered by Thutmose I during the Egyptian New Kingdom Period. During that one thousand years, Nubians peacefully co-existed with Egypt and other African kingdoms. Trade flourished between them and subgroups of kingdoms developed throughout the region. Nubia is first mentioned in Egyptian accounts in the 24th century BCE during the Old Kingdom. This didn’t mean that the Nubians weren’t of any importance; in fact, they were Egypt’s largest trading partner. Substantial amounts of imported wealth such as gold, ebony, incense, ivory, and copper made the Nubian kingdoms incredibly valuable to Egypt. The Nubians also had great notoriety with their archery skills. They boasted some of the best archers in Northern Africa and on several occasions participated in Egyptian military campaigns. Nubian and Egyptian intermarriages were commonplace and many archaeologists speculate that a handful of Egyptians pharaohs might have had Nubian ancestry. There is no doubting the mutual and reciprocal influence that the two civilizations had on each other for nearly a thousand years.

Nubian ushabtis were small figurines that were placed in the burial chamber of a tomb. They served a spiritual purpose by serving their master in the afterlife. This is one of the many cultural and religious practices that Nubians and Egyptians shared over generations (St. Louis Art Museum)

The dynamic changed drastically around 1500 BCE when the pharaoh Thutmose I expanded Egypt’s borders into the Levant and Nubia. The occupation lasted nearly 400 years, but during that time, competing Nubian factions challenged Egyptian authority creating a near constant state of civil conflict in the region. By 1000 BCE, the Kushites began emerging as the dominant power in Nubia and Egyptian control was relinquished. By the 8th century BCE, the tables were turned as a massive Kushite army led by Piye began a systemic conquest of Egypt. He founded the 25th Dynasty and its pharaohs ruled the two lands for a little over 100 years. In 525 BC, an invading Assyrian army removed the Kushites and forcibly keeping them in the south for the next thousand years. Its at this time that the ancient city of Meroe came to prominence as the cultural and power center for Nubia following the collapse of the 25th Dynasty. The Kushite Kingdom preserved many Egyptian traditions, customs, and religious practices and developed their own language, Meroitic. Today it remains as one of the few undeciphered ancient languages.

The bottom script is Meroitic; one of the last ancient languages never to be translated (St. Louis Art Museum)

The extent of my knowledge on African civilizations is slim, but the point of this blog is to broaden my own history knowledge boundaries. If the Nubian Treasures exhibit taught me anything, it was that civilizations constantly borrow from one another. Whether its religious beliefs, economic practices, cultural customs, or government bureaucracy, the Nubian peoples and Egyptians certainly had a strong and complex relationship. Nubian archaeology has received increased attention in the past two decades and as that interest continues, it’s likely we’ll uncover more about this long overshadowed civilization and its people.

For more information about the St. Louis Art Museum and its exhibits, visit their website: SLAM.org