Washington Post Book Reviews
For You
Monday August 16, 2010
|
THE LONG ROAD TO ANNAPOLIS: The Founding of the Naval Academy and The Emerging American Republic
William P. Leeman
Univ. of North Carolina
ISBN 978-0-8078-3383-4
292 pages
$39.95
Reviewed by John Lehman
After American independence, patriots believed that standing armies and professional navies were instruments of royal tyranny and had no place in the new republic. George Washington, "Cincinnatus of the West," set the model, taking up arms to defend his country and then returning to the plow when the threat was past. Militias were to suffice for the new republic's defense.
Barbary pirates, conflict with France and the War of 1812 brought the realization that, indeed, an army and navy were needed, but Thomas Jefferson did not trust their officer corps because they were largely in sympathy with his political enemy Alexander Hamilton. This fear helped motivate Jefferson to establish the military academy at West Point in 1802 to inculcate republican values. Naval officers, while equally pro-federalist, did not present as great a threat as their army counterparts for they were usually away at sea. Thus, he saw much less need for a naval academy.
So for the first half-century of the navy's existence, its officers studied their profession in "the school of the ship." Captains took responsibility for the professional and moral training of young midshipmen, requiring them to take classroom instruction at sea from chaplains or civilian schoolmasters, who assigned extensive reading in the classics, science, philosophy and history.
There were always advocates for a naval academy, beginning with the greatest naval hero of the revolution, John Paul Jones, but political support came only after a major scandal. In 1842, midshipman Philip Spencer, who happened to be the son of the secretary of war, was hanged aboard the training brig Somers by his captain on suspicion of conspiracy to mutiny. In 1845, Secretary of the Navy George Bancroft seized on the Somers affair as a reason finally to establish a naval academy at Annapolis.
William Leeman has given us an excellent history of the politics and personalities animating the long debate over whether or not to establish a naval academy, with many interesting anecdotes along the way. He chronicles President Theodore Roosevelt's effort to establish Annapolis as the professional and cultural heart of the navy. In 1906, Roosevelt made the discovery in Paris of John Paul Jones' body the occasion for a publicity event. In order to draw world, and especially Congressional, attention to his plans for building a world-class navy, Roosevelt sent a squadron of cruisers to France to bring Jones' body back to Annapolis for interment in a crypt under the Naval Academy chapel, patterned on Napoleon's Tomb at Les Invalides.
Like West Point, the Naval Academy rapidly became an important institution in the American ruling establishment. While merit certainly had an important role in the selection of cadets and midshipmen, appointments went disproportionately to the sons of influential and wealthy supporters of members of Congress. Between 1845 and 1945 only 2 percent of midshipmen came from working-class backgrounds. Whereas the Army consistently commissioned officers from the enlisted ranks, only 2 percent of naval officers in World War II had prior enlisted service. The ratio of officers from the better civilian universities to Annapolis grads in World War II was 70 to 1, with Annapolis grads usually found on the big prestigious ships and the grads from Officer Candidate School (OCS) relegated to the lesser craft. The saying was "It don't mean a thing if you ain't got that ring."
If one were starting from scratch without historical anxieties and political pressures, it is unlikely that the current Naval Academy would be the outcome. There is an inherent conflict between a liberal education based on skeptical inquiry and military indoctrination requiring unquestioning obedience. Combining the two educational cultures tends to create a pressure chamber with too much to do and no time to think and absorb.
European military training has evolved in very different ways. In the United Kingdom, the naval academy at Dartmouth and the military academy at Sandhurst are based on an 18-month military-only curriculum by which military science and leadership are taught to the exclusion of purely academic subjects. Those who wish to get a university degree normally do so at a civilian university before or after they graduate from the academy. Officers thus absorb a better understanding of civilian culture and intellectual freedom, while also getting an undiluted indoctrination in military professionalism. This form of military education is also a lot less expensive.
Some argue that relying on civilian university programs like ROTC would not produce officers who stay for a full career. Yet fewer than 50 percent of academy graduates make the navy or marine corps a career, which is about the same as ROTC and OCS members. The cost to the taxpayer for a commissioned officer from a service academy is much higher than from ROTC. OCS is by far the best bargain of all.
Critics have suggested that an institution led by people who all went to the same school leads to closed and inbred thinking and resistance to outside ideas and innovations. That has certainly not been my experience over the years with the senior naval officer corps. Innovation has originated much more from naval officers than from outside critics. Submarines, aircraft carriers and, in my day, cruise missiles, UAVs and ultra hi-tech communication all were introduced by strong innovative naval officers working with like-minded civilians.
Leeman has told a fine tale of how the Naval Academy came to be. His next book should take on the even more tumultuous story of how it became what it is today.
John Lehman was secretary of the navy in the Reagan administration, a member of the 9-11 Commission and is a member of the National Defense Commission.
Copyright 2010 Washington Post Writers Group
|
Comment on this Story | Printer Friendly | Share | Top |
BEING WRONG: Adventures in the Margin of Error
BEING WRONG: Adventures in the Margin of ErrorKathryn Schulz
Ecco
ISBN 978-0-06-117604-3
405 pages
$26.99
WRONG: Why Experts Keep Failing Us -- and How to Know When Not to Trust ThemDavid H. Freedman
Little, Brown
ISBN 978-0316093293
295 pages
$25.99
Reviewed by Michael Washburn
Error arrives cloaked in certainty. In our politics, in our relationships and in the advice we solicit, we're at the mercy of an ever-present unreliability. Such are the lessons taught by Kathryn Schulz's "Being Wrong" and David H. Freedman's "Wrong," complementary explorations of our relentless genius for getting it ... wrong.
The good news, from Schulz's perspective, is that mistakes shouldn't be condemned, at least not in any traditional sense. Schulz draws on philosophers, neuroscientists, psychoanalysts and a bit of common sense in an erudite, playful rumination on error. "We are wrong about what it means to be wrong," she writes. "Far from being a sign of intellectual inferiority, the capacity to err is crucial to human cognition ... (and) it is inextricable from some of our most humane and honorable qualities: empathy, optimism, imagination, conviction, and courage." By understanding the dynamics of error, we open a space for tolerance of both our own and others' failings. This is error as pedagogy, and it does not come naturally.
Being wrong, Schulz notes, feels exactly like being right. We often fall victim to the "cuz it's true" dynamic: the self-serving circularity of taking one's own belief in an idea as a sign of that idea's veracity. We engage in this thinking all the time, about everything: We decide what flavor ice cream is best or what the long-term prospects of the "tea party" are, and we promote these beliefs, often fiercely. Humans are engines of strong, often ill-formed or arbitrary opinion, and any assertion of one's knowledge carries with it an accusation of others' errors.
We judge others' mistakes more harshly than our own, often assuming that others' errors stem from one of three flaws: idiocy, ignorance or evil. As Schulz writes, "Moral and intellectual wrongness are connected not by mere linguistic coincidence but by a long history of associating error with evil -- and, conversely, rightness with righteousness."
"Being Wrong" traverses disciplines and eras, deftly interweaving etymology (Schulz reminds us that "error" derives from the Latin word for "to stray or wander") with such sources as Saint Augustine's "Confessions," contemporary neuroscience and vivid examples of radical mistakes. For instance, to illuminate the resilience of error, she tells the disquieting tale of a woman suffering from anosognosia, the unawareness or denial of a disability: The woman remained convinced that she could see despite having recently gone totally blind, going so far as to describe in vivid yet absolutely incorrect detail her hospital room.
Through such cases, Schulz lays bare the inductive failures, misperceptions and biased assumptions that exist in less extreme form in everyone and that everyone should find instructive.
In "Wrong," Freedman takes a darker view of our errors, and while his book is less artfully written than Schulz's, it is more forcefully argued, focusing on the point where error shades into deceit.
Why do experts fail? Political pundits and business writers are paid to give opinions, not to be right. And as Freedman points out, despite life's complexity, we prefer the simple advice proffered by informal experts, particularly when it reinforces our pre-existing ideas or affirms our hopes (sleep your way to a six-pack!). The startling lesson of "Wrong," however, is how often medical researchers engage in similar crowd-pleasing charlatanism.
"If," Freedman writes, "a scientist wants to or expects to end up with certain results, he will likely achieve them, often through some form of fudging, whether conscious or not." He supports this assertion with a torrent of research -- experts on experts -- hoping to disenthrall readers from the seduction of scientific expertise. He notes this irony and devotes an appendix to explaining why his sources should prove more reliable: "Experts who study other experts' failings are better equipped ... to avoid those troubles." On balance, Freedman is convincing, even if his explanation sometimes exhibits a bit of "cuz it's true."
"Wrong" makes a powerful case for the prevalence of scientific ineptitude. For example, data are often disregarded if they contradict the results that were predicted in research proposals written to secure funding. Moreover, most studies honored with publication benefit from journals' preference for provocative, positive findings. This may, at first, seem reasonable, until Freedman points out that when multiple studies address the same question, the ones with positive results are much more likely to see print, even if the majority present negative findings. In this sense, prestigious scientific journals are no better than tabloids: They seek out attention-grabbing headlines, and researchers are eager to oblige.
Brazen fraud also plays a role in research. Among other examples, Freedman explains how an oncologist faked the results of a widely praised genetics experiment by coloring white mice with a Magic Marker.
Mice themselves are a problem, too. Researchers' reliance on different species, subjected to extreme conditions, to shed light on human ailments falls under Freedman's withering gaze. For instance, anyone who sees a correlation between human depression and the length of time a lab rat swims before giving up and floating is probably fooling himself.
But Freedman's lazy scientists are Schulz's self-deceived strivers. Experts are only human and, therefore, unaware of many of their mistakes, even if they profit from our culture of credulity. Despite this rash of errors, we remain in denial over our own fallibility -- we're our own worst experts. In her discussion of medical expertise, Schulz writes, "What is both philosophically and practically interesting about (the question of medical error) is the paradox that lurks at its heart: if you want to try to eradicate error, you have to start by assuming that it is inevitable."
Perhaps this quasi-philosophical first premise should supplement Descartes' cogito: Not only I think, therefore I am, but I'm certain, therefore I'm often wrong.
Michael Washburn is the assistant director of the Center for the Humanities at the Graduate Center of the City University of New York.
Copyright 2010 Washington Post Writers Group
|
Comment on this Story | Printer Friendly | Share | Top |
|
|
|
|
Recent Stories |
THE OBAMA DIARIES
THE WILD VINE: A Forgotten Grape and the Untold Story of American Wine
THE WEATHER OF THE FUTURE: Heat Wave, Extreme Storms, and Other Scenes from a Climate-Changed Planet
GIVE + TAKE
SUPER SAD TRUE LOVE STORY |
|
|
|
|
|
| |